{"id":1353,"date":"2019-11-06T14:03:49","date_gmt":"2019-11-06T22:03:49","guid":{"rendered":"https:\/\/www.lightsondata.com\/?p=1353"},"modified":"2019-11-04T14:21:00","modified_gmt":"2019-11-04T22:21:00","slug":"how-to-implement-data-profiling-for-successful-source-data-discovery","status":"publish","type":"post","link":"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/","title":{"rendered":"How to implement data profiling for successful source data discovery"},"content":{"rendered":"<p>Effective Data Warehoues (DW) data source profiling is often an overlooked step in data warehouse data preparation. DW project teams need to understand all quality aspects of source data before preparation for downstream consumption. Beyond simple visual examination, you need to profile, visualize, detect outliers, and find null values and other junk data in your source data sets.<\/p>\n<p>The first purpose of this profiling analysis is to decide if the data source is even worth including in your project. As data warehouse guru Ralph Kimball writes in his book The Data Warehouse Toolkit, <em>\u201cEarly disqualification of data sources is a responsible step that can earn you respect from the rest of the team, even when it seems to be bad news.\u201d<\/em><\/p>\n<p>If the data source is deemed worthy of inclusion, results from profiling this source will help you evaluate the data for overall quality and estimate the ETL work necessary to cleanse the data for downstream analysis.<\/p>\n<p>A leading cause of data warehousing barriers during planning and development is extracting erroneous or poor quality source data as input to data warehouse ETLs.<\/p>\n<h2>Data discovery, data mappings &amp; design, ETL development<\/h2>\n<p>Typical ETLs extract data from sources and loads it to targets. Project teams that perform data discovery profiling on data sources before building ETL data mappings can expect to achieve the following:<\/p>\n<ul>\n<li>A more accurate understanding of the data types, formats, and precision of each source<\/li>\n<li>A more precise specification for the types of data transformations required to clean, de-duplicate, aggregate, and apply business rules<\/li>\n<li>Discovery of source data anomalies and outliers which require further investigation for possible remediation prior to the migration of data \u2013 this, leading to a more robust error handling and exception handling process<\/li>\n<li>Identification of source data, previously unknown, that meets the business requirements and needs to be migrated &#8211; discovery can lead to uncovering data that was previously undefined or unobtainable for data migrations<\/li>\n<\/ul>\n<p>Profiling each source candidate will likely highlight information about the actual source data of which neither technical nor business resources were aware. However, analysis of profiling results will require both technical and business resources to understand and act on source data profiling results. All profiling information will be necessary in order to correctly detail the correct mapping and data transformation requirements for the source data (see Figure 1).<br \/>\nBoth technical and business knowledge of the source and target data are critical to the success of DW projects. The need for multiple resources from distinct functional areas to coordinate each step in the DW\/BI development life cycle is a challenging aspect of data integration development.<\/p>\n<figure id=\"attachment_1355\" aria-describedby=\"caption-attachment-1355\" style=\"width: 384px\" class=\"wp-caption aligncenter\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" data-attachment-id=\"1355\" data-permalink=\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/source-data-profiling\/\" data-orig-file=\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/source-data-profiling.png?fit=384%2C321&amp;ssl=1\" data-orig-size=\"384,321\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"source data profiling\" data-image-description=\"\" data-image-caption=\"&lt;p&gt;Figure 1 : Profiling of all source data to provide metadata as input to data warehouse design&lt;\/p&gt;\n\" data-medium-file=\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/source-data-profiling.png?fit=300%2C251&amp;ssl=1\" data-large-file=\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/source-data-profiling.png?fit=384%2C321&amp;ssl=1\" class=\"size-full wp-image-1355\" src=\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/source-data-profiling.png?resize=384%2C321&#038;ssl=1\" alt=\"source data profiling\" width=\"384\" height=\"321\" srcset=\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/source-data-profiling.png?w=384&amp;ssl=1 384w, https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/source-data-profiling.png?resize=300%2C251&amp;ssl=1 300w\" sizes=\"auto, (max-width: 384px) 100vw, 384px\" \/><figcaption id=\"caption-attachment-1355\" class=\"wp-caption-text\">Figure 1 : Profiling of all source data to provide metadata as input to data warehouse design<\/figcaption><\/figure>\n<h2>Implementing data profiling for successful source data discovery<\/h2>\n<p>Source Data discovery represents an inventory and analysis of data from various \u201cpotential\u201d sources to gain insight into obscure patterns and trends. It is the first step in fully harnessing and understanding an organization&#8217;s data to inform critical business decisions in response to a DW\/BI project\u2019s requirements.<br \/>\nAll too often, after just a few meetings, business\/data analysts begin developing ETL mappings \u2014 next, developers code ETLs; then, unit testing. Soon, however, the project team realizes that although ETL\u2019s were written to requirements specifications, data loads don\u2019t \u201clook right\u201d.<br \/>\nAfter troubleshooting they find the cause is source data that does not meet DW\/BI requirements. The team might decide that data transformations and \u201cdata joins\u201d will eliminate some of the discrepancies. In effect, they are performing two critical functions left out of the original development plan \u2013 1) data discovery to include profiling followed by 2) data enhancement \/ cleansing \/ enrichment.<br \/>\nA well-planned data discovery process will result in a more efficient DW ETL design. You can profile your source data to discover structure, relationships and data rules. Data profiling provides statistical information about compliant data and outliers.<br \/>\nData discovery tools can overcome problems of scale because many of those tools can scan large environments and identify data in a fraction of the time required by a team of human analysts. Tools offer a much greater chance of finding the best sources of critical business data.<br \/>\nGeneral steps to source data discovery include:<\/p>\n<ol>\n<li>Identify specific data domains needed to meet required business reporting<\/li>\n<li>Uncover potential internal and external sources of that data from enterprise applications that store, collect, or consume address data<\/li>\n<li>Prioritize candidate source data for analysis and movement into a data warehouse by using (for example) data lake metadata<\/li>\n<li>\u00a0Ensure that each source meets the privacy and regulatory requirements to be used for your purpose<\/li>\n<li>Ensure that each source will be adequately available and accessible according to required frequencies<\/li>\n<li>Use data preparation (ex. profiling and cleansing) tools to perform portions of an overall refinement process, by integrating with other types of curation and preparation as part of an iterative approach to data refinement<\/li>\n<\/ol>\n<p>Table 1 lists and describes useful profiling tasks. Single column profiling refers to the analysis of values in a single column ranging from simple counts and aggregation functions to distribution analysis and discovery of patterns and data types. Multi-column profiling is a set of activities that can be applied to a single column but allows for the analysis of inter-value dependencies across columns, resulting in association rules, clustering and outlier detection.<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Effective Data Warehoues (DW) data source profiling is often an overlooked step in data warehouse data preparation. DW project teams need to understand all quality aspects of source data before preparation for downstream consumption. Beyond simple visual examination, you need to profile, visualize, detect outliers, and find null values and other junk data in your [&hellip;]<\/p>\n","protected":false},"author":12,"featured_media":1368,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"How to implement data profiling for successful source data discovery\r\n#DW #lightsondata #dataprofiling","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[4],"tags":[54,40,52,53,86],"class_list":["post-1353","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-data-quality","tag-bi","tag-data-profiling","tag-data-warehouse","tag-dw","tag-etl","post-wrapper","thrv_wrapper"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.4 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>How to implement data profiling for successful source data discovery | LightsOnData<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"How to implement data profiling for successful source data discovery | LightsOnData\" \/>\n<meta property=\"og:description\" content=\"Effective Data Warehoues (DW) data source profiling is often an overlooked step in data warehouse data preparation. DW project teams need to understand all quality aspects of source data before preparation for downstream consumption. Beyond simple visual examination, you need to profile, visualize, detect outliers, and find null values and other junk data in your [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/\" \/>\n<meta property=\"og:site_name\" content=\"LightsOnData\" \/>\n<meta property=\"article:published_time\" content=\"2019-11-06T22:03:49+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/how-to-implement-data-profiling.png?fit=800%2C450&ssl=1\" \/>\n\t<meta property=\"og:image:width\" content=\"800\" \/>\n\t<meta property=\"og:image:height\" content=\"450\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Wayne Yaddow\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@georgefirican\" \/>\n<meta name=\"twitter:site\" content=\"@georgefirican\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Wayne Yaddow\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/\",\"url\":\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/\",\"name\":\"How to implement data profiling for successful source data discovery | LightsOnData\",\"isPartOf\":{\"@id\":\"https:\/\/www.lightsondata.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/how-to-implement-data-profiling.png?fit=800%2C450&ssl=1\",\"datePublished\":\"2019-11-06T22:03:49+00:00\",\"author\":{\"@id\":\"https:\/\/www.lightsondata.com\/#\/schema\/person\/4503a79021fcf6acf4850c36356b6ffe\"},\"breadcrumb\":{\"@id\":\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/#primaryimage\",\"url\":\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/how-to-implement-data-profiling.png?fit=800%2C450&ssl=1\",\"contentUrl\":\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/how-to-implement-data-profiling.png?fit=800%2C450&ssl=1\",\"width\":800,\"height\":450,\"caption\":\"how to implement data profiling for source data discovery\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.lightsondata.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"How to implement data profiling for successful source data discovery\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.lightsondata.com\/#website\",\"url\":\"https:\/\/www.lightsondata.com\/\",\"name\":\"LightsOnData\",\"description\":\"Practical resources, online courses, free articles and videos for data management, data governance, data quality, and business intelligence\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.lightsondata.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.lightsondata.com\/#\/schema\/person\/4503a79021fcf6acf4850c36356b6ffe\",\"name\":\"Wayne Yaddow\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.lightsondata.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/0c5eaab6104044bec265d829746d99a451bd1acb77e74af42b55c757e191fe76?s=96&d=retro&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/0c5eaab6104044bec265d829746d99a451bd1acb77e74af42b55c757e191fe76?s=96&d=retro&r=g\",\"caption\":\"Wayne Yaddow\"},\"description\":\"Wayne Yaddow is an independent consultant with more than 20 years\u2019 experience leading data integration, data warehouse, and ETL testing projects with J.P. Morgan Chase, Credit Suisse, Standard and Poor\u2019s, AIG, Oppenheimer Funds, and IBM. He taught IIST (International Institute of Software Testing) courses on data warehouse and ETL testing and wrote DW\/BI articles for Better Software, The Data Warehouse Institute (TDWI), Tricentis, and others. Wayne continues to lead numerous ETL testing and coaching projects on a consulting basis. You can contact him at wyaddow@gmail.com.\",\"url\":\"https:\/\/www.lightsondata.com\/author\/wyaddowgmail-com\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"How to implement data profiling for successful source data discovery | LightsOnData","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/","og_locale":"en_US","og_type":"article","og_title":"How to implement data profiling for successful source data discovery | LightsOnData","og_description":"Effective Data Warehoues (DW) data source profiling is often an overlooked step in data warehouse data preparation. DW project teams need to understand all quality aspects of source data before preparation for downstream consumption. Beyond simple visual examination, you need to profile, visualize, detect outliers, and find null values and other junk data in your [&hellip;]","og_url":"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/","og_site_name":"LightsOnData","article_published_time":"2019-11-06T22:03:49+00:00","og_image":[{"width":800,"height":450,"url":"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/how-to-implement-data-profiling.png?fit=800%2C450&ssl=1","type":"image\/png"}],"author":"Wayne Yaddow","twitter_card":"summary_large_image","twitter_creator":"@georgefirican","twitter_site":"@georgefirican","twitter_misc":{"Written by":"Wayne Yaddow","Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/","url":"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/","name":"How to implement data profiling for successful source data discovery | LightsOnData","isPartOf":{"@id":"https:\/\/www.lightsondata.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/#primaryimage"},"image":{"@id":"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/#primaryimage"},"thumbnailUrl":"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/how-to-implement-data-profiling.png?fit=800%2C450&ssl=1","datePublished":"2019-11-06T22:03:49+00:00","author":{"@id":"https:\/\/www.lightsondata.com\/#\/schema\/person\/4503a79021fcf6acf4850c36356b6ffe"},"breadcrumb":{"@id":"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/#primaryimage","url":"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/how-to-implement-data-profiling.png?fit=800%2C450&ssl=1","contentUrl":"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/how-to-implement-data-profiling.png?fit=800%2C450&ssl=1","width":800,"height":450,"caption":"how to implement data profiling for source data discovery"},{"@type":"BreadcrumbList","@id":"https:\/\/www.lightsondata.com\/how-to-implement-data-profiling-for-successful-source-data-discovery\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.lightsondata.com\/"},{"@type":"ListItem","position":2,"name":"How to implement data profiling for successful source data discovery"}]},{"@type":"WebSite","@id":"https:\/\/www.lightsondata.com\/#website","url":"https:\/\/www.lightsondata.com\/","name":"LightsOnData","description":"Practical resources, online courses, free articles and videos for data management, data governance, data quality, and business intelligence","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.lightsondata.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/www.lightsondata.com\/#\/schema\/person\/4503a79021fcf6acf4850c36356b6ffe","name":"Wayne Yaddow","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.lightsondata.com\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/0c5eaab6104044bec265d829746d99a451bd1acb77e74af42b55c757e191fe76?s=96&d=retro&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/0c5eaab6104044bec265d829746d99a451bd1acb77e74af42b55c757e191fe76?s=96&d=retro&r=g","caption":"Wayne Yaddow"},"description":"Wayne Yaddow is an independent consultant with more than 20 years\u2019 experience leading data integration, data warehouse, and ETL testing projects with J.P. Morgan Chase, Credit Suisse, Standard and Poor\u2019s, AIG, Oppenheimer Funds, and IBM. He taught IIST (International Institute of Software Testing) courses on data warehouse and ETL testing and wrote DW\/BI articles for Better Software, The Data Warehouse Institute (TDWI), Tricentis, and others. Wayne continues to lead numerous ETL testing and coaching projects on a consulting basis. You can contact him at wyaddow@gmail.com.","url":"https:\/\/www.lightsondata.com\/author\/wyaddowgmail-com\/"}]}},"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/11\/how-to-implement-data-profiling.png?fit=800%2C450&ssl=1","jetpack_shortlink":"https:\/\/wp.me\/p9BPV6-lP","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/posts\/1353","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/users\/12"}],"replies":[{"embeddable":true,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/comments?post=1353"}],"version-history":[{"count":9,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/posts\/1353\/revisions"}],"predecessor-version":[{"id":1363,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/posts\/1353\/revisions\/1363"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/media\/1368"}],"wp:attachment":[{"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/media?parent=1353"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/categories?post=1353"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/tags?post=1353"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}