{"id":989,"date":"2019-05-15T06:11:42","date_gmt":"2019-05-15T13:11:42","guid":{"rendered":"http:\/\/www.lightsondata.com\/?p=989"},"modified":"2019-07-03T19:16:30","modified_gmt":"2019-07-04T02:16:30","slug":"how-to-identify-and-reduce-dw-bi-data-quality-risks","status":"publish","type":"post","link":"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/","title":{"rendered":"How to identify and reduce DW\/ BI data quality risks"},"content":{"rendered":"<h4>An introduction to DW\/ BI data quality risk assessments<\/h4>\n<p>Data warehouse and business intelligence (DW\/ BI) projects are showered with risks &#8211; from data quality in the warehouse to analytic values in BI reports. If not addressed properly, data quality risks can bring entire projects to a halt, leaving planners scrambling for cover, sponsors looking for remedies, and budgets being wiped out.<\/p>\n<p>Usually, data consumers, such as end users, don\u2019t often know exactly what they want delivered until they start seeing early versions of a BI application (e.g., reports). This circumstance often requires DW\/BI teams to build the data warehouse and application reports before they are fully defined and specified. Couple this challenge with the data quality problems inherent when sourcing operational systems and the potential for risks are very real. (Editor note: Learn what are the attributes you should track to\u00a0<a href=\"https:\/\/www.lightsondata.com\/developing-a-comprehensive-report-inventory\/\" target=\"_blank\" rel=\"noopener noreferrer\">develop a comprehensive report inventory<\/a>).<\/p>\n<p>This article outlines methods for recognizing and minimizing the\u00a0<strong>data quality risks<\/strong>\u00a0often associated with DW\/ BI projects. Addressing additional DW\/ BI project risks (including performance, schedules, defects discovered late in the software development life cycle, etc.) is also important, but is beyond the scope of this piece.<\/p>\n<p>Data Quality is the desired state where an organization\u2019s data assets reflect the following attributes:<\/p>\n<ul>\n<li>Clearly defined<\/li>\n<li>Correct values in sources &#8211; during extraction, while loading to targets, and in analytic reports<\/li>\n<li>Understandable presentation format in analytic applications<\/li>\n<li>Usefulness in supporting targeted business processes.<\/li>\n<\/ul>\n<p>Figure 1\u00a0illustrates primary control points (i.e., testing points\u00a0<span style=\"color: #339966;\">\u2713<\/span><span style=\"font-family: inherit; font-style: inherit; font-weight: inherit;\">) in an end-to-end data quality auditing and reporting process.<\/span><\/p>\n<figure id=\"attachment_991\" aria-describedby=\"caption-attachment-991\" style=\"width: 940px\" class=\"wp-caption aligncenter\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" src=\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/05\/how-to-identify-and-reduce-dq-1024x508.png?resize=940%2C466\" alt=\"checkpoints for DW\/BI data quality\" width=\"940\" height=\"466\" \/><figcaption id=\"caption-attachment-991\" class=\"wp-caption-text\">Figure 1: A few of the many checkpoints that are necessary to adequately audit data quality for DW\/ BI projects.<\/figcaption><\/figure>\n<h4>Data quality risks should be addressed early and often<\/h4>\n<p>The extraction, transformation, and loading (ETL) process is still the most underestimated, under-budgeted part of most DW\/ BI iterations. And the biggest reason why the ETL portion of a project often raises more questions than it resolves has to do with a lack of understanding of the source data quality along with the resulting data quality of ETL processes.<\/p>\n<p>Before diving into the most common data quality risks for data warehouse and business intelligence projects and their risk management, let\u2019s ensure we are on the same page with the following, as we will address them both:<\/p>\n<blockquote>\n<p><strong><em>Risk<\/em><\/strong><em>\u00a0<strong>assessment<\/strong> is the conversion of\u00a0risk assessment data\u00a0into\u00a0risk\u00a0decision-making information.\u00a0<strong>Risks <\/strong>are composed of two factors: (1)\u00a0<strong>risk<\/strong>\u00a0probability and (2)\u00a0<strong>risk<\/strong>\u00a0impact.<\/em><\/p>\n<p><strong><em>Data quality risk management<\/em><\/strong><em> is a structured approach for the identification, assessment, and prioritization of data quality risks followed by planning of resources to minimize, monitor, and control the probability and impact of undesirable events.<\/em><\/p>\n<\/blockquote>\n<p>The most common DW\/ BI project data quality risks are listed below, each accompanied by the probability \/ <strong>odds<\/strong>\u00a0that they typically exist, their likely\u00a0<strong>impacts, <\/strong>and recommendations for mitigating those risks. The listing is not intended to be exhaustive.<\/p>\n<table data-rows=\"14\" data-cols=\"5\">\n<thead>\n<tr>\n<th>\n<p style=\"text-align: center;\">#<\/p>\n<\/th>\n<th>\n<p style=\"text-align: center;\"><strong><span>Data Quality Risks<\/span><\/strong><\/p>\n<\/th>\n<th>\n<p style=\"text-align: center;\"><strong><span>Odds<\/span><\/strong><\/p>\n<\/th>\n<th>\n<p style=\"text-align: center;\"><strong><span>Impact<\/span><\/strong><\/p>\n<\/th>\n<th colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: center;\"><strong><span>Potential Risk Mitigation Tasks<\/span><\/strong><\/p>\n<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>\n<p style=\"text-align: center;\">1<\/p>\n<\/td>\n<td>\n<p style=\"text-align: left;\"><strong>HUMAN RESOURCE SKILLS<\/strong>Insufficiently qualified resources with required knowledge of data warehouse and business intelligence testing; lack of skills with data testing toolsets, methods and best practices.<\/p>\n<\/td>\n<td>\n<p>Medium<\/p>\n<\/td>\n<td>\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Engage DW\/BI training resources<\/p>\n<p style=\"text-align: left;\">\u2022 Recruit staff with DW experience<\/p>\n<p style=\"text-align: left;\">\u2022 Contract DW\/ BI professional consultants<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td>\n<p style=\"text-align: center;\">2<\/p>\n<\/td>\n<td>\n<p style=\"text-align: left;\"><strong>MASTER TEST PLAN\/STRATEGY<\/strong>A master test plan\/strategy does not exist or is inadequate in scope<strong><\/strong><\/p>\n<\/td>\n<td>\n<p>Medium<\/p>\n<\/td>\n<td>\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Create a test plan to document the overall structure and objectives of all project testing\u2014 from unit testing to component to system and performance testing. The plan should cover activities over the DW\/ BI lifecycle and identify evaluation criteria for the testers.<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td>\n<p style=\"text-align: center;\">3<\/p>\n<\/td>\n<td>\n<p style=\"text-align: left;\"><strong>SOURCE DATA QUALITY IN DOUBT<\/strong>Data integration effort may not meet the planned schedule because the quality of source data is unknown <\/p>\n<\/td>\n<td>\n<p>High<\/p>\n<\/td>\n<td>\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Formal <a href=\"https:\/\/www.lightsondata.com\/quick-data-profiling-solution\/\" target=\"_blank\" rel=\"noopener noreferrer\">data profiling<\/a> of all source data early (i.e., during requirements gathering) to understand whether data quality meets project needs\u2022 Inaccuracies, omissions, cleanliness, and inconsistencies in the source data should be identified and resolved before or during the extract \/ transform process \u2022 Often, specific data elements exist on multiple source systems. Identify the various sources and discuss with the users which are the most applicable\u2022 Use of commercial data quality tools accompanied by consultation and training<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td>\n<p style=\"text-align: center;\">4<\/p>\n<\/td>\n<td>\n<p style=\"text-align: left;\"><strong>SOURCE DATA HISTORY INCOMPLETE<\/strong>Differing levels of historical data among all source data<\/p>\n<\/td>\n<td>\n<p>Medium<\/p>\n<\/td>\n<td>\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 What if your business requirement calls for four years of historical data, but the best, most recent data contains only one year for some sources and three years for other sources? The missing years would need to be extracted from other data sources, possibly of questionable quality. Follow the risk mitigation tasks listed above, addressing the data quality.\u2022 Flag this risk early to project sponsor(s) and see if a change in business requirements is desirable&nbsp;<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: center;\">5<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: left;\"><strong>SOURCE &amp; TARGET DATA MAPS SUSPECT<\/strong>Source data may be inaccurately mapped due to absence of data dictionaries, meta data, or data models<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>Medium<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Data dictionaries should be developed and maintained to support all data associated with the project. Quality data mapping documents may be the result.<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: center;\">6<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: left;\"><strong>ETL TARGET DATA IN ERROR<\/strong>Only a subset of the loaded data could be tested<strong><\/strong><\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>Medium<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Ensure that the target data sampling process is high quality\u2022 Use test tools that allow for extensive data coverage\u2022 Choose a data sampling approach that\u2019s extensive enough to avoid missing defects in both source and target data\u2022 Choose an appropriate technology to match source and target data to determine whether both source and target are equal or target data has been transformed\u2022 Verify that no data or information is lost during ETL processes. The data warehouse must get all relevant data from the source application into the target according to business rules<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: center;\">7<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: left;\"><strong>SOURCE &amp; TARGET END TO END TESTING UNCOORDINATED<\/strong>Poor or non-existent testing of source to warehouse data flows<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 This \u201cauditability\u201d must include validation that the information in a source system (such as a spreadsheet) is accurate so that there is a high level of confidence that it can be trusted when it\u2019s loaded to the warehouse. Organizations that perform only quality checks on data at a sub-set of points in the warehouse will probably fail to adequately protect themselves from the data quality problems that emerge when information is exchanged between all of these \u201cdynamic points.\u201d<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: center;\">8<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: left;\"><strong>DATA DICTIONARIES AND DATA MODELS ARE LACKING<\/strong>Data and information within warehouse and\/ or marts cannot be easily interpreted by developers and quality assurance (QA) team<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>Medium<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Ensure accurate and current documentation of data models and mapping documents\u2022 Use automated documentation tools\u2022 Create meaningful documentation of data definitions and data descriptions in a data dictionary\u2022 Create procedures for maintaining documentation in line with changes to the source systems\u2022 Provide training to QA team by data stewards\/ owners<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: center;\">9<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: left;\"><strong>EXCESSIVE DATA DEFECTS<\/strong>Data defects are found at the late stage of each iteration<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Ensure that data requirements are complete and that data dictionaries are available and current \u2022 Profile all data sources and target sources after each ETL\u2022 Ensure that data mapping and all other specification documents are kept current<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: center;\">10<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: left;\"><strong>COMPLEX DATA TRANSFORMATIONS<\/strong>Complex data transformations and BI reports<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Early validation of table join complexity, queries and resulting business reports\u2022 Validation and clarification of business requirements, as well as early and careful translation of the data requirements\u2022 Validation of the number and accessibility of source data fields<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: center;\">11<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: left;\"><strong>DATA VOLUME SCALABILITY IN DOUBT <\/strong>Growing data volumes due to changing requirements<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>Medium<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Employ toolsets for data volume estimations\u2022 Consider technical design for data volumes to be done by experienced database administrators\/ data architects<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: center;\">12<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: left;\"><strong>DATA REQUIREMENTS INCOMPLETE<\/strong>Quality issues due to unclear or non-existent data requirements documentation<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>Medium<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Ensure that requirements are always updated after change requests are approved\u2022 Have the data requirements documentation as part of the project\u2019s deliverables\u2022 Perform validation and clarification of requirements, as well as early and careful translations of the data requirements in relation to business and technical requirements<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: center;\">13<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p style=\"text-align: left;\"><strong>REGRESSION TESTS NOT AUTOMATED<\/strong>Minimal automation of regression tests<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>Medium<\/p>\n<\/td>\n<td rowspan=\"1\" colspan=\"1\">\n<p>High<\/p>\n<\/td>\n<td colspan=\"1\" rowspan=\"1\">\n<p style=\"text-align: left;\">\u2022 Without automated regression tests, fewer tests may be run after builds are deployed; manual testing may result in fewer tests being run. Therefore, make sure your test plan accounts for this and ask for more time and\/ or human resources to complete necessary testing<\/p>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h4>More potential data quality risks to consider:<\/h4>\n<ul>\n<li><strong>Legacy data architecture and data definition artifacts<\/strong>\u00a0may be unavailable or incomplete to aid in project planning. Source data may be inaccurately mapped due to lack of (or outdated) legacy system data dictionary<\/li>\n<li><strong>The project team may encounter incompatible software<\/strong>, hardware, and\/or processes due to multiple operating systems or vendors, or format incompatibilities. Ex.: <a href=\"https:\/\/en.wikipedia.org\/wiki\/Database#Database_management_system\" target=\"_blank\" rel=\"noopener noreferrer\">Database Management System<\/a> (DBMS) to DBMS, DBMS to Operating System, etc.<\/li>\n<li><strong>The integrity and quality of the converted data<\/strong>\u00a0may be compromised due to lack of enterprise-wide <a href=\"https:\/\/www.lightsondata.com\/what-is-data-governance\/\" target=\"_blank\" rel=\"noopener noreferrer\">data governance<\/a><\/li>\n<li><strong>Independent data validation,\u00a0<\/strong>the quality of the target system data may not meet the departmental standards because independent data validation (e.g., Quality Assurance department, off-shoring) was not considered part of the scope of work<\/li>\n<li><strong>Source data may be inaccurately transformed<\/strong>\u00a0and migrated due to lack of involvement of key business subject matter experts in the requirements and business rule process<\/li>\n<\/ul>\n<p>The following DW\/ BI project tasks should also be given a robust assessment and attention followed by, verification, or validation responsibilities associated with them to enhance data quality management throughout your data warehouse project:<\/p>\n<ol>\n<li>Business requirements collection and analysis<\/li>\n<li>Logical data modeling<\/li>\n<li>Data mapping<\/li>\n<li>Physical modeling and implementation<\/li>\n<li>Extraction, transformation, and loading (ETL) design<\/li>\n<li>Report\/ data visualization and cube design<\/li>\n<li>Project planning<\/li>\n<li>Data quality management<\/li>\n<li>Testing and validation<\/li>\n<li>Data warehouse archiving<\/li>\n<li>Backup and recovery of the data warehouse<\/li>\n<li>Change management<\/li>\n<li>Return on Investment (ROI) determination<\/li>\n<\/ol>\n<h4>\u00a0Concluding Remarks<\/h4>\n<p>Data warehouse and business intelligence projects can be highly complex and inherently risky. It is the responsibility of the project manager to lead the DW\/ BI team in identifying all data quality risks associated with a particular data warehouse implementation. The goal of this process is to document essential information relating to project risk.<\/p>\n<p>If the project team and its designers fail to assess the quality of source data, then they are exposing the entire project to excessive risks. Consider this: if the DW\/ BI team does not take the time to assess all source data quality, then it is entirely possible that you will purchase and install all the DW\/ BI technology, do all the analysis, write all the source-to-target code to populate target tables, and still fail.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>An introduction to DW\/ BI data quality risk assessments Data warehouse and business intelligence (DW\/ BI) projects are showered with risks &#8211; from data quality in the warehouse to analytic values in BI reports. If not addressed properly, data quality risks can bring entire projects to a halt, leaving planners scrambling for cover, sponsors looking [&hellip;]<\/p>\n","protected":false},"author":12,"featured_media":1012,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[3,4],"tags":[54,52,51,53],"class_list":["post-989","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-business-intelligence","category-data-quality","tag-bi","tag-data-warehouse","tag-dqm","tag-dw","post-wrapper","thrv_wrapper"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.4 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>How to identify and reduce DW\/ BI data quality risks | LightsOnData<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"How to identify and reduce DW\/ BI data quality risks | LightsOnData\" \/>\n<meta property=\"og:description\" content=\"An introduction to DW\/ BI data quality risk assessments Data warehouse and business intelligence (DW\/ BI) projects are showered with risks &#8211; from data quality in the warehouse to analytic values in BI reports. If not addressed properly, data quality risks can bring entire projects to a halt, leaving planners scrambling for cover, sponsors looking [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/\" \/>\n<meta property=\"og:site_name\" content=\"LightsOnData\" \/>\n<meta property=\"article:published_time\" content=\"2019-05-15T13:11:42+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2019-07-04T02:16:30+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/05\/howtoidentifyDWBI.png?fit=800%2C450&ssl=1\" \/>\n\t<meta property=\"og:image:width\" content=\"800\" \/>\n\t<meta property=\"og:image:height\" content=\"450\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Wayne Yaddow\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@georgefirican\" \/>\n<meta name=\"twitter:site\" content=\"@georgefirican\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Wayne Yaddow\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/\",\"url\":\"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/\",\"name\":\"How to identify and reduce DW\/ BI data quality risks | LightsOnData\",\"isPartOf\":{\"@id\":\"https:\/\/www.lightsondata.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/05\/howtoidentifyDWBI.png?fit=800%2C450&ssl=1\",\"datePublished\":\"2019-05-15T13:11:42+00:00\",\"dateModified\":\"2019-07-04T02:16:30+00:00\",\"author\":{\"@id\":\"https:\/\/www.lightsondata.com\/#\/schema\/person\/4503a79021fcf6acf4850c36356b6ffe\"},\"breadcrumb\":{\"@id\":\"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/#primaryimage\",\"url\":\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/05\/howtoidentifyDWBI.png?fit=800%2C450&ssl=1\",\"contentUrl\":\"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/05\/howtoidentifyDWBI.png?fit=800%2C450&ssl=1\",\"width\":800,\"height\":450,\"caption\":\"how to identify reduce DW BI data quality risks\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.lightsondata.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"How to identify and reduce DW\/ BI data quality risks\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.lightsondata.com\/#website\",\"url\":\"https:\/\/www.lightsondata.com\/\",\"name\":\"LightsOnData\",\"description\":\"Practical resources, online courses, free articles and videos for data management, data governance, data quality, and business intelligence\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.lightsondata.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.lightsondata.com\/#\/schema\/person\/4503a79021fcf6acf4850c36356b6ffe\",\"name\":\"Wayne Yaddow\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.lightsondata.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/0c5eaab6104044bec265d829746d99a451bd1acb77e74af42b55c757e191fe76?s=96&d=retro&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/0c5eaab6104044bec265d829746d99a451bd1acb77e74af42b55c757e191fe76?s=96&d=retro&r=g\",\"caption\":\"Wayne Yaddow\"},\"description\":\"Wayne Yaddow is an independent consultant with more than 20 years\u2019 experience leading data integration, data warehouse, and ETL testing projects with J.P. Morgan Chase, Credit Suisse, Standard and Poor\u2019s, AIG, Oppenheimer Funds, and IBM. He taught IIST (International Institute of Software Testing) courses on data warehouse and ETL testing and wrote DW\/BI articles for Better Software, The Data Warehouse Institute (TDWI), Tricentis, and others. Wayne continues to lead numerous ETL testing and coaching projects on a consulting basis. You can contact him at wyaddow@gmail.com.\",\"url\":\"https:\/\/www.lightsondata.com\/author\/wyaddowgmail-com\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"How to identify and reduce DW\/ BI data quality risks | LightsOnData","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/","og_locale":"en_US","og_type":"article","og_title":"How to identify and reduce DW\/ BI data quality risks | LightsOnData","og_description":"An introduction to DW\/ BI data quality risk assessments Data warehouse and business intelligence (DW\/ BI) projects are showered with risks &#8211; from data quality in the warehouse to analytic values in BI reports. If not addressed properly, data quality risks can bring entire projects to a halt, leaving planners scrambling for cover, sponsors looking [&hellip;]","og_url":"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/","og_site_name":"LightsOnData","article_published_time":"2019-05-15T13:11:42+00:00","article_modified_time":"2019-07-04T02:16:30+00:00","og_image":[{"width":800,"height":450,"url":"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/05\/howtoidentifyDWBI.png?fit=800%2C450&ssl=1","type":"image\/png"}],"author":"Wayne Yaddow","twitter_card":"summary_large_image","twitter_creator":"@georgefirican","twitter_site":"@georgefirican","twitter_misc":{"Written by":"Wayne Yaddow","Est. reading time":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/","url":"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/","name":"How to identify and reduce DW\/ BI data quality risks | LightsOnData","isPartOf":{"@id":"https:\/\/www.lightsondata.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/#primaryimage"},"image":{"@id":"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/#primaryimage"},"thumbnailUrl":"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/05\/howtoidentifyDWBI.png?fit=800%2C450&ssl=1","datePublished":"2019-05-15T13:11:42+00:00","dateModified":"2019-07-04T02:16:30+00:00","author":{"@id":"https:\/\/www.lightsondata.com\/#\/schema\/person\/4503a79021fcf6acf4850c36356b6ffe"},"breadcrumb":{"@id":"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/#primaryimage","url":"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/05\/howtoidentifyDWBI.png?fit=800%2C450&ssl=1","contentUrl":"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/05\/howtoidentifyDWBI.png?fit=800%2C450&ssl=1","width":800,"height":450,"caption":"how to identify reduce DW BI data quality risks"},{"@type":"BreadcrumbList","@id":"https:\/\/www.lightsondata.com\/how-to-identify-and-reduce-dw-bi-data-quality-risks\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.lightsondata.com\/"},{"@type":"ListItem","position":2,"name":"How to identify and reduce DW\/ BI data quality risks"}]},{"@type":"WebSite","@id":"https:\/\/www.lightsondata.com\/#website","url":"https:\/\/www.lightsondata.com\/","name":"LightsOnData","description":"Practical resources, online courses, free articles and videos for data management, data governance, data quality, and business intelligence","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.lightsondata.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/www.lightsondata.com\/#\/schema\/person\/4503a79021fcf6acf4850c36356b6ffe","name":"Wayne Yaddow","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.lightsondata.com\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/0c5eaab6104044bec265d829746d99a451bd1acb77e74af42b55c757e191fe76?s=96&d=retro&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/0c5eaab6104044bec265d829746d99a451bd1acb77e74af42b55c757e191fe76?s=96&d=retro&r=g","caption":"Wayne Yaddow"},"description":"Wayne Yaddow is an independent consultant with more than 20 years\u2019 experience leading data integration, data warehouse, and ETL testing projects with J.P. Morgan Chase, Credit Suisse, Standard and Poor\u2019s, AIG, Oppenheimer Funds, and IBM. He taught IIST (International Institute of Software Testing) courses on data warehouse and ETL testing and wrote DW\/BI articles for Better Software, The Data Warehouse Institute (TDWI), Tricentis, and others. Wayne continues to lead numerous ETL testing and coaching projects on a consulting basis. You can contact him at wyaddow@gmail.com.","url":"https:\/\/www.lightsondata.com\/author\/wyaddowgmail-com\/"}]}},"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/www.lightsondata.com\/wp-content\/uploads\/2019\/05\/howtoidentifyDWBI.png?fit=800%2C450&ssl=1","jetpack_shortlink":"https:\/\/wp.me\/p9BPV6-fX","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/posts\/989","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/users\/12"}],"replies":[{"embeddable":true,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/comments?post=989"}],"version-history":[{"count":14,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/posts\/989\/revisions"}],"predecessor-version":[{"id":1145,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/posts\/989\/revisions\/1145"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/media\/1012"}],"wp:attachment":[{"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/media?parent=989"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/categories?post=989"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.lightsondata.com\/wp-json\/wp\/v2\/tags?post=989"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}