ds-blog-icon.png

THE DATA DOWNLOAD

As a business intelligence consulting company, we pride ourselves on being able
to deliver on our projects as well as provide good quality content to our readers.

Webinar Q&A: ROI on Data Quality

Aug 31, 2018 11:39:11 AM   Sally McCormack

Topics: Data Quality

1. What areas do companies typically focus on when they want to create a pilot program to demonstrate ROI? Are there specific areas that may have more business relevance?

Read More

Addressing Banking Challenges in 2018

Jan 23, 2018 1:58:31 PM   Solomon Williams

Topics: Data Governance, Financial Services, Data Quality, Master Data Management

The banking and financial industries face significant challenges and added regulation in the year ahead. While the number and degree of these challenges will likely shift throughout 2018, the below list covers the most pressing and how your organization can leverage data management solutions to ensure preparedness.

Read More

Top Ten Data Quality Problems: Part II

Jul 3, 2017 10:23:52 AM   Sally McCormack

Topics: Data Quality

Recently, we outlined the top five data quality problems in enterprise data management and offered best practices to solve those challenges. This post explores the topic further to highlight five additional roadblocks associated with managing the critical data of an organization. Organizations that take time to decipher the root cause behind data challenges will run more successful enterprise data programs. They’ll also have a foundation in place for sustainable growth with an enterprise-wide view of customers, manufacturing, supply chains, sales, and operations. 

Read More

Top Ten Data Quality Problems

Jun 27, 2017 10:16:54 AM   Sally McCormack

Topics: Data Quality

Top Ten Data Quality Problems: Part I

By: Sally McCormack, Data Quality Competency Director, Datasource Consulting, LLC

Problems with data quality are costly to an enterprise. When facing the potential for missed opportunities, uninformed decision-making, non-compliance sanctions, and low customer satisfaction, today’s business leaders are making data quality a priority in their organizations’ data management programs. An Experian report found that 88 percent of companies see a direct effect of inaccurate data on their bottom line, losing an average of 12 percent of their revenue. In a similar study by Database Marketing, organizations estimate that they could increase sales by nearly a third (29%) with corrected customer data. (Source: Internal Results)

 

Read More

Business Intelligence Industry Review and Trends for 2014

Dec 20, 2016 10:17:24 AM   Steve Dine & David Crolene

Topics: Blog, Program Management, Data Integration, Data Quality, Master Data Management, Data Security, Big Data, Agile, Cloud

Business Intelligence Industry: 10 Considerations for 2014


By:  Steve Dine and David Crolene

Each year, we reflect upon the business intelligence industry and enterprise information management (EIM) industry and provide a review of the noteworthy trends that we encounter in the field.  Our review emanates from five sources: our customers, industry conferences, articles, social media, and software vendors. This year has proved to be an interesting one on many fronts.  Here is our business intelligence  industry review and observations for 2013 and predicted trends for the remainder of 2014.

Read More

How Business - IT Partnerships Develop Operational Analytics

Nov 21, 2016 9:39:56 AM   Nancy Couture

Topics: Blog, Program Management, Data Integration, Data Quality

A great example of business and IT working together to innovate is in the development of operational analytics. As data volumes and frequency of data continue to increase, organizations have realized that it’s not enough to analyze their data – they must take action on it. 

Read More

Is Your Data Golden?

Nov 16, 2016 10:23:00 AM   Datasource

Topics: Data Quality, Blog, Data Profiling

We all know that ‘bad data’ is bad, but to what extent? Have you ever been a part of an email campaign where you get more email bounce-backs than successfully sent messages? While the annoyance factor is high, research tells us it can’t compare to the true cost of having customer data that’s out of date, duplicated, or inaccurate. As a frame of reference, think about the grocery store loyalty cards in your wallet. Do you always have them with you when you shop, or do you have one under your cell phone number and one under your defunct home phone?

Read More

The Master Data Management Difference: Master then Monetize your Data

Jun 13, 2016 6:00:00 AM   Ryan Baca

Topics: Data Quality, Blog, Master Data Management

Are businesses undervaluing the impact of Master Data Management (MDM) initiatives? If they are not looking at the many, often unexpected, ways that data management can impact the top and bottom line, they might very well be doing so.

It is firsthand knowledge that data management initiatives are seldom viewed as an opportunity to create significant value within an organization. You may be wondering how to entice your stakeholders with an MDM project when the ROI may seem intangible. Although it may seem elusive, the business case below outlines how one company realized a return of over $75M on a $25M investment.

Read More

Informatica Data Quality Standards and Tips

May 10, 2016 1:21:40 PM   Sally McCormack

Topics: Informatica Data Quality, Blog, Data Quality

CONSISTENT NAMING AND CODING STANDARDS

When designing rules in Informatica Data Quality, the developers and data stewards will see the same rules. Therefore, it is important to develop consistent naming and coding standards. For example, both the data steward and the developer will understand what “rule_” means while not everyone will understand what “mplt_” means. Therefore, mapplets should be named rule_ if they are used in both the Analyst and Developer tools.

Read More

The Importance of Data Quality in CCAR Compliance

Apr 25, 2016 1:54:13 PM   Sally McCormack

Topics: Data Quality, Blog

When Comprehensive Capital Analysis and Review (CCAR) was mandated by the Federal Reserve in response to the financial meltdown of 2008, it provided a framework for assessing banking organizations with consolidated assets over $10 billion. Designed to help prevent future turmoil in the financial services industry, this stress testing ensures large institutions are able to withstand changing economic conditions, providing uniform and consistent service.

Data quality is a critical component in CCAR compliance. The Federal Reserve Board (FRB) provides detailed rules, called schedule instructions, which define the specific checks that must be performed against a financial institution’s data. Called edit checks, this testing focuses on a wide variety of issues related to overall data quality.

Read More