ds-blog-icon.png

THE DATA DOWNLOAD

As a business intelligence consulting company, we pride ourselves on being able
to deliver on our projects as well as provide good quality content to our readers.

The Importance of Data Quality in CCAR Compliance

Apr 25, 2016 1:54:13 PM   Sally McCormack

Topics: Data Quality, Blog

When Comprehensive Capital Analysis and Review (CCAR) was mandated by the Federal Reserve in response to the financial meltdown of 2008, it provided a framework for assessing banking organizations with consolidated assets over $10 billion. Designed to help prevent future turmoil in the financial services industry, this stress testing ensures large institutions are able to withstand changing economic conditions, providing uniform and consistent service.

Data quality is a critical component in CCAR compliance. The Federal Reserve Board (FRB) provides detailed rules, called schedule instructions, which define the specific checks that must be performed against a financial institution’s data. Called edit checks, this testing focuses on a wide variety of issues related to overall data quality.

Read More

How to Implement a Robust Data Quality Solution

Apr 25, 2016 1:53:51 PM   Nancy Couture

Topics: Data Quality, Blog

Data quality is becoming a key concern for companies who rely on data on a daily basis. Without a purposeful data quality program, information becomes inconsistent and unreliable.


 


This first series of articles describe foundational steps that enable agile data warehouse development. My prior articles published thus far describe how to:

The next focus for setting yourself up for a best in class agile data warehouse environment is to develop a robust data quality solution.

Read More

Data Profiling with Informatica Data Quality

Apr 25, 2016 1:53:29 PM   Sally McCormack

Topics: Data Profiling, Informatica Data Quality, Blog

One of the first steps in solving a data quality problem is to perform data profiling. As seen in Jason Hover’s article, Data Profiling: What, Why and How?, data profiling allows you to analyze your data to determine what it looks like and what problems exist in the data. Manual data profiling can be performed; however, using software such as Informatica Data Quality allows both data stewards and developers to collaboratively profile the data in a common repository more quickly, often yielding a more thorough analysis.

Read More

Informatica PowerCenter Directory Listing Using a Java Transformation

Apr 25, 2016 1:53:10 PM   Jerry Perez

Topics: Informatica PowerCenter, Blog

Sometimes as developers we find ourselves locked out of the Informatica PowerCenter environment preventing us from verifying if a particular folder, target, source or parameter file exists in the PowerMart directory. Using a Java Transformation provides a potential solution to this problem. The following will explain how to build a Java Transformation that takes as input the PowerMart root directory or any directory the Informatica PowerCenter account has access to and return information about all the files within the directory and its sub-directories as rows.

Read More

ORA-01652 Informatica

Apr 25, 2016 1:52:56 PM   Tom Nats

Topics: Informatica PowerCenter

Kept getting this error when trying to query the REP_SESS_TBL_LOG view in the PowerCenter repository:

ORA-01652: unable to extend temp segment by 64 in tablespace TEMP " in the database.

The stats were run (even though they were run the day before) and that didn't help.

Ran this command it fixed the issue:

Read More

Using PowerCenter Command-line Tasks: Manipulating Workflows

Apr 25, 2016 1:52:39 PM   Jerry Perez

Topics: Informatica PowerCenter, Blog, Data Integration

PowerCenter Workflows

Command-line tasks are great components that can facilitate reusability, standardization and scalability of processes tied to PowerCenter workflows. Building a library of reusable command tasks to be used throughout a Subject Area can be beneficial for such tasks as managing dependencies, tracking objects or interface/integrate with processes or tools external to PowerCenter.

Read More

Data Profiling: What, Why and How?

Apr 25, 2016 1:52:21 PM   Jason Hover

Topics: Data Profiling, Data Quality, Blog

Like it or not, many of the assumptions you have about your data are probably not accurate. Despite our best efforts, gremlins inevitably find their way into our systems. The end result – poor data quality – has a host of negative consequences. This brief article will provide an introduction to data quality concepts, and illustrate how data profiling can be used to improve data quality.

Read More

Querying the Informatica PowerCenter Repository

Apr 25, 2016 1:52:05 PM   Sally McCormack

Topics: Informatica PowerCenter, Blog

PowerCenter Tips

As an Informatica PowerCenter administrator, you may often have the need to obtain a list of users and associated groups, workflows that have last run, mappings in a folder, default values within a mapping, etc. This information can be queried in the PowerCenter tools, however, a more efficient way of collecting this data is to query the repository metadata tables directly in the database. This method proves to be very helpful when performing a large repository upgrade or decommissioning an environment.

Read More

New Data Integration Tool Improves Functionality, Quality and Speed

Apr 25, 2016 1:51:49 PM   Jeff Hensiek

Topics: Press Releases

Introducing InfaTools Stage Mapping Generator: A New Data Integration Tool

 
InfaTools Mapping Generator software, by Datasource Consulting, provides more efficient and higher quality generation of staging mappings to the Enterprise Information Management and Data Warehousing industry while enabling Agile project delivery with an innovative solution that helps delivery data integration projects faster.

 

Datasource Consulting, LLC, a leading Enterprise Information Management and Data Warehouse consultancy serving large enterprise customers throughout the United States, announces the release of InfaTools Stage Mapping Generator (InfaTools), a proprietary data integration tool that speeds up the process of creating staging mappings, sessions and workflows in Informatica while improving efficiency and quality.

Read More

Sending an Email as Someone Else

Apr 25, 2016 1:51:07 PM   Tom Nats

Topics: Informatica PowerCenter, Blog

Sometimes when working with Informatica, you have to write a shell script or two (or three...). In those scripts, you might need to send an email on failure, etc..

To do this with the standard Linux client, you would use the following syntax:

Read More

Governing Leading Edge BI & Analytics Applications with Traditional Techniques, Part 1

Apr 25, 2016 1:50:20 PM   Matt Caton

Topics: Tool Selection, Business Intelligence, Program Management, Blog

Over the last 10 years, the Business Intelligence (BI) and Analytics software landscape has experienced two major trends. The first trend included a number of acquisitions consolidating some of the largest players. 2007 was the peak of this trend starting with Oracle’s purchase of Hyperion ($3.3 B), followed by SAP acquiring Business Objects (BO) ($6.78 B), and then ending the year with IBM’s acquisition of Cognos ($4.9 B). Before consumption, Hyperion, Cognos, and BO had positioned themselves as visionaries and leaders in the space, and the behemoths took notice and action.

The second major trend saw the rise of a newer class of tools like Tableau, Birst and Qlik, boosting higher quality visualizations, self-service, speed to delivery, and ease of use. Architecturally, these tools are leaner and require a smaller footprint. They have eliminated many of the burdensome tasks to install and configure, in turn eliminating much of the need for IT intervention. The emergence of these new tools has significantly eroded the legacy vendors’ stronghold on the market, dropping their share to 70% of sales in 2013, as depicted below, and even further in recent years.

Read More

Governing Leading Edge BI & Analytics Applications with Traditional Techniques, Part 2

Apr 25, 2016 1:49:26 PM   Matt Caton

Topics: Tool Selection, Business Intelligence, Program Management, Blog

In part one of the two part series, we looked at the shift in the BI and analytics technology market from traditional tools to emerging technologies. We then briefly discussed how the new wave of BI and analytics technology often fails to meet all enterprise BI requirements. In part 2, we will discuss how an organization should adopt traditional techniques to govern leading edge technologies.

Read More

Informatica Data Quality Standards: Tips

Apr 25, 2016 1:35:00 PM   Sally McCormack

Topics: Informatica Data Quality, Blog, Data Quality

 

CONSISTENT NAMING AND CODING STANDARDS

When designing rules in Informatica Data Quality, the developers and data stewards will see the same rules. Therefore, it is important to develop consistent naming and coding standards. For example, both the data steward and the developer will understand what “rule_” means while not everyone will understand what “mplt_” means. Therefore, mapplets should be named rule_ if they are used in both the Analyst and Developer tools.

 

 
Read More