As leaders in Enterprise Data Management and Business Intelligence, we pride ourselves on being able to deliver successfully on our projects as well as provide high-quality content to our readers.


Follow Us

facebook.png twitter.png linkedin.png youtube.png

Governing Leading Edge BI & Analytics Applications with Traditional Techniques, Part 2

Posted by Matt Caton on Apr 25, 2016 1:49:26 PM

In part one of the two part series, we looked at the shift in the BI and analytics technology market from traditional tools to emerging technologies. We then briefly discussed how the new wave of BI and analytics technology often fails to meet all enterprise BI requirements. In part 2, we will discuss how an organization should adopt traditional techniques to govern leading edge technologies.


When comparing technologies, it is easy to spot a glaring difference between how modern and traditional tools are configured and supported. When using traditional tools, administrators are equipped with a number of guardrails to govern reports and analytics within a well-configured environment. For instance, Cognos reports and reporting models are primarily deployed within an online enterprise reporting portal. Providing access to the folders and reports within the portal requires an Administrator to define a security model (hopefully within an existing framework like Microsoft’s Active Directory). In contrast, with Tableau’s flagship product, Tableau Desktop, developers can simply distribute reports to anyone who has downloaded their public ‘Reader.’ With Tableau Server, administrators can configure an online reporting portal but this technology is still relatively immature. This technology is foundational for traditional tools.

In many ways, we are comparing apples to oranges when trying to compare the architectural features of traditional tools, like Cognos and MicroStrategy, to newer ones, like Tableau and Spotfire. The ultimate goal of both classes of tools is to serve enterprise reporting and analytic communities. With traditional tools, establishing strong governance is more natural and consistent with the administrative tasks necessary to deliver reports. Establishing governance with newer technologies, like Tableau, requires more attention and focus. Without discipline and strong governance standards, organizations will likely experience the same frustrations that they are currently encountering with tools like Microsoft Excel.

When a newer leading edge analytics application fails to be adopted and trusted, it’s often the result of poor internal processes and lack of available data. In many respects, modern applications offer superior benefits. However, the tool is only as good as the people and processes that support it. If an organization wants to realize the benefits of newer tools, they must govern the new technology with traditional techniques.

Governing New Technology with Proven Techniques

To guarantee both the integrity of the results and ROI, an organization has to put guardrails in place. Clearly, there are caveats and exceptions that must be available for advanced data discovery and statistical analysis. Ultimately though, to effectively govern new technologies, we need a set of best practices which include the following:

  • Establish the gold standard - This two-step best practice initially requires the BI program to create or exploit an already established validation process to clearly differentiate between reports that have been through an exhaustive validation process versus those that have not.
    • Validate – To achieve greatness and be established as a Gold Standard report – and just to be clear, we aren’t talking about a one-time report but a dynamic report that is based on relative dates and dimensions – a strong validation process must be in place. This is typically achieved with a strong internal peer review process followed by an external audit and then full on User Testing and acceptance.
    • Brand – Next, a brand qualifying this validation should be established and promoted. Adding a watermark to reports indicating to consumers that this report has been through the gauntlet of validation will help instill trust and confidence in the BI application used to create it.
  • Define and govern data access and ideally logical semantic layers – Enable self-service by establishing and governing data access connections for consumers. Newer school, advanced analytic applications don’t typically support the development of a common semantic layer (in other words a predefined ad-hoc reporting data model). When applicable, create, govern, and validate logical self-service reporting and analytics data models. Define, document, and train users how to access semantic data layers and/or other available, validated data sources. Provide data model navigator documentation so consumers know exactly how to query models and are aware of any nuances. Ideally, enterprise standard reports and executive level dashboards will leverage the same semantic data access layers to further promote data consistency, trust, and confidence in the BI and analytics tool.
    • Logical – Enable data analysis by providing logical, rich, accurate, and accessible data models. Logically organize hierarchies and groups to clearly delineate between data types like dates, dimensions, and measures. Define clean and logical organization within a guard-railed environment so that users can quickly ask and answer any number of questions.
    • Governed - Semantic models and data access layers that are rooted in governance, with a very clear lineage back to their sources, will truly enable and fast-track self-service. This is because data analysts won’t be forced to integrate and aggregate their own data, hoping it’s correct, but will be enabled to focus all their attention analyzing their data knowing it’s correct.
  • Socialize and align with the business – This best practice is at the heart of data governance and, although it sounds straight forward, involves a great deal of discipline and consistency. BI programs that intersect with IT and the Business are instrumental in cross-aligning multiple departments and user types.
    • Cheerlead the efforts – Promoting the efforts required leading into the acquisition and integration of a new tool will help set expectations for key end users. This will also help create awareness and an appetite for the technology.
    • Establish a governance board and meet often – This aspect of governance in regards to a newly acquired technology is broader than the tool itself but vital for the data feeding it. If not already in place, establish a governance board and indicate the primary objectives of the new enterprise reporting and analytics package.
  • Create clean, logical security structures – Security models will be most effective if they merge technical and business requirements and are logical and easy to administer. Establish cohesive security models to efficiently and effectively govern your newly acquired BI and analytics tool.
  • Automate tool usage and health monitoring – Design or leverage the tool’s ability to provide usage statistics. Set up re-occurring emails to monitor who is logging in and querying semantic models and/or data sources most often. Where applicable, manage server side system health with automatic triggers and monitoring. Tool usage statistics enable stronger tool governance by highlighting who is most and least active in the BI and Analytics solution.
  • Manage metadata and business rules - The tool is only going to be as good as the data feeding it and to instill trust and confidence, users will need insight into data lineage and sources. This also requires insight into the conformed Business Rules defined to support certain groupings and integration points.
    • Conform to the Business and other BI Tools - These business rules, where applicable, should conform across the Enterprise and other BI and Reporting tools sourcing the same data.
    • Define data stewards and meet often – In addition to a Governance Board, data stewards should be in place to represent and make decisions regarding their data. For instance, business rules and data definitions should be owned and maintained by those closest to that data. This data must be available for Enterprise consumption and data stewards will function as gatekeepers and subject matter experts.
  • Train and enable – Initially, with a newly acquired analytics tool, the goal is to enable analysts and power users to ask and answer their own questions. Ideally though, users will become strong enough with the technology to create their own dynamic reports and dashboards. With a well-defined report or dashboard, the BI program can facilitate putting this through the full-on validation process to create a ‘Gold Standard’ report. By enabling our consumers, they will take ownership and pride in the system and its output.
  • Define deliverables and set appropriate expectations – Setting a clear scope and definition for success with regard to the newly acquired technology is essential. Define deliverables such as the examples below. Appropriately setting achievable expectations will in turn help the BI Program and business begin to define and monitor the ROI of their purchase.
    • To consolidate 50 Sales reports into 3 dynamic Sales dashboards
    • To enable the Finance department with a semantic model for Self-Service
    • To promote Gold Standard reports within a Mobile based reporting solution

A common thread within each of the above recommendations is that of perception. More so, it’s a focus on instilling confidence and trust within the reports and analytics produced by your BI tool. In all cases, this requires an intense focus on the end consumer. Establishing traditional governance techniques as outlined above is imperative to promote confidence in the tool as well as usage. These techniques require diligence and an upfront commitment to establish a strong governance foundation. By setting a solid foundation in governing the BI tool as well as the data it sources, you’ll be well positioned to realize its true potential.

About the Author:

Matt Caton is the BI & Analytics Practice Lead at Datasource Consulting. Matt is a certified Informatica MDM Specialist and a certified Cognos Multi-Dimensional Report Developer. He is knowledgeable in numerous enterprise reporting tools including IBM Cognos, Tableau, Oracle Business Intelligence Enterprise Edition (OBIEE), SAP Business Objects, Microsoft SQL Server Reporting Services (SSRS), Microstrategy, Pentaho and Jaspersoft.

Topics: Tool Selection, Business Intelligence, Program Management, Blog

Written by Matt Caton