ds-blog-icon.png

THE DATA DOWNLOAD

As a business intelligence consulting company, we pride ourselves on being able
to deliver on our projects as well as provide good quality content to our readers.

Business Intelligence Tools

Posted by David Crolene on Dec 20, 2016 10:19:12 AM

Business Intelligence Tools: 10 Steps to a Successful Tool Evaluation


By: Dave Crolene

Datasource recently completed a project where we helped a client evaluate and compare two business intelligence tools, both were data virtualization products.  As an objective facilitator, Datasource Consulting was called up on to ensure that both products had a fair and even playing field and to help discover the strengths and weaknesses of both tools to help drive a decision.  Whether you are evaluating Business Intelligence tools (BI) , Extract Transform Load tools (ETL), Relational Databases (RDBs), etc, this article provides a step-by-step approach that we use to compare and evaluate products.

bi_tools.jpg

10 Steps to a Successful Business Intelligence Tools Evaluation


Evaluating Business Intelligence Tools Step 1 – Basic PM Tasks

As with any project, it is important to plan your project.  At a minimum establish the following:

  • Project timeline
  • List of team resources and their availability
  • Internal communication plan and recurring meetings
  • Vendor communication plan – particularly important to keep an even playing field
  • Issues log

Evaluating Business Intelligence Tools Step 2 - Ask the Vendors for Information

Since most vendor evaluations might only last a few weeks, it is important to ask all vendors involved to answer a series of questions.  This is useful to fill in the gaps that might not be covered by the hands-on evaluation tasks (e.g., use cases and demonstrations).  Typically, such a request can be constructed in the form of a formal Request for Information (RFI), or a more basic questionnaire.  Either way, we recommend that questions be broken into three categories:

  • Capability – The feature set offered by each tool.
  • Integration - Integration within the company environment, existing data warehouse tools, and established infrastructure and policies.
  • Vendor - Company factors including product vision, market share, industry experience/track record, corporate stability, references, industry reviews, and price.

These categories are further broken into subjects and then specific questions.  An example question might be:

Category Subject Question
Capability Design Is an API/SDK available for the tool?

Note: As you evaluate and finalize the questions to be asked of the vendors, consider how the questions might serve to help score the products.

Evaluating Business Intelligence Tools Step 3 – Determine the Scoring Team

Picking your scoring team may seem easy, but it is important to create a group of people that:

  • Has available time to attend meetings, demonstrations, and conduct necessary research
  • Has some amount of background in the technology field or adjacent field, so that prerequisite education is kept to a manageable level
  • Serves as a representative cross-section of the ultimate user base and support base.  One might consider including resources from Architecture, Development, and Operations.

It is a good idea to create a Scoring Team that is large enough to provide a representative cross-section, while keeping it small enough to remain streamlined.  We like to keep the team to between 4 and 10 people.

Evaluating Business Intelligence Tools Step 4 - Create a Scoring Matrix

A scoring matrix provides a means to objectively evaluate one tool versus others.  Ultimately, the Scoring Team will provide scores that, when weighted, produce a final score for each of the business intelligence tools being evaluated.

You already have a list of Categories (Step 2), and now you can define the scoring subjects.  This should be a finite list of assessment areas that will be individually scored.  Typically, we try to categorize these subjects into the same categories used during the RFI.  An example of categories and their corresponding subjects might be:

Category Subject
Capability – The feature set offered by each tool
  • Design
  • Performance
  • Metadata
  • Ease of Use

 

Integration - Integration within the company environment, existing data warehouse tools, and established infrastructure and policies
  • Security
  • Configuration/Data Sources
  • Scheduling/Monitoring
  • Support Resources
  • POC Performance
  • Company-specific Capabilities

 

Vendor - Company factors including product vision, market share, industry experience/track record, corporate stability, references, industry reviews, and price
  • Vendor Overview
  • Pricing

 

Finally, we identify the list of evaluation criteria under each subject.  We find it is best to generate scores at the subject level, but identify more specific evaluation criteria that help the scoring team to conceptualize what they are scoring.  Often, these criteria are associated with a use case or a questionnaire item.  Other times, you may chose to simply write a sentence that helps quantify the Subject.  An example of an evaluation criterion might be:

“Vendor performance during POC - quality of resources, communication, displayed interest in project, etc.” 

Either way, it is helpful to display the associated use case(s), questionnaire item(s), and/or written evaluation criteria within the scoring matrix.

Note:  It is easy to forget to weight and score soft factors that are discovered during the evaluation of each of the business intelligence tools.  Don’t forget to include criteria that will capture such soft factors as:  quality of experience with Vendor during evaluation, uptake by internal resources of each tool, prospective end-user general opinion, number of bugs discovered during evaluation, etc.

Datasource Consulting maintains a Scoring Matrix template, but others may be found on the Internet.  The first section of the Datasource matrix includes a section where the Scoring Team can enter weights for each evaluation item.

The second section of the template includes an area where the Scoring Team can enter their scores.

Finally, the third section displays a vendor comparison and graphical results (see image under Step 10).

Evaluating Business Intelligence Tools Step 5 – Weight the Scoring Matrix

Assemble the Scoring Team in a conference room and facilitate a weighting session.  We find that it is best to first weight the categories.  To accomplish this, one would start by force ranking the importance of the three categories.  This helps draw out communication within the team as to the importance of each category.

Once the categories are force ranked, ask each member of the scoring team to divvy up 100 points among the three categories.   Once this is done for each scoring team member, an average value for each category is calculated.

The next step is to ask each scoring team member to divvy up 100 points among the subjects.  The subject scores may then be pro-rated based on the subject score.  The resulting subject weights should add up to a total of 100 points.

Evaluating Business Intelligence Tools Step 6 – Finalize Use Cases

Now that the team has developed a Scoring Matrix, Use Cases should be developed to help evaluate the subjects.  We try to ensure the Use Cases are created to test a breadth of functionality, while maintaining scope within the duration of the project.  More Use Cases aren't always better.  Sometimes, fewer well-written Use Cases may be more telling of the business intelligence tools’ strengths and weaknesses.

Using traditional Use Case structure, one would document:

  • Use Case Number
  • Use Case Name
  • Actors
  • Use Case Description

The Use Case Description provides a detailed discussion of the steps to perform during the evaluation of the business intelligence tools.  This should include enough information to execute the Use Case in all business intelligence tools being evaluated, without giving undue advantage to any one tool.

Evaluating Business Intelligence Tools Step 7 – Conduct Evaluation

Perform the tool evaluation.  Execute each Use Case using each tool.  Document the results, impressions, performance, and any other characteristics called for within the Use Case Description.  Issues with each tool should be captured and cataloged.

Evaluating Business Intelligence Tools Step 8 – Demonstrations, Results & Vendor Presentations

This step consists of several meetings.  First, a demonstration of each tool is presented to the Scoring Team.  Then, the evaluation results are shared with the Scoring Team.  Questions should be captured and more complex questions shared with the corresponding vendors for response.  It is important for the Scoring Team to try to keep an open mind and hold off on scoring the business intelligence tools until after the final Vendor Presentation.

Finally, the Final Vendor Presentation is made.  The Vendor Presentations typically focus more on the overall tool suite and the fit of the vendors within the industry.  This is important for scoring the final Category, “Vendor”.

Evaluating Business Intelligence Tools Step 9 – Conduct Scoring

After all demonstrations, Q&A sessions, and vendor presentations are concluded, set up a meeting with the Scoring Team to conduct scoring.   Prior to this meeting, ask the team to do all necessary homework and to come ready to provide a score.  The meeting provides a forum in which final questions may be asked and score values may be debated.

It is a good idea to be explicit as to what each score represents.  The Datasource Consulting template includes scores ranging from 0-3:

  • 0 - not available
  • 1 - limited functionality
  • 2 - satisfies
  • 3 - strongly satisfies

In this session, step through each of the subjects (within the Categories) and ask each team member to provide a score.  Avoid spending too much time debating each score, and rather, focus on gathering the initial scores for each of the business intelligence tools.  Once the scores are gathered, feel free to circle back and discuss outlier scores with the team.  If the scoring team is large enough, consider throwing out the high and low scores.  This helps to reduce the impact of individual bias on the overall evaluation results.

If the scoring team composition is skewed to include more resources from one organizational function than another (e.g., more people from Support than Development or Architecture), consider weighting the scores of the business intelligence tools to create a fair representation of the ultimate user base.  An easy way to accomplish this is to take an average of scores within each group and then average the group averages to arrive at the total score.  The table below shows an example of this:

Group Team Member Score Number of Scorers Group Average Total Average
Architecture Arch 1

2

2

1.5

1.6

  Arch 2

1

Development Dev 1

1

4

2

  Dev 2

2

  Dev 3

3

  Dev 4

2

Support Support 1

2

3

1.3

  Support 2

1

  Support 3

1

Evaluating Business Intelligence Tools Step 10 – Document Findings

Once the scoring is complete, create a final report outlining and compiling the findings.  This should include an overview on the status of the Use Cases, open issues, performance findings, and scoring results.  A sample report table of contents follows:

  • Project Background
  • Evaluation Approach
    • Overview
    • Use Cases
    • Diagram(s)
    • Performance Testing
    • Scoring
    • Evaluation Findings
      • Use Cases
      • Performance
      • Scoring Results
      • Conclusion
      • Appendix (supporting materials)

The Scoring Results section should display a summarized view of the score outcomes (images below).

Ultimately, this document will serve to provide enough information to arrive at a recommendation.  Datasource Consulting maintains a template for such a final report that we leverage on Tool Evaluation projects.

Keep in mind that sometimes multiple tools might appear to be close enough to be viable and the decision may boil down to pricing.  However, having conducted a thorough evaluation on all other aspects of the business intelligence tools, one is empowered to negotiate with the vendors with confidence.

Are you ready to discuss your project?

Let's chat about it, we look forward to helping and becoming your data partner. 

request-call-with-practice-lead

Topics: Tool Selection, Business Intelligence, Blog

Written by David Crolene