Evaluation Guide: How to choose the right modern BI & analytics platform


The transition to a modern business intelligence model requires IT to adopt a collaborative approach that includes the business in all aspects of the overall program. This guide focuses on the platform evaluation and selection. It is intended for IT to use collaboratively with business users and analysts as they assess each platform’s ability to execute on the modern analytics workflow and address the diverse needs of users across the organization.

About this guide

This evaluation guide aims to support IT organizations as they evaluate and select a modern BI & analytics platform suitable for a broad, enterprise-wide deployment.

The transition to a self-service-based, modern BI model requires IT to adopt a collaborative approach that includes the business in all aspects of the overall program (see Redefining the Role of IT in a Modern BI World). This guide focuses on the platform evaluation and selection aspect of a modern BI program. It is intended for IT to use collaboratively with business users and analysts as they assess each platform’s ability to execute on the modern analytics workflow and address the diverse needs of users across the organization.

The modern analytic workflow is a cycle of inter-related capabilities.

IT enables the modern analytics workflow, but it is primarily driven by business users and analysts throughout the organization. And its successful implementation requires collaboration and participation from all roles. In order to select a modern BI & analytics platform that can be adopted and widely deployed, organizations should consider the following set of foundational core attributes throughout the evaluation process which are covered in detail in the “Core Platform Attributes to Consider” section below:

• Platform integration & accessibility
• Ease of use
• User enablement
• Deployment flexibility
• Pricing and packaging

Intended audience

This guide assumes the following core role types will be represented and available to participate in applicable aspects of the evaluation:

  • IT/BI professional – performs all initial setup tasks including software installation, user provisioning, access rights, governance oversight, and some development tasks (content and data source).
  • Content creator – performs most of the content creation tasks including data preparation, free-form exploration, content promotion, and data validation.
  • Information consumer – primarily accesses and interacts with curated content and trusted data sources.

Throughout the guide, a primary role will be identified for each stage within the analytics workflow as this is the lead role for that specific stage of the evaluation. However, it is imperative that every stage of the evaluation includes participation and input from all of the above role types to ensure all needs and concerns are addressed through the process.

It should also be noted that for some organizations, the same person may serve multiple roles so it would not be uncommon for a single person to evaluate a platform from more than one perspective. Ultimately, the modern approach to business analytics will evolve to the point where it will no longer be possible (or necessary) to differentiate between an enabler, a producer, or a consumer of analytics within an organization.

Prerequisites for evaluation

In order to conduct a comprehensive evaluation of a modern analytics platform, the following tasks should be completed prior to kicking off the evaluation process.

  • Desktop/server/cloud software licenses needed for evaluation
  • Professional services/implementation partner engagement (if applicable)
  • Role identification and evaluation assignments:
    • IT/BI professional
    • Content creator
    • Information consumer
  • Access to cloud data source(s) and on-premises data source(s)
  • Initial environment setup
  • Initial user provisioning and security
  • Confirm availability of mobile devices (iOS, Android, other), phones, and tablets

Modern BI & analytic platform evaluation

Core platform attributes to consider

This guide primarily focuses on evaluating specific inter-related capabilities that are important when selecting a modern BI & analytics platform. However, it is critical that the evaluation team considers the following list of non-technical core attributes that are essential to the successful implementation and execution of the modern analytic workflow in an organization. These attributes should factor heavily into the ultimate decision as they collectively serve as the glue that holds together the individual capabilities of the workflow and are foundational in nature.

Platform integration & accessibility

  • Can all of the steps of the modern analytics workflow be executed seamlessly within the platform without the need to move between modules/products in a disconnected manner?
  • Can all of the steps of the modern analytics workflow be executed without IT involvement or specialized skills?

Ease of use

  • Is it easy for BI platform administrators to install, configure, and manage the platform?
  • Is it easy for content creators to prepare data and curate data sources without upfront or ongoing assistance from IT?
  • Is it easy for content creators to author content and access the analytic capabilities of the platform without upfront or ongoing assistance from IT?
  • Is it easy for non-technical content consumers to find, view and interact with available analytic content?
  • Is it easy for non-technical content consumers to ask deeper questions autonomously and customize existing published content to suit their specific needs?

User enablement

  • Is role-specific training available and accessible to all users?
  • Are there self-paced tutorials and/or online webinars that users can access?
  • Is it easy for users to search and find answers to product-specific questions?
  • Is there a robust and active user community accessible to share and learn best practices, tips & tricks, etc.?
  • What is the platform vendor’s reputation for resolving technical support issues?
  • Are professional services (through vendor or partners) readily available?
  • What is the platform vendor’s reputation for ensuring customer success and ongoing engagement with customers?

Deployment flexibility

  • Does the platform offer flexible deployment options (e.g. SaaS, public/private cloud deployment, on-premises, etc.)?
  • Does the platform offer flexible data storage options (e.g. in-DB vs. platform storage (in-memory))?
  • Does the platform support hybrid connectivity of on-premises and cloud data sources?
  • Is the platform scalable to accommodate increasing data volumes and additional users over time?
  • Can the platform easily scale up and out depending on the needs of the organization?

Pricing and packaging

  • Is the product packaging easy to understand?
  • Are the available licensing options clear and transparent?
  • Is the pricing model for the platform easy to understand?
  • Is the pricing model for the platform flexible and scalable?

Access and view

As organizations begin the transition from a traditional top-down approach driven by IT to one based on self-service, it is often advantageous for IT (or a centralized BI team) to develop an initial set of trusted data sources and analytic content. Business users can then access and use this content as a starting point for their analysis. Over time, as users are encouraged to ask and answer their own questions as part of the modern analytics workflow, the domain of available trusted content will grow organically. Users will have access to a wider range of analytic content for self-service. For the purpose of this section, we will disregard the origin of content available to end users, and the evaluation criteria as it pertains to getting into a governed state will be addressed in the “promote & govern” section.

The evaluation criteria for this section will first be addressed from the perspective of the IT/BI professional who is ultimately responsible for the administration of the centralized environment where analytic content is stored and maintained, and data sources are administered and monitored.

Evaluation criteria:

IT/BI professionals should be able to:

  • Define and update underlying data refreshes and monitor status.
  • Choose where underlying data used for analysis should be stored and how it should be accessed.
  • Extend the platform to include partner-provided capabilities.
  • Monitor and audit usage of available content and perform impact analysis.
  • Diagnose and tune performance-related issues.

Evaluation considerations:

  • Can the refresh schedule be set and managed independently for each item centrally stored in the analytic content repository?
  • Can a specific person or role be set up to be notified of issues/failures in the data refresh process?
  • Can queries originating from the analytic platform be pushed down to the underlying database where the data resides?
  • Can data be ingested into the analytic platform’s in-memory/columnar storage for performance optimization?
  • Can on-premises data be accessed live from the analytic platform when deployed in the cloud?
  • Can the platform be extended via APIs/SDKs to include supplemental analytic capabilities not natively delivered by the platform?
  • Can usage of specific data sources and available analytic content be tracked and audited by an administrator?
  • Can an administrator perform an impact analysis to determine the scope and severity of a proposed change to downstream content and processes?
  • Does the platform offer utilities to an administrator to identify, diagnose, and resolve performance-related issues?

The second perspective in this section to consider is that of the information consumer who drives the specific usage requirements and parameters that the IT/BI professional is responsible for successfully delivering.

Evaluation criteria:

Information consumers should be able to:

  • Search the repository for existing content based on a keyword or topic.
  • Define alerts and notification preferences if a metric/KPI exceeds a threshold or is triggered by a specific condition.
  • Subscribe to relevant content and set update/notification preferences.
  • Access and view analytic content on any preferred form factor.

Evaluation considerations:

  • Can a user perform a search to find and view available content that has already been created by another user that may assist in answering a business question?
  • Can a user easily determine whether analytic content and/or data sources have been certified and should be considered trusted?
  • Can a user access and view field-level metadata to understand the underlying details of a particular data element?
  • Can a user define data-driven or static thresholds to indicate when a notification should be triggered?
  • Can a user specify how and where relevant alerts and notifications are delivered?
  • Can a user subscribe to specific content and set notification preferences following updates or other events impacting content subscriptions?
  • Can a user search for and access analytic content on any device (phone, tablet, laptop, etc.)?
  • Can a user access and download analytic content via a mobile device for offline viewing?


The interact phase is an extension of the initial access-and-view phase of the analytics workflow. It offers information consumers need to perform guided analysis of available content within predetermined fixed boundaries as set by the content publisher. The following considerations should be the focus of the evaluation for this section from the perspective of the information consumer:

Evaluation criteria:

  • Information consumers should be able to:
  • Change the scope of analysis through direct interaction with visual interface.
  • Leverage controls provided by the content author to increase analytic depth.
  • Use search capabilities to interact with available content.
  • Interact with content on any preferred form factor.

Evaluation considerations:

  • Can a user control the scope of analysis interactively through native capabilities of the platform? The following questions should be evaluated to determine the extent to which this is addressed from directly within the flow of visual interaction:
  • Can a user drill up and drill down using predefined or custom-built hierarchies?
  • Can a user focus his or her analysis on a specific data point or set of data points identified through the visual interaction process?
  • Can a user exclude a specific data point or a set of data points identified through the visual interaction process?
  • Can the user interact with parameters to change the analytical view or perform what-if analysis/scenario modeling?
  • Can the user interact with visible filter controls to change the scope of analysis?
  • Can a user search on keywords to drive filters and change the scope of analysis?
  • Can a user interact with available analytic content through natural language query?
  • Can a user perform the same level of interaction on any device of different form factors?

Analyze and discover

This phase of the modern analytics workflow spans a broad spectrum of user needs, and it is imperative that the platform seamlessly addresses these needs. This phase is of particular significance in the workflow as it differentiates data-visualization tools used to build charts from rich visual-analysis tools that use visualizations as the primary metaphor for analysis. As users interact with dashboards and generate new questions, users will inevitably encounter barriers and roadblocks as they reach the limits of the guided experience offered by existing dashboards. When this occurs, users require a self-driven, autonomous framework for asking and answering new questions that have emerged. Users of all skill levels must be able to “visualize as you analyze” and access the analytics capabilities of the platform while in the flow of analysis without having to move to a different module or product in the suite.

The concepts of platform integration and ease of use are covered in the “core attributes” section at the end of this guide in greater detail, but they are most critical to consider here. The transition from the “interact” phase to “analyze & discover” is often where the analytics workflow is disrupted due to lack of overall continuity of platform components needed to
ask the next level of questions.

The first scenario to consider is from the perspective of an information consumer who has generated new questions that cannot be addressed by any available dashboards. The following considerations should be the focus of the evaluation for this scenario:

Evaluation criteria:

Information consumers should be able to:

  • Access the trusted data source that serves as the source of a dashboard to autonomously launch a deeper contextual analysis.
  • Search the repository of trusted data sources to identify curated data sets that are available to augment the analysis.
  • Enhance the data model of trusted sources to customize for their specific needs.

Evaluation considerations:

  • Can a user, from within a production dashboard, launch a new analysis using the data set(s) that the dashboard sources from? This should allow for self-service exploration and analysis of all data elements contained within the data source without the need to access a separate product or module within the platform.
  • Can a user browse or search the repository of available production data sources available for analysis and launch a new analysis from a selected data source? The success criteria are the same as the previous step with the only difference being that the analysis starts from a data source, not an existing dashboard.
  • Can a user, once connected to a trusted data source, modify and augment the existing data model within the flow of analysis and content creation? This should be done in the context of the analysis and not in a separate product or module within the platform and each of the following questions should be addressed:
  • Can a user enrich the existing data model to create new dimensions and measures needed for analysis?
  • Can a user combine and group related data points into a new field in the data model to streamline analysis?
  • Can a user isolate specific data points of interest and save dynamically within the data model for further analysis?
  • Can a user modify the data model and create custom drill paths and hierarchies to align with their analytic needs?
  • Can a user interactively correct data issues that surface during the analysis process? This would include NULL value handling and renaming/replacing values globally for consistency.
  • Assess the breadth and depth of assistive analytics capabilities available within the product to augment the analytics workflow where appropriate using the following questions:
  • Is the user presented with recommended best-fit visualizations throughout the discovery process based on the chosen analysis path?
  • Are advanced analytic capabilities accessible to the user to enrich analysis without necessarily having to understand or access the underlying models or algorithms used within the product?
  • Can a user access the underlying statistical detail used for advanced analytics if needed to share with more advanced users who may request it for further analysis and validation?
  • Is field-level metadata accessible throughout the analysis process and can it be updated augmented by the user as appropriate?

The second scenario to consider is from the perspective of a content creator who has new questions that cannot be addressed by any available dashboards or any trusted data sources within the environment. The following considerations should be the focus of the evaluation for this scenario:

Evaluation criteria:

Content creators should be able to:

  • Ingest and model data that is not yet in a trusted state to explore and discover new insights.
  • Combine trusted and untrusted data to create new data sources for analysis.
  • Use existing and newly created data sources to build new analytic content to share and promote.
  • Modify existing analytic content based on new findings resulting from discovery and exploration.
  • Create a guided analytics experience to facilitate broader use by information consumers.

Evaluation considerations:

  • Can a user connect to data sources that are not currently being centrally governed?
  • Does the platform offer broad connectivity options to include structured and unstructured data sources for ingestion and analysis?
  • Can a content creator perform all of the analyze and discover tasks included in the information consumer section against new, untrusted data sources?
  • Can a user virtually extend a trusted data source without modifying the underlying data structure or load process?
  • Can a user build new analytic content using a new data source or one that is a hybrid of trusted and untrusted data?
  • Can a user create alternate versions of governed content for sharing and track the lineage of changes over time?
  • Can a user redirect the underlying data connection of governed analytic content to use a newly-created/-enhanced source with no downstream impact?
  • Can a user build programmatic controls into analytic content to facilitate interaction and provide a guided experience to a broad audience of information consumers?
  • Can a user create and save style sheets or design themes to apply in the creation of other content?


The approach to sharing content has evolved. Within traditional BI platforms, sharing meant the delivery of static printed or exported reports to an inbox or a user’s desk. Within the modern analytics approach, sharing now includes collaboration and aspects of social interactions that we have grown accustomed to with all of our business tools. This transition is driven by the simple fact that information is outdated as soon as the report is printed or exported. And this doesn’t align with the needs of today’s consumer seeking the latest information. Some aspects of content sharing involve making information broadly available to users while others aspects entail collaboration as a core component of the analysis process. Both scenarios will be included in the evaluation criteria for this section.

The push model of making information accessible to a broad range of users will be addressed first. This is a more reminiscent of the traditional approach; however, modern platforms should also enable organizations to make information broadly accessible to a wide range of internal and external users. Many of these tasks fall into the domain of the IT/BI professional and the following criteria should be evaluated from that perspective.

Evaluation criteria:

IT/BI professionals should be able to:

  • Deliver content on any form factor used throughout the organization.
  • Embed analytic content for broader access and contextual use.
  • Provide for external access and consumption.

Evaluation considerations:

  • Can analytic content be rendered in any form factor that may be used across the organization to access the environment? This would include tablets, phones, laptops, large displays, etc.
  • Can analytic content be embedded in an organization’s web portals and applications that users access as part of their normal business processes?
  • Can analytic content be shared to external consumers who are outside the corporate firewall?

The second scenario is that of true collaboration where both trusted and untrusted content is discussed, reviewed, and validated at the peer-to-peer, workgroup, or enterprise level.
This collaboration should be an integral step in the process of deriving new insights and as an input into the governance process. The primary participant in this scenario is the content creator which should be the perspective of evaluation for the following criteria.

Evaluation criteria:

  • Content creators should be able to:
  • Collaborate with others on the development and validation of analytic content.
  • Annotate and discuss findings in a social media-style conversation.
  • Follow specific content types or content authors.
  • Provide quality ratings for specific analytic content.
  • Create storyboards to share findings and insights.
  • Add descriptive narratives to augment and enhance visual content.

Evaluation considerations:

  • Can users across the organization collaborate on shared content in real-time to discuss and elaborate on findings?
  • Can users annotate and provide comments directly within the content using any form factor?
  • Can users follow a conversation via a timeline to track lineage of the conversation and view a snapshot of the content being discussed as it looked when a comment was added?
  • Can users follow specific users within an organization and receive updates and notifications of their activity?
  • Can users follow and track specific topics or types of content and receive updates and notifications when new content meeting the criteria is published?
  • Can users rate content either through a rating system or by using social media-style “likes”?
  • Can users create stories to represent a logical sequence of findings for the purpose of walking another user through the analytics journey?
  • Can users integrate descriptive narratives to augment the visual content within an analysis, either manually or automatically through platform capabilities?

Promote and govern

There are various approaches to governance. And every organization is going to fall at a different point on the spectrum ranging from an IT-led, highly-governed and control environment to one with little to no controls, with many organizations landing somewhere in between. Often times, even within one organization, the governance requirements may vary depending on the user needs within that area as well as the data itself.

When choosing a modern analytics platform, flexibility is important to consider in order to meet those varying needs of the business and to ensure that you can alter your governance needs as you scale. An organization may choose to facilitate the transition from traditional to modern by initially using the modern platform in a traditional manner then gradually expanding the range of capabilities that are accessible to users through self-service. It is equally as important to evaluate a platform’s distinct capabilities in the separate but related areas of data governance and analytics governance (as depicted below) to ensure that an adequate amount of flexibility is afforded within the platform to put the most appropriate governance model in place and adjust over time as needed.

For most modern analytics use cases, a self-service-driven organic approach to governance will lead to greater adoption, deeper insights, and improved business outcomes. As such, this is the approach that should be considered primary for the purpose of this evaluation.

In this approach, a subset of content creators, referred to as information stewards in this guide, are primarily responsible for defining and navigating the overall governance process. The sections below will consider aspects of both data governance as well as analytics governance from the perspective of the content creators as well as the IT/BI professional.

Data governance

The task of defining and ensuring compliance with an organization’s governance framework is a core responsibility of the content creator and as such, the following data governance- related items should be considered from that perspective:

Evaluation criteria:

Content creators should be able to:

  • Define, manage, and update data models used for analysis (data source management).
  • Autonomously define, update, and expose field-level metadata to users (metadata management).
  • Centrally capture and expose data-cleansing and enrichment rules applied to published data models (data enrichment and data quality).
  • Monitor and track usage metrics of centrally-defined data models (monitoring & management).

Evaluation considerations:

  • Can an information steward publish a data model into the system-of-record environment for broader use across the organization?
  • Can a published data model be augmented with validated user-defined fields through a promotion process?
  • Can an information steward physically mark trusted data models with a watermark?
  • Can a published data model be virtually extended with additional sources/data elements without impacting downstream content and/or users?
  • Can an impact assessment be conducted prior to any data model changes?
  • Can a content creator add and update descriptive metadata to dimensions and measures within a published data model?
  • Can business rules and data transformations used to create and populate published data models be exposed to end users?
  • Can data model changes be tracked and audited and reverted if needed?
  • Can an information steward access utilization statistics and access platform capabilities to identify data-model attribute redundancy, inconsistency, non-use, etc.?

Administration and enablement of the entire governance process is largely the responsibility of the IT/BI professional, and as such, the following data governance-related items should be considered from that perspective:

Evaluation criteria:

IT/BI professionals should be able to:

  • Define security parameters and access controls to published data models (data security).
  • Monitor and audit usage to ensure compliance and appropriate use of data assets (monitoring & management).
  • Create new data models as needed to enforce consistency across departments and information stewards (data source management).
  • Comply with the organization’s overarching data strategy (data source management).

Evaluation considerations:

  • Can security be inherited from source systems where applicable?
  • Can an administrator allow/deny access at the user/group level for each data source?
  • Can access rights be defined at the row level to allow a user to access a subset of data for each data source?
  • Can an administrator define specific roles and privileges for each user in the system to control who can create, edit, and promote shared data sources?
  • Can system-wide usage be tracked and analyzed by an administrator?
  • Can an administrator access a system-wide view of the environment to identify redundancies and inconsistencies across data models being managed by individual information stewards?
  • Can an administrator create a new data source and seamlessly switch downstream users and analytic content to reference it in place of an existing source?
  • Can an administrator decide the most appropriate storage strategy for data required by the analytics platform based on an organization’s reference architecture?

Analytics governance

The responsibility of defining and ensuring compliance with an organization’s governance framework is a core responsibility of the content creator, and as such, the following analytics governance-related items should be considered from that perspective:

Evaluation criteria:

Content creators should be able to:

  • Access platform capabilities to assist with validation and accuracy verification of user- generated analytic content (content validation).
  • Promote validated analytic content to centralized-trusted environment as determined by governance process (content promotion).
  • Certify content as trusted and delineate from untrusted content in the same environment (content certification).
  • Monitor and audit usage of published content and track usage of untrusted content (content usage monitoring).

Evaluation considerations:

  • Can an information steward access and reference benchmark data stored in the platform to validate the accuracy of content being evaluated for promotion?
  • Can user-developed content be promoted to a shared environment for broader consumption?
  • During the promotion process, can underlying data sources be redirected to reference trusted data models already published?
  • Can watermarks be applied to published analytic content to indicate that it has been certified and can be trusted?
  • Can an information steward access and analyze usage metrics of published content, both trusted and untrusted, to ensure appropriate use?

Administration and enablement of the entire governance process is largely the responsibility of the IT/BI professional, and as such, the following analytics governance-related items should be considered from that perspective:

Evaluation criteria:

IT/BI professionals should be able to:

  • Create and maintain an environment for storing and organizing published content (content management).
  • Secure analytic content and grant users appropriate levels of access based on content type, sensitivity, business need, etc. (security, permissions & access controls).
  • Monitor broad usage patterns across organizational business units (content usage monitoring).

Evaluation considerations:

  • Can the environment be customized to suit the needs and preferences of the organization related to content organization and overall management?
  • Can the IT/BI professional enable access to the platform’s content through the organization’s portals to leverage existing content-management investments?
  • Can security be applied at a granular level to allow/deny users access to specific analytic content?
  • Can security defined at the data-model level be enforced for all downstream analytic content automatically?
  • Can usage patterns and consumption preferences be tracked and analyzed to provide an administrator an overall assessment of the environment and how it is being used?

The shift from traditional BI platforms to modern analytics platforms is one that is necessary to truly realize the impact data can have on an organization. Modern analytics platforms bring together self-service and governance to empower the entire organization with trusted data to gain insights into the business. These platforms should be evaluated through a different lens as they break the mold of traditional, IT-run BI platforms.

Tableau, a proven leader in the modern analytics space, enables organizations to explore trusted data in a secure and scalable environment. It gives people access to intuitive visual analytics, interactive dashboards, and limitless ad hoc analyses that reveal hidden opportunities and eureka moments alike. As well as the security, governance, and management you require to confidently integrate Tableau into your business—on-premises or in the cloud—and deliver the power of true self-service analytics at scale.

About the authors: 

Josh Parenteau

Market Intelligence Director, Tableau

About Solutive Oy

Tableau NEMEA Partner of the Year nominee 2017, Tableau EMEA Partner of the Year nominee 2015, Tableau EMEA Rookie Partner of the Year 2014.

Euroopan johtaviin Tableau-kumppaneihin kuuluvana huippuosaajana teemme tiedosta näkyvää, ymmärrettävää ja hyödyllistä. Mullistamme asiakkaidemme kyvyn hyödyntää dataa toimintansa johtamisessa ja kehittämisessä: nopeista prototyypeistä aina skaalautuvaan enterprise-tason analytiikkaan asti.

You might also be interested in...