Organizations are collecting data from multiple data sources and a variety of systems to enrich their analytics and business intelligence (BI). But collecting data is only half of the equation. As the data grows, it becomes challenging to find the right data at the right time. Many organizations can’t take full advantage of their data lakes because they don’t know what data actually exists. Also, there are more regulations and compliance requirements than ever before. It is critical for organizations to understand the kind of data they have, who is handling it, what it is being used for and how it needs to be protected. They also have to avoid putting too many layers and wrappers around the data as it can make the data difficult to access. These challenges create a need for more automated ways to discover, track, research and govern the data.
I have written previously that the world of data and analytics will become more and more centered around real-time, streaming data. Data is created constantly and increasingly is being collected simultaneously. Technology advances now enable organizations to process and analyze information as it is being collected to respond in real time to opportunities and threats. Not all use cases require real-time analysis and response, but many do, including multiple use cases that can improve customer experiences. For example, best-in-class e-commerce interactions should provide real-time updates on inventory status to avoid stock-out or back-order situations. Customer service interactions should provide real-time recommendations that minimize the time to resolution. Location-based offers should be targeted at the customer’s current location, not their location several minutes ago. Another domain where real-time analyses are critical is internet of things (IoT) applications. Additionally, use cases like predictive maintenance require timely information to prevent equipment failures that help avoid additional costs and damage.
For years, maybe decades, we have heard about the struggles between IT and line-of-business functions. In this perspective, we will look at some of the data from our Analytics and Data Benchmark Research about the roles of IT and line-of-business teams in analytics and data processes. We will also look at some of the disconnects between these two groups. And, by looking at how organizations are operating today and the results they are achieving, we can discern some of the best practices for improving the outcomes of analytics and data processes.
Despite all the advances organizations have made with respect to analytics, our most recent research shows the majority of the workforce in the majority of organizations are not using analytics and business intelligence (BI). Less than one-quarter (23%) report that one-half or more of their workforce is using analytics and BI. This is a problem. It means organizations are not enabling their workforce to perform at peak efficiency and effectiveness. It means the workforce in many organizations does not have access to the same information by which they are being measured. It means organizations must find other ways to communicate with, and manage, the workforce.
Topics: Sales, embedded analytics, Analytics, Business Intelligence, Data, Sales Performance Management, Digital Technology, Digital Commerce, natural language processing, subscription management, partner management, sales engagement, revenue management, Collaborative & Conversational Computing
Many organizations invest in data governance out of concern over misuse of data or potential data breaches. These are important considerations and valid aspects of data governance programs. However, good data governance also has positive impacts on organizations. For example, I have previously written about the valuable connection between the use of data catalogs and satisfaction with an organization’s data lake. Our most recent Analytics and Data Benchmark Research demonstrates some of the beneficial links between data governance and analytics. In this Perspective, I’ll share some of the correlations identified in our research.
Organizations of all sizes are dealing with exponentially increasing data volume and data sources, which creates challenges such as siloed information, increased technical complexities across various systems and slow reporting of important business metrics. Migrating to the cloud does not solve the problems associated with performing analytics and business intelligence on data stored in disparate systems. Also, the computing power needed to process large volumes of data consists of clusters of servers with hundreds or thousands of nodes that can be difficult to administer. Our Analytics and Data Benchmark Research shows that organizations have concerns about current analytics and BI technology. Findings include difficulty integrating data with other business processes, systems that are not flexible enough to scale operations and trouble accessing data from various data sources.
I’m proud to share Ventana Research’s 2022 Market Agenda for Digital Technology. Our focus in this agenda is to deliver expertise to help organizations prioritize technology investments that increase workforce effectiveness and organizational agility, ensuring ongoing operations during any type of disruption.
Organizations today have huge volumes of data across various cloud and on-premises systems which keep growing by the second. To derive value from this data, organizations must query the data regularly and share insights with relevant teams and departments. Automating this process using natural language processing (NLP) and artificial intelligence and machine learning (AI/ML) enables line-of-business personnel to query the data faster, generate reports themselves without depending on IT, and make quick decisions. Some organizations have started using NLP in self-service analytics to quickly identify patterns and simplify data visualization. Our Analytics and Data Benchmark Research finds that about 81% of organizations expect to use natural language search for analytics to make timely and informed decisions.
Organizations today are working with multiple applications and systems, including enterprise resource planning (ERP), customer relationship management (CRM), supply chain management (SCM) and other systems, where data can easily become fragmented and siloed. And as the organization increases its data sources and adds more systems and custom applications, it becomes challenging to manage the data consistently and keep data definitions up to date. This increases the need to use master data management (MDM) software that can provide a single source of truth to drive accurate analytics and business operations.
TIBCO is a large, independent cloud-computing and data analytics software company that offers integration, analytics, business intelligence and events processing software. It enables organizations to analyze streaming data in real time and provides the capability to automate analytics processes. It offers more than 200 connectors, more than 200 enterprise cloud computing and application adapters, and more than 30 non-relational structured query language databases, relational database management systems and data warehouses.