Ventana Research recently published the findings of our benchmark research on Data Preparation, which examines the practices organizations use to accomplish data preparation. We view data preparation as a sequence of steps: identifying, locating and then accessing the data; aggregating data from different sources; and enriching, transforming and cleaning it to create a single uniform data set. Using data to accomplish organizational goals requires that it be prepared for use; to do this job properly, businesses need flexible tools that enable them to enrich the context of data drawn from multiple sources and collaborate on its preparation as well as ensure security and consistency. Users of data preparation tools range from analysts to operations professionals in the lines of business to IT professionals.
I recently attended SAP TechEd in Las Vegas to hear the latest from the company regarding its analytics and business intelligence offerings as well as its data management platform. The company used the event to launch SAP Data Hub and made several other data and analytics announcements that I’ll cover below.
Many organizations continue to struggle with preparing data for use in operational and analytical processes. We see these issues reported in our Data and Analytics in the Cloud benchmark research, where 55 percent of organizations identify data preparation as the most time-consuming task in their analytical processes. Similarly, in our Next-Generation Predictive Analytics research, 62 percent of companies report that they’re unsatisfied because data needed for access or integration is not readily available. In our Big Data Integration research, 52 percent report spending that in working with big data integration processes, they spend the most time reviewing data for quality and consistency. And nearly half of companies (48%) report this same issue in our Internet of Things research. We are currently conducting further research into this critical issue with our Data Preparation benchmark research.
Informatica reintroduced itself to the world at its recent customer conference, Informatica World, in San Francisco. The company took advantage of the event to showcase its new branding in an effort to change the way customers think about the company. Informatica has been providing information services in the cloud for more than a decade. Even though cloud revenue comprises a minority of Informatica’s business, in absolute terms, the revenue is significant, and company executives want the public to recognize Informatica as a leader in cloud-based data management services for enterprises. Presenters also made notable product announcements, discussed below, including the application of machine learning to the data management process.
Topics: Big Data, data science, Analytics, Business Intelligence, Cloud Computing, Data Governance, Data Integration, Data Preparation, Information Optimization, Machine Learning and Cognitive Computing, Machine Learning Digital Technology
I recently attended SAS Institute’s analyst relations conference. There the company provided updates on its financial performance and its Viya platform and a glimpse into some of its future plans.
Topics: Big Data, data science, Mobile Technology, business intelligence, Analytics, Cloud Computing, Collaboration, Data Governance, Data Integration, Data Preparation, Internet of Things, Information Optimization, Machine Learning and Cognitive Computing, Machine Learning Digital Technology
Big data initially was characterized in terms of “the three V’s,” volume, velocity and variety. Nearly five years ago I wrote about the three V’s as a way to explain why new and different technologies were needed to deal with big data. Since then the industry has tackled many of the technical challenges associated with the three V’s. In 2017 I propose that we focus instead on a different letter, which includes these A’s: analytics, awareness, anticipation and action. I’ll explain why each is important at this stage of big data evolution.
Big data has become an integral part of information management. Nearly all organizations have some need to access big data sources and produce actionable information for decision-makers. Recognizing this connection, we merged these two topics when we put together our recently published research agendas for 2017. As we plan our research, we focus on current technologies and how they can be used to improve an organization’s performance. We then share those results with our readers.
Topics: Big Data, data science, Analytics, Data Governance, Data Integration, Data Preparation, Information Management, Internet of Things, Machine Learning and Cognitive Computing, Machine Learning Digital Technology
The business intelligence market is bounded on one side by big data and on the other side by data preparation. That is, to maximize their performance in using information, organizations have to collect and analyze ever increasing volumes of data while the tools available are constantly evolving in the big data ecosystem that I have written about. In our benchmark research on big data analytics, half (51%) of organizations said they want to access big data using their existing BI tools. At the same time, as I have noted, end users are demanding self-service access to data preparation capabilities to facilitate their analyses.
Data preparation is critical to the effectiveness of both operational and analytic business processes. Operational processes today are fed by streams of constantly generated data. Our data and analytics in the cloud benchmark research shows that more than half (55%) of organizations spend the most time in their analytic processes preparing data for analysis – a situation that reduces their productivity. Data now comes from more sources than ever, at a faster pace and in a dizzying array of formats; it often contains inconsistencies in both structure and content.
Qlik helped pioneer the visual discovery market with its QlikView product. In some respects, Qlik and its competitors also spawned the self-service trend rippling through the analytics market today. Their aim was to enable business users to perform analytics for themselves rather than building a product with the perfect set of features for IT. After establishing success with end users the company began to address more of the concerns of IT, eventually creating a robust enterprise-grade analytics platform. This approach has worked for Qlik, driving growth that led to an initial public offering in 2010. The company now generates more than half a billion dollars in revenue annually, making it one of the largest independent analytics vendors. Of which based on their company and products was rated a Hot Vendor in our 2015 Value Index on Analytics and Business Intelligence and one of the highest ranked in usability.