Ventana Research recently announced its 2020 research agenda for data, continuing the guidance we’ve offered for nearly two decades to help organizations derive optimal value and improve business outcomes. Data volumes continue to grow while data latency requirements continue to shrink. Meanwhile, virtually every organization is confronting a need for good data governance.
Organizations now must store, process and use data of significantly greater volume and variety than in the past. These factors plus the velocity of data today — the unrelentingly rapid rate at which it is generated, both in enterprise systems and on the internet — add to the challenge of getting the data into a form that can be used for business tasks.
Domopalooza 2019 marked the first annual user conference after Domo went public, but the energy, excitement and new feature announcements have not slowed. With thousands in attendance and growing fast, this year's conference focused on five key areas: digitization, real time connectivity, driving insight based actions, applying AI & machine learning, and building applications. All of these announcements are aimed at broadening the workloads supported by Domo.
IBM's Analytics University (held in both Miami and Stockholm) brought about some large changes. Big announcements this year included a consolidation of IBM's Watson Analytics into Cognos 11.1, helping provide some clarity to their analytics offerings, along with new visualizations and better data preparation. This also includes a new conversational assistant to help generate narrative explanations of displays and interactive queries. For the full breakdown of IBM's Analytics University 2018, and my analysis of all the largest announcements, watch my latest hot take.
Once again I attended Tableau's Users Conference, along with 17,000 other attendees, affectionately self-referred to as "data nerds". Pushing the envelope in data capabilities and access, Tableau introduced the "Ask Data" feature, allowing users to prose natural language queries and receive a response, along with new data preparation capabilities and other enhancements to help data analysts. Further, Tableau announced new developer enhancements including a new developer program to better align tools built for Tableau with Tableau's interface. For the full breakdown of Tableau User Conference 2018, and my analysis of all the largest announcements, watch my hot take video.
This year, Teradata rebranded the Teradata users conference from "Partners" to "Analytics Universe", and there is a reason for it. For decades, Teradata has represented the high end of the analytic database, but new innovations and technologies are adding flexibility to Teradata's licensing as they compete. For the full breakdown of Teradata's Analytics Universe 2018, and my analysis of all the largest announcements, watch my hot take video.
In 2017 Strata + Hadoop World was changed to the Strata Data Conference. As I pointed out in my coverage of last year’s event, the focus was largely on machine learning and artificial intelligence (AI). That theme continued this year, but my impression of the event was of a community looking to get value out of data regardless of the technology being used to manage that data. The change was subtle: The location was the same; the exhibitors were largely the same; attendance was similar this year and last. But there was no particular vendor or technology dominating the event.
Topics: Big Data, Data Science, Machine Learning, Analytics, Business Intelligence, Data Governance, Data Integration, Data Preparation, Information Optimization, Digital Technology, Machine Learning and Cognitive Computing
Ventana Research recently published the findings of our benchmark research on Data Preparation, which examines the practices organizations use to accomplish data preparation. We view data preparation as a sequence of steps: identifying, locating and then accessing the data; aggregating data from different sources; and enriching, transforming and cleaning it to create a single uniform data set. Using data to accomplish organizational goals requires that it be prepared for use; to do this job properly, businesses need flexible tools that enable them to enrich the context of data drawn from multiple sources and collaborate on its preparation as well as ensure security and consistency. Users of data preparation tools range from analysts to operations professionals in the lines of business to IT professionals.
I recently attended SAP TechEd in Las Vegas to hear the latest from the company regarding its analytics and business intelligence offerings as well as its data management platform. The company used the event to launch SAP Data Hub and made several other data and analytics announcements that I’ll cover below.
Many organizations continue to struggle with preparing data for use in operational and analytical processes. We see these issues reported in our Data and Analytics in the Cloud benchmark research, where 55 percent of organizations identify data preparation as the most time-consuming task in their analytical processes. Similarly, in our Next-Generation Predictive Analytics research, 62 percent of companies report that they’re unsatisfied because data needed for access or integration is not readily available. In our Big Data Integration research, 52 percent report spending that in working with big data integration processes, they spend the most time reviewing data for quality and consistency. And nearly half of companies (48%) report this same issue in our Internet of Things research. We are currently conducting further research into this critical issue with our Data Preparation benchmark research.