IBM's Analytics University (held in both Miami and Stockholm) brought about some large changes. Big announcements this year included a consolidation of IBM's Watson Analytics into Cognos 11.1, helping provide some clarity to their analytics offerings, along with new visualizations and better data preparation. This also includes a new conversational assistant to help generate narrative explanations of displays and interactive queries. For the full breakdown of IBM's Analytics University 2018, and my analysis of all the largest announcements, watch my latest hot take.
Once again I attended Tableau's Users Conference, along with 17,000 other attendees, affectionately self-referred to as "data nerds". Pushing the envelope in data capabilities and access, Tableau introduced the "Ask Data" feature, allowing users to prose natural language queries and receive a response, along with new data preparation capabilities and other enhancements to help data analysts. Further, Tableau announced new developer enhancements including a new developer program to better align tools built for Tableau with Tableau's interface. For the full breakdown of Tableau User Conference 2018, and my analysis of all the largest announcements, watch my hot take video.
This year, Teradata rebranded the Teradata users conference from "Partners" to "Analytics Universe", and there is a reason for it. For decades, Teradata has represented the high end of the analytic database, but new innovations and technologies are adding flexibility to Teradata's licensing as they compete. For the full breakdown of Teradata's Analytics Universe 2018, and my analysis of all the largest announcements, watch my hot take video.
In 2017 Strata + Hadoop World was changed to the Strata Data Conference. As I pointed out in my coverage of last year’s event, the focus was largely on machine learning and artificial intelligence (AI). That theme continued this year, but my impression of the event was of a community looking to get value out of data regardless of the technology being used to manage that data. The change was subtle: The location was the same; the exhibitors were largely the same; attendance was similar this year and last. But there was no particular vendor or technology dominating the event.
Topics: Big Data, Data Science, Machine Learning, Analytics, Business Intelligence, Data Governance, Data Integration, Data Preparation, Information Optimization, Digital Technology, Machine Learning and Cognitive Computing
Ventana Research recently published the findings of our benchmark research on Data Preparation, which examines the practices organizations use to accomplish data preparation. We view data preparation as a sequence of steps: identifying, locating and then accessing the data; aggregating data from different sources; and enriching, transforming and cleaning it to create a single uniform data set. Using data to accomplish organizational goals requires that it be prepared for use; to do this job properly, businesses need flexible tools that enable them to enrich the context of data drawn from multiple sources and collaborate on its preparation as well as ensure security and consistency. Users of data preparation tools range from analysts to operations professionals in the lines of business to IT professionals.
I recently attended SAP TechEd in Las Vegas to hear the latest from the company regarding its analytics and business intelligence offerings as well as its data management platform. The company used the event to launch SAP Data Hub and made several other data and analytics announcements that I’ll cover below.
Many organizations continue to struggle with preparing data for use in operational and analytical processes. We see these issues reported in our Data and Analytics in the Cloud benchmark research, where 55 percent of organizations identify data preparation as the most time-consuming task in their analytical processes. Similarly, in our Next-Generation Predictive Analytics research, 62 percent of companies report that they’re unsatisfied because data needed for access or integration is not readily available. In our Big Data Integration research, 52 percent report spending that in working with big data integration processes, they spend the most time reviewing data for quality and consistency. And nearly half of companies (48%) report this same issue in our Internet of Things research. We are currently conducting further research into this critical issue with our Data Preparation benchmark research.
Informatica reintroduced itself to the world at its recent customer conference, Informatica World, in San Francisco. The company took advantage of the event to showcase its new branding in an effort to change the way customers think about the company. Informatica has been providing information services in the cloud for more than a decade. Even though cloud revenue comprises a minority of Informatica’s business, in absolute terms, the revenue is significant, and company executives want the public to recognize Informatica as a leader in cloud-based data management services for enterprises. Presenters also made notable product announcements, discussed below, including the application of machine learning to the data management process.
Topics: Big Data, Data Science, Analytics, Business Intelligence, Cloud Computing, Data Governance, Data Integration, Data Preparation, Information Optimization, Machine Learning and Cognitive Computing, Machine Learning Digital Technology
I recently attended SAS Institute’s analyst relations conference. There the company provided updates on its financial performance and its Viya platform and a glimpse into some of its future plans.
Topics: Big Data, Data Science, Mobile Technology, business intelligence, Analytics, Cloud Computing, Collaboration, Data Governance, Data Integration, Data Preparation, Internet of Things, Information Optimization, Machine Learning and Cognitive Computing, Machine Learning Digital Technology
Big data initially was characterized in terms of “the three V’s,” volume, velocity and variety. Nearly five years ago I wrote about the three V’s as a way to explain why new and different technologies were needed to deal with big data. Since then the industry has tackled many of the technical challenges associated with the three V’s. In 2017 I propose that we focus instead on a different letter, which includes these A’s: analytics, awareness, anticipation and action. I’ll explain why each is important at this stage of big data evolution.