Big data initially was characterized in terms of “the three V’s,” volume, velocity and variety. Nearly five years ago I wrote about the three V’s as a way to explain why new and different technologies were needed to deal with big data. Since then the industry has tackled many of the technical challenges associated with the three V’s. In 2017 I propose that we focus instead on a different letter, which includes these A’s: analytics, awareness, anticipation and action. I’ll explain why each is important at this stage of big data evolution.
Ventana Research analysts recently published our research agendas for 2017. As we put together these plans we think about the forces that are shaping the markets that we cover and then craft agendas that study these issues to provide insights for our community. I’ve been working in the business intelligence (BI) and analytics market for nearly 25 years, and throughout that time the industry has been trying to make analytics useful to increasingly wider audiences. That focus continues to today. Better search and presentation methods, including visual discovery and natural-language processing, are promising ways to engage more users. We also see organizations supporting their users in specific functional roles with relevant and accessible analytics. My colleagues examine these issues as part of their agendas in the Office of Finance, Sales, Marketing, Customer Experience, Operations and Supply Chain, and Human Capital Management. While their agendas include analytics within specific domains, my own research focuses on a range of analytics issues across domains including cloud computing, mobility, collaboration, data science and the Internet of Things.
The business intelligence market is bounded on one side by big data and on the other side by data preparation. That is, to maximize their performance in using information, organizations have to collect and analyze ever increasing volumes of data while the tools available are constantly evolving in the big data ecosystem that I have written about. In our benchmark research on big data analytics, half (51%) of organizations said they want to access big data using their existing BI tools. At the same time, as I have noted, end users are demanding self-service access to data preparation capabilities to facilitate their analyses.
The big data market continues to evolve, as I have written previously. Vendors are attempting to differentiate their offerings as they seek to encourage customers to pay for technology that they could potentially download for free.
More than 13,000 self-described “data and visualization nerds” gathered in Austin, TX, recently for Tableau Software’s annual customer conference. In his inaugural keynote, Tableau’s new CEO, Adam Selipsky, said that nearly 9,000 were first-time attendees. I was impressed with the enthusiasm of the customers who had gathered for the event, cheering as company officials reviewed product plans and demonstrated new features. This enthusiasm suggests Tableau has provided capabilities that resonate with its users. Among other things, the company used the conference to outline a number of planned product enhancements.
Teradata recently held its annual Partners conference, at which gather several thousand customers and partners from around the world. This was the first Partners event since Vic Lund was appointed president and CEO in May. Year on year, Teradata’s revenues are down about 5 percent, which likely prompted some changes at the company. Over the past few years Teradata made several technology acquisitions and perhaps spread its resources too thin. At the event, Lund committed the company to a focus on customers, which was a significant part of Teradata’s success in the past. This commitment was well received by customers I spoke with at the event.
Ventana Research has newly published its Mobile Analytics and Business Intelligence 2016 Value Index. The Value Index provides a comprehensive evaluation of vendors and their product offerings across seven categories. In performing that analysis, I realized that this software category is at a crossroads. Once an optional capability often reserved for executives, mobile analytics is becoming a requirement of business users across organizations. The blurring of lines between work and personal lives has provoked a change from single device BI to BI on multiple devices including smartphones and tablets as well as laptops and desktops. From a platform standpoint, the adoption of HTML5 is contesting the prevalence of native mobile applications.
It’s part of my job to cover the ecosystem of Hadoop, the open source big data technology, but sometimes it makes my head spin. If this is not your primary job, how can you possibly keep up? I hope that a discussion of what I’ve found to be most important will help those who don’t have the time and energy to devote to this wide-ranging topic.
Data virtualization is not new, but it has changed over the years. The term describes a process of combining data on the fly from multiple sources rather than copying that data into a common repository such as a data warehouse or a data lake, which I have written about. There are many reasons for an organization concerned with managing its data to consider data virtualization, most stemming from the fact that the data does not have to be copied to a new location. It could, for instance, eliminate the cost of building and maintaining a copy of one of the organization’s big data sources. Recognizing these benefits, many database and data integration companies offer data virtualization products. Denodo, one of the few independent, best-of-breed vendors in this market today, brings these capabilities to big data sources and data lakes.
Predictive analytics is a rewarding yet challenging subject. In our benchmark research on next-generation predictive analytics at least half the participants reported that predictive analytics allows them to achieve competitive advantage (57%) and create new revenue opportunities (50%). Yet even more participants said that users of predictive analytics don’t have enough skills training to produce their own analyses (79%) and don’t understand the mathematics involved (66%). (In the term “predictive analytics” I include all types of data science, not just one particular type of analysis.)