The Internet of Things (IoT) is a technology that extends digital connectivity to devices and sensors in homes, businesses, vehicles and potentially almost anywhere. This advance enables virtually any device to transmit its data, to which analytics can then be applied to facilitate monitoring and a range of operational functions. IoT can deliver value in several ways. It can provide organizations with more complete data about their operations, which helps them improve efficiencies and so reduce costs. It also can deliver a competitive advantage by enabling them to reduce the elapsed time between an event occurring and operational responses, actions taken or decisions made in response to it.
Big data initially was characterized in terms of “the three V’s,” volume, velocity and variety. Nearly five years ago I wrote about the three V’s as a way to explain why new and different technologies were needed to deal with big data. Since then the industry has tackled many of the technical challenges associated with the three V’s. In 2017 I propose that we focus instead on a different letter, which includes these A’s: analytics, awareness, anticipation and action. I’ll explain why each is important at this stage of big data evolution.
Big data has become an integral part of information management. Nearly all organizations have some need to access big data sources and produce actionable information for decision-makers. Recognizing this connection, we merged these two topics when we put together our recently published research agendas for 2017. As we plan our research, we focus on current technologies and how they can be used to improve an organization’s performance. We then share those results with our readers.
Topics: Big Data, data science, Data Governance, Data Integration, Data Preparation, Information Management, Internet of Things, analytics, Machine Learning and Cognitive Computing, Machine Learning Digital Technology
Ventana Research analysts recently published our research agendas for 2017. As we put together these plans we think about the forces that are shaping the markets that we cover and then craft agendas that study these issues to provide insights for our community. I’ve been working in the business intelligence (BI) and analytics market for nearly 25 years, and throughout that time the industry has been trying to make analytics useful to increasingly wider audiences. That focus continues to today. Better search and presentation methods, including visual discovery and natural-language processing, are promising ways to engage more users. We also see organizations supporting their users in specific functional roles with relevant and accessible analytics. My colleagues examine these issues as part of their agendas in the Office of Finance, Sales, Marketing, Customer Experience, Operations and Supply Chain, and Human Capital Management. While their agendas include analytics within specific domains, my own research focuses on a range of analytics issues across domains including cloud computing, mobility, collaboration, data science and the Internet of Things.
The business intelligence market is bounded on one side by big data and on the other side by data preparation. That is, to maximize their performance in using information, organizations have to collect and analyze ever increasing volumes of data while the tools available are constantly evolving in the big data ecosystem that I have written about. In our benchmark research on big data analytics, half (51%) of organizations said they want to access big data using their existing BI tools. At the same time, as I have noted, end users are demanding self-service access to data preparation capabilities to facilitate their analyses.
The big data market continues to evolve, as I have written previously. Vendors are attempting to differentiate their offerings as they seek to encourage customers to pay for technology that they could potentially download for free.
IBM recently held its inaugural World of Watson event. Formerly known as IBM Insight, and prior to that IBM Information on Demand, the annual event, attended by 17,000 people this year, showcases IBM’s data and analytics and the broader IBM efforts in cognitive computing. The theme for the event, as you might guess, was the Watson family of cognitive computing products. I, for one, was glad to spend more time getting to know the Watson product line, and I’d like to share some of my observations from the event.
Topics: Big Data, data science, Machine Learning, cloud computing, cloud computing, Business inteligence, Data Governance, Data Integration, Internet of Things, Information Optimization, analytics, analytics, digital technology
More than 13,000 self-described “data and visualization nerds” gathered in Austin, TX, recently for Tableau Software’s annual customer conference. In his inaugural keynote, Tableau’s new CEO, Adam Selipsky, said that nearly 9,000 were first-time attendees. I was impressed with the enthusiasm of the customers who had gathered for the event, cheering as company officials reviewed product plans and demonstrated new features. This enthusiasm suggests Tableau has provided capabilities that resonate with its users. Among other things, the company used the conference to outline a number of planned product enhancements.
Fall is a busy time for software industry analysts. It’s a season filled with vendors’ user conferences and some industry conferences. Throughout the course of attending these events I’ve come to the realization that big vendors are often considered the Rodney Dangerfield of the software industry: They get no respect. What I mean by no respect is revealed in snarky social media comments, less enthusiastic coverage by tech media than smaller vendors get and a general sense that big vendors don’t do anything new with their development efforts. However, I suggest this is a shortsighted view of the software world. Smaller vendors serve a valuable function as a source of innovation for the industry, but they get a disproportionate share of attention. I suggest the big vendors deserve businesses’ attention, too, when they consider new software purchases.
Data preparation is critical to the effectiveness of both operational and analytic business processes. Operational processes today are fed by streams of constantly generated data. Our data and analytics in the cloud benchmark research shows that more than half (55%) of organizations spend the most time in their analytic processes preparing data for analysis – a situation that reduces their productivity. Data now comes from more sources than ever, at a faster pace and in a dizzying array of formats; it often contains inconsistencies in both structure and content.