Businesses are transforming their organizations, building a data culture and deploying sophisticated analytics more broadly than ever. However, the process of using data and analytics is not always easy. The necessary tools are often separate, but our research shows organizations prefer an integrated environment. In our Data Preparation Benchmark Research, we found that 41% of participants use Analytics and Business Intelligence tools for data preparation.
Topics: embedded analytics, Analytics, Business Intelligence, Collaboration, Data Preparation, Information Management, Internet of Things, Data, Digital Technology, natural language processing, Conversational Computing, AI and Machine Learning
Traditional on-premises data processing solutions have led to a hugely complex and expensive set of data silos where IT spends more time managing the infrastructure than extracting value from the data. Big data architectures have attempted to solve the problem with large pools of cost-effective storage, but in doing so have often created on-premises management and administration challenges. These challenges of acquiring, installing and maintaining large clusters of computing resources gave rise to cloud-based implementations as an alternative. Public cloud is becoming the new center for data as organizations migrate from static on-premises IT architectures to global, dynamic and multi-cloud architectures.
In this analyst perspective, Dave Menninger takes a look at data lakes. He explains the term “data lake,” describes common use cases and shares his views on some of the latest market trends. He explores the relationship between data warehouses and data lakes and share some of Ventana Research’s findings on the subject. He also provides an assessment of the risks organizations face in working with data lakes and offers recommendations for maximizing the potential of data.
Effectively managing data privacy and security is a high-stakes matter. When an organization doesn’t get it right, it often becomes front-page news and occasionally becomes a subject of litigation. Yet organizations face an equally challenging imperative to ensure that business users have easy access to the data they need. Depending on how they are implemented, data governance policies can inhibit access to data, making it harder to find and utilize the data assets of an organization.
Artificial intelligence (AI) and machine learning (ML) are all the rage right now. Our Machine Learning Dynamic Insights research shows that organizations are using these techniques to achieve a competitive advantage and improve both customer experiences and their bottom line. One type of analysis an organization can perform using AI and ML is predictive analytics. Organizations also need to plan their operations to predict the amount of cash they will need, inventory levels and staffing requirements. Unfortunately, while planning begins with predictions, organizations can’t plan with AI and ML. Let me explain what I mean.
MicroStrategy recently held its annual user conference, which focused on the theme of the “Intelligent Enterprise.” HyperIntelligence, an innovative product for delivering analytics throughout organizations that they introduced a year ago, was the star of the event. The company announced enhancements to HyperIntelligence and the latest version of its flagship platform, MicroStrategy 2020, as well as a new two-tiered education and certification program.
Ventana Research recently announced its 2020 research agenda for analytics, continuing the guidance we’ve offered for nearly two decades to help organizations derive optimal value from their technology investments and improve business outcomes.
It’s been exciting to follow the emergence of innovative capabilities in the analytics market, but for businesses it can be challenging to stay on top of all these changes. To help, we craft our research agenda using our firm’s knowledge of technology vendors and products and our experience with and expertise on business requirements.
Organizations now must store, process and use data of significantly greater volume and variety than in the past. These factors plus the velocity of data today — the unrelentingly rapid rate at which it is generated, both in enterprise systems and on the internet — add to the challenge of getting the data into a form that can be used for business tasks.
I am happy to share some insights gleaned from our latest Value Index research, which provides our assessment of how well vendors’ offerings meet buyers’ requirements. The Ventana Research Value Index: Collaborative Analytics and Business Intelligence 2019 is the distillation of a year of market and product research efforts by Ventana Research. Drawing on our benchmark research and expertise, we apply a structured research methodology built on evaluation categories that are designed to reflect the real-world criteria incorporated in a request for proposal to vendors in analytics and business intelligence. Using this methodology, we evaluated vendor submissions in seven categories, five relevant to the product (adaptability, capability, manageability, reliability and usability) and two related to the vendor (TCO/ROI and vendor validation). This research-based index is the first such evaluation to assess the full business value of collaborative analytics and business intelligence software. You can learn more about our Value Index as an effective vendor selection and RFI/RFP tool at https://www.ventanaresearch.com/value-indexes.
Domopalooza 2019 marked the first annual user conference after Domo went public, but the energy, excitement and new feature announcements have not slowed. With thousands in attendance and growing fast, this year's conference focused on five key areas: digitization, real time connectivity, driving insight based actions, applying AI & machine learning, and building applications. All of these announcements are aimed at broadening the workloads supported by Domo.