Businesses are transforming their organizations, building a data culture and deploying sophisticated analytics more broadly than ever. However, the process of using data and analytics is not always easy. The necessary tools are often separate, but our research shows organizations prefer an integrated environment. In our Data Preparation Benchmark Research, we found that 41% of participants use Analytics and Business Intelligence tools for data preparation.
Topics: embedded analytics, Analytics, Business Intelligence, Collaboration, Data Preparation, Information Management, Internet of Things, Data, Digital Technology, natural language processing, Conversational Computing, AI and Machine Learning
Traditional on-premises data processing solutions have led to a hugely complex and expensive set of data silos where IT spends more time managing the infrastructure than extracting value from the data. Big data architectures have attempted to solve the problem with large pools of cost-effective storage, but in doing so have often created on-premises management and administration challenges. These challenges of acquiring, installing and maintaining large clusters of computing resources gave rise to cloud-based implementations as an alternative. Public cloud is becoming the new center for data as organizations migrate from static on-premises IT architectures to global, dynamic and multi-cloud architectures.
Organizations are always looking to improve their ability to use data and AI to gain meaningful and actionable insights into their operations, services and customer needs. But unlocking value from data requires multiple analytics workloads, data science tools and machine learning algorithms to run against the same diverse data sets. Organizations still struggle with limited data visibility and insufficient insights, which are often caused by a multitude of reasons such as analytic workloads running independently, data spread across multiple data centers, data governance, etc. In our ongoing benchmark research project, we are researching the ways in which organizations work with big data and the challenges they face.
Ventana Research has been evaluating analytics and business intelligence (BI) software for a long time—almost 20 years. Our methodology for these assessments is referred to as a Value Index. We use weightings derived from our benchmark research about how you, as buyers of these technologies, value and evaluate vendors. You can view our 2019 Value Index results here. I am in the process of completing the 2020 evaluation now.
Topics: business intelligence, embedded analytics, Analytics, Collaboration, Data Governance, Data Preparation, Information Management (IM), natural language processing, Conversational Computing, AI and Machine Learning, collaborative computing, software evaluation
Artificial intelligence (AI) and machine learning (ML) are all the rage right now. Our Machine Learning Dynamic Insights research shows that organizations are using these techniques to achieve a competitive advantage and improve both customer experiences and their bottom line. One type of analysis an organization can perform using AI and ML is predictive analytics. Organizations also need to plan their operations to predict the amount of cash they will need, inventory levels and staffing requirements. Unfortunately, while planning begins with predictions, organizations can’t plan with AI and ML. Let me explain what I mean.
I was recently asked to identify key modern data architecture trends. Data architectures have changed significantly to accommodate larger volumes of data as well as new types of data such as streaming and unstructured data. Here are some of the trends I see continuing to impact data architectures.
Ventana Research recently announced its 2020 research agenda for analytics, continuing the guidance we’ve offered for nearly two decades to help organizations derive optimal value from their technology investments and improve business outcomes.
It’s been exciting to follow the emergence of innovative capabilities in the analytics market, but for businesses it can be challenging to stay on top of all these changes. To help, we craft our research agenda using our firm’s knowledge of technology vendors and products and our experience with and expertise on business requirements.
Organizations’ use of data and information is evolving as the amount of data and the frequency with which that data is collected increase. Data now streams into organizations from myriad sources, among them social media feeds and internet-of-things devices. These seemingly ever-increasing volumes of devices and data streams offer both challenges and opportunities to capture information about a business and improve its operations.
The emerging internet of things (IoT) is an extension of digital connectivity to devices and sensors in homes, businesses, vehicles and potentially almost anywhere. This innovation means that virtually any appropriately designed device can generate and transmit data about its operations, which can facilitate monitoring and a range of automatic functions. To do this IoT requires a set of event-centered information and analytic processes that enable people to use that event information to make optimal decisions and take act effectively.
Organizations now must store, process and use data of significantly greater volume and variety than in the past. These factors plus the velocity of data today — the unrelentingly rapid rate at which it is generated, both in enterprise systems and on the internet — add to the challenge of getting the data into a form that can be used for business tasks.