Machine learning is valuable for organizations, but it can be hard to deploy. Our Machine Learning Dynamic Insights research identifies that not having enough skilled resources and difficulty building and maintaining ML systems are pressing challenges organizations face in applying ML. Traditional ML model development is resource-intensive, requiring significant domain knowledge and time to produce and compare dozens of models. And as the number of ML models grow, their management becomes difficult. By bringing automation to ML, organizations can reduce the time it takes to create production-ready ML models. AutoML can also enable organizations to make data science initiatives more accessible across the organization.
The amount of data flowing into organizations is growing exponentially, creating a need to process more data more quickly than ever before. Our Data Preparation Benchmark Research shows that accessing and preparing data continues to be the most time-consuming part of making data available for analysis. This can potentially slow down the organizational functions which depend on the analysis results. Trying to get ahead of the backlog with incremental improvements to existing approaches and traditional technologies alone can be frustrating.
Process-mining software isn’t exactly new, but it’s also not widely known in the software technology market. The discipline has been around for at least a decade, but is generating more interest these days with both specialist vendors and major enterprise software vendors offering process-mining products and services. We assert that through 2022, 1 in 4 organizations will look to streamline their operations by exploring process mining.
Organizations are accelerating their digital transformation and looking for innovative ways to engage with customers in this new digital era of data management. The goal is to understand how to manage the growing volume of data in real time, across all sources and platforms, and use it to inform, streamline and transform internal operations. Over the years, the adoption of cloud computing has gained momentum with more and more organizations trying to make use of applications, data, analytics and self-service business intelligence (BI) tools running on top of cloud-computing infrastructure in order to improve efficiency. However, cloud adoption means living with a mix of on-premises and multiple cloud-based systems in a hybrid computing environment. The challenge is to ensure that processes, applications and data can still be integrated across cloud and on-premises systems. Our research shows that organizations still have a significant requirement for on-premises data management but also have a growing requirement for cloud-based capabilities.
Topics: business intelligence, embedded analytics, Analytics, Collaboration, Data Governance, Data Preparation, Information Management, Internet of Things, Data, natural language processing, data lakes, AI & Machine Learning
Every organization performing analytics with multiple employees needs to collaborate. They should be collaborating in the analytics process and in communicating the results of those analyses. As I continue my evaluation of analytics and data vendors, I have to admit some disappointment at the level of collaborative capabilities some analytics vendors provide. To be fair, the level of capabilities vary widely, but I expected collaborative capabilities to be more uniformly available as a standard feature in analytics technologies by now. I had anticipated that three-quarters of analytics vendors would include collaboration capabilities. More than half the vendors I have evaluated support some comments and discussion in their products, only a few have incorporated social recognition and wall posting as part of their collaborative capabilities. So, what impact does a lack of analytics collaboration have on organizations undergoing digital transformation?
Ventana Research has been evaluating analytics and business intelligence (BI) software for a long time—almost 20 years. Our methodology for these assessments is referred to as a Value Index. We use weightings derived from our benchmark research about how you, as buyers of these technologies, value and evaluate vendors. You can view our 2019 Value Index results here. I am in the process of completing the 2020 evaluation now.
Topics: business intelligence, embedded analytics, Analytics, Collaboration, Data Governance, Data Preparation, Information Management (IM), natural language processing, Conversational Computing, AI and Machine Learning, collaborative computing, software evaluation
I was recently asked to identify key modern data architecture trends. Data architectures have changed significantly to accommodate larger volumes of data as well as new types of data such as streaming and unstructured data. Here are some of the trends I see continuing to impact data architectures.
Organizations’ use of data and information is evolving as the amount of data and the frequency with which that data is collected increase. Data now streams into organizations from myriad sources, among them social media feeds and internet-of-things devices. These seemingly ever-increasing volumes of devices and data streams offer both challenges and opportunities to capture information about a business and improve its operations.
The emerging internet of things (IoT) is an extension of digital connectivity to devices and sensors in homes, businesses, vehicles and potentially almost anywhere. This innovation means that virtually any appropriately designed device can generate and transmit data about its operations, which can facilitate monitoring and a range of automatic functions. To do this IoT requires a set of event-centered information and analytic processes that enable people to use that event information to make optimal decisions and take act effectively.
I am happy to offer some insights on Yellowfin drawn from our latest Value Index research, which provides an analytic representation of our assessment of how well vendors’ offerings meet buyers’ requirements. The Ventana Research Value Index: Analytics and Business Intelligence 2019 is the distillation of a year of market and product research efforts by Ventana Research. We utilized a structured research methodology that includes evaluation categories designed to reflect the breadth of the real-world criteria incorporated in a request for proposal (RFP) and vendor selection process for analytics and business intelligence. We evaluated Yellowfin and 14 other vendors in seven categories, five relevant to the product (adaptability, capability, manageability, reliability and usability) and two related to the vendor (TCO/ROI and vendor validation). To arrive at the Value Index rating for a given vendor, we weighted each category to reflect its relative importance in an RFP process, with the weightings based on our experience and data derived from our benchmark research on analytics and business intelligence.