All Analyst Perspectives
Posted by David Menninger on Oct 5, 2018 11:31:56 AM
In 2017 Strata + Hadoop World was changed to the Strata Data Conference. As I pointed out in my coverage of last year’s event, the focus was largely on machine learning and artificial intelligence (AI). That theme continued this year, but my impression of the event was of a community looking to get value out of data regardless of the technology being used to manage that data. The change was subtle: The location was the same; the exhibitors were largely the same; attendance was similar this year and last. But there was no particular vendor or technology dominating the event.
Posted by David Menninger on Sep 20, 2018 8:35:29 AM
All too often, software vendors view analytics as the end rather than the beginning of a process. I’m reminded of some of the advanced math classes I’ve taken in which the teaching process focused on a few key aspects of a mathematical proof or solution, leaving the rest of the exercise to be worked out by the students. In other contexts, you may hear people say the numbers speak for themselves.
Posted by David Menninger on Feb 23, 2018 6:00:00 AM
We at Ventana Research recently published our research agendas for 2018. The world of data and information management continues to evolve, as does our research on the use of these technologies to improve your organization’s operations. Relational databases are no longer the only viable enterprise data store as more organizations adopt a polyglot database infrastructure. And while their exact form may still be changing, as I have recently written, big data technologies are here to stay. Our Data and Analytics in the Cloud Benchmark Research indicates that an increasing number of organizations are opting for cloud-based deployments: A modern data infrastructure includes a hybrid of on-premises and cloud deployments for 44 percent of organizations. Our upcoming research will track how these changes are affecting data- and information-management processes.
Posted by David Menninger on Feb 12, 2018 5:17:18 AM
We at Ventana Research recently published our research agendas for 2018. Analytics and business intelligence are evolving and so is our research on their use across practice areas. Earlier research has shown that analytics can deliver significant value to organizations; for example, our predictive analytics research shows that 57 percent of organizations reported achieving a competitive advantage and half created new revenue opportunities with predictive analytics. Waves of investment in self-service analytics have propelled the market for analytics tools, significantly empowering line-of-business organizations to create their own analytics and set their own analytic priorities. But organizations are also beginning to recognize some of the limitations of current analytics implementations – for self-service, for example. Our Data Preparation Benchmark Research reveals that fewer than half (42%) of organizations are comfortable allowing business users to work with data not prepared by IT. Our research this year will continue to explore both the successes and challenges organizations face as they continue to use analytics and BI.
Posted by David Menninger on Dec 29, 2017 5:53:35 AM
Ventana Research recently published the findings of our benchmark research on Data Preparation, which examines the practices organizations use to accomplish data preparation. We view data preparation as a sequence of steps: identifying, locating and then accessing the data; aggregating data from different sources; and enriching, transforming and cleaning it to create a single uniform data set. Using data to accomplish organizational goals requires that it be prepared for use; to do this job properly, businesses need flexible tools that enable them to enrich the context of data drawn from multiple sources and collaborate on its preparation as well as ensure security and consistency. Users of data preparation tools range from analysts to operations professionals in the lines of business to IT professionals.
Posted by David Menninger on Dec 14, 2017 8:22:19 AM
I recently attended SAP TechEd in Las Vegas to hear the latest from the company regarding its analytics and business intelligence offerings as well as its data management platform. The company used the event to launch SAP Data Hub and made several other data and analytics announcements that I’ll cover below.
Posted by David Menninger on Nov 19, 2017 7:30:17 AM
The Strata Data Conference is changing and it’s changing in a good way. At the recent Strata Data Conference in New York, Mike Olson, chief strategy officer at Cloudera, which co-sponsored the event, commented that at prior events we used to talk about the “Hadoop zoo animals,” meaning the various components of the Hadoop ecosystem of which I have written previously. Following last fall’s Strata event, I observed that the conference was evolving to focus on the use of data. Advancing that evolution, this year’s event focused on a particular type of usage: artificial intelligence (AI) and machine learning. The evolution from a focus on zoo animals to a focus on business value using advanced analytics shows further maturation of the big data market.
Posted by David Menninger on Nov 9, 2017 7:46:42 AM
There’s been some speculation in the market that Hadoop may be disappearing. Some of this speculation has been driven by vendors that have recently downplayed Hadoop in their marketing efforts. For example, the Strata+Hadoop World conference is now known as the Strata Data Conference. The Hadoop Summit is now known as the Dataworks Summit. In Cloudera’s S-1 filing with the SEC for its initial public offering, the term “Hadoop” appears only 14 times, while the term “machine learning” appears 83 times. So, if some of the vendors that created the market appear to be pivoting away from Hadoop, does your organization need to do something similar, or is there a role for Hadoop in your IT architecture?
Posted by David Menninger on Sep 4, 2017 9:35:36 AM
Recently Hortonworks announced some significant additions to its products at the DataWorks Summit. These additions reflect the fact that the big data market continues to evolve, as I have previously written.
Posted by David Menninger on Sep 1, 2017 10:09:08 AM
Natural language generation (NLG), the process of generating text or narratives based on a set of data values, can reach a broader audience. NLG narratives can be used for a variety of purposes, but in this perspective I focus on how NLG can be used to enhance business intelligence (BI) processes. In the case of BI, NLG can be used to explain what has happened and why it is happening, and even what actions to take. The NLG narratives can be understood by a broader range of business users than the tables and charts of data that are the typical output of most BI applications or analytics tools.
Posted by David Menninger on Aug 24, 2017 3:07:01 AM
Many organizations continue to struggle with preparing data for use in operational and analytical processes. We see these issues reported in our Data and Analytics in the Cloud benchmark research, where 55 percent of organizations identify data preparation as the most time-consuming task in their analytical processes. Similarly, in our Next-Generation Predictive Analytics research, 62 percent of companies report that they’re unsatisfied because data needed for access or integration is not readily available. In our Big Data Integration research, 52 percent report spending that in working with big data integration processes, they spend the most time reviewing data for quality and consistency. And nearly half of companies (48%) report this same issue in our Internet of Things research. We are currently conducting further research into this critical issue with our Data Preparation benchmark research.
Posted by David Menninger on Jul 3, 2017 5:33:07 AM
This is my second analyst perspective based on our IoT Benchmark Research. In the first, I discussed the business focus of IoT applications and some of the challenges organizations are facing. Now I’ll share some of the findings about technologies used in IoT applications and the impact those technologies appear to have on the success of users’ projects.
Posted by David Menninger on Jun 1, 2017 10:26:20 AM
This year various types of organizations are embracing machine learning like it is going out of style – or maybe it would be better to say coming into style. And now with a little investigation on LinkedIn finds over half million professionals with machine learning in their job title. Machine learning is the application of specific data science algorithms that become more accurate as the system records more outcomes and processes more data. This improvement is referred to as “learning,” hence the name. There are good reasons machine learning is growing so rapidly, but there are pitfalls to avoid as well.
Posted by David Menninger on May 27, 2017 12:19:53 AM
Informatica reintroduced itself to the world at its recent customer conference, Informatica World, in San Francisco. The company took advantage of the event to showcase its new branding in an effort to change the way customers think about the company. Informatica has been providing information services in the cloud for more than a decade. Even though cloud revenue comprises a minority of Informatica’s business, in absolute terms, the revenue is significant, and company executives want the public to recognize Informatica as a leader in cloud-based data management services for enterprises. Presenters also made notable product announcements, discussed below, including the application of machine learning to the data management process.
Posted by David Menninger on May 17, 2017 9:12:02 AM
I recently attended the MicroStrategy World conference, which was held in Washington, D.C. and it celebrate its 20th anniversary, which is why MicroStrategy hosted the event near its headquarters. Over the past 20 years, the market and technology for business intelligence and analytics have significantly changed, and in several changes, MicroStrategy has been at the forefront. Now is a good time to examine the company’s position in the market and its latest offerings in context of the analytics market direction that I recently presented.
Posted by David Menninger on Apr 15, 2017 12:36:52 AM
Some 3,000 people attended Domo’s recent customer event, called Domopalooza. That’s nearly double the attendance of the previous event, which my colleague Mark Smith covered. Formerly a bit “stealthy,” Domo has started sharing more information, some of which I’ll pass along, as well as observations about product announcements made at the event.
Posted by David Menninger on Apr 5, 2017 9:47:03 AM
I recently attended SAS Institute’s analyst relations conference. There the company provided updates on its financial performance and its Viya platform and a glimpse into some of its future plans.
Posted by David Menninger on Mar 14, 2017 11:04:49 PM
The Internet of Things (IoT) is a technology that extends digital connectivity to devices and sensors in homes, businesses, vehicles and potentially almost anywhere. This advance enables virtually any device to transmit its data, to which analytics can then be applied to facilitate monitoring and a range of operational functions. IoT can deliver value in several ways. It can provide organizations with more complete data about their operations, which helps them improve efficiencies and so reduce costs. It also can deliver a competitive advantage by enabling them to reduce the elapsed time between an event occurring and operational responses, actions taken or decisions made in response to it.
Posted by David Menninger on Mar 1, 2017 11:11:30 PM
Big data initially was characterized in terms of “the three V’s,” volume, velocity and variety. Nearly five years ago I wrote about the three V’s as a way to explain why new and different technologies were needed to deal with big data. Since then the industry has tackled many of the technical challenges associated with the three V’s. In 2017 I propose that we focus instead on a different letter, which includes these A’s: analytics, awareness, anticipation and action. I’ll explain why each is important at this stage of big data evolution.
Posted by David Menninger on Feb 15, 2017 9:09:44 AM
Big data has become an integral part of information management. Nearly all organizations have some need to access big data sources and produce actionable information for decision-makers. Recognizing this connection, we merged these two topics when we put together our recently published research agendas for 2017. As we plan our research, we focus on current technologies and how they can be used to improve an organization’s performance. We then share those results with our readers.
Posted by David Menninger on Feb 8, 2017 8:19:34 AM
Ventana Research analysts recently published our research agendas for 2017. As we put together these plans we think about the forces that are shaping the markets that we cover and then craft agendas that study these issues to provide insights for our community. I’ve been working in the business intelligence (BI) and analytics market for nearly 25 years, and throughout that time the industry has been trying to make analytics useful to increasingly wider audiences. That focus continues to today. Better search and presentation methods, including visual discovery and natural-language processing, are promising ways to engage more users. We also see organizations supporting their users in specific functional roles with relevant and accessible analytics. My colleagues examine these issues as part of their agendas in the Office of Finance, Sales, Marketing, Customer Experience, Operations and Supply Chain, and Human Capital Management. While their agendas include analytics within specific domains, my own research focuses on a range of analytics issues across domains including cloud computing, mobility, collaboration, data science and the Internet of Things.
Posted by David Menninger on Jan 27, 2017 7:21:37 AM
The business intelligence market is bounded on one side by big data and on the other side by data preparation. That is, to maximize their performance in using information, organizations have to collect and analyze ever increasing volumes of data while the tools available are constantly evolving in the big data ecosystem that I have written about. In our benchmark research on big data analytics, half (51%) of organizations said they want to access big data using their existing BI tools. At the same time, as I have noted, end users are demanding self-service access to data preparation capabilities to facilitate their analyses.
Posted by David Menninger on Jan 19, 2017 8:52:07 AM
The big data market continues to evolve, as I have written previously. Vendors are attempting to differentiate their offerings as they seek to encourage customers to pay for technology that they could potentially download for free.
Posted by David Menninger on Dec 16, 2016 11:52:04 PM
IBM recently held its inaugural World of Watson event. Formerly known as IBM Insight, and prior to that IBM Information on Demand, the annual event, attended by 17,000 people this year, showcases IBM’s data and analytics and the broader IBM efforts in cognitive computing. The theme for the event, as you might guess, was the Watson family of cognitive computing products. I, for one, was glad to spend more time getting to know the Watson product line, and I’d like to share some of my observations from the event.
Posted by David Menninger on Dec 10, 2016 6:17:32 AM
More than 13,000 self-described “data and visualization nerds” gathered in Austin, TX, recently for Tableau Software’s annual customer conference. In his inaugural keynote, Tableau’s new CEO, Adam Selipsky, said that nearly 9,000 were first-time attendees. I was impressed with the enthusiasm of the customers who had gathered for the event, cheering as company officials reviewed product plans and demonstrated new features. This enthusiasm suggests Tableau has provided capabilities that resonate with its users. Among other things, the company used the conference to outline a number of planned product enhancements.
Posted by David Menninger on Nov 26, 2016 8:33:57 AM
Fall is a busy time for software industry analysts. It’s a season filled with vendors’ user conferences and some industry conferences. Throughout the course of attending these events I’ve come to the realization that big vendors are often considered the Rodney Dangerfield of the software industry: They get no respect. What I mean by no respect is revealed in snarky social media comments, less enthusiastic coverage by tech media than smaller vendors get and a general sense that big vendors don’t do anything new with their development efforts. However, I suggest this is a shortsighted view of the software world. Smaller vendors serve a valuable function as a source of innovation for the industry, but they get a disproportionate share of attention. I suggest the big vendors deserve businesses’ attention, too, when they consider new software purchases.
Posted by David Menninger on Nov 21, 2016 9:15:25 AM
Data preparation is critical to the effectiveness of both operational and analytic business processes. Operational processes today are fed by streams of constantly generated data. Our data and analytics in the cloud benchmark research shows that more than half (55%) of organizations spend the most time in their analytic processes preparing data for analysis – a situation that reduces their productivity. Data now comes from more sources than ever, at a faster pace and in a dizzying array of formats; it often contains inconsistencies in both structure and content.
Posted by David Menninger on Nov 9, 2016 6:14:21 AM
I recently attended .conf2016, Splunk’s seventh annual user conference. Splunk created the market for analyzing machine data (shorthand for machine-generated data), which consists of log files and event data from various types of systems and devices. Our big data analytics benchmark research shows that these are two of the most common sources of big data that organizations analyze. This market has proven to be fertile ground for Splunk, growing steadily with revenues more than doubling over the previous two fiscal years. Machine data is also the backbone for the Internet of Things (IoT) and operational intelligence, which form the basis of forthcoming benchmark research from Ventana Research.
Posted by David Menninger on Oct 25, 2016 4:20:26 AM
I recently spent time at Strata+Hadoop World 2016 in New York. I attended this event and its predecessor, Hadoop World, off and on for the past six years. This one in New York had a different feel from previous events including the most recent event in San Jose at the end of March. Perhaps because of its location in one of the financial and commercial hubs of the world, the event had much more of a business orientation. But it’s not just location. Past events have been held in New York also, and I see the business focus as a sign of the Hadoop market maturing.
Posted by David Menninger on Oct 15, 2016 12:42:15 AM
I recently attended Oracle OpenWorld for the first time in several years. The message at this year’s event was clear: Oracle is all in on the cloud. I had heard the message, but I didn’t get the full impact until I arrived at the Moscone Center in San Francisco. All signage at the event contained the word “cloud,” and Oracle issued 18 press releases in conjunction with OpenWorld related to cloud computing. I also found out that Oracle has its own definition of “cloud.”
Posted by David Menninger on Sep 22, 2016 9:29:59 AM
Teradata recently held its annual Partners conference, at which gather several thousand customers and partners from around the world. This was the first Partners event since Vic Lund was appointed president and CEO in May. Year on year, Teradata’s revenues are down about 5 percent, which likely prompted some changes at the company. Over the past few years Teradata made several technology acquisitions and perhaps spread its resources too thin. At the event, Lund committed the company to a focus on customers, which was a significant part of Teradata’s success in the past. This commitment was well received by customers I spoke with at the event.
Posted by David Menninger on Sep 9, 2016 9:04:14 AM
Ventana Research has newly published its Mobile Analytics and Business Intelligence 2016 Value Index. The Value Index provides a comprehensive evaluation of vendors and their product offerings across seven categories. In performing that analysis, I realized that this software category is at a crossroads. Once an optional capability often reserved for executives, mobile analytics is becoming a requirement of business users across organizations. The blurring of lines between work and personal lives has provoked a change from single device BI to BI on multiple devices including smartphones and tablets as well as laptops and desktops. From a platform standpoint, the adoption of HTML5 is contesting the prevalence of native mobile applications.
Posted by David Menninger on Aug 23, 2016 7:01:41 AM
It’s part of my job to cover the ecosystem of Hadoop, the open source big data technology, but sometimes it makes my head spin. If this is not your primary job, how can you possibly keep up? I hope that a discussion of what I’ve found to be most important will help those who don’t have the time and energy to devote to this wide-ranging topic.
Posted by David Menninger on Aug 18, 2016 10:05:21 AM
Data virtualization is not new, but it has changed over the years. The term describes a process of combining data on the fly from multiple sources rather than copying that data into a common repository such as a data warehouse or a data lake, which I have written about. There are many reasons for an organization concerned with managing its data to consider data virtualization, most stemming from the fact that the data does not have to be copied to a new location. It could, for instance, eliminate the cost of building and maintaining a copy of one of the organization’s big data sources. Recognizing these benefits, many database and data integration companies offer data virtualization products. Denodo, one of the few independent, best-of-breed vendors in this market today, brings these capabilities to big data sources and data lakes.
Posted by David Menninger on Jul 28, 2016 3:45:18 AM
Predictive analytics is a rewarding yet challenging subject. In our benchmark research on next-generation predictive analytics at least half the participants reported that predictive analytics allows them to achieve competitive advantage (57%) and create new revenue opportunities (50%). Yet even more participants said that users of predictive analytics don’t have enough skills training to produce their own analyses (79%) and don’t understand the mathematics involved (66%). (In the term “predictive analytics” I include all types of data science, not just one particular type of analysis.)
Posted by David Menninger on Jul 2, 2016 9:25:29 AM
Qlik helped pioneer the visual discovery market with its QlikView product. In some respects, Qlik and its competitors also spawned the self-service trend rippling through the analytics market today. Their aim was to enable business users to perform analytics for themselves rather than building a product with the perfect set of features for IT. After establishing success with end users the company began to address more of the concerns of IT, eventually creating a robust enterprise-grade analytics platform. This approach has worked for Qlik, driving growth that led to an initial public offering in 2010. The company now generates more than half a billion dollars in revenue annually, making it one of the largest independent analytics vendors. Of which based on their company and products was rated a Hot Vendor in our 2015 Value Index on Analytics and Business Intelligence and one of the highest ranked in usability.
Posted by David Menninger on May 11, 2016 9:20:56 AM
It has been more than five years since James Dixon of Pentaho coined the term “data lake.” His original post suggests, “If you think of a data mart as a store of bottled water – cleansed and packaged and structured for easy consumption – the data lake is a large body of water in a more natural state.” The analogy is a simple one, but in my experience talking with many end users there is still mystery surrounding the concept. In this post I’d like to clarify what a data lake is, review the reasons an organization might consider using one and the challenges they present, and outline some developments in software tools that support data lakes.
Posted by David Menninger on Apr 29, 2016 10:26:40 AM
The emerging Internet of Things (IoT) is an extension of digital connectivity to devices and sensors in homes, businesses, vehicles and potentially almost anywhere. This innovation means that virtually any device can generate and transmit data about its operations – data to which analytics can be applied to facilitate monitoring and a range of automatic functions. To do these tasks IoT requires what Ventana Research calls operational intelligence (OI), a discipline that has evolved from the capture and analysis of instrumentation, networking and machine-to-machine interactions of many types. We define operational intelligence as a set of event-centered information and analytic processes operating across an organization that enable people to use that event information to take effective actions and make optimal decisions. Ventana Research first began covering operational intelligence over a decade ago.
Posted by David Menninger on Mar 24, 2016 11:29:15 PM
On Monday, March 21, Informatica, a vendor of information management software, announced Big Data Management version 10.1. My colleague Mark Smith covered the introduction of v. 10.0 late last year, along with Informatica’s expansion from data integration to broader data management. Informatica’s Big Data Management 10.1 release offers new capabilities, including for the hot topic of self-service data preparation for Hadoop, which Informatica is calling Intelligent Data Lake. The term “data lake” describes large collections of detailed data from across an organization, often stored in Hadoop. With this release Informatica seeks to add more enterprise capabilities to data lake implementations.
Posted by David Menninger on Mar 11, 2016 6:59:49 AM
I recently attended the SAS Analyst Summit in Steamboat Springs, Colo. (Twitter Hashtag #SASSB) The event offers an occasion for the company to discuss its direction and to assess its strengths and potential weaknesses. SAS is privately held, so customers and prospects cannot subject its performance to the same level of scrutiny as public companies, and thus events like this one provide a valuable source of additional information.
Posted by David Menninger on Feb 26, 2016 7:03:13 AM
Last week I attended Spark Summit East 2016 at the New York Hilton Midtown. It revealed several ways in which Spark technology might impact the big data market.
Posted by David Menninger on Feb 7, 2016 10:10:09 PM
The big data market continues to expand and enable new types of analyses, new business models and new revenues streams for organizations that implement these capabilities. Following our previous research into big data and information optimization, we’ll investigate the technology trends affecting both of these domains as part of our 2016 research agenda.
Posted by David Menninger on Feb 7, 2016 9:46:23 PM
Some followers of Ventana Research may recall my work here several years ago. Here and elsewhere I have spent most of my career in the data and analytics markets matching user requirements with technologies to meet those needs. I’m happy to be returning to Ventana Research to resume investigating ways in which organizations can make the most of their data to improve their business processes; for a first look, please see our 2016 research agenda on Big Data and Information Optimization. I relish the opportunity to conduct primary market research in the form of Ventana’s well-known benchmark research and to help end users and vendors apply the information collected in those studies.
Posted by David Menninger on Feb 7, 2016 9:39:23 PM
Throughout the course of our research in 2016, we’ll be exploring ways in which organizations can maximize the value of their data. Ventana Research believes that analytics is the engine and data is the fuel to power better business decisions. Several themes emerged from our benchmark research on incorporating data and analytics into organizational processes, and we will follow them in our 2016 Business Analytics Research Agenda:
Posted by David Menninger on Mar 27, 2012 11:56:10 AM
As a technology, predictive analytics has existed for years, but adoption has not been widespread among businesses. In our recent benchmark research on business analytics among more than 2,600 organizations, predictive analytics ranked only 10th among technologies they use to generate analytics, and only one in eight of those companies use it. Predictive analytics has been costly to acquire, and while enterprises in a few vertical industries and specific lines of business have been willing to invest large sums in it, they constitute only a fraction of the organizations that could benefit from them. Ventana Research has just completed a benchmark research project to learn about how the organizations that have adopted predictive analytics are using it and to acquire real-world information about their levels of maturity, trends and best practices. In this post I want to share some of the key findings from our research.
Posted by David Menninger on Mar 26, 2012 11:08:46 AM
I want to share my observations from the recent annual SAS analyst briefing. SAS is a huge software company with a unique culture and a history of success. Being privately held SAS is not required to make the same financial disclosures as publicly held organizations, it released enough information to suggest another successful year, with more than $2.7 billion in revenue and 10 percent growth in its core analytics and data management businesses. Smaller segments showed even higher growth rates. With only selective information disclosed, it’s hard to dissect the numbers to spot specific areas of weakness, but the top-line figures suggest SAS is in good health.
Posted by David Menninger on Mar 16, 2012 10:50:17 AM
In our definition, information management encompasses the acquisition, organization, dissemination and use of information by organizations to create and enhance business value. Effective information management ensures optimal access, relevance, timeliness, quality and security of this data with the aim to improve organizational performance. This goal is not easily met, especially as organizations acquire ever more data at an ever faster pace. In our business analytics benchmark research of more than 2,600 organizations, almost half (45%) have to integrate six or more types of data in their analyses. More than two-thirds reported that they spend more time preparing data than analyzing it. To assist in dealing with these sorts of issues and others, we’ve laid out an ambitious information management research agenda for 2012.
Posted by David Menninger on Mar 15, 2012 11:43:40 AM
For most people involved with business intelligence (BI), these are exciting times. Using BI to improve business processes continues to motivate organizations to invest in BI. The focus on BI also empowers business analytics and can be rented in the cloud computing model of accessing software. New technologies are adding dimensions to BI and creating both excitement and confusion for enterprises implementing them. We offer a variety of accomplished research that can help organizations overcome the hype and understand how to use these technologies to improve business decision-making, and we’re planning new research in 2012 on these topics.
Posted by David Menninger on Mar 14, 2012 10:07:12 AM
My colleague Mark Smith and I recently attended data integration vendor Informatica’s annual industry analyst event. The company offered some impressive numbers regarding growth and profitability over the years, with 30 consecutive quarters of growth even during the recent recession. Through acquisition and its own research and development activities Informatica now has a broad portfolio of products. It includes data integration and supporting migration, replication and synchronization needs, master data management, complex event processing and other elements of the information management spectrum. As at last year’s event, the company retains a sharp focus on the data integration related portfolio, and its product roadmap addresses four key themes impacting that market: big data, cloud computing, social media and mobile technology. We also see these themes as significant technology trends, and our approach is outlined in our 2012 research agendas for information management and in the larger business technology innovation agenda. Thus it was interesting to hear Informatica’s take on them.
Posted by David Menninger on Feb 1, 2012 8:34:46 AM
Revolution Analytics recently announced the winners of its “Applications of R in Business” contest. Revolution Analytics has built a business around supporting R, an open source statistical software package, and extending it with features it licenses to customers. I served as a judge in the contest. Since I was in the midst of analyzing the data for our predictive analytics benchmark research, I was interested to see how the contestants applied predictive analytics techniques to specific business problems.
Posted by David Menninger on Feb 1, 2012 8:22:48 AM
When it comes to technology, debates about whether a particular name suits its category are rampant. Here is a link to one such argument about the term “big data” from Curt Monash, an analyst whom I respect a great deal. This debate rages in the Twittersphere also, as in this comment from Neil Raden, another analyst I respect, suggesting that “big data is a marketing term … imprecise by design.” Another term I’ve encountered resistance to recently is “predictive analytics.” See: (“Revolution Analytics Hosts Contest on Business Predicting the Future“).
Posted by David Menninger on Jan 31, 2012 7:38:21 AM
MicroStrategy, one of the largest independent vendors of business intelligence (BI) software, recently held its annual user conference, which I attended with some of my colleagues and more than 2,000 other attendees. At this year’s event, the company emphasized four key themes: mobility, cloud computing, big data and social media. In this post, I’ll assess what MicroStrategy is doing in each of the first three areas. My colleague, Mark Smith, covered MicroStrategy’s social intelligence efforts in his blog. I’ll also share some opinions on what might be missing from the company’s vision.
Posted by David Menninger on Jan 25, 2012 10:44:47 AM
We recently published the results of our benchmark research on Big Data to complement the previously published benchmark research on Hadoop and Information Management. Ventana Research undertook this research to acquire real-world information about levels of maturity, trends and best practices in organizations’ use of large-scale data management systems now commonly called Big Data. The results are illuminating.
Posted by David Menninger on Dec 28, 2011 11:49:27 AM
Rather than make predictions for 2012, which are everywhere right now, I want to look back at some of the surprising events of 2011. I think it’s worth considering what happened that wasn’t expected and what these things might tell us about the information technology market. Here, in no particular order, are the most important ones I see.
Posted by David Menninger on Dec 20, 2011 3:56:24 AM
Collaborative and mobile technologies continue to influence business intelligence (BI) software products. The recent release of Yellowfin 6 embraces these innovations in a visually appealing, end-user-oriented BI product. Yellowfin is an independent BI software vendor based in Australia that was recently recognized, along with its customer Macquarie University, as a Ventana Leadership Award winner for the use of location–based aspects of its technology for effective planning and student acquisition initiatives.
Posted by David Menninger on Dec 15, 2011 9:00:14 AM
Talend recently announced version 5 of its information management platform, which emphasizes unifying its various components. Through a combination of development activities, acquisitions and partnerships, Talend has been steadily building its portfolio of information management capabilities. In addition to its core data integration capabilities, it has added data quality, master data management, application integration and with this release business process management (BPM).
Posted by David Menninger on Nov 30, 2011 11:13:52 AM
Kalido recently introduced version 9 of its Information Engine product. The company has been around for 10 years but has had difficulty establishing its identity in the information management market. Kalido was perhaps ahead of its time, partly a vendor of data integration, partly master data management and partly data governance. As an example of the positioning challenge, its core product, Information Engine, while not a data integration tool, could in some cases provide sufficient capabilities to meet an organization’s data integration needs. Its real value, however, comes from authoring and management of information about the user’s data warehouse.
Posted by David Menninger on Nov 17, 2011 7:54:31 AM
Tibco recently introduced Spotfire 4.0, the most recent version of its interactive discovery and business intelligence (BI) tool. Spotfire comes at BI through visualization. It uses in-memory processing and good user interface design to develop highly interactive displays of data. Version 4.0 attempts to enhance Spotfire’s dashboard capabilities and offers integration with enterprise collaboration tools. The former capabilities are necessary to broaden Spotfire’s appeal and applicability for more BI projects, but the latter capabilities are more interesting since they represent a fundamental shift in the way enterprises use business intelligence.
Posted by David Menninger on Nov 15, 2011 11:06:29 AM
Cloudera’s recent Hadoop World 2011 event confirmed that the world of big data is getting even bigger. As I wrote of last year’s event, Hadoop, the open source large-scale data processing technology, has gone mainstream. And while 75% of the audience attended this year for the first time and so may not have realized the breadth of Hadoop’s acceptance, statistics announced in the opening keynote show widespread use of it. Mike Olson, Cloudera CEO, reported that the event was sold out, with 1,400 attendees from 580 organizations and 27 countries. In independent confirmation, our benchmark research shows that 54% of organizations are either using or evaluating Hadoop for their big-data needs.
Posted by David Menninger on Nov 4, 2011 12:25:17 PM
IBM’s Information on Demand (IOD) event showcased its products for both information management and business intelligence. I’ve covered the information management aspects of IOD in a separate post. In this post I’ll look at the business intelligence aspects. Earlier this year IBM made predictive analytics a major focus of its Business Analytics analyst summit, an event that often foreshadows the IOD messages. In addition to predictive analytics, IBM emphasized both large-scale “big” data and a concept it calls “personal analytics” at the summit. Both of these received more attention at IOD.
Posted by David Menninger on Nov 4, 2011 11:49:48 AM
IBM made more than two dozen announcements in conjunction with its recent Information on Demand (IOD) event. In this post I’ll address the impact of IOD from an information management perspective and in a separate post shortly from an analytics perspective. Trying to organize the mass of information IBM brought forth at IOD 2011, I group the announcements into three general categories of enhancements and extensions to InfoSphere, big data (which is technically part of InfoSphere) and databases.
Posted by David Menninger on Nov 2, 2011 8:57:26 AM
Informatica recently introduced HParser, an expansion of its capabilities for working with Hadoop data sources. Beginning with Version 9.1, introduced earlier this year, Informatica’s flagship product has been able to access data stored in HDFS as either a source or a target for information management processes. However, it could not manipulate or transform the data within the Hadoop environment. With this announcement, Informatica starts to bring its data transformation capabilities to Hadoop.
Posted by David Menninger on Oct 28, 2011 8:13:08 AM
QlikTech recently introduced QlikView 11, the latest version of its business intelligence (BI) software, which emphasizes new collaboration features as well as enhancements to its user interface. In an about-face, though, in its approach to mobile access, the company has moved away from its native iPad application to a browser-based app using HTML5 technology.
Posted by David Menninger on Oct 20, 2011 8:45:48 AM
In a move that may indicate the beginning of a new wave of activity in the business intelligence (BI) market, Oracle has announced its intention to acquire Endeca. Founded in 1999, Endeca originally focused on search capabilities for online commerce. Users selected a product attribute, and the software automatically revised the remaining selection criteria based on products matching the previous selection. We have been covering Endeca as part of the BI and information applications marketing. For instance, if the products only come in one color, the color attribute would be removed from the selection criteria and possibly replaced by other relevant criteria. Most of us take this behavior for granted as it has been adopted or imitated by many e-commerce sites and other Web properties.
Posted by David Menninger on Oct 15, 2011 4:33:47 AM
Posted by David Menninger on Oct 15, 2011 4:05:24 AM
Oracle made several announcements at its recent Open World event demonstrating its strengths in the business computing market but also that it is standing on the shoulders of giants. The company has developed the expertise, processes and market share to scale out the ideas and innovations of others. Don’t get me wrong: That statement is not an indictment. Large organizations often have challenges with innovation. They are not as nimble as their smaller competitors. On the other hand, small organizations often have challenges scaling out their successes. In an earlier post I characterized the software market as a sort of ecosystem, and this is how it works. Large organizations often look to imitate or acquire smaller firms for their innovations.
Posted by David Menninger on Oct 7, 2011 6:01:12 AM
About 30 years ago, perhaps on this very day, I was sitting in front of an Apple II working on a VisiCalc spreadsheet. At the time, I don’t think I even knew who Steve Jobs was. I wasn’t in the software industry yet. I was working for a public accounting firm. The Apple II sat in a corner of the office “typing pool.” For those of you who don’t know what a typing pool was, there was no swimming involved – it was a group of full-time employees with dedicated equipment who did all the typing and word processing tasks of the office.
Posted by David Menninger on Oct 3, 2011 11:14:12 AM
Oracle kicked off its Open World 2011 conference with the announcement ofExalytics, a new data warehouse appliance specifically for business intelligence (BI). Three years ago when Oracle introduced the Exadata product line it was based on hardware from Hewlett-Packard. Since then it has acquired Sun Microsystems and replaced the HP components in Exadata, assuming complete control over the hardware and software included in the appliance. Oracle also introduced two other appliance products: Exalogic, which is focused on Oracle Applications, and more recently the Oracle Database Machine. Oracle’s new tag line, “Hardware and software, engineered to work together,” indicates its emphasis on these appliances and the potential for more, perhaps even some to be announced at Open World.
Posted by David Menninger on Sep 29, 2011 12:13:42 PM
This has been the year of the cloud for MicroStrategy. After ignoring early competition from cloud-based business intelligence (BI) providers, the company has jumped on the cloud BI bandwagon. At MicroStrategy Worldearly this year it announced a program called Cloud Intelligence and this summer introduced MicroStrategy Cloud, a complete BI platform with the option of using either IBM Netezza or ParAccel as the database and Informatica as the data integration environment. Now the company has expanded its cloud offerings to include MicroStrategy Cloud Personal, which enables individuals to easily upload spreadsheet data, analyze it and share it with others. (A free version is currently in beta testing.)
Posted by David Menninger on Sep 26, 2011 6:26:57 AM
PivotLink introduced version 5 of its Business Intelligence (BI) product, which it delivers in the cloud computing environment and available through software as a service (SaaS) at its user conference. Demonstrating one of the unique benefits of providing it a SaaS approach, PivotLink did not require any of the attendees to request a download, get a new license file or do any type of software upgrade to receive the new version. It had already been done for them. One purpose of the conference was to inform customers of the new features they now had at their disposal. Any enterprise that has had to coordinate a major software upgrade should appreciate the time savings SaaS can provide on normal software maintenance activities. Our recent benchmark research on business data in the cloud indicates that companies are adopting SaaS-based products to varying degrees across all the lines of business.
Posted by David Menninger on Sep 23, 2011 11:03:05 AM
Recently Karmasphere introduced version 1.5 of its Analyst product which helps organizations analyze “big data” stored in Hadoop, the open source large-scale data processing technology. An independent software vendor focused exclusively on the Hadoop market, Karmasphere made available a community edition of its developer product in September 2009 and launched the company in March 2010. Since then it has been active and visible in Hadoop-related events including Hadoop World, the IBM Big Data Symposium and others.
Posted by David Menninger on Sep 9, 2011 9:35:04 AM
Splunk may be one of the biggest software companies you’ve never heard of. I’ve been following the seven-year-old company for over six months now and recently attended its second annual user conference. Splunk focuses on analyzing large volumes of machine-generated data in underlying applications and systems, which includes application and system logs, network traffic, sensor data, click streams and other loosely structured information sources. Many of these “big data” sources are the same sources analyzed with Hadoop, according to our recently published benchmark research. However, Splunk takes a different approach that focuses on performing simple analyses on this data in real time rather than the batch-based advanced analytics we see as the most common use for Hadoop.
Posted by David Menninger on Aug 24, 2011 9:45:16 AM
We cite collaboration as one of five key technology influences on the business intelligence (BI) market, and I get many questions about collaboration and BI from end users and vendors alike. The rise of social media websites such as Facebook and Twitter has raised awareness of collaborative platforms and created a critical mass of participants, which is a necessary ingredient for successful collaboration. However, I have to point out that consumer-oriented social media tools do not provide all the necessary components for collaborative BI.
Posted by David Menninger on Aug 23, 2011 9:35:41 AM
In preparation for conducting a benchmark research study on predictive analytics, I’ve been speaking with vendors in this market segment to gather background. One of them is Opera Solutions, which I was not familiar with despite having worked in predictive analytics for more than 10 years. I was surprised to learn the company claims to be generating $100 million in revenue annually. Founded in 2004, Opera provides predictive analytics as a service, employing a staff of approximately 150 data scientists along with hundreds of other employees. It claims this is the largest private group of scientists outside IBM. (I’ve written previously about IBM’s significant efforts in the predictive analytics market.)
Posted by David Menninger on Aug 19, 2011 10:27:08 AM
I recently attended Teradata’s Third-Party Influencers Meeting (Twitter Hashtag #TD3PI) in Las Vegas where the company updated industry analysts and consultants on upcoming product plans and the status of two product lines it added via acquisition in the past year. Randy Lea, Teradata’s VP of product management and marketing, provided a company overview recapping highlights and defining the corporate strategy building on its work in 2010 that my colleague assessed. Teradata focuses on three markets: its core business of data warehousing, which it identified as a $27 billion market, the $15 billion business applications market represented by its Aprimo acquisition, and the big-data analytics market, a $2 billion market addressed in part by its Aster Data acquisition. I mention Teradata’s estimates of market size as they may indicate the emphasis and investment the company will make in each segment. Lea referred to Teradata’s independence from NCR, for almost four years, as a good thing. Independence, he said, has allowed Teradata to focus, to release products more frequently and to acquire companies that enhance their market position. Judging by the performance of its stock (NYSE: TDC), the financial markets seem to agree independence has been good – even with the recent market downturn its stock is up 75% in the last year.
Posted by David Menninger on Jul 7, 2011 1:32:12 PM
SnapLogic, a cloud-based data integration vendor, has extended its product linewith new data quality capabilities. This is worth comment because SnapLogic sits at the intersection of two recent trends in information management.
Posted by Ventana Research on Jun 28, 2011 11:48:16 PM
For months the speculation was rampant, and now the rumors have proven to be true. Yahoo has officially announced that it will become a player in the emerging Hadoop market. Hadoop provides distributed computing capabilities that enable organizations to process very large amounts of data quickly. Backed by Yahoo and Benchmark Capital, a new entity called Hortonworks has formed around a team from Yahoo that consists of more than 20 key architects of and contributors to the Apache Hadoop project. The company will start with some 25 employees and “will be hiring aggressively from our collective networks,” according to Rob Bearden, Hortonworks president and COO.
Posted by David Menninger on Jun 24, 2011 10:55:19 AM
InetSoft is a business intelligence vendor that is not well-known but has more than 3,000 customers. Why do you need to know about another BI vendor? As I’ve written in the past, there’s a place in this market for both the megavendors and smaller vendors. InetSoft, one of the latter, has developed a broad set of capabilities over the years that have resonated with its customers. It recently announced and brought to market a significant new release, Style Intelligence 11.
Posted by David Menninger on Jun 22, 2011 10:26:10 AM
As recently as two years ago, Pentaho was all about open source business intelligence. The company used an open source business model to build a base of more than 1,200 paying customers and establish more than 8,000 production deployments. It still has an open source business model, but the company has created a broad yet integrated product line that deserves to be evaluated on its features, not just its licensing scheme. This week Pentaho announced version 4.0 of its BI suite along with version 4.2 of Pentaho Data Integration (aka Kettle).
Posted by David Menninger on Jun 19, 2011 7:36:37 AM
Cloudera is riding the wave of big data. I first learned about the company while working at Vertica, one of Cloudera’s partners. Customers that managed large amounts of structured relational data also needed to process large amounts of semistructured data such as the type found in web logs and application logs. The emerging channel of social media provided another source of data lacking the structure that would lend itself to analysis in a relational database. Other organizations needed to perform calculations and analyses that were difficult to express in SQL. Seeing this market Cloudera recognized earlier than others an opportunity to leverage the Apache Hadoop project; it has been offering the Cloudera Distribution for Hadoop (CDH) since early 2009.
Posted by David Menninger on Jun 17, 2011 1:02:00 PM
I recently attended IBM’s analyst summit on business analytics. Since last year’s event was largely a preview of Cognos 10, which was several years in the making, I wondered what this year’s event would be about. IBM focused much of the attention on predictive analytics, strengthened by its acquisition of SPSS. My colleague Robert Kugel covered another theme from the event in his post on Cognos Planning.
Posted by David Menninger on Jun 8, 2011 1:13:34 PM
Informatica has announced version 9.1 for Big Data. I wrote previously about Informatica 9.1,the latest iteration of the company’s data integration platform, following its industry analyst summit. At that event in February, the company officials alluded to future plans regarding Hadoop and other big-data sources yet to be finalized. This announcement reveals those plans. Informatica will support three types of “big data”: big transaction data from relational databases and data warehouse system, big interaction data from social media, customer interaction systems and other systems, and big data processing, which means Hadoop, the open source software framework. Let’s look at each of these types.
Posted by David Menninger on May 18, 2011 8:06:57 AM
Last week I attended the IBM Big Data Symposium at the Watson Research Center in Yorktown Heights, N.Y. The event was held in the auditorium where the recent Jeopardy shows featuring the computer called Watson took place and which still features the set used for the show – a fitting environment for IBM to put on another sort of “show” involving fast processing of lots of data. The same technology featured prominently in IBM’s big-data message, and the event was an orchestrated presentation more like a TV show than a news conference. Although it announced very little news at the event, IBM did make one very important statement: The company will not produce its own distribution of Hadoop, the open source distributed computing technology that enables organizations to process very large amounts of data quickly. Instead it will rely on and throw its weight behind the Apache Hadoop project – a stark contrast to EMC’s decision to do exactly that, announced earlier in the week. As an indication of IBM’s approach, Anant Jhingran, vice president and CTO for information management, commented, “We have got to avoid forking. It’s a death knell for emerging capabilities.”
Posted by David Menninger on May 12, 2011 4:29:33 PM
Earlier this week EMC announced it will create its own distribution for Apache Hadoop. Hadoop provides distributed computing capabilities that enable organizations to process very large amounts of data quickly. As I have written previously, the Hadoop market continues to grow and evolve. In fact, the rate of change may be accelerating. Let’s start with what EMC announced and then I’ll address what the announcement means for the market.
Posted by David Menninger on Apr 29, 2011 7:49:11 AM
As part of our largest-ever research study on business analytics, which surveyed more than 2,600 organizations covering the maturity and competency of business, IT and vertical industries, we looked at how IT is applying analytics to support their own business activities. One of the things we found is that, charged with enabling business units to use information systems as effectively as possible, the IT department, like the shoemaker’s barefoot children in the old tale, typically stands last in line for resources to manage its own performance. In trying to understand and tune the collection of networking and operating systems, middleware and applications an enterprise needs to operate, IT professionals usually have to make do with small sets of historical data stored in spreadsheets and data warehouses and marts that are not as well managed as the systems they maintain to support the business. In most cases IT cannot apply the same level of analytics to its own operations that it provides to business units. This also has effects beyond IT itself: To the extent that the result is subpar performance of its core information systems, the business will suffer.
Posted by David Menninger on Apr 20, 2011 8:07:48 AM
In various forms, business intelligence (BI) – as queries, reporting, dashboards and online analytical processing (OLAP) – is being used increasingly widely. And as basic BI capabilities spread to more organizations, innovative ones increasingly are exploring how to take advantage of the next step in the strategic use of BI: predictive analytics. The trend in Web searches for the phrase “predictive analytics” gives one indication of the rise in interest in this area. From 2004 through 2008, the number of Web search was relatively steady. Beginning in 2009, the number of searches rose significantly and has continued to rise.
Posted by David Menninger on Apr 1, 2011 7:33:05 AM
The information management (IM) technology market is undergoing a revolution similar to the one in the business intelligence (BI) market. We define information management as the acquisition, organization, control and use of information to create and enhance business value. It is a necessary ingredient of successful BI implementations, and while some vendors such as IBM, Information Builders, Pentaho and SAP are in addition integrating their BI and IM offerings, each discipline involves different aspects of the use of information and will require it sometimes integrated and sometimes separate.
Posted by David Menninger on Mar 20, 2011 8:29:01 PM
The business intelligence (BI) technology market is undergoing a revolution. I’ve been working in this segment for 20 years, and it is and has been an exciting market in which to work, but its dynamic nature can be daunting to organizations trying to evaluate, purchase and deploy BI to improve their business processes. And despite the advances our benchmark research shows high levels of dissatisfaction with and immaturity in BI capabilities within organizations.
Posted by Ventana Research on Mar 17, 2011 7:40:43 PM
I recently attended SAS Institute’s annual analyst conference. My colleague covered the multibillion-dollar company’s strategy and the event. Now I want to look into some of the details of SAS’s products for business analytics and how they are supported with business intelligence (BI), and information management. Although SAS is not a publicly traded company and therefore is not required to make the financial disclosures that others are, the company revealed numerous financial statistics. Business intelligence represents over $200 million in license revenue to SAS. That’s a significant figure, larger than publicly traded BI vendors QlikTech (NASDAQ: QLIK) and Actuate (NASDAQ: BIRT) have and smaller than but still in the same order of magnitude as MicroStrategy (NASDAQ: MSTR) and Information Builders. These figures are consistent with results in our benchmark research on business intelligence and performance management: 18% of our research respondents reported using SAS products, which places it in the middle of the pack.
Posted by David Menninger on Mar 13, 2011 9:56:18 PM
SAP has launched its Enterprise Information Management (EIM) 4.0 release as part of its “Run Better Tour.” It includes a broad range of information management components spanning data integration, data quality, data profiling, metadata management and more. The launch was done in conjunction with SAP Business Intelligence (BI) 4.0, which got much bigger billing at the event –to the point where one might call this a stealth marketing campaign. However, the event did identify three themes intended to highlight EIM capabilities: event insight, trusted data and text processing. The goal here was to communicate the integration SAP has achieved within and between its BI and EIM products. IBM announced a similar advance with its InfoSphere products and Informatica has also invested heavily in integrating its information management products. Our Information Management benchmark research validates this approach, finding that incompatible tools create a significant obstacle to organizations’ quest for consistent sets of data.
Posted by David Menninger on Mar 4, 2011 7:14:49 AM
There has been a spate of acquisitions in the data warehousing and business analytics market in recent months. In May 2010 SAP announced an agreement to acquire Sybase, primarily for its mobility technology and had already been advancing its efforts with SAP HANA and BI. In July 2010 EMC agreed to acquire data warehouse appliance vendor Greenplum. In September 2010 IBM countered by acquiring Netezza, a competitor of Greenplum. In February 2011 HP announced after giving up on its original focus with HP Neoview and now has acquired analytics vendor Vertica that had been advancing its efforts efficiently. Even Microsoft shipped in 2010 its new release of SQL Server database and appliance efforts. Now, less than one month later, Teradata has announced its intent to acquire Aster Data for analytics and data management. Teradata bought an 11% stake in Aster Data in September, so its purchase of the rest of the company shouldn’t come as a complete surprise. My colleague had raised the question if Aster Data could be the new Teradata but now is part of them.
Posted by David Menninger on Mar 3, 2011 9:52:44 PM
This is the second in a series of posts on the architectures of analytic databases. The first post addressed massively parallel processing (MPP) and database technology. In this post, we’ll look at columnar database technology. Given the recent announcement of HP’s plan to acquire Vertica, a columnar database vendor, there is likely to be even more interest in columnar database technology, how it operates and what benefits it offers.
Posted by David Menninger on Mar 2, 2011 9:47:05 PM
It’s clear that now we are living in the era of big data. The stores of data on which modern businesses rely are already vast and increasing at an unprecedented pace. Organizations are capturing data at deeper levels of detail and keeping more history than they ever have before. Managing all of the data is thus emerging as one of the key challenges of the new decade.
Posted by David Menninger on Mar 1, 2011 9:25:27 PM
Last week SAP launched the 4.0 Release of its Business Intelligence and Enterprise Information Management products in conjunction with the New York City stop on its “SAP Run Better Tour”. My colleague Mark Smith has already covered the announcement in the context of some of today’s major technology trends. In this post, I’ll focus on the specifics of the product announcements.
Posted by David Menninger on Feb 27, 2011 11:42:32 AM
At Informatica’s recent industry analyst summit, Chris Boorman, the company’s chief marketing officer, opened the event by describing Informatica as expanding beyond its core offering in data integration in a broader sense. He compared this growth to Amazon expanding from being an online bookseller to offering computing resources via Amazon Web Services. I see it almost the opposite way. Informatica has always been in the data integration business. It has excelled at making this area of IT more relevant and more applicable to broader audiences. My colleague described their latest efforts to focus on line of business users in a recent post. My purpose here is to review some of the highlights of the company’s latest product releases.
Posted by David Menninger on Feb 14, 2011 3:33:01 PM
Kognitio announced the addition of MultiDimensional eXpressions (MDX) capabilities for its WX2 product line. John Thompson, CEO of U.S. operations, and Sean Jackson, VP of marketing, shared some of the details with me recently. I find the marriage of MDX and large-scale data both technically challenging and potentially valuable to the market.
Posted by David Menninger on Feb 7, 2011 11:12:23 AM
Last week I attended MicroStrategy World 2011 in Las Vegas, the North American version of the business intelligence (BI) vendor’s annual user conference. The event was well attended, and the company claimed attendance was up 40% over last year. The purpose of the post is to recap the announcements made, highlight the areas where MicroStrategy is making investments and comment on the overall direction implied by these investments.
Posted by David Menninger on Jan 28, 2011 12:22:47 PM
Open source business intelligence (BI) software vendor Jaspersoft recently announced general availability of its flagship product Jaspersoft 4 and earlier this week announced a new reporting project that provides data connectors to a variety of large-scale data sources.
Posted by David Menninger on Jan 19, 2011 11:06:15 AM
This is the first in a series of posts on the architectures of analytic databases. This is relational database technology that has been “supercharged” in some way to handle large amounts of data such as typical data warehouse workloads. The series will examine massively parallel processing (MPP), columnar databases, appliances and in-database analytics. Our purpose is to help those evaluating analytic database technologies understand some of the alternative approaches so they can differentiate between different vendors’ offerings. There is no single best solution for all analytics for all types of applications; usually the decision involves a series of trade-offs. Understanding what you might be giving up or gaining, you may be able to make a better decision about which solution is best for your organization’s needs.
Posted by David Menninger on Jan 4, 2011 4:55:16 PM
Cloud computing is having an increasingly large influence over the IT landscape. It’s likely that, whether you realize it or not, corporate data exists and or is migrating outside the walls of your organization. Recent research by Ventana Research shows that in areas such as customer services, sales, workforce or human capital management, software as a service (SaaS) or cloud-based applications increasingly are being accepted and adopted. In our benchmark research on business intelligence and performance management, for example, only 53 percent of prefer their systems on-premises, and we expect that percentage to decline in the next 12 to 24 months, in which more than one-third of organizations plan to begin using cloud-based or SaaS applications.
Posted by David Menninger on Dec 15, 2010 2:03:01 PM
Last week I attended salesforce.com’s Dreamforce user conference in San Francisco (Twitter #DF10). As a user of salesforce applications for the last four years in my previous positions, I was familiar with its analytic capabilities, or lack thereof. Certainly you can accomplish simple reporting and produce dashboards displaying salesforce data, which is adequate for narrowly focused reporting and analysis. However, as a user I was underwhelmed. For example, there are no built-in capabilities for pixel-perfect reporting, drill-and-pivot navigation of data, advanced visualization or predictive analytics. But if you need to perform serious what-if analysis or predictive analytics against your sales and marketing data, you’ll have to do some custom programming at least to make it work in salesforce. Given the overall importance of business intelligence, I was expecting to hear more at Dreamforce about new analytics capabilities for specific lines of business.
Posted by David Menninger on Dec 9, 2010 5:47:33 PM
Talend, a vendor of open source data integration tools, recently announced its acquisition of Sopera, an open source application integration company whose products are based on a service-oriented architecture (SOA). It simultaneously announced an additional $34 million of funding. As I pondered what the announcements mean, I couldn’t help but think of the bigger picture. Is this entrepreneurial action typical of an open source vendor?
Posted by David Menninger on Nov 30, 2010 11:14:29 AM
There’s a lot going on in search technology still, or again, depending on your perspective. We’ve analyzed search in a business context periodically over the years. I want to provide some more analysis on the business side of search after many announcements that I have been analyzing over the last two months from Endeca, our analysis of IBM Cognos, MarkLogic and my analysis of QlikView, all of which include significant enhancements to existing search capabilities in their most recent product upgrades.
Posted by David Menninger on Nov 26, 2010 6:05:27 PM
Tableau Software officially released Version 6 of its product this week. Tableau approaches business intelligence from the end user’s perspective, focusing primarily on delivering tools that allow people to easily interact with data and visualize it. With this release, Tableau has advanced its in-memory processing capabilities significantly. Fundamentally Tableau 6 shifts from the intelligent caching scheme used in prior versions to a columnar, in-memory data architecture in order to increase performance and scalability.
Posted by David Menninger on Nov 26, 2010 5:57:53 PM
Interest in and development of in-memory technologies have increased over the last few years, driven in part by widespread availability of affordable 64-bit hardware and operating systems and the performance advantages in-memory operations provide over disk-based operations. Some software vendors, such as SAP with its High-Performance Analytic Appliance (HANA) project has been advancing with momentum, have even suggested that we can put our entire analytic systems in memory.
Posted by David Menninger on Nov 26, 2010 5:56:00 PM
In the weeks leading up to and as part of its Information On Demand Conference that my colleague assessed, IBM introduced version 8.5 of InfoSphere Information Server and several related product updates. As my colleague suggested earlier, IBM has an ambitious agenda to provide comprehensive information management capabilities through a combination of product development and acquisitions. The breadth of this portfolio is impressive, and InfoSphere Information Server 8.5 makes significant strides in tying the various pieces together.
Posted by David Menninger on Nov 26, 2010 5:53:19 PM
On October 25, IBM introduced Cognos 10 at its Information on Demand and Business Analytics Forum in Las Vegas that I attended to review the technology closer from my examination at its recent IBM Business Analytics analyst summit in September. According to Rob Ashe, IBM’s general manager of business analytics, Cognos 10 has been developed for over six years. You’re probably aware that in that period IBM made a variety of acquisitions including Cognos itself. These acquisitions and their impact on the new product are clearly in evidence as part of the release.
Posted by David Menninger on Nov 26, 2010 5:51:15 PM
My colleague recently wrote about QlikView, noting its rapid ascent to providing a very robust support of mobile technology platforms among BI vendors and integration with SAP. On the occasion of its release of a major product revision, QlikView 10, I’d like to add my perspective on the company and its most recent release. I first learned of QlikView about five years ago while working on the TM1 product line which, like QlikView, is also a 64-bit, in-memory analytic technology supporting business intelligence needs across business and IT.
Posted by David Menninger on Nov 26, 2010 5:48:46 PM
If you enjoyed my previous blog, “Hadoop Is the Elephant in the Room,” perhaps you’d be interested in what your organization might do with Hadoop. As I mentioned, the Hadoop World event this week showcased some of the biggest and most mature Hadoop implementations, such as those of eBay, Facebook, Twitter and Yahoo. Those of you who need 8,500 processors and 16 petabytes of storage like eBay likely already know about Hadoop. But is Hadoop relevant to organizations with less data that is still a lot?
Posted by David Menninger on Nov 26, 2010 5:47:23 PM
Earlier this week I attended Hadoop World in New York City. Hosted by Cloudera, the one-day event was by almost all accounts a smashing success. Attendance was approximately double that of last year. There were five tracks filled mostly with user presentations. According to Mike Olson, CEO of Cloudera, the conference’s tweet stream (#hw2010) was one of the top 10 trending topics of that morning. Cloudera did an admirable job of organizing the event for the Hadoop community rather than co-opting it for its own purposes. Certainly, this was not done out of altruism, but it was done well and in a way that respected the time and interests of those attending.
Posted by David Menninger on Nov 26, 2010 5:45:43 PM
I attended the IBM Business Analytics Analyst Summit in Ottawa and while I can’t tell you much about what was discussed there due to confidentiality restrictions that will be released shortly, I can share with you some of my own observations regarding the state of BI, particularly what’s wrong with it. By “wrong,” I mean why aren’t adoption rates higher? Why aren’t users more satisfied? Our Ventana Research benchmark research on BI indicates that only 37 percent of organizations are satisfied or very satisfied with their BI efforts.
Posted by David Menninger on Nov 26, 2010 5:44:00 PM
With the ongoing spate of mergers and acquisitions in the software industry, I’d like to offer some perspective on what these acquisitions mean to software-purchasing organizations. Think of the software industry as a thriving ecosystem, with large software companies at the top of the food chain. Small companies are formed and, if they have an interesting idea and some good marketing, they grow. The really good ones may continue to grow and be independent, but most of the good ideas eventually get bought by larger, more established companies.
Posted by David Menninger on Nov 26, 2010 5:42:22 PM
Here’s a big shout-out to the Ventana Research community. I’m happy to be here. I think it would be appropriate to introduce myself and tell you a little bit about why I’m here and what I hope to accomplish as a member of the Ventana Research team.
Posted by David Menninger on Nov 26, 2010 7:20:28 AM
Tableau Software officially released Version 6 of its product this week. Tableau approaches business intelligence from the end user’s perspective, focusing primarily on delivering tools that allow people to easily interact with data and visualize it. With this release, Tableau has advanced its in-memory processing capabilities significantly. Fundamentally Tableau 6 shifts from the intelligent caching scheme used in prior versions to a columnar, in-memory data architecture in order to increase performance and scalability.
Posted by Ventana Research on Jun 4, 2010 10:43:56 AM
Cliché or not, a business’s most valuable asset is its people, and for 15 years Softscape has been dedicated to providing applications that help human resources organizations handle a range of processes that I call workforce performance management, including what the industry refers to as talent management. Privately held Softscape operates in 156 countries, and its customers range from large companies (with 1,000 to 10,000 employees) to the extremely large (which have over 50,000 employees). It has one of the highest customer renewal rates in the industry, is growing consistently and is attracting new customers with its cloud computing offering that provides software as a service (SaaS) that organizations rent rather than license and install themselves. Softscape also has entered the expanding market for customer experience management to support processes for maximizing customer relationships.