Services for Organizations

Using our research, best practices and expertise, we help you understand how to optimize your business processes using applications, information and technology. We provide advisory, education, and assessment services to rapidly identify and prioritize areas for improvement and perform vendor selection

Consulting & Strategy Sessions

Ventana On Demand

    Services for Investment Firms

    We provide guidance using our market research and expertise to significantly improve your marketing, sales and product efforts. We offer a portfolio of advisory, research, thought leadership and digital education services to help optimize market strategy, planning and execution.

    Consulting & Strategy Sessions

    Ventana On Demand

      Services for Technology Vendors

      We provide guidance using our market research and expertise to significantly improve your marketing, sales and product efforts. We offer a portfolio of advisory, research, thought leadership and digital education services to help optimize market strategy, planning and execution.

      Analyst Relations

      Demand Generation

      Product Marketing

      Market Coverage

      Request a Briefing



        David Menninger's Analyst Perspectives

        << Back to Blog Index

        Tableau 6 Combines In-Memory Processing and Visualization


        Tableau Software officially released Version 6 of its product this week. Tableau approaches business intelligence from the end user’s perspective, focusing primarily on delivering tools that allow people to easily interact with data and visualize it.  With this release, Tableau has advanced its in-memory processing capabilities significantly. Fundamentally Tableau 6 shifts from the intelligent caching scheme used in prior versions to a columnar, in-memory data architecture in order to increase performance and scalability.

        Tableau provides an interesting twist in its implementation of in-memory capabilities, combining in-memory data with data stored on the disk. One of the big knocks against in-memory architectures has been the limitation imposed by the physical memory on the machine. In some cases products were known to crash if you exceeded the memory. In other cases the system didn’t crash, but it performed so much slower once you exceeded the memory that it almost appeared to have crashed.

        The advent of 64-bit operating systems dramatically increased the theoretical limitations that existed in 32-bit operating systems. Servers can now be configured with significant amounts of memory at prices that are within reason, but putting your entire warehouse or large-scale data set entirely in-memory on a single machine is still a stretch for most organizations. With Tableau 6 a portion of the data can be loaded into memory and the remainder left on disk. Coupled with the feature that allows links to data in an RDBMS it provides considerable flexibility. Data can be loaded into memory, put on disk or linked to one of many supported databases. As the user interacts with data, it will be retrieved from the appropriate location. Tableau 6 also includes assistance in managing and optimizing the dividing line between data in-memory and on-disk, based on usage patterns.

        However, one of the places where this new architecture comes up short is in the data refreshment process. In the current Tableau 6, users must manually request a refresh of the data that is currently in-memory. Ideally there should be an optional automated way to keep the in-memory data up to date. The other thing I would like to see in Tableau 6 and other in-memory products is better read/write facilities. Although this version includes better “table calcs,” which can be used to display some derived data and perform some limited what-if capabilities, there is no write-back capability that would let you use Tableau as a planning tool and record the changes you explore.

        Tableau 6 includes a number of other features beyond the in-memory capabilities. It now supports a form of data federation in which data from multiple sources can be combined in a single analysis. The data can be joined on the fly in the Tableau server. Tableau refers to this as “data blending.” Users can also create hierarchies on the fly simply by dragging and dropping dimensions. And there are some new interactive graphing features such as dual-axis graphs with different levels of detail on each axis and the ability to exclude items from one axis but not the other, which can be helpful to correct for outliers such as the impact of one big sale on profitability or average deal size.

        As well this release supports several new data sources including the Open Data Protocol (OData), the DataMarket section of Windows Azure Marketplace and Aster Data who my colleague recently assessed.

        Version 6 also includes some IT-oriented enhancements. As Tableau has grown, its software has been deployed in ever-larger installations, which places a focus on its administrative facilities. The new release includes improved management for large numbers of users with grouping and assigning privileges and specific selection and edit options. It also includes a new project view of objects created and managed within Tableau. All of these help bring it forward to departmental and enterprise class analytics technology.

        Overall, the release includes features that should be well received by both end users and IT. In shares an end user analytics category with QlikView 10, which I recently assessed, and Tibco Spotfire. I’ll be anxious to see if the company can push the in-memory capabilities even further in future releases. It is clear that Tableau brings another viable option to the category of analytics for analysts with new in-memory computing and blending of data from across data sources.

        Let me know your thoughts or come and collaborate with me on  Facebook, LinkedIn and  Twitter .

        Regards,

        David Menninger – VP & Research Director

        David Menninger
        Executive Director, Technology Research

        David Menninger leads technology software research and advisory for Ventana Research, now part of ISG. Building on over three decades of enterprise software leadership experience, he guides the team responsible for a wide range of technology-focused data and analytics topics, including AI for IT and AI-infused software.

        JOIN OUR COMMUNITY

        Our Analyst Perspective Policy

        • Ventana Research’s Analyst Perspectives are fact-based analysis and guidance on business, industry and technology vendor trends. Each Analyst Perspective presents the view of the analyst who is an established subject matter expert on new developments, business and technology trends, findings from our research, or best practice insights.

          Each is prepared and reviewed in accordance with Ventana Research’s strict standards for accuracy and objectivity and reviewed to ensure it delivers reliable and actionable insights. It is reviewed and edited by research management and is approved by the Chief Research Officer; no individual or organization outside of Ventana Research reviews any Analyst Perspective before it is published. If you have any issue with an Analyst Perspective, please email them to ChiefResearchOfficer@ventanaresearch.com

        View Policy

        Subscribe to Email Updates

        Posts by Month

        see all

        Posts by Topic

        see all


        Analyst Perspectives Archive

        See All