Organizations today are working with multiple applications and systems, including enterprise resource planning (ERP), customer relationship management (CRM), supply chain management (SCM) and other systems, where data can easily become fragmented and siloed. And as the organization increases its data sources and adds more systems and custom applications, it becomes challenging to manage the data consistently and keep data definitions up to date. This increases the need to use master data management (MDM) software that can provide a single source of truth to drive accurate analytics and business operations.
TIBCO is a large, independent cloud-computing and data analytics software company that offers integration, analytics, business intelligence and events processing software. It enables organizations to analyze streaming data in real time and provides the capability to automate analytics processes. It offers more than 200 connectors, more than 200 enterprise cloud computing and application adapters, and more than 30 non-relational structured query language databases, relational database management systems and data warehouses.
Organizations have become more agile and responsive, in part, as a result of being more agile with their information technology. Adopting a DevOps approach to application deployment has allowed organizations to deploy new and revised applications more quickly. DataOps is enabling organizations to be more agile in their data processes. As organizations are embracing artificial intelligence (AI) and machine learning (ML), they are recognizing the need to adopt MLOps. The same desire for agility suggests that organizations need to adopt AnalyticOps.
Talend is a data integration and management software company that offers applications for cloud computing, big data integration, application integration, data quality and master data management. The platform enables personnel to work with relational databases, Apache Hadoop, Spark and NoSQL databases for cloud or on-premises jobs. Talend data integration software offers an open and scalable architecture and can be integrated with multiple data warehouses, systems and applications to provide a unified view of all data. Its code generation architecture uses a visual interface to create Java or SQL code.
How does your organization define and display its metrics? I believe many organizations are not defining and displaying metrics in a way that benefits them most. If an organization goes through the trouble of measuring and reporting on a metric, the analysis ought to include all the information needed to evaluate that metric effectively. A number, by itself, does not provide any indication of whether the result is good or bad. Too often, the reader is expected to understand the difference, but why leave this evaluation to chance? Why not be more explicit about what results are expected?
Our research shows that nearly all financial service organizations (97%) consider it important to accelerate the flow of information and improve responsiveness. Even just a few years ago, capturing and evaluating this information quickly was much more challenging, but with the advent of streaming data technologies that capture and process large volumes of data in real time, financial service organizations can quickly turn events into valuable business outcomes in the form of new products and services or revenue.
Access to external data can provide a competitive advantage. Our research shows that more than three-quarters (77%) of participants consider external data to be an important part of their machine learning (ML) efforts. The most important external data source identified is social media, followed by demographic data from data brokers. Organizations also identified government data, market data, environmental data and location data as important external data sources. External data is not just part of ML analyses though. Our research shows that external data sources are also a routine part of data preparation processes, with 80% of organizations incorporating one or more external data sources. And a similar proportion of participants in our research (84%) include external data in their data lakes.
The technology industry throws around a lot of similar terms with different meanings as well as entirely different terms with similar meanings. In this post, I don’t want to debate the meanings and origins of different terms; rather, I’d like to highlight a technology weapon that you should have in your data management arsenal. We currently refer to this technology as data virtualization. Other similar terms you may have heard include data fabric, data mesh and [data] federation. I’ll briefly discuss these terms and how I see them being used, but ultimately, I’d like to share with you some research that shows why data virtualization can be valuable, regardless of what you call it.
Data governance is a hot topic these days. In fact, we are conducting benchmark research on the subject here. With increasing regulations such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), organizations face more external oversight of their data governance practices. The risk of significant fines associated with these and other regulations, coupled with organizations’ internal compliance requirements, has brought more attention to data governance practices. We anticipate through 2023, three-quarters of Chief Data Officers’ primary concerns will be governing the privacy and security of their organization’s data.
The annual Ventana Research Digital Innovation Awards showcase advances in the productivity and potential of business applications, as well as technology that contributes significantly to the improved processes and performance of an organization. Our goal is to recognize technology and vendors that have introduced noteworthy digital innovations to advance business and IT.