I have written before about the continued use of specialist operational and analytic data platforms. Most database products can be used for operational or analytic workloads, and the number of use cases for hybrid data processing is growing. However, a general-purpose database is unlikely to meet the most demanding operational or analytic data platform requirements. Factors including performance, reliability, security and scalability necessitate the use of specialist data platforms. I assert that through 2026, and despite increased demand for hybrid operational and analytic processing, more than three-quarters of data platform use cases will have functional requirements that encourage the use of specialized analytic or operational data platforms. It is for that reason that specialist database providers, including Ocient, continue to emerge with new and innovative approaches targeted at specific data-processing requirements.
Organizations are continuously increasing the use of analytics and business intelligence to turn data into meaningful and actionable insights. Our Analytics and Data Benchmark Research shows some of the benefits of using analytics: Improved efficiency in business processes, improved communication and gaining a competitive edge in the market top the list. With a unified BI system, organizations can have a comprehensive view of all organizational data to better manage processes and identify opportunities.
Topics: business intelligence, embedded analytics, Data Governance, Data Management, natural language processing, AI and Machine Learning, data operations, Streaming Analytics, operational data platforms
I have written previously that the world of data and analytics will become more and more centered around real-time, streaming data. Data is created constantly and increasingly is being collected simultaneously. Technology advances now enable organizations to process and analyze information as it is being collected to respond in real time to opportunities and threats. Not all use cases require real-time analysis and response, but many do, including multiple use cases that can improve customer experiences. For example, best-in-class e-commerce interactions should provide real-time updates on inventory status to avoid stock-out or back-order situations. Customer service interactions should provide real-time recommendations that minimize the time to resolution. Location-based offers should be targeted at the customer’s current location, not their location several minutes ago. Another domain where real-time analyses are critical is internet of things (IoT) applications. Additionally, use cases like predictive maintenance require timely information to prevent equipment failures that help avoid additional costs and damage.
Teradata introduced some enhancements to its Vantage platform last year in which they expanded its analytics functions and language support, and strengthened tools to improve collaboration between data scientists, business analysts, data engineers and business personnel. Some of the key enhancements included expanding the native support for R and Python, extending the ability to execute a wide range of open-source analytics algorithms, and automatic generation of SQL from R and Python code. These updates are included to reduce data silos, enabling a wide range of data and analytics personas to collaboratively run complex analytics in a self-service manner.
The amount of data flowing into organizations is growing exponentially, creating a need to process more data more quickly than ever before. Our Data Preparation Benchmark Research shows that accessing and preparing data continues to be the most time-consuming part of making data available for analysis. This can potentially slow down the organizational functions which depend on the analysis results. Trying to get ahead of the backlog with incremental improvements to existing approaches and traditional technologies alone can be frustrating.