Sensitivity to environmental, social and governance issues – or ESG – has grown over the years and with it, increasing attention by some investors and government entities urging organizations to measure and disclose ESG metrics.
The topic of revenue operations has been extensively covered recently, not least by vendors extolling the virtues of their particular offering. But as with much of the software industry, vendors often see the market through the lens of their current product capabilities rather than what is necessarily needed. With the rise of the mixed-revenue model that includes subscription and usage pricing as well as one-time sales, combined with the growth in self-service commerce, the result is more teams within an organization being directly involved with supporting revenue generation. In response, many organizations have appointed a Chief Revenue Officer (CRO) who is responsible and accountable for all sources of revenue for an organization. And with the rise of the role of the CRO, combined with an increasing adoption of mixed-revenue models, we see this as an increasingly necessary shift. We believe that leadership will need to drive this change in approach, recognizing that this will require a shift in responsibilities and, as importantly, accountability.
It’s likely that finance analytics trace back to when people first began to record transactions on clay tablets. Financial analytics were given a boost with the codification of double-entry bookkeeping, an elegant system for recording transactions that facilitate the assessment of the performance and health of an organization. Further advances were achieved with the first mechanical – and then digital system – for automating computations, while personal computing devices made the numbers accessible to all.
I have written a few times in recent months about vendors offering functionality that addresses data orchestration. This is a concept that has been growing in popularity in the past five years amid the rise of Data Operations (DataOps), which describes more agile approaches to data integration and data management. In a nutshell, data orchestration is the process of combining data from multiple operational data sources and preparing and transforming it for analysis. To those unfamiliar with the term, this may sound very much like the tasks that data management practitioners having been undertaking for decades. As such, it is fair to ask what separates data orchestration from traditional approaches to data management. Is it really something new that can deliver innovation and business value, or just the rebranding of existing practices designed to drive demand for products and services?
Artificial intelligence and machine learning are valuable to data and analytics activities. Our research shows that organizations using AI/ML report gaining competitive advantage, improving customer experiences, responding faster to opportunities and threats and improving the bottom line with increased sales and lower costs. No wonder nearly 9 in 10 (87%) research participants report using AI/ML or planning to do so.
Planful recently acquired Plannuh, a marketing-performance management application, to integrate into the Planful platform so that organizations can connect their marketing planning and analysis group with the finance department. There’s the old story of a CEO who said, “I know half my marketing spend is wasted, I just don’t know which half.” Plannuh is designed to answer that question.
Ventana Research’s Data Lakes Dynamics Insights research illustrates that while data lakes are fulfilling their promise of enabling organizations to economically store and process large volumes of raw data, data lake environments continue to evolve. Data lakes were initially based primarily on Apache Hadoop deployed on-premises but are now increasingly based on cloud object storage. Adopters are also shifting from data lakes based on homegrown scripts and code to open standards and open formats, and they are beginning to embrace the structured data-processing functionality that supports data lakehouse capabilities. These trends are driving the evolution of vendor product offerings and strategies, as typified by Cloudera’s recent launch of Cloudera Data Platform (CDP) One, described as a data lakehouse software-as-a-service (SaaS) offering.
Topics: Business Intelligence, Data Governance, Data Management, Data, AI and Machine Learning, data operations, Analytics and Data, Streaming Data & Events, operational data platforms, Analytic Data Platforms
Much has been written in recent years on the emergence of subscription management as a new revenue model that both vendors and buyers are embracing as the future. The benefits speak to the value of a predictable revenue stream for the vendor, but more importantly, the advantages to the customer who needs a lower initial outlay, predetermined expense over the lifetime of usage and the ability to cancel or suspend on demand.
As I recently pointed out, process mining has emerged as a pivotal technology for data-driven organizations to discover, monitor and improve processes through use of real-time event data, transactional data and log files. With recent advancements, process mining has become more efficient at discovering insights in complex processes using algorithms and visualizations. Organizations use it to better understand the current state of systems and business processes. It is also used to enable business process intelligence and improvement in any function or industry using events and activity models for data-driven decision-making. We assert that through 2024, 1 in 4 organizations will look to streamline their operations by exploring process mining to optimize workflow and business processes.
I have written before about the continued use of specialist operational and analytic data platforms. Most database products can be used for operational or analytic workloads, and the number of use cases for hybrid data processing is growing. However, a general-purpose database is unlikely to meet the most demanding operational or analytic data platform requirements. Factors including performance, reliability, security and scalability necessitate the use of specialist data platforms. I assert that through 2026, and despite increased demand for hybrid operational and analytic processing, more than three-quarters of data platform use cases will have functional requirements that encourage the use of specialized analytic or operational data platforms. It is for that reason that specialist database providers, including Ocient, continue to emerge with new and innovative approaches targeted at specific data-processing requirements.
A year ago, I wrote about how technology could be useful in an inflationary period, correctly anticipating the world we live in now. Responding effectively to changes in costs is always challenging, but even more so because of the choppy and chaotic nature of the current environment. Many organizations have a limited or no ability to raise prices, and are forced to find ways to minimize the impact of rising costs. And while it’s true that some organizations have a degree of pricing power, behind this generalization there is a more complex reality because this ability to raise prices often varies depending on specific products, customers and channels. Companies can best address the challenges of inflation by adopting a technique that Ventana Research calls “profitability management.”
Through 2025, establishing customer experience application suites on a common platform will be the focal point of the drive to optimize customer and organizational engagement. Organizations that are passionate about improving the customer experience are choosing to empower processes and people with intelligence through smarter applications that embrace analytics, artificial intelligence and automation to personalize and optimize the customer journey, whatever the channel of customer choice.
Earlier this year I described the growing use-cases for hybrid data processing. Although it is anticipated that the majority of database workloads will continue to be served by specialist data platforms targeting operational and analytic workloads respectively, there is increased demand for intelligent operational applications infused with the results of analytic processes, such as personalization and artificial intelligence-driven recommendations. There are multiple data platform approaches to delivering real-time data processing and analytics, including the use of streaming data and event processing and specialist, real-time analytic data platforms. We also see operational data platform providers, such as Aerospike, adding analytic processing capabilities to support these application requirements via hybrid operational and analytic processing.
A predictive finance department is one that can command technology to be more forward-looking and action-oriented while still fulfilling its core role of handling the financial elements of its organization including accounting, treasury and corporate finance. Beyond just automating rote tasks, technology also facilitates a shift toward becoming a predictive finance organization. Greater amounts of information, now available in near real time, and the increasing use of artificial intelligence (AI), enable more immediate analyses and assessments of possible courses of action, providing executives and managers the ability to better anticipate change and the agility to adapt quickly to unexpected circumstances.
Process mining is defined as the analysis of application telemetry including log files, transaction data and other instrumentation to understand and improve operational processes. Log data provides an abundance of information about what operations are occurring, the sequences involved in the processes, how long the processes are taking and whether or not the processes are completed successfully. As computing power has increased and storage costs have decreased, the economics of collecting and analyzing large amounts of log data have become much more attractive.