I recently noted that as demand for real-time interactive applications becomes more pervasive, the use of streaming data is becoming more mainstream. Streaming data and event processing has been part of the data landscape for many decades, but for much of that time, data streaming was a niche activity. Although adopted in industry segments with high-performance, real-time data processing and analytics requirements such as financial services and telecommunications, data streaming was far less common elsewhere. That has changed significantly in recent years, fueled by the proliferation of open-source and cloud-based streaming data and event technologies that have lowered the cost and technical barriers to developing new applications able to take advantage of data in-motion. This is a trend we expect to continue, to the extent that streaming data and event processing becomes an integral part of mainstream data-processing architectures. has been part of the data landscape for many decades, but for much of that time, data streaming was a niche activity.
The analytics and business intelligence market landscape continues to grow as more organizations seek robust tools and capabilities to visualize and better understand data. BI systems are used to perform data analysis, identify market trends and opportunities and streamline business processes. They can collect and combine data from internal and external systems to present a holistic view.
The applicant tracking system, for all its shortcomings, revolutionized the way people found and applied for jobs when it first hit the market in the mid-1990s. Electronic applications quickly became the norm, resume or application review became more accessible for hiring teams and compliance was much more trackable and achievable, thanks to streamlined application processes. Today, tracking and compliance aren’t enough to power the complex world of recruitment. The Great Resignation has made it abundantly clear that candidates expect the same type of consumerized experience in the hiring process as they do when buying anything at all. To win or keep the best talent, organizations must make the hiring process personalized and enjoyable, and a traditional ATS simply cannot support that mandate.
Anaplan offers a cloud-based business planning platform that incorporates a modeling and calculation engine. The tool makes it relatively easy to add or expand the scope of plans that can be connected and monitored on a single platform. This Integrated Business Planning (IBP) approach enables organizations to use the software for financial planning or budgeting, sales, supply chain, workforce, marketing and IT planning. These are the types of plans in which companies often need to create models that incorporate their specific requirements, business systems and strategy. I expect that by 2025, one-fourth of financial planning and analysis (FP&A) groups will have implemented IBP.
Topics: Office of Finance, Continuous Planning, Business Intelligence, Business Planning, Financial Performance Management, AI and Machine Learning, continuous supply chain, digital finance, profitability management
I have recently written about the importance of healthy data pipelines to ensure data is integrated and processed in the sequence required to generate business intelligence, and the need for data pipelines to be agile in the context of real-time data processing requirements. Data engineers, who are responsible for monitoring, managing and maintaining data pipelines, are under increasing pressure to deliver high-performance and flexible data integration and processing pipelines that are capable of handling the rising volume and frequency of data. Automation is a potential solution to this challenge, and several vendors, such as Ascend.io, have emerged in recent years to reduce the manual effort involved in data engineering.
The contact center industry is reexamining how organizations engage with contact center agents. One thing that we learned from the forced movement to work-from-home was that organizations have to provide agents with appropriate tools to collaborate and communicate with peers and supervisors as well as workers in the back office who participate in all sorts of customer-facing or customer-adjacent processes. It is also important to provide supervisors with visibility into agent activity. That means extending existing coaching and evaluation methods. Ventana Research believes that by 2025, nearly every organization will have dedicated systems or processes that help supervisors manage remotely.
I recently explained how emerging application requirements were expanding the range of use cases for NoSQL databases, increasing adoption based on the availability of enhanced functionality. These intelligent applications require a close relationship between operational data platforms and the output of data science and machine learning projects. This ensures that machine learning and predictive analytics initiatives are not only developed and trained based on the relationships inherent in operational applications, but also that the resulting intelligence is incorporated into the operational application in real time to support capabilities such as personalization, recommendations and fraud detection. Graph databases already support operational use cases such as social media, fraud detection, customer experience management and recommendation engines. Graph database vendors such as Neo4j are increasingly focused on the role that graph databases can play in supporting data scientists, enabling them to develop, train and run algorithms and machine learning models on graph data in the graph database, rather than extracting it into a separate environment.
Environmental, social and governance issues have grown increasingly pressing over the past few years as investors and government entities urge organizations to measure and disclose ESG metrics. I’ve already covered the broader topic as it relates to external reporting and how financial planning and analysis groups are likely to own this mandate going forward. (It’s mainly been a marketing and public relations effort up to now.) FP&A departments are also likely to be charged with responsibility for internal ESG analysis and reporting, because to achieve environmental and social goals, organizations will need to assign specific objectives to individual business units and their responsible parties. I assert that by 2025 more than one-half of corporations required to comply with ESG reporting will centralize responsibility for preparing related reports and filings with FP&A to achieve accuracy, control and efficiency objectives. To do so, FP&A groups must immediately establish a data management strategy consistent with its targeted ESG analysis and reporting approach.
I often use the term “analytics” to refer to a broad set of capabilities, deliberately broader than business intelligence. In this Perspective, I’d like to share what decision-makers should consider as they evaluate the range of analytics requirements for their organization.