Ventana Research coined the term intercompany financial management (IFM) to define a discipline for structuring and handling transactions within a corporation and between its legal entities designed to maximize staff efficiency and accounting accuracy while optimizing tax exposure, minimizing tax leakage and ensuring consistent tax and regulatory compliance. Technology has advanced to a point where this approach is feasible and cost effective. For that reason, Ventana Research asserts that by 2026, one-half of organizations with 10,000 or more employees will have implemented IFM to achieve tax, risk management and accelerated financial close benefits.
The Internet of Things describes machines and objects that are enhanced with sensors and communication connections, enabling them to report on their operational status, including outages, faults or threshold conditions that require attention. The technology has gradually made its way into business and consumer systems over the past decade. Today, these connected devices are playing an important role in customer experience processes such as field service.
In January of 2020, I was head of product innovation for a newly launched product in the human capital management technology space, targeting high-volume hiring. We had big ambitions for that year, tied to product development and sales, all documented during our annual goal-setting and performance review process. And then the pandemic hit, and everything changed overnight. Everything, that is, except for my annual goals, or those of my team, which had already been set in stone. When the annual review process came around, nothing we’d set forth earlier in the year was even applicable. How is a leader to evaluate and make compensation recommendations for a team member based on completely outdated criteria?
Despite the emphasis on organizations being more data-driven and making an increasing proportion of business decisions based on data and analytics, it remains the case that some of the most fundamental questions about an organization are difficult to answer using data and analytics. Ostensibly simple questions such as, “how many customers does the organization have?” can be fiendishly difficult to answer, especially for organizations with multiple business entities, regions, departments and applications. Increasing volumes and sources of data can hinder, rather than help. Only 1 in 5 participants (20%) in Ventana Research’s Analytics and Data Benchmark research are very confident in their organization’s ability to analyze the overall quantity of data. This is a perennial issue that data and application integration vendors, such as SnapLogic, are aiming to address – increasingly through automation and products for business users as well as data management professionals.
FinancialForce offers cloud-based ERP and professional services automation (PSA) software. The company targets midsize and larger services companies, especially those that provide professional services (such as consultants or field service organizations) as well as those that offer subscription-based or recurring revenue services and products. FinancialForce’s key point of differentiation is that it is built natively on the Salesforce platform, ensuring that CRM data is already located on the same platform as accounting and back-office data so organizations can orchestrate end-to-end, front-office to back-office processes without having to integrate different systems.
Ventana Research recently announced its 2023 Market Agenda for Human Capital Management, continuing the guidance we’ve offered for two decades to help organizations derive maximum potential from workforce- and people-related technology investments and initiatives. In crafting this Market Agenda, we focused on three critical themes top-of-mind for both HCM vendors and buyers: Organizational resiliency, employee engagement and utilizing digital technology to derive deeper insights into the state of the workforce so that leadership can take action to promote retention and growth.
I am happy to share insights from our latest Ventana Research Value Index research, which assesses how well vendors’ offerings meet buyers’ requirements. The 2023 Analytic Data Platforms Value Index is the distillation of a year of market and product research by Ventana Research. Drawing on our Benchmark Research, we apply a structured methodology built on evaluation categories that reflect real-world criteria incorporated in a request for proposal to data platform vendors supporting the spectrum of analytic use-cases. Using this methodology, we evaluated vendor submissions in seven categories: five relevant to the Product Experience: Adaptability, Capability, Manageability, Reliability and Usability, and two related to the Customer Experience: Total Cost of Ownership/Return on Investment and Validation. This research-based index evaluates the full business and information technology value of analytic data platforms offerings. I encourage you to learn more about our Value Index and its effectiveness as a vendor selection and request for information/requestion for proposal tool.
I am happy to share insights gleaned from our latest Value Index research, an assessment of how well vendors’ offerings meet buyers’ requirements. The 2023 Ventana Research Value Index: Customer Experience Management is the distillation of a year of market and product research by Ventana Research. Drawing on our Benchmark Research, we apply a structured methodology built on evaluation categories derived from RFP responses submitted by customer experience (CX) vendors. These categories reflect the real-world criteria required by organizations for CX software procurement. Using this methodology, we evaluated vendor submissions in seven categories: five relevant to the Product Experience — Adaptability, Capability, Manageability, Reliability and Usability — and two related to the Customer Experience — TCO/ROI and Vendor Validation.
With Ventana Research’s 2023 Market Agenda, we continue the guidance we’ve offered for two decades to help organizations derive maximum value from digital business technology investments. Through our market research and expertise, we identify trends and best practices and share insights on how to achieve technological effectiveness, particularly in key processes and systems to engage the workforce.
Topics: Performance Management, Business Continuity, Digital transformation, Digital Business, Digital Security, Digital Communications, Work Management, Experience Management, Governance & Risk, Sustainability & ESG
Markets have been more volatile than ever. It creates a need for decision makers to utilize technologies such as artificial intelligence and machine learning (AI/ML) to better understand the external factors that impact their business. By identifying these factors, organizations can better plan for changing market environments and seize market opportunities. However, manual modeling is a time-consuming process and results in a limited number of models and tests. Also, updating those models is slow and laborious. With the addition of market volatility, it creates multiple challenges for CFOs, managers and financial planning specialists. With limited exposure to external drivers of demand and delivery, the process becomes very costly. Developing accurate forecasts requires integrating exogenous data with the internal performance data, but it’s challenging to find quality external data and then get that raw data clean enough to input into any model. My colleague, Robert Kugel, recently shared his perspective on using external data for forecasting, budgeting and planning to enhance predictive capabilities.
Ventana Research recently announced its 2023 Market Agenda in the expertise area of Marketing, continuing the guidance we have offered for nearly two decades to help organizations derive optimal value from business technology and improve outcomes.
The subscription and recurring revenue business models became a significant part of the economy this century with the advent of streaming services for entertainment and software as a service. They have grown in popularity because they enhance customer lifetime value by evolving what had previously been a one-time-sale relationship into a delivery of ongoing services which can create a more loyal customer relationship as well as provide a regular, more predictable revenue stream. I recommend that corporations that have adopted or are planning to adopt either of these business models take a continuous accounting approach to managing their record keeping. Ventana Research asserts that by 2026, one-half of subscription organizations will use continuous accounting to remove constraints limiting sales and marketing flexibility, streamline back-office processes, shorten the accounting close and improve customer satisfaction.
Ventana Research recently published the 2023 Analytic Data Platforms Value Index. As organizations strive to be more data-driven, increasing reliance on data as a fundamental factor in business decision-making, the importance of the analytic data platform has never been greater. In this post, I’ll share some of my observations about how the analytic data platforms market is evolving.
Ventana Research recently announced its 2023 research agenda for Operations and Supply Chain, continuing the guidance we’ve offered for nearly two decades to help organizations across industries derive optimal value and improved outcomes from business technology.
Having just completed the 2023 Ventana Research Value Index for Customer Experience Management, I want to share some of my observations about how the market has developed. We found that there are many tools available for various needs related to customer management and communication, ranging from marketing tools to contact center systems to data and analytics applications. It is rare to find all the components fully integrated into a single platform, although that appears to be where the industry is headed. We found that despite the lack of clarity in the marketplace, technology is moving quickly to provide users with more extensive tools that work better together. The vendor landscape is fractured, but most are taking advantage of developments in artificial intelligence (AI), machine learning (ML) and workflow automation to deliver functionality that, in some cases, was simply not possible as recently as five years ago.
I am happy to share insights from our latest Ventana Research Value Index research, which assesses how well vendors’ offerings meet buyers’ requirements. The 2023 Operational Data Platforms Value Index is the distillation of a year of market and product research by Ventana Research. Drawing on our Benchmark Research, we apply a structured methodology built on evaluation categories that reflect real-world criteria incorporated in a request for proposal to data platform vendors supporting the spectrum of operational use cases. Using this methodology, we evaluated vendor submissions in seven categories: five relevant to the Product Experience: Adaptability, Capability, Manageability, Reliability and Usability, and two related to the Customer Experience: Total Cost of Ownership/Return on Investment and Validation.
Ventana Research recently announced its 2023 Market Agenda for the Office of Finance, continuing the guidance we have provided since 2003 on the practical use of technology for the finance and accounting department. Our insights and best practices aim to enable organizations to operate with agility and resiliency, improving performance and delivering greater value as a strategic partner.
Configure, price and quote (CPQ) software has been traditionally viewed as a specialist outpost of software, almost exclusively B2B and predominately used in manufacturing or specialty chemicals or other industries where there were large variants of similar products that required careful configuration, pricing and ultimately generating a quote that could be forwarded to a prospective customer for consideration. But in recent years, this has changed, both in the number of vendors offering CPQ products, either standalone or more commonly as part of a broader offering, and the number of selling organizations recognizing the value of either the “C,” the “P,” or the “Q” or some combination of these.
Ventana Research recently announced its 2023 Market Agenda for Data, continuing the guidance we have offered for two decades to help organizations derive optimal value and improve business outcomes.
Ventana Research recently announced its 2023 Market Agenda for Analytics, continuing the guidance we have offered for nearly two decades to help organizations derive optimal value from technology investments to improve business outcomes.
Ventana Research recently published the 2023 Operational Data Platforms Value Index. The importance of the operational data platform has never been greater as organizations strive to be more data-driven, incorporating intelligence into operational applications via personalization and recommendations for workers, partners and customers. In this post, I’ll share some of my observations on how the operational data platforms market is evolving.
A professional services automation application marries front- and back-office functions, helping services organizations address core business challenges by ensuring that:
Ventana Research recently announced its 2023 research agenda for the Office of Revenue, continuing the guidance we’ve offered for nearly two decades to help organizations realize their optimal value from applying technology to improve business outcomes. Chief Sales and Revenue Officers face an imperative to manage their sales and revenue organizations, but they don’t always have the guidance they need to embrace technology to achieve the best possible outcomes. As we look forward to 2023, we are focusing on the entire selling and buying journey, and in addition focusing on those activities that ensure renewal and expansion as well as newer digital engagement and selling channels. We are looking at applications that simplify processes and tasks across the customer experience, from beginning to end.
Topics: Sales, Analytics, Internet of Things, Data, Sales Performance Management, Digital Technology, Digital Commerce, Conversational Computing, AI and Machine Learning, mobile computing, Subscription Management, extended reality, intelligent sales, partner management, Sales Engagement
Ventana Research recently announced its Market Agenda in the expertise area of Customer Experience. CX has emerged as a way for organizations to demonstrate value and stand out in the marketplace. The technology underlying modern CX is transitioning from tools that are based on communication to those centered on data analysis and process automation. This allows organizations to build great experiences and reap the benefits in customer loyalty and value. It also forces companies to reckon with the complexity and disruption that technologies like artificial intelligence and automation bring to an organization.
I am happy to share insights from our latest Ventana Research Value Index, which assesses how well vendors’ offerings meet buyers’ requirements. The 2023 Data Platforms Value Index is the distillation of a year of market and product research by Ventana Research. Drawing on our Benchmark Research, we apply a structured methodology built on evaluation categories that reflect real-world criteria incorporated in a request for proposal to data platform vendors that support the spectrum of operational and analytic use cases. Using this methodology, we evaluated vendor submissions in seven categories: five relevant to the Product Experience: Adaptability, Capability, Manageability, Reliability and Usability, and two related to the Customer Experience: Total Cost of Ownership/Return on Investment and Validation.
Since 2021, Verint’s message to the customer experience community has focused on the “engagement capacity gap,” a way of describing how the available resources for CX clash with the high level of customer expectations. The argument is that efforts to close that gap require a rethink of how contact centers (and their parent organizations) operate and plan.
I’m proud to share Ventana Research’s 2023 Market Agenda for Digital Technology. Our focus in this agenda is to deliver expertise to help organizations prioritize technology investments that improve customer, partner and workforce experiences while also increasing organizational effectiveness and agility.
Topics: Analytics, Cloud Computing, Internet of Things, Data, Digital Technology, blockchain, AI and Machine Learning, mobile computing, extended reality, robotic automation, Collaborative & Conversational Computing
Vertical strategies for enterprise resource planning systems are not new. They emerged more than two decades ago as vendors looked for ways to reduce costs and shorten time-to-value in a software category that was notorious for high costs and extended timelines. A vertical-plus strategy – the plus means it’s a platform, not just an application – takes advantage of recently available technology to extend the ease of implementation and maintenance of the system by having deeper integration with complementary applications, available low-code/no-code customization capabilities and a data pantry that enables the amalgamation of data from multiple sources for situational awareness and decision support. Moreover, the ongoing shift from on-premises to cloud-based ERP systems, especially those designed to address specific types of businesses, will accelerate over the next five years as more configurable and customizable systems designed for specific business verticals become available. A cloud-based platform facilitates the creation of a digital ecosystem that can enable a software vendor’s users to enhance customer experiences.
Data observability is a hot topic and trend. I have written about the importance of data observability for ensuring healthy data pipelines, and have covered multiple vendors with data observability capabilities, offered both as standalone and part of a larger data engineering system. Data observability software provides an environment that takes advantage of machine learning and DataOps to automate the monitoring of data quality and reliability. The term has been adopted by multiple vendors across the industry, and while they all have key functionality in common – including collecting and measuring metrics related to data quality and data lineage – there is also room for differentiation. A prime example is Acceldata, which takes a position that data observability requires monitoring not only data and data pipelines but also the underlying data processing compute infrastructure as well as data access and usage.
Having recently completed the 2023 Data Platforms Value Index, I want to share some of my observations about how the market is evolving. Although this is our inaugural assessment of the market for data platforms, the sector is mature and products from many of the vendors we assess can be used to effectively support operational and analytic use cases.
Ventana Research has announced its market agenda for 2023, continuing a 20-year tradition of credibility and trust in our objective efforts to educate and guide the technology market. Our research and insights are backed by our expertise and independence, as we do not share our Market Agenda or our market research – including analyst and market perspectives – with any external party before it is published. We continuously refine our Market Agenda throughout the year to ensure we offer the expertise and insights organizations rely on to better assess and navigate the direction of the technology industry.
As the world of work continues to evolve, so too must the way organizations evaluate talent. Traditional evaluation criteria like education and prior job titles still have a place in hiring, promotion and succession-planning processes. However, organizations that consider transferrable skills first have the opportunity to screen in talent that they may not have otherwise considered, creating a substantial advantage in the proverbial war for talent.
Consumer and mobile applications have influenced our expectations. Nearly all of us carry a smartphone, and we interact with a variety of applications on our devices. Those applications have forever influenced what we expect from computing systems. When I search the web for a gas station, I’m not searching for all gas stations. I’m searching for those stations that are near me. Not just near my regular location, but near my current location. We expect personalized interactions, not generic, one-size-fits-all interactions. These expectations have spilled over from consumer applications to enterprise applications. Unfortunately, enterprise IT applications have not fully met these expectations yet.
Topics: Digital Technology
Those of us who have worked in or alongside sales teams have observed that many sales fundamentals have changed over the years. Yet, in many ways, they have not. One essential that has not changed is sales enablement. How do you onboard salespeople and quickly bring them up to speed? Or, how do you introduce new products and services so the sales team is conversant and knowledgeable about them? In years past, this would have happened at the head office or a regional office, with videos and internally produced documentation and product fact sheets. But thanks to the internet, this has all changed. Digitization has improved – and will continue to enhance – sales enablement.
In today’s organization, the myriad of analytics and permutations of dashboards challenge workers’ ability to take contextual actions efficiently. Unfortunately, conventional wisdom for investing in analytics does not recognize the benefits of empowering the workforce to understand the situation, examine options and work together to make the best possible decision.
Since its inception, HCM software has upended how people interact with their workplace. Paper resumes have given way to online applications. Physical time clocks have largely disappeared in favor of apps or clocks housed within a point-of-sale system. Even benefits enrollment has entered the digital age, adding tools like decision-support modeling to help enrollees determine which of the myriad offerings will better fit their specific circumstances. This digital transformation does more than just improve productivity and drive compliance. It helps to create an experience for workers that can build trust and engagement. In our research, 64% of organizations indicated that delivering a superior employee experience is a top priority. We assert that by 2025, two-thirds of organizations will require artificial intelligence (AI) in all HCM and adjacent systems to curate personalized experiences for all workers to drive engagement, productivity and retention. UKG, a company founded in 2021 by the merger of two HCM technology giants, Ultimate Software and Kronos, excels in this arena.
The management of work is a focal point for every organization that has people and resources directed to accomplish the smallest to largest of tasks. But many organizations are not easily able to manage complex activities because the details of how people are assigned and complete work are not as simple as they should be. Traditional project management methods and technologies have failed to work in an enterprise manner, so new approaches have emerged to meet today’s challenges. The essence of work management is to ensure the automation and intelligence afforded by the technology industry is infused in how organizations operate, as outlined in my perspective.
Organizations conduct data analysis in many ways. The process can include multiple spreadsheets, applications, desktop tools, disparate data systems, data warehouses and analytics solutions. This creates difficulties for management to provide and maintain updated information across multiple departments. Our Analytics and Data Benchmark Research shows that organizations face a variety of challenges with analytics and business intelligence. One-third of participants find it difficult to integrate analytics and BI with other business processes. Participants also find that not all software is flexible enough for the constantly changing business environment, and that it is hard to access all data sources.
Workiva offers an environmental, social and governance application that enables organizations to manage the highly distributed tasks necessary for reporting to regulators and stakeholders on ESG matters. ESG issues have grown increasingly pressing over the past few years as investors and government entities urge organizations to measure and disclose relevant metrics. I’ve already covered the broader topic as it relates to external reporting and how financial planning and analysis groups are likely to own this mandate going forward. I’ve also addressed the data strategy that finance organizations should adopt to meet regulatory compliance requirements. Notably, I assert that by 2025, more than one-half of corporations required to comply with ESG reporting will centralize responsibility for preparing reports and filings with financial planning and analysis to achieve accuracy, control and efficiency objectives.
For far too long, business intelligence technologies have left the rest of the exercise to the reader. Many of these tools do an excellent job providing information in an interactive way that lets organizations dive into the data and learn a lot about what has happened across all aspects of the business. More recently, many of these tools have added augmented intelligence capabilities that help explain why things happened. But rarely did any of these tools provide information about what to do or how to evaluate the alternative ways in which you might respond.
The shift from on-premises server infrastructure to cloud-based and software-as-a-service (SaaS) models has had a profound impact on the data and analytics architecture of many organizations in recent years. More than one-half of participants (59%) in Ventana Research’s Analytics and Data Benchmark research are deploying data and analytics workloads in the cloud, and a further 30% plan to do so. Customer demand for cloud-based consumption models has also had a significant impact on the products and services that are available from data and analytics vendors. Data platform providers, both operational and analytic, have had to adapt to changing customer demand. The initial response — making existing products available for deployment on cloud infrastructure — only scratched the surface in terms of responding to emerging expectations. We now see the next generation of products, designed specifically to deliver innovation by taking advantage of cloud-native architecture, being brought to market both by emerging startups, and established vendors, including InterSystems.
Topics: Business Intelligence, Cloud Computing, Data Management, Data, natural language processing, AI and Machine Learning, data operations, Analytics & Data, operational data platforms, Analytic Data Platforms
Analytics processes are all about how organizations use data to create metrics that help manage and improve operations. Yet, the discipline applied to analytics processes seems to be lacking compared to data processes. I’ve pointed out that the weak link in data governance is often analytics. Organizations can also do a better job tying AnalyticOps to DataOps and do more to define and manage metrics. Our research has shown that creating and managing metrics in a semantic model improves analytics processes.
There is always space for innovation in the data platforms sector, and new vendors continue to emerge at regular intervals with new approaches designed to serve specialist data storage and processing requirements. Factors including performance, reliability, security and scalability provide a focal point for new vendors to differentiate from established vendors, especially for the most demanding operational or analytic data platform requirements. It is never easy, however, for developers of new data platform products to gain significant market traction, given the dominance of the established relational database vendors and cloud providers. Targeting requirements that are not well-served by general purpose data platforms can help new vendors get a toe in the door of customer accounts. The challenge to gaining further market traction is for new vendors to avoid having products become pigeon-holed as only being suitable for a niche set of requirements. This is precisely the problem facing the various distributed SQL database providers.
Ventana Research uses the term “data pantry” to describe a method of data storage (and the technology and process blueprint for its construction) created for a specific set of users and use cases in business-focused software. It’s a pantry because all the data one needs is readily available and easily accessible, with labels that are immediately recognized and understood by the users of the application. In tech speak, this means the semantic layer is optimized for the intended audience. It is stocked with data gathered from multiple sources and immediately available for analysis, forecasting, planning and reporting. This does away with the need for analysts to repeatedly perform data extraction, enrichment or transformation motions from the required source systems, all but eliminating the substantial amount of time analysts and business users routinely spend on data preparation.
Topics: Continuous Planning, Business Intelligence, Data Management, Business Planning, Data, Financial Performance Management, Enterprise Resource Planning, AI and Machine Learning, continuous supply chain, data operations, digital finance, profitability management, Analytics & Data, Streaming Data & Events
In previous perspectives in this series, I’ve discussed some of the realities of cloud computing including costs, hybrid and multi-cloud configurations and business continuity. This perspective examines the realities of security and regulatory concerns associated with cloud computing. These issues are often cited by our research participants as reasons they are not embracing the cloud. To be fair, the majority of our research participants are embracing the cloud. However, among those that have not yet made the transition to the cloud, security and regulatory concerns are among the most common issues cited across the various studies we have conducted.
In the face of a very uncertain future, companies have been discovering the value of rapid planning and budgeting cycles. As events unfold, they’re changing expectations for the future significantly on a daily or weekly basis. However, even when the world returns to a steadier state, companies will benefit from making their planning and budgeting processes faster, easier, more relevant, more strategic, more agile and more accurate.
Recently, I suggested you need to “mind the gap” between data and analytics. This perspective addresses another gap — the gap in skills between business intelligence (BI) and artificial intelligence/machine learning (AI/ML).
After decades of overpromising and underdelivering, technology has now evolved to the point where it is fundamentally changing how accountants work – for the better. The pandemic and resulting support of remote work set the stage for a transformation of how accounting efforts are structured and performed, all for the better. Remote audits that became routine during lockdowns are evolving into virtual ones, where auditors take full advantage of advanced software to achieve dependably higher audit quality with less effort, while improving working conditions for auditors and staff accountants. Although discussions I’ve had with practitioners over the past two years indicate that organizations are using this approach to some extent, widespread use has become practical only recently.
The market and buyer landscape for contact center operating services has changed significantly since the onset of the pandemic, now almost three years ago. Three years would have been enough time for some significant shifts, even without the pressure the pandemic put on service operations. Nevertheless, with on-premises systems now taking a backseat industrywide, it’s fair to say that CCaaS, which typically refers to cloud-based systems, now represents the lions’ share of spending and therefore stands as a proxy for the industry as a whole. Ventana Research predicts that by 2026, 7 in ten organizations will have moved all or part of their contact center technology into the cloud to attain greater flexibility and scalability.
The technology industry has established itself as a pivotal force in its ability to help organizations become more intelligent and automated. But doing so has required a journey of epic proportions for most organizations that have had to endure a transition of competencies and skills that was, in many places, transitioned to consulting firms who were hired appropriately to manage changes. Unfortunately, this step led, in many cases, to an extended focus on digital transformation rather than the necessary modernization of business processes and technology. Through 2024, after concerted investment into digital transformation, one-half of organizations will require a new digital business and technology agenda for organizational resilience.
Earlier this year, I wrote about the increasing importance of data observability, an emerging product category that takes advantage of machine learning (ML) and Data Operations (DataOps) to automate the monitoring of data used for analytics projects to ensure its quality and lineage. Monitoring the quality and lineage of data is nothing new. Manual tools exist to ensure that it is complete, valid and consistent, as well as relevant and free from duplication. Data observability vendors, including Monte Carlo Data, have emerged in recent years with the goal of increasing the productivity of data teams and improving organizations’ trust in data using automation and artificial intelligence and machine learning (AI/ML).
Emburse offers a single platform that enables organizations — small, midsize and larger —to manage their travel and related expenses, pay invoices and handle their corporate spend. Today, technology has the ability to significantly increase the efficiency with which organizations handle expenditures while simultaneously containing costs, increasing controls and improving visibility into where the money is going. This is part of a broader trend toward digitizing outlays: I assert that by 2025, more than two-thirds of organizations will be using spend management software and corporate cards to achieve greater control and increased efficiency.
One of the most significant considerations when choosing an analytic data platform is performance. As organizations compete to benefit most from being data-driven, the lower the time to insight the better. As data practitioners have learnt over time, however, lowering time to insight is about more than just high-performance queries. There are opportunities to improve time to insight throughout the analytics life cycle, which starts with data ingestion and integration, includes data preparation and data management, as well as data storage and processing, and ends with data visualization and analysis. Vendors focused on delivering the highest levels of analytic performance, such as SQream, understand that lowering time to insight relies on accelerating every aspect of that life cycle.
Embedded business intelligence (BI) continues to transform the business landscape, enabling organizations to quickly interpret data and convert it into actionable insights. It allows organizations to extract information in real time and answer wide-ranging business questions. Embedding analytics helps tackle the issue of extracting information from data which is a time-consuming process. Our research shows organizations spend more time cleaning and optimizing data for analysis rather than creating insights. On top of that, they are adding more data sources and information systems which in turn introduces more complexity. Our Analytics and Data Benchmark Research shows that organizations face various challenges with analytics and BI. More than one-third of participants (35%) responded that they find it hard to integrate analytics and BI with business processes and connect to multiple data sources. By embedding analytics and BI into business processes and workflows, organizations can enable users to make critical decisions fast, enhancing overall business agility.
Managing corporate income taxes is a challenge for chief financial officers and their tax department professionals. Tax codes are often complex, so tax accounting as well as the data required for tax provisions and tax compliance are different enough from statutory accounting to create significant workloads for the tax department. The provision for income tax expense and, for public companies, the assembly of information related to tax-related disclosures, can be a factor holding up the completion of the accounting close.
Organizations are increasingly utilizing cloud object storage as the foundation for analytic initiatives. There are multiple advantages to this approach, not least of which is enabling organizations to keep higher volumes of data relatively inexpensively, increasing the amount of data queried in analytics initiatives. I assert that by 2024, 6 in ten organizations will use cloud-based technology as the primary analytics data platform, making it easier to adopt and scale operations as necessary.
Vena Solutions offers organizations a platform for financial planning, analysis and reporting as well as software to manage accounting consolidation and close processes. From the start, Vena has designed its applications to meet the needs of midsize organizations, which typically have the same requirements as large enterprises but with significantly fewer resources to acquire, manage and maintain technology. Ventana Research named Vena a Value Index Leader in Adaptability and a Vendor of Merit in its 2022 Value Index on Business Planning.
In today’s data-driven world, organizations need real-time access to up-to-date, high-quality data and analysis to keep pace with changing market dynamics and make better strategic decisions. By mining meaningful insights from enterprise data quickly, they gain a competitive advantage in the market. Yet, organizations face a multitude of challenges when transitioning into an analytics-driven enterprise. Our Analytics and Data Benchmark Research shows that more than one-quarter of organizations find it challenging to access data sources and integrate data and analytics in business processes. Vendors such as IBM offer a broad set of analytics tools with self-service capabilities that allows organizations to reduce IT dependencies and enables decision-makers to recognize performance gaps, market trends and new revenue opportunities. Its technology can simplify data access for self-service applications, enabling users to make business decisions informed by insights and take the guesswork out of decision-making.
The theme of this year’s Oracle NetSuite SuiteWorld was “Full Suite Ahead,” with content aimed at demonstrating to customers (and prospective buyers) the value of using more of what NetSuite has to offer. The business logic behind this concept goes beyond the obvious objective of upselling existing customers to increase the average annual recurring revenue. As is often the case with subscription businesses, customers fail to take advantage of what’s already included in their service. Ensuring that customers are achieving full value is essential to retaining them, and almost always a precondition to selling them more. For a cloud software vendor, this translates to having an effective customer success organization backed by a customer-centric product strategy and a product management organization that delivers on the strategy. All of this was on display at the event.
The pandemic years saw an exponential rise in organizational investment in digital learning, and for good reason. With in-person learning no longer an option, organizations were forced to quickly adapt to the changing world of work in order to ensure workers were prepared with the knowledge and training necessary to operate in ways they had never before seen. Beyond operational imperatives, organizations have turned to digital learning systems to find new ways to track and bolster productivity, sentiment and engagement of the workforce. This serves to protect the investments made in hiring, onboarding and upskilling or reskilling talent by mitigating risk and regrettable attrition as well as helping employees along a career path that is achievable and desirable for the employee and strategically advantageous to the employer.
Almost all organizations are investing in data science, or planning to, as they seek to encourage experimentation and exploration to identify new business challenges and opportunities as part of the drive toward creating a more data-driven culture. My colleague, David Menninger, has written about how organizations using artificial intelligence and machine learning (AI/ML) report gaining competitive advantage, improving customer experiences, responding faster to opportunities and threats, and improving the bottom line with increased sales and lower costs. One-quarter of participants (25%) in Ventana Research’s Analytics and Data Benchmark Research are already using AI/ML, while more than one-third (34%) plan to do so in the next year, and more than one-quarter (28%) plan to do so eventually. As organizations adopt data science and expand their analytics initiatives, they face no shortage of options for AI/ML capabilities. Understanding which is the most appropriate approach to take could be the difference between success and failure. The cloud providers all offer services, including general-purpose ML environments, as well as dedicated services for specific use cases, such as image detection or language translation. Software vendors also provide a range of products, both on-premises and in the cloud, including general-purpose ML platforms and specialist applications. Meanwhile, analytic data platform providers are increasingly adding ML capabilities to their offerings to provide additional value to customers and differentiate themselves from their competitors. There is no simple answer as to which is the best approach, but it is worth weighing the relative benefits and challenges. Looking at the options from the perspective of our analytic data platform expertise, the key choice is between AI/ML capabilities provided on a standalone basis or integrated into a larger data platform.
In my previous perspectives on cloud computing, I addressed some of the realities of cloud costs as well as hybrid and multi-cloud architectures. In the midst of the pandemic, my colleague, Mark Smith, authored a series of perspectives on considerations for business continuity in general, beginning with this look at some of the investments organizations must make to mitigate the risk of business disruptions. In this perspective, I’d like to address some of the realities of business continuity and cloud computing and how they impact the digital technologies of an organization. The cloud can be both advantageous and disadvantageous when it comes to providing business continuity.
The starting point of an era is never precise and rarely conforms to neat calendar delineations. For example, the start of the 20th century is associated with the outbreak of war in 1914. So I expect that decades from now, the consensus will hold that what became known as the 21st century began in the year 2020, with the pandemic serving as a catalyst that accelerated already existing trends and forced changes to prevailing norms and practices. This and other disruptive events that have followed are reverberating through economic and social networks and will ultimately result in some new equilibrium, but the ructions on the way there will be sharp and ever-present. Large-scale disruptions in most aspects of doing business have forced change on organizations. In this climate, the financial planning and analysis group can play a far more important role by using technology to enhance organizational agility and improve performance.
IBM Planning Analytics with Watson is a comprehensive, cloud-based business planning application that supports what Ventana Research calls integrated business planning. We coined this term in 2007 to describe a high-participation approach to business planning that integrates strategy, operations and finance. Our Next Generation Business Planning Benchmark Research demonstrated the value of IBP: Organizations that link planning processes get better results. Sixty-six percent of organizations that have an integrated method say it works well or very well, compared to only 25% that have little or no connection between plans.
Topics: Predictive Analytics, Office of Finance, embedded analytics, Business Intelligence, Business Planning, Financial Performance Management, Watson, Digital transformation, AI and Machine Learning, digital finance, profitability management
The idea of partnerships in business is most definitely not new. Wholesaling through distributors and retailers is centuries old. For some industries, their entire model is selling and servicing through partners. Think auto parts, and the auto part stores visible in most neighborhoods. But what is new is that partnerships are moving beyond this reseller model towards product partnerships, where a seller’s products and services are supplemented by other vendors’ offerings from adjacent and complementary markets.
It has been three years since Oracle hosted a CloudWorld event live in Las Vegas, and it is clear the time has not been wasted. With 12 new releases since the last event, the Oracle human capital management product team has made significant advances in both product and service. Three areas of note: Oracle HCM’s continued focus on personalization of experience, meeting the HCM needs of the healthcare industry and advancing the Oracle Cloud Recruiting offering.
Topics: Human Capital Management
If you’ve ever been to London, you are probably familiar with the announcements on the London Underground to “mind the gap” between the trains and the platform. I suggest we also need to mind the gap between data and analytics. These worlds are often disconnected in organizations and, as a result, it limits their effectiveness and agility.
I have previously written about growing interest in the data lakehouse as one of the design patterns for delivering hydroanalytics analysis of data in a data lake. Many organizations have invested in data lakes as a relatively inexpensive way of storing large volumes of data from multiple enterprise applications and workloads, especially semi- and unstructured data that is unsuitable for storing and processing in a data warehouse. However, early data lake projects lacked structured data management and processing functionality to support multiple business intelligence efforts as well as data science and even operational applications.
The door opened to a new world in 2020, one that renders old assumptions suspect and future outcomes more varied and uncertain. It’s likely that the transition to what’s next will be bumpy, which makes planning more effectively that much more strategic.
Prophix offers cloud financial software for planning, budgeting, reporting and statutory financial consolidation designed to meet the requirements of midsize organizations and divisions of larger corporations. The company was one of the first to offer a planning platform capable of bringing together a company’s diverse planning processes and financial planning and budgeting. Its consolidation and close automation enable a shorter close and improved accounting staff productivity for midsize corporations that have even moderately complex legal entity structures that operate in multiple currencies. Increasingly, organizations are finding that having the right finance and accounting department software tools helps attract and retain the best talent.
For quite a few years now, two trends have put the contact center on a collision course. First, the technology used to handle customer inquiries has been evolving quickly, moving organizations farther and farther away from the traditional mode of primarily answering voice calls. At the same time, consumers have become much more demanding. There’s no doubt that customers are more likely to use quality of service as a gauge for whether they should continue doing business with an organization. They’re more willing to bolt for a competitor if they have a bad experience. In short, they want more of everything, and contact centers have been trying to accommodate these expectations.
I have written recently about the similarities and differences between data mesh and data fabric. The two are potentially complementary. Data mesh is an organizational and cultural approach to data ownership, access and governance. Data fabric is a technical approach to automating data management and data governance in a distributed architecture. There are various definitions of data fabric, but key elements include a data catalog for metadata-driven data governance and self-service, agile data integration.
The worldwide market for software to manage indirect income taxes, which includes sales and use, goods and services (GST) and value-added taxes (VAT), has been growing because of recent compliance mandates, the growth of e-commerce as well as a desire to accelerate business processes by reducing friction in areas such as tax compliance, cutting administration costs and lowering risk. Vertex provides businesses with cloud-based software that manages indirect tax processes for midsize and larger companies, especially for those with complex tax profiles. Vertex enables local and worldwide compliance backed by its ongoing tax research that continually compiles tax rules for over 19,000 jurisdictions. Because links with core financials is an essential capability for organizations of any size, Vertex maintains pre-built integrations with the leading ERP and financial management systems. Cloud-based systems are now the norm to support teams that are geographically dispersed and enable hybrid work environments. Ventana Research asserts that by 2026, a majority of midsize and larger companies will have digitized their indirect tax compliance to ensure accuracy as jurisdictions step up audits to increase revenues.
ERP systems have been a fixture of organizational process management and record keeping for so long (more than three decades) that it is likely that few who use the software are aware that ERP is an acronym for Enterprise Resources Planning. Its smooth and uninterrupted functioning is essential to an organization’s accounting and finance processes. In manufacturing and distribution, ERP manages inventory and logistics. Some organizations use it to handle human resources functions like tracking workers, payroll and related costs. Its initial introduction represented a major advance, but its subsequent evolution has been slow and mainly a series of incremental refinements.
Pressures to engage consumers through every interaction and provide a delightful customer experience are influencing advancements in business and technology. Organizations are challenged to manage friction points experienced by billions of consumers amid expanding digital channels. These issues must be addressed to engage and respond to customers every second of the day.
Traditional key performance indicators used for performance measurement in contact centers are no longer sufficient. These outdated standards don’t reliably inform mid- and upper-level leadership about the true impact of agent work and behavior. Organizations should begin to expand the notion of what’s important in order to make the contact center a stronger organizational institution, more closely tied to others who impact the customer experience. Outside the contact center, people are keen to understand the relationship between what’s being spent and what’s coming in: revenue and growth.
Some weeks back I published my thoughts about the traditional applicant tracking system and how that technology is no longer sufficient to support organizations’ complex recruiting needs. This is particularly true when trying to take a one-size-fits-all approach to hiring processes. Having identical hiring processes and requirements for professional hires as for low-complexity, low barrier-to-entry roles will inevitably result in lost candidates, frustrated recruiters and hiring managers, and low application-to-hire conversion rates. It doesn’t have to be that way. The right technology can support a hiring process that is purpose built for the segment it is intended to attract and hire.
In their pursuit to be data-driven, organizations are collecting and managing more data than ever before as they attempt to gain competitive advantage and respond faster to worker and customer demands for more innovative, data-rich applications and personalized experiences. As data is increasingly spread across multiple data centers, clouds and regions, organizations need to manage data on multiple systems in different locations and bring it together for analysis. As the data volumes increase and more data sources and data types are introduced in the organization, it creates challenges to storing, managing, connecting and analyzing the huge set of information that is spread across multiple locations. Having a strong foundation and scalable data management architecture in place can help alleviate many of the challenges organizations face when they are scaling and adding more infrastructure. We have written about the potential for hybrid and multi-cloud platforms to safeguard data across heterogenous environments, which plays to the strengths of companies, such as Actian, that provide a single environment with the ability to integrate, manage and process data across multiple locations.
Sensitivity to environmental, social and governance issues – or ESG – has grown over the years and with it, increasing attention by some investors and government entities urging organizations to measure and disclose ESG metrics.
The topic of revenue operations has been extensively covered recently, not least by vendors extolling the virtues of their particular offering. But as with much of the software industry, vendors often see the market through the lens of their current product capabilities rather than what is necessarily needed. With the rise of the mixed-revenue model that includes subscription and usage pricing as well as one-time sales, combined with the growth in self-service commerce, the result is more teams within an organization being directly involved with supporting revenue generation. In response, many organizations have appointed a Chief Revenue Officer (CRO) who is responsible and accountable for all sources of revenue for an organization. And with the rise of the role of the CRO, combined with an increasing adoption of mixed-revenue models, we see this as an increasingly necessary shift. We believe that leadership will need to drive this change in approach, recognizing that this will require a shift in responsibilities and, as importantly, accountability.
It’s likely that finance analytics trace back to when people first began to record transactions on clay tablets. Financial analytics were given a boost with the codification of double-entry bookkeeping, an elegant system for recording transactions that facilitate the assessment of the performance and health of an organization. Further advances were achieved with the first mechanical – and then digital system – for automating computations, while personal computing devices made the numbers accessible to all.
I have written a few times in recent months about vendors offering functionality that addresses data orchestration. This is a concept that has been growing in popularity in the past five years amid the rise of Data Operations (DataOps), which describes more agile approaches to data integration and data management. In a nutshell, data orchestration is the process of combining data from multiple operational data sources and preparing and transforming it for analysis. To those unfamiliar with the term, this may sound very much like the tasks that data management practitioners having been undertaking for decades. As such, it is fair to ask what separates data orchestration from traditional approaches to data management. Is it really something new that can deliver innovation and business value, or just the rebranding of existing practices designed to drive demand for products and services?
Artificial intelligence and machine learning are valuable to data and analytics activities. Our research shows that organizations using AI/ML report gaining competitive advantage, improving customer experiences, responding faster to opportunities and threats and improving the bottom line with increased sales and lower costs. No wonder nearly 9 in 10 (87%) research participants report using AI/ML or planning to do so.
Planful recently acquired Plannuh, a marketing-performance management application, to integrate into the Planful platform so that organizations can connect their marketing planning and analysis group with the finance department. There’s the old story of a CEO who said, “I know half my marketing spend is wasted, I just don’t know which half.” Plannuh is designed to answer that question.
Ventana Research’s Data Lakes Dynamics Insights research illustrates that while data lakes are fulfilling their promise of enabling organizations to economically store and process large volumes of raw data, data lake environments continue to evolve. Data lakes were initially based primarily on Apache Hadoop deployed on-premises but are now increasingly based on cloud object storage. Adopters are also shifting from data lakes based on homegrown scripts and code to open standards and open formats, and they are beginning to embrace the structured data-processing functionality that supports data lakehouse capabilities. These trends are driving the evolution of vendor product offerings and strategies, as typified by Cloudera’s recent launch of Cloudera Data Platform (CDP) One, described as a data lakehouse software-as-a-service (SaaS) offering.
Topics: Business Intelligence, Data Governance, Data Management, Data, AI and Machine Learning, data operations, Analytics and Data, Streaming Data & Events, operational data platforms, Analytic Data Platforms
Much has been written in recent years on the emergence of subscription management as a new revenue model that both vendors and buyers are embracing as the future. The benefits speak to the value of a predictable revenue stream for the vendor, but more importantly, the advantages to the customer who needs a lower initial outlay, predetermined expense over the lifetime of usage and the ability to cancel or suspend on demand.
As I recently pointed out, process mining has emerged as a pivotal technology for data-driven organizations to discover, monitor and improve processes through use of real-time event data, transactional data and log files. With recent advancements, process mining has become more efficient at discovering insights in complex processes using algorithms and visualizations. Organizations use it to better understand the current state of systems and business processes. It is also used to enable business process intelligence and improvement in any function or industry using events and activity models for data-driven decision-making. We assert that through 2024, 1 in 4 organizations will look to streamline their operations by exploring process mining to optimize workflow and business processes.
I have written before about the continued use of specialist operational and analytic data platforms. Most database products can be used for operational or analytic workloads, and the number of use cases for hybrid data processing is growing. However, a general-purpose database is unlikely to meet the most demanding operational or analytic data platform requirements. Factors including performance, reliability, security and scalability necessitate the use of specialist data platforms. I assert that through 2026, and despite increased demand for hybrid operational and analytic processing, more than three-quarters of data platform use cases will have functional requirements that encourage the use of specialized analytic or operational data platforms. It is for that reason that specialist database providers, including Ocient, continue to emerge with new and innovative approaches targeted at specific data-processing requirements.
A year ago, I wrote about how technology could be useful in an inflationary period, correctly anticipating the world we live in now. Responding effectively to changes in costs is always challenging, but even more so because of the choppy and chaotic nature of the current environment. Many organizations have a limited or no ability to raise prices, and are forced to find ways to minimize the impact of rising costs. And while it’s true that some organizations have a degree of pricing power, behind this generalization there is a more complex reality because this ability to raise prices often varies depending on specific products, customers and channels. Companies can best address the challenges of inflation by adopting a technique that Ventana Research calls “profitability management.”
Through 2025, establishing customer experience application suites on a common platform will be the focal point of the drive to optimize customer and organizational engagement. Organizations that are passionate about improving the customer experience are choosing to empower processes and people with intelligence through smarter applications that embrace analytics, artificial intelligence and automation to personalize and optimize the customer journey, whatever the channel of customer choice.
Earlier this year I described the growing use-cases for hybrid data processing. Although it is anticipated that the majority of database workloads will continue to be served by specialist data platforms targeting operational and analytic workloads respectively, there is increased demand for intelligent operational applications infused with the results of analytic processes, such as personalization and artificial intelligence-driven recommendations. There are multiple data platform approaches to delivering real-time data processing and analytics, including the use of streaming data and event processing and specialist, real-time analytic data platforms. We also see operational data platform providers, such as Aerospike, adding analytic processing capabilities to support these application requirements via hybrid operational and analytic processing.
A predictive finance department is one that can command technology to be more forward-looking and action-oriented while still fulfilling its core role of handling the financial elements of its organization including accounting, treasury and corporate finance. Beyond just automating rote tasks, technology also facilitates a shift toward becoming a predictive finance organization. Greater amounts of information, now available in near real time, and the increasing use of artificial intelligence (AI), enable more immediate analyses and assessments of possible courses of action, providing executives and managers the ability to better anticipate change and the agility to adapt quickly to unexpected circumstances.
Process mining is defined as the analysis of application telemetry including log files, transaction data and other instrumentation to understand and improve operational processes. Log data provides an abundance of information about what operations are occurring, the sequences involved in the processes, how long the processes are taking and whether or not the processes are completed successfully. As computing power has increased and storage costs have decreased, the economics of collecting and analyzing large amounts of log data have become much more attractive.
The learning management system technology market has evolved dramatically over the past two decades. Learning management systems, now commonly referred to as learning experience platforms, are an integral resource for any organization concerned about productivity, organizational agility and operational excellence. These technologies enable organizations to demonstrate an investment in people, as the LMS not only facilitates regulatory and legal compliance and other forms of cost and risk avoidance but also improves internal mobility, career growth and the employee experience, leading to improved employee productivity, engagement and retention.
I have recently written about the organizational and cultural aspects of being data-driven, and the potential advantages data-driven organizations stand to gain by responding faster to worker and customer demands for more innovative, data-rich applications and personalized experiences. I have also explained that data-driven processes require more agile, continuous data processing, with an increased focus on extract, load and transform processes — as well as change data capture and automation and orchestration — as part of a DataOps approach to data management. Safeguarding the health of data pipelines is fundamental to ensuring data is integrated and processed in the sequence required to generate business intelligence. The significance of these data pipelines to delivering data-driven business strategies has led to the emergence of vendors, such as Astronomer, focused on enabling organizations to orchestrate data engineering pipelines and workflows.
Workday held its first in-person Rising user group meeting since 2019 in Orlando. Three topics are worth commenting on: Workday’s Extend offering, its industry accelerators and its progress with the Workday Adaptive Planning offering.
In my first perspective on cloud computing realities, I covered some of the cost considerations associated with cloud computing and how the cloud costing model may be different enough from on-premises models that some organizations are taken by surprise. In this perspective. I’d like to focus on realities of hybrid and multi-cloud deployments.
Organizations are collecting data from multiple data sources and a variety of systems to enrich their analytics and business intelligence (BI). But collecting data is only half of the equation. As the data grows, it becomes challenging to find the right data at the right time. Many organizations can’t take full advantage of their data lakes because they don’t know what data actually exists. Also, there are more regulations and compliance requirements than ever before. It is critical for organizations to understand the kind of data they have, who is handling it, what it is being used for and how it needs to be protected. They also have to avoid putting too many layers and wrappers around the data as it can make the data difficult to access. These challenges create a need for more automated ways to discover, track, research and govern the data.
Kinaxis recently announced it has acquired a Netherlands-based company, MPO, a cloud-based software offering that orchestrates multiparty supply chain execution. The combination is designed to enable Kinaxis to extend its concurrent planning platform to handle core elements of supply chain execution. Kinaxis acquired all the shares of MPO for approximately US$45 million, with some of the final consideration dependent on performance. MPO will continue to operate as a standalone business, but will be increasingly integrated into Kinaxis’ operations worldwide.
The data catalog has become an integral component of organizational data strategies over the past decade, serving as a conduit for good data governance and facilitating self-service analytics initiatives. The data catalog has become so important, in fact, that it is easy to forget that just 10 years ago it did not exist in terms of a standalone product category. Metadata-based data management functionality has had a role to play within products for data governance and business intelligence for much longer than that, of course, but the emergence of the data catalog as a product category provided a platform for metadata-based data inventory and discovery that could span an entire organization, serving multiple departments, use cases and initiatives.
I have written about vendor efforts to use artificial intelligence (AI) and advanced analytics in their applications targeted at sales and revenue teams to improve focus and prioritize activities, both for pipeline management as well as individual opportunities. Since then, vendors have continued to innovate, and there have been more releases showcasing efforts to aid sales and revenue. And with this continuing innovation, we believe that by 2026, two-thirds of revenue leaders will begin considering a new generation of revenue analytics and data-driven applications designed to improve performance and productivity.
Today’s contact centers need to revisit core assumptions around measuring agent performance. Changes in business conditions influencing agent engagement raise new questions about whether traditional performance models are sufficient to address the more complex customer needs that have taken center stage in recent years.
Business intelligence has evolved. It now includes a spectrum of analytics, one of the most promising of which has been described as augmented intelligence. Some organizations have used the term to describe the practical reality that artificial intelligence with machine learning is not replacing human intelligence, but augmenting it. The term also represents the application of AI/ML to make business intelligence and analytics tools more powerful and easier to use. It’s this latter usage that I prefer and I’d like to explore in this perspective.
Organizations do not live in a vacuum and things happening outside their walls have a direct impact on how they perform. So, it is essential for them to incorporate external data in their forecasting, planning and budgeting, especially for predictive analytics and machine learning (ML) to support artificial intelligence (AI). I use the term external data to include any information about the world outside an organization (including economic and market statistics), competitors (such as pricing and locations), and customers. Until recently, it was adequate for organizations to regard external data is a “nice to have” item, but that is no longer the case. External data is necessary for many functions, including useful and accurate competitive intelligence used by sales and marketing groups. It is also essential for the effective applications of AI using ML for business-focused planning and budgeting and predictive analytics.
Payroll Management is one of the six major focus areas in the Human Capital Management research and advisory practice at Ventana Research. “Continuous payroll” is a hot topic in this area, with much discussion about the always-on nature of this enhanced payroll function and its related demands for supporting technologies. Advancements in payroll technologies and practices have paved the way for off-cycle payroll transactions and pay modes – like earned wage access, for example – to become the new normal. Injecting continuous payroll practices into global organizations operating in multiple countries, each with its own pay-related laws, regulations and customs, requires complex functionality that many payroll systems simply do not have. Managing global payroll requires expertly maximizing human knowledge and intelligent systems to meet both international and regional requirements.
Outbound communication is used in a number of different contexts. For potential customers, traditional telemarketing still exists, though it is limited these days due to its minimal effectiveness. Instead, many customer-experience planners have substituted digital outbound over voice for lead generation and nurturing campaigns. Customers find text messages in the channel of their choice to be much less intrusive, and they are considerably less expensive than having contact center agents reach out.
I recently wrote about the need for organizations to take a holistic approach to the management and governance of data in motion alongside data at rest. As adoption of streaming data and event processing increases, it is no longer sufficient for streaming data projects to exist in isolation. Data needs to be managed and governed regardless of whether it is processed in batch or as a stream of events. This requirement has resulted in established data management vendors increasing their focus on streaming data and event processing through product development as well as acquisitions. It has also resulted in streaming and event specialists, such as Confluent, adding centralized management and governance capabilities to their existing offerings as they seek to establish or reinforce the strategic importance of streaming data as part of a modern approach to data management.
General Omar Bradley is credited with saying, “Amateurs study strategy, professionals study logistics.” This is a battlefield commander’s perspective on the often-overlooked importance of mastering the nitty-gritty in achieving military objectives. I think the same is true when it comes to data in business computing because, in my experience, it is often an overlooked or secondary consideration.
You would be forgiven for thinking that no one buys anything in person any more given the pages of digital ink spilled over the rise of digital commerce led by the rise and rise of Amazon. However, one quick errand run on a Saturday morning would easily give lie to this, as parking lots are full, not just at grocery stores but for everyday retail as well as big box stores. Likewise, in business-to-business (B2B) commerce, despite the advertised demise, person-to-person sales are still a major part of B2B purchases.
People analytics enable organizations to gain data-driven insights that optimize the impact and value of the workforce. For decades, human capital management (HCM) leaders have been sold tools marketed as analytics that were no more than dashboards filled with nice visualizations of historic data with no context as to what each individual data point meant to their strategic objectives and initiatives. And yet, our recent Analytics and Data Benchmark Research shows that 83% of organizations indicate that dashboards are very important or are currently in use for analytics. A dashboard, while important for a snapshot view of key metrics, is not an analytics tool. Today, advances in technology allow systems to provide actionable insights into potential people risks or opportunities before it’s too late.
Zoho presented analysts with a deep look at its strategy and roadmap at its July analyst conference, describing how it intends to meld its many business applications together through integration at the level of the platform. The company, which is privately owned and funded, has generally sought to build its own tools rather than buy or partner. This approach has allowed the firm to create a suite of tightly linked tools that share a common interface.
The migration to cloud is obvious. Organizations are adopting cloud computing for all variety of applications and use cases. Managed cloud services, commonly referred to as software as a service (SaaS), offer many benefits to organizations including significantly reduced labor costs for system administration and maintenance, as many of these costs are shifted to the software vendor. SaaS also provides organizations with faster time to value as they adopt new technologies by eliminating the need to acquire and configure hardware, and it also eliminates the need to install software. In fact, we assert that by 2025, nine in 10 organizations will be using multiple cloud applications in order to minimize the costs of administration and maintenance. Yet, there are some challenges associated with cloud computing I’d like to address in a series of Analyst Perspectives:
The lockdowns of 2020 forced accounting departments to adapt to managing their close-to-report cycle without face-to-face contact, prompting many to adopt digital technologies to facilitate the process. It gave further impetus to the digital transformation of the department, which aims to eliminate unnecessary manual tasks such as consolidations and reconciliations using software automation. And, rather than looking at the close as a set of discrete tasks, Controllers and CFOs increasingly are managing the process as a connected stream of responsibilities from pre-close activities to creating and publishing financial, management and external reports. This approach is consistent with what Ventana Research calls continuous accounting.
I have written recently about increased demand for data-intensive applications infused with the results of analytic processes, such as personalization and artificial intelligence (AI)-driven recommendations. Almost one-quarter of respondents (22%) to Ventana Research’s Analytics and Data Benchmark Research are currently analyzing data in real time, with an additional 10% analyzing data every hour. There are multiple data platform approaches to delivering real-time data processing and analytics and more agile data pipelines. These include the use of streaming and event data processing, as well as the use of hybrid data processing to enable analytics to be performed on application data within operational data platforms. Another approach, favored by a group of emerging vendors such as Rockset, is to develop these data-intensive applications on a specialist, real-time analytic data platform specifically designed to meet the performance and agility requirements of data-intensive applications.
Organizations are managing and analyzing large datasets every day, identifying patterns and generating insights to inform decisions. This can provide numerous benefits for an organization, such as improved operational efficiency, cost optimization, fraud detection, competitive advantage and enhanced business processes. By bringing the right, actionable data to the right user, organizations can potentially speed up processes and make more effective operational decisions.
Especially in the United States, baby boomer retirements and fewer graduates with accounting degrees is posing a growing challenge to finance department executives in attracting and retaining the best accounting talent. The solution, which may not seem obvious, is to make accounting cool, again.
It has been nearly two and a half years since the world was thrust into one of the most dramatic eras of workforce transformation in the modern era. Organizations have been forced to reevaluate everything about the workforce, from the physical spaces in which work is done, to compensation, to non-traditional benefits and work/life enhancement offerings. Even so, many continue to struggle to attract and retain the right talent to support operational needs. As leaders continue to redesign how they look at their talent pools, they have come to rely on their technology stacks more than ever to inform and enable new processes and experiences for candidates and employees, and recruiters and managers, throughout the entire talent life cycle. We at Ventana Research assert that by 2025, two-thirds of organizations will expect full talent life cycle support from their talent platform to bolster and unify the experience for candidates, recruiters, employees and managers.
When I looked at the state of analytics recently, it was clear that analytics are not as widely deployed within organizations as they should be. Only 23% of participants in our Analytics and Data Benchmark Research reported that more than one-half of their organization’s workforce are using analytics. There are many elements to becoming a data-driven organization, as my colleague Matt Aslett points out, but analytics are a necessary component. Our research shows that organizations recognize the importance of embedded analytics, ranking it the second most important digital technology in their analytics and data efforts behind big data and ahead of artificial intelligence and machine learning (AI/ML).
“Digital finance transformation” became an even more important topic over the past two years as finance and accounting departments have had to cope with an unrelenting set of new challenges that have had a profound impact on business operations, financial markets and regulatory environments. Digital technologies enable organizations to cope with change and improve performance by increasing efficiency, reducing risk, achieving greater visibility into opportunities, shortening process cycles and completing core processes. Digitizing department operations helps attract and retain the best talent because professionals spend less time on mechanical, repetitive tasks. Unfortunately, our research suggests that transformation is more talked about than done. I assert that by 2025, only one-third of finance departments will have achieved a level of technology competence to be described as digitally transformed while the CFOs of those that do will have greater influence in their organization's management.
I recently noted that as demand for real-time interactive applications becomes more pervasive, the use of streaming data is becoming more mainstream. Streaming data and event processing has been part of the data landscape for many decades, but for much of that time, data streaming was a niche activity. Although adopted in industry segments with high-performance, real-time data processing and analytics requirements such as financial services and telecommunications, data streaming was far less common elsewhere. That has changed significantly in recent years, fueled by the proliferation of open-source and cloud-based streaming data and event technologies that have lowered the cost and technical barriers to developing new applications able to take advantage of data in-motion. This is a trend we expect to continue, to the extent that streaming data and event processing becomes an integral part of mainstream data-processing architectures.
The analytics and business intelligence market landscape continues to grow as more organizations seek robust tools and capabilities to visualize and better understand data. BI systems are used to perform data analysis, identify market trends and opportunities and streamline business processes. They can collect and combine data from internal and external systems to present a holistic view.
The applicant tracking system, for all its shortcomings, revolutionized the way people found and applied for jobs when it first hit the market in the mid-1990s. Electronic applications quickly became the norm, resume or application review became more accessible for hiring teams and compliance was much more trackable and achievable, thanks to streamlined application processes. Today, tracking and compliance aren’t enough to power the complex world of recruitment. The Great Resignation has made it abundantly clear that candidates expect the same type of consumerized experience in the hiring process as they do when buying anything at all. To win or keep the best talent, organizations must make the hiring process personalized and enjoyable, and a traditional ATS simply cannot support that mandate.
Anaplan offers a cloud-based business planning platform that incorporates a modeling and calculation engine. The tool makes it relatively easy to add or expand the scope of plans that can be connected and monitored on a single platform. This Integrated Business Planning (IBP) approach enables organizations to use the software for financial planning or budgeting, sales, supply chain, workforce, marketing and IT planning. These are the types of plans in which companies often need to create models that incorporate their specific requirements, business systems and strategy. I expect that by 2025, one-fourth of financial planning and analysis (FP&A) groups will have implemented IBP.
Topics: Office of Finance, Continuous Planning, Business Intelligence, Business Planning, Financial Performance Management, AI and Machine Learning, continuous supply chain, digital finance, profitability management
I have recently written about the importance of healthy data pipelines to ensure data is integrated and processed in the sequence required to generate business intelligence, and the need for data pipelines to be agile in the context of real-time data processing requirements. Data engineers, who are responsible for monitoring, managing and maintaining data pipelines, are under increasing pressure to deliver high-performance and flexible data integration and processing pipelines that are capable of handling the rising volume and frequency of data. Automation is a potential solution to this challenge, and several vendors, such as Ascend.io, have emerged in recent years to reduce the manual effort involved in data engineering.
The contact center industry is reexamining how organizations engage with contact center agents. One thing that we learned from the forced movement to work-from-home was that organizations have to provide agents with appropriate tools to collaborate and communicate with peers and supervisors as well as workers in the back office who participate in all sorts of customer-facing or customer-adjacent processes. It is also important to provide supervisors with visibility into agent activity. That means extending existing coaching and evaluation methods. Ventana Research believes that by 2025, nearly every organization will have dedicated systems or processes that help supervisors manage remotely.
I recently explained how emerging application requirements were expanding the range of use cases for NoSQL databases, increasing adoption based on the availability of enhanced functionality. These intelligent applications require a close relationship between operational data platforms and the output of data science and machine learning projects. This ensures that machine learning and predictive analytics initiatives are not only developed and trained based on the relationships inherent in operational applications, but also that the resulting intelligence is incorporated into the operational application in real time to support capabilities such as personalization, recommendations and fraud detection. Graph databases already support operational use cases such as social media, fraud detection, customer experience management and recommendation engines. Graph database vendors such as Neo4j are increasingly focused on the role that graph databases can play in supporting data scientists, enabling them to develop, train and run algorithms and machine learning models on graph data in the graph database, rather than extracting it into a separate environment.
Environmental, social and governance issues have grown increasingly pressing over the past few years as investors and government entities urge organizations to measure and disclose ESG metrics. I’ve already covered the broader topic as it relates to external reporting and how financial planning and analysis groups are likely to own this mandate going forward. (It’s mainly been a marketing and public relations effort up to now.) FP&A departments are also likely to be charged with responsibility for internal ESG analysis and reporting, because to achieve environmental and social goals, organizations will need to assign specific objectives to individual business units and their responsible parties. I assert that by 2025 more than one-half of corporations required to comply with ESG reporting will centralize responsibility for preparing related reports and filings with FP&A to achieve accuracy, control and efficiency objectives. To do so, FP&A groups must immediately establish a data management strategy consistent with its targeted ESG analysis and reporting approach.
I often use the term “analytics” to refer to a broad set of capabilities, deliberately broader than business intelligence. In this Perspective, I’d like to share what decision-makers should consider as they evaluate the range of analytics requirements for their organization.
I spent years in the talent acquisition space, and I think that at least several months of that time – cumulatively – was spent just trying to get people to calm down. Talent acquisition is a critically important business process, but if I had a dollar for every time I had to remind someone that there really are no recruiting emergencies, I’d be a wealthy woman.
Streaming data has been part of the industry landscape for decades but has largely been focused on niche applications in segments with the highest real-time data processing and analytics performance requirements, such as financial services and telecommunications. As demand for real-time interactive applications becomes more pervasive, streaming data is becoming a more mainstream pursuit, aided by the proliferation of open-source streaming data and event technologies, which have lowered the cost and technical barriers to developing new applications that take advantage of data in motion. Ventana Research’s Streaming Data Dynamic Insights enables an organization to assess its relative maturity in achieving value from streaming data. I assert that by 2024, more than one-half of all organizations’ standard information architectures will include streaming data and event processing, allowing organizations to be more responsive and provide better customer experiences.
Field service is a segment of customer experience that is dominated by two elements: the complexity of the issues handled, and the high cost of providing on-site services. It is recognized as a critical component of the service experience, especially when managing the condition of high-precision equipment in the medical, manufacturing and utility industries. It is also a high-risk moment in the customer life cycle. Consumers often experience the process as a series of disconnected visits and handoffs that fail to resolve issues the first time.
Organizations are collecting vast amounts of data every day, utilizing business intelligence software and data visualization to gain insights and identify patterns and errors in the data. Making sense of these patterns can enable an organization to gain an edge in the marketplace and plan more strategically.
Although the digital transformation of the finance department was a topic of discussion before 2020, it became a front-and-center issue as organizations locked down and in-office interactions became impossible. Finance and accounting departments were immediately confronted with a challenge because of their limited adoption of technology that would support a virtual working environment. As our 2019 Office of Finance Benchmark Research found, they are technological laggards: 45% are at the tactical or lowest level of competence in using technology across multiple processes and functions, while only 12% are at the highest. In my experience, many finance and accounting professionals and those running the department do not necessarily think that such competence is necessary, but this thinking is outdated because, increasingly, technology is the only practical way to address the department’s responsibilities (for example, the new revenue recognition for contracts accounting standards). To gain full advantage of technology, finance and accounting organizations must become “fast followers,” avoiding the bleeding edge but breaking the habit of waiting until the last possible moment before adopting proven advances.
When joining Ventana Research, I noted that the need to be more data-driven has become a mantra among large and small organizations alike. Data-driven organizations stand to gain competitive advantage, responding faster to worker and customer demands for more innovative, data-rich applications and personalized experiences. Being data-driven is clearly something to aspire to. However, it is also a somewhat vague concept without clear definition. We know data-driven organizations when we see them — the likes of Airbnb, DoorDash, ING Bank, Netflix, Spotify, and Uber are often cited as examples — but it is not necessarily clear what separates the data-driven from the rest. Data has been used in decision-making processes for thousands of years, and no business operates without some form of data processing and analytics. As such, although many organizations may aspire to be more data-driven, identifying and defining the steps required to achieve that goal are not necessarily easy. In this Analyst Perspective, I will outline the four key traits that I believe are required for a company to be considered data-driven.
Topics: embedded analytics, Analytics, Business Intelligence, Data Governance, Data Integration, Data, Digital Technology, natural language processing, data lakes, AI and Machine Learning, data operations, Digital Business, Streaming Analytics, data platforms, Analytics & Data, Streaming Data & Events
Since its inception 20 years ago, Ventana Research has advocated for a shorter accounting close because it can improve the performance of the entire organization, not just finance and accounting. An important benefit of a shorter close is increased staff time for analysis and the preparation of reports and narratives that improve communications with the board and outside investors. Similarly, the department can provide those in operating roles the financial and managerial accounting results to highlight opportunities and issues they must address.
There are more digital channels in the commerce space than ever before: the web, mobile apps, text, voice-activated “agents,” video and social channels. Conversational computing and hyper-personalization are transforming customer engagement, and organizations may need to undergo a digital platform renovation to optimize customer and product experiences or risk lagging behind competitors. B2B selling and buying are increasingly using methods similar to B2C digital approaches to mirror the digital commerce experience that has grown substantially within the last few years. Salesforce Commerce Cloud is one of the platforms utilizing this approach.
We’ve recently published our latest Benchmark Research on Data Governance and it’s fair to say, “you’ve come a long way, baby.” Many of you reading this weren’t around when that phrase was introduced in 1968 to promote Virginia Slims cigarettes, but you may have heard the phrase because it went on to become a part of popular culture. We’ve learned a lot about cigarettes since then, and we’ve learned a lot about data governance, too.
In my more than two decades in the world of human resources and human capital management technology, I have never seen a topic become so completely ubiquitous so quickly as has employee experience. This is great news from my perspective. As I addressed in this recent analyst perspective, market factors have forced organizations to acknowledge the tremendous bottom-line value of an engaged workforce, and that engagement is wholly dependent upon an employer's commitment to providing a personalized, well-rounded experience.
We conducted our recent Smart Close Dynamic Insights Research in part to assess to what extent the substantial disruptions of the pandemic have impacted the accounting close. When office lockdowns began in the first quarter of 2020, many finance departments were challenged by having to do their quarterly close remotely without their normal face-to-face interactions. In the United States, the Securities and Exchange Commission was so concerned that corporations would be unable to meet their filing deadlines that they gave registrants carte blanche to extend their filing if necessary. As it turned out, only a relative handful did, and all but one of those was based in China; but for many, that first calendar close required a heroic effort. Since then, organizations have made concerted efforts to adopt and use technology to enable them to operate resiliently under any conditions. Our research finds that while organizations have to some extent adapted to operating a more remote working environment, progress toward a faster close has been elusive. The research also confirms that organizations that use technology effectively to automate processes are better able to complete their close sooner.
OneStream offers a platform designed to serve the needs of accounting and financial planning and analysis organizations. The software handles financial close and consolidation, planning and budgeting, analysis and reporting. For me, the most significant announcement at the company’s recent user conference was the unveiling of its Sensible ML (Machine Learning) offering, which is in limited general release. I’ve commented on the importance of artificial intelligence in business applications, and Sensible ML is a promising and important step in that direction.
I recently wrote about the growing range of use cases for which NoSQL databases can be considered, given increased breadth and depth of functionality available from providers of the various non-relational data platforms. As I noted, one category of NoSQL databases — graph databases — are inherently suitable for use cases that rely on relationships, such as social media, fraud detection and recommendation engines, since the graph data model represents the entities and values and also the relationships between them. The native representation of relationships can also be significant in surfacing “features” for use in machine learning modeling. There has been a concerted effort in recent years by graph database providers, including TigerGraph, to encourage and facilitate the use of graph databases by data scientists to support the development, testing and deployment of machine learning models.
“Lead to cash” is an often-used term and is a companion to “quote to cash” and “order to cash”. What they all represent is an approach which recognizes that there is a process designed to convert a lead from a qualified interest to an active sale, through quote and contract negotiation, to order or contract, invoice and payment. “Quote to cash” and “order to cash” are subsets of this process, with different starting places, but ultimately end in the same place: with a payment for a delivered product or service.
A few years ago – somewhat tongue in cheek – I began using the term “data pantry” to describe a type of data store that’s part of a business application platform, created for a specific set of users and use cases. It’s a data pantry because, unlike a general-purpose data store such as a data warehouse, everything the user needs is readily available and easily accessible, with labels that are immediately recognized and understood.
Organizations are continuously increasing the use of analytics and business intelligence to turn data into meaningful and actionable insights. Our Analytics and Data Benchmark Research shows some of the benefits of using analytics: Improved efficiency in business processes, improved communication and gaining a competitive edge in the market top the list. With a unified BI system, organizations can have a comprehensive view of all organizational data to better manage processes and identify opportunities.
Topics: business intelligence, embedded analytics, Data Governance, Data Management, natural language processing, AI and Machine Learning, data operations, Streaming Analytics, operational data platforms
I previously described the concept of hydroanalytic data platforms, which combine the structured data processing and analytics acceleration capabilities associated with data warehousing with the low-cost and multi-structured data storage advantages of the data lake. One of the key enablers of this approach is interactive SQL query engine functionality, which facilitates the use of existing business intelligence (BI) and data science tools to analyze data in data lakes. Interactive SQL query engines have been in use for several years — many of the capabilities were initially used to accelerate analytics on Hadoop — but have evolved along with data lake initiatives to enable analysis of data in cloud object storage. The open source Presto project is one of the most prominent interactive SQL query engines and has been adopted by some of the largest digital-native organizations. Presto managed-services provider Ahana is on a mission to bring the advantages of Presto to the masses.
I’ve never been a fan of talking about semantic models because most of the workforce probably doesn’t understand what they are, or doesn’t recognize them by name. But the findings in our recent Analytics and Data Benchmark Research have changed my mind. The research shows how important a semantic model can be to the success of data and analytics processes. Organizations that have successfully implemented a semantic model are more than twice as likely to report satisfaction with analytics (77%) compared with a 33% overall satisfaction rate. Therefore, I owe it to all of you to write about them.
Artificial intelligence using machine learning has passed through the bright, shiny object stage and software vendors are well into the process of making the concept a reality in their offerings. Ventana Research defines AI as the use of technology to process information in much the way humans do, including improving accuracy in recommendations, actions and conclusions as more data is received. I like the alternative term “augmented intelligence” because it emphasizes that these systems enhance – rather than replace – the capabilities of the humans employing them, especially through improved decision-making and eliminating the need to perform repetitive work.
I previously explained how the data lakehouse is one of two primary approaches being adopted to deliver what I have called a hydroanalytic data platform. Hydroanalytics involves the combination of data warehouse and data lake functionality to enable and accelerate analysis of data in cloud storage services. The term data lakehouse has been rapidly adopted by several vendors in recent years to describe an environment in which data warehousing functionality is integrated into the data lake environment, rather than coexisting alongside. One of the vendors that has embraced the data lakehouse concept and terminology is Dremio, which recently launched the general availability of its Dremio Cloud data lakehouse platform.
Compensation management is a key talent management process involving all workers and managers within an organization. Determining and providing the appropriate compensation for each person — whether it involves base pay, merit pay, or variable pay and incentives such as bonuses — is critical to being able to attract and retain productive members of the workforce, including full- and part-time employees, contingent workers and contractors. The complexities of compensation often prove to be a core challenge for human resources departments as they strive to keep the organization productive, satisfied and motivated while ensuring equitable and defensible pay practices across the entire workforce.
Organizations are constantly trying to streamline and optimize business data to solve complex problems and identify opportunities to increase revenue and accelerate business growth. The data is usually stored in multiple systems with various data governance rules, which makes it complicated to democratize data and analytics within an organization. The most pressing concerns cited by participants in our Analytics and Data Benchmark Research include difficulty integrating with business processes, systems not flexible or adaptable to change and challenges accessing data sources.
As I recently described, it is anticipated that the majority of database workloads will continue to be served by specialist data platforms targeting operational and analytic workloads, albeit with growing demand for hybrid data processing use-cases and functionality. Specialist operational and analytic data platforms have historically been the since preferred option, but there have always been general-purpose databases that could be used for both analytic and operational workloads, with tuning and extensions to meet the specific requirements of each.
Organizations are scaling business intelligence initiatives to gain a competitive advantage and increase revenue as more data is created. Lack of expertise, data governance and slow performance can impact these efforts. Our Analytics and Data Benchmark Research finds some of the most pressing complaints about analytics and BI include difficulty integrating with other business processes and flexibility issues. Kyvos is a BI acceleration platform that enables BI and analytics tools to analyze massive amounts of data. It offers support for online analytical processing-based multidimensional analytics, enabling workers to access large datasets with their analytics tools. It operates with major cloud platforms, including Google Cloud, Amazon Web Services and Microsoft Azure.
I recently wrote about the potential benefits of data mesh. As I noted, data mesh is not a product that can be acquired, or even a technical architecture that can be built. It’s an organizational and cultural approach to data ownership, access and governance. While the concept of data mesh is agnostic to the technology used to implement it, technology is clearly an enabler for data mesh. For many organizations, new technological investment and evolution will be required to facilitate adoption of data mesh. Meanwhile, the concept of the data fabric, a technology-driven approach to managing and governing data across distributed environments, is rising in popularity. Although I previously touched on some of the technologies that might be applicable to data mesh, it is worth diving deeper into the data architecture implications of data mesh, and the potential overlap with data fabric.
At Enterprise Connect in March, Amazon announced new functionality in its cloud contact center platform, Amazon Connect. The company is now including a full Workforce Optimization component, which includes built-in forecasting, capacity planning and scheduling capabilities. It's no surprise that Amazon is adding these capabilities, as WFO has become a core component of a complete CCaaS platform.
Kinaxis is a sales and operation planning software company headquartered in Ottawa, Canada. Its RapidResponse is an S&OP platform for concurrent planning, designed to integrate an organization’s supply chain planning silos, accelerate planning cycles and optimize supply chain execution to match customer demand.
I recently described the use cases driving interest in hybrid data processing capabilities that enable analysis of data in an operational data platform without impacting operational application performance or requiring data to be extracted to an external analytic data platform. Hybrid data processing functionality is becoming increasingly attractive to aid the development of intelligent applications infused with personalization and artificial intelligence-driven recommendations. These applications can be used to improve customer service; engagement, detect and prevent fraud; and increase operational efficiency. Several database providers now offer hybrid data processing capabilities to support these application requirements. One of the vendors addressing this opportunity is SingleStore.
There is a fundamental flaw in information technology, or at least in the way it is most commonly delivered. Most technology systems are developed under the assumption that all people will use the system primarily in the same way. Sure, there are some options built in — perhaps the same action can be initiated by either clicking on a button, selecting a menu item or invoking a keyboard short-cut. The problem is that when every variation needs to be coded into the system, the prospect of providing personalized software programs to every individual is impractical.
The data governance landscape is growing rapidly. Organizations handling vast amounts of data face multiple challenges as more regulations are added to govern sensitive information. Adoption of multi-cloud strategies increases governance concerns with new data sources that are accessed in real time. Our Data Governance Benchmark Research shows that organizations face multiple challenges when deploying data governance. Three-quarters (73%) of organizations report disparate data sources as the biggest challenge, and half of the organizations report creating, modifying, managing and enforcing governance policies as the second biggest challenge.
Personalization is everywhere, from clothes that are selected for us and delivered to our homes, to the ads we see, to the movies we stream. It’s no surprise that employees expect that same level of curated experience in the workplace. And yet, evidence abounds to the contrary. The proverbial black hole of recruiting is still the bane of existence of everyone who has applied for a job online and has heard nothing back. Companies still tout “self-directed career advancement” as a positive quality while employees don’t even know what opportunities may be available to them and long for guidance and support. Benefits departments offer subsidized childcare, while offering pet insurance would be more beneficial (and less costly!) for a forgotten portion of the employee population.
The server is a key component of enterprise computing, providing the functional compute resources required to support software applications. Historically, the server was so fundamentally important that it – along with the processor, or processor core – was also a definitional unit by which software was measured, priced and sold. That changed with the advent of cloud-based service delivery and consumption models.
Many – myself included – have written about the growth in technologies designed to aid in business-to-business sales and sales management by serving sales reps, line managers, executives and operations. But one area that has been ill-served is technical presales, or sales engineering. You may ask why this should matter. Aren’t presales engineers all about demonstrations? How could technology – beyond video conferencing – help?
Organizations need to use external data in planning and budgeting, both data and third-party forecasts. This need also extends to external data in training artificial intelligence systems to assist in planning and for predictive analytics. Companies do not live in a vacuum and things occurring outside physical facilities have a direct impact on how an organization performs. Incorporating external data and third-party forecasts in any systemic fashion is really only practical if you’re using dedicated planning and budgeting software. And increasingly, planning and budgeting software will be incorporating AI capabilities. Watch this brief video presentation by Ventana Research SVP and Research Director Robert Kugel to uncover the benefits of organizations using external data.
Over a decade ago, I coined the term NewSQL to describe the new breed of horizontally scalable, relational database products. The term was adopted by a variety of vendors that sought to combine the transactional consistency of the relational database model with elastic, cloud-native scalability. Many of the early NewSQL vendors struggled to gain traction, however, and were either acquired or ceased operations before they could make an impact in the crowded operational data platforms market. Nonetheless, the potential benefits of data platforms that span both on-premises and cloud resources remain. As I recently noted, many of the new operational database vendors have now adopted the term “distributed SQL” to describe their offerings. In addition to new terminology, a key trend that separates distributed SQL vendors from the NewSQL providers that preceded them is a greater focus on developers, laying the foundation for the next generation of applications that will depend on horizontally scalable, relational-database functionality. Yugabyte is a case in point.
Contact centers are undergoing a radical reshuffling of the workforce, partly because the pandemic shifted agents to remote work. But the trends were in place to reorganize the world of work long before the pandemic. Digital contact channels, which are gaining in popularity, require workers that are better informed and capable of handling more complex and interdependent interactions and processes. That’s changing the nature of training, management and even process design between departments.
Sage recently announced that it is expanding its Sage Intacct software offering to support discrete manufacturing, with its initial foray into this competitive market centered in France. The move supports the company’s strategy of building out the scope of industries served by its cloud applications to include product-oriented business models and expanding Sage Intacct’s geographic footprint. The company has been extending the functionality it offers customers with human capital management as well as budgeting and planning and extending beyond its sole focus on service organizations to be able to support product-focused businesses. These include wholesale distribution, construction, retail (with the recently completed Bright Pearl acquisition) and now discrete manufacturing, specifically industrial machinery and supplies, electrical equipment and electronic parts.
I recently described how the operational data platforms sector is in a state of flux. There are multiple trends at play, including the increasing need for hybrid and multicloud data platforms, the evolution of NoSQL database functionality and applicable use-cases, and the drivers for hybrid data processing. The past decade has seen significant change in the emergence of new vendors, data models and architectures as well as new deployment and consumption approaches. As organizations adopted strategies to address these new options, a few things remained constant – one being the influence and importance of Oracle. The company’s database business continues to be a core focus of innovation, evolution and differentiation, even as it expanded its portfolio to address cloud applications and infrastructure.
I am excited to announce that I have joined Ventana Research to lead our market coverage of our Human Capital Management expertise, including focus areas of Continuous Payroll, Employee Experience, Learning Management, Talent Management, Total Compensation Management and Workforce Management.
Customer Service and Support (CSS) software is about more than case tracking and trouble tickets. Many organizations view the service call as an opportunity to solidify a positive customer relationship and perhaps enhance the loyalty and value of the customer. That has propelled interest in the emphasis on workflows and automation that now/currently drives CSS, particularly when it comes to managing self-service and field service, and the ability to provide agents with contextually relevant information during interactions.
I first wrote about a new era of trade a few years ago to make the point that the period of optimizing supply chains for the lowest cost was over, and that companies needed to redesign them to achieve greater resiliency. That observation proved correct. Now we are hearing about “the end of globalization,” a hyperbolic phrase describing the effects of ongoing changes to the international political order that have been underway for more than a decade. These changes are forcing companies to make sometimes significant adjustments to sourcing and supply chain management. Globalization, which started in 1492, isn’t over, but managing international trade requires the ability to deal with shifts in strategic planning assumptions and agility in dealing with tactical events. Software will play an important role in enabling corporations to meet these ongoing challenges caused by a major reordering of global trade.
Organizations have been using data virtualization to collect and integrate data from various sources, and in different formats, to create a single source of truth without redundancy or overlap, thus improving and accelerating decision-making giving them a competitive advantage in the market. Our research shows that data virtualization is popular in the big data world. One-quarter (27%) of participants in our Data Lake Dynamic Insights Research reported they were currently using data virtualization, and another two-quarters (46%) planned to include data virtualization in the future. Even more interesting, those who are using data virtualization reported higher rates of satisfaction (79%) with their data lake than those who are not (36%). Our Analytics and Data Benchmark Research shows more than one-third of organizations (37%) are using data virtualization in that context. Here, too, those using data virtualization reported higher levels of satisfaction (88%) than those that are not (66%).
I recently wrote about the importance of data pipelines and the role they play in transporting data between the stages of data processing and analytics. Healthy data pipelines are necessary to ensure data is integrated and processed in the sequence required to generate business intelligence. The concept of the data pipeline is nothing new of course, but it is becoming increasingly important as organizations adapt data management processes to be more data driven.
Topics: Analytics, Business Intelligence, Data Governance, Data Integration, Data, Digital Technology, Digital transformation, data lakes, AI and Machine Learning, data operations, Digital Business, data platforms, Analytics & Data, Streaming Data & Events
Data governance is an issue that impacts all organizations large and small, new and old, in every industry, and every region of the world. Data governance ensures that an organization’s data can be cataloged, trusted and protected, improving business processes to accelerate analytics initiatives and support compliance with regulatory requirements. Not all data governance initiatives will be driven by regulatory compliance; however, the risk of falling foul of privacy (and human rights) laws ensures that regulatory compliance influences data-processing requirements and all data governance projects. Multinational organizations must be cognizant of the wide variety of regional data security and privacy requirements, not least the European Union’s General Data Protection Regulation (GDPR). The GDPR became enforceable in 2018, protects the privacy of personal or professional data, and carries with it the threat of fines of up to 20 million euros ($22 million) or 4% of a company’s global revenue. Europe is not alone in regulating against the use of personally identifiable information (other similar regulations include The California Consumer Privacy Act) but Ventana Research’s Data Governance Benchmark Research illustrates that there are differing attitudes and approaches to data governance on either side of the Atlantic.
Although the bulk of contact center seats are still served by on-premises equipment, there appears to be a consensus that the cloud is better suited to delivering a successful, omnichannel customer experience, and that most new contact center deployments will be run on cloud-computing platforms.
I recently described the growing level of interest in data mesh which provides an organizational and cultural approach to data ownership, access and governance that facilitates distributed data processing. As I stated in my Analyst Perspective, data mesh is not a product that can be acquired or even a technical architecture that can be built. Adopting the data mesh approach is dependent on people and process change to overcome traditional reliance on centralized ownership of data and infrastructure and adapt to its principles of domain-oriented ownership, data as a product, self-serve data infrastructure and federated governance. Many organizations will need to make technological changes to facilitate adoption of data mesh, however. Starburst Data is associated with accelerating analysis of data in data lakes but is also one of several vendors aligning their products with data mesh.
How payments are effected is an afterthought to many involved in a transaction, but flaws in this process can be a source of pain and frustration for those in the back office, especially in accounting and treasury. To improve the way payments are handled in business-to-business transactions, the once ubiquitous paper checks are giving way to electronic payments. This category includes credit, debit and virtual cards, wire transfers, as well as ACH (Automated Clearing House) transmissions that may be in the form of direct deposits, direct debits and electronic checks. Electronic payments are supplanting checks because they lower processing costs for both parties in a transaction; increase accuracy, auditability and control of the accounting; provide better visibility into payment status; and enable deeper insight into spend or customer metrics. Building on these digital advances, blockchain payment systems (BPS), now at an early stage in development and adoption, have significant potential in the market because they offer similar advantages at an even lower cost. I assert that by 2025, fewer than 20% of organizations will be using blockchain payment systems, but those that do will speed transactions, reduce overhead and cut costs.
Ventana Research is happy to share insights gleaned from the latest Value Index research, an assessment of how well vendors’ offerings meet buyers’ requirements. The 2022 Revenue Performance Management (RPM) Value Index is the distillation of a year of market and product research. Drawing on our Benchmark Research, we apply a structured methodology built on evaluation categories that reflect the real-world criteria incorporated in a request for proposal to vendors supporting the spectrum of revenue performance management. Using this methodology, we evaluated vendor submissions in seven categories: five relevant to the Product Experience ﹘ Adaptability, Capability, Manageability, Reliability and Usability ﹘ and two related to the Customer Experience ﹘ Total Cost of Ownership or Return on Investment and Vendor Validation.
I have written previously that the world of data and analytics will become more and more centered around real-time, streaming data. Data is created constantly and increasingly is being collected simultaneously. Technology advances now enable organizations to process and analyze information as it is being collected to respond in real time to opportunities and threats. Not all use cases require real-time analysis and response, but many do, including multiple use cases that can improve customer experiences. For example, best-in-class e-commerce interactions should provide real-time updates on inventory status to avoid stock-out or back-order situations. Customer service interactions should provide real-time recommendations that minimize the time to resolution. Location-based offers should be targeted at the customer’s current location, not their location several minutes ago. Another domain where real-time analyses are critical is internet of things (IoT) applications. Additionally, use cases like predictive maintenance require timely information to prevent equipment failures that help avoid additional costs and damage.
As organizations shift focus to a broader definition of sales that includes all sources of revenue, vendors are also pivoting to include “revenue” as part of promotional messaging. But it’s my view that just changing your message or description does not necessarily deliver the capabilities and product experiences customers need to successfully plan, execute and achieve revenue targets and objectives. The just-completed 2022 Ventana Research Value Index for Revenue Performance Management addressed this shift, focusing on available product capabilities that support customer needs as well as their overall experience.
The term "corporate spend" usually refers to the incidental but still significant outlays organizations make to support operations. Especially in nonmanufacturing industries, purchases of indirect goods and business services – such as computers, office supplies, furniture and services – as well as travel and entertainment can represent a significant percentage of total costs. Technology has evolved to the point where executives – especially the chief financial officer – need to take an overarching approach to corporate spend that utilizes technology to tighten controls, deepen visibility into expenditures, increase productivity and reduce process frictions. Spend management software and corporate spend cards – either physical or virtual – offer a means of achieving spend management objectives. This is part of a broader trend to digitizing outlays: I assert that by 2025, more than two-thirds of organizations will be using spend management software and corporate cards to achieve greater control and increased efficiency.
Data mesh is the latest trend to grip the data and analytics sector. The term has been rapidly adopted by numerous vendors — as well as a growing number of organizations —as a means of embracing distributed data processing. Understanding and adopting data mesh remains a challenge, however. Data mesh is not a product that can be acquired, or even a technical architecture that can be built. It is an organizational and cultural approach to data ownership, access and governance. Adopting data mesh requires cultural and organizational change. Data mesh promises multiple benefits to organizations that embrace this change, but doing so may be far from easy.
Topics: Analytics, Business Intelligence, Data Governance, Data Integration, Data, Digital Technology, Digital transformation, data lakes, data operations, Digital Business, data platforms, Analytics & Data, Streaming Data & Events
Ventana Research defines subscription management as the processes and technology needed to manage the subscriber experience from the first digital touch to the continuous modifications of orders for services and billing. Effective subscription management requires a new generation of applications designed to manage the life cycle of subscriptions and provide subscribers with the experiences they expect. The subscription business model has grown in popularity across many industries, and for many organizations it is now part of how they conduct business. Organizations, whether through line extensions, completely new businesses or through mergers and acquisitions, now have a mixed business model combining subscription and usage with one-time sales, often as a bundle of related products and services. The model establishes a regular, predictable income stream and monetizes existing and new assets. In addition, usage-based pricing is preferred by many consumers, both B2B and B2C, because it is more closely aligned to actual consumption patterns. For product companies, selling by subscription enables them to maintain ongoing contact with customers to facilitate future sales. Subscription is also popular with customers as it allows a degree of control from the buyer’s point of view and can be cancelled or modified, typically online, in a frictionless manner.
Today’s contact center agents find themselves handling increasingly more complex interactions due to changes in consumer demand, advances in self-service and the proliferation of digital contact channels. This added complexity requires continuous agent support for successful customer experience outcomes. Intelligent software can reduce agent workload and improve customer interactions by picking up customer cues.
For years, maybe decades, we have heard about the struggles between IT and line-of-business functions. In this perspective, we will look at some of the data from our Analytics and Data Benchmark Research about the roles of IT and line-of-business teams in analytics and data processes. We will also look at some of the disconnects between these two groups. And, by looking at how organizations are operating today and the results they are achieving, we can discern some of the best practices for improving the outcomes of analytics and data processes.
Ventana Research recently published the results of our Business Planning Value Index Research and I commented on its connection to our emphasis on using software to unify planning processes across an enterprise to improve performance. Since 2007, we have advocated what we call Integrated Business Planning (IBP): a high-participation, collaborative, action-oriented approach to planning and budgeting built on frequent, short planning sprints. Short planning cycles enable companies to achieve greater agility in responding to market or competitive changes.
Despite widespread and increasing use of the cloud for data and analytics workloads, it has become clear in recent years that, for most organizations, a proportion of data-processing workloads will remain on-premises in centralized data centers or distributed-edge processing infrastructure. As we recently noted, as compute and storage are distributed across a hybrid and multi-cloud architecture, so, too, is the data it stores and relies upon. This presents challenges for organizations to identify, manage and analyze all the data that is available to them. It also presents opportunities for vendors to help alleviate that challenge. In particular, it provides a gap in the market for data-platform vendors to distinguish themselves from the various cloud providers with cloud-agnostic data platforms that can support data processing across hybrid IT, multi-cloud and edge environments (including Internet of Things devices, as well as servers and local data centers located close to the source of the data). Yellowbrick Data is one vendor that has seized upon that opportunity with its cloud Data Warehouse offering.
Digital Transformation. The Subscription Economy. Omni-Channel Selling. Customer Centric. These are all terms used to label trends and events that are changing the way business is being conducted, a change that has accelerated due to recent events. Regardless of the terminology, there is no doubt that the way vendors and buyers are interacting, whether B2C or B2B, is different today for many organizations than it was even five years ago. But to be fair, no technology on its own can transform your business without changes to the other two key elements: people and processes. In addition, change is unlikely to happen if you are also relying on your existing ERP or CRM systems.
The technology underpinning customer experience (CX) is a hodgepodge of tools that have been developed for niche use cases and then expanded to fill broader roles. Examples include the old (CRM, help desk software and speech analytics) and the new (customer data platforms and conversational AI). This is because CX is a set of very specialized processes that happen in different parts of the enterprise, managed by people who often do not connect with peers handling related processes. Service-related activities are focused in the contact center, personalization and loyalty in marketing departments, and so forth.
Topics: Customer Experience, Marketing, Marketing Performance Management, Voice of the Customer, Contact Center, Product Information Management, Digital Marketing, agent management, intelligent marketing, Customer Experience Management, Digital Experience Platform, Conversational Marketing, customer service and support
Organizations face various challenges with analytics and business intelligence processes, including data curation and modeling across disparate sources and data warehouses, maintaining data quality and ensuring security and governance. Traditional processes are slow when transforming large and diverse datasets into something which is easily consumable in BI. And, it can take days or weeks to create reports and dashboards — maybe longer if processes change and new data sources are introduced. Our Analytics and Data Benchmark Research shows that the most time-consuming processes are preparing data, reviewing it for quality issues and preparing reports for presentation and distribution.
Value-added tax is a type of levy that is applied at each step of a transaction chain, from basic inputs to the final good or service. The amount assessed is based on the value added by an organization (hence the name) when a transaction occurs. VAT is used throughout the world because, historically, it has been harder to evade compared to income taxes. VAT is a common method of national taxation: Approximately 85% of countries impose it worldwide. A notable exception is the United States, where sales and use taxes are imposed at the state and local level and applied only to the price of the final good or service. As commerce has become global and cross-border sales have increased, traditional methods for calculating, applying and complying with VAT regimes has grown more complex. To achieve higher tax revenue while ensuring better compliance, governments are turning to technology to make collections more effective while making processes more efficient.
I recently examined how evolving functionality had fueled the adoption of NoSQL databases, recommending that organizations evaluate NoSQL databases when assessing options for data transformation and modernization efforts. This recommendation was based on the breadth and depth of functionality offered by NoSQL database providers today, which has expanded the range of use cases for which NoSQL databases are potentially viable. There remain a significant number of organizations that have not explored NoSQL databases as well as several workloads for which it is assumed NoSQL databases are inherently unsuitable. Given the advances in functionality, organizations would be well-advised to maintain up-to-date knowledge of available products and services and an understanding of the range of use cases for which NoSQL databases are a valid option.
Today, organizations understand the importance of good external data that can be integrated with internal data to train machine learning models. Our Machine Learning Dynamic Insights research showed that external data adds a significant value in gaining competitive advantage, improving customer experience and increasing sales. But getting the right external data for a particular requirement is not always easy. Internal data is usually not enough to train different models because of its narrow scope of usage and lack of relevance. Manual data acquisition methods are resource-intensive and can take weeks or months to get the data ready to feed into models.
I recently attended an analyst conference held by Unit4, an enterprise resource planning vendor focused on midsize organizations in people-centric industries. The conference was intended to communicate the company’s strategy, product updates and roadmap. The meeting took place shortly after announcement of the availability of Unit4 Industry Mesh and the acquisition of Compright, which does compensation planning as well as in the context of the broad technology shifts affecting ERP applications.
Topics: Human Capital Management, Office of Finance, Business Planning, Financial Performance Management, Talent Managment, ERP and Continuous Accounting, Total Compensation Management, digital finance
The various NoSQL databases have become a staple of the data platforms landscape since the term entered the IT industry lexicon in 2009 to describe a new generation of non-relational databases. While NoSQL began as a ragtag collection of loosely affiliated, open-source database projects, several commercial NoSQL database providers are now established as credible alternatives to the various relational database providers, while all the major cloud providers and relational database giants now also have NoSQL database offerings. Almost one-quarter (22%) of respondents to Ventana Research’s Analytics and Data Benchmark Research are using NoSQL databases in production today, and adoption is likely to continue to grow. More than one-third (34%) of respondents are planning to adopt NoSQL databases within two years (21%) or are evaluating (14%) their potential use. Adoption has been accelerated by the evolving functionality offered by NoSQL products and services, the growing maturity of specialist NoSQL vendors, and new commercial offerings from cloud providers and established database providers alike. This evolution is exemplified by the changing meaning of the term NoSQL itself. While it was initially associated with a rejection of the relational database hegemony, it has retroactively been reinterpreted to mean “Not Only SQL,” reflecting the potential for these new databases to coexist with and complement established approaches.
Software that automates the full scope of the accounting close, including reconciliations, consolidation and reporting, has grown more capable and affordable over the past five years. By enabling consistent process management that captures best practices, and by automating rote, repetitive activities to boost staff productivity, these applications enable organizations to shorten the close, make the process more efficient and reduce the risk of material errors by strengthening accounting controls. As accounting departments have learned over the past two years, close automation software helps ensure business continuity under any circumstance, especially as remote workforces that are able to perform the close virtually become more commonplace.
Natural language processing (NLP) is a field that combines artificial intelligence (AI), data science and linguistics that enables computers to understand, interpret and manipulate text or spoken words. NLP includes generating narratives based on a set of data values, using text or speech as inputs to access information, and analysing text or speech, for instance, to determine its sentiment. There are various techniques for interpreting human language, ranging from statistical and machine learning (ML) methods to rules-based and algorithmic approaches. In this perspective, we will focus on two aspects of NLP: natural language query (NLQ), which offers the ability to use natural language expressions to discover and understand data, and natural language generation (NLG), which uses AI to produce written or spoken narratives from a dataset. NLQ and NLG enable business personnel to communicate information needs with business intelligence (BI) systems more easily.
As businesses become more data-driven, they are increasingly dependent on the quality of their data and the reliability of their data pipelines. Making decisions based on data does not guarantee success, especially if the business cannot ensure that the data is accurate and trustworthy. While there is potential value in capturing all data — good or bad — making decisions based on low-quality data may do more harm than good.
In a previous Analyst Perspective, we discussed some of the big-picture trends that are bringing cost control back as a core driver of contact center operations. In this report we will tackle some of the practical ramifications: how those trends affect decision-making and operations.
There is much vendor activity and customer interest in making better use of data, to improve the sales process in the face of increased pressure to achieve organization revenue goals. As detailed in my Analyst Perspective: The Art and Science of Sales from the “Inside Out," enhanced buyer research as well as the inclusion of more people in the buying process, have made selling harder, evidenced by a general trend of declining quota attainment. There is no denying that better use of data can help in prioritizing and helping to advance the sales process more effectively. But this is not the whole story. Whereas generating interest and qualifying opportunity is a key part of the sales team’s role, all this progress can be undone with a cumbersome and clunky configure, price and quoting (CPQ) and contract life cycle management (CLM) process. Automated and digitized systems that handle these elements aid greatly toward a winning/successful close process and will set the right tenor for a continuing and sustained customer relationship. And although CPQ is often thought of as part of the finance department, as contracts are with legal, both of these processes should be seen as adjuncts of the sales process, and both sales and revenue leadership and operations teams need to align with finance and legal. My colleague Robert Kugel covers the finance perspective in more detail in his Analyst Perspective: Configure, Price and Quote Software Supports Profitability Management.
Reconciling accounts at the end of a period is one of those mundane finance department tasks that are ripe for automation. Reconciliation is the process of comparing account data (at the balance or item level) that exists either in two accounting systems or in an accounting system and somewhere else (such as in a spreadsheet or on paper). The purpose of the reconciling process is to identify things that do not match (as they must in double-entry bookkeeping systems) and then assess the nature and causes of the variances. This is followed by making adjustments or corrections to ensure that the information in an organization’s books is accurate. Most of the time, reconciliation is a matter of good housekeeping that identifies errors and omissions in the accounting process, including invalid journal postings and duplicate accounting entries, so they can be corrected. Reconciliation also is an important line of defense against fraud since inconsistencies may be a sign of such activity.
Despite all the advances organizations have made with respect to analytics, our most recent research shows the majority of the workforce in the majority of organizations are not using analytics and business intelligence (BI). Less than one-quarter (23%) report that one-half or more of their workforce is using analytics and BI. This is a problem. It means organizations are not enabling their workforce to perform at peak efficiency and effectiveness. It means the workforce in many organizations does not have access to the same information by which they are being measured. It means organizations must find other ways to communicate with, and manage, the workforce.
Topics: Sales, business intelligence, embedded analytics, Analytics, Data, Sales Performance Management, Digital Technology, Digital Commerce, natural language processing, Subscription Management, partner management, Revenue Management, Sales Engagement, Collaborative & Conversational Computing
Contact centers have always been very cost-centric and attuned to the kinds of constraints that they have to operate in, but many organizations were diverted from that kind of focus when the pandemic first hit. In 2020, there was a sudden need for new tools and equipment just to keep centers running, and the costs involved in enabling agents to work from home — equipping them and their supervisors with the tools they needed to collaborate and stay in sync — were unavoidable.
I recently described the emergence of hydroanalytic data platforms, outlining how the processes involved in generating energy from a lake or reservoir were analogous to those required to generate intelligence from a data lake. I explained how structured data processing and analytics acceleration capabilities are the equivalent of turbines, generators and transformers in a hydroelectric power station. While these capabilities are more typically associated with data warehousing, they are now being applied to data lake environments as well. Structured data processing and analytics acceleration capabilities are not the only things required to generate insights from data, however, and the hydroelectric power station analogy further illustrates this. For example, generating hydroelectric power also relies on pipelines to ensure that the water is transported from the lake or reservoir at the appropriate volume to drive the turbines. Ensuring that a hydroelectric power station is operating efficiently also requires the collection, monitoring and analysis of telemetry data to confirm that the turbines, generators, transformers and pipelines are functioning correctly. Similarly, generating intelligence from data relies on data pipelines that ensure the data is integrated and processed in the correct sequence to generate the required intelligence, while the need to monitor the pipelines and processes in data-processing and analytics environments has driven the emergence of a new category of software: data observability.
The use of artificial intelligence (AI) using machine learning (ML) will be the single most important trend in business software this decade because it can multiply the investment value of such applications and provide vendors an important source of differentiation to achieve a competitive advantage in what are today very mature software categories. I assert that by 2025, almost all Office of Finance software vendors will have incorporated some AI capabilities to reduce workloads and improve performance. However, software vendors will be challenged to apply innovations in this area quickly while ensuring that the AI capabilities function well enough in the real world to foster rapid adoption while avoiding user frustration. The failures of the Apple Newton and Microsoft’s Clippy office assistant stand out as examples of too-ambitious-too-soon attempts at infusing intelligent automation.
Many organizations invest in data governance out of concern over misuse of data or potential data breaches. These are important considerations and valid aspects of data governance programs. However, good data governance also has positive impacts on organizations. For example, I have previously written about the valuable connection between the use of data catalogs and satisfaction with an organization’s data lake. Our most recent Analytics and Data Benchmark Research demonstrates some of the beneficial links between data governance and analytics. In this Perspective, I’ll share some of the correlations identified in our research.
A formal Voice of the Customer (VoC) program is a necessity for any organization that wants to grow its customer base and differentiate from its competitors. Unfortunately, many organizations have not updated their notion of “formal” in quite a few years.
Topics: Customer Experience, Marketing, Voice of the Customer, Contact Center, Digital Marketing, agent management, Customer Experience Management, Field Service, Conversational Marketing, customer service and support
As I stated when joining Ventana Research, the socioeconomic impacts of the pandemic and its aftereffects have highlighted more than ever the differences between organizations that can turn data into insights and are agile enough to act upon it and those that are incapable of seeing or responding to the need for change. Data-driven organizations stand to gain competitive advantage, responding faster to worker and customer demands for more innovative, data-rich applications and personalized experiences. One of the key methods that accelerates business decision-making is reducing the lag between data collection and data analysis.
Revenue performance management and the role of revenue operations is moving to the forefront of sales organizations, aligning departments around a single view of the business with shared revenue targets and goals. This facilitates the needs of the sales department as well as customer experience, marketing and renewals. The concept of RevOps does not yet have a widely shared common definition within organizations. Because revenue organizations include workers associated with sales operations, there tends to be a bias that RevOps leans towards sales management with the addition of customer success for retention and marketing.
I am happy to share insights gleaned from our latest Value Index research, an assessment of how well vendors’ offerings meet buyers’ requirements. The Ventana Research Value Index: Business Planning 2022 is the distillation of a year of market and product research. Drawing on our Benchmark Research, we apply a structured methodology built on evaluation categories that reflect real-world criteria incorporated in a request for proposal to business planning vendors supporting the spectrum of planning. Using this methodology, we evaluated vendor submissions in seven categories: five relevant to the product experience ﹘ Adaptability, Capability, Manageability, Reliability and Usability ﹘ and two related to the customer experience ﹘ Total Cost of Ownership/Return on Investment and Vendor Validation.
I recently described how the data platforms landscape will remain divided between analytic and operational workloads for the foreseeable future. Analytic data platforms are designed to store, manage, process and analyze data, enabling organizations to maximize data to operate with greater efficiency, while operational data platforms are designed to store, manage and process data to support worker-, customer- and partner-facing operational applications. At the same time, however, we see increased demand for intelligent applications infused with the results of analytic processes, such as personalization and artificial intelligence-driven recommendations. The need for real-time interactivity means that these applications cannot be served by traditional processes that rely on the batch extraction, transformation and loading of data from operational data platforms into analytic data platforms for analysis. Instead, they rely on analysis of data in the operational data platform itself via hybrid data processing capabilities to accelerate worker decision-making or improve customer experience.
Having just completed the 2022 Ventana Research Value Index for Business Planning, I want to share some of my observations about the business planning software market and how it has advanced as an important part of our market coverage for almost two decades. Dedicated applications for planning and budgeting have been around since the 1980s and are, therefore, quite mature, with robust features and functionality as well as continual refinements in usability and performance. Outwardly, the specifications for offerings in this category appear very similar, but how the software works is at least as important to buyers’ preferences. Moreover, planning is not a mechanical process, so despite limited differentiation at the surface, an organization can find that one vendor’s offering is a better fit for its individual approach to planning than others.
Organizations of all sizes are dealing with exponentially increasing data volume and data sources, which creates challenges such as siloed information, increased technical complexities across various systems and slow reporting of important business metrics. Migrating to the cloud does not solve the problems associated with performing analytics and business intelligence on data stored in disparate systems. Also, the computing power needed to process large volumes of data consists of clusters of servers with hundreds or thousands of nodes that can be difficult to administer. Our Analytics and Data Benchmark Research shows that organizations have concerns about current analytics and BI technology. Findings include difficulty integrating data with other business processes, systems that are not flexible enough to scale operations and trouble accessing data from various data sources.
Ventana Research recently announced its 2022 Market Agenda for the Office of Finance, continuing the guidance we have offered since 2003 on the practical use of technology for the finance and accounting department. Our insights and best practices aim to enable organizations to operate with agility and resiliency, improving performance and delivering greater value as a strategic partner.
Topics: Office of Finance, Business Intelligence, Collaboration, Business Planning, Financial Performance Management, ERP and Continuous Accounting, Revenue, blockchain, robotic finance, Predictive Planning, AI and Machine Learning, lease and tax accounting, profitability management
Ventana Research recently announced its 2022 Market Agenda for the Office of Revenue, continuing the guidance we have offered for nearly two decades to help organizations realize optimal value from applying technology to improve business outcomes. Chief sales and revenue officers and their associated operations teams are experts in their respective fields but may not have the guidance needed to employ technology effectively. As we look to 2022, we are focusing on the entire selling and buying life cycle and the applications that simplify and improve interactions throughout the customer experience.
Topics: Sales, Analytics, Internet of Things, Data, Sales Performance Management, Digital Technology, Digital Commerce, Conversational Computing, AI and Machine Learning, mobile computing, Subscription Management, extended reality, intelligent sales, partner management, Sales Engagement