In today’s organization, the myriad of analytics and permutations of dashboards challenge workers’ ability to take contextual actions efficiently. Unfortunately, conventional wisdom for investing in analytics does not recognize the benefits of empowering the workforce to understand the situation, examine options and work together to make the best possible decision.
Since its inception, HCM software has upended how people interact with their workplace. Paper resumes have given way to online applications. Physical time clocks have largely disappeared in favor of apps or clocks housed within a point-of-sale system. Even benefits enrollment has entered the digital age, adding tools like decision-support modeling to help enrollees determine which of the myriad offerings will better fit their specific circumstances. This digital transformation does more than just improve productivity and drive compliance. It helps to create an experience for workers that can build trust and engagement. In our research, 64% of organizations indicated that delivering a superior employee experience is a top priority. We assert that by 2025, two-thirds of organizations will require artificial intelligence (AI) in all HCM and adjacent systems to curate personalized experiences for all workers to drive engagement, productivity and retention. UKG, a company founded in 2021 by the merger of two HCM technology giants, Ultimate Software and Kronos, excels in this arena.
The management of work is a focal point for every organization that has people and resources directed to accomplish the smallest to largest of tasks. But many organizations are not easily able to manage complex activities because the details of how people are assigned and complete work are not as simple as they should be. Traditional project management methods and technologies have failed to work in an enterprise manner, so new approaches have emerged to meet today’s challenges. The essence of work management is to ensure the automation and intelligence afforded by the technology industry is infused in how organizations operate, as outlined in my perspective.
Organizations conduct data analysis in many ways. The process can include multiple spreadsheets, applications, desktop tools, disparate data systems, data warehouses and analytics solutions. This creates difficulties for management to provide and maintain updated information across multiple departments. Our Analytics and Data Benchmark Research shows that organizations face a variety of challenges with analytics and business intelligence. One-third of participants find it difficult to integrate analytics and BI with other business processes. Participants also find that not all software is flexible enough for the constantly changing business environment, and that it is hard to access all data sources.
Workiva offers an environmental, social and governance application that enables organizations to manage the highly distributed tasks necessary for reporting to regulators and stakeholders on ESG matters. ESG issues have grown increasingly pressing over the past few years as investors and government entities urge organizations to measure and disclose relevant metrics. I’ve already covered the broader topic as it relates to external reporting and how financial planning and analysis groups are likely to own this mandate going forward. I’ve also addressed the data strategy that finance organizations should adopt to meet regulatory compliance requirements. Notably, I assert that by 2025, more than one-half of corporations required to comply with ESG reporting will centralize responsibility for preparing reports and filings with financial planning and analysis to achieve accuracy, control and efficiency objectives.
For far too long, business intelligence technologies have left the rest of the exercise to the reader. Many of these tools do an excellent job providing information in an interactive way that lets organizations dive into the data and learn a lot about what has happened across all aspects of the business. More recently, many of these tools have added augmented intelligence capabilities that help explain why things happened. But rarely did any of these tools provide information about what to do or how to evaluate the alternative ways in which you might respond.
The shift from on-premises server infrastructure to cloud-based and software-as-a-service (SaaS) models has had a profound impact on the data and analytics architecture of many organizations in recent years. More than one-half of participants (59%) in Ventana Research’s Analytics and Data Benchmark research are deploying data and analytics workloads in the cloud, and a further 30% plan to do so. Customer demand for cloud-based consumption models has also had a significant impact on the products and services that are available from data and analytics vendors. Data platform providers, both operational and analytic, have had to adapt to changing customer demand. The initial response — making existing products available for deployment on cloud infrastructure — only scratched the surface in terms of responding to emerging expectations. We now see the next generation of products, designed specifically to deliver innovation by taking advantage of cloud-native architecture, being brought to market both by emerging startups, and established vendors, including InterSystems.
Topics: Business Intelligence, Cloud Computing, Data Management, Data, natural language processing, AI and Machine Learning, data operations, Analytics & Data, operational data platforms, Analytic Data Platforms
Analytics processes are all about how organizations use data to create metrics that help manage and improve operations. Yet, the discipline applied to analytics processes seems to be lacking compared to data processes. I’ve pointed out that the weak link in data governance is often analytics. Organizations can also do a better job tying AnalyticOps to DataOps and do more to define and manage metrics. Our research has shown that creating and managing metrics in a semantic model improves analytics processes.
There is always space for innovation in the data platforms sector, and new vendors continue to emerge at regular intervals with new approaches designed to serve specialist data storage and processing requirements. Factors including performance, reliability, security and scalability provide a focal point for new vendors to differentiate from established vendors, especially for the most demanding operational or analytic data platform requirements. It is never easy, however, for developers of new data platform products to gain significant market traction, given the dominance of the established relational database vendors and cloud providers. Targeting requirements that are not well-served by general purpose data platforms can help new vendors get a toe in the door of customer accounts. The challenge to gaining further market traction is for new vendors to avoid having products become pigeon-holed as only being suitable for a niche set of requirements. This is precisely the problem facing the various distributed SQL database providers.
Ventana Research uses the term “data pantry” to describe a method of data storage (and the technology and process blueprint for its construction) created for a specific set of users and use cases in business-focused software. It’s a pantry because all the data one needs is readily available and easily accessible, with labels that are immediately recognized and understood by the users of the application. In tech speak, this means the semantic layer is optimized for the intended audience. It is stocked with data gathered from multiple sources and immediately available for analysis, forecasting, planning and reporting. This does away with the need for analysts to repeatedly perform data extraction, enrichment or transformation motions from the required source systems, all but eliminating the substantial amount of time analysts and business users routinely spend on data preparation.
Topics: Continuous Planning, Business Intelligence, Data Management, Business Planning, Data, Financial Performance Management, Enterprise Resource Planning, AI and Machine Learning, continuous supply chain, data operations, digital finance, profitability management, Analytics & Data, Streaming Data & Events
In previous perspectives in this series, I’ve discussed some of the realities of cloud computing including costs, hybrid and multi-cloud configurations and business continuity. This perspective examines the realities of security and regulatory concerns associated with cloud computing. These issues are often cited by our research participants as reasons they are not embracing the cloud. To be fair, the majority of our research participants are embracing the cloud. However, among those that have not yet made the transition to the cloud, security and regulatory concerns are among the most common issues cited across the various studies we have conducted.
In the face of a very uncertain future, companies have been discovering the value of rapid planning and budgeting cycles. As events unfold, they’re changing expectations for the future significantly on a daily or weekly basis. However, even when the world returns to a steadier state, companies will benefit from making their planning and budgeting processes faster, easier, more relevant, more strategic, more agile and more accurate.
Recently, I suggested you need to “mind the gap” between data and analytics. This perspective addresses another gap — the gap in skills between business intelligence (BI) and artificial intelligence/machine learning (AI/ML).
After decades of overpromising and underdelivering, technology has now evolved to the point where it is fundamentally changing how accountants work – for the better. The pandemic and resulting support of remote work set the stage for a transformation of how accounting efforts are structured and performed, all for the better. Remote audits that became routine during lockdowns are evolving into virtual ones, where auditors take full advantage of advanced software to achieve dependably higher audit quality with less effort, while improving working conditions for auditors and staff accountants. Although discussions I’ve had with practitioners over the past two years indicate that organizations are using this approach to some extent, widespread use has become practical only recently.
The market and buyer landscape for contact center operating services has changed significantly since the onset of the pandemic, now almost three years ago. Three years would have been enough time for some significant shifts, even without the pressure the pandemic put on service operations. Nevertheless, with on-premises systems now taking a backseat industrywide, it’s fair to say that CCaaS, which typically refers to cloud-based systems, now represents the lions’ share of spending and therefore stands as a proxy for the industry as a whole. Ventana Research predicts that by 2026, 7 in ten organizations will have moved all or part of their contact center technology into the cloud to attain greater flexibility and scalability.
The technology industry has established itself as a pivotal force in its ability to help organizations become more intelligent and automated. But doing so has required a journey of epic proportions for most organizations that have had to endure a transition of competencies and skills that was, in many places, transitioned to consulting firms who were hired appropriately to manage changes. Unfortunately, this step led, in many cases, to an extended focus on digital transformation rather than the necessary modernization of business processes and technology. Through 2024, after concerted investment into digital transformation, one-half of organizations will require a new digital business and technology agenda for organizational resilience.
Earlier this year, I wrote about the increasing importance of data observability, an emerging product category that takes advantage of machine learning (ML) and Data Operations (DataOps) to automate the monitoring of data used for analytics projects to ensure its quality and lineage. Monitoring the quality and lineage of data is nothing new. Manual tools exist to ensure that it is complete, valid and consistent, as well as relevant and free from duplication. Data observability vendors, including Monte Carlo Data, have emerged in recent years with the goal of increasing the productivity of data teams and improving organizations’ trust in data using automation and artificial intelligence and machine learning (AI/ML).
Emburse offers a single platform that enables organizations — small, midsize and larger —to manage their travel and related expenses, pay invoices and handle their corporate spend. Today, technology has the ability to significantly increase the efficiency with which organizations handle expenditures while simultaneously containing costs, increasing controls and improving visibility into where the money is going. This is part of a broader trend toward digitizing outlays: I assert that by 2025, more than two-thirds of organizations will be using spend management software and corporate cards to achieve greater control and increased efficiency.
One of the most significant considerations when choosing an analytic data platform is performance. As organizations compete to benefit most from being data-driven, the lower the time to insight the better. As data practitioners have learnt over time, however, lowering time to insight is about more than just high-performance queries. There are opportunities to improve time to insight throughout the analytics life cycle, which starts with data ingestion and integration, includes data preparation and data management, as well as data storage and processing, and ends with data visualization and analysis. Vendors focused on delivering the highest levels of analytic performance, such as SQream, understand that lowering time to insight relies on accelerating every aspect of that life cycle.
Embedded business intelligence (BI) continues to transform the business landscape, enabling organizations to quickly interpret data and convert it into actionable insights. It allows organizations to extract information in real time and answer wide-ranging business questions. Embedding analytics helps tackle the issue of extracting information from data which is a time-consuming process. Our research shows organizations spend more time cleaning and optimizing data for analysis rather than creating insights. On top of that, they are adding more data sources and information systems which in turn introduces more complexity. Our Analytics and Data Benchmark Research shows that organizations face various challenges with analytics and BI. More than one-third of participants (35%) responded that they find it hard to integrate analytics and BI with business processes and connect to multiple data sources. By embedding analytics and BI into business processes and workflows, organizations can enable users to make critical decisions fast, enhancing overall business agility.
Managing corporate income taxes is a challenge for chief financial officers and their tax department professionals. Tax codes are often complex, so tax accounting as well as the data required for tax provisions and tax compliance are different enough from statutory accounting to create significant workloads for the tax department. The provision for income tax expense and, for public companies, the assembly of information related to tax-related disclosures, can be a factor holding up the completion of the accounting close.
Organizations are increasingly utilizing cloud object storage as the foundation for analytic initiatives. There are multiple advantages to this approach, not least of which is enabling organizations to keep higher volumes of data relatively inexpensively, increasing the amount of data queried in analytics initiatives. I assert that by 2024, 6 in ten organizations will use cloud-based technology as the primary analytics data platform, making it easier to adopt and scale operations as necessary.