The learning management system technology market has evolved dramatically over the past two decades. Learning management systems, now commonly referred to as learning experience platforms, are an integral resource for any organization concerned about productivity, organizational agility and operational excellence. These technologies enable organizations to demonstrate an investment in people, as the LMS not only facilitates regulatory and legal compliance and other forms of cost and risk avoidance but also improves internal mobility, career growth and the employee experience, leading to improved employee productivity, engagement and retention.
I have recently written about the organizational and cultural aspects of being data-driven, and the potential advantages data-driven organizations stand to gain by responding faster to worker and customer demands for more innovative, data-rich applications and personalized experiences. I have also explained that data-driven processes require more agile, continuous data processing, with an increased focus on extract, load and transform processes — as well as change data capture and automation and orchestration — as part of a DataOps approach to data management. Safeguarding the health of data pipelines is fundamental to ensuring data is integrated and processed in the sequence required to generate business intelligence. The significance of these data pipelines to delivering data-driven business strategies has led to the emergence of vendors, such as Astronomer, focused on enabling organizations to orchestrate data engineering pipelines and workflows.
Workday held its first in-person Rising user group meeting since 2019 in Orlando. Three topics are worth commenting on: Workday’s Extend offering, its industry accelerators and its progress with the Workday Adaptive Planning offering.
In my first perspective on cloud computing realities, I covered some of the cost considerations associated with cloud computing and how the cloud costing model may be different enough from on-premises models that some organizations are taken by surprise. In this perspective. I’d like to focus on realities of hybrid and multi-cloud deployments.
Organizations are collecting data from multiple data sources and a variety of systems to enrich their analytics and business intelligence (BI). But collecting data is only half of the equation. As the data grows, it becomes challenging to find the right data at the right time. Many organizations can’t take full advantage of their data lakes because they don’t know what data actually exists. Also, there are more regulations and compliance requirements than ever before. It is critical for organizations to understand the kind of data they have, who is handling it, what it is being used for and how it needs to be protected. They also have to avoid putting too many layers and wrappers around the data as it can make the data difficult to access. These challenges create a need for more automated ways to discover, track, research and govern the data.
Kinaxis recently announced it has acquired a Netherlands-based company, MPO, a cloud-based software offering that orchestrates multiparty supply chain execution. The combination is designed to enable Kinaxis to extend its concurrent planning platform to handle core elements of supply chain execution. Kinaxis acquired all the shares of MPO for approximately US$45 million, with some of the final consideration dependent on performance. MPO will continue to operate as a standalone business, but will be increasingly integrated into Kinaxis’ operations worldwide.
The data catalog has become an integral component of organizational data strategies over the past decade, serving as a conduit for good data governance and facilitating self-service analytics initiatives. The data catalog has become so important, in fact, that it is easy to forget that just 10 years ago it did not exist in terms of a standalone product category. Metadata-based data management functionality has had a role to play within products for data governance and business intelligence for much longer than that, of course, but the emergence of the data catalog as a product category provided a platform for metadata-based data inventory and discovery that could span an entire organization, serving multiple departments, use cases and initiatives.
I have written about vendor efforts to use artificial intelligence (AI) and advanced analytics in their applications targeted at sales and revenue teams to improve focus and prioritize activities, both for pipeline management as well as individual opportunities. Since then, vendors have continued to innovate, and there have been more releases showcasing efforts to aid sales and revenue. And with this continuing innovation, we believe that by 2026, two-thirds of revenue leaders will begin considering a new generation of revenue analytics and data-driven applications designed to improve performance and productivity.
Today’s contact centers need to revisit core assumptions around measuring agent performance. Changes in business conditions influencing agent engagement raise new questions about whether traditional performance models are sufficient to address the more complex customer needs that have taken center stage in recent years.
Business intelligence has evolved. It now includes a spectrum of analytics, one of the most promising of which has been described as augmented intelligence. Some organizations have used the term to describe the practical reality that artificial intelligence with machine learning is not replacing human intelligence, but augmenting it. The term also represents the application of AI/ML to make business intelligence and analytics tools more powerful and easier to use. It’s this latter usage that I prefer and I’d like to explore in this perspective.
Organizations do not live in a vacuum and things happening outside their walls have a direct impact on how they perform. So, it is essential for them to incorporate external data in their forecasting, planning and budgeting, especially for predictive analytics and machine learning (ML) to support artificial intelligence (AI). I use the term external data to include any information about the world outside an organization (including economic and market statistics), competitors (such as pricing and locations), and customers. Until recently, it was adequate for organizations to regard external data is a “nice to have” item, but that is no longer the case. External data is necessary for many functions, including useful and accurate competitive intelligence used by sales and marketing groups. It is also essential for the effective applications of AI using ML for business-focused planning and budgeting and predictive analytics.
Payroll Management is one of the six major focus areas in the Human Capital Management research and advisory practice at Ventana Research. “Continuous payroll” is a hot topic in this area, with much discussion about the always-on nature of this enhanced payroll function and its related demands for supporting technologies. Advancements in payroll technologies and practices have paved the way for off-cycle payroll transactions and pay modes – like earned wage access, for example – to become the new normal. Injecting continuous payroll practices into global organizations operating in multiple countries, each with its own pay-related laws, regulations and customs, requires complex functionality that many payroll systems simply do not have. Managing global payroll requires expertly maximizing human knowledge and intelligent systems to meet both international and regional requirements.
Outbound communication is used in a number of different contexts. For potential customers, traditional telemarketing still exists, though it is limited these days due to its minimal effectiveness. Instead, many customer-experience planners have substituted digital outbound over voice for lead generation and nurturing campaigns. Customers find text messages in the channel of their choice to be much less intrusive, and they are considerably less expensive than having contact center agents reach out.
I recently wrote about the need for organizations to take a holistic approach to the management and governance of data in motion alongside data at rest. As adoption of streaming data and event processing increases, it is no longer sufficient for streaming data projects to exist in isolation. Data needs to be managed and governed regardless of whether it is processed in batch or as a stream of events. This requirement has resulted in established data management vendors increasing their focus on streaming data and event processing through product development as well as acquisitions. It has also resulted in streaming and event specialists, such as Confluent, adding centralized management and governance capabilities to their existing offerings as they seek to establish or reinforce the strategic importance of streaming data as part of a modern approach to data management.
General Omar Bradley is credited with saying, “Amateurs study strategy, professionals study logistics.” This is a battlefield commander’s perspective on the often-overlooked importance of mastering the nitty-gritty in achieving military objectives. I think the same is true when it comes to data in business computing because, in my experience, it is often an overlooked or secondary consideration.
You would be forgiven for thinking that no one buys anything in person any more given the pages of digital ink spilled over the rise of digital commerce led by the rise and rise of Amazon. However, one quick errand run on a Saturday morning would easily give lie to this, as parking lots are full, not just at grocery stores but for everyday retail as well as big box stores. Likewise, in business-to-business (B2B) commerce, despite the advertised demise, person-to-person sales are still a major part of B2B purchases.
People analytics enable organizations to gain data-driven insights that optimize the impact and value of the workforce. For decades, human capital management (HCM) leaders have been sold tools marketed as analytics that were no more than dashboards filled with nice visualizations of historic data with no context as to what each individual data point meant to their strategic objectives and initiatives. And yet, our recent Analytics and Data Benchmark Research shows that 83% of organizations indicate that dashboards are very important or are currently in use for analytics. A dashboard, while important for a snapshot view of key metrics, is not an analytics tool. Today, advances in technology allow systems to provide actionable insights into potential people risks or opportunities before it’s too late.
Zoho presented analysts with a deep look at its strategy and roadmap at its July analyst conference, describing how it intends to meld its many business applications together through integration at the level of the platform. The company, which is privately owned and funded, has generally sought to build its own tools rather than buy or partner. This approach has allowed the firm to create a suite of tightly linked tools that share a common interface.
The migration to cloud is obvious. Organizations are adopting cloud computing for all variety of applications and use cases. Managed cloud services, commonly referred to as software as a service (SaaS), offer many benefits to organizations including significantly reduced labor costs for system administration and maintenance, as many of these costs are shifted to the software vendor. SaaS also provides organizations with faster time to value as they adopt new technologies by eliminating the need to acquire and configure hardware, and it also eliminates the need to install software. In fact, we assert that by 2025, nine in 10 organizations will be using multiple cloud applications in order to minimize the costs of administration and maintenance. Yet, there are some challenges associated with cloud computing I’d like to address in a series of Analyst Perspectives: