Ventana Research Analyst Perspectives

SAP’s Opens Road for HANA and Big Data at SAPPHIRE NOW

Posted by Ventana Research on May 27, 2011 7:53:32 AM

At this year’s SAPPHIRE NOW conference (Twitter: #SAPPHIRENOW) SAP demonstrated its in-memory computing technology and applications. SAP’s High Performance Analytic Application (HANA), which I think of as a high-availability network appliance, is part of the technology industry movement to increase the performance and scalability across a range of applications, from analytics to transactions, to drive timely insights on data or real-time interactions across a business value chain that includes everyone from customers to suppliers. As part of the in-memory computing initiative, SAP demonstrated its in-memory database, which uses a columnar data store that employs technology SAP acquired with the Sybase IQ product. As I noted before the conference, in-memory technology is part of a major new focus for this global business applications company.

Read More

Topics: Sales Performance, SAP, Supply Chain Performance, Sustainability, IT Performance, Operational Performance, Business Analytics, Business Collaboration, Business Intelligence, Business Mobility, Business Performance, Business Technology, CIO, Cloud Computing, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Information Applications, Information Management, Information Technology, Workforce Performance, CFO, COO

RIM Has a BlackBerry and PlayBook for Business

Posted by Ventana Research on May 25, 2011 5:46:14 AM

At its BlackBerry World conference earlier this month, RIM promoted its own tablet computer to challenge other providers’ tablet offerings. The BlackBerry PlayBook, which was unveiled at the beginning of 2011, addresses the growing demand for business mobility – a factor I noted as one of the five key business technology innovations of this year.

Read More

Topics: Sales Performance, Social Media, Supply Chain Performance, Sustainability, Google, Playbook, RIM, Smart Phones, IT Performance, IT Research, Operational Performance, Business Analytics, Business Collaboration, Business Intelligence, Business Mobility, Business Performance, CIO, Cloud Computing, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Information Applications, Information Management, Location Intelligence, Mobility, Operational Intelligence, Workforce Performance, Sybase. Mobile Industry, Tablets, Digital Technology

SAP Brews New Human Capital Management for the Cloud

Posted by Ventana Research on May 25, 2011 5:40:58 AM

At SAP’s annual SAPPHIRE NOW conference (Twitter: #SAPPHIRENOW) this month, the company introduced a new portfolio of human capital management applications that will be available on many devices and added mobility options for users, including offerings for smartphones and tablets and cloud computing. This move beyond the traditional on-premises approach of SAP’s ERP Human Capital Management product suite is a critical step forward for SAP if it is to remain relevant for HR organizations.

Read More

Topics: Big Data, Performance Management, Sales Performance, Social Media, Supply Chain Performance, Sustainability, Human Capital Management, Metrics, Mobile Applications, Business Technology Innovation, Operational Performance, Business Analytics, Business Collaboration, Business Intelligence, Business Mobility, Business Performance, Cloud Computing, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Information Management, Workforce Performance, data mart, Talent Management, Workforce Analytics

IBM Provides Clarity for Finance

Posted by Ventana Research on May 25, 2011 5:37:32 AM

IBM Software recently held a user group conference called Vision 2011 that focused on its Clarity Systems acquisition’s users but also covered broader finance department topics. For me, the highlight of the show was the continued evolution and enrichment of the Clarity FSR external reporting application designed to automate the close-to-report cycle. This process is commonly referred to as “the last mile of finance,” a term coined by a now-defunct company, Movaris, and adopted by Gartner. If you think about it, though, it isn’t “the last mile” for the tens of thousands of companies that don’t publish financial statements and is only one of several important finance department processes that follow the accounting close (such as internal reporting and tax statement preparation). 

Finance departments have long needed to automate the assembly of periodic documents that combine words and numbers. These documents include the quarterly and annual reports public corporations are required to submit to the United States Securities and Exchange Commission (SEC), the Canadian Securities Administrators, the United Kingdom’s Financial Services Authority (FSA) and other agencies. Historically, companies have cobbled together these filings from bits of text created by a variety of people in several departments (chiefly finance and legal), using numbers that come from a range of sources. These sources include accounting data from a consolidation system, other enterprise systems, data warehouses and spreadsheets that track headcount, leased premises, stock performance, advertising expense and executive compensation, to name just five. 

FSR automates the document creation process, eliminating the need to perform repetitive, mechanical functions and reducing the time needed to ensure accuracy and the time spent managing the process. Manually assembling this information into a document has always been a chore, even after word processing and spreadsheets were adapted to this purpose decades ago. These filings are legal documents that must be completely accurate and conform to mandated presentation styles. They require careful review to ensure accuracy and completeness. Complicating this effort recently are increasingly stringent deadlines, especially in the U.S. Anyone who has been a party to these efforts knows that there can be frequent changes in the numbers as they are reviewed by different parties, and those responsible need to ensure that any change to a number that occurs (such as the depreciation and amortization figure) is automatically reflected everywhere that amount is cited in the document (in this example, that would include the statement of cash flows, income statement, the text of the management discussion and analysis and the text or tables of one or more footnotes). Those managing the process spend a great deal of energy simply checking the document to ensure that the various sections include the latest wording, that the numbers are consistent in the tables and text, that amounts have been rounded properly (which can be really complicated) and that the right people have signed off on each and every part of the filing. FSR workflow-enables the process, meaning that handoffs are automated, participants get alerts if they haven’t completed their steps in timely fashion, and administrators can keep track of where everyone is in the process. 

Despite the fact that technology (specifically document management systems) has been widely available to automate the close-to-file process for a couple of decades, it was not widely adopted by finance departments. Some of this reflected the cost and effort required to deploy these heavy-duty systems and some was the usual “we’ve always done it this way” resistance to change. To be fair, about 50 years ago the SEC’s 10-K (annual report) and 10-Q filings were rather sparse and there wasn’t much to check. They have only gradually become the data- and disclaimer-rich documents we know today. Companies would have kept pulling these reports together manually except that the SEC mandated tagging that they use eXtensible Business Reporting Language (XBRL). This represented a tipping point in the workload because although tagging the basic financial statements is not labor-intensive, the broader requirement for tagging footnotes is. This has been enough for many companies to adopt tools like Clarity FSR. 

FSR, built on Microsoft software components, takes advantage of a wide familiarity with Excel and Word to reduce the amount of training required of end users. The time required to prepare the document is reduced, since once a company has configured its system to establish, in effect, a template, it’s relatively easy to create each quarterly or annual XBRL-tagged filing for the SEC. IBM Clarity has continued to incorporate new techniques in FSR for simplifying and further automating the creation and tagging processes. 

The users conference included a presentation by Time Warner, which was an early adopter of FSR. Its reasons for using the software to do the work, rather than relying on a third party (such as a financial printer or service provider), seem sound to me. Namely, it saves time and reduces the effort required to produce an accurate and complete document. Moreover (and personally I think this is extremely important), it gives those responsible for external financial reporting, the legal department and the company as a whole greater control over the process. Corporations can have more time (even a crucial day or two) to review what is in the document and concentrate more on what the document should contain rather than defaulting to what’s practical in the time allotted. (As they like to say in auditing, the threshold of materiality rises exponentially as deadlines near.) 

Although FSR was designed specifically for the SEC’s XBRL mandate, once FSR is in place, it can be used in many other ways. For example, Time Warner is using it to file statutory reports in the U.K. The number of jurisdictions that require XBRL-tagged filings is increasing worldwide, and not just for periodic corporate financials. This is especially true for financial services companies engaged in banking and insurance. Companies can and should also offer their financial press releases in a tagged format to make them easier for analysts and investors to incorporate these numbers in their models at the time earnings are announced. (This was one of the reasons why XBRL was created.) 

Beyond external financial reporting, FSR can be used by finance organizations to create any periodic document (even ones simply for internal consumption) that combines words and numbers. This would be especially useful where multiple people must collaborate to produce narratives and collect data from multiple sources. It can cut the amount of time and effort required to produce them and it gives whoever is responsible a valuable administrative tool for automating workflows and monitoring the status of each component.

FSR has evolved from its original release, with ongoing improvements that have increased the efficiency of the process. I think finance departments in midsize and larger corporations, especially public companies, can benefit from utilizing a tool such as FSR. I also believe most companies that are outsourcing the tagging process and have avoided automating their document assembly are making a strategic mistake. The benefits of automation are greater and the net cost of using this sort of tool is much lower than they probably realize. I recommend that companies that are considering a tool for automating their periodic external filing include IBM Clarity FSR in their software evaluation list.

Best Regards,

Robert Kugel – SVP Research

Read More

Topics: Big Data, Performance Management, Social Media, Sustainability, Human Capital Management, Metrics, Mobile Applications, Business Analytics, Business Intelligence, Business Performance, Cloud Computing, Financial Performance, Governance, Risk & Compliance (GRC), Workforce Performance, data mart, Talent Management, Workforce Analytics

Conference Highlights Social Media, Analytics and the Customer Experience

Posted by Ventana Research on May 25, 2011 5:33:23 AM

The Directors Club of the U.K. recently held its inaugural National Customer Show in London. The event was well attended and attracted sponsorship from some of the biggest vendors in the contact center industry; among them were platinum sponsors Interactive Intelligence and salesforce.com, and session sponsors Nexidia and SwordCiboodle. I noticed three common themes, covering very different aspects of managing the customer, and I’ll hit the highlights of each.

Social Media

These days you can’t escape discussions about social media and the impact it is having on how companies interact with customers and prospects. As I recently wrote, social media is here, and millions of people are using it to communicate with friends, colleagues, businesses and even government. But all the hype, statistics and YouTube videos are masking the realities of business use. My research shows that as yet social media has had little impact on the contact center, and at the show I found confirmation of this point. I also heard more people than usual, even from salesforce.com, uttering cautionary words about social media. The reality is that business use outside of marketing and training videos is still low and consumer use is largely confined to complaining. Several people picked up the latter point; whereas in the past one complaint was heard by a few tens of people, now a single complaint may be heard by thousands (and potentially by millions) of people. Companies need to be aware of this; they need to monitor comments and take positive action about them and do whatever is possible to not let the things that generated the complaints happen again. So the message was monitor social media, have a process and people in place to take appropriate action and have a process to address the root cause of customer issues.

 Contact Center Analytics

Throughout the show, and at one session I chaired, I heard conversations about the need for companies to review their existing metrics and add new metrics that reflect their business goals, rather than settle for efficiency metrics that just show how well things are or are not working. The general consensus seemed to be that no one metric is going to fit the bill for all companies. Yes, net promoter scores add insight to potential new business, and having good customer effort scores makes sense from an efficiency and customer perspective because making it easy for customers to interact with your company is likely to generate more business at lower costs. But companies need a balanced set of metrics from contact center analytics that I recently researched that reflect their business and priorities. I was pleased to find considerable support for my view that having “metrics for metrics’ sake” is pointless and that companies need to have in place processes that ensure action is taken based on their key performance metrics.

 The Customer Experience

As a concept, customer experience management is going the same way as CRM, in that it means many things to different people. For me it is about proactively managing the experience customers receive at any touch point. So when it comes to the most popular channel – calls to the contact center – CEM is about how agents handle each and every customer call.

This theme was echoed during one session I attended that connected customer experience with agent empowerment. My benchmark research into CEM has shown that the major influence on the customer experience is the agent’s attitude. The discussion took up the theme that if agents are not empowered to handle calls effectively then customers are likely to go away unhappy. Empowerment seemed to come down to doing some basic things well: process (not doing dumb things), training and coaching, motivation, and setting rewards and performance metrics that positively encourage agents to do a good job and deliver to the company’s business requirements.

Personally I like the theme “take the dumb out of handling customer interactions.” Too many times companies do dumb things: asking customers to repeat information they have given before, having metrics that drive agents to do the wrong things (keep calls short rather than solve the problem), using IVR menus that don’t match what customers want to do, providing inconsistent information on different channels – the list goes on. If companies would stand back and examine objectively the dumb things they are doing and put them right, we would all get a better experience.

In this day of social media fixing broken processes is even more important. Dumb things will end up exposed in public. This begs the question of who should be responsible for social media, because one dumb response can cause more trouble than the original issues. As a result companies need to pay more attention to interaction-handling than ever before.

Are you ready to cope with this new environment? Can you be certain that interactions are being handled consistently and effectively across all channels? How is social media impacting customer experience and do you use analytics to gain better visibility to what you do not know. If so, I’d love to know how you do it.

Regards

Richard Snow – VP & Research Director

Read More

Topics: Predictive Analytics, Sales Performance, Salesforce.com, Customer Analytics, Customer Experience, Customer Feedback Management, Social CRM, Speech Analytics, Voice of the Customer, Nexidia, Operational Performance, Analytics, Business Analytics, Business Collaboration, Business Performance, Cloud Computing, Customer & Contact Center, Customer Service, Call Center, Contact Center, Contact Center Analytics, CRM, Desktop Analytics, Interactive Intelligence, Text Analytics, Workforce Management, SwordCiboodle

IBM Chooses Hadoop Unity; Not Shipping the Elephant

Posted by Ventana Research on May 23, 2011 11:06:33 PM

Last week I attended the IBM Big Data Symposium at the Watson Research Center in Yorktown Heights, N.Y. The event was held in the auditorium where the recent Jeopardy shows featuring the computer called Watson took place and which still features the set used for the show – a fitting environment for IBM to put on another sort of “show” involving fast processing of lots of data. The same technology featured prominently in IBM’s big-data message, and the event was an orchestrated presentation more like a TV show than a news conference. Although it announced very little news at the event, IBM did make one very important statement: The company will not produce its own distribution of Hadoop, the open source distributed computing technology that enables organizations to process very large amounts of data quickly. Instead it will rely on and throw its weight behind the Apache Hadoop project – a stark contrast to EMC’s decision to do exactly that, announced earlier in the week. As an indication of IBM’s approach, Anant Jhingran, vice president and CTO for information management, commented, “We have got to avoid forking. It’s a death knell for emerging capabilities.”

The event brought together organizations presenting interesting and diverse use cases ranging from traditional big-data stories from Web businesses such as Yahoo to less well known scenarios such as informatics in life sciences and healthcare, by Illumina and the University of Ontario Institute of Technology (UOIT), respectively, low-latency financial services by eZly and customer demographic data by Axciom.

Eric Baldeschwieler, vice president of Hadoop development at Yahoo, shared some impressive statistics about its Hadoop usage, one of the largest in the world with over 40,000 servers. Yahoo manages 170 petabytes of data with Hadoop and runs more than 5 million Hadoop jobs every month. The models it uses to help prevent spam and others that do ad-targeting are in some cases retrained every five minutes to ensure they are based on up-to-date content. As a point of reference CPU utilization on Yahoo’s Hadoop computing resources averages greater than 30% and at its best is greater than 80%. It appears from these figures that the Hadoop clusters are configured with enough spare capacity to handle spikes in demand.

During the discussions, I detected a bit of a debate about who is the driving force behind Hadoop. According to Baldeschwieler, Yahoo has contributed 70% of the Apache Hadoop project code, but on April 12, Cloudera claimed in a press release, “Cloudera leads or is among the top three code contributors on the most important Apache Hadoop and Hadoop-related projects in the world, including Hadoop, HDFS, MapReduce, HBase, Zookeeper, Oozie, Hive, Sqoop, Flume, and Hue, among others.” Perhaps Yahoo wants to reestablish its credentials as it mulls whether to spin out its Hadoop software unit. If such a spinoff were to occur, it could further fracture the Hadoop market.

I found it interesting that the customers IBM brought to the event, while having interesting use cases, were not necessarily leveraging IBM products in their applications. This fact led me to the initial conclusion that the event was more of a show than a news conference. Reflecting further on IBM’s stated direction of supporting the Apache Hadoop distribution, I wondered what IBM Hadoop-related products they would use. IBM will be announcing version 1.1 of InfoSphere BigInsights in both a free basic edition and an enterprise edition. The product includes Big Sheets, which can integrate large amounts of unstructured Web data. InfoSphere Streams 2.0, announced in April, adds Netezza TwinFin, Microsoft SQLServer and MySQL support to other SQL sources already supported. But this event was not about those products. It was about IBM’s presence in and knowledge of the big-data marketplace. Executives did say that the IBM product portfolio will be extended “in all the places you would expect” to support big data but offered few specifics.

IBM emphasized the combination of streaming data, via InfoSphere Streams, and big data more than other big-data vendors do. The company painted a context of “three V’s” (volume, velocity and variety) of data, which attendees, Twitter followers and eventually the IBM presenters expanded to include a fourth V, validity. To illustrate the potential value of combining streaming data and big data, Dr. Carolyn McGregor, chair in health informatics at UOIT, shared how the institute is literally saving lives in neonatal intensive care units by monitoring and analyzing neonatal data in real time.

Rob Thomas, IBM vice president of business development for information management explained the role of partners in the IBM big data ecosystem. As stated above, IBM will rely on Apache Hadoop as the foundation of its work, but will partner with vendors further up the stack. Datameer, Digital Resaoning,  and  Karmasphere all participated in the event as examples of the types of partnerships IBM will seek.

IBM has already demonstrated, via Watson, that it knows how to deal with large-scale data and Hadoop, but to date, if you want those same capabilities from IBM, it will have to come mostly in the form of services. The event made it clear that IBM backs the Apache Hadoop effort but not in the form of new products. In effect, IBM used its bully pulpit (not to mention its size and presence in the market) to discourage others from fragmenting the market. The announcements may also have been intended to buy time for further product developments. I look for more definition from IBM on its product roadmap. If it wants to remain competitive in the big-data market, IBM needs to articulate how its products will interact with and support Hadoop. In my soon to be released Hadoop and Information Management benchmark research that I am completing will provide some facts on whether or not IBM is making the right bet on Hadoop.

Regards,

Ventana Research

Read More

Topics: Big Data, EMC, Business Intelligence, Cloudera, Greenplum, IBM, Information Applications, Information Management, InfoSphere, Strata+Hadoop

Roambi Innovates Mobile Industry with Simpler Information and Analytics

Posted by Mark Smith on May 21, 2011 3:21:43 PM

To maintain a productive workforce, businesses need to be able to put information in front of users at every level, from executives to front-line managers. Mobile technologies such as smartphones and tablets can provide analytics and business intelligence (BI), but so far this market niche has been dominated by publishing dashboards and reports that conform to the limits of mobile platforms. Analytics and BI software developers usually opt to publish charts and tables to Web pages on a smartphone or tablet. However, the usability of mobile-based Web browsers leaves a lot to be desired, which is particularly unfortunate in light of our recent benchmark research in business analytics, which found that usability was the number one consideration in 57 percent of organizations, while 89 percent said mobile applications need to be simpler to understand and use. A company called MeLLmo appears poised to capitalize on the demand for accessible mobile BI information.

Read More

Topics: Sales Performance, SAP, Supply Chain Performance, Sustainability, Google, Smart Phones, IT Performance, Operational Performance, Analytics, Business Analytics, Business Collaboration, Business Intelligence, Business Mobility, Business Performance, CIO, Cloud Computing, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Information Applications, Information Management, Mobility, Workforce Performance, Roambi, Sybase. Mobile Industry, Tablets, Digital Technology

SAP Sales OnDemand Comes Alive in the Cloud and Mobile at SAPPHIRE NOW

Posted by Mark Smith on May 21, 2011 12:59:36 PM

At the SAPPHIRE NOW conference this week, SAP released the production version of the cloud-based Sales OnDemand software that it unveiled earlier in the year. There has been a lot of the esoteric commentary of SAP Sales OnDemand from those that exclusively cover the IT industry. Unfortunately the majority of them have never worked in sales or held a quota that prevents a provide a deeper perspective on the relevance to the sales organization and what it can provide to existing SAP customers or those evaluating it for the first time. I covered some critical perspectives in my research agenda on sales as a background to my analysis of this new offering.

Read More

Topics: Sales, Sales Performance, SAP, Sales Compensation, Sales Force Automation, Sales Forecasting, Operational Performance, Business Performance, Customer & Contact Center, Financial Performance, Workforce Performance, CRM, Sales Performance Management, SFA

Mercer Promotes Possibility of the New Empowered Workforce

Posted by Ventana Research on May 21, 2011 12:18:21 AM

Less than a week after attending ADP’s industry analyst day, I flew to Washington, D.C., to attend Mercer’s analyst forum, which gave me a chance to compare another human resources juggernaut. While ADP is known primarily for payroll and business process outsourcing, Mercer is known for HR consulting and benefits outsourcing. Mercer is not as big as ADP, with $3.5 billion in annual revenue and over 27,000 customers, most of which are large multinational and midmarket companies, servicing over 4.2 million employees. But it is just as influential because of the global benchmark research and market data it provides to clients.

Read More

Topics: Performance Management, Human Capital Management, Metrics, Operational Performance, Business Analytics, Business Collaboration, Business Intelligence, Business Performance, Financial Performance, Governance, Risk & Compliance (GRC), Workforce Performance, Compensation, data mart, Talent Management, Workforce Analytics, Workforce Planning

SAP Advances Enterprise Performance Management in Version 10

Posted by Robert Kugel on May 21, 2011 12:12:30 AM

SAP announced the release of version 10 of its SAP BusinessObjects Enterprise Performance Management (EPM) Solutions suite, an enhanced and updated set of applications and capabilities for executives and managers. In our Value Index assessment of financial performance management suites and my analysis of it last year, Ventana Research gave SAP’s offering the highest score, and this new release builds on that solid foundation that I already assessed in my blog. It has been several years since SAP began acquiring and assembling its performance management and analytical software assets, and the company has progressed to the point where discussing the integration efforts is becoming irrelevant. This release revamps the user interface of the different components to provide a more consistent look and feel – a crucial factor in facilitating training and improving user productivity. Outside of the suite itself, the current release is designed to integrate better with ERP, SAP NetWeaver BW, risk management and BI. In facts it establishes a foundation for finance analytics that I have researched and is essential for doing what I call and have written about in putting the “A” back in FP&A

EPM incorporates a range of financial and performance management functionality, including strategy management, planning, sales and operations planning (S&OP), financial information management, profitability and cost management, spend management and supply chain performance management, as well as finance department process management software for financial consolidation, intercompany reconciliations and disclosure management. These components now have a more consistent user interface and all have been given some enhancements to their functionality especially in the path to supporting the need for I call integrated business planning that SAP has indicated is strategic to its future and use of its in-memory computing technology called HANA.

SAP also has improved integration of EPM with mobile devices like Apple iPad, which allows executives and managers who spend a large portion of their time away from their desks to have access to the information they need in a timely and contextual fashion, and lets them interact with the data to gain deeper understanding of underlying causes and potential outcomes. (My colleague Mark Smith covered mobile business intelligence in this blog.)

Release 10.0 includes the Disclosure Management application, which enables companies to automate the process of preparing external financial reports and regulatory disclosures. This capability will aid the increasing number of public companies in the U.S. that need to file their financial statements with a more complete set of eXtensible Business Reporting Language (XBRL) tags that I already assessed on the importance of automating. Companies can save considerable time using the software by systematizing their data collection, using workflows for managing the assembly of the text that goes into these filings, applying tags to text and data (if necessary) and automating the assembly of text and numbers in the exact format required. Automating this process gives executives more time to review filings and lessens the risk of reporting errors by changing mainly manual processes into a more systematized one. Performing this work in-house rather than outsourcing it gives companies greater control over the process and likely will save them a considerable amount of time following a relatively short learning curve. I provided some insight on this advancement when SAP acquired software assets for this new offering that has now come to market. 

The current release builds enhanced enterprise risk management procedures into the overall performance management process. Outside of financial services, few companies explicitly quantify risk in their planning and performance assessment processes. Too often, managers are evaluated solely on productivity measures and therefore can be given disincentives to weigh risk factors. These risks may be well understood by business unit and divisional managers but are almost never communicated to senior executives. As I noted in a previous blog, this gives rise to agency risk within a company.

Although almost every company is mindful of achieving its profitability objectives, many fall short in coordinating the actions of their various silos and operating units to optimize the trade-offs they must make, especially as events unfold after the annual planning process. Profitability management enables senior executives to analyze and assess alternatives and optimize these trade-offs. 

EPM 10 continues the necessary evolution of the financial performance management suite. It’s not necessary for finance organizations to manage performance and core finance operations using software from a single vendor (and most don’t). However, suites give companies the option of doing so, which can be a less costly way of buying and maintaining this functionality. Finance organizations looking at a consistent user experience and technology for GRC will find SAP BusinessObjects GRC 10 is empowered by SAP EPM 10 capabilities. 

Today, technology is pushing a fundamental shift in how companies use financial performance management software. The increasing availability of in-memory computing (HANA in SAP’s case, which my colleague David Menninger discussed in his blog), cloud computing and mobile devices enables a fundamental shift from today’s once-a-month, accounting-based rear-view-mirror approach to assessing performance via an anywhere, anytime interactive view that blends financial and operating results and provides a richer, more accurate measure of results. In fact my colleague at SAPPHIRE NOW 2011 user conference has already seen how SAP was demonstrating a new dynamic cash flow management on SAP HANA to help advance the efficiency of accounting and financial operations. 

I recommend that organizations considering any component of a financial performance management suite should include SAP BusinessObjects EPM 10 in their list of products to investigate. This application suite can clearly help finance and is a better path than doing what I call the ERP forklift migration

Regards,

Robert Kugel – SVP Research

Read More

Topics: Planning, Sales Performance, SAP, Supply Chain Performance, Sustainability, Forecast, Office of Finance, budget, Budgeting, XBRL, Operational Performance, Business Intelligence, Business Performance, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Information Management, Workforce Performance, CFO, agile, budgeting software, CEO, Corporate Finance, Financial Performance Management, Integrated Business Planning

Workforce Planning Is Part of the Next Generation of Human Capital Management

Posted by Ventana Research on May 20, 2011 11:57:49 PM

Workforce planning is a business process that done right ensures an organization of suitable access to talent to ensure future business success. At a Mercer analyst summit I attended recently, which I wrote about in “Mercer Promotes Possibility of the New Empowered Workforce,” one of the sponsor execs kept challenging the HR industry analyst community to do more research on workforce planning, since her company and its customers are spending more time and money on just that. 

Read More

Topics: Performance Management, Sales Performance, Supply Chain Performance, Human Capital Management, Metrics, Operational Performance, Business Analytics, Business Collaboration, Business Intelligence, Business Performance, Customer & Contact Center, Financial Performance, Information Applications, Information Management, Workforce Performance, Compensation, data mart, Talent Management, Workforce Analytics, Workforce Planning

The New Mobile SAP Evolves Stronger from Sybase Investment

Posted by Mark Smith on May 20, 2011 11:45:15 PM

At the SAPPHIRE NOW annual conference, (Twitter: #SAPPHIRENOW) the advantage of the mobile technology SAP gained through its acquisition of Sybase is becoming evident. In a blog before the conference I touched on the importance of mobility to the company’s future. From walking around, assessing keynotes and sessions and talking to companies using SAP, it seems that the big bet that SAP made on mobility is paying off.

Read More

Topics: Sales Performance, SAP, Social Media, Supply Chain Performance, Google, Smart Phones, IT Performance, Operational Performance, Business Analytics, Business Collaboration, Business Intelligence, Business Mobility, Business Performance, CIO, Cloud Computing, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Information Applications, Information Management, Mobility, Workforce Performance, Sybase. Mobile Industry, Tablets, Digital Technology

SAP’s New Management and Products Faces the Future at SAPPHIRE NOW

Posted by Mark Smith on May 13, 2011 1:22:58 PM

It is no easy task to change the culture of a global technology company, especially one that has a very demanding customer base with high expectations for advancing its widespread product lines. This is the challenge that SAP faces as it transitions from a company of three-letter-acronym collections of applications including CRM, SCM and ERP to one that focuses on specific business processes and needs. (My colleague recently discussed the problems in forklift migrations of ERP.)  This transition is necessitated by the shift of purchasing power and influence for applications back to business after over a decade of IT control. This alone might not seem like a drastic change, but reframing its entire dialogue and sales approach is not simple for a company the size of SAP. It must continue to grow through new applications and substantive upgrades of existing ones and cannot rely just on maintenance fees from the installed base. Over the last several months we’ve kept an eye on SAP as it builds up of its annual SAPPHIRE NOW conference, investigating changes in products and management. I’d like to share some of our firm’s analysis with those of you who have invested with or are looking to invest in SAP.

Read More

Topics: Sales Performance, SAP, Social Media, Supply Chain Performance, Sustainability, Business Technology Innovation, IT Performance, IT Research, Operational Performance, Business Analytics, Business Collaboration, Business Intelligence, Business Mobility, Business Performance, Business Technology, CIO, Cloud Computing, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Information Applications, Information Management, Information Technology, Location Intelligence, Operational Intelligence, Workforce Performance, CFO, COO

EMC Enters Elephant Race with Hadoop

Posted by Ventana Research on May 12, 2011 5:21:09 PM

Earlier this week EMC announced it will create its own distribution for Apache Hadoop.  Hadoop provides distributed computing capabilities that enable organizations to process very large amounts of data quickly. As I have written previously, the Hadoop market continues to grow and evolve. In fact, the rate of change may be accelerating. Let’s start with what EMC announced and then I’ll address what the announcement means for the market.

 EMC announced three new offerings, slated for the third quarter of 2011, that leverage its acquisition of Greenplum last year, ranging from an open source version to incorporation in its data warehouse appliance.

The EMC Greenplum HD Community Edition is a free, open source version of the Apache Hadoop stack comprising HDFS, MapReduce, Zookeeper, Hive and HBase. EMC extends Hadoop with fault tolerance for the Name Node and Job Tracker, both of which are well-known points of failure in standard Hadoop implementations.

The EMC Greenplum HD Enterprise Edition, interface-compatible with the Apache Hadoop stack, provides several additional features including snapshots, wide-area replication, a Network File System (NFS) interface and some management tools. EMC also claims performance increases of two to five times the performance over standard packaged versions of Apache Hadoop.

The EMC Greenplum HD Data Computing Appliance integrates Apache Hadoop with the Greenplum database and computing hardware. The appliance configuration provides SQL access and analytics to Hadoop data residing on the Hadoop Distributed File System (HDFS) as external tables, eliminating the need to materialize the data in the Greenplum database.

Until now Cloudera has dominated the emerging commercial Hadoop market and faced little or no competition since it introduced the Cloudera Distribution for Hadoop (CDH). The EMC announcements are both good and bad news for Cloudera. On the one hand they suggest – you might even say validate – that Cloudera has chosen a valuable market. EMC seems to be willing to invest heavily to try to get a share of it. On the other hand, Cloudera now faces a competitor that has significant resources. For customers competition is generally a good thing, of course, as it pushes vendors to innovate and improve their products to win more business.

EMC’s approach to the market differs dramatically from IBM’s strategy. IBM announced on Twitter at its Big Data Symposium held this week that it is putting all its weight behind Apache Hadoop in the hope of avoiding the fragmentation that plagued the UNIX market for years. EMC’s Enterprise Edition promises to tackle issues well known to the Hadoop market, but EMC faces competition from others who are also tackling these issues. If lower-cost or free competitive offerings adequately address these issues it could seriously undercut the market for EMC’s Enterprise Edition. While EMC brings more enterprise credentials to the Hadoop market than Cloudera, it has less experience with Hadoop. Multiple vendors are attempting to bring enterprise class capabilities to Hadoop, and it’s too soon to see who will succeed. However, overall, the Hadoop market will benefit from all the attention and investment.

 I find it interesting and a little ironic that prior to its acquisition by EMC, Greenplum (along with Aster Data, now part of Teradata)  helped popularize MapReduce, one of Hadoop’s most commonly used components, by embedding MapReduce as part of its databases. These proprietary implementations could be credited with helping to bring Hadoop into the mainstream big-data market because they combined data warehousing with MapReduce. It spawned a debate in which database guru Mike Stonebraker at first dismissed MapReduce and then embraced it. The debate attracted attention, a key ingredient in building any new market. Now EMC Greenplum completes the circle by embracing Hadoop.

 To its credit, EMC aligned a dozen partners around these announcements, creating an ecosystem of third-party products and services. Concurrent, CSC, Datameer, Informatica, Jaspersoft, Karmasphere, MicroStrategy, Pentaho, SAS, SnapLogic, Talend and VMware all announced their support for the EMC products in one form or another. Most of these companies also partner with Cloudera, so this is a good move but not a coup for EMC.

 The Hadoop market continues to evolve. We are now analyzing the data collected in our benchmark research on the state of the large-scale or now called the big data market, including Hadoop. Stay tuned for the results. It will be interesting to see where the market ends up. I expect more changes and innovation driven in part by the increased competition.

 The Hadoop market is no longer a one-elephant race.

 Regards,

 David Menninger – VP & Research Director

Read More

Topics: Big Data, EMC, Social Media, Operational Performance, Business Analytics, Business Collaboration, Business Intelligence, Cloud Computing, Cloudera, Customer & Contact Center, Greenplum, Information Applications, Information Management, Strata+Hadoop

ADP Plots Course for Workforce Analytics

Posted by Ventana Research on May 12, 2011 3:34:43 PM

One thing became crystal clear while I was at ADP’s industry analyst day last week: The world is more connected than ever before, and this contributes to making the world more complex than ever before.

Read More

Topics: Big Data, Performance Management, Social Media, Human Capital Management, Metrics, Mobile Applications, Operational Performance, Business Analytics, Business Collaboration, Business Intelligence, Business Mobility, Business Performance, Cloud Computing, Financial Performance, Information Management, Workforce Performance, Compensation, data mart, Talent Management, Workforce Analytics

ADP Advances Workforce Mobility with Vantage HCM

Posted by Ventana Research on May 12, 2011 2:46:25 PM

Going into ADP’s industry analyst day, I was curious about where a 61-year-old “payroll” company fits in today’s market for human capital management. It certainly has a presence, with over 550,000 customers across multiple lines of business – HR, payroll, tax and benefits administration – and nearly $9 billion in revenue with three consecutive quarters of growth coming out of the worst recession since the Great Depression.  

Read More

Topics: Big Data, Performance Management, Sales Performance, Social Media, Supply Chain Performance, Human Capital Management, Mobile Applications, Operational Performance, Business Analytics, Business Collaboration, Business Performance, Cloud Computing, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Workforce Performance, Compensation, Talent Management

Verint Bolsters Workforce Optimization

Posted by Ventana Research on May 12, 2011 2:43:28 PM

Recently Verint Witness Actionable Solutions announced the latest release of its Impact 360 Workforce Optimization software, which it calls the first  “fifth-generation” product in this space.

Read More

Topics: Predictive Analytics, Social Media, Customer Analytics, Customer Experience, Customer Feedback Management, Social CRM, Speech Analytics, Voice of the Customer, Operational Performance, Analytics, Business Performance, Cloud Computing, Customer & Contact Center, Customer Service, Workforce Performance, Call Center, Contact Center, Contact Center Analytics, CRM, Desktop Analytics, Text Analytics, Workforce Management, Verint

Disaster, Risk Management and the Lean Supply Chain

Posted by Ventana Research on May 12, 2011 2:40:20 PM

The earthquake, tsunami and nuclear plant trifecta that devastated Japan has had a negative impact also on companies that embraced the concept of managing a lean supply chain – one that minimizes inventories at each stage. If news accounts are to be believed, there seem to be legions regretting that decision as disruptions caused by the disasters have a ripple impact, hampering manufacturers’ ability to deliver goods worldwide. But although current events are a wake-up call highlighting the risks inherent in a lean supply chain approach, a worse danger is that some companies may overreact, especially those where blame for bad outcomes – not bad decisions – are the focal point of damaging reviews and assessments.

Read More

Topics: Performance Management, Sales Performance, Supply Chain Performance, Sustainability, Human Capital Management, Marketing, IT Performance, Operational Performance, Analytics, Business Analytics, Business Collaboration, Business Intelligence, Business Performance, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Workforce Performance, Supply Chain

Analytics for IT: Cobbler’s Children Need Shoes

Posted by Ventana Research on May 12, 2011 2:38:09 PM

As part of our largest-ever research study on business analytics, which surveyed more than 2,600 organizations covering the maturity and competency of business, IT and vertical industries, we looked at how IT is applying analytics to support their own business activities. One of the things we found is that, charged with enabling business units to use information systems as effectively as possible, the IT department, like the shoemaker’s barefoot children in the old tale, typically stands last in line for resources to manage its own performance. In trying to understand and tune the collection of networking and operating systems, middleware and applications an enterprise needs to operate, IT professionals usually have to make do with small sets of historical data stored in spreadsheets and data warehouses and marts that are not as well managed as the systems they maintain to support the business. In most cases IT cannot apply the same level of analytics to its own operations that it provides to business units. This also has effects beyond IT itself: To the extent that the result is subpar performance of its core information systems, the business will suffer.

To break out of this frustrating cycle, IT needs to make the rest of the organization aware of the role it actually performs, of course, and it needs metrics and measurements, which require analytics to standardize and routinely generate them. IT needs to be able to analyze both historical and real-time events involving data and processes so managers can determine the right level of automation and efficiency to demand from the technology. And IT needs the capability delivered by predictive analytics  to anticipate situations and outcomes so it can prepare properly for them. In short, the CIO and IT staff need to manage their portfolio as a business asset, not merely a collection of technologies.

Metrics about its own operations and systems also enable IT to determine priorities for improvement. To fully understand the state of their existing investments and processes, IT organizations should not just measure them but analyze them to develop insights on future outcomes of their systems. This more sophisticated approach to analytics can help IT determine where to focus resources and what to do with legacy systems. Knowing this, it is possible to prioritize precious budget dollars and justify IT investments more convincingly.

Our research found that IT’s concerns currently center on cost and operational efficiency. The most important financial metrics are return on investment, cost per project, budget utilization and adherence to budget. The most important process metrics address timeliness in IT’s core function of service to the business: delivery of projects on time, speed of technology implementation and help desk response time.

In our research, which we presented in this webinar on IT analytics, of the participants’ perceptions of which metrics are most important for executives and managers, two loomed large: business user satisfaction and compliance with service level agreements (SLAs). The executives themselves rated the two metrics nearly equal in importance, but their management reports (vice presidents) by a slight margin most often named adherence to governance and risk management requirements rather than either of those. These responses suggest that people may work somewhat at cross-purposes in pursuing IT analytics.

The research also finds strong suggestions that organizations ought to involve more people in the process of establishing requirements for defining analytics. Research participants asserted overwhelmingly that they and the head of their business unit are involved in establishing requirements important to their jobs, but percentages drop for heads of other business units and business analysts in other business units. This disparity takes on more weight when we recall that business user satisfaction and SLA compliance are important metrics for leaders.

For analytics to deliver value, they must be available to those who need them; the research shows that this is an issue for many organizations. No more than half have analytics generally available to address any of seven major IT management tasks, and only for budget analysis are analytics completely available in even one-fourth of organizations. In a related finding, more than half said it is very important to make it simpler to provide analytics and metrics; less than 10 percent said that is only somewhat important or not important. As well, over a third said they can significantly improve their use of analytics and performance indicators, and over a third are not satisfied with the process currently used to create analytics.

The process of applying analytics also impacts IT’s effectiveness. The IT Analytics benchmark research found that users in nearly two-thirds of all organizations spend most of their time in unproductive chores that precede analyzing their data: preparing it for analysis, reviewing it for quality and consistency and waiting for it. And before that, issues in collecting the data raise another roadblock. In more than half of organizations, doing that is very difficult or a challenge that impedes creating metrics and performance indicators.

These functional barriers also can get in the way of analysts performing important tasks. Among capabilities they need in order to work effectively with analytics and metrics, 42 percent said access to source data is the most important, and at least one-third identified as most important the abilities to search for existing data and analytics, to take action based on analytics and to design and maintain both business models and metrics for analytics. Applying predictive analytics to project future outcomes, a hallmark of advanced maturity in the use of IT analytics, was cited by 31 percent.

IT professionals need appropriate tools to facilitate these and other analytics-related activities. In more than half of these organizations, business intelligence technologies for query, reporting, analysis are the most important of these tools. Yet even in this technologically astute environment, desktop spreadsheets are often used to generate analytics and are an important information source for building IT. But spreadsheets require manual effort to populate the data and are prone to error, and thus are not appropriate for collaborative and enterprise-wide activities. We think their widespread use is a factor in half of organizations being only somewhat satisfied or not satisfied with their organization’s current technology for creating and applying analytics.

As part of our benchmark research methodology, Ventana Research has developed a model for assessing maturity that classifies organizations at four maturity levels (from bottom to top, Tactical, Advanced, Strategic and Innovative) in each of four categories: People, Process, Information and Technology. With respect to their use of and plans for IT analytics, our Maturity Index analysis found only 15 percent whose responses place them at the highest Innovative level of maturity. One important finding reflecting on organizations’ maturity is that two-thirds said the data used in preparing metrics and performance indicators is only somewhat accurate or somewhat inaccurate. As well, it takes 35 percent of organizations more than one week to provide updated metrics and performance indicators to people and nearly as many up to a week to provide them.

It is a positive sign that improvements, if made, will be done most often to improve business processes or decision-making rather than for operational efficiency and cost savings. The first two motivations are more likely to produce better business results. Similarly, maximizing IT effectiveness and improving the value of IT to business managers are more important than issues involving resources, costs and budget.

However, these opinions come from organizations that plan to change the way they generate and apply analytics in the next 12 to 18 months, and they comprise only 28 percent of the total; another 36 percent said changes are needed but are not currently a priority. The primary barriers to such an initiative are both fiscal (lack of resources and budget) and perceptual (lack of awareness and a sense that the business case is not strong enough). Recognizing a problem but not being willing or able to remedy it is another sign of immaturity.

To maximize its value, IT should use analytics and metrics to help set its own goals and objectives and to ensure they serve the business strategies of the organization. This innovative path is embracing IT performance management. Few organizations have taken the necessary steps to actually manage performance and align, optimize and understand the range of their IT processes and resources. We believe, and this benchmark research confirms, that it is time for them to take those steps, supported by executive management in providing resources.

Regards,

Ventana Research

Read More

Topics: Predictive Analytics, IT Performance, Analytics, Business Analytics, Business Intelligence, Business Performance, Information Applications, Information Management, Information Technology, IT Analytics, IT Service Management, ITIL, ITSM, IT Performance Management (ITPM)

Think Carefully about Social Media and Your Customers

Posted by Ventana Research on May 12, 2011 2:35:39 PM

Unless you have been on a long vacation somewhere without newspapers, mobile phones or the Internet, you must have noticed all the buzz about social media – some of it factual and lots of it hype. Over a billion people use Facebook. There are many millions of tweets on Twitter every day, and YouTube has become the place to share videos, whether for a laugh, for a company’s brand awareness or for training courses. The key question for business is how much of this is useful for commerce and how much is just socializing. I started researching this movement and its intersection some time back and last year spoke about Customer Service in the Social Media Age.

Companies should be looking at social media as another channel of communications with their customers and prospects. My research into the state of technology in contact centers shows that companies on average now support four channels of communications but that as yet social media is the least used. This is due to some extent to its newness, but I believe other factors also come into play. Social media is different than other channels. It is much more open-ended, and it is impossible to control who (and how many people) might see an entry. Therefore, companies and customers should be careful about what they post (or allow employees to post in their name). Social media generates high volumes of communications and thus can consume lots of time and effort both to keep up with and respond to entries. And like it or not, it is open to abuse, such as with disgruntled consumers running negative campaigns against companies, companies manipulating entries to sway consumers’ views and both sides reacting badly to provocative entries.

Another significant difference is that use of social media transcends business units; this might be the hardest thing for companies to reconcile. As a speaker pointed out at the recent IQPC Executive Customer Contact Exchange (ECCE) conference, business can use social media for four activities – brand management (marketing), sales, customer service and product development. Of these it seems that the most use is for brand management, with marketing departments using it as a “cheap” channel to place advertising and also to monitor consumer comments about the company or brand. The next widest use is in the largely negative side of customer service, as customers post negative comments about companies, products and the quality of service they receive, and some companies respond. At the very least companies should be monitoring these comments using one of the many social media analytics tools; doing so they can extract a wealth of insights into what they and others are doing right and wrong (most often the latter).

At the present time other uses are less common. A few companies have extended the use of social media into their end-to-end customer service processes, such as in picking up entries requesting information on how to get a product working. This typically involves capturing social media entries using one of the engines now available, routing service entries to the contact center or customer service group, and then having someone post a response through the same channel or if appropriate a different channel. In a similar way some companies are picking up potential sales opportunities, as in the form of entries requesting information about a product, and routing these into their sales process. Finally some innovative companies are using social media forums to solicit feedback on potential product developments or enhancements.

It is still uncertain which of these uses will deliver real business value, but as companies experiment with social media, I advise them to take into account that typically each of these four uses is the responsibility of a different business unit. My research on the use of technology shows that one of the most important things for companies and customer alike is consistency – of information and experience. Inconsistency in either means increased costs (providing multiple channels to get an answer), increased customer frustration and loss of potential business. To avoid these, companies should regard social media as a cross-business-unit responsibility and ensure that all use a single source of customer information and synchronize their processes across unit boundaries.

There was also a lot of discussion at the ECCE event as to how companies should put together their social media strategy. It seems to me that the first thing companies should do is “listen” to how their customers are using social media and what they are saying on different sites. Several vendors are doing this that I have been assessing including Attensity, Clarabridge, Genesys, ResponseTek, RightNow, salesforce.com and SAS These products, some of which are deployed in the cloud, can extract relevant entries from different sites and use text analytics to assess the content. Once you have this ability to listen you’ll be in a position to decide strategy and how best to benefit from social media going forward. Where does your company stand with regard to social media? What uses are you making of it? Do you have a product in place to monitor what is happening? Drop me a line and tell me about your experience.

Regards,

Richard Snow – VP & Research Director

Read More

Topics: Predictive Analytics, Sales Performance, Salesforce.com, SAS, Social Media, Customer Analytics, Customer Experience, Social CRM, Speech Analytics, Voice of the Customer, Clarabridge, Genesys, ResponseTek, RightNow, Operational Performance, Analytics, Business Collaboration, Business Intelligence, Business Mobility, Cloud Computing, Customer & Contact Center, Customer Service, Information Applications, Call Center, Contact Center, Contact Center Analytics, CRM, Desktop Analytics, Text Analytics

SAP Diversifies into Contact Center and Communications Technology Market

Posted by Ventana Research on May 12, 2011 2:32:49 PM

Most people associate SAP with enterprise software: ERP, CRM and more recently with business analytics and business intelligence. The majority also see the company as committed to providing these as on-premises applications and having only begun its presence in cloud computing for business applications. But there’s more to the story, as I recently discovered. With its Business Communications Management (BCM) software SAP has quietly diversified into the contact center market, while at the same time increasing its presence in the cloud.

BCM originated with SAP’s acquisition of Wicom Communications, a Finnish company, in 2007. Wicom developed its product from a customer project and had some success selling it in the Nordic countries. SAP has built on this foundation and is now offering BCM globally. It is a multichannel, VoIP-based communications management application that helps companies control their interactions with customers. It is designed so that calls are kept locally but where they are routed and which are recorded can be controlled either locally using an on-premises application or in the cloud. BCM includes call recording, IVR and unified interaction routing (interactions from multiple channels are routed through a single queue). It has a built-in directory of valid users that works in conjunction with presence capabilities so that one user (agent) can identify others who are available on the network, either to collaborate on the resolution of an interaction or for one user to transfer the interaction to another. The directory lists users’ skills to help one pick out someone who has the right skills to handle a particular customer interaction.

All of BCM’s capabilities are fully integrated with each other, and there is a single point of administration. This close coupling makes it possible to centralize reporting and analysis and to combine information from multiple sources to provide a broader base of information for reports and analysis. Integration also extends to other applications, particularly others from SAP such as SAP CRM, ERP, ByDesign, BOBJ and BI for more extensive reporting. These integrate at the lowest level, thus providing more out-of-the-box interoperability than normally is possible between third-party applications. Other non-SAP applications can be integrated using Web services.

The products are available from SAP on-premises and in the cloud from its partners. SAP also provides consulting services to help customers get up and running. In summary the set of products provides tightly integrated capabilities with VoIP-based smart PBX functionality, core capabilities of unified communications (presence and collaboration), multichannel routing, and reporting and analysis; alongside tight integration with CRM. This does not make a fully functional contact center, but the communications management supports companies as they try to improve the way they handle customer interactions.

SAP positions BCM as enabling “communications-enabled business processes.” I have two issues with this concept. In my experience most call centers don’t think about “process” but rather a set of activities such as handling incoming calls (and other interactions) that have to be delivered to the most qualified person and enabling that person to get on and resolve the call; for many people technology just gets in the way.  Second, in my experience applications are not very friendly to call-handling; callers  don’t seem to structure their conversations in the logical way that applications work and don’t respect what screens have to accessed and what data has to be entered in what sequence; that is, the applications don’t flow the same way as conversations flow. So I’m not sure about communication-enabled processes, but from what I have seen and heard BCM does enable smart interaction management and therefore should help companies improve the way they interact with customers which is something I have extensively researched into customer interaction technology. SAP is clearly deepening its focus with CRM as my colleague expressed recently.

Are you ready for communications-enabled processes or customer interaction activities and technologies? If so, I’d love to know what you are doing and what technology you use to support your efforts.

Regards

Richard Snow – VP & Research Director

 
Read More

Topics: Sales Performance, SAP, Customer Analytics, Customer Experience, Social CRM, Voice of the Customer, Operational Performance, Analytics, Business Collaboration, Business Intelligence, Cloud Computing, Customer & Contact Center, Customer Service, Information Applications, Call Center, Contact Center, Contact Center Analytics, CRM, Unified Communications

MarkLogic Revs Up Information Applications with New Energy and Leadership

Posted by Ventana Research on May 12, 2011 2:30:54 PM

At this year’s user conference, it was clear that change is afoot at MarkLogic, whose technology platform enables users to access information more easily accessible within applications and devices. Last month the board of directors appointed a new CEO, Ken Bado, created the new position of chief marketing officer (CMO) and named a head of global services and alliances, all within three weeks. The Silicon Valley software company has been growing in the last several years but appears not fast enough for its board members. There have been a lot of advancements since my in-depth analysis in 2010 and at last years conference.

Read More

Topics: Sales Performance, Supply Chain Performance, MarkLogic, Reporting, XML, IT Performance, Operational Performance, Business Analytics, Business Intelligence, Business Performance, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Information Applications, Information Management, Workforce Performance, Content Management, Document Management, Information Platform, Search

SuccessFactors Finds Learning Management System and More with Plateau Acquisition

Posted by Ventana Research on May 12, 2011 2:28:44 PM

Consolidation in the human capital management software market is in full swing. Recently  Peopleclick Authoria acquired Aquire Solutions, and just this week Golden Gate Capital and Infor agreed to buy Lawson Software for about $2 billion, which my colleague Robert Kugel commented.

Read More

Topics: Performance Management, Sales Performance, Human Capital Management, Learning, Learning Management System, Operational Performance, Business Analytics, Business Collaboration, Business Mobility, Business Performance, Cloud Computing, Customer & Contact Center, Financial Performance, Workforce Performance, Compensation, Talent Management, Workforce Planning

Acquisition of Lawson Complements Infor’s Portfolio

Posted by Ventana Research on May 12, 2011 2:25:52 PM

Golden Gate Capital and Infor (which is owned largely by Golden Gate Capital) will acquire Lawson Software for approximately $2 billion  in a transaction that is expected to be completed sometime in this year’s third quarter. Lawson is the latest in a string of enterprise software acquisitions made or financed by Golden Gate that began almost a decade ago. Today, Infor is made up of legacy companies such as Baan, Comshare, ePiphany, Dun & Bradstreet Software, SSA, Sun Systems and Symix, to name just a handful. Compared to Oracle’s acquisition approach, I would describe Golden Gate’s as more of a “rollup” of applications software vendors because it incorporates a larger number of smaller companies. While Oracle has focused primarily on serving the largest corporations, Infor’s customers tend to be midsize to large companies or divisions of very large corporations. Nonetheless, with this acquisition Infor will have a larger base of revenue and installations to work from in an industry where size and economies of scale drive profitability and competitiveness.

Lawson’s focus has been on two main vertical segments that I think nicely complement Infor’s lineup: services-oriented S3 strategic industries, which includes healthcare and public sector organizations as well as the cross-industry market for human capital management (HCM) software that my colleague recently outlined its importance for 2011; and light manufacturing-oriented M3 strategic industries, which targets fashion companies, equipment service management and rental as well as food and beverage. The HCM portfolio of Lawson will significantly help Infor who has not been as aggressive with its workforce management solution acquired many years back and for a market that is growing and consolidating rapidly in the last several years. Lawson’s strategy has been to focus on midsize-to-larger organizations in its core markets with a vertical-specific product focus and a value proposition of lower cost of ownership.

One objective in an acquisition such as this is to keep customers paying maintenance as long as possible. (I covered this topic in an earlier blog, “The Technology Stack and Innovation.”) When the final deal was announced it was accompanied by a letter from Infor’s CEO, Charles Phillips, to Lawson’s customers aimed at reassuring them that Infor is in it for the long run to keep them as customers and that Lawson’s current products will continue to receive support.

Beyond the goal of continuing to receive maintenance fees on Lawson’s existing product lines, I think that the Lawson acquisition reaffirms Infor’s basic product approach of making it simple for its customers to migrate from their existing software to a next-generation Infor offering. Software companies that like Infor have acquired an array of similar business applications have big incentives to move established customers onto a new or substantially updated system as painlessly as possible; otherwise they are likely to stop paying maintenance and start evaluating a full set of alternatives. (I just covered this point in a recent blog on ending “forklift migrations.” Reducing migration pain makes it much easier for a vendor to keep customers on maintenance and hold onto an important and highly profitable source of revenue. Moreover, it’s a way for these vendors to consolidate the number of code bases they are maintaining, which at the very least will make their development programs more effective, rationalize sales efforts and offer operating savings.

While the price Golden Gate and Infor are paying is hardly cheap (at about 2.6 times this year’s projected revenue for Lawson), it does give the acquirers a large, incremental, maintenance-paying installed base that can be targeted with a “pain-free” migration offering. Whether this ultimately pays off for Infor’s and Golden Gate’s investors depends, of course, on execution. Infor has been a company with good (and some not so good) products with unfulfilled potential. It’s up to Charles Phillips who already and his team to realize that potential and put into action his letter to Infor and Lawson on the announcement.                  

Regards,

Robert Kugel – SVP Research

 
Read More

Topics: Sales Performance, Supply Chain Performance, ERP, Human Capital Management, Operational Performance, Business Analytics, Business Performance, Business Technology, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Oracle, Workforce Performance, CFO, Infor, Talent Management, Corporate Finance, Financial Performance Management

Datawatch Offers Shorter Path from Data to Information

Posted by Ventana Research on May 12, 2011 2:23:08 PM

Turning data into information for taking actions and making decisions has bedeviled businesses throughout the computer age. Many organizations have data in dozens of applications and legacy systems along with many reports in various business intelligence systems. The challenge is to get data from each of the reports and assemble it into contextualized views of information for particular business needs. In our benchmark research on what we call information applications, only 11 percent of organizations said they are satisfied with their existing efforts to do this; more than half of organizations see the current process as too slow and not adaptable to the changes that necessarily occur in assembling actionable information.

Read More

Topics: Sales Performance, Sustainability, Reporting, Business Technology Innovation, IT Performance, Operational Performance, Business Analytics, Business Intelligence, Business Performance, Cloud Computing, Customer & Contact Center, Financial Performance, Governance, Risk & Compliance (GRC), Information Applications, Information Management, Workforce Performance, Datawatch, Document Management

Zyme Solutions Tackles Channel Data Challenge

Posted by Ventana Research on May 4, 2011 7:49:25 AM

Companies (especially in high technology) that sell through an indirect channel face a difficult challenge because global sales channels are complex, fragmented and changeable, with different business practices and customs than direct channels. Keeping track of which products have sold in and sold through which partners can be a difficult task. Unless a company is working with only a handful of channel partners, just collecting the data is time-consuming. Not only is the data complex, much of it is taken from disparate IT systems of individual channel partners. They report their data at different times and in different ways using a mishmash of data structures, aggregations and nomenclature, so companies have to go through a data-cleansing step to acquire a consistent data set with which to work. Yet having accurate, detailed and timely data is important to both the day-to-day and strategic management of a corporation. Without that, it’s hard to manage customer and partner relationships effectively and have a timely, accurate view of aggregate indirect channel sales and inventory positions.

Read More

Topics: Sales, Sales Performance, Salesforce.com, Human Capital Management, Zyme Solutions, Operational Performance, Business Analytics, Business Collaboration, Financial Performance, Governance, Risk & Compliance (GRC), channel, CRM

Prospects for Ending Forklift Migrations of ERP

Posted by Ventana Research on May 3, 2011 9:40:48 PM

Back in the old days (20 years ago or so) companies that wanted to expand or update their telephone systems had to do what was called a “forklift migration.” In other words, they had to remove big, heavy and very expensive boxes of electronics from an equipment room and replace them with newer big, heavy and very expensive boxes. The process of adding, deleting or changing people, offices and phone numbers was equally burdensome and costly. This all seems quaint now because digital telephony and voice over IP (VOIP) have completely changed the technology underpinnings of voice communications. I bring this up because we may be on the verge of substantially reducing the “forklift migration” equivalent of replacing or updating on-premises ERP systems and other enterprise software. This possibility is important for software vendors as well as users. Retaining a maintenance base and revenue stream has become a key strategic objective for any enterprise software provider. In North America in particular, companies that have outgrown their enterprise system or want to replace it almost never exhibit total brand loyalty. Instead they begin the replacement process by looking at alternatives, winnow it down to a short list and then select the best of the lot. If migration is as much work as implementing a new system, organizations are likely to view replacement as an equally attractive option, increasing the probability that the incumbent vendor will lose a customer. But if there’s little pain in changing an ERP system to acquire new functional capabilities or meet other objectives, incumbent vendors stand to benefit.

Read More

Topics: SAP, ERP, Office of Finance, Business Performance, Financial Performance, Governance, Risk & Compliance (GRC), Oracle, Infor

It’s Time for the Contact Center To Change

Posted by Ventana Research on May 3, 2011 9:07:01 PM

Twenty years ago, when I began consulting in the contact center industry, building a call center was a hard, resource-consuming task. Just to begin handling calls required purchasing lots of proprietary equipment, such as PBXs and automatic call distributors (ACDs), as well as software for computer/telephony integration (CTI) and business applications such as case management and CRM – and then spending a lot of time and effort integrating them. Lots of tasks were managed using spreadsheets, and if you wanted anything more than the basic reports available from your PBX/ACD supplier, you would have to budget a great deal more money. Right from those early days, call center managers focused on efficiency and relied on basic metrics such as queue lengths, average call-handling time, hold times and call transfers.

Read More

Topics: Predictive Analytics, Sales Performance, SAP, Social Media, Customer Analytics, Customer Data Management, Customer Experience, Customer Feedback Management, Social CRM, Speech Analytics, Voice of the Customer, InContact, LiveOps, Operational Performance, Analytics, Cloud Computing, Customer & Contact Center, Customer Service, Workforce Performance, Call Center, Contact Center, Contact Center Analytics, CRM, Desktop Analytics, Interactive Intelligence, Text Analytics, Unified Communications, Workforce Management, Contactual

Content not found