Ventana Research Analyst Perspectives

The Technology Stack and Innovation: SAP & The Rest

Written by Ventana Research | Dec 14, 2010 5:51:04 PM

Vishal Sikka raised an important point about the software business during his remarks at the SAP Global Influencer Summit that my colleague just assessed (See: “SAP Elevates Technology Strategy for Enterprise Software and Solutions“). He contrasted the business strategy of consolidation that other companies are pursuing with his view of SAP’s strategy of innovation. In one sense, this assertion is an attempt to disparage Oracle’s and to some extent IBM’s approach to constructing an IT business portfolio, even though SAP itself has been a consolidator in recent years. (Business Objects and Sybase, for example, are significant components of SAP’s product universe and go-forward strategy.) However, I believe consolidation vs. innovation is an important point to consider as we enter the second decade of the 21st century because it points to the potential for a basic shift in the dynamics of the software business.

In contrast with the first five decades of the software business, technology innovation was not much of a driving force in business applications over the past decade. It’s not that there weren’t steady evolutionary enhancements, but these did not have a fundamental impact on demand or competitive dynamics. Instead the lack of a major technological catalyst created the business dynamic that characterized the 2000s: consolidation.

One of the big “Aha!” (or WTF) moments in the software business in that decade took place in February 2004 when (I imagine) Safra Catz (or an analyst working for her) changed the assumptions Oracle was making in its valuation of PeopleSoft during its hostile takeover that began the previous June. Simply changing assumptions about the maintenance tail that PeopleSoft would bring made it possible – at a stroke – to increase the price Oracle was offering to pay by one-third. Until that point, it was common to make conservative assumptions about customer retention. Conventional wisdom was that, even assuming you can keep 90 percent of your customers on maintenance, half of these will be gone after just seven years. So until 10 years ago, conservative assumptions in valuing enterprise software companies were called for because highly disruptive technology would emerge regularly, and expecting to lose half your customers over a product cycle was therefore reasonable in considering what to pay. Recall that after Microsoft released Windows 3.0 it took less than seven years for Lotus to lose just about all of its 1-2-3 customer base to Excel.

For someone like me who pays attention to financial fads and fashions, the near-elimination of disruptive technology from the valuation assumptions in software mergers and acquisitions has been interesting because, while reasonable at the time, it someday must prove to be wrong. This is why Vishal’s comments – although a bit self-serving – are worth everyone’s attention.

The last big schism in enterprise software took place with the introduction of client/server computing. That term is misleading because it wasn’t just the architecture that was important. It was accompanied by the mainstreaming of the relational database, event-driven programming languages and the graphical user interface. These improvements combined to make it far easier to create flexible business applications that could support a wider range of functions and processes. All of these elements reached the market within a couple of years of each other, so in hindsight it looks like one big event, but it was a convergence of several that made the difference.

In contrast, the rapid adoption of the Web was more additive than disruptive to the software business. I think that those predicting another client/server revolution were hung up in the buzz.

This time around, in the next couple of years what might be different is that several disparate threads arriving over a longer period of time may accelerate the obsolescence of existing business software. In-memory computing (such as SAP’s high-performance analytic appliance HANA) may be a game changer when combined with (for example) cloud computing, server virtualization, open source software, mobility and tablet computing and other user interface devices that substantially enhance usability and the user experience. It’s true that all of these things have been around for a while. However, as business computing has matured and become more complex, it takes longer for threads like these to coalesce into products and services that corporations will demand. If these prove attractive enough, they could spawn a wave of replacement of existing software.

One of the questions that innovation vs. consolidation begs is whether owning a “stack” of technology is a good, bad or indifferent when it comes to dealing with a major replacement cycle. Vishay’s observation that nobody buys a stack of technology is true – sort of. Most of the time people don’t buy “performance management,” for instance, they purchase one or several IT elements necessary to support performance management. Yet some companies do buy suites precisely because they have one throat to choke or it’s easier to manage the integration. Although nobody is going to do a full green-field deployment in a larger company these days, the stack represents a vendor consolidation option that some (even many) companies will consider. It presents an opportunity for vendors that own a stack to increase their footprint in an enterprise. Based on corporate behavior over the past 50 years, I’d say owning a stack is a positive but only if most or all of the elements of that stack are on the cutting edge.

So which software vendors are most vulnerable in a new decade of innovation outstripping consolidation? That depends on how one defines the nature of the vulnerability – market share or profitability, for example. For starters, consider those with high-margin quasi-monopolies such as Microsoft (both Office and Windows) and Oracle (database and to a much lesser extent business applications maintenance). Their mighty operating profit margins are the result of their huge economies of scale, which could be eroded by increased competition. Yet people have been predicting the demise of their fat margins for years now, and though IT executives grumble, they keep paying. Another company that is vulnerable is Infor, which is why hiring Oracle’s ex-co-president, Charles Phillips, as its CEO was an important move. With far fewer technical resources and tighter purse strings, one of his most difficult jobs will be to drive a company that is the result of consolidating the weaker players in the enterprise software business in an era of innovation. His task is daunting but far from impossible.

I think in this coming decade the introduction of innovation in enterprise applications will accelerate. That will change the terms of engagement in the market, but it’s not clear to me at this point that any of the market leaders is better or worse positioned to adapt. That will become clearer over the next two years. It should be interesting to watch.

Let me know your thoughts or come and collaborate with me on Facebook, LinkedIn and Twitter.

Regards,

Robert Kugel – SVP Research