You are currently browsing the category archive for the ‘Operational Intelligence’ category.
Teradata recently gave me a technology update and a peek into the future of its portfolio for big data, information management and business analytics at its annual technology influencer summit. The company continues to innovate and build upon its Teradata 14 releases and its new processing technology. Since my last analysis of Teradata’s big data strategy, it has embraced technologies like Hadoop with its Teradata Aster Appliance, which won our 2012 Technology Innovation Award in Big Data. Teradata is steadily extending beyond providing just big data technology to offer a range of analytic options and appliances through advances in Teradata Aster and its overall data and analytic architectures. One example is its data warehouse appliance business, which according to our benchmark research is one of the key technological approaches to big data; as well Teradata has advanced support with its own technology offering for in-memory databases, specialized databases and Hadoop in one integrated architecture. It is taking an enterprise management approach to these technologies through Teradata Viewpoint, which helps monitor and manage systems and support a more distributed computing architecture.
By expanding its platform to include workload-based appliances that can support terabytes to petabytes of data, its Unified Data Architecture (UDA) can meet a broad class of enterprise needs. That can help support a range of big data analytic needs, as my colleague Tony Cosentino has pointed out, by providing a common approach to getting data from Hadoop into Teradata Aster and then into Teradata’s analytics. This UDA can begin to address challenges in data activities and tasks in the analytic process, which our research finds are issues for 42 percent of organizations. Teradata Aster Big Analytics Appliance is for organizations that are serious about retaining and analyzing more data, which 29 percent of organizations in our research cited as the top benefit of big data technology. This appliance can handle up to 5 petabytes and is tightly integrated with Aster and Hadoop technology from Hortonworks, a company that is rapidly expanding its footprint, as I have already assessed.
The packaged approach of an appliance can help organization address what our technology innovation research identified as the largest challenges in big data: not enough skilled resources (for 56% of organizations) and being hard to build and maintain (52%). These can be overcome if an organization designs a big data strategy that can apply a common set of skills, and the Teradata technology portfolio can help with that.
At the influencer summit, I was surprised that Teradata did not go into the role of data integration processes and the steps to profile, cleanse, master, synchronize and even migrate data (which its closest partner, Informatica, emphasizes) but focused more on access to and movement of data through its own connectors, Unity Data Mover, Smart Loader for Hadoop and support of SQL-H. For most of its deployments there is a range of complementary data integration technology from its partners as much as it is a Teradata only approach. For SQL-H Teradata takes advantage of the metadata HCatalog to improve access to data in HDFS. I like how Teradata Studio 14 helps simplify the view and use of data in Hadoop, Teradata Aster and even spreadsheets and flat files for building sandbox and test environments for big data. (To learn more, look into the Teradata Developer Exchange.) Teradata has made it easy to add connecters to get access to Hadoop on its Exchange which is a great way to get the latest advances in its utilities and add-ons to its offerings.
Teradata provided an early peak on the just announced Teradata Intelligent Memory, a significant step in adapting big data architectures to the next generation of memory management. This new advancement can cache and pool data that is in high demand (hot) across any number of Teradata workload-specific platforms by processing data to determine the importance of data (described as hot, warm or cold) for fast and efficient access and applying analytics. This technological feat can then utilize both solid-state and conventional disk storage to ensure the fastest access and computation of the data for a range of needs. This is a unique and powerful way to support an extended memory space for big data and to intelligently adapt to the data patterns of user organizations; its algorithms can interoperate across Teradata’s family of appliances.
Teradata has also invested further into its data and computing architecture through what it calls fabric-based computing. That can help connect nodes across systems through access on the company’s Fabric Switch using its BYNET, Infiniband and other methods. (Teradata participates in the OpenFabrics Alliance, which works to optimize access and interconnection of systems data across storage-area networks.) Fabric Switch provides an access point through which other aspects of Teradata’s UDA can access and use data for various purposes, including backup and restore or data movement. These advances will significantly increase the throughput and combined reliability of systems and enhance performance and scalability at both the user and data levels.
Tony Cosentino pointed out the various types of analytics that Teradata can support; one of them is analytics for discovery through its recently launched Teradata Aster Discovery Platform. This directly addresses two of the four types of discovery I have just outlined : data and visual discovery. Teradata Aster has a powerful library of analytics such as path, text, statistical, cluster and other areas as core elements of its platform. Its nPath analytic expression has significant potential in enabling Aster to process distributed sets of data from Teradata and Hadoop in one platform. Analytic architectures should apply the same computational analytics across systems, from core database technology to Teradata Aster to the analytics tools that an analyst is actually using. Aster’s approach to visual and data discovery is challenging in that it requires a high level of expertise in SQL to make customizations; the majority of analysts that could use this technology don’t have that level of knowledge. But here Teradata can turn to partners such as MicroStrategy and Tableau, which have built more integrated support for Teradata Aster and offer easier to use that are interactive and visual designed for analysts who do not want to muck with SQL. Teradata has internal challenges in improving support for analysts and the analytic processes they are responsible for; its IT-focused, data-centric approach will not help here. Our big data research finds that staffing and training are the top two barriers for using this technology, according to more than 77 percent of organizations; vendors should note this and reduce the custom and manual work that requires specific SQL and data skills in their products.
Regarding analytics specifically, Teradata has continued to deepen its analytics efforts with partner SAS. A new release of Teradata Appliance supports SAS High-Performance Analytics for up to 52 terabytes of data and also supports SAS Visual Analytics, which I have tried and assessed and tried myself.
Through its Teradata Aprimo applications Teradata continues its efforts to attract marketing executives in business-to-consumer companies that require big data technology to utilize a broad range of information. Teradata has outlined a larger role for the CMO with big data and analytics capabilities that go well beyond its marketing automation software. The company announced expansion to support predictive analytics and has outlined its direction for supporting customer engagement. It needs to take steps such as these to ensure it tunes into business needs beyond what CIOs and IT are doing with Teradata as a big data environment for the enterprise.
Along these lines I have also pointed out that we should be cautious about accepting research that predicts the CMO will outspend the CIO in the future. What I have seen in these assertions is flawed in many facets and often come from those who have no experience in market research and the role marketing and dealing with technology expenditure in that context. As we have done research into both the business and IT sides, we have discovered the complexities of making practical technology investments; for example, our research into customer relationship maturity found that inbound interactions from customers occur across many departments; they occur in marketing (in 46% of organizations), but more often through contact centers (77%), where Teradata should strengthen its efforts. On the plus side Teradata continues to demonstrate success in assisting customers in marketing, winning our 2013 Leadership Award for Marketing Excellence with its deployment at International Speedway Corp. and in 2012 at Nationwide Insurance with Teradata Aprimo. Our current research into next-generation customer engagement already identifies a need to support multichannel and multidepartment interactions. Teradata could further expand its efforts in these areas with existing customers; KPN won our 2013 Leadership Award in Customer Excellence after connecting Teradata with its Oracle-based applications and supporting BI systems.
Overall Teradata is doing a great job of focusing on its strengths in big data and areas where it can maximize the impact of its analytics, especially marketing and customer relations. While IBM, Oracle, SAP and other large technology providers in the database and analytic markets tend to minimize what Teradata has created, it is has a loyal customer base that is attracted to the expanded architectures of its appliances and its broader UDA and intelligent memory systems. I think with more focus on the processes of real business analysts and further simplifying usability, Teradata’s opportunity could grow significantly. In helping its customers process more of the vast volumes of data and information from the Internet, such as weather, demographic and social media, it could make clear the broader value of big data in optimizing information from the variety of data in content and documents. It could expand its new generation of tools and applications to exploit the use of this information as it is beginning to do with marketing applications from Teradata Aprimo. If Teradata customers find it easier to access information and share it across lines of business through social collaboration and mobile technology, that will further demand for its technology to operate on larger scales in both the number of users and the places where it can be accessed even via cloud computing. Exploiting in-memory computing along with providing more discovery potential from analytics will help its customers utilize the power of big data and trust in Teradata to supply it.
CEO & Chief Research Officer
In some parts of the world, bribing government officials is still considered a normal cost of doing business. Elsewhere there has been a growing trend over the past 40 years to make it illegal for a corporation to pay bribes. In the United States, Congress passed the Foreign Corrupt Practices Act (FCPA) in 1977 in the wake of a succession of revelations of companies paying off government officials to secure arms deals or favorable tax treatment. More recently other governments have implemented anticorruption statutes. The U.K., for instance, enacted the strict Bribery Act in 2010 to replace increasingly ineffective statutes dating back to 1879. The purpose of these actions is to enable ethical and law-abiding companies to compete on a level playing field with those that are neither. A cynic might wonder about the real, functional difference between, say, Wal-Mart’s recent payments to officials in Mexico to accelerate approval of building permits and the practice in New York City of having to engage expediters to ensure timely sign-offs on construction approval documents. No matter – the latter is legal (it’s a domestic issue, after all) while the former is not.
Moreover, the U.S. Department of Justice (DOJ) and the Securities and Exchange Commission (SEC) have increased their oversight of bribery. At the beginning of 2013 they jointly issued the Resource Guide to the U.S. Foreign Corrupt Practices Act. For its part, the SEC has stepped up enforcement using its own resources. Recently, it charged a group of bond traders with enabling a Venezuelan finance official to embezzle millions of dollars by disguising the money as fees paid to the broker/dealer to handle apparently legitimate transactions. Tellingly, though, there was another relatively recent bribery issue that involved Morgan Stanley where the SEC declined to include that company in an enforcement action because it had demonstrated diligence to prevent it.
Before anticorruption laws, it was expedient for corporations to pay government officials to close business, get preferred status or prevent punishment. Once the laws were established, that stopped being the case. However, from a management standpoint, compliance with the law became complicated because of the dual nature of the corporation, which is both an entity and a group of individuals. In the case of the latter, when an individual breaks the law, is that person at fault, is the corporation or are both? Regardless of how a case is decided, there can be severe reputational damage to a company found violating the law, and that will have repercussions for corporate boards and executives.
This question leads to the agency dilemma, an important consideration in enterprise risk management. Economists long ago recognized the agency dilemma when the modern corporation separated the roles of its principals (that is, the shareholders) from its management. The agency issue exists where the best interests of the principals are either not aligned or in conflict with the interests of the agents (the professional managers running the corporation). But agency issues also extend to the company’s executives and may be rife in any large-scale business. Within the management group, authority to act independently is delegated down through the hierarchy, and the interests of the lower-level managers may be in conflict with those of senior executives, the board of directors and shareholders. For example, suppose that a local manager believes his performance evaluation, compensation and prospects for promotion hinge on the timely opening of a new facility. Confronted with a culture of payoffs for permits, that manager may try to find a way to pay officials for expedited consideration, especially if he is local to the area. From that individual’s perspective, corrupt activity may be the norm, and he may believe himself to be clever enough to violate company policy without detection.
It was once acceptable for a company to claim that it had a stated policy prohibiting bribery and that executives were ignorant of an employee’s actions. Absent proof to the contrary, that often was enough. However, the FCPA changed this norm, imposing the need for diligence and affirmative actions on the part of companies to prevent employees from breaking the law as well as to detect and report any such violations that do occur (which is how the Wal-Mart situation came to light). Public standards, too, have changed since the 1970s. Despite its self-disclosure after the fact and the steps it took to address the corrupt behavior, Wal-Mart suffered severe reputational damage. Yet even with the likelihood potential consequences, our benchmark research reveals that just 6 percent of companies have effective controls for managing reputational risk.
We assert that the most effective control is to prevent illegal activity from taking place at all. Short of that, companies that can demonstrate that they have taken all reasonable steps to prevent a violation of the law are in a better position to claim that the individual, not the company, is at fault.
An organization should have clearly articulated and documented antibribery and corruption policies and procedures, institute mandatory training of and signed acknowledgements of having taken it by executives and managers, and put in place incentives and disciplinary measures. However, these required measures are increasingly insufficient to demonstrate diligence in preventing corrupt activities. Companies also must have a software-supported internal control system that flags suspicious activity immediately and triggers a rigorous remediation process that analyzes, investigates and documents the disposition of each incident. Incidents that are detected long after their commission are more difficult to cope with and pose much higher legal, financial and reputational risk.
Software is available that helps detect activities that violate anticorruption laws and regulations as they occur or shortly thereafter; this is far more effective than waiting for internal audits or (worse still) whistleblowers to uncover malfeasance. To prevent violations of the FCPA and other antibribery statues, corporations must be able to monitor their financial and other systems for warning signs. These applications take advantage of operational intelligence, a class of analytical capabilities built on event-focused information-gathering that can uncover suspicious actions as they occur. Our research on innovating with operational intelligence shows that companies use an array of systems (led by IT systems management and major enterprise applications such as ERP and CRM) to track events, analyze them, report results and create alerts when conditions warrant them, as detailed in the related chart. The research also shows that about half (53%) use 11 or more information sources in implementing their operational intelligence efforts. In the future, effective FCPA software increasingly will need to look at a wider range of internal data as well as information from external sources and social media to determine, for example, whether a consulting company that just received a finder’s fee is run by or employs a relative of a government official. Today, companies can utilize software from large vendors such as IBM, Oracle and SAP, as well as vendors with FCPA-specific software such as Compliancy and Oversight Systems.
Bribery and corruption are unlikely to disappear entirely. Regardless of anyone’s best intentions, corporate boards and executives can find themselves enmeshed in a scandal not of their own devising. The best defense in such cases is plain evidence that the organization has done everything reasonable to prevent its occurrence and has discovered and dealt with it promptly if it does. Policies and training are vital components, but software can be the extra component necessary to improve the effectiveness of monitoring and auditing to support anticorruption efforts.
Robert Kugel – SVP Research