This has been a dramatic year for Informatica, a major provider of data integration software. In August it was acquired and taken private by Permira funds and Canada Pension Plan Investment Board for about US$5.3 billion. This change was accompanied by shifts in its management. CEO Sohaib Abbasi became chairman and now has left, and many executives were replaced while Anil Chakravathy became CEO from being the Chief Product Officer. The new owners appear to have shifted the company’s strategic priorities to emphasize profitability with reduced headcount and return on the purchase investment. Despite these changes, during the past six months Informatica has made key product announcements that will impact its future and the future of data management.
Topics: Big Data, Data Quality, Master Data Management, MDM, Operational Performance Management (OPM), Cloud Computing, Data Integration, Data Management, Data Preparation, Governance, Risk & Compliance (GRC), Informatica, Information Management, Business Performance Management (BPM), Information Optimization, Risk & Compliance (GRC)
Organizations today create and collect data at ever faster rates, and this introduces challenges in ensuring that data is not just managed but used in a consistent manner for a range of operational and analytic tasks. This is made more difficult by new data sources whose definitions vary from standard and widely used formats. Making all information available and consistent is essential to support business processes and decision-making. A key technology tool for this effort is master data management (MDM). Every business area needs MDM, whether it deals with customers, products, employees, finance or others individually or collectively in what is called multidomain MDM. It is an essential tool for data governance across an organization, which has become a focal point for improvement as many organizations spend significant time in data-related tasks. Our benchmark research on information optimization shows that preparing data for analysis (47%) and reviewing data for quality and consistency issues (45%) are the two information tasks that consume the most time. Properly used MDM enables data stewards and other IT professionals to improve the consistency and quality of departmental and enterprise data.
Topics: Data Quality, Master Data Management, Sales Performance, Social Media, Supply Chain Performance, Golden Records., MDM, Operational Performance, Business Analytics, Business Performance, Cloud Computing, Customer & Contact Center, Data Management, Financial Performance, Information Applications, Information Management, Workforce Performance
When applying information technology to drive better business performance, companies and the systems integrators that assist them often underestimate the importance of organizing data management around processes. For example, companies that do not execute their quote-to-cash cycle as an end-to-end process often experience a related set of issues in their sales, marketing, operations, accounting and finance functions that stem from entering the same data into multiple systems. The inability to automate passing of data from one functional group to the next forces people to spend time re-entering data and leads to fragmented and disconnected data stores. The absence of a single authoritative data source also creates conflicts about whose numbers are “right.” Even when the actual figures recorded are identical, discrepancies can crop up because of issues in synchronization and data definition. Lacking an authoritative source, organizations may need to check for and resolve errors and inconsistencies between systems to ensure, for example, that what customers purchased was what they received and were billed for. The negative impact of this lack of automation is multiplied when transactions are complex or involve contracts for recurring services.
Topics: Big Data, Mobile, Sales Performance, Supply Chain Performance, ERP, Operations, Management, close, closing, computing, end-to-end, quote-to-cash, requisition-to-pay, Operational Performance, Analytics, Business Performance, Cloud, Data Management, Information Applications, Information Management, Accounting, CRM, Data, finance, FPM
At the Informatica World 2014 conference, the company known for its data integration software unveiled the Intelligent Data Platform. In the last three years Informatica has expanded beyond data integration and now has a broad software portfolio that facilitates information management within the enterprise and through cloud computing. The Intelligent Data Platform forms a framework for its portfolio. This expression of broad potential is important for Informatica, which has been slow to position its products as capable of more than data integration. A large part of the value it provides lies in what its products can do to help organizations strengthen their enterprise architectures for managing applications and data. We see Informatica’s sweet spot in facilitating efficient use of data for business and IT purposes; we call this information optimization.
Topics: Big Data, Master Data Management, Sales Performance, Supply Chain Performance, application architecture, Operational Performance, Business Analytics, Business Intelligence, Business Performance, CIO, Cloud Computing, Customer & Contact Center, Data Integration, Data Management, Financial Performance, Informatica, Information Applications, Information Management, Workforce Performance, Information Optimization, Product Information Management
Many businesses are close to being overwhelmed by the unceasing growth of data they must process and analyze to find insights that can improve their operations and results. To manage this big data they find a rapidly expanding portfolio of technology products. A significant vendor in this market is SAS Institute. I recently attended the company’s annual analyst summit, Inside Intelligence 2014 (Twitter Hashtag #SASSB). SAS reported more than $3 billion in software revenue for 2013 and is known globally for its analytics software. Recently it has become a more significant presence in data management as well. SAS provides applications for various lines of business and industries in areas as diverse as fraud prevention, security, customer service and marketing. To accomplish this it applies analytics to what is now called big data, but the company has many decades of experience in dealing with large volumes of data. Recently SAS set a goal to be the vendor of choice for the analytic, data and visualization software needs for Hadoop. To achieve this aggressive goal the company will have to make significant further investments in not only its products but also marketing and sales. Our benchmark research on big data analytics shows that three out of four (76%) organizations view big data analytics as analyzing data from all sources, not just one, which sets the bar high for vendors seeking to win their business.
Topics: Big Data, Predictive Analytics, SAS, Event Stream, Operational Performance, Analytics, Business Analytics, Business Intelligence, Business Performance, CIO, Customer & Contact Center, Data Management, Information Applications, Information Management, Location Intelligence, Operational Intelligence, Discoveryf