Ventana Research Analyst Perspectives

Teradata Brings In-Memory Computing and Data Discovery to Big Data

Posted by Mark Smith on May 8, 2013 11:20:05 PM

Teradata recently gave me a technology update and a peek into the future of its portfolio for big data, information management and business analytics at its annual technology influencer summit. The company continues to innovate and build upon its Teradata 14 releases and its new processing technology. Since my last analysis of Teradata’s big data strategy, it has embraced technologies like Hadoop with its Teradata Aster Appliance, which won our 2012 Technology Innovation Award in Big Data. Teradata is steadily extending beyond providing just big data technology to offer a range of analytic options and appliances through advances in Teradata Aster and its overall data and analytic architectures. One example is its data warehouse appliance business, which according to our benchmark research is one of the key technological approaches to big data; as well Teradata has advanced support with its own technology offering for in-memory databases, specialized databases and Hadoop in one integrated architecture. It is taking an enterprise management approach to these technologies through Teradata Viewpoint, which helps monitor and manage systems and support a more distributed computing architecture.

Read More

Topics: Big Data, MicroStrategy, SAS, Tableau, Teradata, Customer Excellence, Operational Performance, Analytics, Business Analytics, Business Intelligence, CIO, Cloud Computing, Customer & Contact Center, In-Memory Computing, Information Applications, Information Management, Location Intelligence, Operational Intelligence, CMO, Discovery, Intelligent Memory, Teradata Aster, Strata+Hadoop

Information Optimization is a Key Benefit of Big Data Investments

Posted by Mark Smith on Mar 8, 2013 10:26:18 AM

Data is a commodity in business. To become useful information, data must be put into a specific business context. Without information, today’s businesses can’t function. Without the right information, available to the right people at the right time, an organization cannot make the right decisions nor take the right actions, nor compete effectively and prosper. Information must be crafted and made available to employees, customers, suppliers, partners and consumers in the forms they want it at the moments they must have it. Optimizing information in this manner is essential to business success. Yet I see organizations today focusing on investments in big data because they believe it can effortlessly bring analysts insights. That premise is incorrect.

Read More

Topics: Big Data, Analytics, Business Analytics, Cloud Computing, In-Memory Computing, Information Applications, Information Management, Information Optimization, Strata+Hadoop, Digital Technology

Encountering New Bottlenecks with Oracle’s Breakthrough Technology

Posted by Robert Kugel on Oct 3, 2012 11:43:23 AM

Two key themes that emerged from Larry Ellison’s Sunday night keynote at this year’s Oracle OpenWorld were faster processing speed and cheaper storage. An underlying purpose to these themes was to assert the importance of Oracle’s strategic vertical integration of hardware and software with the acquisitions of Sun. I try to view technology keynotes like this from the perspective of a practical business user. Advancements such of these are important because enhancing the performance and cost-effectiveness of IT infrastructure can drive substantially improved business capabilities. As I’ve noted in the past, the ability to rapidly process large amounts of data provides business users with significant new capabilities in areas such as complex event processing, social media analytics and the ability to analyze unstructured or semi-structured data. In planning, it has the potential to change how companies perform a wide range of analytics-driven processes, especially in areas such as planning, budgeting and forecasting. It makes it feasible to more fully explore the impact of different courses of action, because rather than having to wait hours or days for answers to questions that start with “What happens if we…” the answers come back in seconds. Review and planning sessions can focus more on what’s next rather than rehashing history.

Read More

Topics: Big Data, Customer Experience, executive, IT Performance, Business Analytics, Business Performance, Data Management, Financial Performance, In-Memory Computing, Information Management, Business Process Management, Data, FPM

Tidemark Reaches the Starting Gate

Posted by Robert Kugel on Jan 3, 2012 11:50:33 AM

My colleague Mark Smith and I recently chatted with executives of Tidemark, a company in the early stages of providing business analytics for decision-makers. It has a roster of experienced executive talent and solid financial backing. There’s a strategic link with Workday that reflects a common background at the operational and investor levels. As it gets rolling, Tidemark is targeting large and very companies as customers for its cloud-based system for analyzing data. It can automate alerts and enhance operating visibility, collaboratively assess the potential impacts of decisions and support the process of implementing those decisions.

Read More

Topics: Big Data, Data Warehousing, Master Data Management, Performance Management, Planning, Predictive Analytics, Sales Performance, GRC, Budgeting, Risk Analytics, Operational Performance, Analytics, Business Analytics, Business Collaboration, Business Intelligence, Business Mobility, Business Performance, Cloud Computing, Customer & Contact Center, Data Governance, Data Integration, Financial Performance, In-Memory Computing, Information Management, Mobility, Workforce Performance, Risk, Workday, Financial Performance Management, Integrated Business Planning, Strata+Hadoop

Tableau 6 Combines In-Memory Processing and Visualization

Posted by Ventana Research on Nov 28, 2010 12:14:12 PM

Tableau Software officially released Version 6 of its product this week. Tableau approaches business intelligence from the end user’s perspective, focusing primarily on delivering tools that allow people to easily interact with data and visualize it.  With this release, Tableau has advanced its in-memory processing capabilities significantly. Fundamentally Tableau 6 shifts from the intelligent caching scheme used in prior versions to a columnar, in-memory data architecture in order to increase performance and scalability.

Read More

Topics: Data Visualization, Enterprise Data Strategy, Tableau, Analytics, Business Analytics, Business Intelligence, CIO, In-Memory Computing

Advantages and Challenges of In-Memory Databases and Processing

Posted by Ventana Research on Nov 28, 2010 12:12:40 PM

Interest in and development of in-memory technologies have increased over the last few years, driven in part by widespread availability of affordable 64-bit hardware and operating systems and the performance advantages in-memory operations provide over disk-based operations. Some software vendors, such as SAP with its High-Performance Analytic Appliance (HANA) project has been advancing with momentum, have even suggested that we can put our entire analytic systems in memory.

Read More

Topics: Database, Enterprise Data Strategy, IT Performance, Analytics, Business Analytics, Business Intelligence, CIO, Complex Event Processing, In-Memory Computing, Information Management, Information Technology

Content not found