Using information technology to make data useful is as old as the Information Age. The difference today is that the volume and variety of available data has grown enormously. Big data gets almost all of the attention, but there’s also cryptic data. Both are difficult to harness using basic tools and require new technology to help organizations glean actionable information from the large and chaotic mass of data. “Big data” refers to extremely large data sets that may be analyzed computationally to reveal patterns, trends and associations, especially those related to human behavior and interaction. The challenges in dealing with big data include having the computational power that can scale to the processing requirements for the volumes involved; analytical tools to work with the large data sets; and governance necessary to manage the large data sets to ensure that the results of the analysis are accurate and meaningful. But that’s not all organizations have to deal with now. I’ve coined the term “cryptic data” to focus on a different, less well known sort of data challenge that many companies and individuals face.
Topics: Big Data, data science, Planning, Predictive Analytics, Sales Performance, Social Media, Supply Chain Performance, forecasting, FP&A, Human Capital, Marketing, Office of Finance, Operational Performance Management (OPM), Budgeting, Connotate, cryptic, equity research, Finance Analytics, Kofax, Statistics, Operational Performance, Analytics, Business Analytics, Business Performance, Financial Performance, Business Performance Management (BPM), Datawatch, Financial Performance Management (FPM), Kapow, Sales Performance Management (SPM), import.io
The need for businesses to process and analyze data has grown in intensity along with the volumes of data they are amassing. Our benchmark research consistently shows that preparing data is the most widespread impediment to analytic and operational efficiency. In our recent research on data and analytics in the cloud, more than half (55%) of organizations said that preparing data for analysis is a major impediment, followed by other preparatory tasks: reviewing data for quality and consistency (48%) and waiting for data and information (28%). Organizations that want to apply analytics to make more effective decisions and take prompt actions need to find ways to shorten the work that comes before it. Conventional analytics and business intelligence tools are not designed for data preparation, but new software tools can enable business users independently or in concert with IT to perform the tasks needed.
Topics: Big Data, Sales Performance, Supply Chain Performance, Human Capital, Marketing, Monarch, Operational Performance Management (OPM), Customer Performance, Business Analytics, Business Intelligence, Business Performance, Data Preparation, Financial Performance, Governance, Risk & Compliance (GRC), Information Management, Uncategorized, Business Performance Management (BPM), Datawatch, Information Optimization, Risk & Compliance (GRC)
Big data has become a big deal as the technology industry has invested tens of billions of dollars to create the next generation of databases and data processing. After the accompanying flood of new categories and marketing terminology from vendors, most in the IT community are now beginning to understand the potential of big data. Ventana Research thoroughly covered the evolving state of the big data and information optimization sector in 2014 and will continue this research in 2015 and beyond. As it progresses the importance of making big data systems interoperate with existing enterprise and information architecture along with digital transformation strategies becomes critical. Done properly companies can take advantage of big data innovations to optimize their established business processes and execute new business strategies. But just deploying big data and applying analytics to understand it is just the beginning. Innovative organizations must go beyond the usual exploratory and root-cause analyses through applied analytic discovery and other techniques. This of course requires them to develop competencies in information management for big data.
Topics: Big Data, MapR, Predictive Analytics, Sales Performance, SAP, Supply Chain Performance, Human Capital, Marketing, Mulesoft, Paxata, SnapLogic, Splunk, Customer Performance, Operational Performance, Business Analytics, Business Intelligence, Business Performance, Cloud Computing, Cloudera, Financial Performance, Hadoop, Hortonworks, IBM, Informatica, Information Management, Operational Intelligence, Oracle, Datawatch, Dell Boomi, Information Optimization, Savi, Sumo Logic, Tamr, Trifacta
We recently released our benchmark research on big data analytics, and it sheds light on many of the most important discussions occurring in business technology today. The study’s structure was based on the big data analytics framework that I laid out last year as well as the framework that my colleague Mark Smith put forth on the four types of discovery technology available. These frameworks view big data and analytics as part of a major change that includes a movement from designed data to organic data, the bringing together of analytics and data in a single system, and a corresponding move away from the technology-oriented three Vs of big data to the business-oriented three Ws of data. Our big data analytics research confirms these trends but also reveals some important subtleties and new findings with respect to this important emerging market. I want to share three of the most interesting and even surprising results and their implications for the big data analytics market.
Topics: Big Data, Pentaho, Predictive Analytics, Sales Performance, Supply Chain Performance, IT Performance, nuevora, Operational Performance, Analytics, Business Analytics, Business Intelligence, Business Performance, Cloud Computing, Customer & Contact Center, Financial Performance, Information Applications, Information Management, Location Intelligence, Operational Intelligence, Workforce Performance, Actuate, Datawatch
Our recently released benchmark research on information optimization shows that 97 percent of organizations find it important or very important to make information available to the business and customers, yet only 25 percent are satisfied with the technology they use to provide that access. This wide gap between importance and satisfaction reflects the complexity of preparing and presenting information in a world where users need to access many forms of data that exist across distributed systems.
Topics: Big Data, IT Performance, Analytics, Business Analytics, Business Collaboration, Business Intelligence, Business Performance, Cloud Computing, Customer & Contact Center, Data Preparation, Information Applications, Information Management, Data discovery, Datawatch, Information Optimization