Composite Software Extends Big Data Analytics SupportJune 12, 2013 No Comments
SOURCE: Composite Software
SAN MATEO, Calif. — June 11, 2013 — Composite Software today introduced a new version of its Composite Data Virtualization Platform that enables organizations to turn big data analytics into competitive advantage by dramatically simplifying integration of big data with enterprise, cloud and desktop sources. Delivered in Composite 6.2 SP3, Composite updates Hortonworks, Cloudera and the Apache Distribution of Hadoop big data integrations through Hive and adds data integration with Cloudera through Impala.
“Big data is becoming part of normal operations for leading organizations,” according to Gartner, Inc.’s February 2013 report Big Data Adoption in the Logical Data Warehouse. ¹ “Approximately 50 percent of all organizations planning on incorporating big data into their normal operations are planning to use data integration and data warehouses to do it.”
Big Data Skills Shortage
The emergence of big data has opened the door to unprecedented analytic opportunities for business innovation, customer retention and profit growth. However, the big data skills shortage is a bottleneck facing every organization today as they move from early big data experiments into enterprise scale adoption and is limiting big data analytics success.
Composite’s data virtualization offerings address this skills shortage directly, providing five-to-10 times faster time-to-build than traditional data integration methods and leveraging industry-standard SQL, As a result, Composite users are able to develop more-extensive big data analytics far sooner.
“The key to analytics is data. The more data the better, and that includes big data as well as the myriad other sources,” said David Besemer, CTO Composite Software. “SQL is the key to unlocking big data’s value to the rest of the enterprise. Nearly everyone knows SQL, but very few know MapReduce. Composite is pleased we can help organizations bridge this gap and accelerate their big data analytics efforts.”
Composite was the first data virtualization vendor to support Hadoop integration with its June 2011 release of its Data Virtualization Platform. Composite 6 provided a standardized SQL approach to augment specialized MapReduce coding of Hadoop queries. By simplifying integration of Hadoop data, organizations could for the first time extend their analytics and BI to include this emerging “big data” source as well as enterprise, cloud and other data sources.
With Composite 6.1 in February 2012, Composite became the first data virtualization vendor to enable MapReduce programs to easily query Composite as a data source, on-demand with high performance. This allowed enterprises to extend MapReduce analyses beyond Hadoop stores to include diverse enterprise data managed by Composite.
With today’s release of Composite 6.2 SP3, Composite maintains its big data integration leadership with updates of its support for Hive access to the leading Hadoop distributions including Apache Hadoop, Cloudera Distribution (CDH) and Hortonworks (HDP). In addition, Composite now also supports access to Hadoop through HiveServer2 and Cloudera CDH through Impala.
According to Jai Malhotra, cofounder of Xtremeinsights, a new-breed of technology service providers focused on big data and advanced analytics, “The big data market is evolving rapidly and my customers want to stay on the leading edge. Composite helps us do just that.”
¹” Big Data Adoption in the Logical Data Warehouse”, Gartner, Inc., February 2013, Mark A. Beyer, Ted Friedman
DATA and ANALYTICS