Inside the Briefcase

How to Best Utilise Analytics in all its Forms

How to Best Utilise Analytics in all its Forms

Analytics is one of the most indispensable tools any...

2016 APM Reference Guide: Application Performance Monitoring

2016 APM Reference Guide: Application Performance Monitoring

IT Briefcase Analyst Report
This product guide allows you to...

IT Briefcase Exclusive Interview: Top IoT Trends and Predictions for Organizations in 2016

IT Briefcase Exclusive Interview: Top IoT Trends and Predictions for Organizations in 2016

with Mike Martin, nfrastructure
In this interview, Mike Martin,...

Unleash the Power of Global Content

Unleash the Power of Global Content

globeYour business depends on pushing accurate and dynamic content...

Clicking Away Your Right to Privacy

Clicking Away Your Right to Privacy

Before using any standard Internet service provider for e-mail...

Hadoop: How Open Source Can Whittle Big Data Down to Size

March 2, 2012 No Comments

SOURCE: Computerworld

Techworld Australia caught up with Doug Cutting to talk about Apache Hadoop, a software framework he created for processing massive amounts of data.

In 2011 ’Big Data’ was, next to ‘Cloud’, the most dropped buzzword of the year. In 2012 Big Data is set to become a serious issue that many IT organisations across the public and private sectors will need to come to grips with.

The challenge essentially comes down to this: How do you store the massive amounts of often-unstructured data generated by end users and then transform it into meaningful, useful information?

One tool that enterprises have turned to to help with this is Hadoop, an open source framework for the distributed processing of large amounts of data.

Read More

DATA and ANALYTICS , Featured Articles, OPEN SOURCE, SOCIAL BUSINESS

Leave a Reply

(required)

(required)


ADVERTISEMENT

AnDevCon


American Customer Festival 2016 New York

ITBriefcase Comparison Report

Cyber Security Exchange