Inside the Briefcase

Ironclad SaaS Security for Cloud-Forward Enterprises

Ironclad SaaS Security for Cloud-Forward Enterprises

The 2015 Anthem data breach was the result of...

The Key Benefits of Using Social Media for Business

The Key Benefits of Using Social Media for Business

Worldwide, there are more than 2.6 billion social media...

Gartner IT Sourcing, Procurement, Vendor and Asset Management Summit 2018, September 5 – 7, in Orlando, FL

Gartner IT Sourcing, Procurement, Vendor and Asset Management Summit 2018, September 5 – 7, in Orlando, FL

Register with code GARTITB and save $350 off the...

Infographic: The Three Pillars of Digital Identity: Trust, Consent, Knowledge

Infographic: The Three Pillars of Digital Identity: Trust, Consent, Knowledge

8,434 adults were surveyed to gauge consumer awareness of...

FICO Scales with Oracle Cloud

FICO Scales with Oracle Cloud

Doug Clare, Vice President at FICO, describes how Oracle...

Hadoop: How Open Source Can Whittle Big Data Down to Size

March 2, 2012 No Comments

SOURCE: Computerworld

Techworld Australia caught up with Doug Cutting to talk about Apache Hadoop, a software framework he created for processing massive amounts of data.

In 2011 ’Big Data’ was, next to ‘Cloud’, the most dropped buzzword of the year. In 2012 Big Data is set to become a serious issue that many IT organisations across the public and private sectors will need to come to grips with.

The challenge essentially comes down to this: How do you store the massive amounts of often-unstructured data generated by end users and then transform it into meaningful, useful information?

One tool that enterprises have turned to to help with this is Hadoop, an open source framework for the distributed processing of large amounts of data.

Read More

DATA and ANALYTICS , Featured Articles, OPEN SOURCE, SOCIAL BUSINESS

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner