Inside the Briefcase

The 5 Most Common Application Bottlenecks

The 5 Most Common Application Bottlenecks

Application bottlenecks can lead an otherwise functional computer or...

How Machine Learning Helps Improve the Security Industry

How Machine Learning Helps Improve the Security Industry

We’ve been moving more and more towards computerized processes...

Transformation on a Global Scale

Transformation on a Global Scale

Necessity may be the mother of invention, but it’s...

IT Briefcase Exclusive Interview: As Container Adoption Swells, So Do Security Concerns

IT Briefcase Exclusive Interview: As Container Adoption Swells, So Do Security Concerns

Fei Huang, NeuVector
In this Fresh Ink interview segment,...

6 Marketing Strategies for Your Small Business

6 Marketing Strategies for Your Small Business

One of the main problems facing small businesses is...

Hadoop: How Open Source Can Whittle Big Data Down to Size

March 2, 2012 No Comments

SOURCE: Computerworld

Techworld Australia caught up with Doug Cutting to talk about Apache Hadoop, a software framework he created for processing massive amounts of data.

In 2011 ’Big Data’ was, next to ‘Cloud’, the most dropped buzzword of the year. In 2012 Big Data is set to become a serious issue that many IT organisations across the public and private sectors will need to come to grips with.

The challenge essentially comes down to this: How do you store the massive amounts of often-unstructured data generated by end users and then transform it into meaningful, useful information?

One tool that enterprises have turned to to help with this is Hadoop, an open source framework for the distributed processing of large amounts of data.

Read More

DATA and ANALYTICS , Featured Articles, OPEN SOURCE, SOCIAL BUSINESS

Leave a Reply

(required)

(required)


ADVERTISEMENT

UC Expo

SSOW

sptechcon

ITBriefcase Comparison Report