Inside the Briefcase

How to Best Utilise Analytics in all its Forms

How to Best Utilise Analytics in all its Forms

Analytics is one of the most indispensable tools any...

2016 APM Reference Guide: Application Performance Monitoring

2016 APM Reference Guide: Application Performance Monitoring

IT Briefcase Analyst Report
This product guide allows you to...

IT Briefcase Exclusive Interview: Top IoT Trends and Predictions for Organizations in 2016

IT Briefcase Exclusive Interview: Top IoT Trends and Predictions for Organizations in 2016

with Mike Martin, nfrastructure
In this interview, Mike Martin,...

Unleash the Power of Global Content

Unleash the Power of Global Content

globeYour business depends on pushing accurate and dynamic content...

Clicking Away Your Right to Privacy

Clicking Away Your Right to Privacy

Before using any standard Internet service provider for e-mail...

From Cloud Computing to Fog Computing

October 2, 2013 No Comments

How Cloud Computing Started

We have all heard the story – at least, fragments of it – of how Cloud Computing came to be what it is today. In the 2000s, after the dot-bust, Amazon – by then already a heavyweight retailer but not yet the giant and multiple-reaching organization we know today – sought to monetize unused capacity in their data centers by “renting it out” to external customers on a utility computing basis. Like most computer networks, Amazon’s data centers were using as little as 10 percent of their capacity, because they were sized for occasional spikes such as Black Friday. Today, Amazon’s AWS division represents an undisclosed but significant share of Amazon’s total $60bn+ revenue, and is rumored to be the most profitable one. And while Amazon is no longer the sole provider of cloud services, it has managed to hold on to its first mover’s advantage.

In the 10+ years that have since marked the deployment of Cloud, several degrees of cloudiness have occurred, growing from scattered clouds to overcast – and now it’s even getting foggy…

Scattered Clouds

In the early days of Cloud Computing, applications were built and run on premises – inside the firewall, or in a DMZ, but almost always on servers owned and/or managed by IT. These servers could be hosted in 3rd party-operated data centers, and/or be rented out – but nevertheless, there were identified physical machines on which applications were running.

Some forward-thinking IT organizations started to figure out the interest of offloading peak traffic to the Cloud. Instead of oversizing their data centers for the highest predicted load (such sizing would never be reliable anyway), they could just provision extra capacity in the Cloud when needed. Amazon Web Services’ elastic scalability and pricing model made it fairly easy. As a result, retailers could absorb the holiday season rush, tax preparers would not fear the April deadline, and online flower vendors would not crash and burn on Valentine’s Day and Mother’s Day.

However, applications remained primarily on premises. Clouds remained scattered – their elasticity was used only when on-premises systems couldn’t cope.

And because every rule has its exception, some organizations were making high-stake bets on the Cloud. The most visible example is Salesforce.com, which launched roughly at the same time Amazon was placing their Cloud on the map.

Overcast

A few years later, many organizations completed their transition to the Cloud, which had become a standard deployment mode for a broad variety of systems. With the exception of some highly-regulated industries, deploying systems in the Cloud is now a no-brainer. Not only is this true for SaaS application providers like Workday, NetSuite and ServiceSource (who have flourished in the wake of Salesforce), but IT organizations in many industries today view the Cloud as a deployment model for most of their applications, even home-grown ones.

Cloud providers themselves offer a breadth of deployment alternatives, from public Cloud to private Cloud, that satisfy the security and privacy demands of customers. These providers include global organizations such as Amazon and Microsoft (Azure) that offer a broad palette of services, and also specialized vendors and projects like Eucalyptus or Openstack that provide technology to build private Clouds.

Everywhere you look, there are Clouds. This is what is called ‘overcast’.

It’s Becoming Foggy

We are now entering a new era of cloudiness. The alternative is no longer binary: applications no longer get deployed either on-premises (with Cloud as an “overflow” option) or in the Cloud. Applications and IT systems are becoming increasingly hybrid themselves: they do get deployed as a combination of on-premises and in the Cloud. IT organizations work with a pool of infrastructure resources, some of them placed under their control, some of them provided by a third party. They are also faced with a distribution of data and application assets that they don’t control entirely – examples include social media data, open data, or B2B services. Hybrid is no longer an infrastructure term, it also applies to applications.

As a result, the boundary between Cloud and on-premises becomes blurry. Clouds are reaching on-premises data centers, creating a layer of fog: low clouds, reaching to the ground.

Cloud Computing is changing. It is now Fog Computing.

YvesM casual2 lores From Cloud Computing to Fog Computing

Yves de Montcheuil is the Vice President of Marketing at Talend, the recognized leader in open source integration. Yves holds a master’s degree in electrical engineering and computer science and has 20 years of experience in software product management, product marketing and corporate marketing. He is also a presenter, author, blogger, social media enthusiast, and can be followed on Twitter: @ydemontcheuil.

 

CLOUD COMPUTING, Fresh Ink

Leave a Reply

(required)

(required)


ADVERTISEMENT

AnDevCon


American Customer Festival 2016 New York

ITBriefcase Comparison Report

Cyber Security Exchange