Inside the Briefcase

The 5 Most Common Application Bottlenecks

The 5 Most Common Application Bottlenecks

Application bottlenecks can lead an otherwise functional computer or...

How Machine Learning Helps Improve the Security Industry

How Machine Learning Helps Improve the Security Industry

We’ve been moving more and more towards computerized processes...

Transformation on a Global Scale

Transformation on a Global Scale

Necessity may be the mother of invention, but it’s...

IT Briefcase Exclusive Interview: As Container Adoption Swells, So Do Security Concerns

IT Briefcase Exclusive Interview: As Container Adoption Swells, So Do Security Concerns

Fei Huang, NeuVector
In this Fresh Ink interview segment,...

6 Marketing Strategies for Your Small Business

6 Marketing Strategies for Your Small Business

One of the main problems facing small businesses is...

IBM’s New Data Center Analytics

December 7, 2010 No Comments

As if everything wasn’t changing fast enough: Now data centers built for decades need continual redesign.

On Tuesday, IBM will announce it is selling predictive analytics software for the design, planning, maintenance and upgrades of some of business’ biggest cost centers.  Not long ago, these multibillion-dollar computing palaces were built to the same spec for decades. Now, says IBM, they should be under near-continual review.

“Seventy-one percent of data centers are technically obsolete,” says Steven Sams, IBM’s vice president of Global Site and Facilities Services.  “Operating them is five times the cost of building them .” Savings from the new method, he suggested, could be about 30% in operational costs – meaning that over a 30-year lifespan, the savings would pay for the cost of construction. Typical return on investment, he said, had gone from three to five years, to six to 18 months.

The explosion of Internet-based cloud computing services has Sams convinced in predictions that today’s 55 million servers in use worldwide will be 82 million servers in 2013, or that between now and the end of 2012 global data storage capacity may be 6.5 times today’s level. (Those are also the kind of numbers big consultancies and big vendors throw around to scare the bejeezus out of the rest of us. Worked for me.)

Read More of Quentin Hardy’s Blog Post on Forbes.com


Featured Blogs

Leave a Reply

(required)

(required)


ADVERTISEMENT

UC Expo

SSOW

sptechcon

ITBriefcase Comparison Report