Create an Account
username: password:
 
  MemeStreams Logo

A Roadmap for the Edge of the Internet

search

possibly noteworthy
Picture of possibly noteworthy
My Blog
My Profile
My Audience
My Sources
Send Me a Message

sponsored links

possibly noteworthy's topics
Arts
Business
Games
Health and Wellness
Home and Garden
Miscellaneous
  Humor
Current Events
  War on Terrorism
Recreation
Local Information
  Food
Science
Society
  International Relations
  Politics and Law
   Intellectual Property
  Military
Sports
Technology
  Military Technology
  High Tech Developments

support us

Get MemeStreams Stuff!


 
A Roadmap for the Edge of the Internet
Topic: Technology 7:37 am EDT, Aug 13, 2008

"I need data for my blade server!"

In the curious way of technological evolution, we first had computers that occupied entire rooms, watched them shrink to desktop, laptop and palm-sized devices, and now find ourselves coming full circle, and then some, Alan Benner reports. He tells this MIT class about warehouse-sized data centers, linking processors, and ensembles of processors, in dizzyingly complex hierarchies. These gigantic operations, some with their own power and air conditioning plants, are central to the enterprise of Internet behemoths Google, Amazon and YouTube, but have not yet percolated out to more traditional companies like insurance firms -- a situation Benner and his IBM colleagues would like to remedy.

Benner describes in broad strokes how these data operations are organized into levels of “virtualization and consolidation,” where the hardware is hidden, yet the data is both fully accessible and secure, no matter where the user and the computers are located. These new enterprise data centers aim to maximize efficiency, both in utilization and power consumption. It’s better to have fewer, bigger and well-integrated machines, says Benner, working as much as possible. Since even idle servers use a lot of power, users should share processing time in a manner that keeps the processors occupied. Benner describes computer architecture and software that aims at “statistically multiplexing jobs,” matching peaks in one group’s workload to nonpeaks in another group’s. Ideally, users remain blissfully unaware of this traffic management, and need never worry whether their information is getting crunched next door, or on the other side of the planet.

Benner hopes that companies will see advantages in migrating their data and services to a bigger, shared infrastructure, especially now with the near-ubiquity of high bandwidth networks. Given the rapid rise of energy costs, and the burdens of supporting a growing IT administration, it may save money “to move work to where it can be done most efficiently,” he says.

See also:

I want to stress that last point because there is no denying it: the system failed. The active wrong-doing detailed in the two joint reports was not systemic in that only a few people were directly implicated in it. But the failure was systemic in that the system – the institution – failed to check the behavior of those who did wrong.

A Roadmap for the Edge of the Internet



 
 
Powered By Industrial Memetics
RSS2.0