Monday, July 4, 2011

What would you do with infinite computes?

Firing up a 1000 processor deep analytics cluster in the cloud to solve a market segmentation question regarding your customer orders during Christmas 2010, or a sentiment analysis of your company's facebook fan page now costs less than having lunch in Palo Alto.

The cloud effectively provides infinite computes, and to some degree infinite storage, although the costs of non-ephemeral storage might murk that analogy up a bit. So what would you do differently now you have access to a global supercomputer?

When I pose this question to my clients, it quickly reveals that their business processes are ill-prepared to take advantage of this opportunity. We are roughly half a decade into the cloud revolution, and at least a decade into the 'competing on analytics' mind set, but the typical enterprise IT shop is still unable to make a difference in the cloud.

However, change may be near. Given the state of functionality in software stacks like RightScale and Enstratus we might see a discontinuity in this inability to take advantage of the cloud. These stacks are getting to the point that an IT novice is able to provision complex applications into the cloud. Supported by solid open source provisioning stacks like Eucalyptus and Cloud.com, building reliable and adaptive software service stacks in the cloud is becoming child's play.

What I like about these environment is that they are cloud agnostic. For proper DR/BPC a single cloud provider would be a single point of failure and thus a non-starter. But these tools make it possible to run a live application across multiple cloud vendors thus solving the productivity and agility requirements that come with the territory of an Internet application.

No comments: