Saturday, November 28, 2009

And now something completely different: brain simulation

Our good folks at the national labs have been developing cloud computing for two decades, so what they are doing now is possibly an indication of what we'll be doing with the cloud a decade from now. Researchers at IBM Almaden have been working on the largest brain simulation to date; 1.6 billion neurons with 9 trillion connections. The scale of this endeavor still dwarfs the capacity and capability of any commercial cloud offering; the simulation uses roughly 150 thousand processors and 150TBytes of memory.

Just to provide a sense of the OPEX of such an installation: Dawn, a IBM Blue Gene/P supercomputer at LLNL, hums and breathes inside an acre-size room on the second floor of the lab's Terascale Simulation Facility. Its 147,456 processors and 147,000 gigabytes of memory fill 10 rows of computer racks, woven together by miles of cable. Dawn devours a million watts of electricity through power cords as thick as a bouncer's wrists—racking up an annual power bill of $1 million. The roar of refrigeration fans fills the air: 6675 tons of air-conditioning hardware labor to dissipate Dawn's body heat, blowing 2.7 million cubic feet of chilled air through the room every minute.

Given the fact that a real brain only consumes about 25Watts, clearly there is a lot of room for technology innovation. Silicon innovation however has come to a stand still with venture capital completely abandoning this segment. There are no VC firms in the US or EU that have any funds that target this vertical. It is rumored that Google is designing its own silicon now since no commercial chip manufacturers are providing the innovation that Google needs.

Friday, November 20, 2009

Governmental IT: Analytics is not a dirty word

Over at Smart Data Collective, Bill Cooper wrote a wonderful article on deep analytics. In particular, I liked his assessment on the resistance expressed by customers that can't see the forest for the trees.

I’ve watched many government agencies balk at the idea of data mining and complex analytics. They are concerned about switching to a new data architecture and the potential risks involved in implementing a new solution or making a change in methodology.

Having been there, I do understand their concerns, but fear of change is what’s holding government agencies back from being able to fully leverage the data that already exists to effect change at the local, regional, state and national levels. Analytics are the key to lowering costs, increasing revenue and streamlining government programs.

In my own government experience and now, looking at it from the other side, I have come to believe that government clients need to think about data the way the world’s top corporations do. Like all federal agencies, these companies already had huge repositories of data that were never analyzed – never used to support decisions, plan strategies or take immediate actions. Once they began to treat that data as a corporate asset, they started to see real results. The best part is that leveraging these mountains of data does not require a "rip and replace" approach. Inserting a data warehousing/data mining or complex analytics capability into a SOA or cloud computing environment can be very low risk and even elegant in its implementation. The potential rewards are immense!

That’s what’s needed in the government sector. We need to view analytics not as a dirty word but as a secret weapon against fraud and other challenges impacting all areas of the government sector.

I am a big believer that the future of the cloud consists of federated systems for the simple reason that large data sets are captive to their storage devices. Federated systems makes service oriented architectures (SOA) a natural architecture pattern to collate information. The fact that Google Gears and Microsoft Azure exhibit SOA at different levels of abstraction is clear evidence of the power of SOA. Add coarse grain SOAs to these fine-grained patterns and you can support federation and scale internal IT systems even if the core runs in Gears or Azure.

Interactive Map of cloud services

Appirio, a company that helps enterprise customers leverage PaaS cloud platforms such as and Google Apps, put a nice interactive navigator on their website.
The Appirio cloud computing ecosystem map aims to provide more clarity in the fast evolving cloud services market. It tries to help enterprise decision makers to accelerate their adoption of the cloud by trying to provide a standard taxonomy.

Ryan Nichols, head of cloud strategy at Appirio, states: "The cloud ecosystem is evolving so quickly that it's difficult for most enterprises to keep up. We created the ecosystem map to track this evolution ourselves, and have decided to publish it to help others assess the lay of the land. With broader community involvement, we can create a living, breathing map where anyone can access, drill down and interact with dynamic information. This will bring some much-needed clarity to the cloud market."

Unfortunately, since the map is geared towards the enterprise customer it ignores all of the innovation that is taking place in the mashup, programmable web, and mid-market products, such as Zementis ADAPA in the Cloud. Given the new ways in which the cloud enables new application architectures and services, the enterprise market is the worst indicator of the evolving cloud ecosystem.

Monday, November 2, 2009

PC sales decline

In his post PCs at a Crossroads Michael Friedenberg reports on IDC's measurement of the PC marketplace. From the article:

"Case in point is the PC market. Market researcher IDC reports that 2009 will be the first year since 2001 where PC shipments will decline. I believe this drop is driven by a more rapid intersection of the cyclical and the systemic as the PC value proposition is challenged and then transformed. As Intel CEO Paul Otellini recently said, "We're moving from personal computers to personal computing." That comment signals Intel's way of moving into new markets, but it also acknowledges that the enterprise PC market has arrived at a crossroads."

Cloud computing is one development that is dramatically changing the desktop and server eco-system. The performance of a single server or desktop hasn't kept pace with the computational needs of modern science, engineering, or business. Cloud computing moves away from capital equipment to the ability to procure just the computational output AND at infinite scale for most use cases. Most desktops are idling most of the time, but are too slow to get real work done when you need it. This is pushing the work towards elastic resources that are consumed as you go. If the browser is all you need, then a move towards server consolidation and thin clients is not far behind.