Friday Jan 18, 2008

Choosing Your Demand Curve

As a high school student, I never really thought about energy all that much (I didn't drive and therefore didn't pay for gas) until late one Saturday night when I saw a commercial for Pink Floyd's Animals, featuring the now culturally relevant the stacks of the Battersea Power Station. The commercial literally raised the hair on the back of my neck, as it's hard to see what's floating between the smokestacks until the images resolves to reveal a pig, floating between the stacks, looking into the frame, not out at the audience. OK, it was freaky at the time, but maybe I had just read too much Heinlein. Listening to the album not too long after, the reference made sense: big power (corporate or generated) meant big pigs, with a healthy dose of George Orwell thrown in for good measure. I hadn't given the Battersea power plant or Floyd's pig much thought until I watched Children of Men recently, and laughed when I saw the floating pig in the background, framed by Battersea, another reference to animalistic behavior in the movie.

I'm relying on ancient history to make up for failing to plug (no pun) some recent history in Innovating@Sun podcasts that indirectly address questions of supply, demand, and piggish capitalism. I started by joining Eco-Computing VP Dave Douglas to talk about the disruptive effects of eco-computing -- primarily from a financial, not a cultural perspective. Dave sees eco-computing driven by cost savings, not a desire to be friendlier to the environment or a more responsible corporation. Those are indeed honorable and solid goals, but if you want to get traction talking about a new IT initiative, it has to change supply and demand curves modulated by the guys who build data centers. That's the economic disruption driven by eco-computing: focus on reducing demand for power, space and cooling. Normally systems vendors are predisposed to look at the supply curve, and either drive prices for infrastructure down or increase performance (or both), two effects that tend to move to a different point on the demand curve where overall consumption is increased. If you assume there's elasticity in demand for IT (and I've yet to find a customer that believed it had sufficient IT resources for every project, every year), then some rough rules of supply and demand apply, with each new product cycle shifting the supply curve a bit.

The problem with approaching eco-computing from the supply-side is that you never drive a disruption in demand. Dave Douglas makes a very strong argument for for carbon offsets; Dave wants us to look for ways to use less energy (and therefore create fewer carbon emissions to offset) in the aggregate. True disruption in the market comes from sliding onto a completely different demand curve, leading to a major shift in aggregate demand or in the price of goods. Long tail, open source, cultural icon status: all pick up demand curves and move them.

I first started thinking about IT and economic theories when we recorded the Innovating@Sun podcast about the Niagara 2 processor. Moore's Law is often equated to an expression of computing power, but more accurately it defines the number of transistors that can be delivered in a processor over time. VLSI design teams can use those transistors for memory, threads, network acceleration, multi-core implementations, or any other arrangement to satisfy the demand barbarians at the gates (Wall Street, not Microsoft pun intended). As the most typical distribution of transistors aims for higher clock rates and better single thread performance, you get more power consumption and more heat dissipation. The multi-thread, multi-core design of Niagara 2, along with the partitioning of gates for system elements, yields a completely different demand curve, one with radically lower heat production and power consumption. It's eco-computing from the processor level looking up, rather than the application and storage level looking down. In either direction, it forces us to reconsider how we build applications to take advantage of these new relationships between the total cost of computing and computing performance.

Two years ago, more than half of my customer conversations around CMT and the Niagara processor included the phrase "but we don't develop parallelized applications." In that time, nearly every processor house has announced or delivered multi-core and/or multi-threaded chips, so that Niagara is no longer the exception but the rule. Today, those same discussions focus on how customers can take advantage of multi-threading so that they can manage their aggregate IT demand in terms of the future platform supply.


Hal Stern's thoughts on software, services, cloud computing, security, privacy, and data management


« July 2016