Monday May 26, 2008

Eco Business Case

Several weeks ago I was interviewed by a business school student building a case study for "eco" as a corporate imperative. What started with a line of questions aimed at highlighting the growing interest in "green" approaches to business turned quickly into a discussion of the other kind of green -- financial incentives. I don't think you'll find any company that wants to admit to being anti-green or anti-environment; but there are few companies that will actively fund and endorse "eco" as a strategy.

We (and I mean my generation, the late boomers, the current group of IT professionals) treat "eco" as a social initiative, and not as a business initiative. That's no longer possible: social, business and large-scale economic initiatives are intertwined, and companies that "get it" will gain greater acceptance in the market, have an easier time hiring new college graduates, and find that their marketing and sales efforts are amplified by the tangentially understood "social media". Put another way: the next generation of consumers -- of IT, of consumer products, of social services -- wants social action, has grown up immensely connected to their communities and the world, believes it can enact gross change, and provides instant feedback both good and bad. If a company doesn't make the link between eco as a green computing initiative and eco as a business initiative, they will sink into the Millennial equivalent of the Rust Belt.

Put another way, we fail, miserably, if we simply treat eco-computing, and eco-business, as something Carol Cone calls the ribbonization of America -- a nice cause for which we'll slap a ribbon on the back of our cars. Eco has to go beyond a corporate cause and turn into a set of actions and priorities embedded in how you think about the long-term priorities of your business. One of the best things Scott McNealy ever said to me was "Don't ask me to solve the problem, you're the engineer. Tell me what to execute because I'm an executive." Sounds trite, but it's been outstanding career advice. So here are my four actionable - executive - imperatives for dealing with eco as a business-driven, financially motivated priority.

1. Reduce demand. The most obvious one, but bound by just how much demand you can take out. Power efficiency is driven by an entire chain of transmission functions from power entering the data center to the power supply units inside of servers; moving toward more efficient distribution and conversion is a requirement.

2. Switch demand curves. If you think of power and cooling capacity demand along a classical economic supply/demand curve, eco-computing demands that you find completely different curves as well as demand-reduced points along existing curves. One approach: carefully examine where you can use tape instead of disk, trading latency for power. Tape has the eco-wonderful property of drawing zero power when it's not spinning in a drive. Another approach: look at the relationship between software and heat. Huh? Inefficient software uses more CPU cycles; increasing CPU utilization drives power consumption and demand for cooling. Yet another view: Put Moore's Law into historical context. Moore predicted the number of transistors in a processor, not the actual processor performance. How we allocate those transistors into cores, threads of execution, cache and system support functions like I/O and memory control govern the overall throughput and power efficiency of the processor. Why does Sun continu to invest in processor design? Because major refactoring of transistors in a high throughput processor creates these new demand curves.

3. Scale with sub-linear cost. Increasing delivery of IT services over the network, and rapid adoption of those services are "network scale effects." At some point, we hit a non-linear cost point in data center capacity planning where we need a new data center design or a new physical data center. What's our incremental cost of adding data center capacity? If it's a major real estate project, it tends to be measured in tens to hundreds of millions of dollars and in years, not weeks. Projects like Sun's Modular Data Center (AKA "Project Blackbox") stimulate thinking about raising capacity without building another raised floor. Being able to scale up requires a service delivery design that spans the architectural milieu from hardware layouts to load balancers and security blueprints; we've captured a set of design ideas in Sun's architectural wiki. With technology adoption increasingly driven by social mechanisms (from iPods to Twitter), companies need to identify the social drivers of growth for compute, storage and networking so that we can avoid the equivalent of an impulse function in meeting that demand.

4. Scenario plan for 10 or more years. You would never tell your investors that you don't plan to be in full operational mode a decade from now. Failure to address long-term growth and scale issues creates constraints for your successive set of business executives. The root of this scenario planning is to answer questions about who is using your products (and services) and how they'll be used (and re-used). Sun's eco-computing initiative has spurred seemingly minor product design changes that have had long-lasting impact. For example, we've removed plastic bezels and trims from our hardware products, making the systems skins fully recyclable. At a completely different abstraction level, we need to look at the encoding of corporate data, from databases to documents to intellectual property: how will it be retrieved, read, re-used and retained in five, ten or fifty years? We need interoperability across different representations today, and across future representations if we want this data to be of any use.

All four of these actions are cost driven: current operating cost, future operating cost, or cost avoidance planning. The social aspects of eco-computing make it appealing while the cost aspects make it compelling.

Monday Feb 18, 2008

Innovation versus Regulation

Growing up in a reform Jewish synagogue, we always had a somewhat tangential relationship to the more traditional Jewish organizations and agencies; our rabbi had a pony tail (in 1970) rather than a long beard. One religious artifact that stuck with me was that each year, we'd give small amounts of money (tzedekah, Hebrew for "righteousness" rather than "charity") to buy "trees for Israel" through the Jewish National Fund. The JNF is dedicated to building out infrastructure and the long-term development of natural resources. It's not a conservation fund; it's about adopting a multiple decade view for a part of the world that has a history measured in millennia.

What got me thinking about the JNF and my grade school pile of tree certificates was a comment made by Virgin Galactic President Will Whitehorn at the annual nerd dinner that's part of our Analyst Summit. When asked why Virgin felt they'd be able to develop economically viable spaceflight, Whitehorn simply stated that private enterprise almost always outperforms government mandates. As a research or development area becomes more institutionalized, rules and regulations stagnate work. Whitehorn's point was that if you wait for regulation or oversight, work is reduced to meeting the letter of the law, instead of innovating to drive the spirit or intent of its creation. Whitehorn was implicitly dismissing NASA and other government-funded agencies who have "missions" but not economic development goals. Building on materials research and manufacturing interests outside of the Virgin empire, Galactic is going to build a spacecraft and then drive its economics through both supply and demand sides of the equation.

At the Analyst Summit, a number of people asked me about the motivations for Sun's eco-computing initiatives. It's not purely about the environmental aspects -- those are adjunct benefits, like having an old growth forest cultivated from the pocket change of school children. It's about driving real innovation and change in the development of our computing infrastructure, before government regulation establishes rules and boundaries that reduce the problem to an exercise in compliance. If we, as technology producers and consumers, choose to truly innovate in both the supply and demand sides of the computing equation, it means investment in reducing our net demand for power, space and cooling while allowing computing infrastructure to scale. Failure to do so means that rate limiting factors in data center scale -- those stemming from energy and space constraints -- will be the subject of government regulation.

Sun's eco-initiatives are centered on the costs of computing, but they're not limited to the silicon domain. The intersection of space flight and thinking about forests stemmed from a comment made by Dave Douglas in his SAS breakout: When Sun produced its annual report online, and skipped production of the glossy paper version, more than 99 million sheets of paper were conserved. That's the equivalent of 11,000 trees, or a small forest in any locale. Securities regulations dictate that we produce an annual report; innovation in how we deliver it to shareholders, potential investors and government agencies lets us challenge long-term views of the infrastructure required. It's seeing the forest through the regulatory trees.

Friday Jan 18, 2008

Choosing Your Demand Curve

As a high school student, I never really thought about energy all that much (I didn't drive and therefore didn't pay for gas) until late one Saturday night when I saw a commercial for Pink Floyd's Animals, featuring the now culturally relevant the stacks of the Battersea Power Station. The commercial literally raised the hair on the back of my neck, as it's hard to see what's floating between the smokestacks until the images resolves to reveal a pig, floating between the stacks, looking into the frame, not out at the audience. OK, it was freaky at the time, but maybe I had just read too much Heinlein. Listening to the album not too long after, the reference made sense: big power (corporate or generated) meant big pigs, with a healthy dose of George Orwell thrown in for good measure. I hadn't given the Battersea power plant or Floyd's pig much thought until I watched Children of Men recently, and laughed when I saw the floating pig in the background, framed by Battersea, another reference to animalistic behavior in the movie.

I'm relying on ancient history to make up for failing to plug (no pun) some recent history in Innovating@Sun podcasts that indirectly address questions of supply, demand, and piggish capitalism. I started by joining Eco-Computing VP Dave Douglas to talk about the disruptive effects of eco-computing -- primarily from a financial, not a cultural perspective. Dave sees eco-computing driven by cost savings, not a desire to be friendlier to the environment or a more responsible corporation. Those are indeed honorable and solid goals, but if you want to get traction talking about a new IT initiative, it has to change supply and demand curves modulated by the guys who build data centers. That's the economic disruption driven by eco-computing: focus on reducing demand for power, space and cooling. Normally systems vendors are predisposed to look at the supply curve, and either drive prices for infrastructure down or increase performance (or both), two effects that tend to move to a different point on the demand curve where overall consumption is increased. If you assume there's elasticity in demand for IT (and I've yet to find a customer that believed it had sufficient IT resources for every project, every year), then some rough rules of supply and demand apply, with each new product cycle shifting the supply curve a bit.

The problem with approaching eco-computing from the supply-side is that you never drive a disruption in demand. Dave Douglas makes a very strong argument for for carbon offsets; Dave wants us to look for ways to use less energy (and therefore create fewer carbon emissions to offset) in the aggregate. True disruption in the market comes from sliding onto a completely different demand curve, leading to a major shift in aggregate demand or in the price of goods. Long tail, open source, cultural icon status: all pick up demand curves and move them.

I first started thinking about IT and economic theories when we recorded the Innovating@Sun podcast about the Niagara 2 processor. Moore's Law is often equated to an expression of computing power, but more accurately it defines the number of transistors that can be delivered in a processor over time. VLSI design teams can use those transistors for memory, threads, network acceleration, multi-core implementations, or any other arrangement to satisfy the demand barbarians at the gates (Wall Street, not Microsoft pun intended). As the most typical distribution of transistors aims for higher clock rates and better single thread performance, you get more power consumption and more heat dissipation. The multi-thread, multi-core design of Niagara 2, along with the partitioning of gates for system elements, yields a completely different demand curve, one with radically lower heat production and power consumption. It's eco-computing from the processor level looking up, rather than the application and storage level looking down. In either direction, it forces us to reconsider how we build applications to take advantage of these new relationships between the total cost of computing and computing performance.

Two years ago, more than half of my customer conversations around CMT and the Niagara processor included the phrase "but we don't develop parallelized applications." In that time, nearly every processor house has announced or delivered multi-core and/or multi-threaded chips, so that Niagara is no longer the exception but the rule. Today, those same discussions focus on how customers can take advantage of multi-threading so that they can manage their aggregate IT demand in terms of the future platform supply.

Thursday Jul 26, 2007


Much of this week has been spent in a variety of VP-level meetings discussing Sun's FY08 strategies and initiatives. The top of the list is growth, in terms of new application areas and infrastructure wins as well as design patterns for the "next" data center. Top of that list is power, cooling, and virtualization, creating a denser and more flexible computing fabric. It's possible to grow our business (through new data center designs) while also being eco-friendly (through more efficient servers, better systems design, and macro-level packaging like Project BlackBox).

One of my favorite "tell it like it is" customers put it rather succinctly for me: He virtualized two dozen servers down to four. With site licenses for all of the software needed, his software costs didn't change. He's running the same number of OS instances, so systems administration, networking and other per-image costs remained flat. But the four new servers are consuming more power than the twenty old ones, and he's wondering where the savings went (aside from a smaller physical footprint). Eco-computing involves looking at the whole stack, from the processor up to the administrator, because each of those levels contribute to the operational cost.

So this is my first new blog category since splitting off the hockey ramblings into a world of their own. I expect eco-friendly, eco-computing, eco-logical, eco-nomic and if I'm truly random, Umberto Eco to make appearances.


Hal Stern's thoughts on software, services, cloud computing, security, privacy, and data management


« June 2016