We held an HPC panel session yesterday in Second Life for Sun employees interested in
learning more about HPC. Our speakers were
Cheryl Martin, Director of HPC Marketing; Peter Bojanic, Director for Lustre; Mike Vildibill,
Director of Sun's Strategic Engagement Team (SET); and myself. We covered several
aspects of HPC: what it is, why it is important, and how Sun views it from a business
We also talked about some of the hardware and software technologies and products
that are key enablers for HPC: Constellation,
Lustre, MPI, etc.
As we were all in-world at the time, I thought it would be interesting to ponder
whether Second Life itself could be described as "HPC" and whether we were in
fact holding the HPC meeting within an HPC application.
Having viewed this excellent SL Architecture talk given by Ian (Wilkes) Linden,
VP of Systems Engineering at Linden Lab, I conclude that
SL is definitely an HPC application. Consider the following information taken from Ian's presentation.
As you can see, the geography of SL has been exploding in size over the last 5-6 years. As of Dec 2008
that geography is simulated using more than 15K instances of the SL simulator process that in addition to
computing the physics of SL also run an average of 30 million simultaneous server-side scripts to
create additional aspects of the SL user experience. And look at the size of
their dataset: 100TB is very respectable from an HPC perspective. And a billion files! Many HPC sites are worrying what will happen when they get to that level of scale
while Linden Lab is already dealing with it. I was surprised they aren't using Lustre, since I assume their
storage needs are exploding as well. But I digress.
The SL simulator described above would be familiar to any HPC programmer. It's a big
C++ code. The problem
space (the geography of SL) has been decomposed into 256m X 256m chunks that are
each assigned to once instance of the simulator. Each simulator process runs on its
own CPU core and "adjacent" simulator instances exchange edge data to ensure
consistency across sub-domain boundaries. And it's a high-level physics simulation.
Smells like HPC to me.