Monday Apr 25, 2005

Sun "Kit"

Have you noticed the increasing use of the term 'kit" to refer to a hardware vendor's products? Articles will refer to, for example, Sun's "kit", when discussing our latest servers or storage and desktops.

I really like that term - because it drives home the point that when you are in the market to purchase "kit" from a product vendor, you sign up to be the kit builder. And for the hobbyist out there, that can be really fun and educational, even thrilling to some degree.

Many of us grew up building kits. I \*loved\* building ships, trucks, airplanes, tanks, cars, rockets, etc. It was a blast, and possibly contributed to (and/or was because of) my engineering mindset. The sense of accomplishment of building highly realistic, detailed and customized models, from a bunch of bare parts, is quite rewarding.

However, most IT shops I work with are less interested in the process of constructing their own unique one-off configurations from collections of parts (kit). I applaud clients for their increasing demand for solutions built from established patterns and reference implementations. I applaud IT vendors for their increasing portfolios of pre-integrated and hardened solutions.

Kit building is a great weekend hobby for kids (and adults). But when it comes to running our businesses and defending our country, we need to leverage, as much as possible, the experience and factory integration of trusted IT solution vendors. For some, it is hard to give up the thrill/challenge of the IT equivalent of "junk yard wars". But there are even more interesting and higher-valued challenges and rewards awaiting those who free up their time from the tyranny of the "nuts and bolts".

The following is a great weekend hobby project. But you don't need to let your IT projects look like this...

Saturday Apr 09, 2005

Project Lifecycle Cartoon

While this is intended to be funny, it's a little too close for comfort in many cases. But due diligence up-front VOC (voice of the customer) needs assessment interviews, and a subsequent translation into well-formed and reconciled SMART (1, 2) [Specific, Measurable, Achievable/Attainable, Realistic/Realizable, Traceable/TimeBounded] Requirements, along with an ongoing Risk Log, would have made this a very boring cartoon. A lesson we would be well advised to remember in many contexts.


Friday Apr 08, 2005

Stocks: SUNW -vs- IBM, HPQ, MSFT, ORCL

In the following graphs I've compared Sun Microsystems (SUNW) to some of our competitors and/or partners: IBM, HP, Oracle, Microsoft. The charts look at the five companies all the way back to the late 80s, and back just five years. In the first graph, you clearly see the "exuberant" six year ramp that SUNW experienced starting in 1995. That's the year we launched Java and the UltraSPARC processor. I also joined Sun that year :-). The post Y2K dot-com implosion hit us pretty hard, but after a two year slide we've settled down and ended up a significantly better long-term investment than some. In hindsight at least.

The second graph looks at the same companies since Y2K. It's interesting to see that we all declined (at various rates) until mid-2002, at which point we all found a plateau that we've pretty much sustained for the last two and a half years.

I don't know about you, but I think the market is primed to move again. The IT industry landscape has changed a lot since the Y2K peak. Pressure is building. Innovation has been occurring all along. Which of the five will break out of the horizontal? My bet is that it'll be those companies that successfully combine targeted innovation and exceptional services.


Wednesday Mar 30, 2005

x86: 64-bit & SMP

The following news story "IBM, HP take different tack as Xeon MP moves to 64-bit" has some interesting quotes: http://www.nwfusion.com/news/2005/0330ibmhpta.html

First: "HP has decided to cease production of its eight-way ProLiant DL740 and DL760 systems...". HP is following Dell's withdraw of the 8-socket server space. Apparently Dell and HP believe that there is little market demand for more than a handful of threads (today an OS schedules one thread per core or hyperthread context). Or, could it be that their Operating Systems of choice (Windows and Linux) simply can't (yet) scale to larger thread counts? Hey Dell & HP... you might want to check out Solaris 10. A million of your prospects have downloaded and registered this OS in just the last two months! And it runs just fine on your (small and large) x86/x64 servers, up to hundreds of threads.

Second Andy Lees, corporate vice president with Microsoft's server and tools business, said "If you run a 32-bit application on 64-bit Windows [Windows Server 2003 x64 Edition] on 64-bit hardware, you'll get about a 5% bump in terms of performance," he said. "If you go ahead and add 64-bit [application] capabilities, then things get dramatically better."

Hmmm. This is an interesting admission that 64-bit might actually be worthwhile. It is (not really) amazing that up until Microsoft (x64 Edition) and Intel (EM64T) had decent 64-bit offerings, that they told the world that 32-bit was all that anyone would need for the foreseeable future - except maybe for huge databases and extremely large memory footprint compute jobs. I guess "foreseeable" means until we can field a team. Oh, by the way, Solaris has been 64-bit forever (in Internet years), has unmatchable security features and reliability, and a bundled virtualization technology that alone is worth the price of admission (oh yeah, it's free).

Combine small and high-thread count performance, security, reliability, and virtualization... and Solaris 10 will allow you to stack multiple applications on a single x86/x64 server with confidence. All of a sudden an 8-socket server (with 16 high-performance cores) looks like an important sweet spot for driving utilization rates up and operation cost and complexity down.

HP and Dell have withdrawn from that space (a strategic blunder I believe). It'll be interesting to see who steps up to claim that prize!

Tuesday Mar 29, 2005

Itanic: Davy Jones' Locker

In the year 2000, just as the first Itanium processor from Intel hit the market, IDC predicted that 2004 Itanium server sales would hit the $28 billion mark! But IDC missed their projection slightly. They were off by about $26.6 billion, or ~95%. Ouch!!

Of the few Itanium-based servers that were actually sold in all of 2004, HP lead the "crawl" and accounted for 76% of them. But HP, as of mid-2004, joined Sun and IBM in the Opteron-based server market, so expect Itanium sales at HP in 2005 to slow at a faster rate than HP's general server sales numbers. IBM came in 2nd with 10% of the Itanium market, but has strongly hinted that they are killing off their Itanium-based server offerings in favor of Opteron, Power, and traditional Intel processors. Dell captured 3rd place with just 5% of the tiny Itanium pie, and so far Dell has resisted selling Opteron-based servers... but how long will Michael watch from the sidelines?

For those who like to look under the hood, it seems to me there are three server-oriented processor families that deserve attention and will still be important in 2010:

  1. Sun's (and Fujitsu's) SPARC-based CMT families (US-IV, Olympus, Niagara, Rock, etc)
  2. IBM's Power family (Power4, Power5, Power6, etc)
  3. AMD/Intel's x86/x64 families:
    1. Opteron/AMD64 [Egypt, Italy, etc]
    2. IA-32/EM64T [Nocona, Potomac, Smithfield, Tulsa, etc]

It will be fun to watch. They all have well funded R&D, aggressive rates of innovation, compelling roadmaps, and market/ISV traction. I believe all three horses will be in the race five years from now, but only two will be perceived as the market leaders. Unpredictable market dynamics and execution challenges will likely cause one of the three to stumble and fall behind. But anyone's guess as to which will stumble would be just that - a guess. Intel can survive a $25 billion dollar mistake, and learn from it; and AMD is actually delivering new processors faster than their roadmaps suggest (an amazing feat for a processor design shop)! IBM's roadmap and processor technology look great, but massive CMT could explode and their Cell Processor could turn into the next Itanic for server applications. Sun has Olympus to compete with Power6, and very exciting new yearlings (Niagara and Rock) that could, well, Rock the world soon. Single-threaded deep pipeline performance processors, throughput-oriented massive-CMT chips, and price-efficient desktop/presentation CPUs are all up for grabs. I doubt one horse will win the Triple Crown. Stay tuned.

Of course, OS traction will dictate this to some degree (Solaris, Linux, and Windows64 are all interesting candidates), as will J2EE -vs- .NET adoption and COTS app support. I think that security and efficient/reliable virturalization technology will be key drivers of platform selection in future years.

The one thing we can predict with near certainly is that Itanium (aka: Itanic) is headed to Davy Jones' locker.

Wednesday Mar 23, 2005

Good Enough -vs- Gratuitous Upgrades

Sun offers a really cool thin-client called the SunRay. Check out this flash! We've got 30,000 or so running our desktops throughout Sun. Zero-admin, highly-reliable, energy-efficient clients have saved us millions and driven up productivity. Many of our customers are running these as well. There isn't much to the device... No OS, no disk, no fan, no viruses, no patching, no state... you can almost think of it as a remote/networked frame buffer on steroids. Coupled with USB peripheral support, mobile session capability, Java card security, DoD approved multi-compartment support, VIOP telephony, this is a device that deserves all the attention and acceptance it is getting.

Using Tarantella, Citrix, or other techniques, this device can even display full screen Windows (indistinguishable from a Windoze thin client) if desired, or it can run "Windows in a window" from a native GNOME Linux or Solaris desktop. With the Java Desktop System's integration of hundreds of bundled apps (StarOffice [MS Office], Mr. Project [MS Project], GIMP [Photoshop], Evolution [Outlook], etc, etc) some are looking at the oppty to stop payment to Redmond.

Whatever your choice of display and environment, just pull your Java Card (your session is preserved on the server) and reinsert it later at home, or the next day in another office, and your session will "instantly" pop up in front of you ready to continue your work.

However, a customer recently expressed a concern that the SunRay isn't powered by the latest processor technology, and isn't populated by a huge bank of RAM. Hmmm. I wonder if this person might also consider writing to and asking:

Norelco why their electric razors are powered by two AA batteries! When MegaRaz offers your choice of 220V 3-phase or dual-feed 30A single-phase units that can rip thru facial hair and auto-exfoliate the top layer of skin in record time.

Panasonic why their microwave ovens are still powered by radio-wave emitting magnetrons. Don't they know that MicroRad now offers lead-lined plutonium-powered resonant-coupled chamber ovens that can cut food prep time by a factor of 50 over obsolete microwave ovens?

Kenmore why their refrigerators have not kept up with the times. That DeepFrz and many others now offer a turbo-switch option that circulates liquid hydrogen to drop the freezer compartment temp to near absolute zero, extending food storage times to future generations. Many use this feature to preserve small pets during vacations, eliminating the need for pet sitting or boarding.

Those were designed to be funny, and to make the point that often engineering makes design choices that are "good enough". The SunRay has to have enough power to paint pixels. And it does. Future versions might require more capable processors to handle stronger encryption at faster network speeds, 3D Acceleration, etc. But gratuitously incorporating leading-edge technology into a design can increase cost, heat, power, noise, and instability with no added benefit. Be careful what you ask for... because you'll end up paying for it. Requirements should be linked to the business value they provide,  and not to an emotional "got to have it just because" craving that is fueled by consumer marketing campaigns.

Sunday Mar 13, 2005

SOA: Debating our CTO

I have the utmost respect for our CTO of Enterprise Web Services, John Crupi. He is a great guy and one of our sharpest arrows. If you get a chance to hear him speak, you will enjoy the time and take away valuable insights. John recently joined the BSC community (blogs.sun.com) and posted a brief intro to SOA. Welcome John! I look forward to future updates on this topic on your blog.

Me, I'm a Lead Architect with a background built on consulting and systems engineering primarily at the IT Infrastructure level, focused on most of the solution stack - up to but not generally into the functional business logic or S/W app design space. Prior to Sun I spent years as a programmer translating business requirements into S/W solutions... but that's been a while.

With that context (the fact that I come to the table with certain biases and experiences that color my perceptions, and I suspect John does as well, to some degree) I'm going to suggest that maybe John is slightly off-base w.r.t. his premise about SOA and IT / Business Unit (BU) alignment. In the spirit of extracting deeper insights and clarifying positions, I'm going to challenge John with an alternate view (a debate), and ask him to either agree with me or defend his position. Hmmm...is this a Career Limiting Move - publicly challenging one of our Chief Technology Officers? No... not at Sun. We encourage our folks to question assumptions and even our leaders, resolve/align, and then move forward in unity. Okay, with that:

John, you suggest that: one of the critical success factors for SOA is a tighter relationship/alignment between Business Units and IT. In fact you say we can not even do SOA without effort on the part of the Business Unit.

Now I could not agree more that Business/IT alignment is absolutely paramount. The lack of business focus and alignment is one of the top reasons why so many IT initiatives fail to deliver or meet expectations or provide a higher return to the business than its cost. I've blogged about that very topic.

However, that alignment, IMHO, is not related to SOA. In fact, I believe there are benefits to isolating service construction techniques from the consumers and owners of those services. To reuse the power utility metaphor:

You don't care how S&L built the power plants that deliver your electric service, or how power distribution provisioning logic taps into multiple grid suppliers and peak-load gas turbines. You simply have specific service level and financial demands, and expect a quality experience when/if you have to interact with the service desk to resolve a dispute, request a change in your service, or report an incident.

There are two primary components to IT... the design/development of services, and the opertaion/delivery of services.

"Business - IT Development" alignment is driven by business requirements (functional, service level, cost, time-to-market, etc). SOA isn't a "requirement", but a technique that helps IT achieve the desires of the business to support their business processes.

"Business/IT Operations" alignment is properly performed as defined by ITSM/ITIL Best Practices, and as illustrated in my graphic below. Business and IT need to work as a intimate partnership to define, implement, deliver, and continually refine an optimized Service Portfolio at contracted service levels and an established and predictable cost point. Again, SOA is simply a technique that helps IT achieve operational excellence.

All other functions are internal to IT. The fact that requirements are fleshed out in an Agile fashion and constructed/deployed using a SOA strategy is meaningless to the Business Unit. They simply want IT to build the capability they need, adjust it when asked, and deliver it as expected.

<>

As a consumer and purchaser of various utilities (electricity, gas, cable, phone, water, etc) you don't need nor want to know the details of how the utility achieves scale economy or service resiliency or security or efficiency/utilization or adaptability or regulatory compliance. Well, okay, you and I by nature might be curious and like to know how these things work. But, in general, exposing the internal details of how a Service Delivery Platform is constructed is, IMHO, counter-productive to the Business/IT conversation and partnership. Some curious BU stakeholders will likely want to understand and even attempt to influence your model (eg: buy EMC, use .NET, etc). But that kind of inquiry can expose dysfunction and introduce inefficiency in the model. You don't tell Pacific Power to buy GE turbines or supply power at 62 hertz, unless you want to pay extraordinary fees for your own custom power plant.

I strongly believe in the principles of Agile development and architecture. Clearly the days of throwing a fixed requirements document over the wall are over. Business Units, IT Operations, and IT Development all must work together in a healthy partnership focused on continuous business process optimization and refinement. However, in my opinion, the true value of SOA is in the benefits it delivers to the internal IT function w.r.t. scale economy, resiliency, efficiency, adaptability, etc. Business Units don't need nor want to know about SOA... they simply have (frequently changing) requirements and expectations.

Bottom line: SOA is a architectural style/technique that IT Shops will employ to quickly respond to changing service level demands, while operating IT as an efficient adaptable business with an ability to tap into (integrate with) external/outsourced partners (blog on Coase's Law).

John - I respectfully invite a reply.


Friday Mar 11, 2005

ITSM: Transforming IT

Here are two recent letters I sent to customers following workshops designed to map out a strategy to transform their IT organization thru the assessment of their people, processes, and technologies and the application of best practices. I thought that these might be beneficial to others who are attempting to do likewise. There are no great pearls of wisdom here, but it might get you thinking about having the conversation. ITSM = IT Service Management.

One client is attempting to synthesize several frameworks (ITIL, Sigma, ISO, and CMM-I) into a multi-year strategy to uplevel their operational capability. They asked for a mapping between ISO and ITIL, to which I replied in the 2nd letter (below).

Hi <--->,

I'm glad to see you are moving forward with this. As we mentioned during our workshop, some clients choose to perform the SunTONE assessment by themselves. Others seek assistance from Sun or a partner. Still others do both... performing an informal survey themselves and then requesting a formal evaluation from Sun. Either way, since I'm just down the street from you, I would like to keep tabs on your efforts and help ensure you get the assistance you need and the results you desire. If you find there are areas that you'd like to target for improvement, I can also help suggest services and/or technologies and/or best practices that will help improve your "score". Of course, it isn't about the score - but a firm's ability to deliver a quality service and experience that meets documented SLOs at a desired level of security and cost.

As I've mentioned, your operational capability is already (it appears) at a higher state of maturity than most. A SunTONE "stamp" will certify this capability and is a badge of honor. You'll join hundreds of other firms that have attained this status, and will differentiate yourselves from the other hosting centers.

If you have a standing meeting to discuss status and actions and gaps associated with this effort, and if you think I could add value to this meeting, I would be more than happy to attend and provide insights and suggestions where appropriate.


Hi <--->,

I'm more of a Sigma guy than an ISO guy... But from my investigation of ISO, it seems clear that a clean mapping exists between ISO and Sigma. These are initiatives to create and document and control processes to ensure a high degree of quality and predictability and continuous improvement/refinement. These are wonderful tools to ensure a process continues to be aligned with expectations and goals, and is as efficient as possible.

ITIL and SunTONE, on the other hand, DEFINE best practices and processes.

See the difference? ITIL is a set of practices/processes, whereas Sigma and ISO are mechanisms to ensure any process is (and continues to be) optimized.

So, in that sense, they are HIGHLY complementary, but orthogonal. I don't believe there is overlap or mapping between ISO and ITIL. You really need both the processes (ITIL) and the means to define and measure and analyze and implement and control (Sigma/ISO) those processes.

Note that both ITIL and Sigma/ISO are systemic/intrusive frameworks that, if done right, will infiltrate the whole organization and will be embraced and promoted from the highest levels. It is a culture change that takes more than a training campaign, MBOs, and a tiger team. You already know this, but many clients fail because they are not prepared to endure the multi-year evolution that this kind of change requires. But, for those that succeed, there are great rewards all along the way... incremental quick-hit benefits that don't require huge time or resource investments.

Many IT shops, I believe, will be outsourced and/or be "consolidated" over the next few years because they can not control their costs, security, and service levels. ITIL+Sigma/ISO is the path to survival and excellence.

Hope this helps!!

Java Jingle

 Java Jingle from 1997http://blogs.sun.com/roller/resources/dcb/Java.mp3

I think Sun employees wrote and recorded that song. Anyone recall who? A verse near the end states: "Nobody can tell you what the future may bring...". Well, that was 8 years ago. Check this out!

As Java technology enters its 10th year, the Java Brand is a one of the most powerful technology brands on the planet. You'll see it on your Java powered mobile phone from Sony Ericcson, Motorola, etc or your Palm PDA, on a variety of new PCs from the factory, built into various printers from Ricoh, baked into mobile games, and a part of slew of websites from our partners like Borland, Oracle, and others. Java technology is on over 2 Billion devices and counting!

The Facts
In our most recent study we found that 86% of consumers and 100% of developers and IT recognize the Java brand. In addition we have seen the association of Java and Sun grow by 15% year over year. Over 80% of Developers and IT professionals know that Java comes from Sun. In addition 1 in 3 consumers will buy a product with the Java brand over a comparable product, this is up from 1 in 5 just a year ago.  Java.com just blew past 10 Million visitors per month, which is more visitors than Nintendo.com, Wired.com, Playstation.com, Time.com, Businessweek.com, and many others. Here are some facts and figures:

  • 2 million downloads of J2EE 1.4 - the most popular release ever!
  • 4.5 million Java developers, up 7% from June 2004
  • 2 billion Java-enabled devices, up 14% from June 2004
  • 750M Java Cards, up 25% from June 2004
  • 579M phones, up 65% from June 2004
  • 650M PCs, up 8% from June 2004

Power Hungry Grids!!

I find it ironic that our industry uses Power Generation and Distribution Grids as a metaphor to describe the utility based computing model that is being promoted by vendors and demanded by an increasing number of customers. Actually, it is a reasonable and appropriate analogy. You don't build your own unique power generator for your home or business, and you don't hire a Chief Electrical Officer. Instead you plug into the Power Grid(s)... and leverage standards and scale economics and the variable cost structure of a reliable shared service provider for which you pay for what you consume at a predictable cost per unit. Being a commodity adhering to standards, you can easily switch providers with little or no impact to your operation. You demand a level of service quality, and know what you are willing to pay for that service.

I find it ironic simply because it will take a main artery from the Power Grid to, well, power the Compute Grids being designed. There are plans on drawing boards to increase the compute density of future servers such that a standard 19" datacenter rack will (fully populated with the most dense compute servers) consume up to 25KW of power!! That's huge. Consider a data center floor filled with these racks. You can imagine the engineering challenges associated with extracting that much heat from these blast furnaces. And then, of course, it's up to the datacenter to do something with that all that heat. One customer measured hurricane force chilled air speeds underneath their raised floor tiles! To make matters worse, according to p.20 of this report (see the table below), computer equipment accounts for less than half of the power demand for a typical data center.

The good news is that you'll have an unprecedented amount of compute power on each floor tile, so in theory, you won't need as many racks. Of course, we all know that the demand for compute capability exceeds the supply. On the other hand, the ultimate realization of the utility model suggests that you might not even have your own datacenter. Like your gas, water, electricity, cable, and phone services, the cost of the building, of powering, cooling, and administering the equipment, of security, insurance, disaster recovery, etc, will all be absorbed by the utility provider. You simply pay for the service at a known rate per unit of consumption.

That sure sounds great in theory (unless you are the Chief Integration Officer, or Chief Infrastructure Officer). It'll be fun to watch this play out. And watch IT earn the title: "Information Technology".

Tuesday Mar 08, 2005

The Science of Data Recovery

Chris Gerhard made an off hand comment about the fact that disk scrubbing simply hinders (doesn't necessarily prevent) a motivated attempt to retrieve information from a disk drive. Disk Scrubbing is the process of (attempting to) securely erasing a disk to prevent others from accessing previously stored information. This is typically done by writing (possibly multiple times) random data over the entire surface of a disk.

Since I work with various government accounts/agencies/programs, this is an area of interest to me and some of my clients.

You might think that a digital medium designed to store only zeros and ones would be immune to forensic recovery of residual data once the zeros and ones are randomly altered. The fallacy with this is that magnetic storage is not a digital medium at all. Magnetic domains are created when the read/write head applies energy to a bit location to align some (not all) of the particles to reflect either a zero or a one. The precise location of the "domain" for each write varys slightly in three dimensions (including depth). This reality provides interesting opportunities or risk (depending on your perspective).

A colleague (thanks Joe) pointed me to a fascinating report on techniques involved in recovering data from ostensibly erased disks and computer memory. This is amazing and spooky stuff for the technically inclined. Here is another report (thanks Kurt) that's also very interesting and enlightening. Joe also pointed me to Prof. Gutman's website, who has a lifetime of security related knowledge to share!

Here are a few brief excerpts (read the article for context):

When all the above factors are combined it turns out that each (disk) track contains an image of everything ever written to it, but that the contribution from each "layer" gets progressively smaller the further back it was made. Intelligence organisations have a lot of expertise in recovering these palimpsestuous images.

To effectively erase a medium to the extent that recovery of data from it becomes uneconomical requires a magnetic force of about five times the coercivity of the medium... (a modern hard drive has a  coercivity of 1400-2200 Oe).... Even the most powerful commercial AC degausser cannot generate Oe needed for full erasure. It may be necessary to resort to physical destruction of the media to completely sanitise it (in fact since degaussing destroys the sync bytes, ID fields, error correction information, and other paraphernalia needed to identify sectors on the media, thus rendering the drive unusable, it makes the degaussing process mostly equivalent to physical destruction).

One example of an adequate degausser was the 2.5 MW Navy research magnet used by a former Pentagon site manager to degauss a 14" hard drive. It bent the platters on the drive...

Monday Mar 07, 2005

"Sun DB" The Open Database

Our President & COO recently talked to the press about our plans regarding Sun's Open Source SQL database (see the link and excerpt below).

I believe "Sun DB" (a generic term for the concept) will provide huge value to our industry. Many will continue to choose to deploy their largest, most active, and most mission critical data stores on technology from traditional database vendors. However, Sun DB will provide a supported open standard and open source SQL data store at an extremely attractive price point (free?). IT Shops, Government Programs, Research Facilities, etc, will find this offering to be technically and financially irresistible for many types of deployments. And, I'm guessing that traditional database vendors will find intensified market pressure to readdress license models increasingly irresistible. It's a win-win for everyone... Well, almost everyone.

http://www.infoworld.com/article/05/02/16/HNsunpresident_1.html

Sun president talks databases, Sparc, and HP

Jonathan Schwartz talks about Sun's open source plans and offers Fiorina's successor some advice

IDG: Does Sun have a concrete plan to offer an open source database, or was Scott McNealy just being provocative when he suggested that recently?

Schwartz: To be a complete application platform you have to have some form of persistent storage. You can achieve that through a file system, a directory engine, a messaging store, the persistence engine in our application server -- those are all forms of databases. What we haven't done is address the SQL access database, which has been served well in the open source community by MySQL and PostgreSQL. We're committed to filling the hole -- all of the hole, not just the file system. We have to address the requirements of the SQL database, so I think we're quite serious about it.

IDG: Would you use the same model as you did with Linux on the Java Desktop System, i.e. take an existing open source product, tweak it for your needs and put a Sun label on it?

Schwartz: That's to be determined. Customers have said, 'We'd like an alternative to the existing choices we have.' And they are consistently asking Sun to go work on that issue.

IDG: So it's a matter of when and not if?

Schwartz: Absolutely.

Friday Mar 04, 2005

In Good Company: McNealy/Vass

My 30 minutes of fame! Some of our C-levels came to town to yesterday to present at the IPIC 2005 Conference. Our execs love to meet with customers at every opportunity, so we were given a couple hours of their time before their flights - to host an exec roundtable. We invited some of our top customers. Scott entertained and enlightened the crowd from 10-11am. Bill Vass was on from 1-2pm. And, during our catered lunch, between Scott and Bill's talks, I was asked to talk about "Innovation & Value". It was a blast. Mapquest tells me I'm 2908 miles from Corporate HQ. CityDistance tells me I'm 2443 miles away. But, for 30 minutes... I was on the "A", um "C" team!  :-)

Tuesday Mar 01, 2005

Client Engagement Prep Form

I created this a few years ago when I was an Area Product Specialist, flying into accounts all over the place for workshops and architectural or technology discussions. At the time, I needed a way to synchronize details about the account, the specific challenges/opptys we needed to flesh out at the meeting, and travel logistics. It helps to set expectations and align messaging before a customer facing meeting. Account teams were great at filling these out.... I have 100+ of these in my e-mail archive! I generally used a descriptive subject line, such as:

Subject: Brillhart Customer Engagement: Xerox@Rochester [9 May 03]

Here is the form. Feel free to adapt and reuse!

This note contains important information regarding our upcoming meeting(s). Please verify that the meeting logistics are correct, and then complete the Meeting Questionnaire (see below).

If you intend on us disclosing any confidential information, please ensure you've completed all the Non-Disclosure (ND) paperwork and secured any approvals in advance. Some account teams believe they have a general bi-lateral ND in place, when in fact each meeting requires a separate approval. Please have the paperwork at the meeting. Thanks!

This Questionnaire doesn't take long to complete, and it really does help ensure success. Sales teams often benefit from this exercise as much as the presenter.

MEETING LOGISTIC SUMMARY
I'm scheduled to meet with you and your customer, [Xerox], in the [Rochester Area] on [Friday, May 9th] for [about 2 hours]. This engagement [\*is\*] covered by a signed ND agreement. The primary focus of this meeting will be [item #3] as described below, with particular emphasis on VCS competitive positioning.

Please let me know ASAP if any of this has changed. It might be useful for your customers to know a little about me before the meeting: http://brillharts.com/sig

MEETING QUESTIONNAIRE
In order to prepare for our upcoming meetings, I'd like you to fill out the following brief questionnaire as soon as possible for my preparation. Please try to fill out everything just to be sure we are all on the same page. I've found this process really helps ensure a successful meeting. Thanks in advance for your time!!

1. Account Team Contact Info:
Sales Rep: 10-digit office/pager/cell
Client Solutions Contact: 10-digit office/pager/cell:

2. Customer Name and their Function, Department or Group:

3. Directions to Meeting
(or an address - and I'll use MapQuest)
Hotel Recommendations, if an overnight stay is required
Do I need a car, or will you be picking me up at the airport?

4. Customer Prep Call
Do we have a customer con call scheduled with one of the key meeting participants to better understand their expectations for this meeting?

5. Primary \*Business\* Challenges/Goals
What are the primary \*business\* challenges/goals we are trying to help them with during this meeting?

6. Key Discussion Topics & Desired Outcome/Takeaway/Actions/Agreements
When we leave, what do \*we\* hope was accomplished?

7. How many people will be attending? Who are they?
What is their experience level or technical competence related to the topic of the meeting? Are they generally advocates, skeptics, or opponents of our approach to or stand on this topic? What level of influence do they have to make commitments and/or decisions? Who else from Sun will be in attendance? Consider inviting SunES personnel and strategic partners. Should someone from Sales Mgmt attend?

8. Do you anticipate the need to talk about Futures?
CPUs, Servers, Clustering, Storage/SANs, Solaris, Web Services/SOA, etc.... If so, have you secured ND approval?

9. Competition / Position / Traps?
What is the main competitive threat related to the topic of this meeting? Are we the incumbent or the challenger in this space? What "traps" might have the competition set for us?

10. Service Escalations / Quality Issues?
Have they had any serious product or service issues that might surface in this meeting?

11. Odds and ends:
What is the dress code?
Will there be a laptop projector?
Do they understand the general Sun product line and vision?

A quick FYI: Presentations are often more effective in a "chalk talk" interactive format. Please ensure there is something to write on (white board or easel). Sometimes the best approach is a laptop projector that projects onto a white board to facilitate annotations to the slides that relate to the customer's situation. Also, if we only expect the meeting to last a couple hours, try to secure other meetings to make the most of the day.

MY SERVICES
1. Engage the customer in an open discussion about their technical and business requirements, goals, and the expectations of both their mgmt and the end-users of the services they plan to deliver. Assist the customer in thinking through the various options and tradeoffs they can choose from during the architecture and design phase. Work with the Sales Team to produce a solution proposal. Continue to provide support to the Sales Team and customer as needed to secure the order.

2. Discuss our Vision and Roadmap and the Technologies that surround Datacenter Architecture and Operations. This can include N1, SOA and Web Services, ITIL Disciplines, Operational Capability, Utility Computing, Managed Services, etc.

3. Discuss High Availability using SunCluster 3, Replication Techniques for Disaster Recovery, and End-to-End Solution Architectures, and help the customer design a solution that solves the business challenge they are facing.

4. Perform an Architectural Review and Systems Performance Audit of the customer's current environment, and propose changes that will optimize their environment for their current and projected business requirements.

5. Deliver an in-depth technical review of our Servers, Interconnects, and Chip Architectures and position Sun w.r.t. competitive offerings, to help guide the customer to a decision that is appropriate for their current and projected needs.

6. Provide a high-level strategic overview of our Vision, Value Proposition, Broad Product and Technology Overview, and Competitive Positioning, to help the customer make an informed and confident decision to partner with Sun.

7. Work with Customer Engineers and SysAdmins at the customer's site to build a Proof of Concept evaluation environment using Best Practices, and then assist the customer in exercising the POC to demonstrate how it's features and functions will enable the customer to succeed.

8. Other. Such as Storage NDs, Blade NDs, Volume Server NDs, etc.

Thursday Feb 24, 2005

Rocket Science & Open Standards

Here is a letter I sent to a Lead Architect I met at a particular "space agency", as a follow up to our discussion about one of their infrastructure redesign projects. I think many clients are wrestling with this topic, so I offer this as an open/anonymous letter for your consideration.

Hi <----->,

It was great meeting with you yesterday. Thanks for sharing some insights into your strategy and challenge. I applaud you for starting to think about your future infrastructure needs and the potential risk of status quo at this point. Too many clients wait until the last minute and then they find themselves in an urgent/reactive mode making poor and costly choices.

I was thinking more about your philosophy regarding the use of non-proprietary open standards. This is very important, and I'm a huge advocate of this approach. I agree that it is critical to architect a solution that promotes choice and permits you to migrate to different products and technologies and vendors without cost, delay, or pain. To me, and I would guess to you as well, this is the reason to select interoperable standards and "open" platforms.

As you know (although many people confuse the two) the "open source" movement is different than the value proposition of "open standards". The measure of whether something is open or not is determined by the cost/pain of extracting that component out of your solution and replacing it with another choice. Examples could include the server vendor (eg: HP to Sun Opteron), the OS (eg: Linux to Solaris), the SAN fabric switches, the J2EE App Server, the SQL Database, the Compilers, etc, etc....  Note that open source does not factor into the measure of being "open".

I do also recognize the value of open source. It can increase the rate of innovation through a global community. It can provide for independent security assessment and validation. It can offer a client the ability to tweak the product for their own needs (although I generally discourage this due to support and quality and complexity reasons). As you know, Sun has open sourced the code to Solaris10! I never thought I'd see that happen, but it has.

There is another aspect that I believe is part of your strategy. If you build the upper layers using interoperable standards, then the layers below are often interchangeable even if they aren't fully open. For example, if you build your business logic using J2EE running in an App Server, then the OS and the H/W choice is much less "sticky". You can switch between SPARC and Opteron or between Solaris and Linux without cost or delay or pain. Also, if you code and compile your own apps, you can choose to use standard library calls that make the underlying platform easy to change.

There are, however, drawbacks associated with choosing products that do not have a well established and directed engineering and support mechanism. The key, in my opinion, is to select partners and products that embrace open standards (and open source) and yet have an auditable and proven support and engineering model. This gives you high confidence in your solution as well as the ability to change at will.

I believe Linux is fine as a personal desktop operating environment. I also think Linux can be a viable choice on which to run stateless replicated (load balanced) presentation tier services. However relying on Linux to host mission critical applications and tier 2+ services, in my professional opinion, will significantly increase the risk associated with your mission support. It just isn't mature enough yet. There are reliability concerns, security concerns,  scalability concerns, functionality concerns, support concerns, bug fix responsiveness concerns, legal indemnification concerns, etc.

I offer the same counsel about the choice of your supplier of Opteron servers and other components in your solution stack. Many have found that the potential initial cost savings associated with building a whitebox generic server, and using freeware software, is often lost many times over in the frustration and hassle of dealing with bugs and quality issues and the lack of features. These issues are highly mitigated when using "open standard" products offered by a partner like Sun that pours billions per year into R&D and QA.

I'll close by reiterating my suggestion that these should probably play a role in your infrastructure redesign:
  - Sun's industry leading Opteron servers (btw, our future roadmap is extremely interesting)
  - Sun's open source Solaris 10 operating environment
  - Sun's open standards "platform software stack"
          (app server, directory server with ActiveX interoperability, portal server, identity server, etc, etc)

We also have an interesting suite of virtualization and automation solutions, including our N1 Service Provisioning Server.

I'd love to support you in learning more about and even evaluating these products and technologies and strategies. I'd also be glad to act as a general sounding board and/or provide architectural review and guidance as desired.

Please feel free to contact me any time. I look forward to hearing from you.

Best Regards,

-- Dave

Dave Brillhart
Lead Architect - Strategic Government
Client Solutions Organization
Sun Microsystems, Inc.

About

dcb

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today
News
Blogroll

No bookmarks in folder