September 22, 2009

Musings on 1st day of IDF - Whither Itanium?

OK, so I managed another early start on my day and caught Caltrain #217 to San Francisco. Destination: Moscone West, my day home away from home, and Intel's Developer Forum (IDF).

The trek to IDF is a late summer early autumn pilgrimage that the Intel faithful have been engaging in for years. While the venerable PC geek fest is still filled with technical sessions and other nuts and bolts revelations for the developer, the focus of IDF has continued its movement away from solely that of PC hardware to embrace a much broader audience including smaller and embedded devices, servers and big-stuff, as well as software.

The theme of Paul Otellini's keynote was the Spectrum of Computing on Intel Architecture (IA), with a particular emphasis on the Continuum, i.e. illustrating the scaling of IA from Atom through Core up to Xeon. No longer is the focus on personal computing, but rather on making all computing personal. Followed by a long list of examples from the tiniest of handheld devices, special application hardware, laptops, desktops, servers, etc. Intel went out of its way to show that IA can handle the workloads and form factors of just about every size imaginable.

While Mr. Otenilli made it clear that Intel believes that PC sales will be flat, not down, in 2009 and will show substantial growth in 2010, it is clear that the future will be driven by smaller hand held devices such as notebooks, and the even smaller. These devices are the domain of the Atom processor family, a growing reality that Intel reaffirmed by announcing the Intel Atom Developer Program. Atom based smaller devices were clearly on the mind of Intel and were plugged frequently as part of the Continuum and often shown side by side with the next generation laptops illustrating the seamlessness of the user experience across a variety of form factors.

Intel also spoke in detail about its strategy for enabling cross platform applications including support for Windows 7,, Adobe Air, and MS Silverlight amongst others. Cross platform has always been a part of the Intel heritage, but when one stops to think about the potential impact in the embedded or purpose built hardware marketplace, there are many market forces at play that could help Intel drive its embedded Atom approach.

The universe of non-PC, non-laptop, non-mobile phone, non-server, and non-storage devices in enormous. In the realm of IT, it is the true elephant in the middle of the room. Yet for the most part, the embedded space remains the realm of proprietary chips and operating systems that by definition have not taken advantage of the cost efficiencies of standardized components and software. With the scale that Intel can bring with Atom, it is hard to imagine a marketplace that would not be touched by the efficiency of scale that Atom portends combined with the choice of operating system. While servers, desktops, and laptops remain interesting, the embedded market is where we see the latent potential for very big action including medical devices, transportation and shipping logistics, mobile devices, and yet with all puns intended, slots and gaming machines.

Speaking of servers, it was interesting to note the complete lack of any mention of Itanium in the morning's keynote, and only a passing mention in Sean Maloney's afternoon keynote. The number of enterprise focused (formerly Itanium only) features appearing in the Xeon roadmap continues to grow and the glaring omission from Intel's Continuum of processors and architecture discussion cannot be all that reassuring to the Itanium dependent. Although Sean did mention the now often repeated mantra that the value of Itanium server sales now exceeds that of SPARC based systems, in a market where Sun's sales have gone on hold pending resolution of its acquisition by Oracle, is this really all that surprising? If I were an Itanium loyalist, the muted tones of support and minute placement of my cherished platform would be raising the hair on my back. I wonder what the tone will be tomorrow night at the Itanium Solutions Alliance Awards party.

Of course any trip to Moscone West would not be complete without yet another loosely defined Cloud experience. In both of the first two keynotes, the Cloud discussion came up, and once again we see the effects of marketing buzzwords overtaking anything resembling consistent definition. Paul Otellini made the comment that we are no longer in a client-server age, but rather a many clients-many clouds age. In the afternoon keynote Sean Maloney talked of the continuum of the data center redefined including one environment called the Internet/Cloud. Argh!

Yes, I would completely agree that the client/server era is dead, absolutely. However, this industry wide fascination with ill defined Clouds is truly annoying. If the cloud is an entity that lives "outside" or "elsewhere", is it not the totality of everything that is not "inside" somewhere else? If so, how we can have a many clients to many clouds relationship? If the Cloud is a class of computing solution, like implied in the afternoon keynote, then what is the difference between Cloud and Internet? I may be picking on this topic a lot this year, but the lack of clarity is not good for anyone. If everything is a Cloud, then nothing is a Cloud.

Overall, it is encouraging to hear Intel continue to calibrate itself beyond the traditionally narrow focus of the semi-recent past (PC) to embrace a much broader view of the computing opportunity. While today’s keynotes were not earth shattering and revolutionary, they were incremental, future focused, and largely rational in nature. From a big picture perspective, the company has illustrated a long term vision that to our way of thinking makes a good deal of sense. Of course executing upon a vision can be more difficult, and the Santa Clara company is not without its executional shortcomings. Nevertheless, its vision is largely rational, which is more than what some companies can offer.

September 02, 2009

Pondering VMware and the Road Ahead

Monday was the beginning of the weeklong virtualization love fest known as VMworld 2009. Although the exposition did not start until Tuesday, Monday was a day of tutorials, preparation, and an industry analyst event where VMware parted the corporate kimono for us market watchers to gaze inwards on the company’s present and future plans.

While certain announcements will be made during this week, other content that will remain NDA for the near future; nevertheless, it is always enlightening to hear directly from company executives their assessment of the corporate strengths and weaknesses, as well as their big picture vision for the company. Happily, there was plenty of information and perspective shared, even when it challenged the presentation schedule.

Not surprisingly, VMware extolled the virtues of virtualization, and its market success especially in the upper echelon of the marketplace. At the same time, the company recognized that its lowest hanging fruit, x86 server virtualization, is starting to mature in the marketplace, and that a hypervisor alone does not make for a long term corporate revenue growth strategy. Hence, there is a market imperative for the company to move beyond the tactical achievement of server virtualization and consolidation towards a strategic position of extolling the business benefits of virtualization.

Given the growing market position of Citrix/Xen and Microsoft, it is clear that VMware needs to maintain a competitive differentiation that exceeds being a supplier of a software hypervisor. At the same time, hardware based hypervisors such as IBM’s Power based CPUs are continuing to push the performance envelope and questioning the assumption that an additional software hypervisor is needed to support Linux workloads. Many x86 Linux binaries can run well in emulation on most any current vintage System p or i.

To VMware’s credit, the company has outlined several initiatives that seek to elevate the discussion of virtualization to a higher level, to include the often-overlooked components of any virtualization solution, including storage, networking, management, security, compliance, provisioning, scheduling, and monitoring. Informally, the company is beginning to refer to its approach as the “software mainframe” (with the requisite disclaimers). Although it is hard to imagine an array of virtualized x86 systems providing the same technical achievement as the System z, the notion of the reducing the complexity while increasing the agility of the datacenter remains laudable. Nevertheless, we applaud VMware's vision and realization that there is much more to virtualization than server consolidation. Abstracting the view of the datacenter to a set of all encompassing virtual services proffers many benefits to not only IT professionals, but to end users within the organizations.

At many levels, the concept of what VMware executives were touting as The New Age Desktop makes sense. The notion that desktops are virtual environments that are delivered through a catalog of IT services on demand undoubtedly has great appeal to IT professionals, and with the correct positioning and user empowerment, could address many of the functional needs within organizations. If the desktop is truly a portal to applications, then it matters little where the application, processing power, storage etc., resides. Yes, there are compliance and religious issues that may affect this, but these are not functional issues. Moving away from dedicated thick client access devices can help turn hardware management issues into service and scheduling issues, which are typically simpler and less costly to address.

As rational and “simple” as this all sounds, it is also a very familiar refrain that depending upon the year could be associated with the Network Computer, NetPC, Windows Terminals, Thin Clients, ASPs, RDP, Consolidated Client Infrastructure, and list keeps on going. The mantra of centralized computing is well understood by IT. The cost effectiveness is well documented. Now vendors are telling us that clouds will solve all of IT’s problems by centralizing computing and abstracting software from hardware; clouds will provide a simplified, consolidated, cost effective IT solution that will make things simple, and we won’t have to worry anymore. Yet, despite all of this, a quick gaze across the IT landscape does not show that centralized computing has toppled distributed computing.

So what’s the problem (it if were only that simple)? The problem is that there is not a single problem. IT is an amalgamation of resources deployed across a considerable stretch of time and in most cases reflects the intersection of the business needs of the moment and the limitations of the technology marketplace at a given point in time. Even though the TCO and ROI metrics favor much of the approach posited by VMware, this is a huge undertaking for most any organization. One that will for a variety of reasons will be incremental in nature, and likely take years to acheive.

The other issue is easy to paper over at the high level, but practically is much harder to budge, is the role of existing legacy systems, especially in the mission critical applications. It is clear that interconnectivity of the legacy with the new will be paramount for success. Some of the current Cloud Computing FUD would cause one to believe that the legacy will simply disappear, but we believe reality will take a far different path. While Cloud Computing, whatever it is or is believed to be any given moment, may develop alongside existing IT investment, it is doubtful the cloud will subsume all of IT in its current form.

Hence, not only is there an internal rationalization, virtualization, and simplification path to be followed, but it will be complicated by the degree of interconnection with the even less well defined abstraction known as the cloud. As IT of the here and now illustrates so well, it is difficult to maintain order over physical (and to a lesser extent) virtual beasts that are we well defined and understood. It is a much more perplexing challenge when one of the major components of the IT strategy, the cloud, is comparatively poorly defined and understood.

Overall, the challenges for VMware, or any vendor for that matter, to elevate virtualization from a tactical cost saver to a strategic business imperative are clear and substantial. That is the bad news, but it is also the good news. For a software vendor, there are few better opportunities than a difficult challenge that is waiting to be solved. However, this next step down the virtualization path is immensely more complex and hence risky than the relative ease of tactical virtualization and consolidation that has made the company so successful. It would seem a tall order to expect that all of the past “sins” of IT, and IT related business process would be cleansed from the modern enterprise by its wholesale investment in the latest and greatest “scale out” application development models on virtualized x86 platforms. However, given its competitive prowess to date, we expect that VMware will give this its best shot, and that the marketplace and IT consumers overall will be better off for it.