November 11, 2009

EMC, the Acadian Accomplices, and the Private Cloud

Once again I find myself the lucky recipient of the honorable identifier, AA 197 seat 22D, and as such have a few hours to kill in a rather austere low humidity environment. Thank goodness for my latest iDevice with its requisite collection of jazz, news podcasts, and noise cancelling headphones.

I have spent the last two days at EMC’s Analyst Event in the Boston ‘burbs; it was an interesting study in just how much the Hopkinton Storage Master has transformed over the past few years. A scant decade ago, EMC was easily summed as up Symmetrix plus the other stuff the company sold. High performance storage was the game, and in the booming best-of-breed focused 1990s, it wasn’t a bad fit. Of course the 1990s eventually ended with a resounding collapse of the marketplace that gave us rampant consolidation and retrenchment by much of the IT industry.

It was during these shaky years following that the focus and look of IT began to change as the distinction between servers, storage, and networking began to blur considerably. Not all that long ago the notion of the storage server was an oddity but IBM, then HP, and finally Sun began to embrace this blurring of purpose into their product lines. Yet today, this is an almost antiquated view given the transformative impact that virtualization has had across the board in IT. Perhaps some did not understand why in 1993 EMC, a storage company, would buy VMware, but today, this prescient purchase was an obvious underpinning for the transformation of EMC into something much more than that of a storage vendor.

Today’s buzz is all about cloud computing. At first blush one would expect that server and networking vendors would be the primary proponents, and for the most part they have been, but perhaps counter intuitively it is the far reaching embrace afforded the cloud approach by EMC that is most interesting. The recent announcement of the Acadia joint venture with Cisco, VMware, and Intel illustrates the degree to which EMC has embraced an internally controlled cloud as the future of enterprise IT. It was enlightening to hear the collective strategy of EMC and its Acadian Accomplices during the past two days; so much so that the relative lack of product announcements typical for such events was hardly noticed.

While many of the specifics discussed fall under NDA and thus are not to be repeated without the secret handshake, it was very clear from the event that EMC has focused its vision on delivering a new abstracted approach to the datacenter it has dubbed the Private Cloud. It is also apparent that this transformative message is more than buzzword saber rattling in that it not only seeks to depose the traditional silos of IT procurement (and the vendor community aligned therewith) but simultaneously cleverly engages the customer about what their business is seeking to do as opposed to how they do it. Who’d a thunk a “storage” company would transform itself into a thought leader for the datacenter with a vision that ultimately will obscure the mechanics of storage from not only the end user but most everyone in IT?

Readers know that I have taken issue with much of the “cloud dusting” in IT marketing as of late especially with respect to the notion of public clouds taking over IT as we know it in the very near future. However, much like in the early days of public Internet, the opportunity more rationally lies within the enterprise. The Private Cloud as articulated by EMC et al seeks to extend virtualization to its logical conclusion where all IT resources are dynamically provisioned from a pool of well described and understood resources under the control of the enterprise. It also seeks to transform a capital expenditure mindset into operational expenditure mentality which can also align more closely with servicing business objectives as opposed to technological deployment.

OK, this sounds wonderful, if not too good to be true; but enough gushing at the potential, let’s return to the ever present practical reality. Embracing the Private Cloud will take much more than technology, it will take behavioral change – the toughest product there is to sell. In a legal environment where compliance and best practices can install a sense of dread if not fear into most any LOB professional, the sanctity of information control is paramount to corporate success. I would argue this need trumps any cost benefits that internal cloud scenarios may deliver.

But to its credit, this is where the other multitudes of software acquisitions made by EMC over the past few years come into play. Unlike the other major systems vendors, EMC has put a razor sharp edge on information management, not just the storage equipment, but the information being stored. With ILM in the early 2000s, followed by management with Documentum, and security through RSA, deduping, and several others, EMC began to drive storage from the perspective of the information and its business value as opposed to a collection of hardware. Add to this mix the notion and virtualization, and primordial ooze of datacenter wide virtualization and abstraction begins to form. This ooze is non-trivial in its impact on IT attitudes, and it is not easily created; in fact it could be argued that the Acadia venture would difficult for any other combination of vendors to replicate.

The combination of EMC, VMware, Cisco, and Intel notably lacks any traditional systems vendor and services bureau presence. (Yes, each has some services business, but not in the big integrator sense.) Perhaps this lack is what has enabled the Acadian Accomplices to take a differing approach to virtualization and clouds bereft of the temptation to simply rename existing solutions as “cloud” without offering a transformative approach that focus on the information (and indirectly the applications) first, and the physical considerations of server, storage, and networking second. For example, HP’s announced acquisition of 3Com should bolster their networking credentials, but its combination of server + storage + networking does not articulate the same vision as Acadia.

There are many, many specifics that need to be worked out, and the evolution to the datacenter building block that EMC is driving will be a long one. This is not a flick the switch and everything is instantly different kind of endeavor. However, it is one with the potential to fundamentally alter how we view the datacenter, virtualization, IT provisioning, and oh yeah, storage. This is a strong message about virtualization that embraces all aspects of IT, not just servers, and focuses ultimately on the value of information and its enabling power to the enterprise. This is pretty dang cool.

September 22, 2009

Musings on 1st day of IDF - Whither Itanium?

OK, so I managed another early start on my day and caught Caltrain #217 to San Francisco. Destination: Moscone West, my day home away from home, and Intel's Developer Forum (IDF).

The trek to IDF is a late summer early autumn pilgrimage that the Intel faithful have been engaging in for years. While the venerable PC geek fest is still filled with technical sessions and other nuts and bolts revelations for the developer, the focus of IDF has continued its movement away from solely that of PC hardware to embrace a much broader audience including smaller and embedded devices, servers and big-stuff, as well as software.

The theme of Paul Otellini's keynote was the Spectrum of Computing on Intel Architecture (IA), with a particular emphasis on the Continuum, i.e. illustrating the scaling of IA from Atom through Core up to Xeon. No longer is the focus on personal computing, but rather on making all computing personal. Followed by a long list of examples from the tiniest of handheld devices, special application hardware, laptops, desktops, servers, etc. Intel went out of its way to show that IA can handle the workloads and form factors of just about every size imaginable.

While Mr. Otenilli made it clear that Intel believes that PC sales will be flat, not down, in 2009 and will show substantial growth in 2010, it is clear that the future will be driven by smaller hand held devices such as notebooks, and the even smaller. These devices are the domain of the Atom processor family, a growing reality that Intel reaffirmed by announcing the Intel Atom Developer Program. Atom based smaller devices were clearly on the mind of Intel and were plugged frequently as part of the Continuum and often shown side by side with the next generation laptops illustrating the seamlessness of the user experience across a variety of form factors.

Intel also spoke in detail about its strategy for enabling cross platform applications including support for Windows 7, moblin.org, Adobe Air, and MS Silverlight amongst others. Cross platform has always been a part of the Intel heritage, but when one stops to think about the potential impact in the embedded or purpose built hardware marketplace, there are many market forces at play that could help Intel drive its embedded Atom approach.

The universe of non-PC, non-laptop, non-mobile phone, non-server, and non-storage devices in enormous. In the realm of IT, it is the true elephant in the middle of the room. Yet for the most part, the embedded space remains the realm of proprietary chips and operating systems that by definition have not taken advantage of the cost efficiencies of standardized components and software. With the scale that Intel can bring with Atom, it is hard to imagine a marketplace that would not be touched by the efficiency of scale that Atom portends combined with the choice of operating system. While servers, desktops, and laptops remain interesting, the embedded market is where we see the latent potential for very big action including medical devices, transportation and shipping logistics, mobile devices, and yet with all puns intended, slots and gaming machines.

Speaking of servers, it was interesting to note the complete lack of any mention of Itanium in the morning's keynote, and only a passing mention in Sean Maloney's afternoon keynote. The number of enterprise focused (formerly Itanium only) features appearing in the Xeon roadmap continues to grow and the glaring omission from Intel's Continuum of processors and architecture discussion cannot be all that reassuring to the Itanium dependent. Although Sean did mention the now often repeated mantra that the value of Itanium server sales now exceeds that of SPARC based systems, in a market where Sun's sales have gone on hold pending resolution of its acquisition by Oracle, is this really all that surprising? If I were an Itanium loyalist, the muted tones of support and minute placement of my cherished platform would be raising the hair on my back. I wonder what the tone will be tomorrow night at the Itanium Solutions Alliance Awards party.

Of course any trip to Moscone West would not be complete without yet another loosely defined Cloud experience. In both of the first two keynotes, the Cloud discussion came up, and once again we see the effects of marketing buzzwords overtaking anything resembling consistent definition. Paul Otellini made the comment that we are no longer in a client-server age, but rather a many clients-many clouds age. In the afternoon keynote Sean Maloney talked of the continuum of the data center redefined including one environment called the Internet/Cloud. Argh!

Yes, I would completely agree that the client/server era is dead, absolutely. However, this industry wide fascination with ill defined Clouds is truly annoying. If the cloud is an entity that lives "outside" or "elsewhere", is it not the totality of everything that is not "inside" somewhere else? If so, how we can have a many clients to many clouds relationship? If the Cloud is a class of computing solution, like implied in the afternoon keynote, then what is the difference between Cloud and Internet? I may be picking on this topic a lot this year, but the lack of clarity is not good for anyone. If everything is a Cloud, then nothing is a Cloud.

Overall, it is encouraging to hear Intel continue to calibrate itself beyond the traditionally narrow focus of the semi-recent past (PC) to embrace a much broader view of the computing opportunity. While today’s keynotes were not earth shattering and revolutionary, they were incremental, future focused, and largely rational in nature. From a big picture perspective, the company has illustrated a long term vision that to our way of thinking makes a good deal of sense. Of course executing upon a vision can be more difficult, and the Santa Clara company is not without its executional shortcomings. Nevertheless, its vision is largely rational, which is more than what some companies can offer.

September 02, 2009

Pondering VMware and the Road Ahead

Monday was the beginning of the weeklong virtualization love fest known as VMworld 2009. Although the exposition did not start until Tuesday, Monday was a day of tutorials, preparation, and an industry analyst event where VMware parted the corporate kimono for us market watchers to gaze inwards on the company’s present and future plans.

While certain announcements will be made during this week, other content that will remain NDA for the near future; nevertheless, it is always enlightening to hear directly from company executives their assessment of the corporate strengths and weaknesses, as well as their big picture vision for the company. Happily, there was plenty of information and perspective shared, even when it challenged the presentation schedule.

Not surprisingly, VMware extolled the virtues of virtualization, and its market success especially in the upper echelon of the marketplace. At the same time, the company recognized that its lowest hanging fruit, x86 server virtualization, is starting to mature in the marketplace, and that a hypervisor alone does not make for a long term corporate revenue growth strategy. Hence, there is a market imperative for the company to move beyond the tactical achievement of server virtualization and consolidation towards a strategic position of extolling the business benefits of virtualization.

Given the growing market position of Citrix/Xen and Microsoft, it is clear that VMware needs to maintain a competitive differentiation that exceeds being a supplier of a software hypervisor. At the same time, hardware based hypervisors such as IBM’s Power based CPUs are continuing to push the performance envelope and questioning the assumption that an additional software hypervisor is needed to support Linux workloads. Many x86 Linux binaries can run well in emulation on most any current vintage System p or i.

To VMware’s credit, the company has outlined several initiatives that seek to elevate the discussion of virtualization to a higher level, to include the often-overlooked components of any virtualization solution, including storage, networking, management, security, compliance, provisioning, scheduling, and monitoring. Informally, the company is beginning to refer to its approach as the “software mainframe” (with the requisite disclaimers). Although it is hard to imagine an array of virtualized x86 systems providing the same technical achievement as the System z, the notion of the reducing the complexity while increasing the agility of the datacenter remains laudable. Nevertheless, we applaud VMware's vision and realization that there is much more to virtualization than server consolidation. Abstracting the view of the datacenter to a set of all encompassing virtual services proffers many benefits to not only IT professionals, but to end users within the organizations.

At many levels, the concept of what VMware executives were touting as The New Age Desktop makes sense. The notion that desktops are virtual environments that are delivered through a catalog of IT services on demand undoubtedly has great appeal to IT professionals, and with the correct positioning and user empowerment, could address many of the functional needs within organizations. If the desktop is truly a portal to applications, then it matters little where the application, processing power, storage etc., resides. Yes, there are compliance and religious issues that may affect this, but these are not functional issues. Moving away from dedicated thick client access devices can help turn hardware management issues into service and scheduling issues, which are typically simpler and less costly to address.

As rational and “simple” as this all sounds, it is also a very familiar refrain that depending upon the year could be associated with the Network Computer, NetPC, Windows Terminals, Thin Clients, ASPs, RDP, Consolidated Client Infrastructure, and list keeps on going. The mantra of centralized computing is well understood by IT. The cost effectiveness is well documented. Now vendors are telling us that clouds will solve all of IT’s problems by centralizing computing and abstracting software from hardware; clouds will provide a simplified, consolidated, cost effective IT solution that will make things simple, and we won’t have to worry anymore. Yet, despite all of this, a quick gaze across the IT landscape does not show that centralized computing has toppled distributed computing.

So what’s the problem (it if were only that simple)? The problem is that there is not a single problem. IT is an amalgamation of resources deployed across a considerable stretch of time and in most cases reflects the intersection of the business needs of the moment and the limitations of the technology marketplace at a given point in time. Even though the TCO and ROI metrics favor much of the approach posited by VMware, this is a huge undertaking for most any organization. One that will for a variety of reasons will be incremental in nature, and likely take years to acheive.

The other issue is easy to paper over at the high level, but practically is much harder to budge, is the role of existing legacy systems, especially in the mission critical applications. It is clear that interconnectivity of the legacy with the new will be paramount for success. Some of the current Cloud Computing FUD would cause one to believe that the legacy will simply disappear, but we believe reality will take a far different path. While Cloud Computing, whatever it is or is believed to be any given moment, may develop alongside existing IT investment, it is doubtful the cloud will subsume all of IT in its current form.

Hence, not only is there an internal rationalization, virtualization, and simplification path to be followed, but it will be complicated by the degree of interconnection with the even less well defined abstraction known as the cloud. As IT of the here and now illustrates so well, it is difficult to maintain order over physical (and to a lesser extent) virtual beasts that are we well defined and understood. It is a much more perplexing challenge when one of the major components of the IT strategy, the cloud, is comparatively poorly defined and understood.

Overall, the challenges for VMware, or any vendor for that matter, to elevate virtualization from a tactical cost saver to a strategic business imperative are clear and substantial. That is the bad news, but it is also the good news. For a software vendor, there are few better opportunities than a difficult challenge that is waiting to be solved. However, this next step down the virtualization path is immensely more complex and hence risky than the relative ease of tactical virtualization and consolidation that has made the company so successful. It would seem a tall order to expect that all of the past “sins” of IT, and IT related business process would be cleansed from the modern enterprise by its wholesale investment in the latest and greatest “scale out” application development models on virtualized x86 platforms. However, given its competitive prowess to date, we expect that VMware will give this its best shot, and that the marketplace and IT consumers overall will be better off for it.

August 19, 2009

When New becomes old and old is Renewed Again?

OK, so I find myself again at 37,000 feet fighting dehydration and the limitations of American Airlines 1023 seat 9D, (hey at least it is an improvement over 14E of the previous flight leg.) Not that I relish flying anywhere these days, but at least my stay in Austin provided me with some renewed vigor in my esteem towards certain technology. In contrast, last week I was riding Caltrain, which provides infinitely more comfortable travel, but the event at Moscone West left me with far less good feeling about the technology marketplace.

Last week was the triple trade show: OpenSource World, Next Generation Data Center, and CloudWorld in San Francisco. For those not familiar with SF conventioneering, Moscone Center (North and South) comprise the mega underground convention center where many a successful trade show have boasted endless educational opportunities as well as an expo floor filled many vendors plying their wares for the adoring public. Moscone West is kitty corner from North and South, and does not boast the scale, or perhaps the cache, of its larger twin sibling. This triple trade show was in Moscone West, possibly a warning signal of its truly consolidated and limited nature.

Although the many educational sessions at the show undoubtedly proved valuable to their attendees, the modestly sized expo floor reflected poorly on the market level support for what are supposedly three of the hottest IT topics. While I will not claim that one could have been bowling down the expo aisles without hitting any attendees, it was the first hour of the exhibition on its first day. This was prime time, but the sheer numbers and excitement level seemed more akin to off season reruns on a Friday night. There were no elaborate mega booths, not very many big vendor names, and perhaps more telling, few vendors related to Linux or Open Source.

It is easy for me to accept the presence of few Open Source vendors in 2009, as most customers typically care much less about the software’s molecular composition as opposed to what it can do for their business. Perhaps Open Source, much like TCP/IP, is now just an accepted part of IT reality and no longer needs a trade show to raise its profile. However, for the other two topics, I don’t believe this to be the case. So, why is there apparently limited support from the marketplace?

With respect to Cloud Computing, the level of marketing FUD and muck has led to an environment where there are no broadly accepted definitions (even within some vendors) of just what the “cloud” is and where exactly it is collecting stray IT water molecules to be dispatched as needed. Is “cloud” a collection of technology, an API set, a philosophical approach, an outsourcing contract, another name for “grid”, or is it just an abused marketing term such as “ASP” that we will all run from in a couple of years? When the market cannot agree upon definitions, then articulating product and service offerings let alone differentiating them from the competition is very challenging. Maybe no one came, because no one knew what “cloud” meant and thus why it would be relevant to them?

While there has been talk of the next generation of data center for as long as there have been data centers, the reality of CAP EX on the balance sheet, the expense and availability of additional power and cooling as well as stretched IT resources have the attention of many an IT director and operations executive. The pain is obvious and regrettably commonplace, however, the problem is so massive that in some respects it might be easier to look the other way as long as possible, or at least pass the buck to facilities. The increasingly symbiotic nature between IT and facilities is not well understood by most. Thus, it is a lot to ask from an expo floor to be able to deliver simple off-the-shelf solutions to vexing, entrenched, systemic problems where the human elements are not even in alignment yet. Perhaps it was unrealistic to expect so.

So it seems that one of these “new” IT technologies (Open Source) is in danger of becoming mature and no longer worthy of being a discrete talking point, one lacks the basic definitions by which to have a meaningful discussion (Cloud), and one may prove too big and complex to talk about (NGDC). Not being able to define and therefore be able to talk about a subject or not wanting to talk about a complex subject leads to little discourse about either. Without talking, there is no chance of a conversation, let alone a sale. Perchance it should not be surprising that three shows combined into one yielded perhaps a third or fewer attendees than other “hot” topic shows in their prime. Underwhelmed, I rode Caltrain #256 home; at least I could feel good about supporting mass transit.

At least my flight to Austin would bring me a more positive technology outlook. One of the old marketplace stalwarts, the POWER processor, is due to have its latest incarnation discussed in some detail next week at Hot Chips 21. Although old by IT standards, POWER keeps being reinvigorated; it just keeps getting better with age. At an analyst only event in Austin, IBM briefed a few of us about many of the new capabilities that POWER7 as well as AIX 7 will bring. For aficionados of the processor family, POWER7 is likely to be a welcomed with open arms. For those who are the competition, POWER7 may prove itself to be more of a competitive nuisance than POWER6 or POWER5 were. Given, the NDA nature of the briefing, I can’t spill the beans on some of the exciting capabilities planned for the next round of POWER and AIX. However, some information about POWER7 will be made available at a Hot Chips session.

The preview of POWER7, AIX 7, and the Power System Software we collectively experienced in Austin addressed many of the issues that were thematic of the triple trade show of the week past. In fact, if the preview had taken place at the previous week’s show, maybe there would have been some palpable energy flowing on the floor.

Nevertheless, the three issues thematic to the under attended trade show will be addressed in many ways by the POWER7 et al once the details are made public. It is interesting to see once again, an “old” technology, driven by a “very old” company very aptly addressing a contemporary set of issues in the marketplace while a three-in-one trade show could not seem to muster an audience commensurate with the applicability of its theme.

American Airlines 1023 is now descending on approach to SJC. I definitely feel consolidated in this economy seat. It’s time for me to switch context and go sit in traffic on I-280. Now if only Cloud Computing could do something about that.

June 02, 2009

JavaONE: Bright Future or Riding off into the Sunset?

Today, I set my alarm early, 6:00 AM early. The reason was to catch Caltrain #313 @ 6:57 AM so I can make it Moscone Center in time for Sun Microsystems’ keynote at JavaONE. As I made my way to the train station, I pondered what I was going to see and hear at JavaONE – an event that is very long in tooth, at least in Internet years.

Today’s keynote was spun around what 14 years has brought, and there has been plenty of good brought by Java. In 1996, who realistically could believe that literally billions of remote access devices would have Java as an embedded capability? Perhaps a few die-hard believers would have, but the scale that Java has achieved since its launch is beyond remarkable. Today we witnessed mobile phones, Blue Ray players, televisions, web sites, and CPUs, but precious little mention of “computers” except when seeing development tools. This morning’s guests included eBay, Research in Motion, Sony, Verizon, Intel, and Jagex (of Runescape fame). As shocking as this would be from a 1996 perspective, in 2009, it is testimony to what Java has accomplished.

The notion of communities of consumers was prevalent throughout, whether they were Java developers, eBay buyers and sellers, Blackberry users, or Blue-Ray viewers at home (conveniently networked with numerous others through their PS3). While an obvious extension of the notion of Networked Computing, I wonder just how far the idea of automatic community extends to discrete users who just happen to be doing to the same task at a given time.

Nevertheless, the potential of associating users into communities, even if they are of only short duration (such as the audience for this morning’s keynote) for commercial benefit is considerable. With each association, there is a data point, with each data point there needs to be a data repository, and for each demand for a data repository, there is a database vendor standing in the hall. Perhaps this relationship between consumers and Java devices is the ultimate reason why the Redwood Shores Company would want to purchase the creator of all things Java.

Java and I have a love hate relationship that spans decades. I have always loved the idealism, the technical achievement, and its role as the first technology that really made the Internet “fun.” However, I have always hated the arrogance, early attempts at exercising proprietary behavior through an ill-fated ISO standardization scheme, and Java’s positioning as a replacement for underlying operating systems and hence their purveyors.

Probably what I found most remarkable was the tone of Java today; very consumer friendly. Yet Larry Ellison seemingly turned that positioning on its head during the last 15 minutes of the keynote. The message during the first 75 minutes was perhaps best summed up by a JavaFX TV architect who stated that Java was all about delivering content to the user’s “favorite screen in their life." It was all about consumers, info consumption, and developers seeking to convert labors of love into revenue streams.

Later on, when Oracle’s fearless leader joined Scott McNealy and James Gosling on stage, Java’s message seemed to revert to its early days. Larry boasted of Oracle’s 100% Java middleware stack, and soon to be 100% Java based applications. All of this is impressive to be sure, but it also sounds very much like traditional computing as manifest through networks seeking to solve business problems, not a consumer empowerment experience. For those seated in the audience, the question was simple, “which of these is the Java vision in a post-Oracle merger universe?”

Although Oracle through Ellison made many statements about the value of Java, it is hard to imagine that the attendees did not have some degree of worry introduced. There were at best only non-committal statements about the future of JavaONE. It will prove interesting to see just how Oracle ultimately chooses to leverage its new Java assets. Having billions of enabled devices implies untold numbers of potential database transactions, all of which surely would cause a smile to appear across Oracle’s collective brow. The collective $4 - $5 billion in potential R&D spending bandied around by McNealy and Ellison is impressive, and such an investment could help take Java to the next level.

However, just how interested is a business data and applications provider in driving consumer focused application sales through a newly announced Java store? How much desire is there to continue to push the developer communities in an open source paradigm? Does Oracle want all of what Java has become or will it be quite content to cherry pick the desired morsels, and simply wither away the remainder? Only time will tell.

It is clear that even if there is another JavaONE in the future, 2009 will likely be known as the last “real” JavaONE. Overall, the exhibition floor seemed subdued, and there was a subtle theme throughout Sun’s keynote message: “Java is really big, so big; we have to keep telling you it’s big.” Seeing McNealy in action again on stage brought fond memories, but it also reminded me that these are memories, and that Sun and much of its unique history will soon mount their horses to begin riding off into the sunset. One can only hope that all of the good aspects of Java will endure through its new owner, and that the communities that have arisen around it are not inadvertently shown the pasture as well.

Change is inevitable, and technology has the inherent ability to speed up the rate of change. Java, as was Sun and the Internet, was clearly a game changer in the grand scheme of all things IT. Embracing change, as difficult as it is, leads to opportunity. Judging by what was shown today and the words spoken, change is in the air. The question is, “how will the marketplace embrace the change in motion?” Will developers be further empowered in a post Oracle-Sun universe, or will they experience seismic change in their platform underpinnings? Again, only time will time. Speaking of time, I had better head back to the Caltrain station for the ride home.

April 20, 2009

Oracle and Sun, Sitting in a Tree, M-E-R-G-I-N-G

Oracle Corporation and Sun Microsystems announced that they have entered into a definitive agreement under which Oracle will acquire Sun common stock for $9.50 per share in cash. The transaction is valued at approximately $7.4 billion, or $5.6 billion net of Sun’s cash and debt. Oracle stated its belief that the acquired business will contribute over $1.5 billion to Oracle’s non-GAAP operating profit in the first year, increasing to over $2 billion in the second year. The Sun Microsystems’ BoD has unanimously approved the transaction, which is anticipated to close this summer, subject to Sun stockholder approval, certain regulatory approvals and customary closing conditions.

OK, so the proverbial other shoe has dropped. As we pondered a few days ago, they were two likely scenarios for Sun Microsystems’ future. It appears the Death by Acquisition is now the preferred outcome, at least according to the unanimous actions of Sun’s BoD.

Given the long established and well-regarded technical give and take of the two firms, Oracle’s proposed acquisition of Sun leads to the obvious conclusion that Solaris will unquestionably be the platform of choice for Oracle’s bevy of software assets. For current Oracle and Sun customers, this is good news as it elevates the Solaris platform to a premium position from an R&D and feature enhancement/customization perspective. In addition, Sun’s storage assets may become a more critical consideration for DBMS and related application design than they have been in the past.

Sun’s portfolio of open source applications and initiatives may offer some competitive differentiation and new opportunities in the database realm and perhaps to a lesser extent in other commercial application market segments. Perhaps having mySQL in the Oracle fold would draw attention toward open source DBMS for commercial users and might lay the path to a DBMS offering akin to IBM’s WebSphere Community Edition. While these are different product areas, the notion of community versions with an upgrade path to full fledged commercial applications would seem applicable in this context as well. Sun’s other open source assets including Java, OpenSolaris, and ZFS, amongst others would serve to bolster Oracle’s technical prowess while potentially expanding the scope if not reach of Oracle access within and outside of the enterprise. This also bodes well for Oracle Fusion Middleware, which is built upon Java.

Outside of the database and commercial applications space, the positive impact of this announcement is less clear. From a hardware perspective, Oracle’s lack of history as a hardware systems vendor may bring some pause to channel partners including SIs, as well as VARs and other resellers with an historically hardware focus. With control over a leading platform for its cash cow DBMS and commercial applications, will there be a temptation to focus the hardware business too narrowly on being a carrying case for the software as opposed to a viable standalone business? How will traditional partners such as EMC, IBM, HP, etc. react to Oracle now being in a position to compete with their hardware and related services businesses? In the short term, none of this should present an issue, but over time, the potential synergies of Oracle + Sun could provoke consternation from important business partners. These challenges will need to be very carefully considered lest an important channel be harmed.

Since Sun contracts out much of its manufacturing, Oracle could easily continue this modus operandi, and not assume the potential long-term issues of investing in or shutting down aging manufacturing plant. This is a plus for Oracle, but it does remove one impediment that would otherwise justify the continued operation of certain hardware based product lines whose capital equipment cost has not fully depreciated. Nevertheless, in the short term we do not envision Oracle suddenly idling Sun’s substantial hardware business, but the long term synergies and strategies of the combined company may look rather different from that a traditional hardware systems vendor.

With this acquisition, Oracle could start looking similar to a version of Big Blue from a couple of decades ago, i.e. a vendor that provided a complete vertically integrated package of server hardware, storage, operating system, middleware, commercial applications, and services. Will this combination meet with marketplace acceptance, or skepticism? Does the one stop shop with a comprehensive single vendor solution meet the expectations of the best of breed, open source, heterogeneous marketplace of the 21st century? Will customers perceive a bias in the support and R&D of Oracle products that favors Sun’s underlying technology? How will Oracle assuage fears of customers that Oracle’s improved efficiency and cohesion in delivering technology will not also result in Oracle more efficiently commanding a greater share of its customer’s IT spend while reducing choice? All of these are mighty questions with a potentially significant impact on the IT marketplace in general, and Oracle’s bottom line in specific.

Big Blue taught the industry a key lesson about being a systems vendor a few years ago, namely leave some space on the playground for everyone. With IBM, HP, EMC, Fujitsu, and the rest, there are opportunities for hardware and software vendors to collaborate, compete, and co-exist. Under a combined Oracle/Sun, there may be less space in the playground, at least for Oracle customers. To our way of thinking, this would not be good for industry, nor Oracle in the long term. If this acquisition were ultimately given regulatory approval, it would be in everyone’s best interest to keep a watchful eye on Oracle’s behavior. This merger is systemically different from Oracle’s past software only acquisition, and as such has the potential to disrupt the value proposition of hardware in many segments of the IT marketplace.

April 06, 2009

No Sun in the Big Blue sky

OK, so IBM is backing away from Sun after the latest round of negotiations. While there has been much speculation about the sale price and just how much CYA IBM would be willing to afford the Copernican Company, as of now, the deal has fallen apart.

So, where does this leave Sun, and more importantly its customers? Since SGI went down a few days ago for a mere $25 million, one could reasonably ask, "Is this the same trajectory for Sun?" While it is clear, to our way of thinking that Sun is in a weaker position now than before the talk of merger began, the question now shifts to the likely long-term outcome for the company.

From our perspective, there are two likely scenarios for Sun, as we know it today: 1) Death by Acquisition, and 2) Death by a Thousand Cuts.

Under Death by Acquisition, Sun would ultimately be dismantled, but with a varying degree of strategy. The big potential suitors would fall into the "the other system vendors" camp. i.e. HP, Cisco, EMC, Oracle, etc. or a VC/fund/private equity investor type of buyer. With the Big Blue out of the picture, neither camp would seem to want to purchase and maintain a Sun as we know it. Instead they would be interested in running the company through a chop shop and part out to the desired parts to the highest bidder (some of which might be the buyer itself), with the rest going to the IT equivalent of the scrap pile.

From a customer perspective, it seems that HP ownership of Sun would guarantee an outcome similar to that experienced by DEC customers, i.e. loss of the platform, followed by a lukewarm embrace if the customer would dain to port to Itanic. Customers never like forced ports. Granted, HP has expertise in supporting Solaris and Java environments and their services folks, like all good services folks, ultimately deliver and support whatever equipment the customer demands. However, with control over the platform choices at the point of origination, this could change.

If an EMC, Oracle, Cisco, etc. were to acquire Sun, the customer base faces some more interesting, and upsetting challenges. None of these companies has the systems-wide perspective of Sun and while EMC might some more interesting storage solutions as a result and Oracle might optimize further around its DBMS, the clear compelling value for existing customers seems somewhat elusive.

If a financial interest would to acquire Sun, then again it's a path to the chop shop. While taking over declining companies was fashionable a few years back, the thought of bought out or refinanced companies rising to their former greatness is distracted by the realities of Chrysler, SGI, etc. Thus, a fast parting out of Sun's unique might be a quick financial fix; it would be a sad loss of an innovator and cohesive IT supplier with a vision for the future.

Under Death by a Thousand Cuts, Sun continues as an independent systems vendor, but one that is not on a recovery path. The prognosis is terminal, but the patient gets to live in its own home, and in a varying state of denial that the virile life of the past is destined to remain in the past. Sun's customers would not immediately be harmed or dislocated, but over time, the inevitable loss of key channel partners, employees, and customers would slowly bleed the company to the point where it could no longer effectively support and invest in itself. Then at some point, an attempted fire sale occurs and the company ends up in the chop shop or worse yet, simply fades away and is quietly subsumed a la SGI.

To our way of thinking, either of these scenarios is less desirable than Sun having become part of Big Blue. The impact on customers from IBM would be more gradual, the services organization would have little issue in maintaining and supplementing existing installations, and while SPARC would probably be put out to pasture at some point, POWER seems infinitely palatable to Itanic as a platform alternative.

The failure to complete this merger illustrates an all too common, yet unfortunate, reality that plagues companies that still have their founders involved in some fashion. Reports have indicated that Scott McNealy led the faction protesting the IBM deal. This brings to mind another failed merger, namely Yahoo!, which failed largely due to the actions and objections of founder Jerry Yang. Founders too often have an emotional attachment to "their baby" that can cloud judgment, especially when dealing in adverse economic environments. Did Yahoo! fare better by following Yang’s lead and refusing to sell to Microsoft? Obviously not. Will Sun? Probably not.

Despite all of this, I still have a lot of respect for Scott McNealy, and Sun Microsystems. He and they played an important role in the valley during last few decades, and we are all better off for it. I still think of McNealy as a very funny fellow, one whom I would have liked to see guest host Late Night or the Tonight Show, and a very knowledgeable force in the IT market. Nevertheless, the reality of 2009 seems to dictate that a midsized systems vendor would be relegated to RC Cola status in a world dominated by Coke and Pepsi. Eventually the big two will subsume most of the shelf space with the "store brand" (read white box) filling out the rest.

While there may be a magic trick or two left in Sun's collective hat, it doesn't seem obvious to the outsider. Whatever deal might be forged by Sun with other potential suitors is going to be tarnished by the failure of this one. Hence, a maligned $9.40 deal might ultimately be replaced with a $7, $6, or even $5 deal. It's just hard to see a different scenario.

All in all, it's just too bad. In fact, it's a damn shame. Old cowboys don't die; they just ride off into the sunset. It's just too bad that it may be the Sun set that may ultimately ride off.