December 06, 2007

Is Bot Defense the IDS of 2008?

I don’t’ think there is any question that bots and botnets are a dangerous threat. The combination of a worm delivery vehicle and a malware payload of varying capabilities is a potent one that attackers have morphed to suit their own purposes. Bot defense is proving to be a difficult task even as traditional AV vendors and others have purported to include bot defense in and among the various protections they offer.

There are also a couple of specialty vendors that focus on the threat and claim to be able to identify not just the threat, but the best way to defeat it in the future. If this all sounds strangely like the rhetoric surrounding Intrusion Detection Systems in the early days -- it’s because it does. As you may recall, IDS vendors all touted their ability to identify attacks. The market bifurcated itself into network and host and vendors pretty much camped out on one side or the other.

Then one day, at a Gartner security conference of all places, an analyst (Richard Stienon now with Fortinet) coined the phrase “IDS is dead!” The market went into a tizzy with much scurrying around by vendors to re-position themselves as Intrusion Prevention rather than Intrusion Detection. In retrospect Stienon merely stated the obvious that end user organizations didn’t want a complete description of their problem, they wanted technology to make sure the problem didn’t occur in the first place.

So should it be with bots and botnets. The community wants and needs prevention more than it needs detection and identification. I offer this blog as a call for vendors to develop measures that do more than diagnose the threat but can provide detailed guidance to non-security professionals such as those that work in the Network Operations Center (NOC) to help them thwart these efforts in an exceptionally timely manner. Ideally perhaps the products would also offer the capability to invoke the recommended solution with a key stroke or two in accordance with previously approved security and operations protocols and permissions.

We know that the edge belongs to the attacker. Security professionals have to win all the time to keep their IT world safe, attackers only have to win a few times to accomplish their goals. Let’s hope that the botnet world becomes a proving ground for being one step a head of the enemy, rather than behind them.

October 09, 2007

Networking Re-Pondered at 37,000 feet

Once again I find myself the resident of seat 30C (power equipped) in a nearly full pressured tin can aka a Boeing 767-200 being flown courtesy of American Airlines. It is no surprise anymore that I spend more of my life at 37,000 feet, with 10% humidity, and insufficient oxygen to support normal cognitive thought than most people would consider tolerable. So, while taking a break from doing “real work”, I plug in my trusty iPod and Bose Noise cancelling headphones and escape in the vast never lands of my electronic palace (it is much larger than the 17.2 inch wide, 32 inch pitch seat AA has provided). Although I ponder the chord changes and melodic composition of a lot of Hard Bop and other Jazz while doing time on aircraft, this time I was pulled into something different.

I was thinking, it was too bad I don’t have the ill fated Boeing Connexion on board, as half way through a flight, I always discover the document, file, or website I need to access but don’t have at my disposal. Worse, when I do have what I need, I have managed to fill up my paltry disk drive with enough stuff that I am inviting the wrath of the fragmented paging file and inevitable system meltdown. It seems that I can no longer operate for very long without being connected, even in a faux fashion (with offline files, briefcases, and what not). This started my exhausted mind wondering through all the meanings/contexts that networking has come to define.

Today there is much discussion about social networking, as epitomized by FaceBook (hipsters) and LinkedIn (us stuffy professional types). This notion of networking is of course, the modern equivalent of the Good Ol’ Boy network of those "in the know" and "should be known". Then there is networking as epitomized by Cisco, well OK, all of us, aka the Internet. The Internet is one of my fondest technological pursuits, and I remind myseld that several of the services on it could prove real handy mid-air. If I was online, I could get to those missing files, grab them out of my backup Gmail box, or better yet get them from a Mozy backup (one of the neat tricks now up EMC's burgeoning sleeve) and I would be back on track. Of course, my dream (or worst nightmare) come true is not having to lug all of my context around with me (sorry not a thin client pitch this time) but having access to my files from anywhere anytime. Then again, maybe I am pitching a thin client thought, perhaps thin enough to fit into seat 30C.

Just having this mental exercise about connectivity shows how far along networking and the expectations of it has come. In my early commercial internet days, 28.8k dial up was fast, a 56k-leased line was expensive, and being able to attach a file to an email was a snazzy affair. But this is modern history compared with my initial experiences of the ARPANet at 110 baud, or 300 on a good day. That network was a bit more stoic than the current one, however, there was an even more fascinating network underlying it that few thought of, namely the PSTN, or telephone network.

Being a closet Phone Phreak myself, I have to admit to spending many an idle hour pondering just how far can one could make a connection, either through PSTN or in conjunction with the ARPANet. The great revelation that I could type a character and within only 1 or 2 seconds have it echo back in full duplex from the UK and thus carry on a conversation in the stone age equivalent of IM for the price of a local phone call was fascinating. I knew at that time (it was 1977) that a world whereby communications interconnecting computers (OK I was thinking terminals and teletypes) was going to happen and become the norm, if for no other reason than the avoid Ma Bell’s long distance tariffs. Yes, some of us had cracked the Sprint and Ralston Purina private long distance networks, but that kind of five finger calling did not hold the same commercial appeal as the alternative computing message network accessed for the price of a local call.

With network speeds slow enough you could almost watch the electrons move, these networks would obviously have to change to support what we now take for granted, but the mystery, and mechanical actions of connectivity were fascinating. The assumptions that we today make about connectivity, at work, home, or mobile seem so normal, and yet were beyond science fiction not all that long ago. Egad, I was beginning to be humbled by my lack of connectivity in good ol' 30C.

At this point, I took umbrage in my disconnected state and turned back to the iPod. Amongst the many CDs and podcasts was a directory of special recordings, known as Phone Trips. I dialed in (pun intended) and started to listen to Evan Doorbell narrate some of his trips through 1XB, 5XB, and 1ESS central offices, stacking tandems, and reliving the general exploration of the greatest network in the 1970s, AT&T’s long lines. Odd as it may seem, (yes, I admitted I was a phreak) my frustration about not having perpetual connectivity began to wane, and I found myself reconnecting with the sense of adventure that networks once held for me. This was perhaps the best-connected network feeling I have had in some time -- and I was completely offline.

With arguably good reason, pondering my past is an affair best left to the experts. Nevertheless, it was good to remember some of the special almost bizarre excitement that networks once held, and to reconnect with the potential that networks have for all of us, even if that excitement has “matured” into expectation and sense of taking it all for granted. With this in mind, the Internet, Web 2.0. VoIP, zillion bits per second LANs and WANs, all are part of one of the most significant behavioral modification of recent times. Whenever, we are quick to minimize the value prop of the latest and greatest Internet service, we need to remember the quaint beginning of networks, and look just how far they came and how fast.

Nevertheless, for the remainder of this flight, I will delve into one of my favorite, organic, purely analog networks, the dynamic interconnected nodes of the Horace Silver Quintet. The workload varies, has high customer facing value, and makes immense use of discrete neurological networks, both local and remote, and you never quite know what to expect. Grid processing at its best. It's time for this camper to nod off...

September 25, 2007

The empowerment of Power

I am in Austin Texas attending the Power Architecutre Developer Conference. While at first blush, we all might wonder why such an ex-technical guy like me is at a developer’s conference; the reasons are in fact of a marketing nature. Although the audience at the conference is obviously a technical one, there is much here to illustrate just what an interesting beast power.org has become. From a sheer marketing perspective, it is compelling to see the likes of IBM, Freescale, AMCC, Cadence, Synopsys, Wind River, and many others all gleefully talking about all different kinds of solutions in many seemingly unrelated markets. Well, unrelated except that the Power architecture is the underpinning for it all.

Not that everyone is talking about one processor, as there seem to be more processors being showcased than the number of fingers on both hands, but there is a singularity in discussion about the broad architecture, whether the solutions are small low power embedded devices, personal computer chips, large server technology, or HPC focused devices. It is interesting to be standing in hallway with signs touting the location of the Cell Hack-a-thon, the POWER6 partition mobility session, the AMCC SATA RAID controller processor, and a networking Sony PS3s tutorial. These along with many other diverse offerings for the automotive, control system, computing, networking, and switching industries, to mention but a few illustrate just how pervasive and important the Power architecture has become.

In a time where there is so much discussion on open standards, and the value of ecosystems and multi-vendor cooperation, it is amazing at times, how relatively few recognize the role that Power plays to this end. Mention Power to most server folks and they will talk about it from the perspective of POWER5 or POWER6 and then tell you that it is only an IBM, or worse, proprietary solution. Funny that the same processor lives inside EMC storage systems, which are definitely not from IBM. Mention PowerPC and many will tell you that was the Mac processor, but Mac is now industry standard with Intel. Yet the many PowerPC based processors from AMCC for networking and storage solutions were never part of any Apple solution, and these products remain very much in the market demand. These are just a couple of examples.

The Power architecture holds a unique, if not ubiquitous position in the marketplace. While the number of attendees (a few hundred) at this developer’s conference would pale in comparison with a big industry trade show, the numbers are impressive when one considers that they are all here from many divergent industries and all are seeking to learn how to gain greater leverage from their investment in the Power architecture. It is hard to think of another platform that garners the interest and support from such a diverse audience. In this era of multi core multi threaded computing, the Power architecture in many cases is the epitome of an industry standard with a thriving ecosystem in which no single vendor dominates all industries.

Given the consolidation in the IT industry of the past few years, it is reassuring that some sectors remain vibrant and competitive, and that a single architectural platform is underpinning so much of the market growth. Despite the continued efforts of some to equate all things Intel as Industry Standard, outside of x86 (which of course, is huge), this assertion seems hard to accept. On the contrary, when considering the wide and far-flung impact of the Power architecture, would not this more closely align with the notion of industry standard, or better yet industries standard?

The Power Architecture Developer Conference is a testimony to the importance of this architecture even thought it will probably only garner secondary news status in the grand scheme of the moment. The fact that its ecosystem is sufficiently rich to support a developer’s conference that spans from the smallest of embedded devices to high performance computing is the front-page news item, but perhaps one that will remain one of the best-kept secrets in the industry.

August 29, 2007

History Isn’t Always The Best Teacher

There’s an old adage that those who don’t know history are bound to repeat it. A couple of recent experiences lead me to adopt the more topical Richard Clarke view of “the future will not be like the past”. Let’s start with a relatively simple world – automobiles. Everyone loves a new car, the roar of the new engine, the new car smell and the envy of your friends. Once upon a time I worked for a company called Wang Laboratories – and no, that’s not an acronym. Fresh for a stint at Chrysler Corporation I was the lead marketing guy for the auto dealer vertical which at the time was 40% of Wang’s domestic revenue and grew 300% in my three year tenure. We sold programmable calculators (some of which are now on display at the Computer Museum in Mountain View, CA) to car dealers to help them sell cars and include things like financing, insurance, undercoating, etc. Our independent software vendor (ISV) partners would add localized software to do calculations for the local city and state. The result was a display of numbers that could break costs down to pennies a day and print out every single form needed at the point of sale. Every time something new came along – whether air conditioning as an option, new and improved chemical coating, extended warranty - the dealers figured out a way to sell it along with the car.

Loyal readers will remember my blog on the satellite radio for the car. Having taken a 2,000 mile (3,200 KM) road trip I’ve become quite accustomed to satellite radio. Last night HRH the QM and I went shopping for a new Lexus. Imagine my amazement to find that they don’t have Sirius or XM nor is it available! I was totally shocked, even HRH’s 2007 Chrysler came with satellite radio. Notwithstanding the salesman trying to convince me that satellite radio was not worthwhile, my feeling is that Toyota couldn’t cut a satisfactory deal. What’s the morale of the story? Economics trumps technology or lack of customer demand can put technology purchases on the back burner.

Another historical blip in my opinion is Vista. We all remember the painful migration from DOS to Windows this and Windows that ultimately culminating in Windows XP. By and large XP works just fine. The hullabaloo and hype over Vista towards the end of 2006 and the rather ho hum launch in 2007 ushered in the Vista era. Knowing the ‘issues’ that we dealt with in other migrations when it came time for me to get a new laptop an HRH to upgrade her desktop we drank the Kool Aide and went for the Vista machine hoping to be ahead of the power curve for once.

Imagine my surprise at the number of software vendors who still don’t have their Vista acts together (such as Symantec for example) and the sizeable number of other software vendors who simply don’t care. Combine that with a dearth of support people who are knowledgeable in the new OS and you have history not repeating itself at all.

I guess this all means that the technology future will not be like the technology past. Consumers, employees and IT users of all stripes have to learn to take the good and leave the bad. While I’m not a prospect for an iPhone, I may very well be a prospect for a Mac for my next computer – we’ll see. In the meanwhile to my US readers – a Happy and Safe Labor Day.

August 27, 2007

HP, MIT, and DSpace Foundation

HP and the MIT Libraries recently announced the DSpace Foundation, a non-profit organization that will provide support to institutions that use DSpace, an open source software solution for accessing, managing, and preserving scholarly works in a digital archive. There are more than 200 DSpace projects worldwide that are digitally capturing, preserving and sharing artifacts, documents, collections and research data. Some notable new projects include 2008 Virtual Olympic Museum that will archive the 2008 China Summer Olympics; Texas Digital Library that will provide a digital infrastructure for open access journals, electronic theses and dissertations, faculty datasets, departmental databases, digital archives, course management and learning materials, digital media and special collections from Texas A&M University, Texas Tech University, The University of Houston and The University of Texas; as well as the China Digital Museum that will include 18 campus museums, each with 20,000 - 50,000 objects covering geosciences, biology, anthropology, science and technology.

While one considers the potential for DSpace just to catalog scholastic and public museum undertakings, the sheer magnitude can be overwhelming. Toss in other privately controlled content, and suddenly the few million entries in Wikipedia seem to pale by comparison. However, if there were ever an application that could showcase the reach and depth of the Internet, this would certainly be one, and perhaps very fitting given the humble research and scholastic endeavors of the Internet and its predecessors. Nevertheless, the likely number of items to be placed into DSpace repositories, especially in developing regions such as China, will be enormous.

Although I would suspect that cultural artifacts and items that are in the public domain would remain easily accessible through the various independent repositories, it does raise the issue of how far the various organizations would go in depositing and making available the scholastic research that might have commercial or competitive value. Of course, this is no different than current restrictions on such material but a DSpace, much like early Internet endeavors, could create an environment where the law of unintended consequences rears its ugly head.

Which scholastic endeavors would be shared and under what conditions? How would institutions of higher learning, which offer court the financial assistance of commercial entities and non-profits, change their behavior in a DSpace enabled universe? Could DSpace inadvertently cause some types of information that might be freely shared in an environment where it is incumbent upon the user to make greater effort to find and assess the information to be withheld given the greater ease of access that DSpace would afford?

In the case of digital media, such as images, animations, etc., the potential for a greater enhanced repository of public domain, or royalty free content is enormous especially if library developers take serious the federated capabilities of DSpace. The indexing and archiving abilities of DSpace could translate into a very rich user experience, and assemble some truly breathtaking archives of humanity’s achievements on earth. The potential vastness of DSpace repositories could become a mind numbing thought it and of itself.

When I think about the impact that the incredibly basic tools of email, FTP, Gopher, and the early Web had on research and development, the contrast with DSpace and the Internet technologies of today is striking. If DSpace has even a fraction of the impact on research that the early Internet tools did, we are in store for real intellectual treat. It will be interesting to see what these early DSpace initiatives morph into and how they will alter the expectations of the research and academic communities. It could be pretty darn cool.

August 09, 2007

FCC, 700 MHz, Wireless Carriers, and the Mobile Internet

As part of the impending 700 MHz spectrum auction, The Federal Communications Commission has circulated a proposal that included a requirement that the new mobile internet spectrum would be accessible to new applications and devices in a similar fashion as the existing internet today. The FCC Chairman has stated publicly his goal is to allow any wireless device to download any mobile broadband application, with no restrictions, if the software itself is not illegal or poses a threat to a network. At the same time, CTIA, a wireless industry association, is disputing any requirement that the new network carriers would be required to resell bandwidth at wholesale rates to perceived competitors, specifically Google, that would seek to create a competitive offering without incurring the cost of deploying network infrastructure.


At some level this argument sounds reminiscent of a little over a decade ago when the various carriers were arguing with Enhanced Service Providers about who had to connect to whom and who had to sell what to whom and at what price in the context of the Internet, and later VoIP, and then DSL and cable, and so on. At one level it is a revisitation of the Bell Heads vs. the Net Heads drama vintage 1996 but the basis of the discussion remains strikingly similar, namely who has to pay for building the network, and what rights do others have to access this network at a “fair” price.


Rightfully, the carriers are arguing not only do they have to pay to construct the network; they also had to pay off Uncle Sam for the right to use the spectrum, and all of this costs a lot of money that will be recouped slowly over time. At the same time, Internet juggernauts such as Google are eyeing any network as a delivery platform for their content and services and want to ensure that they are not locked out of the game, as so many content providers find themselves in the locked down envrionment of mobile telephones in the USA.


In a more cynical view, I think we can see an US vs. THEM mentality, in that much of the venture money chasing this opportunity is coming from Silicon Valley while the carriers are traditionally an East Coast affair. Yes, this may be simplistic, but it underscores just one of the many differences between the Bell Heads and Net Heads, two groups of players who have managed some sort of relative peace during the past decade as the Internet grew in importance to both communities and the population at large. However, in the case of mobile Internet access, things have been less sanguine as carriers attempt to or even mandate that customers access their services, not those of competitors. While there may be perfectly good corporate reasons for doing this, it does hinder the customer’s choices in what services they can purchase and how they may wish to go about using them.


Net neutrality proponents argue that unfettered access to networks is the only way to guarantee that carriers won’t limit access or propose usury tariffs to ward off the competition. Yet at the same time, it is unrealistic to believe that corporations would continue to invest in new technologies if they know that they must bear the cost of design and deployment, yet a competitor could use the network solely on a usage-based fee, without assuming any of the risk of the initial investment. While I could accept either of these arguments in a vacuum, the reality is that the marketplace will not benefit from either extreme, and this is what I consider to be the ultimate consideration.


In the “good old” days, spectrum was licensed with the pursuit of the public good as a significant factor. During the past couple of decades, public service has been displaced by a revenue generation scheme whereby auctioning spectrum to enrich the Federal coffers has become the goal. In some cases, for totally private use, this may make some sense similar to royalty payments made by lumber, mineral, oil, and many other interests when extracting wealth from publicly owned lands.


But in the case of broad based communications, perhaps it would be better to take some of that auction money and use it to build out/maintain a minimum portion of the network that could be secured as access points for the competitors that the carriers fear. To protect against those seeking massive access, there could be prescribed limits to how much they could purchase through the minimum network, the rest they can treat as a commercial endeavor, just like the carriers, pay the prevailing burdened cost of delivery. But for those smaller entities, their access could be protected. Perfect? No. Better than monopoly or forced subsidization of the competition, I think so.

Still, the auction process is not complete, and the final rules have not been written. It will be interesting to see how it all plays out. I for one hope that we do not end up with another closed/proprietary network, but at the same time realize that without investment and the chance for a positive return on that investment, the network is unlikely to be built.

August 01, 2007

Satellite Communications – Not As Heavenly As You Would Think

In January 2007 my beloved wife, HRH the QM (Her Royal Highness the Queen Mother for you new readers) got a new car. A brand spanking new Chrysler Town & Country Van. One of its salient and most attractive features was the Sirius radio. While the thought of paying for what has always been free was not in my mind, HRH gladly signed up for a two year subscription. Not only has she been reliving her past via the 60s channel but is re-building her knowledge of Broadway hits and her passion for Tony Bennett and Frank Sinatra.

Whilst in her car (a rare occurrence since HRH would much rather criticize than drive) I am of course a prisoner of her music. As it turns out, over time I rather liked the idea of no commercials and being able to listen to a particular kind of music. With the offer of equipment and subscription as a Father’s Day gift I thought I would give it a try because silly me, I figured since this was a popular consumer item it must be pretty easy technology to deal with.

I should have harkened back to my tactical days in the Army when a key part of any exercise was trying to establish communications. For those of you who don’t know what that means – here’s the picture. Let’s start with the fact that there were no cell phones. If you wanted a phone you brought your own phones, wire and switchboards. Oh yes, batteries for the phones as well. Remarkably these field phones connected into a SB-22 Field Switchboard (see http://www.prc68.com/I/SB22.shtml for a picture) worked pretty well.

FM Tactical radio – well that was another matter. Not really reliable and line of sight these things always seemed to be finicky. Never mind that the crypto equipment ran so hot that one day we had to pack it into a plastic bag and surround it with ice in order for it to work. Rounding out the communications menagerie was Radio Teletype. It was the SMS of its day which was quite some time ago.

Flinging forward into 2007 I bought appropriate gear for the house and car from Circuit City. The installation while more or less painless was not without its wrinkles. Turns out my Lexus 300 required a special type of wired installation because wireless wouldn’t work. All in all it’s working OK. Of course there are spots on the highway where the local PBS radio station ‘jams’ or overpowers my signal, but I live with that.

The inside rig is another matter. While in the back of my mind I knew I needed to have an antenna it never occurred to me that I would actually have to go out on the roof to position it and that it would be a two person job. I had limited success putting on the window sill, but too flaky to rely on. So now I’m waiting to fit into HRH’s schedule planning on her being the inside sound tester. Fortunately the inside rig also serves as an iPod stand so it has some utility notwithstanding the SIRIUS inside debacle.

Bottom line – no communication technology is completely reliable and redundancy is the key to success or in this case – musical enjoyment.

July 11, 2007

HP and On Demand DVDs

Recently HP and Trans World Entertainment Corporation inked an agreement to offer a new service that would give consumers access to a diverse catalog of movie, TV and specialty video content not readily available in stores or online. The companies stated that to date, there have been nearly 1 million movies and TV episodes created worldwide, yet less than 10 percent have been captured on DVD. People can log on to Trans World Entertainment’s f.y.e.com online store and place orders, choosing from a variety of titles. The order is then fulfilled by HP’s custom manufacturing facility where most purchases are mailed within 24 hours. All orders include a DVD in full color packaging that is green certified as environmentally responsible packing. The service is a component of HP Video Merchant Services, which enables retailers to deliver video content in a variety of ways, including digital downloads, and traditional packaged DVDs as well as new formats including HD-DVD and Blu-Ray.

OK, so you are thinking, aren’t the pundits saying that downloads will decimate all physical movie and music purchases and rentals real soon now? Well, this pundit never bought into that argument, and the numbers in the market show that downloads, while growing, have hardly displaced physical media as the dominant delivery mechanisms for movies and music. However, downloads do address one of the gating factors for cost effective sales of content, namely, moving the cost of goods to as close to zero as possible to enable a positive ROI on small volumes. For niche content, the overhead of producing a modest run of DVDs, say 2000 copies, to sell perhaps a couple hundred makes it extremely difficult to break even, let alone earn a buck. If the fixed cost of a release could be reduced to creating a disk master, something that a mainstream PC with a few hundred dollars of software can easily do, and developing simple, but very acceptable packaging printed on demand, then the economics of content distribution change dramatically.

There is a potential goldmine in old television, movies, newsreels, animated shorts, commercial short subjects, and so forth that have commercial value to a select group in the marketplace. In many cases these enthusiasts, researchers, geeks, or just plain regular folks with irregular interests, would happily pay the market rate for a volume release or perhaps even more for content that is of special interest to them. Yes, some might be willing to download it, but downloads do not offer a permanent archive for the content, unless the purchaser wants to burn a DVD, print up a sleeve or box, and then assemble the product for long term storage and use. This is where HP’s solution is so cool, it does the “hard” stuff, but at an economic investment that would not be hard for a niche content provider to reasonably make a return on.

While some of the early success for Trans World Entertainment will probably be on older volume content such as TV shows, sports events, foreign movies; educational programming, and pieces of historic interest, in the long term we believe the value in this HP solution will be much higher for independent commercial producers of content, community groups/organizations, non-profits, etc. who may have the aptitude to produce content, but lack the infrastructure or wherewithal to effect distribution and sales. We could easily envision this service at the local photo/video shop similar to the kiosks that print digital snapshots. Creative components such as templates for thematic packaging or even custom content could be added on to create a higher value service for the vendor. If placed correctly, the potential for HP as a back end content fulfillment provider could be enormous.

A few years ago it was common to hear HP’s competitors slamming the company as rapidly becoming little more than a printer company. For those who truly believed this, history has shown them to be dead wrong. HP is not just a printer company, but it is very much a printing, imaging, and content enablement and distribution company. Through its Video Merchant Services, the company has lowered the barrier to entry and fundamentally changed the playing field for niche video content providers. While success is not automatic, this approach could well challenge those postulate that the future of media distribution will be predominantly based upon downloads.

June 29, 2007

Hip Pocket E-Mail – Epilog

It’s been a couple of weeks since I got my spiffy new Cingular 8525 and I thought I would follow-up on my last posting with some updated reporting.

The Good

Overall I’ve been pretty pleased. The ability to check e-mail without powering up the computer is a good thing. I’ve been able to verify that there are no crises brewing and that nothing requires my attention – all without leaving the comfort of the leather easy share that the dog and I seem to share. This has saved me quite a bit of time and no doubt some electricity as well.

I also had the opportunity to go off site – and I mean off site. I attended weekend Red Cross Training in Public Affairs at a private High School which offered NO Internet access at all. My little 8525 kept me plugged in pretty effortlessly.

Not sure how the camera capability will work out, but we visited the Monterey Bay Aquarium the other weekend and I forgot my camera. Hopefully the shots of the otters will come out OK and I’ll figure out how to get them out of the camera and on to the computer.

The Internet Access is a good thing. I was able to find some shopping and food info while on the road and found that a tremendous convenience.

Since HRH the QM (Her Royal Highness, The Queen Mother for new readers) and I have the same device some of my tech support responsibility burdens have eased – I was able to reconfigure her AOL pretty quickly after her phone was repaired.

The Bad

There is no way I can use the phone while driving. While that’s good for safety, it’s bad for productivity and convenience. I mean the keyboard is so unfriendly that I can’t even dial the thing unless I’m stopped and out of direct sun light.

I lost my stylus this morning. The phone shouldn’t really require one – ask the Apple people they know. Luckily I’ve stopped biting my nails since leaving my former employer (imagine that?) so I’m able to pretty much navigate without one and since I usually have a pen as well, I hope to be OK. I found an on-line site to buy a 3 pack and ordered a pack today. This way I’ll have an extra when HRH the QM loses her first one.

The Ugly

The documentation is awful to be kind. I’ve tried to program Voice Tags, you’d think I was trying to alter the ratio of rods in a Nuclear Reactor. While I could program AOL I needed professional help (a local VAR) to program e-mail from one of my other sources and I need the same guy’s help to figure out how to make my 8525 act as a modem for my Vista Laptop. I’m hoping to solve the ‘modem’ issue later today.

Overall I’m a reasonably happy customer, but as with any new technology – time will tell the tale in the long haul.

Happy 4th to my US friends and colleagues!

June 14, 2007

Hip Pocket E-mail – Blessing or Curse?

I’ve resisted getting a PDA capable of receiving e-mail and doing all the other whiz bang things we associate with being ‘connected’ at all hours of the day and night. I’ve been on countless airplanes watching people and embracing their PDAs with a mix of disdain and envy. My philosophy has been – if it’s really that important, you call me on the phone like a person. For the most part this has worked out well. Of course, it was not career enhancing in a passive aggressive e-mail uber alles environment as prevailed at my former employer where conflict was avoided at all costs and that anyone who had a different opinion should be treated like the psycho of the week on Criminal Minds.

Nevertheless, my wife and I decided to buy ourselves an ultra romantic anniversary present – his and hers new super phones on Saturday, 9 June. Over time I’ve found that my personal level of aggravation goes down if the Queen Mother (QM) has the same hardware and software that I do. We had been on Cingular, moved to AT&T – got converted to Cingular and back to AT&T. We thought we would start at the local AT&T Store. The sales person was competent and personable. We decided that the new 8525 would be best for us because it had larger keys and a bigger screen than the Palm devices and others.

We went up to the block to the Sprint store. The experience was quite different, the sales person who worked with us did not seem to be nearly as knowledgeable or as personable and the product line, although cheaper, was quite limited.

We trudged back to AT&T and were delighted when the sales person was able to move our phone numbers from the old phones to the new ones. We purchased extra phone chargers for each car, state of the art Jawbone wireless headsets and $1,300 later we were out the door tethered to a 2 year contract. Alas they had run out of cases (this will be important a bit later on) and were advised that the cases would be in with a ‘big shipment’ next week. As it turned out the cases are really Palm cases.

Needless to say we felt very 21st century. Over Sunday I learned about the new device (the manual was sparse to say the least) and with a bit of effort was able to program AOL e-mail to reach the device. Strangely enough when I programmed my wife’s device, there was a different set of instructions on the AT&T website. All was ultimately working fine. I decided to wait until later in the week to deal with transferring appointments and contacts and installing the ‘trial version’ of Outlook 2007. You would think for the kind of money we paid, the appointment calendar software would have been included – but even Microsoft has to make a buck a guess.

Next major event was the QM having a bit of a trip and cracking her LCD. We found out, a bit to our chagrin that AT&T doesn’t take responsibility for what it sells. Rather it routes you to a local repair shop. That shop ordered the parts which would take about a week to arrive. The bad news was the cost to repair was almost the same amount we paid for the device with the 2 year contract, the good news is they gave her a loaner phone.

Wednesday I was at an all day off site conference. I thought this would be a great test of being able to get my e-mail and staying in touch. For reasons that remain unclear, not only was I not able to get my e-mail, but all my settings for my AOL account (which I keep because the wife has one and I’m customer support – remember) were gone. I was not a happy camper. Also I now started to get gibberish text messages – they looked like the inside of a PGP key at the rate of a couple an hour.

Today was a bit more sane. I delivered a presentation at a local Bar Association and was able to check my e-mail without incident. Surprise avoidance is a good thing. However, there is no way I can use that device and drive at the same time. I need reading glasses to see the device and I have not progressed to the point where I’ve voice coded my more common phone numbers.

A minor annoyance is the fact that the device has valuable screen real estate devoted to a MSN connection. I don’t have one and don’t want one, nor do I want to waste valuable screen space on it, but I have no idea how to remove it or if it can be removed at all.

So what’s my overall take: it was time to make the move and no new technology is acquired without pain. It is somewhat frustrating when a large company like AT&T is not a single point of contact. It would be like your local auto dealer saying thanks for buying that new car, now take it somewhere else to get it fixed. Interestingly enough the repair shop told me that AT&T doesn’t even offer insurance on the device because they are so expensive to repair. I wonder if I had bought the same thing at BestBuy if that would have been the case.

Time will tell if our new hip pocket e-mail will be a blessing and a curse. So far, it’s been a bit of both.

May 31, 2007

SOA and its IMPACT 2007

Before the Memorial Day holiday I was fortunate enough to be able to attend the IMPACT 2007 user conference in Orlando, Florida. There were about 4000 attendees from a variety of vendors, end user organizations, analysts, and the media all present discussing with great intensity the state of Service Oriented Architecture and its impact on businesses. Overall, I am pleased to say that the high level theme of the event was not about technology but rather discussing the business value of technology and how SOA way has transformed how many organizations view IT and align it with their business processes.

On day one, there was a bit of levity given that Don McMillan, the engineer comic, was the master of ceremonies for the various keynote speeches given by a variety of IBM executives and partners. However, all of the speeches illustrated the degree of acceptance that SOA has achieved in many organizations in a very short period of time. This seems to be contrast to the state of SOA just a few years ago.

Around 1999 or 2000 we started hearing much about web services and how it would become the new way of computing in the enterprise. Not too surprisingly at the time the focus was all about the technological implementation. Yet at the same time, some of us pundit types even came up with names like Service Computing (from Zona Research lore) to describe the shift web services implied whereby business process would start to drive IT rather than the historic opposite. SOA has supplanted web services in most respects as a much broader and more strategic blueprint by which to deliver IT services within the enterprise, and happily to my way of thinking, appears to have caught the attention of many others as well.

While there were many sessions about implementing SOA in software design, it was quite notable that there was Business Leader track that targeted the C-level and other executives as well as line of business professionals. This non-technical track was focused on the business agility and competitive advantage that SOA could offer an organization. I can think of no better illustration that SOA has reached a respectable degree of maturity than the existence of this track.

It is encouraging to see that business process was such a dominate theme. While some may take this to mean BPM, I see it in a much larger context. After all, the reason organizations purchase IT products and services is to support their business. Although the late 1980s and 1990s often had many businesses wondering aloud if they in fact had started a second business, i.e. the IT datacenter business, the reality is there would be no IT market without demonstrable business results from use of IT. This clarity of mission is made much stronger by SOA, and I am pleased to say that based upon the crowds at the IMPACT 2007 event, this message seems to be well received and resonating.

May 18, 2007

Crisis Action Planning, Unlike Chicken Soup – Does Not Get Better With Age

Bad things happen to good people and unless they are prepared to deal with them bad things turn into disasters or worse. Like most aspects of running an organization, disaster planning is a mesh of people, process and technology. Most disruptions to business operations are unplanned consequently knowing what to do instinctively before something bad happens can mean the difference between success and failure and sometimes even life and death.

This week I had the opportunity to be an observer as a client went through a ‘table top’ Crisis Management Plan exercise. Key representatives came from the Executive team, finance, corporate treasury, legal, corporate communications and HR. They were run through an expanding scenario that required them to state their priorities and indicate what they would need by way of information from the various teams in the room. Issues as to which organization would be the lead for various aspects of the “crisis” were also hashed out.

As the exercise unfolded it was clear that Corporate Security and HR had worked on many of these issues before, and that there was a general spirit of teamwork and cooperation. It wasn’t until after the exercise was over that I learned that IT wasn’t involved and that the Information Security functions were spread out over several “Managers”. There was good news and bad news here. The good news was that the overall team functioned well and could work on the few areas where they needed improvement. The bad news was that the focus had shifted so far from technology that a second level exercise, one with real players and data, would very likely not be so smooth.

Disaster preparedness for organizations takes many forms. A good place to start is identifying the critical people and processes that need to continue to function regardless of interruptions. Then determine the tools they will need under a variety of circumstances to execute those functions and develop the plans and logistics needed to achieve these ends.

A couple of key things that may often get missed are: 1. 7x24 hour crisis management and engagement of law enforcement. In the case of 7 x 24 operations it is important to realize that a special team needs to be identified and that team removed from their day to day duties to focus on crisis management and actions.

The issue of engaging law enforcement is a bit more complex. Organizations recognize that they may need to involve law enforcement quickly in certain cases such as work place violence; however, in the case of theft of intellectual property, improper employee behavior such as ‘legal’ pornography, industry generally is in no rush to engage law enforcement. In any event, organizations need to determine their philosophy ahead of time. They need to identify: incidents that will immediately involve law enforcement; which law enforcement agency should be notified and the circumstances to do so; individuals who are the principal points of contact, etc. These decisions need to be made prior to the stress of incidents.

It should also be borne in mind that organizations do not exist in a vacuum. Natural disasters and selected manmade ones will likely involve the geographic area surrounding the organization and affect employee welfare and freedom of movement. It is prudent to work with local government and key non government organizations (NGO) such as the Red Cross to understand the total setting. Communal planning for disasters is a continuous process for many organizations - it should be for yours as well.

May 01, 2007

Of Biometrics and Privacy

At each RSA show, I’ve noticed that the emphasis on biometric security was enlarging, and that the vendors of this type of security were in deadly earnestness about the usefulness and reliability of their products. They were right. A consumer has happened along that values privacy and the security of that privacy to the extent that they put Army Intelligence, the CIA, and the NSA combined all to shame. This particular consumer would endure weeks of torture rather than reveal secrets. If the Mossad were to emulate this consumer, the security of Israel would be absolute.

I’m speaking, of course, about the recent adaptation of voice recognition biometric security devices being installed on preteen girls’ diaries. Really - I’ve seen commercials for them. If voice recognition can pass the rigorous demands and fanatical testing that is no doubt being conducted by this new class of consumers, then this is a security technology that should be incorporated in the highest levels of the Pentagon. Any person who has met a preteen girl knows that I’m not being facetious, here. The person who marketed voice recognition to this segment of the population is brilliant. True, maybe the fate of the world doesn’t depend on the diarist’s little brother not knowing about her crush on Bobby in Homeroom, but perhaps the fate of the little brother depends on his inability to read her diary. And like little brothers everywhere, he is going to be using everything short of a nuclear warhead to try and open that diary.

There is real world testing being conducted at this moment. If this technology passes the test – we’ll wait and see what the girls have to say about it – then with a little tweaking, it should be able to withstand more serious assaults. Visions of Tom Cruise being lowered by a wire into a frilly pink bedroom aside, voice recognition technology is most likely going to be taken much more seriously.

Bobby from Homeroom will be relieved.

April 27, 2007

Sun Labs: Where Engineers Engage Their Passion And Have Fun

Yesterday I had the good fortune of attending the annual open house at Sun Labs in Menlo Park, California. If I had know that engineers could have that much fun or that network security/encryption experts could have the bandwidth of knowledge and sense of humor of Radia Perlman I might have stuck with engineering and not gone to law school.

The day was an intellectual holiday, free of the usual trappings and marketing hoopla surrounding vendor sponsored events. I learned about Project Live* (pronounced Live Star) a new virtualization technology; Project Sun spot – an experimental platform for developing wireless sensor, robotics and swarm intelligence applications in Java; new authentication schemas for web security and the Ephemerizer project which is designed to provide assured delete by employing public key cryptography and the planned destruction of keys as a means to delete access.

The environment was purely intellectual and the only sales type I ran into was from one of the major research houses. Engineers were indeed exercising their passion and zeal as much as any attorney representing their clients. They were hoping to enlist disciples who would help to move their various projects from the laboratory to reality. In many respects it was refreshing to engage in discussion about so many concepts in their product infancy. It was especially intriguing for me to address the issue of document control simultaneously from the technical and legal aspects.

I bought a copy of Whit Diffie’s latest book “Privacy on the Line” which he was kind enough to autograph with a personal note. Whit and I go back a long way – in fact to my first published market forecast at my very first research job. The day was not all joy, however, because Whit told me that one of our mutual friends had recently passed away. Paul Heckle was a good and gentle man who always concerned with the future of software, ease of design. A rare bright light in the technical landscape.

I’m grateful to Sun Labs for giving me a day away to see the future, but remember the past.

April 25, 2007

More Thoughts on Energy and Efficiency

The other day I was again pondering energy efficiency in the data center, as I am prone ot due with my own idle brain/CPU cycles. Virtualization remains a hot topic as many organizations seek to shift their physical server workloads onto virtualized servers supported by a smaller number of servers/blade solutions. One part of the rationale is to reduce ongoing operational costs and capital expenditure, but now another common reason is energy savings. Although perhaps a later entrant into this efficiency discussion, storage solutions are getting with the energy efficiency program as organizations are beginning to consider the ramifications of having “all those disks” spinning around all the time. The ever declining price points of SATA and other lower cost disk technologies have further highlighted the operational vs. procurement costs of storage especially in secondary and tertiary tier solutions. However for the most part, storage still tends to be over deployed, underutilized, and full of duplication as the management of it proves more difficult in some cases than simply throwing more empty disks at the problem.

One company that has been taking the new approach to reducing the amount of storage over deployment is 3PAR. The company is self described as a utility storage provider and uses an approach it has dubbed as “Thin Provisioning” with goal of delivering more storage capacity with less physical storage. According to a recent press release, 3PAR Thin Provisioning customers collectively have a worldwide energy savings of approximately $6.6 million annually. This savings eliminates 48,000 metric tons of CO2, the carbon emissions equivalent of roughly 9,000 cars. Hewlett Packard of course has been on a tear regarding energy efficiency in the datacenter and EMC has elevated the discussion to the power consumption of its large storage arrays. Sun Microsystems has a greener side to its servers and VMware has participated with PG&E, the California electric utility, to offer rebates on virtualized servers that displace older physical servers. EMC through its acquisition of Avamar has taken a big step forward in eliminating duplication in the storage network and reducing the amount of bandwidth taken up by shuffling around duplicate copies of files and backing them up multiple times. It would be cool to figure what the equivalent reduction in automobile emissions of CO2 would be for reduced delivery of files across the network.

Nevertheless, it is clear that the issue of power consumption will remain in the forefront of IT budgets especially as the price of oil of creeps back toward $70.00 a barrel with corresponding high price of electricity in many jurisdictions. Although for many this means a budgetary driven desire to reduce power consumption, the reality is that competitive advantage can be achieved by reducing power consumption and datacenter footprint regardless of the price for energy. Free market energy prices are dynamic, and in some cases government subsidies distort prices, however, the price is typically constant amongst competitors in a given region. Company A can’t buy electricity for less for Company B therefore regardless of the price paid if Company A reduces its overall consumption, it is in a better competitive position than Company B. Of course a corollary reduction in carbon emissions is also achieved by a reduction in fuel consumption and in some markets the carbon footprint and proposed taxes/mitigation fees may ultimately prove to be a higher economic penalty than the cost of the fuels being consumed. But at the end of the day, consuming less energy has a growing number of factors in its favor.

A couple of years ago I had the opportunity to tour Fujitsu Siemens Computers’ manufacturing plant in Augsburg Germany. It was somewhat surprising to see actual high tech manufacturing still taking place in the western European country. However what more enlightening was the comment from one of the tour guides that the price advantage China has in electronics manufacturing is predicated on the price of oil remaining below $90.00 a barrel. If the price of energy were to exceed this level, then the labor costs and other resource driven issues no longer work to China’s advantage as the cost of shipping trumps these other advantages. Thusly, it follows that $100 per barrel oil could make IT manufacturing competitive again in regions such as North America or Western Europe despite more stringent regulatory environments and higher costs of doing business. This illustrates how the price of energy may be a short term advantage but though reduced consumption of energy can translate into a long-term competitive advantage.

So while many will continue to be fixated on the dollar cost of energy consumption in the datacenter I think the ultimate winners will be the ones who can orient themselves to use less of any resource be it energy, server CPU time, storage capacity, or networking bandwidth, amongst others. All of these resources have costs in deployment, operations, and retirement. One of the current children’s songs by Jack Johnson, The 3 R’s, includes the phrase, “Reduce, Reuse, and Recycle.” We can all learn a lot from this approach. As consumers of IT, we have become spoiled with expectation of improved performance and decreased acquisition costs for equipment but have also become wasteful in how we deploy IT.

Historically many cultures have supported being good stewards of the resources they were blessed with whether they were crops, land, livestock or money. The notion of living well, but living efficiently and with a small footprint, was the way of life. Although many consider the 21st century to be far more advanced and civilized than past cultures, the reality is a life on earth has not changed all that much. Human beings still face the challenge of allocating resources efficiently and effectively. So while we may look to IT to be the silver bullet that fixes many of the business challenges organizations face, the reality is that IT is just another collection of tools and resources in the corporate competitive tool chest. By being as efficient as possible in the deployment and operation of these tools organizations stand the best chance to become the long term survivors, and hence the winners in the marketplace.

April 19, 2007

Mainframe Security Still High Card In Security Deck

I had the good fortune to attend a briefing from IBM on their latest security announcements for their Z Systems. It’s been a really long time since my last encounter with mainframes. In fact it dates back to my first iteration as analyst way back in the 1980s, the days of 35 MM projectors and 5 ¼” floppy disks. My first encounter goes back even further than that.

In those days there weren’t very many alternatives to mainframes and many of us cut our teeth on programming by punching holes in cards and eagerly waiting for reams of printouts on very large sheets of bar laden paper. The first computer I ran was an IBM 1130 for Chrysler. About the size of an attorney’s desk, it had three huge plastic discs, a card reader and a very impressive, loud, and dust generating 1403 Printer. It could do only one thing at a time with a Star Trek like cluster of green lights flashing away and one great big green light which, as I recall, said simply ‘run’. I remember sitting at the console doing some of my MBA homework when the Dodge Division Regional Manager came in – they were Gods in the Regions and this one in particular terrified many of my colleagues. “Larry why aren’t you working? Why are you just sitting there reading a book?” he bellowed at me. “Mr Robbins”, I replied “as long as this little green light is on, this little machine is working its little heart out and so am I.”

The mainframe was the workhorse for Chrysler and its safety net. When I accidentally erased all of the vehicle sales for a particular month by punching the wrong number in the month column, the IT guys and gals in Detroit came to my rescue and retransmitted the files. Ah yes, I’ve always been quite fond of mainframes.

In some ways it was gratifying to know that they are still cranking away and that IBM has evolved the mainframe architecture to be as relevant today as back then. The briefing was an eye opener for me. I was especially intrigued by the notion of a specialty engine and how this architecture contributed to optimizing the encryption process. End to end encryption in the IPSec world is important and may, over time, evolve to be the standard for sensitive data such as personally identifiable data. Consequently efficient encryption, without the sacrifice of bandwidth or thruput speed is essential.

I was also quite impressed by the attention given to Linux. I had always assumed that Linux was a creature of servers and PCs that had outlived their usefulness a generation or two ago. Ever mindful of new marketing approaches, IBM touts that mainframes are really green frames because according to them “the IBM System z9 Enterprise Class (EC) may provide up to 4 times the same work in the same space and may provide up to 12 times the work for the same power consumption.”

So as I sit here learning the ins and outs of my new Dell Vista laptop I take comfort in the fact that there are mainframes snuggling in their data centers just cranking away as they’ve done for 40 years.

March 29, 2007

YouTube, VIPs, and System i

Today there is a bewildering array of individual and community driven communications means that less than a decade ago simply did not exist. Blogs, Podcasts, and YouTube, are becoming very accepted, dare I say, a conventional if not expected means of getting the message out. While much of this content started as personal diatribes (blogs) being released to anyone willing to get online and view the web page, the next stage of content, i.e. Podcasts (take the content and listen later) and now YouTube (view the video now or later) has definitely caught the attention of mainstream content providers. In particular, media interests, especially those with a position to spin, or a product/service to sell, have begun to embrace these communication channels as another way to reach their various audiences and their pocketbooks. At the same time, each of these content providers, are seeking to make their constituencies feel special, part of the greater community, VIPs if you will, so that they will bond with the underlying message and become proactive supporters of the cause.

As new as these approaches may seem, they are addressing an age-old issue, i.e. how to get the message out to one’s audience and influence their actions. In IT marketing, this quandary is the focus of so much marketing attention, especially when it comes to business partners. With reported growth in the number of SMBs and their purchasing power continuing to outstrip the growth of the larger enterprise, vendors are increasingly reliant upon their indirect channels to reach a larger proportion of the business opportunity. At the same time, the number of ways in which to reach the audience is growing, and there is no longer an automatic acceptance that the only and best information comes directly from vendors, Further, user communities are becoming more differentiated along verticals, geographies, IT expertise, and behavioral demographics. So what is a vendor to do? Focus on the past, present, or future? I would argue in many cases, all of the above.

I’ll be upfront, I like the IBM System i. Yet its legacy, and for many, an outdated perception of it in the marketplace, at times limit its potential community of users. However, we have seem much change about the System i, and recently have witnessed an interesting confluence of marketing and positioning that reaches back into the platform’s legacy while at the same time pursuing new audiences, through new communication channels.

The recently launched Vertical Industry Program, aka VIP, has targeted the traditional heart of System i’s success, i.e. being the integrated platform for industry applications. The last few years have offered ISVs many alternative platforms upon which to ply their wares, be they Linux, Windows, or Open Source technologies. For some the value prop of System i may have become overlooked. Refocusing on a core market is a generally a good idea, and the support of 3400+ revitalized System i applications is testimony to the opportunity. But something that is different this time is the micro focus of VIP.

Rather than targeting a few broad verticals, VIP is focused on 80+ micro-verticals, such as travel and entertainment subsets like as gaming table or restaurant management, or manufacturing sub segments such as after-market auto parts, or labor union pension funds. While VIP has the expected partner program features such as co-marketing, technology assistance, and so forth, ultimately I think the narrower focus of targeted solutions may prove to be the success driver. Through close targeting, the channel partner as well as their customers individually become more important and may once again think of themselves as VIPs as opposed to just one of many vying for attention. With the proliferation of formal and ad hoc user communities, the feeling of importance imparted by a vendor to its partners and their customers cannot be underestimated as it is a powerful viral marketing tool.

With an eye towards capturing the attention of new users, consisting of demographically younger, Windows-centric, or Web-savvy citizens, System i has posted some videos, “IT Revenge” v1.1 - v1.4, on YouTube that tap into many of the common frustrations of SMB server administrators. While smashing a server suspended as a piƱata with a baseball bat may not seem very business like, the reality is this taps into that visceral feeling many server administrators have experienced when dealing with server sprawl, and sets the tone that System i, might be a bit different. Add to this the “i want control” advertising campaign, the iSociety online community, and The Truth web site, and you have a collection of marketing and influencing platforms that reach far beyond the traditional IT education channels. Place a seed and watch it grow might be the adage, but it is a good one, especially if the seed is planted in soil that has never before grown the new crop.

I find it ironic that the classic value of the System i, load the app, fire it up, watch it run, and then leave it alone, is so contemporary. It all gets back to why organizations deploy computers and applications in the first place. It is not about perfecting the black arts of operating and maintaining a fleet of disparate computing resources, it’s about getting business done as competitively as possible. This is as true for business partners as end user organizations. By focusing on both constituencies’ needs, IBM can help each feel like they are VIPs, and garner positive viral marketing in numerous end user communities. Getting a closer view and understanding of the customer and partner is always essential to this end. By combining more traditional approaches like the VIP program, and blogs, YouTube, and user communities, IBM is seeking to widen its marketing net, while at the same time making each of its constituencies feel unique and special. This integrated combination of old and new style marketing may be reflective of the System i, itself – a platform with a heritage of integrated simplicity and ease of use whose potential is relevant today in more scenarios than ever.

March 23, 2007

CSOs – Trend or Fad?

The notion of combining physical security and logical (information) security has been around for some time. Some industry thought leaders such as Steve Hunt, feel that convergence of the responsibilities for physical and information security is not only a best practice, but inevitable. Recently AT&T published a white paper with the results of a survey conducted for them by the Economist Intelligence Unit. The paper stated that “Typically, the CEO remains the primary decision-maker for electronic security decisions (although in Europe the CIO is more likely to hold this role). But the importance of the chief security officer (CSO) is rising—this figure is cited as the main decision-maker at 12% of companies.”

This made me wonder if the role of CSO makes sense or if it is simply wishful thinking. I pondered the history of the responsibility for information and physical security during my Army career. At battalion (a unit commanded by a Lieutenant Colonel) and above, there is a principal staff officer responsible for “Intelligence and Security”. At one point this officer (the S2 if working for a Lieutenant Colonel or Colonel and G2 if working for a General officer) was responsible for information security as well. Over time this proved untenable since intelligence officers were not IT professionals and it wasn’t practical to have them learn the technical details and nuances necessary to be effective. The responsibility was transferred to the “6” who was the lead for Communications and IT within the organization.

In the commercial sector physical security is the province of facilities while information security is typically within IT and usually reports to the CIO. Ultimately the CIO and the facilities lead may report to a common VP such as the CFO.

Given all the above, suppose you had the ability to re-orient security, what would the ideal structure be given the growing array of regulations, pressure for data privacy and looming e-discovery rules?

I’d argue that the CEO needs a focal point and perhaps the logical keystone is a single individual responsible for Security and Compliance (S&C). Of necessity this would cross the lines of other key direct reports to the CEO such as HR, CFO and of course legal. Staff elements within the Security and Compliance Office could be set up that would have dotted line supervision over their respective functional counterparts while S&C Officer would be the CEO’s representative in all matters related to security and compliance across the organization.

March 13, 2007

Venyon & NFC -- Cool, but is it Viable?

Recently I attended a breakfast in San Francisco given by Venyon, a joint venture of Nokia and Gieseck & Devrient, focused on the topic of Near Field Communication (NFC). NFC is a short-range wireless connectivity technology that has recognition by the International Organization for Standardization/International Electrotechnical Commission (ISO/IEC), European Telecommunications Standards Institute (ETSI), and ECMA, a European association for standardizing information and communications systems. NFC is optimized for proximity transactions, operates globally in the 13.56 MHz range, and offers data exchange rates from 106-424 kbps. It is also purported to be compatible with the existing and future contact-free payment and ticketing card infrastructures based on ISO 14443 standards.

Venyon envisions a future whereby mobile phone manufacturers will integrate its secured chip and management platform into handsets thus allowing users to use their phone as a secure bank vault by which to commence instantaneous consumer gratification. One comparison given was the use of the mobile phone much like the Oyster cards of London Transport, or equivalent tracking/payments systems in use in various metro transport systems. In addition, Venyon also foresees the embedded technology as a part of multi factor authentication solutions based upon its secure chip, the mobile phone SIM, and your knowledge of secrets. One such scenario might be gaining access to a secured door by touching the contact pad with the phone (as opposed to a card key), which triggers a phone call to the registered number for the mobile phone, whose conversation could be viewed by remote security camera whereby the parties then exchange secrets or pass codes, and the door then is unlocked remotely.

OK, this technology is the stuff that futurists love. It integrates, shares electrons, proffers a future of great enablement, professes a revenue stream for service providers, includes a piggy bank, and has a zillion possibilities for savvy business folk to attempt to have the piggy bank sent their way. I can hear the ads now, “With Cingular and Nokia you can download the latest MP3s, cool ring tones, your bank deposits, and your latest hot club listings, all into one totally cool device that will reduce your carbon footprint while we bill you automatically from your personal cash vault located inside the phone.” Argh!!

To be fair, I am no longer a twenty-something, I am two twenty-somethings. Cool is not as hot as it once was, and I am not no longer easily captivated by gizmos. While I believe there is a market populated by younger people who judge social status by the latest device or wireless functionality, looking beyond this crowd, just how useful could Venyon's use of NFC be for a payment and authentication solution? Technically, it would probably work, but could most people be convinced to change their behavior to where the phone became the next credit or debit card? Would a unified card key system be swapped out in favor of a collection mobile phones from various and sundry service providers? Would light weight modern phones survive the impact and abuse they would receive if they started being slapped against payment and access pads multiple times per day? But most importantly, will banks, payment providers, retailers, and all the other requisite parts of a successful ecosystem sign up and play?

In the realm of IT, one axiom remains constant. It is much easier to swap out technology than to change human behavior. In order to effect behavior change, the new way of doing things must provide benefits not available through the old way and offer a high enough economic imperative to bribe the user into desiring the change. At present, it is hard to see where the economic imperative would be high enough to cause substantial change in the general public.

However, by seeking the tweens and twenty-somethings, which have disposal income and time, Venyon and its NFC cohorts could plant behavioral seeds that would transcend into working adulthood. This could groom a future where such solutions would be as expected as listening to music on a mobile phone is in this demographic today. So, the challenge will be to find deep enough pockets to drive the development of an NFC ecosystem and place the technology into enough early adopters and hopefully watch their NFC behaviors grow. This would take a lot of patience, but it is not impossible. Just ask anyone 10 years ago if they would listen to music on their cell phone (analog, heavy, and expensive) and they would laugh you out the door. Yet to the major mobile providers today, it is no laughing matter.

Although I think Venyon’s vision will not be an overnight success, when technology offers new ways to help separate people from their money, there are always folks interested in taking up the challenge. Time will tell if Venyon’s solution ultimately addresses a need in the marketplace, or if it will join the long list of technologies that were seeking a problem to solve, but did not find one in time before their own problem of financial viability become insurmountable.

March 07, 2007

Defense Technology – Still A Major Source of Commercial Products

I read yet one more notice of one company buying a piece of another, but this one struck me as a bit noteworthy. Nomad Digital, providers and operators of specialist mobility networks, today announced its acquisition of QinetiQ Rail, the commercial rail division of QinetiQ. Most people never heard of the company with the funny Q name, but I recognized them from their Defense business centered in the UK which has recently expanded to the US.
According to the acquirer, “The transportation sector is full of opportunities for a wide range of WiMAX broadband and narrowband mobile wireless services and it is largely under-served by conventional mobile network operators. …. enhance our offering of value added on-train services, such as live CCTV, train operating system applications, more reliable train-to-shore communications and entertainment services for passengers.” He goes on to add “By retaining an interest in Nomad, QinetiQ has demonstrated its conviction that we have a strong business here."
There are some interesting market dynamics here. Wireless technology is clearly important in the defense community and rail systems are a key piece of the national infrastructure moving large amounts of cargo and sometimes people. Railroads need to insure safe and efficient operation of their rolling stock. Some like Eurostar and Amtrak are looking for ways to make themselves more attractive to passengers, especially those able to afford higher fares and add-on services purchases.
Having recently experience a cruise on the largest cruise ship in the world (Royal Caribbean’s Freedom of the Seas) I’m aware of the demand for entertainment and instant access. Migration of defense technology into the commercial sector will certainly help increase accessibility; I just hope we’re all smart enough to know when to turn it off.

February 27, 2007

If it had been The Great Firewall of China, Would the Manchus Still Have Invaded?

It's a war out there.

The Forces of Good have been battling the Forces of Evil ever since someone made up the first law. Since not everyone agrees that following the rules is in their best interest, enforcers (traditionally ranging from hired thugs to your friendly neighborhood policeman) have been in place. The point of this mini history lesson? To highlight the fact that it's never going to end. Good is never going to triumph over Evil, because part of being Good is not hitting first. Good has to wait for Evil to do something before enforcing the law.

Cyberspace is no different. Hackers and spammers and phishers and all others out there making victims of Joe Q. always get the first blow in the war. The only preemptive move Good can make is to build a wall; but even the Great Wall of China failed, so we shouldn't put all of our trust in our defenses. Once a cybercriminal strikes, all the security companies can do is mop up the pieces and try to make sure that that particular avenue is closed.

It's a neverending, frustrating battle that is going to by definition have innocent victims. (No victim, no crime, right?) So when one is shopping for cybersecurity products, one should keep in mind that they are all based on past attacks and won't necessarily be precognitive enough to protect precious data from the Next Evil Thing.

That's not to say that our cyberspace enforcers aren't doing their best - they are certainly making an all-out effort to try to predict and prevent the next attack. Law enforcement in the real world is working with security specialists in the cyber world, but I believe that more needs to be done. Cyber crime and real world crime are becoming increasingly enmeshed, as is highlighted by the real world crime of a stolen laptop enabling the cyber crime of stolen data and identity theft. The recent theft of a cell phone triggered the real world embarrassment of certain celebrity figures, but what if a stolen PDA results in a criminal being able to ambush someone?

Guarding cell phones, laptops, PDAs or other personal electronics should be on the same level of priority as guarding one's house keys. The sooner the public at large realizes the dangers of not taking their electronic privacy seriously, the better. And perhaps, if enough awareness is raised, my teenager will quit losing his cell phone.

February 20, 2007

Happy Birthday Blade.org

Last week I attended a party in San Francisco that was celebrating the first birthday of blade.org. As expected, various VIPs from blade.org and ancillary organizations were present to celebrate, but there were also guests from the VC community as well as two authors/professors namely, Raymond Miles, and Henry Chesbrough. There was the slate of presentations one would expect at such an occasion, but if you were expecting to hear a pitch about how great IBM BladeCenter was, you were in the wrong place. Rather, the tone of the morning was intriguing, as it did not focus on blades per se, but on a far more compelling topic, collaboration in the marketplace.

At the highest level, one could argue the premise of blade.org is simple, get more companies interested in designing blades to fit in the BladeCenter, and then sell more BladeCenters. At one level, this is true; Big Blue likes to sell things and make money just like everyone else. Nevertheless, what is much more fascinating is not what is being done, but how it is being done. In 2006, blade.org had eight companies/members pondering the potential of the blade architecture. That number quickly jumped to 40, and now exceeds 100. At the same time, $1B in VC funding has poured in to member companies. It does not take long to realize that blade.org is much more than a front to promote IBM BladeCenter; rather it is a thriving example of what Raymond Miles dubbed, “Collaborative Entrepreneurship.”

When you look at the array of companies participating in blade.org, you see a wide spectrum of players, not just server, switch, or networking vendors, but component vendors, application developers, integrators, etc. Each of these is bringing complementary expertise to the blade opportunity and while some may be competitors, the embodiment of blade.org is a shared experience aiming to raise the water level for all boats floating in the blade harbor. This phenomenon of implied trust at the highest levels was a focal point for all of the speakers that morning. When such trust can be achieved, then collaborative innovation can take place even in a competitive marketplace.

Until recently, such an approach would be limited to a narrowly defined consortium of interests, or would be treated as suspect by participants, always looking out for the inevitable competitive stab in the back. But as blade.org has illustrated, it is possible to come together in a well-defined and trusted environment to innovate. The financial backing of VCs is evidence of the trust that has been garnered, as VCs are not quick to part with their money unless they have a high degree of faith and trust in the situation. The result is that vendors are looking at the opportunities for blades with a greater deal of interest than might have otherwise occurred. Further, the R&D and S&M efforts of each member have leveraged the investments of others to grow the overall opportunity at much more rapid pace than any single vendor or very small group of vendors could do.

Blade.org typifies an open approach. Yes, open in the sense of technical standards, but more importantly, open in the approach of sharing information, and doing business. The resultant trust encourages all to participate at a greater level and with a higher expectation of success. The recognition of this need for trust is not new in business, as the Japanese keiretsus of the latter 20th century illustrated another approach to maintain the trust, i.e. cross ownership of all the participants. However, this approach was closed by nature, and dependant upon a single source of financing. Blade.org has the best of both worlds, a strong environment of trust, and incentives to contribute, but an open model for participation, financial independence of its members, and an affirming position to independent and collaborative entrepreneurship.

The notion of Collaborative Entrepreneurship in general and Blade.org in particular, is an exciting experiment/activity to watch. We look forward to see what it will have achieved when Blade.org's second birthday comes around.

February 14, 2007

Cuba and Venezuela – Unlikely Good Examples of Open Source Preference

A recent headline in my local paper, the San Jose Mercury News, attracted my attention: “Cuba moving to ditch Microsoft, its products” (http://www.mercurynews.com/mld/mercurynews/news/world/16721400.htm). While many would tend to chalk this up to anti-US security paranoia, in my opinion this would be the wrong conclusion.

During 2006 I had the opportunity of meeting with many government officials from around the world and uniformly they were all interested in one thing: saving money on their software license costs. While this was especially prevalent in Asia, this goal was not unique to developing countries. Even the most developed of nations such as Japan is aggressively exploring ways to make better use of open source software and reduce their dependency on Microsoft.

Government users, particularly those in the defense sector have always harbored a distrust of commercial software for sensitive applications. The cry of “Commercial Off the Shelf” (COTS) or “Government Off The Shelf” (GOTS) does not echo as loudly as the crescendo of less budget dollars going out the door. Many organizations will likely be able to increase their size through the promise of reduced software purchasing and support costs.

The article mentions China, Brazil and Norway as countries that have encouraged the development of Linux and the move from Microsoft. They are by no means alone and it would follow that the Cuban model of mobilizing university students to develop open source products is a model that could be easily emulated by many nations. In fact, once upon a time (1997) in a far off land called Bosnia I suggested to the US trade officials that the country’s universities would be ideal places to check Y2K code. Engineering students and graduates had been trained in the old Soviet mainframe mold and could easily adapt to the tasks inherent in riding any code of a potential Y2K problem. Alas, no one thought this was such a good idea.

When I look at this open source movement from a geographical perspective, it strikes me that the big winners in open source product trade are likely to be China, Brazil and India.

In the case of open source the innovator’s dilemma may be more of how to make a profit than to make a usable product.

RSA San Francisco 2007 - A Perspective

The RSA Conference has emerged as the leading annual US information security event. The one time crypto-geek fest originally held in the Sofitel Hotel in Redwood City, CA in the early 1990s has blossomed from a gathering of the cryptographic community to the anchor of the information security marketing year.

I’ve been on the Security Speaker circuit for quite some time. I did my first CSI in New York in 1983 and as I recall, my first RSA speaker slot in 1996. The growth of RSA in both size and scope has been nothing short of remarkable. This year’s event featured over 340 exhibitors, 500 speakers and 200 sessions. Many of the keynote speakers called for the conference to broaden its coverage and extend into more general business topics.

Frankly this would be a sad thing. There are plenty of general venues, but not very many places where security oriented start-ups and new technology companies can mingle with end users, venture capital companies, competitors, would be acquirers and of course, analysts. This year’s event was upbeat and by all accounts successful. I’ve noticed that the number of end user attendees has increased. Attendees tend to be people actually doing the work rather than executive management. Engineers and project leaders apparently use RSA as way to stay abreast of industry developments in the sessions and see all the key vendors in one spot. As with many other shows, once the workshops start, the exhibit floor empties out.

Last year may have been the year of compliance where every vendor seemed to base their marketing appeal on ‘compliance’. This year would have to be the year of leakage where many vendors were touting their ability to prevent leakage of sensitive data or intellectual presence.

This year, as with years past, there is growing attention to legal and government matters. While this trend may pay tribute to the early days when bashing NASA over crypto export restrictions was de rigueur, it strikes me that legal penalties, data forensics, electronic discovery and government policy concerns have taken a higher mind share with information users and security vendors alike.

For me, the highlight of the event is always the Speaker’s dinner. Being selected as a speaker or panelist is always an accomplishment. Art Coviello, head of the RSA Division of EMC and former CEO of RSA told us that 500 speakers were accepted from 2300 applicants. In years past the number of acceptances has been a tenth of the applications. It’s also interesting to see the body of speakers and appreciate the range of talent from cryptographers to attorneys. As you might expect the table chatter is lively and often sarcastic, cynical and even thought provoking.

See you there in San Francisco next April?

January 23, 2007

Looking Forward: Power, Cooling, and the Data Center

In the future, we may look back on 2006 as the year that power consumption, cooling, and energy efficiency in the data center ceased being a back-burner issue for IT and facilities managers and elevated itself to become one of the forefront, if not leading, issues for many. While those “in the know” have always been aware of HVAC and power distribution limitations, until recently it has not been a noticeable issue. During the past several months, we have seen vendors focus on the issue of energy efficiency through various initiatives including HP’s Smart Cooling, EMC’s Energy Efficiency Tool, Sun’s Cool Threads, the latest Energy Star Specification, and VMware’s energy utility rebates. With the competitive attention now being brought to bear, we expect to see this topic remain at an elevated level during 2007 as vendors line up their competitive differentiation and definitions of what exactly energy efficiency is all about.

Although much of the cost cutting and resource gutting by CIOs and CFOs during the first part of the twenty-first century focused on infrastructure consolidation and headcount reductions, it didn’t take too long for the impact of $75/bbl oil and 22¢/kwh electricity to reach into the data center. At the same time, rather ironically, all the focus on server and storage consolidation combined with ever denser form factors such as blades has changed the heat generation and dissipation characteristics of the data center. Thus, the limitation of physical reality has once again impeded progress in our collective journey to a virtual IT existence. Yet there are many similarities and lessons to be drawn from the “consolidate, simplify, and virtualize” mantra of the past few years. Just as inefficiencies in server and storage utilization have led to consolidations featuring closely monitored virtualization schemes, we are now witnessing the same opportunity with cooling and power consumption.

Over-provisioning of cooling and power is inherently just as inefficient as over-provisioning anything else. If machine rooms are continuously cooled to meet peak loads, then a lot of kilowatt-hours are going to waste. Likewise, if the actual power being drawn by equipment is less than the wiring supports and designed to a worst-case scenario that is unlikely to be achieved, there will be unnecessary breaker panels and conduit being installed. From a financial and operational perspective, targeting the cooling where it is needed, only when it is needed, just makes sense; anything emitted beyond this is simply waste that impacts the bottom line of the business. Similarly, electrical circuits that are underutilized represent an underperforming investment.

Although initiatives in the marketplace vary in their impact, we believe that the players who can provide dynamic realtime monitoring and control of the power consumption and cooling envelopes will be the long-term winners in this space. At present HP probably has the most comprehensive offering; however, other vendors certainly have many requisite pieces of the puzzle and one cannot overlook the expertise of IBM’s Global Services to pull together just about any solution given enough money. At a minimum, a combination of systems management, facilities management, myriad sensors, and realtime data acquisition and control software will be required to achieve enhanced data center power and cooling efficiency. In addition, the knowledge, planning, and wherewithal to pull this together cannot be underestimated. But despite the higher barrier to entry to play effectively in this space, we believe the opportunity is too great for most systems or management vendors to overlook.

For systems management vendors such as HP, IBM, BMC, CA and others, the myriad sensors necessary to monitor environmental conditions represent an opportunity to extend management solutions to reach beyond the traditional bounds of IT. The dividing line between IT and facilities is clearly blurring in the data center. This disruption in thinking highlights the latent opportunity.
We expect to see more initiatives where vendors such as Sun, VMware, EMC, IBM, et al will work with utilities to implement creative programs to help organizations reduce data center power consumption. Besides reducing the power bill, the reduced demand for data center power forestalls the need for additional generation capacity on the power grid and is a win-win scenario for the utility and ratepayers alike. In addition, environmental factors such as air quality and ambient noise levels will likely emerge as drivers as well, as organizations rationalize and change how they approach office/work space internally.

Organizations will reap these benefits incrementally as they refresh their technology over its lifecycle, and in some cases, the savings might encourage earlier refresh of equipment that may still be functional, but less efficient. Of course, a significant upgrade of the data center as a whole would bring more ROI sooner. If power utilities were to embrace power savings programs for computer technologies, like many do with older household appliances, lighting, heating, and cooling equipment, the potential to enhance the energy efficiency of the data center would grow significantly. Hopefully, in 2007 this is exactly the kind of market place behavior we will see as chipmakers, drive manufacturers, systems vendors, storage specialists, systems management companies, and utilities all work towards improving the efficiency of the physical operation of the data center.

January 18, 2007

HP's Customer Dis-Service

Ahhh the joys of owning a PC. I purchased an HP Pavilion 7495.it back in October when my laptop motherboard crashed and burned. I’d been putting off buying because I was waiting for an AMD chip to come out but I bit the bullet and bought the system because I was in dire need of a PC and couldn’t wait any longer. The day I brought it home, I installed it, and then paranoid little nerd that I am, I went straight to the Microsoft Update site and started downloading critical updates. Things went well for awhile, but then something went very wrong. The pc went into a continuous reboot loop after I’d installed all the updates.

Seeing as there were really no apps installed yet, I did an F10 and started over. Because I was in a hurry, I let it go. Flash forward to January. I did a backup of my system (still paranoid) and thought – well, maybe there was a glitch – I’ll try this again. No glitch. The same thing happened. And I couldn’t boot in safe mode and any other fixes I tried all told me that the computer was hopelessly corrupt. I did the F10 maneuver again, but this time, I decided I’d get to the root of the problem before I put all my apps back on the system. So I went to the Microsoft site to see if I could get online support, and it told me to contact the PC’s manufacturer.

So I called HP in Italy and sat on hold for 4 minutes until the lady came back and told me to call back later because all the lines were busy. Nice. So I went online to HP and got an online techie. Alvin P is what he told me his name was. So I wrote out an explanation of what had happened, and what I had tried. I explained that I needed to know which one of the updates was not getting along with HP’s version of XP Media Center Edition 2005 and what I could do to get around it. His response – you have to do each update manually to figure out which one doesn’t work. Mind you there are 63 of them according to Microsoft. Mind you this has been a problem since October, and you’d think I wouldn’t be the only one with this problem, so maybe someone at HP and/or Microsoft had worked it out by then and could tell me. Nope.

So in the end, this is HP’s response – go figure it out by yourself. AND THEN WHAT!!!????? So once I know which patch is damning my system – which it will do again – then I’ll have to do F10 AGAIN and install all the other patches again and then hope that was the only patch giving me a hard time. However, that only isolates the problem. It doesn’t solve it. What do I do about the bad patch? Skip over it and leave my system vulnerable to whatever that patch was supposed to fix? I think what really annoys me most is that Alvin clearly didn’t go digging through any databases to see what might have happened previously. Nope he just blew me off with – well, you have to install each one and see what happens!!!

I did warn him that I was an industry analyst and if that was his last and best answer then I was irritated enough to blog. He said I would just have to wade through each patch and figure out for myself what was the problem. Now that’s fine if you’re a semi-literate person who likes technology. But what about HP’s thousands and thousands of customers who aren’t particularly literate? What would they do at this point? It makes me think twice about purchasing another HP system.

January 16, 2007

Compliance 2007: The King's Sox Had Holes

For most of 2006 the term compliance was synonymous with the dreaded U.S. Sarbanes Oxley (SOX) law. The overwhelming majority of large organizations, especially multi-nationals, found themselves spending oodles of money on myriad projects all earmarked as necessary for “SOX Compliance.” Major trading countries and regions all took notice of how U.S. organizations were scurrying about trying to ensure that their top executives would not be clad in orange jumpsuits and headed to jail. Some countries such as Japan decided to go on the offensive and put the world on notice that they too would be passing legislation designed to bolster investor confidence and mend the sins of past malfeasance on the part of several executives and organizations.
Organizations have been facing a maze of regulations for quite some time; furthermore, it was not uncommon for the regulations to be technology-neutral in their guidance and perhaps even conflicting in their requirements. Laws and regulations could be based on the jurisdiction: federal (country level), state or provincial, or even municipal. Examples include the California disclosure law popularly known as SB 1386 and the Canadian Personal Information Protection Electronic Documents Act (PIPEDA). Organizations also found that they would be subject to regulations based on their size (revenue, market capitalization, number of employees), or their industry. For example, in health care there is HIPAA, more properly known as the Health Insurance Portability and Accountability Act of 1996; in financial services there is GLBA, or the Gramm Leach Bliley act of 1996; and for power and energy there are regulations promulgated by the North American Electric Reliability Council (NERC) that effect Canada, Mexico, and the United States.
An unintended result of this web of regulations is that top management is not necessarily totally clear on what the organization must do in terms of personnel issues, policies, and procedures. This can leave IT as the tail on the business dog. Top management must clearly describe business goals and objectives so that IT can implement them. In the case of compliance, IT is not able to sequentially address each and every rule, regulation, and law. Rather IT must employ IT as a tool for governance of the organization. IT, information security, and privacy technology in particular can be employed to enforce standards within the operation of the organization. These standards when taken together will ensure that the IT infrastructure the organization and its top management rely on to provide accurate and current information actually does so. IT can also be judiciously employed to ensure that the organization can function in spite of unforeseen interruptions whether they are acts of nature, intentional acts by adversaries, or accidents.
We are cautiously optimistic about the compliance outlook for 2007. We feel fairly confident in saying that U.S. law makers have been made aware of the negative effects of SOX and have hopefully taken notice of heightened IPO activity in financial markets outside the U.S. such as Hong Kong. This is likely to translate to a loosening of the perceived SOX stranglehold. The lack of a successful SOX prosecution may also be a factor emboldening executives to take a commonsense approach to running the organization which entails stated goals and objectives with respect to governance and which translates business objectives into IT standards, policies, and procedures that ensure the integrity of the IT infrastructure, which was the core intent of SOX in the first place.