Skip to main content

Is the Itanium Finally Dead?

The Intel Itanium was a still-born monster from day one. I had my doubts, as soon as I had any real information on that processor. When the first machines started coming out, it was clear that this was a total disaster.

The Itanium was quickly named the Itanic and the name stuck. If Google's numbers are to be believed, the use of the term 'Itanic' outnumbers the use of the term 'Itanium' about 20 to 1. The product has been synonymous with EPIC (pun intended) failure almost since day one.

The wonder is not that Intel is finally killing off the Itanium. The wonder is why it took so long and so much money to learn what should have been clear from the start.

I published an article about the ongoing CPU struggle between AMD and Intel in 2000 ("Why Should AMD Drop Mustang?" under the pseudonym 'DeepNorth'. In the article, I say:

"For the first half of 2002, Intel needs for McKinley to have such compelling performance advantages over 32-bit systems (and 64-bit Hammer) that it is worth doing a software rewrite for the majority of developers. My 20 years in the industry tell me the chances of that happening are virtually nil." [McKinley was the coming next generation Itanium]

I was not expecting much from McKinley anyway, but it underwhelmed even me when it was released. Nobody I worked with was writing stuff for the Itanium. I do not remember even seeing an Itanium box in the wild. They exist, but in tiny quantities compared to the x86.

In the closing footnote of that article, I say the following:

"I don't believe that there is much argument that the 'Itanic' is a failure, even from Intel. Will this processor ever be anything other than a historical curiosity?"

That was over ten years ago. About two years ago, it was still 'steady as she goes' from Intel and HP, even though the processor had been all but dead for most of its existence. This ship would have sunk long ago had Intel, HP, Microsoft and similar big players not been madly bailing water all these years. However, over the past year, the Itanium has begun to take on too much water even for industry titans. It is sinking fast now. Even Intel has dropped C++ compiler support for the Itanic. This week, Oracle delivered the coup-de-grace by announcing that it was discontinuing support for the Itanium. HP is the only major player that has not dropped the hot potato that is the Itanium. They can't hold it forever.

Possibly the only thing that Itanium had going for it was the fact that it could run Windows. It can still run Windows, but not for long. Mainstream support for the current Itanium version of Windows ends in 2013 and 'extended support' ends in 2018.

Dan Reger, Senior Technical Product Manager for Windows Server for Microsoft has this to say about the Itanium vs the x86 chips:

"Microsoft will continue to focus on the x64 architecture, and it’s [(sic)] new business-critical role, while we continue to support Itanium customers for the next 8 years as this transition is completed."

Hardware needs software to breath. The Itanium's oxygen has been cut off.

I predicted that the Itanium was doomed a decade ago. That was in the face of a firm commitment of billions of dollars by major industry players to make Itanium work. Nothing at all has been added to the 'plus' column since then and many have been removed. Any remaining entries in the 'plus' column are either trivial or irrelevant.

The Itanium is *still* not actually dead, but it has been taken off life support. It is just a matter of time. I think it is reasonable to say that it will be official soon, probably before the end of Microsoft's mainstream support in 2013.

Having programmed in assembly on both x86 processors and non-x86 processors, I have to say that I am not fond of the x86 architecture. Nobody would be happier to see it replaced than me. I think it will be replaced. In fact, it has arguably already been replaced with a less horrible (but still horrible) version of itself.

The Itanium was the wrong answer to the wrong question. It never would have been a success. That is a shame because it was attempting to address very real problems with the x86 architecture and many of the worst of these problems remain.

For displaced chip designers who might find themselves on another team, I will try to put up a wish-list from a long time programmer (me) that should be food for thought.

--- Copyright (C) Bob Trower. You may use this freely as you wish, but please attribute the work if you can ---

Comments

Popular posts from this blog

The system cannot execute the specified program

It always annoys me no end when I get messages like the following: "The system cannot execute the specified program." I got the above error from Windows XP when I tried to execute a program I use all the time. The message is hugely aggravating because it says the obvious without giving any actionable information. If you have such a problem and you are executing from a deep directory structure that may be your problem. It was in my case. Looking on the web with that phrase brought up a bunch of arcane stuff that did not apply to me. It mostly brought up long threads (as these things tend to do) which follow this pattern: 'Q' is the guy with the problem asking for help 'A' can be any number of people who jump in to 'help'. Q: I got this error "The system cannot execute the specified program." when I tried to ... [long list of things tried] A: What program were you running, what operating system, where is the program? What type of

Coming Soon: General Artificial Intelligence

The closer you get to experts who understand the nuts and bolts and history of AI, the more you find them saying that what we have is not nearly General Artificial Intelligence (GAI), and that GAI seems far away. I think we already have the roots in place with Neural Networks (NN), Deep Learning (DL), Machine Learning (ML), and primitive domain limited Artificial Intelligence (AI). Things like computer vision, voice recognition, and language translation are already in production. These are tough problems, but in some ways, machines are already better than humans are. I expect GAI to be an emergent property as systems mature, join, and augment one another. I was around during the 70s AI winter, and was involved in the 80s AI winter as one of the naysayers. I built a demonstration system with a Sperry voice recognition card in 1984. I could demonstrate it in a quiet room, but as a practical matter, it was not production ready at all. Around 1988 we built demonstration expert systems usin

Your call is important to us, but not much.

Rogers entire network is down and Rogers either does not know why or sufficiently disrespects its customers that it won't say. I was on the advisory committee for the largest private network in Canada serving 150,000 employees countrywide. I was also an active participant building out that network. I installed the first Local Area Networks there. I wrote a code generator responsible for the most critical portion of Bell's mobile network. I also wrote a portion of code for a system in the United States that detected and pinpointed line breaks in their network before they happened. For a time, I held the title 'Networking Professor' at our local College. I registered my first domain name in the 1980s. I have administered Internet network servers for decades. In one capacity or another, I have worked with most of the telecommunications providers in Canada past and present. Nearly a billion devices use a small network codec written by me decades ago.  Except that Rogers was