I wanted to tell you all about this article: From the ashes: The next stage of EDA". I know it's a bit technical for some of you, but I think it's worth reading carefully anyway, because it gives a good overview of "Where we are now" -- where we are in the ongoing Moore's Law saga, the current problems facing the semiconductor industry, and the surprising ways people are changing what they are doing in response. (The figures are in annoying PDF files, but still worth checking out.)
For example, we keep hearing about how all the hardware design jobs are going to India. But this article talks about the increasing difficulty of digital design, and if this article is to be believed, those jobs going to India are the "dumb" digital design jobs -- jobs done by some guy who went to college and learned Verilog or VHDL, and who understands very little of the underlying semiconductor physics. Maybe they should study under Britney Spears for a while. This follows a similar trend in analog circuit design. Brian Hettinga, CEO of National Semiconductor, recently testified (although not under oath) that analog design jobs are not going to India. Analog design is a "black art" that involves deep understanding of physics, and the best people learn from the Guru's here in the US. Analog design has been enjoying double-digit growth in spite of the tech recession, because "analog" factors, rather than digital, have become key selling points especially in the mobile computing realm -- power consumption (and battery life) and form factor are key selling points. Digital, thanks to Moore's Law, is less of a selling point. That's what Hettinga says, but this article makes the point that doing digital design at the cutting edge requires -- like analog design -- a lot of physics knowledge.
Another interesting point is the way software costs are becoming a larger portion of the cost of electronics. In the last few years, EDA startups have been one of the few areas where, quite remarkably!, there have been many successful startup companies, in the depths of the tech recession. (EDA, Electronic Design Automation, is the industry term for the software used for circuit design.). Two examples off the top of my head are Nassda and Magma both of which are still in business and profitable. However, if you read the article, you'll realize that by "software costs", he doesn't just mean EDA software. He also means software that is written by the circuit design team that is used on the chip instead of hardware circuits.
What happens as the costs increase but the shelf life -- and the opportunity to recoup the costs -- becomes ever shorter? I would suggest that the result will be consolidation, in the sense that fewer chips designs will be produced and reach the market. Those few chip designs therefore must in larger volumes sold. Is that possible? That would be possible if these fewer chips can do more. And in fact, that AMD technology chief guy whose name I can't remember... insists that the long term trend in high-tech is for "general purpose" technology to take over the roles once played by specialized circuitry. As examples he cited the special multimedia instruction sets that all Intel and AMD chips have shipped with for many years. These instruction sets take over the basic functions that were once done by specialized DSP (digital signal processor) chips that people had to buy separately from the main CPU. (The interviewer didn't think to ask him whether this trend of "general purpose processors" taking over "specialized processors" would continue to the point where all the computations performed by the human brain will someday be done by a general-purpose chip from AMD. *I* would have asked. But he probably wouldn't be prepared to answer that one anyway. But why not? Every general purpose CPU from AMD is Turing-complete.)
Is he right about this trend? As an anecdotal example, when video games first came out, they required special hardware in their arcade consoles. Today, I can play the same games with "general purpose" software -- a web browser running flash. So the rule seems to be true. Of course, cutting-edge video games today still require special graphics hardware. But hey, it makes a good segway since today, we are meeting at the Swiss Science place to find out "Why We Play Games". See you then...
Nice segway. See you tonight, Mark.
Posted by: Mark Finnern | May 21, 2004 at 10:05