It is always a pleasure interacting with Dr. Walden (Wally) C. Rhines, chairman and CEO, Mentor Graphics, and vice chairman of the EDA Consortium, USA. I started by enquiring about the global semiconductor industry.
Dr. Wally Rhines said: "The absolute size of the semiconductor industry (in terms or total revenue) differs depending on which analyst you ask, because of differences in methodology and the breadth of analysts' surveys. Current 2012 forecasts include $316 billion from Gartner, $320 billion from IDC, $324.5 billion from IHS iSuppli, $327.2 billion from Semico Research and $339 billion from IC Insights.
"These numbers reflect growth rates from 4 per cent to 9.2 per cent, based on the different analyst-specific 2011 totals. Capital spending forecasts for the three largest semiconductor companies have increased by almost 50 per cent just since the beginning of this year. However, the initial spurt of demand was influenced by the replenishment of computer and disc drive inventories caused by the Thailand flooding. Now that this is largely complete, there is some uncertainty about the second half.
"So, overall it looks like the industry will pass $310 billion this year, but it may not be by very much. The strong capital spending and demand for leading edge capacity should impact the second half but the bigger impact will probably be in 2013.
What's with 28/20nm?
Has 28/20nm semiconductor technology become a major 'work horse'? What's going on in that area? At least, this area is now of considerable interest.
Dr. Rhines said that the semiconductor industry's transition to the 28nm family of technologies, which broadly includes 32nm and 20nm, is a much larger transition than we have experienced for many technology generations.
The world's 28nm-capable capacity now comprises almost 20 per cent of the total silicon area in production and yet, the silicon foundries are fully loaded with more 28nm demand than they can handle. In fact, high demand for 28/20nm has created a capacity pinch that is currently spurring additional capital expenditure by foundries.
He added: "As yields and throughput mature at 28nm, the major wave of capital investment will provide plentiful foundry capacity at lower cost, stimulating a major wave of design activity. Cost-effective, high yield 28nm foundry capacity will not only drive increasing numbers of new designs but it will also force re-designs of mature products to take advantage of the cost reduction opportunity."
Handling 22nm and sub-22nm levels
One would like to know What does EDA now need to do at handling 22nm and sub-22nm levels! Also, is that already happening?
Dr. Rhines said that as we move toward smaller geometries, we need better techniques to manage the growing problem of variability in nanometer integrated circuit manufacturing. We are really starting to see that DFM (design for manufacturing), something the industry has been talking about for years, is now becoming critical to design.
DFM requires a detailed understanding of OPC (optical proximity correction). Specialists in optics have joined traditional electronic design specialists at EDA companies to create these key technologies. The EDA companies are working closely with semiconductor manufacturers on process technology.
Challenges ahead!
And, what are the other challenges that the EDA industry will likely face?
Dr. Rhines added that low power design at higher levels is a pressing challenge. Architectural choices made in the front end of design have the most significant impact on power consumption. In addition, the speed of simulation is orders of magnitude faster at the system level. But assessing power at this abstract level traditionally has been extremely difficult to do with any degree of accuracy.
Fortunately, advanced design tools are emerging that provide accurate power modeling early in the design flow where architectural tradeoffs can easily be made. This will enable designers to explore more alternatives for applying the most efficient power strategies.
In the area of functional verification, the explosion in the complexity of verification continues to be a formidable challenge for EDA. As designs expand in size and complexity, simulation runs cannot reach effective coverage within a reasonable amount of time. To keep on schedule, designers are being forced to either lower their coverage goals or change methodologies. Changes ESL (electronic system level), coverage based verification, emulation, hardware acceleration of test benches and assertion-based verification.
One of the most promising new methodologies is intelligent testbench automation that removes redundant simulation, giving top priority to running each unique test first. This results in achieving target coverage linearly, leaving more time for running random tests, or even expanding the test space to previously untested functionality.
Emulation is also evolving to address the growing complexity of system design and the increasing need for hardware/software co-verification. While traditional 'in-circuit emulation' is still used, the trend of leading edge users is toward acceleration of test benches, or co-modeling, virtual IP stimulus (rather than plug-in hardware) and software debug for dozens of simultaneous users.
As a result, emulation can be set up as a typical IT server farm, with users remotely accessing the portion of the emulator capacity they need. The cost per cycle of emulation is two to three orders of magnitude lower than simulation on a traditional server farm. A large, and increasing, share of emulation deployment is in systems companies, who are using emulation to both debug multi-chip systems and to develop and verify embedded software.
Another challenging area is embedded software. The cost of designing hardware has actually not increased much over the last twenty years, according to the ITRS roadmap and Gary Smith EDA.
What has increased is the cost of system engineering and embedded software development. The enablement software for the SoC (drivers, Linux, light weight executives, etc), is becoming the bottleneck in the release process.
He noted: "Ideally, the SoC design process would enable embedded software development ahead of silicon. But that's not as easy as it sounds. First, the virtual representation needs to run sufficiently fast. Second, an environment that is native to the software development team needs to be established, trying to train the embedded software team on the use of hardware design tools is a non-starter. The solution that is emerging is a single embedded software development environment that is the same whether the target is a simulation, emulation, prototype or final product. Popularity of this kind of environment is growing rapidly.
"For example, Mentor's Sorcery Codebench, which is built upon the GNU open source tool chain for cross development of Linux, RTOS and bare-metal based embedded systems, is experiencing more than 20,000 downloads per month."
Tuesday, September 18, 2012
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.