Speaking on 'Enabling business growth through effective and collaborative innovation', at the recently held International Electronics Forum (IEF) 2010 in Dresden, Germany, Dr. Jack Sun, CTO and vice president, R&D, TSMC, said that TSMC leads and invests heavily in competitive, energy efficient, and eco-friendly technologies to enable product innovation, such as CMOS platform scaling -- 40/28/20nm/FinFET, low-R,ELK.., More-than-Moore, and integrated package/3D-IC.
He added that TSMC strives for manufacturing excellence and capacity, and economy of scale, to support customers’ innovation and business growth. The company is also pushing the acceleration of EUV and Multi-Ebeam capabilities for cost-effective density scaling. His clear message was, "We must and can collaborate to innovate and overcome the technical and cost challenges." That is, a collaborative innovation among the government, the industry and the academia is required to overcome the cost hurdle.
Earlier, he said that the IC industry will continue to grow -- with a 22 percent growth likely in 2010, reaching $276 billion. During 2011-2014, he estimated a 4.2 percent CAGR for the IC industry and 7.2 percent CAGR for fabless companies.
Dwelling on the application and technology trend, Dr. Sun pointed out that the trend is SoC and heterogeneous integration at chip, package, and product level with embedded power-efficient processors, hardware accelerators, and special features.
TSMC continues to further expand its offering by including packaging services and silicon foundry services. This will allow the fabless semiconductor companies to achieve 'More than Moore' gains in integration by using TSMC as a foundry partner.
Dr. Sun also detailed the how TSMC enables innovation by providing best-in-class technology and design solutions.
* 'Green' CMOS technology platform – Moore’s Law.
-- High density energy-efficient transistors and interconnect $ most desirable for embedded SoC.
-- Pushing reduced-cost lithography and 450mm.
* Eco-friendly fine-pitch integrated packaging technology and 3D-IC
* Special/derivative technologies to interact with the external world – More-than-Moore.
-- MCU, MEMS, RF, analog, BCD power, CIS, Display Driver, etc.
-- Re-use and leverage compatible CMOS platform backbone and IP
* Open innovation platform and ecosystem of IPs.
TSMC's 20nm and 28nm leadership
TSMC's CMOS platform leadership clearly highlights future 20nm technology as well as 28nm leadership. Dr. Sun also highlighted how customers innovate with TSMC 40nm. Currently, there are more than 60 customer product tape-outs. More than half are in production with D0 <0.1 ~ 0.18. The monthly 40nm CyberShuttle has delivered >780 blocks for design/IP verification.
While on TSMC 28nm technology highlights, Dr. Sun said that the 28LP (poly/SiON) yield is approaching mature level on 64Mb SRAM. Also, the 28nm HKMG (28HP/28HPL) development is on track. Here, TSMC developed Gate-Last process with N+/P+ work function and superior performance, yield, manufacturability, variability,and reliability. Also, it achieved double-digit 64Mb yield, good Vccmin, close-to- targets transistors, and good pre-qual reliability.
Dr. Sun added that a steady stream of shuttles have been running since the first one was launched in Jan’09. Almost every shuttle is 100 percent utilized. This implies an intensive customer engagement by TSMC. Over two dozen customers are said to be working with TSMC on 28nm technology across all application segments.
Now, on to TSMC's 20nm highlights. The key technology features include planar transistors with 2nd-generation HKMG and 5th-generation strained Si; low-resistance ultra-shallow junction with M0 and enhanced millisecond anneal and silicide; and enhanced ELK and 2nd-generation Low-R interconnect.
Some other 20nm tehchnology highlights include immersion lithography with innovative patterning and layout solutions to achieve 2x density over 28nm, with the EDA tool likely to be ready by mid 2010. Also, the design rules are compatible for EUV and Multi-Ebeam insertion for selected layers in 2013-2014. Wow, this is really something!
TSMC's lithography roadmap
TSMC is also pushing the lithography roadmap with multiple partners. The concept and feasibility being proven; innovation and consortium collaboration are essential for throughput/cost improvement and infrastructure setup. While on EUV lithography at TSMC and the EUV production tool, the year ~2013-14 is the potential for N20/N14 production.
Dr. Sun also touched upon the Multi-Ebeam maskless lithography at TSMC. The MAPPER pre-alpha tool (110 beam, 5KeV) has been at TSMC since July 2009. It has accomplished 10 percent beam-to-beam CD uniformity for 45-nm half-pitch and 30-nm half-pitch for contact holes. This will be upgraded to 13,000 beams for 10 WPH, and clustered for 100 WPH for N20 and N14.
He listed TSMC's manufacturing leadership as follows:
* First Si success.
* Yield and fast seamless ramp.
* Giga-Fab’s at your service.
* Green fab initiative.
* Capacity leader and economy of scale.
* 450mm initiatives.
On the N40 volume production ramp, Dr. Sun said that two giga-fabs are in volume production with good D0 and device performance.
Also. two of TSMC's parallel capacity building lines will stay as 12” capacity leader -- referring to Giga Fab 12 and Giga Fab 14, respectively. TSMC is determined to continue expanding capacity. Fab12 Phase 5 -- planned tool move-in in 3Q‘10; Fab12 Phase 6 -- secured land; and Fab14 Phase 4 -- ground breaking in 1Q10.
Dr. Sun talked about how TSMC aims to achieve cost effective manufacturing through 450mm production as well.
Thursday, May 27, 2010
Monday, May 24, 2010
Xilinx’s ISE Design Suite 12.1 focuses on power, productivity and plug-and-play!
Early this month, Xilinx Inc. released the ISE Design Suite 12. I remember, last year, around the end of April, the ISE Design Suite 11 had been released, so this release should surely have some new things to offer, considering that this is among Xilinx’s key milestones this year.
Just for the record, so far, Xilinx’s notable milestones for this year have been: 28nm architechture supported by clock gating, partial reconfiguration in February 2010, followed by the release of the AMBA 4/AXI-4 specifications in March. And now, the ISE Desgin Suite 12.1, which incidentally, is available from May 3, 2010, onward.
For those who came in late, Xilinx’s ISE Design Suite 11.1 (released a year ago) was said to be the industry’s first FPGA design solution with fully interoperable domain-specific design flows and user-specific configurations for logic, digital signal processing (DSP), embedded processing, and system-level design.
What’s new with ISE Design Suite 12.1?
Now, what’s new in Xilinx’s ISE Desgn Suite 12? Three things — power, productivity and plug-and-play!Xilinx’s ISE Design Suite 12′s thrust has been on improving the power efficiency (or power reduction), productivity and plug-and-play capability. Let’s take a look at each one of them.
On power, Xilinx claims to have achieved (or made available) 30 percent dynamic power reduction using the innovative automated clock gating technology. On productivity, the ISE Design Suite 12 boasts of improved productivity with design preservation, faster run-time and fourth generation partial reconfiguration. On plug-and-play, the Design Suite 12 allows plug-and-play FPGA design with AXI-4 compliant IP.
I will try and add some more details on the three aspects, time permitting.
Xilinx has also outlined the next steps in the ISE Design Suite roadmap. These are:
* In May 2010, it has introduced intelligent clock-gating for Virtex-6 and and improved the design preservation
flow for timing predictability in ISE Design Suite 12.1
* In the summer of 2010, Xilinx will offer partial reconfiguration to all users and intelligent clock-gating support for Spartan 6.
* In the fall of 2010, Xilinx drives plug-and-play FPGA design with embedded, DSP and connectivity IP support for AXI4.
Just for the record, so far, Xilinx’s notable milestones for this year have been: 28nm architechture supported by clock gating, partial reconfiguration in February 2010, followed by the release of the AMBA 4/AXI-4 specifications in March. And now, the ISE Desgin Suite 12.1, which incidentally, is available from May 3, 2010, onward.
For those who came in late, Xilinx’s ISE Design Suite 11.1 (released a year ago) was said to be the industry’s first FPGA design solution with fully interoperable domain-specific design flows and user-specific configurations for logic, digital signal processing (DSP), embedded processing, and system-level design.
What’s new with ISE Design Suite 12.1?
Now, what’s new in Xilinx’s ISE Desgn Suite 12? Three things — power, productivity and plug-and-play!Xilinx’s ISE Design Suite 12′s thrust has been on improving the power efficiency (or power reduction), productivity and plug-and-play capability. Let’s take a look at each one of them.
On power, Xilinx claims to have achieved (or made available) 30 percent dynamic power reduction using the innovative automated clock gating technology. On productivity, the ISE Design Suite 12 boasts of improved productivity with design preservation, faster run-time and fourth generation partial reconfiguration. On plug-and-play, the Design Suite 12 allows plug-and-play FPGA design with AXI-4 compliant IP.
I will try and add some more details on the three aspects, time permitting.
Xilinx has also outlined the next steps in the ISE Design Suite roadmap. These are:
* In May 2010, it has introduced intelligent clock-gating for Virtex-6 and and improved the design preservation
flow for timing predictability in ISE Design Suite 12.1
* In the summer of 2010, Xilinx will offer partial reconfiguration to all users and intelligent clock-gating support for Spartan 6.
* In the fall of 2010, Xilinx drives plug-and-play FPGA design with embedded, DSP and connectivity IP support for AXI4.
Sunday, May 23, 2010
What’s new with Mentor’s PADS 9.2?
Lots, if you look closely! If you look at the previous PADS 9.0 flow release, a typical Suite configuration includes:
* Design entry including DxDesigner, variant management, and data import/translators from competitors’ PCB design systems.
* HyperLynx pre-and post-layout signal integrity simulation, thermal analysis and analog simulation.
* PADS Layout with powerful auto-routing, high-speed rule adherence, unlimited database and layers, and design reuse.
* Advanced manufacturing rules.
* 3D Viewer and integration to multi-disciplined collaboration solutions.
This latest release of PADS seems to have gone a mile or two further. The final verdict stays with the end users.
Well, it was a great pleasure to meet up with Jim Martens, product marketing manager, PADS Solutions Group, Mentor Graphics and Yan Killy, CID, Application Engineer Consultant, Mentor Graphics, while they were on a three-city tour of India in connection with the Mentor Graphics PCB Technology Day.
Specifically, there were sessions on PADS 9.2, the newest and most powerful release of the PADS solution yet! PADS Layout is said to be the most widely used desktop PCB design solution in the world. There have also been major enhancements in DxDesigner and also how the PADS flow has been extended to now include 3D viewing, power integrity, thermal analysis, collaborative design tools, and better ties to fabrication and assembly.
Live product demonstations were the major highlight of Mentor’s PCB Technology Day. If I am correct, PADS 9.2 is due for release shortly!
PADS solves design challenges in handhelds
Touching upon design challenges in handheld, Martens highlighted a few, such as handheld vendors adding more functions in the same space. Packaging poses a challenge as well. For example, the available board area is 476mm2, while the component area is 400mm2. Some other areas providing challenges include multiband connectivity, re-using the external casing, mechanical assembly, etc.
Yan Killy, who also provided a demo of using PADS with consumer handhelds, added that while component management could be a problem, the PADS integrated design allows for component management. Features include wizards for fast and efficient library creation. Part attributes have been organized in standard databases, thus making it easier to find parts and manage libraries.Another feature is design data management, which has features such as spreadsheet views for fast editing of design data, and finding optimal parts as parametric seach.
PADS’ new release features conceptual signal integrity analysis — where you can optimize layer stack up and also perform spectrum analysis for EMI potential; advance constraints management — users can assign net classes, differential pairs, min./max. length, and matched length groups, as well as define heirarchical rules. System integration allows for a tighter integration between design definition and layout for high productivity.
Killy added that interactive route editing was one of the strengths of PADS, which features unique aids for interactive routing, as well as dynamic and manual route nodes. PADS also features angle free automatic routing as well as physical design route.
Another feature of PADS is its advanced packaging toolkit, which features wirebond editor, die flag generation and wirebond report generation. Of course, there is the PADS RF support as well, with features such as DXF import in layout and Decal editor, along with solder mask and CAM output improvements.
FPGA I/O Designer and Hyperlynx
The next session was on the FPGA I/O Designer, which advances FPGA on-board productivity, which was followed by sessions on Hyperlynx Power Integrity (PI) and Green Systems Design using HyperLynx Power and Signal Integrity Engine.
* Design entry including DxDesigner, variant management, and data import/translators from competitors’ PCB design systems.
* HyperLynx pre-and post-layout signal integrity simulation, thermal analysis and analog simulation.
* PADS Layout with powerful auto-routing, high-speed rule adherence, unlimited database and layers, and design reuse.
* Advanced manufacturing rules.
* 3D Viewer and integration to multi-disciplined collaboration solutions.
This latest release of PADS seems to have gone a mile or two further. The final verdict stays with the end users.
Well, it was a great pleasure to meet up with Jim Martens, product marketing manager, PADS Solutions Group, Mentor Graphics and Yan Killy, CID, Application Engineer Consultant, Mentor Graphics, while they were on a three-city tour of India in connection with the Mentor Graphics PCB Technology Day.
Specifically, there were sessions on PADS 9.2, the newest and most powerful release of the PADS solution yet! PADS Layout is said to be the most widely used desktop PCB design solution in the world. There have also been major enhancements in DxDesigner and also how the PADS flow has been extended to now include 3D viewing, power integrity, thermal analysis, collaborative design tools, and better ties to fabrication and assembly.
Live product demonstations were the major highlight of Mentor’s PCB Technology Day. If I am correct, PADS 9.2 is due for release shortly!
PADS solves design challenges in handhelds
Touching upon design challenges in handheld, Martens highlighted a few, such as handheld vendors adding more functions in the same space. Packaging poses a challenge as well. For example, the available board area is 476mm2, while the component area is 400mm2. Some other areas providing challenges include multiband connectivity, re-using the external casing, mechanical assembly, etc.
Yan Killy, who also provided a demo of using PADS with consumer handhelds, added that while component management could be a problem, the PADS integrated design allows for component management. Features include wizards for fast and efficient library creation. Part attributes have been organized in standard databases, thus making it easier to find parts and manage libraries.Another feature is design data management, which has features such as spreadsheet views for fast editing of design data, and finding optimal parts as parametric seach.
PADS’ new release features conceptual signal integrity analysis — where you can optimize layer stack up and also perform spectrum analysis for EMI potential; advance constraints management — users can assign net classes, differential pairs, min./max. length, and matched length groups, as well as define heirarchical rules. System integration allows for a tighter integration between design definition and layout for high productivity.
Killy added that interactive route editing was one of the strengths of PADS, which features unique aids for interactive routing, as well as dynamic and manual route nodes. PADS also features angle free automatic routing as well as physical design route.
Another feature of PADS is its advanced packaging toolkit, which features wirebond editor, die flag generation and wirebond report generation. Of course, there is the PADS RF support as well, with features such as DXF import in layout and Decal editor, along with solder mask and CAM output improvements.
FPGA I/O Designer and Hyperlynx
The next session was on the FPGA I/O Designer, which advances FPGA on-board productivity, which was followed by sessions on Hyperlynx Power Integrity (PI) and Green Systems Design using HyperLynx Power and Signal Integrity Engine.
Friday, May 21, 2010
FPGA Camp in Bangalore discusses various industry aspects
Today, FPGACentral hosted its first ever FPGA Camp in Bangalore. The conference mainly aimed at bringing the engineers together and discussed the various aspects of the FPGA, mainly next generation FPGA technology, application, methodology, best practices and challenges, etc.he morning session rolled out with a session on ‘Today’s FPGA Ecosystem,’ where the participants included, Neeraj Varma, country manager – Sales, India and Australia/NZ, Xilinx India, Wai Leng Cheong, regional sales manager, South Asia Pacific, Altera Singapore, and Rakesh Agarwal, country manager, India & ANZ, Lattice.
Adrian Hernandez, senior manager, Xilinx USA, gave a presentation on ‘Mastering FPGA Design through Debug.’ This was followed by John Wei, High Speed System Specialist, Altera, Hong Kong delivering a lecture on the ‘Trends and challenges in designing with high speed transceivers based FPGAs, and signal Integrity concerns.’ The morning session was wrapped up by Srinivasan Venkataramanan, CTO, CVC, who presented on ‘Upgrading to SystemVerilog for FPGA Designs.’
A highlight of the afternoon session was a panel discussion on ‘State of FPGA technology and its adoption in India.”
Now, I am not really posting anything specifically on the sessions as these were mainly targeted toward engineers, and I, for a change, decided to simply sit back and listen to the speakers, rather than take notes.
Just a few points from here and there. For instance, Lattice’s Rakesh Agarwal mentioned that the company’s mid-range ECP3 is the lowest power SerDes enable FPGA in the market. The company is focused on markets where it can differentiate with high value, low power solutions, and where it has the scale to effectively compete.
The single most important feature that one must keep in mind when designing and verifying FPGA based projects is device reconfiguration. Xilinx’s Adrian Hernandez suggested that users should build on the FPGA’s reconfiguration. He called upon them to share knowledge and experiences. One of the points raised by John Wei was that advanced oscillator and hybrid CDR enables 25Gbps at the 28nm CMOS process node in FPGAs.
SystemVerilog interfaces have quickly found way into new designs, as they are useful for RTL designers and verification engineers. Srinivasan Venkataramanan touched upon the ecsystem around the SV-FPGA, adding that all of the major EDA vendors support SystemVerilog for design.
On the event itself, Navin Kumar and his team, including the volunteers, deserve a huge round of applause for pulling off this event. It was the first of its kind in India, an open source conference — with free attendance, etc. I believe, more people turned up, than originally expected. The turnout itself was interesting, with a mix of engineers, students and of course, the industry.
There were some minor hiccups regarding the location/venue and the positioning of booths — some of which looked really cramped for space, etc. However, these are really very minor issues, which the FPGACentral India team is sure to address in its forthcoming events. Well done guys!
Adrian Hernandez, senior manager, Xilinx USA, gave a presentation on ‘Mastering FPGA Design through Debug.’ This was followed by John Wei, High Speed System Specialist, Altera, Hong Kong delivering a lecture on the ‘Trends and challenges in designing with high speed transceivers based FPGAs, and signal Integrity concerns.’ The morning session was wrapped up by Srinivasan Venkataramanan, CTO, CVC, who presented on ‘Upgrading to SystemVerilog for FPGA Designs.’
A highlight of the afternoon session was a panel discussion on ‘State of FPGA technology and its adoption in India.”
Now, I am not really posting anything specifically on the sessions as these were mainly targeted toward engineers, and I, for a change, decided to simply sit back and listen to the speakers, rather than take notes.
Just a few points from here and there. For instance, Lattice’s Rakesh Agarwal mentioned that the company’s mid-range ECP3 is the lowest power SerDes enable FPGA in the market. The company is focused on markets where it can differentiate with high value, low power solutions, and where it has the scale to effectively compete.
The single most important feature that one must keep in mind when designing and verifying FPGA based projects is device reconfiguration. Xilinx’s Adrian Hernandez suggested that users should build on the FPGA’s reconfiguration. He called upon them to share knowledge and experiences. One of the points raised by John Wei was that advanced oscillator and hybrid CDR enables 25Gbps at the 28nm CMOS process node in FPGAs.
SystemVerilog interfaces have quickly found way into new designs, as they are useful for RTL designers and verification engineers. Srinivasan Venkataramanan touched upon the ecsystem around the SV-FPGA, adding that all of the major EDA vendors support SystemVerilog for design.
On the event itself, Navin Kumar and his team, including the volunteers, deserve a huge round of applause for pulling off this event. It was the first of its kind in India, an open source conference — with free attendance, etc. I believe, more people turned up, than originally expected. The turnout itself was interesting, with a mix of engineers, students and of course, the industry.
There were some minor hiccups regarding the location/venue and the positioning of booths — some of which looked really cramped for space, etc. However, these are really very minor issues, which the FPGACentral India team is sure to address in its forthcoming events. Well done guys!
Thursday, May 20, 2010
GlobalFoundries enabling the next wave of ‘foundry’ innovation
According to Mojy Chian, Senior Vice President, Design Enablement, GlobalFoundries, continued innovation in the foundry business demands a new approach. He was speaking at the recently held International Electronics Forum (IEF) 2010 organized by Future Horizons in Dresden, Germany.
GlobalFoundries is bringing a highly integrated model to foundry, which involves the extension of customer operations, early customer-foundry engagement, as well as close collaboration and joint technology development. This would enable faster time to volume and market, leading to smooth ramps to mature yields. Chian added that design, manufacturing, and EDA/IP solutions must work in unison to accomplish this.
According to him, the industry desperately needs a new approach. Here, he discussed GlobalFoundries’ 28nm collaborative innovation, which involves four phases.
Phase 1: Exploration
* For advanced technology, foundry engagement begins 2.5 years before product tapeout.
* Starts with exploration of design architecture, specification, and methodology.
* Foundry value proposition drives corresponding process selection.
* Early engagement locks in the process to the customer’s design requirements.
Phase 2: Optimization
* Design architecture and IP development begins.
* Performance, power, density, cost, TTM targets analyzed.
* Trade-off analysis of design and process targets, TTM, and manufacturability.
* Design and process technology are co-optimized.
Phase 3: Iteration
* Initial process and design test structures taped out.
* Process targets frozen – PDK 0.1 is released.
* Design implementation methodology finalized.
* Design performance and power targets are defined.
Phase 4: Implementation
* Concurrent and target-driven process and design implementation.
* Fine tuning of process and design – design implementation in high gear.
* Test chips taped out, chip level validation, PDK 0.2, 0.5, 0.9, and 1.0 released.
* Incremental march towards process qualification and risk production.
The ultimate goal: process qualified on same day as tapeout! Target-driven technology development and design enablement are at the core of accelerating time-to-market.
Design Enablement at GlobalFoundries is said to be an unique approach to bridging the gap between design and manufacturing. There is close collaboration and early engagement. Although product engineering heritage is not typically found at foundries, GlobalFoundries has been able to make use of the best practices from Chartered’s foundry experience.
All of this leads to optimized performance, leakage and yields, accelerated time to market, reduced design and manufacturing risks, integrated system-level functionality.
Earlier, Chian said that while design starts may be slowing, advanced technology continues to drive innovation and provide new value. Also, while the R&D costs may be rising, so is the revenue for advanced technology.
Changing landscape; close relationship with customers
There is a changing landscape at the leading edge. One, industry design rules are no longer binary. Also, passing design rules does not guarantee robust manufacturing. Pushing design rules can provide denser designs. There is an increasing need for interdependency of design and process. Also, design-technology co-optimization is essential. Chian added that the leading-edge chip designs are unique and require equally flexible design solutions.
Here, he highlighted GlobalFoundries' attempt at delivering differentiated design solutions. According to him, close partnership with customers enables a collaborative development environment. There is early engagement that influences product planning and architecture. Not only these, it facilitates access to technology and design enablement experts as well, and leads to the establishment of trust and intimacy at the technical level.
Further, there is availability of customized solutions that maximize ROI. Also, you can optimize design/process to meet requirements. There are concurrent design and process development to improve TTM and TTV — also leading to interim PDK releases, as well as version 0.1 based IP releases.
Close partnership with the customer also optimizes the interface of design and process, leading to improved yields and increasing the competitive advantage.
GlobalFoundries is bringing a highly integrated model to foundry, which involves the extension of customer operations, early customer-foundry engagement, as well as close collaboration and joint technology development. This would enable faster time to volume and market, leading to smooth ramps to mature yields. Chian added that design, manufacturing, and EDA/IP solutions must work in unison to accomplish this.
According to him, the industry desperately needs a new approach. Here, he discussed GlobalFoundries’ 28nm collaborative innovation, which involves four phases.
Phase 1: Exploration
* For advanced technology, foundry engagement begins 2.5 years before product tapeout.
* Starts with exploration of design architecture, specification, and methodology.
* Foundry value proposition drives corresponding process selection.
* Early engagement locks in the process to the customer’s design requirements.
Phase 2: Optimization
* Design architecture and IP development begins.
* Performance, power, density, cost, TTM targets analyzed.
* Trade-off analysis of design and process targets, TTM, and manufacturability.
* Design and process technology are co-optimized.
Phase 3: Iteration
* Initial process and design test structures taped out.
* Process targets frozen – PDK 0.1 is released.
* Design implementation methodology finalized.
* Design performance and power targets are defined.
Phase 4: Implementation
* Concurrent and target-driven process and design implementation.
* Fine tuning of process and design – design implementation in high gear.
* Test chips taped out, chip level validation, PDK 0.2, 0.5, 0.9, and 1.0 released.
* Incremental march towards process qualification and risk production.
The ultimate goal: process qualified on same day as tapeout! Target-driven technology development and design enablement are at the core of accelerating time-to-market.
Design Enablement at GlobalFoundries is said to be an unique approach to bridging the gap between design and manufacturing. There is close collaboration and early engagement. Although product engineering heritage is not typically found at foundries, GlobalFoundries has been able to make use of the best practices from Chartered’s foundry experience.
All of this leads to optimized performance, leakage and yields, accelerated time to market, reduced design and manufacturing risks, integrated system-level functionality.
Earlier, Chian said that while design starts may be slowing, advanced technology continues to drive innovation and provide new value. Also, while the R&D costs may be rising, so is the revenue for advanced technology.
Changing landscape; close relationship with customers
There is a changing landscape at the leading edge. One, industry design rules are no longer binary. Also, passing design rules does not guarantee robust manufacturing. Pushing design rules can provide denser designs. There is an increasing need for interdependency of design and process. Also, design-technology co-optimization is essential. Chian added that the leading-edge chip designs are unique and require equally flexible design solutions.
Here, he highlighted GlobalFoundries' attempt at delivering differentiated design solutions. According to him, close partnership with customers enables a collaborative development environment. There is early engagement that influences product planning and architecture. Not only these, it facilitates access to technology and design enablement experts as well, and leads to the establishment of trust and intimacy at the technical level.
Further, there is availability of customized solutions that maximize ROI. Also, you can optimize design/process to meet requirements. There are concurrent design and process development to improve TTM and TTV — also leading to interim PDK releases, as well as version 0.1 based IP releases.
Close partnership with the customer also optimizes the interface of design and process, leading to improved yields and increasing the competitive advantage.
Wednesday, May 19, 2010
Plastic Logic's QUE proReader looks to mean business!
Konrad Herre, VP Manufacturing, Plastic Logic, gave a very interesting presentation on the commercialization of plastic electronic technology -- specifically, new product segments based on organic electronics, at the recently held International Electronics Forum (IEF) 2010, organized by Future Horizons in Dresden, Germany.
The company has been in the news for its QUE ProReader, (eReader) which it claims, is a milestone in the evolution of plastic electronics. Electronic reading will be natural, easy and comfortable, while using the QUE, according to Herre.The QUE proReader features an unique form factor, large display, is light in weight and thin, and even shatterproof. It has an intuitive user interface featuring touch navigation. Obviously, a main strength of the product is it use of powerful software tools, which enable notes, mark-up, zoom, etc. Of course, the wireless download capabilities provide easier access to content. The QUE uses a Li-ion battery.
The QUE ProReader has been built using Plastic Logic's unique plastic electronics technology. The result is a stunning form factor that is sleek, lightweight and incredibly easy-to-read. Its touchscreen-based interface is elegant and easy-to-use.
Plastic Logic is a spinoff from Cambridge University's Cavendish Laboratory in 2004. It had the proof of concept validation done in 2004. With regard to flexible display module manufacturing, the company's started focusing on display manufacturing in 2006. The groundbreaking in Dresden happened in 2007. The first displays from its factory appeared in 2008, with ramp slated for 2010. Product trials happened in 2009, with launch sometime this year.
Touching upon the industrialization of the flexible display concept, Herre said there had been a private financial funding for a production facility of over $100 million. Over 200 locations were evaluated globally, before the company arrived at the decision to develop a manufacturing facility in the 'Silicon Saxony' region of Dresden, Germany starting in May 2007 for volume production in 2009.
Why select Dresden as a location? Plastic Logic cited reasons such excellent local support in all areas, ground, workforce regulations, etc., experience and infrastructure for R&D and volume manufacturing, development grants, good industrial area for development, etc. Perhaps, the fact that the IEF 2010 was held here should be proof enough!
I later checked the site que.com, and found there are two varieties of the QUE proReader -- with 4GB and Wi-Fi for $649 and with 8GB, Wi-Fi and 3G for $799.
Product specifications for the Wi-Fi and 3G model are:
Connectivity: Cellular (GSM), Wi-Fi (802.11 b/g), USB, Bluetooth 2.0.
Memory: 8 GB (Approx. 7.5 GB available for user data).
Display (active area): 10.7" diagonal, 960 x 1280 pixels at 150ppi, 8 gray levels.
User Interface: Full touchscreen, Virtual keyboard.
Battery: Rechargeable lithium-ion battery, charging via computer or wall charger.
Dimensions: 8.5" x 11" x .3".
Weight: Approximately 17 ounces.
It can be used with a Blackberry smartphone with Bluetooth. With the QUE software on your Windows PC, Mac or BlackBerry smartphone, a user can quickly and easily convert and transfer documents to the QUE. Now for that long due product launch!
The company has been in the news for its QUE ProReader, (eReader) which it claims, is a milestone in the evolution of plastic electronics. Electronic reading will be natural, easy and comfortable, while using the QUE, according to Herre.The QUE proReader features an unique form factor, large display, is light in weight and thin, and even shatterproof. It has an intuitive user interface featuring touch navigation. Obviously, a main strength of the product is it use of powerful software tools, which enable notes, mark-up, zoom, etc. Of course, the wireless download capabilities provide easier access to content. The QUE uses a Li-ion battery.
The QUE ProReader has been built using Plastic Logic's unique plastic electronics technology. The result is a stunning form factor that is sleek, lightweight and incredibly easy-to-read. Its touchscreen-based interface is elegant and easy-to-use.
Plastic Logic is a spinoff from Cambridge University's Cavendish Laboratory in 2004. It had the proof of concept validation done in 2004. With regard to flexible display module manufacturing, the company's started focusing on display manufacturing in 2006. The groundbreaking in Dresden happened in 2007. The first displays from its factory appeared in 2008, with ramp slated for 2010. Product trials happened in 2009, with launch sometime this year.
Touching upon the industrialization of the flexible display concept, Herre said there had been a private financial funding for a production facility of over $100 million. Over 200 locations were evaluated globally, before the company arrived at the decision to develop a manufacturing facility in the 'Silicon Saxony' region of Dresden, Germany starting in May 2007 for volume production in 2009.
Why select Dresden as a location? Plastic Logic cited reasons such excellent local support in all areas, ground, workforce regulations, etc., experience and infrastructure for R&D and volume manufacturing, development grants, good industrial area for development, etc. Perhaps, the fact that the IEF 2010 was held here should be proof enough!
I later checked the site que.com, and found there are two varieties of the QUE proReader -- with 4GB and Wi-Fi for $649 and with 8GB, Wi-Fi and 3G for $799.
Product specifications for the Wi-Fi and 3G model are:
Connectivity: Cellular (GSM), Wi-Fi (802.11 b/g), USB, Bluetooth 2.0.
Memory: 8 GB (Approx. 7.5 GB available for user data).
Display (active area): 10.7" diagonal, 960 x 1280 pixels at 150ppi, 8 gray levels.
User Interface: Full touchscreen, Virtual keyboard.
Battery: Rechargeable lithium-ion battery, charging via computer or wall charger.
Dimensions: 8.5" x 11" x .3".
Weight: Approximately 17 ounces.
It can be used with a Blackberry smartphone with Bluetooth. With the QUE software on your Windows PC, Mac or BlackBerry smartphone, a user can quickly and easily convert and transfer documents to the QUE. Now for that long due product launch!
Tuesday, May 18, 2010
New frontiers in MEMS around the human body
Staying with Future Horizons' International Electronics Forum (IEF) 2010 held recently in Dresden, Germany, Benedetto Vigna, Group Vice President and General Manager, MEMS, Sensors and High Performance Analog Division, STMicroelectronics, made a wonderful presentation on how MEMS can be useful for the human body, especially from the medical electronics point of view.
MEMS (microelectromechanical systems) is a three-dimensional device embedded in silicon, and uses silicon’s mechanical (and electrical) properties. It supports multifunctional systems of actuators, electronics and sensors.
Three critical waves of MEMS
Vigna highlighted the three very important waves of MEMS -- automotive airbags, consumerization, and MEMS in, on, around the body! The last part is the most interesting one!
Automotive airbags formed the 1st wave of MEMS. The application supported big and not-so-precise accelerometers. Additional automotive applications followed, such as tyre pressure sensors and stability control. Vigna heralded consumerization as the 2nd wave of MEMS. There have been high-volume fabrication techniques, leading to higher performance/greater reliability at lower costs.
He also pointed out the Wii effect! In this case, the high-volume commitment of vendors + UI benefits led to consumerization of MEMS.
Vigna added that MEMS has seen a speeding spiral of success in recent times. Earlier, it took 25 years from labs to fabs. Now, three product generations are developed and released in 12 months!
Another instance or example of the 2nd MEMS wave include the move from keyboard and mouse to free motion. In this case, the MEMS sensors change interaction with consumer electronics and propel new applications. There are now:
* Motion user interfaces in phones, games and remotes.
* Advanced navigation and location-based services.
* Free-fall protection in portable devices.
MEMS market
Vigna focused a moment on the MEMS motion sensors market 2009-2013 and the MEMS market. As far as the MEMS motion sensors market is concerned, accelerometers are likely to grow at 14.5 percent CAGR for the period 2009-2013. On the other hand, gyroscopes are likely to grow at 17.3 percent CAGR during 2009-2013.
Cell phones and CE is the major market segment in both cases, registering 19.5 percent CAGR and 25,4 percent CAGR, respectively, followed by automotive at 10.7 percent CAGR and 12.3 percent CAGR, respectively.
It is to be noted that in 2009, the overall MEMS market was almost flat compared to 2008, but volumes rose significantly, showing increasing penetration of MEMS in consumer devices.
Current trends in MEMS
Coming on to the current trends, MEMS is now pushing the limits of size and power -- motion sensors are squeezing the footprint to 2x2 mm and current consumption well below 10uA in full operating mode. Multiple sensor integration is another trend. The integration of motion, magnetic, pressure and temperature sensors in a single package brings more degrees of freedom.
Embedded intelligence is the third key trend. The on-chip processing capabilities are enabling smart autonomous sensors and decreasing power consumption at the system level. Finally, software, is now the 'S' in MEMS! Vigna said that hardware and software integration is a key added value and differentiating factor.
Third wave of MEMS
Now, to the critical and most exciting third wave of MEMS -- in, on, around the human body!
If you look at the population trends of today, while it is increasing, people are also living longer. The global average age has been increasing, as has been the cost of healthcare. In fact, healthcare is estimated to account for ~ 25 percent of the US GDP by 2025.
So, what can MEMS do for healthcare? Actually, a whole lot of things! Some of these are sensing -- motion, molecular detection and pressure, temperature cycling, microfluidics -- pumps and valves, electrophoresis and energy capture. Vigna also gave an example of the bio-to-bit networks as the third wave of MEMS, which actually makes home as the point of care!
There are some common themes in this third MEMS Wave. For instance, connectivity leads to low power wireless data transmission, with improved accuracy, reliability and efficiency of diagnosis and treatment.
The applications of MEMS in its third wave in healthcare are themselves eye catching! Some of these are:
DNA/RNA analysis -- a disposable lab-on-chip for rapid, point-of-care diagnosis. Then, Nanopump diabetes management -- the nanopump has been developed by Debiotech and ST.
Further, remote patient monitoring -- a remote heart monitor by ST and Mayo Clinic. It features fall and ‘smart’ motion detection.
Next, the wireless body sensor network. We already have the SensAction AAL, an EU-funded pilot of a remote motion monitoring system for the elderly. A wearer’s movements are tracked and communicated wirelessly to a PDA/PC. It is using ST’s MotionBee wireless sensor technology.
Glaucoma detection is yet another application -- a 24-hour disposable contact lens with pressure sensor, courtesy, ST and Sensimed. And lastly, smart micro-robotics for less invasive surgeries.
MEMS (microelectromechanical systems) is a three-dimensional device embedded in silicon, and uses silicon’s mechanical (and electrical) properties. It supports multifunctional systems of actuators, electronics and sensors.
Three critical waves of MEMS
Vigna highlighted the three very important waves of MEMS -- automotive airbags, consumerization, and MEMS in, on, around the body! The last part is the most interesting one!
Automotive airbags formed the 1st wave of MEMS. The application supported big and not-so-precise accelerometers. Additional automotive applications followed, such as tyre pressure sensors and stability control. Vigna heralded consumerization as the 2nd wave of MEMS. There have been high-volume fabrication techniques, leading to higher performance/greater reliability at lower costs.
He also pointed out the Wii effect! In this case, the high-volume commitment of vendors + UI benefits led to consumerization of MEMS.
Vigna added that MEMS has seen a speeding spiral of success in recent times. Earlier, it took 25 years from labs to fabs. Now, three product generations are developed and released in 12 months!
Another instance or example of the 2nd MEMS wave include the move from keyboard and mouse to free motion. In this case, the MEMS sensors change interaction with consumer electronics and propel new applications. There are now:
* Motion user interfaces in phones, games and remotes.
* Advanced navigation and location-based services.
* Free-fall protection in portable devices.
MEMS market
Vigna focused a moment on the MEMS motion sensors market 2009-2013 and the MEMS market. As far as the MEMS motion sensors market is concerned, accelerometers are likely to grow at 14.5 percent CAGR for the period 2009-2013. On the other hand, gyroscopes are likely to grow at 17.3 percent CAGR during 2009-2013.
Cell phones and CE is the major market segment in both cases, registering 19.5 percent CAGR and 25,4 percent CAGR, respectively, followed by automotive at 10.7 percent CAGR and 12.3 percent CAGR, respectively.
It is to be noted that in 2009, the overall MEMS market was almost flat compared to 2008, but volumes rose significantly, showing increasing penetration of MEMS in consumer devices.
Current trends in MEMS
Coming on to the current trends, MEMS is now pushing the limits of size and power -- motion sensors are squeezing the footprint to 2x2 mm and current consumption well below 10uA in full operating mode. Multiple sensor integration is another trend. The integration of motion, magnetic, pressure and temperature sensors in a single package brings more degrees of freedom.
Embedded intelligence is the third key trend. The on-chip processing capabilities are enabling smart autonomous sensors and decreasing power consumption at the system level. Finally, software, is now the 'S' in MEMS! Vigna said that hardware and software integration is a key added value and differentiating factor.
Third wave of MEMS
Now, to the critical and most exciting third wave of MEMS -- in, on, around the human body!
If you look at the population trends of today, while it is increasing, people are also living longer. The global average age has been increasing, as has been the cost of healthcare. In fact, healthcare is estimated to account for ~ 25 percent of the US GDP by 2025.
So, what can MEMS do for healthcare? Actually, a whole lot of things! Some of these are sensing -- motion, molecular detection and pressure, temperature cycling, microfluidics -- pumps and valves, electrophoresis and energy capture. Vigna also gave an example of the bio-to-bit networks as the third wave of MEMS, which actually makes home as the point of care!
There are some common themes in this third MEMS Wave. For instance, connectivity leads to low power wireless data transmission, with improved accuracy, reliability and efficiency of diagnosis and treatment.
The applications of MEMS in its third wave in healthcare are themselves eye catching! Some of these are:
DNA/RNA analysis -- a disposable lab-on-chip for rapid, point-of-care diagnosis. Then, Nanopump diabetes management -- the nanopump has been developed by Debiotech and ST.
Further, remote patient monitoring -- a remote heart monitor by ST and Mayo Clinic. It features fall and ‘smart’ motion detection.
Next, the wireless body sensor network. We already have the SensAction AAL, an EU-funded pilot of a remote motion monitoring system for the elderly. A wearer’s movements are tracked and communicated wirelessly to a PDA/PC. It is using ST’s MotionBee wireless sensor technology.
Glaucoma detection is yet another application -- a 24-hour disposable contact lens with pressure sensor, courtesy, ST and Sensimed. And lastly, smart micro-robotics for less invasive surgeries.
Sunday, May 16, 2010
Providing ‘real solutions’ will be next challenge for IC suppliers
At the recently held International Electronics Forum (IEF 2010) in Dresden, Germany, Rich Beyer, chairman and CEO, Freescale Semiconductor, highlighted that the “need for providing ‘real solutions’” would be the next challenge for the various IC suppliers.
Increasing complexity means that the OEMs are now relying heavily on the IC suppliers for system-level support and software development. Also, connected intelligence is really blurring the traditional market boundaries. This requires system-level expertise combined with the knowledge of multiple market technologies.
There is also a great need for innovation teamwork, which would require focusing on the entire product value chain — starting from definition and design on to software and support. Delivering ‘real solutions’ would involve wrapping the ecosystems around OEM application expertise to create value through differentiation.
More details later!
Increasing complexity means that the OEMs are now relying heavily on the IC suppliers for system-level support and software development. Also, connected intelligence is really blurring the traditional market boundaries. This requires system-level expertise combined with the knowledge of multiple market technologies.
There is also a great need for innovation teamwork, which would require focusing on the entire product value chain — starting from definition and design on to software and support. Delivering ‘real solutions’ would involve wrapping the ecosystems around OEM application expertise to create value through differentiation.
More details later!
Thursday, May 6, 2010
Thrive or survive…going for gold in post-recession recovery: Malcolm Penn @ IEF2010, Dresden
According to Malcom Penn, chairman and CEO, Future Horizons, 2010 — a barnstroming year — will likely see the global semiconductor industry grow by 31+ percent. He was delivering the company’s forecast at the ongoing 19th International Electronics Forum (IEF) 2010 in Dresden, Germany, which ends here tomorrow. He said it would take a disaster of the scale of Lehmann Brothers to derail this now!
Some of the other forecasts made by Malcolm Penn include:
* 2011: +28 percent; based on: peak of the structural cyclical boom (could stretch into 2012).
* 2012: +18 percent; based on: normal cyclical trash cycle starting 2H-2012 (1H-2013?).
* 2013: +3 percent based on: market correction in full flow (could be negative, cap ex overspend and inventory build depending).
* 2014: +12 percent; based on: start of the next cyclical recovery (single digit, if 2013 is negative).
Given the now unavoidable 2010-11 fab shortage, the growth upside for 2010-12 is huge!
The forecast track record of Future Horizons is quite interesting. As per forecasts made during the IFS2010 in Jan.2010, the chip fundamentals was said to be in very good shape. The industry was starting its recovery with shortages. Also, the ASPs had already stopped faling. The inventory levels were at an all-tme low. Finally, the capacity was tight, and spending, weak!
All of this added up to two years of very strong growth in prospect. Penn had said: “It doesn’t get much better than this. But, despite what the numbers say, still no-one believes beyond the next quarter! “Ah but” is still driving the industry consensus!
Industry fundamentals don’t lie — believe in them or die! The capacity famine was instigated two+ years ago — well before the crasj, today’s shortage was inevitable. The recovery dynamics will continue to strengthen. Future Horizons’ forecast is now +31 percent ~$300 billion. The next trash dynamic has still not yet triggered. It is unlikely to happen before 2011, meaning, 2012 impact. However, the economic uncertainty remains the biggest risk. Also, the global financial system is fundamentally flawed.
Current industry status and outlook
There is said to be underlying fear, uncertainty and doubt. Some of the reasons are:
* Is unit demand overheating or sustainable?
* Is inventory starting to get out of control?
* Will the economy slip back into recession?
* Is the capacity crunch a blip or more fundamental?
* What will happen to the economy when the stimulus funding dtops?
* Will the end market demand hold up or slip back?
All of these have been going around, largely due to the loss of collective confidence, after 'five bad years' of industry growth.
According to Penn, risk aversion is not risk management. As recessions draw to a close, few are ready to believe that it is over, exaggerating the catch-up reactions, amplifying the cycle peaks and troughs. He cautioned that fears of a 2H-2010 dip will exacerbate the next trash cycle. By the time the industry waits for 2010 clarity (September?), it could be too late to rescue 2011. Well, if you really can't stand the semiconductor heat, it is best for you to maybe, consider a career change!
Now let's analyze the industry fundamentals -- economy, unit demand, fab capacity and ASPs, respectively.
Regarding the economy, the industry had entered the recession in structurally good shape. The 2009 chip market eas the victim, not the cause of the recession. While the chip market depends on the economy, it marches to its own drum. It must be pointed out that the chip market actually recovered much faster than the economy. Asia, and not the USA, is now driving the global GDP growth. Also, business is set to replace consumer as the global growth driver.
Forecast health warning #1: Economic disruption will derail the chip market. The only questions are: "by how much and for how long?"
With regard to the unit demand, riding the IC unit shipments trends wave takes judgement. The monthly run rate varies dramatically from the trend line. It is impossible to balance supply with demand (demand changes in days, supply changes takes months). A mismatch makes it ‘feel’ that capacity expansion is out of control.
Forecast health warning #2: Need to watch inventory plus double ordering in tight demand cycles.
Turning attention to fab capacity, the long-term wafer supply security Is said to be fundamental. It is the fundamental fabless (Fablite) achilles heel, and the reason that FSA (now GSA) was formed in 1994.
Kit ordered today equals units out one year later. Then, there’s the ramp up time The next four quarter’s capacity is cast in stone. Even if we splurged cap ex today, there will be no new sales impact until May 2011. Therefore, 'sell the kit today' means capacity two quarters later.
On the semiconductor equipment sales trends, there have been two years of under investment. No amount of productivity gains, second-hand equipment sales, cannibalisation, or other ‘tricks’ will compensate for this decline.
Looking at the front end book-to-bill, equipment orders placed today equals new IC sales four quarters later. The Q1 cap ex growth won't impact until Q1-11. The front end book-to-bill has never been so slow for so long.
As for the cap ex spend and book-to-bill impact, the 2009 cap ex was down 48 percent on 2008 (down 31 percent on 2007). The 2010 cap ex is up 80 percent on 2009 (which is back to 2008's level). The 2010-11 capacity is condemned to leading edge famine.
Looking at the medium-term new capacity outlook, there is still no serious capacity build out yet in prospect!
If you look at the global MOS wafer fab capacity, Q1-09 was -8 percent, the biggest quarterly capacity fall in semiconductor history. Q2-09 was not much better at a further -2.7 percent. Q3-09 ‘bottomed’ at -0.7 percent. The Q3-09 capacity was down 12.5 percent on Q3-08’s peak and not increasing. Also, the Q3-09 demand was down only 10.6 percent, but recovering.
Comparing the capital spend and semiconductor market, the cap ex was well below 20 percent trend line (2008 = 12 percent/2009 = 7 percent). Even the 80 percent cap ex growth in 2010 is too low (still only 11-12 percent).
Looking at the MOS capacity build out by wafer size, there has been no change in volume ramp profile. The MOS capacity mix by feature size is 35 percent of the capacity Is <55nm vs. 18 percent this time last year (54 percent 65nm and under).
On the supply/demand balance, he said that during 2010, new capacity addition will be minimal. The era of cheap and readily available wafers is over. Excess capacity had zeroed out in Q4-2009, exactly as forecast!
Forecast health warning #3: Need to watch front end cap ex spending in 2011. Excess capacity will kill the market stone dead!
Finally, turning to ASPs, Penn said these are the least understood industry wild card. The ASPs can’t really keep falling forever. He listed five factors, all of whom having already run their course. These were:
* 130nm yield bust … destroying the node’s ASP price enhancement.
* 300mm wafer transition.
* Two to three year memory price war -- a result of 300mm conversion.
* Brutal 32-bit MPU price war -- ASPs fell from $100 to $70.
* The overall price pressure due to excess capacity.
According to Penn, there is lots of structural ASP recovery potential.
Forecast health warning #4: ASPs are diagnostic and Complex -- systemic (Moore's Law); structural (Capacity/Wafer Size); and sabotage (price wars).
Some of the other forecasts made by Malcolm Penn include:
* 2011: +28 percent; based on: peak of the structural cyclical boom (could stretch into 2012).
* 2012: +18 percent; based on: normal cyclical trash cycle starting 2H-2012 (1H-2013?).
* 2013: +3 percent based on: market correction in full flow (could be negative, cap ex overspend and inventory build depending).
* 2014: +12 percent; based on: start of the next cyclical recovery (single digit, if 2013 is negative).
Given the now unavoidable 2010-11 fab shortage, the growth upside for 2010-12 is huge!
The forecast track record of Future Horizons is quite interesting. As per forecasts made during the IFS2010 in Jan.2010, the chip fundamentals was said to be in very good shape. The industry was starting its recovery with shortages. Also, the ASPs had already stopped faling. The inventory levels were at an all-tme low. Finally, the capacity was tight, and spending, weak!
All of this added up to two years of very strong growth in prospect. Penn had said: “It doesn’t get much better than this. But, despite what the numbers say, still no-one believes beyond the next quarter! “Ah but” is still driving the industry consensus!
Industry fundamentals don’t lie — believe in them or die! The capacity famine was instigated two+ years ago — well before the crasj, today’s shortage was inevitable. The recovery dynamics will continue to strengthen. Future Horizons’ forecast is now +31 percent ~$300 billion. The next trash dynamic has still not yet triggered. It is unlikely to happen before 2011, meaning, 2012 impact. However, the economic uncertainty remains the biggest risk. Also, the global financial system is fundamentally flawed.
Current industry status and outlook
There is said to be underlying fear, uncertainty and doubt. Some of the reasons are:
* Is unit demand overheating or sustainable?
* Is inventory starting to get out of control?
* Will the economy slip back into recession?
* Is the capacity crunch a blip or more fundamental?
* What will happen to the economy when the stimulus funding dtops?
* Will the end market demand hold up or slip back?
All of these have been going around, largely due to the loss of collective confidence, after 'five bad years' of industry growth.
According to Penn, risk aversion is not risk management. As recessions draw to a close, few are ready to believe that it is over, exaggerating the catch-up reactions, amplifying the cycle peaks and troughs. He cautioned that fears of a 2H-2010 dip will exacerbate the next trash cycle. By the time the industry waits for 2010 clarity (September?), it could be too late to rescue 2011. Well, if you really can't stand the semiconductor heat, it is best for you to maybe, consider a career change!
Now let's analyze the industry fundamentals -- economy, unit demand, fab capacity and ASPs, respectively.
Regarding the economy, the industry had entered the recession in structurally good shape. The 2009 chip market eas the victim, not the cause of the recession. While the chip market depends on the economy, it marches to its own drum. It must be pointed out that the chip market actually recovered much faster than the economy. Asia, and not the USA, is now driving the global GDP growth. Also, business is set to replace consumer as the global growth driver.
Forecast health warning #1: Economic disruption will derail the chip market. The only questions are: "by how much and for how long?"
With regard to the unit demand, riding the IC unit shipments trends wave takes judgement. The monthly run rate varies dramatically from the trend line. It is impossible to balance supply with demand (demand changes in days, supply changes takes months). A mismatch makes it ‘feel’ that capacity expansion is out of control.
Forecast health warning #2: Need to watch inventory plus double ordering in tight demand cycles.
Turning attention to fab capacity, the long-term wafer supply security Is said to be fundamental. It is the fundamental fabless (Fablite) achilles heel, and the reason that FSA (now GSA) was formed in 1994.
Kit ordered today equals units out one year later. Then, there’s the ramp up time The next four quarter’s capacity is cast in stone. Even if we splurged cap ex today, there will be no new sales impact until May 2011. Therefore, 'sell the kit today' means capacity two quarters later.
On the semiconductor equipment sales trends, there have been two years of under investment. No amount of productivity gains, second-hand equipment sales, cannibalisation, or other ‘tricks’ will compensate for this decline.
Looking at the front end book-to-bill, equipment orders placed today equals new IC sales four quarters later. The Q1 cap ex growth won't impact until Q1-11. The front end book-to-bill has never been so slow for so long.
As for the cap ex spend and book-to-bill impact, the 2009 cap ex was down 48 percent on 2008 (down 31 percent on 2007). The 2010 cap ex is up 80 percent on 2009 (which is back to 2008's level). The 2010-11 capacity is condemned to leading edge famine.
Looking at the medium-term new capacity outlook, there is still no serious capacity build out yet in prospect!
If you look at the global MOS wafer fab capacity, Q1-09 was -8 percent, the biggest quarterly capacity fall in semiconductor history. Q2-09 was not much better at a further -2.7 percent. Q3-09 ‘bottomed’ at -0.7 percent. The Q3-09 capacity was down 12.5 percent on Q3-08’s peak and not increasing. Also, the Q3-09 demand was down only 10.6 percent, but recovering.
Comparing the capital spend and semiconductor market, the cap ex was well below 20 percent trend line (2008 = 12 percent/2009 = 7 percent). Even the 80 percent cap ex growth in 2010 is too low (still only 11-12 percent).
Looking at the MOS capacity build out by wafer size, there has been no change in volume ramp profile. The MOS capacity mix by feature size is 35 percent of the capacity Is <55nm vs. 18 percent this time last year (54 percent 65nm and under).
On the supply/demand balance, he said that during 2010, new capacity addition will be minimal. The era of cheap and readily available wafers is over. Excess capacity had zeroed out in Q4-2009, exactly as forecast!
Forecast health warning #3: Need to watch front end cap ex spending in 2011. Excess capacity will kill the market stone dead!
Finally, turning to ASPs, Penn said these are the least understood industry wild card. The ASPs can’t really keep falling forever. He listed five factors, all of whom having already run their course. These were:
* 130nm yield bust … destroying the node’s ASP price enhancement.
* 300mm wafer transition.
* Two to three year memory price war -- a result of 300mm conversion.
* Brutal 32-bit MPU price war -- ASPs fell from $100 to $70.
* The overall price pressure due to excess capacity.
According to Penn, there is lots of structural ASP recovery potential.
Forecast health warning #4: ASPs are diagnostic and Complex -- systemic (Moore's Law); structural (Capacity/Wafer Size); and sabotage (price wars).
Wednesday, May 5, 2010
EDA360 unplugged with Cadence's Jaswinder Ahuja
Following the announcement of the EDA360 last week, I managed to get in touch with Jaswinder Ahuja, corporate vice president and managing director, Cadence Design Systems (I) Pvt Ltd.
We discussed a variety of topics such as: why the EDA industry is at the crossroads, EDA360 unplugged, the integrators vs. creators concept, the IP stack and the road ahead for EDA360.
First, why is the EDA industry at the crossroads?
According to Ahuja, if you look at the evolution in the electronic design world, systems companies are finding differentiation and value through the creative, innovative applications or “apps” that are being demanded by end consumers. This is true not only in the mobile handset world, where iPhone and Android are obvious examples, but anywhere there’s a processor. Therefore, software is becoming a very important part in the scheme of things.
"Semiconductor companies are being asked by system companies to provide the hardware platform as well as the software that will run on that particular platform. That is the trend that Cadence is seeing today, and that is what is discussed in the EDA360 manifesto," he added.
EDA is at crossroads because EDA companies can no longer provide the tools only for IP integration and silicon realization like they have been doing all these years. EDA now has to encompass SOC realization (including bare metal software) and then move towards system realization, which includes mechanical/board design, he noted.
EDA360 and its key features
As mentioned earlier, the EDA 360 is a five-year vision for defining the trends in the EDA industry, based on what Cadence is observing in the industry and the direction in which it feels that the industry will go.
Ahuja said that EDA360 represents System Realization, the development of a complete hardware/software platform ready for applications development; SoC Realization, the creation of a single SoC including hardware-dependent software; and Silicon Realization, which includes complex digital, analog, and mixed-signal designs.
The traditional approach to system development starts with the hardware, and appends the software and the applications later. With application-driven System Realization, designers start by envisioning the applications that will run on the system, define requirements, and then work their way down to hardware and software IP creation and integration. This flow requires some new and expanded capabilities.
Part of system realization is project management. EDA360 reaches beyond engineering teams to help customers meet project and business objectives.
Key features of EDA360 include:
* Outlining how companies can bridge the profitability gap, not just the productivity gap.
* Explaining the shifts to integration and profitability.
* Software aware SoC realization, which includes an integrated, optimized IP stack.
The four chapters of the EDA360 manifesto take a look at:
Chapter 1: EDA Industry Focus Shifts to Integration and Profitability.
Chapter 2: Application-Driven System Realization.
Chapter 3: Software-Aware SoC Realization.
Chapter 4: EDA360 Enables Silicon Realization.
Integrators vs Creators concept
One of the key points made in the EDA360 is that the EDA industry to date has only served the needs of creators. It has almost completely ignored integrators, who need a different set of tools and capabilities.
Ahuja said that the fundamental manner in which electronic design is being done is now changing. "Systems companies are demanding that their semiconductor suppliers provide not just silicon but application-ready hardware/software platforms.
Semiconductor makers, meanwhile, are facing projected SoC development costs of $100 million at 32 nm and below. One result of these pressures is that fewer companies will be design creators and more will become integrators who make heavy use of pre-designed IP, including both hardware and software. It is only a reflection of the evolution of the industry."
The needs of integrators are different from those of creators. Thus, far EDA has focused almost exclusively on creators. While EDA360 continues to serve creators, it also brings new tools and methodologies to integrators.
He added: "Creators are most concerned about a productivity gap. EDA360 will help close that gap through better design, verification, and implementation approaches. Integrators are more concerned about a lesser-known profitability gap. EDA360 will help close that gap by enabling integration-optimized intellectual property (IP) creation and selection, IP integration into SoCs and systems, and system cost optimization."
According to him, EDA360 is a comprehensive new vision and call-to-action for the electronics industry to address a disruptive transformation—a shift in focus from design creation to integration. It attempts to define where the industry is going.
Bringing profitability gap into the equation
Now, how does the profitability gap come into the equation?
Ahuja said: "EDA largely focuses on the design community which, until recently, has been mainly concerned about productivity. The productivity gap was the difference between what could be accomplished and what actually was. To the chip company, that gap was missed opportunity and lost value. Therefore, so far the focus of EDA companies has been on helping design managers address the productivity gap."
However, because of the growing complexity and integration of designs that we see today, the semiconductor ecosystem now has to address the profitability gap, not just the productivity gap. EDA360 acknowledges the importance of the overall ecosystem, and looks at what are the key drivers that make a company successful. No one company can do it alone. EDA360 requires a collaborative ecosystem including EDA vendors, embedded system providers, IP providers, foundries and customers.
EDA360 dashboard
Part of EDA360 is providing new tools that offer a “dashboard” to help companies manage system development projects, and provide metrics that make sense in hardware and software engineering environments. The metric-driven verification capability available now points the way to what needs to be done.
Today, design and verification managers can create an executable verification plan that identifies key project metrics, executes simulation engines, and tracks coverage metrics. They can then review reports and charts that will help them manage a “plan to closure” verification process, and determine when verification is done. As a result, verification resources are used effectively and overall costs are reduced, helping close the profitability gap.
The IP stack
While System Realization produces a complete hardware/software platform ready for applications deployment, SoC Realization ensures the successful development of a single SoC to meet system needs. Typically, SoCs are considered to be “done” when the silicon is completed. In the EDA360 view, however, SoC Realization is not complete without software device drivers for each hardware subsystem.
"We believe these drivers should be developed with the SoC rather than tacked on later—and that leads to a completely new view of how silicon IPshould be provided," added Ahuja.
"Instead of thinking of IP as isolated “blocks,” we propose an IP stack that includes “bare-metal software” as well as hardware IP. Bare-metal software refers to everything below the OS layer, and the most prominent feature of bare-metal software is device drivers."The IP stack depicted here also includes verification IP (VIP) that validates IP functionality and integration. The stack may include hard macros with fixed layouts along with synthesizable IPat the register-transfer level or the transaction-level modeling (TLM) level. It also includes design constraints.
What next for EDA360?
EDA360 is an approach that is entirely Cadence's initiative.
"You will see over the period of time that EDA360 will be backed up by several announcements. Some of them have already been made – for example, the announcements around Palladium XP and the partnership with Wind River. Watch out for more announcements shortly," he concluded.
We discussed a variety of topics such as: why the EDA industry is at the crossroads, EDA360 unplugged, the integrators vs. creators concept, the IP stack and the road ahead for EDA360.
First, why is the EDA industry at the crossroads?
According to Ahuja, if you look at the evolution in the electronic design world, systems companies are finding differentiation and value through the creative, innovative applications or “apps” that are being demanded by end consumers. This is true not only in the mobile handset world, where iPhone and Android are obvious examples, but anywhere there’s a processor. Therefore, software is becoming a very important part in the scheme of things.
"Semiconductor companies are being asked by system companies to provide the hardware platform as well as the software that will run on that particular platform. That is the trend that Cadence is seeing today, and that is what is discussed in the EDA360 manifesto," he added.
EDA is at crossroads because EDA companies can no longer provide the tools only for IP integration and silicon realization like they have been doing all these years. EDA now has to encompass SOC realization (including bare metal software) and then move towards system realization, which includes mechanical/board design, he noted.
EDA360 and its key features
As mentioned earlier, the EDA 360 is a five-year vision for defining the trends in the EDA industry, based on what Cadence is observing in the industry and the direction in which it feels that the industry will go.
Ahuja said that EDA360 represents System Realization, the development of a complete hardware/software platform ready for applications development; SoC Realization, the creation of a single SoC including hardware-dependent software; and Silicon Realization, which includes complex digital, analog, and mixed-signal designs.
The traditional approach to system development starts with the hardware, and appends the software and the applications later. With application-driven System Realization, designers start by envisioning the applications that will run on the system, define requirements, and then work their way down to hardware and software IP creation and integration. This flow requires some new and expanded capabilities.
Part of system realization is project management. EDA360 reaches beyond engineering teams to help customers meet project and business objectives.
Key features of EDA360 include:
* Outlining how companies can bridge the profitability gap, not just the productivity gap.
* Explaining the shifts to integration and profitability.
* Software aware SoC realization, which includes an integrated, optimized IP stack.
The four chapters of the EDA360 manifesto take a look at:
Chapter 1: EDA Industry Focus Shifts to Integration and Profitability.
Chapter 2: Application-Driven System Realization.
Chapter 3: Software-Aware SoC Realization.
Chapter 4: EDA360 Enables Silicon Realization.
Integrators vs Creators concept
One of the key points made in the EDA360 is that the EDA industry to date has only served the needs of creators. It has almost completely ignored integrators, who need a different set of tools and capabilities.
Ahuja said that the fundamental manner in which electronic design is being done is now changing. "Systems companies are demanding that their semiconductor suppliers provide not just silicon but application-ready hardware/software platforms.
Semiconductor makers, meanwhile, are facing projected SoC development costs of $100 million at 32 nm and below. One result of these pressures is that fewer companies will be design creators and more will become integrators who make heavy use of pre-designed IP, including both hardware and software. It is only a reflection of the evolution of the industry."
The needs of integrators are different from those of creators. Thus, far EDA has focused almost exclusively on creators. While EDA360 continues to serve creators, it also brings new tools and methodologies to integrators.
He added: "Creators are most concerned about a productivity gap. EDA360 will help close that gap through better design, verification, and implementation approaches. Integrators are more concerned about a lesser-known profitability gap. EDA360 will help close that gap by enabling integration-optimized intellectual property (IP) creation and selection, IP integration into SoCs and systems, and system cost optimization."
According to him, EDA360 is a comprehensive new vision and call-to-action for the electronics industry to address a disruptive transformation—a shift in focus from design creation to integration. It attempts to define where the industry is going.
Bringing profitability gap into the equation
Now, how does the profitability gap come into the equation?
Ahuja said: "EDA largely focuses on the design community which, until recently, has been mainly concerned about productivity. The productivity gap was the difference between what could be accomplished and what actually was. To the chip company, that gap was missed opportunity and lost value. Therefore, so far the focus of EDA companies has been on helping design managers address the productivity gap."
However, because of the growing complexity and integration of designs that we see today, the semiconductor ecosystem now has to address the profitability gap, not just the productivity gap. EDA360 acknowledges the importance of the overall ecosystem, and looks at what are the key drivers that make a company successful. No one company can do it alone. EDA360 requires a collaborative ecosystem including EDA vendors, embedded system providers, IP providers, foundries and customers.
EDA360 dashboard
Part of EDA360 is providing new tools that offer a “dashboard” to help companies manage system development projects, and provide metrics that make sense in hardware and software engineering environments. The metric-driven verification capability available now points the way to what needs to be done.
Today, design and verification managers can create an executable verification plan that identifies key project metrics, executes simulation engines, and tracks coverage metrics. They can then review reports and charts that will help them manage a “plan to closure” verification process, and determine when verification is done. As a result, verification resources are used effectively and overall costs are reduced, helping close the profitability gap.
The IP stack
While System Realization produces a complete hardware/software platform ready for applications deployment, SoC Realization ensures the successful development of a single SoC to meet system needs. Typically, SoCs are considered to be “done” when the silicon is completed. In the EDA360 view, however, SoC Realization is not complete without software device drivers for each hardware subsystem.
"We believe these drivers should be developed with the SoC rather than tacked on later—and that leads to a completely new view of how silicon IPshould be provided," added Ahuja.
"Instead of thinking of IP as isolated “blocks,” we propose an IP stack that includes “bare-metal software” as well as hardware IP. Bare-metal software refers to everything below the OS layer, and the most prominent feature of bare-metal software is device drivers."The IP stack depicted here also includes verification IP (VIP) that validates IP functionality and integration. The stack may include hard macros with fixed layouts along with synthesizable IPat the register-transfer level or the transaction-level modeling (TLM) level. It also includes design constraints.
What next for EDA360?
EDA360 is an approach that is entirely Cadence's initiative.
"You will see over the period of time that EDA360 will be backed up by several announcements. Some of them have already been made – for example, the announcements around Palladium XP and the partnership with Wind River. Watch out for more announcements shortly," he concluded.
Tuesday, May 4, 2010
EDA360 to help integrators close profitability gap
Cadence Design Systems laid out its EDA360 vision last week -- what it says is 'a new vision for the semiconductor industry'.
The EDA360 vision paper says: "Today, systems and semiconductor companies are undergoing a disruptive transformation so profound that even the best-known companies will be impacted. The EDA industry now stands at a crossroads where it also must change in order to continue as a successful, independent business. Without that change, EDA will become a fragmented industry offering suboptimal, poorly targeted solutions that fail to solve customer problems. As a result, the huge leap forward provided by the electronics revolution will come to a standstill. The result? A squandered opportunity for technology innovation, and a diminished contribution by the electronics industry to re-build the global economy."
You can download the vision paper from eda360.com, if you like!
What is EDA360?
Now, what exactly is the EDA360? The EDA 360 is a five-year vision for acknowledging the trends in the EDA industry that we are observing in the marketplace and looks at how EDA can add value in the future.
The vision paper is essentially looking at where EDA should be heading over the next five years.
The four chapters of the EDA360 are:
* EDA industry focus shifts to integration and profitability;
* Application-driven system realization;
* Software-aware SoC realization; and
* EDA360 enables silicon realization.
Why is the EDA industry at crossroads?
If you look at the evoliution in the electronic design world, application-level system design has been happening. Software is becoming a very important part in the scheme of things. The hardware companies are being asked to provide the hardware platform as well as the software that will run on that particular platform. So, this is an ongoing evolution.
The EDA industry to date has only served the needs of the creators. It has almost completely ignored integrators, who need a different set of tools and capabilities. So, how can the EDA360 go about achieving this?
When one says that the EDA industry has so far only served the needs of the creators, It is only a reflection of the evolution of the industry. The fundamental manner in which electronic design is being done is now changing. While it is shifting, it also takes a while to understand the entire paradigm. The industry is also moving toward IP re-use, etc., -- those are all the shifts.
The industry is now said to be looking at a new paradigm: integration ready IP. What the vision paper does: it takes the industry to where it is heading and tells this is what's needed. This is what the integrators will need in the future.
Bridging the profitability gap!
The EDA360 also talks about the growing “profitability gap” in the electronics industry that has rarely been discussed.
The 'profitability gap' in the electronics industry is a reflection of the evolution. EDA largely focuses on the engineering community -- who in turn, are worried about their productivity. Going forward, there is a definitely a need to bridge the profitability gap. The EDA360 acknowledges the importance of the overall ecosystem.
The paper says the industry must create, integrate and optimize for profitability at the end. It notes: "Closing the productivity gap requires innovative and differentiated capabilities in design, verification, and implementation. EDA 360 needs to deliver the following capabilities to close the productivity gap." These capabilities are:
* Design for productivity.
* Verify for productivity.
* Implement for productivity.
The EDA industry also needs to respond to support application-driven system realization. An example is the Android from Google. System integation is all in the context of the application. For instance, there's a need for application development kits, drivers, etc.
EDA 360 dashboard
Part of EDA360 is providing new tools that offer a “dashboard” to help companies manage system development projects, and provide metrics that make sense in hardware and software engineering environments.
Now, designers are familiar with Cadence's methodologies. There is something called the V veriification) Manager. One of the biggest problems that verification teams contend with is: have they done enough verification and whether the product is ready to go! This is a closed loop approach to that particular problem. All the tools are working together, and everything is integrated.
Open integration platform
Chapter 3 of the EDA360 talks about the open intergration platform at highest level and optimized IP.
The paper says: "Instead of thinking of IP as isolated “blocks,” we propose an IP stack that includes “bare-metal software” as well as hardware IP. Bare-metal software refers to everything below the OS layer, and the most prominent feature of bare-metal software is device drivers. The IP stack depicted below also includes verification IP (VIP) that validates IP functionality and integration. The stack may include hard macros with fixed layouts along with synthesizable IP at the register-transfer level or the transaction-level modeling (TLM) level. It also includes design constraints."
The same chapter also talks about the open integration platform. It says that SoC integration involves three key steps:
* Analyze the architecture.
* Develop or source integration-optimized IP.
* Integrate IP to realize the SoC.
Enabling silicon realization
The last chapter talks about how the EDA360 enables silicon realization. According to this chapter, the three concepts to silicon realization are:
* Merge top-down design and bottom-up design.
* Raise the level of abstraction.
* Apply unified design intent.
The vision paper concludes, saying: "The newer challenge is helping integrators close the profitability gap. Here, EDA 360 will provide solutions that make IP re-use easier, allow “iterative correctness,” optimize cost, and manage change across external suppliers and internal teams."
Well, this is an entirely Cadence's initiative. The industry is quite likely to see over a period of time that the EDA360 will be backed up by several announcements. Cadence is said to already have a partnership with Wind River, and more announcements will likely happen this week.
It makes one wonder whether the other EDA companies have such vision papers, and most importantly, should not all of them be working together taking the EDA industry several notches higher in the future.
The EDA360 vision paper says: "Today, systems and semiconductor companies are undergoing a disruptive transformation so profound that even the best-known companies will be impacted. The EDA industry now stands at a crossroads where it also must change in order to continue as a successful, independent business. Without that change, EDA will become a fragmented industry offering suboptimal, poorly targeted solutions that fail to solve customer problems. As a result, the huge leap forward provided by the electronics revolution will come to a standstill. The result? A squandered opportunity for technology innovation, and a diminished contribution by the electronics industry to re-build the global economy."
You can download the vision paper from eda360.com, if you like!
What is EDA360?
Now, what exactly is the EDA360? The EDA 360 is a five-year vision for acknowledging the trends in the EDA industry that we are observing in the marketplace and looks at how EDA can add value in the future.
The vision paper is essentially looking at where EDA should be heading over the next five years.
The four chapters of the EDA360 are:
* EDA industry focus shifts to integration and profitability;
* Application-driven system realization;
* Software-aware SoC realization; and
* EDA360 enables silicon realization.
Why is the EDA industry at crossroads?
If you look at the evoliution in the electronic design world, application-level system design has been happening. Software is becoming a very important part in the scheme of things. The hardware companies are being asked to provide the hardware platform as well as the software that will run on that particular platform. So, this is an ongoing evolution.
The EDA industry to date has only served the needs of the creators. It has almost completely ignored integrators, who need a different set of tools and capabilities. So, how can the EDA360 go about achieving this?
When one says that the EDA industry has so far only served the needs of the creators, It is only a reflection of the evolution of the industry. The fundamental manner in which electronic design is being done is now changing. While it is shifting, it also takes a while to understand the entire paradigm. The industry is also moving toward IP re-use, etc., -- those are all the shifts.
The industry is now said to be looking at a new paradigm: integration ready IP. What the vision paper does: it takes the industry to where it is heading and tells this is what's needed. This is what the integrators will need in the future.
Bridging the profitability gap!
The EDA360 also talks about the growing “profitability gap” in the electronics industry that has rarely been discussed.
The 'profitability gap' in the electronics industry is a reflection of the evolution. EDA largely focuses on the engineering community -- who in turn, are worried about their productivity. Going forward, there is a definitely a need to bridge the profitability gap. The EDA360 acknowledges the importance of the overall ecosystem.
The paper says the industry must create, integrate and optimize for profitability at the end. It notes: "Closing the productivity gap requires innovative and differentiated capabilities in design, verification, and implementation. EDA 360 needs to deliver the following capabilities to close the productivity gap." These capabilities are:
* Design for productivity.
* Verify for productivity.
* Implement for productivity.
The EDA industry also needs to respond to support application-driven system realization. An example is the Android from Google. System integation is all in the context of the application. For instance, there's a need for application development kits, drivers, etc.
EDA 360 dashboard
Part of EDA360 is providing new tools that offer a “dashboard” to help companies manage system development projects, and provide metrics that make sense in hardware and software engineering environments.
Now, designers are familiar with Cadence's methodologies. There is something called the V veriification) Manager. One of the biggest problems that verification teams contend with is: have they done enough verification and whether the product is ready to go! This is a closed loop approach to that particular problem. All the tools are working together, and everything is integrated.
Open integration platform
Chapter 3 of the EDA360 talks about the open intergration platform at highest level and optimized IP.
The paper says: "Instead of thinking of IP as isolated “blocks,” we propose an IP stack that includes “bare-metal software” as well as hardware IP. Bare-metal software refers to everything below the OS layer, and the most prominent feature of bare-metal software is device drivers. The IP stack depicted below also includes verification IP (VIP) that validates IP functionality and integration. The stack may include hard macros with fixed layouts along with synthesizable IP at the register-transfer level or the transaction-level modeling (TLM) level. It also includes design constraints."
The same chapter also talks about the open integration platform. It says that SoC integration involves three key steps:
* Analyze the architecture.
* Develop or source integration-optimized IP.
* Integrate IP to realize the SoC.
Enabling silicon realization
The last chapter talks about how the EDA360 enables silicon realization. According to this chapter, the three concepts to silicon realization are:
* Merge top-down design and bottom-up design.
* Raise the level of abstraction.
* Apply unified design intent.
The vision paper concludes, saying: "The newer challenge is helping integrators close the profitability gap. Here, EDA 360 will provide solutions that make IP re-use easier, allow “iterative correctness,” optimize cost, and manage change across external suppliers and internal teams."
Well, this is an entirely Cadence's initiative. The industry is quite likely to see over a period of time that the EDA360 will be backed up by several announcements. Cadence is said to already have a partnership with Wind River, and more announcements will likely happen this week.
It makes one wonder whether the other EDA companies have such vision papers, and most importantly, should not all of them be working together taking the EDA industry several notches higher in the future.
Update on global semicon sales forecast estimates: Cowan’s LRA model
This is a continuation of my coverage of the fortunes of the global semiconductor industry. I would like to acknowledge and thank Mike Cowan, an independent semiconductor analyst and developer of the Cowan LRA model, who has provided me the latest numbers.
Here are the latest forecast results for 2010 global semicon sales estimates associated with the forecasting model — the Cowan LRA model for predicting worldwide semicon sales.
These latest forecast numbers are displayed in the second table below and are based upon the recently posted figures on the WSTS website Mar. 2010 actual sales result of $26.533 billion with a corresponding 3MMA of $23.060 billion. The actual 1Q/10 sales came in at $69.101 billion.
It should be noted that there were very minor upward revisions to both January and February actual sales numbers relative to the previous monthly published global S/C sales numbers as highlighted below:Source: Cowan LRA Forecasting Model (May 2010).
The updated sales forecast estimate for 2010 — $294.98 billion — shows a drop from last month’s forecast estimate — $298.88 billion. This corresponds to a decrease in the year-over-year 2010 sales growth estimate of 1.8 percentage points, namely from 32.1 percent to 30.3 percent.
Here are the latest forecast results for 2010 global semicon sales estimates associated with the forecasting model — the Cowan LRA model for predicting worldwide semicon sales.
These latest forecast numbers are displayed in the second table below and are based upon the recently posted figures on the WSTS website Mar. 2010 actual sales result of $26.533 billion with a corresponding 3MMA of $23.060 billion. The actual 1Q/10 sales came in at $69.101 billion.
It should be noted that there were very minor upward revisions to both January and February actual sales numbers relative to the previous monthly published global S/C sales numbers as highlighted below:Source: Cowan LRA Forecasting Model (May 2010).
The updated sales forecast estimate for 2010 — $294.98 billion — shows a drop from last month’s forecast estimate — $298.88 billion. This corresponds to a decrease in the year-over-year 2010 sales growth estimate of 1.8 percentage points, namely from 32.1 percent to 30.3 percent.
Monday, May 3, 2010
ST intros STM32L EnergyLite ultra-low-power MCUs
STMicroelectronics recently launched the STM32L EnergyLite ultra-low-power MCUs. I caught up with Vinay Thapiyal, technical marketing manager, MCU’s, ST India, to learn more.
The highlights of the STM32L series of MCUs include a commitment for ultra-low power -- the EnergyLite platform is common for 8-bit (STM8L) and 32-bit (STM32L) MCUs. Also, it is strong on pure energy efficiency, with high performance combined with ultra low power, i.e., high high energy saving. Finally, the ultra low power member in STM32 portfolio enriches both the STM32 ultra-low-power EnergyLite platform and the STM32 portfolio.
According to Thapliyal, STMicroelectronics has been involved in the MCU market for a long time. Off late, it has started focusing on the STM32 -- the ARM Cortex based MCU and the STM8 -- for 8-bit family. "We have started converging our old families into these two domains," he added.
The STM32F is the foundation of the STM32 family. STM32F is a family of low power MCUs based on the 32-bit ARM Cortex M3 architecture. The STM8 is a family of MCUs based on ST's propritetary atchitecture. The STM32L is STMicroelectronics' ultra low power family mainly used for portable and very low power applications.
The ultra-low-power EnergyLite platform, featuring the STM32L and the STM8L is based on STMicroelectronics’ 130 nm ultra-low-leakage process technology. They share common technology, architecture and peripherals. The STM8, which was launched in 2009, has caught on very fast. It is a high performance, low cost MCU.
He added that STMicroelectronics started with 130nm technology, and low pin count and low flash on STM8, while higher memory and high pin count is available on the STM32.
Common features across all STM8L and STM32L devices include:
* Multiple communication peripherals up to 3 x USART, 2 x SPI, and 2 x PC.
* Upto 8x16 bit timer.
* Internal 16MHz and 38kHz RC oscillators.
* 2 x watchdogs.
* Reset circuitry POR/PDR.
* 2 x comparators.
* Temperature sensors.
Targeted applications of the STM8L/32L include portable medical electroncs, alarm systems, metering, general portable devices, factory automation, mobile and sensors.
Some more details later!
The highlights of the STM32L series of MCUs include a commitment for ultra-low power -- the EnergyLite platform is common for 8-bit (STM8L) and 32-bit (STM32L) MCUs. Also, it is strong on pure energy efficiency, with high performance combined with ultra low power, i.e., high high energy saving. Finally, the ultra low power member in STM32 portfolio enriches both the STM32 ultra-low-power EnergyLite platform and the STM32 portfolio.
According to Thapliyal, STMicroelectronics has been involved in the MCU market for a long time. Off late, it has started focusing on the STM32 -- the ARM Cortex based MCU and the STM8 -- for 8-bit family. "We have started converging our old families into these two domains," he added.
The STM32F is the foundation of the STM32 family. STM32F is a family of low power MCUs based on the 32-bit ARM Cortex M3 architecture. The STM8 is a family of MCUs based on ST's propritetary atchitecture. The STM32L is STMicroelectronics' ultra low power family mainly used for portable and very low power applications.
The ultra-low-power EnergyLite platform, featuring the STM32L and the STM8L is based on STMicroelectronics’ 130 nm ultra-low-leakage process technology. They share common technology, architecture and peripherals. The STM8, which was launched in 2009, has caught on very fast. It is a high performance, low cost MCU.
He added that STMicroelectronics started with 130nm technology, and low pin count and low flash on STM8, while higher memory and high pin count is available on the STM32.
Common features across all STM8L and STM32L devices include:
* Multiple communication peripherals up to 3 x USART, 2 x SPI, and 2 x PC.
* Upto 8x16 bit timer.
* Internal 16MHz and 38kHz RC oscillators.
* 2 x watchdogs.
* Reset circuitry POR/PDR.
* 2 x comparators.
* Temperature sensors.
Targeted applications of the STM8L/32L include portable medical electroncs, alarm systems, metering, general portable devices, factory automation, mobile and sensors.
Some more details later!
Subscribe to:
Posts (Atom)