Justin Rattner, VP and Senior Fellow CTO, Intel Corp., while speaking at the recently held CXO Forum organized by the India Semiconductor Association (ISA), highlighted the various innovations Intel has created over the years and continues to do so.
There are some hurdles to innovation. For instance, success brings conservatism and then, there is the curse of high-volume manufacturing. It led to Andy Grove's famous statement: "Only the paranoid survive". Obviously, massive inertia and innovation do not blend. Rattner mentioned certain external hurdles, especially, anti-Innovation policies and standards that hamper innovation.
Moore's Law drives innovation
Moore’s Law has been driving innovation at Intel. These have been in the form of high-K metal gate transistors at 45nm -- the first new transistor architecture in 35 years! There's more, in form of phase change memory (PCM) below 45nm, non-planar tri-gate transistor beyond 32nm, and carbon nanotube transistor
Some recent multi-disciplinary innovations include 45nm Core 2 Duo, Nehalem uArch, power management, quad core through package technology, Silverthorne/LPIA, USB/PCI Express, vPro, and WiMAX and 802.11n, respectively. Intel has made sustained Advances in silicon technology. In 2007, it developed 32nm SRAM with 1.9 billion transistors, with 32nm slated for 2009.
Sustained advances in micro-architecture include Intel Core -- new microarchitecture 65nm (2006), Penryn compaction/derivative at 45nm (2007), Nehalem -- new microarchitecture 45nm (2008), Westmere compaction/derivative 32nm (2009), and Sandy Bridge new microarchitecture 32nm (2010). "This shows our sustained microprocessor leadership," added Rattner.
There are plans to further reinvigorate Intel architecture -- by high throughput computing, IA programmability, ease of scaling for software, array of enhanced IA cores, and increasing teraflops of performance. Its 45nm Silverthorne is based on the Menlow platform and promises 'Full Internet in Your Pocket.'
Intel has also made advances in integration and packaging, such as multi chip packages, Wifi + WiMAX, processor + chipsets + accelerators, 60 percent smaller CPU packages, and 100 percent lead-free technology*. By 2008, Intel is committed to having all 45nm CPUs halogen free.
Innovations in memory, communications
Intel has also made innovations in memory technology. Robson Technology, which has NAND Flash cache, has 1.5X faster application load times, has 1.5X resume from hibernate, gives 0.4W average power savings, and is used in the Santa Rosa platform.
The other innovation is solid-state drives. These are embedded in a range of devices, from handhelds to servers. Compared to HDDs, these give 1/10th the power, >10X performance, and are 1,000X more durable.
Innovations in communication technology include things like adaptive antenna and front-ends, digital CMOS radio, and reconfigurable baseband. Intel has also developed the world’s 1st TeraFLOPS supercomputer on a die. It features 80 cores, 1TFLOP at 62W, and 256 GB/s bisection.
Intel has also unleashed the era of tera. These are in the form of tera bits –- Si Photonics and tera bytes –- 3D stacked memory. The latter includes 256KB SRAM per core, 4X C4 bump density, and 3200 thru-silicon vias.
Similarly, Intel's innovation in tera-scale software include RMS workloads, C++ for parallelism, Ct for nested data parallelism (race-free irregular parallel computation), and hardware assisted STM C transactional memory, which ensures concurrent access with no errors.
Intel has also done innovations for emerging regions. Rattner touched upon Research (at the Berkeley Lablet), which involves a long-distance WiFi solution: 6Mb/s at 100+ km. This has been tested at the Aravind Eye Hospital in Tamil Nadu. It has allowed doctor/patient videoconferences. Thirteen rural villages have been connected so far, going on to 50 villages. The impact has been tremendous -- 2,500 exams per month, over 30,000 so far; cataracts, glaucoma, cornea problems have been diagnosed; and 3,000 people have had their vision restored so far.
Intel has also been a champion in innovating through collaboration. These have been in form of academic open-collaborative (pre-competitive research) -- Carnegie Mellon, Berkeley, University of Washington; industrial partnerships (product differentiation) -- an example being the ciscointelalliance.com, and consortium (ecosystem benefit) -- via the ISA, Continua, Innovation Value Institute, Trusted Computing Group, etc.
Intel's focus areas for 2008 include:
a) tera-scale computing -- unleashing the next generation of applications.
b) Platform* on a chip (POC) -- 'Platform' integration on chip with IA.
c) Trusted services -- technologies for secure service opportunities.
d) Carry small, live large -- context aware usage models and platforms.
e) Ultimate connectivity -- connected 'all-ways' for future platforms.
Tuesday, January 29, 2008
Wednesday, January 23, 2008
LabVIEW 8.5 delivers power of multicore processors
National Instruments (NI) recently released the latest version of LabVIEW -- LabVIEW 8.5, which delivers the power of multi-core processors to engineers and scientists.
According to Jayaram Pillai, MD, India, Russia & Arabia, NI, new processors with multicore are coming out. Processing power is now split into two separate cores. With multicore, you are doing parallel processing. LabVIEW is a very dataflow programming tool. It is not sequential -- it is decided by the data itself. And inherently, it has always been parallel processing. NI has taken the advantage from LabVIEW 5 -- due to dual-core processing. It used multi-threaded architecture.
Assign different tasks on different cores
With processor technology shifting to multi-core, there is a need to run efficiently on the processor. For LabVIEW, you can assign different tasks on different cores -- which are independent. They don't have to run at the same speed. Those are the challenges multi-core seeks to address.
The technology NI has, the single application that you can build in LabVIEW, it aids in taking advantage of multicore. Part of a program can be assigned to one core. Another part is, if you don't want to club, it can run on the another core. All of this happens on one program.
So how is the new version better? As you want to accomplish more things, you have to go to higher levels of extractions (or abstractions). The graphical program is a very high level of abstraction. You can accomplish more with the graphical programming.
Another area where LabVIEW is finding itself is in the design space. Building systems today is very complex. NI builds into three stages -- design-prototype-deploy.
As an example, the Railways want to detect trains that have bad wheels. They collect data from railway lines -- lot of signal processing and maths were done. NI knows how to detect defective wheels. It put sensors to that signal, go to the field and do some trials. Once you know the algorithms and the BIOS required, you can go into designing. You take the prototype and make that into a product. Each stage, in the past, used different tools.
Since it is all the way -- from design to deployment -- it can all be done with LabVIEW. This capability -- that it can start at design and prototype, etc., brings greater value to engineers. NI call this process as graphical system design.
Community LabVIEW
Commenting on NI's activities in India, Jayaram Pillai said LabVIEW 8.5 is a major release. "Over the last 10 years we've been in India, we have changed its hardware platdform. One thing that remains common is LabVIEW and what it does for scientists and engineers. We've been able to create a community around LabVIEW. There are users and programmers of LabVIEW. There are certified developers, companies that have been built around LabVIEW. There are applications built around LabVIEW," he added.
Looking at graphical programming, over the last 10 yrs -- the biggest challenge for Pillai has been to address customers needs for engineers who can program LabVIEW. There's enough excitement in the market about LabVIEW. There are 1,100 jobs at least for LabVIEW.
There are people writing toolkits that can work with LabVIEW, which includes IP, developers, companies, customers, products, systems built around LabView. This has been happening within India.
Pillai said that similar to any technology adoption, there's a take-off stage. "It is called the S Curve, and we are at the bottom of the curve. The potential is huge. Our big challenge is to be able to create more engineers in the market. I feel that over 1,000 engineers are required," he added.
From an academic standpoint -- India has about 400,000+ engineers passing out each year. Only 25 percent are hireable, with reasonable training, by companies. The IITs, state governments, etc. take care of it. That means, there are business opportunities for companies to start finishing schools for training these people for the industry.
Lots of companies are getting into these finishing schools. These students will learn about the tools that they would later need to use.
Pillai stressed that it was important how the country looks at academicians. He said: "We haven't got into this stage as much. The IT industry has done very well and created a huge appetite for engineers. The majority of engineers passing out are hired by the IT companies."
He added: "LabVIEW is getting accepted by people in various projects. We've created a LabVIEW community -- the whole ecosystem."
Outlook for 2008
According to Pillai, algorithm enineering is key during the design phase. LabVIEW 8.5 has a new tool for design stage of the graphical system design.
With LabVIEW, users can increase performance through graphical programming for multi-core processors and FPGAs. Pillai said: "At the design stage, you want to give the engineer multiple computing options. The engineer is concerned about deployment, but he would like to move from Windows to real-time. This should be done seamlessly. LabVIEW FPGA has been around for a long time."
Programming in FPGA is not simple. NI's offer to engineers is: look at the FPGA as a platform and program it freely without even needing to know what's the VHDL code! LabVIEW can run on Windows, and move to RTOS and FPGA, and all of it is seamless.
As an example, Lego has a product called Mindstorm -- a robot. That robot has sensors. It is targeted at 10-yr olds. On one side, a 10-yr old can program LabVIEW. All details are there in LabVIEW. On another side, it can be used in the largest, physics experiments. So LabVIEW is not complex to use.
LabView has been shipping formally since last month.
According to Jayaram Pillai, MD, India, Russia & Arabia, NI, new processors with multicore are coming out. Processing power is now split into two separate cores. With multicore, you are doing parallel processing. LabVIEW is a very dataflow programming tool. It is not sequential -- it is decided by the data itself. And inherently, it has always been parallel processing. NI has taken the advantage from LabVIEW 5 -- due to dual-core processing. It used multi-threaded architecture.
Assign different tasks on different cores
With processor technology shifting to multi-core, there is a need to run efficiently on the processor. For LabVIEW, you can assign different tasks on different cores -- which are independent. They don't have to run at the same speed. Those are the challenges multi-core seeks to address.
The technology NI has, the single application that you can build in LabVIEW, it aids in taking advantage of multicore. Part of a program can be assigned to one core. Another part is, if you don't want to club, it can run on the another core. All of this happens on one program.
So how is the new version better? As you want to accomplish more things, you have to go to higher levels of extractions (or abstractions). The graphical program is a very high level of abstraction. You can accomplish more with the graphical programming.
Another area where LabVIEW is finding itself is in the design space. Building systems today is very complex. NI builds into three stages -- design-prototype-deploy.
As an example, the Railways want to detect trains that have bad wheels. They collect data from railway lines -- lot of signal processing and maths were done. NI knows how to detect defective wheels. It put sensors to that signal, go to the field and do some trials. Once you know the algorithms and the BIOS required, you can go into designing. You take the prototype and make that into a product. Each stage, in the past, used different tools.
Since it is all the way -- from design to deployment -- it can all be done with LabVIEW. This capability -- that it can start at design and prototype, etc., brings greater value to engineers. NI call this process as graphical system design.
Community LabVIEW
Commenting on NI's activities in India, Jayaram Pillai said LabVIEW 8.5 is a major release. "Over the last 10 years we've been in India, we have changed its hardware platdform. One thing that remains common is LabVIEW and what it does for scientists and engineers. We've been able to create a community around LabVIEW. There are users and programmers of LabVIEW. There are certified developers, companies that have been built around LabVIEW. There are applications built around LabVIEW," he added.
Looking at graphical programming, over the last 10 yrs -- the biggest challenge for Pillai has been to address customers needs for engineers who can program LabVIEW. There's enough excitement in the market about LabVIEW. There are 1,100 jobs at least for LabVIEW.
There are people writing toolkits that can work with LabVIEW, which includes IP, developers, companies, customers, products, systems built around LabView. This has been happening within India.
Pillai said that similar to any technology adoption, there's a take-off stage. "It is called the S Curve, and we are at the bottom of the curve. The potential is huge. Our big challenge is to be able to create more engineers in the market. I feel that over 1,000 engineers are required," he added.
From an academic standpoint -- India has about 400,000+ engineers passing out each year. Only 25 percent are hireable, with reasonable training, by companies. The IITs, state governments, etc. take care of it. That means, there are business opportunities for companies to start finishing schools for training these people for the industry.
Lots of companies are getting into these finishing schools. These students will learn about the tools that they would later need to use.
Pillai stressed that it was important how the country looks at academicians. He said: "We haven't got into this stage as much. The IT industry has done very well and created a huge appetite for engineers. The majority of engineers passing out are hired by the IT companies."
He added: "LabVIEW is getting accepted by people in various projects. We've created a LabVIEW community -- the whole ecosystem."
Outlook for 2008
According to Pillai, algorithm enineering is key during the design phase. LabVIEW 8.5 has a new tool for design stage of the graphical system design.
With LabVIEW, users can increase performance through graphical programming for multi-core processors and FPGAs. Pillai said: "At the design stage, you want to give the engineer multiple computing options. The engineer is concerned about deployment, but he would like to move from Windows to real-time. This should be done seamlessly. LabVIEW FPGA has been around for a long time."
Programming in FPGA is not simple. NI's offer to engineers is: look at the FPGA as a platform and program it freely without even needing to know what's the VHDL code! LabVIEW can run on Windows, and move to RTOS and FPGA, and all of it is seamless.
As an example, Lego has a product called Mindstorm -- a robot. That robot has sensors. It is targeted at 10-yr olds. On one side, a 10-yr old can program LabVIEW. All details are there in LabVIEW. On another side, it can be used in the largest, physics experiments. So LabVIEW is not complex to use.
LabView has been shipping formally since last month.
Monday, January 14, 2008
Power awareness critical for chip designers
The holy grail of electronics -- low-power design, or having the requisite power awareness is extremely critical for chip designers working on both high-performance applications and portable applications. For one, it determines the battery lifetime of a device, besides determining the cooling and energy costs. It is said that several of today's chip designs are limited in terms of power and still require maximum performance.
Touching on the global factors, S.N. Padmanabhan, Senior Vice President, Mindtree Consulting, said the Kyoto Protocol mandates energy conservation efforts.
Low-power design challenges
Asia, as we all know, has been emerging as a major energy consuming society. Shortage of electricity is becoming a major concern. There is a huge strain on nations to meet the rising needs/halt rise. There is also a rapid increase in all types of electronic goods in growing economies. As a result, increased efficiencies and reduced consumption should be beneficial as a whole!
In the Indian context, the country has around 125 million televisions sets, 5 million automatic washing machines, 10 million white goods, 200+ million other electronics, over 90 million cell phones and 50 million land lines, etc. A 1W reduction in white goods and TVs would lead to a saving of 140 Mi Watts of power! And, a 10 mW reduction in phones will save 1.4 Mi Watts!! Therefore, it makes even more sense to go low power!
Mindtree's Padmanabhan said IC power budgets have come down drastically. It is <2W for four out of five chips designed. There has also been a simultaneous manipulation of multiple parameters (P=CV2f). Next, there are several leakage issues in 65nm and smaller geometries, which can no longer be ignored.
Add to all of these are factors that there is a lack of availability of comprehensive tools and techniques, as well as analog designs. In such a scenario, designers need to be very clear about their objectives -- is it achieving lowering average power, lowering the maximum peak power or lowering energy.
Jayanta Kumar Lahiri, Director, ARM, pointed out challenges associated with batteries. Battery storage has been a limiting factor. Battery energy doubles in a decade and surely, does not follow the Moore’s law. Next, there have hardly been major changes in the basic battery technology. The energy density/size safe handling are limiting factors as well for batteries.
He added that the low-power challenge is four-fold in the VLSI domain. These are -- leakiness; more integration means more W/cm^2; EDA tools not that good in low power domain and does not co-relate sometimes with the silicon, and variability of device parameters make things worse.
Toshiyuki Saito, Senior Manager, Design Engineering Division NEC Electronics Japan, said low power is necessary for customer's success -- in form of heat suppress for wired systems and improved battery life time for mobile systems. It also brings cost competitiveness for SoC suppliers in terms of packaging cost, and development cost and turnaround time. Finally, it would contribute to preserving the global environment.
Addressing low power challenges
What are semiconductor and EDA companies doing to address the low-power design challenges? Padmanabhan said several techniques were being employed at the circuit level. However, each one of those had limitations.
These include AVS -- which provides maximum savings, reduces speed, but may need compensation; clock gating -- which does not help to reduce leakage and needs additional gates; and adaptive clock scaling -- which needs sophistication and is not very simple; and finally, the use of multi threshold cells for selective trade-off.
Emerging techniques include efficient RTL synthesis techniques, which is fast, but leaky, vs. slow and low power; power aware resource sharing, which is planning to be done at the architectural level and synthesis, but is not as widely used as other techniques; and power gating methodology -- which makes use of sleep transistors, has coarse and fine grained methods, reduces dynamic and leakage power, and also exploits idle times of the circuit.
He added that power optimization should start at the architecture and design stages. Maximum optimization can be achieved at the system level. Also, the evolving power optimization tools and methodologies required collaborative approaches.
Power Forward Initiative
Pankaj Mayor, Group Director, Industry Alliances, Cadence Design Systems, said low power imperative is driving the semiconductor and EDA industries. He said, "design-based low power solution is the only answer!" Traditional design-based solutions are fragmented. Basic low power design techniques, such as area optimization, multi-Vt optimization and clock gating were automated in the 1990s.
There has since been an impact of advanced low-power techniques. These advanced techniques include multi-supply voltage (MSV), power shut-off (PSO), dynamic and adaptive voltage frequency scaling (DVFS and AVS), and substrate biasing. Cadence's low-power solution uses advanced techniques.
According to Mayor, the Power Forward Initiative (PFI) has created an ecosystem as well. The Power Forward Initiative includes Cadence and 23 other companies across the design chain, as of the end of December 2007.
The year 2007 also saw a continued Power Forward industry momentum. In Q1-07, Common Power Format or CPF became the Si2 [Silicon Integration Initiative] standard. The Cadence Low Power Solution production released V 1.0 in this quarter as well. In H2-07, the industry has seen over 100 customers adopting CPF-based advanced low power solution as well as ~50 tapeouts.
CPF allows holistic automation and validation at every design step. Arijit Dutta, Manager, Design Methodology, Freescale Semiconductor exhibited the advantages of using the CPF in wireless, networking and automotive verticals at Freescale.
Touching on the global factors, S.N. Padmanabhan, Senior Vice President, Mindtree Consulting, said the Kyoto Protocol mandates energy conservation efforts.
Low-power design challenges
Asia, as we all know, has been emerging as a major energy consuming society. Shortage of electricity is becoming a major concern. There is a huge strain on nations to meet the rising needs/halt rise. There is also a rapid increase in all types of electronic goods in growing economies. As a result, increased efficiencies and reduced consumption should be beneficial as a whole!
In the Indian context, the country has around 125 million televisions sets, 5 million automatic washing machines, 10 million white goods, 200+ million other electronics, over 90 million cell phones and 50 million land lines, etc. A 1W reduction in white goods and TVs would lead to a saving of 140 Mi Watts of power! And, a 10 mW reduction in phones will save 1.4 Mi Watts!! Therefore, it makes even more sense to go low power!
Mindtree's Padmanabhan said IC power budgets have come down drastically. It is <2W for four out of five chips designed. There has also been a simultaneous manipulation of multiple parameters (P=CV2f). Next, there are several leakage issues in 65nm and smaller geometries, which can no longer be ignored.
Add to all of these are factors that there is a lack of availability of comprehensive tools and techniques, as well as analog designs. In such a scenario, designers need to be very clear about their objectives -- is it achieving lowering average power, lowering the maximum peak power or lowering energy.
Jayanta Kumar Lahiri, Director, ARM, pointed out challenges associated with batteries. Battery storage has been a limiting factor. Battery energy doubles in a decade and surely, does not follow the Moore’s law. Next, there have hardly been major changes in the basic battery technology. The energy density/size safe handling are limiting factors as well for batteries.
He added that the low-power challenge is four-fold in the VLSI domain. These are -- leakiness; more integration means more W/cm^2; EDA tools not that good in low power domain and does not co-relate sometimes with the silicon, and variability of device parameters make things worse.
Toshiyuki Saito, Senior Manager, Design Engineering Division NEC Electronics Japan, said low power is necessary for customer's success -- in form of heat suppress for wired systems and improved battery life time for mobile systems. It also brings cost competitiveness for SoC suppliers in terms of packaging cost, and development cost and turnaround time. Finally, it would contribute to preserving the global environment.
Addressing low power challenges
What are semiconductor and EDA companies doing to address the low-power design challenges? Padmanabhan said several techniques were being employed at the circuit level. However, each one of those had limitations.
These include AVS -- which provides maximum savings, reduces speed, but may need compensation; clock gating -- which does not help to reduce leakage and needs additional gates; and adaptive clock scaling -- which needs sophistication and is not very simple; and finally, the use of multi threshold cells for selective trade-off.
Emerging techniques include efficient RTL synthesis techniques, which is fast, but leaky, vs. slow and low power; power aware resource sharing, which is planning to be done at the architectural level and synthesis, but is not as widely used as other techniques; and power gating methodology -- which makes use of sleep transistors, has coarse and fine grained methods, reduces dynamic and leakage power, and also exploits idle times of the circuit.
He added that power optimization should start at the architecture and design stages. Maximum optimization can be achieved at the system level. Also, the evolving power optimization tools and methodologies required collaborative approaches.
Power Forward Initiative
Pankaj Mayor, Group Director, Industry Alliances, Cadence Design Systems, said low power imperative is driving the semiconductor and EDA industries. He said, "design-based low power solution is the only answer!" Traditional design-based solutions are fragmented. Basic low power design techniques, such as area optimization, multi-Vt optimization and clock gating were automated in the 1990s.
There has since been an impact of advanced low-power techniques. These advanced techniques include multi-supply voltage (MSV), power shut-off (PSO), dynamic and adaptive voltage frequency scaling (DVFS and AVS), and substrate biasing. Cadence's low-power solution uses advanced techniques.
According to Mayor, the Power Forward Initiative (PFI) has created an ecosystem as well. The Power Forward Initiative includes Cadence and 23 other companies across the design chain, as of the end of December 2007.
The year 2007 also saw a continued Power Forward industry momentum. In Q1-07, Common Power Format or CPF became the Si2 [Silicon Integration Initiative] standard. The Cadence Low Power Solution production released V 1.0 in this quarter as well. In H2-07, the industry has seen over 100 customers adopting CPF-based advanced low power solution as well as ~50 tapeouts.
CPF allows holistic automation and validation at every design step. Arijit Dutta, Manager, Design Methodology, Freescale Semiconductor exhibited the advantages of using the CPF in wireless, networking and automotive verticals at Freescale.
Indian designers could lead in EDA product development
There will be the multi-nationalization of the product development process, according to Walden C. Rhines, chairman & CEO, Mentor Graphics. More executives recognize that "access to qualified personnel" is the key driver. As per A.T. Kearney Global Services Location Index 2007, India is the most attractive offshoring destination. He was speaking at the Though Leadership Forum organized by the India Semiconductor Association.
Touching on the evolution of EDA, and the role of Indian designers, Rhines said that most electronic engineers did not consider themselves "risk takers". Most electronic engineers also don't like to change tools and fewer even consider "hot" new tools.
On the contrary, young engineers and recent university graduates eagerly adopted new technology. It is a way for them to distinguish themselves, get the productivity advantage and they were less invested in existing methodologies.
Indian designers smart
Comparing Indian designers with the rest of the world, he said that electronic designers in India, on an average, are less experienced than in the United States, Europe and Japan. However, they are on an average, as smart, or smarter, than those in United States, Europe and Japan. There has been an increasing influence of India design centers on the multinational design flows and tools.
Disruptive change creates leadership opportunities. There has been improved power and cost through improved system architecture. The C synthesis enables faster architectural exploration and shorter time to Verilog.
C synthesis matches or exceeds hand coded RTL efficiency, as per STMicroelectronics (Reed-Solomon, Galois Field Multiplier). There have also been system architectural innovations to reduce die size -- Ericsson mobile platforms. There is a need to iterate to find the optimum architecture.
Now, Indian designers have been early adopters of C-based design. The reasons were, one there was a willingness to try new approaches and two, it caused multinational parent companies to accelerate their own adoption.
India is likely to be a leader in transaction level design. They have been able to extract fast, accurate power and timing models from RTL. They have managed runs 100x-1000x faster vs. RTL, retained accuracy at the gate level and RTL, their models run with application software for hardware/software cosimulation, and they have done transaction-based verification using emulation. It has been the same for UPF-compatible verification.
Some other areas of verification where India may lead the way are assertions, coverage based verification, and algorithmic test bench synthesis.
Adoption of place and route technology
Let us see how the adoption of new place and route technology has influenced the industry. When a design flow breaks, what breaks the most often? Place and route breaks every two technology generations. Technology generations ramp to peak volume. Place and route utilizes a semiconductor company's internal software until gate array routers emerge.
Cell-based layout requires hierarchical router with timing. Tangent attacks leading-edge 0.75 micron designs. Later, Cadence became the dominant place and route supplier at 0.35 micron as the fabless industry grew demand. [Cadence acquired Tangent in 1989]. However, at 0.25 microns, SoC drives new technology requirements. This led to the collapse of the FAM business model.
SoC needs "break" the flow. ArcSys emerged at 0.35/0.25 micron. It addresses SoC requirements for large sizes and interconnect delay. It goes public as Avant!, and is later acquired by Synopsys. However, a timing closure crisis emerges at 0.13 micron. Cadence and Avant!/Synopsys try to extend older tool architectures.
Now, timing closure "breaks" the design flow. At this point, Magma emerges at 0.13 micron with timing-driven layout solution. At 90nm, Magma dominates timing-driven design. It also approaches Cadence, Avant!/Synopsys place and route market share. What the industry witnesses is that a new problem emerges and a new, leading-edge solution provider enters every two nodes.
And now, pressures are creating 65/45nm discontinuity. These are process and design variations, low-power requirements, and large design data sizes. Explosive growth in complexity requires multi-corner, multi-mode analysis.
Achieving power/performance design goals requires analysis of corner cases for manufacturing and operational variability. Manufacturing variability multiplies the required corner cases. Hence, manufacturing variability now "breaks" the place and route flow at 65nm. With the advent of 45nm, it demands design for manufacturing (DFM), and ushers in more corners.
Implications for EDA in India
So what are the implications for EDA in this scenario, especially from an Indian context? One, introduce and support leading-edge design tools in India. Two, EDA startups will focus initial sales efforts in San Jose and India. Three, purchasing decisions will increasingly incorporate India design teams to drive flows and decisions. Four, India will emerge as the test bed for new design ideas. As a result, Indian designers would exercise their influence by demanding the best-in-class design tools and capabilities.
Indian designers should always remain open to new design approaches. They should beware of becoming risk adverse as they become more experienced. They should need to stay abreast of the emerging innovations by maintaining close contact with EDA companies, including start-ups. They also need to make EDA suppliers aware of their issues and challenges.
Touching on the evolution of EDA, and the role of Indian designers, Rhines said that most electronic engineers did not consider themselves "risk takers". Most electronic engineers also don't like to change tools and fewer even consider "hot" new tools.
On the contrary, young engineers and recent university graduates eagerly adopted new technology. It is a way for them to distinguish themselves, get the productivity advantage and they were less invested in existing methodologies.
Indian designers smart
Comparing Indian designers with the rest of the world, he said that electronic designers in India, on an average, are less experienced than in the United States, Europe and Japan. However, they are on an average, as smart, or smarter, than those in United States, Europe and Japan. There has been an increasing influence of India design centers on the multinational design flows and tools.
Disruptive change creates leadership opportunities. There has been improved power and cost through improved system architecture. The C synthesis enables faster architectural exploration and shorter time to Verilog.
C synthesis matches or exceeds hand coded RTL efficiency, as per STMicroelectronics (Reed-Solomon, Galois Field Multiplier). There have also been system architectural innovations to reduce die size -- Ericsson mobile platforms. There is a need to iterate to find the optimum architecture.
Now, Indian designers have been early adopters of C-based design. The reasons were, one there was a willingness to try new approaches and two, it caused multinational parent companies to accelerate their own adoption.
India is likely to be a leader in transaction level design. They have been able to extract fast, accurate power and timing models from RTL. They have managed runs 100x-1000x faster vs. RTL, retained accuracy at the gate level and RTL, their models run with application software for hardware/software cosimulation, and they have done transaction-based verification using emulation. It has been the same for UPF-compatible verification.
Some other areas of verification where India may lead the way are assertions, coverage based verification, and algorithmic test bench synthesis.
Adoption of place and route technology
Let us see how the adoption of new place and route technology has influenced the industry. When a design flow breaks, what breaks the most often? Place and route breaks every two technology generations. Technology generations ramp to peak volume. Place and route utilizes a semiconductor company's internal software until gate array routers emerge.
Cell-based layout requires hierarchical router with timing. Tangent attacks leading-edge 0.75 micron designs. Later, Cadence became the dominant place and route supplier at 0.35 micron as the fabless industry grew demand. [Cadence acquired Tangent in 1989]. However, at 0.25 microns, SoC drives new technology requirements. This led to the collapse of the FAM business model.
SoC needs "break" the flow. ArcSys emerged at 0.35/0.25 micron. It addresses SoC requirements for large sizes and interconnect delay. It goes public as Avant!, and is later acquired by Synopsys. However, a timing closure crisis emerges at 0.13 micron. Cadence and Avant!/Synopsys try to extend older tool architectures.
Now, timing closure "breaks" the design flow. At this point, Magma emerges at 0.13 micron with timing-driven layout solution. At 90nm, Magma dominates timing-driven design. It also approaches Cadence, Avant!/Synopsys place and route market share. What the industry witnesses is that a new problem emerges and a new, leading-edge solution provider enters every two nodes.
And now, pressures are creating 65/45nm discontinuity. These are process and design variations, low-power requirements, and large design data sizes. Explosive growth in complexity requires multi-corner, multi-mode analysis.
Achieving power/performance design goals requires analysis of corner cases for manufacturing and operational variability. Manufacturing variability multiplies the required corner cases. Hence, manufacturing variability now "breaks" the place and route flow at 65nm. With the advent of 45nm, it demands design for manufacturing (DFM), and ushers in more corners.
Implications for EDA in India
So what are the implications for EDA in this scenario, especially from an Indian context? One, introduce and support leading-edge design tools in India. Two, EDA startups will focus initial sales efforts in San Jose and India. Three, purchasing decisions will increasingly incorporate India design teams to drive flows and decisions. Four, India will emerge as the test bed for new design ideas. As a result, Indian designers would exercise their influence by demanding the best-in-class design tools and capabilities.
Indian designers should always remain open to new design approaches. They should beware of becoming risk adverse as they become more experienced. They should need to stay abreast of the emerging innovations by maintaining close contact with EDA companies, including start-ups. They also need to make EDA suppliers aware of their issues and challenges.
Tuesday, January 1, 2008
Can we expect exciting times in 2008?
Welcome 2008! May I wish all my readers a very happy and prosperous 2008. Another year's gone past. We have a habit of looking back to see at what happened and what could have been.
A lot has been written already about 2007 and what to expect in 2008. So let's just touch upon some of the events from 2007 and some expectations from 2008.
For India, 2007 was a great year for the semiconductor industry -- first, the Indian government announced the semiconductor policy, followed some months later by the fab policy. Both were tremendous firsts in India's science and technology, and not IT, history. Everyone hopes that the Indian semiconductor industry will take off this year. Eyes are focused on the embedded segment, what with the global semiconductor industry reportedly facing 'an embedded dilemma.'
An issue hitting the EDA industry is that, the cost of designing or developing the embededded software for an SoC actually passed the cost of desgining the SoC itself in 2007. The world needs to avoid this software crisis, and India is well placed to take full advantage and play a major role, given its strength in embedded.
In IT, it's been a mixed sort of a year for Apple, which hit big time with the iPhone, seemed not to make waves with either the Safari browser or the Leopard OS. Microsoft had the Vista OS, but then, Vista didn't exactly warm the hearts of users or those who wished to upgrade their OS, including yours truly. Maybe, 2008 would ring in better times for Vista.
While on browsers, Firefox has gained lot of ground. However, by the end of 2007 came the news that the Netscape Web browser -- which started it all -- would soon be confined to history.
Netscape Navigator was the world's first commercial Web browser and launch pad of the Internet boom. It will be taken off on February 1, 2008, after a 13-year run. Time Warner's AOL, its current owner, has reportedly decided to kill further development and technical support to focus on growing the company as an advertising business. The first version of Netscape had come out in late 1994.
In gaming, there are admirers of Wii, PS3 and Xbox 360, and will remain the same. Which one of these gaming consoles will reign supreme, eventually, is difficult to predict.
In consumer electronics, lines are surely blurring between portable media players (PMPs) and portable navigation devices. Also, it would be interesting to see how digital photo frames survive 2008. A reported tight supply, especially for seven-inch models, has led to some makers in Asia either postponing mass production or extending lead times. Surely, makers cannot add more entertainment functions in smaller screen models, to keep costs down.
In the security products market, IP cameras and video servers should have a better year, with more emphasis now on video surveillance. In fact, some friends have been querying me as well regarding their potential.
On components, we can hope to see more growth for solid polymer capacitors in 2008, and among PCBs some fabricators should start manufacturing high-density interconnect (HDI) PCBs this year.
In wireless, we should witness TD-SCDMA in operation prior to the Beijing Olympic Games. Backers would like to see TD-SCDMA succeed, given the effort Datang-Siemens has made on the technology, as also the Chinese government, which issued spectrum for TD-SCDMA nearly five years ago!
Let's all welcome 2008 and look forward to more exciting things happening.
A lot has been written already about 2007 and what to expect in 2008. So let's just touch upon some of the events from 2007 and some expectations from 2008.
For India, 2007 was a great year for the semiconductor industry -- first, the Indian government announced the semiconductor policy, followed some months later by the fab policy. Both were tremendous firsts in India's science and technology, and not IT, history. Everyone hopes that the Indian semiconductor industry will take off this year. Eyes are focused on the embedded segment, what with the global semiconductor industry reportedly facing 'an embedded dilemma.'
An issue hitting the EDA industry is that, the cost of designing or developing the embededded software for an SoC actually passed the cost of desgining the SoC itself in 2007. The world needs to avoid this software crisis, and India is well placed to take full advantage and play a major role, given its strength in embedded.
In IT, it's been a mixed sort of a year for Apple, which hit big time with the iPhone, seemed not to make waves with either the Safari browser or the Leopard OS. Microsoft had the Vista OS, but then, Vista didn't exactly warm the hearts of users or those who wished to upgrade their OS, including yours truly. Maybe, 2008 would ring in better times for Vista.
While on browsers, Firefox has gained lot of ground. However, by the end of 2007 came the news that the Netscape Web browser -- which started it all -- would soon be confined to history.
Netscape Navigator was the world's first commercial Web browser and launch pad of the Internet boom. It will be taken off on February 1, 2008, after a 13-year run. Time Warner's AOL, its current owner, has reportedly decided to kill further development and technical support to focus on growing the company as an advertising business. The first version of Netscape had come out in late 1994.
In gaming, there are admirers of Wii, PS3 and Xbox 360, and will remain the same. Which one of these gaming consoles will reign supreme, eventually, is difficult to predict.
In consumer electronics, lines are surely blurring between portable media players (PMPs) and portable navigation devices. Also, it would be interesting to see how digital photo frames survive 2008. A reported tight supply, especially for seven-inch models, has led to some makers in Asia either postponing mass production or extending lead times. Surely, makers cannot add more entertainment functions in smaller screen models, to keep costs down.
In the security products market, IP cameras and video servers should have a better year, with more emphasis now on video surveillance. In fact, some friends have been querying me as well regarding their potential.
On components, we can hope to see more growth for solid polymer capacitors in 2008, and among PCBs some fabricators should start manufacturing high-density interconnect (HDI) PCBs this year.
In wireless, we should witness TD-SCDMA in operation prior to the Beijing Olympic Games. Backers would like to see TD-SCDMA succeed, given the effort Datang-Siemens has made on the technology, as also the Chinese government, which issued spectrum for TD-SCDMA nearly five years ago!
Let's all welcome 2008 and look forward to more exciting things happening.
Subscribe to:
Posts (Atom)