This
is the third installment on verification, now, taken up by Synopsys.
Regarding the biggest verification mistakes today, Arindam Ghosh,
director – Global Technical Services, Synopsys India, attributed these
as:
* Spending no time on verification planning (not documenting
what needs to be verified) and focusing more on running simulations or
on execution.
* No or very low investment in building better
verification environments (based on best/new methodologies and best
practices); instead maintaining older verification environments.
* Compromising on verification completeness because of tape out pressures and time-to-market considerations.
Would you agree that many companies STILL do not know how to verify a chip?
It
could be true for smaller companies or start-ups, but most of the major
semiconductor design engineers know about the better
approaches/methodologies to verify their chips. However, they may not be
investing in implementing the new methodologies for multiple reasons
and may instead continue to follow the traditional flows.
How are
companies trying to address those? One way to address these mistakes
would be to set up strong methodology teams to create a better
verification infrastructure for future chips. However, few companies are
doing this.
Are companies realizing this and building an
infrastructure that gets you business advantage? According to him, some
companies do realize this and are investing in building a better
infrastructure (in terms of better methodology and flows) for
verification.
When should good verification start -- after design; as you are designing and architecting your design environment?
He
said that good verification starts as soon as we start designing and
architecting the design. Verification leads should start discussing the
verification environment components with the lead architect and also
start writing the verification plan.
Are folks mistaking by
looking at tools and not at the verification process itself? He noted
that tools play a major role in the effectiveness of any verification
process, but we still see a lot of scope in methodology improvements
beyond the tools.
What all needs to get into verification
planning as the ‘right’ verification path is fraught with complexities?
As per Ghosh, there is not a single full-proof recipe for a ‘right’
verification path. It depends on multiple factors, including whether the
design is a new product or derivative, the design application etc. But
yes, it is very important to do comprehensive verification planning
before starting the verification process.
How is Synopsys
addressing this? Synopsys is said to be building a comprehensive,
unified and integrated verification environment is required for today’s
revolutionary SoCs and would offer a fundamental shift forward in
productivity, performance, capacity and functionality.
Synopsys’
Verification Compiler provides the software capabilities, technology,
methodologies and VIP required for the functional verification of
advanced SoC designs in one solution.
Verification Compiler includes:
* Better capacity and compile and runtime performance.
*
Next-generation static and formal technology delivering performance
improvement and the capacity to analyze a complete SoC (Property
checking, LP, CDC, connectivity).
* Comprehensive low power verification solution.
* Verification planning and management.
*
Next-generation verification IP and a deep integration between VIP and
the simulation engine, which in turn can greatly improve productivity.
The constraint engine is tuned for optimal performance with its VIP
library. It has integrated debug solutions for VIP so one can do
protocol-level analysis and transaction-based analysis with the rest of
the testbench.
* Support for industry standard verification methodologies.
* X-propagation simulation with both RTL and low power simulations.
*
Common debug platform with better debug technology having new
capabilities, tight integrations with simulation, emulation, testbench,
transaction debug, power-aware debug , hw/sw debug, formal, VIP and
coverage.
Top verification recommendations
What would be Synopsys' top five recommendations for verification?
* Spend a meaningful amount of time and effort on verification planning before execution.
*
Continuously invest in building a better verification infrastructure
and methodologies across the company for better productivity.
*
Collaborate with EDA companies to develop, evaluate and deploy new
technologies and flows, which can bring more productivity to
verification processes.
* Nurture fresh talent through regular on and off-the-job trainings (on flows, methodologies, tools, technology).
*
Conduct regular reviews of the completed verification projects with the
goal of trying to improve the verification process after every tapeout
through methodology enhancements.
Monday, April 21, 2014
Monday, April 14, 2014
Cadence: Plan verification to avoid mistakes!
Following Mentor Graphics, Cadence Design Systems Inc. has entered the verification debate. ;) I met Apurva Kalia, VP R&D – System & Verification Group, Cadence Design Systems. In a nutshell, he advised that there needs to be proper verification planning in order to avoid mistakes. First, let's try to find out the the biggest verification mistakes.
Top verification mistakes
Kalia said that the biggest verification mistakes made are:
* Verification engineers do not define a structured notion of verification completeness.
* Verification planning is not done up front and is carried out as verification is going along.
* A well-defined reusable verification methodology is not applied.
* Legacy tools continue to be used for verification; new tools and technologies are not adopted.
In that case, why are some companies STILL not knowing how to verify a chip?
He added: "I would not describe the situation as companies not knowing how to verify a chip. Instead, I think a more accurate description of the problem is that the verification complexity has increased so much that companies do not know how to meet their verification goals.
"For example, the number of cycles needed to verify a current generation processor – as calculated by traditional methods of doing verification – is too prohibitive to be done in any reasonable timeframe using legacy verification methodologies. Hence, new methodologies and tools are needed. Designs today need to be verified together with software. This also requires new tools and methodologies. Companies are not moving fast enough to define, adopt and use these new tools and methodologies thereby leading to challenges in verifying a chip."
Addressing challenges
How are companies trying to address the challenges?
Companies are trying to address the challenges in various ways:
* Companies at the cutting edge of designs and verification are indeed trying to adopt structured verification methodologies to address these challenges.
* Smaller companies are trying to address these challenges by outsourcing their verification to experts and by hiring more verification experts.
* Verification acceleration and prototyping solutions are being adopted to get faster verification and which will allow companies to do more verification in the same amount of time.
* Verification environment re-use helps to cut down the time required to develop verification environments.
* Key requirements of SoC integration and verification—including functionality, compliance, power, performance, etc.—are hardware/software debug efficiency, multi-language verification, low power, mixed signal, fast time to debug, and execution speed.
Cadence has the widest portfolio of tools to help companies meet verification challenges, including:
Incisive Enterprise Manager, which provides hierarchical verification technology for multiple IPs, interconnects, hardware/software, and plans to improve management productivity and visibility;
The recently launched vManager solution, a verification planning and management solution enabled by client/server technology to address the growing verification closure challenge driven by increasing design size and complexity;
Incisive Enterprise Verifier, which delivers dual power from tightly integrated formal analysis and simulation engines; and
Incisive Enterprise Simulator, which provides the most comprehensive IEEE language support with unique capabilities supporting the intent, abstraction, and convergence needed to speed silicon realization.
Are companies building an infrastructure that gets you business advantage? Yes, companies are realizing the problems. It is these companies that are the winners in managing today’s design and verification challenges, he said.
Good verification
When should good verification start?
Kalia noted: "Good verification should start right at the time of the high level architecture of the design. A verification strategy should be defined at that time, and an overall verification plan should be written at that time. This is where a comprehensive solution like Incisive vManager can help companies manage their verification challenges by ensuring that SoC developers have a consistent methodology for design quality enhancements."
Are folks mistaking by looking at tools and not at the verification process itself?
He addded that right tools and methodology are needed to resolve today’s verification challenges. Users need to work on defining verification methodologies and at the same time look at the tools that are needed to achieve verification goals.
Verification planning
Finally, there's verification planning! What should be the ‘right’ verification path?
Verification planning needs to include:
* A formal definition of verification goals;
* A formal definition of coverage goals at all levels – starting with code coverage all the way to functional coverage;
* Required resources – human and compute;
* Verification timelines;
* All the verification tools to be used for verification; and
* Minimum and maximum signoff criteria.
Top verification mistakes
Kalia said that the biggest verification mistakes made are:
* Verification engineers do not define a structured notion of verification completeness.
* Verification planning is not done up front and is carried out as verification is going along.
* A well-defined reusable verification methodology is not applied.
* Legacy tools continue to be used for verification; new tools and technologies are not adopted.
In that case, why are some companies STILL not knowing how to verify a chip?
He added: "I would not describe the situation as companies not knowing how to verify a chip. Instead, I think a more accurate description of the problem is that the verification complexity has increased so much that companies do not know how to meet their verification goals.
"For example, the number of cycles needed to verify a current generation processor – as calculated by traditional methods of doing verification – is too prohibitive to be done in any reasonable timeframe using legacy verification methodologies. Hence, new methodologies and tools are needed. Designs today need to be verified together with software. This also requires new tools and methodologies. Companies are not moving fast enough to define, adopt and use these new tools and methodologies thereby leading to challenges in verifying a chip."
Addressing challenges
How are companies trying to address the challenges?
Companies are trying to address the challenges in various ways:
* Companies at the cutting edge of designs and verification are indeed trying to adopt structured verification methodologies to address these challenges.
* Smaller companies are trying to address these challenges by outsourcing their verification to experts and by hiring more verification experts.
* Verification acceleration and prototyping solutions are being adopted to get faster verification and which will allow companies to do more verification in the same amount of time.
* Verification environment re-use helps to cut down the time required to develop verification environments.
* Key requirements of SoC integration and verification—including functionality, compliance, power, performance, etc.—are hardware/software debug efficiency, multi-language verification, low power, mixed signal, fast time to debug, and execution speed.
Cadence has the widest portfolio of tools to help companies meet verification challenges, including:
Incisive Enterprise Manager, which provides hierarchical verification technology for multiple IPs, interconnects, hardware/software, and plans to improve management productivity and visibility;
The recently launched vManager solution, a verification planning and management solution enabled by client/server technology to address the growing verification closure challenge driven by increasing design size and complexity;
Incisive Enterprise Verifier, which delivers dual power from tightly integrated formal analysis and simulation engines; and
Incisive Enterprise Simulator, which provides the most comprehensive IEEE language support with unique capabilities supporting the intent, abstraction, and convergence needed to speed silicon realization.
Are companies building an infrastructure that gets you business advantage? Yes, companies are realizing the problems. It is these companies that are the winners in managing today’s design and verification challenges, he said.
Good verification
When should good verification start?
Kalia noted: "Good verification should start right at the time of the high level architecture of the design. A verification strategy should be defined at that time, and an overall verification plan should be written at that time. This is where a comprehensive solution like Incisive vManager can help companies manage their verification challenges by ensuring that SoC developers have a consistent methodology for design quality enhancements."
Are folks mistaking by looking at tools and not at the verification process itself?
He addded that right tools and methodology are needed to resolve today’s verification challenges. Users need to work on defining verification methodologies and at the same time look at the tools that are needed to achieve verification goals.
Verification planning
Finally, there's verification planning! What should be the ‘right’ verification path?
Verification planning needs to include:
* A formal definition of verification goals;
* A formal definition of coverage goals at all levels – starting with code coverage all the way to functional coverage;
* Required resources – human and compute;
* Verification timelines;
* All the verification tools to be used for verification; and
* Minimum and maximum signoff criteria.
Monday, April 7, 2014
Five recommendations for verification: Dr. Wally Rhines
It
seems to be the season of verification. The Universal Verification
Methodology (UVM 1.2) is being discussed across conferences. Dennis
Brophy, director of Strategic Business Development, Mentor Graphics,
says that UVM 1.2 release is imminent, and UVM remains a topic of great
interest.
Biggest verification mistakes
Before I add Dennis Brophy’s take on UVM 1.2, I discussed with Dr. Wally Rhines, chairman and CEO, Mentor Graphics Corp. the intricacies regarding verification. First, I asked him regarding the biggest verification mistakes today.
Dr. Rhines said: “The biggest verification mistake made today is poor or incomplete verification planning. This generally results in underestimating the scope of the required verification effort. Furthermore, without proper verification planning, some teams fail to identify which verification technologies and tools are appropriate for their specific design problem.”
Would you agree that many companies STILL do not know how to verify a chip?
Dr. Rhines added: “I would agree that many companies could improve their verification process. But let’s first look at the data. Today, we are seeing that about 1/3 of the industry is able to achieve first silicon success. But what is interesting is that silicon success within our industry has remained constant over the past ten years (that is, the percentage hasn’t become any worse).
“It appears that, while design complexity has increased substantially during this period, the industry is at least keeping up with this added complexity through the adoption of advanced functional verification techniques.
“Many excellent companies view verification strategically (and as an advantage over their competition). These companies have invested in maturing both their verification processes and teams and are quite productive and effective. On the other hand, some companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.”
How are companies trying to address those?
According to him, the recent Wilson Research Group Functional Verification Study revealed that the industry is maturing its verification processes through the adoption of various advanced functional verification techniques (such as assertion-based verification, constrained-random simulation, coverage-driven techniques, and formal verification).
Complexity is generally forcing these companies to take a hard look at their existing processes and improve them.
Getting business advantage
Are companies realizing this and building an infrastructure that gets you business advantage?
He added that in general, there are many excellent companies out there that view verification strategically and as an advantage over their competition, and they have invested in maturing both their verification processes and teams. On the other hand, some other companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.
When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment?
Dr. Rhines noted: “Just like the design team is often involved in discussion during the architecture and micro-architecture planning phase, the verification team should be an integral part of this process. The verification team can help identify architectural aspects of the design that are going to be difficult to verify, which ultimately can impact architectural decisions.”
Are folks mistaken by looking at tools and not at the verification process itself? What can be done to reverse this?
He said: “Tools are important! However, to get the most out of the tools and ensure that the verification solution is an efficient and repeatable process is important. At Mentor Graphics, we recognize the importance of both. That is why we created the Verification Academy, which focuses on developing skills and maturing an organization’s functional verification processes.”
What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities?
Dr. Rhines said: “During verification planning, too many organizations focus first on the “how” aspect of verification versus the “what.” How a team plans to verify its designs is certainly important, but first you must identify exactly what needs to be verified. Otherwise, something is likely to slip through.
“In addition, once you have clearly identified what needs to be verified, it’s an easy task to map the functional verification solutions that will be required to productively accomplish your verification goals. This also identifies what skill sets will need to be developed or acquired to effectively take advantage of the verification solutions that you have identified as necessary for your specific problem.”
How is Mentor addressing this situation?
Mentor Graphics’ Verification Academy was created to help organizations mature their functional verification processes—and verification planning is one of the many excellent courses we offer.
In addition, Mentor Graphics’ Consulting provides customized solutions to technical challenges on real projects with real schedules. By helping customers successfully integrate advanced functional verification technologies and methodologies into their work flows, we help ensure they meet their design and business objectives.
Five recommendations for verification
Finally, I asked Dr. Rhines what would be the top five recommendations for verification?
Here are the five recommendations for verification from Dr. Rhines:
* Ensure your organization has implemented an effective verification planning process.
* Understand which verification solutions and technologies are appropriate (and not appropriate) for various classes of designs.
* Develop or acquire the appropriate skills within your organization to take advantage of the verification solutions that are required for your class of design.
* For the SoC class of designs, don’t underestimate the effort required to verify the hardware/software interactions, and ensure you have the appropriate resources to do so.
* For any verification processes you have adopted, make sure you have appropriate metrics in place to help you identify the effectiveness of your process—and identify opportunities for process improvements in terms of efficiency and productivity.
Biggest verification mistakes
Before I add Dennis Brophy’s take on UVM 1.2, I discussed with Dr. Wally Rhines, chairman and CEO, Mentor Graphics Corp. the intricacies regarding verification. First, I asked him regarding the biggest verification mistakes today.
Dr. Rhines said: “The biggest verification mistake made today is poor or incomplete verification planning. This generally results in underestimating the scope of the required verification effort. Furthermore, without proper verification planning, some teams fail to identify which verification technologies and tools are appropriate for their specific design problem.”
Would you agree that many companies STILL do not know how to verify a chip?
Dr. Rhines added: “I would agree that many companies could improve their verification process. But let’s first look at the data. Today, we are seeing that about 1/3 of the industry is able to achieve first silicon success. But what is interesting is that silicon success within our industry has remained constant over the past ten years (that is, the percentage hasn’t become any worse).
“It appears that, while design complexity has increased substantially during this period, the industry is at least keeping up with this added complexity through the adoption of advanced functional verification techniques.
“Many excellent companies view verification strategically (and as an advantage over their competition). These companies have invested in maturing both their verification processes and teams and are quite productive and effective. On the other hand, some companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.”
How are companies trying to address those?
According to him, the recent Wilson Research Group Functional Verification Study revealed that the industry is maturing its verification processes through the adoption of various advanced functional verification techniques (such as assertion-based verification, constrained-random simulation, coverage-driven techniques, and formal verification).
Complexity is generally forcing these companies to take a hard look at their existing processes and improve them.
Getting business advantage
Are companies realizing this and building an infrastructure that gets you business advantage?
He added that in general, there are many excellent companies out there that view verification strategically and as an advantage over their competition, and they have invested in maturing both their verification processes and teams. On the other hand, some other companies are struggling to figure out the entire SoC space and its growing complexity and verification challenges.
When should good verification start?
When should good verification start — after design; as you are designing and architecting your design environment?
Dr. Rhines noted: “Just like the design team is often involved in discussion during the architecture and micro-architecture planning phase, the verification team should be an integral part of this process. The verification team can help identify architectural aspects of the design that are going to be difficult to verify, which ultimately can impact architectural decisions.”
Are folks mistaken by looking at tools and not at the verification process itself? What can be done to reverse this?
He said: “Tools are important! However, to get the most out of the tools and ensure that the verification solution is an efficient and repeatable process is important. At Mentor Graphics, we recognize the importance of both. That is why we created the Verification Academy, which focuses on developing skills and maturing an organization’s functional verification processes.”
What all needs to get into verification planning as the ‘right’ verification path is fraught with complexities?
Dr. Rhines said: “During verification planning, too many organizations focus first on the “how” aspect of verification versus the “what.” How a team plans to verify its designs is certainly important, but first you must identify exactly what needs to be verified. Otherwise, something is likely to slip through.
“In addition, once you have clearly identified what needs to be verified, it’s an easy task to map the functional verification solutions that will be required to productively accomplish your verification goals. This also identifies what skill sets will need to be developed or acquired to effectively take advantage of the verification solutions that you have identified as necessary for your specific problem.”
How is Mentor addressing this situation?
Mentor Graphics’ Verification Academy was created to help organizations mature their functional verification processes—and verification planning is one of the many excellent courses we offer.
In addition, Mentor Graphics’ Consulting provides customized solutions to technical challenges on real projects with real schedules. By helping customers successfully integrate advanced functional verification technologies and methodologies into their work flows, we help ensure they meet their design and business objectives.
Five recommendations for verification
Finally, I asked Dr. Rhines what would be the top five recommendations for verification?
Here are the five recommendations for verification from Dr. Rhines:
* Ensure your organization has implemented an effective verification planning process.
* Understand which verification solutions and technologies are appropriate (and not appropriate) for various classes of designs.
* Develop or acquire the appropriate skills within your organization to take advantage of the verification solutions that are required for your class of design.
* For the SoC class of designs, don’t underestimate the effort required to verify the hardware/software interactions, and ensure you have the appropriate resources to do so.
* For any verification processes you have adopted, make sure you have appropriate metrics in place to help you identify the effectiveness of your process—and identify opportunities for process improvements in terms of efficiency and productivity.
Thursday, April 3, 2014
Semicon industry needs to keep delivering value: Anil Gupta
In
2013, the global semiconductor industry had touched $306 billion or so.
Sales had doubled from $100 billion to $200 billion in six years — from
1994 to 2000. It was enterprise sales that was driving this. It has
taken 14 years to move past $300 billion, said Anil Gupta, managing
director, Applied Micro Circuits India Pvt Ltd, at the UVM 1.2 day.
This time, consumption of semiconductors is not only around enterprise, but social networks as well. Out of the $306 billion, logic was approximately $86 billion, memory was $67 billion, and micro was $58 billion. We, as consumers, are starting to play a huge role.
However, the number of large players seem to be shrinking. Mid-size firms, like Applied Micro, are said to be struggling. Technology is playing an interesting role. There is a very significant investment in FinFETs. It may only get difficult for all of us. Irrespective, all of this is a huge barrier to the mid- to small-companies. Acquisitions are probably the only route, unless you are in software.
In India, we have been worried for a while, whether the situation will be a passing phase. We definitely will have a role to play. From an expertise perspective, thanks to our background, we have been a poor nation. For us, the job is the primary goal. We need to think: how do we deliver value? We have to try and keep creating value for as long as possible.
As more and more devices actually happen, many other things are also happening. An example for devices is power. We still have a fair number of years ahead where there will be opportunities to deliver value.
What’s happening between hardware and software? The latter is in demand. Clearly, there is a trend to make the hardware a commodity. However, hardware s not going away! Therefore, the opportunity for us to deliver value is huge.
Taking the tools to make something, is critical. UVM tools are critical. But, somewhere along the way, we seem to stop at that. We definitely need to add value. UVM’s aim is to make things re-usable.
Don’t loose your focus while doing verification. Think about the block, the subsystem and the top. You need to and will discover and realize how valuable it is to find a bug, before the tape out of the chip.
This time, consumption of semiconductors is not only around enterprise, but social networks as well. Out of the $306 billion, logic was approximately $86 billion, memory was $67 billion, and micro was $58 billion. We, as consumers, are starting to play a huge role.
However, the number of large players seem to be shrinking. Mid-size firms, like Applied Micro, are said to be struggling. Technology is playing an interesting role. There is a very significant investment in FinFETs. It may only get difficult for all of us. Irrespective, all of this is a huge barrier to the mid- to small-companies. Acquisitions are probably the only route, unless you are in software.
In India, we have been worried for a while, whether the situation will be a passing phase. We definitely will have a role to play. From an expertise perspective, thanks to our background, we have been a poor nation. For us, the job is the primary goal. We need to think: how do we deliver value? We have to try and keep creating value for as long as possible.
As more and more devices actually happen, many other things are also happening. An example for devices is power. We still have a fair number of years ahead where there will be opportunities to deliver value.
What’s happening between hardware and software? The latter is in demand. Clearly, there is a trend to make the hardware a commodity. However, hardware s not going away! Therefore, the opportunity for us to deliver value is huge.
Taking the tools to make something, is critical. UVM tools are critical. But, somewhere along the way, we seem to stop at that. We definitely need to add value. UVM’s aim is to make things re-usable.
Don’t loose your focus while doing verification. Think about the block, the subsystem and the top. You need to and will discover and realize how valuable it is to find a bug, before the tape out of the chip.
Wednesday, April 2, 2014
Are we at an inflection point in verification?
Are
we at an inflection point in verification today? Delivering the guest
keynote at the UVM 1.2 day, Vikas Gautam, senior director, Verification
Group, Synopsys, said that today, mobile and the Internet of Things are
driving growth.
Naturally, the SoCs are becoming even more complex. It is also opening up new verification challenges, such as power efficiency, more software, and reducing time-to-market. There is a need to shift-left to be able to meet time-to-market goal.
The goal is to complete your verification as early as possible. There have been breakthrough verification innovations. System Verilog brought in a single language. Every 10-15 years, there has been a need to upgrade verification.
Today, many verification technologies are needed. There is a growing demand for smarter verification. There is need for much upfront verification planning. There is an automated setup and re-use with VIP. There is a need to deploy new technologies and different debug environments. The current flows are limitimg smart verification. There are disjointed environments with many tools and vendors.
Synopsys has introduced the Verification Compiler. You get access to each required technology, as well as next-gen technology. These technologies are natively integrated. All of this enables 3X verification productivity.
Regarding next gen static and formal platforms, there will be capacity and performance for SoCs. It should be compatible with implementation products and flows. There is a comprehensive set of applications. The NLP+X-Prop can help find tough wake-up bug at RTL. Simulation is tuned for the VIP. There is a ~50 percent runtime improvement.
System Verilog has brought in many new changes. Now, we have the Verification Compiler. Verdi is an open platform. It offers VIA – a platform for customizing Verdi. VIA improves the debug efficiency.
Naturally, the SoCs are becoming even more complex. It is also opening up new verification challenges, such as power efficiency, more software, and reducing time-to-market. There is a need to shift-left to be able to meet time-to-market goal.
The goal is to complete your verification as early as possible. There have been breakthrough verification innovations. System Verilog brought in a single language. Every 10-15 years, there has been a need to upgrade verification.
Today, many verification technologies are needed. There is a growing demand for smarter verification. There is need for much upfront verification planning. There is an automated setup and re-use with VIP. There is a need to deploy new technologies and different debug environments. The current flows are limitimg smart verification. There are disjointed environments with many tools and vendors.
Synopsys has introduced the Verification Compiler. You get access to each required technology, as well as next-gen technology. These technologies are natively integrated. All of this enables 3X verification productivity.
Regarding next gen static and formal platforms, there will be capacity and performance for SoCs. It should be compatible with implementation products and flows. There is a comprehensive set of applications. The NLP+X-Prop can help find tough wake-up bug at RTL. Simulation is tuned for the VIP. There is a ~50 percent runtime improvement.
System Verilog has brought in many new changes. Now, we have the Verification Compiler. Verdi is an open platform. It offers VIA – a platform for customizing Verdi. VIA improves the debug efficiency.
Three things in Indian semicon: Vinay Shenoy
There
have been a variety of announcements made by the Government of India in
the last one year or so. In the pre-90s period, the country showed just
1 percent GDP growth rate. It was adverse to FDI and had a regulated
market. All of this led to deregulation under the late PM, PV Narasimha
Rao.
The Indian government was averse to foreign investment, which was opened up around 1994. Since then, we have seen 6-8 percent growth, said Vinay Shenoy, MD, Infineon Technologies (India). He was delivering the keynote at the UVM 1.2 day, being held in Bangalore, India.
Around 1997, India signed the ITA-1 with the WTO. Lot of electronic items had their import duty reduced to zero. It effectively destroyed the electronics manufacturing industry in India. We were now reduced to being a user of screwdriver technology. In 1985, the National Computer Policy, and in 1986, the National Software Policy, were drafted. The government of India believed that there existed some opportunities. The STPI was also created, as well as 100 percent EoUs. So far, we have been very successful in services, but have a huge deficit on manufacturing.
We made an attempt to kick off semicon manufacturing in 2007, but that didn’t take off for several reasons. It was later revived in 2011-12. Under the latest national policy of electronics, there have been a couple announcements – one, setting up of two semicon fabs in India. The capital grant – nearly 25-27 percent — is being given by the government. It has provided a financial incentive – of about $2 billion.
Two, electronics manufacturing per se, unless it is completely an EoU, the semicon industry will find it difficult to survive. There is the M-SIPS package that offers 25 percent capital grant to a wide range of industries.
Three, we have granted some incentives for manufacturing. But, how are you going to sell? The government has also proposed ‘Made in India’, where, 30 percent of the products will be used within India. These will largely be in the government procurements, so that the BoM should be at least 30 percent from India. The preferential market policy applies to all segments, except defense.
Skill development is also key. The government has clearly stated that there should be innovation-led manufacturing. The government also wants to develop PhDs in selected domains. It intends to provide better lab facilities, better professors, etc. Also, young professors seeking to expand, can seek funding from the government.
TSMC promotes small IP companies. Similarly, it should be done in India. For semicon, these two fabs in India will likely come up in two-three years time. “Look at how you can partner with these fabs. Your interest in the semicon industry will be highly critical. The concern of the industry has been the stability of the tax regime. The government of India has assured 10 years of stable tax regime. The returns will come in 10-15 years,” added Shenoy.
The government has set up electronics manufacturing clusters (EMC). These will make it easy for helping companies to set up within the EMC. The NSDC is tying up with universities in bringing skill-sets. The industry is also defining what skills will be required. The government is funding PhDs, to pursue specialization.
The Indian government was averse to foreign investment, which was opened up around 1994. Since then, we have seen 6-8 percent growth, said Vinay Shenoy, MD, Infineon Technologies (India). He was delivering the keynote at the UVM 1.2 day, being held in Bangalore, India.
Around 1997, India signed the ITA-1 with the WTO. Lot of electronic items had their import duty reduced to zero. It effectively destroyed the electronics manufacturing industry in India. We were now reduced to being a user of screwdriver technology. In 1985, the National Computer Policy, and in 1986, the National Software Policy, were drafted. The government of India believed that there existed some opportunities. The STPI was also created, as well as 100 percent EoUs. So far, we have been very successful in services, but have a huge deficit on manufacturing.
We made an attempt to kick off semicon manufacturing in 2007, but that didn’t take off for several reasons. It was later revived in 2011-12. Under the latest national policy of electronics, there have been a couple announcements – one, setting up of two semicon fabs in India. The capital grant – nearly 25-27 percent — is being given by the government. It has provided a financial incentive – of about $2 billion.
Two, electronics manufacturing per se, unless it is completely an EoU, the semicon industry will find it difficult to survive. There is the M-SIPS package that offers 25 percent capital grant to a wide range of industries.
Three, we have granted some incentives for manufacturing. But, how are you going to sell? The government has also proposed ‘Made in India’, where, 30 percent of the products will be used within India. These will largely be in the government procurements, so that the BoM should be at least 30 percent from India. The preferential market policy applies to all segments, except defense.
Skill development is also key. The government has clearly stated that there should be innovation-led manufacturing. The government also wants to develop PhDs in selected domains. It intends to provide better lab facilities, better professors, etc. Also, young professors seeking to expand, can seek funding from the government.
TSMC promotes small IP companies. Similarly, it should be done in India. For semicon, these two fabs in India will likely come up in two-three years time. “Look at how you can partner with these fabs. Your interest in the semicon industry will be highly critical. The concern of the industry has been the stability of the tax regime. The government of India has assured 10 years of stable tax regime. The returns will come in 10-15 years,” added Shenoy.
The government has set up electronics manufacturing clusters (EMC). These will make it easy for helping companies to set up within the EMC. The NSDC is tying up with universities in bringing skill-sets. The industry is also defining what skills will be required. The government is funding PhDs, to pursue specialization.
Subscribe to:
Posts (Atom)