Quantcast
Channel: Systems Design Engineering Community » MEMS
Viewing all articles
Browse latest Browse all 15

Future Challenges in Design Verification and Creation

$
0
0

Gabe Moretti, Senior Editor

Dr. Wally Rhines, Chairman and CEO of Mentor Graphics, delivered the keynote address at the recently concluded DVCon U.S. in San Jose.  The title of the presentation was: “Design Verification Challenges: Past, Present, and Future”.  Although one must know the past and recognize the present challenges, the future ones were those that interested me the most.

But let’s start from the present.  As can be seen in Figure 1 designers today use five major techniques to verify a design.  The techniques are not integrated with each other; they are as five separate silos within the verification methodology.  The near future goal as explained by Wally is to integrate the verification process.  The work of the Portable Stimulus Working Group within the Accellera System Initiative is addressing the problem.  The goal, according to Bill Hodges of Intel is: “Users should not be able to tell if their job was executed on a simulator, emulator, or prototype”.

Figure 1.  Verification Silos

The present EDA development work addresses the functionality of the design, both at the logical and at the physical level.  But, especially with the growing introduction of Internet of Things (IoT) devices and applications the issues of security and safety are becoming a requirement and we have not learned how to verify the device robustness in these areas.

Security

Figure 2, courtesy of Mentor Graphics, encapsulates the security problem.  The number of security breaches increases with every passing day it seems, and the financial and privacy losses are significant.

Figure 2

Chip designers must worry about malicious logic inside the chip, Counterfeit chips, and side-channel attacks.  Malicious logic is normally inserted dynamically into the chip using Trojan malware.  They must be detected and disabled.  The first thing designers need to do is to implement counter measures within the chip.  Designers must implement logic to analyze runtime activity to recognize foreign induced activity through a combination of hardware and firmware.  Although simulation can be used for verification, static tests to determine that the chip performs as specified and does not execute unspecified functions should be used during the development process.  Well-formed and complete assertions can approximate a specification document for the design.

Another security threat called “side-channel attacks” is similar to the Trojan attack but it differs in the fact that it takes advantage of back doors left open, either intentionally or not, by the developers.  Back doors are built into systems to deal with special security circumstances by the developers’ institution, but can be used criminally when discovered by unauthorized third parties.  To defend from such eventuality designers can use hardened IP or special logic to verify authenticity.  Clearly during development these countermeasures must be verified and weaknesses discovered.  The question to be answered is: “Is the desired path the only path possible?”

Safety

As the use of electronic systems grows at an increasing pace in all sort of products, the reliability of such systems grows in importance.  Although many products can be replaced when they fail without serious consequences for the users, an increasing number of systems failures put the safety of human being in great jeopardy.  Dr. Rhines identified in particular systems in the automotive, medical, and aerospace industries.  Safety standards have been developed in these industries that cover electronic systems.  Specifically, ISO 26262 in the automotive industry, IEC 60601 in the medical field, and DO-254 in aerospace applications.  These certification standards aim to insure that no harm will occur to systems, their operators, or to bystanders by verifying the functional robustness of the implementation.

Clearly no one would want a heart pace maker (figure 3) that is not fail-safe to be implanted in a living organism.

Figure 3. Implementation subject to IEC 60601 requirements

The certification standards address safe system development process by requiring evidence that all reasonable system safety objective are satisfied.  The goal is to avoid the risk of systematic failures or random hardware failures by establishing appropriate requirements and processes.  Before a system is certified, auditors check that each applicable requirement in the standard has been implemented and verified.  They must identify specific tests used to verify compliance to each specific requirement and must also be assured that automatic requirements tracking is available for a number of years.

Dr. Rhines presented a slide that dealt with the following question: “Is your system safe in the presence of a fault?”.

To answer the question verification engineers must inject faults in the verification stream.  Doping this it helps determining if the response of the system matches the specification, despite the presence of faults.  It also helps developers understand the effects of faults on target system behavior, and is assesses the overall risk.  Wally noted that formal-based fault injection/verification can exhaustively verify the safety aspects of the design in the presence of faults.

Conclusion

Dr. Rhines focused on the verification aspects during his presentation and his conclusions covered four points.

  • Despite design re-use, verification complexity continues to increase at 3-4X the rate of design creation
  • Increasing verification requirements drive new capabilities for each type of verification engine
  • Continuing verification productivity gains require EDA to both abstract the verification process from the underlying engines, develop common environments, methodologies and tools, and separate the “what” from the “how”
  • Verification for security and safety is providing another major wave of verification requirements.

I would like to point out that developments in verification alone are not enough.  What EDA really needs is to develop a system approach to the problem of developing and verifying a system.  The industry has given lip service to system design and the tools available so far still maintain a “silos” approach to the problem.  What is really required is the ability to work at the architectural level and evaluate a number of possible solutions to a well specified requirements document.  Formal tools provide good opportunities to approximate, if not totally implement, an executable requirements document.  Designers need to be able to evaluate a number of alternatives that include the use of mixed hardware and software implementations, analog and mixed-signal solutions, IP re-use, and electro-mechanical devices, such as MEMS.

It is useless or even dangerous to begin development under false assumptions whose impact will be found, if ever, once designers are well into the implementation stage.  The EDA industry is still focusing too much on fault identification and not enough on fault avoidance.


Viewing all articles
Browse latest Browse all 15

Latest Images

Trending Articles



Latest Images