Freewheeling on Formal

ICCD 2006 panel extends the reach ...


by Peggy Aycinena

October 31, 2006

********************************

Tuesday, October 3rd, was a good day for the verification world. A lunchtime panel – "Extending the Reach of Formal Verification for Electronic Design" – was held in conjunction with the IEEE International Conference on Computer Designwas as tasty as the buffet lunch offered up that day at the DoubleTree Hotel in San Jose.

EDN's Ron Wilson moderated, with Real Intent's Rich Faris, NVIDIA's Prosenjit Chatterjee, Mentor Graphics' Steve White, Industry Expert Tom Anderson, and Synopsys' Swami Venkat were arrayed along the table adjacent to the podium. The conversation was substantive.

Ron Wilson opened with a few brief comments and then opened up the forum to Q&A – always a great way to promote conversation. He noted that when formal verification burst onto the scene, it seemed possible at last to submit RTL or ESL code to the commercially available tools to see if a design worked correctly or not. He said the reality has not been quite that idyllic, however. Some of the tools have worked, some have not, and some require "three PhD's and a computer scientist" to actually produce any kind of useful verification answer.

** Ron Wilson suggested the panelists start out by defining the various pieces that constitute formal verification to help explain where the situation is today.

Rich Faris said Real Intent is a formal verification tool vendor. They've got an equivalence-checking tool that looks at C code and RTL and makes sure they're the same. They've also got a model checker. They focus on static, formal ABV (assertion-based verification) built in PSL.

Steve White said there are various ways of applying formal: there's static, with exhaustive proofs and counter examples, there are established properties that need to be checked, and there's the idea of marrying simulation with formal techniques. He added that looking for bugs is not exhaustive, but does produce a range of counter examples.

Swami Venkat said he agreed with Steve about property verification. It must be synthesized with the RTL, and can provide constraints or assertions to be proven.

** Ron Wilson asked the panel to define all of that from the user's perspective.

Tom Anderson said he believes there's a verification continuum that goes from LINT checking, to static checking, to clock-domain checking – but in an ideal world people would write auto checks. In addition, there should be underlying tests automatic checking.

Prosenjit Chatterjee said there's applied formal and there's property verification to prove properties. They can be applied in many projects, but only among selected design teams. He added that NVIDIA is in transition from formal to formal plus simulation to generate coverage vectors.

** To clarify the lexicon, Ron Wilson asked the panelists to define a property.

Prosenjit Chatterjee said it's the describing function aspect of design using assertions.

Rich Faris said it's something that can span all the time within the design, and that it's different from a testbench.

** Ron Wilson asked the panelists to what extent people are really applying this stuff. He noted that a couple of years ago, it was all considered experimental. Is that still the reality today, he wanted to know.

Swami Venkat said Synopsys is focusing on property verification, not equivalence checking. Some properties, he said, are written by users who expect the formal tools to prove or disprove those properties. He said some tools input design and verification assertions, things like tri-state busses or multiple clock domains. More and more people are using that, he said, but still it's an art. It's a trick to define a constraint and to use the declarative language. Until people are comfortable describing assertions, he added that only a handful of people will use the technology.

Tom Anderson said that, from his perspective as an EDA consultant, his clients doing "average" designs are not using formal and have no experience with it. But those clients that have a dedicated verification team are using formal. But in either case, assertions are becoming mainstream for customers – most have experience with assertions today, he said. Answering Ron Wilson's lexicon question, Tom said properties come from assertions in design.

** Still pursuing the lexicon, Ron Wilson asked how an assertion differs from a property.

Tom Anderson said that a property is something that's inside a formal tool. And, an assertion is equivalent to a property for the sake of the panel discussion, he said.

Steve White reminded the audience that he had founded 0-In Design (acquired by Mentor Graphics in 2004), and that there's been a transition from – "What's an assertion and its value?" – to – "How do I deploy it?" He noted that every design services company is using formal verification, whether they're designing microprocessors, embedded controllers, or video processors. However, Steve White warned that formal's not a substitute for simulation. However, with things like clock-domain crossings or static-timing constraints, formal's the only way to be sure they're correct. In a design flow, many still see formal as something only for experts and specialists, he added, but even that's changing as people become better education about the technology and the user community matures.

Prosenjit Chatterjee said formal's a technology that being picked up slowly. He said NVIDIA has been applying it for 4 years. But, when there's a lack of resources – say, you have a particular project today that's using conventional simulation or emulation – who's going to do the formal work? The value is definitely there, however, so why not use it across the entire chip? That's possible if there are additional engineers, he said, answering his own question. And, as the tools have matured, they are being applied to bigger designers. Prosenjit Chatterjee estimated that 20 percent to 30 percent of designs today are using assertion-based verification.

Rich Faris said that every large design gets touched these days by some formal verification, but he would like to see it applied to even more designs – not just those with large verification teams and design teams.

** For further clarification, Ron Wilson asked if the formal tools are for designers or verification engineers.

Rich Faris said the question has to be looked at from a software paradigm. Years ago, software engineers put assertions into their code – they were scattered throughout the code. It should be the same for hardware today, he said. The designer should be asking, "What assertions am I making?" There still needs to be a lot of work to make that happen, he added, but eventually SystemVerilog Assertions will make a big difference.

Prosenjit Chatterjee said designers should be using test plans that are formal friendly. And what are the properties there? He said it's not necessary to write assertions before the RTL. They can be written while writing the RTL, if there's a flow that encourages that, to describe what the corner cases are – particularly if there's a separate verification team.

Steve White said verification tools that require designer input impede adoption. The reality is that everybody is understaffed, but still constraints are the real issue. So, he said, it's counter-intuitive to think in terms of constraints. The barriers must be lowered to see broad adoption. Clearly, NVIDIA has had the right idea, Steve White added. They bought into test planning from the beginning and they thought about their strategy. He noted that there has to be a culture change in the industry where people think about methodologies in terms of ROI. He added that standards are helping here.

Tom Anderson said that the planning process is important. He said his customers have a strategy to work with their design and verification engineers, and sometimes their architects as well, to call out corner cases. Assertions can deal with those illegal corner cases and customers are really understanding that, he said.

Swami Venkat said that customers don't want design engineers verifying designs. He said his team comes across customers who have verification engineers writing assertions, but it's always easier for the designers to write the assertions. And, although designers find it comfortable to do that for some of the complex protocols, it's better if the verification engineers do that work.

** Ron Wilson then asked a trick question – aren't assertions part of the design intent?

Steve White said people are expanding the definition of what constraint means. It's important to do it, he said, but the tools are starting to do things that are useful – things like automatically detecting reconvergence. Every designer has tons of clocks to deal with, he said, when they're dealing with large designs. And that's when formal is the most exciting, when there are clear benefits and the technology is directly applicable. That's the part where vendors find responses from lots of design teams.

** Ron Wilson opened the floor to questions. Someone asked about state space coverage?

Swami Venkat said to look to SVA – SystemVerilog Assertions – that let users specify assertions. What's hyped now, he said, is using formal tools without having to change anything. Simulation and coverage tools can provide the metrics, but users need to define the coverage targets.

** Another question from the floor asked if formal equivalence can also cover the state space.

Swami Venkat said that different engines cover the state space differently. Each vendor's tool has a different state machine, he said. And, some provide programs for a particular state machine. Progress is being made, but there is also a lot of work to be done there, he added.

Steve White was more emphatic – it's not possible today to cover the state space with formal equivalence checking. Modern design has more states than there are atoms in the universe, he noted wistfully. So, in the case of logic for a given assertion, it's a case of exhaustive proofs versus bounded series. Assertions must be described in a heuristic manner, he added. "Do I have sufficient coverage?" is the state of the art today.

** Wistful as well, Ron Wilson asked if there are any tools other than just experience for extracting assertions.

Steve White said it depends on the distance from the registers to the assertions. If you're, say, 5 registers from the assertion, you're probably okay. And you just know that from experience, which is why designers are appreciative when they're kept in the loop, he said.

Rich Faris said there's a lot more work to be done there, but it would help if we standardized on one format.

Tom Anderson added that each tool has its own metric for assertion density.

Prosenjit Chatterjee said he doesn't think a bounded proof is sufficient. He added that sometimes assertions are more simulation friendly than formal. For example, a FIFO doesn't drop data. It's assertion friendly, but not formal friendly. In the microarchitecture phase, he said, you just write high-level assertions. But during verification, you see if the property needing proof is expressed in the right from or not.

** Another question from the floor asked the panel what kind of assertions a designer should write versus those the verification engineer should write.

Prosenjit Chatterjee said for microarchitect, you should see that the test plan is completed – what is the high-level functionality needing to be proved? But the verification engineer sees to it that what's written in English is being checking in the verification. Therefore, it's very designer dependent. It's not the high-level that governs the syntax.

** Another question from the floor asked about the difficulty of training designers to use a formal engineer.

Prosenjit Chatterjee said that data out is data in, but assertions are what designers need to accelerate the process of design. In one of NVIDIA's customers in Japan, he said they ran their specifications through formal to determine what somebody would like to see in the system design.

Tom Anderson answered the question with a question: "What level of granularity is there without people writing formal assertions for the complete chip?"

Swami Venkat said SVA does a good job of verifying design and assertions. It makes it a lot easier – there's a specification available that's all written. It's bringing the industry a lot closer to having the specification defined by the assertions, he said.

** Another question from the floor was about retiming latches, or power-saving modes – what verification methodology did the panel consider to be accurate?

Swami Venkat said when there's a team, it's whatever the verification engineers want to use

Prosenjit Chatterjee said, in general, NVIDIA doesn't have separate verification teams, so it's whatever the team wants to use overall.

** A forward-looking question from the floor inquired about automatic assertions?

Rich Faris said that's going to be important for coping with different parts of the design coming from different design teams within a company, or for coping with IP from a third-party vendor.

Prosenjit Chatterjee said if a team designs a FIFO or a repetitive structure, you should have your own flow where good properties are extracted automatically from the project as a whole. But often there isn't time to write assertions. If a designer has no time, the verification team can read documentation and write the assertions.

** Ron Wilson asked if teams ever ask for more time for verification.

Swami Venkat said there are some things like FIFOs or state machines, where now we write checker libraries. And, Synopsys just donated a checker library to Accellera.

A difficult question from the floor addressed the barrier of the hardcore user, and their attitudes toward using automatic assertions. How big is the gap between the ideal and the actual technologies being used?

Steve White said that for general-purpose design, the gap is close to being closed quickly. It's coming, he said, through a combination of automatic checkers, libraries from industry, and the ability to express assertions in a credible way. The industry is light years ahead of where it has been, he said.

Swami Venkat said there are more users of verification IP, plus there's support for verification coverage engines with built-in transaction coverage monitors being used very broadly. For formal, the challenge is to define the constraints. He said Synopsys works closely with its customers to write those constraints.

** Ron Wilson asked if a committee is working on protocols, should that also include assertions.

Tom Anderson said standard sets of assertions can be downloaded from the web.

Rich Faris said to check protocol rules, but standards are really going to help out here. He added that you could consider verification to be a black box problem, but an engineer wants to looks at the proprietary aspects of the chip – the engineer wants to look at the proprietary internal states.

** As the discussion wound down, a question from the floor asked about management's commitment to verification.

Prosenjit Chatterjee said it can be a problem, because bugs might be fixable. But seeing that, the engineers often wish they had written assertions beforehand. Now they should be required, not recommendations. Formal was a curiosity in the beginning, he said, and it's taken time to be accepted. Today, however, verification teams are using formal.

** A final question from the floor asked if the ideal would be having assertions to verify or to use specifications to create design by constraints – which is more promising?

Steve White said there is a potential to have executable assertions, but that's a long way off. But, it's practical to use assertions nonetheless. He said Mentor sees people using formal techniques for different parts of their design – where people are iterating their designs.

Rich Faris said the question gets into equivalence checking and reminded the audience that models are ultimately the thing designers need.

** The hour was up, Ron Wilson thanked his panelists, and the audience rushed off to their afternoon sessions.

********************************

I went to the Special Session on Hardware Equivalence and heard Dr. Alan Hu from the University of British Columbia – "High-Level versus RTL Combinational Equivalence."

After his talk, Professor kindly agreed to respond to some questions by e-mail. Published both here and in my EDA Weekly on verification planning, his responses help to clarify some of the ideas currently swirling around in the world of formal and beyond:

** Please define formal verification.

Alan Hu – Formal verification means proving a property about a model of a system. "Proving" is in the sense of mathematical proof – the highest level of confidence of anything known to humanity. In other words, you have 100% coverage of all possible behaviors of your (model of your) system.

"Property" means you have to specify what you're proving, and this is the same problem whether you're working with formal or informal verification. If you don't know what you want (you don't specify to look for a particular case), you won't know to verify it. And "model" means you're always verifying something other than the actual product in the customer's hands. The model could be the layout, it could be gates, it could be a very abstract protocol, or an algorithm.

** Please distinguish between an assertion and a property.

Alan Hu – I don't. Keep in mind that I'm speaking with the luxury of being an academic. In different contexts, some communities may draw a distinction.

For example, some people will say that "assertions" are embedded in the design, whereas "properties" are external to it. Other people will have other definitions. I don't think any usage has become the de facto standard, so if I'm talking to someone, I'll just keep listening until I figure out what distinction they're really trying to make.

** At what point did formal verification become more than an "academic curiosity?"

Alan Hu – What we're seeing is a gradual process, and it's hard to cite a defining moment. Even thirty years ago, for certain niche markets, formal verification was already being used. Twenty years ago, companies like IBM had started doing formal equivalence verification in-house. Ten years ago, formal equivalence checking was taking over the RTL-to-gate market, and the big semiconductor and EDA companies (Intel, IBM, Cadence, Synopsys, etc.) and several start-ups were all ramping up on model checking.

Today, the model-checking (or property-checking) tools are gaining market traction, and RTL-to-gate equivalence checking is so completely owned by formal tools that most people don't think of it as formal verification anymore. Ten years from now, even more formal techniques will be mainstream parts of the design flow, but some formal verification problems will still be in the realm of academic research. It's important not to keep redefining formal verification to mean whatever we can't quite do yet.

** Is it possible to verify a large SOC without using some form of formal verification?

Alan Hu – Sure. You can carefully hand-pick exceptionally good design and verification engineers, choose an extremely conservative (i.e., low performance) architecture and design that's "obviously correct," buy a ton of emulation boxes and server farms, simulate for a really, really long time, and do some extra spins of silicon until you get most of the bugs out.

The real question is whether it's cost-effective to do a large SOC without using some form of formal verification. And I'd have to say the answer is no. People vote with their wallets. Everyone these days is using formal for verifying RTL-to-gate equivalence, for example. People have decided that's the most cost-effective way to do that step.

For property checking, the value proposition is less clear. All of the people doing the most complex chips – the microprocessors and GPUs and very complex SOCs – are all either buying or developing in-house formal property checking tools. In some cases, they are developing extremely sophisticated formal methodologies, as well.

So, on the leading edge, the value proposition is definitely there. For the trailing-edge designs, though, people apparently can get by with older technology for a bit longer.

********************************

Print Version


October 31, 2006

Peggy Aycinena owns and operates EDA Confidential. She can be reached at peggy@aycinena.com


Copyright (c) 2006, Peggy Aycinena. All rights reserved.