The future of VLSI verification

 


The future of VLSI verification is in trouble.


Chip verification is slowing down innovation. Is verification up to the task??  


I have seen verification engineers desperately in need of modern tools and flows to deal with rapidly increasing design size.


Existing tools are not capable of dealing with system-level issues, and that is where the most complex issues hide. A single verification mistake can cost billions.


Verification Teams are expected to do more in less time?


I experienced only incremental improvements in tool capacity and performance, while design sizes have increased rapidly.


Do you think, it requires to catalyse all verification engineers because we haven’t got enough of them?


But, How?


AI can assist by drawing correlations in the simulation data. The regression can be optimised in several ways. 

After making a change in the design, AI could determine which tests to target that area. 


Many problems are caused by ambiguity in the design specification. 

We can use GenAI large language models to parse the specification and use it as a co-pilot.

Comments

Popular Posts