Adding Verilog Timing support

Hello all,

I’m an engineer at a semiconductor firm and we are interested in providing QSpice models of our parts. For our use case models of digital system are in verilog, and we need support for delay statements. Converting our models to C++ is not viable. verilator supports most timing constructs when invoked with the --timing option (the unsupported features are not a problem for our use case). It appears as though QSpice does not invoke verilator with this option, and the generated C++ wrapper for verilog models does not implement the needed functionality anyways.

It looks like delay supported could be implemented roughly this way:

  1. Invoke verilator with --timing option
  2. MaxExtStepSize should be added to verilated C++ wrapper and can use nextTimeSlot() on the verilated struct to limit stepsize.
  3. The model eval function in the wrapper needs to update the verilator context with time. It looks like this is not done currently, which seems wrong so it’s probably just being done somewhere I don’t know about.
  4. Trunc likely needs to be modified as well, although I’m not as sure about the changes needed for that.
  5. Compile/link/etc into .dll
  6. Simulate

The most significant issue I see at the moment is that --timing generates C++ that requires C++ 20 support in the compiler, which the mars toolchain doesn’t seem to support. This could be an issue for adding timing support natively in QSpice, but isn’t an issue for models compiled externally.

Anyone have any thoughts or pointers for implementing this? I’m going to attempt to implement delay support using the approach outlined above and see how far that gets me.

2 Likes

Might be best to talk to this guy: mike.engelhardt@qorvo.com

If it can be implemented, I would use it. IMO I would rather write Verilog for simulations than C++ because it ports directly to an FPGA. I have been using the current Verilog implementation but I have to drive the Verilog block with a 1GHz pulse which forces the simulation to generate sub ns steps and the whole thing runs slow. Also have to simulate the delay (filter and comparator) on the outputs of the FPGA.

1 Like