3

I have always read that delays declared in a RTL code can never be synthesized. They are meant only for simulation purpose and modern synthesis tools will just ignore delays declarations in the code.

For example: x = #10 y; will be considered as x = y; by the synthesis tool.

Can someone please explain the reasons, as to why delay declarations in any hardware description language (i.e. VHDL, Verilog or Sytem-Verilog) cannot be synthesized?

Qiu
  • 5,651
  • 10
  • 49
  • 56
Anand
  • 363
  • 5
  • 11
  • This is a [cross posted question](http://electronics.stackexchange.com/q/121121/13513) as it falls between the overlap of SO and ElectronicsSE. – Morgan Jul 14 '14 at 14:29
  • What do you want them to synthesize into ? – shrm Jul 14 '14 at 21:28
  • @mishr I want them to synthesize into a delay generating hardware. – Anand Jul 15 '14 at 02:26
  • @Anand VHDL/Verilog provides delay to model the inherent delay of each component (gates for example), not the other way round. What is the utility of the 'delay generating hardware' you speak of ? – shrm Jul 15 '14 at 11:57

2 Answers2

5

In Verilog we can imply logic which changes value on clock edges this synthesises to flip-flops. We can imply un-clocked boolean logic, this synthesises to combinatorial logic, just a bunch of ANDs and ORs.

When synthesising clock trees the synthesis tool balances these by adding delays so that all nodes receive the clock at the same time, so it would seem that the synthesis tool does have the ability to add delays.

However when ASICs are manufactured there is a variance in speed, at a high level can be viewed as Slow, Typical and Fast. In practice there are hundreds of variations of these corners where certain types of device in the silicon run fast and others slow.

These corners of the silicon also have a temperature rating, worst case may be +140C Fast silicon and -40C Slow silicon. The variation of the delay through a buffer in this case could be 1ns to say 30ns.

To bring this back to Verilog if #10 was synthesisable you would actually get 155+-145 ie 10ns to 300ns, if you have also designed something with #20 to be part of the same interface or control structure it is going to have a range of 20ns to 600ns. Therefore the whole thing is not really valid against your design.

The Clock trees are designed in a way to cap the max and min delays and so that all nodes on the clock tree will scale relative to each other. They are never given such a strict rule that it must be #10ns as this is physically impossible to guarantee in a combinatorial circuit.

Morgan
  • 19,934
  • 8
  • 58
  • 84
4

There is no such thing as code that can never be synthesized. If you can write code that can be compiled and executed on a piece of hardware, it can be synthesized into hardware. It's just a matter of what synthesis tool vendors have chosen to interpret. There have been behavioral synthesis tools in the past that have interpreted when they know the clock cycle period is 100, that

assign #120 A = B * C;

means the result A will appear in the next clock cycle. But those tools do not exist anymore. A big problem with this kind of methodology is that you don't know if these delays are assumptions or requirements.

The way most synthesis tools are written, all timing information is specified separately from your Verilog description. Only the functional aspects of of your description are extracted from your Verilog code. It is much easier to get the timing information for the clocks and input/output requirements from a separate file than to try to analyze your testbench.

dave_59
  • 39,096
  • 3
  • 24
  • 63