I am using Cadence's Ethernet eVC wherein the agent's monitor is tapped at the following signals:
. ____________ _____
.clk _____| |__________________|
. ________ _______ ________________ _________
.data __0a____X___07__X_______0b_______X_________
. ^ ^
It samples data at the rising and falling edges of the clock. In the example above, the data 0x07 is garbage data, and the valid values are 0xa (clk rise) and 0xb (clk fall). However, the monitor is sampling (for clk fall) 0x7!
I'm suspecting this is a Specman-simulator synchronization issue. How can this be resolved if it is?
- Simulator - IES 13.10
irun 13.10 options - (I'll include here only those which I think could be relevant to the issue, plus those which I've no idea yet what their purpose is)
-nomxindr -vhdlsync +neg_tchk -nontcglitch +transport_path_delays -notimezeroasrtmsg -pli_export -snstubelab
Languages - VHDL (top testbench), Verilog (DUT), Specman (virtual sequence, Enet and OCP eVCs)
- Time between 0x07 (left ^ in the waveform above) and falling edge of clock (right ^) = 0.098ns
One colleague suggested using -sntimescale
, but I still can't imagine how that is causing/would resolve the issue. Any of these search strings were not showing helpful hints, even those articles from Cadence: "specman tick synchronization delta delay timescale precision"