I am using Synopsis VCS compiler. My testbench is coded in UVM. I have a set of C routines that perform some standalone functions. I am calling these C routines through DPI imports in the UVM environment.
Here is the code snippet in a simple way,
uint64_t blah, var1, blah_1;
var1 = UVM_class::C_function_1(uint64_t blah);
blah_1 = UVM_class::C_function_2(uint64_t var1);
if(blah_1 != blah) assert(0);
#
uint64_t C_function_1(uint64_t blah)
{
.....
.....
uint64_t x = function1(...);
return x;
}
#
uint64_t function1(...)
{
uint64_t y;
calculate some stuff
return y;
}
Here is the issue: If I run this as a part of regression, about 10000 times it works perfect.
At the 10001th time, this is what happens.
function1 retruns the correct value and I see that when I print y. However, when I print x inside C_function_1, x has something like 0xffffff_fffff_y. That is value of y is present, but there is some garbage attached to it. This messes up subsequent calculations that involve x.
I read a lot regarding stack getting messed up and made sure I malloc'd and free'd all pointers that are arguments to various functions.
I also tried running the C portion as standalone and there is no error and the regression is clean.
The only issue is when I run the UVM test which calls the C regression routine.
I have spent a lot of time debugging this to no avail.
Anybody any suggestions?