1

Here are two similar constraint blocks, one written using decimal notation, and the other using hexadecimal notation. The first works as expected, but the second only generates positive values (including 0) out of the 5 available values:

-- positive and negative values generated as expected
var rnd_byte : int(bits: 8);
for i from 0 to 9 {
  gen rnd_byte keeping {
    soft it == select {
      90 : [-1, -128 , 127, 1];
      10 : 0x00;
    };
  };
  print rnd_byte;
};

-- only positive values (including 0) generated!!!
var rnd_byte : int(bits: 8);
for i from 0 to 9 {
  gen rnd_byte keeping {
    soft it == select {
      90 : [0xFF, 0x80, 0x7F, 0x01];
      10 : 0x00;
    };
  };
  print rnd_byte;
};

How can I make the second example behave as the first one, but keep the hexadecimal notation. I don't want to write large decimal numbers.

evilpascal
  • 117
  • 3
  • 12

3 Answers3

2

0xff and 0x80 are not in the range of the rnd_byte data type. You need to declare rnd_byte as uint(bits:8). Alternatively, try to typecast the literals (I could not verify the syntax): (0xff).as_a(int(bits:8))

Thorsten
  • 710
  • 8
  • 17
2

some more about this issue - with procedural code there is auto casting. so you can write

var rnd_byte : int( bits : 8);
rnd_byte = 0xff;

and it will result with rnd_byte == -1.

constraints work with int (bits :8 ) semantics, and this code would fail:

var rnd_byte : int( bits : 8);
gen rnd_byte keeping {it == 0xff};

as suggested - for getting 0xff - define the field as unsigned.

user3467290
  • 706
  • 3
  • 4
1

In procedural code, automatic casting between numeric types takes care of the absolute majority of cases. However, in generation numbers are viewed by their natural values, as in int(bits:*) semantics. Hex notation means the value is unsigned.

Rodion Melnikov
  • 338
  • 1
  • 4