1

I have a signed hex string FFFF which is signed. So when I convert it to an integer it should be -1. But when I use int.Parse/int.TryParse or System.Convert.ToInt32 it gives me the value 65535.

var rawValue = int.Parse("FFFF", System.Globalization.NumberStyles.HexNumber);
// rawValue = 65535
var rawValue = Convert.ToInt32("FFFF", 16);
// rawValue = 65535

Any ideas how I can get the parse/convert functions to recognise its a signed hex string?

Kevin Brydon
  • 12,524
  • 8
  • 46
  • 76
  • FFFF is 65535 Int32 so... Are you asking "is there some type like `System.Int16` which normally called `short` in other languages?" – Alexei Levenkov Nov 24 '20 at 16:50
  • I think there is an answer to your question in this post [this post ](https://stackoverflow.com/questions/3705429/how-do-i-convert-hex-string-into-signed-integer) – TuDatTr Nov 24 '20 at 16:52
  • I thought there might be some way to indicate to the conversion function that the hex was "signed". So it would recognise that hex strings starting "FF..." were going to be negative. Something to do with twos compliment? – Kevin Brydon Nov 24 '20 at 16:55

1 Answers1

2

FFFF is -1 if it is converted to 16 bit signed integer. So you can try;

var x = Int16.Parse("FFFF", System.Globalization.NumberStyles.HexNumber); // x will be -1

If you convert it to 32 bit (which is what int.Parse does) it will be 65535 because sign bit will be 0

var x = Int32.Parse("FFFF", System.Globalization.NumberStyles.HexNumber); // x will be 65535
tozlu
  • 4,667
  • 3
  • 30
  • 44