I am using python3.4
in a venv
.
I am writing a script for a sensor where, after reading a configuration file I need to send an int
to the serial port in bytearray
a snippet of the class function is:
def set_sampling(self, type, sampling_us):
conf_buffer = bytearray([0xfe, 0x06])
if type not in self.sensor_ids:
print('Sensor not identified')
else:
conf_buffer.append(self.sensor_ids[type])
conf_buffer.append(0x02)
if (sampling_us > 0):
print(sampling_us)
sampling_bytes = (sampling_us).to_bytes(4, 'little')
conf_buffer += sampling_bytes
self.send_frame(conf_buffer)
self.enable(type)
the frame structure is 0xf6 0x06 sensor_id 0x02 sampling_us
where sampling_us
should be in little-endian format
I have currently sampling_us
as 1000000 (equals to 1 second)
When I perform the following in the interpreter:
>>> (1000000).to_bytes(4, 'little')
the result provided is:
>>> b'@B\x0f\x00'
however I cross-checked with a script for the sensor where the bytes for 1000000 is actually b'\x40\x42\x0f\x00'
I reversed the check by performing:
>>> int.from_bytes(b'\x40\x42\x0f\x00', 'little')
>>> 1000000
the correct bytes are in fact b'\x40\x42\x0f\x00'
as the sensor does not respond if the bytearray sent to it is b'@B\x0f\x00'
Why am I getting a discrepancy here? what is it I am doing wrong here?