1

I have created a JavaFX application to retrieve data from a FTDI peripheral device. I used JNAerator to generate the the API and everything works beautifully on my development machine (OS X). However, when tested on a coworker's box (Windows), the BirdJ Pointer.getBytes() method returns byte arrays where every value is off by exactly 128.

Is there a known platform difference or something else in Java that would explain this inconsistent behavior or is this more likely a problem in the native FTDI drivers?

Is there a cleaner way to resolve it than by introducing ugly platform specific logic to modify every byte read or written?

EDIT

I'm not sure my problem description was clear. Here is a specific example.

I request 3 bytes from the FTDI device to confirm it is ready to send data. I get [-91, -1, -1] which matches the documentation saying to expect "A5 FF FF". My code is written to accept that answer and everything proceeds just fine.

My coworker gets [37, 127, 127] which is "25 7F 7F". Since that is not the expected value, my code reports an error and exits.

Jonathon
  • 73
  • 8
  • 1
    Java bytes are signed. If your device data isn't 8 bits of signed data, you'll need to sign-extend and mask the data, e.g. `int value = ((int)mybyte) & 0xFF;` – technomage Jan 02 '17 at 23:22
  • or more idiomatic since java 8: `Byte.toUnsignedInt(val)` – the8472 Jan 03 '17 at 00:03
  • I use that when I need to convert the byte values to hex, but it does not solve my problem. I added a specific example for clarification. – Jonathon Jan 03 '17 at 18:39
  • 1
    _You_ have the high bits set in each byte, which results in the negative numbers, your coworker does not. Do you have native code that works properly on windows? – technomage Jan 03 '17 at 23:23
  • JNA copies memory directly into your byte array (`Pointer.getByteArray()`) using the JNI function `SetByteArrayRegion`. Nothing magic going on there. What are the JVM make/model/version in question? – technomage Jan 03 '17 at 23:27
  • Your comment reminded me to look at the byte size in the driver documentation. The driver can be set to use either 7 or 8 bit words. I tried setting it to 7 bit words on my computer and can now reproduce the behavior my coworker is experiencing. I am guessing the Mac driver defaults to 8 bit words while the Windows driver defaults to 7 bit words. I will adjust the code to always use 8 bit words and have him try again. – Jonathon Jan 04 '17 at 00:14

1 Answers1

0

Calling SetDataCharacteristics to ensure that all words use 8 bits solved my problem.

Jonathon
  • 73
  • 8