I'm writing code which reads data form Student ID card. After card initialization, selecting directory and file on card I start reading data form it with READ BINARY (B0).
According to ISO 7816-4: Interindustry Commands for Interchange:
If bit8=0 in P1, then P1||P2 is the offset of the first byte to be read in data units from the beginning of the file.
I'm reading data in chunks, each 16 bytes. In first loop I would like to read offset = 0
so I set P1 = 0x00
and P2 = 0x00
and everything works fine, first 16 bytes are transmitted form card.
Problems start at second loop, next 16 bytes should be read from offset = 16
so I set P1 = 0x00
and P2 = 0x10
but I receive data from offset = 64
.
After some testing I've found that P1||P2
is always interpreted as offset in 4 bytes.
It seems like my card is using data units (mentioned ISO 7816-4) in size of DWORD each.
All examples in Web treat data units as byte and everything seems to work for author/users. My question is: How can I determine the size of data unit during application runtime?