1

I'm writing code which reads data form Student ID card. After card initialization, selecting directory and file on card I start reading data form it with READ BINARY (B0).

According to ISO 7816-4: Interindustry Commands for Interchange:

If bit8=0 in P1, then P1||P2 is the offset of the first byte to be read in data units from the beginning of the file.

I'm reading data in chunks, each 16 bytes. In first loop I would like to read offset = 0 so I set P1 = 0x00 and P2 = 0x00 and everything works fine, first 16 bytes are transmitted form card.

Problems start at second loop, next 16 bytes should be read from offset = 16 so I set P1 = 0x00 and P2 = 0x10 but I receive data from offset = 64.

After some testing I've found that P1||P2 is always interpreted as offset in 4 bytes.

It seems like my card is using data units (mentioned ISO 7816-4) in size of DWORD each.

All examples in Web treat data units as byte and everything seems to work for author/users. My question is: How can I determine the size of data unit during application runtime?

elklepo
  • 509
  • 4
  • 17
  • Which card are you using? MPCOS? It is always better to check particular card documentation. Unfortunately a lot of cards work differently than ISO 7816 defines. Good luck! – vlp Oct 05 '17 at 17:21
  • you can evaluate the response from the file select with option to return FCI/FCP/FMD and have a look if there is any clue there – Paul Bastian Oct 07 '17 at 10:01

0 Answers0