I'm working on a driver for reading smart cards (PC/SC), and I've been reading the data in a forced 8-bit manner, even if the card itself might have a 16-bit chip. I have two questions, one is: how would I tell whether the card conforms to a 16-bit or 8-bit architecture, and the other is: would there be a performance boost to treating the 16-bit system as 16-bit?
Asked
Active
Viewed 299 times
1 Answers
1
Would there be a performance boost to treating the 16-bit system as 16-bit?
No.
The CPU is internally 8, 16 or even 32 bit. But all current processor cards operate over either an ISO 7816-3 (contact) or ISO 14443 (contactless) interface. It's this interface that controls the speed, not the CPU. The CPU uses an outer clock for this, but all the latest smart cards use an internal clock that is running at much higher speeds.
As long as the interfaces are not updated, the "choice" between 8 or 16 bit doesn't matter a bit, let alone 8. I've put "choice" between quotes because I don't see where you have any choice in this.

Maarten Bodewes
- 90,524
- 13
- 150
- 263
-
What about for encrypted sessions? Running encryption over individual packets as opposed to one big packet seems costly. – Alex Londeree Feb 23 '12 at 14:47
-
What kind of encrypted sessions are you referring to? APDU's with secure messaging? – Maarten Bodewes Feb 23 '12 at 16:25