0

The ASN.1 definition of Accuracy is:

Accuracy ::= SEQUENCE {
    seconds     INTEGER          OPTIONAL,
    millis  [0] INTEGER (1..999) OPTIONAL,
    micros  [1] INTEGER (1..999) OPTIONAL  }

What is unclear to me is how to handle millis and micros. It can't work to have both the SEC_ASN1_INTEGER universal tag and the 0 and 1 tags in the same 'kind' field of a SEC_ASN1Template struct, as they both would go into the same part (the lowest byte) of that field.

tml
  • 968
  • 7
  • 12
  • Actually I most likely won't be needing to model the Accuracy type after all, so this particular question is too specific. But oh boy, the Mozilla people, or some volunteer that understand the stuff, really should write some better documentation for their ASN.1 encoding and decoding APIs. – tml Feb 20 '15 at 06:51

1 Answers1

1

Note that

millis [0] INTEGER (1..999) OPTIONAL

is not really of INTEGER type, but of [0] type.

Perhaps the following equivalent type definition might help you:

Accuracy ::= SEQUENCE {
    seconds     AccuracySeconds      OPTIONAL,
    millis  [0] AccuracyMilliseconds OPTIONAL,
    micros  [1] AccuracyMicroseconds OPTIONAL  }

AccuracySeconds ::= INTEGER
AccuracyMilliseconds ::= INTEGER(1..999)
AccuracyMicroseconds ::= INTEGER(1..999)

Also need to take into account if your type definition is using IMPLICIT tagging or EXPLICIT tagging... encoding would be different:

(implicit)

30 06
   01 01 02
   80 01 02

vs (explicit)

30 08
   01 01 02
   A0 03
      02 01 02
jsantander
  • 4,972
  • 16
  • 27