0

I'm given a 2 byte sequence and asked to Base64 encode it:

00000001 00010001

From what I understand you can only encode sequences of 6 bits when working with Base64. So because 16 bits is not divisible by 6 I'm a little stuck.

The solution I can see is to convert the given 2 byte sequence into a 3 byte sequence so it becomes divisible by 6. But how do I do this without changing the value of the initial sequence?

1 Answers1

1

Basically, you pad it out with zeroes to the next multiple of 6 bits, and pad out the last four-character sequence with =s. Since the last two zero bytes don't make up a full input byte, the decoder knows to ignore them. (The = padding isn't totally necessary, but it's customary to make the end result always a multiple of 4 characters long.)

For instance, the sequence you've got is:

00000001 00010001

Breaking that up into groups of 6, we get:

000000 010001 0001

Pad with zeroes:

000000 010001 000100

Convert to ASCII:

ARE

And pad that out:

ARE=
  • aw, great. got it! thank you :) sorry to be a nag, but don't suppose you know anything about huffman tables? I think if you have a logical mind like yourself it might make sense, I just can't wrap my head around it. http://stackoverflow.com/questions/14432503 - and once again thank you! – user1995969 Jan 21 '13 at 06:21