2

I have after a lot of troubleshooting managed to get an encoder and decoder workin where I input a number, it turns it into binary and then ASCII, returns that, I feed that into the other one, it turns the ASCII into binary and then strings that toghether into a number. This works for all numbers that do not look like 11xx xxxx as they turn into 10xx xxxx.

Encoder:

public static String convert(int number) {
    String binary = Integer.toBinaryString(number);
    String string = "";
    byte[] bytes = new byte[4];

    while (binary.length() < 32) {
        binary = "0" + binary;
    }

    for (int i = 0; i < bytes.length; i++) {
        string += (char) Integer.parseInt(binary.substring(i * 8, (i + 1) * 8), 2);
    }

    return string;
}

Decoder:

public static int convert(String message) {
    byte[] bytes = new byte[message.length()];
    try {
        for (int i = 0; i < message.length(); i++) {
            bytes[i] = (message.substring(i, i + 1).getBytes("UTF-8"[message.substring(i, i + 1).getBytes("UTF-8").length - 1]);
        }
        StringBuilder binary = new StringBuilder(32);
        String s;
        for (int i = 0; i < bytes.length; i++) {
            s = Integer.toBinaryString(bytes[i] & 0xFF);
            while (s.length() < 8)
                s = "0" + s;
            binary.append(s);
        }
        int result = 0;
        result = Integer.parseInt(binary.toString(), 2);
        return result;

    } catch (UnsupportedEncodingException e) {
        return -1;
    }
}
Mojken
  • 23
  • 4

1 Answers1

2

You have misunderstood what character encoding does. Throw out everything with getBytes() in the decoder.

getBytes() encodes to UTF-8, which is not one byte per character. If you encoded one byte per character in encoder, you decode one byte per character in the decoder.

You encoder is treacherously convoluted btw. The simple version would be to directly decompose the int into bytes and cast to char:

String encode(int i) {
    char[] result = new char[4];
    char[0] = (char) (i >>> 24);
    char[1] = (char) ((i >> 16) & 0xFF);
    char[2] = (char) ((i >> 8) & 0xFF);
    char[3] = (char) (i & 0xFF);
    return new String(result);
}

The decoder can be much simpler as well:

int decode(String s) {
    int result = 0;
    for (int i=0; i<s.length; ++i) {
        char c = s.charAt(i);
        if (c > 255)
            throw new IllegalArgumentException("invalid character: " + c);
        result = (result << 8) | (c & 0xFF);
    }
    return result;
}

All you intermediate conversions achieve nothing but make the code more confusing and error prone.

Durandal
  • 19,919
  • 4
  • 36
  • 70