I'm using Python3 to get some useful UUID from string like this: \227L\310\210\r\232MG\2110\221?h$0\313
When I trying to make it manually by placing string to code, all works fine:
>>> from uuid import UUID
>>> print(UUID(bytes=b"\227L\310\210\r\232MG\2110\221?h$0\313"))
974cc888-0d9a-4d47-8930-913f682430cb
>>>
But if I take this string from input() function, Python adds some characters there (backslashes), so I can't use such string as UUID bytes argument:
from uuid import UUID
from codecs import encode
def give_client_id():
input_str = input('Paste bynary string: ')
print(input_str)
bytes_str = bytes(input_str, 'ascii')
print(bytes_str)
encoded_str = input_str.encode()
print(encoded_str)
print(UUID(bytes=bytes_str))
if __name__ == "__main__":
give_client_id()
This code always returns the same for me:
Paste bynary string: \227L\310\210\r\232MG\2110\221?h$0\313
\227L\310\210\r\232MG\2110\221?h$0\313
b'\\227L\\310\\210\\r\\232MG\\2110\\221?h$0\\313'
b'\\227L\\310\\210\\r\\232MG\\2110\\221?h$0\\313'
Traceback (most recent call last):
File "give_client_id_from_bytes.py", line 19, in <module>
give_client_id()
File "give_client_id_from_bytes.py", line 15, in give_client_id
print(UUID(bytes=bytes_str))
File "/usr/lib/python3.8/uuid.py", line 178, in __init__
raise ValueError('bytes is not a 16-char string')
ValueError: bytes is not a 16-char string
How can I force Python accept such string in UUID or re-format this string to something more useful?