-1

I use a UTF-16 character picker to create ASCII art in Texbox in HTML, and UTF-16 characters are supported and visible "as is". Now I need to process such ASCII art and save into an Array as UTF-16 characters, process with Javascript as Strings to build ASCII art animations for Twitter like this:

Screencap of twitter feed


You don't have to be sorry. Twitter accepts UTF-16 as ASCIIart

For UTF-16 definition go to Wikipedia

http://en.wikipedia.org/wiki/UTF-16 UTF-16 (16-bit Unicode Transformation Format) is a character encoding for Unicode capable of encoding 1,112,064[1] numbers (called code points) in the Unicode code space from 0 to 0x10FFFF. It produces a variable-length result of either one or two 16-bit code units per code point.

I already did 2-bytes Unicode picker (UTF-16) and can generate UTF-16 input into Twitter.

== re:

Removed the link as it's pointing to a Twitter account which doesn't show the mentioned content anymore (w/o scrolling). May appear like spam then. – david Nov 20 at 4:09

That way it may take much longet time to get right answer.

Makoto
  • 104,088
  • 27
  • 192
  • 230
  • Removed the link as it's pointing to a Twitter account which doesn't show the mentioned content anymore (w/o scrolling). May appear like spam then. – david Nov 20 '12 at 04:09
  • Voting down because I really, really don't want to see this on Twitter. – ceejayoz Nov 28 '12 at 15:13

1 Answers1

1

UTF-16 is a character encoding. Twitter only accepts UTF-8 as input. You can convert UTF-16 to UTF-8 without any data loss, so just do that and then send it to Twitter.

Tom van der Woerdt
  • 29,532
  • 7
  • 72
  • 105
  • I am really sorry but Twitter exactly accepts Unicode UTF-16 as seen above and I exactly input UTF-16 into Twitter as web browsers support Unicode UTF-16 as html code . – user1831960 Nov 26 '12 at 16:36
  • 1
    You don't seem to understand what UTF-16 is, sorry. Twitter only accepts UTF-8. (Source: their docs, plus experience) – Tom van der Woerdt Nov 26 '12 at 16:47
  • You don't have to be sorry. Twitter accepts UTF-16 as ASCIIart For UTF-16 definition go to Wikipedia http://en.wikipedia.org/wiki/UTF-16 UTF-16 (16-bit Unicode Transformation Format) is a character encoding for Unicode capable of encoding 1,112,064[1] numbers (called code points) in the Unicode code space from 0 to 0x10FFFF. It produces a variable-length result of either one or two 16-bit code units per code point. I already did 2-bytes Unicode picker (UTF-16) and can generate UTF-16 input into Twitter. – user1831960 Nov 28 '12 at 14:19
  • 1
    I think I know what UTF-16 and UTF-8 are, thank you. Twitter accepts Unicode in UTF-8 format. If you send UTF-16 data to Twitter, it will either refuse it or interpret it as UTF-8. I recommend doing some reading about what Unicode actually is and what the use of UTF8/16 is. You claim that you can use UTF-16 on Twitter, but that is not true. You can use Unicode, and if you enter Unicode characters in your browser then it will convert that to UTF-8 and send it to Twitter as UTF-8. – Tom van der Woerdt Nov 28 '12 at 14:48
  • Interested to close and delete this thread since I don't get any relevant response. I did UTF-16 (2-byte Unicode picker for Twitter) and can generate Unicode 2-byte Asciiart for Twitter. No more interested in unrelevant responses. – user1831960 Nov 30 '12 at 14:37
  • I exactly enter 2-byte Unicode characters into Twitter, into any web browser to be and interpreted as 2-byte Unicode (UTF-16) to generate Unicode ASCIIart as shown on top. Job done. Job closed. No more interested in unrelevant responses – user1831960 Nov 30 '12 at 14:42