4

How do you display Regional Status Indicators (things like country flags) in the Linux Terminal? Is this even possible?

I'm programming a VPN client and would like to include some flags for the countries the client includes. For example, I would like to display the Japan flag (). I pasted the two Unicode symbols and next to each other in a Bash script, but upon running the script in the Terminal, I just got the two symbols next to each other (they didn't "merge"). For example purposes I have part of the file below (note there is a space between the symbols so they don't "merge" in the browser, the space is not there in the real script).

#!/bin/bash
echo "Please choose a server:"
echo -e "  Japan (1)"     # in the real script there is no space here.
echo -e "..."
read -p "> " choice
...

And upon running:

$ ./script.sh
Please choose a server:
  Japan (1)            [ with no space in between ]
...

I understand the concept of Regional Status Indicators, but my question is if it's possible to use them in the terminal.

Edit: I noted that some answerers were having trouble understanding my problem, so I provided a screenshot of what my terminal looks like when I run this script. Regional Status Indicator example in Terminal

Anonymous
  • 79
  • 4

1 Answers1

4

Not sure if you copied the correct byte sequences, but you can simply use the correct escapes instead:

$ echo -e "\U1f1ef\U1f1f5 Japan (1)"
 Japan (1)

It may also be an issue with your terminal understanding the Unicode sequence and rendering it properly, rather than a problem with your script.

chepner
  • 497,756
  • 71
  • 530
  • 681
  • I have tried using the escapes instead with no luck. I am confirming that the Terminal is having trouble rendering the sequence, but my question was is it possible to work around this (maybe using a Terminal font, etc.) – Anonymous Oct 09 '18 at 19:06
  • 1
    Check the output of `echo -e '\U1f1ef\U1f1f5' | hexdump -C`. Assuming you are using a UTF-8 locale, the output should indicate that `bash` produces the bytes `f0`, `9f`, `87`, `af`, `f0`, `9f`, `87` and `b5` for the two Unicode characters. Your terminal must *also* be set up to use UTF-8, so that it knows to reassemble those 8 bytes into at the intended Unicode characters. Finally, your terminal must also be configured to use a font which can provide the correct flag glyph for the terminal's rendering engine to use to display the pair. – chepner Oct 09 '18 at 19:17
  • I checked the `hexdump` output, all is well (there was a `0a` at the end though...), I have also confirmed that the terminal is using UTF-8 character encoding. Upon doing some research, it appears Monospace [Regular] (the font I use) supports a variety of glyphs, but not flag glyphs? I'm unsure, although I figured I would have this problem. My question was for a workaround. – Anonymous Oct 09 '18 at 19:24
  • There's no programming workaround. You are correctly writing the bytes needed to display the flag; this is purely a matter of configuring your terminal to be able to display them correctly, which is why I directed you to superuser.com. – chepner Oct 09 '18 at 19:49
  • Ok... you directed me to Super User? – Anonymous Oct 09 '18 at 20:15
  • Sorry, I thought I had. (I might have done so in my first comment, then edited it out before submitting the comment.) In any case, try superuser.com :) – chepner Oct 09 '18 at 20:22
  • No problem, I was just confused. Thanks for all of the help! – Anonymous Oct 09 '18 at 20:23
  • 1
    is there any update regarding it? I am also having a similar issue, that instead of the country flag, its printing J P. – Izaz Mar 30 '22 at 10:54