0

I am trying to display a playing card using Unicode in Java/Android Studio. the Unicode for the card is U+1F0A1 which I understand can't be used and must be converted to surrogate pairs.

the code I have entered is

    public String getShortName() {
    char spades = 0xD83C/0xDCA1;
    return String.valueOf(spades);

however, this doesn't display the card on the emulator. I have tried numerous variations of the surrogate pairs but nothing happens. Can anyone help please?

Many thanks

Phantômaxx
  • 37,901
  • 21
  • 84
  • 115
Heather
  • 41
  • 2
  • 5
  • 1
    Use a string literal with unicode escapes: `"\uD83C\uDCA1"`, or simply a string literal with the character: "". – teppic Sep 25 '17 at 19:21
  • Where did you find the expression ``0xD83C/0xDCA1``? – f1sh Sep 25 '17 at 19:23
  • 1
    The expression `0xD83C/0xDCA1` is performing integer division, so you are actually setting `char spades = 55356 / 56481;`, which is 0, which is not what you want. You should use `return "\uD83C\uDCA1;"` or `return "";`, like teppic suggested. Or, you can put the codepoint value into an `int[]` array and use the `String` constructor that takes codepoints as input: `return new String(new int[]{ 0x1F0A1 }, 0, 1);` – Remy Lebeau Sep 25 '17 at 20:47
  • Thanks for the quick responses. I have tried to input the surrogate pair as suggested above ("\uD83C\uDCA1") but this does not print anything to the emulator. The reason I was using the 0xD83C/0xDCA1 is I have successfully implemented unicode in this way. ```char hearts = 0x2665; return String.valueOf(hearts);``` – Heather Sep 25 '17 at 21:48

0 Answers0