-2

Suppose somebody stores A in memory. As I understand, it will be stored as 01000001 (8 bits consecutively in a byte of memory). So where does the ASCII transition take place? Is it some kind of program which takes "A" checks its data type and stores as binary and then the same program retrieves the binary number and again looks at the ASCII chart and converts to A as per the data type.

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
wupiku
  • 85
  • 1
  • 5
  • Is this for `c` or for `java`? In C, a char is just an integer, so the "conversion" would happen when you print it with a function like `printf()`. – Reticulated Spline Sep 19 '19 at 15:55
  • There is no conversion. `A` **is** `01000001`. – Max Vollmer Sep 19 '19 at 15:55
  • It takes place on your screen, in your printer etc. Everything in your computer is a number, and what that number means is relative to the context. – Weather Vane Sep 19 '19 at 15:57
  • 2
    Possible duplicate of [ASCII table and character presentation](https://stackoverflow.com/questions/48485573/ascii-table-and-character-presentation) – Max Vollmer Sep 19 '19 at 15:57
  • 1
    @ReticulatedSpline In Java, a char is also just an integer. The main difference is that in C a char is 1 byte, while in Java it's 2 bytes. – Max Vollmer Sep 19 '19 at 16:02
  • The program stores the ID a glyph. Depending on the encoding the same ID may represent a different glyph, and the way the ID is stored and/or interpreted may be different. There is no strong guarantee about the actual value: in practice almost all system use the ascii table for character between 0 and 127. – AugustinLopez Sep 19 '19 at 16:24
  • Fundamentally, the transition between `A` and 65 (or vice versa) takes place in the I/O technology. When you type `A` on the keyboard, the keyboard generates a number 65 and sends it to the computer. When you display `A`, the display technology takes the number 65 and converts it into an appropriate set of pixels with colour and background etc as required. It's all 'black magic'. In the program, a byte containing the bits 0100001 can be interpreted as `A` or 65 or as part of some bigger unit of data. – Jonathan Leffler Sep 19 '19 at 16:31
  • @MaxVollmer Another key difference is that in Java, a char is a UTF-16 code unit. In C, it is only known from context or source whether it is actually text and if text then which character encoding. – Tom Blodget Sep 29 '19 at 15:43
  • @wupiku Several answers describe the concept of rendering. A value is rendered as a character because the program causes it to be. In practice on a typical system, the rendering engine uses a **font file**, which gives glyph drawing instructions for each supported character code. Other than perhaps giving a preference for which font to use and at what scale, program code is hardly ever involved in the text rendering. – Tom Blodget Sep 29 '19 at 15:53

3 Answers3

0

Remember that character representations are really only useful for users and programmers; as far as the system is concerned, its a series of 1's and 0's.

The only time true ASCII translation occurs is when the stored value is either programmatically assigned/compared, eg char a = 'A'; or displayed. In the latter, a lookup would occur that would provide an environment-specific rendering, eg a pixel bitmap for a character cell display, or perhaps a per-symbol static image on a graphic system, or a stored/fixed character set on a hardcopy device such as a printer.

David W
  • 10,062
  • 34
  • 60
0

A is 01000001. No conversion is ever made. If you print it with printf, it won't do a conversion of any kind.

It will simply call write(2), which itself will write 01000001 on the standard output. The program that turns that data into pixels on the screen is your terminal emulator or whatever is displaying the text. A C program has no idea what ascii is, all it knows is that a char is 1 byte long.

0

It happens at the output device. The output device receives that sequence of bits and an A is drawn on the console, or output in the paper sheet of a printer... The idea is that always you send that sequence to the printer, an A will be printed on the paper. Nowhere else is the pattern associated with the A letter.

In an input device as the keyboard, each key is a simple interruptor that connects two wires... always the A key is pressed, that sequence of bits is sent to the computer, so, as soon as every device prints an A each time it receives that pattern we all agree on that.

By the way, you can use that pattern for a completely different purpose. But every time that symbol is sent to the printer, it will print an A and no other letter.

Luis Colorado
  • 10,974
  • 1
  • 16
  • 31