It's called pointer arithmetic. What it does is take the pointer to the string literal, and add six "units" (where a unit is the size of the underlying type pointed to, in this case sizeof(char)
(which always is one)).
You can see the string like this:
+--+--+--+--+--+--+--+--+--+--+--+--+--+
| H| e| l| l| o| | p| r| i| n| t| f|\0|
+--+--+--+--+--+--+--+--+--+--+--+--+--+
0 1 2 3 4 5 6 7 8 9 10 11 12 13
The numbers below is the offset, or index if using array notation, to the specific letter in the string.
The important thing to know here is that it doesn't add bytes, it's just a coincidence here because the base type is the size of a byte. If you had an array of short
(which is usually two bytes) then adding six would add 6 * sizeof(short)
bytes, that is in the normal case 12 bytes.