I expect the output to be 131200 (2^17 + 2^ 7) rather than 2. Makes sense in theory, but it is not working. The bit pattern of arr[3] after the pointer typecasting line would be something like - [00000000 00000010 00000000 10000000]. (Please tell me if I am wrong somewhere.)
#include <stdio.h>
int main(void){
int arr[5];
arr[3] = 128;
((short *) arr)[6] = 2;
printf("%i\n", arr[3]);
}