Based on what I've read, Google Glass supports most/all sensors, and existing Android APIs are available to access sensors on a mobile device.
However, the GDK (gdk.jar) contains only gesture and voice-related class files, which means that custom 'glue' code is required in order to render sensor-specific data (from the gyroscope, accelerometer, and so forth) on the Glass screen.
Is this correct?