I'm currently trying to implement an augmented reality iphone application (ios 4.2) that uses accelerometer data to translate and rotate an OpenGl object on the screen. I have already succeeded in getting the object to respond to the phones rotation, but this was always going to be the easy part.
For the translational part, I have I've tried implementing some of the techniques from this paper (http://www.freescale.com/files/sensors/doc/app_note/AN3397.pdf) but it's still not very accurate. I'm in the process of implementing a kalman filter to filter the accelerometer data.
Has anyone had any luck in determining phone translational movement? If so, how accurate did you get it, and what techniques did you use to obtain this accuracy?