2

So this is a rather specific question about the openAl implementation on ios. There are 2 parts.

1.) When humans hear sound we can typically tell if the sound is coming from behind or in front of us because the sound coming from behind is more muffled and reaches our inner ear differently.

Is this accounted for in the OpenAL implementation? I can't really tell from playing around with it.

2.) When humans hear sound, there is a slight delay between the time a sound reaches our left ear and our right ear depending on where the source is.

Is this accounted for in the OpenAL implementation?

Beleg
  • 362
  • 2
  • 23
wfbarksdale
  • 7,498
  • 15
  • 65
  • 88

1 Answers1

0

1.) To acheive this effect, you need to apply an efx extension to the sound to muffle it, or just decrease the volume manually on this condition.

2.) there is no support for delayed sounds when the source is far away. setting speedofsound only affects frequency attenuation.

extracrispy
  • 669
  • 5
  • 16