Questions tagged [audiobuffer]

92 questions
2
votes
0 answers

Passing an AudioBuffer to AudioContext Analyser in CreateJS

I have made an audioCtx in JavaScript using the AudioContext() class. I have an analyser made with audioCtx.createAnalyser(). If my audio is an audio tag and I make a source with audioCtx.createMediaElementSource(audio) then pass that to the…
Dan Zen
  • 480
  • 3
  • 10
2
votes
1 answer

How to convert an AudioBuffer to a mp3 file?

Is there an easy way of doing that, or do I need to interleave the channels and create a DataView that contains a specific header format as well as the interleaved data?
Maxime Dupré
  • 5,319
  • 7
  • 38
  • 72
2
votes
2 answers

ios Core audio: how to get samples from AudioBuffer with interleaved audio

I have read an audio file into AudioBufferList with ExtAudioFileRead function. This is and ASBD for the audio: AudioStreamBasicDescription importFormat; importFormat.mFormatID = kAudioFormatLinearPCM; importFormat.mFormatFlags =…
jangofett
  • 59
  • 1
  • 12
2
votes
1 answer

Convert MediaElementAudioSourceNode to AudioBufferSourceNode

If you want to decode Audio Data, createMediaElementSource() is not working on mobile devices, However createBufferSource() method is working properly: This code is working properly in web browsers, but not in mobile devices: var audioSource = new…
2
votes
2 answers

Javascript: Converting from Int16 to Float32

I'm trying to put a WAV file in an AudioBuffer so that I can manipulate it. I've created WAV files from an AudioBuffer before, and that required converting a Float32Array to a DataView containing Int16 values. I used this handy function I picked…
Paulie
  • 1,940
  • 3
  • 20
  • 34
2
votes
1 answer

How to set AudioChannel as Left Headphone and Right headphone?

I am using playing audio as soon as we received any input from microphone. I am using OSStatus for recording and playing audio. As the recording and playing are working fine. I have to active left side headphone, right side headphone or center as…
Nirmalsinh Rathod
  • 5,079
  • 4
  • 26
  • 56
2
votes
4 answers

How to create AudioBuffer/Audio from NSdata

I am a beginner in streaming application, I created NSdata from AudioBuffer and i am sending the nsdata to client(receiver). But i don't know how to convert NSdata to Audio Buffer. I am using the following code to convert AudioBuffer to NSdata (This…
2
votes
2 answers

Uncaught TypeError: Value is not of type AudioBuffer

I get this error when I try to run an XHR to load a sample. Uncaught TypeError: Value is not of type AudioBuffer. Everything seems to be right, but I'm not sure what the problem is. Kit.prototype.load = function(){ if(this.startedLoading) …
Cameron
  • 2,427
  • 1
  • 22
  • 27
2
votes
0 answers

Does Javascript have abstractions which would provide (MediaElement) API for AudioBuffer? This would help to have AudioBuffer as a drop-in replacement for HTML5 audio element. You would not need to change the old playback code and event…
Mikko Ohtamaa
  • 82,057
  • 50
  • 264
  • 435
2
votes
0 answers

Passing an AudioBuffer to LibGdx FFT

I have a problem with LibGdx, i whish to make an audio visualizer for Android using libGdx, i could use the Android FFt from the Visualizer class() but, since i want my app to work on Desktop to, i can't. What i want is to finaly be able to make…
Psilopat
  • 51
  • 3
1
vote
1 answer

Remote IO audio is very noisy

I am new to core audio and remote io. I need data of size 320 bytes which I encode and send. Also a minimum of 50 frames per second. Here is what I have done: AudioComponentDescription desc; desc.componentType = kAudioUnitType_Output; …
Sujithra
  • 527
  • 1
  • 6
  • 14
1
vote
1 answer

While streaming audio chunk by chunk it has pause interval between chunks

I am currently working on implementing a streaming audio feature, and I've encountered an issue related to merging audio buffers using the AudioContext. My goal is to fetch 5-second audio chunks and play them to create a continuous audio…
callmenikk
  • 1,358
  • 2
  • 9
  • 24
1
vote
0 answers

Google Drive create AudioBuffer from a WAV file in Google Drive. webContentLink field not available

I have to process many wav files from Google drive, so I need to obtain the audioBuffers: I'm trying this code, but return the error: TypeError: response.body.getReader is not a function The code I'm trying: const query = `parents='${folderId}'…
Eli7
  • 21
  • 4
1
vote
1 answer

How do I turn an array of Float into an AVAudioPCMBuffer?

I have an array of Float (representing audio samples) and I want to turn it into an AVAudioPCMBuffer so I can pass it to AVAudioFile's write(from:). There's an obvious way (actually not obvious at all, I cribbed it from this gist): var floats:…
Robert Atkins
  • 23,528
  • 15
  • 68
  • 97
1
vote
1 answer

AudioBufferSourceNode not looping when duration is set

I've been playing around with js audio web api. The thing I'm trying to achieve is to play a piece of a track in loop. No problem playing the whole track in loop, but if I define a duration then it doesn't loop anymore...I guess what I need is like…