0

I am working on Android music player. I am trying to convert audio byte to rgb color so that I can show it in wave form while audio is playing. I am using felixpalmer API currently and it's able to generate single color waveform.

Here is the code:

private void setupVisualizerFxAndUI() {

    // Create the Visualizer object and attach it to our media player.
    Paint linePaint = new Paint();
    linePaint.setStrokeWidth(2f);
    linePaint.setAntiAlias(true);
    linePaint.setColor(ContextCompat.getColor(getContext(), R.color.colorPrimary));

    final LineRenderer lineRenderer = new LineRenderer(linePaint, true);
    mVisualizerView.addRenderer(lineRenderer);

    mVisualizer = new Visualizer(mediaPlayer.getAudioSessionId());
    mVisualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
    mVisualizer.setDataCaptureListener(
            new Visualizer.OnDataCaptureListener() {
                public void onWaveFormDataCapture(Visualizer visualizer,
                                                  byte[] bytes, int samplingRate) {
                    mVisualizerView.updateVisualizer(bytes);
                    onFFTData(bytes, lineRenderer);
                }

                public void onFftDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate) {
                    //onFFTData(bytes, lineRenderer);
                }
            }, Visualizer.getMaxCaptureRate() / 2, true, false);
}

Please point me to some resource where I can read about how can I generate dynamic rgb colors from this audio byte code. So that my music player can show waves in dynamic color instead of singer color as it's showing currently.

pradex
  • 328
  • 4
  • 18

1 Answers1

0

I noticed these two lines:

    final LineRenderer lineRenderer = new LineRenderer(linePaint, true);
    mVisualizerView.addRenderer(lineRenderer);

and I thought that looked promising. So I followed the GitHub link and found: https://github.com/felixpalmer/android-visualizer/blob/master/src/com/pheelicks/visualizer/renderer/LineRenderer.java

It's all right there. He does some math with the audio data, generates some points then calls

      canvas.drawLines(mPoints, mPaint);

All you have to do is duplicate/subclass the LineRenderer class. Calculate the color you want, then call mPaint.setColor(color) before the the drawLines call.

Because the Visualizer addRenderer method takes any Renderer instance (Renderer is the abstract class that all his renderers inherit from), you can create your own subclass and his visualizer will happily use your class to draw the visualization.

In fact, your code snippet shows a constructor LineRenderer(linePaint, true) which I don't even see in his source. He does have a constructor LineRenderer(Paint paint, Paint flashPaint, boolean cycleColor). When you set cycleColor true, the color will change on each frame so LineRenderer can in fact generate a waveform with different colors. You can see in the source code how he does it (line 99). You can start with that as an example moving towards what you want.

His source code has all the ingredients you need. Happy coding.

kris larson
  • 30,387
  • 5
  • 62
  • 74
  • 1
    I've done these and I know how it works but I am using the above code to send rgb values to led device and I need to generate color from Audio bytes using wave intensity while cycle-color from the source only revolves in a certain range. Also I removed flashPaint from the constructor as It wasn't needed. – pradex Oct 19 '16 at 14:53