So I am getting back on a project that I haven't touched in months. I was messing around with it a few days ago with no issue (at least after my most recent post before this). For whatever reason today though, I went to run it and i've been wrestling this issue for a few hours now...
Problem:
I am creating a FIrebaseVisionImage object to use for various ml vision tasks.
FirebaseVisionImage image = FirebaseVisionImage.fromMediaImage(mediaImage, rotation);
Pretty standard setup before that (similar to what they have in their example using CameraX with ImageAnalysis.Analyzer. For whatever reason I am now getting this error that I wasn't getting before.
java.lang.ArithmeticException: divide by zero
at com.google.android.gms.internal.firebase_ml.zzrb.zza(com.google.firebase:firebase-ml-vision@@24.0.0:55)
at com.google.android.gms.internal.firebase_ml.zzrb.zza(com.google.firebase:firebase-ml-vision@@24.0.0:48)
at com.google.firebase.ml.vision.common.FirebaseVisionImage.fromMediaImage(com.google.firebase:firebase-ml-vision@@24.0.0:20)
at com.divertinc.visiondispositiontesting.MainActivity$4.analyze(MainActivity.java:248)
What i've done so far:
Okay well not a problem, let me just stack trace that. I see that this line:
int var9 = var2 / var8;
Is the issue. Okay cool so let's figure out those values.
Let's work backwards:
When I call fromMediaImage FirebaseVisionImage it's supposed to return this (based on my image):
return new FirebaseVisionImage(zzrb.zza(var2, var0.getWidth(), var0.getHeight()), (new Builder()).setFormat(17).setWidth(var0.getWidth()).setHeight(var0.getHeight()).setRotation(var1).build())
K cool, so we know the method at question here is: zza(Plane[] var0, int var1, int var2)
- K then the line that follows (as far as the error is concerned) is: zza(var0[0], var1, var2, var4, 0, 1);
- Which is then calling: zza(Plane var0, int var1, int var2, byte[] var3, int var4, int var5)
- Which finally calls: int var9 = var2 / var8
Now, i'm an Android Studio newbie so I can't figure out how to get values in class files when tracing so I just went back to where I was creating the FirebaseVisionImage and right before that I did this:
Image.Plane var0 = mediaImage.getPlanes()[0];
ByteBuffer var6 = var0.getBuffer();
int var2 = mediaImage.getHeight();
int var8 = (var6.remaining() + var0.getRowStride() - 1) / var0.getRowStride();
Log.d("divide debug: ", String.valueOf(var2));
Log.d("divide debug: ", String.valueOf(var8));
Log.d("divide debug: ", String.valueOf(var6.remaining()));
Log.d("divide debug: ", String.valueOf(var0.getRowStride()));
Log.d("divide debug: ", String.valueOf((var6.remaining() + var0.getRowStride() - 1)));
This resulted in:
480
0
0
640
639
K well 639/640 is 0.9.... Now if I remember how Java works, int division rounds down - unless explicitly overridden I assume? Anyway. K I guess that makes complete sense then. Okay so if I remember correctly, the only thing I changed between now and when it was working was dependency upgrades which I actually downgraded down for to see if that would affect it which it shouldn't have anyway after looking at release notes.
EDIT: So I found that ACTUALLY when I log what I did earlier, I notice that 4 frames are being analyzed before with the following results:
Log 1:
480
480
307200
640
307839
Log 2:
480
0
0
640
639
Log 3:
480
480
307200
640
307839
CRASH MESSAGE
Log 4:
480
0
0
640
639
Where i'm stuck :(
The error is happening less than a 100th of a millisecond after the third log and nearly 300 milliseconds after that, the 4th log is hit.
So my assumption is that there is something wrong with buffering every other frame since .remaining() method results in 0 for every other frame which shouldn't be happening. Unfortunately I know very little about that so I wanted to see if anyone could point me in the right direction D: On the upside, i've learned a ton throughout the process of posting this (i've been working on this post for about 45 minutes)
Below is my whole camera functionality which I assume has a hell of a lot of issues as is D:
private void startCamera() {
//make sure there isn't another camera instance running before starting
CameraX.unbindAll();
/* start preview */
int aspRatioW = txView.getWidth(); //get width of screen
int aspRatioH = txView.getHeight(); //get height
Rational asp = new Rational (aspRatioW, aspRatioH); //aspect ratio
Size screen = new Size(aspRatioW, aspRatioH); //size of the screen
//config obj for preview/viewfinder thingy.
PreviewConfig pConfig = new PreviewConfig.Builder().setTargetResolution(screen).build();
Preview preview = new Preview(pConfig); //lets build it
preview.setOnPreviewOutputUpdateListener(
new Preview.OnPreviewOutputUpdateListener() {
//to update the surface texture we have to destroy it first, then re-add it
@Override
public void onUpdated(Preview.PreviewOutput output){
ViewGroup parent = (ViewGroup) txView.getParent();
parent.removeView(txView);
parent.addView(txView, 0);
txView.setSurfaceTexture(output.getSurfaceTexture());
updateTransform();
}
});
/* image capture */
.setTargetRotation(getWindowManager().getDefaultDisplay().getRotation()).build();
final ImageCapture imgCap = new ImageCapture(imgCapConfig);
findViewById(R.id.imgCapture).setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
Log.d("image taken", "image taken");
}
});
/* image analyser */
ImageAnalysisConfig imgAConfig = new ImageAnalysisConfig.Builder().setImageReaderMode(ImageAnalysis.ImageReaderMode.ACQUIRE_LATEST_IMAGE).build();
ImageAnalysis analysis = new ImageAnalysis(imgAConfig);
analysis.setAnalyzer(
Executors.newSingleThreadExecutor(), new ImageAnalysis.Analyzer(){
@Override
public void analyze(ImageProxy imageProxy, int degrees){
while (weCanAnalyzeFrame) {
if (!isMachineLearning) {
Log.d("analyze", "just analyzing");
if (imageProxy == null || imageProxy.getImage() == null) {
Log.d("imageProxy: ", "is null");
return;
}
Image mediaImage = imageProxy.getImage();
int rotation = degreesToFirebaseRotation(degrees);
Log.d("degrees: ", String.valueOf(degrees));
Log.d("rotation: ", String.valueOf(rotation));
Image.Plane var0 = mediaImage.getPlanes()[0];
ByteBuffer var6 = var0.getBuffer();
int var2 = mediaImage.getHeight();
int var8 = (var6.remaining() + var0.getRowStride() - 1) / var0.getRowStride();
// int var9 = var2 / var8;
Log.d("divide debug: ", String.valueOf(var2));
Log.d("divide debug: ", String.valueOf(var8));
Log.d("divide debug: ", String.valueOf(var6.remaining()));
Log.d("divide debug: ", String.valueOf(var0.getRowStride()));
Log.d("divide debug: ", String.valueOf((var6.remaining() + var0.getRowStride() - 1)));
Log.d("divide debug: ", " ");
FirebaseVisionImage image = FirebaseVisionImage.fromMediaImage(mediaImage, rotation);
Log.d("analyze", "isMachineLearning is about to be true");
isMachineLearning = true;
extractBarcode(image, image.getBitmap());
}
}
}
});
//bind to lifecycle:
CameraX.bindToLifecycle(this, analysis, imgCap, preview);
}