Questions tagged [android-vision]

The vision package (com.google.android.gms.vision) provides common functionality for working with visual object detectors. Face Detection in Google Play services 7.8, includes vision API for optical character recognition, detecting faces, and reading barcodes.

The vision package (com.google.android.gms.vision) provides common functionality for working with visual object detectors.

It's new Face Detection in Google Play services, includes vision API for detecting faces and barcodes.

With the release of Google Play services 7.8, we announced the addition of new Mobile Vision APIs, which includes a new Face API that finds human faces in images and video better and faster than before. This API is also smarter at distinguishing faces at different orientations and with different facial features facial expressions.

We need Android Google Play Services SDK level 25 or greater( Google Play services 7.8)

Note: Face Detection is a leap forward from the previous Android FaceDetector.Face API

Fore more info Face Detection in Google Play services

194 questions
3
votes
0 answers

openCV NDK (/javaCV) vs FastCV Vs android Vision for face processing

I am trying to process faces in images on a device/tablet. I have used opencv (NDK) a while ago. I see that there are couple of other options available to process faces. Just wondering how opencv, android-vision api and FastCV would compare…
3
votes
2 answers

Surfaceview using Mobile Vision Displays the Camera in Landscape

I am using SurfaceView and Google's Mobile Vision library. For many devices it looks fine but when using with few devices like Nexus 7 the camera view comes in Landscape mode. Which makes it difficult for Scanning barcodes etc as it is difficult to…
3
votes
0 answers

Google Vision API: Support for negative colors on QRCode detector

I was working on a QRCode reader within my app using the Vision API. I realized the API can't detect negative colors. My customer has thousand of cards with white QRCodes on a blue background. Attached is an example of 2 qrcodes. The first works…
Nom4d3
  • 51
  • 4
3
votes
0 answers

Android Vision Face Detection with Video Stream

I am trying to integrate the face detection api in a video stream I am receiving from a parrot bebop drone. The stream is decoded with the MediaCodec class (http://developer.android.com/reference/android/media/MediaCodec.html) and this is working…
3
votes
1 answer

Android vision barcode-reader not working with some Samsung devices

I'm using the barcode-reader project in two Samsung devices (SM-N9005 or SM-G920F) and they are not capable of recognize any barcode. There are no problems in others like One plus one, or others Samsung devices like S3 or S4. There is not a problem…
Mun0n
  • 4,438
  • 4
  • 28
  • 46
2
votes
2 answers

Using OCR mobile vision to anchor image to detected text

I am using the Text Recognition (mobile vision/ML) by Google to detect text on Camera feed. Once I detect text and ensure it is equal to "HERE WE GO", I draw a heart shape beside the detected text using the passed boundries. The problem I am…
Snake
  • 14,228
  • 27
  • 117
  • 250
2
votes
0 answers

CameraSource.Builder setFlashMode method missing in latest android vision version

I'm building simple barcode reader with Android Vision API using googlesamples as guide. The problem is that I want to add toggle button for turning flash on/off, but method public Builder setFlashMode(@FlashMode String mode) used in the…
krdzy
  • 31
  • 4
2
votes
1 answer

what is the difference between getCornerPoints() and getBoundingBox() in TextBlock (android vision)

I am confused about the what is the deffrence between getCornerPoints() and getBoundingBox() in TextBlock as they both return the coordinate of the corner points of the bounding box?? any body can clarify?
farahkh
  • 51
  • 7
2
votes
1 answer

Can Mobile Vision API's Face Tracking, not Detection, used independent of its CameraSource?

This question is about using Google's Mobile Vision Face API on Android. The Story (Background) and what I want to do I am trying to implement a function that detects faces in a camera view, and overlaying images on those faces. Now, I have already…
2
votes
3 answers

Sort TextBlock as top to bottom in vision API

While I am scanning for text using vision API, Overlay return multiple text boxes as unsorted list. So when I read for text by looping them, sometimes I am getting texts in wrong order, ie., text from bottom of the page appears first. Sample code…
Gunaseelan
  • 14,415
  • 11
  • 80
  • 128
2
votes
1 answer

Using Google's Text Recognition API to detect horizontal lines instead of blocks in images

Is there a way to detect full-sized, horizontal lines (max width) instead of text blocks in images using Google's Text Recognition API? Say, if I wanted to retrieve the total due from a receipt image like this: ... because as of now, the API…
DaveNOTDavid
  • 1,753
  • 5
  • 19
  • 37
2
votes
0 answers

inflate error google mobile vision

Hey guys, for my school project i'm building a dutch licenseplate scanner. i came a long way with just the text but now i'm trying to add the actual scanning part. therefore i am using the google mobile vision code. after running the code samples…
MESP
  • 486
  • 2
  • 17
2
votes
1 answer

Is image pre-processing required (Google Mobile Vision Text Recognition API)?

In short: Is it needed or not to improve accuracy? A bit longer: I was going through documentation and Internet and I did not find any references concerning Mobile Vision and if it is doing some sort of image pre-processing procedures by itself or…
KaljaTolkki
  • 133
  • 1
  • 7
2
votes
0 answers

How to read a single character with Google OCR vision-samples

Is there any way to recognize single character in android OCR vision-samples? I know it is a known issue, the OCR works well with lines but not with single character or numbers. I'm wondering if it could be possible in a scenario where there is a…
Giulio Pettenuzzo
  • 786
  • 1
  • 5
  • 20
2
votes
1 answer

How to change targetSandboxVersion in production update?

I updated my production app with targetSandboxVersion="2" because the Google Play Console wouldn't let me release my instant app without that. Come to find out, that was a bug that was fixed by the Play Console team. A Google engineer mentioned, in…
Josh Logier
  • 103
  • 1
  • 8