Is it possible for Mobile Vision API to detect Chinese, Japanese and Korean? If not, is there any way to detect these language in Android?
Asked
Active
Viewed 2,967 times
1
-
try with google translate API. – Android Geek Jun 07 '17 at 06:33
-
Did Google Translate API support OCR? – Rice Jun 07 '17 at 07:23
2 Answers
2
Google Cloud Vision API provides support for a much larger set of languages (i.e. Chinese) from an OCR standpoint: https://cloud.google.com/vision/docs/languages

Joseph Lam
- 5,289
- 3
- 14
- 13
-
You should probably mention that switching to a service from a library will also mean adding character limitations and possible fees. – Abandoned Cart Feb 03 '20 at 14:50
0
You can detect Chinese, Japanese and Korean Tesseract but you will not get 100% result.
Tutorial Link :
http://imperialsoup.com/2016/04/29/simple-ocr-android-app-using-tesseract-tutorial/
other Language Data files :
https://github.com/tesseract-ocr/tessdata
Sample Links:

Janak
- 607
- 5
- 23
-
Thanks. I used this before. It takes 3s to detect English but 30s to detect Chinese and Korean. Do you know why? – Rice Jun 07 '17 at 07:22
-
I have never try for other then English, but if you will provide grayscale Image then may it will take less time. – Janak Jun 07 '17 at 07:23
-
How can I improve Korean??? Is there no way to improve that by myself? Just do you need to use the provider's file? – c-an Nov 15 '18 at 09:32
-
@ChanjungKim... For better result and fast processing now you can use google vision API and it also supports the Korean language. check this link: https://cloud.google.com/vision/docs/languages and try the API : https://cloud.google.com/vision/ – Janak Nov 15 '18 at 19:53
-
1
-
@ChanjungKim Yes, I have checked same for you. So the best option is to go with this link with Korean language pack(trained data). https://github.com/tesseract-ocr/tessdata/blob/master/kor.traineddata or https://github.com/tesseract-ocr/tessdata/blob/master/kor_vert.traineddata, You need to take very clear, zoom and grayscale image so will process faster than previous (do some image processing by your self). – Janak Nov 15 '18 at 20:20
-
@Janak A `very clear, zoom and grayscale image` is not practical for any sort of application outside of a proof of concept. – Abandoned Cart Feb 03 '20 at 14:51
-
@AbandonedCart You are right very clear, zoom and the grayscale image is not practical for any sort of application. But you will provide some image processing functionality in which a user can zoom the image and do default grayscale. So users can get the best result. – Janak Feb 12 '20 at 20:02
-
@Janak Are you demanding I provide something with the same limitations or did you intend to demand I provide something with less limitations? I should have warned you that I didn't point out your flaws without considering the argument "but what else is there?" I use https://ocr.space/OCRAPI with full-color and somewhat blurry images and have had nothing but excellent results. The caveat is that you ship the image off to a server in real time, but it is worth it to not cripple the experience. – Abandoned Cart Feb 13 '20 at 16:26