My team is working on a mobile web application using the Cordova framework, initially targeting Android. A feature of the application is to record video on the user's phone via a custom media capture plugin, save it locally, read it using the Cordova File plugin (cordova-plugin-file), and stream it to a Node JS server for distribution to other connected users via the Stream API.
Devices receiving the stream save each incoming chunk within an array of ArrayBuffers, and then convert this to a Blob via the Blob constructor:
let receivedChunks: ArrayBuffer[] = [];
// chunks of video data received from node.js server
// each chunk saved as ArrayBuffer
// ArrayBuffer[] treated as blob parts
const videoBlob = new Blob( receivedChunks, { type: 'video/mp4' });
We then use the File plugin to write this blob to the Android cacheDirectory for our application, and get a file:///
URL in order to load the video into an HTML5 <video>
element. The app queues up playback of these <video>
elements using the Media API and Media Events.
None of the published Cordova (or PhoneGap) plugins quite suited our UI requirements, so we wrote our own, based on the Camera2 API (we've sacked off support for Android 4.x and below for the time being). We based our plugin on the Google samples, and it was working fine until we ran into the same issue as referenced by another StackOverflow user: Camera2 video recording without preview on Android: mp4 output file not fully playable
Turns out there are some issues with Deep Sleep on Samsung Galaxy devices running Android 6.0 Marshmallow. We implemented the workaround I described in my answer to that question, which partly solved the problem, but left us with scrambled metadata that meant we lost device orientation hints (our app uses sensorLandscape
to keep the UI the right way up, so we have to apply orientation fixes to recorded video in order to prevent them from playing back upside-down).
So we took our workaround a little further and decided to re-encode the corrected video:
private void transcodeVideo(String pathToVideo)
throws IOException {
MediaExtractor extractor = new MediaExtractor();
extractor.setDataSource(pathToVideo);
int trackCount = extractor.getTrackCount();
MediaMuxer muxer = new MediaMuxer(pathToVideo+ "transcoded", MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
HashMap<Integer, Integer> indexMap = new HashMap<Integer, Integer>(trackCount);
for (int i = 0; i < trackCount; i++) {
extractor.selectTrack(i);
MediaFormat format = extractor.getTrackFormat(i);
int dstIndex = muxer.addTrack(format);
indexMap.put(i, dstIndex);
}
boolean sawEOS = false;
int bufferSize = 256 * 1024;
int frameCount = 0;
int offset = 100;
ByteBuffer dstBuf = ByteBuffer.allocate(bufferSize);
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
muxer.setOrientationHint(DEFAULT_ORIENTATIONS.get(_savedVideoRotation));
muxer.start();
while (!sawEOS) {
bufferInfo.offset = offset;
bufferInfo.size = extractor.readSampleData(dstBuf, offset);
if (bufferInfo.size < 0) {
Log.d(TAG, "saw input EOS.");
sawEOS = true;
bufferInfo.size = 0;
} else {
bufferInfo.presentationTimeUs = extractor.getSampleTime();
bufferInfo.flags = MediaCodec.BUFFER_FLAG_KEY_FRAME;
int trackIndex = extractor.getSampleTrackIndex();
muxer.writeSampleData(indexMap.get(trackIndex), dstBuf,
bufferInfo);
extractor.advance();
frameCount++;
Log.d(TAG, "Frame (" + frameCount + ") " +
"PresentationTimeUs:" + bufferInfo.presentationTimeUs +
" Flags:" + bufferInfo.flags +
" TrackIndex:" + trackIndex +
" Size(KB) " + bufferInfo.size / 1024);
}
}
muxer.stop();
muxer.release();
}
This is where things get weird.
The re-encoded video seems to play back just fine on most other devices, including a rather long-in-the-tooth Moto G LTE (1st gen). However, when we stream and save more than a couple of videos at the same time on the Moto G, the re-encoded video stops playing properly. There's no audio or video, but the <video>
element emits all the normal video events that we'd expect to see if the video was playing properly - in particular, the 'ended'
event is fired after the expected duration.
If we only stream and save two videos, the Moto G can play the re-encoded video just fine. Other devices in the same session (all receiving the same set of videos delivered from the server) seem to have no problem with the S7's re-encoded video. If we remove the S7 from the set of devices in the same session, we sometimes see the same problem, but sometimes not - but it's 100% consistent when the S7 with the re-encoded video is involved.
Is there anything obviously wrong with our MP4 encoding? Is anyone aware of issues with simultaneously writing multiple files to the flash storage of a slower Android device like a Moto G? Has anyone else seen this odd playback behaviour, where a video element fires media events without actually playing audio or video?
I'm aware that this question may be a little lacking in focus, and there are a lot of variables involved (multiple possible points of failure in code, multiple devices, unclear on whether the problem is encoding, playback or something else), but if it rings a bell for anyone and they can provide a little insight then it would be greatly appreciated!