Using libmp4v2 to record live audio/video streams to a MP4 container 3


In the past few days, I was reseaching on a recording library to write live audio/video streams from video conference server or peer to a MP4 file container.

Requirements

Before we start the implementations, I’d like to introduce a few more about the working environments & functional requirements.

1. Audio format: The audio from in the live stream can be several difference type of codecs, including but not limits to G.722.1c, MP3, AAC. But the major format in most circumstances, it’ll be G.722.1c. And regardless the format, most of the audio sample parameters are the same, sample rate: 32000, bits per sample: 16, channels: 1 (except AAC LC and AAC LD).

2. Video format: Just like the audio format, video format can be H.264, H.263, MPEG4, in several difference products of Kedacom.

3. What we need to do?

a. Write the live audio and video stream to a common container which can be supported by most of the popular players in different OSs(Windows, Linux, Android, iOS, etc)

b. Mininal the loading cost. That means at least we can not transcode the video, but the audio seem will be must if it is encoded to format like G.722.1c.

c. The target file container better have dual stream supports, because we may receive multiple video streams(video captured from desktop and camera) in the same time, but they share one copy of audio.

Target solution

MP4 with H.264 video + PCMA audio

According to the requestments, my first practise was using libmp4v2 to record them with H.264 video and PCMA audio.

Video samples will be wrote into the MP4 file directly, and audio samples will be decoded first, and resample from 32000 sample rate to 8000 sample reate,  and re-encoded to PCMA, then wrote in to the MP4 file.

Everything went perfectly.  However, the audio in the target MP4 file really sucks, because of the resample process. I tried use CoolEdit to view the PCM samples before and after resample process, then found out the PCM buffer already sucked after the resample process.

So it seems this solution(H.264 + PCMA) is not properly for us.

MP4 with H.264 video + AAC audio

Actually, I was once tried this solution first, but I was wrong in so called a “mini minor” fault when converting BYTEs buffer to FLOATs buffer in case of little endian or big endian.  The result of this “mini minor” fault is the audio will alwasy with huge noises in the background.

After reviewed my codes once, twice, again and again,  and coded several test programs to compare the output results,  eventually, I dragged this bug out.

What a disarster I’ve done. Sign, wasting my life for this kind of things…. Shit, Fxxk.

Some detail

Some libraries I used to do this job:

FAAC source code: http://sourceforge.net/projects/faac/ or

AAC implementation in libstagefright of android: http://ffmpeg.zeranoe.com/builds/source/external_libraries/vo-aacenc-0.1.2.tar.xz

libmp4v2 source code: http://code.google.com/p/mp4v2/

In fact,  when I encounter the noise issue writing H.264 + AAC to MP4 file by using libmp4v2, I was once doubting that maybe there are bugs in libmp4v2, and I tried libavformat(ffmpeg) to write the container, however the result remains the same.

and one more issue:

Like I said before, the audio sample in the live stream is mono, but after I encoded the mono audio to AAC by libfaac, and with the parameter set properly, what I saw in vlc is the audio turn into stereo. This one also confused me for a while, after I digged into AAC codec informats by Google, turns out it’s not an issue , at least not for me.

 


Leave a comment

Your email address will not be published. Required fields are marked *

3 thoughts on “Using libmp4v2 to record live audio/video streams to a MP4 container