June 2013: Graduate season

Graduate season

June 28th, 2013, Friday, a day happen to be free for me, and happen to be Catherine’s last day for her kindergarten. Congratualtions, She’s soon be a primary school pupil. PIC_20130626_211552_B7C PIC_20130626_211614_16C PIC_20130626_211634_461PIC_20130628_154710_605 PIC_20130628_154828_1C0

And she got her one more certificate than her father, Painting certificate. :'( :'( :'( 😀 😀 😀PIC_20130623_162228_E87 PIC_20130623_162528_723

I ain’t got no worries. We ain’t got no worries

HEAT has won this season’s NBA champion. LBJ got his second ring in the recent two years. After it, he said: I ain’t got no worries. We ain’t got no worries(我现在没啥可担心的了,我们现在没啥可担心的啦), nevertheless how much difficulties they’v encountered with. I’m never be a true fan of NBA, even when YaoMing was still serve in NBA, but I heard a lot about LBJ, about he’s  a fighter, maybe the greatest one, about he is a really nice guy. So, all I wanna say is congratulations!

In the meantime, the words are also for me, because I got hurt, again, in the last Friday(June 28th) evening’s badminton game. I ain’t got no worries either after won my “champion”.

PIC_20130629_134138_85B

PIC_20130629_134226_278

I ain’t got no worries, either.

Recruiting/Calling former colleagues

Once upon a time, when I joined Kedacom, I was asked by my supervisor about my plan in Kedacom, I said my first goal here is to get to know more about Kedacom & its products, then lead a team and in charge a specific technical area, but my first goal should be pass the probation.

Earlier in this month, I was told by my supervisor for a second time that I am gaven a better expectations here, he wish I can play a more important role in the division, and now there are vacant positions, he asked me if I can introduce some guys to Kedacom. And once done for this, he can assign them to me, maybe I can be the leader of them.
So I tried to reach to serval colleague, like Ben, Bruse, XiaoQ, Jason…
The sad thing is all of them seems having a better oppotunity in their new company, some of them even having a better salary than me (in Kedacom). What I can say is only congraturations.:'(:'(:'(
Ben once was the best shot, however he’s in a special time now, he’s attending to apply Shanghai HuKou in the coming monthes after the famous 7-year-of-tax-contribution in Shanghai.
I told him to confirm with the HR or Administration dept of his employer to check whether he’s fits with all the requirements, then give me a call ASAP once he’s got the result.
But……weeks passed, no call till now. So, seems I can send my best wishes to him too now.

Motivation would be a blade in sometimes.

After finished the entrance exam in the last month, someone asked me why I put so much time in this comparing to 1/4 to 1/3 classmates did not attended most of the courses.
I said to him, its because I’m having a good time here, my hard work was payoff, I feel I’m approved here, that’s my motivation, I need to be approved by someone else, I need to be approved by myself. This is a typical reaction to most of us, when you feel you are approved, you will put more effort into it.
Back to career topic, when I decided to join Kedacom, I was, and still, having confidence in myself, believing that I can prove myself in a short time, and blah-blah-blah.

But it seems now, I was so naive.

I did a lot things in the past monthes in Kedacom, resolved several technical point for the online products of Kedacom, I was so eagered to be approved.

However, I’m still under probation, right on schedule, rather than pass it in advance.In the meantime, I did not put much effort to some training courses which is not cross with my job responsibilities (for now, and by my own opinion), I thought they are not relevant, but it seems it does. And I skipped some of the code analysis steps while I was coding some extra but independent functions for KdvMediaSDK, my focus was on the outcome directly, skipped some coding style, coding rules of the new company, it seems also a big mistake for a senior programmer.

So I must keep telling myself now, be patient.

Start GPU encoding/decoding with VA-API (Video Acceleration API)

About

The main motivation for VA-API (Video Acceleration API) is to enable hardware accelerated video decode/encode at various entry-points (VLD, IDCT, Motion Compensation etc.) for the prevailing coding standards today (MPEG-2, MPEG-4 ASP/H.263, MPEG-4 AVC/H.264, and VC-1/VMW3). Extending XvMC was considered, but due to its original design for MPEG-2 ?MotionComp only, it made more sense to design an interface from scratch that can fully expose the video decode capabilities in today’s GPUs.

The current video decode/encode interface is window system independent, so that potentially it can be used with graphics sub-systems other than X. In a nutshell it is basically a scheme to pass various types of data buffers from the application to the GPU for decoding or encoding. Feedback on the API is greatly welcomed, as this is intended to be a community collaborative effort.

Download

The latest releases of libva software can be found at: http://www.freedesktop.org/software/vaapi/

Git

libva an implementation of VA-API for Linux, is now available via git from the following location (http://cgit.freedesktop.org/libva/):

git clone git://anongit.freedesktop.org/git/libva

The gstreamer-vaapi elements are available at: https://gitorious.org/vaapi/gstreamer-vaapi

git clone git://gitorious.org/vaapi/gstreamer-vaapi.git

Specification

Latest VA-API decode/encode specification can be found at http://cgit.freedesktop.org/libva/tree/va/va.h,

Post-processing interface can be found at http://cgit.freedesktop.org/libva/tree/va/va_x11.h

Drivers (back-ends) that implement VA-API

  • Broadcom Crystal HD (work-in-progress):
    * <http://gitorious.org/crystalhd-video>
    
  • Intel Embedded Graphics Drivers (IEGD):
    * <http://edc.intel.com/Software/Downloads/IEGD/>
    
  • Intel Embedded Media and Graphics Drivers (EMGD):
    * <http://edc.intel.com/Software/Downloads/EMGD/> 
    
  • Intel GMA500 driver (OEM only):
    * <https://launchpad.net/~ubuntu-mobile/+archive/ppa> 
    
  • Intel integrated G45 graphics chips:
    * <http://cgit.freedesktop.org/vaapi/intel-driver> 
    
  • IMG VXD375/385 and VXE250/285 video engines:
    * <http://cgit.freedesktop.org/vaapi/pvr-driver/> 
    
  • VDPAU back-end for NVIDIA and VIA chipsets:
    * <http://cgit.freedesktop.org/vaapi/vdpau-driver/> 
    
  • VIA / S3 Graphics Accelerated Linux Driver:
    * <http://www.s3graphics.com/en/index.aspx> 
    
  • XvBA / ATI Graphics Backend (for proprietary driver only)
    * <http://cgit.freedesktop.org/vaapi/xvba-driver/> 
    

    Other back-ends are currently under development.

Decoding Hardware with no backend available

  • NONE FOR NOW

Software using VA-API

  • Clutter toolkit (through clutter-gst, thus GStreamer):
    * <http://clutter-project.org/> 
    
  • FFmpeg (upstream SVN tree >= 2010/01/18 / version 0.6.x and onwards):
    * <http://ffmpeg.org/> 
    
  • Fluendo video codec pack for Intel Atom (GStreamer):
    * <http://www.fluendo.com/> 
    
  • Gnash flash player:
    * <http://wiki.gnashdev.org/Hardware_Video_decoding> 
    
  • GStreamer:
    * <http://gitorious.org/vaapi/gstreamer-vaapi> 
    
  • Lightspark flash player:
    * <http://lightspark.sourceforge.net/> 
    
  • MPlayer/VAAPI:
    * <http://gitorious.org/vaapi/mplayer> (`hwaccel-vaapi` branch) 
    
  • MythTV (work-in-progress):
    * <http://www.mythtv.org/wiki/VAAPI> 
    
  • ?RealPlayer for MID:
    * <https://community.helixcommunity.org/Licenses/realplayer_for_mid_faq.html> 
    
  • Totem movie player (simply requires GStreamer VA-API plug-ins):
    * <http://projects.gnome.org/totem/> 
    
  • VideoLAN – VLC media player:
    * <http://www.videolan.org/> 
    
  • XBMC:
    * <http://www.xbmc.org/> 
    
  • Xine:
    * <https://github.com/huceke/xine-lib-vaapi/tree/vaapi> 
    

libVA sample code

  • Hardware video decoding acceleration demos:
    * <http://gitorious.org/hwdecode-demos/> 
    
  • Decode sample program:
    * <http://cgit.freedesktop.org/libva/tree/test/decode/mpeg2vldemo.c> 
    
  • Encode sample program:
    * <http://cgit.freedesktop.org/libva/tree/test/encode/h264encode.c> 
    
  • Post-processing sample program:
    * <http://cgit.freedesktop.org/libva/tree/test/putsurface/putsurface.c> 
    

Architecture

[[!img Linux_vaAPI.gif]

Contact

Jonathan Bian (jonathan.bian@intel.com); Austin Yuan (shengquan.yuan@intel.com)

From: http://www.freedesktop.org/wiki/Software/vaapi/

Integrating Intel® Media SDK with FFmpeg for mux/demuxing and audio encode/decode usages

Download Article and Source Code

Download Integrating Intel® Media SDK with FFmpeg for mux/demuxing and audio encode/decode usages (PDF 568KB)
Download Source Code. (ZIP 98KB) (Note: Licensing terms match Media SDK 2012)

Introduction

The provided samples intend to illustrate how Intel® Media SDK can be used together with the popular FFmpeg suite of components to perform container muxing and demuxing (splitting). The samples also showcase integration of rudimentary FFmpeg audio decode and encode. Continue reading “Integrating Intel® Media SDK with FFmpeg for mux/demuxing and audio encode/decode usages”

Reading book “GOD’S DEBRIS – A Thought Experiment”

Early in this month, I watched a movie 《中国合伙人》. I don’t know the English name of this movie, Chinese partner? Don’t care. What I want to say here is the lead role in this movie was really not a great speech maker at all, however he is one of the greatest one now, only by or starting with telling people what a loser he was.

And today, when I reading the book  <GOD’S DEBRIS – A Thought Experiment>, I get the exact same suggestions from this book. Frankly, I think I’m not good at expressing, even more, I’m having trouble in expressing my “fluent”, “boundless”  thoughts. I’m really need suggestions like this. So, mark it.

Chapter 25 – Relationship

Q: How can I be more trusted?

A: Lie. You should lie about your talents and accomplishments, describing your victories in dismissive terms as if they were the result of luck. And you should exaggerate your flaws.

Q: Why in the world would I want to tell people I was a failure and an idiot? Isn’t it better to be honest?

A: Honesty is like food. Both are necessary, but too much of either creates discomfort. When you downplay your accomplishments, you make people feel better about their own accomplishments. It is dishonest, but it is kind.

Ingredient for successfully social living:

  • Express gratitude.
  • Give more than expected
  • Speak optimistically.
  • Touch people.
  • Remember names.
  • Don’t confuse flexibility with weakness.
  • Don’t judge people by their mistakes; rather, judge them by how they respond to their mistakes.
  • Remember that your physical appearance is for the benefit of others.
  • Attend to your own basic needs first, otherwise you will not be useful to anyone else.

Reading codes of WebRTC: try to implement a common video module wrapper from webrtc

My goal here is only to test & research, practise and get to know more about WebRTC & its codes, so here we go.

Wrappering an independent Video Capture class from WebRTC’s video capture module Continue reading “Reading codes of WebRTC: try to implement a common video module wrapper from webrtc”

Why my VC2010’s Resource View and Class View are both blank?

I changed the vcxproj file location for one of the projects in the solution,  and re-added this project to the solution.

The project compiled fine and worked.

However recently, I went to the Resource View to edit some dialogs and the window is completely empty. I can’t remember if I went there since I changed the location of the project or not:

mkHhb

After Googled this issue, I found two solutions for it:

Solution 1: Disable Database

It might be because your Browsing/Navigation Database is disabled.

Check the current setting under: Tools -> Options -> Text Editor -> C++ -> Advanced. “Disable Database” should be false.

This is a bug as far as I know, and they said they are working on the problem.

More Info: http://connect.microsoft.com/VisualStudio/feedback/details/535971/solution-resource-view-empty-when-option-disable-database-c-is-set-to-true

It doesn’t work for me.

Solution 2: Unload & reload all the projects

I had similar problem that I solved with reloading projects in the solution (unload projects and then reload all projects).

Simple, but just worked as expected.

A simple guide to start GPU programming

This is a simple guide document for getting start with GPU programming by using CUDA SDK, and my working environment is  WinXP + VS 2010.

If you are looking for a comprehensive guide of GPU programming, you need to visit http://docs.nvidia.com/cuda/index.html.

Here we go.

1. Installing CUDA Development Tools

Key steps:

  • Verify the system has a CUDA-capable GPU.
  • Download the NVIDIA CUDA Toolkit.
  • Install the NVIDIA CUDA Toolkit.
  • Test that the installed software runs correctly and communicated with the hardware.

URL: https://developer.nvidia.com/cuda-downloads

For WinXP(32 bit): http://developer.download.nvidia.com/compute/cuda/5_0/rel-update-1/installers/cuda_5.0.35_winxp_general_32-3.msi

You can choose what to install from the following packages:

  1. Note: If you want to install the CUDA Driver for new hardware, and have already installed the CUDA Driver before, you can launch the CUDA Driver installer from the Start Menu under:

    NVIDIA Corporation\CUDA Toolkit\v5.0, or

    NVIDIA Corporation\CUDA Toolkit\v5.0 (64 bit)

    CUDA Driver

    The CUDA Driver installation can be done silently or by using a GUI. A silent installation of the driver is done by enabling that feature when choosing what to install.

    • Silent: Only the display driver will be installed.
    • GUI: A window will appear after the CUDA Toolkit installation if you allowed it at the last dialog with the full driver installation UI. You can choose which features you wish to install.
  2. CUDA Toolkit

    The CUDA Toolkit installation defaults to

    C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v#.#, where

    #.#is version number 3.2 or higher. This directory contains the following:

    Bin\
    the compiler executables and runtime libraries
    Include\
    the header files needed to compile CUDA programs
    Lib\
    the library files needed to link CUDA programs
    Doc\
    the CUDA C Programming Guide, CUDA C Best Practices Guide, documentation for the CUDA libraries, and other CUDA Toolkit-related documentation

    Note: CUDA Toolkit versions 3.1 and earlier installed into

    C:\CUDAby default, requiring prior CUDA Toolkit versions to be uninstalled before the installation of new versions. Beginning with CUDA Toolkit 3.2, multiple CUDA Toolkit versions can be installed simultaneously.

  3. CUDA Samples The CUDA Samples contain source code for many example problems and templates with Microsoft Visual Studio 2008 and 2010 projects.

    For Windows XP, the samples can be found here:

    C:\Documents and Settings\All Users\Application Data\NVIDIA Corporation\CUDA Samples\v5.0

    For Windows Vista, Windows 7, and Windows Server 2008, the samples can be found here:

    C:\ProgramData\NVIDIA Corporation\CUDA Samples\v5.0

2. Compiling CUDA Programs

The project files in the CUDA Samples have been designed to provide simple, one-click builds of the programs that include all source code. To build the 32-bit or 64-bit Windows projects (for release or debug mode), use the provided

*.slnsolution files for Microsoft Visual Studio 2008 or 2010 (and likewise for the corresponding versions of Microsoft Visual C++ Express Edition). You can use either the solution files located in each of the examples directories in

CUDA Samples\v5.0\C\<category>\<sample_name>

or the global solution files

Samples*.slnlocated in

CUDA Samples\v5.0\C

CUDA Samples are organized according to

<category>. Each sample is organized into one of the following folders: (0_Simple, 1_Utilities, 2_Graphics, 3_Imaging, 4_Finance, 5_Simulations, 6_Advanced, 7_CUDALibraries).

LINK : fatal error LNK1104: cannot open file ‘atlthunk.lib’

I was building webrtc on my notebook which OS is Win7 when I ran into this issue. I followed my previous steps which I suceeded to download and build the complete webrtc project:

  • Install Visual Studio 2010
  • Install DirectX SDK 2010(June)
  • Install Microsoft SDK v7.1

then downolad webrtc project including the project & settings for Visual Studio, and try to build it. You can view the detail steps here: http://rg4.net/archives/736.html.

“fatal error LNK1104: cannot open file’atlthunk.lib'”

First, I tried to add this pragma to the top of my header file. But problem remains.

#pragma comment(linker,”/NODEFAULTLIB:atlthunk.lib”)

Then I digged more into it.

Found out that atlthunk.lib belong to ATL7.1 , and ATL7.1 belongs to WinDDK. So what we need to do actually is download and install WinDDK. You can download it from this url:

http://download.microsoft.com/download/4/A/2/4A25C7D5-EFBE-4182-B6A9-AE6850409A78/GRMWDK_EN_7600_1.ISO

After downloaded and installed WinDDK, you still need to add this directory to VC’s library directories manually:

C:\WinDDK\7600.16385.1\lib\ATL\amd64

Then, re-open the webrtc’s all.sln, and rebuild the solution.

So the close note will be:

If you are building webrtc on WinXP, you need to install DirectX SDK 2010(June) and Microsoft SDK v7.1. But if you are building it on Win7, you need to install one more SDK, that is WinDDK v7.1.

Darwin Streaming Server Relay Setting

Introduction

Streaming relays and reflectors can be used to scale streaming infrastructure by distributing load between servers and making the most efficient use of network bandwidth. A streaming reflector “tunes in” on and incoming stream and relfects it to clients. The most common setup for a reflector is reflecting a live streams as (see the section on Live Webcasting). A streaming relay forwards an streams from a source to a destination server. One of the primary advantages of relays is segmentation of network traffic. Clients can tune in on relays that make most effective use of limited network resources instead of loading a single network segment with streaming traffic.

One of the easiest ways to distribute load between a group of servers is to reflect a multicast stream. This requires a multicast enabled network between the broadcaster/encoder and the streaming servers (see terminology below). The multicast .sdp file from the broadcaster/encoder simply needs to be placed on each streaming server. When clients open an rtsp stream to the multicast .sdp file from a server, they will receive a reflected unicast of the multicast stream. Using this method, a number of servers can reflect live webcasts. Combined with load balancing this technique can be used to distribute load for live streaming across a group of servers (a simple cgi for load balancing is presented in the Using CGI section of this site).

Terminology

Network Bandwidth is the capacity of the network to move data. Bandwidth is measured in bits-per-second. Analog modems over telephone lines have a capacity of up to 56 kilobits per second. Because digital video files are large (read below), video streamed over 56 kilobit modems will be low quality. Higher quality video requires a broadband connection. Broadband generally refers to a network connection that can sustain at least 200 kilobits per second. The table below outlines the maximum bandwidth of dedicated Internet connections:

Connection Bandwidth (kbits/second)
T1 or DS-1 1,544,000
T2 or DS-2 6,000,300
T3 or DS-3 44,736,000
OC-3 155,000,000
T4 or DS-4 274,000,000
OC-12 600,000,000
OC-48 2,400,000,000
OC-192 10,000,000,000

Unicast packets on typical IP networks have a single source and destination. Most traffic on IP networks between clients and servers on today’s networks is unicast. Multicast packets have a single source and multiple destinations. Multicast packets can save network bandwidth when multiple clients need to view the same streaming media simultaneously. Instead of sending out individual unicast packets to each client, a single stream of multicast packets can be viewed by multiple clients.

Today’s commodity Internet does not support multicast (Internet 2 does). The capacity of a streaming server is frequently limited by the bandwidth of the network connection. The table below summarizes the maximum number of 500 kilobit Unicast streams supported by the dedicated Internet connections listed above:

Unicast Streams Minimum Connection
3 T1
12 T2 or DS-2
89 T3 or DS-3
310 OC-3
548 T4 or DS-4
1,200 OC-12
4,800 OC-48
20,000 OC-192

A single Xserve running QuickTime Streaming could saturate a T4/DS-4 connection. Faster connections would require additional servers set up as reflectors and relays to support more simultaneous Unicast streams.

A streaming relay server requests a stream from a broadcast or video on demand server and relays the stream to clients. Streaming relays can be used to relay live multicast streams as unicast streams or visa-versa. Relays can also be used to connect clients to streams on a local server, lowering streaming media traffic across IP Routers.

Note, content replication and content caching servers can also be used to reduce streaming network traffic. By replicating content files closer to clients, traffic across IP Routers can be minimized. Content caching servers intelligently replicate content based on client requests. This can be accomplished without relays or reflectors, which are more appropriate for live streams.

Sample Relay Deployment

The following diagram illustrates a network environment with streaming servers and relays. In this diagram:

  • Two primary servers are placed on the campus backbone (Internal Campus servers). One server is for production, the other is for testing/backup of the primary.
  • Each subnet has it’s own streaming server (subnet A shown). The primary servers are configured to relay live streams to the subnet servers. Clients on the subnet tune into the local subnet relay, reducing traffic across the subnet routers.
  • An external streaming server is placed in the network DMZ for publicly accessible content.
  • Each remote campus has a streaming server that replicates video on demand content from the primary campus servers, and function as relays to reduce traffic across the Internet router/firewalls.

Relaying from Server to Server

Relay Sources:

In the diagram above, the Primary Server will be configured to relay live/playlist streams to servers on each subnet and remote campus. Every relay has a source and a destination. Using Server Admin in Mac OS X Server, the Relay Source can be configured by creating a new Relay or in the web based administrative interface of QuickTime/Darwin Streaming Server:


Server Admin on Mac OS X Server/QuickTime Streaming Server


Web Administration on QuickTime/Darwin Streaming Server

The source for the relay can be:

  1. Request Incoming Stream:

    The stream will be requested from the source. If the source is a .sdp file on the local machine (127.0.0.1), no username and password needs to be specified. If the source is another streaming server, you must specify the administrative username and password of that server.

  2. Unannounced UDP:

    The stream will be received on a specific IP address and port. The IP address of the live encoder and port numbers must be entered. Note: there is no GUI in the web administrative interface for this source type.

  3. Announced UDP:

    The server will start relaying when a new stream is announced on the source IP address. This source requires an encoder that supports “Automatic Unicast.”

It is very common for the local server (127.0.0.1) to be the source of any of the relay source types above.

Relay Destinations:

Each relay can have one or more destinations. Incoming streams from the source will be sent to the destination. Each source can have one or more destinations. Using Server Admin in Mac OS X Server, the Destinations can be configured by adding them in the Destinations pane of a relay or in the web based administrative interface of QuickTime/Darwin Streaming Server:


Server Admin on Mac OS X Server/QuickTime Streaming Server


Web Administration on QuickTime/Darwin Streaming Server

The destination(s) for the relay can be:

  1. Announced UDP:

    The RTSP announce protocol will be used to automatically generate the .sdp file on the destination. This is a convenient way to automatically propagate .sdp files on the destination. Note: If your relay source and destination have firewall restrictions or network congestion, this technique is not recommended – this technique requires an active session between source and destination.

  2. Unannounced UDP:

    Packets will be sent to the specified IP address and port number. This method requires manual generation of the .sdp file on the destination. This technique does not require a session between the relays, and is a better choice when firewalls or network congestion may be factors. The source .sdp file can be used as a starting point for the manually generated .sdp file on the destination. The following lines must be edited in the .sdp file:

    • The line that begins with “c=IN IP4” should be edited with the destination server’s IP address.
    • The first line that begins with “m=” should use the port number specified in the destination relay.
    • The second line that begins with “m=” should use the port number + 2 specified in the destination relay.

Configuring a Unicast to Multicast Relay

It is possible to relay an incoming stream as a multicast stream. The following procedure outlines the steps required for this configuration:

  1. Set up a multicast relay on the streaming server. This relay will be configured with the IP address of your Broadcaster/Encoder as the “Source Hostname or IP Address:”, the <broadcastMountPoint> specified in step 2 below as the “Mount Point:”. Check “Wait for announced stream(s)” if you are using Automatic Unicast on the Broadcaster/Encoder. The “Hostname or IP Address:” for the destination must be a valid multicast address. Select “Relay via UDP” and set the base port to an even number (something in the 9000-9996 range works well). The multicast TTL is the number of router hops the multicast will work through. Set this for the topology of the network you are working on. The server must also be configured to accept an announced broadcast. This can now be done from QTSS Web Admin General Settings by clicking on “Change Movie Broadcast Password…”
    Note: If you are not using a Broadcaster/Encoder that supports RTSP Announce, you can set the source to 127.0.0.1, select “Request Incoming Stream” and set the mount point to the path to the Unicast sdp file. The unicast sdp file must be manually copied to the server from your encoding software or device. No username and password is necessary if you use the loopback address. The mount point is relative to whatever you specify as your Movies directory.
  2. Set up QT Broadcaster to send an “Automatic Unicast (Announce)” to the server. The “File:” you specify in broadcaster should include the “.sdp” extension – for example “webcast.sdp”. After you start the broadcast, test accessing it from the server by accessing the url rtsp://<serverIP>/<broadcastMountPoint> from QT Player. You will be tuning in on a Unicast relay of the stream.
    Note: If you are using a software/hardware encoder that does not support RTSP Announce, you must generate the Unicast sdp file and copy it to the Movies directory on the server.
  3. To tune in on the multicast relay, you have to make a copy and edit the .sdp file that is created by the announced broadcast and or copied manually as outlined above. After you complete step 2, you should see a file with the name you specified in your streaming server’s Movie directory. Copy this file (see step number 4 below for options on where you might want to place the copy). Edit the copy of the file. Look for the line beginning with “c=IN IP4” (at the top of the file). Change the IP address to the multicast IP address specified in step 1 above. Next look for the first line beginning with “m=”. Usually this is m=audio. Change the 0 to the base port you specified in 1 above (i.e. 9000). Look for the next line beginning with m=. Usually this is m=video. Change the 0 to the base port + 2 (i.e. 9002).
  4. The multicast sdp file can be accessed via ftp, http, from a file server, or e-mailed to clients. It cannot be accessed directly from the QTSS process. Save the file where it can be accessed via http or ftp, from a file server, or e-mail it to clients. I usually put it on a web server (see note below on mime types). If you place the file on a http or ftp server, you can access the multicast from QuickTime Player by using the url:

    http://<webServerIP>/<pathAndFileNameOfsdpFile>

    or

    ftp://<ftpServerIP>/<pathAndFileNameOfsdpFile>

    If you put the multicast sdp file on a web server, or e-mail it to clients, they can just open up the sdp file with QuickTime Player. Note that sdp files are often associated with Real Player – you might have to drag and drop the file or use File-Open from QT Player.
    Alternatively you can open up the sdp file or URL with QT Player Pro, save the file as a self-contained .mov file, send this to clients, embed in a web page, etc.

Closing Notes:

If you keep the name of the broadcast sdp file the same, you can stop and start new broadcasts using the same name. However, you will have to do steps 3 and 4 above for each new broadcast if you change any broadcast parameters that are reflected in the .sdp file. The steps outlined in steps 3 and 4 could be automated by a cgi or other script.

If you place the multicast sdp file on a web server, the mime type needs to be configured properly on the web server:

mime type extension
application/sdp sdp

May 2013: Biggest exam in a decade

Post-graduate

In the last weekend, I attended one biggest exam in a decade, the post-graduate entrance exam of USTC.

To assure that I can pass the exam, I attended a training course earlier this month, and put myself busy preparing the exams for weeks.

Everything went great.  For both the two phases of the exam (First one was pure a exam, including English, Mathematics, and Specialty course; Then was an interview held by Prefessors from USTC).

And now I believe that I can pass, only need to wait the result to be revealed at July.

Job

I, per myself, am a modest man, and I’m a man values honesty as a great virtue.

But it seems that I’m act too modest in some specific time. I just joined a new company last month, now surrounding by lots of unfamiliar faces. My previous title in my previous employer, UniSVR, was R&D manager of China mainland, in charge of both Shanghai & Beijing offices’ R&D teams, but that was once upon a time. Now I’m nobody but an ordinary SDE, and people here don’t know me, yet. There once I was given a chance to introduce myself. I said nothing about the titles and the track records I once retrieved or archived, because I didn’t think this could be any help. Now I start to think it was a big mistake, just like another big mistake I’v made earlier in a conference interview thru WebEx.

In that interview, I must say, for the record, I completely meet with the requirements of that position.

However, I failed to pass the interview. After reviewed the interview myself, there were two key points affected and led to the failure. Video codec & Streaming QoS, which I’m both experienced and capable of. But when I was asked about these skills or questions, I was too modest, I only said that I didn’t get myself involved into those skills because:

  • Video codec

In fact, earlier when I was working on the WinCE/Windows Mobile client for 3GVAU project and UMA project, I once researched into x264(encoder) and ffmpeg(decoder) for over a year (not full time, because there are other proejct to work out), but only for porting purpose and bit rate control purpose. I’m not saying here that I’m an expert in H.264 codec algorithm topic. At least I knew and familiar with it, although it was a thing of 5 or 6 years ago.  When I was asked about video codec, I said nothing about these experiences, instead, I said I had not worked on the detail codec algorithms, what I master at are how to use the codecs, and parseing some header informations. For example, when we streaming videos, we need to parse H.264 SPS/PPS, MPEG-4 VOL/VOP info.

And I put an end of this question with a saying, There are things you didn’t get yourself involved into, but it doesn’t mean you are not capable of it. And more, I told them that I can do the video codec algorithm work if it is necessary.

  • Streaming QoS

I was once researching on this topic for monthes to improve the streaming quality of UniArgus series of products, including but not limits to standard protocol like RTCP, private protocol of UniArgus. However these implementations did not work as so good as what we wished it to be,  when the networking bandwith is really suck, especially when you are using a smartphone to view a live video through 3G connections.

Besides, UniArgus series products are designed to stream limited sources of stream to hundreds or thousands of different clients, including clients connecting by LAN, by WIFI, by WAN, or by 3G. That means its impossible for the server to encode/transcode too much different streams with different profiles. For example, one stream in server, two clients. One client is in LAN, another is connecting thru 3G(which sucks).  If server can only produce a profile of stream, then we can  not take take into consideration of both of the clients’ QoS requirements like frame rate, bitrate adjustment. What we can do is only if one client encountered package loss issue, we can re-send the lossed packages or skip some usless video packages(like drop all the rest B/P frames untill next I frame).

However I never mentioned all these staffs, maybe it’s because the I was nervous, or just don’t want to say things I not 100% confident in its outcome if I was about to do this in the next minute…

A conclusion for this:

I failed that interview, not because I’m not qualified, but only because I did not show them what I capable of in the short time conversation, or what I expressed/showed to peoples was some things can not completely stand for my expierencs or backgrounds.

My one biggest problem is just that. I’m a modest man, I never like to talk big. Years of experenes in Project Schedule planning made me even more cautious in talking about some specific staffs. And in the past over ten years, I was always be the man who sit on the master side of interviews, I’d never been interviewed, so I do lack of skills in presenting myself.

Now, here I am, in Kedacom. Wish I can do better from now on.

Family

My father resigned from his job in the beginning of 2013. And earlier in the month, May 10, he left Shanghai for his new start in Hangzhou. Wish he perfect in his new position.

Catherine’s kindergarten education is about to over, and she recieved the letter of admission from the 1st junior school of DAHUA, after Lucy’s efforts through her boss.

Little uncle went to Hangzhou to have a surgical operation on his leg. It was a really big operation, after it was done, no one can stand the extremely pain, so Mom went Hangzhou to look after her little brother for two days. Now he’s already checked out, and back to Dongyang for later recuperation.  Wish he healthy, and will not be lame in his leg any more.