ABOUT ME

-

Today
-
Yesterday
-
Total
-
  • iPhone HTTP Streaming with FFMpeg and an Open Source Segmenter
    Media_Dev 2010. 3. 10. 00:18
    반응형

    http://www.ioncannon.net/programming/452/iphone-http-streaming-with-ffmpeg-and-an-open-source-segmenter/comment-page-1/

    With the release of the iPhone OS 3 update came the ability to do live streaming. There are a few types of streaming and each requires a certain encoding and segmentation. I've put together a cheat sheet on how I went about building a static stream using FFMpeg and an example segmenter that someone has posted. I'm not covering windowed streams in this post but if you are thinking about implementing a windowed stream the following will help you make a step in that direction.

    Before getting started it is best to read over the Apple documentation on HTTP live streaming. Start out with the iPhone streaming media overview. This document covers the basics of how the streaming works and has some nice diagrams.

    If you want even more information after reading the overview you can take a look at the HTTP Live streaming draft proposal that was submitted to the IETF by Apple. It covers the streaming protocol in complete detail and has examples of the stream file format for reference.

    Once you are ready to start grab a decent quality video clip to use. If you don't have one handy I found a nice list of downloadable HD clips in various formats for testing.

     

    Step 1: Grab the latest version of FFMpeg

    You may be able to get away with anything after FFMpeg 0.5 but you might as well pull down a more recent version. The FFMpeg download page has instructions on getting the latest version. I pulled the version I used out of git.

    I used the following command to configure FFMpeg:

    configure --enable-gpl --enable-nonfree --enable-pthreads --enable-libfaac --enable-libfaad --enable-libmp3lame --enable-libx264

    One of the main things to note is the --enable-libx264 flag.

    Step 2: Encode your video for the iPhone

    Once you have a working version of FFMpeg it is time to create an X264 encoded stream that will work with the iPhone. There are a few things to note before diving in:

    1. The supported bitrates for streaming are: 100 Kbps to 1.6 Mbps
    2. The suggested bitrates for streaming are*:
      • Low – 96 Kbps video, 64 Kbps audio
      • Medium – 256 Kbps video, 64 Kbps audio
      • High – 800 Kbps video, 64 Kbps audio
    3. The iPhone screen size is: 480×320

    * See step 7 for more information on what I think are better bitrates.

    Taking all that into account someone on the iPhone developer forums suggested the following and it works well for me:

    ffmpeg -i <in file> -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s 320×240 -vcodec libx264 -b 96k -flags +loop -cmp +chroma -partitions +parti4×4+partp8×8+partb8×8 -subq 5 -trellis 1 -refs 1 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate 96k -bufsize 96k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 320:240 -g 30 -async 2 <output file>

    If you want some more detail on some of these commands check out the X264 encoding guide and in general the FFMpeg documentation to see what all the flags mean.

    Note that I have the bitrate set to 96k in the above example. That can be changed to fit your needs. Use the script that I have created later in the post or just make sure you change the -b, -maxrate, and -bufsize values.

    Step 3: Download and build the segmenter

    Now you have a complete video but you don't want to toss the entire thing up or you wouldn't be reading about HTTP streaming. What you need is a way to segment the video stream into smaller chunks. You can download Apple's segmenter (see the overview above for more information on where to find it) or you can download one created by the forum user corp186.

    There is an SVN repository set up for the segmenter source. It is only a couple files and it is easy to build. The trouble you may run into is that the Makefile that it comes with won't build the binary correctly. Don't worry it just takes some extra link flags to make it work. The following is what I needed in the Makefile to get it to build on my system:

    all:
            gcc -Wall -g segmenter.c -o segmenter -lavformat -lavcodec -lavutil -lbz2 -lm -lz -lfaac -lmp3lame -lx264 -lfaad

     

    clean:
            rm segmenter

    After compiling the segmenter you are ready to create your first HTTP streaming content.

    The format of the segmenter command is:

    segmenter <input MPEG-TS file> <segment duration in seconds> <output MPEG-TS file prefix> <output m3u8 index file> <http prefix>

    Following is an example used to create a stream from a video file created with the above FFMpeg command split into 10 second intervals:

    segmenter sample_low.ts 10 sample_low stream_low.m3u8http://www.ioncannon.net/

    Step 4: Prepare the HTTP server

    At this point you should have a set of files that represent the stream and a stream definition file. Those files can be uploaded to a web server at this point but there is another important step to take that ensures they will be download correctly and that is setting up mime types. There are two mime types that are important for the streaming content:

    .m3u8 application/x-mpegURL
    .ts video/MP2T

    If you are using Apache you would want to add the following to your httpd.conf file:

    AddType application/x-mpegURL .m3u8
    AddType video/MP2T .ts

    If you are using lighttpd you would want to put this in your configuration file (if you have other mime types defined make sure you just add these and don't set them):

    mimetype.assign = ( ".m3u8" => "application/x-mpegURL", ".ts" => "video/MP2T" )

    Step 5: Test the stream

    The video is encoded for the iPhone, segmented for streaming, and the server is configured. The only thing left to do is test the stream and the fastest way to do that is to use the new HTML5 video tag. Here is an example of how to set it up:

    <html>
      <head>
        <title>Video Test</title>
        <meta name="viewport" content="width=320; initial-scale=1.0; maximum-scale=1.0; user-scalable=0;"/>
      </head>
      <body style="background-color:#FFFFFF; ">
        <center>
          <video width='150' height='150' src="stream-128k.m3u8" />
        </center>
      </body>
    </html>

    If everything has been done correctly you should see the video.

    If you want to test the stream out in an application then download the MoviePlayer iPhone demo application from the iPhone developer site. Build and run it in the simulator or put it on an actual phone and then type the URL in for the server you uploaded your stream to.

    That is all there is to building a single static HTTP stream. A good number of steps but if you have some experience using FFMpeg it isn't too hard to set up. The only pitfalls I ran into revolve around trying to segment the stream without the segmeter code. I don't know enough about how the segmentation works to know why this is so difficult to do but I believe it could have something to do with synchronization points in the stream. Of course when you stray from the path the stream just doesn't work and you get a generic error message so that is just my best guess. I'll also guess that Apple may tighten up the player over time and make it work better with miss-formatted streams.

    Step 6: Automating the stream encoding and segmentation

    Here is a little script I put together that first encodes an input file and then segments it into 10 second chunks:

    #!/bin/sh

     

    BR=800k

    ffmpeg -i $1 -f mpegts -acodec libmp3lame -ar 48000 -ab64k 

    반응형

    댓글

Designed by Tistory.