|Version 3 (modified by burek, 8 months ago) (diff)|
If you need to stream your audio/video content over the internet, you'll usually need a streaming (broadcasting) server, one of which is ffserver. It is able to collect multiple input sources (feeds) and transcode/remux/broadcast each of them using multiple outputs (streams). To better describe its possibilities, consider the following image:
There are several elements shown on the image. Let's name them all first:
- Input sources (I)
- Feeds (F)
- Streams (S)
- Media players (P)
These elements are not part of internal structure of ffserver tool, but rather represent external applications (usually ffmpeg), which can send audio/video streams to ffserver that will be distributed (broadcast) to all the viewers (media players). Since ffmpeg is mostly used as an input source, we'll describe it here in this document.
Input sources will connect to ffserver and connect with one or more feeds if those feeds are not connected with some other input source at that moment. On the image you can see that input source I1 is connected to only one feed F1, but input source I2 is connected with two feeds (F2 and F3) sending streams to both of them.
Feed element is an internal part of ffserver which has a purpose to connect one input source with one or more output streams. The possibility to connect a feed with more output streams is useful when you want to stream one input source (for example, your webcam with audio) using several different output formats (for example, streaming a full HD video and a small-size preview video for mobile phones) at the same time. Shortly speaking, each feed element logically represents each of your input sources. It can be considered as an "input jack" of ffserver, to which you connect your audio/video sources.
A stream element is internal part of ffserver and represents a connection point for all your viewers who wish to get a specific stream. For example, if you want to stream one full HD video and a small-size preview video for mobile phones, you will create two stream elements with different frame size and possibly different encoding and/or output format. Each stream element can handle multiple connecting clients, just like one web server can handle multiple web clients. It can be considered as an "output jack" of ffserver, to which your viewers can connect to view your audio/video stream. The obvious difference between a feed element and a stream element is that a single stream element can handle multiple connections with viewers, while a single feed element is always connected to only one input source.
Media player elements are not internal part of ffserver. They represent your viewers from the "outside world" that are connecting to the stream elements to view your multimedia content. Some of the popular media players are: ffplay, VLC or Windows Media Player.
To be able to successfully start ffserver, you'll need the valid configuration file first. Once you create a config file, you can start ffserver simply by running the following command:
ffserver -f /etc/ffserver.conf
Depending on your configuration file, your ffserver will start or not :) But more often it will not start until you debug all the issues that usually occur, including syntax errors, so you'll most probably want to run your ffserver in debug mode with "-d" option, like this:
ffserver -d -f /etc/ffserver.conf
You can always get a full list of options with:
When you finally build a valid configuration file, you'll want to run your ffserver in the background (as a daemon), which can be accomplished using either a trailing ampersand character in a shell command or more conveniently you can comment out "NoDaemon" directive inside your config file (works on Windows too).
Connecting your input sources
Once your ffserver is up and running, it's time to connect input sources to it. Without input sources, your ffserver is not going to broadcast anything to the outside world and will be perfectly useless. So, let's see how we can connect a couple of input sources to ffserver.
The simplest way is to use ffmpeg tool. Let's assume that we want to stream our webcam video together with audio to our friends. We will simply run an ffmpeg command line that will capture our webcam video and audio input and forward it to ffserver. The command will look something like this:
ffmpeg \ -f v4l2 -s 320x240 -r 25 -i /dev/video0 \ -f alsa -ac 1 -i hw:0 \ http://localhost:8090/feed1.ffm
This is the same thing as this:
ffmpeg -f v4l2 -s 320x240 -r 25 -i /dev/video0 -f alsa -ac 1 -i hw:0 http://localhost:8090/feed1.ffm
but it looks better and makes it more clear to understand each part of the command line.
The first part "-f v4l2 -s 320x240 -r 25 -i /dev/video0" captures our webcam input. For more info, you can read more about How to capture a webcam input. The second part "-f alsa -ac 1 -i hw:0" captures our audio input, depending on our audio configuration. For more info, you can read more about Capturing audio with FFmpeg and ALSA. The last, but not the least important, part is "http://localhost:8090/feed1.ffm", which tells ffmpeg to connect to the ffserver and send the audio+video streams to be broadcast. Make sure that your feed ends with ".ffm" and if it's not the case, then prepend "-f ffm" before your url, to manually specify the output format (because ffmpeg won't be able to figure it out automatically any more), like this "-f ffm http://localhost:8090/blah.bleh".
As soon as you type that command, you should see ffmpeg displaying some statistics about your input streams and counting output frames, which is a pretty good sign that everything works (so far).
Viewing your streams
If you've done all the steps so far without errors, you're now ready to view your streams. The simplest way to do so is to use ffplay to connect to ffserver and view a specific stream, for example, like this:
Your stream should appear (depending on the encoding used and caching enforced) relatively shortly in a matter of seconds.
Creating the configuration file
It would be very wise to start off reading the ffserver's sample configuration file. It is self-documented with a lot of comments and it is a good starting point for beginners, since it contains various examples too. It would be a waste of time and space to write about it again here. Also, ffserver's documentation page might help too. In general, the configuration file is consisted of global directives, list of feed elements, list of stream elements and a specification of a special status stream element, which is used to provide a way for you to view the status of all your running streams.
TODO: add some popular streaming examples, like:
- h264+aac in flv/ts
- theora/vorbis in ogg