Introduction
Intel® GPUs contain fixed function hardware to accelerate video encode, decode, and frame processing, which can now be used with a variety of interfaces. Media SDK and Media Server Studio provide great performance with an API designed around delivering full hardware capabilities that is portable between OSes. However, there is a big limitation: the Media SDK API only processes video elementary streams. FFmpeg is one of the most popular media frameworks. It is open source and easily expandable. Because of this it has a very wide range of functionality beyond just codecs: muxing and demuxing(splitting), audio, network streaming, and more. It is straightforward to extend FFmpeg with wrappers for Intel® HW acceleration. Various forms of these wrappers have existed for many years, and they provide important ease of use benefits compared to writing encode/decode code directly with the Media SDK API. However, the tradeoff for this ease of use is that performance is still left on the table. To get the best of both worlds – full performance and access to the full range of capabilities in FFmpeg – a hybrid approach is recommended.
Intel® provides several ways for you to use hardware acceleration in FFmpeg.
- FFmpeg wrappers for lower level APIs "underneath" Media SDK in the stack: libva (Linux) and DXVA (Windows)
- FFmpeg supports the default Media SDK plugin and this article describes the transcoding performance of the plugin, the detailed installation and validation guide is here;
- The Intel® FFmpeg plug-in project is a fork of FFmpeg which attempts to explore additional options to improve performance for Intel hardware within the FFmpeg framework.
- A 2012 article by Petter Larsson began exploring how to use the FFmpeg libav* APIs and Media SDK APIs together in the same application.
This article provides important updates to the 2012 article. It describes the process to use the FFmpeg libraries on Ubuntu 16.04. The example code will be based on our tutorial code so the user will have a better view on how the FFmpeg API is integrated with the media pipeline. The example code will also update the deprecated FFmpeg API so it is synced with the latest FFmpeg releases.
Build FFmpeg libraries and run the tutorial code
Requirements
- Hardware: An Intel® hardware platform which has the Intel Quick Sync Video capability. It is recommended to use the latest hardware version since the better support. For Linux, a computer with 5th or 6th generation Core processor; for Windows®, 5th generation or late.
- Linux OS: The sample code was tested on Ubuntu 16.04.3LTS, but the user can try other Linux distribution like CentOS.
- Intel® Media Server Studio: For the hardware you have, please go to the MSS documentation page to check the release notes and identify the right MSS version, for the latest release, click the Linux link on "Essential/Community Edition"; for the previous releases, click the link on "Historical release notes and blogs".
- FFmpeg: This should be the latest release from FFmpeg website, for this article, V3.4 is used.
- Video File: Any mp4 video container with H.264 video content, for testing purpose, we use the BigBuckBunny320x180.mp4
Project File Structure
The project to run the tutorial has the following file structure:
Folder | Content | Notes |
---|---|---|
simple_decode_ffmpeg | src/simple_decode_ffmpeg.cpp Makefile | simple_decode_ffmpeg.cpp is the Media SDK application to create a simple decode pipeline and call the function defined in ffmpeg_utils.h to hook up the demux APIs of FFmpeg library |
simple_encode_ffmpeg | src/simple_encode_ffmpeg.cpp -Makefile | simple_encode_ffmpeg.cpp is the Media SDK application to create a simple encode pipeline and call the ffmpeg adaptive function defined in ffmpeg_utils.h to hook up with the mux APIs of FFmpeg library. |
common | ffmpeg_utils.h ffmpeg_utils.cpp | The API in these files defines and implements the API to initialize, execute and close the mux and demux functions of the FFmpeg library. |
$(HOME)/ffmpeg_build | - inlcude - lib | This is the built FFmpeg libraries, the libraries involved are libavformat.so, libavcodec.so and libavutil.so |
How to build and execute the workload
- Download the Media Server Studio and validate the successful installation
- Based on the hardware platform, identify the right Media Server Studio version.
- Go to Media Server Studio landing page to download the release package.
- Following this instruction to install the Media Server Studio on Ubuntu 16.04; following this instruction if you install on CentOS 7.3(the instruction can also be found in the release package).
- Following above instruction to validate it before the next step.
- Download the FFmpeg source code package and build the libraries.
- Following the instruction in the generic compilation guide of FFmpeg, in the guide, select the Linux and the distribution you are working on, for example, the Ubuntu build instructions. This project requires the shared FFMpeg library, refer to the following instruction to build the final FFMpeg library.
- After building the requested FFMpeg modules. To build the shared library, several argument should be appended the general instructions. When configuring the final build, please append the following arguments to the original "./configure..." command: "--enable-shared --enable-pic --extra-cflags=-fPIC", for example,
PATH="$HOME/bin:$PATH" PKG_CONFIG_PATH="$HOME/ffmpeg_build/lib/pkgconfig" ./configure \ --prefix="$HOME/ffmpeg_build" \ --pkg-config-flags="--static" \ --extra-cflags="-I$HOME/ffmpeg_build/include" \ --extra-ldflags="-L$HOME/ffmpeg_build/lib" \ --extra-libs=-lpthread \ --bindir="$HOME/bin" \ --enable-gpl \ --enable-libass \ --enable-libfdk-aac \ --enable-libfreetype \ --enable-libmp3lame \ --enable-libopus \ --enable-libtheora \ --enable-libvorbis \ --enable-libvpx \ --enable-libx264 \ --enable-libx265 \ --enable-nonfree \ --enable-shared \ --enable-pic \ --extra-cflags=-fPIC
- Note: the general instruction download the latest(snapshot) package with the following command "wget http://ffmpeg.org/releases/ffmpeg-snapshot.tar.bz2", there might be build/configure mistake with this package since it is not the official release, please download your favorite release packages if the build failed. For this tutorial, version 3.4 is used.
- Set the path LD_LIBRARY_PATH to point to $HOME/ffmpeg_build/lib, it is recommend to set to the system environment variable, for example, add it to /etc/environment; Or user can use the following command as a temporary way to set in the current environment:
# export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$HOME/ffmpeg_build/lib"
- Download the sample code attached to this article and uncompress it in a local directory.
- Build the source code
- Add the library path to the LD_LIBRARY_PATH:Check the Makefile in directory "simple_decode_ffmpeg" and "simple_encode_ffmpeg), noticed "FFMPEG_BUILD=$(HOME)/ffmpeg_build", the directory "$(HOME)/ffmpeg_build" is the default build directory if you followed the general FFMpeg compilation instructions; if you have build the library in different directory, you have to change the $(FFMPEG_BUILD) variable to that directory.
- At the root directory of the project, do "make", the binary will be built at "~/_build" directory
- The user could disable the audio code and check video only by followings:
Remove "-DDECODE_AUDIO" from the Makefile in simple_decode_ffmpeg project
Remove "-DENCODE_AUDIO" from the Makefile in simple_encode_ffmpeg project - The user could also turn of the debug build by changing the Makefile to switch the definition of "CFLAG"
- Run the binary with the video workload
- Download the BigBuckBunny320x180.mp4 and save it locally.
- To decode the video file with the following command:
# _build/simple_decode_ffmpeg ~/Downloads/BigBuckBunny_320x180.mp4 out.yuv
The command generates 2 output files: out.yuv--the raw video stream; audio.dat--the raw audio PCM 32bit stream.
- To encode the result from the decoding with the following command:
# _build/simple_encode_ffmpeg -g 320x180 -b 20000 -f 24/1 out.yuv out.mp4
The command reads the raw audio with the name "audio.dat" by default.
Known Issue
- When running the sample to validate the MSS installation, there is a failure when the patched the kernel was not applied to the platform, run the following command to check(take patched kernel 4.4 as an example):
uname -r 4.4.0
In the installation instruction, the kernel 4.4 was patched, this implies the driver update to access the media fixed functions. If the command doesn't show the expected kernel version, user has to switch the kernel at the boot time at the grub option menu, to show the grub menu, refer to this page.
- The following table shows all the video clips being tested successful, for the other codecs and containers, please feel free to extends the current code.
Tested sample vs container with codecs -
container with codecs sample_decode_ffmpeg sample_encode_ffmpeg .mp4 (h.264/hevc/MPEG2, aac) (h.264, aac) .mkv (h.264/hevc/MPEG2, ac3) (h.264, ac3) .ts (h264/hevc, ac3) (MPEG2, aac) .mpg, mpeg (MPEG2, ac3) (MPEG2, aac)
- The audio codec uses the FFMpeg's library, among the audio codec, only AAC is well tested, for the other codec, Vorbis and AC3 has encoding error, so the default audio for the container ".mkv", ".mpeg", "mpg" and ".ts" is forced to other audio codec.
- To validate the successful installation of the Media Server Studio, after installing it, download the Media SDK sample from this page and run the following command:
./sample_multi_transcode -i::h264 test_stream.264 -o::h264 out.264 Multi Transcoding Sample Version 8.0.24.698 libva info: VA-API version 0.99.0 libva info: va_getDriverName() returns 0 libva info: User requested driver 'iHD' libva info: Trying to open /opt/intel/mediasdk/lib64/iHD_drv_video.so libva info: Found init function __vaDriverInit_0_32 libva info: va_openDriver() returns 0 Pipeline surfaces number (DecPool): 20 MFX HARDWARE Session 0 API ver 1.23 parameters: Input video: AVC Output video: AVC Session 0 was NOT joined with other sessions Transcoding started .. Transcoding finished Common transcoding time is 0.094794 sec ------------------------------------------------------------------------------- *** session 0 PASSED (MFX_ERR_NONE) 0.094654 sec, 101 frames -i::h264 test_stream.264 -o::h264 out.264 ------------------------------------------------------------------------------- The test PASSED
The design of the mux/demux functions with the FFmpeg library APIs
The sample code is modified base on our original tutorial code, simple_decode and simple_encode. The call to the FFMpeg integration is added to the original source code, the modified area is wrapped by the following comment line:
// =========== ffmpeg splitter integration ============
......
// =========== ffmpeg splitter integration end ============
Demux functions
The structure demuxControl keeps the control parameters of the demux process; the function openDemuxControl() initializes and configures the demuxControl structure; the structure is then used for the demux and decoding process; during the decoding, the function ffmpegReadFrame() reads the video frame after demuxing; finally the function closeDemuxControl() releases the system resources.
In the code "DECODE_AUDIO" turns on the audio decoding and demux the audio stream and use the FFMpeg audio decoder to uncompress the audio stream into the raw audio file "Audio.dat".
Mux functions
The structure muxControl keeps the control parameters of the mux process; the function openMuxControl initializes and configures the muxControl structure, the structure is then used for the encoding and mux process; during the encoding, the function ffmpegWriteFrame() writes the encoded stream into the output container via the FFmpeg muxer; finally the function closeMuxControl() releases the system resources.
In the code "ENCODE_AUDIO" turns on the audio encoding and mux/compress the audio raw data from "Audio.dat" to the video container.
Reference
FFmpeg: build with shared libraries
Luca Barbato's blog about the bitstream filtering
Luca Barbato's blog about the new AVCodec API
"