Open Source Media Framework - Intel® Media SDK Decoder Latency

ID 标签 757386
已更新 3/10/2022
版本 Latest
公共

author-image

作者

There are three parts which legally can introduce latency in Intel® Media SDK decoder:

  • Bitstream parsing: when app calls DecodeFrameAsync with some amount of bitstream data, decoder doesn't know whether it's a full frame or not, so decoder starts real decode process only when app sends the beginning of a next frame. If your app receives data from some splitter and you're sure you have one full frame, then you should set MFX_BITSTREAM_COMPLETE_FRAME flag in mfxBitstream object. 
  • AsyncDepth latency - The AysncDepth parameter will impact the latency. To decrease this part latency, just to set AsyncDepth to 1.
  • Reordering: According to Advanced Video Coding (AVC) spec a decoder doesn't have to return a decoded surface immediately for displaying. It's true even in absence of B frames - decoder doesn't know in advance that later it won't meet B frames, reordering might present. So by default decoder returns the first decoded frame after DPB is full. It's possible to tune an encoder to produce a low latency bitstream. Here are the rules:
  • sps.pic_order_cnt_type = 2
  • or if SPS.pic_order_cnt_type = 0 then
    • Set VUI.max_dec_frame_buffering >= Number of reference frames used by encoder (e.g. 1)
    • or SEI.pic_timing.dpb_output_delay = 0

Project: 
Open Source Media Framework

"