Motion-energy-based unequal error protection of H.264/AVC video bitstreams

  • Huu Dung Pham

    Student thesis: Doctor of Philosophy (PhD) - CDU


    Video compression technology is being introduced as one of the main features of the current and next generations of broadband networks. H.264 Advanced Video Coding (AVC) is the most applicable and ecient compression standard. As human eyes are very sensitive to the motion activities, an error occurring in the motion region signicantly degrades the video performance. In recently proposed techniques, the motion activity of frames has been evaluated by the pixel luminance dierence or number of blocks moved between consecutive frames. These techniques have a high complexity, as their calculations must be done at pixel level. In H.264/AVC standard, a video frame is divided into sub-macroblocks, where each sub-macroblock has a number of pixels located in square or rectangular shapes. This thesis proposes a new concept to evaluate the motion activity of video frames, which is called motion energy. It is related to the distance of the sub-macroblock moved between consecutive frames and its size. Indeed, the motion energy concept uses parameters that are directly related to motion activities. Hence, it is more accurate than previously proposed techniques, when luminance is applied in the determination of motion activities.

    Thesis also introduces new multi-level error protection techniques for the video bitstream formatted by the Group of Pictures (GOP). Frames of a GOP are unequally protected by their motion energies, which are estimated from previous GOPs. The proposed techniques improve the video performance, while maintaining the same overhead existing in other conventional error protection techniques. As delay is not generated in determining the importance of frames by dierent estimation methods, proposed techniques can be applied for real-time video transmission systems.
    Date of AwardJan 2015
    Original languageEnglish
    SupervisorSina Vafi (Supervisor)

    Cite this