Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to realize ffmpeg Audio and Video synchronization by Qt

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/01 Report--

This article will explain in detail how to achieve ffmpeg audio and video synchronization with Qt. The editor thinks it is very practical, so I share it for you as a reference. I hope you can get something after reading this article.

I. Preface

Using ffmpeg to synchronize audio and video, I personally think that this is the most difficult one in the basic processing of ffmpeg. Countless people are stuck here, no matter what. I have also tried all kinds of demo on the Internet, which are basically scum, or only support a very small number of video files, such as receiving a frame of video and a frame of audio, or it is impossible to synchronize at all, or the progress jump directly bounces down. In fact, the most perfect audio and video synchronization processing demo is ffplay. I have personally tested dozens of various local audio and video files and dozens of video stream files, all of which are perfect. Of course, this is my own life, and it's not perfect.

If you only play video streams (without audio streams), you may not need audio and video synchronization, so the problem of synchronization was not considered at all when you only played rtsp video streams at first, because you didn't encounter it and you didn't need it. When you find all kinds of video streams like rtmp, http and m3u8 later, the problem is even bigger. It comes from video stream files in hls format all at once, from small video files one by one. If there is no synchronization, it means that a lot of pictures have been brushed suddenly, and the next time it will be brushed again, so you need to calculate the synchronization by yourself. The last packet received is put into the queue and displayed when it needs to be displayed.

Common audio and video synchronization methods:

Controlled by fps, fps indicates how many frames are played in a second, such as 25 frames. You can calculate the time spent on decoding a frame by yourself, and it takes up a frame (1000 lapses, 25 seconds, 40 milliseconds). This is actually the most despicable way to deal with it through delay.

Remember the time to start decoding startTime, calculate the pts time through av_rescale_q, the difference between the two is the time needed to delay, call av_usleep to delay, this only part of the file is normal, very often abnormal.

Audio synchronization to video, video clock as the master clock, never tried, many people on the Internet said that this method is not good.

Video synchronization to audio, audio clock as the master clock, never tried, it is said that most people use this method.

Audio and video synchronization to the external clock, the external clock as the main clock, the final use of the method, easy to understand not to interfere with each other, each according to the external clock to synchronize themselves.

Ffplay has three built-in synchronization policies, which can be controlled by parameters. The default is video synchronization to audio.

II. Functional features

Multi-thread real-time playback video stream + local video + USB camera and so on.

Support windows+linux+mac, support ffmpeg3 and ffmpeg4, support 32-bit and 64-bit.

Multi-thread display image, do not card the main interface.

Automatically reconnect the webcam.

The border size can be set, that is, the offset and the border color.

You can set whether to draw OSD tags, that is, label text or pictures and label locations.

There are two OSD locations and styles that can be set.

You can set whether to save to a file and the file name.

You can drag the file directly to the ffmpegwidget control to play.

Supports h365 video streams + rtmp and other common video streams.

Playback can be paused and resumed.

Support for storing individual video files and storing video files at regular intervals.

Customize the top suspension bar and send click signal notification, which can be set whether it is enabled or not.

Can set screen stretch fill or proportional fill.

Programmable decoding is speed priority, quality priority and balanced processing.

You can take screenshots (original pictures) and screenshots of the video.

Video file storage supports naked streaming and MP4 files.

Perfect audio and video synchronization, using external clock synchronization strategy.

Seek is supported to locate the playback position.

Qsv, dxva2, d3d11va and other hard decoding are supported.

Support opengl to draw video data with very low CPU usage.

Support Android and embedded linux, just cross-compile.

Third, effect picture

The core code void FFmpegSync::run () {reset (); while (! stopped) {/ / paused state or no frame in the queue does not process if (! thread- > isPause & & packets.count () > 0) {mutex.lock (); AVPacket * packet = packets.first (); mutex.unlock () / / there is a problem with the synchronization of naked stream files in h364. We cannot get pts and dts. Temporarily, we use the stupidest method to delay solving if (thread- > formatName = = "h364") {int sleepTime = (1000 / thread- > videoFps)-5; msleep (sleepTime). } / / calculate the current frame display time external clock synchronization ptsTime = getPtsTime (thread- > formatCtx, packet); if (! this- > checkPtsTime ()) {msleep (1); continue;} / / display the current playback progress checkShowTime () / / 0-indicates audio 1-indicates video if (type = = 0) {thread- > decodeAudio (packet);} else if (type = = 1) {thread- > decodeVideo (packet);} / / releases resources and removes thread- > free (packet); mutex.lock () Packets.removeFirst (); mutex.unlock ();} msleep (1);} clear (); stopped = false;} bool FFmpegSync::checkPtsTime () {bool ok = false; if (ptsTime > 0) {if (ptsTime > offsetTime + 100000) {bufferTime = ptsTime-offsetTime + 1000000;} int offset = (type = 0? 1000: 5000) OffsetTime = av_gettime ()-startTime + bufferTime; if ((offsetTime)

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report