Periodic sync issue when transcoding

Denis Vasiliev

New Member
Hello,

We faced a problem of video and audio sync issues after permanently enabling stream transcoding for all streams.

In a nutshell about architecture:
Android / iOS SDK is used, while during the stream, the publication takes place between two mobile devices. Before publishing the video, a command is forcibly sent to start transcoding, the transcoding itself takes place on the origin server.
JDK 12
Flashphoner ver. 5.2.1054

Applying the settings did not give any result:
av_paced_sender=true
av_paced_sender_max_buffer_size=5000

No patterns could be found, but a possible problem occurs when the video quality is very low.

Additionally, servers with flashphoner are located in America, and tests are carried out in Europe (perhaps this is useful for initial analysis).

Any advice would be appreciated!
 

Max

Administrator
Staff member
Hello

1. Try to update.

The build 5.2.1054 is October 2021. Looking quite old.
Try to update to the latest available release https://docs.flashphoner.com/display/WCS52EN/Release+notes
Make sure you have backup before update and you know how to rollback all changes quickly.

2. Switch to TCP.

Out of sync issues can be caused by network packet losses.
To avoid the losses, you can switch streaming to TCP https://docs.flashphoner.com/display/WCS52EN/Publishing+and+playing+stream+via+WebRTC+over+TCP

3. Check lipsync.

Open url http://host:8081?action=stat
See parameter streams_synchronization
This metric allows to see current synchronization between audio and video in milliseconds.
If it is greater than 100ms, stream is out of sync.

4. Transcoding.

How exactly do you transcode? What codecs used for transcoding?
We have recently fixed issue with transcoding to AAC with a packet lost rate.
Build 5.2.1329
 

Denis Vasiliev

New Member
1. Try to update.
Thanks for the recommendation, we will look into upgrading to a newer version.

2. Switch to TCP.
We have real-time publishing going on, could you advise, based on your experience, which type would be the optimal UDP or TCP, provided that there may not be a stable Internet there, but real-time synchronization should be as accurate as possible between two streams on mobile devices.

As we know, if there is not a stable Internet, we will get artifacts.

4. Transcoding.
About our transcoder:
"videoCodec": "H264",
"width": 480,
"height": 640

audio we get from the source without transcoding
 

Max

Administrator
Staff member
We have real-time publishing going on, could you advise, based on your experience, which type would be the optimal UDP or TCP, provided that there may not be a stable Internet there
WebRTC over UDP is faster than TCP, but UDP is less packet loss proof than TCP. So on good stable internet connections UDP is recommended to use, but on unstable one TCP is preferable. You can switch transport on client side in Android and iOS SDKs.
but real-time synchronization should be as accurate as possible between two streams on mobile devices.
You cannot synchronize two different stream because WebRTC sets synchronization value as unique for every stream. It's a bad practice to publish audio and video streams separately from the same device. We recommend to publish one stream with audio and video tracks. Speaking of screen sharing, Android SDK allows to use microphone, and iOS SDK allows to use either microphone or system sound.
As we know, if there is not a stable Internet, we will get artifacts.
Yes, this is packet loss. In this case, TCP is preferrable transport.
 
Top