Transcoder on Edge servers

Hello Max,

Actually we have:

1 WCS -> Origin USA - Receive all RTMP streams and process outside MIXER to end users
1 WCS -> Origin BR - Receive all webcams streams (near to publishers)
Propagation route between origins are set to true.

1 WCS -> StandAlone USA - Receive all PUSHED streams from (ORIGIN BR) to create a MCU

Edges are using LB @ Google.


How to implement on Edge the transcoding?
We attempt to play a stream like streamname-720p as set @ cdn_profiles.yml and doesnt works. We can play streamname normally without the suffix.

Thanks!
 

Max

Administrator
Staff member
Good day.
Propagation route between origins are set to true.
This should be set to false
Code:
cdn_origin_to_origin_route_propagation=false
and should not be used in production if possible.
How to implement on Edge the transcoding?
We attempt to play a stream like streamname-720p as set @ cdn_profiles.yml and doesnt works. We can play streamname normally without the suffix.
You should either allow transcoding on Origins (but this is not recommended)
Code:
cdn_origin_allowed_to_transcode=true
or deploy Transcoder nodes to CDN to use transcoding profiles.
Transcoding can be performed on Edge servers only if constraints are set for the subscriber: e.g., if video height or bitrate are specified
Code:
session.createStream({name:"stream1", constraints:{audio:true, video:{width:640,height:480}}}).play();
For HLS playback case, dedicated Transcoder nodes should be preferred, it allows to use transcoding profuiles as well as ABR.
 
Hello Max,

cdn_origin_to_origin_route_propagation=false

We need this to be set to true.

About our system:

ADMIN AREA:

Has a monitor page, where can watch all streams.
Can Publish and Remove from Mixer and MCU all participants - using API.

SPEAKER AREA:

Has a MCU to talk with others speakers.
Has a Mixer -> MONITOR what is being sent out to viewers (its include speaker at the moment + desktop of speaker).

To make it works, we need 2 origins and an exclusive server for MCU.

Speaker go Online -> Open his webcam at server A.
10 seconds later, a JS API - PUSH his stream to MCU server and log into a speakers room.

Using routes between origin servers, we can publish the routed stream by API (admin panel) to output mixer (viewers side).

Why we do this?
Because if we attach the stream to MCU and them to Mixer (as same server receiving the webcam) the MCU video from this speaker froozes and he lost the background communication with staff.
 

Max

Administrator
Staff member
Ok, we understand.
We raised the ticket WCS-2825 to implement a possibility to add one stream to two mixers simultaneously and let you know about ticket state in this topic.
Please clarify what kind of stream republishing do you use: RTMP (/push/startup) or WebRTC (/pull/push)?
Also, how do you move streams from Origin BR to Origin USA?
Please note that you can use WebRTC pulling and pushing REST API without CDN setup. If you just have to push all webcam streams from Origin BR to Origin USA, you can use /pull/push query without route propagation.
 
Top