how to achieve CanvasStreaming to 1080p

auronsan14

New Member
currently we use canvas streaming

Code:
  const constraints = {
      audio: false,
      video: false,
      customStream: canvStream.current,
    };

session
      .createStream({
        name: streamName,
        display: localVideo.current,
        constraints,
        cacheLocalResources: true
      })
      .on(STREAM_STATUS.PUBLISHING, function (stream: any) {
        setStatus('#publishStatus', STREAM_STATUS.PUBLISHING);
        onPublishing(stream);
      })
      .on(STREAM_STATUS.UNPUBLISHED, function (e: any) {
        setStatus('#publishStatus', e.getErrorInfo());
        disconnect();
      })
      .on(STREAM_STATUS.FAILED, function (e: any) {
        setStatus('#publishStatus', e.getErrorInfo());
        disconnect();
      })
      .publish();
  }
how to achieve 1080p video?
 
Last edited:

Max

Administrator
Staff member
Good day.
This is a browser limit: a stream picture size published from canvas may only be equal or less than canvas element size on the page.
So, to publish 1080p from canvas you need to place canvas element of 1920x1080 pixels to the web page.
Another option is to transcode the stream picture to the desired size at server side, but this requires a lot of CPU and RAM on server.
 

auronsan14

New Member
how to transcode on canvas streaming as there no constrain video? just use api with stream name?
 
Last edited:

Max

Administrator
Staff member
Transcoding means server-side transcoding and rescaling
For example initial canvas size is 320x240 will be decoded on the server then upscaled to 1920x1080 then encoded to H.264 stream.

1. You will get blurry 1920x1080 image because upscale will affect image quality.
2. You will utilize 1-2 physical CPU cores on the server, because Full HD transcoding is CPU extensive operation.

Yes you have to use transcoding API.
 

Max

Administrator
Staff member
did it possible to use headless Chrome? to achieve it ?
Probably, no. Headless Chrome does not use GPU, so transcoding will require a lot of client CPU. Also, canvas streaming requires a browser tab where canvas is rendering to be an active tab. So headless Chrome seems not a good solution for the case.
 

auronsan14

New Member
yeah but we will success run the canvas on headless chrome + webrtc to achive 1080p video.

now we got problem when try init two flashphoner roomApi and flashphoner create stream
 

auronsan14

New Member
yeah worked now with room.publish.

now I trying to get audio input to push along with customStream.

is there a way to get audioInput from each participants object?
var participants = room.getParticipants();
 

Max

Administrator
Staff member
is there a way to get audioInput from each participants object?
Each participants stream is playing in video tag (because it's a usual stream). See Conference example source code:
Code:
                participant.getStreams()[0].play(document.getElementById(pDisplay)).on(STREAM_STATUS.PLAYING, function (playingStream) {
                    document.getElementById(playingStream.id()).addEventListener('resize', function (event) {
                        resizeVideo(event.target);
                    });
                });
The div tag to be container for video tag is passed to participant.getStreams[0].play() function. This function returns Web SDK Stream object. So you can get video tag by two ways:
1. Get the stream object and use Stream.id() method
Code:
var playingStream = participant.getStreams()[0].play(document.getElementById("participant1Dysplay"));
var video = document.getElementById(playingStream.id());
2. Use STREAM_STATUS.PLAYING event:
Code:
participant.getStreams()[0].play(document.getElementById("participant1Dysplay")).on(STREAM_STATUS.PLAYING, function (playingStream) {
    var video = document.getElementById(playingStream.id());
});
Then you can add audio track from video tag to canvasStream:
Code:
canvasStream.addTrack(video.srcObject.getAudioTracks()[0]);
 
Top