I'm trying to publish a screen capture stream from android device to flashphoner server, using this example: https://github.com/flashphoner/wcs-...ple/screen_sharing/ScreenSharingActivity.java. I'm using Xamarin though, but hopefully it doesn't make any difference.
I've successefully streamed my phone camera, but with screenshare there is a problem - only sound gets to the server, no video. I've also explored record files produced by flashphoner (with ffprobe), and they also contain only audio stream. I've requested screen capture permissions and did not forget to set AndroidScreenCapturer via ` WebRTCMediaProvider.getInstance().setVideoCapturer(videoCapturer); ` as described in your example. Set video constrains also. The only difference I see is that I do not use local or remote renderers (no renderers at all), since I only need to publish stream, displaying it to user is not necessary. Might that be a problem? Any other clues why video is ignored?
I've successefully streamed my phone camera, but with screenshare there is a problem - only sound gets to the server, no video. I've also explored record files produced by flashphoner (with ffprobe), and they also contain only audio stream. I've requested screen capture permissions and did not forget to set AndroidScreenCapturer via ` WebRTCMediaProvider.getInstance().setVideoCapturer(videoCapturer); ` as described in your example. Set video constrains also. The only difference I see is that I do not use local or remote renderers (no renderers at all), since I only need to publish stream, displaying it to user is not necessary. Might that be a problem? Any other clues why video is ignored?