Android screen capture problem

Evgenii

New Member
I'm trying to publish a screen capture stream from android device to flashphoner server, using this example: https://github.com/flashphoner/wcs-...ple/screen_sharing/ScreenSharingActivity.java. I'm using Xamarin though, but hopefully it doesn't make any difference.

I've successefully streamed my phone camera, but with screenshare there is a problem - only sound gets to the server, no video. I've also explored record files produced by flashphoner (with ffprobe), and they also contain only audio stream. I've requested screen capture permissions and did not forget to set AndroidScreenCapturer via ` WebRTCMediaProvider.getInstance().setVideoCapturer(videoCapturer); ` as described in your example. Set video constrains also. The only difference I see is that I do not use local or remote renderers (no renderers at all), since I only need to publish stream, displaying it to user is not necessary. Might that be a problem? Any other clues why video is ignored?
 

Evgenii

New Member
Also, in client (android) logs I see this from MediaConnection when publishing screen:
a=group:BUNDLE audio

While when publishing camera I see this:
a=group:BUNDLE audio video

Suggesting that indeed only audio is published via WebRTC for some reason.
 

Evgenii

New Member
I found the reason: it turns out in Android version Q (maybe even in earlier versions) you are required to start foreground service before you can capture screen contents, just requesting permissions via MediaProjectionManager is not enough. After I registered and started foreground service (with "mediaProjection" foreground service type) before requesting screen share permissions - it started to work. So probably worth updating documentation on that.
 

Max

Administrator
Staff member
Good day.
Thank you for good job. We raised the ticket WCS-2880 to update Android SDK documentation.
 
Top