Android SDK 1.0 has problem when audio only streaming

ett

Member
Hello. I have a problem using Android SDK 1.0.

- related post: https://forum.flashphoner.com/threads/roomapi-does-not-make-hasvideo-false.12835/
- WCS: 5.2.629-dd8778ba58690d19a44ed583cd116650fe511539
- Android SDK: wcs-android-sdk-1.0.1.70-30acadd1f221211d5867d13ad7ee97baff924d5f

PROBLEM:
Code:
const options = new StreamOptions();
options.setConstraints(new Constraints(true, false)); // video is false
room.publish(null, options)
Code:
StreamStatusEvent {
  "appKey" : "roomApp",
  "name" : "mystream",
  "published" : true,
  "hasVideo" : true,
  "hasAudio" : true,
  "status" : "PUBLISHING",
  "audioCodec" : "opus",
  "videoCodec" : "H264",
  "record" : false,
}
The "mystream" can play but there is no audio.
And the "mystream" will be forced to automatically UNPUBLISHED after some seconds with "Failed by RTP activity".



TEMPORARY WORKAROUND(2 WAY):
a. Use Android SDK 1.1
b. Change codecs property in flashphoner.properties to codecs=opus only,
Either way, it looks like this.
Code:
StreamStatusEvent {
  "appKey" : "roomApp",
  "name" : "mystream",
  "published" : true,
  "hasVideo" : true,
  "hasAudio" : true,
  "status" : "PUBLISHING",
  "audioCodec" : "opus",

  "record" : false,
}
(missing "videoCodec". It is correct I expected)
The "mystream" can play with audio and "mystream" will not be UNPUBLISHED.



WHAT I WANT YOU TO DO:
About a., I want to use Android SDK 1.0 to support Android 5.0, so I can't use 1.1.
About b., I want to use video in other case, so I can't omit H264 from codecs.

Could you please get it to work correctly in Android SDK 1.0?
 

Max

Administrator
Staff member
Good day.
We reproduced the issue in Media Device example (so it does not concern RoomApi) and raised the ticket WCS-2743.
But, if fix will require to update WebRTC library, it certainly won't fixed because new WebRTC library versions are supported in Android SDK 1.1 only.
We suppose the following workaround: publish a stream with video muted. In this case there will be no video (black screen), and you should not restrict audio codecs on server side, audio will play with default server settings.
 

ett

Member
But, if fix will require to update WebRTC library, it certainly won't fixed
I hope it doesn't happen.

publish a stream with video muted. In this case there will be no video (black screen)
I understand.
However, that requires removing the video stream from the recorded file, so I strongly hope that Android SDK 1.0 has been fixed.

I am waiting for receiving your progress report.
Best regards.
 

ett

Member
Hi, Max.
There is a new problem related to this.
In the WCS-iOS-SDK-2.5.2, it appears that the camera device is being used despite the fact that VideoConstraint is false.
When I publish audio only streaming(FPWCSApi2MediaConstraints.alloc().initWithAudioVideo(true, false))
and use camera capturing, I got this error.
AVCaptureSession was interrupted for VideoDeviceInUseByAnotherClient!

The SDK calling AVCaptureSession#addInput:videoInput despite the fact that the constraints are false?
 

Max

Administrator
Staff member
The SDK calling AVCaptureSession#addInput:videoInput despite the fact that the constraints are false?
iOS SDK does not make this call directly. Probably WebRTC library does, in this case we cannot handle it.
Please clarify: are you trying to publish two separate streams (audio and video) from the same application? If yes, this probably will work in Web SDK only.
 

ett

Member
Hi, Max.
Please clarify: are you trying to publish two separate streams (audio and video) from the same application? If yes, this probably will work in Web SDK only.
The answer is no. I'm publishing only one audio stream.
At the same time, I will be take a picture using camera and upload it somewhere.
It was successful on Android.
 
Last edited:

venkypokala

New Member
I am using Android SDK SIP functions examples -- Android Phone(wcs-android-sdk-1.1.0.13-release). it's not working in background service give me any suggestion.

When stop the service we are getting crashes happend.

java.lang.IllegalArgumentException: Receiver not registered: org.webrtc.NetworkMonitorAutoDetect@817ea71

2020-06-21 20:28:46.449 5799-6024/com.evolgence.on W/System.err: at android.app.LoadedApk.forgetReceiverDispatcher(LoadedApk.java:1429)
2020-06-21 20:28:46.449 5799-6024/com.evolgence.on W/System.err: at android.app.ContextImpl.unregisterReceiver(ContextImpl.java:1577)
2020-06-21 20:28:46.449 5799-6024/com.evolgence.on W/System.err: at android.content.ContextWrapper.unregisterReceiver(ContextWrapper.java:664)
2020-06-21 20:28:46.449 5799-6024/com.evolgence.on W/System.err: at org.webrtc.NetworkMonitorAutoDetect.unregisterReceiver(NetworkMonitorAutoDetect.java:751)
2020-06-21 20:28:46.449 5799-6024/com.evolgence.on W/System.err: at org.webrtc.NetworkMonitorAutoDetect.destroy(NetworkMonitorAutoDetect.java:729)
2020-06-21 20:28:46.449 5799-6024/com.evolgence.on W/System.err: at org.webrtc.NetworkMonitor.stopMonitoring(NetworkMonitor.java:129)
2020-06-21 20:28:46.449 5799-6024/com.evolgence.on W/System.err: at org.webrtc.NetworkMonitor.stopMonitoring(NetworkMonitor.java:138)
2020-06-21 20:28:46.449 5799-6024/com.evolgence.on E/rtc: #

# Fatal error in: gen/sdk/android/generated_base_jni/jni/../../../../../../../../../usr/local/google/home/sakal/code/webrtc-aar-release/src/sdk/android/src/jni/jni_generator_helper.h, line 85
# last system error: 0
# Check failed: !env->ExceptionCheck()

#
 
Last edited:

Max

Administrator
Staff member
The answer is no. I'm publishing only one audio stream.
At the same time, I will be take a picture using camera and upload it somewhere.
It was successful on Android.
Anyway, iOS SDK just passes constraints to WebRTC library an doues not call AVCaptureSession methods directly. Unfortunately, we cannot affect library code.
 

Max

Administrator
Staff member
it's not working in background service give me any suggestion.
First, please create new topic for new question. Android background execution does not affect audio only streaming issues.
About background service, please read this page. Please ask futher questions in new topic.
 

ett

Member
Anyway, iOS SDK just passes constraints to WebRTC library an doues not call AVCaptureSession methods directly. Unfortunately, we cannot affect library code.
I understand that you did not call it direct.
So, for example, even though the video constraints in the options is false,
/*FPWCSApi2Room in FPWCSApi2*/- (FPWCSApi2Stream *)publish:(RTCEAGLVideoView *)display withOptions:(FPWCSApi2StreamOptions *) options;
are you not calling like this? Perhaps you are calling it?
/*RTCPeerConnectionFactory in WebRTC*/- (RTCAVFoundationVideoSource *)avFoundationVideoSourceWithConstraints:(nullable RTCMediaConstraints *)constraints;
 

ett

Member
I apologize profusely to you.
The problem is elsewhere and there was nothing wrong with WCS iOS SDK.
Thank you.
 

Max

Administrator
Staff member
We fixed the issue with audio only publishing from Android SDK 1.0 applications in Android SDK 1.0 build 1.0.1.75.
 

ett

Member
This is a very nice report.
However, I apologize for despite your great work.
We have already started to move away from supporting Android 5.0, and it is unknown if we will support older Android in the future.
Anyway, thank you very much.
 
Top