How to get playload.streamName using Android SDK?

ett

Member
The test result is same, sure. I know "we see Audio muted: true" as I said.
But is playStream?.onStreamEvent called that time?


I am in trouble because playStream?.onStreamEvent is not called for just PLAY, why don't you answer me if playStream?.onStreamEvent is called for?
First, please check it and let me know whether playStream?.onStreamEvent is called or not with logs.

Code:
             playStream?.onStreamEvent({streamEvent in
+               print("onStreamEvent is called")
                if (streamEvent?.type == FPWCSApi2Model.streamEventType(toString: .fpwcsStreamEventTypeAudioMuted)) {
 
Last edited:

Max

Administrator
Staff member
am in trouble because playStream?.onStreamEvent is not called for just PLAY,
When you pressing PLAY, the following happens:
1. playStream request is sent via websocket
2. notifyAudioCodec and setRemoteSDP are received from WCS
3. notifyStreamStatusEvent is received from WCS. It contains muted/unmuted status (see below)
4. playStream?.onStreamEvent is called
1627434429728.png

1627434700245.png

So answer is yes, playStream?.onStreamEvent will never be called right after PLAY is pressed. This event is fired asynchronously after WebRTC connection is established.
You have to just handle this event to detect if stream is muted.
 

ett

Member
4. playStream?.onStreamEvent is called
Is that really the MediaDevicesSwift example's logs??? It doesn't seem to be the Xcode's logs.
Please show me the Xcode's logs if you write and test the following.
Code:
// wcs-ios-sdk-samples/Swift/MediaDevices/ViewController.swift
             playStream?.onStreamEvent({streamEvent in
+               print("playStream?.onStreamEvent is called")
                if (streamEvent?.type == FPWCSApi2Model.streamEventType(toString: .fpwcsStreamEventTypeAudioMuted)) {
If the answer is yes as you said, "playStream?.onStreamEvent is called" should be included in the Xcode's logs. I want you to pay attention, do not touch the "Mute Audio" switch.

Is that really what MediaDevicesSwift example's playStream?.onStreamEvent with payload is called after tapping "PLAY"?
In the log image you posted, it is about a StreamStatusEvent and not a StreamEvent.

According to my test results, MediaDevicesSwift example's playStream?.onStreamEvent is not called after tapping "PLAY".
My question is "After tapping "PLAY", will MediaDevicesSwift example's playStream?.onStreamEvent has been called or not?" on you test.
 
Last edited:

Max

Administrator
Staff member
Is that really the MediaDevicesSwift example's logs??? It doesn't seem to be the Xcode's logs.
This is Chrome browser console logs to illustrate websocket signaling. The signaling is the same for Web SDK, Android SDK and iOS SDK.
We raised the ticket WCS-3272 to add logging to onStreamEvent handler. Will let you know results here.
 

ett

Member
Just run it in Xcode, so you can test it right away.
First, just show me the logs of Xcode console outputs.

I am not talking about WebSocket signaling.
As a matter of fact, it only asks if the callback function playStream?.onStreamEvent has been called or not.
And you said "4. playStream?.onStreamEvent is called".
So print("playStream?.onStreamEvent is called") will shown, right?
 

ett

Member
The stream state is passed via Websocket, so we explain it.
I know, I know, sure.
But I said MediaDevicesSwift example: callback function: playStream?.onStreamEvent is NOT CALLED after tapping "PLAY".
And you said MediaDevicesSwift example: callback function: playStream?.onStreamEvent is CALLED after tapping "PLAY".
Why is it different? I just added print(...) into wcs-ios-sdk-samples/Swift/MediaDevices/ViewController.swift#L278, built it in Xcode and followed the same steps as you.

Even looking at your WebSocket logs, I'm not sure if MediaDevicesSwift example: callback function: playStream?.onStreamEvent is called after tapping "PLAY". So I want your application outputs as evidence.
 
Last edited:

ett

Member
Additional Information for You.
I tried wcs-android-sdk-samples/media-devices. (commit fd3946c4c196fbb7896aa96fc2788771b4bfc929 (HEAD -> 1.1, tag: 628b018, origin/HEAD, origin/1.1))
The result is that onStreamEvent is called after PLAY correctly:D.

The 2 changes from original are here.
Code:
--- media-devices/src/main/res/values/strings.xml
+++ media-devices/src/main/res/values/strings.xml
@@ -21,7 +21,7 @@
     <string name="turn_on_flashlight">Turn on flashlight</string>
     <string name="turn_off_flashlight">Turn off flashlight</string>
     <string name="action_switch_renderer">Switch Renderer</string>
-    <string name="wcs_url">ws://192.168.0.1:8080/</string>
+    <string name="wcs_url">wss://demo.flashphoner.com:8443/</string>
     <string name="camera_fps">25</string>
     <string name="camera_width">640</string>
     <string name="camera_height">480</string>
Code:
--- media-devices/src/main/java/com/flashphoner/wcsexample/mediadevices/MediaDevicesActivity.java
+++ media-devices/src/main/java/com/flashphoner/wcsexample/mediadevices/MediaDevicesActivity.java
@@ -808,6 +807,7 @@ public void run() {

                 @Override
                 public void onStreamEvent(StreamEvent streamEvent) {
+                    Log.e(TAG, "======================================= onStreamEvent: " + streamEvent.getType());
                     runOnUiThread(new Runnable() {
                         @Override
                         public void run() {
The result of after tapping "PLAY" is bellow.:D
Code:
(((...OMITTED...)))
07-29 09:51:30.953 26639 26787 I agc_manager_direct.cc: (line 457): AgcManagerDirect::Initialize
07-29 09:51:30.954 26639 26802 I agc_manager_direct.cc: (line 294): [agc] Initial GetMicVolume()=12
07-29 09:51:30.963 26639 26682 I Session : Invoker class method: notifyStreamStatusEvent
07-29 09:51:30.964 26639 26682 E com.flashphoner.wcsexample.mediadevices.MediaDevicesActivity: ======================================= onStreamEvent: audioUnmuted
07-29 09:51:30.965 26639 26682 E com.flashphoner.wcsexample.mediadevices.MediaDevicesActivity: ======================================= onStreamEvent: videoUnmuted
07-29 09:51:31.007 26639 26817 I probe_controller.cc: (line 378): kWaitingForProbingResult: timeout
07-29 09:51:31.035 26639 26811 I HardwareVideoEncoder: Sync frame generated
07-29 09:51:31.036 26639 26811 I HardwareVideoEncoder: Prepending config frame of size 30 to output buffer with offset 0, size 4608
(((...OMITTED...)))
Again.
In wcs-ios-sdk-samples/WCSExample/MediaDevices and wcs-ios-sdk-samples/Swift/MediaDevices,
playing stream's onStreamEvent callback is not called after tapping "PLAY" for me:eek:.
What a difference of you and me? I used iPad Air 2: iOS14.6 for testing.
 
Last edited:

Max

Administrator
Staff member
Again.
In wcs-ios-sdk-samples/WCSExample/MediaDevices and wcs-ios-sdk-samples/Swift/MediaDevices,
playing stream's onStreamEvent callback is not called after tapping "PLAY" for me:eek:.
What a difference of you and me? I used iPad Air 2: iOS14.6 for testing.
Please be patient. We'll let you know about progress here
 

Max

Administrator
Staff member
We added logging on StreamEvent receiving both to SDK and samples in build 2.6.55.
And you're right: StreamEvent is not fired when subscriber connects to the stream. In this case, stream status can be received in StreamStatusPlaying event handler using Stream.getAudioState() and Stream.getVideoState() methods. Please see code examples here.
This implemented both in Web SDK and iOS SDK, but in Android SDK the stream status is wrapped to StreamEvent on clients side. We will fix this behaviour later to be equal in all the three SDKs in ticket WCS-3275.
 

ett

Member
1.
And you're right: StreamEvent is not fired when subscriber connects to the stream.
But I said the behavior "StreamEvent is not fired when subscriber connects to the stream." happens in the case of iOS, BUT NOT Android.
How about do you think that point?

No, StreamEvent is receiving only when publishStream.muteAudio()/publishStream.unmuteAudio(), not when playStream.play().

<UPDATED>Now I have discovered.
- NG: androidPublishStream.publish() / iosPlayStream.onStreamEvent().play() NOT get StreamEvent after PLAYING.
- NG: iosPublishStream.publish() / iosPlayStream.onStreamEvent().play() NOT get StreamEvent after PLAYING.
- OK: iosPublishStream.publish() / androidPlayStream.on(streamEventHandler).play() get StreamEvent after PLAYING.
In all cases, StreamEvent will come if publish-side do muteAudio()/unmuteAudio().

2.
StreamStatusPlaying event handler using Stream.getAudioState() and Stream.getVideoState() methods.
I haven't gotten to the point yet. Because of you said "onStreamEvent is called" many times, the conversation has not moved forward.

Do you remember I said to you?
Hi Max, How about this?

Is not that because of onStreamEvent. It is because of the below, right?

Code:
    [_remoteStream on:kFPWCSStreamStatusPlaying callback:^(FPWCSApi2Stream *stream){
        [self changeStreamStatus:stream];
        [self onStarted];
        _useLoudSpeaker.control.userInteractionEnabled = YES;
        [_remoteControl onAudioMute:[stream getAudioState].muted]; // <----- HERE
        [_remoteControl onVideoMute:[stream getVideoState].muted];
    }];
And it will return incorrect results for the mixer stream.
It is because we need the mute state of each stream participating in the mixer stream, not the mixer stream itself.
This is the main issue.

- Origin :: publishing mixer://mixer1
- - User1 :: publishing stream1 to mixer1
- - User2 :: publishing stream2 with muteAudio() to mixer1
- - User3 :: publishing stream3 to mixer1
- User4 :: playing mixer://mixer1

When the viewer User4 playing mixer://mixer1, how to get whether the User2's stream2 is muted using Stream.getAudioState()?
User4 has only one stream which is playing mixer://mixer1. Stream.getAudioState() is the state of mixer://mixer1.
 
Last edited:

Max

Administrator
Staff member
But I said the behavior "StreamEvent is not fired when subscriber connects to the stream." happens in the case of iOS, BUT NOT Android.
How about do you think that point?
This will be fixed in ticket WCS-3275 as we wrote above.
When the viewer User4 playing mixer://mixer1, how to get whether the User2's stream2 is muted using Stream.getAudioState()?
User4 has only one stream which is playing mixer://mixer1. Stream.getAudioState() is the state of mixer://mixer1.
This will be fixed in ticket WCS-3266.
 

ett

Member
This implemented both in Web SDK and iOS SDK....
We will fix this behaviour later to be equal in all the three SDKs in ticket WCS-3275.
I could not get your point a little. What will you do in WCS-3275?
I feel "This implemented ... We will fix ..." means about Stream.getAudioState() and Stream.getVideoState() .
It is great, but I have been explaining onStreamEvent is not called when subscriber connects to the stream in iOS SDK(and Web SDK?).

1. WCS-3275 make Android SDK (as a result, all SDK) can Stream.getAudioState() and Stream.getVideoState().
2. WCS-3275 make iOS SDK and Web SDK (as a result, all SDK) to onStreamEvent is called when subscriber connects to the stream.
3. WCS-3275 make Android SDK (as a result, all SDK) to onStreamEvent is NOT called when subscriber connects to the stream.

Which combination of these?
The processing for muted/unmuted state when connected, the other processing muted/unmuted toggled in the middle of the stream.
In order to make these processes common, the realization of 2. is preferred.



This will be fixed in ticket WCS-3266.
Fantastic!
I believe that we will be able to get mute/unmute state for each of streams which are participant to a mixer stream when subscriber connects to the mixer stream.
Is the ticket WCS-3266 a realization of this behavior? If so, it's a great ticket.
 
Last edited:

Max

Administrator
Staff member
1. WCS-3275 make Android SDK (as a result, all SDK) can Stream.getAudioState() and Stream.getVideoState().
This option.
In the all SDKs you should:
- to get a stream status after subscribing to the stream: handle STREAM_STATUS.PLAYING event and get state using Stream.getAudioState(), Stream.getVideoState();
- to get the stream status when publishing client calls muteAudio or muteVideo, use StreamEvent handler
I believe that we will be able to get mute/unmute state for each of streams which are participant to a mixer stream when subscriber connects to the mixer stream.
Is the ticket WCS-3266 a realization of this behavior?
Yes. Mixer stream subscriber is already receiving StreamEvent when incoming stream publisher metes/unmutes audio or video. We'll add this event to be received when subscriber just connected to the mixer stream, and one or more streams in mixer are already muted.
 

ett

Member
I understand everything. Thank you for the reply.

I will advise you on something related to this.
I'm having a conversation with multiple people in RoomApp by doing Particiapnt->play() with each other (not using MCU).
At that time, I am experiencing a phenomenon where if I do publishStream->muteAudio() multiple times or FPWCSApi2->getAudioManager()->muteAudio()|unmuteAudio() multiple times, the sending audio and receiving audio are not always muted as intended. This is very troubling.

Please pay close attention to this point when testing your implementation.
 

Max

Administrator
Staff member
I'm having a conversation with multiple people in RoomApp by doing Particiapnt->play() with each other (not using MCU).
At that time, I am experiencing a phenomenon where if I do publishStream->muteAudio() multiple times or FPWCSApi2->getAudioManager()->muteAudio()|unmuteAudio() multiple times, the sending audio and receiving audio are not always muted as intended. This is very troubling.
We will check this
 

ett

Member
I'm sorry. There was something I forgot to tell you.
That happens when three or more people are in a room.
Here is what happens:
0. User1 calls (first time) publishStream.muteAudio() => publishStream.unmuteAudio(). So far, no problems.
1. User1 calls (second time) publishStream.muteAudio().
2. User2 receives StreamEvent{audioMuted,payload=User1}
3. User3 receives StreamEvent{audioMuted,payload=User1}
4. As a result, User2 cannot listen User1's voice. However, User3 still can listen User1's voice.

Unfortunately, it is difficult to prepare a reliable way to reproduce it. (I don't have time to create a sample.)

We will check this
I really appreciate your cooperation.
 

Max

Administrator
Staff member
2. User2 receives StreamEvent{audioMuted,payload=User1}
3. User3 receives StreamEvent{audioMuted,payload=User1}
Please clarify: do you use RoomApi or MCU mixer? payload member is filled only when user received outgoing mixer stream (MCU or not), and incoming stream publisher calls muteAudio().
Also, please reproduce the issue in WebSDK MCU Client example, it will be easier to debug.
4. As a result, User2 cannot listen User1's voice. However, User3 still can listen User1's voice.
The stream is always muted on publisher side (sending silence). So this is not possible to hear user which successfully called muteAudio() for any subscriber. You can check it by User1's stream audio bitrate, it sholud be about 14000 or lower.
 

ett

Member
Please clarify: do you use RoomApi or MCU mixer? payload member is filled only when user received outgoing mixer stream (MCU or not), and incoming stream publisher calls muteAudio().
I am so sorry, I said wrong information. I was also wrong about the results.
My application also contains code that uses a mixer, and because of that I made a mistake. The mixer is not used in this issue.

Again, I will explain the happens I confirmed.
- WCS 5.2.996, iOS SDK 2.6.48, Android SDK 1.1.0.32
- I am using RoomApi, no MCU, and no mixer.
- User1(iOS), User2(iOS), User3(Android) in the room and all they are publishing.
- User1 playing two streams, User2 and User3. User2 playing two streams, User1 and User3. User3 playing two streams, User1 and User2.
- User1 calls (first time) publishStream.muteAudio() => publishStream.unmuteAudio(). So far, no problems.


1. User1 calls (second time) publishStream.muteAudio().
2. User2 receives StreamEvent{audioMuted} via participantStream->onStreamEvent. the participantStream is made by room->getParticipants()->find(User1,User3)->play() at room->onStateCallback.
3. User3 receives StreamEvent{audioMuted} via participantStream->onStreamEvent. (almost same as above 2. except ->find(User1,User2)).
4. As a result, User2 and User3 still can listen User1's voice even if User1 muted.

In this case, it seems that User1's microphone mute is not working.
I tried RTCAudioSession->inputGain=0 in addition to publishStream.muteAudio(), but User1 still didn't mute.
I hope you can help me.
 
Last edited:
Top