How to get playload.streamName using Android SDK?

Max

Administrator
Staff member
Good day.
In iOS SDK, you have direct access to StreamEvent.payload members, but in Android SDK they are accessible through public methods only. We raised the ticket WCS-3251 to add an access method to incoming stream name. Will let you know here.
 
Last edited:

ett

Member
I understand.
You mean that Android SDK need to have getPayload(), but not yet.
Please do it as soon as possible.
 

Max

Administrator
Staff member
Since Android SDK build 1.1.0.32 it is possible to detect mixer incoming stream status while playing a mixed stream. Please see example here.
 

ett

Member
Thank you very much. I will try it now!

In addition, on Android, there seems to be no way to get the mute status when PLAYING, how can I get it?
Code:
// iOS SDK can do it.
stream.onCallback(kFPWCSStreamStatus.Playing, (stream) => {
  console.log(stream.getName(), " is ", stream.getAudioState().muted ? "muted" : "unmuted")
});
Code:
// Android SDK can NOT do it.
ignitionStream.on(new com.flashphoner.fpwcsapi.session.StreamEventHandler({
  onStreamStatus: (stream, status) => {
    console.log(stream.getName(), " is ", ???????? ? "muted" : "unmuted");
  });
StreamEvent is only when the mute state is changed in the middle of the stream, so I need to know immediately after it is playing.
Simply get the mute status from com.flashphoner.fpwcsapi.session.Stream.
 
Last edited:

ett

Member
One more thing, session.getStreams() is also not available in Android SDK, right? iOS SDK can do it.
 

Max

Administrator
Staff member
In addition, on Android, there seems to be no way to get the mute status when PLAYING, how can I get it?
You can get mute/unmute status as shown here
Code:
@Override
public void onStreamEvent(StreamEvent streamEvent) {
    runOnUiThread(new Runnable() {
        @Override
        public void run() {
            switch (streamEvent.getType()) {
                case audioMuted: mAudioMuteStatus.setText(getString(R.string.audio_mute_status)+"true"); break;
                case audioUnmuted: mAudioMuteStatus.setText(getString(R.string.audio_mute_status)+"false"); break;
                case videoMuted: mVideoMuteStatus.setText(getString(R.string.video_mute_status)+"true"); break;
                case videoUnmuted: mVideoMuteStatus.setText(getString(R.string.video_mute_status)+"false");
            }
        }
    });
}
That's enough for all the cases. StreamEvent is receiving right after stream became PLAYING.
One more thing, session.getStreams() is also not available in Android SDK, right? iOS SDK can do it.
Yes, the object model in Android and iOS SDK slightly differs. In Android SDK (and iOS Swift SDK too) a developer should maintain a streams list in session. You always call Session.createStream() method, so you always know all the streams you publishing and playing.
 

ett

Member
You can get mute/unmute status as shown here
Yes, I can get mute/unmute status.
That's enough for all the cases. StreamEvent is receiving right after stream became PLAYING.
No, StreamEvent is receiving only when publishStream.muteAudio()/publishStream.unmuteAudio(), not when playStream.play().

<UPDATED>Now I have discovered.
- NG: androidPublishStream.publish() / iosPlayStream.onStreamEvent().play() NOT get StreamEvent after PLAYING.
- NG: iosPublishStream.publish() / iosPlayStream.onStreamEvent().play() NOT get StreamEvent after PLAYING.
- OK: iosPublishStream.publish() / androidPlayStream.on(streamEventHandler).play() get StreamEvent after PLAYING.
In all cases, StreamEvent will come if publish-side do muteAudio()/unmuteAudio().

- WCS 5.2.992
- Android SDK 1.1.0.32
- iOS SDK 2.6.48

I will continue to investigate some more.

----------

In Android SDK, StreamEvent.getPayload() works fine. Thank you.

----------

a developer should maintain a streams list in session
Okay, I see. It seems a bit tedious.
 
Last edited:

Max

Administrator
Staff member
No, StreamEvent is receiving only when publishStream.muteAudio()/publishStream.unmuteAudio(), not when playStream.play().
We cannot reproduce the problem:
1. Publish a stream on WCS using MediaDevices example (WebSDK or Android SDK)
2. Mute the stream audio
3. After that, play the stream in Android SDK example MediaDevices
4. It shows Audio muted: true
Try to reproduce the problem in MediaDevices example (source on GitHub). If the problem is not reproducing, the possible reason is in your code/ Please modify MediaDevices example minimally to reproduce the problem and send the code using this form
 

Max

Administrator
Staff member
Please clarify: What application do you use for testing and what SDK?
If the problem is reproducing in Media Devices example application, please describe setps to reproduce. If not, please modify MediaDevices example minimally to reproduce the problem and send the code using this form.
I am using RoomApp. If using RoomApp, also get StreamEvent(with Payload.streamName) right after status=PLAYING?
Yes. You can get access to Stream object and then define onStreamEvent handler function for this object.
 

Max

Administrator
Staff member
1. publish stream on an origin server.
2. play stream using iOS SDK on an edge server.
We cannot reproduce the issue in MediaDevicesSwift example (code on GitHub):
1. Publish a stream named test on Origin server from WebSDK Medai Devices example
2. Mute the stream audio in WebSDK Medai Devices example
3. Open Media Devices Swift application on iPhone, play the stream named test from Edge server
4. Audio muted: on Remote settings view shows true
Note that you cannot test like this in ObjectiveC Media Devices example because it cannot play a stream with given name, it plays only the same stream which was published from it before. So you probably modified example code.
Please test the MediaDevicesSwift example out of the box. If problem is not reproducing, please modify MediaDevices example minimally to reproduce the problem and send the code using this form.
Also note that WCS build should be 5.2.966 or newer on both servers to send stream events between Origin and Edge and through a mixer.
 

ett

Member
I do not understand what you are saying.
I tried the MediaDevices example.

1.
Code:
    [_remoteStream onStreamEvent:^(FPWCSApi2StreamEvent *streamEvent){
        NSLog(@"No remote stream, %@", streamEvent.type);
When onStreamEvent is called, the "No remote stream, ..." output should be shown.

2.
You said
That's enough for all the cases. StreamEvent is receiving right after stream became PLAYING.
But "No remote stream, ..." does not shown in logs when immediately after PLAYING as unexpected.
(On the other hand, Android SDK shows onStreamStatus=PLAYING and onStreamEvent=audioUnmuted almost at the same time.)
As I've said before, when turn on/off the "Mute Audio" switch, then "No remote stream, ..." is shown as expected.

3.
I want StreamEvent immediately after PLAYING.
Because when a user play the mixer stream, the user need to know who is muted.
That behavior is done correctly in the AndroidSDK.




These are the logs.
// when immediately after PLAYING
// then onStreamEvent is NOT called. But onStreamEvent should be called because determine mute state of each streams.
Code:
2021-07-23 19:16:01.737015+0900 MediaDevices[2096:1172569] [FPWCSApi2MediaConnection] PeerConnectionState change 0
2021-07-23 19:16:01.739540+0900 MediaDevices[2096:1172569] [FPWCSApi2MediaConnection] didChangeIceConnectionState 1
2021-07-23 19:16:03.072450+0900 MediaDevices[2096:1172569] [FPWCSApi2MediaConnection] didChangeIceConnectionState 2
2021-07-23 19:16:03.072913+0900 MediaDevices[2096:1172569] [FPWCSApi2MediaConnection] didChangeIceConnectionState 3
2021-07-23 19:16:03.338396+0900 MediaDevices[2096:1172695] [FPWCSApi2Stream] Update stream state 3, media session id is 4B81CE6F-B144-449D-A351-898F2F8FE92D
2021-07-23 19:16:03.339452+0900 MediaDevices[2096:1172550] muted 0
2021-07-23 19:16:03.608356+0900 MediaDevices[2096:1172717] [FPWCSApi2Session] No such callback method notifyVideoFormat with data (
// when turn off the "Mute Audio" switch
// then onStreamEvent is called
Code:
2021-07-23 19:16:45.520988+0900 MediaDevices[2096:1172550] No remote stream, audioUnmuted
2021-07-23 19:16:45.521189+0900 MediaDevices[2096:1172550] muted 0
 
Last edited:

ett

Member
Hi Max, How about this?

4. Audio muted: on Remote settings view shows true
Is not that because of onStreamEvent. It is because of the below, right?

Code:
    [_remoteStream on:kFPWCSStreamStatusPlaying callback:^(FPWCSApi2Stream *stream){
        [self changeStreamStatus:stream];
        [self onStarted];
        _useLoudSpeaker.control.userInteractionEnabled = YES;
        [_remoteControl onAudioMute:[stream getAudioState].muted]; // <----- HERE
        [_remoteControl onVideoMute:[stream getVideoState].muted];
    }];
And it will return incorrect results for the mixer stream.
It is because we need the mute state of each stream participating in the mixer stream, not the mixer stream itself.
 

ett

Member
Of course it have the same problem.
Even if "PLAY" button tapped, playStream?.onStreamEvent is NOT called, same as Objective-C version.

Have you done any testing in Media Devices Swift?
Did you see playStream?.onStreamEvent being called when you tapped the "PLAY" button? YES or NOT?

----

You said it before,
4. It shows Audio muted: true
It caused by self.remoteViewController!.onAudioMute(stream.getAudioState()?.muted ?? false),
and but NOT caused by playStream?.onStreamEvent, right?

This is because "playStream?.onStreamEvent" is not called when "PLAY" is tapped.
"playStream?.onStreamEvent" is only called when you tap the "Mute Audio" switch.
 
Last edited:

Max

Administrator
Staff member
We've done the following test with the latest MediaDevicesSwift example build:
1. Connect to our demo server, publish stream streamName
1627376917184.png

2. Tap Local Settings. In this view, switch Mute Audio on, then tap Back
1627376999597.png

3. Play the stream streamName
1627377248319.png

4. Tap Remote Settings. In this view, we see Audio muted: true
1627377362711.png

Does the example works different in your test?
 
Top