Call Kit and Flashphoner

andrew.n

Member
Context: We have to add a new feature to our app, to support video streaming calls between 2 users. Same as Skype/Messanger but only for 2 users (no group support yet)

As I understood, both Skype and Messanger use Call Kit to properly handle communication between multiple apps that supports calling using VoIP.
I run the TwoWayStreaming demo project but it just handles the use case when the app is in the foreground.

My questions:
1) I want to be sure that using "TwoWayStreaming" is the best approach that we need for now.
2) How can I use Flashphone with Call Kit, in case the phone is blocked and the user receives a call, how I can handle that, same as I can do with VoIP?
 

Max

Administrator
Staff member
Good day.
1) I want to be sure that using "TwoWayStreaming" is the best approach that we need for now.
Yes
2) How can I use Flashphone with Call Kit, in case the phone is blocked and the user receives a call, how I can handle that, same as I can do with VoIP?
Flashphoner SDK provides WebRTC streaming tools only. Even for SIP calls, it is WebRTC under the hood, SIP leg works on server side.
So you can try to integrate TwoWayStreaming example with Call Kit as Skype developers do with their own code.
 

andrew.n

Member
Thank you Max for your response. We manage to integrate Call Kit using PushKit as well, but I want to know if there are any guidelines to integrate FlashPhoner with the flow that we have with Call Kit.
First, we have a small issue that if the phone is blocked and the user responds, we have to establish the connection only for voice call and if the user taps on "Video" from the native UI, then we can open the app and launch the video streaming.
Any advice is welcomed :)
 
  • Like
Reactions: VXG

Max

Administrator
Staff member
Hello

You would need to use push notifications.
This is how it works.

1. User has established Websocket connection to the server.
2. User locked his device. Connection closed.
4. User received push notification.
5. User pressed 'Answer'.
6. User has established a new connection and start publishing / playback stream.

Regarding Call Kit. We never tested this.
 

andrew.n

Member
@Max
So, when the app is in the foreground, I start a call from the web app to the iOS app, on the iOS app I accept from the CallKit interface and I show the video call screen and everything works perfectly. (also when the call is ended, I call the end call transaction request to CallKit to remove the system call).

If I move the app in the background (the phone is unlocked), and start a call from the web app, I can see the CallKit interface on the home screen (or where ever I am - maybe another app), I accept the call, the OS opens my app, the video call screen is presented, the streams are works (video part) but I cannot hear anything in the app or web app. I did some debugging with the backend guy and looks like everything is set up properly (basically I run the same code), but at 1 minute, the streaming is stopped by the iOS app with the following message reported on the server "Stopped by publisher stop".

Still investigating... any idea on this issue?

Later edit 1:
I updated from SDK version 2.6.36 to 2.6.73 (last) and it looks like is working when the app is in the background. The issue still happens if the app is closed (force closed) and I show the video call after
1) I create the web socket connection
2) I present the first screen in the app (key window is setted up)
3) doing all the setup with FlashPhoner (as I do normally when you are in the app...)
 
Last edited:

Max

Administrator
Staff member
I updated from SDK version 2.6.36 to 2.6.73 (last) and it looks like is working when the app is in the background. The issue still happens if the app is closed (force closed) and I show the video call after
1) I create the web socket connection
2) I present the first screen in the app (key window is setted up)
3) doing all the setup with FlashPhoner (as I do normally when you are in the app...)
Please clarify: dou you use a WebRTC-SIP call, or just WebRTC two way video chat?
For WebRTC-SIP calls to receive in background, please wait for WCS-3393 implementation.
For WebRTC two way video chat, please check if there is audio/video traffic between two sides (you can test it in WebSDK Media Devices example which shows a WebRTC publishing/playback statistics).
Also please check if the problem is reproducing when application is started manually after force closing. If yes, please try to reproduce the issue in Two Way Streaming example.
 

andrew.n

Member
Hey @Max
We are using TwoWayStreaming. There is no audio received on the backend after I accept the call.
I tried this example with audio: true, video: false to see if it's working at least like that... but no...
If I use audio: true, video: true (or videoWidth, videoHeight,we can see each other, but we can't hear each other, and in 1 min the streaming stops)
 

Attachments

Max

Administrator
Staff member
Please reproduce the issue and collect a report on server side as described here using report.sh script. Also collect traffic dump (the collection must be started before stream publishing). Send the treport using this form.
 

Max

Administrator
Staff member
We checked the report. Unfortunately, the test stream (streamtest we suppose) is published from Chrome 94 on Windows 10, but you mentioned the problem is in iOS SDK application. Also, the server where report collected is the Origin, but test stream seems to be playing from some Edge server, so we cannot even suppose what SDK is used to play it. Publishing debug logs are not collected, so we cannot confirm audio/video traffic presence.
So we recommend the following:
1. Explain the test clearly: what steps are performed, what exactly SDK and example applications are used
2. (We still suppose you experiencing a problem with iOS SDK) Build the example application (Two Way Streaming for example) from GitHub sources and try to reproduce the issue in this application: force close it, then manually open and publish a stream. Do not use production server, deploy a staging one. Also, play the test stream from the same server where it is published.
3. If the problem is not reproducing, modify example application code minimally to reproduce it and send this code using this form
Please also note that when you forcefully close the application by swiping from the last applications list, there should be no difference from a manual start when you receiving a push notification, because the application is unloaded from memory at this point. Please check if application has all the necessary permissions to microphone and camera. If no permissions are granted to microphone, the stream will be published without audio.
 

wyvasi

Member
We reduced the app code just to publish from IOS and we play from 'media devices' demo. We receive video but not audio.
Sent both report and pcap separated because of file size upload.
Thanks!
 

Max

Administrator
Staff member
We reduced the app code just to publish from IOS and we play from 'media devices' demo. We receive video but not audio.
Unfortunately we have not received your report.
Please modify iOS Two Way Streaming example application code minimally to reproduce the issue and send using this form
 

Max

Administrator
Staff member
About the report: there are no client debug logs provided, so we cannot parse traffic dump. But in publisher client logs we see that audio RTP activity timeout is reached. So we suppose there is no audio traffic from publuisher.
We tried to reproduce the issue using latest WCS build 5.2.1126 and iOS SDK Two Way Streaming example build with 2.6.73. We cannot reproduce it.
Please try to disable RTP bundles at server side
Code:
rtp_bundle=false
Also, remove the following settings from flashphoner.properties.
Code:
rtp_paced_sender=true
rtp_paced_sender_initial_rate=200000
rtp_paced_sender_increase_interval=50
rtp_paced_sender_k_up=0.9
This doen't concern the issue but must not be applied simultaneousy with streaming_distributor_subgroup_enabled=true option (as we pointed in another topic earlier). Also, it may lead to excessive client logging.
If RTP bundles disabling does not help, we need a reproduction. So modify iOS Two Way Streaming example application code minimally to reproduce the issue and send using this form.
 

wyvasi

Member
Code:
rtp_bundle=false
didn't help
We tried to publish to our server and demo.flashphoner.com but it still fails because RTP audio activity.
When app is opened is fine, when app is closed and we start publishing using.
I can't see how to make a demo for you to test. In order for you to test you need to ->
Receive pushkit notification (app closed and it requires certificates) -> accept voip call (CallKit) -> start to publish. This is the case of missing audio which we can't find a solution for. I don't think is server related because we used both demo.flashphoner and our server.
Hope we can solve this issue somehow, I am open for suggestions on how we can approach this issue.
 

Max

Administrator
Staff member
I can't see how to make a demo for you to test.
In this case, you have to wait for our CallKit usage example release (ticket WCS-3420). To resolve the issue, we need a reproduction. So, we implement the example and check if the issue is reproducing.
We let you know results here.
 

Max

Administrator
Staff member
Good day.
We added iOS Call Kit Demo Swift example using Call Kit and push notifications to receive incoming SIP calls. Note that this example works with iOS SDK build 2.6.80 and WCS build 5.2.1164. Also, it cannot be tested with our demo server because APNs credentials should be set on the server side: Server configuration, so you should use your own server instanse to test.
 

andrew.n

Member
@Max
Hey Max, we added the new SDK and it works pretty well, except for only one issue.
So we have the following scenarios:
✅If the app is in the foreground everything works fine.
✅If the phone is unlocked and the app is in the background everything works fine.
✅If the phone is unlocked and the app is killed, everything works fine, the call is set up, and when the app launches it shows the video call screen.
❌If the phone is locked and the app is killed when you answer you can hear the other user but the input from my device is not working on the web. If I unlock the phone it opens the app and everything works fine, BUT while the phone is locked the user that is calling me cannot hear me. From the phone, I can hear the user from the web. I have checked the stream.publish() method is called. On the phone (top left corner) I can see that the app is using the camera input... but the audio is not working somehow :|

* everything works without SIP setup. is SIP setup mandatory for this issue?
 
Top