Call Kit and Flashphoner

Max

Administrator
Staff member
everything works without SIP setup. is SIP setup mandatory for this issue?
Please clarify the following:
1. Did you update both SDK and WCS?
2. Is the issue reproducing in Call Kit Demo example built from the source? If not, please change the example source code to reproduce this and send the code using this form
 

wyvasi

Member
Can we publish audio without using SIP when we receive callKit notification?
We don't need SIP and I don't see why would I complicate things to integrate it, when we can send/receive push notifications and normal publish.
We can't modify your demo because you only add many SIP logic there, maybe I wasn't clear before, we wanted to publish audio (webrtc), it looks like it connects/publishes but the device is not sending any audio to the server(this behavior only happens when phone is blocked).
 

Max

Administrator
Staff member
Can we publish audio without using SIP when we receive callKit notification?
As mentioned in official docs, CallKit is intended for VoIP calls handling, so SIP logic seems to be required. We raised the ticket WCS-3473 to investigate it.
Please also clarify how do you sending a push notification? We implemented it on WCS side by receiving an incoming SIP call. What is your flow?
 

wyvasi

Member
I am sending notifications using this package https://www.npmjs.com/package/@parse/node-apn .
This video-call was made for browsers and we wanted to add it to mobile, all events are made sent over ws and now for mobile we send notifications using this library to let the ios app know when to publish, play, etc. Now if we start call from web browser -> phone (while is blocked), audio is playing on phone, but not on the web browser.
 

Max

Administrator
Staff member
Thanks for clarification. It seems possible to implement without SIP, but this requires changes a WCS server side. We'll let you know in this topic when release the feature.
 

Max

Administrator
Staff member
We investigated the issue. Seems like no changes needed on server side or in iOS SDK.
There is a method in CXProvideDelegate protocol
Code:
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession)
We implemented it to answer the SIP call, see ProviderDelegate.swift module:
Code:
    func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
        NSLog("CKD - didActivate \(#function)")
        currentCall?.answer()
    }
You should also implement this method and publish WebRTC stream where, for example
Code:
    func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
        NSLog("CKD - didActivate \(#function)")
        stream?.publish()
    }
In this case, audio should publish correctly.
 
Last edited:
Top