Arpit Parekh
New Member
We are working on iOS App using Swift.
We are following VideoChat application for two person audio/video call.
We have two classes.
RoomManager - Which has instances of FPWCS library, including WebRTCView.
RoomViewController - Which handles the UI for remote stream and local stream.
Scenario: When two person audio call
1. User1 connects and join the room and publish the remote stream
2. User2 receives a call when app is in the background and in lock state, and receives a call using callkit in below function,
We connect and join the room in background.
3. While publishing stream, we are not able to hear the voice at remote end.
Is there any way to take WebRTCView in background state or publishing stream for audio.
We are following VideoChat application for two person audio/video call.
We have two classes.
RoomManager - Which has instances of FPWCS library, including WebRTCView.
RoomViewController - Which handles the UI for remote stream and local stream.
Scenario: When two person audio call
1. User1 connects and join the room and publish the remote stream
Swift:
public func connect(serverUrl: String,
userName: String,
roomName: String,
isAudioCall: Bool,
isOutgoingCall: Bool = false) {
print("\(#function)")
self.isAudioOnly = isAudioCall
self.isOutgoingCall = isOutgoingCall
//Note: Here initalliay sent event as dis connected, but actual event used is isConnecitng
self.updateStatus(event: .fpwcsRoomManagerEventDisconnected, isConnecting: true)
let options = FPWCSApi2RoomManagerOptions()
options.urlServer = serverUrl
options.username = userName
do {
try wcsRoomManager = FPWCSApi2.createRoomManager(options)
wcsRoomManager?.on(.fpwcsRoomManagerEventConnected, callback: { roomManager in
print("callback fpwcsRoomManagerEventConnected")
self.joinRoom(roomName: roomName)
})
} catch {
print("wcsRoomManager Error while creating WCS session: \(error.localizedDescription)")
}
}
Swift:
private func joinRoom(roomName: String) {
print(#function)
let roomOptions = FPWCSApi2RoomOptions()
roomOptions.name = roomName
room = wcsRoomManager?.join(roomOptions)
room?.onStateCallback({ room in
print("callback room joined")
self.delegate?.onUserJoined()
self.publishStream(localDisplay: self.localParticipantVideoVW,
shouldRecord: false)
})
Swift:
public func publishStream(localDisplay: WebRTCView,
shouldRecord: Bool) {
print("#localStream publishing stream")
let options = FPWCSApi2StreamOptions()
options.record = shouldRecord
localStream = room?.publish(localDisplay.videoView,
with: options)
localStream?.on(.fpwcsStreamStatusPublishing, callback: { [weak self] stream in
print("#localStream particiapnt stream published successfully.")
if self.isAudioOnly {
print("#localStream isAudioOnly")
stream?.muteVideo()
}
self.delegate?.onStreamPublished()
})
Swift:
func provider(_ provider: CXProvider, perform action: CXAnswerCallAction)
3. While publishing stream, we are not able to hear the voice at remote end.
Is there any way to take WebRTCView in background state or publishing stream for audio.