Search results

  1. andrew.n

    App crash from SDK: com.flashphoner.FPWCSApi2SessionQueue FBSSerialQueue assertBarrierOnQueue

    @Max We had a crash from the SDK few days ago: Device information: Model:iPhone 14 Pro Max Orientation: Portrait RAM free: 156.61 MB Disk free: 137.65 GB Operating System Version:16.0.0 Orientation: Portrait Jailbroken:No Crash Date:Nov 17, 2022, 2:29:48 AM Crashlytics - Stack trace check...
  2. andrew.n

    Enable camera after app starts (CallKit flow)

    @Max should we use the same FPWCSApi2Session object for both video stream and audio stream? or should I have 2 different session objects?
  3. andrew.n

    Enable camera after app starts (CallKit flow)

    @Max Yes, I have supportsVideo = true, also I have includesCallsInRecents = true, but I don't think this can affect. We couldn't make the CallKitDemo example because we have to prepare the certificates and so on... I might take a little bit to make CallKitDemo reproduce the issue. I have to...
  4. andrew.n

    Enable camera after app starts (CallKit flow)

    @Max the first problem was solved because I commented a few lines of code: Do you have any idea why this can happen? I don't understand how those 4 parameters can affect the execution. func reportIncomingCall(uuid: UUID, handle: String, user: User, completion: ((NSError?) -> Void)? = nil) {...
  5. andrew.n

    Enable camera after app starts (CallKit flow)

    @Max I did some debugging and manage to find the possible issue. So, when the app is already running, everything works perfect, BUT, when the app is closed and the device is locked, the function "func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {" is not called. I...
  6. andrew.n

    Enable camera after app starts (CallKit flow)

    @Max regarding CallKit intergration on iOS, right now, after reporting the incoming call to CallKit SDK, we use the following setup: FPWCSApi2MediaConstraints(audio: true, videoWidth: videoWidth, videoHeight: videoHeight) and use broadcastURLStream?.muteVideo() method. After the user opens the...
  7. andrew.n

    Swift Package Manager support

    https://github.com/facebookincubator/SocketRocket/issues/643
  8. andrew.n

    Swift Package Manager support

    @Max https://github.com/jsonmodel/jsonmodel/pull/649 this? still open...
  9. andrew.n

    Swift Package Manager support

    @Max https://github.com/livekit/WebRTC-swift with Swift Package Manager support :-? this can help you ?
  10. andrew.n

    Filters (beautify, AR, etc) on live streaming

    @Max Any ETA on this?
  11. andrew.n

    Filters (beautify, AR, etc) on live streaming

    @Max Great, thanks, looks like the streaming is not switching from "Publishing" to "Published". Let me know when is solved.
  12. andrew.n

    Filters (beautify, AR, etc) on live streaming

    I tried both options, in the first place I used only on the publish stream :)
  13. andrew.n

    Filters (beautify, AR, etc) on live streaming

    @Max The limit on the form is 30mb, I added the zip into a google drive, and attached the link to the ticket :)
  14. andrew.n

    Filters (beautify, AR, etc) on live streaming

    A small note on this, it worked to include GPUImage v3 in the project. The project had to be added manually because it's not available with CocoaPods, so firstly, you have to remove the v1 from CocoaPods and add the GPUImage v3 as a manual project into your project.
  15. andrew.n

    Filters (beautify, AR, etc) on live streaming

    If I press again the first line in the debugger changes: 2022-08-29 12:21:51.848065+0300 GPUImageDemoSwift[66565:4778931] [FPWCSApi2WebRTCMediaProvider] Finded device: Front Camera 2022-08-29 12:21:51.848145+0300 GPUImageDemoSwift[66565:4778931] [FPWCSApi2WebRTCMediaProvider] Try to find...
  16. andrew.n

    Filters (beautify, AR, etc) on live streaming

    @Max I'm playing around with the GPUImageDemoViewController. It worked to start the stream and play it (it uses the back camera) and I added another button to switch the camera: @IBAction func didTapSwitchButton(_ sender: UIButton) { publishStream!.switchCamera() } The camera is not...
  17. andrew.n

    Filters (beautify, AR, etc) on live streaming

    Also, I noticed that you use GPUImage v1 (which had the last update 6 years ago), I saw that there is a 3rd version of this framework: https://github.com/BradLarson/GPUImage3 Did you consider to switch from v1 to v3 ?
  18. andrew.n

    Filters (beautify, AR, etc) on live streaming

    @Max What about the face detection? To apply some filters only on the face of the user?
  19. andrew.n

    Filters (beautify, AR, etc) on live streaming

    @Max I started to look into the project, but I want to ask, is there any documentation regarding all filters? Like a preview for us to see how all of them look like without having to test each of them ? Also, is there any guide how to create your own filter using your SDK ?
  20. andrew.n

    Filters (beautify, AR, etc) on live streaming

    @Max Everything works great with the SDK, and of course, new requests came up such as using filters during the live stream. I want to ask you if you had that kind of request before, or if there is another SDK to add over the current FlashPhoner SDK to handle this request. It will be useful if...
Top