[URGENT] Support external camera

GlebPrischepa

New Member
Hi Flashphoner,
I really like your sdk. Thanks for your solution!

I have an app with custom camera implementation and I would like to use my camera as local source.
What is the best way to connect my camera implementation with your sdk?

Thanks in advance.
Look forward to seeing your reply.
 

Max

Administrator
Staff member
Good day.
I have an app with custom camera implementation and I would like to use my camera as local source.
What is the best way to connect my camera implementation with your sdk?
You should expose your custom camera to a browser (actually, to an operating system by installing appropriate device drivers you certainly wrote). Then you can choose the camera from input devices list
1594695574851.png

Please read here how to select camera using WebSDK. Also please inspect Media Devices example.
 

GlebPrischepa

New Member
It looks like you did not understand me.
I am developing Android app that uses my Camera2 implementation.

I downloaded your samples from https://github.com/flashphoner/wcs-android-sdk-samples and SDK from https://docs.flashphoner.com/display/ANDROIDSDK11EN/Android+SDK+release+notes. There was some issue when I run video_chat sample but generated sample video-chat-release.apk that worked perfectly fine.

I would like to know what is the best way to use my custom Android Camera2 implementation and your SDK to transfer frames to your server?

Thanks
 

Max

Administrator
Staff member
It looks like you did not understand me.
Unfortunally, you didn't clarify a platform. By default, most of developers ask WebSDK questions if platform is not mentioned.
I would like to know what is the best way to use my custom Android Camera2 implementation and your SDK to transfer frames to your server?
Please check if your custom camera implementation is avalable to org.webrtc.Camera2Capturer and org.webrtc.Camera2Enumerator. If yes, please let us know, we'll raise a ticket to support if (now, AndroidSDK supports Camera1 only). If no, we probably couldn't help you because any input device should be accessible using org.webrtc interfaces to capture anything.
 

GlebPrischepa

New Member
Max, thanks for your response.
I have just checked "Camera2Capturer" but it looks like this is not what I am looking for in your SDK.
Please let me explain my use case again -

My app uses custom Android Camera2 implementation and contains lot of useful features for our users.
Now we want to integrate WebRTC to our app.

We want to use our camera with our custom rendering and your solution for transferring and receiving frames using your SDK.

I am asking if there is a way to disable your camera implementation?
Can we set any frame listener to forward received frames from my camera to your server?

Your SDK looks very attractive for us except of camera integration pipeline.
Let me know if we can schedule a call to discuss these issues.

Look forward to seeing your reply.
 

GlebPrischepa

New Member
I am using your Android SDK downloaded from https://docs.flashphoner.com/display/ANDROIDSDK11EN/Android+SDK+release+notes.
I see that class Stream.java and its method publish can instantiate opening hardcoded camera1 or camera2 implementations and transferring frames.
There is no clear way to inject my solution in Stream.java and this is the biggest blocker I see.

I would like to schedule a call with the tech team according to my issue and upcoming SDK purchasing.
 

Max

Administrator
Staff member
I see that class Stream.java and its method publish can instantiate opening hardcoded camera1 or camera2 implementations and transferring frames.
Under the hood, WCS Android SDK uses org.webrtc.Camera1Capturer to capture frames from selected camera (and org.webrtc.Camera1Enumerator to select a camera), then uses networking part of org.webrtc to send data to server.
If you could implement your custom CameraCapturer, and we could allow custom capturer and Camera2Enumerator usage in Android SDK, the goal seem to be achieved.
If this approach suits you, we'll raise a ticket.
And WebRTC library integration and custom capturing implementation seems to be the only approach even if you prefer to implement WebRTC support without our SDK. In this case, you can implement custom WebRTC connection part using raw Websocket API to publish stream to WCS.
 

GlebPrischepa

New Member
Implementing custom CameraCapturer sounds good to me.
How long will it take to implement part of feature on your side?

And I need the same implementation for iOS as well.
How can I achieve the same behavior on iOS?
 

Max

Administrator
Staff member
Implementing custom CameraCapturer sounds good to me.
How long will it take to implement part of feature on your side?
We raised internal ticket WCS-2805 and will let you know about results.
We do not provide ETA for new feature requests, because it depends on user votes for the feature.
And I need the same implementation for iOS as well.
How can I achieve the same behavior on iOS?
Please read this thread for example. But in iOS case, you should link with libwebrtc, use hints from StackOverflow and implement WebRTC networking part (RTCPeerConnection) and raw Websocket API for WCS websocket connections.
 

Max

Administrator
Staff member
Good day.
Since Android SDK build 1.1.0.26 it is possible to use custom software camera implementation to capture video. So you can implement custom Java class to capture image and apply filter. Please read details here.
 
Top