Hi friends.
we have a comercial android app using SIP stack for voice. We want to migrate that technology to webRTC and WCS and in fact we have prepared a testbed with this characteristics:
- Android app using flashphoner android SDK
- WCS as webrtc to SIP gateway
- Asterisk
- External SBC (acting as SIP to TDM gateway)
When we make a call from the android app to public telephone number the flow sequence is as follow:
Android app with wifi 300mbs symmetric calls webrtc to WCS. In the same machine the WCS sends via SIP the call to local asterisk, an the asterisk sends the call to the SBC that deliver the call to a fix phone. The problem is that the audio level in the fix side is very very low. We are using G729 and I don't know if we can configure that level in the WCS or in the android SDK
Thanks in advance.
Mario
we have a comercial android app using SIP stack for voice. We want to migrate that technology to webRTC and WCS and in fact we have prepared a testbed with this characteristics:
- Android app using flashphoner android SDK
- WCS as webrtc to SIP gateway
- Asterisk
- External SBC (acting as SIP to TDM gateway)
When we make a call from the android app to public telephone number the flow sequence is as follow:
Android app with wifi 300mbs symmetric calls webrtc to WCS. In the same machine the WCS sends via SIP the call to local asterisk, an the asterisk sends the call to the SBC that deliver the call to a fix phone. The problem is that the audio level in the fix side is very very low. We are using G729 and I don't know if we can configure that level in the WCS or in the android SDK
Thanks in advance.
Mario