top button
Flag Notify
    Connect to us
      Site Registration

Site Registration

Audio devices on Android

+1 vote
335 views

I'm newbie in Android development and I must say, that I'm a little bit confused by its audio framework. I work on porting Android to our custom board and I have almost everything working except of GSM audio.
I suspect that it should be relatively easy to mark some stream as output and some as input for voice calls, but I cannot get into it. I have two PCM interfaces on our SoC, one is used for audio codec and provides speaker and microphone and it is working for me as audio.primary and Android applications can use it. The second one is used for GSM audio interface and I have working it on kernel ALSA level (I can use command line aplay to play and capture sounds). Now I need to connect these two together - to get sound from GSM into speaker and opposite from microphone to GSM.

I think that audio_policy.conf file has to do something with that, but I checked audio.h header and don't know if AUDIO_DEVICE_IN_COMMUNICATION or AUDIO_DEVICE_IN_VOICE_CALL should be used for GSM, what output device, etc.

I know that this description is a bit messed, but I hope that someone can point me to right direction and my question will be clearer then.

posted May 14, 2013 by anonymous

Looking for an answer?  Promote on:
Facebook Share Button Twitter Share Button LinkedIn Share Button

Similar Questions
+1 vote

I'm looking for a way to stream different audio streams to different output devices. For example Phone Call going to the Bluetooth but my MP3 player going to the speaker. Is it possible in Android?

Is there any hack in any of the android/alsa layer that can do the trick?

+4 votes

I am trying to record an audio in android device. But I want mute my device in code if did not receive gain. If anybody have idea about this please share it. If connected earphones I can MUTE by pressing mute button but I want edit code and mute it by default.

+1 vote

Current AudioFlinger at stereo 16-bit PCM at 44.1 kHz( ˆ’32,768 to +32,767) is a little bit too out-dated. The audio quality is degraded, especially in lossless have been downsample. In addition, more and more HQ record is launched, not only at CD level. So, it is necessary to get into 24-bit 96kHz( ˆ’8,388,608 to +8,388,607), also get advantage on high SNR too.

I don't know the major obstacle of being upgrade to hi-res audio. I guess hardware limit, bad AD/DA converter, hard to implement on software side,etc.

+2 votes

I am using the AOA 2.0 protocol to transfer the Mobile media audio to my head unit. If the same mobile is not connected via BT to the HU, the audio transfer happens perfectly fine with AOA 2.0. But if the mobile is paired using BT to HU the media audio is routed through A2DP. Is this a generic behavior?

Is there a specific rule which protocol to be used when there is multiple protocol available for Audio routing? Or is it a device dependent?

+3 votes

I found a config value in packages/app/Bluetooth/res/values/config.xml and a hide class

  • false
  • android.bluetooth.BluetoothHeadsetClient, this provides HFP client API, such as dial, accept call.

I set profile_supported_hfpclient to true, and remake and flash the system. I write a test app to using BluetoothHeadsetClient API, and it can dial and accept call. However, the voice is not working when calling. The speaker doesn't make sound, and the voice received by mic isn't transferred by Bluetooth SCO.

So does android support HFP client fully? How can I make voice works well

...