Android6.0编程实现双向通话自动录音功能的方法详解

本文实例讲述了Android6.0编程实现双向通话自动录音功能的方法。分享给大家供大家参考,具体如下:

项目中需要实现基于Android 6.0 的双向通话自动录音功能,在查阅相关android电话状态监听文章以及Git上的开源录音项目后,整理出此文

首先,介绍一下android 电话状态的监听(来电和去电):
http://www.jb51.net/article/32433.htm

实现手机电话状态的监听,主要依靠两个类:

TelephoneManger和PhoneStateListener

TelephonseManger提供了取得手机基本服务的信息的一种方式。因此应用程序可以使用TelephonyManager来探测手机基本服务的情况。应用程序可以注册listener来监听电话状态的改变。

我们不能对TelephonyManager进行实例化,只能通过获取服务的形式:

Context.getSystemService(Context.TELEPHONY_SERVICE);

注意:对手机的某些信息进行读取是需要一定许可(permission)的。
主要静态成员常量:(它们对应PhoneStateListener.LISTEN_CALL_STATE所监听到的内容)

int CALL_STATE_IDLE  //空闲状态,没有任何活动。
int CALL_STATE_OFFHOOK //摘机状态,至少有个电话活动。该活动或是拨打(dialing)或是通话,或是 on hold。并且没有电话是ringing or waiting
int CALL_STATE_RINGING //来电状态,电话铃声响起的那段时间或正在通话又来新电,新来电话不得不等待的那段时间。

项目中使用服务来监听通话状态,所以需要弄清楚手机通话状态在广播中的对应值:

EXTRA_STATE_IDLE //它在手机通话状态改变的广播中,用于表示CALL_STATE_IDLE状态,即空闲状态。
EXTRA_STATE_OFFHOOK //它在手机通话状态改变的广播中,用于表示CALL_STATE_OFFHOOK状态,即摘机状态。
EXTRA_STATE_RINGING //它在手机通话状态改变的广播中,用于表示CALL_STATE_RINGING状态,即来电状态
ACTION_PHONE_STATE_CHANGED //在广播中用ACTION_PHONE_STATE_CHANGED这个Action来标示通话状态改变的广播(intent)。
//注:需要许可READ_PHONE_STATE。
String EXTRA_INCOMING_NUMBER //在手机通话状态改变的广播,用于从extra取来电号码。
String EXTRA_STATE //在通话状态改变的广播,用于从extra取来通话状态。

如何实现电话监听呢?

Android在电话状态改变是会发送action为android.intent.action.PHONE_STATE的广播,而拨打电话时会发送action为

public static final String ACTION_NEW_OUTGOING_CALL =
      "android.intent.action.NEW_OUTGOING_CALL";

的广播。通过自定义广播接收器,接受上述两个广播便可。

下面给出Java代码:(其中的Toast均为方便测试而添加)

package com.example.hgx.phoneinfo60.Recording;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.telephony.TelephonyManager;
import android.widget.Toast;
/**
 * Created by hgx on 2016/6/13.
 */
public class PhoneCallReceiver extends BroadcastReceiver {
  private int lastCallState = TelephonyManager.CALL_STATE_IDLE;
  private boolean isIncoming = false;
  private static String contactNum;
  Intent audioRecorderService;
  public PhoneCallReceiver() {
  }
  @Override
  public void onReceive(Context context, Intent intent) {
    //如果是去电
    if (intent.getAction().equals(Intent.ACTION_NEW_OUTGOING_CALL)){
      contactNum = intent.getExtras().getString(Intent.EXTRA_PHONE_NUMBER);
    }else //android.intent.action.PHONE_STATE.查了下android文档,貌似没有专门用于接收来电的action,所以,非去电即来电.
    {
      String state = intent.getExtras().getString(TelephonyManager.EXTRA_STATE);
      String phoneNumber = intent.getExtras().getString(TelephonyManager.EXTRA_INCOMING_NUMBER);
      int stateChange = 0;
      if (state.equals(TelephonyManager.EXTRA_STATE_IDLE)){
        //空闲状态
        stateChange =TelephonyManager.CALL_STATE_IDLE;
        if (isIncoming){
          onIncomingCallEnded(context,phoneNumber);
        }else {
          onOutgoingCallEnded(context,phoneNumber);
        }
      }else if (state.equals(TelephonyManager.EXTRA_STATE_OFFHOOK)){
        //摘机状态
        stateChange = TelephonyManager.CALL_STATE_OFFHOOK;
        if (lastCallState != TelephonyManager.CALL_STATE_RINGING){
          //如果最近的状态不是来电响铃的话,意味着本次通话是去电
          isIncoming =false;
          onOutgoingCallStarted(context,phoneNumber);
        }else {
          //否则本次通话是来电
          isIncoming = true;
          onIncomingCallAnswered(context, phoneNumber);
        }
      }else if (state.equals(TelephonyManager.EXTRA_STATE_RINGING)){
        //来电响铃状态
        stateChange = TelephonyManager.CALL_STATE_RINGING;
        lastCallState = stateChange;
        onIncomingCallReceived(context,contactNum);
      }
    }
  }
  protected void onIncomingCallStarted(Context context,String number){
    Toast.makeText(context,"Incoming call is started",Toast.LENGTH_LONG).show();
    context.startService(new Intent(context,AudioRecorderService.class));
  }
  protected void onOutgoingCallStarted(Context context,String number){
    Toast.makeText(context, "Outgoing call is started", Toast.LENGTH_LONG).show();
    context.startService(new Intent(context, AudioRecorderService.class));
  }
  protected void onIncomingCallEnded(Context context,String number){
    Toast.makeText(context, "Incoming call is ended", Toast.LENGTH_LONG).show();
    context.startService(new Intent(context, AudioRecorderService.class));
  }
  protected void onOutgoingCallEnded(Context context,String number){
    Toast.makeText(context, "Outgoing call is ended", Toast.LENGTH_LONG).show();
    context.startService(new Intent(context, AudioRecorderService.class));
  }
  protected void onIncomingCallReceived(Context context,String number){
    Toast.makeText(context, "Incoming call is received", Toast.LENGTH_LONG).show();
  }
  protected void onIncomingCallAnswered(Context context, String number) {
    Toast.makeText(context, "Incoming call is answered", Toast.LENGTH_LONG).show();
  }
}

下面是AudioRecorderService的java实现:

package com.example.hgx.phoneinfo60.Recording;
import android.app.Service;
import android.content.Intent;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.AsyncTask;
import android.os.Environment;
import android.os.IBinder;
import android.provider.MediaStore;
import android.util.Log;
import android.widget.Toast;
import com.example.hgx.phoneinfo60.MyApplication;
import java.io.DataOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.net.HttpURLConnection;
import java.net.URL;
/**
 * Created by hgx on 2016/6/13.
 */
public class AudioRecorderService extends Service {
  private static int RECORD_RATE = 0;
  private static int RECORD_BPP = 32;
  private static int RECORD_CHANNEL = AudioFormat.CHANNEL_IN_MONO;
  private static int RECORD_ENCODER = AudioFormat.ENCODING_PCM_16BIT;
  private AudioRecord audioRecorder = null;
  private Thread recordT = null;
  private Boolean isRecording = false;
  private int bufferEle = 1024, bytesPerEle = 2;// want to play 2048 (2K) since 2 bytes we use only 1024 2 bytes in 16bit format
  private static int[] recordRate ={44100 , 22050 , 11025 , 8000};
  int bufferSize = 0;
  File uploadFile;
  @Override
  public IBinder onBind(Intent intent) {
    // TODO: Return the communication channel to the service.
    //maintain the relationship between the caller activity and the callee service, currently useless here
    return null;
  }
  @Override
  public void onDestroy() {
    if (isRecording){
      stopRecord();
    }else{
      Toast.makeText(MyApplication.getContext(), "Recording is already stopped",Toast.LENGTH_SHORT).show();
    }
    super.onDestroy();
  }
  @Override
  public int onStartCommand(Intent intent, int flags, int startId) {
    if (!isRecording){
      startRecord();
    }else {
      Toast.makeText(MyApplication.getContext(), "Recording is already started",Toast.LENGTH_SHORT).show();
    }
    return 1;
  }
  private void startRecord(){
    audioRecorder = initializeRecord();
    if (audioRecorder != null){
      Toast.makeText(MyApplication.getContext(), "Recording is started",Toast.LENGTH_SHORT).show();
      audioRecorder.startRecording();
    }else
      return;
    isRecording = true;
    recordT = new Thread(new Runnable() {
      @Override
      public void run() {
        writeToFile();
      }
    },"Recording Thread");
    recordT.start();
  }
  private void writeToFile(){
    byte bDate[] = new byte[bufferEle];
    FileOutputStream fos =null;
    File recordFile = createTempFile();
    try {
      fos = new FileOutputStream(recordFile);
    } catch (FileNotFoundException e) {
      e.printStackTrace();
    }
    while (isRecording){
      audioRecorder.read(bDate,0,bufferEle);
    }
    try {
      fos.write(bDate);
    } catch (IOException e) {
      e.printStackTrace();
    }
    try {
      fos.close();
    } catch (IOException e) {
      e.printStackTrace();
    }
  }
  //Following function converts short data to byte data
  private byte[] writeShortToByte(short[] sData) {
    int size = sData.length;
    byte[] byteArrayData = new byte[size * 2];
    for (int i = 0; i < size; i++) {
      byteArrayData[i * 2] = (byte) (sData[i] & 0x00FF);
      byteArrayData[(i * 2) + 1] = (byte) (sData[i] >> 8);
      sData[i] = 0;
    }
    return byteArrayData;
  }
  //Creates temporary .raw file for recording
  private File createTempFile() {
    File tempFile = new File(Environment.getExternalStorageDirectory(), "aditi.raw");
    return tempFile;
  }
  //Create file to convert to .wav format
  private File createWavFile() {
    File wavFile = new File(Environment.getExternalStorageDirectory(), "aditi_" + System.currentTimeMillis() + ".wav");
    return wavFile;
  }
  /*
   * Convert raw to wav file
   * @param java.io.File temporay raw file
   * @param java.io.File destination wav file
   * @return void
   *
   * */
  private void convertRawToWavFile(File tempFile, File wavFile) {
    FileInputStream fin = null;
    FileOutputStream fos = null;
    long audioLength = 0;
    long dataLength = audioLength + 36;
    long sampleRate = RECORD_RATE;
    int channel = 1;
    long byteRate = RECORD_BPP * RECORD_RATE * channel / 8;
    String fileName = null;
    byte[] data = new byte[bufferSize];
    try {
      fin = new FileInputStream(tempFile);
      fos = new FileOutputStream(wavFile);
      audioLength = fin.getChannel().size();
      dataLength = audioLength + 36;
      createWaveFileHeader(fos, audioLength, dataLength, sampleRate, channel, byteRate);
      while (fin.read(data) != -1) {
        fos.write(data);
      }
      uploadFile = wavFile.getAbsoluteFile();
    } catch (FileNotFoundException e) {
      //Log.e("MainActivity:convertRawToWavFile",e.getMessage());
    } catch (IOException e) {
      //Log.e("MainActivity:convertRawToWavFile",e.getMessage());
    } catch (Exception e) {
      //Log.e("MainActivity:convertRawToWavFile",e.getMessage());
    }
  }
  /*
  * To create wav file need to create header for the same
  *
  * @param java.io.FileOutputStream
  * @param long
  * @param long
  * @param long
  * @param int
  * @param long
  * @return void
  */
  private void createWaveFileHeader(FileOutputStream fos, long audioLength, long dataLength, long sampleRate, int channel, long byteRate) {
    byte[] header = new byte[44];
    header[0] = 'R'; // RIFF/WAVE header
    header[1] = 'I';
    header[2] = 'F';
    header[3] = 'F';
    header[4] = (byte) (dataLength & 0xff);
    header[5] = (byte) ((dataLength >> 8) & 0xff);
    header[6] = (byte) ((dataLength >> 16) & 0xff);
    header[7] = (byte) ((dataLength >> 24) & 0xff);
    header[8] = 'W';
    header[9] = 'A';
    header[10] = 'V';
    header[11] = 'E';
    header[12] = 'f'; // 'fmt ' chunk
    header[13] = 'm';
    header[14] = 't';
    header[15] = ' ';
    header[16] = 16; // 4 bytes: size of 'fmt ' chunk
    header[17] = 0;
    header[18] = 0;
    header[19] = 0;
    header[20] = 1; // format = 1
    header[21] = 0;
    header[22] = (byte) channel;
    header[23] = 0;
    header[24] = (byte) (sampleRate & 0xff);
    header[25] = (byte) ((sampleRate >> 8) & 0xff);
    header[26] = (byte) ((sampleRate >> 16) & 0xff);
    header[27] = (byte) ((sampleRate >> 24) & 0xff);
    header[28] = (byte) (byteRate & 0xff);
    header[29] = (byte) ((byteRate >> 8) & 0xff);
    header[30] = (byte) ((byteRate >> 16) & 0xff);
    header[31] = (byte) ((byteRate >> 24) & 0xff);
    header[32] = (byte) (2 * 16 / 8); // block align
    header[33] = 0;
    header[34] = 16; // bits per sample
    header[35] = 0;
    header[36] = 'd';
    header[37] = 'a';
    header[38] = 't';
    header[39] = 'a';
    header[40] = (byte) (audioLength & 0xff);
    header[41] = (byte) ((audioLength >> 8) & 0xff);
    header[42] = (byte) ((audioLength >> 16) & 0xff);
    header[43] = (byte) ((audioLength >> 24) & 0xff);
    try {
      fos.write(header, 0, 44);
    } catch (IOException e) {
      // TODO Auto-generated catch block
      //Log.e("MainActivity:createWavFileHeader()",e.getMessage());
    }
  }
  /*
  * delete created temperory file
  * @param
  * @return void
  */
  private void deletTempFile() {
    File file = createTempFile();
    file.delete();
  }
  /*
   * Initialize audio record
   *
   * @param
   * @return android.media.AudioRecord
   */
  private AudioRecord initializeRecord() {
    short[] audioFormat = new short[]{AudioFormat.ENCODING_PCM_16BIT, AudioFormat.ENCODING_PCM_8BIT};
    short[] channelConfiguration = new short[]{AudioFormat.CHANNEL_IN_MONO, AudioFormat.CHANNEL_IN_STEREO};
    for (int rate : recordRate) {
      for (short aFormat : audioFormat) {
        for (short cConf : channelConfiguration) {
          //Log.d("MainActivity:initializeRecord()","Rate"+rate+"AudioFormat"+aFormat+"Channel Configuration"+cConf);
          try {
            int buffSize = AudioRecord.getMinBufferSize(rate, cConf, aFormat);
            bufferSize = buffSize;
            if (buffSize != AudioRecord.ERROR_BAD_VALUE) {
              AudioRecord aRecorder = new AudioRecord(MediaRecorder.AudioSource.DEFAULT, rate, cConf, aFormat, buffSize);
              if (aRecorder.getState() == AudioRecord.STATE_INITIALIZED) {
                RECORD_RATE = rate;
                //Log.d("MainActivity:InitializeRecord - AudioFormat",String.valueOf(aFormat));
                //Log.d("MainActivity:InitializeRecord - Channel",String.valueOf(cConf));
                //Log.d("MainActivity:InitialoizeRecord - rceordRate", String.valueOf(rate));
                return aRecorder;
              }
            }
          } catch (Exception e) {
            //Log.e("MainActivity:initializeRecord()",e.getMessage());
          }
        }
      }
    }
    return null;
  }
  /*
  * Method to stop and release audio record
  *
  * @param
  * @return void
  */
  private void stopRecord() {
    if (null != audioRecorder) {
      isRecording = false;
      audioRecorder.stop();
      audioRecorder.release();
      audioRecorder = null;
      recordT = null;
      Toast.makeText(getApplicationContext(), "Recording is stopped", Toast.LENGTH_LONG).show();
    }
    convertRawToWavFile(createTempFile(), createWavFile());
    if (uploadFile.exists()) {
      //Log.d("AudioRecorderService:stopRecord()", "UploadFile exists");
    }
    new UploadFile().execute(uploadFile);
    deletTempFile();
  }
}

更多关于Android相关内容感兴趣的读者可查看本站专题:《Android多媒体操作技巧汇总(音频,视频,录音等)》、《Android开发入门与进阶教程》、《Android视图View技巧总结》、《Android编程之activity操作技巧总结》、《Android操作json格式数据技巧总结》、《Android文件操作技巧汇总》、《Android资源操作技巧汇总》及《Android控件用法总结》

希望本文所述对大家Android程序设计有所帮助。

(0)

相关推荐

  • Android通话记录备份实现代码

    (一) 前言 Android默认提供了联系人备份到sd卡的功能(代码在com.android.vcard包里面),我们可以把联系人导出成.vcf文件存在sd卡中:如果换手机了,我们又可以把联系人从sd卡文件中导入进来.那么,通话记录我们也能不能做出类似的功能呢?答案是肯定的! (二) 导出通话记录 既然是备份通话记录,那就肯定包括导出和导入的功能,这里我们先讲导出通话记录. 1. 根据通话记录导出的规范,导出的文件一般以.vcl后缀结尾,中间的内容是 复制代码 代码如下: BEGIN:VCALL

  • Android 实现电话来去自动录音的功能

    我们在使用Android手机打电话时,有时可能会需要对来去电通话自动录音,本文就详细讲解实现Android来去电通话自动录音的方法,大家按照文中的方法编写程序就可以完成此功能. 来去电自动录音的关键在于如何监听手机电话状态的转变: 1)来电的状态的转换如下(红色标记是我们要用到的状态) 空闲(IDEL)--> 响铃(RINGING)--> 接听(ACTIVE)--> 挂断(经历DISCONNECTING--DISCONNECTED)--> 空闲(IDEL) 或者  空闲(IDEL)

  • Android简单的利用MediaRecorder进行录音的实例代码

    复制代码 代码如下: package com.ppmeet; import java.io.IOException; import android.app.Activity;  import android.graphics.PixelFormat;  import android.media.MediaRecorder;  import android.os.Bundle;  import android.view.View;  import android.view.View.OnClick

  • Android编程实现通话录音功能的方法

    本文实例讲述了Android编程实现通话录音功能的方法.分享给大家供大家参考,具体如下: 因受系统限制,只能录自已麦的声音,录不到对方的声音,可能需要改内核才能实现双向录音: 接通电话和挂断电话时,震动一下: 使用广播接收者实现自启动: 服务代码: package com.eboy.phoneListener; import java.io.File; import android.app.Service; import android.content.Context; import andro

  • Android应用开发:电话监听和录音代码示例

    在oncreate 中执行: 复制代码 代码如下: public void onCreate() {  super.onCreate();  Log.i("TAG", "服务启动了"); // 对电话的来电状态进行监听  TelephonyManager telManager = (TelephonyManager) this    .getSystemService(Context.TELEPHONY_SERVICE);  // 注册一个监听器对电话状态进行监听 

  • Android开发实现删除联系人通话记录的方法

    本文实例讲述了Android开发实现删除联系人通话记录的方法.分享给大家供大家参考,具体如下: 1. 负责存放呼叫记录的内容提供者源码在 ContactsProvider 项目下: 源码路径: com/Android/providers/contacts/CallLogProvider.Java 使用到的数据库在: /data/data/com.android.providers.contacts/databases/contacts2.db 表名:calls 呼叫记录有三种类型: 来电:Cal

  • Android获取手机通话记录的方法

    Android如何获取手机通话记录,本文为大家揭晓. 获取手机通话记录流程: 1. 获取ContentResolver; ContentResolver resolver = getContentResolver(); 2.resolver.query(*); 需要传入通话记录的URI:CallLog.Calls.CONTENT_URI 3.对查询得到的Cursor进行数据获取. 主要代码如下: MainActivity.java package com.noonecode.contentres

  • Android开发四大组件之实现电话拦截和电话录音

    一.问题描述 使用BordercastReceiver和Service组件实现下述功能: 1.当手机处于来电状态,启动监听服务,对来电进行监听录音. 2.设置电话黑名单,当来电是黑名单电话,则直接挂断. 当拨打电话或电话状态发生改变时,系统就会发出有序广播,因此我们可以使用BordercastReceiver接受广播,因BordercastReceiver执行时间短不能执行耗时任务也不能使用子线程,因此我们应启动一个Service来监听电话并进行处理 二.加入AIDL文件 Android没有对外

  • android 添加按(power键)电源键结束通话(挂断电话)

    首先我们发现现在我们所用的android智能手机大部分都有当你在打电话时按power键来挂断电话,一般都是在设置中. 我主要是在原生源码中添加这一功能,主要用于学习....先看一张图:  看到那个按电源键挂断电话吧,那就是我所添加的,本来原生源码中是没有这一栏的..... 大概思路: 首先我先添加这一个checkboxPreference,然后将是否选择这一功能的值(0和1)存到data/data/com.android.providers.settings /databases/setting

  • Android开发模仿qq视频通话悬浮按钮(实例代码)

    模仿qq视频通话的悬浮按钮的实例代码,如下所示: public class FloatingWindowService extends Service{ private static final String TAG="OnTouchListener"; private static View mView = null; private static WindowManager mWindowManager = null; private static Context mContext

  • Android获取通话时间实例分析

    本文章总结了一段Android获取通话时间程序代码,有需要的朋友可参考一下. 我们知道安卓系统中通话时长应该是归Callog管,所以建议去查查ContactProvider,或者是TelephonyProvider Service测试 可以的通话开始的时候启动Service 记录当前时间A, 然后stopSelf(); 另外在通话结束的时候再次启动一下Service,再次获得当前时间B, 然后把时间A和B进行比较处理 String time = Long.toString(比较后处理的时间) 然

  • Android录音应用实例教程

    本文以实例形式较为详细的展示了Android录音的实现方法,分享给大家供大家参考之用.具体方法如下: 首先是xml布局文件: <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:layout_width="fill_parent" andr

随机推荐