2014-09-29 73 views
0

我目前正在研究Android Wear应用程序,并且正在寻找录音。我遵循Android开发者网站上的教程,它适用于我的Nexus 7,但不适用于我用于测试的Samsung Gear Live。应用程序一直在崩溃。三星齿轮现场音频编码

挖掘一点问题,我可能已经发现这是记录器工作的两个参数有问题:OutputFormatAudioEncoder。我尝试配对并尝试所有可用的OutputFormatAudioEncoder,但没有任何运气。

所以这里是我的问题:有人遇到同样的问题?如果是这样,你是否找到格式/编码器的正确组合?

我不粘贴我的代码,因为它与文档中的完全相同。这里是链接,如果你想看看:http://developer.android.com/guide/topics/media/audio-capture.html

预先感谢您为您的答案和你的时间:)

+0

我一直在挖掘更多,并发现一些......有趣/令人不安的信息......当使用ARM_NB或ARM_WB的AAC编码时,应用程序不会崩溃。相反,当我的'MediaRecorder'上调用'start()'方法时,我有'mediaserver died'错误。 – Snow 2014-10-01 08:40:25

+0

你试过了默认编解码器吗?我的理解是,Android Wear上没有压缩编解码器,因此您需要在没有压缩的情况下捕获音频数据,并且它应该适合您,但我没有测试过,所以我没有示例。 – 2014-10-17 05:19:01

+0

是的,我也尝试了默认,但不幸的是它不工作。我会尽量不压缩,谢谢:) – Snow 2014-10-17 08:06:00

回答

0

根本问题是你不能使用MediaRecorder,即使Android audio capture example does,而是你需要使用AudioRecord类。

此外,我建议将原始数据传回您的手机,将其组装成音频文件,因为这在可穿戴设备上非常棘手。

欲了解更多信息,请参阅this answer了解更多信息。

我在下面包括了一个样本,我开始工作了。

import android.app.Activity; 
import android.content.Intent; 
import android.media.AudioFormat; 
import android.media.AudioRecord; 
import android.media.MediaRecorder; 
import android.os.Bundle; 
import android.speech.RecognizerIntent; 
import android.support.wearable.view.WatchViewStub; 
import android.util.Log; 
import android.widget.TextView; 
import android.view.View; 

import java.util.List; 

public class MainActivity extends Activity { 
    private static final String TAG = MainActivity.class.getName(); 
    private static final int SPEECH_REQUEST_CODE = 1; 

    private static final int RECORDER_SAMPLERATE = 44100; 
    private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_STEREO; 
    private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT; 

    private TextView mTextView; 
    private AudioRecord recorder; 
    private int bufferSize = 0; 
    private Thread recordingThread = null; 
    private volatile boolean isRecording; 

    @Override 
    protected void onCreate(Bundle savedInstanceState) { 
     Log.v(TAG, "Creating MainActivity"); 
     super.onCreate(savedInstanceState); 
     setContentView(R.layout.activity_main); 
     final WatchViewStub stub = (WatchViewStub) findViewById(R.id.watch_view_stub); 
     stub.setOnLayoutInflatedListener(new WatchViewStub.OnLayoutInflatedListener() { 
      @Override 
      public void onLayoutInflated(WatchViewStub stub) { 
       mTextView = (TextView) stub.findViewById(R.id.text); 
      } 
     }); 

     bufferSize = 
       AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE, 
         RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING); 
    } 

    public void handleRecordButtonClick(View view) { 
     startAudioCapture(); 
    } 

    public void handleStopButtonClick(View view) { 
     stopAudioCapture(); 
    } 

    private void startAudioCapture() { 
     Log.v(TAG, "Starting audio capture"); 
     recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 
       RECORDER_SAMPLERATE, RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING,  bufferSize); 
     if (recorder.getState() == AudioRecord.STATE_INITIALIZED) { 
      recorder.startRecording(); 
      isRecording = true; 
      Log.v(TAG, "Successfully started recording"); 

      recordingThread = new Thread(new Runnable() { 

       @Override 
       public void run() { 
        processRawAudioData(); 
       } 
      }, "AudioRecorder Thread"); 

      recordingThread.start(); 
     } else { 
      Log.v(TAG, "Failed to started recording"); 
     } 
    } 

    private void stopAudioCapture() { 
     Log.v(TAG, "Stop audio capture"); 
     recorder.stop(); 
     isRecording = false; 
     recorder.release(); 
    } 

    private void processRawAudioData() { 
     byte data[] = new byte[bufferSize]; 
     int read = 0; 
     while(isRecording) { 
      read = recorder.read(data, 0, bufferSize); 

      if(AudioRecord.ERROR_INVALID_OPERATION != read) { 
       Log.v(TAG, "Successfully read " + data.length + " bytes of audio"); 
      } 
     } 
    } 
}