- 性别
- 保密
- 积分
- 7
- 积分
- 92
- 精华
- 0
- 阅读权限
- 20
- 注册时间
- 2012-5-19
- 最后登录
- 2013-8-17
- 帖子
- 33
- 性别
- 保密
|
本帖最后由 sky_yx 于 2015-12-30 14:08 编辑
苹果的iphone 有语音识别用的是Google 的技术,做为Google 力推的Android 自然会将其核心技术往Android 系统里面植入,并结合google 的云端技术将其发扬光大。
所以Google Voice Recognition在Android 的实现就变得极其轻松。
语音识别,借助于云端技术可以识别用户的语音输入,包括语音控制等技术,下面我们将利用Google 提供的Api 实现这一功能。
功能点为:通过用户语音将用户输入的语音识别出来,并打印在列表上。
功能界面如下:
用户通过点击speak按钮显示界面:
用户说完话后,将提交到云端搜索:
在云端搜索完成后,返回打印数据:
完整代码如下:
/*
* Copyright (C) 2008 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.example.android.apis.app;
import com.example.android.apis.R;
import android.app.Activity;
import android.os.Bundle;
import android.speech.RecognizerIntent;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.ListView;
import java.util.ArrayList;
import java.util.List;
/**
* Sample code that invokes the speech recognition intent API.
*/
public
class VoiceRecognition extends Activity implements OnClickListener {
private
static
final
int VOICE_RECOGNITION_REQUEST_CODE =
1234;
private ListView mList;
/**
* Called with the activity is first created.
*/
@Override
public
void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Inflate our UI from its XML layout description.
// Get display items for later interaction
Button speakButton = (Button) findViewById(R.id.btn_speak);
mList = (ListView) findViewById(R.id.list);
// Check to see if a recognition activity is present
PackageManager pm = getPackageManager();
List<ResolveInfo> activities = pm.queryIntentActivities(
new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH), 0);
if (activities.size() !=
0) {
} else {
}
}
/**
* Handle the click on the start recognition button.
*/
public
void onClick(View v) {
if (v.getId() == R.id.btn_speak) {
startVoiceRecognitionActivity();
}
}
/**
* Fire an intent to start the speech recognition activity.
*/
private
void startVoiceRecognitionActivity() {
Intent intent =
new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intent.putExtra(RecognizerIntent.EXTRA_PROMPT, "Speech recognition demo");
startActivityForResult(intent, VOICE_RECOGNITION_REQUEST_CODE);
}
/**
* Handle the results from the recognition activity.
*/
@Override
protected
void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == VOICE_RECOGNITION_REQUEST_CODE && resultCode == RESULT_OK) {
// Fill the list view with the strings the recognizer thought it could have heard
ArrayList<String> matches = data.getStringArrayListExtra(
RecognizerIntent.EXTRA_RESULTS);
matches));
}
super.onActivityResult(requestCode, resultCode, data);
}
}
|
|