Last reviewed: 3/23/2024 10:20:51 AM
Delphi Android Applications
Develop Android applications that speak and listen using your favorite version of Delphi.
The following sections describe the steps for integrating SpeechKit with Android Delphi applications.
SpeechKit Jars
SpeechKit includes Android compatible jar libraries required by the SpeechKit Object Pascal classes.
Right click the Libraries folder in your Android Target Platform and select the Add... option to add the following SpeechKit jars to your project:
- Program Files\Chant\SpeechKit 13\Android\lib\speechkit.jar and
- Program Files\Chant\SpeechKit 13\Android\lib\chant.shared.jar.
Azure Jar
For Azure Speech, right click the Libraries folder in your Android Target Platform and select the Add... option to add the Azure Speech SDK classes to your project:
- Program Files\Chant\SpeechKit 13\Android\client-sdk-1.31.0\jars\classes.jar.
SpeechKit Units
SpeechKit includes an Object Pascal source file Chant.SpeechKit.pas comprised of the Object Pascal classes that manage speech recognition and speech synthesis. It also includes an Object Pascal source file Chant.Shared.pas comprised of common Object Pascal classes that all Chant libraries use.
To access the SpeechKit Object Pascal classes within your application, first add a project reference to the SpeechKit Object Pascal source files:
- Within your Delphi project, select Project Options.
- Select the Delphi compiler options.
- Add Search path reference to the SpeechKit unit source file director: C:\Program Files\Chant\SpeechKit 13\Delphi\source.
- Add unit output path reference to the local directory with a period '.' character.
To access the SpeechKit Object Pascal classes within your application, add a reference to the Chant.Shared, and Chant.SpeechKit units in your uses clause. Additional Delphi Android units may be required such as Androidapi.Helpers to access the application context and Androidapi.Jni.Os and Androidapi.JNIBridge for permissions.
unit Unit1;
interface
uses
..., Chant.Shared, Chant.SpeechKit;
Add an event handler declaration and Object instance variable to the application declarations.
type
TForm1 = class(TForm)
...
procedure RecognitionDictation(Sender: TObject; Args: TRecognitionDictationEventArgs);
procedure InitComplete(Sender: TObject; Args: TTTSEventArgs);
private
{ Private declarations }
_Recognizer: TChantRecognizer; // Platform default - Android Speech
//_Recognizer: TMCSRecognizer; // Azure Speech
_Synthesizer: TChantSynthesizer; // Platform default - Android Speech
//_Synthesizer: TMCSSynthesizer; // Azure Speech
public
{ Public declarations }
end;
var
Form1: TForm1;
_SpeechKit: TSpeechKit;
FPermissionRecordAudio: string;
Azure Deployment Libraries
Add libraries to deployment under Project->Deployment:
- Select All configurations 32-bit platform
- Add all the applicable SDK files (e.g., C:\Program Files\Chant\SpeechKit 13\Android\client-sdk-1.31.0\jni\armeabi-v7a\libMicrosoft.CognitiveServices.Speech.core.so)
- Set the Remote Path for each SDK File added (e.g., library\lib\armeabi-v7a\)
- Select All configurations 64-bit platform
- Add all the applicable SDK files (e.g., C:\Program Files\Chant\SpeechKit 13\Android\client-sdk-1.31.0\jni\arm64-v8a\libMicrosoft.CognitiveServices.Speech.core.so)
- Set the Remote Path for each SDK File added (e.g., library\lib\arm64-v8a\)
Object Instantiation
Instantiate SpeechKit and set the credentials. For speech recognition, verify permissions, instantiate object, set callback event handler, and set application context. For speech synthesis, instantiate object, set callback event handler, and set application context.
// Instantiate SpeechKit object
_SpeechKit := TSpeechKit.Create();
if (_SpeechKit <> nil) then
begin
// Set credentials
_SpeechKit.SetCredentials('Credentials');
// Get permission
FPermissionRecordAudio := JStringToString(TJManifest_permission.JavaClass.RECORD_AUDIO);
PermissionsService.RequestPermissions([FPermissionRecordAudio],
{$IF CompilerVersion >= 33.0}
procedure(const APermissions: TClassicStringDynArray; const AGrantResults: TClassicPermissionStatusDynArray)
{$ELSE}
procedure(const APermissions: TArray<string>; const AGrantResults: TArray<TPermissionStatus>)
{$IFEND}
begin
// 1 permissions involved: RECORD_AUDIO
var LocationPermissionGranted :=
(Length(AGrantResults) = 1) and
(AGrantResults[0] = TPermissionStatus.Granted);
if LocationPermissionGranted then
InitReco;
end
);
// Create synthesizer
_Synthesizer := _SpeechKit.CreateChantSynthesizer();
//_Synthesizer := _SpeechKit.CreateMCSSynthesizer('speechKey', 'speechRegion');
if (_Synthesizer = nil) then
// Unable to initialize, return;
Exit;
// Register callback for init complete
_Synthesizer.InitComplete := InitComplete;
end;
procedure TForm1.InitReco;
var
chantEngine: TChantEngine;
currentReco: string;
i: Integer;
begin
// Create recognizer
_Recognizer := _SpeechKit.CreateChantRecognizer();
//_Recognizer := _SpeechKit.CreateMCSRecognizer('speechKey', 'speechRegion');
if (_Recognizer = nil) then
// Unable to initialize, return;
Exit;
// Register callback for recognizer speech
_Recognizer.RecognitionDictation := Recognition;
...
end;
Permissions
Speech recognition requires the user to grant RECORD_AUDIO permission. Add the appropriate permission in the manifest file by selecting Project->Options->Application->Uses Permission and setting Record audio to true.
Speech synthesis streamed to a file requires WRITE_EXTERNAL_STORAGE permission. Add the appropriate permission in the manifest file by selecting Project->Options->Application->Uses Permission and setting Write external storage to true.
Apps Targeting Android 12
Speech recognition apps targeting Android 12 with API 30 require an additional manifest entry. Add the following queries section before the application element.
<queries>
<intent>
<action android:name="android.speech.RecognitionService" />
</intent>
</queries>
Speech synthesis apps targeting Android 12 with API 30 require an additional manifest entry. Add the following queries section before the application element.
<queries>
<intent>
<action android:name="android.intent.action.TTS_SERVICE" />
</intent>
</queries>
Event Callbacks
Event callbacks are the mechanism in which the class object sends information back to the application such as speech recognition occurred, audio playback finished, or there was an error.
procedure TForm1.Recognition(Sender: TObject; Args: TRecognitionDictationEventArgs);
begin
If ((Args <> nil) and (Length(Args.Text) > 0)) then
begin
...
end;
end;
procedure TForm1.InitComplete(Sender: TObject; Args: TTTSEventArgs);
var
chantEngine: TChantEngine;
currentVoice: string;
i: Integer;
begin
if (Form1._Synthesizer.ChantEngines <> nil) then
begin
for chantEngine in Form1._Synthesizer.ChantEngines do
// Add name to list
Form1.ListBox1.Items.Add(chantEngine.Name);
end;
...
end;
Development and Deployment Checklist
When developing and deploying Delphi Android applications, ensure you have a valid license from Chant. See the section License for more information about licensing Chant software.
Sample Projects
Delphi Android sample projects are installed at the following location:
- Documents\Chant\SpeechKit 13\Delphi.