This is a server plugin for UniMRCP to allow
Deepgram Brain to be used as a
speechrecog
resource in an MRCP server.
Download the libdgmrcp.so
library from the
releases page.
Place the library file libdgmrcp.so
in the UniMRCP plugins directory
(for example, /opt/unimrcp/plugin/
).
Then edit the <plugin-factory>
section of the UniMRCP server
configuration file (for example,
/opt/unimrcp/conf/unimrcpserver.xml
). General information about
configuring the UniMRCP server can be found in the Server
Configuration Manual on this
page.
A minimum configuration is as follows:
<plugin-factory>
<engine id="Deepgram" name="libdgmrcp" enable="true">
<param name="brain_url" value="wss://brain.deepgram.com/v2/"/>
<param name="brain_username" value="USERNAME"/>
<param name="brain_password" value="PASSWORD"/>
</engine>
</plugin-factory>
The following options can be specified:
name | value | description |
---|---|---|
brain_url | string (required) | The URL of the Deepgram ASR API. Try wss://brain.deepgram.com/v2/ to use Deepgram's hosted API. Note the trailing slash, which is significant. |
brain_username | string (required) | API username or API key. |
brain_password | string (required) | API password or secret. |
model | string | The default ASR model to use. |
language | string | The default ASR language to use. |
sensitivity_level | float | The default VAD sensitivity level, between 0.0 and 1.0. |
plaintext_results | boolean | If true , then results in a RECOGNITION-COMPLETE message will be in plain text instead of the standard NLSML. Note that this does not conform to the MRCP specification, but it can be convenient for testing and development. |
A Dockerfile is provided that will download and build UniMRCP and its dependencies, as well as build the server plugin.
$ docker build -t dgmrcp .
In order to extract the plugin, we need to create a container from the image, and then copy the shared object out:
CONTAINER=$(docker create dgmrcp) \
&& docker cp $CONTAINER:/dgmrcp/target/release/libdgmrcp.so ./ \
&& docker rm $CONTAINER
Licensed under either of
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.