Enabling DTMF input

Prerequisite: Your Mix project must have a channel that supports DTMF interaction. See Manage targets, for more information.

For question and answer nodes that collect a list entity, you can set DTMF mappings directly in the node, if desired. VoiceXML Connector can interpret DTMF input based on the dtmf_mappings information in the ExecuteResponse payload, without the need for an external DTMF grammar reference. See RecognitionSettings, in the Dialog as a Service gRPC API documentation, for more information.

Alternatively, any question and answer nodes can reference external DTMF grammars files: see Specify grammars. The ability to reference DTMF grammar files in question and answer nodes is enabled by default, in your project settings, for channels that support DTMF input.

A question and answer node that collects an entity for which a Nuance Recognizer built-in grammar exists can reference the built-in DTMF grammar in a similar fashion. In such cases, the grammar reference is a URI specifying the type and name of the built-in grammar, any desired parameters, and the name of the entity being collected. For example, the node that collects an entity called ACCOUNT_NUMBER (based on nuance_CARDINAL_NUMBER) as a 7-digit string can reference the built-in digits DTMF grammar as:

builtin:dtmf/digits?length=7;entity=ACCOUNT_NUMBER;

Where:

  • dtmf indicates that we’re using a DTMF grammar
  • digits is the name of the desired built-in grammar
  • ACCOUNT_NUMBER is the name of the entity to collect

Refer to your Speech Suite documentation for more information on built-in grammars.

In your project settings, you can specify a global DTMF grammar to support DTMF interaction at confirmation turns, see Specify grammars for confirmation.

Note that you can also specify DTMF grammar files for commands.