question

newuser-bd4d7b73-c429-406f-88ea-ba14b3af9984 avatar image

Full utterance is stored as slot value

I'm making a skill which identifies injuries, and I need the intent 'identify' to go through collecting data step-by-step, as shown below. However, the sample utterances are all not working. For example, for slot 'type', which has the sample "I have a {type}", when I say "I have a cut", the slot's value is stored as "I have a cut" instead of "cut". Any ideas why this is happening? I suspect my understanding of this is entirely wrong, and if so, could someone tell me how to actually do such a skill?

"intents": [
                {
                    "name": "identify",
                    "confirmationRequired": false,
                    "prompts": {},
                    "slots": [
                        {
                            "name": "type",
                            "type": "AMAZON.LITERAL",
                            "confirmationRequired": false,
                            "elicitationRequired": true,
                            "prompts": {
                                "elicitation": "Elicit.Slot.508056705383.1216799245514"
                            }
                        },
                        {
                            "name": "painlvl",
                            "type": "AMAZON.NUMBER",
                            "confirmationRequired": false,
                            "elicitationRequired": true,
                            "prompts": {
                                "elicitation": "Elicit.Slot.508056705383.470905496378"
                            }
                        },
                        {
                            "name": "loc",
                            "type": "AMAZON.LITERAL",
                            "confirmationRequired": false,
                            "elicitationRequired": true,
                            "prompts": {
                                "elicitation": "Elicit.Slot.508056705383.793792906850"
                            }
                        },
                        {
                            "name": "symone",
                            "type": "AMAZON.LITERAL",
                            "confirmationRequired": false,
                            "elicitationRequired": true,
                            "prompts": {
                                "elicitation": "Elicit.Slot.508056705383.774745442725"
                            }
                        },
                        {
                            "name": "symtwo",
                            "type": "AMAZON.LITERAL",
                            "confirmationRequired": false,
                            "elicitationRequired": true,
                            "prompts": {
                                "elicitation": "Elicit.Slot.508056705383.816384198478"
                            }
                        },
                        {
                            "name": "symthree",
                            "type": "AMAZON.LITERAL",
                            "confirmationRequired": false,
                            "elicitationRequired": true,
                            "prompts": {
                                "elicitation": "Elicit.Slot.508056705383.1130719713141"
                            }
                        }
                    ]
                }
            ]
        }
alexa skills kitskilldialog modelutterances
10 |5000

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

newuser-bd4d7b73-c429-406f-88ea-ba14b3af9984 avatar image
newuser-bd4d7b73-c429-406f-88ea-ba14b3af9984 answered

Sorry, the samples are here:

"name": "identify",
                    "slots": [
                        {
                            "name": "type",
                            "type": "AMAZON.LITERAL",
                            "samples": [
                                "I don't know",
                                "I am {type}",
                                "I have a {type}"
                            ]
                        },
                        {
                            "name": "painlvl",
                            "type": "AMAZON.NUMBER",
                            "samples": [
                                "It is {painlvl}",
                                "{painlvl}"
                            ]
                        },
                        {
                            "name": "loc",
                            "type": "AMAZON.LITERAL",
                            "samples": [
                                "my {loc}",
                                "It is on my {loc}",
                                "It's on my {loc}",
                                "It's at my {loc}",
                                "It's on the {loc}",
                                "It is on the {loc}",
                                "It is at my {loc}"
                            ]
                        },
                        {
                            "name": "symone",
                            "type": "AMAZON.LITERAL",
                            "samples": [
                                "There's {symone}",
                                "I had {symone}",
                                "I have {symone}",
				"I am {symone}"
                            ]
                        },
                        {
                            "name": "symtwo",
                            "type": "AMAZON.LITERAL",
                            "samples": [
                                "There's {symtwo}",
                                "I had {symtwo}",
                                "I have {symtwo}",
                                "I am {symtwo}"
                            ]
                        },
                        {
                            "name": "symthree",
                            "type": "AMAZON.LITERAL",
                            "samples": [
                                "There's {symthree}",
                                "I had {symthree}",
                                "I have {symthree}",
                                "I am {symthree}"
                            ]
                        }
                    ],
                    "samples": [
                        "I am injured",
                        "I'm injured",
                        "Identify my injury",
                        "Tell me what injury I have"
                    ]
                }
	],


10 |5000

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

apeksha avatar image
apeksha answered

Hi Developer,

We tested your skill and it's working fine.

{
  "version": "1.0",
  "session": {
    "new": false,
    "sessionId": "amzn1.echo-api.session.991f247a-c130-4ff1-9aa6-61a8d15c433c",
    "application": {
      "applicationId": "amzn1.ask.skill.f402161a-187c-487b-a1ae-0b60a7a3c14d"
    },
    "user": {
      "userId": "amzn1.ask.account.AHQVMF54OTCEXQ257ZFMES7OM42UWPD3XG7IP7GFK6RKAMH3VMCHK6R4JUZIU36KX5Q4TQLF6IKPFVYLJDFEZO5IK5IXBC7ZJDM2YXXEX2LACRL64RUPNE4ABBQKGWBIDCOUNVRZ26BYSRRZN336ACYXA4JGRJBZWR46F7G5VBEDVL5WID5R3DLFVJZQXRPGRJ3JEL2ESLPDLFQ"
    }
  },
  "context": {
    "System": {
      "application": {
        "applicationId": "amzn1.ask.skill.f402161a-187c-487b-a1ae-0b60a7a3c14d"
      },
      "user": {
        "userId": "amzn1.ask.account.AHQVMF54OTCEXQ257ZFMES7OM42UWPD3XG7IP7GFK6RKAMH3VMCHK6R4JUZIU36KX5Q4TQLF6IKPFVYLJDFEZO5IK5IXBC7ZJDM2YXXEX2LACRL64RUPNE4ABBQKGWBIDCOUNVRZ26BYSRRZN336ACYXA4JGRJBZWR46F7G5VBEDVL5WID5R3DLFVJZQXRPGRJ3JEL2ESLPDLFQ"
      },
      "device": {
        "deviceId": "amzn1.ask.device.AHNUEWTIWXOJHGL7VRR5ABY6OWT2ZJ4AFZ2RL6N6672RAHWXJWL4S3U6IF3C2MQUF6PE5775O6WUX36PFRWGEKNOXKHNPINRMSQEDSKCRPBMUIKPC7WT5HLKRZO2PFZ23J2FKCFWVYJPNHOBRQYOYOQI55PA",
        "supportedInterfaces": {}
      },
      "apiEndpoint": "https://api.amazonalexa.com",
      "apiAccessToken": "eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsImtpZCI6IjEifQ.eyJhdWQiOiJodHRwczovL2FwaS5hbWF6b25hbGV4YS5jb20iLCJpc3MiOiJBbGV4YVNraWxsS2l0Iiwic3ViIjoiYW16bjEuYXNrLnNraWxsLmY0MDIxNjFhLTE4N2MtNDg3Yi1hMWFlLTBiNjBhN2EzYzE0ZCIsImV4cCI6MTUzNDgzOTQ5MiwiaWF0IjoxNTM0ODM1ODkyLCJuYmYiOjE1MzQ4MzU4OTIsInByaXZhdGVDbGFpbXMiOnsiY29uc2VudFRva2VuIjpudWxsLCJkZXZpY2VJZCI6ImFtem4xLmFzay5kZXZpY2UuQUhOVUVXVElXWE9KSEdMN1ZSUjVBQlk2T1dUMlpKNEFGWjJSTDZONjY3MlJBSFdYSldMNFMzVTZJRjNDMk1RVUY2UEU1Nzc1TzZXVVgzNlBGUldHRUtOT1hLSE5QSU5STVNRRURTS0NSUEJNVUlLUEM3V1Q1SExLUlpPMlBGWjIzSjJGS0NGV1ZZSlBOSE9CUlFZT1lPUUk1NVBBIiwidXNlcklkIjoiYW16bjEuYXNrLmFjY291bnQuQUhRVk1GNTRPVENFWFEyNTdaRk1FUzdPTTQyVVdQRDNYRzdJUDdHRks2UktBTUgzVk1DSEs2UjRKVVpJVTM2S1g1UTRUUUxGNklLUEZWWUxKREZFWk81SUs1SVhCQzdaSkRNMllYWEVYMkxBQ1JMNjRSVVBORTRBQkJRS0dXQklEQ09VTlZSWjI2QllTUlJaTjMzNkFDWVhBNEpHUkpCWldSNDZGN0c1VkJFRFZMNVdJRDVSM0RMRlZKWlFYUlBHUkozSkVMMkVTTFBETEZRIn19.VwB2ZdTy5J_5KKxx67PrGIYDaNA5PVvLzZH9t5cMo3_2u1aPKhxhMM_tGjTmu28pplrVjfow94sEK5GZ2qqqImCHo042l6ZgMEpi9ouNkJMqrxZBVS1tD-buwF4E2_M6Up-tBnj9bjvnqMBedbqvcltrPi8WUtQAUVmJMtwXDSN09sA91XlmvBT81d1s765cPuBc27cORhpKBbzZEHV2L0sNMFXKwUEmGJ-jwM97NZKJ8lIfWHghPe92RvRnObAeR05SyT_iyEykJg-KKD7UPrVQnpp6DcswmJcnKRzOP7e5TTJSh3oOfe6uBucJ_Jw_GY8pskjDXoMdOWMqKgkIUw"
    }
  },
  "request": {
    "type": "IntentRequest",
    "requestId": "amzn1.echo-api.request.bbd1b24e-0587-4f25-a60e-abef3db00e0f",
    "timestamp": "2018-08-21T07:18:12Z",
    "locale": "en-US",
    "intent": {
      "confirmationStatus": "NONE",
      "slots": {
        "loc": {
          "name": "loc",
          "confirmationStatus": "NONE"
        },
        "symone": {
          "name": "symone",
          "confirmationStatus": "NONE"
        },
        "painlvl": {
          "name": "painlvl",
          "confirmationStatus": "NONE"
        },
        "symthree": {
          "name": "symthree",
          "confirmationStatus": "NONE"
        },
        "symtwo": {
          "name": "symtwo",
          "confirmationStatus": "NONE"
        },
        "type": {
          "name": "type",
          "value": "cut",
          "resolutions": {
            "resolutionsPerAuthority": [
              {
                "authority": "amzn1.er-authority.echo-sdk.amzn1.ask.skill.f402161a-187c-487b-a1ae-0b60a7a3c14d.TYPE",
                "status": {
                  "code": "ER_SUCCESS_MATCH"
                },
                "values": [
                  {
                    "value": {
                      "name": "cut",
                      "id": "fe47aa7c733c490d36e80508d5dc4019"
                    }
                  }
                ]
              }
            ]
          },
          "confirmationStatus": "NONE"
        }
      }
    },
    "dialogState": "IN_PROGRESS"
  }
}

PFA the screenshot from Developer console.


10 |5000

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.

Paul avatar image
Paul answered

Hey there,

I think this could be a good case to use Custom Types instead of Literals.

A custom type is a list of things you want to match for example in one of my apps I have one called TimeCateogry and it breaks down to the following:

{
"name": "TimeCategory",
"values": [
{
"name": {
"value": "sunset"
}
},
{
"name": {
"value": "sunrise"
}
},
{
"name": {
"value": "evening"
}
},
{
"name": {
"value": "afternoon"
}
},
{
"name": {
"value": "morning"
}
},
{
"name": {
"value": "month"
}
},
{
"name": {
"value": "year"
}
},
{
"name": {
"value": "week"
}
},
{
"name": {
"value": "day"
}
}
]
}

A sample utterance in my case is "What did the {animal} do this {TimeCategory}" in other words "what did the monkey do this afternoon"

So in your case I think you'd have a custom type called InjuryType, and it'd be filled with something like; cut, scrape, burn, bruise, headache, broken etc etc

Then your sample utterance would be: "I have a {type}" This would match anything like "I have a burn"..

You match custom slot value will then appear in your request, a little something like this:

request.intent.slots.InjuryType.value = "burn"

Hope that helps,

Paul

10 |5000

Up to 2 attachments (including images) can be used with a maximum of 512.0 KiB each and 1.0 MiB total.