…app.handle('revealWord',conv=>{conv.add(newSimple(`<speak>Sorry, you lost.<mark name="REVEAL_WORD"/> The word is ${conv.session.params.word}.</speak>`));conv.add(newCanvas());});…
…setCallbacks(){// declare Assistant Canvas Action callbacksconstcallbacks={onTtsMark(markName){if(markName==='REVEAL_WORD'){// display the correct word to the userthat.revealCorrectWord();}},}callbacks.onUpdate.bind(this);}…
/***RegisterallcallbacksusedbytheInteractiveCanvasAction*executedduringgamecreationtime.*/setCallbacks(){constthat=this;//DeclaretheInteractiveCanvasactioncallbacks.constcallbacks={onUpdate(data){console.log('Received data',data);},onInputStatusChanged(inputStatus){console.log("The new input status is: ",inputStatus);},};//CalledbytheInteractiveCanvaswebapponcewebapphasloadedto//registercallbacks.this.canvas.ready(callbacks);}}
[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["必要な情報がない","missingTheInformationINeed","thumb-down"],["複雑すぎる / 手順が多すぎる","tooComplicatedTooManySteps","thumb-down"],["最新ではない","outOfDate","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["サンプル / コードに問題がある","samplesCodeIssue","thumb-down"],["その他","otherDown","thumb-down"]],["最終更新日 2025-07-25 UTC。"],[[["\u003cp\u003eInteractive Canvas Actions support \u003ccode\u003eonUpdate()\u003c/code\u003e, \u003ccode\u003eonTtsMark()\u003c/code\u003e, and \u003ccode\u003eonInputStatusChanged()\u003c/code\u003e callbacks to enhance user interactions.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eonUpdate()\u003c/code\u003e callback facilitates data exchange between your webhook and web app for dynamic updates, primarily used in server-side fulfillment.\u003c/p\u003e\n"],["\u003cp\u003e\u003ccode\u003eonTtsMark()\u003c/code\u003e synchronizes web app behavior with spoken prompts by triggering actions based on custom SSML marks, applicable to both server-side and client-side fulfillment.\u003c/p\u003e\n"],["\u003cp\u003eCurrently in Developer Preview, \u003ccode\u003eonInputStatusChanged()\u003c/code\u003e allows your web app to respond to microphone and Assistant processing states for a more integrated user experience.\u003c/p\u003e\n"]]],[],null,["# Callbacks\n\nYou can implement the following callbacks in your Interactive Canvas Action:\n\n`onUpdate()`\n------------\n\nThe `onUpdate()`callback passes data from your webhook to your web app to update\nthe web app appropriately. You should only use this callback with the server-side\nfulfillment model of Interactive Canvas development.\n\nFor more information about `onUpdate()`, see\n[Pass data to update the web app](/assistant/interactivecanvas/prompts#pass_data_to_update_the_web_app).\n\n`onTtsMark()`\n-------------\n\nThe `onTtsMark()` callback is called when custom `\u003cmark\u003e` tags included in the\nSpeech Synthesis Markup Language ([SSML](/assistant/conversational/ssml))\nof your response are read out to the user during Text to Speech (TTS). You can\nuse `onTtsMark()` in both the server-side and client-side fulfillment development\nmodels.\n\nIn the following snippets, `onTtsMark()` synchronizes the web app's animation\nwith the corresponding TTS output. When the Action has said to the user, \"Sorry,\nyou lost,\" the web app spells out the correct word and displays the letters to\nthe user.\n| **Note:** At this time, timepoints don't work with the SSML `\u003cbreak\u003e` tag.\n\nIn the following example, the webhook handler `revealWord` includes a custom\nmark in the response to the user when they've lost the game: \n\n### JavaScript\n\n```javascript\n...\napp.handle('revealWord', conv =\u003e {\n conv.add(new Simple(`\u003cspeak\u003eSorry, you lost.\u003cmark name=\"REVEAL_WORD\"/\u003e The word is ${conv.session.params.word}.\u003c/speak\u003e`));\n conv.add(new Canvas());\n});\n...\n \n```\n\nThe following code snippet then registers the `onTtsMark()` callback, checks the\nname of the mark, and executes the `revealCorrectWord()` function, which updates\nthe web app: \n\n### JavaScript\n\n```javascript\n...\nsetCallbacks() {\n // declare Assistant Canvas Action callbacks\n const callbacks = {\n onTtsMark(markName) {\n if (markName === 'REVEAL_WORD') {\n // display the correct word to the user\n that.revealCorrectWord();\n }\n },\n }\n callbacks.onUpdate.bind(this);\n}\n...\n \n```\n\n`onInputStatusChanged()`\n------------------------\n\n\u003cbr /\u003e\n\n| **Warning**: This API is currently in Developer Preview. You can test this API in the simulator, but do not deploy an Action that uses this feature to alpha, beta, or production channels. Actions deployed using these features will not function on end-user devices.\n\n\u003cbr /\u003e\n\nThe `onInputStatusChanged()` callback notifies you when the input status changes\nin your Interactive Canvas Action. Input status changes indicate when the\nmicrophone opens and closes or when Assistant is processing a query. The\nfollowing events can cause the input status to change:\n\n- The user speaking to your Action\n- The user inputting text on the Android Google Search App (AGSA)\n- The web app using the `sendTextQuery()` API to send a text query to the Action\n- The Action writing to home storage and other Assistant events\n\nThe primary use case for this callback is synchronizing your Action with the\nuser's voice interactions. For example, if a user is playing an Interactive\nCanvas game and opens the microphone, you can pause the game while the user\nspeaks. You can also wait until the microphone is open to send a text query to\nAssistant to ensure it's received.\n\nThis API reports the following statuses:\n\n- `LISTENING` - Indicates that the microphone is open.\n- `IDLE` - Indicates that the microphone is closed.\n- `PROCESSING` - Indicates that Assistant is currently executing a query, and the microphone is closed.\n\nThe API reports the input status to your Action each time the status changes.\n\nWhile any transition between states is possible, the following flows are common:\n\n- `IDLE`\\\u003e`LISTENING`\\\u003e`PROCESSING`\\\u003e`IDLE` - The user says a query, the query is processed, and the microphone closes.\n- `IDLE`\\\u003e`PROCESSING`\\\u003e`IDLE` - The web app uses the `sendTextQuery()` API to send a text query to the Action.\n- `IDLE`\\\u003e`LISTENING`\\\u003e`IDLE` - The user opens the microphone but does not say a query.\n\nTo use this feature in your Action, add `onInputStatusChanged()` to your web app\ncode, as shown in the following snippet: \n\n onInputStatusChanged(inputStatus) {\n console.log(\"The new input status is: \", inputStatus);\n }\n\nThe `onInputStatusChanged()` callback passes back a single enum parameter,\n`inputStatus`. You can check this value to see the current input status. The\n`inputStatus` can be `LISTENING`, `PROCESSING`, or `IDLE`.\n\nNext, add `onInputStatusChanged()` to the `callbacks` object to register it, as\nshown in the following snippet: \n\n /**\n * Register all callbacks used by the Interactive Canvas Action\n * executed during game creation time.\n */\n setCallbacks() {\n const that = this;\n // Declare the Interactive Canvas action callbacks.\n const callbacks = {\n onUpdate(data) {\n console.log('Received data', data);\n },\n onInputStatusChanged(inputStatus) {\n console.log(\"The new input status is: \", inputStatus);\n },\n };\n // Called by the Interactive Canvas web app once web app has loaded to\n // register callbacks.\n this.canvas.ready(callbacks);\n }\n }"]]