ML Kit can generate short replies to messages using an on-device model.
To generate smart replies, you pass ML Kit a log of recent messages in a conversation. If ML Kit determines the conversation is in English, and that the conversation doesn't have potentially sensitive subject matter, ML Kit generates up to three replies, which you can suggest to your user.
Try it out
- Play around with the sample app to see an example usage of this API.
Before you begin
- Include the following ML Kit pods in your Podfile:
pod 'GoogleMLKit/SmartReply', '7.0.0'
- After you install or update your project's Pods, open your Xcode project using its
.xcworkspace
. ML Kit is supported in Xcode version 12.4 or greater.
1. Create a conversation history object
To generate smart replies, you pass ML Kit a chronologically-ordered array of
TextMessage
objects, with the earliest timestamp first. Whenever the user
sends or receives a message, add the message, its timestamp, and the message
sender's user ID to the conversation history.
The user ID can be any string that uniquely identifies the sender within the conversation. The user ID doesn't need to correspond to any user data, and the user ID doesn't need to be consistent between conversations or invocations of the smart reply generator.
If the message was sent by the user you want to suggest replies to, set
isLocalUser
to true.
Swift
var conversation: [TextMessage] = [] // Then, for each message sent and received: let message = TextMessage( text: "How are you?", timestamp: Date().timeIntervalSince1970, userID: "userId", isLocalUser: false) conversation.append(message)
Objective-C
NSMutableArray *conversation = [NSMutableArray array]; // Then, for each message sent and received: MLKTextMessage *message = [[MLKTextMessage alloc] initWithText:@"How are you?" timestamp:[NSDate date].timeIntervalSince1970 userID:userId isLocalUser:NO]; [conversation addObject:message];
A conversation history object looks like the following example:
Timestamp | userID | isLocalUser | Message |
---|---|---|---|
Thu Feb 21 13:13:39 PST 2019 | true | are you on your way? | |
Thu Feb 21 13:15:03 PST 2019 | FRIEND0 | false | Running late, sorry! |
ML Kit suggests replies to the last message in a conversation history. The last message should be from a non-local user. In the example above, the last message in the conversation is from the non-local user FRIEND0. When you use pass ML Kit this log, it suggests replies to FRIENDO's message: "Running late, sorry!"
2. Get message replies
To generate smart replies to a message, get an instance of SmartReply
and pass
the conversation history to its suggestReplies(for:completion:)
method:
Swift
SmartReply.smartReply().suggestReplies(for: conversation) { result, error in guard error == nil, let result = result else { return } if (result.status == .notSupportedLanguage) { // The conversation's language isn't supported, so // the result doesn't contain any suggestions. } else if (result.status == .success) { // Successfully suggested smart replies. // ... } }
Objective-C
MLKSmartReply *smartReply = [MLKSmartReply smartReply]; [smartReply suggestRepliesForMessages:inputText completion:^(MLKSmartReplySuggestionResult * _Nullable result, NSError * _Nullable error) { if (error || !result) { return; } if (result.status == MLKSmartReplyResultStatusNotSupportedLanguage) { // The conversation's language isn't supported, so // the result doesn't contain any suggestions. } else if (result.status == MLKSmartReplyResultStatusSuccess) { // Successfully suggested smart replies. // ... } }];
If the operation succeeds, a SmartReplySuggestionResult
object is passed to
the completion handler. This object contains a list of up to three suggested
replies, which you can present to your user:
Swift
for suggestion in result.suggestions { print("Suggested reply: \(suggestion.text)") }
Objective-C
for (MLKSmartReplySuggestion *suggestion in result.suggestions) { NSLog(@"Suggested reply: %@", suggestion.text); }
Note that ML Kit might not return results if the model isn't confident in the relevance of the suggested replies, the input conversation isn't in English, or if the model detects sensitive subject matter.