Google is committed to advancing racial equity for Black communities. See how.



FirebaseCustomLocalModel Describes a local model created from local or asset files. 
FirebaseCustomLocalModel.Builder Builder class of FirebaseCustomLocalModel
FirebaseCustomRemoteModel Describes a remote model to be downloaded to the device. 
FirebaseCustomRemoteModel.Builder Builder of FirebaseCustomRemoteModel
FirebaseModelDataType Data types supported by FirebaseModelInputs
FirebaseModelInputOutputOptions Configurations for data types and dimensions of input and output data. 
FirebaseModelInputOutputOptions.Builder Builder class to build FirebaseModelInputOutputOptions
FirebaseModelInputs Input data for FirebaseModelInterpreter
FirebaseModelInputs.Builder Builder class of FirebaseModelInputs
FirebaseModelInterpreter Interpreter to run custom models with TensorFlow Lite (requires API level 16+)

A model interpreter is created via getInstance(FirebaseModelInterpreterOptions) Follow the steps below to specify the FirebaseCustomRemoteModel or FirebaseCustomLocalModel, create a FirebaseModelInterpreterOptions and then create a FirebaseModelInterpreter and all the way to running an inference with the model. 

FirebaseModelInterpreterOptions Immutable options to configure model interpreter FirebaseModelInterpreter
FirebaseModelInterpreterOptions.Builder Builder class of FirebaseModelInterpreterOptions
FirebaseModelOutputs Stores inference results.