The Mobile Vision API is deprecated

Get Started with the Mobile Vision iOS API

The Mobile Vision API for iOS has detectors that let you find faces, barcodes and text in photos and video.

A note on CocoaPods

The Google Mobile Vision iOS SDK and related samples are distributed through CocoaPods. Set up CocoaPods by going to cocoapods.org and following the directions.

Try the sample apps

After installing CocoaPods, run the command pod try GoogleMobileVision from Terminal to open up example projects for the library. There are 2 sample apps available:

  • FaceDetectorDemo: This demo demonstrates basic face detection and integration with AVFoundation. The app highlights face, eyes, nose, mouth, cheeks, and ears within detected faces.

  • GooglyEyesDemo: This demo demonstrates how to use GoogleMVDataOutput pod to simplify integration with video pipelines. The app draws cartoon eyes on top of detected faces.

  • MultiDetectorDemo: This demo demonstrates how to use GoogleMVDataOutput pod to run multiple detection types at once. The app draws red rectangles on detected faces and blue rectangles on detected barcodes.

To add Mobile Vision API to your existing iOS App

To create apps using the Mobile Vision API, download the GoogleMobileVision CocoaPod as follows:

  • Add a file named Podfile to your Xcode project folder, if you don't have one already.

  • Add pod 'GoogleMobileVision/FaceDetector' to your Podfile.

  • Run the command pod update from Terminal in the Xcode project folder. This will download and add the FaceDetector CocoaPod to your project.

You can find the class and method descriptions for the GoogleMobileVision in the API reference.

Next Steps

Now that you have your environment set up to run with the Mobile Vision API, there are a few things you can do next: