Put your AI models to work
Once you have trained and validated your models, you are ready to put them to work in production and start getting value out of their capabilities. In Criterion AI, going from training and validation to production is a matter a few simple mouse clicks and you even get to decide whether you want your models to be hosted online or offline. Online deployments are hosted in the cloud and are exposed via secure RESTful interfaces, which are incredibly easy to interact with from other systems (such as mobile apps, other SaaS products, web pages, etc.). 🔌
We also offer our free Criterion AI app (for both iPhone and Android) that makes use of your deployed model’s RESTful interface to analyze images taken with your phone in real time. It requires no integration whatsoever—with our mobile app, you can start leveraging the power of your model in a matter of minutes.
Use your models on your phone
Using the camera on either your iPhone and Android phone, you can run your model directly from the palm of your hand. Download the app from Apple’s App Store or the Google Play Store and log in using your Criterion AI account. You will see a list of all of the online models you have access to and can use from your phone. 📱
Pick one of the deployments, select a format for the image that matches your model (e.g., JPEG or PNG), shoot the photo and run it through your model. In just a few seconds, you will see the result of the model right on your screen. Act on the result and move on with your work. All of the actions (including the image and output from the model) are automatically being stored in Criterion AI to ensure that you comply with regulatory requirements for record retention.
No Internet? Run your models offline
Lots of use cases require that models be run offline—either because there is no Internet connection available at the place of execution or because the amount of data that has got to be processed per second is so large that running requests outside of the on-premise environment is going to be way too slow. For that reason, we have made it dead easy to export your models from Criterion AI and run them on your local hardware. We convert all models trained in Criterion AI to TensorFlow’s SavedModel format, which is an industry standard for serializing and storing AI models after having completed training. 💾
With an exported model from Criterion AI, you can make use of TensorFlow Serving—a flexible, high-performance serving system for machine learning models, designed for production environments. It is built and maintained by Google and they have even been so kind as to open source it, which means that it is 100% free to use (under Apache License 2.0) and incredibly simple to integrate into other systems.
At Criterion AI, we have helped several of our Enterprise customers integrate TensorFlow Serving into systems built by third-party OEMs. If you want to hear more about these cases or if you have an integration case at hand that you wish to discuss with us, please reach out to us. We would love to talk to you about how to leverage TensorFlow Serving and how you can run your AI models in conjunction with your existing on-premise systems. 👍