Deploy the Model
Once you have validated your trained model, take a look at examples of deploying this model across different platforms. You can find a checklist of supported devices. We support validation on specific targets and applications for live video inference. Certain platforms are still under development.
| Platform | On Target Validation | Live Video | In Development |
|---|---|---|---|
| PC / Linux | ✓ | ||
| Mac/MacOS | ✓ | ||
| i.MX 8M Plus EVK | ✓ | ✓ | |
| NVIDIA Orin | ✓ | ||
| Kinara ARA-2 | ✓ | ||
| Raivin Radar Fusion | ✓ | ✓ | ✓ |
| i.MX 95 EVK | ✓ | ✓ |
If you wish to run validation on device, please follow instructions below.
Additional Platforms
Support for additional platforms beyond these listed will be available soon. Let us know which platform you'd like to see supported next!
In this Quickstart guide, you have created your EdgeFirst Studio Account, logged in to EdgeFirst Studio, and created your very first project and ran your first experiment by capturing images and videos, annotating datasets, training a Vision model, validating the trained model, and deploying the model back into the PC, EdgeFirst Platform, or the i.MX 8M Plus EVK.
Next Steps
For these next steps, it is recommended to be familiar in navigating EdgeFirst Studio. Next, users are invited to follow along other various User Workflows that are tailored towards various hardware requirements and resources available to the user.