Validating Fusion Models
This tutorial will describe the steps to validate the performance of Fusion models in EdgeFirst Studio that have been trained through the end-to-end workflows or Training Fusion. For a tutorial to validate ModelPack Vision models, see Validating ModelPack.
Checkout our video tutorial above as part of the EdgeFirst Studio Series to showcase the steps for running Fusion validation in EdgeFirst Studio. Otherwise, follow the steps below with section specific timestamps of the video.
Specify Project Experiments
From the projects page, choose the project that contains the training session with the models you want to validate. In this example, the project chosen is the "Spatial Perception" project. Next click the "Model Experiments" button as indicated in red.

Create Validation Session
In the experiment card, click the "Validate Sessions" button as indicated in red below.

You will be greeted to the "Validate Sessions" page as shown below.

Start a validation session by clicking on the "New Session" button on the top right corner of the page.

You will be greeted with the validation session configuration window. In this window, specify the name of the validation session, the model to validate, and the dataset to deploy. In this example, the TFLite model will be validated and the "Raivin Ultra Short 25.03" dataset with the validation partition will be used. Next specify, the validation parameters such as the detection window size and the detection threshold. Additional information on these parameters are provided by hovering over the info buttons as indicated in red below.
The only augmentation available for this type of validation is blur
. See Vision Augmentations for further details.

Once the configurations have been made, go ahead and click on the "Start Session" button on the bottom right of the window. This will start the validation session which will validate the model using the validation partition of the dataset.
Session Progress
Once the validation session has started, the progress with the stages will be shown on the left and additional information and status is shown on the right.

Completed Session
The completed session will look as follows with the status set to "Complete".

The attributes of the validation session are labeled below.

Validation Metrics
Once the validation session completes, you can view the validation metrics by clicking the "View Validation Charts" button on the top of the session card.

The metrics provides the precision, recall, F1, and IoU scores of the model at the specified window sizes. Additional charts are provided for the precision vs. recall and bird’s eye view heatmaps describing where the model performs well and where the model makes errors.
See Validation Metrics for further details.
You can go back to the validation session card by pressing the "Back" button as indicated in red below on the top left corner of the page.

Comparing Metrics
It is also possible to compare validation metrics for multiple sessions. See Validation Sessions in the EdgeFirst Studio Overview for further details.
Next Steps
Now that you have validated your Fusion model, follow these next steps for deploying your Fusion model in a Raivin Platform.