Model Experiments Dashboard
A model experiment is a container of training and validation sessions in the experiment. The following figure shown is the "Model Experiments" page. This page will contain all the experiments that were started by the user.

The following figure breaks down the elements of an "Experiment" card.

An experiment will contain child training and validation sessions. The training sessions and validation sessions will be described in more detail in the sections below.
Training Sessions
Training sessions take in datasets and synthesize from them new AI models for object recognition (Vision) or object perception (Fusion).
From the "Model Experiments" page, we can click on the "Training Sessions" button with the icon to see the training sessions in the experiment. The figure below shows the layout of the training session cards under the "Training Sessions" page.

The following figure describes the attributes of any given training session.

To compare all the training charts of each session, click on "All Charts" at the top right corner of the "Training Sessions" page. This will show the charts from each session overlaid on top of one another for a quick comparison.

All the training charts will be displayed with a legend that indicates the training session.

For more details regarding deploying training sessions, please see Training ModelPack for training Vision models and Training Fusion for training Fusion models.
Validation Sessions
The validation sessions will assess the performance of the models in the training sessions. A training session can have any number of validation sessions. From the "Model Experiments" page, we can click on the "Validation Sessions" button with the icon to see the validation sessions in the experiment. The figure below shows the layout of the validation session cards under the "Validation Sessions" page.

The following figure describes the attributes of any given validation session.

To compare the validation charts of each session, click on "Compare" at the top right corner of the "Validate Sessions" page. This will show the charts of each validation session side-by-side for a quick comparison.

Next select the validation session results you wish to compare. Once selected, click "Compare" to show the validation charts side-by-side.

Now the charts for each session are displayed side-by-side. All the charts for a single training session will be shown in one column. A new column indicates another session.

For more details regarding deploying validation sessions, please see Validating ModelPack for validating Vision models and Validating Fusion for validating Fusion models.
Next Steps
Now that you are familiar with the layout of the Model Experiments Dashboard, proceed to the next section for learning more about the context of the Cloud Instances Dashboard.