This section describes what is included in the auto-generated model documentation in Model Review.

The table below describes the keywords under Model Documentation:



Model Documentation

Base Model

  • Model name, version, project the model belongs to, and Unified Modelling Identifier (aka unique model id).

Computed on

  • Date and time the model was created (start to end).

Raw Dataset

Raw Dataset

  • Name and description of the raw data used in creating the model including the number of columns or features.

Columns in feature set - Number of Features

  • A list of all columns in the raw data. 
  • Click to select each column and generate a distribution chart and descriptive statistics. 
  • Also, click View Raw Dataset to view the raw data used for creating the model.

Feature Distribution Raw Dataset

  • Auto-generated chart for any of the columns in the raw data. 
  • The chart shown is based on the column selected in the “columns in feature set”.

Feature Statistics - Raw Dataset

  • Auto-generated descriptive statistics for the column selected in the feature list. This includes the number of missing values, count of observations or rows, mean, standard deviation and more.   

Training and Validation Dataset

Training and validation Dataset

  • Description of data used in creating the model.

Columns in Feature Set - “num of input features”

  • List of input features.

Feature Statistics - Training Set

  • Distribution chart of the selected feature in the training dataset. 
  • Click to view the training set to view the training dataset.

Feature Statistics - Validation Set


  • Distribution chart of the selected feature in the validation dataset.
  • Click to view the validation set to view the training dataset.

Unused Features

  • List of features in raw data that was not used for model creation.

Feature Importance

  • A chart showing the importance of the input feature used in creating the model.

Model Details

Data preprocessing


  • A list of features that were preprocessed and their corresponding preprocessing technique applied to them.



  • The algorithm used in creating that model and the active parameters of the algorithm used in creating the model (see “How Parameters in general work” section if active parameters are unclear).  

Performance Metrics

  • A list of the model’s performance metrics and their corresponding values (see Create Model -> step 5 for metrics explanation).

Performance Analysis

Model Accuracy

  • A chart showing the model’s performance (see Create Model -> step 5 for metrics explanation).

Sample Accuracy

  • A chart showing a plot of predicted values, original values and the difference between them. 
  • Click "view original vs predicted" to view the actual values of predicted and original values of the target variable. 

Receiver Operating Characteristic (ROC) curve (note that this only shows for classification model)

  • A ROC curve for the model.



Comments by Model Creator

  • All comments added by the creator of the model.

Comments by Model Manager

  • All comments added by the reviewer of the model.

Attached Document

  • A list of all documents attached to the model.