docs.validmind.ai/training
Click to start
“As a validator who is new to ValidMind, I want to learn how to review model documentation, prepare my validation report, track issues, and submit my report for approval.”
First, let’s make sure you can log in to ValidMind.
Training is interactive — you explore ValidMind live. Try it!
→ , ↓ , SPACE , N — next slide ← , ↑ , P , H — previous slide ? — all keyboard shortcuts
To try out this course, you need to have been onboarded onto ValidMind Academy with the Validator role.
Log in to check your access:
Be sure to return to this page afterwards.
This is the ValidMind Platform.
From here, you have access to:
Try it live on the next page.
Evaluate the conceptual soundness, data preparation, model development, and ongoing monitoring and governance plans for the model.
The Document Overview shows a section-by-section outline of your model’s documentation, as well as summaries of:
To locate your document overview for a model:
In the left sidebar, click Model Inventory.
Select a model or find your model by applying a filter or searching for it.
In the left sidebar that appears for your model, click Documentation.
Try it live on the next page.
Have a question about the model? Collaborate with your developer right in the model documentation.
Try it live on the next page.
While working with content blocks in documentation, you can comment directly on specific portions of the text.
All users associated with a model, such as model developers and model validators, will see a notification that a comment has been posted in their Recent Activity feed, accessible via the Dashboard.
In any section of the model documentation, select the portion of text you want to comment on, then click the button that appears.
Enter your comment and click Comment.
You can view the comment by clicking the highlighted text. Comments will also appear in the right sidebar.
Click the highlighted text to view the comment thread.
Enter your comment and click Reply.
You can view the comment thread by clicking the highlighted text.
Click the highlighted text portion to view the thread, then click to resolve the thread.
To view the resolved comment thread, click the Comments archive button in the toolbar.
You can view a history of all archived comments in the Comment archive.
To reopen a comment thread, reply to the comment thread in the Comment archive or click the Reopen button that appears next to the highlighted text portion.
Locate the test results in the documentation, review the data, and identify issues with the model.
To locate your document overview for a model:
In the left sidebar, click Model Inventory.
Select a model or find your model by applying a filter or searching for it.
In the left sidebar that appears for your model, click Documentation.
Review the sections:
Try it live on the next page.
Based on your review of the documentation, add some findings for your validation report.
Try it live on the next page.
As part of the validation process, you may find issues with the model documentation that must be resolved. To indicate that there is an issue and to track the resolution later on, you add a new finding.
Select a model or find your model by applying a filter or searching for it.
In the left sidebar that appears for your model, click Documentation.
You can now either log a finding on this overview page, or via a specific documentation section. Both methods will allow you to associate a finding with a documentation section.
Link your findings and the evidence from the test results you analyzed to the validation report.
(Scroll down for the full instructions.)
Try it live on the next page.
Select a model or find your model by applying a filter or searching for it.
In the left sidebar that appears for your model, click Validation Report and then locate 2. Validation.
You can now expand any subsection of the validation report you would like to work with.
For example, select 2.1.1. Assumptions.
In any section of the documentation where the button is available, click Link Finding to Report.
On the Link Finding to Report page that opens, select from the list of available findings, or create a new finding.
Click Update Linked Findings.
Confirm that newly linked finding shown under Findings is accurate..
Select a model or find your model by applying a filter or searching for it.
In the left sidebar that appears for your model, click Validation Report and then locate 2. Validation.
You can now expand any subsection of the validation report you would like to work with.
For example, select 2.1.1. Assumptions.
In any section of the documentation where the button is available, click Link Evidence to Report.
On the Link Evidence to Validation Report page that opens, select the evidence that is related to your assessment.
If you are not sure if something is relevant, click to expand the section for more details.
Click Update Linked Evidence.
Confirm that the newly linked-to evidence shown under Developer Evidence is accurate.
Based on the evidence you analyzed and your model, with your guidelines.
(Scroll down for the full instructions.)
Try it live on the next page.
Select a model or find your model by applying a filter or searching for it.
In the left sidebar that appears for your model, click Validation Report and then locate 2. Validation.
You can now expand any subsection of the validation report you would like to work with.
For example, select 2.1.1. Assumptions.
In any section of the documentation where the assessment dropdown menu is available, select one of the available options:
For example, to indicate that there is some compliance based on the evidence or findings you linked to:
Confirm that the compliance summary shown for each subsection under 2. Validation provides an accurate overview for current qualitative and quantitative risk assessments:
As you prepare your report, review open or past due findings, close resolved ones, or add a mitigation plan.
(Scroll down for the full instructions.)
Try it live on the next page.
Select a model or find your model by applying a filter or searching for it.
In the left sidebar that appears for your model, click Model Findings.
Go through the open findings one-by-one:
If the finding has been addressed:
Closed
.If part or all of the finding remains to be addressed:
Assess if the issue identified by the finding prevents the model from being approved.
Criteria include:
High-severity findings — Must be resolved before approval.
Medium-severity findings — May allow conditional approval with a mitigation plan.
Low-severity findings — Don’t prevent approval and are typically resolved post-approval with a mitigation plan.
Track open issues until all findings are resolved or a remediation plan is in place for post-approval issues.
When you’re ready, verify the approval workflow, and then submit your validation report for approval.
(Scroll down for the full instructions.)
Try it live on the next page.
Workflow states and transitions are configured by an administrator in advance, but you should verify that the expected people are included in the approval process.
Select a model or find your model by applying a filter or searching for it.
On the landing page of your model, locate the model status section:
While your lifecycle statuses and workflows are custom to your organization, some examples are:
To transition through the approval workflow, all required workflow steps must be completed. For example, you cannot submit a validation report for review until the model documentation itself has been submitted.
Select a model or find your model by applying a filter or searching for it.
If an action is available to your role, you’ll see it listed under your model status on the model’s landing page.
While your lifecycle statuses and workflows are custom to your organization, some examples are:
There is more that ValidMind can do to help you prepare validation reports, from using your own template to configuring the full approval workflow.
Or, find your next learning resource on ValidMind Academy.
ValidMind Academy | Home