Why?
See section 4.1.1
Who? Section 4.1.4
When? Section 4.1.3
How?
Section 3
What?
See section 4.1.2
{Explanations}
{interpretability method}
The {required element} of the system is {interpretable} in line with the safety case requirements, to the {intended audience} in the {intended context}
Argument over {interpretability requirements}
{Explanations} are appropriate for {audience}
{Audience}
Argument over appropriateness of {explanations} to {audience}
{Audience Evidence}
{Explanations} produced at appropriate {times}
{Times that explanations
are provided}
Argument over appropriateness of {explanations} {time}
{Time Evidence}
{interpretability method} is faithful to {model} process
{ML Model}
Argument over faithfulness of {interpretability method}
{Explainability method
Evidence}
Implemented {interpretability method} appropriate for {requirements}, i.e. the correct thing is being explained
Argument over {interpretability methods}
{Explainability method
Evidence}
{interpretability requirements}
{Intended context}
{Intended audience}
{required element}
{Interpretable}