Once you uploaded cases (i.e., images) to your reader study, it is time to define the questions that participants will need to answer for each of those cases.
Go to Questions → Add a Question.
Question text : The question that will be presented to the user, should be short. e.g. 'Is there pathology present in these images?'
Help text: This can be used to provide extra information or clarification to the reader about this question.
Answer type: The answer type defines the type of input for your question. Grand Challenge offers the following answer types:
Plain answer types:
- multiple choice
Annotation answer types:
- bounding box
- distance measurement
The annotation type answers are discussed in more detail here
Required: A question can be mandatory or optional, check the "Required" box accordingly.
Image port: If the Answer type is an annotation you need to define on which image port it should be created. This will be the same image port for every case.
Direction: The layout of the question, vertical means that the question text goes above the answer box, horizontal means that the question text will be on the same row as the answer box.
Order: The order of the questions.
Interface: If a default answer will be provided in the Display Sets, select the relevant Interface here. How to set default answers is discussed in more detail here
💡Once a question has been answered by any participant it cannot be edited or deleted anymore. You can make the questions editable again by removing all answers (Users progress → Remove Answers).
If you are setting up an Educational Reader Study, you also need to add the ground truth to monitor your users performance. Go to Ground Truth and follow the instructions there. If ground truth has been added to a Reader Study, any answer given by a reader will be evaluated against the question-specific ground truth. The scores can then be compared on the leader board. Statistics are also available based on these scores: the average and total scores for each question as well as for each case are displayed in the statistics view.
💡It is not (yet) possible to include ground truth for annotation questions.