Commit 177731a6 by Diana Huang

Replace "student" with "learner".

parent 0cd40194
.. _PA Accessing Assignment Information:
##########################################
Accessing Assignment and Student Metrics
Accessing Assignment and Learner Metrics
##########################################
After your open response assessment assignment has been released, you can access information about the number of students in each step of the assignment or the performance of individual students. This information is available in the **Course Staff Information** section at the end of each assignment. To access it, open the assignment in the courseware, scroll to the bottom of the assignment, and then click the black **Course Staff Information** banner.
After your open response assessment assignment has been released, you can access information about the number of learners in each step of the assignment or the performance of individual learners. This information is available in the **Course Staff Information** section at the end of each assignment. To access it, open the assignment in the courseware, scroll to the bottom of the assignment, and then click the black **Course Staff Information** banner.
.. image:: /Images/PA_CourseStaffInfo_Collapsed.png
:alt: The Course Staff Information banner at the bottom of the peer assessment
......@@ -15,7 +15,7 @@ After your open response assessment assignment has been released, you can access
View Metrics for Individual Steps
************************************************
You can check the number of students who have completed, or are currently working through, the following steps:
You can check the number of learners who have completed, or are currently working through, the following steps:
* Submitted responses.
* Completed peer assessments.
......@@ -25,42 +25,42 @@ You can check the number of students who have completed, or are currently workin
To find this information, open the assignment in the courseware, scroll to the bottom of the assignment, and then click **Course Staff Information**.
The **Course Staff Information** section expands, and you can see the number of students who are currently working through (but have not completed) each step of the problem.
The **Course Staff Information** section expands, and you can see the number of learners who are currently working through (but have not completed) each step of the problem.
.. image:: /Images/PA_CourseStaffInfo_Expanded.png
:alt: The Course Staff Information box expanded, showing problem status
.. _Access Information for a Specific Student:
.. _Access Information for a Specific Learner:
***********************************************
Access Information for a Specific Student
Access Information for a Specific Learner
***********************************************
You can access information about an individual student's performance on a peer assessment assignment, including:
You can access information about an individual learner's performance on a peer assessment assignment, including:
* The student's response.
* The peer assessments that other students performed on the student's response, including feedback on individual criteria and on the overall response.
* The peer assessments that the student performed on other students' responses, including feedback on individual criteria and on the overall responses.
* The student's self assessment.
* The learner's response.
* The peer assessments that other learners performed on the learner's response, including feedback on individual criteria and on the overall response.
* The peer assessments that the learner performed on other learners' responses, including feedback on individual criteria and on the overall responses.
* The learner's self assessment.
In the following example, you can see the student's response. The response received one peer assessment, and the student completed a peer assessment on one other student's response. The student also completed a self assessment.
In the following example, you can see the learner's response. The response received one peer assessment, and the learner completed a peer assessment on one other learner's response. The learner also completed a self assessment.
.. image:: /Images/PA_SpecificStudent.png
:width: 500
:alt: Report showing information about a student's response
:alt: Report showing information about a learner's response
For an example that shows a student's response with more assessments, see :ref:`Access Student Information`.
For an example that shows a learner's response with more assessments, see :ref:`Access Learner Information`.
Accessing information about a specific student has two steps:
Accessing information about a specific learner has two steps:
#. Determine the student's course-specific anonymized ID.
#. Access information for that student.
#. Determine the learner's course-specific anonymized ID.
#. Access information for that learner.
=====================================================
Determine the Student's Course-Specific Anonymized ID
Determine the Learner's Course-Specific Anonymized ID
=====================================================
To determine a student's course-specific anonymized ID, you'll need two .csv spreadsheets from the Instructor Dashboard: the grade report (**<course name>_grade_report_<datetime>.csv**) and the list of course-specific anonymized student IDs (**<course name>-anon-ids.csv**).
To determine a learner's course-specific anonymized ID, you'll need two .csv spreadsheets from the Instructor Dashboard: the grade report (**<course name>_grade_report_<datetime>.csv**) and the list of course-specific anonymized learner IDs (**<course name>-anon-ids.csv**).
#. In the LMS, click the **Instructor** tab.
#. On the Instructor Dashboard, click **Data Download**.
......@@ -72,41 +72,41 @@ To determine a student's course-specific anonymized ID, you'll need two .csv spr
.. note:: Generating a grade report for a large class may take several hours.
5. When the link to the grade report appears in the **Reports Available for Download** list, click the link to open the spreadsheet.
#. When you have both spreadsheets open, view the **<course name>_grade_report_<datetime>.csv** spreadsheet. Locate the student that you want by username or e-mail address. Make a note of the number in the ID column (column A) for that student. In the following example, the student ID for e-mail address ``amydorrit@example.com`` (username ``lildorrit``) is ``18557``.
#. When you have both spreadsheets open, view the **<course name>_grade_report_<datetime>.csv** spreadsheet. Locate the learner that you want by username or e-mail address. Make a note of the number in the ID column (column A) for that learner. In the following example, the learner ID for e-mail address ``amydorrit@example.com`` (username ``lildorrit``) is ``18557``.
.. image:: /Images/PA_grade_report.png
:width: 500
:alt: Spreadsheet listing enrolled students and grades
:alt: Spreadsheet listing enrolled learners and grades
7. Go to the **<course name>-anon-ids.csv** spreadsheet, locate the user ID that you noted in step 6, and then copy the value in the "Course Specific Anonymized user ID" column (**column C**) for the user. The value in column C is the student's anonymized user ID for the course. In the following example, the anonymized user ID for student ID ``18557`` is ``ofouw6265242gedud8w82g16qshsid87``.
7. Go to the **<course name>-anon-ids.csv** spreadsheet, locate the user ID that you noted in step 6, and then copy the value in the "Course Specific Anonymized user ID" column (**column C**) for the user. The value in column C is the learner's anonymized user ID for the course. In the following example, the anonymized user ID for learner ID ``18557`` is ``ofouw6265242gedud8w82g16qshsid87``.
.. image:: /Images/PA_anon_ids.png
:width: 500
:alt: Spreadsheet listing students' anonymous user IDs
:alt: Spreadsheet listing learners' anonymous user IDs
.. note:: Make sure that you don't copy the value in column B. You need the *course-specific* anonymized user ID from **column C**.
.. _Access Student Information:
.. _Access Learner Information:
=======================================
Access the Student's Information
Access the Learner's Information
=======================================
#. In the LMS, go to the peer assessment assignment that you want to see.
#. Scroll to the bottom of the problem, and then click the black **Course Staff Information** banner.
#. Scroll down to the **Get Student Info** box, paste the student's course-specific anonymized user ID in the box, and then click **Submit**.
#. Scroll down to the **Get Learner Info** box, paste the learner's course-specific anonymized user ID in the box, and then click **Submit**.
The student's information appears below the **Get Student Info** box.
The learner's information appears below the **Get Learner Info** box.
The following example shows:
* The student's response.
* The learner's response.
* The two peer assessments for the response.
* The two peer assessments the student completed.
* The student's self assessment.
* The two peer assessments the learner completed.
* The learner's self assessment.
For a larger view, click the image so that it opens by itself in the browser window, and then click anywhere on the image that opens.
.. image:: /Images/PA_SpecificStudent_long.png
:width: 250
:alt: Report showing information about a student's response
\ No newline at end of file
:alt: Report showing information about a learner's response
......@@ -58,10 +58,10 @@ Currently, you cannot add text formatting or images inside the Peer Assessment c
.. _PA Allow Images:
============================================
Allow Students to Submit Images (optional)
Allow Learners to Submit Images (optional)
============================================
To allow students to submit an image with a response:
To allow learners to submit an image with a response:
#. In the component editor, click the **Settings** tab.
#. Next to **Allow Image Responses**, select **True**.
......@@ -69,10 +69,10 @@ To allow students to submit an image with a response:
.. note::
* The image file must be a .jpg or .png file, and it must be smaller than 5 MB in size.
* Currently, course teams cannot see any of the images that students submit. Images are not visible in the body of the assignment in the courseware, and they are not included in the course data package.
* You can allow students to upload an image, but you cannot require it.
* Students can only submit one image with a response.
* All responses must contain text. Students cannot submit a response that contains only an image.
* Currently, course teams cannot see any of the images that learners submit. Images are not visible in the body of the assignment in the courseware, and they are not included in the course data package.
* You can allow learners to upload an image, but you cannot require it.
* Learners can only submit one image with a response.
* All responses must contain text. Learners cannot submit a response that contains only an image.
.. _PA Add Rubric:
......@@ -80,11 +80,11 @@ To allow students to submit an image with a response:
Step 3. Add the Rubric
******************************
In this step, you'll add your rubric and provide your students with feedback options.
In this step, you'll add your rubric and provide your learners with feedback options.
For each step below, replace any default text with your own text.
.. note:: All open response assessments include a feedback field below the rubric so that students can provide written feedback on a peer's overall response. You can also allow or require students to provide feedback for individual criteria. See step 2.4 below for instructions. For more information, see :ref:`Feedback Options`.
.. note:: All open response assessments include a feedback field below the rubric so that learners can provide written feedback on a peer's overall response. You can also allow or require learners to provide feedback for individual criteria. See step 2.4 below for instructions. For more information, see :ref:`Feedback Options`.
To add the rubric:
......@@ -95,12 +95,12 @@ To add the rubric:
#. Repeat step 4 for each option. If you need to add more options, click **Add Option**.
#. Next to **Feedback for This Criterion**, select a value in the drop-down list.
* If you don't want students to provide feedback for this individual criterion, select **None**.
* If you want to require students to provide feedback, select **Required**.
* If you want to allow students to provide feedback, but not require it, select **Optional**.
* If you don't want learners to provide feedback for this individual criterion, select **None**.
* If you want to require learners to provide feedback, select **Required**.
* If you want to allow learners to provide feedback, but not require it, select **Optional**.
7. Follow the instructions in steps 2-6 to add your remaining criteria. If you need to add more criteria, click **Add Criterion** at the end of the list of criteria.
#. Include instructions for students to provide overall written feedback on their peers' responses. You can leave the default text in the **Feedback Instructions** field or replace it with your own text.
#. Include instructions for learners to provide overall written feedback on their peers' responses. You can leave the default text in the **Feedback Instructions** field or replace it with your own text.
.. _PA Criteria Comment Field Only:
......@@ -125,14 +125,14 @@ To provide a comment field without options:
Step 4. Specify the Assignment Name and Response Dates
************************************************************
To specify a name for the assignment as well as start and due dates for all student responses:
To specify a name for the assignment as well as start and due dates for all learner responses:
#. In the component editor, click the **Settings** tab.
#. Next to **Display Name**, type the name you want to give the assignment.
#. Next to **Response Start Date** and **Response Start Time**, enter the date and time when you want students to be able to begin submitting responses. Note that all times are in Universal Coordinated Time (UTC).
#. Next to **Response Due Date** and **Response Due Time**, enter the date and time by which all student responses must be submitted. Note that all times are in Universal Coordinated Time (UTC).
#. Next to **Response Start Date** and **Response Start Time**, enter the date and time when you want learners to be able to begin submitting responses. Note that all times are in Universal Coordinated Time (UTC).
#. Next to **Response Due Date** and **Response Due Time**, enter the date and time by which all learner responses must be submitted. Note that all times are in Universal Coordinated Time (UTC).
.. note:: We recommend that you set the response due date and time at least two days before the peer assessment due date and time. If the response due time and peer assessment due time are close together, and a student submits a response just before responses are due, other students may not have time to perform peer assessments before peer assessments are due.
.. note:: We recommend that you set the response due date and time at least two days before the peer assessment due date and time. If the response due time and peer assessment due time are close together, and a learner submits a response just before responses are due, other learners may not have time to perform peer assessments before peer assessments are due.
.. _PA Select Assignment Steps:
......@@ -140,9 +140,9 @@ To specify a name for the assignment as well as start and due dates for all stud
Step 5. Select Assignment Steps
****************************************
Open response assessment assignments can include student training, peer assessment, and self assessment steps. You can include a peer assessment step before a self assessment step and vice versa.
Open response assessment assignments can include learner training, peer assessment, and self assessment steps. You can include a peer assessment step before a self assessment step and vice versa.
If you include a student training step, you **must** include a peer assessment step. You can also include a self assessment step. The student training step must come before both the peer assessment and the self assessment step.
If you include a learner training step, you **must** include a peer assessment step. You can also include a self assessment step. The learner training step must come before both the peer assessment and the self assessment step.
To add steps to the assignment:
......@@ -150,13 +150,13 @@ To add steps to the assignment:
#. Scroll down past the **Allow Image Responses** field.
#. Locate the following headings:
* **Step: Student Training**
* **Step: Learner Training**
* **Step: Peer Assessment**
* **Step: Self Assessment**
Select the check boxes for the steps that you want the assignment to include.
#. (optional) If you want to change the order of the steps, drag the steps into the order that you want. If you include a student training step, make sure it is the first step in the assignment.
#. (optional) If you want to change the order of the steps, drag the steps into the order that you want. If you include a learner training step, make sure it is the first step in the assignment.
.. _PA Specify Step Settings:
......@@ -168,36 +168,36 @@ After you select the steps that you want, you'll specify settings for those step
.. note:: If you make changes to a step, but then you clear the check box for that step, the step will no longer be part of the assignment and your changes will not be saved.
.. _PA Student Training Step:
.. _PA Learner Training Step:
========================
Student Training
Learner Training
========================
For the student training step, you'll enter one or more responses that you have created, then select an option for each criterion in your rubric.
For the learner training step, you'll enter one or more responses that you have created, then select an option for each criterion in your rubric.
.. note:: You must enter your complete rubric on the **Rubric** tab before you can select options for the student training responses. If you later change one of your criteria or any of its options, you'll also have to update the student training step.
.. note:: You must enter your complete rubric on the **Rubric** tab before you can select options for the learner training responses. If you later change one of your criteria or any of its options, you'll also have to update the learner training step.
To add and score student training responses:
To add and score learner training responses:
#. Under **Step: Student Training**, locate the first **Scored Response** section.
#. Under **Step: Learner Training**, locate the first **Scored Response** section.
#. In the **Response** field, enter the text of your example response.
#. Under **Response Score**, select the option that you want for each criterion.
For more information, see :ref:`PA Student Training Assessments`.
For more information, see :ref:`PA Learner Training Assessments`.
============================
Peer Assessment
============================
For the peer assessment step, you'll specify the number of responses that each student must grade, the number of students that must grade each response, and start and due dates. All fields are required.
For the peer assessment step, you'll specify the number of responses that each learner must grade, the number of learners that must grade each response, and start and due dates. All fields are required.
To specify peer assessment settings:
#. Locate the **Step: Peer Assessment** heading.
#. Next to **Must Grade**, enter the number of responses that each student must grade.
#. Next to **Graded By**, enter the number of students that must grade each response.
#. Next to **Start Date** and **Start Time**, enter the date and time when students can begin assessing their peers' responses. All times are in Universal Coordinated Time (UTC).
#. Next to **Must Grade**, enter the number of responses that each learner must grade.
#. Next to **Graded By**, enter the number of learners that must grade each response.
#. Next to **Start Date** and **Start Time**, enter the date and time when learners can begin assessing their peers' responses. All times are in Universal Coordinated Time (UTC).
#. Next to **Due Date** and **Due Time**, enter the date and time by which all peer assessments must be complete. All times are in UTC.
============================
......@@ -207,7 +207,7 @@ Self Assessment
For the self assessment step, you'll specify when the step starts and ends.
#. Locate the **Step: Self Assessment** heading.
#. Next to **Start Date** and **Start Time**, enter the date and time when students can begin assessing their peers' responses. All times are in Universal Coordinated Time (UTC).
#. Next to **Start Date** and **Start Time**, enter the date and time when learners can begin assessing their peers' responses. All times are in Universal Coordinated Time (UTC).
#. Next to **Due Date** and **Due Time**, enter the date and time by which all peer assessments must be complete. All times are in UTC.
.. _PA Show Top Responses:
......@@ -216,10 +216,10 @@ For the self assessment step, you'll specify when the step starts and ends.
Step 7. Show Top Responses
******************************
To allow students to see the top-scoring responses for the assignment, you'll specify a number on the **Settings** tab.
To allow learners to see the top-scoring responses for the assignment, you'll specify a number on the **Settings** tab.
#. In the component editor, click the **Settings** tab.
#. In the **Top Responses** field, specify the number of responses that you want to appear in the **Top Responses** section below the student's final score. If you don't want this section to appear, set the number to 0. The maximum number is 100.
#. In the **Top Responses** field, specify the number of responses that you want to appear in the **Top Responses** section below the learner's final score. If you don't want this section to appear, set the number to 0. The maximum number is 100.
.. note:: Because each response can be up to 300 pixels in height, we recommend that you set this number to 20 or lower to prevent the page from becoming too long.
......
......@@ -8,11 +8,11 @@ Open Response Assessments
Introduction to Open Response Assessments
*****************************************
Open response assessments allow instructors to assign questions that may not have definite answers. Students submit a response to the question, and then that student and the student's peers compare the response to a rubric that you create. Usually students will submit text responses. You can also allow your students to upload an image to accompany the text.
Open response assessments allow instructors to assign questions that may not have definite answers. Learners submit a response to the question, and then that learner and the learner's peers compare the response to a rubric that you create. Usually learners will submit text responses. You can also allow your learners to upload an image to accompany the text.
Open response assessments include peer assessments and self assessments. In peer assessments, students compare their peers' responses to a rubric that you create. In self assessments, students compare their own responses to the rubric.
Open response assessments include peer assessments and self assessments. In peer assessments, learners compare their peers' responses to a rubric that you create. In self assessments, learners compare their own responses to the rubric.
In open response assessments, students usually only see their own responses and any peer responses they assess. You can also allow students to see the top-scoring responses that their peers have submitted. For more information, see :ref:`PA Top Responses`.
In open response assessments, learners usually only see their own responses and any peer responses they assess. You can also allow learners to see the top-scoring responses that their peers have submitted. For more information, see :ref:`PA Top Responses`.
For more information about creating open response assessments, including step-by-step instructions, see the following sections:
......@@ -31,9 +31,9 @@ When you create an open response assessment assignment, you include several elem
* The prompt, or question.
* The rubric.
* One or more assessment steps. Assignments can include a student training step, a peer assessment step, and a self assessment step.
* One or more assessment steps. Assignments can include a learner training step, a peer assessment step, and a self assessment step.
.. note:: If you include a student training step, you must also add a peer assessment step. The student training step must be the first step.
.. note:: If you include a learner training step, you must also add a peer assessment step. The learner training step must be the first step.
For step-by-step instructions for creating an open response assessment, see :ref:`PA Create a PA Assignment`.
......@@ -41,31 +41,31 @@ For step-by-step instructions for creating an open response assessment, see :ref
Prompt
************************
The **prompt**, or question that you want your students to answer, appears near the top of the page, followed by a field where the student enters a response. You can require your students to enter text as a response, or you can allow your students to both enter text and upload an image.
The **prompt**, or question that you want your learners to answer, appears near the top of the page, followed by a field where the learner enters a response. You can require your learners to enter text as a response, or you can allow your learners to both enter text and upload an image.
.. image:: /Images/PA_QandRField.png
:width: 500
:alt: ORA question and blank response field
.. note:: If students upload an image, the image file must be a .jpg or .png file, and it must be smaller than 5 MB in size.
.. note:: If learners upload an image, the image file must be a .jpg or .png file, and it must be smaller than 5 MB in size.
When you write your question, you can include helpful information for your students, such as what students can expect after they submit responses and the approximate number of words or sentences that a student's response should have. (A response cannot have more than 10,000 words.)
When you write your question, you can include helpful information for your learners, such as what learners can expect after they submit responses and the approximate number of words or sentences that a learner's response should have. (A response cannot have more than 10,000 words.)
For more information, see :ref:`PA Add Prompt`.
==========================================
Asking Students to Upload Images
Asking Learners to Upload Images
==========================================
You can ask your students to upload an image as part of their response. If you do this, however, keep the following in mind:
You can ask your learners to upload an image as part of their response. If you do this, however, keep the following in mind:
* Currently, you cannot require your students to upload an image. You can only allow it.
* Currently, you cannot require your learners to upload an image. You can only allow it.
* All responses must include some text. Students cannot submit a response that only contains an image.
* All responses must include some text. Learners cannot submit a response that only contains an image.
* Students can only submit one image with their response.
* Learners can only submit one image with their response.
.. note:: Currently, course teams cannot see any of the images that students submit. Images are not visible in the body of the assignment in the courseware, and they are not included in the course data package.
.. note:: Currently, course teams cannot see any of the images that learners submit. Images are not visible in the body of the assignment in the courseware, and they are not included in the course data package.
.. _PA Rubric:
......@@ -73,13 +73,13 @@ You can ask your students to upload an image as part of their response. If you d
Rubric
************************
Your assignment must include a **rubric** that you design. The same rubric is used for peer and self assessments, and the rubric appears when students begin grading. Students compare their peers' responses to the rubric.
Your assignment must include a **rubric** that you design. The same rubric is used for peer and self assessments, and the rubric appears when learners begin grading. Learners compare their peers' responses to the rubric.
Rubrics are made of *criteria* and *options*.
* Each criterion has a *name*, a *prompt*, and one or more *options*.
* The name is a very short summary of the criterion, such as "Ideas" or "Content". Criterion names generally have just one word. Because the system uses criterion names for identification, **the name for each criterion must be unique.** Criterion names do not appear in the rubric that students see when they are completing peer assessments, but they do appear on the page that shows the student's final grade.
* The name is a very short summary of the criterion, such as "Ideas" or "Content". Criterion names generally have just one word. Because the system uses criterion names for identification, **the name for each criterion must be unique.** Criterion names do not appear in the rubric that learners see when they are completing peer assessments, but they do appear on the page that shows the learner's final grade.
.. image :: /Images/PA_CriterionName.png
:alt: A final score page with call-outs for the criterion names
......@@ -95,13 +95,13 @@ Rubrics are made of *criteria* and *options*.
Different criteria in the same assignment can have different numbers of options. For example, in the image above, the first criterion has three options and the second criterion has four options.
.. note:: You can also include criteria that do not have options, but that do include a field where students can enter feedback. For more information, see :ref:`PA Criteria Comment Field Only`.
.. note:: You can also include criteria that do not have options, but that do include a field where learners can enter feedback. For more information, see :ref:`PA Criteria Comment Field Only`.
You can see both criterion and option names when you access assignment information for an individual student. For more information, see :ref:`PA Accessing Assignment Information`.
You can see both criterion and option names when you access assignment information for an individual learner. For more information, see :ref:`PA Accessing Assignment Information`.
.. image:: /Images/PA_Crit_Option_Names.png
:width: 600
:alt: Student-specific assignment information with call-outs for criterion and option names
:alt: Learner-specific assignment information with call-outs for criterion and option names
When you create your rubric, decide how many points each option will receive, and make sure that the explanation for each option is as specific as possible. For example, one criterion and set of options may resemble the following.
......@@ -142,35 +142,35 @@ For more information, see :ref:`PA Add Rubric`.
Assessment Steps
************************
In your assignment, you'll also specify the **assessment steps**. You can set the assignment to include a student training step, a peer assessment step, and a self assessment step.
In your assignment, you'll also specify the **assessment steps**. You can set the assignment to include a learner training step, a peer assessment step, and a self assessment step.
You can see the type and order of the assessments when you look at the assignment. In the following example, after students submit a response, they complete a student training step ("Learn to Assess Responses"), complete peer assessments on other students' responses ("Assess Peers"), and then complete self assessments ("Assess Your Response").
You can see the type and order of the assessments when you look at the assignment. In the following example, after learners submit a response, they complete a learner training step ("Learn to Assess Responses"), complete peer assessments on other learners' responses ("Assess Peers"), and then complete self assessments ("Assess Your Response").
.. image:: /Images/PA_AsmtWithResponse.png
:alt: Image of peer assessment with assessment steps and status labeled
:width: 600
.. note:: If you include a student training step, you must also include a peer assessment step. The student training step must come before peer and self assessment steps.
.. note:: If you include a learner training step, you must also include a peer assessment step. The learner training step must come before peer and self assessment steps.
.. _PA Student Training Assessments:
.. _PA Learner Training Assessments:
========================
Student Training Step
Learner Training Step
========================
When you create a peer assessment assignment, you can include one or more student training assessments to help students learn to perform their own assessments. A student training assessment contains one or more sample responses that you write, together with the scores that you would give the sample responses. Students review these responses and try to score them the way that you scored them.
When you create a peer assessment assignment, you can include one or more learner training assessments to help learners learn to perform their own assessments. A learner training assessment contains one or more sample responses that you write, together with the scores that you would give the sample responses. Learners review these responses and try to score them the way that you scored them.
.. note:: If you include a student training step, you must also include a peer assessment step. The student training step must come before peer and self assessment steps.
.. note:: If you include a learner training step, you must also include a peer assessment step. The learner training step must come before peer and self assessment steps.
In a student training assessment, the **Learn to Assess Responses** step opens immediately after a student submits a response. The student sees one of the sample responses that you created, along with the rubric. The scores that you gave the response do not appear. The student also sees the number of sample responses that he or she will assess.
In a learner training assessment, the **Learn to Assess Responses** step opens immediately after a learner submits a response. The learner sees one of the sample responses that you created, along with the rubric. The scores that you gave the response do not appear. The learner also sees the number of sample responses that he or she will assess.
.. image:: Images/PA_TrainingAssessment.png
:alt: Sample training response, unscored
:width: 500
The student selects an option for each of the assignment's criteria, and then clicks **Compare your selections with the instructor's selections**. If all of the student's selections match the instructor's selections, the next sample response opens automatically.
The learner selects an option for each of the assignment's criteria, and then clicks **Compare your selections with the instructor's selections**. If all of the learner's selections match the instructor's selections, the next sample response opens automatically.
If any of the student's selections differs from the instructor's selections, the student sees the response again, and the following message appears above the response:
If any of the learner's selections differs from the instructor's selections, the learner sees the response again, and the following message appears above the response:
.. code-block:: xml
......@@ -179,7 +179,7 @@ If any of the student's selections differs from the instructor's selections, the
response and consider why the instructor may have assessed it differently. Then, try
the assessment again.
For each of the criteria, the student sees one of the following two messages, depending on whether the student's selections matched those of the instructor:
For each of the criteria, the learner sees one of the following two messages, depending on whether the learner's selections matched those of the instructor:
.. code-block:: xml
......@@ -191,38 +191,38 @@ For each of the criteria, the student sees one of the following two messages, de
Selected Options Agree
The option you selected is the option that the instructor selected.
For example, the following student chose one correct option and one incorrect option.
For example, the following learner chose one correct option and one incorrect option.
.. image:: /Images/PA_TrainingAssessment_Scored.png
:alt: Sample training response, scored
:width: 500
The student continues to try scoring the sample response until the student's scoring for all criteria matches the instructor's scoring.
The learner continues to try scoring the sample response until the learner's scoring for all criteria matches the instructor's scoring.
For more information, see :ref:`PA Student Training Step`.
For more information, see :ref:`PA Learner Training Step`.
=====================
Peer Assessment Step
=====================
In the peer assessment step, students review other students' responses and select an option for each criterion in your rubric based on the response. Students can also provide text feedback, or comments, on the response.
In the peer assessment step, learners review other learners' responses and select an option for each criterion in your rubric based on the response. Learners can also provide text feedback, or comments, on the response.
Number of Responses and Assessments
************************************
When you specify a peer assessment step, you'll specify the **number of responses** each student has to assess and the **number of peer assessments** each response has to receive.
When you specify a peer assessment step, you'll specify the **number of responses** each learner has to assess and the **number of peer assessments** each response has to receive.
.. note:: Because some students may submit a response but not complete peer assessments, some responses may not receive the required number of assessments. To increase the chance that all responses will receive enough assessments, you must set the number of responses that students have to assess to be higher than the number of assessments that each response must undergo. For example, if you require each response to receive three assessments, you could require each student to assess five responses.
.. note:: Because some learners may submit a response but not complete peer assessments, some responses may not receive the required number of assessments. To increase the chance that all responses will receive enough assessments, you must set the number of responses that learners have to assess to be higher than the number of assessments that each response must undergo. For example, if you require each response to receive three assessments, you could require each learner to assess five responses.
If all responses have received assessments, but some students haven't completed the required number of peer assessments, those students can assess responses that other students have already assessed. The student who submitted the response sees the additional peer assessments when he sees his score. However, the additional peer assessments do not count toward the score that the response receives.
If all responses have received assessments, but some learners haven't completed the required number of peer assessments, those learners can assess responses that other learners have already assessed. The learner who submitted the response sees the additional peer assessments when he sees his score. However, the additional peer assessments do not count toward the score that the response receives.
.. _Feedback Options:
Feedback Options
****************
By default, students see a single comment field below the entire rubric. You can also add a comment field to an individual criterion or to several individual criteria. This comment field can contain up to 300 characters.
By default, learners see a single comment field below the entire rubric. You can also add a comment field to an individual criterion or to several individual criteria. This comment field can contain up to 300 characters.
The comment field appears below the options for the criterion. In the following image, both criteria have a comment field. There is also a field for overall comments on the response.
......@@ -237,9 +237,9 @@ For more information, see :ref:`Add Rubric` and :ref:`PA Criteria Comment Field
Peer Assessment Scoring
***********************
Peer assessments are scored by criteria. An individual criterion's score is the median of the scores that each peer assessor gave that criterion. For example, if the Ideas criterion in a peer assessment receives a 10 from one student, a 7 from a second student, and an 8 from a third student, the Ideas criterion's score is 8.
Peer assessments are scored by criteria. An individual criterion's score is the median of the scores that each peer assessor gave that criterion. For example, if the Ideas criterion in a peer assessment receives a 10 from one learner, a 7 from a second learner, and an 8 from a third learner, the Ideas criterion's score is 8.
A student's final score for a peer assessment is the sum of the median scores for each individual criterion.
A learner's final score for a peer assessment is the sum of the median scores for each individual criterion.
For example, a response may receive the following scores from peer assessors:
......@@ -278,13 +278,13 @@ Note, again, that final scores are calculated by criteria, not by individual ass
Assessing Additional Responses
********************************
Students can assess more than the required number of responses. After a student completes the peer assessment step, the step "collapses" so that just the **Assess Peers** heading is visible.
Learners can assess more than the required number of responses. After a learner completes the peer assessment step, the step "collapses" so that just the **Assess Peers** heading is visible.
.. image:: /Images/PA_PAHeadingCollapsed.png
:width: 500
:alt: The peer assessment step with just the heading visible
If the student clicks the **Assess Peers** heading, the step expands. The student can then click **Continue Assessing Peers**.
If the learner clicks the **Assess Peers** heading, the step expands. The learner can then click **Continue Assessing Peers**.
.. image:: /Images/PA_ContinueGrading.png
:width: 500
......@@ -295,7 +295,7 @@ If the student clicks the **Assess Peers** heading, the step expands. The studen
Self Assessment Step
=====================
In self assessments, the student sees his response followed by your rubric. As with peer assessments, the student compares the rubric to his response and selects an option for each of the criteria.
In self assessments, the learner sees his response followed by your rubric. As with peer assessments, the learner compares the rubric to his response and selects an option for each of the criteria.
If you include both peer and self assessments, we recommend that you include the peer assessment before the self assessment.
......@@ -305,13 +305,13 @@ If you include both peer and self assessments, we recommend that you include the
Top Responses
*****************************
You can include a **Top Responses** section that shows the top-scoring responses that students have submitted for the assignment, along with the scores for those responses. The **Top Responses** section appears below the student's score information after the student finishes every step in the assignment.
You can include a **Top Responses** section that shows the top-scoring responses that learners have submitted for the assignment, along with the scores for those responses. The **Top Responses** section appears below the learner's score information after the learner finishes every step in the assignment.
.. image:: /Images/PA_TopResponses.png
:alt: Section that shows the text and scores of the top three responses for the assignment
:width: 500
You can allow the **Top Responses** section to show between 1 and 100 responses. Keep in mind, however, that each response may be up to 300 pixels in height in the list. (For longer responses, students can scroll to see the entire response.) We recommend that you specify 20 or fewer responses to prevent the page from becoming too long.
You can allow the **Top Responses** section to show between 1 and 100 responses. Keep in mind, however, that each response may be up to 300 pixels in height in the list. (For longer responses, learners can scroll to see the entire response.) We recommend that you specify 20 or fewer responses to prevent the page from becoming too long.
.. note:: It may take up to an hour for a high-scoring response to appear in the **Top Responses** list.
......
.. _PA for Students:
.. _PA for Learners:
###########################################
Open Response Assessments for Students
Open Response Assessments for Learners
###########################################
You may want to let your students know what to expect when they complete open response assessments. This guide walks students through each step of the process.
You may want to let your learners know what to expect when they complete open response assessments. This guide walks learners through each step of the process.
**************************************************
Student Introduction to Open Response Asssessments
Learner Introduction to Open Response Asssessments
**************************************************
In an open response assessment, you'll provide a response to a question that may not have a simple or definitive answer. Some open response assessments have asked students to submit written responses, videos of speeches, and computer code.
In an open response assessment, you'll provide a response to a question that may not have a simple or definitive answer. Some open response assessments have asked learners to submit written responses, videos of speeches, and computer code.
Open response assessments may include a peer assessment, a self assessment, or both. With a peer assessment, you'll assess, or grade, responses that several of your peers have submitted, and several of your peers will assess your response. With a self assessment, you'll assess your own response. To assess a response, you'll compare the response to a rubric that the instructor provides.
......@@ -25,7 +25,7 @@ When you assess a response, you'll select the option that best describes the res
Some instructors create a **Top Responses** section that shows the top-scoring responses for the assignment and the scores that these responses received. If an instructor creates this section, you can see it below your score after you've completed each step of the assignment.
************************
Student Instructions
Learner Instructions
************************
When you come to an open response assessment in the course, you'll see the question and a response field. After you submit your response, you'll assess some of your peers' responses, your own response, or both, depending on the assignment. You can see the steps that your assignment includes below the response field.
......@@ -34,11 +34,11 @@ When you come to an open response assessment in the course, you'll see the quest
:alt: Open response assessment example with question, response field, and assessment types and status labeled
:width: 550
Here, we'll walk you through the process of completing an open response assessment that includes a student training step, a peer assessment, and a self assessment:
Here, we'll walk you through the process of completing an open response assessment that includes a learner training step, a peer assessment, and a self assessment:
#. Submit your response to a question.
#. Learn to assess responses.
#. Assess responses that other students have submitted.
#. Assess responses that other learners have submitted.
#. Assess your own response to the question.
#. Receive your score and provide feedback on the peer assessment.
......@@ -59,14 +59,14 @@ Read the question carefully. Some instructors include important information in t
After you compose a response, type it into the response field under **Your Response**, and then click **Submit your response and move to the next step**. If you can't finish your response all at once, you can click **Save Your Progress** to save a draft of your response, and then come back and submit it later.
After you submit your response, if other students have already submitted responses, the peer assessment step starts immediately. However, you don't have to start grading right away. If you want to stop working and come back later, just refresh or reopen your browser when you come back. New peer responses will be available for you to grade.
After you submit your response, if other learners have already submitted responses, the peer assessment step starts immediately. However, you don't have to start grading right away. If you want to stop working and come back later, just refresh or reopen your browser when you come back. New peer responses will be available for you to grade.
If no other students have submitted responses yet, you'll see the following message:
If no other learners have submitted responses yet, you'll see the following message:
.. code-block:: xml
Waiting for Peer Responses
All submitted peer responses have been assessed. Check back later to see if more students
All submitted peer responses have been assessed. Check back later to see if more learners
have submitted responses. You'll receive your grade after you complete the peer assessment
and self assessment steps, and after your peers have assessed your response.
......@@ -134,7 +134,7 @@ For each of the criteria, you'll see one of the following two messages, dependin
Selected Options Agree
The option you selected is the option that the instructor selected.
In the following example, the student chose one correct option and one incorrect option.
In the following example, the learner chose one correct option and one incorrect option.
.. image:: /Images/PA_TrainingAssessment_Scored.png
:alt: Sample training response, scored
......@@ -148,13 +148,13 @@ When you've successfully assessed all of the sample responses, you'll move to th
Assess Peer Responses
=====================
When peer assessment starts, you'll see the original question, another student's response, and the rubric for the assignment. Above the response you can see how many responses you'll assess and how many you've already assessed.
When peer assessment starts, you'll see the original question, another learner's response, and the rubric for the assignment. Above the response you can see how many responses you'll assess and how many you've already assessed.
.. image:: /Images/PA_S_PeerAssmt.png
:alt: In-progress peer assessment
:width: 500
You'll assess these responses by selecting options in the rubric, the same way you assessed the sample responses in the "learn to assess responses" step. Additionally, this step has a field below the rubric where you can provide comments about the student's response.
You'll assess these responses by selecting options in the rubric, the same way you assessed the sample responses in the "learn to assess responses" step. Additionally, this step has a field below the rubric where you can provide comments about the learner's response.
.. note:: Some assessments may have an additional **Comments** field for one or more of the assessment's individual criteria. You can enter up to 300 characters in these fields. In the following image, both criteria have a **Comments** field. There is also a field for overall comments on the response.
......@@ -164,7 +164,7 @@ You'll assess these responses by selecting options in the rubric, the same way y
After you've selected options in the rubric and provided additional comments about the response in this field, click **Submit your assessment and move to response #<number>**.
When you submit your assessment of the first student's response, another response opens for you. Assess this response in the same way that you assessed the first response, and then submit your assessment. You'll repeat these steps until you've assessed the required number of responses. The number in the upper-right corner of the step is updated as you assess each response.
When you submit your assessment of the first learner's response, another response opens for you. Assess this response in the same way that you assessed the first response, and then submit your assessment. You'll repeat these steps until you've assessed the required number of responses. The number in the upper-right corner of the step is updated as you assess each response.
Assess Additional Peer Responses
********************************
......@@ -192,7 +192,7 @@ When you've completed enough peer assessments, your self assessment opens. You'l
Receive Your Score and Provide Feedback
==========================================
After you submit your self assessment, if other students are still assessing your response, you'll see the following message under the **Assess Your Response** step.
After you submit your self assessment, if other learners are still assessing your response, you'll see the following message under the **Assess Your Response** step.
.. code-block:: xml
......@@ -206,7 +206,7 @@ If you see this message, keep checking back periodically until peer assessment i
When peer assessment is complete, you can see the scores you received from all of your peers, as well as your self assessment. You can also see any additional comments that your peers have provided.
.. image:: /Images/PA_AllScores.png
:alt: A student's response with peer and self assessment scores
:alt: A learner's response with peer and self assessment scores
:width: 550
If you want to, you can provide feedback on the scores that you received under **Provide Feedback on Peer Assessments**.
......@@ -221,7 +221,7 @@ If you've assessed the required number of peer responses and completed your self
Peer Assessment Scoring
***********************
Peer assessments are scored by criteria. An individual criterion's score is the *median*, not average, of the scores that each peer assessor gave that criterion. For example, if the Ideas criterion in a peer assessment receives a 10 from one student, a 7 from a second student, and an 8 from a third student, the Ideas criterion's score is 8.
Peer assessments are scored by criteria. An individual criterion's score is the *median*, not average, of the scores that each peer assessor gave that criterion. For example, if the Ideas criterion in a peer assessment receives a 10 from one learner, a 7 from a second learner, and an 8 from a third learner, the Ideas criterion's score is 8.
Your final score for a peer assessment is the sum of the median scores for each individual criterion.
......
......@@ -33,7 +33,7 @@ Example-Based Assessment (AI)
.. automodule:: openassessment.assessment.api.ai
:members:
Student Training
Learner Training
****************
.. automodule:: openassessment.assessment.api.student_training
......
......@@ -11,7 +11,7 @@ Overview
In this document, we describe the architecture for:
* Training a classifier using a supervised machine learning algorithm.
* Grading student essays using a trained classifier.
* Grading learner essays using a trained classifier.
Both training and grading require more time than is acceptable within the
request-response cycle of a web application. Therefore, both
......@@ -31,11 +31,11 @@ Requirements
------------
* Grading tasks *must* be completed within hours after being scheduled.
Ideally, the delay would be within several minutes, but students could
Ideally, the delay would be within several minutes, but learners could
tolerate longer delays during periods of high usage or failure recovery.
The AI Grading API does not implement deadlines, so if a submission
is submitted for grading (allowed when the problem is open),
the student will receive a grade for the AI assessment step.
the learner will receive a grade for the AI assessment step.
* Grading task queues must tolerate periods of high usage,
as the number of submissions will likely increase when
......@@ -63,7 +63,7 @@ Entities
* **AI Grading API**: An API that encapsulates all interactions with AI-grading database models and the task queue. All inputs and outputs are JSON-serializable, so the API calls can be made in-process (likely the initial implementation) or through the network.
* **Submission**: An essay submitted by a student to a problem in a course.
* **Submission**: An essay submitted by a learner to a problem in a course.
* **Assessment**: Specifies the scores a submission received for each criterion in a rubric.
......@@ -129,9 +129,9 @@ Parameter: AI Grading Workflow ID
Procedure:
1. A student submits an essay, creating a **submission** in the database.
1. A learner submits an essay, creating a **submission** in the database.
2. The student updates the workflow, and the **Workflow API** uses the **AI Grading API** to:
2. The learner updates the workflow, and the **Workflow API** uses the **AI Grading API** to:
a. Retrieve the most recent **ClassifierSet** for the current rubric definition (possibly none if training hasn't yet finished).
b. Create an **AI Grading Workflow** record in the database, associated with a Submission ID and **ClassifierSet**.
......@@ -141,8 +141,8 @@ Procedure:
a. Retrieve the submission and classifiers from persistent storage or a cache.
i. If the **ClassifierSet** is null, then the classifier wasn't available when the student created the submission.
ii. Since we cannot grade the student without a classifier, we create the **AI Grading Workflow** record but do not schedule the **Grading Task**. This means that the workflow will not be marked complete.
i. If the **ClassifierSet** is null, then the classifier wasn't available when the learner created the submission.
ii. Since we cannot grade the learner without a classifier, we create the **AI Grading Workflow** record but do not schedule the **Grading Task**. This means that the workflow will not be marked complete.
iii. When a **Training Task** completes, update incomplete **Grading Tasks** with null **ClassifierSets** with the newly created **ClassifierSet**, then schedule the **GradingTasks**.
b. **Optimization**: Check whether a completed **AI Grading Workflow** exists for this submission using the same **ClassifierSet**.
......@@ -156,7 +156,7 @@ Procedure:
e. Create an **AssessmentPart** for each rubric criterion, containing the score assigned by the classifier for that criterion.
f. Mark the **AI Grading Workflow** as complete by associating the **Assessment** with the workflow.
4. When a student checks the status of the submission, the **AI Grading API**:
4. When a learner checks the status of the submission, the **AI Grading API**:
a. Queries the database for the latest **AI Grading Workflow** matching the submission.
b. Reports whether the workflow is started or complete.
......
......@@ -6,9 +6,9 @@ Understanding the Workflow
The `openassessment.workflow` application is tasked with managing the overall
life-cycle of a student's submission as it goes through various evaluation steps
life-cycle of a learner's submission as it goes through various evaluation steps
(e.g. peer assessment, self assessment). A new workflow entry is created as soon
as the student submits their response to a question, and it is initialized with
as the learner submits their response to a question, and it is initialized with
the steps (and step order) are initialized at that time.
Canonical Status
......@@ -16,7 +16,7 @@ Canonical Status
`AssessmentWorkflow` model is not the canonical status. This is because the
determination of what we need to do in order to be "done" is specified by the
OpenAssessmentBlock problem definition and can change. So every time we are
asked where the student is, we have to query the assessment APIs (peer, self,
asked where the learner is, we have to query the assessment APIs (peer, self,
AI, etc.) with the latest requirements(e.g. "number of submissions you have
to assess = 5"). The "status" field on this model is an after the fact
recording of the last known state of that information so we can search
......@@ -48,9 +48,9 @@ Isolation of Assessment types
`submission_uuid`, repeated calls to this function should return the same
thing.
`on_init(submission_uuid)`
Notification to the API that the student has submitted a response.
Notification to the API that the learner has submitted a response.
`on_start(submission_uuid)`
Notification to the API that the student has started the assessment step.
Notification to the API that the learner has started the assessment step.
In the long run, it could be that `OpenAssessmentBlock` becomes a wrapper
that talks to child XBlocks via this kind of API, and that each child would
......@@ -80,7 +80,7 @@ Simple Order/Dependency Assumptions
for the next six months or so.
Steps Stay Completed
In the interests of not surprising/enraging students, once a step is complete,
In the interests of not surprising/enraging learners, once a step is complete,
it stays complete. So if peer grading requires two assessors and a particular
submission meets that threshold, it will be considered complete at that point
in time. Raising the threshold to three required assessors in the future will
......@@ -108,5 +108,5 @@ Django settings
* `ORA2_ASSESSMENTS`: a `dict` mapping assessment names to the Python module path
of the corresponding assessment API.
* `ORA2_ASSESSMENT_SCORE_PRIORITY`: a `list` of assessment names that determine
which assessment type is used to generate a student's score.
which assessment type is used to generate a learner's score.
......@@ -4,7 +4,7 @@
Migrating AI Problems
---------------------
ORA2 supports AI assessment for student responses, but currently does not support authoring of AI problems. In order to migrate an existing AI assessment problem into ORA2, you will need to:
ORA2 supports AI assessment for learner responses, but currently does not support authoring of AI problems. In order to migrate an existing AI assessment problem into ORA2, you will need to:
1. Create a problem with example-based assessment enabled.
......@@ -53,4 +53,4 @@ ORA2 supports AI assessment for student responses, but currently does not suppor
.. image:: course_staff_ai.png
3. At this point, students can submit essays and receive grades.
3. At this point, learners can submit essays and receive grades.
......@@ -135,9 +135,9 @@ def create_assessment(
if submission['student_item']['student_id'] != user_id:
msg = (
u"Cannot submit a self-assessment for the submission {uuid} "
u"because it was created by another student "
u"(submission student ID {student_id} does not match your "
u"student id {other_id})"
u"because it was created by another learner "
u"(submission learner ID {student_id} does not match your "
u"learner id {other_id})"
).format(
uuid=submission_uuid,
student_id=submission['student_item']['student_id'],
......@@ -267,7 +267,6 @@ def get_assessment(submission_uuid):
return serialized_assessment
def get_assessment_scores_by_criteria(submission_uuid):
"""Get the median score for each rubric criterion
......@@ -313,7 +312,7 @@ def _log_assessment(assessment, submission):
"""
logger.info(
u"Created self-assessment {assessment_id} for student {user} on "
u"Created self-assessment {assessment_id} for learner {user} on "
u"submission {submission_uuid}, course {course_id}, item {item_id} "
u"with rubric {rubric_content_hash}"
.format(
......
......@@ -81,7 +81,7 @@ def on_start(submission_uuid):
StudentTrainingWorkflow.create_workflow(submission_uuid)
except Exception:
msg = (
u"An internal error has occurred while creating the student "
u"An internal error has occurred while creating the learner "
u"training workflow for submission UUID {}".format(submission_uuid)
)
logger.exception(msg)
......@@ -189,7 +189,7 @@ def validate_training_examples(rubric, examples):
]
if len(set(criteria_options) - set(criteria_without_options)) == 0:
return [_(
"If your assignment includes a student training step, "
"If your assignment includes a learner training step, "
"the rubric must have at least one criterion, "
"and that criterion must have at least one option."
)]
......@@ -277,7 +277,7 @@ def get_num_completed(submission_uuid):
except DatabaseError:
msg = (
u"An unexpected error occurred while "
u"retrieving the student training workflow status for submission UUID {}"
u"retrieving the learner training workflow status for submission UUID {}"
).format(submission_uuid)
logger.exception(msg)
raise StudentTrainingInternalError(msg)
......@@ -380,7 +380,7 @@ def get_training_example(submission_uuid, rubric, examples):
workflow = StudentTrainingWorkflow.get_workflow(submission_uuid=submission_uuid)
if not workflow:
raise StudentTrainingRequestError(
u"No student training workflow found for submission {}".format(submission_uuid)
u"No learner training workflow found for submission {}".format(submission_uuid)
)
# Get or create the training examples
......@@ -402,7 +402,7 @@ def get_training_example(submission_uuid, rubric, examples):
except DatabaseError:
msg = (
u"Could not retrieve a training example "
u"for the student with submission UUID {}"
u"for the learner with submission UUID {}"
).format(submission_uuid)
logger.exception(msg)
raise StudentTrainingInternalError(msg)
......@@ -448,7 +448,7 @@ def assess_training_example(submission_uuid, options_selected, update_workflow=T
item = workflow.current_item
if item is None:
msg = (
u"No items are available in the student training workflow associated with "
u"No items are available in the learner training workflow associated with "
u"submission UUID {}"
).format(submission_uuid)
raise StudentTrainingRequestError(msg)
......@@ -462,12 +462,12 @@ def assess_training_example(submission_uuid, options_selected, update_workflow=T
item.mark_complete()
return corrections
except StudentTrainingWorkflow.DoesNotExist:
msg = u"Could not find student training workflow for submission UUID {}".format(submission_uuid)
msg = u"Could not find learner training workflow for submission UUID {}".format(submission_uuid)
raise StudentTrainingRequestError(msg)
except DatabaseError:
msg = (
u"An error occurred while comparing the student's assessment "
u"to the training example. The submission UUID for the student is {}"
u"An error occurred while comparing the learner's assessment "
u"to the training example. The submission UUID for the learner is {}"
).format(submission_uuid)
logger.exception(msg)
raise StudentTrainingInternalError(msg)
......@@ -317,7 +317,7 @@ class PeerWorkflow(models.Model):
except DatabaseError:
error_message = (
u"An internal error occurred while retrieving a peer submission "
u"for student {}"
u"for learner {}"
).format(self)
logger.exception(error_message)
raise PeerAssessmentInternalError(error_message)
......@@ -357,7 +357,7 @@ class PeerWorkflow(models.Model):
except DatabaseError:
error_message = (
u"An internal error occurred while retrieving a peer submission "
u"for student {}"
u"for learner {}"
).format(self)
logger.exception(error_message)
raise PeerAssessmentInternalError(error_message)
......@@ -383,7 +383,7 @@ class PeerWorkflow(models.Model):
items = list(item_query[:1])
if not items:
msg = (
u"No open assessment was found for student {} while assessing "
u"No open assessment was found for learner {} while assessing "
u"submission UUID {}."
).format(self.student_id, submission_uuid)
raise PeerAssessmentWorkflowError(msg)
......@@ -398,7 +398,7 @@ class PeerWorkflow(models.Model):
except (DatabaseError, PeerWorkflowItem.DoesNotExist):
error_message = (
u"An internal error occurred while retrieving a workflow item for "
u"student {}. Workflow Items are created when submissions are "
u"learner {}. Workflow Items are created when submissions are "
u"pulled for assessment."
).format(self.student_id)
logger.exception(error_message)
......
......@@ -577,7 +577,7 @@
"options_selected": {}
}
],
"errors": ["If your assignment includes a student training step, the rubric must have at least one criterion, and that criterion must have at least one option."]
"errors": ["If your assignment includes a learner training step, the rubric must have at least one criterion, and that criterion must have at least one option."]
}
}
......@@ -70,7 +70,7 @@
value="{{ submission_start|utc|date:"H:i" }}"
>
</div>
<p class="setting-help">{% trans "The date and time when students can begin submitting responses." %}</p>
<p class="setting-help">{% trans "The date and time when learners can begin submitting responses." %}</p>
</li>
<li class="openassessment_date_editor field comp-setting-entry">
<div class="wrapper-comp-setting">
......@@ -99,7 +99,7 @@
value="{{ submission_due|utc|date:"H:i" }}"
>
</div>
<p class="setting-help">{% trans "The date and time when students can no longer submit responses." %}</p>
<p class="setting-help">{% trans "The date and time when learners can no longer submit responses." %}</p>
</li>
<li id="openassessment_submission_image_wrapper" class="field comp-setting-entry">
<div class="wrapper-comp-setting">
......@@ -109,7 +109,7 @@
<option value="1" {% if allow_file_upload %} selected="true" {% endif %}>{% trans "True"%}</option>
</select>
</div>
<p class="setting-help">{% trans "Specify whether students can submit an image file along with their text response." %}</p>
<p class="setting-help">{% trans "Specify whether learners can submit an image file along with their text response." %}</p>
</li>
<li id="openassessment_submission_latex_wrapper" class="field comp-setting-entry">
<div class="wrapper-comp-setting">
......@@ -119,7 +119,7 @@
<option value="1" {% if allow_latex %} selected="true" {% endif %}>{% trans "True"%}</option>
</select>
</div>
<p class="setting-help">{% trans "Specify whether students can write Latex formatted strings"%}</p>
<p class="setting-help">{% trans "Specify whether learners can write Latex formatted strings"%}</p>
</li>
<li id="openassessment_leaderboard_wrapper" class="field comp-setting-entry">
<div class="wrapper-comp-setting">
......@@ -139,9 +139,9 @@
<p class="openassessment_description" id="openassessment_step_select_description">
{% if 'example_based_assessment' in editor_assessments_order %}
{% trans "In this assignment, you can include steps for student training, peer assessment, self assessment, and example based assessment. Select the steps that you want below, and then drag them into the order that you want. If you include an example based assessment step, it must precede all other steps. If you include a student training training step, it must precede peer and self assessment steps." %}
{% trans "In this assignment, you can include steps for learner training, peer assessment, self assessment, and example based assessment. Select the steps that you want below, and then drag them into the order that you want. If you include an example based assessment step, it must precede all other steps. If you include a learner training training step, it must precede peer and self assessment steps." %}
{% else %}
{% trans "In this assignment, you can include steps for student training, peer assessment, and self assessment. Select the steps that you want below, and then drag them into the order that you want. If you include a student training step, it must precede all other steps." %}
{% trans "In this assignment, you can include steps for learner training, peer assessment, and self assessment. Select the steps that you want below, and then drag them into the order that you want. If you include a learner training step, it must precede all other steps." %}
{% endif %}
</p>
<ol id="openassessment_assessment_module_settings_editors">
......
......@@ -57,7 +57,7 @@
</label>
</div>
<p class="setting-help">
{% trans "Select one of the options above. This describes whether or not the student will have to provide criterion feedback." %}
{% trans "Select one of the options above. This describes whether or not the learner will have to provide criterion feedback." %}
</p>
</li>
</ul>
......
......@@ -9,7 +9,7 @@
</div>
<div class="openassessment_assessment_module_editor">
<p id="ai_assessment_description_closed" class="openassessment_description_closed {% if assessments.example_based_assessment %} is--hidden {% endif %}">
{% trans "An algorithm assesses students' responses by comparing the responses to pre-assessed sample responses that the instructor provides."%}
{% trans "An algorithm assesses learners' responses by comparing the responses to pre-assessed sample responses that the instructor provides."%}
</p>
<div id="ai_assessment_settings_editor" class="assessment_settings_wrapper {% if not assessments.example_based_assessment %} is--hidden {% endif %}">
<p class="openassessment_description">
......
......@@ -10,7 +10,7 @@
</div>
<div class="openassessment_assessment_module_editor">
<p id="peer_assessment_description_closed" class="openassessment_description_closed {% if assessments.peer_assessment %} is--hidden {% endif %}">
{% trans "Students assess a specified number of other students' responses using the rubric for the assignment." %}
{% trans "Learners assess a specified number of other learners' responses using the rubric for the assignment." %}
</p>
<div id="peer_assessment_settings_editor" class="assessment_settings_wrapper {% if not assessments.peer_assessment %} is--hidden {% endif %}">
<p class="openassessment_description">
......@@ -50,7 +50,7 @@
value="{{ assessments.peer_assessment.start|utc|date:"H:i" }}"
>
</div>
<p class="setting-help">{% trans "Enter the date and time when students can begin assessing peer responses." %}</p>
<p class="setting-help">{% trans "Enter the date and time when learners can begin assessing peer responses." %}</p>
</li>
<li class="field comp-setting-entry">
<div class="wrapper-comp-setting">
......
......@@ -38,7 +38,7 @@
<textarea id="openassessment_rubric_feedback" class="input setting-input">{{ feedbackprompt }}</textarea>
</div>
<p class="setting-help">
{% trans "Encourage your students to provide feedback on the response they've graded. You can replace the sample text with your own." %}
{% trans "Encourage your learners to provide feedback on the response they've graded. You can replace the sample text with your own." %}
</p>
</li>
<li class="field comp-setting-entry">
......@@ -47,7 +47,7 @@
<textarea id="openassessment_rubric_feedback_default_text" class="input setting-input">{{ feedback_default_text }}</textarea>
</div>
<p class="setting-help">
{% trans "Enter feedback text that students will see before they enter their own feedback. Use this text to show students a good example peer assessment." %}
{% trans "Enter feedback text that learners will see before they enter their own feedback. Use this text to show learners a good example peer assessment." %}
</p>
</li>
......
......@@ -10,7 +10,7 @@
</div>
<div class="openassessment_assessment_module_editor">
<p id="self_assessment_description_closed" class="openassessment_description_closed {% if assessments.self_assessment %} is--hidden {% endif %}">
{% trans "Students assess their own responses using the rubric for the assignment." %}
{% trans "Learners assess their own responses using the rubric for the assignment." %}
</p>
<div id="self_assessment_settings_editor" class="assessment_settings_wrapper {% if not assessments.self_assessment %} is--hidden {% endif %}">
<p class="openassessment_description">
......@@ -36,7 +36,7 @@
value="{{ assessments.self_assessment.start|utc|date:"H:i" }}"
>
</div>
<p class="setting-help">{% trans "Enter the date and time when students can begin assessing their responses." %}</p>
<p class="setting-help">{% trans "Enter the date and time when learners can begin assessing their responses." %}</p>
</li>
<li class="field comp-setting-entry">
<div class="wrapper-comp-setting">
......
......@@ -6,12 +6,12 @@
<div class="openassessment_inclusion_wrapper">
<input type="checkbox" id="include_student_training"
{% if assessments.student_training %} checked="true" {% endif %}>
<label for="include_student_training">{% trans "Step: Student Training" %}</label>
<label for="include_student_training">{% trans "Step: Learner Training" %}</label>
</div>
<div class="openassessment_assessment_module_editor">
<p id="student_training_description_closed" class="openassessment_description_closed {% if assessments.student_training %} is--hidden {% endif %}">
{% trans "Students learn to assess responses by scoring pre-assessed sample responses that you provide. Students move to the next step when the scores they give match your scores. Note that if you add this step, you must also add a peer assessment step. This step must come before the peer assessment step." %}
{% trans "Learners learn to assess responses by scoring pre-assessed sample responses that you provide. Learners move to the next step when the scores they give match your scores. Note that if you add this step, you must also add a peer assessment step. This step must come before the peer assessment step." %}
</p>
<div id="student_training_settings_editor" class="assessment_settings_wrapper {% if not assessments.student_training %} is--hidden {% endif %}">
<p class="openassessment_description">
......
......@@ -19,7 +19,7 @@
<strong> {% trans "Peer evaluation of this assignment will close soon. " %} </strong>
{% endif %}
{% if waiting %}
{% trans "All submitted peer responses have been assessed. Check back later to see if more students have submitted responses. " %}
{% trans "All submitted peer responses have been assessed. Check back later to see if more learners have submitted responses. " %}
{% endif %}
{% if has_self %}
{% blocktrans with peer_start_tag='<a data-behavior="ui-scroll" href="#openassessment__peer-assessment">'|safe self_start_tag='<a data-behavior="ui-scroll" href="#openassessment__self-assessment">'|safe end_tag='</a>'|safe %}
......
......@@ -4,9 +4,9 @@
<div class="message__content">
<p>
{% if approaching %}
{% trans "Student training for peer assessment will close soon. " %}
{% trans "Learner training for peer assessment will close soon. " %}
{% endif %}
{% trans "Complete the student training section to move on to peer assessment." %}
{% trans "Complete the learner training section to move on to peer assessment." %}
</p>
</div>
</div>
......
......@@ -21,14 +21,14 @@
</div>
<div class="staff-info__status ui-staff__content__section">
<table class="staff-info__status__table" summary="{% trans "Where are your students currently in this problem" %}">
<table class="staff-info__status__table" summary="{% trans "Where are your learners currently in this problem" %}">
<caption class="title">{% trans "Student Progress" %}</caption>
<caption class="title">{% trans "Learner Progress" %}</caption>
<thead>
<tr>
<th abbr="Step" scope="col">{% trans "Problem Step" %}</th>
<th abbr="# of Students" scope="col">{% trans "Active Students in Step" %}</th>
<th abbr="# of Learners" scope="col">{% trans "Active Learners in Step" %}</th>
</tr>
</thead>
......@@ -129,7 +129,7 @@
<form id="openassessment_student_info_form">
<ul>
<li class="openassessment__student-info_list">
<label for="openassessment__student_username" class="label">{% trans "Get Student Info" %}</label>
<label for="openassessment__student_username" class="label">{% trans "Get Learner Info" %}</label>
</li>
<li class="openassessment__student-info_list">
<input id="openassessment__student_username" type="text" class="value" maxlength="255">
......
......@@ -3,17 +3,17 @@
<div id="openassessment__student-info" class="staff-info__student__report">
{% if submission %}
<h2 class="title">
<span class="label">{% trans "Student Information" %}</span>
<span class="label">{% trans "Learner Information" %}</span>
</h2>
<div class="staff-info__content ui-staff__content">
<div class="wrapper--step__content">
<div class="step__content">
<h3 class="title">{% trans "Student Response" %}</h3>
<h3 class="title">{% trans "Learner Response" %}</h3>
<div class="student__answer__display__content">
{% if workflow_cancellation %}
{% blocktrans with removed_by_username=workflow_cancellation.cancelled_by removed_datetime=workflow_cancellation.created_at|utc|date:"N j, Y H:i e" %}
Student submission removed by {{ removed_by_username }} on {{ removed_datetime }}
Learner submission removed by {{ removed_by_username }} on {{ removed_datetime }}
{% endblocktrans %}
<br>
<!-- Comments: Reason for Cancellation-->
......@@ -21,7 +21,7 @@
Comments: {{ comments }}
{% endblocktrans %}
{% else %}
{% include "openassessmentblock/oa_submission_answer.html" with answer=submission.answer answer_text_label="The student's response to the question above:" %}
{% include "openassessmentblock/oa_submission_answer.html" with answer=submission.answer answer_text_label="The learner's response to the question above:" %}
{% endif %}
</div>
......@@ -55,7 +55,7 @@
<li>
<div class="has--warnings">
<div class="warning">
{% trans "Caution: Removing a student's submission cannot be undone." %}
{% trans "Caution: Removing a learner's submission cannot be undone." %}
</div>
</div>
</li>
......@@ -91,7 +91,7 @@
{% if peer_assessments %}
<div class="staff-info__status ui-staff__content__section">
<h3 class="title">{% trans "Peer Assessments for This Student" %}</h3>
<h3 class="title">{% trans "Peer Assessments for This Learner" %}</h3>
{% for assessment in peer_assessments %}
{% with peer_num=forloop.counter %}
<h4 class="title--sub"> {% trans "Peer" %} {{ peer_num }}: </h4>
......@@ -133,7 +133,7 @@
{% if submitted_assessments %}
<div class="staff-info__status ui-staff__content__section">
<h3 class="title">{% trans "Peer Assessments Completed by This Student" %}</h3>
<h3 class="title">{% trans "Peer Assessments Completed by This Learner" %}</h3>
{% for assessment in submitted_assessments %}
{% with peer_num=forloop.counter %}
<h4 class="title--sub">{% trans "Assessment" %} {{ peer_num }}:</h4>
......@@ -175,7 +175,7 @@
{% if self_assessment %}
<div class="staff-info__status ui-staff__content__section">
<h3 class="title">{% trans "Student's Self Assessment" %}</h3>
<h3 class="title">{% trans "Learner's Self Assessment" %}</h3>
<table class="staff-info__status__table" summary="{% trans "Self Assessment" %}">
<thead>
<tr>
......@@ -236,6 +236,6 @@
{% endif %}
</div>
{% else %}
{% trans "Couldn't find a response for this student." %}
{% trans "Couldn't find a response for this learner." %}
{% endif %}
</div>
......@@ -6,7 +6,7 @@
<div class="wrapper--step__content">
<div class="step__message message message--incomplete">
<h3 class="message__title">{% trans "Error Loading Student Training Examples" %}</h3>
<h3 class="message__title">{% trans "Error Loading Learner Training Examples" %}</h3>
<div class="message__content">
<p>{% trans "We couldn't load the student training step of this assignment." %}</p>
......
......@@ -75,7 +75,7 @@ I noticed that this response...
DEFAULT_EXAMPLE_ANSWER = (
"Replace this text with your own sample response for this assignment. "
"Then, under Response Score to the right, select an option for each criterion. "
"Students learn to assess responses by assessing this response and comparing "
"Learners learn to assess responses by assessing this response and comparing "
"the options that they select in the rubric with the options that you specified."
)
......
......@@ -76,7 +76,7 @@ def require_course_staff(error_key, with_json_handler=False):
def _wrapped(xblock, *args, **kwargs): # pylint: disable=C0111
permission_errors = {
"STAFF_INFO": xblock._(u"You do not have permission to access staff information"),
"STUDENT_INFO": xblock._(u"You do not have permission to access student information."),
"STUDENT_INFO": xblock._(u"You do not have permission to access learner information."),
}
......@@ -262,7 +262,7 @@ class StaffInfoMixin(object):
# from being displayed.
msg = (
u"Could not retrieve image URL for staff debug page. "
u"The student username is '{student_username}', and the file key is {file_key}"
u"The learner username is '{student_username}', and the file key is {file_key}"
).format(student_username=student_username, file_key=file_key)
logger.exception(msg)
......@@ -376,9 +376,9 @@ class StaffInfoMixin(object):
cancelled_by_id=student_item_dict['student_id'],
assessment_requirements=assessment_requirements
)
return {"success": True, 'msg': self._(u"The student submission has been removed from peer assessment. "
u"The student receives a grade of zero unless you reset "
u"the student's attempts for the problem to allow them to "
return {"success": True, 'msg': self._(u"The learner submission has been removed from peer assessment. "
u"The learner receives a grade of zero unless you reset "
u"the learner's attempts for the problem to allow them to "
u"resubmit a response.")}
except (
AssessmentWorkflowError,
......
......@@ -95,7 +95,7 @@ OpenAssessment.StudentTrainingListener.prototype = {
if (criterionAdded) {
this.displayAlertMsg(
gettext("Criterion Added"),
gettext("You've added a criterion. You'll need to select an option for the criterion in the Student Training step. To do this, click the Settings tab.")
gettext("You've added a criterion. You'll need to select an option for the criterion in the Learner Training step. To do this, click the Settings tab.")
);
}
},
......@@ -142,7 +142,7 @@ OpenAssessment.StudentTrainingListener.prototype = {
if (invalidated) {
this.displayAlertMsg(
gettext("Option Deleted"),
gettext("You've deleted an option. The system has removed that option from its criterion in the sample responses in the Student Training step. You may have to select a new option for the criterion.")
gettext("You've deleted an option. The system has removed that option from its criterion in the sample responses in the Learner Training step. You may have to select a new option for the criterion.")
);
}
},
......@@ -174,7 +174,7 @@ OpenAssessment.StudentTrainingListener.prototype = {
if (changed) {
this.displayAlertMsg(
gettext("Option Deleted"),
gettext("You've deleted all the options for this criterion. The system has removed the criterion from the sample responses in the Student Training step.")
gettext("You've deleted all the options for this criterion. The system has removed the criterion from the sample responses in the Learner Training step.")
);
}
},
......@@ -201,7 +201,7 @@ OpenAssessment.StudentTrainingListener.prototype = {
if (changed) {
this.displayAlertMsg(
gettext("Criterion Deleted"),
gettext("You've deleted a criterion. The system has removed the criterion from the sample responses in the Student Training step.")
gettext("You've deleted a criterion. The system has removed the criterion from the sample responses in the Learner Training step.")
);
}
},
......
......@@ -50,7 +50,7 @@ class StudentTrainingMixin(object):
try:
path, context = self.training_path_and_context()
except: # pylint:disable=W0702
msg = u"Could not render student training step for submission {}".format(self.submission_uuid)
msg = u"Could not render learner training step for submission {}".format(self.submission_uuid)
logger.exception(msg)
return self.render_error(self._(u"An unexpected error occurred."))
else:
......@@ -185,8 +185,8 @@ class StudentTrainingMixin(object):
)
except student_training.StudentTrainingRequestError:
msg = (
u"Could not check student training scores for "
u"the student with submission UUID {uuid}"
u"Could not check learner training scores for "
u"the learner with submission UUID {uuid}"
).format(uuid=self.submission_uuid)
logger.warning(msg, exc_info=True)
return {
......
......@@ -959,7 +959,7 @@
}
],
"editor_assessments_order": ["student-training", "peer-assessment", "self-assessment"],
"expected_error": "you must provide at least one example response for student training"
"expected_error": "you must provide at least one example response for learner training"
},
"student_training_example_does_not_match_rubric": {
......
......@@ -102,7 +102,7 @@ class TestCourseStaff(XBlockHandlerTestCase):
# If we ARE course staff, then we should see the debug info
xblock.xmodule_runtime.user_is_staff = True
resp = self.request(xblock, 'render_student_info', json.dumps({}))
self.assertIn("couldn\'t find a response for this student.", resp.decode('utf-8').lower())
self.assertIn("couldn\'t find a response for this learner.", resp.decode('utf-8').lower())
@scenario('data/basic_scenario.xml')
def test_hide_course_staff_debug_info_in_studio_preview(self, xblock):
......@@ -169,7 +169,7 @@ class TestCourseStaff(XBlockHandlerTestCase):
request.params = {"student_id": "test_student"}
# Verify that we can render without error
resp = xblock.render_student_info(request)
self.assertIn("couldn\'t find a response for this student.", resp.body.lower())
self.assertIn("couldn\'t find a response for this learner.", resp.body.lower())
@scenario('data/peer_only_scenario.xml', user_id='Bob')
def test_staff_debug_student_info_peer_only(self, xblock):
......@@ -623,7 +623,7 @@ class TestCourseStaff(XBlockHandlerTestCase):
# Verify that we can render without error
params = {"submission_uuid": submission["uuid"], "comments": "Inappropriate language."}
resp = self.request(xblock, 'cancel_submission', json.dumps(params), response_format='json')
self.assertIn("The student submission has been removed from peer", resp['msg'])
self.assertIn("The learner submission has been removed from peer", resp['msg'])
self.assertEqual(True, resp['success'])
def _create_mock_runtime(
......
......@@ -133,10 +133,10 @@ def validate_assessments(assessments, current_assessments, is_released, _):
answers = []
examples = assessment_dict.get('examples')
if not examples:
return False, _('You must provide at least one example response for student training.')
return False, _('You must provide at least one example response for learner training.')
for example in examples:
if example.get('answer') in answers:
return False, _('Each example response for student training must be unique.')
return False, _('Each example response for learner training must be unique.')
answers.append(example.get('answer'))
# Example-based assessment MUST specify 'ease' or 'fake' as the algorithm ID,
......@@ -293,7 +293,7 @@ def validate_assessment_examples(rubric_dict, assessments, _):
# Must have at least one training example
if len(examples) == 0:
return False, _(u"Student training and example-based assessments must have at least one training example")
return False, _(u"Learner training and example-based assessments must have at least one training example")
# Delegate to the student training API to validate the
# examples against the rubric.
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment