Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
E
edx-ora2
Overview
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
edx
edx-ora2
Commits
e55fbbae
Commit
e55fbbae
authored
Jun 20, 2014
by
gradyward
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Nearing the end of code review and testing.
parent
a73a0638
Expand all
Hide whitespace changes
Inline
Side-by-side
Showing
10 changed files
with
332 additions
and
190 deletions
+332
-190
openassessment/templates/openassessmentblock/oa_edit.html
+0
-0
openassessment/xblock/static/css/openassessment.css
+55
-40
openassessment/xblock/static/js/openassessment.min.js
+0
-0
openassessment/xblock/static/js/spec/oa_edit.js
+105
-15
openassessment/xblock/static/js/src/oa_edit.js
+0
-0
openassessment/xblock/static/sass/oa/utilities/_developer.scss
+73
-50
openassessment/xblock/studio_mixin.py
+76
-5
openassessment/xblock/test/test_studio.py
+1
-1
openassessment/xblock/test/test_xml.py
+8
-3
openassessment/xblock/xml.py
+14
-76
No files found.
openassessment/templates/openassessmentblock/oa_edit.html
View file @
e55fbbae
This diff is collapsed.
Click to expand it.
openassessment/xblock/static/css/openassessment.css
View file @
e55fbbae
...
...
@@ -2135,73 +2135,99 @@ hr.divider,
#openassessment-editor
{
margin-bottom
:
0
;
}
#openassessment-editor
.openassessment
-editor-content-and-
tabs
{
#openassessment-editor
.openassessment
_editor_content_and_
tabs
{
width
:
100%
;
height
:
370px
;
}
#openassessment-editor
.openassessment-editor-
header
{
#openassessment-editor
#openassessment_editor_
header
{
background-color
:
#e5e5e5
;
width
:
100%
;
top
:
0
;
}
#openassessment-editor
#oa
-editor-window-
title
{
#openassessment-editor
#oa
_editor_window_
title
{
float
:
left
;
}
#openassessment-editor
.oa
-editor-
tab
{
#openassessment-editor
.oa
_editor_
tab
{
float
:
right
;
padding
:
2.5px
5px
;
margin
:
2.5px
5px
;
border-radius
:
2.
5px
;
border-radius
:
5px
;
box-shadow
:
none
;
border
:
0
;
}
#openassessment-editor
.oa
-editor-content-
wrapper
{
#openassessment-editor
.oa
_editor_content_
wrapper
{
height
:
100%
;
width
:
100%
;
padding
:
5px
10px
;
}
#openassessment-editor
.openassessment-prompt-
editor
{
#openassessment-editor
#openassessment_prompt_
editor
{
width
:
100%
;
height
:
100%
;
resize
:
none
;
}
#openassessment-editor
.openassessment-rubric-editor
{
resize
:
none
;
border
:
none
;
}
#openassessment-editor
#openassessment_rubric_editor
{
width
:
100%
;
height
:
100%
;
}
#openassessment-editor
.openassessment-assessments-editor
{
width
:
100%
;
}
#openassessment-editor
#oa-settings-editor-wrapper
{
#openassessment-editor
#oa_basic_settings_editor
{
padding
:
20px
20px
;
border-bottom
:
1px
solid
#414243
;
}
#openassessment-editor
#oa_basic_settings_editor
#openassessment_title_editor_wrapper
label
{
width
:
25%
;
text-align
:
left
;
}
#openassessment-editor
#oa_basic_settings_editor
#openassessment_title_editor_wrapper
input
{
width
:
45%
;
min-width
:
100px
;
}
#openassessment-editor
#openassessment_step_select_description
{
margin
:
10px
0
;
}
#openassessment-editor
.openassessment_assessment_module_settings_editor
{
margin-bottom
:
10px
;
padding-bottom
:
10px
;
border-bottom
:
1px
solid
#dadbdc
;
}
#openassessment-editor
.openassessment_indent_line_input
{
padding
:
5px
20px
;
}
#openassessment-editor
#oa_settings_editor_wrapper
{
overflow-y
:
scroll
;
}
#openassessment-editor
#openassessment
-title-
editor
{
#openassessment-editor
#openassessment
_title_
editor
{
width
:
300px
;
margin-left
:
50px
;
}
#openassessment-editor
.openassessment-number-field
{
width
:
25px
;
}
#openassessment-editor
.openassessment-date-field
{
#openassessment-editor
.openassessment_description
,
#openassessment-editor
.openassessment_description_closed
{
font-size
:
75%
;
margin
:
0
;
}
#openassessment-editor
.openassessment_date_field
{
width
:
130px
;
}
#openassessment-editor
.openassessment
-description
{
font-size
:
75%
;
}
#openassessment-editor
.openassessment
-text-field-
wrapper
{
#openassessment-editor
.openassessment
_number_field
{
width
:
25px
;
}
#openassessment-editor
.openassessment
_text_field_wrapper
,
#openassessment-editor
.openassessment_right_text_field_wrapper
,
#openassessment-editor
.openassessment_left_text_field_
wrapper
{
width
:
50%
;
text-align
:
center
;
}
#openassessment-editor
.
right-text-field-
wrapper
{
#openassessment-editor
.
openassessment_right_text_field_
wrapper
{
float
:
right
;
}
#openassessment-editor
.
left-text-field-
wrapper
{
#openassessment-editor
.
openassessment_left_text_field_
wrapper
{
float
:
left
;
}
#openassessment-editor
.openassessment
-due-date-
editor
{
#openassessment-editor
.openassessment
_due_date_
editor
{
height
:
30px
;
}
#openassessment-editor
.openassessment
-inclusion-
wrapper
{
#openassessment-editor
.openassessment
_inclusion_
wrapper
{
background-color
:
#dadbdc
;
padding
:
2.5px
5px
;
margin
:
2.5px
5px
;
border-radius
:
2.5px
;
}
#openassessment-editor
.openassessment_inclusion_wrapper
input
[
type
=
"checkbox"
]
{
display
:
none
;
}
#openassessment-editor
.openassessment_inclusion_wrapper
input
[
type
=
"checkbox"
]
+
label
:before
{
font-family
:
"FontAwesome"
;
display
:
inline-block
;
margin-right
:
10px
;
width
:
auto
;
height
:
auto
;
content
:
"\f096"
;
}
#openassessment-editor
.openassessment_inclusion_wrapper
input
[
type
=
"checkbox"
]
:checked
+
label
:before
{
content
:
"\f046"
;
}
#openassessment-editor
label
{
padding-right
:
10px
;
}
#openassessment-editor
.xblock
-
actions
{
background-color
:
#
e5e5e5
;
#openassessment-editor
.xblock
_
actions
{
background-color
:
#
c8c9ca
;
position
:
absolute
;
width
:
100%
;
bottom
:
0
;
}
#openassessment-editor
.peer-number-constraints
{
margin-bottom
:
10px
;
}
#openassessment-editor
.ui-widget-header
.ui-state-default
{
background
:
#e5e5e5
;
}
#openassessment-editor
.ui-widget-header
.ui-state-default
a
{
color
:
#
202021
;
color
:
#
414243
;
text-transform
:
uppercase
;
outline-color
:
transparent
;
}
#openassessment-editor
.ui-widget-header
.ui-state-active
{
...
...
@@ -2211,19 +2237,8 @@ hr.divider,
color
:
whitesmoke
;
text-transform
:
uppercase
;
outline-color
:
transparent
;
}
#openassessment-editor
input
[
type
=
"checkbox"
]
{
display
:
none
;
}
#openassessment-editor
input
[
type
=
"checkbox"
]
+
label
:before
{
font-family
:
"FontAwesome"
;
display
:
inline-block
;
margin-right
:
10px
;
width
:
auto
;
height
:
auto
;
content
:
"\f096"
;
}
#openassessment-editor
input
[
type
=
"checkbox"
]
:checked
+
label
:before
{
content
:
"\f046"
;
}
#openassessment-editor
hr
{
background-color
:
#d4d4d4
;
background-color
:
transparent
;
color
:
#414243
;
height
:
1px
;
border
:
0px
;
...
...
openassessment/xblock/static/js/openassessment.min.js
View file @
e55fbbae
This diff is collapsed.
Click to expand it.
openassessment/xblock/static/js/spec/oa_edit.js
View file @
e55fbbae
...
...
@@ -17,9 +17,10 @@ describe("OpenAssessment.StudioView", function() {
this
.
titleField
=
""
;
this
.
submissionStartField
=
""
;
this
.
submissionDueField
=
""
;
this
.
hasPeer
=
true
;
this
.
hasSelf
=
true
;
this
.
hasTraining
=
tru
e
;
this
.
hasTraining
=
fals
e
;
this
.
hasAI
=
false
;
this
.
peerMustGrade
=
2
;
...
...
@@ -45,20 +46,35 @@ describe("OpenAssessment.StudioView", function() {
var
title
=
this
.
titleField
;
var
submission_start
=
this
.
submissionStartField
;
var
submission_due
=
this
.
submissionDueField
;
var
assessments
=
[
{
name
:
"peer"
,
must_grade
:
this
.
peerMustGrade
,
must_be_graded_by
:
this
.
peerGradedBy
,
start
:
this
.
peerStart
,
due
:
this
.
peerDue
},
{
name
:
"self"
,
start
:
this
.
selfStart
,
due
:
this
.
selfDue
}
];
var
assessments
=
[];
if
(
this
.
hasTraining
){
assessments
=
assessments
.
concat
({
"name"
:
"student-training"
,
"examples"
:
this
.
studentTrainingExamplesCodeBox
});
}
if
(
this
.
hasPeer
){
assessments
=
assessments
.
concat
({
"name"
:
"peer-assessment"
,
"start"
:
this
.
peerStart
,
"due"
:
this
.
peerDue
,
"must_grade"
:
this
.
peerMustGrade
,
"must_be_graded_by"
:
this
.
peerGradedBy
});
}
if
(
this
.
hasSelf
){
assessments
=
assessments
.
concat
({
"name"
:
"self-assessment"
,
"start"
:
this
.
selfStart
,
"due"
:
this
.
selfDue
});
}
if
(
this
.
hasAI
){
assessments
=
assessments
.
concat
({
"name"
:
"example-based-assessment"
,
"examples"
:
this
.
aiTrainingExamplesCodeBox
});
}
if
(
!
this
.
loadError
)
{
return
$
.
Deferred
(
function
(
defer
)
{
...
...
@@ -123,6 +139,52 @@ describe("OpenAssessment.StudioView", function() {
var
server
=
null
;
var
view
=
null
;
var
prompt
=
"How much do you like waffles?"
;
var
rubric
=
"<rubric>"
+
"<criterion>"
+
"<name>Proper Appreciation of Gravity</name>"
+
"<prompt>How much respect did the person give waffles?</prompt>"
+
"<option points=
\"
0
\"
><name>No</name><explanation>Not enough</explanation></option>"
+
"<option points=
\"
2
\"
><name>Yes</name><explanation>An appropriate Amount</explanation></option>"
+
"</criterion>"
+
"</rubric>"
;
var
title
=
"The most important of all questions."
;
var
subStart
=
""
;
var
subDue
=
"2014-10-1T10:00:00"
;
var
assessments
=
[
{
"name"
:
"student-training"
,
"examples"
:
"<examples>"
+
"<example>"
+
"<answer>ẗëṡẗ äṅṡẅëṛ</answer>"
+
"<select criterion=
\"
Test criterion
\"
option=
\"
Yes
\"
/>"
+
"<select criterion=
\"
Another test criterion
\"
option=
\"
No
\"
/>"
+
"</example>"
+
"<example>"
+
"<answer>äṅöẗḧëṛ ẗëṡẗ äṅṡẅëṛ</answer>"
+
"<select criterion=
\"
Another test criterion
\"
option=
\"
Yes
\"
/>"
+
"<select criterion=
\"
Test criterion
\"
option=
\"
No
\"
/>"
+
"</example>"
+
"</examples>"
,
"start"
:
""
,
"due"
:
""
},
{
"name"
:
"peer-assessment"
,
"must_grade"
:
5
,
"must_be_graded_by"
:
3
,
"start"
:
"2014-10-04T00:00:00"
,
"due"
:
""
},
{
"name"
:
"self-assessment"
,
"start"
:
""
,
"due"
:
""
}
];
beforeEach
(
function
()
{
// Load the DOM fixture
...
...
@@ -183,6 +245,34 @@ describe("OpenAssessment.StudioView", function() {
expect
(
view
.
confirmPostReleaseUpdate
).
toHaveBeenCalled
();
});
it
(
"full integration test for load and update_editor_context"
,
function
()
{
server
.
updateEditorContext
(
prompt
,
rubric
,
title
,
subStart
,
subDue
,
assessments
);
view
.
load
();
expect
(
view
.
promptBox
.
value
).
toEqual
(
prompt
);
expect
(
view
.
rubricXmlBox
.
getValue
()).
toEqual
(
rubric
);
expect
(
view
.
titleField
.
value
).
toEqual
(
title
);
expect
(
view
.
submissionStartField
.
value
).
toEqual
(
subStart
);
expect
(
view
.
submissionDueField
.
value
).
toEqual
(
subDue
);
expect
(
view
.
hasPeer
.
prop
(
'checked'
)).
toEqual
(
true
);
expect
(
view
.
hasSelf
.
prop
(
'checked'
)).
toEqual
(
true
);
expect
(
view
.
hasAI
.
prop
(
'checked'
)).
toEqual
(
false
);
expect
(
view
.
hasTraining
.
prop
(
'checked'
)).
toEqual
(
true
);
expect
(
view
.
peerMustGrade
.
prop
(
'value'
)).
toEqual
(
'5'
);
expect
(
view
.
peerGradedBy
.
prop
(
'value'
)).
toEqual
(
'3'
);
expect
(
view
.
peerDue
.
prop
(
'value'
)).
toEqual
(
""
);
expect
(
view
.
selfStart
.
prop
(
'value'
)).
toEqual
(
""
);
expect
(
view
.
selfDue
.
prop
(
'value'
)).
toEqual
(
""
);
expect
(
view
.
aiTrainingExamplesCodeBox
.
getValue
()).
toEqual
(
""
);
expect
(
view
.
studentTrainingExamplesCodeBox
.
getValue
()).
toEqual
(
assessments
[
0
].
examples
);
expect
(
view
.
peerStart
.
prop
(
'value'
)).
toEqual
(
"2014-10-04T00:00:00"
);
view
.
titleField
.
value
=
"This is the new title."
;
view
.
updateEditorContext
();
expect
(
server
.
titleField
).
toEqual
(
"This is the new title."
);
});
it
(
"cancels editing"
,
function
()
{
view
.
cancel
();
expect
(
runtime
.
notify
).
toHaveBeenCalledWith
(
'cancel'
,
{});
...
...
openassessment/xblock/static/js/src/oa_edit.js
View file @
e55fbbae
This diff is collapsed.
Click to expand it.
openassessment/xblock/static/sass/oa/utilities/_developer.scss
View file @
e55fbbae
...
...
@@ -173,120 +173,160 @@
#openassessment-editor
{
margin-bottom
:
0
;
.openassessment
-editor-content-and-
tabs
{
.openassessment
_editor_content_and_
tabs
{
width
:
100%
;
height
:
370px
;
}
.openassessment-editor-
header
{
#openassessment_editor_
header
{
background-color
:
#e5e5e5
;
width
:
100%
;
top
:
0
;
}
#oa
-editor-window-
title
{
#oa
_editor_window_
title
{
float
:
left
;
}
.oa
-editor-
tab
{
.oa
_editor_
tab
{
float
:
right
;
padding
:
(
$baseline-v
/
8
)
(
$baseline-h
/
8
);
margin
:
(
$baseline-v
/
8
)
(
$baseline-h
/
8
);
border-radius
:
(
$baseline-v
/
8
);
border-radius
:
(
$baseline-v
/
4
);
box-shadow
:
none
;
border
:
0
;
}
.oa
-editor-content-
wrapper
{
.oa
_editor_content_
wrapper
{
height
:
100%
;
width
:
100%
;
padding
:
(
$baseline-v
/
4
)
(
$baseline-h
/
4
);
}
.openassessment-prompt-
editor
{
#openassessment_prompt_
editor
{
width
:
100%
;
height
:
100%
;
resize
:
none
;
border
:
none
;
}
.openassessment-rubric-
editor
{
#openassessment_rubric_
editor
{
width
:
100%
;
height
:
100%
;
}
.openassessment-assessments-editor
{
width
:
100%
;
#oa_basic_settings_editor
{
padding
:
20px
20px
;
border-bottom
:
1px
solid
$edx-gray-d3
;
#openassessment_title_editor_wrapper
{
label
{
width
:
25%
;
text-align
:
left
;
}
input
{
width
:
45%
;
min-width
:
100px
;
}
}
}
#oa-settings-editor-text-fields
{
#openassessment_step_select_description
{
margin
:
10px
0
;
}
#oa-settings-editor-wrapper
{
.openassessment_assessment_module_settings_editor
{
margin-bottom
:
10px
;
padding-bottom
:
10px
;
border-bottom
:
1px
solid
$edx-gray-l3
;
}
.openassessment_indent_line_input
{
padding
:
5px
20px
;
}
#oa_settings_editor_wrapper
{
overflow-y
:
scroll
;
}
#openassessment
-title-
editor
{
#openassessment
_title_
editor
{
width
:
300px
;
margin-left
:
50px
;
}
.openassessment-number-field
{
width
:
25px
;
.openassessment_description
{
font-size
:
75%
;
margin
:
0
;
}
.openassessment
-date-
field
{
.openassessment
_date_
field
{
width
:
130px
;
}
.openassessment_number_field
{
width
:
25px
;
}
.openassessment
-description
{
font-size
:
75%
;
.openassessment
_description_closed
{
@extend
.openassessment_description
;
}
.openassessment
-text-field-
wrapper
{
.openassessment
_text_field_
wrapper
{
width
:
50%
;
text-align
:
center
;
}
.right-text-field-wrapper
{
.openassessment_right_text_field_wrapper
{
@extend
.openassessment_text_field_wrapper
;
float
:
right
;
}
.left-text-field-wrapper
{
.openassessment_left_text_field_wrapper
{
@extend
.openassessment_text_field_wrapper
;
float
:
left
;
}
.openassessment
-due-date-
editor
{
.openassessment
_due_date_
editor
{
height
:
30px
;
}
.openassessment
-inclusion-
wrapper
{
.openassessment
_inclusion_
wrapper
{
background-color
:
$edx-gray-l3
;
input
[
type
=
"checkbox"
]
{
}
padding
:
(
$baseline-v
/
8
)
(
$baseline-h
/
8
);
margin
:
(
$baseline-v
/
8
)
(
$baseline-h
/
8
);
border-radius
:
(
$baseline-v
)
/
8
;
input
[
type
=
"checkbox"
]
{
display
:
none
;
}
input
[
type
=
"checkbox"
]
+
label
:before
{
font-family
:
"FontAwesome"
;
display
:
inline-block
;
margin-right
:
(
$baseline-h
/
4
);
width
:
auto
;
height
:
auto
;
content
:
"\f096"
;
}
input
[
type
=
"checkbox"
]
:checked
+
label
:before
{
content
:
"\f046"
;
}
}
label
{
padding-right
:
10px
;
}
.xblock
-
actions
{
background-color
:
#e5e5e5
;
.xblock
_
actions
{
background-color
:
$edx-gray-l2
;
position
:
absolute
;
width
:
100%
;
bottom
:
0
;
}
.peer-number-constraints
{
margin-bottom
:
10px
;
}
.ui-widget-header
.ui-state-default
{
background
:
#e5e5e5
;
a
{
color
:
$edx-gray-d
4
;
color
:
$edx-gray-d
3
;
text-transform
:
uppercase
;
outline-color
:
transparent
;
}
...
...
@@ -302,25 +342,8 @@
}
}
input
[
type
=
"checkbox"
]
{
display
:
none
;
}
input
[
type
=
"checkbox"
]
+
label
:before
{
font-family
:
"FontAwesome"
;
display
:
inline-block
;
margin-right
:
(
$baseline-h
/
4
);
width
:
auto
;
height
:
auto
;
content
:
"\f096"
;
}
input
[
type
=
"checkbox"
]
:checked
+
label
:before
{
content
:
"\f046"
;
}
hr
{
background-color
:
#d4d4d4
;
background-color
:
transparent
;
color
:
$edx-gray-d3
;
height
:
1px
;
border
:
0px
;
...
...
openassessment/xblock/studio_mixin.py
View file @
e55fbbae
...
...
@@ -6,11 +6,12 @@ import copy
import
logging
from
django.template.context
import
Context
from
django.template.loader
import
get_template
from
django.utils.translation
import
ugettext
as
_
from
django.utils.translation
import
ugettext
as
_
,
ugettext
from
xblock.core
import
XBlock
from
xblock.fragment
import
Fragment
from
openassessment.xblock
import
xml
from
openassessment.xblock.validation
import
validator
from
openassessment.xblock.xml
import
UpdateFromXmlError
,
parse_date
,
parse_examples_xml_str
logger
=
logging
.
getLogger
(
__name__
)
...
...
@@ -48,7 +49,7 @@ class StudioMixin(object):
-- The 'rubric' should be an XML representation of the new rubric.
-- The 'prompt' and 'title' should be plain text.
-- The dates 'submission_start' and 'submission_due' are both ISO strings
-- The 'assessments' is a list of asessment dictionaries (much like self.rubric_assessments)
-- The 'assessments' is a list of as
s
essment dictionaries (much like self.rubric_assessments)
with the notable exception that all examples (for Student Training and eventually AI)
are in XML string format and need to be parsed into dictionaries.
...
...
@@ -69,9 +70,9 @@ class StudioMixin(object):
try
:
rubric
=
xml
.
parse_rubric_xml_str
(
data
[
"rubric"
])
submission_due
=
xml
.
parse_date
(
data
[
"submission_due"
])
submission_start
=
xml
.
parse_date
(
data
[
"submission_start"
])
assessments
=
xml
.
parse_assessment_dictionaries
(
data
[
"assessments"
])
submission_due
=
xml
.
parse_date
(
data
[
"submission_due"
]
,
name
=
"submission due date"
)
submission_start
=
xml
.
parse_date
(
data
[
"submission_start"
]
,
name
=
"submission start date"
)
assessments
=
parse_assessment_dictionaries
(
data
[
"assessments"
])
except
xml
.
UpdateFromXmlError
as
ex
:
return
{
'success'
:
False
,
'msg'
:
_
(
'An error occurred while saving: {error}'
)
.
format
(
error
=
ex
)}
...
...
@@ -171,3 +172,72 @@ class StudioMixin(object):
'is_released'
:
self
.
is_released
()
}
def
parse_assessment_dictionaries
(
input_assessments
):
"""
Parses the elements of assessment dictionaries returned by the Studio UI into storable rubric_assessments
Args:
input_assessments (list of dict): A list of the dictionaries that are assembled in Javascript to
represent their modules. Some changes need to be made between this and the result:
-- Parse the XML examples from the Student Training and or AI
-- Parse all dates (including the assessment dates) correctly
Returns:
(list of dict): Can be directly assigned/stored in an openassessmentblock.rubric_assessments
"""
assessments_list
=
[]
for
assessment
in
input_assessments
:
assessment_dict
=
dict
()
# Assessment name
if
'name'
in
assessment
:
assessment_dict
[
'name'
]
=
assessment
.
get
(
'name'
)
else
:
raise
UpdateFromXmlError
(
_
(
'All "assessment" elements must contain a "name" element.'
))
# Assessment start
if
'start'
in
assessment
:
parsed_start
=
parse_date
(
assessment
.
get
(
'start'
),
name
=
"{} start date"
.
format
(
assessment
.
get
(
'name'
)))
assessment_dict
[
'start'
]
=
parsed_start
else
:
assessment_dict
[
'start'
]
=
None
# Assessment due
if
'due'
in
assessment
:
parsed_due
=
parse_date
(
assessment
.
get
(
'due'
),
name
=
"{} due date"
.
format
(
assessment
.
get
(
'name'
)))
assessment_dict
[
'due'
]
=
parsed_due
else
:
assessment_dict
[
'due'
]
=
None
# Assessment must_grade
if
'must_grade'
in
assessment
:
try
:
assessment_dict
[
'must_grade'
]
=
int
(
assessment
.
get
(
'must_grade'
))
except
(
ValueError
,
TypeError
):
raise
UpdateFromXmlError
(
_
(
'The "must_grade" value must be a positive integer.'
))
# Assessment must_be_graded_by
if
'must_be_graded_by'
in
assessment
:
try
:
assessment_dict
[
'must_be_graded_by'
]
=
int
(
assessment
.
get
(
'must_be_graded_by'
))
except
(
ValueError
,
TypeError
):
raise
UpdateFromXmlError
(
_
(
'The "must_be_graded_by" value must be a positive integer.'
))
# Training examples (can be for AI OR for Student Training)
if
'examples'
in
assessment
:
try
:
assessment_dict
[
'examples'
]
=
parse_examples_xml_str
(
assessment
.
get
(
'examples'
))
except
UpdateFromXmlError
as
ex
:
raise
UpdateFromXmlError
(
_
(
"There was an error in parsing the {name} examples: {ex}"
)
.
format
(
name
=
assessment_dict
[
'name'
],
ex
=
ex
))
# Update the list of assessments
assessments_list
.
append
(
assessment_dict
)
return
assessments_list
\ No newline at end of file
openassessment/xblock/test/test_studio.py
View file @
e55fbbae
...
...
@@ -36,7 +36,7 @@ class StudioViewTest(XBlockHandlerTestCase):
# Verify that every assessment in the list of assessments has a name.
for
assessment_dict
in
resp
[
'assessments'
]:
self
.
assertTrue
(
assessment_dict
.
get
(
'name'
,
False
))
if
assessment_dict
.
get
(
'name'
)
==
'stud
ne
t-training'
:
if
assessment_dict
.
get
(
'name'
)
==
'stud
en
t-training'
:
examples
=
etree
.
fromstring
(
assessment_dict
[
'examples'
])
self
.
assertEqual
(
examples
.
tag
,
'examples'
)
...
...
openassessment/xblock/test/test_xml.py
View file @
e55fbbae
...
...
@@ -11,11 +11,12 @@ import dateutil.parser
from
django.test
import
TestCase
import
ddt
from
openassessment.xblock.openassessmentblock
import
OpenAssessmentBlock
from
openassessment.xblock.studio_mixin
import
parse_assessment_dictionaries
from
openassessment.xblock.xml
import
(
serialize_content
,
parse_from_xml_str
,
parse_rubric_xml_str
,
parse_examples_xml_str
,
parse_assessments_xml_str
,
serialize_rubric_to_xml_str
,
serialize_examples_to_xml_str
,
serialize_assessments_to_xml_str
,
UpdateFromXmlError
,
parse_assessment_dictionaries
serialize_assessments_to_xml_str
,
UpdateFromXmlError
)
...
...
@@ -366,8 +367,12 @@ class TestParseAssessmentsFromDictionaries(TestCase):
config
=
parse_assessment_dictionaries
(
data
[
'assessments_list'
])
for
i
in
range
(
0
,
len
(
config
)):
self
.
assertEqual
(
config
[
i
],
data
[
'results'
][
i
])
if
len
(
config
)
==
0
:
# Prevents this test from passing benignly if parse_assessment_dictionaries returns []
self
.
assertTrue
(
False
)
for
config_assessment
,
correct_assessment
in
zip
(
config
,
data
[
'results'
]):
self
.
assertEqual
(
config_assessment
,
correct_assessment
)
@ddt.file_data
(
'data/parse_assessment_dicts_error.json'
)
def
test_parse_assessments_dictionary_error
(
self
,
data
):
...
...
openassessment/xblock/xml.py
View file @
e55fbbae
...
...
@@ -160,7 +160,7 @@ def serialize_rubric(rubric_root, oa_block, include_prompt=True):
feedback_prompt
.
text
=
unicode
(
oa_block
.
rubric_feedback_prompt
)
def
parse_date
(
date_str
):
def
parse_date
(
date_str
,
name
=
""
):
"""
Attempt to parse a date string into ISO format (without milliseconds)
Returns `None` if this cannot be done.
...
...
@@ -168,6 +168,9 @@ def parse_date(date_str):
Args:
date_str (str): The date string to parse.
Kwargs:
name (str): the name to return in an error to the origin of the call if an error occurs.
Returns:
unicode in ISO format (without milliseconds) if the date string is
parse-able. None if parsing fails.
...
...
@@ -184,8 +187,9 @@ def parse_date(date_str):
return
unicode
(
formatted_date
)
except
(
ValueError
,
TypeError
):
msg
=
(
'The format for the given date ({}) is invalid. Make sure the date is formatted as YYYY-MM-DDTHH:MM:SS.'
)
.
format
(
date_str
)
'The format of the given date ({date}) for the {name} is invalid. '
'Make sure the date is formatted as YYYY-MM-DDTHH:MM:SS.'
)
.
format
(
date
=
date_str
,
name
=
name
)
raise
UpdateFromXmlError
(
_
(
msg
))
...
...
@@ -374,68 +378,6 @@ def parse_examples_xml(examples):
return
examples_list
def
parse_assessment_dictionaries
(
input_assessments
):
assessments_list
=
[]
for
assessment
in
input_assessments
:
assessment_dict
=
dict
()
# Assessment name
if
assessment
.
get
(
'name'
):
assessment_dict
[
'name'
]
=
unicode
(
assessment
.
get
(
'name'
))
else
:
raise
UpdateFromXmlError
(
_
(
'All "assessment" elements must contain a "name" element.'
))
# Assessment start
if
assessment
.
get
(
'start'
):
try
:
parsed_start
=
parse_date
(
assessment
.
get
(
'start'
))
assessment_dict
[
'start'
]
=
parsed_start
except
UpdateFromXmlError
:
raise
UpdateFromXmlError
(
_
(
'The date format in the "start" attribute is invalid. Make sure the date is formatted as YYYY-MM-DDTHH:MM:SS.'
))
else
:
assessment_dict
[
'start'
]
=
None
# Assessment due
if
assessment
.
get
(
'due'
):
try
:
parsed_due
=
parse_date
(
assessment
.
get
(
'due'
))
assessment_dict
[
'due'
]
=
parsed_due
except
UpdateFromXmlError
:
raise
UpdateFromXmlError
(
_
(
'The date format in the "due" attribute is invalid. Make sure the date is formatted as YYYY-MM-DDTHH:MM:SS.'
))
else
:
assessment_dict
[
'due'
]
=
None
# Assessment must_grade
if
assessment
.
get
(
'must_grade'
):
try
:
assessment_dict
[
'must_grade'
]
=
int
(
assessment
.
get
(
'must_grade'
))
except
ValueError
:
raise
UpdateFromXmlError
(
_
(
'The "must_grade" value must be a positive integer.'
))
# Assessment must_be_graded_by
if
assessment
.
get
(
'must_be_graded_by'
):
try
:
assessment_dict
[
'must_be_graded_by'
]
=
int
(
assessment
.
get
(
'must_be_graded_by'
))
except
ValueError
:
raise
UpdateFromXmlError
(
_
(
'The "must_be_graded_by" value must be a positive integer.'
))
# Training examples (can be for AI or for Student Training)
if
assessment
.
get
(
'examples'
):
try
:
assessment_dict
[
'examples'
]
=
parse_examples_xml_str
(
assessment
.
get
(
'examples'
))
except
(
UpdateFromXmlError
,
UnicodeError
)
as
ex
:
raise
UpdateFromXmlError
(
_
(
"There was an error in parsing the {0} examples: {1}"
.
format
(
assessment
.
get
(
'name'
),
ex
)
))
# Update the list of assessments
assessments_list
.
append
(
assessment_dict
)
return
assessments_list
def
parse_assessments_xml
(
assessments_root
):
"""
Parse the <assessments> element in the OpenAssessment XBlock's content XML.
...
...
@@ -464,21 +406,17 @@ def parse_assessments_xml(assessments_root):
# Assessment start
if
'start'
in
assessment
.
attrib
:
parsed_start
=
parse_date
(
assessment
.
get
(
'start'
))
parsed_start
=
parse_date
(
assessment
.
get
(
'start'
)
,
name
=
"{} start date"
.
format
(
assessment_dict
[
'name'
])
)
if
parsed_start
is
not
None
:
assessment_dict
[
'start'
]
=
parsed_start
else
:
raise
UpdateFromXmlError
(
_
(
'The date format in the "start" attribute is invalid. Make sure the date is formatted as YYYY-MM-DDTHH:MM:SS.'
))
else
:
assessment_dict
[
'start'
]
=
None
# Assessment due
if
'due'
in
assessment
.
attrib
:
parsed_start
=
parse_date
(
assessment
.
get
(
'due'
))
parsed_start
=
parse_date
(
assessment
.
get
(
'due'
)
,
name
=
"{} due date"
.
format
(
assessment_dict
[
'name'
])
)
if
parsed_start
is
not
None
:
assessment_dict
[
'due'
]
=
parsed_start
else
:
raise
UpdateFromXmlError
(
_
(
'The date format in the "due" attribute is invalid. Make sure the date is formatted as YYYY-MM-DDTHH:MM:SS.'
))
else
:
assessment_dict
[
'due'
]
=
None
...
...
@@ -711,13 +649,13 @@ def parse_from_xml(root):
# Set it to None by default; we will update it to the latest start date later on
submission_start
=
None
if
'submission_start'
in
root
.
attrib
:
submission_start
=
parse_date
(
unicode
(
root
.
attrib
[
'submission_start'
]))
submission_start
=
parse_date
(
unicode
(
root
.
attrib
[
'submission_start'
])
,
name
=
"submission start date"
)
# Retrieve the due date for the submission
# Set it to None by default; we will update it to the earliest deadline later on
submission_due
=
None
if
'submission_due'
in
root
.
attrib
:
submission_due
=
parse_date
(
unicode
(
root
.
attrib
[
'submission_due'
]))
submission_due
=
parse_date
(
unicode
(
root
.
attrib
[
'submission_due'
])
,
name
=
"submission due date"
)
# Retrieve the title
title_el
=
root
.
find
(
'title'
)
...
...
@@ -827,10 +765,10 @@ def parse_examples_xml_str(xml):
"""
# This should work for both wrapped and unwrapped examples. Based on our final configuration (and tests)
# we should handle both cases gracefully.
if
"<examples>"
not
in
xml
:
xml
=
u"<data>"
+
xml
+
u"</data>"
else
:
xml
=
unicode
(
xml
)
xml
=
u"<examples>"
+
xml
+
u"</examples>"
return
parse_examples_xml
(
list
(
_unicode_to_xml
(
xml
)
.
findall
(
'example'
)))
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment