Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
E
edx-platform
Overview
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
edx
edx-platform
Commits
332a4405
Commit
332a4405
authored
Jun 19, 2013
by
Brian Wilson
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Enable per-student background tasks.
parent
4d2183d7
Hide whitespace changes
Inline
Side-by-side
Showing
7 changed files
with
87 additions
and
27 deletions
+87
-27
lms/djangoapps/instructor_task/api_helper.py
+2
-5
lms/djangoapps/instructor_task/models.py
+20
-1
lms/djangoapps/instructor_task/tests/test_api.py
+1
-1
lms/djangoapps/instructor_task/tests/test_integration.py
+1
-1
lms/djangoapps/instructor_task/tests/test_tasks.py
+58
-13
lms/djangoapps/instructor_task/tests/test_views.py
+2
-3
lms/templates/courseware/instructor_dashboard.html
+3
-3
No files found.
lms/djangoapps/instructor_task/api_helper.py
View file @
332a4405
...
@@ -2,8 +2,6 @@ import hashlib
...
@@ -2,8 +2,6 @@ import hashlib
import
json
import
json
import
logging
import
logging
from
django.db
import
transaction
from
celery.result
import
AsyncResult
from
celery.result
import
AsyncResult
from
celery.states
import
READY_STATES
,
SUCCESS
,
FAILURE
,
REVOKED
from
celery.states
import
READY_STATES
,
SUCCESS
,
FAILURE
,
REVOKED
...
@@ -30,7 +28,6 @@ def _task_is_running(course_id, task_type, task_key):
...
@@ -30,7 +28,6 @@ def _task_is_running(course_id, task_type, task_key):
return
len
(
runningTasks
)
>
0
return
len
(
runningTasks
)
>
0
@transaction.autocommit
def
_reserve_task
(
course_id
,
task_type
,
task_key
,
task_input
,
requester
):
def
_reserve_task
(
course_id
,
task_type
,
task_key
,
task_input
,
requester
):
"""
"""
Creates a database entry to indicate that a task is in progress.
Creates a database entry to indicate that a task is in progress.
...
@@ -39,9 +36,9 @@ def _reserve_task(course_id, task_type, task_key, task_input, requester):
...
@@ -39,9 +36,9 @@ def _reserve_task(course_id, task_type, task_key, task_input, requester):
Includes the creation of an arbitrary value for task_id, to be
Includes the creation of an arbitrary value for task_id, to be
submitted with the task call to celery.
submitted with the task call to celery.
Autocommit annotation makes sure the database
entry is committed.
The InstructorTask.create method makes sure the InstructorTask
entry is committed.
When called from any view that is wrapped by TransactionMiddleware,
When called from any view that is wrapped by TransactionMiddleware,
and thus in a "commit-on-success" transaction,
this autocommit
here
and thus in a "commit-on-success" transaction,
an autocommit buried within
here
will cause any pending transaction to be committed by a successful
will cause any pending transaction to be committed by a successful
save here. Any future database operations will take place in a
save here. Any future database operations will take place in a
separate transaction.
separate transaction.
...
...
lms/djangoapps/instructor_task/models.py
View file @
332a4405
...
@@ -72,6 +72,16 @@ class InstructorTask(models.Model):
...
@@ -72,6 +72,16 @@ class InstructorTask(models.Model):
@classmethod
@classmethod
def
create
(
cls
,
course_id
,
task_type
,
task_key
,
task_input
,
requester
):
def
create
(
cls
,
course_id
,
task_type
,
task_key
,
task_input
,
requester
):
"""
Create an instance of InstructorTask.
The InstructorTask.save_now method makes sure the InstructorTask entry is committed.
When called from any view that is wrapped by TransactionMiddleware,
and thus in a "commit-on-success" transaction, an autocommit buried within here
will cause any pending transaction to be committed by a successful
save here. Any future database operations will take place in a
separate transaction.
"""
# create the task_id here, and pass it into celery:
# create the task_id here, and pass it into celery:
task_id
=
str
(
uuid4
())
task_id
=
str
(
uuid4
())
...
@@ -99,7 +109,16 @@ class InstructorTask(models.Model):
...
@@ -99,7 +109,16 @@ class InstructorTask(models.Model):
@transaction.autocommit
@transaction.autocommit
def
save_now
(
self
):
def
save_now
(
self
):
"""Writes InstructorTask immediately, ensuring the transaction is committed."""
"""
Writes InstructorTask immediately, ensuring the transaction is committed.
Autocommit annotation makes sure the database entry is committed.
When called from any view that is wrapped by TransactionMiddleware,
and thus in a "commit-on-success" transaction, this autocommit here
will cause any pending transaction to be committed by a successful
save here. Any future database operations will take place in a
separate transaction.
"""
self
.
save
()
self
.
save
()
@staticmethod
@staticmethod
...
...
lms/djangoapps/instructor_task/tests/test_api.py
View file @
332a4405
...
@@ -22,7 +22,7 @@ from instructor_task.tests.test_base import (InstructorTaskTestCase,
...
@@ -22,7 +22,7 @@ from instructor_task.tests.test_base import (InstructorTaskTestCase,
class
InstructorTaskReportTest
(
InstructorTaskTestCase
):
class
InstructorTaskReportTest
(
InstructorTaskTestCase
):
"""
"""
Tests API
and view
methods that involve the reporting of status for background tasks.
Tests API methods that involve the reporting of status for background tasks.
"""
"""
def
test_get_running_instructor_tasks
(
self
):
def
test_get_running_instructor_tasks
(
self
):
...
...
lms/djangoapps/instructor_task/tests/test_integration.py
View file @
332a4405
"""
"""
Integration Tests for LMS instructor-initiated background tasks
Integration Tests for LMS instructor-initiated background tasks
.
Runs tasks on answers to course problems to validate that code
Runs tasks on answers to course problems to validate that code
paths actually work.
paths actually work.
...
...
lms/djangoapps/instructor_task/tests/test_tasks.py
View file @
332a4405
"""
"""
Unit tests for LMS instructor-initiated background tasks
,
Unit tests for LMS instructor-initiated background tasks
.
Runs tasks on answers to course problems to validate that code
Runs tasks on answers to course problems to validate that code
paths actually work.
paths actually work.
...
@@ -7,6 +7,7 @@ paths actually work.
...
@@ -7,6 +7,7 @@ paths actually work.
"""
"""
import
json
import
json
from
uuid
import
uuid4
from
uuid
import
uuid4
from
unittest
import
skip
from
mock
import
Mock
,
patch
from
mock
import
Mock
,
patch
...
@@ -62,6 +63,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
...
@@ -62,6 +63,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
}
}
def
_run_task_with_mock_celery
(
self
,
task_function
,
entry_id
,
task_id
,
expected_failure_message
=
None
):
def
_run_task_with_mock_celery
(
self
,
task_function
,
entry_id
,
task_id
,
expected_failure_message
=
None
):
"""Submit a task and mock how celery provides a current_task."""
self
.
current_task
=
Mock
()
self
.
current_task
=
Mock
()
self
.
current_task
.
request
=
Mock
()
self
.
current_task
.
request
=
Mock
()
self
.
current_task
.
request
.
id
=
task_id
self
.
current_task
.
request
.
id
=
task_id
...
@@ -73,7 +75,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
...
@@ -73,7 +75,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
return
task_function
(
entry_id
,
self
.
_get_xmodule_instance_args
())
return
task_function
(
entry_id
,
self
.
_get_xmodule_instance_args
())
def
_test_missing_current_task
(
self
,
task_function
):
def
_test_missing_current_task
(
self
,
task_function
):
# run without (mock) Celery running
"""Check that a task_function fails when celery doesn't provide a current_task."""
task_entry
=
self
.
_create_input_entry
()
task_entry
=
self
.
_create_input_entry
()
with
self
.
assertRaises
(
UpdateProblemModuleStateError
):
with
self
.
assertRaises
(
UpdateProblemModuleStateError
):
task_function
(
task_entry
.
id
,
self
.
_get_xmodule_instance_args
())
task_function
(
task_entry
.
id
,
self
.
_get_xmodule_instance_args
())
...
@@ -88,7 +90,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
...
@@ -88,7 +90,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
self
.
_test_missing_current_task
(
delete_problem_state
)
self
.
_test_missing_current_task
(
delete_problem_state
)
def
_test_undefined_problem
(
self
,
task_function
):
def
_test_undefined_problem
(
self
,
task_function
):
# run with celery, but no problem defined
"""Run with celery, but no problem defined."""
task_entry
=
self
.
_create_input_entry
()
task_entry
=
self
.
_create_input_entry
()
with
self
.
assertRaises
(
ItemNotFoundError
):
with
self
.
assertRaises
(
ItemNotFoundError
):
self
.
_run_task_with_mock_celery
(
task_function
,
task_entry
.
id
,
task_entry
.
task_id
)
self
.
_run_task_with_mock_celery
(
task_function
,
task_entry
.
id
,
task_entry
.
task_id
)
...
@@ -103,7 +105,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
...
@@ -103,7 +105,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
self
.
_test_undefined_problem
(
delete_problem_state
)
self
.
_test_undefined_problem
(
delete_problem_state
)
def
_test_run_with_task
(
self
,
task_function
,
action_name
,
expected_num_updated
):
def
_test_run_with_task
(
self
,
task_function
,
action_name
,
expected_num_updated
):
# run with some StudentModules for the problem
"""Run a task and check the number of StudentModules processed."""
task_entry
=
self
.
_create_input_entry
()
task_entry
=
self
.
_create_input_entry
()
status
=
self
.
_run_task_with_mock_celery
(
task_function
,
task_entry
.
id
,
task_entry
.
task_id
)
status
=
self
.
_run_task_with_mock_celery
(
task_function
,
task_entry
.
id
,
task_entry
.
task_id
)
# check return value
# check return value
...
@@ -118,7 +120,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
...
@@ -118,7 +120,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
self
.
assertEquals
(
entry
.
task_state
,
SUCCESS
)
self
.
assertEquals
(
entry
.
task_state
,
SUCCESS
)
def
_test_run_with_no_state
(
self
,
task_function
,
action_name
):
def
_test_run_with_no_state
(
self
,
task_function
,
action_name
):
# run with no StudentModules for the problem
"""Run with no StudentModules defined for the current problem."""
self
.
define_option_problem
(
PROBLEM_URL_NAME
)
self
.
define_option_problem
(
PROBLEM_URL_NAME
)
self
.
_test_run_with_task
(
task_function
,
action_name
,
0
)
self
.
_test_run_with_task
(
task_function
,
action_name
,
0
)
...
@@ -185,7 +187,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
...
@@ -185,7 +187,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
module_state_key
=
self
.
problem_url
)
module_state_key
=
self
.
problem_url
)
def
_test_reset_with_student
(
self
,
use_email
):
def
_test_reset_with_student
(
self
,
use_email
):
# run with some StudentModules for the problem
"""Run a reset task for one student, with several StudentModules for the problem defined."""
num_students
=
10
num_students
=
10
initial_attempts
=
3
initial_attempts
=
3
input_state
=
json
.
dumps
({
'attempts'
:
initial_attempts
})
input_state
=
json
.
dumps
({
'attempts'
:
initial_attempts
})
...
@@ -233,8 +235,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
...
@@ -233,8 +235,7 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
self
.
_test_reset_with_student
(
True
)
self
.
_test_reset_with_student
(
True
)
def
_test_run_with_failure
(
self
,
task_function
,
expected_message
):
def
_test_run_with_failure
(
self
,
task_function
,
expected_message
):
# run with no StudentModules for the problem,
"""Run a task and trigger an artificial failure with give message."""
# because we will fail before entering the loop.
task_entry
=
self
.
_create_input_entry
()
task_entry
=
self
.
_create_input_entry
()
self
.
define_option_problem
(
PROBLEM_URL_NAME
)
self
.
define_option_problem
(
PROBLEM_URL_NAME
)
with
self
.
assertRaises
(
TestTaskFailure
):
with
self
.
assertRaises
(
TestTaskFailure
):
...
@@ -256,8 +257,10 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
...
@@ -256,8 +257,10 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
self
.
_test_run_with_failure
(
delete_problem_state
,
'We expected this to fail'
)
self
.
_test_run_with_failure
(
delete_problem_state
,
'We expected this to fail'
)
def
_test_run_with_long_error_msg
(
self
,
task_function
):
def
_test_run_with_long_error_msg
(
self
,
task_function
):
# run with an error message that is so long it will require
"""
# truncation (as well as the jettisoning of the traceback).
Run with an error message that is so long it will require
truncation (as well as the jettisoning of the traceback).
"""
task_entry
=
self
.
_create_input_entry
()
task_entry
=
self
.
_create_input_entry
()
self
.
define_option_problem
(
PROBLEM_URL_NAME
)
self
.
define_option_problem
(
PROBLEM_URL_NAME
)
expected_message
=
"x"
*
1500
expected_message
=
"x"
*
1500
...
@@ -282,9 +285,11 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
...
@@ -282,9 +285,11 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
self
.
_test_run_with_long_error_msg
(
delete_problem_state
)
self
.
_test_run_with_long_error_msg
(
delete_problem_state
)
def
_test_run_with_short_error_msg
(
self
,
task_function
):
def
_test_run_with_short_error_msg
(
self
,
task_function
):
# run with an error message that is short enough to fit
"""
# in the output, but long enough that the traceback won't.
Run with an error message that is short enough to fit
# Confirm that the traceback is truncated.
in the output, but long enough that the traceback won't.
Confirm that the traceback is truncated.
"""
task_entry
=
self
.
_create_input_entry
()
task_entry
=
self
.
_create_input_entry
()
self
.
define_option_problem
(
PROBLEM_URL_NAME
)
self
.
define_option_problem
(
PROBLEM_URL_NAME
)
expected_message
=
"x"
*
900
expected_message
=
"x"
*
900
...
@@ -330,3 +335,43 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
...
@@ -330,3 +335,43 @@ class TestInstructorTasks(InstructorTaskModuleTestCase):
self
.
assertEquals
(
output
[
'exception'
],
'ValueError'
)
self
.
assertEquals
(
output
[
'exception'
],
'ValueError'
)
self
.
assertTrue
(
"Length of task output is too long"
in
output
[
'message'
])
self
.
assertTrue
(
"Length of task output is too long"
in
output
[
'message'
])
self
.
assertTrue
(
'traceback'
not
in
output
)
self
.
assertTrue
(
'traceback'
not
in
output
)
@skip
def
test_rescoring_unrescorable
(
self
):
# TODO: this test needs to have Mako templates initialized
# to make sure that the creation of an XModule works.
input_state
=
json
.
dumps
({
'done'
:
True
})
num_students
=
1
self
.
_create_students_with_state
(
num_students
,
input_state
)
task_entry
=
self
.
_create_input_entry
()
with
self
.
assertRaises
(
UpdateProblemModuleStateError
):
self
.
_run_task_with_mock_celery
(
rescore_problem
,
task_entry
.
id
,
task_entry
.
task_id
)
# check values stored in table:
entry
=
InstructorTask
.
objects
.
get
(
id
=
task_entry
.
id
)
output
=
json
.
loads
(
entry
.
task_output
)
self
.
assertEquals
(
output
[
'exception'
],
"UpdateProblemModuleStateError"
)
self
.
assertEquals
(
output
[
'message'
],
"Specified problem does not support rescoring."
)
self
.
assertGreater
(
len
(
output
[
'traceback'
]),
0
)
@skip
def
test_rescoring_success
(
self
):
# TODO: this test needs to have Mako templates initialized
# to make sure that the creation of an XModule works.
input_state
=
json
.
dumps
({
'done'
:
True
})
num_students
=
10
self
.
_create_students_with_state
(
num_students
,
input_state
)
task_entry
=
self
.
_create_input_entry
()
mock_instance
=
Mock
()
mock_instance
.
rescore_problem
=
Mock
({
'success'
:
'correct'
})
# TODO: figure out why this mock is not working....
with
patch
(
'courseware.module_render.get_module_for_descriptor_internal'
)
as
mock_get_module
:
mock_get_module
.
return_value
=
mock_instance
self
.
_run_task_with_mock_celery
(
rescore_problem
,
task_entry
.
id
,
task_entry
.
task_id
)
# check return value
entry
=
InstructorTask
.
objects
.
get
(
id
=
task_entry
.
id
)
output
=
json
.
loads
(
entry
.
task_output
)
self
.
assertEquals
(
output
.
get
(
'attempted'
),
num_students
)
self
.
assertEquals
(
output
.
get
(
'updated'
),
num_students
)
self
.
assertEquals
(
output
.
get
(
'total'
),
num_students
)
self
.
assertEquals
(
output
.
get
(
'action_name'
),
'rescored'
)
self
.
assertGreater
(
'duration_ms'
,
0
)
lms/djangoapps/instructor_task/tests/test_views.py
View file @
332a4405
"""
"""
Test for LMS instructor background task
queue management
Test for LMS instructor background task
views.
"""
"""
import
json
import
json
from
celery.states
import
SUCCESS
,
FAILURE
,
REVOKED
,
PENDING
from
celery.states
import
SUCCESS
,
FAILURE
,
REVOKED
,
PENDING
...
@@ -18,7 +18,7 @@ from instructor_task.views import instructor_task_status, get_task_completion_in
...
@@ -18,7 +18,7 @@ from instructor_task.views import instructor_task_status, get_task_completion_in
class
InstructorTaskReportTest
(
InstructorTaskTestCase
):
class
InstructorTaskReportTest
(
InstructorTaskTestCase
):
"""
"""
Tests
API and
view methods that involve the reporting of status for background tasks.
Tests view methods that involve the reporting of status for background tasks.
"""
"""
def
_get_instructor_task_status
(
self
,
task_id
):
def
_get_instructor_task_status
(
self
,
task_id
):
...
@@ -263,4 +263,3 @@ class InstructorTaskReportTest(InstructorTaskTestCase):
...
@@ -263,4 +263,3 @@ class InstructorTaskReportTest(InstructorTaskTestCase):
succeeded
,
message
=
get_task_completion_info
(
instructor_task
)
succeeded
,
message
=
get_task_completion_info
(
instructor_task
)
self
.
assertFalse
(
succeeded
)
self
.
assertFalse
(
succeeded
)
self
.
assertEquals
(
message
,
"Problem rescored for 2 of 3 students (out of 5)"
)
self
.
assertEquals
(
message
,
"Problem rescored for 2 of 3 students (out of 5)"
)
lms/templates/courseware/instructor_dashboard.html
View file @
332a4405
...
@@ -249,7 +249,7 @@ function goto( mode)
...
@@ -249,7 +249,7 @@ function goto( mode)
<p>
<p>
Then select an action:
Then select an action:
<input
type=
"submit"
name=
"action"
value=
"Reset student's attempts"
>
<input
type=
"submit"
name=
"action"
value=
"Reset student's attempts"
>
%if settings.MITX_FEATURES.get('ENABLE_
COURSE
_BACKGROUND_TASKS'):
%if settings.MITX_FEATURES.get('ENABLE_
INSTRUCTOR
_BACKGROUND_TASKS'):
<input
type=
"submit"
name=
"action"
value=
"Rescore student's problem submission"
>
<input
type=
"submit"
name=
"action"
value=
"Rescore student's problem submission"
>
%endif
%endif
</p>
</p>
...
@@ -260,9 +260,9 @@ function goto( mode)
...
@@ -260,9 +260,9 @@ function goto( mode)
<input
type=
"submit"
name=
"action"
value=
"Delete student state for module"
>
<input
type=
"submit"
name=
"action"
value=
"Delete student state for module"
>
</p>
</p>
%endif
%endif
%if settings.MITX_FEATURES.get('ENABLE_
COURSE
_BACKGROUND_TASKS'):
%if settings.MITX_FEATURES.get('ENABLE_
INSTRUCTOR
_BACKGROUND_TASKS'):
<p>
Rescoring runs in the background, and status for active tasks will appear in a table below.
<p>
Rescoring runs in the background, and status for active tasks will appear in a table below.
To see status for all tasks submitted for this
course
and student, click on this button:
To see status for all tasks submitted for this
problem
and student, click on this button:
</p>
</p>
<p>
<p>
<input
type=
"submit"
name=
"action"
value=
"Show Background Task History for Student"
>
<input
type=
"submit"
name=
"action"
value=
"Show Background Task History for Student"
>
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment