- 06 Dec, 2017 3 commits
-
-
The order import query now retrieves the UUID of the course (not to be confused with course run) for which an entitlement was purchased. LEARNER-1745
Clinton Blackburn committed -
Fix source for course_subjects in BigQuery load.
brianhw committed -
Brian Wilson committed
-
- 05 Dec, 2017 11 commits
-
-
Remove traceback info.
Hassan committed -
Hassan Javeed committed
-
Updated analyze script to support newer releases of EMR.
Hassan committed -
Hassan Javeed committed
-
pycodestyle reads from setup.cfg so there is no need for an additional file. LEARNER-1745
Clinton Blackburn committed -
- All imports have been sorted using isort - Unused imports have been removed - isort is now run as part of code quality checks LEARNER-1745
Clinton Blackburn committed -
A Dockerfile is now included so that developers can use a Docker image/container for local testing. Travis now builds the Docker image and runs tests on the container. The image is pushed to Docker Hub for successful master builds. LEARNER-1745
Clinton Blackburn committed -
LEARNER-1745
Clinton Blackburn committed -
Incrementalization of user-activity.
Hassan committed -
Hassan Javeed committed
-
Update pep8 to pycodestyle.
brianhw committed
-
- 04 Dec, 2017 2 commits
-
-
Brian Wilson committed
-
Move course-subjects to use the discovery API
brianhw committed
-
- 02 Dec, 2017 1 commit
-
-
Brian Wilson committed
-
- 27 Nov, 2017 1 commit
-
-
Pin some google-related packages.
brianhw committed
-
- 25 Nov, 2017 1 commit
-
-
Handle null-duration video events.
brianhw committed
-
- 24 Nov, 2017 1 commit
-
-
Brian Wilson committed
-
- 22 Nov, 2017 2 commits
-
-
Handle failures in _finish for Vertica.
brianhw committed -
Brian Wilson committed
-
- 21 Nov, 2017 2 commits
-
-
Update boto version.
brianhw committed -
Add host argument to s3_connect. Update unit test patches to match. Host argument is now required by boto. Latest version of boto provides a hook to read from a .boto file, and there is a way for remote-task to set such a file up and then read from it. However, it's not clear how to get the .boto file to the task instances in a multi-instance cluster so that the file would be read by boto when reducers open their own streams to output to an S3 file.
Brian Wilson committed
-
- 17 Nov, 2017 1 commit
-
-
Like an earlier fix already applied to MySQL.
Brian Wilson committed
-
- 15 Nov, 2017 1 commit
-
-
use video duration from events
Muhammad Ammar committed
-
- 14 Nov, 2017 2 commits
-
-
EDUCATOR-1411
muhammad-ammar committed -
Added parameter import_credentials to HistogramFromSqoopToMySQLWorkflowBase
Jillian Vogel committed
-
- 13 Nov, 2017 2 commits
-
-
Allow empty inserts for ModuleEngagementSummaryMetricRangesMysqlTask
Jillian Vogel committed -
for sites with periods of low activity.
Jillian Vogel committed
-
- 06 Nov, 2017 1 commit
-
-
Inspect traceback for handling end of output error.
Hassan committed
-
- 02 Nov, 2017 1 commit
-
-
Hassan Javeed committed
-
- 01 Nov, 2017 1 commit
-
-
Add missing backslash for null ascii.
brianhw committed
-
- 31 Oct, 2017 1 commit
-
-
Brian Wilson committed
-
- 26 Oct, 2017 1 commit
-
-
Brian/also load json events
brianhw committed
-
- 25 Oct, 2017 1 commit
-
-
Translation of MySQL's LONGBLOB to Vertica's LONG VARBINARY.
Andrew Zafft committed
-
- 24 Oct, 2017 1 commit
-
-
It coexists with regular event output, and is controlled by an optional parameter. By default it runs with event_record_type equal to 'EventRecord', but can be overridden by running with --event-record-type 'JsonEventRecord'. Includes bug fix to timestamp handling: Add validation screening dates < 1900. Also includes support for event loading to BigQuery, by adding support for partitioning to bigquery_load. * Use records for warehouse loading where defined. * Check bigquery availability in load code. Add support for loading to S3 by interval or by date. PerDate loading checks whether the data already exists, which is good for incremental runs. Bulk loading just runs over an interval, and assumes the data isn't already present on S3. This is better for processing many days more efficiently. To address issue with loading into BigQuery, null characters in column values are encoded as the string '\0'.
Brian Wilson committed
-
- 23 Oct, 2017 3 commits
-
-
Check Sqoop job completion before running.
brianhw committed -
Brian Wilson committed
-
Make sure enrollment output is encoded
brianhw committed
-