From 743034b158ba25640b02845a6cccc7502dc82755 Mon Sep 17 00:00:00 2001 From: Mary Gwozdz Date: Wed, 4 Feb 2026 13:08:41 -0700 Subject: [PATCH 1/5] docs: add ADR 0020 assessment criteria location --- .../0020-assessment-criteria-location.rst | 61 +++++++++++++++++++ 1 file changed, 61 insertions(+) create mode 100644 docs/decisions/0020-assessment-criteria-location.rst diff --git a/docs/decisions/0020-assessment-criteria-location.rst b/docs/decisions/0020-assessment-criteria-location.rst new file mode 100644 index 000000000..25802544f --- /dev/null +++ b/docs/decisions/0020-assessment-criteria-location.rst @@ -0,0 +1,61 @@ +20. Where in the codebase should CBE assessment criteria go? +============================================================ + +Context +------- +Competency Based Education (CBE) requires that the LMS have the ability to track learners' mastery of competencies through the means of assessment criteria. For example, in order to demonstrate that I have mastered the Multiplication competency, I need to have earned 75% or higher on Assignment 1 or Assignment 2\. The association of the competency, the threshold, the assignments, and the logical OR operator together make up the assessment criteria for the competency. Course Authors and Platform Administrators need a way to set up these associations in Studio so that their outcomes can be calculated as learners complete their materials. This is an important prerequisite for being able to display competency progress dashboards to learners and staff to make Open edX the platform of choice for those using the CBE model. + +Decisions +--------- +CBE Assessment Criteria, Student Assessment Criteria Status, and Student Competency Status values should go in the openedx-learning repository as there are broader architectural goals to refactor as much code as possible out of the edx-platform repository into the openedx-learning repository such that it can be designed in a way that is easy for plugin developers to utilize. + +More specifically, all code related to adding Assessment Criteria to Open edX will live in openedx-learning/openedx_learning/apps/assessment_criteria. The exception is a small app in edx-platform to receive grading signals, invoke the openedx-learning evaluation logic to perform calculations, and persist results in openedx-learning. This is a pragmatic integration until grading events move out of edx-platform and into openedx-events; it is acknowledged technical debt to keep grading signal access in edx-platform for now. + +This keeps a single cohesive Django app for authoring the criteria and for storing learner status derived from those criteria, which reduces cross-app dependencies and simplifies migrations and APIs. It also keeps Open edX-specific models (users, course identifiers, LMS/Studio workflows) out of the standalone ``openedx_tagging`` package and avoids forcing the authoring app to depend on learner runtime data. The tradeoff is that authoring and runtime concerns live in the same app; if learner status needs to scale differently or be owned separately in the future, a split into a dedicated status app can be revisited. Alternatives that externalize runtime status to analytics/services or split repos introduce operational and coordination overhead that is not justified at this stage. + +Rejected Alternatives +--------------------- +1. edx-platform repository + - Pros: This is where all data currently associated with students is stored, so it would match the existing pattern and reduce integration work for the LMS. + - Cons: The intention is to move core learning concepts out of edx-platform (see `0001-purpose-of-this-repo.rst <0001-purpose-of-this-repo.rst>`_), and keeping it there makes reuse and pluggability harder. +2. All code related to adding Assessment Criteria to Open edX goes in openedx-learning/openedx\_learning/apps/authoring/assessment\_criteria + - Pros: + - Tagging and assessment criteria are part of content authoring workflows as is all of the other code in this directory. + - All other elements using the Publishable Framework are in this directory. + - Cons: + - We want each package of code to be independent, and this would separate assessment criteria from the tags that they are dependent on. + - Assessment criteria also includes learner status and runtime evaluation, which do not fit cleanly in the authoring app. + - The learner status models in this feature would have a ForeignKey to settings.AUTH_USER_MODEL, which is a runtime/learner concern. If those models lived under the authoring app, then the authoring app would have to import and depend on the user model, forcing an authoring-only package to carry learner/runtime dependencies. This may create unwanted coupling. +3. New Assessment Criteria Content tables will go in openedx-learning/openedx_learning/openedx_tagging/core/assessment_criteria. New Student Status tables will go in openedx-learning/student_status. + - Pros: + - Keeps assessment criteria in the same package as the tags that they are dependent on. + - Cons: + - `openedx_tagging` is intended to be a standalone library without Open edX-specific dependencies (see `0007-tagging-app.rst <0007-tagging-app.rst>`_) assessment criteria would violate that boundary. + - Splitting Assessment Criteria and Student Statuses into two apps would require cross-app foreign keys (e.g., status rows pointing at criteria/tag rows in another app), migration ordering and dependency declarations to ensure tables exist in the right order, and shared business logic or APIs for computing/updating status that now must live in one app but reference models in the other. +4. Split assessment criteria and learner statuses into two apps inside openedx-learning/openedx\_learning/apps (e.g., assessment\_criteria and learner\_status) + - Pros: + - Clear separation between authoring configuration and computed learner state. + - Could allow different storage or scaling strategies for status data. + - Cons: + - Still introduces cross-app dependency and coordination for a single feature set. + - May be premature for the POC; adds overhead without proven need. +5. Store learner status in a separate service + - Pros: + - Scales independently and avoids write-heavy tables in the core app database. + - Could potentially reuse existing infrastructure for grades. + - Cons: + - Introduces eventual consistency and more integration complexity for LMS/Studio views. + - Requires additional infrastructure and operational ownership. +6. Split authoring and runtime into separate repos/packages + - Pros: + - Clear ownership boundaries and independent release cycles. + - Cons: + - Adds packaging and versioning overhead for a tightly coupled domain. + - Increases coordination cost for migrations and API changes. +7. Migrate grading signals to openedx-events now and have openedx-learning consume events directly + - Pros: + - Aligns with the long-term direction of moving events out of edx-platform. + - Avoids a shim app in edx-platform and reduces tech debt. + - Cons: + - Requires cross-repo coordination and work beyond the current scope. + - Depends on changes to openedx-events that are not yet scheduled or ready. From cc0ec47937ef1c7fa6f655342e73d0fdc5105b11 Mon Sep 17 00:00:00 2001 From: Mary Gwozdz Date: Wed, 4 Feb 2026 13:11:58 -0700 Subject: [PATCH 2/5] docs: Increment ADR number --- ...teria-location.rst => 0021-assessment-criteria-location.rst} | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) rename docs/decisions/{0020-assessment-criteria-location.rst => 0021-assessment-criteria-location.rst} (99%) diff --git a/docs/decisions/0020-assessment-criteria-location.rst b/docs/decisions/0021-assessment-criteria-location.rst similarity index 99% rename from docs/decisions/0020-assessment-criteria-location.rst rename to docs/decisions/0021-assessment-criteria-location.rst index 25802544f..66c043c46 100644 --- a/docs/decisions/0020-assessment-criteria-location.rst +++ b/docs/decisions/0021-assessment-criteria-location.rst @@ -1,4 +1,4 @@ -20. Where in the codebase should CBE assessment criteria go? +21. Where in the codebase should CBE assessment criteria go? ============================================================ Context From b8440490ef8d3bfc7f63d814ee1681b46f9751e7 Mon Sep 17 00:00:00 2001 From: Mary Gwozdz Date: Wed, 4 Feb 2026 13:08:50 -0700 Subject: [PATCH 3/5] docs: add ADR 0021 assessment criteria versioning --- .../0021-assessment-criteria-versioning.rst | 47 +++++++++++++++++++ 1 file changed, 47 insertions(+) create mode 100644 docs/decisions/0021-assessment-criteria-versioning.rst diff --git a/docs/decisions/0021-assessment-criteria-versioning.rst b/docs/decisions/0021-assessment-criteria-versioning.rst new file mode 100644 index 000000000..8cba4bd3b --- /dev/null +++ b/docs/decisions/0021-assessment-criteria-versioning.rst @@ -0,0 +1,47 @@ +21. How should versioning be handled for CBE assessment criteria? +================================================================= + +Context +------- +Course Authors and/or Platform Administrators will be entering the assessment criteria rules in Studio that learners are required to meet in order to demonstrate competencies. Depending on the institution, these Course Authors or Platform Administrators may have a variety of job titles, including Instructional Designer, Curriculum Designer, Instructor, LMS Administrator, Faculty, or other Staff. + +Typically, only one person would be responsible for entering assessment criteria rules in Studio for each course, though this person may change over time. However, entire programs could have many different Course Authors or Platform Administrators with this responsibility. + +Typically, institutions and instructional designers do not change the mastery requirements (assessment criteria) for their competencies frequently over time. However, the ability to do historical audit logging of changes within Studio can be a valuable feature to those who have mistakenly made changes and want to revert or those who want to experiment with new approaches. + +Currently, Open edX always displays the latest edited version of content in the Studio UI and always shows the latest published version of content in the LMS UI, despite having more robust version tracking on the backend (Publishable Entities). Publishable Entities for Libraries is currently inefficient for large nested structures because all children are copied any time an update is made to a parent. + +Authoring data (criteria definitions) and runtime learner data (status) have different governance needs: the former is long-lived and typically non-PII, while the latter is user-specific, can be large (learners x criteria/competencies x time), and may require stricter retention and access controls. These differing lifecycles can make deep coupling of authoring and runtime data harder to manage at scale. Performance is also a consideration: computing or resolving versioned criteria for large courses could add overhead in Studio authoring screens or LMS views. + +Decision +-------- +Defer assessment criteria versioning for the initial implementation. Store only the latest authored criteria and expose the latest published state in the LMS, consistent with current Studio/LMS behavior. This keeps the initial implementation lightweight and avoids the publishable framework's known inefficiencies for large nested structures. The tradeoff is that there is no built-in rollback or audit history; adding versioning later will require data migration and careful choices about draft vs published defaults. + +Rejected Alternatives +--------------------- + +1. Each model indicates version, status, and audit fields + - Pros: + - Simple and familiar pattern (version + status + created/updated metadata) + - Straightforward queries for the current published state + - Can support rollback by marking an earlier version as published + - Stable identifiers (original_ids) can anchor versions and ease potential future migrations + - Cons: + - Requires custom conventions for versioning across related tables and nested groups + - Lacks shared draft/publish APIs and immutable version objects that other authoring apps can reuse + - Not necessarily consistent with existing patterns in the codebase (though these are already not overly consistent). +2. Publishable framework in openedx-learning + - Pros: + - First-class draft/published semantics with immutable historical versions + - Consistent APIs and patterns shared across other authoring apps + - Cons: + - Inefficient for large nested structures because all children are copied for each new parent version + - Requires modeling criteria/groups as publishable entities and wiring Studio/LMS workflows to versioning APIs + - Adds schema and migration complexity for a feature that does not yet require full versioning +3. Append-only audit log table (event history) + - Pros: + - Lightweight way to capture who changed what and when + - Enables basic rollback by replaying or reversing events + - Cons: + - Requires custom tooling to reconstruct past versions + - Does not align with existing publishable versioning patterns From 236ef76709c5dc1d7de399dff6d831b57be4b29a Mon Sep 17 00:00:00 2001 From: Mary Gwozdz Date: Wed, 4 Feb 2026 13:13:39 -0700 Subject: [PATCH 4/5] docs: Increment ADR number --- ...a-versioning.rst => 0023-assessment-criteria-versioning.rst} | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) rename docs/decisions/{0021-assessment-criteria-versioning.rst => 0023-assessment-criteria-versioning.rst} (98%) diff --git a/docs/decisions/0021-assessment-criteria-versioning.rst b/docs/decisions/0023-assessment-criteria-versioning.rst similarity index 98% rename from docs/decisions/0021-assessment-criteria-versioning.rst rename to docs/decisions/0023-assessment-criteria-versioning.rst index 8cba4bd3b..a7491edc0 100644 --- a/docs/decisions/0021-assessment-criteria-versioning.rst +++ b/docs/decisions/0023-assessment-criteria-versioning.rst @@ -1,4 +1,4 @@ -21. How should versioning be handled for CBE assessment criteria? +23. How should versioning be handled for CBE assessment criteria? ================================================================= Context From 1100d8aa0021a7e7f0d42550ea4df0c47ec538e3 Mon Sep 17 00:00:00 2001 From: Mary Gwozdz Date: Wed, 4 Feb 2026 13:21:52 -0700 Subject: [PATCH 5/5] docs: remove location adr from versioning adr branch --- .../0021-assessment-criteria-location.rst | 61 ------------------- 1 file changed, 61 deletions(-) delete mode 100644 docs/decisions/0021-assessment-criteria-location.rst diff --git a/docs/decisions/0021-assessment-criteria-location.rst b/docs/decisions/0021-assessment-criteria-location.rst deleted file mode 100644 index 66c043c46..000000000 --- a/docs/decisions/0021-assessment-criteria-location.rst +++ /dev/null @@ -1,61 +0,0 @@ -21. Where in the codebase should CBE assessment criteria go? -============================================================ - -Context -------- -Competency Based Education (CBE) requires that the LMS have the ability to track learners' mastery of competencies through the means of assessment criteria. For example, in order to demonstrate that I have mastered the Multiplication competency, I need to have earned 75% or higher on Assignment 1 or Assignment 2\. The association of the competency, the threshold, the assignments, and the logical OR operator together make up the assessment criteria for the competency. Course Authors and Platform Administrators need a way to set up these associations in Studio so that their outcomes can be calculated as learners complete their materials. This is an important prerequisite for being able to display competency progress dashboards to learners and staff to make Open edX the platform of choice for those using the CBE model. - -Decisions ---------- -CBE Assessment Criteria, Student Assessment Criteria Status, and Student Competency Status values should go in the openedx-learning repository as there are broader architectural goals to refactor as much code as possible out of the edx-platform repository into the openedx-learning repository such that it can be designed in a way that is easy for plugin developers to utilize. - -More specifically, all code related to adding Assessment Criteria to Open edX will live in openedx-learning/openedx_learning/apps/assessment_criteria. The exception is a small app in edx-platform to receive grading signals, invoke the openedx-learning evaluation logic to perform calculations, and persist results in openedx-learning. This is a pragmatic integration until grading events move out of edx-platform and into openedx-events; it is acknowledged technical debt to keep grading signal access in edx-platform for now. - -This keeps a single cohesive Django app for authoring the criteria and for storing learner status derived from those criteria, which reduces cross-app dependencies and simplifies migrations and APIs. It also keeps Open edX-specific models (users, course identifiers, LMS/Studio workflows) out of the standalone ``openedx_tagging`` package and avoids forcing the authoring app to depend on learner runtime data. The tradeoff is that authoring and runtime concerns live in the same app; if learner status needs to scale differently or be owned separately in the future, a split into a dedicated status app can be revisited. Alternatives that externalize runtime status to analytics/services or split repos introduce operational and coordination overhead that is not justified at this stage. - -Rejected Alternatives ---------------------- -1. edx-platform repository - - Pros: This is where all data currently associated with students is stored, so it would match the existing pattern and reduce integration work for the LMS. - - Cons: The intention is to move core learning concepts out of edx-platform (see `0001-purpose-of-this-repo.rst <0001-purpose-of-this-repo.rst>`_), and keeping it there makes reuse and pluggability harder. -2. All code related to adding Assessment Criteria to Open edX goes in openedx-learning/openedx\_learning/apps/authoring/assessment\_criteria - - Pros: - - Tagging and assessment criteria are part of content authoring workflows as is all of the other code in this directory. - - All other elements using the Publishable Framework are in this directory. - - Cons: - - We want each package of code to be independent, and this would separate assessment criteria from the tags that they are dependent on. - - Assessment criteria also includes learner status and runtime evaluation, which do not fit cleanly in the authoring app. - - The learner status models in this feature would have a ForeignKey to settings.AUTH_USER_MODEL, which is a runtime/learner concern. If those models lived under the authoring app, then the authoring app would have to import and depend on the user model, forcing an authoring-only package to carry learner/runtime dependencies. This may create unwanted coupling. -3. New Assessment Criteria Content tables will go in openedx-learning/openedx_learning/openedx_tagging/core/assessment_criteria. New Student Status tables will go in openedx-learning/student_status. - - Pros: - - Keeps assessment criteria in the same package as the tags that they are dependent on. - - Cons: - - `openedx_tagging` is intended to be a standalone library without Open edX-specific dependencies (see `0007-tagging-app.rst <0007-tagging-app.rst>`_) assessment criteria would violate that boundary. - - Splitting Assessment Criteria and Student Statuses into two apps would require cross-app foreign keys (e.g., status rows pointing at criteria/tag rows in another app), migration ordering and dependency declarations to ensure tables exist in the right order, and shared business logic or APIs for computing/updating status that now must live in one app but reference models in the other. -4. Split assessment criteria and learner statuses into two apps inside openedx-learning/openedx\_learning/apps (e.g., assessment\_criteria and learner\_status) - - Pros: - - Clear separation between authoring configuration and computed learner state. - - Could allow different storage or scaling strategies for status data. - - Cons: - - Still introduces cross-app dependency and coordination for a single feature set. - - May be premature for the POC; adds overhead without proven need. -5. Store learner status in a separate service - - Pros: - - Scales independently and avoids write-heavy tables in the core app database. - - Could potentially reuse existing infrastructure for grades. - - Cons: - - Introduces eventual consistency and more integration complexity for LMS/Studio views. - - Requires additional infrastructure and operational ownership. -6. Split authoring and runtime into separate repos/packages - - Pros: - - Clear ownership boundaries and independent release cycles. - - Cons: - - Adds packaging and versioning overhead for a tightly coupled domain. - - Increases coordination cost for migrations and API changes. -7. Migrate grading signals to openedx-events now and have openedx-learning consume events directly - - Pros: - - Aligns with the long-term direction of moving events out of edx-platform. - - Avoids a shim app in edx-platform and reduces tech debt. - - Cons: - - Requires cross-repo coordination and work beyond the current scope. - - Depends on changes to openedx-events that are not yet scheduled or ready.