Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Organizations will need to map their CBE curriculum before beginning to use additional CBE tools. Typically central administrative staff will create the required curriculum framework(s). Following that, curriculum can be uploaded at the organization or course level, depending on the needs of an organization.
Due to some legacy code, organizations will have to run some auto-setup tools and populate certain tag sets before they can build their own frameworks.
In the context of Royal College programs, after your CBME migrations are complete, any new program onboarding to CBE will have to map Entrustable Professional Activities, Key and Enabling Competencies, Milestones, Contextual Variable Responses and Procedure Attributes (if using procedure form template).
The CBE tools allow an organization to build a curriculum framework to define the structure of their curriculum (e.g., exit competencies, phase objectives, course objectives), and leverage that framework to build objectives trees that house the specific curriculum.
A Curriculum Framework defines a list of tag sets and sets relationships between those tag sets. Elentra will look for that information when an objective tree is built. Objective trees consist of the curriculum tags that users will work with in Elentra.
Objective Trees are built based on the structure defined in a Curriculum Framework. They define the hierarchical relationships between all of the curricular objectives (tags) included.
A Curriculum Framework may be used to generate multiple Objective Trees and Objective Trees are versioned each time there is a change to them.
There are three types of objective trees:
Organisation Trees - Use an Organization Tree when multiple courses share identical curriculum. You will be able to apply an organisation tree to courses when you set them up. You create an Organization Tree from the Admin > Manage Curriculum > Curriculum Framework Builder. Then it can be leveraged from Admin > Manage Courses > CBME > Configure CBME.
Building an organisation tree is most appropriate when a single curriculum is applied across multiple courses.
Course Trees - A Course Tree is required for every course, however there are several ways you can create a Course Tree. You can upload curriculum tags per course (appropriate when courses/programs have unique curriculum) or copy an existing Organisation Tree to multiple courses (appropriate when multiple courses share the same curriculum).
User Trees - Elentra creates unique user trees for learners enrolled in courses that have course trees. Updating a user tree is what allows learners to have different versions of organisation trees applied to them as the curriculum changes. In a typical setup, administrators don’t have to modify user trees that often and there is no initial work to build a user tree. A user tree is automatically created for a user when they are enrolled in a course.
After a curriculum framework, organisation trees and course trees are built, you can use the existing competency-based assessment tools like form templates, assessment plans, learner dashboards and program dashboards.
Before you begin to build a curriculum framework, you should create any dashboard rating scales that will be used with a tag set in your framework. Dashboard rating scales allow faculty and learners to track overall progress on specific curriculum tags shown on the dashboard. The dashboard rating scale can be different from the the milestone or global assessment scales.
If you are a Canadian Postgraduate organization, many of the prerequisites for using CBE are already in place. If the CBE module is new to your organization, there will be some developer tasks required before you begin to explore CBE.
Enable CBE for the relevant organization (database setting: cbme_enabled)
Disable CBE for all courses that don’t need it (make entries per course for cbme_enabled on the course_settings table)
Add form types to the organization
Almost all organizations will require the rubric form be active to use CBE
You can optionally add other form types (e.g. Supervisor Form Template, Smart Tag Form)
Configure workflows for the organization (at a minimum you will likely need rubric forms available to the EPA or Other Assessment workflow)
Create Assessment & Evaluation rating scales as needed (this can be done by a medtech:admin user)
If a rating scale will be used for global entrustment or milestone ratings, additional developer work can allow users to have automated question prompts based on the scale selected
The following are required only if you plan to version your curriculum and automatically apply new versions to learners when they transition to a new stage within your curriculum framework:
Configure learner levels as required by your organization (e.g., PGY 1, PGY 2). Store these in the global_lu_learner_levels table.
Enable the enhanced enrolment tab to allow administrative staff to enter learner level and CBE status information (database setting: learner_levels_enabled)
NOTE: Some schools choose to just store this information in the database and not have administrative staff input it (they are presumably getting the data from another system).
Set learners up to land on their My Event Calendar tab on the dashboard instead of their CBE Progress Learner Dashboard. There is no database setting for this; a developer just has to make code changes to support it if that is an organization's preference.
Enable learner self-assessment options for curriculum tags displayed on the learner dashboard (database setting: cbe_dashboard_allow_learner_self_assessment and/or learner_access_to_status_history)
Enable visual summary dashboards (database setting: cbme_enable_visual_summary)
Provides access to additional reporting on EPA assessments
This step is optional based on your curriculum structure and needs. An Organization Tree makes the most sense to use when multiple courses share identical curriculum. You will be able to apply an organisation tree to multiple courses when you set them up and build their course trees.
Click on the framework title in the Curriculum Builder table where you want to create, or edit an organization tree.
Click the Organization Tree tab.
On the Create New Organization Tree page, input the Title name.
Click the Save and Proceed to Uploader button.
At this point you'll be prompted to upload the required files to populate your tree.
For each tag set select Default Upload or Upload referencing single parent code.
Click Choose a file or drag and drop to add a .csv file.
Once the .csv file is selected, the upload area will display the file name.
Click the Save and upload tag set name button. A confirmation highlighted in green will display when the file is uploaded successfully.
Scroll down to click the Next Step button.
Continue until all tag sets are populated.
On the Organization Tree tab, click the trash icon associated with the tree you want to delete.
On the Delete an Organization Tree page, click the Delete this tree button.
You will receive a notification that the tree was deleted, and the page will return to the Organization Trees table.
If you plan to use curriculum versioning and have residents automatically move to a new curriculum version when they move to their next stage, you must define the learners as CBME enabled. A developer can do this directly in the database of you can use the data base setting (learner_levels_enabled) to allow course administrators to set learner levels and CBE status on the course enrolment page.
To ensure CBME works correctly for RC programs, each organisation that has CBME enabled must have a developer or technical administrator set the default_stage_objective
setting value in the elentra_me.settings
table to the global_lu_objectives.objective_id
of the first stage of competence (ex: Transition to Discipline).
After the provided software migrations run with the upgrade to ME 1.24 or higher, some additional configuration is required for your users to move forward using Elentra as they currently do. Please check all tag set settings for the tags included in your Royal College Framework (Milestones). This is important to ensure that the learner dashboard, program dashboard and assessment plan builder behave as expected.
Before you build a Curriculum Framework, please make sure that the database setting cbme_enabled is enabled.
A Curriculum Framework defines tag sets and the relationships between those tag sets.
To use the CBE dashboards, you must create a curriculum framework and define the tag set options you'd like to use for each tag set included in the framework. The tag set options will determine things like whether a tag set appears on the assessment trigger interface or whether a tag set is shown on a learner dashboard.
Once a Curriculum Framework is create, you can optionally create an organizational tree or multiple course trees.
Creating an organisational tree is optional and is required only when you have multiple courses that will rely on the same curriculum. Otherwise, once you have built the required framework(s), you can work within a course page to build course trees.
If you plan to use the CBE learner dashboard, you should set up dashboard rating scales before building your curriculum framework and tag sets so that you can apply the dashboard scales as needed on your tag sets.
Log in as a MedTech or Admin user.
Open the Admin drop-down on the top menu and click Manage Curriculum.
Click Curriculum Framework Builder in the left-side menu.
Click the Add Framework button.
In the Adding New Framework box, input the Title of the framework and click the Add Framework button.
The new framework will appear in the Curriculum Frameworks table.
Rename a framework as needed by clicking the edit icon on the right side of the Curriculum Frameworks table.
Once a Curriculum Framework exists, you need to define the tag sets to include in that framework. You can create new tag sets or add existing tag sets. If you add existing tag sets, you'll still need to upload objectives for those tag sets when you build an objective tree, however you'll be able to manually map tags uploaded through an objective tree to other tags stored in Elentra.
From Admin > Manage Curriculum > Curriculum Frameworks, click on the framework to which you want to add tag sets.
On the Tag Sets page, click the Create Tag Set button.
On the Adding Tag Set page, under the Details section, input the Code, Name, Shortname. These fields are required.
Scroll down to select which course(s) the tag set will be associated with.
Scroll down to select from the options under Framework Specific Options, if applicable. See Curriculum Framework Tag Set Options for more details.
Scroll down to select from the options under Form Options, if applicable. See Curriculum Framework Tag Set Options for more details.
Scroll down to select from the options under Advanced Options, if applicable. See Curriculum Framework Tag Set Options for more details.
Click the Save button to add the tag set.
Note that even if you add an existing tag set to a curriculum framework, Elentra will still require you to upload objectives to build an objective tree.
Click on the framework title in the Curriculum Builder table that you want to add tag sets.
On the Tag Sets page, click the Use Existing Tag Set button.
On the Use an Existing Objective Set box, use the Existing Objective Set drop-down to select a tag set.
Click the Import button.
On the Edit Tag Set page, under the Details section, ensure the correct Code, Name, and Shortname appear.
Scroll down to ensure the proper course(s) are associated with the tag set.
Scroll down to ensure the necessary tag set options are selected, if applicable:
Click the Save button to add the tag set.
This step is necessary to define the relationships between tags. These relationships will help store information properly when you import curriculum tags to create the map of your curriculum.
Click on the framework title in the Curriculum Builder table that you want to manage the tag set relationships for.
Click on the Relationships tab.
Click on the Manage Relationships buttons for the tag set you want to add a relationship.
In the Adding Relationships to [insert tag set title] box, use the Relationship Type and Tag Set drop-downs to add the proper relationship.
Repeat the above steps for all necessary tag sets. This will create the framework for your CBE Course (program).
On the Curriculum Builder page, select the tag set on the Tag Set table.
Click the Remove Tag Set button.
On the Remove Objective Sets box, review the instructions:
4. Click the Remove button to delete.
Although Elentra allows individual programs using a Royal College framework to upload specific Key and Enabling Competencies, organizations are still required to upload a standard list of Key and Enabling Competencies at the organization level.
If you see a warning like the one below, it means your organization still needs to upload standard key and enabling competencies through Manage Curriculum.
For Consortium schools who are exploring CBE in Elentra, although you are never going to use the Key and Enabling Competency tag sets, Elentra still requires you to populate these tag sets with some data.
Download these files to have available for the curriculum tag upload.
Upload Key and Enabling Competency Tags
Navigate to Admin > Manage Curriculum.
Click 'Curriculum Tags' from the Manage Curriculum card on the left sidebar.
Click on a curriculum tag set name (e.g. CanMEDS Key Competencies) and then click 'Import from CSV' in the top right.
Drag and drop or select the appropriate file from your computer and click 'Upload'.
You will get a success message and the curriculum tags you've added will appear on the screen.
Now that you have fulfilled the requirements of the auto-setup tool, you can begin to create your own curriculum framework as needed.
Most organizations will have already completed this step while they were using CBME. If you are testing things or building a new organization for some reason, these files can be used to populate the standard key and enabling competencies. When you build objective trees for individual programs you'll need to prepare KC and EC templates to upload for that program.
Upload Standard Key and Enabling Competency Tags
Navigate to Admin > Manage Curriculum.
Click 'Curriculum Tags' from the Manage Curriculum card on the left sidebar.
Click on a curriculum tag set name (e.g. CanMEDS Key Competencies) and then click 'Import from CSV' in the top right.
Drag and drop or select the appropriate file from your computer and click 'Upload'.
You will get a success message and the curriculum tags you've added will appear on the screen.
Now that you have fulfilled the requirements of the auto-setup tool, you can begin to create your own curriculum framework as needed.
Even if you are not using the Royal College of Physician and Surgeons curriculum structure, Elentra still has some RC prerequisites in the system. You can upload dummy data to meet the requirement to populate certain curriculum tag sets.
To use Elentra's CBE tools, nine required curriculum tag sets must be in place. An auto-setup tool will create the tag sets.
Navigate to Admin > Manage Course > Course (new or existing), and click the CBME tab.
If the required curriculum tag sets aren't yet configured, Elentra will prompt users to auto-setup the required curriculum tag sets and will populate the CanMEDS Roles, Royal College Stages, and Contextual Variables tags. You just do this once for an organization.
Users without permission to use the auto-setup feature will be directed to contact a system administrator with the message below.
After building the required tag sets, Elentra will prompt you to populate a standard list of key and enabling competencies. This is also a relic of CBME supporting Royal College programs but schools exploring CBE can upload dummy data to fulfil this requirement. An organization only has to do this once. Go to the next page to learn how to upload information for the standard key and enabling competency tag sets.
Unless you are working with a developer, you cannot change the tags in the pre-built tags sets or, after you do, you will be prompted to run the auto-setup feature again.
Use .csv format to upload objectives
A Course Tree is required for every course, however there are several ways you can create a Course Tree. You can upload curriculum tags per course (appropriate when courses/programs have unique curriculum) or copy an existing Organisation Tree to multiple courses (appropriate when multiple courses share the same curriculum).
Upload objectives within a course under Admin > Manage Courses.
Within a course, select the CBME tab and then the Configure CBME tab to begin Step 1: Configuration. Select Use a curriculum framework as a base and click Next Step.
Select a framework in the drop-down and click Next Step.
Ensure the objective .csv files are formatted as indicated in the highlighted instructions.
For each curriculum tag set,
Select Default upload or Upload referencing a single parent code
Select the appropriate file
Click 'Save and upload (tag set name)'
Choose a file or drag and drop to add a .csv file.
A confirmation highlighted in green will display when the file is uploaded successfully.
Repeat for each curriculum tag set as needed.
If you forget to select the correct Upload radio button (ex. Upload referencing a single parent code), you must reset the upload by going back to Step 1: Configuration in the CBME tab of the course.
You can review the new proposed tree structure with the uploaded objectives added. Click the Publish button to save.
Note: if there are errors in the upload, then the objective uploads have not been published. The publish button also does not appear when there is an error. You will need to reset the objective upload. See Notes section in Step 4 of this documentation.
A behind the scenes task will run to publish your objective tree and make it visible to you. At most schools, this happens once a day. After sufficient time has passed, you can refresh the Configure CBME tab screen to review and search within the new tree mappings.
Use this path to review migrated CBME data to CBE
Pre-requisite: A developer has already migrated CBME data to CBE.
After migrations have run and you log back into your upgraded Elentra ME 1.24 or higher environment, you can expect to see some automatically generated Curriculum Frameworks.
Since most schools have mapped to Milestones, we'll focus on that framework for demonstrative purposes.
The Royal College Framework (Milestones) framework should look like this:
Where tag set relationships are defined as follows:
You should not need to change anything about this setup.
Please check all tag set settings for the tags included in your Royal College Framework (Milestones). This is important to ensure that the learner dashboard, program dashboard and assessment plan builder behave as expected. (See recommended tag set settings in Step 8 on the next page.)
Use .csv format to upload objectives
In the context of Royal College programs, each program will upload its own curriculum via Admin > Manage Courses/Programs > CBME > Configure CBME.
csv files prepared for RC program curriculum imports should include the following column headings:
Parent(s)
Code
Name
Description
Detailed Description
While all tag sets should include code and name information, the full parent path is only required on the milestone template. The other curriculum tag sets can by uploaded with no parents at all.
Navigate to the Configure CBME tab. Select Use a curriculum framework as a base and click Next Step.
2. Select a framework in the drop-down and click Next Step.
3. Complete the required upload for each tag set.
Reminder: While all tag sets should include code and name information, when using the default upload method, the full parent path is only required on the milestone template. The other curriculum tag sets can by uploaded with no parents at all.
Select an upload method.
For the Stages upload, ensure the Default upload option is selected.
For the Entrustable Professional Activities upload, ensure the Default option is selected.
For the Roles upload, ensure the Default upload option is selected.
For the Key Competency upload, ensure the Default upload option is selected.
For the Enabling Competency upload, ensure the Default upload option is selected.
For the Milestones upload, ensure the Default upload option is selected.
Drag and drop a file into the appropriate space. You will see the file name when the file is there.
Click Save and upload and you'll see a green success message.
Notes:
If you forget to select the correct Upload radio button (e.g., Upload referencing a single parent code), you must reset the upload by going back to Step 1: Configuration in the CBME tab of the course or in the Organization Trees section of the Curriculum Framework Builder.
You can review the new proposed tree structure with the uploaded objectives added. Click the Publish button to save.
Note: If there are errors in the upload, the green Publish button will not display and you will be unable to publish. You will need to reset the objective upload. See Notes section in step 3 of this documentation.
After you publish your course tree, a behind the scenes task must run for the curriculum to publish. At many schools this happens once a night. After time has passed, you can refresh the Configure CBME tab screen to view the imported curriculum.
Please reach out to the Consortium Core Team on Slack if you are a Consortium school and are looking for some sample data to test CBE with.
Use .csv format to upload objectives
Depending on your curriculum framework and organization's needs, you may upload your tag sets within the organizational tree (and then leverage that tree in multiple courses), or you may upload your tag sets through individual courses.
The titles of the column headings in the uploaded spreadsheets must be the following:
Note: Make sure you list all parents for a curriculum tag in one row, with commas separating the parents. There should be no duplicates in the Code column; if there are, Elentra will only pick up the first parent relationship for that code/tag and ignore the remaining entries.
The “Parent(s)” field must be specific as to where the objective is to be located in the Curriculum Framework structure:
Example: for objective PTYPE1, Parent(s) field may be: “DOCC1-EPA1-POE1, DOCC1-EPA1-POE2” - This will attach objective PTYPE1 to Curriculum Tag PEO1 under EPA1, which is under DOCC1, as well as PEO2, under EPA1, which is under DOCC1.
Parent(s) field can be empty until used in another upload. For example, objectives at the highest level would be uploaded with the “Parent(s)” field empty.
Sample .csv format:
Sample .csv format:
Upload objectives referencing their immediate parent.
Example: when uploading objective PTYPE1, the Parent(s) field may only specify “POE1, POE2”. This attaches PTYPE1 to wherever Curriculum Tags POE1 and POE2 appear, no matter their parents or where they appear in the hierarchy.
Sample .csv format:
Sample .csv format:
The Core Team recorded a demo session with a complete walk-through of the creation and versioning of Objective Trees:
Given Curriculum Framework with structure:
DOCC - Domains of Clinical Care
EPA - Entrustable Professional Activity
POE - Phase of Encounter
PTYPE - Procedure Type
P - Procedure
SD/FMR - Skill Dimensions & FM CanMEDs Role
PT - Priority Topic
P&P - Progress & Promotion
When selecting to create a new Objective Tree Version and choosing to Upload, a page will be displayed to allow the user to upload objectives for each of the above Curriculum Tag Sets in the hierarchy, in order of hierarchy:
Domains of Clinical Care
Progress & Promotion
Entrustable Professional Activity
Phase of Encounter
Skill Dimensions & FM CanMEDs Role
Priority Topic
Procedure Type
Procedure
Note: For each of the above, the user can upload a CSV of objectives and select either Default upload or Upload referencing a single parent code.
For Curriculum Tag Sets with no parent in the Curriculum Framework structure, the selected method does not matter, since no Parent is specified for the Objectives.
For example, the following can be used to set objectives for Curriculum Tag Domains of Clinical Care, no matter the upload method since no Parent is specified.
In addition, for Curriculum Tag Sets that appear in the second level (below the top) in the Curriculum Framework, the upload method will also not matter since only a single Parent is specified:
In the above example, objective EPA2 has two parents, DOCC2 and DOCC3.
For Curriculum Tag Sets that appear in lower levels than the top in the Curriculum Framework, the selected upload method will matter.
Default Upload Method Example
The following is an example of how POE objectives can be uploaded using the Default Method, under EPAs:
The above will set objective POE1 under DOC1-EPA1 in the hierarchy for example. Also note the difference between POE4 and POE5.
POE4 is assigned to EPA2, but only EPA2 under DOCC2 in the hierarchy, where POE5 is assigned to EPA2 under DOCC3:
Single Parent Code Upload Example
The following is an example of how POE objectives can be uploaded using the Single Parent Code method, under EPAs:
In the above, EPA2, no matter where it is in the hierarchy, is assigned the set objectives POE2, POE3, POE4 and PEO5. Therefore, both DOCC2-EPA2, and DOCC3-EPA2 are assigned those objectives.
Subsequent levels would be assigned objectives in a similar way.
This template organizes the information about EPAs and allows the information to be uploaded to Elentra.
Each program should complete an EPA template with the following information:
Parent(s): This can be left blank because you will provide the full parent path in the Milestones Template. If you prefer to complete this column, you can indicate the parent of the EPA which is the Stage (i.e., D, F, C, P)
Code: Indicate the EPA code (e.g., C1, C2, C3). Note that there is no space between the letter and number in the EPA Code.
EPA codes should be recorded in uppercase letters and include the learner stage of training and the EPA number. Use the learner stage letters outlined below to ensure that the EPAs correctly map to other curriculum tags. D: Transition to Discipline F: Foundations of Discipline C: Core Discipline P: Transition to Practice
Name: Provide the competency text
Description: Provide additional detail about the EPA as required.
Detailed Description: Not every program will use the ‘Detailed Description’ column.
Save your file as a CSV.
The tools to support competency-based education in Elentra were significantly updated in ME 1.24. Whereas users were previously restricted to a rigid curriculum structure to fit the requirements of Canadian post-graduate medical programs, the new Curriculum Framework Builder allow organizations to build a curriculum structure that suits their needs. Once this curriculum structure is populated with curriculum objectives, organizations can take advantage of the learner and program dashboards that were built to accompany CBME.
You can watch recordings about CBE at (login required).
The CB(M)E module in Elentra is optional and is controlled through a database setting (setting: cbme_enabled).
With CBE enabled, an organization will be able to:
create one or more curriculum frameworks and populate organization and course trees (i.e., map a curriculum that includes objectives at multiple levels on multiple paths)
use supervisor, procedure, smart tag and field note form templates to create assessment forms users can initiate on demand (requires some developer assistance to configure workflows),
create rubric and periodic performance assessment (PPA) forms that are linked to the curriculum (can be used on-demand or sent via distributions),
monitor learner progress on a configurable learner dashboard (i.e., determine which curriculum tag sets display on dashboards),
create assessment plans to set assessment requirements for individual curriculum tags,
monitor learner progress towards assessment plans through a program dashboard, and
monitor learner performance and EPA task completion rates through optional visual summary dashboards.
In addition to these core tasks, users of CBE tools may opt to:
map CBE curriculum tags to specific rotations (requires that a program is using the Clinical Experience Rotation Schedule)
assign faculty members as competence committee members and academic advisors to provide them access to view specific learners,
and log meeting information for each learner.
Be aware that if you enable CBE in an organization, learners will land on their CBE dashboard when they log into Elentra. They still have access to the My Event Calendar but will have to click to view the calendar.
If your organization prefers that learners still land on their My Event Calendar tab when logging in, you'll need a developer to make that change for you.
Faculty and staff users with access to My Learners will also experience a difference because by default they will see a Program Dashboard view when accessing My Learners. If a CBE course does not have a CBE assessment plan set up, most of the Program Dashboard will be empty, however faculty and staff can still access a learner's assessments and logbook from their name card.
This template organizes the information about Stages and allows the information to be uploaded to Elentra.
Every program can use the same Stages template when they upload their program curriculum.
Parent(s): This column can be left blank.
Code: Provide the code (i.e., D, F, C, P).
Name: Provide the full stage name text
D: Transition to Discipline F: Foundations of Discipline C: Core Discipline P: Transition to Practice
Description: Not required.
Detailed Description: Not required.
Save your file as a CSV.
When working with tag set in the context of the Curriculum Framework Builder, an administrator must configure the settings for each tag set. These settings tell Elentra which tag sets to show on the learner and program dashboard, which tag sets should be accessible as filters and in the Assessment Plan, etc.
Select this to have a tag set display on the learner dashboard.
When this option is selected, a dashboard scale can also be selected. The dashboard scale will display and allow progress reviewers with permission to set a status for the objective (when clicking the status circle). If a user doesn't have permission to set the status for an objective, they will only be able to see the status.
In this example, the tag set "Direct Observations" was set to show in dashboard.
This tag set option will display the objectives under their parents as categories on the dashboard.
"EPAs" are displayed under their parent, "Stages", on the Learner Dashboard
This option will add the tag set as a filter on the learner dashboard.
This option should be selected if the tags will be used when triggering on-demand assessments.
This option allows the tag set to be mapped to clinical experiences (i.e., rotations).
This option should be selected if the tag set is to be used in the assessment plan builder.
This option will allow the tag set to be displayed on the program dashboard.
This option will group children of this tag set on the program dashboard.
This option will use the tag set as a filter on the program dashboard.
This option does not currently impact the user interface in any way.
The intention is to allow users to edit curriculum tags at the Course level to accommodate unique course/program needs. This has not yet been implemented.
This option will add this tag set as a smart tag item on the smart tag form templates. Smart tag items will display the mapping of the triggered branch after the form has been generated.
This option will allow this tag set to be used on CBE form templates (Supervisor Form, Procedure Form, Field Note Form).
This option will add this tag set as a mapped objective on Supervisor form templates. You would primarily use this option when you have children of a parent and you want to add the parent tag to a Supervisor form and have the child tags automatically pulled into the form for use.
This is meant to show an EPA; where we expect to show EPA related information, this option allows the objective set to be used in its place.
This is also important to select as the child of a primary objective (example: Parent - Stages, Child - EPAs).
You will want to select this to ensure all children and grandchildren of this tag set/objective set will be displayed in the objective tree mapping for your CBE course.
This option sets the tag set as a filter when initiating tasks on-demand.
If the objective is a primary competency, this filter will display the set of objectives the learner is currently in-progress (e.g., Display EPAs by the current stage). When this option is selected, name the filter.
This option may be used if your curriculum uses stages. For example, your learners must complete a foundations stage before they move into a core stage.
This option determines that ordered objectives must start at the depth of this tag set. This is relevant when multiple types appear on the same depth. The tree in the example below is the Framework and not actual branches of objectives. If the order for B is set as a value of 2, and the order for C is set to 1 and D is left unset, then, the relevant interfaces (currently, only Smart Tag Triggering) would render as such: C, B, D
Use Case
When two objectives live on the same level, both are triggerable and you would like to identify which order they should be displayed in. Relevant to smart tag form triggering.
Clicking this option allows you to enter the order of the objective codes on each tag in this tag set, in the order that the learner will complete it. In the case of the Stages tag set, enter D,F,C,P. This indicates that the learner would complete the D, F, C, P objective codes in this tag set, in this order.
This applies to curriculum versioning.
Go to Admin > Manage Curriculum.
Click Curriculum Framework Builder in the left sidebar.
Click Add Framework on the right.
Provide a title and click Add Framework.
The framework will be created and you can scroll to the bottom of the list of frameworks to find it.
Click on the framework name to open it and configure its tag sets.
Because the CBME auto-setup tool has run, all tag sets required for a RC program should already exist. Click Use Existing Tag Set, then select the appropriate existing objective set and click Import. Build your curriculum framework as follows:
Next, from the Tag Sets tab and click on each tag set to configure its tag set options.
For Royal College Stages:
Set as applicable to All Courses
Framework Specific Options
Allow tag set to be used as a filter on the dashboard
Group Program Dashboard progress by this tag
Allow tag set to be used as a filter on Program Dashboard
Form Options
None
Advanced Options
This objective set is a container of its children
This objective set uses objective ordinality (set as D,F,C,P)
For Entrustable Professional Activities:
Set as applicable to All Courses
Framework Specific Options
Check all EXCEPT 'Group Program Dashboard progress by this tag'
Form Options
Check 'Allow tag set to be added to form templates'
Advanced Options
Check 'This objective set is a primary competency'
Check 'Allow tagset to be used as a filter in the triggering interface'
Set Label as "Current Stage EPAs"
For Roles:
Set as applicable to All Courses
Make no additional selections
For Key Competencies:
Set as applicable to All Courses
For Enabling Competencies:
Set as applicable to All Courses
For Milestones:
Set as applicable to All Courses
Framework Specific Options
Check 'Allow tag set to be used as a filter on the dashboard'
Form Options
Check 'Make available when any parent tag set is added to a supervisor form template'
Advanced Options
None
After all required tag sets are configured, click the Relationships tab. Click on each tag set to define its relationship. Set the relationships between tag sets as follows:
Create a course in Manage Courses.
Follow the tab flow under the Configure CBME tab in the course under Manage Courses to upload curriculum tags.
DOCC1
Care of Adults
DOCC2
Maternity & Newborn
DOCC3
Children & Adolescents
DOCC4
Care of the Elderly
DOCC1-EPA1-POE6
PTYPE1
Integumentary Procedure
DOCC1-EPA2-POE6, DOCC1-EPA3-POE6
PTYPE2
Local Anaesthetic Procedures
DOCC1-EPA3-POE6, DOCC1-EPA4-POE6
PTYPE3
Eye Procedures
DOCC1-EPA4-POE6
PTYPE4
Ear Procedures
DOCC1-EPA2-POE6, DOCC1-EPA3-POE6
PTYPE5
Nose and Throat
DOCC1
EPA1
Care of the Adult with a Chronic Condition
DOCC1
EPA2
Care of the Adult with Minor Episodic Problem
DOCC1
EPA3
Care of the Adult with Acute Serious Presentation
DOCC1
EPA4
Care of the Adult with Multiple Medical Problems
DOCC1
EPA5
Performing a Periodic Health Review of an Adult
EPA1, EPA2, EPA3, EPA4, EPA5
POE1
Hypothesis Generation
EPA1, EPA2, EPA3, EPA4, EPA5
POE2
History
EPA1, EPA2, EPA3, EPA4, EPA5
POE3
Physical Examination
EPA1, EPA2, EPA3, EPA4, EPA5
POE4
Diagnosis
EPA1, EPA2, EPA3, EPA4, EPA5
POE5
Investigation
EPA1, EPA2, EPA3, EPA4, EPA5
POE6
Procedural Skills
DOCC1
Care of Adults
DOCC2
Maternity & Newborn
DOCC3
Children and Adolescents
DOCC1
EPA1
Care of the Adult with Chronic Condition
DOCC2, DOCC3
EPA2
Care of the Adult with Minor Episodic Problem
DOCC3
EPA3
Care of the Adult with Acute Serious Presentation
DOCC3
EPA4
Care of the Adult with Multiple Medical Problems
DOCC1-EPA1
POE1
...
DOCC1-EPA1, DOCC2-EPA2
POE2
...
DOCC2-EPA2
POE3
...
DOCC2-EPA2
POE4
...
DOCC3-EPA2
POE5
...
DOCC3-EPA3
POE6
...
DOCC3-EPA4
POE7
...
DOCC3-EPA4
POE8
...
DOCC1-EPA1
POE1
...
DOCC1-EPA1, DOCC2-EPA2
POE2
...
DOCC2-EPA2
POE3
...
DOCC2-EPA2
POE4
...
DOCC3-EPA2
POE5
...
DOCC3-EPA3
POE6
...
DOCC3-EPA4
POE7
...
DOCC3-EPA4
POE8
...
cbme_enabled
Controls whether an organization has CBME enabled or not.
cbme_standard_kc_ec_objectives
Controls whether you give courses/programs the option to upload program specific key and enabling competencies.
allow_course_director_manage_stage
Enable to allow course directors to manage learner stages (in addition to Competence Committee members).
allow_program_coordinator_manage_stage
Enable to allow program coordinators to manage learner stages (in addition to Competence Committee members).
cbme_memory_table_publish
cbme_enable_tree_aggregates
cbme_enable_tree_aggregate_build
cbme_enable_tree_caching
default_stage_objective
Must have information entered for CBME to work. Set to
global_lu_objectives.objective_id
of the first stage of competence (e.g., Transition to Discipline)
cbme_enable_visual_summary
Enables additional views from program dashboard
cbe_versionable_root_default
Default versionable root for trees
cbe_tree_publish_values_per_query
Number of nodes to be inserted in bulk into a tree table
cbe_smart_tags_items_autoselect
Determines whether smart tag items should be pre-filled on trigger
cbe_smart_tags_add_associated_objectives
Adds the associated objectives item to smart tag forms on trigger
cbe_smart_tags_items_mandatory
Determines whether smart tag items are required
cbe_smart_tags_items_assessor_all_objectives_selectable
Determines whether assessors can view all smart tag objectives in the learner tree, rather than the objectives in the learner tree
cbe_smart_tags_items_target_all_objectives_selectable
Determines whether targets cal also see all selectable objectives, not just the ones applied via the Smart Tag form trigger process
cbe_smart_tags_items_all_objectives_selectable_override
Overrides the target and assessor *_all_objectives_selectable settings, if set
cbe_dashboard_allow_learner_self_assessment
Determines whether learners can provide a self-assessment on dashboard objectives
learner_access_to_status_history
Determines whether learners can see their status history; i.e the "epastatushistory page" where rating scale updates and advisor comments/files are displayed
All templates you upload to the system must be in .csv format.
If you are working in Excel or another spreadsheet manager use the “Save as” function to select and save your file as a .csv
To upload your files, either search for or drag and drop the file into place. You'll often see a green checkmark when your file is successfully uploaded.
If you receive an error message make sure that you have:
Deleted any unused lines. This is especially relevant in templates with pre-populated columns including the contextual variables template, and enabling competencies mapping template.
Completed all required columns. If a column is empty for a specific line the file may fail to upload.
If you've imported all CBME data for a program but you are not able to see EPA maps in the EPA Encyclopedia or in the Configure CBME tab, double check that you've uploaded your files without spaces between the EPA Code letter and number, nor between the CanMEDS Role and Key Competency number. Having spaces in the incorrect places will prevent the system from retrieving the required information to produce maps.
While you can correct minor typos and add additional mapping info. through the user interface you can not delete, nor undo mapping between tag sets.
The option to reset all CBME data for a program was removed in a previous Elentra version.
The process to build new assessment tools after curriculum versioning is relatively straight-forward. Administrators need to build new assessment tools for new EPAs or EPAs that have changed. Assessment tools will be carried over for all EPAs that were marked as "not changing."
Navigate to Admin > Assessment and Evaluation.
Click ‘Form Templates’ on the tab menu.
Click the green ‘Add Form Template’ button in the top right and a pop-up window will appear.
Type in a form name and select the form type from the dropdown menu. Select the appropriate form template and course (e.g., ‘Supervisor Form’, Pediatrics).
Now that you have two (or more) curriculum versions in the system, the Form Template Builder will default to loading the most recent version.
If you want to build new forms for learners using a previous version, change the EPA Version to what you want, click 'Save', and Elentra will load the appropriate EPAs.
Complete the sections of the form builder. This is the same as any previous form building but see details below as needed.
Publish Form Template
Click 'Publish' to make your template available for use. The forms will be available within the hour.
On each form template you create you’ll notice a greyed out area at the bottom including Next Steps, Concerns, and a place for feedback. This is default form content that cannot be changed.
Once a form template has been published, you can rearrange the template components for each form; however, you cannot makes changes to the scales or contextual variables. To make these changes, copy the form template and create a new version.
Within a given form, you can only tag curricular objectives (e.g., EPAs or milestones) from the same curriculum version. To ensure that you do not accidentally add an EPA from a different version, you must create the form first and then "Create & Attach" new items to the form.
Click Admin > Assessment & Evaluation.
Click on Forms from the subtab menu
Click Add Form.
Provide a form name and select PPA Form or Rubric/Flex Form from the Form Type dropdown menu; then click Add Form.
Now that you have two (or more) curriculum versions in the system, the Form Editor will default to loading the most recent version. Under "EPA Version", simply select the appropriate version. Click Save.
If you want to build new forms for learners using Version 1, simply change the EPA Version to Version 1 and it will load the appropriate EPAs.
In order to use the "Programs" filter in the form bank, you need to add Program-level permissions to each form so it is recommended you do so.
Click "Individual", change to "Program"
Begin typing in your program name in the "Permissions" box.
Click on your program to add it.
Complete the sections of the form builder. This is the same as any previous form building but see details below as needed.
Publish Form Template
Click 'Publish' to make your template available for use. The forms will be available within the hour.
The default Feedback and Concerns sections will be added when the form is published.
Rubrics are assessment tools that describe levels of performance in terms of increasing complexity with behaviourally anchored scales. In effect, performance standards are embedded in the assessment form to support assessors in interpreting their observations of learner performance.
If you create a rubric form and at least one item on the form is linked to an EPA the form will be triggerable by faculty and learners once published. Results of a completed rubric form are included on a learner's CBME dashboard information.
Within a given form, you can only tag curricular objectives (e.g., EPAs or milestones) from the same curriculum version. To ensure that you do not accidentally add an EPA from a different version, we recommend you create the form first and then "Create & Attach" new items to the form.
You need to be a staff:admin, staff:prcoordinator, or faculty:director to access Admin > Assessment & Evaluation to create a form.
Click Admin > Assessment & Evaluation.
Click on Forms from the subtab menu.
Click Add Form.
Provide a form name and select Rubric Form from the Form Type dropdown menu; then click Add Form.
Form Title: Form titles are visible to end-users (learners, assessors, etc.) when being initiated on-demand. Use something easily distinguishable from other forms.
Form Description: The form description can be used to store information for administrative purposes, but it is not seen by users completing the form.
Form Type: You cannot change this once you have created a form.
On-Demand Workflow: On-demand workflows enable users to initiate a form on-demand using different workflow options (EPA, Other Assessment Form, Faculty Evaluation and Rotation Evaluation).
EPA: Use for forms tagged to EPAs that you want users to be able to initiate. Contributes to CBME dashboards.
Other Assessment Form: Use for forms that you want users to be able to initiate and complete on demand without tagging to EPAs; or, for tagged forms that you don't want to appear in the EPA list. Only forms with EPAs tagged will contribute to CBME dashboards.
Course: Program coordinators and faculty directors may not have access to multiple courses, while staff:admin users are likely to. If you have access to multiple courses, make sure you've selected the correct course to affiliate the form with.
EPA Version: Select the CBME Version that this form will be used for. After setting the version, you will only be able to tag EPAs from that version to this form.
By default, the Form Editor will load the most recent CBME version. Under "EPA Version", simply select the appropriate version. Click Save. If you want to build new forms for learners using Version 1, simply change the EPA Version to Version 1 and it will load the appropriate EPAs.
Permissions: It is highly recommended that you assign course/program-level permissions to all of your forms, as some filters rely on this setting. Additionally, using a form in a workflow requires that it be permissioned to a course.
Authorship permissions give other users access to edit the form. You can add individual authors or give permission to all administrators in a selected program or organization.
To add individual, program, or organization permissions, click the dropdown selector to select a category, and then begin to type in the appropriate name, clicking on it to add to the list.
You can add multiple individuals, programs, and organizations to the permissions list as needed.
In order to use the "Programs" filter in the form bank and when learners initiate assessments/evaluations, you need to add Program-level permissions to each form.
Select the relevant contextual variables for this form by clicking on the checkbox. Adjust which contextual variable responses should be included by clicking on the gray badge. This allows you to deselect unneeded contextual variable responses which can make the form faster to complete for assessors.
If you want to include an Entrustment Rating on the form, click the checkbox. Select an entrustment rating scaled from the dropdown menu. Note that the responses will be configured based on the scale you select. It is also possible that the Item Text will be autopopulated based on the scale you select.
For the optional Entrustment Rating, set requirements for comments noting that if you select Prompted comments you should also check off which responses are prompted in the Prompt column. If you use this option and any person completing the form selects one of the checked responses, s/he will be required to enter a comment. Additionally, if the form is part of a distribution you'll be able to define how prompted responses should be addressed (e.g. send email to program coordinator whenever anyone chooses one of those response options).
The default Feedback and Concerns items will be added when the form is published.
Add form items by clicking 'Add Items', or click the down arrow for more options.
'Add Free Text' will allow you to add an instruction box.
If you add free text, remember to click Save in the top right of the free text entry area. Any free text entered will display to people using the form.
'Add Curriculum Tag Set' should not be used.
To create and add a new item, click the appropriate button.
Select the Item Type and add any item responses, if applicable.
Tag Curriculum Tags to your newly created item.
In the example below, because you are using a form that is mapped to "Version 2", the curriculum tag sets will be locked to "Version 2". This will ensure that you do not accidentally tag an EPA from a different version.
After you have added items to your form you may download a PDF, and preview or copy the form as needed.
Save your form to return to it later, or if the form is complete, click Publish. You will see a blue message confirming that a form is published. Unlike form templates which require a behind the scenes task to be published, a rubric form will be available immediately.
Rubric forms can also be scheduled for distribution through the Assessment and Evaluation module.
Users can't access the form when initiating an assessment on demand. Why is this happening?
Check that your form is permissioned to a course and has a workflow (e.g. Other Assessment) defined.
My PPA or Rubric Form is not displaying a publish button. Why is this happening?
In order for a PPA or Rubric form to be published, you must have at least one item that is mapped to part of your program's "EPA tree". You will only see the "publish" button appear after you have tagged an item to either an EPA(s) or a milestone(s) within an EPA. After saving the item, you will now see a "Publish" button appear.
Is it a requirement to publish PPA and Rubric forms?
You only need to publish PPAs and Rubric forms if you wish to leverage the EPA/Milestones tagging functionality in the various CBME dashboards and reporting. You are still able to use PPAs and Rubric forms without tagging EPAs or milestones if you only need to distribute them, or select them using the "Other Assessment" trigger workflow. If you want any of the standard CBME items, such as the Entrustment Item, Contextual Variables, or the CBME Concerns rubric, you must tag and publish the form. Keep in mind that the assessment plan builder only supports forms that have the standard entrustment item on it - meaning, only published PPAs/Rubrics forms.
This template organizes the information about Key Competencies and allows the information to be uploaded to Elentra.
Parent(s): This column is optional because you will provide the full parent path in the Milestones template.
If you do chose to populate this columns, indicate the parent(s) of the enabling competency by providing the stage, EPA, and role (e.g., C-C1-PR).
Code: Indicate the key competency code (e.g., ME1, CL2, HA3)
Name: Provide the key competency text
Description: Not required but complete with CanMEDS information as you wish.
Detailed Description: Not required.
Save your file as a CSV.
When coding the Key Competencies remember these required codes for the CanMEDS stages and roles:
Transition to Discipline: D Foundations of Discipline: F Core Discipline: C Transition to Practice: P Professional: PR Communicator: CM Collaborator: CL Scholar: SC Leader: LD Advocate: HA Medical Expert: ME
This template organizes the information about Roles and allows the information to be uploaded to Elentra.
Every program can use the same Roles template when they upload their program curriculum.
Parent(s): This can be left blank because you will provide the full parent path in the Milestones Template.
Code: Provide the code for the role
ME, CL, CM, LD, HA, SC, PR
Name: Provide the full role text
ME: Medical Expert
CM: Communicator CL: Collaborator LD: Leader HA: Health Advocate SC: Scholar PR: Professional
Description: Not required but complete with CanMEDS information as you wish.
Detailed Description: Not required.
Save your file as a CSV.
Within a given contextual variable, you are able to create groups of responses. This is leveraged in the assessment plan builder to allow you to specify exactly what requirements learners must meet.
You must be in an administrative role to manage contextual variables.
Click Admin > Manage Courses
Beside the appropriate program name, click the small gear icon, then click "CBME"
Click on the "CV Responses" tab
Click on any contextual variable category to expand the card. To add a new group within that contextual variable, click on the green "Manage Groups" button.
To add a new group, click "Add Group". To edit an existing group, click on the group title.
Add a title for the contextual variable group and an optional description. Check off all responses that you wish to add to the group and then click "Save Group".
After saving, you can now use the Group contextual variable types within the assessment plan builder.
When importing data to create EPA maps, programs are required to upload a contextual variable template to provide contextual variable response options (e.g. case complexity: low, medium, high). In addition to this information, programs will need to add specific details about any procedures included in their contextual variable response options. This additional information, called procedure attributes, plays an important part in building procedure assessment forms. Think of the procedure attributes as assessment criteria (e.g. obtained informed consent, appropriately directed assistants, etc.) for each procedure .
For each procedure in your program, you must define headings, rubrics and list items that may be used to assess a learner completing the procedure. Headings appear at the top of a form, rubric becomes the subtitle of a rubric, and list items are the things that will actually be assessed on the rubric.
All this information must be stored and uploaded in a CSV spreadsheet that uses two columns: Type and Title.
Within ‘Type’ there are three things you can include: H to represent heading (e.g. Participate in Surgical Procedures, NG Insertion) R to represent rubric (e.g. Procedure, Pre-Procedure) L to represent list items (e.g. Apply knowledge of anatomy)
The three procedure criteria form a hierarchy. A heading can have multiple rubrics, and a rubric can have multiple list items. Arrange the criteria in your spreadsheet to reflect their nesting hierarchies.
In the 'Title' column enter the information to be uploaded.
When complete, save the file in a CSV format.
Sample Procedure Form On this sample form, the following information was uploaded as procedure attributes: H: Participate in Surgical Procedures R: Procedure L: (ME1.3.2) Apply knowledge... (ME3.4.2) Perform procedural tasks... Use common surgical instruments...
You can use the same file for multiple procedures if they share the same heading, rubric, and list information.
Navigate to Admin > Manage Courses.
From the cog menu beside the course name, select 'CBME.'
From the tab menu below the Competency-Based Medical Education header, click **'**CV Responses'.
From the list of contextual variables, click on ‘Procedure’.
Before uploading procedure attributes, make sure you are working with the correct CBME curriculum version and change the version as needed.
Beside each procedure you will see a small black up arrow under the Criteria column.
Click on the arrow to upload a .csv file of information for each procedure.
You must indicate an EPA to associate the procedure attributes with (note that you can select all as needed).
Either choose your file name from a list or drag and drop a file into place.
Click ‘Save and upload criteria’.
You will get a success message that your file has been successfully uploaded. Click 'Close'.
After procedure attributes have been uploaded for at least one EPA you will see a green checkmark in the Criteria column.
Click the green disk in the Save column to save your work. You will get a green success message at the bottom of the screen.
Repeat this process for each procedure and each relevant EPA. Remember, a program can use the same procedure attributes for multiple procedures and EPAs if appropriate.
You can view the procedure attributes already uploaded to a procedure by clicking on the green check mark in the criteria column.
You will see which EPAs have had procedure attributes added and can expand an EPA to see specific details by clicking on the chevron to the right of the EPA.
Navigate to Admin > Manage Courses/Programs.
Search for the appropriate course as needed.
From the cog menu on the right, select 'CBME'.
Click on 'CV Responses' from the tab menu below the Competency-Based Medical Education heading.
Click on a contextual variable to show its existing responses.
Click on the double arrow in the Order column to drag and drop a response to a new location.
You'll get a green success message that your change has been saved.
If users want to reorder the overall list of contextual variables they can do so from Admin > Manage Curriculum. To change a tag's display order you need to edit the tag by clicking on the pencil icon beside a tag name. Then you need to adjust the display order for each individual tag.
After a program has uploaded an initial set of contextual variable responses, they can be managed through the CBME tab.
Navigate to Admin > Manage Courses/Programs.
Search for the appropriate course as needed and from the cog menu on the right, select 'CBME'.
Click on 'CV Responses' from the tab menu below the Competency-Based Medical Education heading.
To modify the contextual variable responses, click the title of the contextual variable or click 'Show' to the right of the one you want to edit.
To edit an existing response, make the required change and click the green disk icon in the Save column.
To delete an existing response, click the red minus button beside the response.
To add a new contextual variable response, click 'Add Response' at the bottom of the category window. This will open a blank response space at the end of the list. Fill in the required content and save.
If you get a yellow bar across the screen when you try to modify contextual variables it means none have been uploaded for the program you are working in. Return to the Import CBME tab and complete Step 4.
This template organizes the information about Enabling Competencies and allows the information to be uploaded to Elentra.
Parent(s): This column is optional because you will provide the full parent path in the Milestones template.
If you do chose to populate this columns, indicate the parent(s) of the enabling competency by providing the stage, EPA, role, and key competency (e.g., C-C1-ME-ME1)
Code: Indicate the enabling competency code (e.g., ME1.1, CL2.3, HA3.4)
Name: Provide the enabling competency text.
Description: Not required but complete with CanMEDS information as you wish.
Detailed Description: Not required.
Save your file as a CSV.
When coding the Enabling Competencies remember these required codes for the CanMEDS stages and roles:
Transition to Discipline: D Foundations of Discipline: F Core Discipline: C Transition to Practice: P Professional: PR Communicator: CM Collaborator: CL Scholar: SC Leader: LD Advocate: HA Medical Expert: ME
This template organizes the information about Enabling Competencies and allows the information to be uploaded to Elentra.
Parent(s): In the parent column you must indicate the full parent path for all milestones. This should include stage, EPA, role, key competency and enabling competency (e.g., C-C1-ME-ME1-ME1.1)
Code: Codes should be recorded in uppercase letters. The format for the milestone code is: Leaner Stage letter, followed by a space, CanMEDS Role letters, Key Competency number, followed by a period, Enabling Competency number, followed by a period, Milestone number.
Note that there should be no space between the CanMEDS Role letter and the Key Competency number.
Creating and confirming your Milestones codes takes patience. You'll likely notice that in some Royal College (RC) documents there is only a two digit milestone code. For the purposes of mapping your curriculum in Elentra, you must have three digit milestone codes. We recommend you add the third digit in the order the milestones appear in the RC documents you're using. Make sure you check for duplication as you go so that unique milestones have their own codes but different codes aren't applied to repeating milestones (within one stage). Using the data organization tools to reorder the milestone code columns and title columns can help you identify unneeded duplication.
You may also notice that the RC allows programs to use milestones coded from one stage in another stage (so you may see an F milestone in a C stage EPA). How programs and organizations handle this is ultimately up to them but we recommend that you align the milestone code with the stage that it is actually being assessed in/mapped to. For example, if it’s a “D” milestone being mapped to an F EPA, rename it as an “F” milestone. This is because you likely have a different expectation of performance on the milestone in a different stage, even if it’s the same task.
Name: Provide the text of the milestone as provided by the Royal College.
Description: Not required but complete with CanMEDS information as you wish.
Detailed Description: Not required.
Save your file as a CSV.
When coding the Milestones remember these required codes for the CanMEDS stages and roles:
Transition to Discipline: D Foundations of Discipline: F Core Discipline: C Transition to Practice: P Professional: PR Communicator: CM Collaborator: CL Scholar: SC Leader: LD Advocate: HA Medical Expert: ME
Warning: Do not delete any contextual variables from the original list within Manage Curriculum or the CBE auto-setup feature will prompt you to re-add them the next time you use CBE.
Contextual Variable Responses are used to describe the context in which a learner completes something. Examples of contextual variables include diagnosis, clinical presentation, clinical setting, case complexity, patient demographics, etc. For each variable, courses must define a list of response options. For example, under clinical presentation, a course might include cough, dyspnea, hemoptysis, etc.
Elentra will autosetup a list of contextual variables in the Manage Curriculum interface. Institutions can also add contextual variables (e.g., assessor's role) to this list before uploading contextual variable responses (e.g., nurse, physician, senior resident) via each course CBE tab. Elentra provides access to the same Contextual Variables for all courses. Courses can customize their list of contextual variable responses, but all courses will all see the same list of contextual variables. For this reason we strongly recommend that you work with your courses to standardize the list of contextual variables loaded into Elentra. If you do not, you risk eventually have hundreds of contextual variables that all course administrative staff have to sort through, whether they apply to their course or not.
Elentra will auto-create the following Contextual Variables for a CBE-enabled organization. Do not delete any of these or your users will be prompted to run the auto-setup tool again.
Institutions can add additional contextual variables to make available to their programs through Admin > Manage Curriculum. Group and role permissions of medtech:admin or staff:admin are required to do this. If you add a contextual variable to the organization wide tag set, courses will be able to upload their own contextual variable responses for the tag.
Navigate to Admin > Manage Curriculum.
Click on 'Curriculum Tags' from the Manage Curriculum Card on the left sidebar.
Click on the Contextual Variables tag set.
Click 'Add Tag' or 'Import from CSV' to add additional tags (more detail regarding managing curriculum tags is available here).
Any new tags you add to the Contextual Variable tag set (e.g. observer role, consult type, domain) will be visible to and useable by all programs within your organization.
When you add new tags to the tag set, you'll be required to provide a code and title. It is recommended that you make the code and title the same, but separate the words in the code with underscores. For example: Title: Observer Role Code: observer_role
Note that there is a 24 character limit for curriculum tag codes and a 240 character limit for curriculum tag titles. If you enter more characters than the limit, the system will automatically cut off your entry at the maximum characters.
After you have added Contextual Variables to the existing tag set, programs will be able to add the new response codes (and responses) to the contextual variable response template and successfully import them.
Once you have built out the list of all contextual variables you want included in the system you can import specific contextual variable responses for each course.
First, build a spreadsheet that includes the CV response category and individual CV responses. You can use this file as a starting place:
Note that responses will display in the order they are in in the csv file you upload. You can reorder the responses later but you may save time by inputting them in the order you want users to see them.
The contextual variable response spreadsheet should have the following information:
Response Code: This column should include all contextual variables applicable to the course/program (e.g., assessor’s role, basis of assessment, case complexity, case type). Make sure to use the exact response code all contextual variable tags in the response_code column.
Response: In this column, you can add the response variables required for a course (e.g. Case complexity response variables: low, medium, high). To add more than one response variable per response code, simply insert another line and fill in both the response code and response columns.
Description (optional): This field is not seen by users at this point and simply serves as metadata.
Save as a CSV file.
Navigate to Admin > Manage Courses.
From the cog menu beside the course name, select 'CBE.'
From the tab menu below the Competency-Based Education header, click 'CV Responses'.
Click 'Upload CV Responses'.
Drag and drop or search your computer for the appropriate file.
Click 'Save and upload Contextual Variable Responses'.
When the upload is complete you'll be able to click on any contextual variable and view the responses you've added.
Periodically an organization or course's CBE curriculum may be updated. Elentra can store multiple versions of a CBE curriculum and associate different learners with different versions as needed.
When a new curriculum version is introduced, administrators have the option of versioning an organization tree or a course tree.
For organizations using CBE outside the context of a Royal College curriculum framework, if a new curriculum version is added, you'll need to use the option to assign the new version to any existing user whose user-tree you want to update to point to the new curriculum version.
The remainder of this information is for programs using a Royal College (Milestones) curriculum framework.
For Royal College Programs, new curriculum versions will be applied to each CBME-enabled learner at the stage level. This means that if a learner is currently in Foundations, they will retain the previous curriculum version in Transition to Discipline and Foundations of Discipline, but will be given the new curriculum version in Core and Transition to Practice. Please ensure that all of your learners are in the correct stage (on their CBE Dashboards) prior to publishing a new curriculum version. Remember: a green checkmark on the stage indicates that the learner has completed that stage. A learner is currently in the stage that comes after the last green stage checkmark.
After an administrator has published a new version, they will need to build tools for all of the new or changing EPAs before learners can be assessed on them.
If you are a Royal College program, make sure learners are flagged as CBE learners before you complete the versioning process. A developer can do this directly in the database, or you can use a database setting (learner_levels_enabled) to allow course administrators to on the course enrolment page.
If learners are not flagged as CB(M)E learners, they will be skipped in the versioning process.
Prepare required CSV files
Add new curriculum version
Create any new Contextual Variable responses as needed and update Procedure criteria (per EPA) if applicable
Contextual variable responses are not specific to curriculum versions, with the exception of procedure criteria uploaded per specific EPA. When a new curriculum version is added you’ll see a version picker display on the CV responses tab, but edits made to CV responses in any version will be reflected across all versions.
Build new assessment tools for changed or added curriculum tags
Build an assessment plan for the new curriculum version
Update mappings, and priority and likelihood ratings on rotations if applicable
If a course uses the Clinical Experience Rotation Scheduler and has mapped curriculum tags to specific rotations and assigned priority and likelihood ratings, these will be copied forward for all unchanging curriculum tags. If a new curriculum tag has been added in a version, you’ll need to map it and provide a priority and likelihood rating if required.
Reset any user-trees as needed
For Royal College programs, learners will be assigned the new curriculum version for any upcoming stages they still need to complete.
Learner Dashboard
Completed and current stages for a learner will retain their previous curriculum version(s).
Future stages for a learner will be updated to the new curriculum version.
The version applied to each stage and viewed to the right of the stage name (small grey badge) should be updated to reflect the new curriculum version for future stages
Any assessment tasks completed on a learner in a future stage for EPAs that have been updated in the new curriculum version should move to "Archived Assessments"
The "What's Left" modal showing the Assessment Plan requirements per EPA will only update for a learner after a new assessment plan is published.
Program Dashboard
Each learner's list of EPAs will reflect their curriculum version (so, one user may show D1, D2, and D3 and another user may show D1, D2, D3 and D4 if they are on version 2 for stage D and D4 was added in version 2).
Visual Summary Dashboards
More information coming soon.
EPA Encyclopedia
Updates in Elentra ME 1.26 mean that learners should see their unique user-trees when viewing the EPA Encyclopedia (i.e., they'll see different curriculum versions applied to different stages).
Completing Assessments
When initiating an assessment, only the EPAs and associated forms relevant to a specific learner will be displayed. Faculty or admin will not have to choose between multiple versions of on-demand forms for a given learner.
Setting learner CB(M)E status (via the or directly in the database) is a prerequisite for Royal College programs using curriculum versioning and who want residents to move to a new curriculum version when they begin a new stage of training.
REMINDER: Please ensure that all of your learners are in the correct stage (on their CBME Dashboards) prior to publishing the new curriculum version. The green checkmark on the stage indicates that the learner has completed that stage. A learner is currently in the stage that comes after the last green stage checkmark. Failing to ensure that learners are in the correct stages will result in those learners receiving (or not receiving) new curriculum in stages when they shouldn't have.
Please ensure that you have documented any changes that will need to be made, in addition to creating (or modifying) new CSV templates for upload. From the user interface, you will be able to indicate:
Whether an EPA is changing, not changing, or being retired
Within each EPA that has been marked as changing, indicate if any KC, EC, or Milestones will be changing, not changing, or retired
If you are adding milestones to an EPA, mark it as changing so that the uploader will add the new milestones
Entirely new EPAs will be detected by the uploader without you needing to indicate anything. For example, if you currently have D1, D2, and D3, but will be adding a D4 and D5, the system will detect these in your spreadsheet and add them to the new curriculum.
Please ensure that all of your learners are in the correct stage (on their CBE Dashboards) prior to publishing a new curriculum version. Remember: a green checkmark on the stage indicates that the learner has completed that stage. A learner is currently in the stage that comes after the last green stage checkmark.
Assuming that new EPAs are being introduced or EPAs are changing, you'll need the following:
Stages Template (if adding new EPAs)
EPA Template
Milestone Template
Any new program-specific Key and Enabling Competencies if application
SPREADSHEET TIP:
To ensure that you do not have any duplicated milestone text or duplicated milestone codes, use the "Duplicate Values" feature in Microsoft Excel. This will highlight any duplicated cells for you to ensure that you have only entered each milestone once. Caveat: this method is not perfect and will not catch the same milestone text if one instance of it has an additional space or missing period.
Log in to Elentra.
Click "Admin" at top right
Click "Manage Programs"
Select your program.
Click "CBME"
Click "Configure CBME".
Click "Actions"
Click "Add New Version"
You can only have one draft curriculum version in progress at a time. Optionally continue working on an existing draft if one exists, or use the reset option presented to you.
Indicate which versioning method to use.
For existing Royal College Programs, select "Use an existing tree version as a base."
Click "Next Step"
Choose a tree version from the dropdown selector.
After you make a choice, you'll be presented with a preview of the existing tree.
Click "Next Step"
Note: Each program can only have one version "in progress" at a time. You can continue editing that new version, or reset the progress.
Set the status of curriculum tags as needed. Note the option to apply a parent objective's status to its children.
If you are adding milestones to an EPA, mark the EPA as changing so that the uploader will add the new milestones
Entirely new EPAs will be detected by the uploader without you needing to indicate anything. For example, if you currently have D1, D2, and D3, but will be adding a D4 and D5, the system will detect these in your spreadsheet and add them to the new curriculum.
Click "Next Step"
Not Changing
The objective will remain unchanged. This exact version of the objective will be carried-forward into the new curriculum version. Note that selecting this option will stop any objectives with the same code from being uploaded.
Changing
The objective will be changing. This will allow you to upload new objective text. For a child objective (e.g. milestones), if you select “Changing” for that objective under one EPA, the system will update all other instances of that objective to “Changing”. This will ensure that you only have one version of that objective.
If there are any children in a Changing parent objective that you want removed, you MUST mark them as Retired. (For Royal College programs, this is with respect to the Milestone code - if "C PR4.3.2" will no longer be in an EPA, mark it as retired.)
Retired
The selected objective and all of its children will not be included in the next version. Any status changes for any of this objective's children will be ignored.
If you include an objective with the retired code in the spreadsheet, it will not get uploaded.
For Royal College programs, you are unlikely to use this status for EPAs unless there is an EPA at the end of your numbering that is being removed. For example, you have D1, D2, D3, D4 currently. You are retiring D4 from the next version and no new D4 will be uploaded.
For child objectives, a “Retired” status will remove that objective from only its parent in the new version. This means that there will be no new version of that child for that parent. Elentra will not retire the child from other parents unless you explicitly tell it to.
Upload files for any tag sets with changing or retired status or where new tags will be added.
You typically do not need to re-upload files for tag sets that aren't changing (e.g., stages, roles, key and enabling competencies).
If your files are missing information, Elentra will present you with a warning and ask you to re-upload your information.
If your file upload is successful, you will see a preview of your new course tree.
Click "Next Step"
You will be presented with a preview of the course tree.
After reviewing all information is complete, click "Publish"
After you publish a new curriculum version, you will need to build assessment tools for all of the new or changing objectives before learners can be assessed on them. Even after the published curriculum version is live, learners will not be able to be assessed on any new/changed objectives until the new assessment tools have been built.
For Royal College programs, remember that in most cases this will have minimal impact, since the learners only get the new version for the stages beyond the one they are currently in.
Please do not delete forms or form templates from previous versions unless you are certain that there are no longer any learners using that version.
After sufficient information has been added to the system, a user can generate a map showing the connections between EPAs, roles, key competencies, enabling competencies, and milestones.
Some schools opt to include a link to the EPA Encyclopedia from Helpful Links. Otherwise users can access the EPA Encyclopedia when initiating an assessment.
There are several filter options when viewing the EPA Encyclopedia.
Search for a specific learner. After you select a learner the page will refresh to show that learner's program(s).
Switch between multiple programs as needed.
Search for a specific EPA by typing in an EPA code or key word. Elentra will search the EPA title, detailed description and entrustment for a match.
Switch between CBME Curriculum versions as needed.
After clicking on an EPA, click on Detailed Description, Entrustment or Program Map to view details about the EPA.
This option is only available to users with Admin > Manage Courses access (e.g. program directors, curriculum coordinators, program coordinators).
Navigate to Admin > Manage course/program and select a course.
Click CBME and the Map Curriculum tab.
Scroll down to click on an EPA or type the beginning of the EPA code and title into the dialogue box and click on the relevant EPA when it appears below.
Hover the mouse over a point on the map to view the complete text for the mapped item.
Please see information .
Elentra allows you to map EPAs to specific rotations and indicate for each rotation whether an EPA is a priority and how likely a learner is to encounter the EPA. For this information to be useful for learners, you must be using the Elentra Rotation Scheduler.
From the CBME dashboard, EPAs can display their priority and the likelihood of being encountered within a learner's rotation. Additionally, EPAs relevant to the learner's current rotation are outlined.
When learners and faculty trigger assessments they will also have the option to apply pre-set filters (e.g. priority EPA, current stage EPAs). All of these tools can help learners quickly identify EPAs to focus on in their clinical work.
Navigate to Admin > Clinical Experience.
Click on the Rotation Schedule tab.
Create a draft schedule or open a published schedule. Note that if you are creating a new rotation schedule, you will also need to add rotations to the schedule.
Once rotations exist within a schedule a small Objectives badge will appear beside the rotation name.
Click on the CBME Objectives badge to open a list of EPAs tagged to the rotation.
Click the plus sign beside any EPA you'd like to label with priority or likelihood.
Indicate the likelihood by clicking on the appropriate button in the Likelihood.
Set an EPA as a priority by clicking the exclamation mark in the priority column.
Repeat as necessary.
Return to the rotations list using the Back to Rotations button in the top right. \
On-demand workflows enable learners to initiate forms on-demand using different workflow options. Administrators are able to select the on-demand workflow options when building a form.
There is developer work in the database required to set up workflows. (Please see file at bottom of the page.)
Please see additional details.
To use the learner dashboard and various form templates, an organization must have rating scales built. Elentra includes several default rating scales but organizations can also add their own.
There are four types of rating scales: default, dashboard, Global Assessment and MS/EC which stands for Milestone Scale /Enabling Competency. There is no user interface to modify the scale types and both Global Assessment and MS/EC are used on form templates including supervisor and procedure form templates.
There is a user interface to manage rating scales but only users with Medtech>Admin group and role permissions can modify scales. Note that scales are applied to an entire organization and can be accessed by multiple courses/programs.
Managing rating scales is part of the Assessment and Evaluation module; please see additional help resources .
Note that as of ME 1.14, when a new rating scale is added by an organization there is some developer work that must be completed to make the rating scale available on form templates. If you are a developer, please see the resource below.
Elentra is designed to automatically populate the item text when you select a global rating scale. Using different scales will result in different questions populating the form. See examples below.
A supervisor form is used to give a learner feedback on a specific EPA and can be triggered by a learner or supervisor. Once an EPA is selected, the form displays the relevant milestones to be assessed. A supervisor can indicate a learner’s progress for each milestone that was observed and can provide a global entrustment rating. Comments can be made optional, prompted or mandatory in each section of the form.
When you create a supervisor form template and publish it, the system automatically looks at the EPAs, milestones, and contextual variables selected and generates the appropriate number of forms. If you kept 3 EPAs on the supervisor form template, the system will generate 3 unique forms (one per EPA) that are available to be triggered by a user.
You need to be logged in as a Program Coordinator, Program Director or staff:admin role to access Admin > Assessment & Evaluation.
Navigate to Admin > Assessment & Evaluation.
Click ‘Form Templates’ on the tab menu.
Click the green ‘Add Form Template’ button in the top right and a pop-up window will appear.
Type in a form name and select the appropriate form type from the dropdown menu. Select ‘Supervisor Form’.
If you have permission to access multiple programs, use the dropdown menu to select the appropriate program for this form. This option will only show up if you have access to multiple programs.
Click 'Add Form'.
Add additional form template information as required:
Template Title: Edit the form template title/name if needed.
Description: The form description can be used to store information for administrative purposes, but it is not seen by users completing the form.
Form Type: The form type was set when you created the form and cannot be changed here.
Course/Program: Program coordinators and faculty directors may not have access to multiple courses, while staff:admin users are likely to. If you have access to multiple courses, make sure you've selected the correct course to affiliate the form with.
EPA Version: If you have two (or more) curriculum versions in the system, the Form Template Builder will default to loading the most recent version. In the "EPA Version" tab, simply select the appropriate version.
If you want to build new forms for learners using Version 1, change the EPA Version to Version 1, click Save, and it will load the appropriate EPAs.
Permissions: Anyone added under permissions will have access to edit and copy the form. You may wish to include a program in the permissions field so that you can filter by this form type later on.
To add individual, program, or organisation permissions, click the dropdown selector to select a category, and then begin to type in the appropriate name, clicking on it to add to the list.
You can add multiple individuals, programs, and organisations to the permissions list as needed.
Include Instructions: Check this to open a rich text editor where you can provide instructions about the form to users. The instructions will display at the top of forms built from this template. The same instructions will apply to all forms published from this form template.
Select which EPAs can be assessed using forms generated from this template.
All EPAs assigned to a course are included on the template by default.
To remove EPAs, click the 'x' to the left of the EPA code. You can add back any required EPAs by clicking on the dropdown menu and checking off the tick box for a desired EPA.
Click the grey badge beside an EPA to display a list of all the milestones mapped to that EPA (all are selected by default). Remove milestones as needed and then click 'Save and Close’.
Deleting unnecessary milestones is one way to reduce the length of the form and reduce the time required to complete it.
Modify the milestones for each EPA as needed. Elentra does not enforce a maximum number of selected milestones per EPA.
Click 'Save'.
If you want all EPAs to have the same available contextual variables, leave all EPAs checked off.
If you’d rather specify which contextual variables apply to which EPAs simply uncheck an EPA and it will appear below with its own customizable list of contextual variables.
Select which contextual variables you want to include with which EPAs by checking and unchecking the tick boxes.
You may only select between 1 and 6 contextual variables per EPA per supervisor form.
By default, all of the options within a contextual variable are included on any forms made from the template.
Click the grey button beside a contextual variable to view the available contextual variable responses.
To remove specific responses from this template, deselect them. For convenience, you can also use ‘Check All’ and ‘Uncheck All’.
To allow users to select multiple CV responses when completing a form, check the multi-select box. If selected, the item created on the form will be a drop down multiple responses type, instead of single response type.
When you have made the required changes, click the blue ‘Save and Close’ button.
If you modify which contextual variable response options will be available on a template, the number in the grey badge will show how many responses have been included out of the total possible responses.
Click 'Save'.
All contextual variables will display on this list, even if a program doesn't have contextual variable responses set for that variable. If you attempt to select a contextual variable for which there are no responses set, you will get an error message that reads "No objectives found to display." Click the X on the red message to remove it and then select a different contextual variable to use.
Use the first dropdown menu to select the scale you want to use to assess enabling competencies or milestones. (Scales can be configured by a medtech:admin user via Admin > Assessment and Evaluation.)
Indicate whether comments are disabled, optional, mandatory, or prompted.
Disabled - Comments are disabled at the milestone level.
Optional - An optional comment box will appear for each milestone. Comments are not required for a form to be submitted.
Mandatory - A mandatory comment box will appear for each milestone. Comments are required for all milestones before a form can be submitted.
Prompted - A mandatory comment box will appear only for the prompted responses indicated. This is a popular option for responses at the lower end of a scale.
The default response feature allows you to prepopulate a form with the selected response.This can reduce time required to complete the form.
The Responses fields will be automatically populated depending on the scale selected in the first dropdown menu.
Click 'Save'.
From the first dropdown menu, select a Global Rating Scale.
Enter Item Text if needed.
Elentra allows organizations to optionally automatically populate the Item Text based on the selected scale. If you do not see Item Text you organization may require some additional configuration by a developer.
From the second dropdown menu, indicate whether comments are disabled, optional, mandatory, or prompted.
The Responses fields will be auto-populated depending on the scale selected in the first dropdown menu.
Click 'Save'.
On each form template you create you’ll notice a greyed out area at the bottom including Next Steps, Concerns, and a place for feedback. This is default form content that cannot be changed except by a developer. These items will be added to all forms published from this template.
Click 'Publish' to make forms generated by this template available for use. Remember that the number of forms that will be created from a template depends on the number of EPAs assigned to the template.
Once a form template has been published, you can rearrange the template components for each form; however, you cannot makes changes to the scales or contextual variables. To make these changes, copy the form template and create a new version.
Please note that a behind the scenes task needs to run before your forms will be published. At some schools this may take up to an hour so expect a slight delay between when you publish a form and when it is available to be triggered by users.
Before staff can build forms for use with CBE, you should confirm you have the following configured in Elentra:
any that will be used with form templates or forms
this includes global rating scales which will be used on any CBE forms, and
milestone rating scale, which will be used on form templates if you use them
while some organizations may already have Other Assessment workflows configured for generic forms, a developer will need to configure additional workflows to support the new form types to be used with CBE (e.g., rubric or Supervisor forms)
Building assessment forms for learners and faculty to access is an important step in using the CBE tools.
Two options for building forms exist:
Building forms with a template (e.g. supervisor, procedure, smart tag and field note forms)
Building forms without a template (e.g. periodic performance assessment (PPA) or rubric form)
Using a form template allows Elentra to quickly generate forms pulling in the relevant EPAs, milestones, contextual variables and rating scales defined by administrative staff. After form templates are published, they can be triggered by faculty and learners at any point. Form templates have defined items included and administrators can not add additional items to a form template. Examples of form templates are the supervisor form, procedure form, and field note form.
You can also build forms without a form template which gives you increased flexibility to add items of your own design. The rubric and PPA form have some required elements but you can also add additional items to these forms. As long as one item added to a rubric or PPA form is linked to an EPA the form will be triggerable by faculty and learners.
If a form is not something that a faculty or learner will trigger themselves, program administrators can create forms and send them out to be completed via distributions. A distribution defines who or what is being assessed/evaluated, who is competing the form, and when it is to be completed. Building items and forms to use with distributions is not a requirement if you only want to use on-demand forms. For additional information on building and distributing forms, please see the Assessment and Evaluation help resources .
For Staff and Faculty Groups to have access to the CBME features in the Distribution wizard, a developer will need to add an additional resource_type
called CBME
into the elentra_auth.acl_permissions
table to include the organisation that has enabled CBME (if your instance of Elentra has more than one organisation) and the two groups of staff and faculty with an app_id
of 1 and only a read permission.
An example of this entry for an instance where CBME is enabled in organisation_id 1 and 3:
Before beginning to create items and build forms, confirm the following with your stakeholders:
whether to include form instructions (i.e., information displayed at the beginning of a form)
what milestone rating scale to use
what global entrustment rating scale to use and what the prompt for a global entrustment rating should read (note that you can have a developer configure this to be automatic if you wish)
whether to default any response options on items
whether to include prompted responses (note that as of Elentra ME 1.25 prompted response options on on-demand forms will NOT send notification to course staff or directors; only the default Concern Items appended to all CBE forms will send a notification)
whether to require comments for any prompted responses
CBME includes specific form templates to reduce the amount of work administrative staff have to do to create the assessment forms courses/programs will use. Elentra also supports generic forms (including rubrics) which can also be leveraged by the CBME tools including the individual dashboard, program dashboard and assessment plan builder. Specific details for each form type are included in other lessons but here is some general information about creating forms.
Form Permissions: These permissions dictate who has access to and can edit a form or form template while it is still a draft. To quickly make a form available to multiple users consider adding a program permission to the form (that will give anyone affiliated with that program (e.g. PA, PD) and with access to the administration of forms access to that form).
Note that a form must be permissioned to a course to be used in an on-demand workflow.
Form Instructions: To include instructions on a form template, look in the Form Template Information section and click the box beside 'Include Instructions.' This will open a rich text editor where you can type. What you type here will be included on all forms generated from this template. You could include specifics about the use of a form, the number of assessments a learner must complete, or even the key features of an EPA. Remember that each form produced will have the same instructions so if you include key features make sure they only relate to the applicable EPAs for the template.
Publish a Form: To make a form template generate forms to be available for learners or faculty to trigger you must publish the form template. Most installations of Elentra run a behind-the-scenes action to publish form templates every hour or so.
Editing a Form: Once a form template is published you cannot change the content of the resulting forms (you can change permissions or rearranged item order on published forms). If there is an error on a form you will need to copy the form, correct the error, and publish your new form. You can delete old forms but note that any already completed assessments will remain on the learner dashboard.
Form embargo is an optional Elentra feature and must be enabled through a database setting (cbme_assessment_form_embargo).
When creating and editing forms the embargo option will only appear if the above setting is enabled.
If embargo forms exist and then the setting is later turned off then any new assessments created with the form will not be embargoed.
An embargo can be applied to CBME forms that are permissioned to a course (e.g. Rubric Form). It does not currently apply to Form Templates.
If an embargo is applied to a form, tasks using that form will not be released to the target until specific release conditions as set by the form creator have been met. (Note that the distribution wizard provides a similar option for peer-assessment tasks within a specific distribution.)
There are three options available when setting an embargo.
A Course Director or Program Director must release the assessment
To set up manual release, just check the "Embargo" checkbox. The assessment responses will be hidden from the learner until a program coordinator or course director for the course the assessment belongs to goes to the task and releases it to the learner.
You can optionally choose to send email notifications for tasks completed using this form. Check of the appropriate box(es) to enable email notifications.
When tasks with a manual release requirement are completed, Program Coordinators and Program Directors will be able to view the task and optionally release it to the target or the Competency Committee Member.
If tasks are released to learners, they will display under Tasks Completed on Me (information re: CBME dashboard pending).
If tasks are released to CC members, they will be visible when the CC member view the learner's Assessments tab (information re: CBME dashboard pending).
Tasks generated using this form will never be available to the target, only program coordinators and program directors will be able to view the completed tasks.
To set up a permanent embargo, check off "Embargo", AND check off "Permanent Embargo?"
Tasks generated using this form will not be available to the target unless the target has completed at least one task using one or more other specific forms).
To set up this option, click on "add completion requirement". That will make the responses become visible to the learner as soon as the other form (selected from the dropdown) is complete.
Tasks that are embargoed will display an embargoed label on the target card.
Embargoed forms can be used in distributions or on-demand assessments.
All tasks generated using an embargoed form will have these embargo release conditions applied, regardless of on-demand completion method (e.g. complete and confirm via PIN, email blank form, etc.)
It is possible for an embargo to be applied even if a task related to the completed assessment would normally require action from the assessee.
Since the embargo release function is carried out by course contacts, this feature is only available to forms which have a course relation.
Embargo form behaviour conflicts with another Elentra tool - the ability for evaluators to immediately release their completed task to the target. Because of this a check was added to ensure that this question does not appear if the task is using an embargo form.
After clicking on an EPA you will see something like this:
There is no user interface to configure the item text used with each scale. This is controlled through the database; if you need to create a scale and will be using it as a global rating scale please speak to your project manager or a developer.
To review feedback provided about the assessment tools themselves, go to Admin>Assessment and Evaluation, and click Reports from the second tab menu (look under the Assessment & Evaluation header). Under Assessments, click on Assessment Tools Feedback Report. CBME form feedback will show up there. You can filter by course and form to refine the list of feedback. If you are logged in as a PA you will only have access to forms associated with your program.
After creating forms using a form template, you can reorder form items on individual forms. This applies to published Supervisor Forms, Field Notes, and Procedure Forms.
Open the Form Template that you used to build the form of interest.
Click on the title of the form on which you wish to re-order items.
Click and drag on the “cross” icon to re-order the assessment items.
The changes will be saved automatically. This must be done for each individual form that you wish to re-order.
Smart tag forms are template-based forms which are built and published without curriculum tags attached to them.
Curriculum tags are attached based on the selections made when triggering assessments using a smart tag form.
Smart tag forms can be assigned ‘Associated Curriculum Tags’ which will automatically be attach when selecting curriculum tags.
There are multiple database settings that are optional when using Smart Tag Forms.
cbe_smart_tags_items_autoselect
Determines whether smart tag items should be pre-filled on trigger
cbe_smart_tags_add_associated_objectives
Adds the associated objectives item to smart tag forms on trigger
cbe_smart_tags_items_mandatory
Determines whether smart tag items are required
cbe_smart_tags_items_assessor_all_objectives_selectable
Determines whether assessors can view all smart tag objectives in the learner tree, rather than the objectives in the learner tree
cbe_smart_tags_items_target_all_objectives_selectable
Determines whether targets can also see all selectable objectives, not just the ones applied via the Smart Tag form trigger process
cbe_smart_tags_items_all_objectives_selectable_override
Overrides the target and assessor *_all_objectives_selectable settings, if set
Open the Admin drop-down on the top menu and click Assessment & Evaluation.
Click the Form Templates tab.
Click the Add Form Template button.
Input the following:
Form Title (required)
Form Type: Smart Tag Form
Program (required): [select course name]
Click the Add Form button to create the form.
The creation of the form template will show you the Edit Form Template page.
Scroll down to Template Components.
In the Template Components, under the Associated Objectives to display on form drop-down, select the desired tag set objective.
Note: The tag set options of Allow tag set to be shown on a smart tag form and Allow tag set to be added to form templates must be selected in the Curriculum Framework Builder.
8. Click the Save and Next button to continue.
9. Select the desired Contextual Variables in the list of Contextual Variables.
10. Click the Save and Next button to continue.
11. Complete the remaining sections of the form as outlined in Add a Supervisor Form Template.
Ideally your users will be loaded into Elentra from a central, authoritative source like a student information system. Manually adding users through Admin>Manage Users is a possibility and more detailed instructions for using this tool are available here.
If you are manually setting up CBE users, please note the following:
Resident learners should be assigned group: student, role: student.
Program Administrators should be assigned group: staff, role: Pcoordinator. Further, they must be assigned to a specific program in order to access the relevant CBE and Assessment and Evaluation features. To give specific staff more ability to act within Elentra set them up as Staff>Admin. Note that Staff>Admin have access to almost all features of Elentra and can access System Settings.
Faculty should be assigned group: faculty, role: faculty, lecturer, or director depending on their role.
Faculty directors will have access to all faculty evaluations within their program (except their own).
Someone with faculty director permissions will still have to be added to a specific program in order to access the relevant CBE and Assessment & Evaluation features.
Any faculty, regardless of their role, can be added to Competence Committees through a course/program setup page or as Academic Advisors through the program Groups tab.
Setting up Academic Advisor Groups will allow the Academic Advisors to access the CBME Dashboard and assessments of their affiliated learners. Academic Advisor Groups rely on the Elentra Course Groups feature and are not specific to CBME-enabled organizations.
Please see here for information on configuring Course Groups.
You need to be logged in as a program coordinator or program director to create or modify competence committees.
Navigate to Admin>Manage course and select a course.
Under the program title, click on Setup.
Scroll down to the Course Contacts section and find the 'Competency Committee Members' section.
Begin to type a name to retrieve a list of potential committee members.
Click on the name you want to add and repeat as necessary.
When a name has been added it will appear below the search box.
After you have added the required names, scroll down to the bottom of the page and click ‘Save’.
Note that all Competency Committee Members for a program will be able to access all resident profiles in that program and will have the ability to promote them.
Members of Competency Committees can promote learners through stages from the learner CBME dashboard. For more information please see the Reviewing Learner Progress>Promoting Learners Through Stages lesson.
Yes, with developer assistance you can disable the program dashboard for specific courses. Developers, the program dashboard can be disabled for a course by adding an entry cbme_progress_dashboard with a value of 0 in the course_settings table.
Currently, the assessment plan builder and CBME Program Dashboard supports Supervisor Forms, Field Notes, PPAs (with global entrustment item), Procedure Forms, and Rubric/Flex Forms (with global entrustment item). It does not currently support PPAs/Rubrics that do not have the global entrustment item added and nor does it support Smart Tag forms.
The CBME Program dashboard and the assessment plan builder take both form versions and EPA versions into consideration. This means that in order for the dashboard to generate correct assessment counts, you need to enter assessment plans for all active EPA versions, and in some cases, all form versions. Remember, your competence committee still has access to all learner data and can 'overrule' the system by marking an EPA as Approved in these (and other) cases. The dashboard should not be the sole source of information for competence committees.
The CBME Program Dashboard only counts assessments that have published assessment plans linked to them. In some cases, you may have assessments that were completed on older form versions that do not have a plan, or you have not yet built assessment plans for your new forms, so the dashboard does not count these assessments. Additionally, some form types are currently not supported such as PPAs and Rubrics that are not tagged to any EPAs. If a learner has gathered assessments on a previous EPA version and is now on a new version (e.g., assessed initially on F3-Version 1, but was given F3-Version 2 midstream) these archived assessments will not display on the program dashboard since it only displays the learner's current EPA versions. To view archived assessment data, navigate to the learner's CBME dashboard.
At most schools, the CBME Program Dashboard is updated on a once-nightly basis.
No, archived assessments will not display on the program dashboard since it only displays the learner's current EPA versions. To view archived assessment data, navigate to the learner's CBME dashboard.
Yes, the program dashboard does include assessments completed by external assessors.
Residents can highlight assessments that they found helpful to their learning. From the “Assessments” tab of the CBME Dashboard, a learner can click on the “thumbs up” icon to indicate to an assessor that their feedback was helpful to the resident's learning. Learners can also include a comment on why they found it helpful. This feedback is important to help assessors identify the types of feedback that residents find beneficial for their learning.
Accessible by PDs and PAs from the CBME Dashboard Stages tab, the Milestone Report allows faculty and PAs to generate a milestone report for a specific learner. This report is a breakdown of completed assessments that have been tagged to milestones and EPAs. Assessments are tallied based on the milestone and EPA that they are tagged to and the tool generates one report per unique scale. The number of unique scales is determined by what tools were used to complete the included assessment tools. The reports are generated as CSVs and are zipped into a folder.
From a learner's CBME Dashboard Stages tab, click Milestone Report in the top right (beside Log Meeting).
1. Select a date range for the report. 2. Click “Get Tools" 3. Select the tools that you wish to view the aggregated milestone report for, or click “Select All Tools" 4. Click “Generate Report”. This will open a download modal for you to select where to save the zip file. 5. Unzip the file 6. If multiple rating scales were used to assess the milestones, there will be one CSV file for each rating scale 7. Open a CSV file 8. Each row represents a milestone, and each column represents the rating scale options. These rating scale options are grouped by EPA (e.g., for the Queen’s Three-Point Scale, you will see 4 columns for each EPA: Not Observed, Needs Attention, Developing, and Achieving). 9. Each cell displays the total number of times that milestone was assessed for that EPA, including how many times it was rated at that level on the rating scale (e.g., 3 of 6 completed assessments were rated as “Achieving”.
NOTE: Even though you may have selected only one of many Supervisor Tools (or other tools), if you used the same scale on all tools, the report will display all data for all tools that used that rating scale. We will be enhancing this in future iterations to only report on tool(s) selected.
A procedure form is an assessment tool that can be used to provide feedback on a learner’s completion of a specific procedural skill. Once a procedure is selected, specific criteria will be displayed. A procedure form can be initiated by a learner or faculty.
When you create a procedure template and publish it, the system looks at the number of EPAs and procedure contextual variable responses selected and generates the appropriate number of forms. If you keep 3 EPAs and indicate 10 procedures on the form template, the system will publish 30 forms that are available to be triggered by a user (one form per EPA per procedure).
To use the Procedure Form Template, a program must first:
Define contextual variables responses for the Procedure variable
Upload assessment criteria CSV files for each procedure. This provides the actual assessment criteria for each procedure.
You can upload different criteria (i.e., different assessment forms/items) for each procedure.
You can optionally use the same criteria for every EPA that will assess that procedure, or you can upload different criteria for every EPA that will assess that procedure.
You need to be logged in as a Program Coordinator, Program Director or staff:admin role to access Admin > Assessment & Evaluation.
Navigate to Admin > Assessment & Evaluation.
Click ‘Form Templates’ on the tab menu.
Click the green ‘Add Form Template’ button in the top right and a pop-up window will appear.
Type in a form template name and select the form type (Procedure Form) from the dropdown menu.
If you have permission to access multiple programs, use the dropdown menu to select the appropriate program for this form. This option will only show up if you have access to multiple programs.
Click 'Add Form'.
You will be taken to the procedure form template build page.
Template Title: Enter the title of the form. This will be seen by users.
Description: The form description can be used to store information for administrative purposes, but is not seen by users completing the form.
Form Type: This was set in the previous step and cannot be edited here.
Course/Program: Program coordinators and faculty directors may not have access to multiple courses, while staff:admin users are likely to. If you have access to multiple courses, make sure you've selected the correct course to affiliate the form with.
EPA Version: If you have two (or more) curriculum versions in the system, the Form Template Builder will default to loading the most recent version. In the "EPA Version" tab, simply select the appropriate version. Click Save.
If you want to build new forms for learners using Version 1, change the EPA Version to Version 1, click Save, and it will load the appropriate EPAs.
Permissions: Anyone added under permissions will have access to edit the form before it is in use and use the form if they are setting up a distribution. You may wish to include a program in the permissions field so that you can filter by this form type later on.
To add individual, program, or organisation permissions, click the dropdown selector to select a category, and then begin to type in the appropriate name, clicking on it to add to the list.
You can add multiple individuals, programs, and organisations to the permissions list as needed.
Include Instructions: Check this to open a rich text editor where you can provide instructions about the form to users (instructions will display at the top of forms built from this template). The same instructions will apply to all forms published from this form template.
Select which EPAs can be assessed using forms generated from this template.
All EPAs assigned to a course are included on the template by default.
To remove EPAs, click on the small 'x' to the left of the EPA code. You can add back any required EPAs by clicking on the dropdown menu and checking off the tick box for a desired EPA.
Click ' Save and Next'.
Note: You do not specify milestones for use on a Procedure Form.
By default, ‘Procedure’ will be selected as a contextual variable.This will require some additional information to be added to the system if the program you are working in hasn’t input procedure response options.
If you want all EPAs to have the same available contextual variables leave all EPAs checked off.
If you’d rather specify which contextual variables apply to which EPAs simply uncheck an EPA and it will appear below with its own customizable list of contextual variables.
In addition to 'Procedure', you may select between 1 and 6 contextual variables per EPA.
By default, all of the response options within a contextual variable are included on any forms made from the template.
Click the grey button beside a contextual variable to view the available contextual variable responses.
To remove specific responses from this template, deselect them. For convenience, you can also use ‘Check All’ and ‘Uncheck All’.
To allow users to select multiple CV responses when completing a form, check the multi-select box. If selected, the item created on the form will be a drop down multiple responses type, instead of single response type.
When you have made the required changes, click the blue ‘Save and Close’ button.
If you modify which contextual variable response options will be available on a template, the number in the grey badge will show how many responses have been included out of the total possible responses.
Click 'Save'.
Setting a Scale
Use the first dropdown menu to select the scale you want to use to assess enabling competencies or milestones. (Scales can be configured by a medtech:admin user via Admin > Assessment and Evaluation.)
Indicate whether comments are disabled, optional, mandatory, or prompted.
Disabled - Comments are disabled at the milestone level.
Optional - An optional comment box will appear for each milestone. Comments are not required for a form to be submitted.
Mandatory - A mandatory comment box will appear for each milestone. Comments are required for all milestones before a form can be submitted.
Prompted - A mandatory comment box will appear only for the prompted responses indicated. This is a popular option for responses at the lower end of a scale.
The default response feature allows you to prepopulate a form with the selected response.This can reduce time required to complete the form.
The Responses fields will be automatically populated depending on the scale selected in the first dropdown menu.
Click 'Save'.
From the first dropdown menu, select a Global Rating Scale.
Enter Item Text if needed.
Elentra allows organizations to optionally automatically populate the Item Text based on the selected scale. If you do not see Item Text you organization may require some additional configuration by a developer.
From the second dropdown menu, indicate whether comments are disabled, optional, mandatory, or prompted.
The Responses fields will be auto-populated depending on the scale selected in the first dropdown menu.
Click 'Save'.
On each form template you create you’ll notice a greyed out area at the bottom including Next Steps, Concerns, and a place for feedback. This is default form content that cannot be changed except by a developer.
When the form is complete, a green bar will tell you the form can be published.
Click 'Publish' to make your template available for use.
Once a form template has been published, you can rearrange the template components for each form; however, you cannot makes changes to the scales or contextual variables. To make these changes, copy the form template and create a new version.
Please note that a behind the scenes task needs to run before your forms will be published. At some schools this may take up to an hour so expect a slight delay between when you publish a form and when it is available to be triggered by users.
In some programs residents may be required to log multiple procedures or encounters and only have a subset of those logged entries be assessed. Elentra does support a logbook outside the CBME module and some programs have opted to have residents use both tools to capture the full picture of residents' progress. For more detail on Elentra's logbook, please see here.
A Periodic Performance Assessment (PPA) Form is designed to capture longitudinal, holistic performance trends. At least one item on a PPA form must be linked to an EPA for the form to be initiated on demand by users.
Within a given form, you can only tag curricular objectives (e.g., EPAs or milestones) from the same curriculum version. To ensure that you do not accidentally add an EPA from a different version, we recommend you create the form first and then "Create & Attach" new items to the form.
You need to be a staff:admin, staff:prcoordinator, or faculty:director to access Admin > Assessment & Evaluation to create a form.
Click Admin > Assessment & Evaluation.
Click on Forms from the subtab menu.
Click Add Form.
Provide a form name and select PPA Form from the Form Type dropdown menu; then click Add Form.
Form Title: Form titles are visible to end-users (learners, assessors, etc.) when being initiated on-demand. Use something easily distinguishable from other forms.
Form Description: The form description can be used to store information for administrative purposes, but it is not seen by users completing the form.
Form Type: You cannot change this once you have created a form.
On-Demand Workflow: On-demand workflows enable users to initiate a form on-demand using different workflow options (EPA, Other Assessment Form, Faculty Evaluation and Rotation Evaluation).
EPA: Use for forms tagged to EPAs that you want users to be able to initiate. Contributes to CBME dashboards.
Other Assessment Form: Use for forms that you want users to be able to initiate and complete on demand without tagging to EPAs; or, for tagged forms that you don't want to appear in the EPA list. Only forms with EPAs tagged will contribute to CBME dashboards.
Course: Program coordinators and faculty directors may not have access to multiple courses, while staff:admin users are likely to. If you have access to multiple courses, make sure you've selected the correct course to affiliate the form with.
EPA Version: Select the CBME Version that this form will be used for. After setting the version, you will only be able to tag EPAs from that version to this form.
By default, the Form Editor will load the most recent CBME version. Under "EPA Version", simply select the appropriate version. Click Save. If you want to build new forms for learners using Version 1, simply change the EPA Version to Version 1 and it will load the appropriate EPAs.
Permissions: Authorship permissions give other users access to edit the form. You can add individual authors or give permission to all administrators in a selected program or organization. It is highly recommended that you assign course/program-level permissions to all of your forms, as some filters rely on this setting.
To add individual, program, or organization permissions, click the dropdown selector to select a category, and then begin to type in the appropriate name, clicking on it to add to the list.
You can add multiple individuals, programs, and organizations to the permissions list as needed.
In order to use the "Programs" filter in the form bank and when learners initiate assessments/evaluations, you need to add Program-level permissions to each form.
Select the relevant contextual variables for this form by clicking on the checkbox. Adjust which contextual variable responses should be included by clicking on the gray badge. This allows you to deselect unneeded contextual variable responses which can make the form faster to complete for assessors.
If you want to include an Entrustment Rating on the form, click the checkbox. Select an entrustment rating scaled from the dropdown menu. Note that the responses will be configured based on the scale you select. It is also possible that the Item Text will be auto-populated based on the scale you select.
For the optional Entrustment Rating, set requirements for comments noting that if you select Prompted comments you should also check off which responses are prompted in the Prompt column. If you use this option and any person completing the form selects one of the checked responses, s/he will be required to enter a comment. Additionally, if the form is part of a distribution you'll be able to define how prompted responses should be addressed (e.g. send email to program coordinator whenever anyone chooses one of those response options).
The default Feedback and Concerns items will be added when the form is published.
Add form items by clicking 'Add Items', or click the down arrow for more options.
'Add Free Text' will allow you to add an instruction box.
If you add free text, remember to click Save in the top right of the free text entry area. Any free text entered will display to people using the form.
'Add Curriculum Tag Set' should not be used.
To create and add a new item, click the appropriate button.
Select the Item Type and add any item responses, if applicable.
Tag Curriculum Tags to your newly created item.
In the example below, because you are using a form that is mapped to "Version 2", the curriculum tag sets will be locked to "Version 2". This will ensure that you do not accidentally tag an EPA from a different version.
At least one item added to the PPA form must be linked to an EPA in order for the form to be initiated on-demand by users.
After you have added items to your form you may download a PDF, and preview or copy the form as needed.
Save your form to return to it later, or if the form is complete, click Publish. You will see a blue message confirming that a form is published. Unlike form templates which require a behind the scenes task to be published, a rubric form will be available immediately.
PPA forms can also be scheduled for distribution through the Assessment and Evaluation module.
My PPA or Rubric Form is not displaying a publish button. Why is this happening?
In order for a PPA or Rubric form to be published, you must have at least one item that is mapped to part of your program's "EPA tree". You will only see the "publish" button appear after you have tagged an item to either an EPA(s) or a milestone(s) within an EPA. After saving the item, you will now see a "Publish" button appear.
Is it a requirement to publish PPA and Rubric forms?
You only need to publish PPAs and Rubric forms if you wish to leverage the EPA/Milestones tagging functionality in the various CBME dashboards and reporting. You are still able to use PPAs and Rubric forms without tagging EPAs or milestones if you only need to distribute them, or select them using the "Other Assessment" trigger workflow. If you want any of the standard CBME items, such as the Entrustment Item, Contextual Variables, or the CBME Concerns rubric, you must tag and publish the form. Keep in mind that the assessment plan builder only supports forms that have the standard entrustment item on it - meaning, only published PPAs/Rubrics forms.
Do I need to publish a PPA before being able to attach it to a distribution?
No, the distribution wizard does not require you to publish the form before being able to attach it to a distribution. As long as you do not want to tag EPAs or Milestones, or have the form be reported in the CBME Dashboards, then you do not need to publish it.
Updated in ME 1.25!
The Assessment Plan Builder now supports more complicated CV requirements by allowing you to define dependencies between CV responses. This feature enhancement must be enabled by a database setting (cbme_assessment_plan_form_dependencies).
The Assessment Plan Builder does not currently support Smart Tag Forms. If a course/program relies heavily on Smart Tag Forms, the assessment plan builder (and hence the program dashboard) may be of limited use.
The assessment plan builder allows you to specify minimum requirements for assessment forms on a per-EPA basis. These plans are leveraged to generate resident progress reports in the CBME Program Dashboard.
Currently, the assessment plan builder supports Supervisor Forms, Field Notes, Procedure Forms, PPAs (with global entrustment item), and Rubric Forms (with global entrustment item). These forms must have an item mapped to an EPA for it to show up on the Assessment Plan. As of ME 1.18, you are now able to build plans for forms that were deleted or retired.
The assessment plan builder does not currently support PPA or Rubric forms that do no have a global entrustment item added.
You must be in an administrative role and have access to a specific program to use the Assessment Plan Builder.
Navigate to Admin > Manage Programs/Courses.
Beside the appropriate program/course name, click the gear icon and click "CBME".
Click the Assessment Plans tab.
Click "Add Assessment Plan".
When you click “Add Assessment Plan” you are creating a container for all EPA-specific assessment plans in your curriculum. Each assessment plan container is scoped to a single curriculum version (i.e., if you have multiple versions of your EPAs, you can create different assessment plans for different EPA versions).
To add an assessment plan, select the “Version” and create a title for your assessment plan container. Adding a description is optional.
Click “Save Plan.” You will now be redirected to the new assessment plan container. You can use one container for all of your EPAs within a version.
The assessment plan container will load all EPAs for the selected curriculum version. You can use a free-text search to find a particular EPA, or scroll down to the EPA of interest.
Each EPA has a circular icon that indicates the EPA plan status.
A green checkmark indicates that an assessment plan has been published for this EPA.
A grey circle indicates that no assessment plans have been started for this EPA.
An orange exclamation mark indicates that an assessment plan has been saved in draft mode for this EPA. Changes may need to be made before publishing.
Click on the EPA you wish to add an assessment plan for.
Add a title and an optional description. These are not visualized to other users at this time and are for admin purposes only.
Click on "Assessment Tools" to load all tools that have been tagged to this EPA.
Date ranges are listed for tools that have been deleted or retired.
A single creation date is listed for active tools.
Select the tool(s) you wish to include in this assessment plan. You can add one plan for each tool, or optionally combine the requirements across tools.
Note that deleted and retired tools are only listed when at least one assessment has been completed with them.
The combine tools feature allows you to combine multiple tools/forms within the same assessment plan, as long as there are shared contextual variables and the entrustment question is the same. You can then set the plan requirements for the shared variables & scale, and the system will use assessments from all of the selected tools to feed into the dashboard. For example, if you require 4 assessments to be completed at "meets expectations" or above, and it can be either a Field Note or a Supervisor form, the combine tools feature is an easy way to do this.
To combine tools:
Click "Assessment Tools"
Select all of the tools that you wish to combine
At the top of each tool card, click the checkbox at the top right
Click "Combine Tools"
Enter the plan requirements, as outlined below
Please note: For procedure forms, a completion of any of the procedures related to the selected form (built from the same template), will count towards meeting the "minimum number of assessments" and "minimum number of assessors" requirements. **** You may define additional requirements using the "Procedure" variable when your assessment plans require it. Otherwise, you do not need to select any specific procedures.
Minimum number of assessments: Enter the minimum number of assessments required for this EPA using this tool. This number is linked to the global entrustment rating, which means that completed assessments must be equal or higher than the selected rating scale response in order to fulfill the plan. This is a mandatory field.
Rating scale responses: Select the minimum level of entrustment or supervision that is required for this EPA. The number entered in the field before this is linked to this response. For example, this EPA required 3 assessments scored at "Almost" or above. This is a mandatory field.
Minimum number of assessors: Enter the minimum number of assessors required to complete assessments on this EPA. This is a mandatory field.
Contextual Variable Types: Choose how you wish to track your contextual variable responses for this form. You can select multiple contextual variable types as needed (e.g., ‘spread’ and ‘specific’ requirements) within the same assessment plan (either single form or 'combined' form plan). For example, for Form A, I want to use "Specific" and "Group - Spread". i.e., I need the resident to see "cystic fibrosis" (specific) and any 4 of the other diagnoses (Group - Spread).
See below for detailed descriptions of each contextual variable type.
Contextual Variables: Select which contextual variables you need to track for this form. Only the contextual variable categories that are on the selected form will be loaded in this dropdown. Depending on the contextual variable type you selected, you will have different options appear to track the responses.
Combine Tools: See notes above.
The assessment plan builder leverages the grouping functionality within contextual variables. Within a given contextual variable, you are can create groups of responses. There are four different ways to track contextual variable responses with the assessment plan builder: spread, specific, group (spread), and group (specific).
To create Contextual Variable Groups you need to be in Admin > Manage Courses > CBME > CV Responses.
The Spread function allows you to check off a selection of contextual variable responses and indicate how many unique responses are required from that list. In the above image, any 4 unique responses must be assessed at least once to meet the plan.
The Specific function allows you to check off a selection of contextual variable responses and indicate how many times each of the selected responses need to be assessed to meet the plan.
The Group: Specific function allows you to check off a selection of contextual variable responses within a contextual variable group and indicate how many times each of the selected responses need to be assessed to meet the plan. You can select responses from multiple groups within the same contextual variable.
The Group: Spread function allows you to select contextual variable groups and indicate how many discrete/unique responses from the group need to be assessed to meet the plan. You can add each response from the group individually as needed. The number of responses required must be equal to or less than the number of CV responses in the group. The learner needs to have a CV response present in their assessments just once to start meeting the requirement.
In the example above if you set the requirement to 4, the learner would be required to collect 4 unique contextual variable responses from the "Complex" group list displayed (for example: genetic syndrome, developmental delay, neurologic problem, AND hemotology/oncology problem).
You can add multiple groups from within the same contextual variable.
You can also use multiple contextual variable types (e.g., ‘spread’ and ‘specific’ requirements) within the same assessment plan (either single form or 'combined' form plan). For example, for Form A, I want to use "Specific" and "Group - Spread". i.e., I need the resident to see "cystic fibrosis" (specific) and any 4 of the other diagnoses (Group - Spread).
You can optionally define dependencies between contextual variables in an assessment plan. This requires a database setting to be enabled (cbme_assessment_plan_form_dependencies).
You can now specify dependencies between contextual variables so that requirements must be collected as a combination.
Add a contextual variable to a plan.
Add a second contextual variable.
You will see both contextual variables display on each other's cards.
Use the "Dependencies" checkbox to have one CV (the one where "Dependencies" is checked") require the other.
After you've checked off one CV to make it a dependency, the corresponding CV box will disappear from the other CV card.
The view on the CBME progress dashboard will reflect these assessment plan requirements grouped together if they are used as combination/dependencies.
Once you have entered all of the contextual variable requirements, you can"Save Draft" to return to it later or "Publish Plan". Only a published plan will get reflected on the CBE program dashboard.
The curriculum tags connected to an individual learner are stored in a user tree. Especially in the context of programs using a Royal College structure, there can be circumstances where an administrator needs to reset a user tree. For example,
if learner stages weren't correctly set before a curriculum was versioned,
if learners were not set as CBME enabled on a course enrolment tab before a curriculum was versioned,
if a competency committee decides to modify the version of curriculum a learner needs to complete for a specific stage.
For existing schools using CBE and a Royal College curriculum framework, the option to modify a User Tree through the user interface replaces the previous work developers had to do to manually adjust a learner's curriculum version if a mistake was made in promoting them or otherwise.
For schools using CBE who do not use a RC curriculum framework, if you version your curriculum, you will be required to manually reset the user trees for all learners to move them forward to the new curriculum version.
To effectively modify a user tree an administrator needs to know:
which learner to use
which curriculum version is applicable to the learner
in the context of Royal College programs, you need to know which curriculum version is applicable to which stage of training
After an administrator has reset a user tree, the version badge on a learner dashboard may take some time to update. It relies on a behind the scenes task which at most schools happens once a night; if you wish to see immediate changes to the version badge on the learner dashboard, you'll need a developer to run the cbme-learner-dashboard-statistics-generator cron job for you.
To reset a user tree:
Navigate to the CBME tab of a program.
Click Configure CBME.
From the Actions dropdown, select Modify User-Tree Versions.
Select a learner (or multiple if applicable).
The Tree ID will be pre-populated based on the course/program you are working in. This is presented for information only.
Under Reset Mode, make a selection.
Override user tree(s) with a specific version
Use this option if your curriculum framework does not include stages of progress and you need to assign a complete new curriculum version to a user.
For most organizations using their own curriculum framework (not Royal College structure), learner are locked onto a full version of a curriculum and DO NOT automatically get pushed to a new version once one is published.
Click Next.
Select a curriculum version.
Click Set user tree version.
You will get a success message and see a list of users who were migrated.
Click Reset another tree or click on another tab to move to another task.
Build useable tree from versionable root
Use this option if you use a Royal College curriculum framework and need to assign different curriculum versions to different learner stages.
For each version and each stage, indicate if the version should be included or excluded in the user tree. For example, in a Royal College framework you could indicate that stages D and F should be on Version 1 and stages C and P should be on Version 2.
Click Next.
Click to include or exclude each version within each stage (for RC programs).
Click Submit branch modifiers.
You will get a success message and a view of the resulting user tree.
Click Reset another tree or click on another tab to move to another task.
These are instructions to have a developer set a learner stage.
When publishing a new curriculum version within the CBME module, learners will automatically be updated to have the new curriculum version (i.e., EPAs marked as “replaced” or “changing”) for all of their upcoming stages of training. There may be a time where you would like certain learners to have stages from a specific version. This guide will instruct you on how to properly update a learner or series of learners to have stages from a specific version.
In order to update the learner(s) objectives in a timely manner there are a few things to gather before you begin the process:
1. Assemble a list of proxy_ids for all of the learners that you wish to update. This process is done on a per course basis so make sure that all of the learner proxy_ids that you compile belong to the same course.
2. Make note of the stages that you wish to be updated. The script requires the stage letter in order to know which stages to update, so make a list of the stage letters. For example, if you are updating a learner to have an old version of Transition to Discipline (D) then you will need to note D as the stage you are updating.
3. The final thing you will need is the cbme_objective_tree_version_id for the version that you wish to update to. So if the course you are updating is course_id 123 and you would like to update a learner to version 2 for a stage then you need to look up the cbme_objective_tree_version_id for course 123 version 2 in the cbme_objective_tree_versions table. You will also need the cbme_objective_tree_version_id for the version that the learner is already a part of. The script requires that you set the version for every stage available to the learner which is why we need the current versions.
4. Access to the database
5. SSH access to your production environment
Updating the learner stages requires a developer tool to be run from developers/tools/cbme_migration. You must have SSH access to your production server in order to complete these steps. Please Note: It’s recommended that you go through the following steps in a test/staging/development environment first so that you can ensure that the script updated the learners properly.
Steps:
1. Open up your database client and open the cbme_objective_tree_aggregates table.
2. For every learner proxy_id that you compiled ahead of time we will be deleting their cbme_objective_tree_aggregate records for the course that we are updating. Select all of the rows where tree_viewer_value is the proxy_id that we are dealing with, the tree_viewer_type is “proxy_id” and the course_id matches the course that we are using. Once you have all of that data, we are going to DELETE it from the table. Repeat this until we have deleted the aggregates for every learner in the list.
3. Now that the aggregates are deleted, we can update the learners’ stages using the script. SSH into your server and navigate to the following directory: /var/www/vhosts/your installation name here/developers/tools/cbme_migration
4. Once in that directory we are going to be executing the reset-learner-tree-version.php script. Tip: if you run php reset-learner-tree-version.php --usage it will bring up the help dialogue to describe all of the options that are available. Once you have read through all of the available options you will notice that there are multiple modes that this script can be run in. For this scenario we will be using “specify” mode since we want to specify which version of stages the learners will be receiving. We must specify all stages in the --stages parameter so that the script updates them to the correct version.
As an example, if your data is this:
organisation_id = 1
course_id = 123
proxy_ids = 1111,2222,3333,4444
stages to update = C,P
current version id = 10
new version id = 20
The command will look like this:
We do not need to provide the --exclude parameter in this case
You will notice with the command above that we have listed all 4 stages in the --stages parameter even though we are only updating C and P. The reason for this is because the script requires that all stages be specified in order to update them to the correct version. In this case we are not changing D and F so we set them as the original version (10) in the script parameters. Each stage corresponds with a version in the --versions parameter, so in this case D will get version 10, F will get version 10, C will get version 20 and so on.
5. Once that script runs the last thing to do is to clear the cbme_objective_tree cache for the course that we are dealing with. The easiest way to do this is through the interface:\
Login to your install as an administrator who has admin access to the course that we are updating
Navigate to Admin > Manage Courses (Programs) > Select the course that you are using > CBME tab > Import CBME Data
Click on the Actions button on the left side above the EPA search bar and select the Edit EPAs option.
Whenever one of these EPAs are updated, the cache is cleared for the course. So all that is required is to just click save on the first EPA that is listed and the cache will be cleared. You do not need to change any of the text in the EPA that you just saved. Simply saving what is already there will sufficiently clear the cache.
Once you have cleared the cache for the course then the learners should see the updates on their dashboard. As mentioned before, it's recommended that you do this process in a test environment first so that you can verify the data is the way you would like it before updating production. If you do run into the scenario where you updated a learner to the wrong version then you can always repeat this process and update them to the correct version.
The easiest way to verify that the learners are in the correct version would be to login as some of the learners that were updated and compare their dashboards to the version they were set to. Usually there is a difference between one version to the next whether it be the EPA titles or the number of EPAs.
A field note form template is used to give a learners narrative feedback about their performance.
When you create a field note form template and publish it, the system automatically looks at the EPAs and contextual variables selected and generates the appropriate number of forms.
Ensure you are logged in as a staff:admin user, or as a Program Coordinator or Program Director affiliated with a program.
Navigate to Admin > Assessment & Evaluation.
Click ‘Form Templates’ on the tab menu.
Click the green ‘Add Form Template’ button in the top right and a pop-up window will appear.
Type in a form template name and select the form type from the dropdown menu. Select ‘Field Note Form.’
If you have permission to access multiple programs, use the dropdown menu to select the appropriate program for this form. This option will only show up if you have access to multiple programs.
Click 'Add Form'.
You will be taken to the field note form template build page.
Template Title: This is the title of the form and will be seen by users.
Description: The form description can be used to store information for administrative purposes, but is not seen by users completing the form.
Form Type: This was set in the previous step and cannot be edited here.
Course/Program: Program coordinators and faculty directors may not have access to multiple courses, while staff:admin users are likely to. If you have access to multiple courses, make sure you've selected the correct course to affiliate the form with.
EPA Version: If you have two (or more) curriculum versions in the system, the Form Template Builder will default to loading the most recent version. In the "EPA Version" tab, simply select the appropriate version. Click Save.
If you want to build new forms for learners using Version 1, change the EPA Version to Version 1, click Save, and it will load the appropriate EPAs.
Permissions: Anyone added under permissions will have access to edit the form before it is in use and use the form if they are setting up a distribution. You may wish to include a program in the permissions field so that you can filter by this form type later on. To add individual, program, or organisation permissions, click the dropdown selector to select a category, and then begin to type in the appropriate name, clicking on it to add to the list. You can add multiple individuals, programs, and organisations to the permissions list as needed.
Include Instructions: Add additional text at the beginning of the form by clicking the small tick box beside ‘Include Instructions.’ This will open a rich text editor where you can enter text, images, hyperlinks, etc. This information will display to users when they complete forms published from this blueprint.
Specify which EPAs can be assessed using forms generated from this template.
All EPAs assigned to a course are included on the template by default.
To remove EPAs, click on the small 'x' to the left of the EPA code.
You can add back any required EPAs by clicking on the dropdown menu and checking off the tick box for a desired EPA.
Click the grey badge beside an EPA to select or remove specific milestones for forms built from this template.
Click 'Save'.
If you want all EPAs to have the same available contextual variables leave all EPAs checked off. If you’d rather specify which contextual variables apply to which EPAs simply uncheck an EPA and it will appear below with its own customizable list of contextual variables.
Select which contextual variables you want to include with which EPAs by checking and unchecking the tick boxes.
You can remove specific contextual variable responses by clicking on the grey button beside a contextual variable.
For convenience, you can also use ‘Check All’ and ‘Uncheck All’.
When you modify which contextual variable response options will be available on a template, the number in the grey badge will show how many responses have been included out of the total possible responses.
To allow users to select multiple CV responses when completing a form, check the multi-select box. If selected, the item created on the form will be a drop down multiple responses type, instead of single response type.
When you have made the required changes, click the blue ‘Save and Next’ button.
You may only select between 1 and 6 contextual variables per EPA per form.
All field note form templates include a Continue and Consider section in which faculty can record comments to provide feedback to learners. These sections cannot be edited in the Field Note Form Template.
From the first dropdown menu, select a Global Rating Scale.
Enter Item Text if needed.
Elentra allows organizations to optionally automatically populate the Item Text based on the selected scale. If you do not see Item Text prepopulated and you would like to, you'll need to speak to a developer about making that change.
From the second dropdown menu, indicate whether comments are disabled, optional, mandatory, or prompted.
Disabled - Comments are disabled at the milestone level.
Optional - An optional comment box will appear for each milestone. Comments are not required for a form to be submitted.
Mandatory - A mandatory comment box will appear for each milestone. Comments are required for all milestones before a form can be submitted.
Prompted - A mandatory comment box will appear only for the prompted responses indicated. This is a popular option for responses at the lower end of a scale.
The Responses fields will be auto-populated depending on the scale selected in the first dropdown menu.
Click 'Save'.
On each form template you create you’ll notice a greyed out area at the bottom including Next Steps, Concerns, and a place for feedback. This is default form content that cannot be changed except by a developer.
Click 'Publish' to make your template available for use.
Once a form template has been published, forms created from it will live on the resident dashboard and can no longer be edited. The number of forms that will be created from a template depends on the number of EPAs assigned to the template.
Learners and their affiliated faculty and program administrators can track assessment form completion by navigating to the learner's CBME dashboard.
Click on the Assessment and Evaluation badge at the top of the page. Click the My Learners tab and click on CBME Dashboard below the relevant learner name.
From the learner's CBME dashboard click on Assessments. Scroll down past the Filter Options until you see a set of tabs including Completed, In Progress, Pending, and Deleted. Choose the appropriate tab to review assessment form completion.
Click on the Assessment and Evaluation badge at the top of the page. Click the My Learners tab and click on Assessments below the relevant learner name.
These pages provide information on tasks triggered by faculty and learner's as well as tasks assigned through distributions and provide access to some reporting. For more information see the Reviewing Progress>Assessments Page help section.
PAs can view a faculty's tasks from the Assessment and Evaluation tab. It works almost the same as the learner's assessment page but is accessed from the Faculty tab.
When a PA sets up assessment and evaluation tasks to be completed via distributions, progress can quickly be viewed via the Assessment and Evaluation module.
Setting CBE status (via the enrolment tab or directly in the database) is a prerequisite for Royal College programs using curriculum versioning and who want residents to move to a new curriculum version when they begin a new stage of training.
Managing learner levels and CBE status on the Enrolment tab of a course is optional. It can be turned on or off by a developer through a setting in the database (learner_levels_enabled).
Learner level options (e.g., PGY 1, PGY 2, etc.) can also be adjusted in the database by a developer.
Users who have access to Admin > Manage Courses will be able to manage learner levels and CBE status through the Course Enrolment tab.
Learner Levels and CBE status can optionally be used when configuring distributions in the Assessment & Evaluation module.
Learner levels do not automatically adjust when a new curriculum period is added. For each new course enrolment, learner levels and CBE status should be defined (if being used).
Navigate to Admin > Manage Courses.
Select the appropriate course.
Click the Enrolment tab.
You will see a list of learners. Use the curriculum period switcher on the right if needed.
For each learner, select a learner level from the drop down.
After a learner level has been defined you can set CBE status to Yes or No.
Please note that if you store your learner levels in another system and pull them into Elentra, making a change in Elentra will not automatically update the data in your other system.
General information about cohorts/course lists can be found here.
For (P)GME programs using CBME, residents should be stored in course lists in order to easily add them as the audience of a course. Course lists are managed through Admin>Manage Cohorts. We recommend that you create a course list for each academic year and populate the course list with all learners, CBME and non-CBME, who will participate in that program for the specified time.
If you are syncing your users with another source, your course lists may be automatically populated. It will depend on the script written by developers to connect Elentra to your other source of user information.
When you create courses, you can set the Course List as the audience for the course. It is important that learners be enrolled in a current curriculum period for them to be able to trigger forms and have forms triggered on and to them.
From the Dashboard, faculty click the green 'Start Assessment/Evaluation' button on the right side of the screen.
First, faculty select an On-Demand Workflow which will dictate which forms become available to select. These options will only display if an organisation has form work-flows configured. If None is the only available option, select None and continue.
After selecting an on-demand workflow, the choices a user has will depend on the workflow they are completing. In this example, we'll complete an EPA form.
Next faculty select a learner. They can begin to type a name to limit the list of options. When they mouse over a name, they can see the learner's name and photo (if uploaded).
Set a date of encounter.
If the learner is enrolled in two programs the faculty will have to specify a program.
Next faculty select an EPA.
For a reminder on what is included in a specific EPA the black question mark provides a quick link to the EPA Encyclopedia (see above).
Users can easily navigate the list of EPAs by applying preset filters including Current Stage EPAs, Current Rotation EPAs, and Priority EPAs. Click on a filter title to apply it. In the example above the Priority EPAs filter is being applied.
After an EPA is selected, the available assessment tools will be displayed. Learners can search the tools by beginning to type in the search bar. Note the small clock in the top right of each tool card. This is an estimate of how long the form will take to complete based on the experience of other users.
Faculty can click 'Preview This Form' to view the form and ensure it is the one they want or they can click 'Begin Assessment' on any tool to begin.
This feature allows residents to cognitively situate their assessors by adding an optional note to any assessment that will be emailed (complete and confirm via email; email blank form; self-assessment, then email blank form). It can be used to remind the assessor about specific case details, provide a focus for assessment, or anything else that the resident feels the assessor should know before completing the assessment. This note will stay attached to the assessment and will be visible from the CBME dashboard for reference.
This cue will stay attached to the assessment and will be visible from the CBME dashboard for reference.The cue is optional and if provided it will appear at the top of the assessment for the assessor. The cue can also be seen on the completed assessment for both the resident and the faculty member.
For “Email blank form,” the cue modal will pop up from the Trigger Assessment page after “Send Assessment” has been clicked.
For “Complete and Confirm via Email” and “Self-assessment, then email blank form”, the cue modal will pop up after clicking “Submit and notify attending by email” at the bottom of the form.
PAs can enter completed assessment forms on behalf of assessors. This allows faculty to complete a pen and paper version of a form and have the data entered into Elentra.
To enter a completed form on someone's behalf:
Navigate to Admin>Assessment & Evaluation.
Click on the green 'Record Assessment' button below the orange Assessment Tasks heading.
Select a resident (you will need to know the curriculum period the learner is in) and assessor from the searchable dropdown menus.
Select a CBME Version if necessary.
Select a Date of Encounter (i.e., the day the form was completed).
Select an EPA as you would to initiate a form. Filters and search are available.
Search for the appropriate form. You can preview the form to make sure it is the one you want or click 'Begin Assessment' to start a form.
You will be submitting the assessment on behalf of the selected assessor. There is a reminder of the selected assessor displayed at the top of the form in a yellow information bar.
Complete the form and click Submit.
You will get a green success message and be returned to the assessment entry screen to complete another form if needed. \
Program Administrators and Faculty Directors have the ability to get an overview of others' task completion status from Admin > Assessment and Evaluation.
Navigate to Admin > Assessment and Evaluation.
Filter by task delivery type as needed.
View Outstanding, Upcoming, or Deleted tasks.
Search any list for for specific forms, or owners. Note: The 'Owner' is the person who is responsible for completing the form.
Hover over the Targets column to view the targets of the form.
Check the appropriate boxes and then send reminders as needed.
Check the appropriate boxes and then 'Delete Task(s)' as needed.
Click on any form to complete it on behalf of another user.
The list of users included in the Outstanding task list are those faculty listed as "Associated Faculty" on the program setup tab, faculty who have been included in a distribution in the course, and learners in the audience of a program enrolment (also configured on the program setup tab).
The most commonly used tools for reviewing resident progress in CBME are the CBME Program Dashboard and individual learners' CBME dashboards. Remember that to use the CBME Program Dashboard a program must have built an assessment plan for its EPAs. At this time the CBME Program Dashboard view is only available to staff and faculty and is not visible to learners. Learners continue to use their individual CBME dashboards.
Although the CBME Program Dashboard is enabled by default, it can be disabled for specific programs if they prefer to use only individual CBME Dashboards or are not building assessment plans at this time. You will need a developer's assistance to disable the CBME Program Dashboard for a specific program/course.
The program dashboard leverages information stored in the assessment plan builder to provide an overview of learner progress towards meeting the plan. From within one interface, Course/Program Directors, Program Administrators, and Academic Advisors are able to see the learners in their program to whom they have access and view their progress towards meeting the assessment plan requirements. There is currently no learner-facing side of this dashboard.
You can watch a recording about the Program Dashboard at (login required).
IMPORTANT PREREQUISITE: Assessment Plan Builder
In order to leverage the program-level dashboard, your program must have assessment plans entered in Elentra. Please visit the Assessment Plan Builder lesson for more information.
Once you have entered your assessment plan requirements into the Assessment Plan Builder, the dashboard will populate the EPA counters.
Reminder: If you have multiple curriculum versions, the Program Dashboard will only show the most recent version. View the learner dashboard to view archived assessments collected by a learner under a different curriculum version.
Log in to Elentra as an administrator (e.g. program administrator, course director).
At the top right, click on the "Assessment & Evaluation" icon beside your name.
Click on the "My Learners" tab to view the CBME Program Dashboard.
Multiple tabs provide different progress information (EPAs, Stages, Program Stats). An advanced filter set allows programs to filter the information on each page. These filters persist across tabs.
There are currently three tabs within the Program Dashboard.
Assessments By EPA: Visualizes each learner's progress towards meeting the assessment plans organized by EPA. You can view all learners in one interface.
Stage Completion Status: Visualizes each learners progress towards meeting all EPAs within a stage. You can view all learners in one interface.
Program Stats: Currently includes a bar graph depicting how many assessments have been completed on each resident, highlighting the proportion that meet the plans.
Click on the information icon in the top right of an EPA tile to view a residents progress to date in terms of fulfilling the contextual variable and other requirements as defined by the Assessment Plan. Some samples views are posted below.
Note that while requirements are incomplete, you can toggle between viewing all requirements or remaining requirements only.
Select the curriculum period that you wish to view enrolment for. This is typically the current academic year.
If you have access to more than one program, you can toggle between them using this dropdown menu.
Search learner names using free-text search.
You are able to sort the learner list by:
Learner name ("Sort by Name")
Progress towards meeting the plan ("Sort by Progress")
Total number of assessments ("Sort by Total Assessments")
Choose to sort the learner list in ascending or descending order.
Filter the learner list by Post-Graduate Year (PGY) level. You may select more than one PGY.
Filter the EPA list by Royal College Stages. You may select more than one stage.
Filter the EPA list. You may select more than one EPA.
Overall Total: Total number of assessments completed on the learner for EPAs that have an assessment plan entered.
EPA Total: The number directly beneath the EPA Code is the total number of assessments that have been completed on that EPA for that learner, regardless of whether or not they met the assessment plan requirements.
Requirements Total: The fraction indicates how many completed assessments met the assessment plan over how many assessments are required.
The resident progress dashboard is meant to give a high level overview of your learners' progress towards meeting your assessment plans. The decision to mark EPA progress as "Approved" is made solely at discretion of the Competence Committee.
Red: No Progress. Indicates that the learner is in that stage, but:
has not been assessed on that EPA, OR
has been assessed but none of the assessments meet plan requirements
Yellow: In Progress < 50%. Indicates that the learner has been assessed on the EPA, but is currently meeting less than 50% of the requirements
Blue: In Progress > 50%. Indicates that the learner has been assessed on the EPA and is meeting more than 50% of the requirements
Green: Achieved. Indicates that the learner has been assessed on the EPA and is currently meeting the defined assessment plan numbers; however, the progress still needs to be reviewed and approved by the competence committee.
Green: Approved (with checkmark). Indicates that the EPA has been reviewed and approved at the competence committee level.
Grey: All EPAs that are not in the learner's current stage will appear grey, even if they have assessments that count towards the assessment plan.
The assessment plan builder allows you to specify minimum requirements for assessment forms on a per-EPA basis. When you enter an assessment plan, you enter the following information for each form or group of combined forms:
Minimum number of assessments, with a global assessment rating equal to or high than an indicated value
Minimum number of unique assessors
Contextual variable requirements, including a defined number of required responses (or a range of responses), such as a certain number of presentations or complexities
These values are then combined in the system to create the total number of required assessments. It is possible for a learner to have the correct number of required assessments for the global assessment rating without achieving the plan due to not meeting the contextual variable or unique assessor requirements.
For example, if the learner needs 5 assessments at "meets expectations" or above, in addition to being assessed on 5 different clinical presentations, the dashboard will only count the first instance of "acute injury" that "meets expectations", and will only count other clinical presentations towards the total after that. Any additional 'acute injuries' that 'meet expectations' will not be counted, since the learner still needs to be assessed on 4 more unique clinical presentations.
If a program does not want to use the CBME Program Dashboard a developer can disable it for specific programs. (Developers, the program dashboard can be disabled for a course by adding an entry cbme_progress_dashboard with a value of 0 in the course_settings table.)
The CBME visual summary is optional and is controlled by a database setting (cbme_enable_visual_summary). To enable this for your organization, please speak to a developer.
You can watch a recording about the Visual Summary Dashboards at (login required).
IMPORTANT PREREQUISITE: Assessment Plan Builder
In order to leverage the visual summary, your program must have assessment plans entered in Elentra. Please visit the Assessment Plan Builder lesson for more information.
To access the visual summary:
Log in to Elentra as an administrator (e.g. program administrator, course director).
At the top right, click on the "Assessment & Evaluation" task list icon.
Click on the "My Learners" tab
You will land on the Program Dashboard.
From the tab menu below the filter option, click Visual Summary.
You will be directed to the Visual Summary dashboard.
Toggle between the different dashboards, and/or programs as applicable.
The Normative Assessment Dashboard shows the performance of all residents in a program relative to each other and their training phases and is meant to be only viewed by Competency Committee members.
The normative dashboard presents summarized data metrics of all the residents in a program relative to each other. The data is presented as both a visual representation (left) and a tabular representation. Users are provided with an option to either view all the residents in a program or selectively view the metrics of residents in one training stage by using the dropdown at the top of the dashboard. This can be useful during Competency Committee meetings when residents are judged on their performance relative to others in their training group.
By default, the normative dashboard filters out residents without any completed EPAs. However, this behavior can be turned off by toggling the checkbox at the top of the dashboard.
The bar chart visualizes the following four metrics individually and to switch between the metric being visualized, users can select the corresponding radio buttons above the chart. Each bar represents one resident and users can hover their mouse over a bar in the chart to see the name of the resident it represents, and the value of the current metric being shown. Clicking on a bar in the bar chart switches the user to the resident dashboard to view a detailed breakdown of all assessments of that resident.
Total EPAs - This is a count of the total number of EPAs filled out by a resident.
Currently the total EPAs only considers EPAs that have been collected on valid assessment forms. However, In a future release this count will be updated to also include EPAs completed on archived, expired or deleted assessment forms.
Achievement Rate - This is the total number of EPAs a resident has achieved divided by the total number of EPAs completed by that resident. An achieved EPA is one where the EPA meets certain requirements such as acquiring a rating of 4 or above on a 5-point scale, or satisfying specific contextual variable requirements in an EPA, or meeting diverse assessor role requirements.
Progress Rate - This is the number of EPAs a resident has achieved divided by the total number of EPAs they are required to complete in all the valid EPA forms in a program across the different training phases.
While the achievement rate is a personal indicator for each resident to see what number of EPAs they attempt are achieved, the progress rate is a global indicator that shows where they are in their residency training program.
Total EPAs vs Achieved EPAs - This chart visualizes two bars for each resident showing their total EPAs and achieved EPAs next to each other. While this metric is similar to achievement rate, it can offer a better picture of a resident’s overall performance as high achievements rates can be occasionally misleading for residents with a very low number of completed EPAs all of which are achieved.
Finally, there is a tabular representation of the same metrics with the ability to sort the residents in the table by clicking on the column header. By default, the residents are sorted in the table by name in an ascending order. This can be changed to a descending order by simply clicking on the column header “Name”. Similarly clicking on each column header sorts the table in an ascending order based on that particular metric and clicking the same column header again changes the order to descending.
The normative dashboard is linked to the resident dashboard so to view a particular residents’ assessments in detail users can simply click on a bar corresponding to a resident in the bar chart or their corresponding row in the table. This will automatically switch the user to the resident dashboard with that particular resident preselected.
The Resident Metrics Dashboard focuses on individual residents and is designed to be used by Residents and Competency Committee members**.**
The resident dashboard has a wealth of information that is grouped into different categories for easier comprehension. First if you arrived at the resident dashboard by selecting a resident on the normative dashboard their data is automatically fetched for you. However, if you manually switched over to the resident dashboard by clicking on the navigation tabs above, you will need to select a resident from the dropdown in the filter panel situated at the top of the dashboard. The dropdown contains the list of all the residents in the program with their name and their
corresponding progress rate. The names of residents are further grouped by their training stage and then sorted alphabetically for easier access.
The dropdown is also an editable text box and so you can type a resident’s name partially to automatically filter the available options in the dropdown. This makes it easier to search for a particular resident in a program with many residents.
After selecting a resident, users can then click on the “GET RECORDS’’ button to visualize their assessment data. You might notice the small button with the calendar icon on it. This is used to highlight assessment data gained by the resident in a particular time period. For now, ignore it but we will learn more about further down. The resident dashboard consists of 4 main sub sections. Let us look at each one individually.
This section provides the following summarized metrics of the resident:
Total EPAs observed - Count of all EPAs filled by a resident.
This number may vary from the total EPAs count for the same resident on the normative dashboard as this number also includes assessments filled on expired/archived assessment forms and not just currently valid assessment forms.
Progress Rate - This is the number of EPAs a resident has achieved divided by the total number of EPAs they are required to complete in all the valid EPA forms in a program across the different training phases.
Achievement Rate - This is the total number of EPAs a resident has achieved divided by the total number of EPAs completed by that resident. An achieved EPA is one where the EPA meets certain requirements such as acquiring a rating of 4 or above on a 5 point scale, or satisfying specific contextual variable requirements in an EPA, or meeting diverse assessor role requirements.
To the right of the acquisition metrics is a line chart that shows the weekly EPA acquisition rate for the last six months by the resident. This is meant to give a high level overview at a quick glance of the residents assessment gathering in the recent past.
This section is meant to quickly lookup a residents’ recent performance with the option to view records in the following ranges: last 10 days, last 25 days, last month and last 3 months. The chart does not visually distinguish the different EPA types (i.e., EPA-F1 vs EPA-C2), instead, it provides this and other additional information in a pop-up menu that can be invoked by hovering the mouse over a point.
The line chart provides a simple representation of the last “N’’ assessments filled by the resident where every EPA is represented as a point with the oldest record starting on the left. The points are arranged vertically using the O-Score Entrustability scale with 5 being the highest (I did not need to be there) and 1 being the lowest (I had to do). The better a resident performs in an EPA, the higher is the vertical position of the point in the chart. We use background lines to show the 5 levels, instead of labelling the points, to reduce visual clutter as the levels are easy to understand without providing additional context.
The final section provides a detailed overview of every single EPA completed by the resident. The entire list of EPAs that residents are required to complete are broken down into four groups based on the training phase during which a resident is supposed to complete them and are numbered accordingly.
Each training phase is presented as a block with the title of the training phase and a label indicating whether the resident has completed the training phase. If a training phase is in progress a completion rate is shown to indicate the number of assessments the resident has achieved in that training phase relative to the total number of required assessments for every EPA in that phase. Each training phase block acts as an accordion and can be expanded or collapsed to view the list of all EPAs in that block.
Although residents generally complete the EPAs of their current training phase before they pick up EPAs of later phases, there are exceptions. Due to various external factors such as their rotation schedules and the nature of medical cases of the patients they attend to, residents can occasionally end up completing EPAs which are not in their current training phase. This means residents can have a non-zero completion rate for training phases that they have not yet started officially. When a training block is open, all the EPAs in that block are arranged sequentially based on the numbering order in a 3-column layout as shown in the following figure.
First Column: EPA ID and textual description of the corresponding medical scenario that the EPA targets.
Second Column: The residents acquisition metrics for each EPA are provided as four numbers along with two bullets charts that visualize how far along the resident is in completing that EPA. The first bullet chart (blue) visualizes the observed EPA count relative to the required count while the second bullet chart visualizes the achieved EPA count relative to the required count. If an EPA is complete (the required numbers of EPAs are achieved), the “TO GO’’ metric changes into a completed check mark icon.
Third Column: This is a visualization of all assessment filled by the resident for that particular EPA. The information is visualized similar to the recent EPA chart discussed above. Assessments are arranged chronologically on the X axis with the oldest to the left and are arranged vertically based on the EPA rating (5-point O-Score Entrustability scale) with 5 being the highest (resident managed the situation independently) and 1 being the lowest (Assessor had to completely take over the situation). Each point in this chart can be hovered upon to view additional information about that EPA such as narrative feedback in an onscreen popup window.
Finally, two buttons are provided as seen in the bottom left corner of the chart. The first one (book icon) can be clicked to see all the records in a table that can be sorted and filtered by columns. To filter the table start typing in the input box at the top of each column in the table. This will dynamically filter the table as you type. To sort the table by a column simply click on the column header.
The second button (sliders icon) brings up dropdown filter lists that can be used to visually identify a particular record based on patient demographics or other contextual variables such as “Case Complexity’’ or “Clinical Presentation’’. For example, if a user wanted to see which of the records were for senior patients, they could select the ‘Senior’ option from the drop-down list and the corresponding points (observation scores) would turn light red.
This is a common feature across the dashboard that highlights all assessments that were filled in particular time period. To enable this, head over the filter panel at the top of the dashboard and click on the small button with the calendar icon on it. This will open a panel where you can set the start date and end date for the time period. You can either type in directly into the input box or use the date selector on the calendar below.
Once the start date and end date are set all assessments that fall in that particular time period are converted into diamonds across the dashboard. This provides a way to visually distinguish these EPAs while still viewing them relative to other EPAs filled outside of the selected time period. This feature can be particularly useful during competence committee meetings which happen once every three months such that the time period can be set to highlight only the EPAs filled by the resident since the last meeting.
Another way to set the time period on the dashboard is by simply clicking on rotation block in the rotation schedule. This will automatically set the start date and end date of the time period to the dates of the rotation block and all assessments filled in that rotation block are automatically highlighted.
The checkbox provided in the date filter panel when enabled hides all EPA levels which do not have any assessments filled in the selected training period. If an entire training phase does not have any EPAs filled in the training period, then the whole training phase block as a whole is also hidden. This can be useful to reduce the visual clutter on the dashboard and only focus on a small subset of EPAs.
From this dashboard, program directors and coordinators can see a breakdown of all assessments completed in a program by a faculty member for informing faculty development.
To promote a learner to a new stage, log in as a Competence Committee member, and click on the Assessment and Evaluation badge at the top of the page (beside the green logout button).
Select the 'My Learners' tab and then click on the CBME Dashboard tab of the learner you want to promote.
On the right hand side of each row you'll see a small grey circle; click on the circle to open a dialogue box where you can enter text and mark an EPA or stage as complete.
You can also click 'View History' in the dialogue box and see any previous comments or decisions regarding learner promotion (or demotion).
Note that there is a database setting you can enable to allow course directors and program coordinators to manage stages (settings: allow_course_director_manage_stage and allow_program_coordinator_manage_stage). Both these settings are off by default and need to be enabled by a developer if you want to use them.
Updated to allow for faculty and learner self-assessment on curriculum tags (Stages tab)
Faculty, program administrators and learners can easily review a learner's progress from the learner's individual CBME dashboard.
Note that the CBME dashboard includes several tabs: Stages, Assessments, Assessments Items, Trends, Comments, and Pins. (Reports can be generated from the Stages page.)
There is another Assessments Dashboard that pulls in additional form information if your organization is using the Assessment and Evaluation module for other forms (e.g. program and faculty evaluations). This page discusses just the CBME Dashboard.
When logged in as a faculty member or program coordinator, click the Assessment & Evaluation badge that appears between the user's name and the logout button in the top right.
Click on 'My Learners' from the tab menu.
Search for a learner as needed, and if you can't find someone, ensure that you are in the correct curriculum period using the dropdown menu on the right.
After finding the correct learner, click on Dashboard under the learner's name to view the learner's progress. Residents automatically land on their CBME Dashboard when they log into Elentra.
From a learner's CBME Dashboard, click through the different tabs to view a range of assessment information. On most tabs, you can apply filters in order to refine the list of assessments visible. To use this option, select all the appropriate filters (e.g. date, curriculum tag, contextual variable, assessment tool, etc.) and then click 'Apply Filters'. Note that the filters a PA or PD applies on one learner will be maintained as you move through different pages.
From the Overview tab, you can see a summary of a learner's progress across curriculum objectives. Depending on the tag set settings in a curriculum framework, tag sets may display as containers of other tags, as a flat list, etc.
Click the down arrow on the right side of an curriculum tag card to see a list of completed assessments for that curriculum tag. Depending on the form there may be a count of the learner's global assessment rating, which you can click on to access an aggregated report of the learner's performance on this form.
From a learner dashboard, you can optionally allow faculty and learners to indicate progress towards curriculum tags through the use of dashboard scales.
For each tag set, faculty or administrators have the ability to mark a learner's status (e.g., In Progress, Complete). The options available depend on the dashboard scale defined for the curriculum tag set.
Additionally, users, including learners, can set status updates for individual tags listed on the dashboard. These also depend on the dashboard scale defined for the curriculum tag set.
to let learners use this feature, the database settings cbe_dashboard_allow_learner_self_assessment and learner_access_to_status_history are required. Additionally, a developer needs to add cbe_dashboard_allow_learner_self_assessment to the course settings table for each course that wishes to use this feature.
By clicking on the status circle a user can update the status as needed. For example, a learner can indicate that she is "In Progress" for a specific tag.
When a learner self-assesses they will be asked to provide a comment to justify their self-assessment. They can also optionally upload a supporting file.
When a faculty or administrator updates a status they can also provide comments and a file. In both cases they can optionally allow the comment or file to also be viewed by the learner.
Users can view the history of status changes to see previously made comments and uploaded files.
A badge on curriculum tag cards displays the learner’s progress towards meeting the uploaded assessment plans (Achieved/Required) and is visible to all who have access to the dashboard including the learners themselves. These numbers align with those on the CBME Program Dashboard and are updated on the same nightly schedule. Note that you can toggle between viewing all requirements and remaining requirements.
If you are using the rotation scheduler, curriculum tags mapped to a learner's current rotation are outlined in the stage colour and the priority and likelihood of a tag in the learner's specific rotation is shown through the exclamation mark and bar chart (unlikely, likely, very likely).
From the “Overview” tab of the CBME Dashboard, click on the grey down arrow on the right side of the curriculum tag card (“View Assessment Details” tooltip will appear on hover).
This will display the titles and total counts of all forms that have been completed on a resident for that curriculum tag. Simply click on the form title that you wish to view aggregated data for.
This will open a new tab with the aggregated report as well as a trends chart. Within this tab, click on the form title to expand the aggregated data. If there have been multiple versions of the same form, these will aggregate separately, so you will need to click on each form version to view the data. You are also able to navigate directly to individual assessments by clicking on them within the trends chart.
Additionally, from ‘View Assessment Details’, you are able to generate an aggregated report by clicking on the entrustment rating responses. This will generate a report for only those assessments with that specific level of entrustment (e.g., to view aggregated report of all assessments where the resident was entrusted with “Supervision Only” on that particular form).
See a list of all completed assessments and filter as desired.
Toggle between completed, in progress, pending and deleted assessments below the filter settings.
In Progress tasks will include any task initiated using 'Complete and confirm via email', even if the assessor has yet to independently open the task.
Pending tasks here includes all assessments delivered to assessors, whether or not they have expired.
On each individual assessment card, note that the form type and relevant EPA are displayed. You can also click the down arrow on the right to show some form information (e.g., global rating, comments, assessor and assessment method), and click 'View Details' on the left to see the assessment form.
Users can quickly filter for read/unread assessments.
The small grey number beside 'Assessments' in the tab menu represents all unread assessments.
From the regular list view an eye with a slash means an assessment is unread.
There is an option to mark all assessments as read on the right side above the assessment cards.
Marking assessments as read or unread is user specific so learners and faculty will see their own results.
Users can pin an assessment from this screen and learners can give an assessment a "thumbs up" to provide feedback to an assessor about their feedback.
Quickly see a list of all completed assessment items and filter as desired. Click the down arrow on the right to see comments, response descriptor and the name of the assessor (as applicable). Click View Details at the bottom left of each card to access the completed assessment form.
Users can pin an assessment item from this screen.
View trends in learner performance across global assessment ratings. Note the overall tally of ratings in the top right corner of each card. Hover over on a node on the chart to view information about the form name, date and rating; you can also click through to access the actual assessment form.
Quickly access a list of all narrative comments provided to the learner on any complete CBME assessment tool. The tool type and relevant EPA will be displayed below the form name on each comment card.
Users can pin comments from this tab.
Quickly view all assessments, items, or comments that have been pinned. Apply filters as desired, or just scroll down to find the toggle to switch between assessments, items, and comments.
An archived assessment is a CBME assessment that was completed on a resident using an EPA from a previous curriculum version. Assessments are archived only when a program uploads new versions of their EPAs but a resident has already collected assessments in the stage beyond the one they are currently in. In this case, the resident still receives the new EPAs for all future stages; however, all completed assessments from the old versions of those stages/EPAs are “archived”.
From the "Stages" tab, each EPA card displays how many assessments have been archived for that EPA. Expand the card for more detail.
In the example above the learner has 3 "current" assessments, and 3 "archived" assessments from a previous EPA version.
When on the "Assessments" tab, archived assessments are identifiable by locating the grey square beside the form title. Current assessments will not have the grey square beside the form title. The image below shows 3 archived assessments.
Learners, advisors, and competence committee members may wish to record notes about meetings held with or about learners. Elentra's My Meetings tool supports this.
Meeting logs can be accessed from the CBE Learner Dashboard.
More information on Meetings .
Learners or faculty can pin assessments or individual comments to keep them easily accessible for review during meetings. This can help to keep an important piece of feedback or other information front and centre until it has been discussed.
How to pin something
To pin an assessment, simply click on the pushpin icon that appears to the right of the assessment title and details. You'll get a success message that you've pinned the item. In the example to the left, the second and third assessments have been pinned.
To pin a comment, click on the Comments tab from the CBME Dashboard and then click the pushpin beside the desired comment. In the example to the left, the second comment has been pinned.
From the CBME Dashboard, click on Pins at the end of the tabs menu. This will open a screen showing all pinned items. To unpin something, just click the pushpin. You'll see a success message that you've unpinned the item.
The Resident Metrics Dashboard focuses on individual residents and is designed to be used by Residents and Competency Committee members.
The resident dashboard has a wealth of information that is grouped into different categories for easier comprehension. First if you arrived at the resident dashboard by selecting a resident on the normative dashboard their data is automatically fetched for you. However, if you manually switched over to the resident dashboard by clicking on the navigation tabs above, you will need to select a resident from the drop-down in the filter panel situated at the top of the dashboard. The drop-down contains the list of all the residents in the program with their name and their corresponding progress rate. The names of residents are further grouped by their training stage and then sorted alphabetically for easier access.
The drop-down is also an editable text box and so you can type a resident’s name partially to automatically filter the available options in the drop-down. This makes it easier to search for a particular resident in a program with many residents.
After selecting a resident, users can then click on the “GET RECORDS’’ button to visualize their assessment data. You might notice the small button with the calendar icon on it. This is used to highlight assessment data gained by the resident in a particular time period. For now, ignore it but we will learn more about it further down. The resident dashboard consists of several main sub sections. Let us look at each one individually.
This section provides the following summarized metrics of the resident:
Total EPAs observed - This is a count of the total number of EPAs filled out by a resident.
Progress Rate - This is the number of EPAs a resident has achieved divided by the total number of EPAs they are required to achieve for all the valid EPA forms in a program across the different training phases.
Achievement Rate - This is the total number of EPAs a resident has achieved divided by the total number of EPAs completed by that resident. An achieved EPA is one where the EPA meets certain requirements set in the assessment plan, such as acquiring a rating of 4 or above on a 5-point scale, or satisfying specific contextual variable requirements, or meeting diverse assessor role requirements.
To the right of the acquisition metrics is a line chart that shows the weekly EPA acquisition rate for the last six months by the resident. This is meant to give a high level overview at a quick glance of the residents assessment gathering in the recent past.
This section is meant to quickly lookup a residents’ recent performance with the option to view records in the following ranges: last 10 days, last 25 days, last month and last 3 months. The chart does not visually distinguish the different EPA types (i.e., EPA-F1 vs EPA-C2), instead, it provides this and other additional information in a pop-up menu that can be invoked by hovering the mouse over a point.
The line chart provides a simple representation of the last “N’’ assessments filled by the resident where every EPA is represented as a point with the oldest record starting on the left. The points are arranged vertically using the O-Score Entrustability scale with 5 being the highest (I did not need to be there) and 1 being the lowest (I had to do). The better a resident performs in an EPA, the higher is the vertical position of the point in the chart.
If a resident has assessments that were filled on EPA forms across several different rating scales then each rating scale is provided with its own recent EPA chart as shown in the above image.
The final section provides a detailed overview of every single EPA completed by the resident. The entire list of EPAs that residents are required to complete are broken down into four groups based on the training phase during which a resident is supposed to complete them and are numbered accordingly. With the addition of support for dynamic CBE the number of training phases can be higher or lower than 4. Some programs such as surgical foundation for example only have two training phases.
Each training phase is presented as a block with the title of the training phase and a label indicating whether the resident has completed the training phase or not. If a training phase is in progress a completion rate is shown to indicate the number of assessments the resident has achieved in that training phase relative to the total number of required assessments for every EPA in that phase. Each training phase block acts as an accordion and can be expanded or collapsed to view the list of all EPAs in that block.
Although residents generally complete the EPAs of their current training phase before they pick up EPAs of later phases, there are exceptions. Due to various external factors such as their rotation schedules and the nature of medical cases of the patients they attend to, residents can occasionally end up completing EPAs which are not in their current training phase. This means residents can have a non-zero completion rate for training phases that they have not yet started officially.
When a training block is open, all the EPAs in that block are arranged sequentially based on the numbering order in a 3-column layout as shown above.
EPA ID and a textual description of the corresponding medical scenario that the EPA targets.
The residents acquisition metrics for each EPA are provided as 3 numbers along with two bullet charts that visualize how far along the resident is in completing that EPA. If an assessment plan is not available for an EPA the required and achieved numbers default to “N/A” (not available). The first bullet chart (blue) visualizes the observed EPA count relative to the required count while the second bullet chart visualizes the achieved EPA count relative to the required count. A green check mark icon indicates the completed status of the EPA. It can show up either because the resident has achieved the required number of EPAs or if the competence committee has marked the EPA as complete (even if the achieved count is not met). In the scenario shown in the image above the latter is true. However, if an EPA has not been marked complete and the resident has not met the required achieved EPA count then a “TO GO” metric is shown in place of the check mark icon as shown above.
This is a visualization of all assessments filled by the resident for that EPA. The information is visualized like the recent EPA chart discussed above. Assessments are first grouped by the EPA form type and version number. This ensures that assessments of a similar variety are visualized together. Within each chart, assessments are arranged chronologically on the X axis with the oldest to the left and are arranged vertically based on the EPA rating (5-point O-Score Entrustability scale) with 5 being the highest (resident managed the situation independently) and 1 being the lowest (Assessor had to completely take over the situation). However, the rating scale is not always a standard 5-point scale and can change depending on the type of scale used in the assessment plan for a given EPA. For example, in the image shown below the EPA form has a 2-point (yes/no) rating scale.
Further each point in this chart can be hovered upon to view additional information about that assessment such as narrative feedback, situational context, assessor name and their role in an onscreen popup window as shown above.
Finally, three buttons are provided as seen in the bottom left corner of each chart. The first button (sliders icon) brings up a set of drop-down filter lists that can be used to visually identify a particular record based on patient demographics or other contextual variables such as “Case Complexity’’ or “Clinical Presentation’’. For example, if a user wanted to see which of the records were for “respiratory distress”, they could select that option from the Clinical presentation drop-down list and the corresponding points (observation scores) would turn light red.
The second button (book icon) can be clicked to see all the records in a table, which can be sorted and filtered through. To filter the table start typing in the input box at the top of each column in the table. This will dynamically filter the table as you type. To sort the table by a column simply click on the column header. For example, in the image below the table is being sorted in an ascending order by the first column (date).
The third button brings up a popup screen that shows the achievement criteria breakdown for the EPA. This feature has been duplicated from the main CBME dashboard where hovering over the “i” icon in an EPA gives a breakdown of the assessment criteria as shown above.
If a school has enabled the ability to track expired assessments, an optional section is visible at the end of the dashboard which shows a tabular breakdown of all the expired assessments filled against a selected resident.
This is a common feature across all the sections of the resident dashboard that highlights all assessments that were filled in a particular period. To enable this, head over to the filter panel at the top of the dashboard and click on the small button with the calendar icon on it. This will open a panel where you can set the start date and end date for the period. You can either type in directly into the input box or use the date selector on the calendar above.
Once the start date and end date are set, all assessments that fall in that period are converted into diamonds across the dashboard. This provides a way to visually distinguish these EPAs while still viewing them relative to other EPAs filled outside of the selected period. This feature can be particularly useful during competence committee meetings which happen once every three months such that the period can be set to highlight only the EPAs filled by the resident since the last meeting.
The checkbox provided in the date filter panel when enabled hides all EPA levels which do not have any assessments filled in the selected training period. If an entire training phase does not have any EPAs filled in the training period, then the whole training phase block is also hidden. This can be useful to reduce the visual clutter on the dashboard and only focus on a small subset of EPAs.
Navigate to Admin>Assessment and Evaluation. PAs can switch between viewing Assessments and Evaluations and can navigate through outstanding, upcoming, and deleted tasks. PAs can also filter by delivery type and can send a reminder or delete a task from the A+E dashboard.
To allow faculty other than course/program directors to adjust the status for a learner on an individual curriculum tag, make sure that they have been given permission to in the relevant .
For users with a Royal College framework, when in a faculty or PA role you can also access a Milestone Report from the Overview tab. More details about the Milestone Report .
To pin an individual assessment item, navigate to the Assessments Items tab of the CBME dashboard. Apply filters as needed and click the pushpin icon to pin beside an assessment item to pin it.
Please Note: you may get an argument error (specifically regarding the implode()
function) when running the visual summary cron job, if you are running PHP 8.0 or higher.
This should be fixed soon. Let us know if you get this error and we can provide you with the fix.
The CBME visual summary dashboards are optional and must be enabled through a database setting (cbme_enable_visual_summary).
The Visual Summary Dashboards available in Elentra provide another way to view completed assessment tasks and filter that data to monitor learner progress. The views included are:
Normative Assessment Dashboard
Resident Dashboard
Faculty Development Dashboard
Program Evaluation Dashboard
Program Oversight Dashboard
To leverage the visual summary, your program must have assessment plans entered in Elentra. Please visit the Assessment Plan Builder lesson for more information.
Log in to Elentra as an administrator (e.g. program administrator, course director).
At the top right, click on the "Assessment & Evaluation" task list icon.
Click on the "My Learners" tab
You will land on the Program Dashboard.
From the tab menu below the filter option, click Visual Summary.
You will be directed to the Visual Summary dashboard.
Toggle between the different dashboards, and/or programs as applicable.
Residents have access to the visual summary page only though their Learner Dashboard where they can click on “visual summary” from the tab menu.
The permission system within the visual summary page has been set up in a specific way such that most users are only shown a small subset of the dashboards based on their user role and the rest of the dashboards are hidden completely. Only a small set of elevated user types have access to all the dashboards.
Residents: Resident Dashboard (only their data) Competency Committee Members/Chairs: Resident Dashboard, and Normative Assessment Course Director: Resident Dashboard, Normative Assessment, Faculty Development, and Program Evaluation Program Coordinator: Resident Dashboard, Normative Assessment, Faculty Development and Program Evaluation Course Director/Program Coordinators, with access to multiple programs: Resident Dashboard, Normative Assessment, Faculty Development, Program Evaluation and Program Oversight (limited to the programs they have access to) Medtech Admin: Resident Dashboard, Normative Assessment, Faculty Development, Program Evaluation and Program Oversight (Access to all programs)
After a course tree is created, learners enrolled in that course will have individual user trees created for them.
At some point you may need to modify a user-tree version so that you can reset a learner to the curriculum tags most relevant to them at a particular time.
From a course/program's CBME tab, click 'Configure CBME'
Under 'Actions,' click 'Modify User-Tree Version'
Select a user
Tree ID will display the user's existing tree id
From the Reset Mode option, pick
Override user tree(s) with a specific version
Build user tree from versionable root
In addition to the CBME Dashboard, users can access a learner's Assessments page to view additional tasks completed on the learner and assigned to the learner. The Assessments page includes tasks completed via distributions and CBME forms initiated on demand.
There is a tool to allow admin users to reopen a completed assessment and edit it as needed. This is a feature that can be turned on or off in the database settings file depending on how your organization wants to use it.
If enabled, staff:admin can access this feature.
Navigate to your Assessment & Evaluation badge (beside own name in the top left).
Click on My Learners and then Assessments under a specific learner's name.
From the list of Tasks Completed on Learner click on a task to view it.
Click the "Reopen Task" button just below the form title. This will set the task to in-progress and the staff>admin or faculty will be able to adjust it.
Provide a reason to reopen the task (e.g. was accidentally deleted, was missing data, other).
Click 'Reopen Task'.
Once reopened, a user can complete the task and submit it on behalf of the original assessor, or they can forward the task to the original assessor to update.
Use with caution!
This tool should be used judiciously to ensure that residents are not ”gaming” the system and bullying anyone into changing their assessments to be more favourable.
This dashboard organizes all the EPAs that have been completed by learners in the program by the year they were completed in with the goal of informing program evaluation. There are two main sections in this dashboard. The first section of the dashboard visualizes key metrics across several academic years as shown below.
This graph displays the number of EPAs that have been completed and expired per learner in each year in the program. Users can mouse over the bars to see the number of active residents with assessments in each year.
This stack chart displays the proportion of EPAs in each year that have been rated at each level of entrustment ('I had to do' to 'I didn't need to be there'). Users can mouse-over each row for additional details. This chart only considers assessments that have been completed on "Supervisor Forms" that have a standard 5 point "O score" scale.
This graph displays the average number of words contained within the completed EPAs of each year. The length of the feedback has been found to correlate with the quality of feedback and so a higher word count is preferable.
This graph visualizes the number of EPA observations submitted per month over multiple years. It is intended to identify increases and decreases in EPA assessments over seasons and years. The X axis of this chart spans across a given academic year from July to June.
The second section of the dashboard shows EPA metrics that contextualize the EPAs that have been completed within a program over a selected academic year.
The summarized metrics are followed by three pie charts that group the assessors by their type, group, and role. Assessor role and group distributions are only available for "internal" assessors. These charts let program evaluators get a better picture of which user group gives them a higher share of EPA assessments. For example, in programs like Surgical foundations where senior residents often assess junior residents. The role “Trainee” might have a higher share compared to “Lecturers” or “Directors”.
The “export program data” button available at the top of the dashboard lets users download a detailed CSV export of all assessments completed or expired in the program. This can be used for additional downstream analysis of metrics that are not shown in the dashboard.
The Normative Assessment Dashboard presents summarized data metrics of all the residents in a program relative to each other. It is meant to be viewed by Competency Committee members to assess a resident against their peers.
The data is presented as both a visual representation(left) and a tabular representation(right). Users are provided with an option to either view all the residents in a program or selectively view the metrics of residents in a particular training stage by using the drop-down at the top of the dashboard. This can be useful during Competency Committee meetings when residents are judged on their performance relative to others in their training group.
By default, the normative dashboard filters out residents without any completed EPAs. However, this behaviour can be turned off by toggling the checkbox at the top of the dashboard.
The bar chart visualizes the following four metrics individually and to switch between the metric being visualized, users can select the corresponding radio buttons above the chart. Each bar represents one resident and users can hover their mouse over a bar in the chart to see the name of the resident it represents, and the value of the current metric being shown. Clicking on a bar in the bar chart switches the user to the resident dashboard to view a detailed breakdown of all assessments of that resident.
Total EPAs- This is a count of the total number of EPAs filled out by a resident.
Achievement Rate- This is the total number of EPAs a resident has achieved divided by the total number of EPAs completed by that resident. An achieved EPA is one where the EPA meets certain requirements set in the assessment plan, such as acquiring a rating of 4 or above on a 5-point scale, or satisfying specific contextual variable requirements, or meeting diverse assessor role requirements.
Progress Rate- This is the number of EPAs a resident has achieved divided by the total number of EPAs they are required to achieve for all the valid EPA forms in a program across the different training phases.
Total EPAs vs Achieved EPAs- This chart visualizes two bars for each resident showing their total EPAs and achieved EPAs next to each other. While this metric is like achievement rate, it can offer a better picture of a resident’s overall performance as high achievement rates alone can be occasionally misleading for residents with a very low number of completed EPAs when all of them are achieved.
Finally, there is a tabular representation of the same metrics with the ability to sort the residents in the table by clicking on the column header. By default, the residents are sorted in the table by name in an ascending order. This can be changed to a descending order by simply clicking on the column header “Name”. Similarly clicking on each column header sorts the table in an ascending order based on that metric and clicking the same column header again changes the order to descending.
The normative dashboard is linked to the resident dashboard so to view a particular residents’ assessments in detail, users can simply click on a bar corresponding to a resident in the bar chart or their corresponding row in the table. This will automatically switch the user to the resident dashboard with that resident pre-selected.
This dashboard lets you compare metrics among different programs for the purpose of program oversight. The dashboard is mainly intended for users that oversee several programs or other higher-level users like Medical Administrators and Deans.
Users need to first select an academic year. This updates the Program dropdown and shows an alphabetically sorted list of programs with the “EPA count” next to the name of each program. The list only contains programs that have assessments that were completed in the selected year. The user can then multi-select all the programs they need to compare and then click “GET RECORDS”.
This graph displays the number of EPAs that have been completed and expired in each year by program. If your school doesn't track expired assessments the values default to zero for all programs.
This graph displays the number of EPAs that have been completed and expired per learner in each year by program. The dotted line represents the average completed EPA count per learner for all the selected programs.
This stack chart displays the proportion of EPAs in each year that have been rated at each level of entrustment ('I had to do' to 'I didn't need to be there'). Users can mouse-over each row for additional details. This chart only considers assessments that have been completed on "Supervisor Forms" that have a standard 5 point "O score" scale.
This graph displays the average number of words contained within the completed EPAs of each year. The dotted line represents the mean value of all the selected programs. The length of the feedback has been found to correlate with the quality of feedback and so a higher word count is preferable.
These are a collection of graphs that visualize the completed and expired count by month through the academic year (spanning from July to June) for each program individually.
This button lets you download the metrics shown in this dashboard as a CSV file for other types of downstream analysis beyond the scope of this dashboard.
This dashboard organizes all of the EPAs that have been completed in a program by the assessor that completed them with the goal of informing faculty development. To get started, users need to first select the academic year in which the assessments were collected and click “GET RECORDS”.
There are three filters available in this dashboard. The first filter lets users select an assessor group to only look at assessments completed by users in that group. This filter can be used to remove or include student assessors. The second filter lets users select a department to only look at assessments completed by assessors from that department. This can be used to remove assessors from external programs. The final filter lets users select a specific assessor for their metrics to be highlighted. Alternatively, users can click on any of the bars in the charts below to select and highlight that assessor.
There are two sections each showing summarized metrics related to faculty assessments. The first section shows the amalgamated metrics for EPAs completed by all assessors in a given academic year. Users can mouse-over the EPA Rating visual to see the proportion of EPAs rated at each level of entrustment (EPAs completed on Supervisor Forms with 5-point O score scales are only considered for this metric). Users can also mouse-over the Training Stage visual to see the proportion of EPAs completed in each stage of training. The second section shows the metrics for EPAs completed by the selected Assessor. Like the section above, users can mouse-over the EPA Rating visual to see the proportion of EPAs rated at each level of entrustment and also mouse-over the Training Stage visual to see the proportion of EPAs completed in each stage of training.
The acquisition metrics panel is followed by four charts each visualizing a specific metric of an assessor relative to others in the program. The red highlighted bar represents the current selector assessor/faculty.
This chart displays the number of EPAs observed by each assessor. Users can mouse-over for each assessor's name and click to highlight that assessor's data. If an assessor is selected, their EPA count is shown in the chart title in red.
This chart displays the percentage of EPAs sent to each assessor that expired before completion. If your school doesn't track expiry metrics of assessments, then the values in this chart would default to zero for all faculties.
This chart displays the average entrustment score of EPAs completed by each assessor. If an assessor is selected, their average entrustment score is shown in the chart title in red. This chart only considers assessments that have been completed on "Supervisor Forms" that have a standard 5 point "O score" scale.
This chart displays the average number of words per comment included with the EPAs completed by each assessor. If an assessor is selected, their average words per comment metric is shown in the chart title in red.
The charts are followed by two tables that are only visible when a faculty is selected. The first table displays all the EPAs completed by the selected assessor. It is searchable (click the white box) and sortable (click the column header). The second table displays Expired EPAs that were not completed by the selected assessor. It is also searchable (click the white box) and sortable (click the column header). Both the tables can be exported as a CSV file.
The dashboard has two different types of exports that can be triggered by using the two buttons at the top of the page.
This provides a CSV export with a list of all faculties and their related metrics. This can be used for downstream analysis that is beyond the scope of the dashboard. If a faculty is selected, the report contains their data alone. If a user wants the data of all the faculties in the export then they must select “All” in the “Assessor” filter.
This button is only visible when a faculty has been selected and it lets the user export the entire dashboard as a PDF file that can be shared with other users who don't have access to the dashboard. It prompts a “Save as PDF” popup as shown below.
This is an overview only. For more detailed information on each step of setting up CBE, please go to the sub-topics.
The CB(M)E module in Elentra is optional and is controlled through a database setting (setting: cbme_enabled).
If you are an existing user of CBME (i.e., a Canadian post-graduate program using Royal College curriculum), the migrations included in the software upgrade to ME 1.24 will create your curriculum framework for you and your existing information (e.g., CV responses, assessment plans) remain unchanged.
You will need to make some small changes to the tag set settings and complete some information for the tag sets in the Royal College Framework (Milestones). More information here.
Using Elentra with learners and faculty requires initial setup including the creation of curriculum layouts and periods, loading of user information, and creation of courses. These tasks are not CBE-specific and are covered elsewhere in our documentation.
Using Elentra specifically for CBE can be broken down into several tasks:
mapping your curriculum,
building forms to use in assessment,
building an assessment plan to facilitate tracking learner progress (optional),
setting up learners and faculty to access forms and complete tasks, and
reviewing and reporting on learner progress.
The following is a high-level overview. Please read the additional pages to learn more about prerequisites for each step.
In most cases central administrative staff will create curriculum frameworks as needed for an organization. Following that, curriculum tags can be uploaded at the organization or course level, depending on the needs of an organization.
In additional to curriculum tags, Elentra also allows you to define contextual variable responses for courses/programs. Contextual variables are additional fields you can add to forms to collect information (e.g., patient age, learning environment, case complexity, etc.). Courses can optionally upload this information for use on CBE form templates.
If you plan to use the Elentra procedure form template, per course uploading of procedure attributes will also be necessary.
Whether you are operating in the context of Canadian post-graduate medicine or not, there remains a CBME auto-setup tool that must be configured for you to move forward with building your own curriculum structure. More information is in the Mapping Curriculum section.
Once the required curriculum tags sets are in place and populated, an institution can begin to use assessment features linked to CBE. A series of form templates can be completed at the course level to facilitate assessment of learners. The existing form templates are:
Supervisor Form Template
Procedure Form Template
Smart Tag Form Template
Field Note Form Template
Using form templates allows Elentra to rely on mapped curriculum to generate and publish multiple forms based on the parameters defined on each form template.
Additional assessment forms can also be used for CBE. These forms are more flexible and allow you to create your own items. These include:
Rubric Forms
Periodic Performance Assessment Forms
Elentra's Assessment Plan Builder allows you to define minimum numbers of assessments learners must collect to demonstrate progress towards their competence. Per curriculum tag (e.g., EPA), administrators can define which tools a learner should collect assessments on, and within those tools, which criteria must be met to determine success (e.g., what global entrustment rating must be met, how many instances of a contextual variable are required, etc.)
Through the Course Contacts section of a Course Setup page, add faculty to the Competence Committee Members list to allow them to review learner performance and make decisions regarding learner progress for all learners in the course/program.
Through the Course Groups tab, build groups to link faculty members to a subset of learners. Faculty assigned as tutors in the course groups will be able to view the CBE program and learner dashboards for their associated learners.
Set up Learners as CBME-enabled. This can be done directly in the database with a developers help, or you can turn on a database setting (learner_levels_enabled) in order to allow course administrators to define CBE status per learner on the course enrolment tab.
Several views allow for the review of learner progress:
The learner dashboard
Review assessment tasks completed on the learner, with option to view individually or aggregated and in trends chart
Note learner progress through dashboard scales (e.g., In Progress, Complete)
Pin comments, items, and forms for quick review
The program dashboard
By default, with CBE enabled, faculty and administrative users will see a CBE program dashboard that displays the progress of multiple learners in a program under the My Learners tab. A course/program must have an assessment plan built to populate this page with data.
The program dashboard can be disabled by a developer on a course-by-course basis if needed.
Visual Summaries
These optional views offer another way to review data collected through CBE assessment tasks.
The Normative Assessment Dashboard shows the performance of all residents in a program relative to each other and their training phases and is meant to be only viewed by Competency Committee members.
The Resident Metrics Dashboard focuses on individual residents and is designed to be used by Residents and Competency Committee members.
The Faculty Development Dashboard allows program directors and coordinators to see a breakdown of all assessments completed in a program by a faculty member to inform faculty development.
If your oraganisation will only use CBE tools in one course of many that learners may be enrolled in, you will likely want a developer to disable CBE for the other courses. This will reduce the number of courses that display in the course picker on CBE screens. Developers will need to add entries per course for cbme_enabled to the course_settings table.
There is some configuration required to effectively use the CBE tools.
You may require a developers help to set up appropriate learner levels for your organization (e.g., PGY 1, PGY 2, PGY 3). This will be used on the Course Enrolement tab to allow you to set learners as CBE-enabled.
Additionally, you should set up course groups to connect faculty with learners if you use academic advisors, coaches, mentors, etc.
Please see additional detail on the following pages.
From the Dashboard, learners have access to a green 'Start Assessment/Evaluation' button on the right side of the screen.
First, learners select an On-Demand Workflow which will dictate which forms become available to select. These options will only display if an organisation has form work-flows configured.
After selecting an on-demand workflow, the choices a learner has will depend on the workflow they are completing. In this example, we'll complete an EPA form.
Next learners select an assessor. They can begin to type a name to limit the list of options. When they mouse over a faculty name, learners can see the faculty card including their name and photo (if uploaded).
If learners need to add an external assessor (someone who doesn't have an Elentra account), they can click 'Add External Assessor'.
Next, learners set a date of encounter.
Next, learners select an assessment method. Details about each assessment method are provided inline.
Note that an assessor must have a PIN setup for learners to select the first option. For more detail on setting user PINs see here.
In our example, completing an EPA form, learners next select and EPA to be assessed on.
For a reminder on what is included in a specific EPA the black question mark provides a quick link to the EPA Encyclopedia.
Users can easily navigate the list of EPAs by applying preset filters including Current Stage EPAs, Current Rotation EPAs, and Priority EPAs. Click on a filter title to apply it. In this example the Priority EPAs filter is being applied.
After an EPA is selected, the available assessment tools will be displayed. Learners can search the tools by beginning to type in the search bar. Note the small clock in the top right of each tool card. This is an estimate of how long the form will take to complete based on the experience of other users.
Learners can click 'Preview This Form' to view the form and ensure it is the one they want or they can click 'Begin Assessment' on any tool to begin.
Depending on the selected method of assessment, learners will either be directed to the assessment form to start it, or the form will be sent directly to the assessor.
Organizations can optionally enable a database setting to add a shortcut icon to learner's CBE dashboards beside each corresponding EPA (setting = cbme_ondemand_start_assessment_shortcut_button). If you'd like this option to be available to your learners, please speak to a developer.
The shortcut icon displays as a small play sign on each EPA card.
Using the shortcut will take the learner to the Start Assessment/Evaluation page with some information already completed.
This page summarizes some of the differences using Elentra with and without CBE enabled. Features not included in this list. (e.g., Exams, Gradebook), are not impacted by the use of CBE.
My Events Calendar (Main dashboard)
Supported, will be page learner lands on when logging in
Supported, learner will have to click to move to their My Events Calendar tab
On Demand Form Workflows
Supported
Supported, have access to additional form types in the workflows
Distributions
Supported
Supported, have access to additional form types in distributions
Assigning Curriculum Tags to Assessment Items
Supported, but lack longitudinal reporting on learner progress on specific curriculum tags
Supported, dashboards allow for easier reporting on learner progress
Form Types
Generic Forms
Standard Rotation Evaluation Form
Standard Faculty Evaluation Form
Generic Forms
Standard Rotation Evaluation Form
Standard Faculty Evaluation Form
Rubric Forms
PPA Forms
Form Templates (Supervisor Form, Procedure Form, Smart Tag Form, Field Note Form)
These additional form types can have their data displayed on CBE dashboards
Reporting on Learner Progress
Learner Reports (can aggregate per-form data)
CBE Learner Dashboard, CBE Program Dashboard, Visual Summary Dashboard
CBE Learner Dashboard
Not supported
Supported, displays assessment data about learner performance on specific curriculum tags. Will be the page learners land on when logging in.
CBE Program Dashboard
Not supported
Supported, requires that an assessment plan be created for a course, then displays assessment data about all learners within that course and uses color coding to chart learner progress towards assessment plan requirements
Visual Summary Dashboards
Not supported
Supported, requires that an assessment plan be created for a course, then provides multiple visual summaries including learner performance, faculty task completion rates, etc.
Assessment Plan Builder
Not supported
Supported, allows administrators to define per assessment form minimum requirements to demonstrate competence for a specific curriculum tag
Meetings
Supported, learners access from user icon, faculty access from My Learners view
Supported, learners retain access from user icon, faculty must access from CBE learner dashboard
Clinical Experience > Rotation Scheduler
Supported
Supported, with ability to map curriculum tags uploaded through a curriculum framework to rotations and indicate likelihood and priority of completion on a rotation
Clinical Experience > Logbook
Supported
Supported
Portfolio
Supported
Supported
Curriculum Mapping
Supported, tags added to Elentra outside a curriculum framework can not be used to generate data on the CBE dashboards
Supported, but tags added through curriculum frameworks and objective trees while available for most tagging, do not have their relationships reflected in most reporting outside the CBE dashboards