Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This is an overview only. For more detailed information on each step of setting up CBME, please go to the sub-topics.
To ensure CBME works correctly, each organisation that has CBME enabled must have a developer or technical administrator set the default_stage_objective
setting value in the elentra_me.settings
table to the global_lu_objectives.objective_id
of the first stage of competence (ex: Transition to Discipline).
Using Elentra with learners and faculty requires some general basic setup including the creation of curriculum layouts and periods, loading of user information, and creation of courses. Those tasks are not CBME specific and are covered elsewhere in our documentation.
Beginning to use Elentra specifically for CBME can be broken down into several tasks:
mapping your EPAs,
building form templates,
setting up learners and faculty to access the forms, and,
reviewing and reporting on learner progress.
Mapping your EPAs
Some curriculum mapping tasks will be completed at the organisation level by CBME implementation staff**.** This includes using Elentra's auto-setup tool and adding required tag sets like the CanMEDS key and enabling competencies.
Next individual programs import their specific program curriculum including EPAs, contextual variable responses, and procedure attributes.
Building Form Templates
Once the required curriculum tags sets are in place and populated, an institution can begin to use assessment features linked to CBME. A series of form templates can be completed at the program level to facilitate assessment of learners. The existing form templates are:
Supervisor Form Template
Procedure Form Template
Field Note Form Template
Using these form templates allows Elentra to rely on a program's EPAs, competencies, milestones and contextual variables to generate and publish multiple forms based on the parameters established on each form template.
Additional assessment forms can also be used for CBME. These include:
The Periodic Performance Assessment Form
The Rubric Assessment Form
As long as items added to these forms are linked to an EPA the form will be triggerable once it is published. Note that using these forms allows you to add customized questions or prompts. You can build those items either in advance or during form creation using the Assessment and Evaluation module.
If you want to be able to create additional customized forms and distribute them to learners and/or faculty based on rotation schedules, specific learning events, etc. you must be using additional features of Elentra. For additional information about creating forms, please see information for the Assessment and Evaluation module here.
Managing Faculty and Learners
Through the Groups tab of a course/program page, set up optional Academic Advisor Groups to link faculty members to groups of learners. Academic Advisors (listed as tutors in the course groups) will be able to view the CBME dashboards for all of their associated learners. The labels next to each tutor of Teacher, Tutor, Teacher's Assistant and Auditor are only used to differentiate between the different tutors in the same group and provide additional information. They do not change any of the user's permissions.
Through the Course Contacts section of a Course Setup page, add faculty to the Competence Committee Members list to allow them to review learner performance and make decisions regarding learner progress.
Reviewing and Reporting on Learner Progress
The CBME Dashboard is one place to review learner progress and the different forms completed on a learner. An Admin can access the CBME Dashboard of their learners by clicking on My Assessment & Evaluation (located in the top right hand corner between your profile and the Logout button), then by navigating to My Learners. From the CBME Dashboard Stages tab, users can quickly see learner progress in different EPAs and stages and see an overview of completed forms and performance trends on the global assessment ratings on those forms. An aggregated report of learner performance in an EPA is available directly from the Stages tab of the CBME dashboard.
From the CBME dashboard you can also access an Assessments tab which allows you apply a variety of filters, search in forms, and view completed, in progress, pending and deleted forms where the learner is the target of the form.
As learners and faculty review and discuss forms, a variety of additional tools are available including the ability to pin comments, items, and forms, a way to log meeting notes, and the ability for learners to give a "thumbs up" to a completed form.
Additional tabs accessible from the CBME dashboard include Assessment Items (a view of individual items completed on the learner), Trends (visualizations of learner performance), Comments (access to all comments in one place) and Pins (view of anything residents or faculty have pinned for review).
In addition to the CBME dashboard, learner progress can be reviewed and reported from the learner's Assessments page by clicking on Assessments, in the My Learners tab of My Assessment & Evaluation.
This displays the tasks where the learner is the target or** ** the assessor/evaluator. Reports are accessible from this page as well, however many of these reports only apply to forms managed via distributions, not CBME forms triggered by faculty or learners.
For institutions that have built assessment plans in the Elentra Assessment Plan Builder, faculty and administrative users will also be able to access a CBME program dashboard that displays the progress of all learners in a program from one user interface. (Please note that there is no learner view of the program dashboard at this point.)
To use the CBME Module, nine required curriculum tag sets must be in place. An auto-setup tool will help users create these. Only certain users have permission to complete this step.
Note: If you are a UGME organization using CBME, some of these may be changed; please speak to a developer or the Elentra Core Team as needed.
Navigate to Admin > Manage Course > create a new program (course), and click the CBME tab.
If the required curriculum tag sets aren't yet configured, the program will prompt users to auto-setup the required curriculum tag sets and will populate the CanMEDS Roles, Royal College Stages, and Contextual Variables tags.
Users without permission to use the auto-setup feature will be directed to contact a system administrator with the message below.
After an administrative user has run the auto-setup tool, several tag sets will be created and three of them populated (users can view these via Admin > Manage Curriculum). Unless you are working with a developer, you cannot change the tags in these tags sets or after you do, you will be prompted to run the auto-setup feature again.
If an administrative user has completed the auto-setup step, but not imported a standard list of key and enabling competencies, they will be prompted to do so if they try to access a course CBME page.
After the required, standard key and enabling competencies tag sets have been populated, all programs must also complete and upload multiple templates through the CBME tab for a program. Note that program directors and program coordinators will have access to the CBME tab for their affiliated programs only. The templates to complete are:
The EPA Template
The Program Specific Key Competency Template (optional)
The Program Specific Enabling Competency Template (optional)
The Milestones Template (or the Enabling Competency Map Template)
The Contextual Variables Template
Navigate to Admin > Manage Courses.
Search for the required program (you may only have access to one depending on your role in your organization).
Using the gear icon beside the program name, click 'CBME'.
Click 'Import CBME Data' from the tab menu below the Competency-Based Medical Education heading.
With the introduction of EPA Versioning, you will now be prompted to indicate which EPAs you are changing. Note that if you are uploading a program's CBME data for the first time, this does not apply and you can click 'Next' on Steps 1 and 2 to move forward.
After skipping Steps 1 and 2, you'll arrive at a screen showing Steps 1 - 3 for importing CBME data.
On the right you will see a dropdown menu titled 'Download Example CSV Templates'.
Click on the required template and it will download to your computer. Look for it in your downloads file if it doesn't immediately open.
Why are there two enabling competency templates?
Use the Enabling Competency Template to import program-specific enabling competencies.
Use the Enabling Competency Map Template only if a program chooses to map EPAs to enabling competencies instead of milestones.
There are two similarly named templates because of two options available: the option to import program specific enabling competencies, and the option to map EPAs to enabling competencies instead of milestones. Most programs will map EPAs to milestones and will never use the 'Enabling Competency Map Template.' However, if you have a program piloting CBME before its national launch they may choose to map to enabling competencies if milestones aren't finalized yet and we have left that option available.
Note that you'll only have this step available if you decided to map program-specific key and enabling competencies in import Step 2.
Use the Key Competencies Template if a program wants to import program-specific key competencies. This template organizes the information about Key Competencies decided by each national specialty committee and allows the information to be uploaded to the system.
Note that the first column, Key Competency Code, is completed for you with alphanumeric codes based on the CanMEDS roles. Adjust this list as needed to match the information from the national specialty committee.
When coding the Key Competencies remember these required codes for the CanMEDS roles: Professional PR Communicator CM Collaborator CL Scholar SC Leader LD Advocate HA Medical Expert ME **** (The exception to this is if you have a unique CBME set up, e.g., different using different roles.)
Complete the second column, Title, by providing the key competency text.
The third column, Description, is optional. This will store information but will not be visible to users.
Save your file as a CSV.
Remember that you'll only see this option if you have elected to map program specific key competencies.
After saving your file as a .csv, scroll to step 4 on the Import CBME screen and either search for or drag and drop the file into place. You'll see a green checkmark when your file is successfully uploaded.
Although Elentra allows individual programs to upload specific Key and Enabling Competencies, organizations are still required to upload a standard list of Key and Enabling Competencies at the organization level. For Canadian schools using CBME, we recommend making the CanMEDS Key and Enabling Competencies your standard list. The CanMEDS documents are available online.
If you see a warning like the below, it means your organization still needs to upload key and enabling competencies through Manage Curriculum.
To download the required templates, complete them, and upload standard Key and Enabling Competencies information:
Navigate to Admin > Manage Curriculum.
Click 'Curriculum Tags' from the Manage Curriculum card on the left sidebar.
Click on a curriculum tag set name (e.g. CanMEDS Key Competencies) and then click 'Import from CSV' in the top right to access and download the template to complete.
After you complete each required template, save it in a .csv format and upload it to the corresponding tag set page.
You can access the CanMEDS Key and Enabling Competencies at http://canmeds.royalcollege.ca/guide. The Key Competencies are preceded by one number like 1 or 2, and are stored under CanMEDS roles. The Enabling Competencies are preceded by a number like 1.1 or 2.2 and are also stored under CanMEDS roles.
Sample Key Competency: 1 Practise medicine within their defined scope of practice and expertise
Sample Enabling Competency: 1.1 Demonstrate a commitment to high-quality care of their patients
Both of these are from the Medical Expert role so would be coded with the prefix ME. Alphanumeric codes are provided in the templates, and you will need to input the key and enabling competency text. Note that you can export the text from the CanMEDS guide.
Download the template through Manage Curriculum > Curriculum Tags > CanMEDS Key Competencies.
Note that the first column, objective_code, is completed for you with alphanumeric codes based on the CanMEDS roles.
Complete the second column, objective_name, by providing the key competency text. You can access the key competencies at http://canmeds.royalcollege.ca/guide or request them from another institution.
Save your file as a CSV.
Download the template through Manage Curriculum > Curriculum Tags > CanMEDS Enabling Competencies.
Note that the first column, objective_code, is completed for you with alphanumeric codes based on the CanMEDS roles.
Complete the second column, objective_name, by providing the enabling competency text. You can access the enabling competencies at http://canmeds.royalcollege.ca/guide or request them from another institution.
Save your file as a CSV.
To upload completed templates:
Navigate to Admin > Manage Curriculum.
Click 'Curriculum Tags' from the Manage Curriculum card on the left sidebar.
Click on a curriculum tag set name (e.g. CanMEDS Key Competencies) and then click 'Import from CSV' in the top right.
Drag and drop or select the appropriate file from your computer and click 'Upload'.
You will get a success message and the curriculum tags you've added will appear on the screen.
A program needs to complete this template OR the Enabling Competencies Template, not both. Most programs will complete the Milestones Template. Users define which option they will use in Step 2 of the import CBME process.
2) Milestone code: Codes should be recorded in uppercase letters. The format for the milestone code is: Leaner Stage letter, followed by a space, CanMEDS Role letters, Key Competency number, followed by a period, Enabling Competency number, followed by a period, Milestone number**.**
Note that there should be no space between the CanMEDS Role letter and the Key Competency number.
CanMEDS Role refers to the CanMEDS roles and should be coded as follows:
Professional PR Communicator CM Collaborator CL Scholar SC Leader LD Advocate HA Medical Expert ME
(The exception to this is if you have a unique CBME setup, e.g., different roles.)
Creating and confirming your Milestones codes takes patience. You'll likely notice that in some Royal College (RC) documents there is only a two digit milestone code. For the purposes of mapping your curriculum in Elentra, you must have three digit milestone codes. We recommend you add the third digit in the order the milestones appear in the RC documents you're using. Make sure you check for duplication as you go so that unique milestones have their own codes but different codes aren't applied to repeating milestones (within one stage). Using the data organization tools to reorder the milestone code columns and title columns can help you identify unneeded duplication.
You may also notice that the RC allows programs to use milestones coded from one stage in another stage (so you may see an F milestone in a C stage EPA). How programs and organizations handle this is ultimately up to them but we recommend that you align the milestone code with the stage that it is actually being assessed in/mapped to. For example, if it’s a “D” milestone being mapped to an F EPA, rename it as an “F” milestone. This is because you likely have a different expectation of performance on the milestone in a different stage, even if it’s the same task.
3) Title: The text of the milestone should be copied verbatim from the national specialty committee EPAs.
Mapping multiple EPAs to one milestone
The system is designed to allow multiple EPAs to map to one milestone. To do so, simply insert a comma between the EPA code entries in column one. e.g. D1,D2,F1
After saving your file as a .csv, scroll to step 3/6 on the Import CBME screen and either search for or drag and drop the file into place. You'll see a green checkmark when your file is successfully uploaded.
After you upload an EPA Template, Step 2 is to define how you'd like to continue mapping your EPAs. Each program has two decisions to make: 1. Competency Options: Decide whether or not to upload course/program specific key and enabling competencies. If you select to use the standard key and enabling competencies (e.g., CanMEDS key and enabling competencies) the system will copy the items previously uploaded via Manage Curriculum and make them the program's default key and enabling competencies. 2. Curriculum Tag Options: Decide whether to map EPAs to Milestones or Enabling Competencies. Most programs will map their EPAs to Milestones.
The ability to choose whether to use course/program specific competencies vs. the standard list is configurable in the Elentra settings table. This means that with a developer's help it is possible to prevent users from having the Competency Options choice. If you get to an option that ONLY asks about Curriculum Tag Options, consult with your CBME implementation lead as to whether your organization should enable the option to provide program specific KCs and ECs.
Make your selections using the radio buttons, then click 'Save Competency and Curriculum Tag Options."
The EPA Template needs to be completed by all programs. For Canadian PG programs, this template organizes the information about EPAs decided by each national specialty committee and allows the information to be uploaded to Elentra.
Each program should complete an EPA template with the following information:
Code: This is the EPA code. It should be recorded in uppercase letters and include the learner stage of training and the EPA number. Use the learner stage letters outlined below to ensure that the EPAs correctly map to other curriculum tags. D: Transition to Discipline F: Foundations of Discipline C: Core Discipline P: Transition to Practice Sample EPA Codes include D1, D2, D3, F1, F2, F3, C1, C2, etc. Note that there is no space between the letter and number in the EPA Code.\
Title: This should be the EPA text. This can be copied from the EPAs as decided by each national specialty committee.\
Detailed Description: In some cases, these may be the same as the EPA title; in other cases, an EPA has an additional description. Not every program will use the ‘Detailed Description’ column. Note that this information is displayed to users in the EPA Encyclopedia.\
Entrustment required (optional): This field is optional but will populate the EPA Encyclopedia which is visible to all users.
Remember to use these codes for the learner stage on all templates:
D: Transition to Discipline F: Foundations of Discipline C: Core Discipline P: Transition to Practice
(The exception to this is if you have a unique CBME setup, e.g. for a UG program.)
Save your completed file as a .csv
Navigate to Admin > Manage Course.
Search for a course as needed and from the menu cog, select CBME.
Click Import CBME Data.
When importing EPAs for the first time, you will see Step 1: Flag EPAs, click 'Next' to move on. (This step is to support EPA versioning which is unnecessary when importing the first version of EPAs.)
You will see Step 2: Set the Status of All Objectives. click 'Next' to move on. (This step is to support EPA versioning which is unnecessary when importing the first version of EPAs.)
The next screen you access will include 3 new steps allowing you to first Import EPAs.
After saving your file as a .csv, scroll to Step 1 on the Import CBME screen and either search for or drag and drop the file into place. You'll see a green checkmark when your file is successfully uploaded.
If you need to add new information to your EPAs, competencies, or milestones templates, add new information to the existing template and upload it again in the appropriate spot. The system will identify any NEWLY ADDED EPAs/competencies/milestones and append them to the existing information.
Because of this feature, it can be valuable to keep a copy of the most recent file you’ve uploaded at all times.
If you need to correct a typo or minor error in an EPA, milestone, etc., navigate to the Import CBME Data screen and follow the instructions on the screen to edit existing EPAs, milestones, etc.
Note that you'll only have this step available if you decided to map program specific key and enabling competencies in import Step 2.
Use the Enabling Competencies Template if a program wants to import program specific enabling competencies. This template organizes the information about Enabling Competencies decided by each national specialty committee and allows the information to be uploaded to the system.
Note that the first column, Enabling Competency Code, is completed for you with alphanumeric codes based on the CanMEDS roles. Adjust this list as needed to match the information from the national specialty committee.
When coding the Enabling Competencies remember these required codes for the CanMEDS roles: Professional PR Communicator CM Collaborator CL Scholar SC Leader LD Advocate HA Medical Expert ME
(The exception to this is if you have a unique CBME setup, e.g., different roles.)
Complete the second column, Title, by providing the enabling competency text.
The third column, Description, is optional. This will store information but will not be visible to users.
Save your file as a CSV.
Remember that you'll only see this option if you have elected to map program-specific enabling competencies.
After saving your file as a .csv, scroll to step 5 on the Import CBME screen and either search for or drag and drop the file into place. You'll see a green checkmark when your file is successfully uploaded.
1) EPA code: Codes should be recorded in uppercase letters. Codes should refer to the stage of training and the EPA number, e.g. D1, D2, D3, F1, C2. Using consistent codes across your program and school is important. Unless you have made modifications for a unique CBME setup, Elentra looks for the following codes for learner stages: D: Transition to Discipline F: Foundations of Discipline C: Core Discipline P: Transition to Practice
To ensure CBME works correctly, each organisation that has CBME enabled must have a developer or technical administrator set the default_stage_objective
setting value in the elentra_me.settings
table to the global_lu_objectives.objective_id
of the first stage of competence (ex: Transition to Discipline).
Mapping Entrustable Professional Activities, Key and Enabling Competencies, Milestones, Contextual Variable Responses and Procedure Attributes for each program is what allows administrators to use the CBME form templates to quickly create assessment tools. There are two stages in mapping curriculum; in the first stage a central administrator (e.g. PGME office) needs to configure Elentra, then, in the second stage, the curriculum information specific to each program needs to be uploaded.
Steps taken once, at the organisational level, include:
Auto-setup CBME features,
Import standard list of Key and Enabling Competencies,
Import custom list of contextual variables as needed, and
Build assessment rating scales as needed.
Then individual programs can upload Entrustable Professional Activities, program-specific Key and Enabling Competencies (if desired), Milestones, Contextual Variable Responses and Procedure Attributes.
Whether you allow courses/programs to upload their own specific key and enabling competencies is an optional database setting (setting = cbme_standard_kc_ec_objectives). If you need to change this setting, speak to a developer.
All templates you upload to the system must be in .csv format.
If you are working in Excel or another spreadsheet manager use the “Save as” function to select and save your file as a .csv
To upload your files, either search for or drag and drop the file into place. You'll often see a green checkmark when your file is successfully uploaded.
If you receive an error message make sure that you have:
Deleted any unused lines. This is especially relevant in templates with pre-populated columns including the contextual variables template, and enabling competencies mapping template.
Completed all required columns. If a column is empty for a specific line the file may fail to upload.
If you've imported all CBME data for a program but you are not able to see EPA maps in the EPA Encyclopedia or in the Map Curriculum Tags tab of CBME, double check that you've uploaded your files without spaces between the EPA Code letter and number, nor between the CanMEDS Role and Key Competency number. Having spaces in the incorrect places will prevent the system from retrieving the required information to produce maps.
While you can correct minor typos and add additional mapping info. through the user interface you can not delete, nor undo mapping between EPAs and milestones or enabling competencies.
The option to reset all CBME data for a program was removed in a previous Elentra version.
A program needs to complete this template OR the Milestones Template, not both. Most programs will map milestones and there is no need to complete this template. Users define which option they will use in Step 2 of the import CBME process.
EPA code: Codes should be recorded in uppercase letters. Codes should refer to the stage of training and the EPA number. e.g. D1, D2, D3, F1, C2. Using consistent codes across your program and school is important.The system is setup to accept the following codes for learner stage: D: Transition to Discipline F: Foundations of Discipline C: Core Discipline P: Transition to Practice\
Enabling Competency Code: Note that the template already includes a list of Enabling Competency Codes using the CanMEDS role prefixes. In the first column, simply add the EPAs codes you wish to map to each enabling competency.\
If there are unmapped Enabling Competency Codes left over you must delete those rows before you upload your .csv file.
Mapping multiple EPAs to one enabling competency
The system is designed to allow multiple EPAs to map to one milestone. To do so, simply insert a comma between the EPA code entries in the first column.
After saving your file as a .csv, scroll to step 5 on the Import CBME screen and either search for or drag and drop the file into place. You'll see a green checkmark when your file is successfully uploaded.
Navigate to Admin > Manage Courses/Programs.
Search for the appropriate course as needed.
From the cog menu on the right, select 'CBME'.
Click on 'CV Responses' from the tab menu below the Competency-Based Medical Education heading.
Click on a contextual variable to show its existing responses.
Click on the double arrow in the Order column to drag and drop a response to a new location.
You'll get a green success message that your change has been saved.
If users want to reorder the overall list of contextual variables they can do so from Admin > Manage Curriculum. To change a tag's display order you need to edit the tag by clicking on the pencil icon beside a tag name. Then you need to adjust the display order for each individual tag.
Within a given contextual variable, you are able to create groups of responses. This is leveraged in the assessment plan builder to allow you to specify exactly what requirements learners must meet.
You must be in an administrative role to manage contextual variables.
Click Admin > Manage Courses
Beside the appropriate program name, click the small gear icon, then click "CBME"
Click on the "CV Responses" tab
Click on any contextual variable category to expand the card. To add a new group within that contextual variable, click on the green "Manage Groups" button.
To add a new group, click "Add Group". To edit an existing group, click on the group title.
Add a title for the contextual variable group and an optional description. Check off all responses that you wish to add to the group and then click "Save Group".
After saving, you can now use the Group contextual variable types within the assessment plan builder.
Ideally your users will be loaded into Elentra from a central, authoritative source like a student information system. Manually adding users through Admin>Manage Users is a possibility and more detailed instructions for using this tool are available here.
If you are manually setting up CBME users, please note the following:
Resident learners should be assigned group: student, role: student.
Program Administrators should be assigned group: staff, role: Pcoordinator. Further, they must be assigned to a specific program in order to access the relevant CBME and Assessment and Evaluation features. To give specific staff more ability to act within Elentra set them up as Staff>Admin. Note that Staff>Admin have access to almost all features of Elentra and can access System Settings.
Faculty should be assigned group: faculty, role: faculty, lecturer, or director depending on their role. Someone with faculty director permissions will still have to be added to a specific program in order to access the relevant CBME and A and E features. Faculty directors will have access to all faculty evaluations within their department (except their own).
Any faculty, regardless of their role, can be added to Competence Committees through a course/program setup page or as Academic Advisors through the program Groups tab. (Additional instructions for each are in additional lessons.)
You need to be logged in as a program coordinator or program director to create or modify competence committees.
Navigate to Admin>Manage course and select a course.
Under the program title, click on Setup.
Scroll down to the Course Contacts section and find the 'Competency Committee Members' section.
Begin to type a name to retrieve a list of potential committee members.
Click on the name you want to add and repeat as necessary.
When a name has been added it will appear below the search box.
After you have added the required names, scroll down to the bottom of the page and click ‘Save’.
Note that all Competency Committee Members for a program will be able to access all resident profiles in that program and will have the ability to promote them.
Members of Competency Committees can promote learners through stages from the learner CBME dashboard. For more information please see the Reviewing Learner Progress>Promoting Learners Through Stages lesson.
Navigate to Admin > Assessment and Evaluation.
Click ‘Form Templates’ on the tab menu.
Click the green ‘Add Form Template’ button in the top right and a pop-up window will appear.
Type in a form name and select the form type from the dropdown menu. Select ‘Supervisor Form’. If you have permission to access multiple programs, use the dropdown menu to select the appropriate program for this form. This option will only show up if you have access to multiple programs.
Now that you have two (or more) curriculum versions in the system, the Form Template Builder will default to loading the most recent version. In the "EPA Version" tab, simply select the appropriate version. Click Save.
If you want to build new forms for learners using Version 1, change the EPA Version to Version 1, click Save, and it will load the appropriate EPAs.
Select which EPAs can be assessed using forms generated from this template.
To remove EPAs, click on the small 'x' to the left of the EPA code. You can add back any required EPAs by clicking on the dropdown menu and checking off the tick box for a desired EPA.
Click the grey badge beside an EPA to select or remove specific milestones for forms built from this template.
Click Save.
If you want all EPAs to have the contextual variables, leave all EPAs checked off. If you’d rather specify which contextual variables apply to which EPAs simply uncheck an EPA and it will appear below with its own customizable list of contextual variables.
You can remove specific contextual variable responses by clicking on the grey button beside a contextual variable.
You may only select between 1 and 6 contextual variables per EPA per supervisor form.
Click Save.
Use the first dropdown menu to select the scale you want to use to assess enabling competencies or milestones.
Indicate whether comments are disabled, optional, mandatory, or prompted. By selecting ‘Prompted’, you can set the system to prompt and require a comment when any flagged response is selected by an assessor.
The default response feature allows you to pre-populate a form with the selected response.
Disabled - Comments are disabled at the milestone level.
Optional - An optional comment box will appear for each milestone. Comments are not required for a form to be submitted.
Mandatory - A mandatory comment box will appear for each milestone. Comments are required for all milestones before a form can be submitted.
Prompted - A mandatory comment box will appear only for the prompted responses indicated. This is a popular option for responses at the lower end of a scale.
From the first dropdown menu, select a Global Rating Scale. This will populate the Item Text and the Responses sections.
From the second dropdown menu, indicate whether comments are disabled, optional, mandatory, or prompted. By selecting ‘Prompted’ from the Comments dropdown menu, you can set the system to prompt and require a comment when any flagged response is selected by an assessor.
Click 'Publish' to make your template available for use. The forms will be available within the hour.
Once a form template has been published, you can rearrange the template components for each form; however, you cannot makes changes to the scales or contextual variables. To make these changes, copy the form template and create a new version.
On each form template you create you’ll notice a greyed out area at the bottom including Next Steps, Concerns, and a place for feedback. This is default form content that cannot be changed.\
Within a given form, you can only tag curricular objectives (e.g., EPAs or milestones) from the same curriculum version. To ensure that you do not accidentally add an EPA from a different version, you must create the form first and then "Create & Attach" new items to the form.
Click Admin > Assessment & Evaluation.
Click on Forms from the subtab menu
Click Add Form.
Provide a form name and select PPA Form or Rubric/Flex Form from the Form Type dropdown menu; then click Add Form.
Now that you have two (or more) curriculum versions in the system, the Form Editor will default to loading the most recent version. Under "EPA Version", simply select the appropriate version. Click Save.
If you want to build new forms for learners using Version 1, simply change the EPA Version to Version 1 and it will load the appropriate EPAs.
Note: In order to use the "Programs" filter in the form bank, you need to add Program-level permissions to each form.
Click "Individual", change to "Program"
Begin typing in your program name in the "Permissions" box.
Click on your program to add it.
If applicable, select the relevant contextual variables. Click the grey badge beside a variable to select or remove specific responses.
Indicate whether comments are disabled, optional, mandatory, or prompted. By selecting ‘Prompted’, you can set the system to prompt and require a comment when any flagged response is selected by an assessor.
Feedback and Concerns items will be added when the form is published.
Click "Add Item(s)" to add an item, or click the down arrow for more options. "Add Free Text" will add an instruction box. Do not use the "Add Curriculum Tag Set" at this time.
Create & Attach a New Item
Click "Create & Attach a New Item" to add an item.
Create New Item
Click "Create & Attach a New Item" to add an item. Select the Item Type and add any item responses, if applicable.
Tag Curriculum Objectives
Because you are using a form that is mapped to "Version 2", the curriculum tag sets will be locked to "Version 2". This will ensure that you do not accidentally tag an EPA from a different version.
Continue adding items as desired. When you have finished creating and attaching new items, click "Publish" to publish your new form.
Navigate to the CBME Sub-Module
Log in to Elentra.
Click "Admin" at top right
Click "Manage Programs"
Select your program.
Click "CBME"
Click "Import CBME Data".
Add New Version
Click "Actions"
Click "Add New Version"
In this step, simply indicate which EPAs will be:
Replaced
Changing
Not Changing
Retired
Note: Each program can only have one version "in progress" at a time. You can continue editing that new version, or reset the progress.
Replaced
Select this option if you wish to replace the EPA entirely, including all milestones. This option will require you to have a complete set of CSV files for the objectives that are being replaced (all EPAs, all Milestones), rather than just the changes. Use this option if you have many changes to implement across EPAs and milestones.
Retired
Select this option to retire an EPA (remember, think in terms of EPA Code, e.g. "C11") from the next version. This means that there will be no new version of that EPA Code (C11) in the next curriculum version. If you include an EPA with the retired code in the spreadsheet, it will not get uploaded. You are unlikely to use this status for EPAs unless there is an EPA at the end of your numbering that is being removed. For example, you have D1, D2, D3, D4 currently. You are retiring D4 from the next version and no new D4 will be uploaded.
For milestones, a “retired” status will remove that milestone from that EPA in the new version. This means that there will be no new version of that milestone for that EPA. It will not retire it from other EPAs unless you explicitly tell it to.
Changing
The objective will be changing. This will allow you to upload new EPA and Milestone text. For a milestone (or competency), if you select “changing” for that milestone under one EPA, the system will update all other instances of that milestone to “changing”. This will ensure that you only have one version of that milestone.
Not changing
The objectives will remain unchanged. This exact version of the objective will be carried-forward into the new curriculum version. Note that selecting this option will stop any objectives with the same code from being uploaded.
In this step, indicate which KCs, ECs, and Milestones will be: Changing, Not changing, or Retired.
If you marked all of your EPAs as "Replaced," you can click next without needing to go through each EPA.
PLEASE NOTE: If there are any milestones in a Changing EPA that you want removed, you MUST mark them as RETIRED. This is with respect to the Milestone code - if "C PR4.3.2" will no longer be in this EPA, mark it as retired.
If you would like to add a new Milestone to an EPA only, then just set the EPA to changing and nothing else.
Note that "retiring" a milestone will only retire it from that EPA and no others unless otherwise indicated.
Upload CSV file with new and/or changing EPAs
You will notice that the yellow box at the top will list all objectives that have been set to changing. You can still upload new EPAs and Milestones that were not already in the system.
Choose the file and click "Save and upload EPAs"
IF APPLICABLE: Upload Key and Enabling Competencies
Moving forward, we are recommending that programs upload the specialty-specific EC and KC files. This is especially important for programs that have added or removed KCs or ECs (e.g., Otolaryngology).
Upload CSV file with new and/or changing Milestones
Choose the file and click "Save and upload Milestones".
Some programs at Queen's only mapped to the Enabling competency level. If you will continue mapping to enabling competencies, and are not converting to milestones, skip the milestone upload and use Step 6 to upload your EC map. If you are converting to milestones, use the milestone upload.
When the upload is complete, click "Next".
Confirm the Version
This overview lists all of the EPAs and their version status. When you are ready to publish the new version of the curriculum, click the blue "Publish" button.
REMINDER: Please ensure that all of your learners are in the correct stage (on their CBME Dashboards) prior to publishing the new curriculum version. The green checkmark on the stage indicates that the learner has completed that stage. A learner is currently in the stage that comes after the last green stage checkmark. Failing to ensure that learners are in the correct stages will result in those learners receiving (or not receiving) new stages when they shouldn't have. If you do this by accident, there is a developer tool to reset the learners stages - please contact the support team if you need help with this.
As soon as you have published your new version, you will need to build tools for all of the new or changing EPAs before residents can be assessed on them.
The new curriculum version will be available the next day. For most installations, new versions are published nightly.
Now that you have finished creating your new curriculum version, please ensure that you create new assessment tools for any changed/replaced/new EPAs. After the published version is live, learners will not be able to be assessed on any new/changed EPAs until the new tools have been built. Remember that in most cases this will have minimal impact, since the learners only get the new version for the stages beyond the one they are currently in.
Please do not delete forms or form templates from previous versions unless you are certain that there are no longer any learners using that version.
The new curriculum version will be applied to each learner at the stage level. This means that if a learner is currently in Foundations, they will retain the previous version of Transition to Discipline and Foundations but will be given the new version of Core and Transition to Practice. Please ensure that all of your learners are in the correct stage (on their CBME Dashboards) prior to publishing the new curriculum version. Remember: a green checkmark on the stage indicates that the learner has completed that stage. A learner is currently in the stage that comes after the last green stage checkmark.
As soon as you have published your new version, you will need to build tools for all of the new or changing EPAs before residents can be assessed on them.
Think in terms of the EPA or Milestone code. A new version of an EPA is when:
The EPA title/description has changed sufficiently to change the context of the EPA, AND/OR
You are changing, adding, or retiring any milestones within the EPA, AND/OR
You are changing the EPA code association (e.g., you have retired an EPA in the middle of your numbering, such as D2, and now need to shift D3, D4, D5, into D2, D3, D4).
A new version of a Milestone is when:
The Milestone title has changed sufficiently to change the context of the Milestone.
Replaced
Select this option if you wish to replace the EPA entirely, including all milestones. This is the easiest option to work through the wizard. This option will require you to have a complete set of CSV files, rather than just the changes. Use this option if you have many changes to implement across EPAs and milestones.
Not changing
The objectives will remain unchanged. This exact version of the objective will be carried-forward into the new curriculum version. Note that selecting this option will stop any objectives with the same code from being uploaded. Any EPAs marked as not changing will have all existing tools carried over.
Changing
The objective will be changing. A new version of this EPA will be created, which may include objectives within it that you mark as changing.
For a milestone (or competency), if you select “changing” on one EPA, the system will update all other instances of that milestone to “changing”. This will ensure that you only have one version of that milestone.
Retire
The EPA (remember, think in terms of EPA Code, e.g. "C1") is being retired from the next version. This means that there will be no new version of that EPA Code in the next curriculum version. If you include an EPA with the retired code in the spreadsheet, it will not get uploaded. You are unlikely to use this status for EPAs unless there is an EPA at the end of your numbering that is being removed. For example, you have D1, D2, D3, D4 currently. You are retiring D4 from the next version and no new D4 will be uploaded.
For milestones, a “retired” status will remove that milestone from that EPA in the new version. This means that there will be no new version of that milestone for that EPA. It will not retire it from other EPAs unless you explicitly tell it to.
Please ensure that you have documented any changes that will need to be made, in addition to creating (or modifying) new CSV templates for upload. From the user interface, you will be able to indicate:
Whether an EPA is being replaced, changing, not changing, or being retired
Within each EPA that has been marked as changing, indicate if any KC, EC, or Milestones will be changing, not changing, or retired
If you are adding milestones to an EPA, mark it as changing so that the uploader will add the new milestones
Entirely new EPAs will be detected by the uploader without you needing to indicate anything. For example, if you currently have D1, D2, and D3, but will be adding a D4 and D5, the system will detect these in your spreadsheet and add them to the new curriculum.
The new curriculum version will be applied to each learner at the stage level. This means that if a learner is currently in Foundations, they will retain the previous version of Transition to Discipline and Foundations but will be given the new version of Core and Transition to Practice. Please ensure that all of your learners are in the correct stage (on their CBME Dashboards) prior to publishing the new curriculum version. Remember: a green checkmark on the stage indicates that the learner has completed that stage.
EPA Template
Milestone Template
If applicable, any new program-specific Key and Enabling Competencies
SPREADSHEET TIP:
To ensure that you do not have any duplicated milestone text or duplicated milestone codes, use the "Duplicate Values" feature in Microsoft Excel. This will highlight any duplicated cells for you to ensure that you have only entered each milestone once. Caveat: this method is not perfect and will not catch the same milestone text if one instance of it has an additional space or missing period.
Great news: When triggering an assessment, only the EPAs and associated forms relevant to that learner will be displayed. You will not have to choose between multiple versions for a given learner.
You only need to build new assessment tools for new EPAs or EPAs that have changed. Assessment tools will be carried over for all EPAs that were marked as "not changing." As noted above, as soon as you have published your new version, you will need to build tools for all of the new or changing EPAs before residents can be assessed on them.
CBME includes specific form templates to reduce the amount of work administrative staff have to do to create the assessment forms courses/programs will use. Elentra also supports generic forms (including rubrics) which can also be leveraged by the CBME tools including the individual dashboard, program dashboard and assessment plan builder. Specific details for each form type are included in other lessons but here is some general information about creating forms.
Form Permissions: These permissions dictate who has access to and can edit a form or form template while it is still a draft. To quickly make a form available to multiple users consider adding a program permission to the form (that will give anyone affiliated with that program (e.g. PA, PD) and with access to the administration of forms access to that form).
Note that a form must be permissioned to a course to be used in an on-demand workflow.
Form Instructions: To include instructions on a form template, look in the Form Template Information section and click the box beside 'Include Instructions.' This will open a rich text editor where you can type. What you type here will be included on all forms generated from this template. You could include specifics about the use of a form, the number of assessments a learner must complete, or even the key features of an EPA. Remember that each form produced will have the same instructions so if you include key features make sure they only relate to the applicable EPAs for the template.
Publish a Form: To make a form template generate forms to be available for learners or faculty to trigger you must publish the form template. Most installations of Elentra run a behind-the-scenes action to publish form templates every hour or so.
Editing a Form: Once a form template is published you cannot change the content of the resulting forms (you can change permissions or rearranged item order on published forms). If there is an error on a form you will need to copy the form, correct the error, and publish your new form. You can delete old forms but note that any already completed assessments will remain on the learner dashboard.
The assessment plan builder allows you to specify minimum requirements for assessment forms on a per-EPA basis. These plans are leveraged to generate resident progress reports in the CBME Program Dashboard.
Currently, the assessment plan builder supports Supervisor Forms, Field Notes, Procedure Forms, PPAs (with global entrustment item), and Rubric Forms (with global entrustment item). These forms must have an item mapped to an EPA for it to show up on the Assessment Plan. As of ME 1.18, you are now able to build plans for forms that were deleted or retired.
The assessment plan builder does not currently support PPA or Rubric forms that do no have a global entrustment item added.
You must be in an administrative role and have access to a specific program to use the Assessment Plan Builder.
Navigate to Admin > Manage Programs/Courses.
Beside the appropriate program/course name, click the gear icon and click "CBME".
Click the Assessment Plans tab.
Click "Add Assessment Plan".
When you click “Add Assessment Plan” you are creating a container for all EPA-specific assessment plans in your curriculum. Each assessment plan container is scoped to a single curriculum version (i.e., if you have multiple versions of your EPAs, you can create different assessment plans for different EPA versions).
To add an assessment plan, select the “Version” and create a title for your assessment plan container. Adding a description is optional.
Click “Save Plan.” You will now be redirected to the new assessment plan container. You can use one container for all of your EPAs within a version.
The assessment plan container will load all EPAs for the selected curriculum version. You can use a free-text search to find a particular EPA, or scroll down to the EPA of interest.
Each EPA has a circular icon that indicates the EPA plan status.
A green checkmark indicates that an assessment plan has been published for this EPA.
A grey circle indicates that no assessment plans have been started for this EPA.
An orange exclamation mark indicates that an assessment plan has been saved in draft mode for this EPA. Changes may need to be made before publishing.
Click on the EPA you wish to add an assessment plan for.
Add a title and an optional description. These are not visualized to other users at this time and are for admin purposes only.
Click on "Assessment Tools" to load all tools that have been tagged to this EPA.
Date ranges are listed for tools that have been deleted or retired.
A single creation date is listed for active tools.
Select the tool(s) you wish to include in this assessment plan. You can add one plan for each tool, or optionally combine the requirements across tools.
Note that deleted and retired tools are only listed when at least one assessment has been completed with them.
The combine tools feature allows you to combine multiple tools/forms within the same assessment plan, as long as there are shared contextual variables and the entrustment question is the same. You can then set the plan requirements for the shared variables & scale, and the system will use assessments from all of the selected tools to feed into the dashboard. For example, if you require 4 assessments to be completed at "meets expectations" or above, and it can be either a Field Note or a Supervisor form, the combine tools feature is an easy way to do this.
To combine tools:
Click "Assessment Tools"
Select all of the tools that you wish to combine
At the top of each tool card, click the checkbox at the top right
Click "Combine Tools"
Enter the plan requirements, as outlined below
Please note: For procedure forms, a completion of any of the procedures related to the selected form (built from the same template), will count towards meeting the "minimum number of assessments" and "minimum number of assessors" requirements. **** You may define additional requirements using the "Procedure" variable when your assessment plans require it. Otherwise, you do not need to select any specific procedures.
Minimum number of assessments: Enter the minimum number of assessments required for this EPA using this tool. This number is linked to the global entrustment rating, which means that completed assessments must be equal or higher than the selected rating scale response in order to fulfill the plan. This is a mandatory field.
Rating scale responses: Select the minimum level of entrustment or supervision that is required for this EPA. The number entered in the field before this is linked to this response. For example, this EPA required 3 assessments scored at "Almost" or above. This is a mandatory field.
Minimum number of assessors: Enter the minimum number of assessors required to complete assessments on this EPA. This is a mandatory field.
Contextual Variable Types: Choose how you wish to track your contextual variable responses for this form. You can select multiple contextual variable types as needed (e.g., ‘spread’ and ‘specific’ requirements) within the same assessment plan (either single form or 'combined' form plan). For example, for Form A, I want to use "Specific" and "Group - Spread". i.e., I need the resident to see "cystic fibrosis" (specific) and any 4 of the other diagnoses (Group - Spread).
See below for detailed descriptions of each contextual variable type.
Contextual Variables: Select which contextual variables you need to track for this form. Only the contextual variable categories that are on the selected form will be loaded in this dropdown. Depending on the contextual variable type you selected, you will have different options appear to track the responses.
Combine Tools: See notes above.
The assessment plan builder leverages the grouping functionality within contextual variables. Within a given contextual variable, you are can create groups of responses. There are four different ways to track contextual variable responses with the assessment plan builder: spread, specific, group (spread), and group (specific).
To create Contextual Variable Groups you need to be in Admin > Manage Courses > CBME > CV Responses.
The Spread function allows you to check off a selection of contextual variable responses and indicate how many unique responses are required from that list. In the above image, any 4 unique responses must be assessed at least once to meet the plan.
The Specific function allows you to check off a selection of contextual variable responses and indicate how many times each of the selected responses need to be assessed to meet the plan.
The Group: Specific function allows you to check off a selection of contextual variable responses within a contextual variable group and indicate how many times each of the selected responses need to be assessed to meet the plan. You can select responses from multiple groups within the same contextual variable.
The Group: Spread function allows you to select contextual variable groups and indicate how many discrete/unique responses from the group __ need to be assessed to meet the plan. You can add each response from the group individually as needed. The number of responses required must be equal to or less than the number of CV responses in the group. The learner needs to have a CV response present in their assessments just once to start meeting the requirement.
In the example above if you set the requirement to 4, the learner would be required to collect 4 unique contextual variable responses from the "Complex" group list displayed (for example: genetic syndrome, developmental delay, neurologic problem, AND hemotology/oncology problem).
You can add multiple groups from within the same contextual variable.
You can also use multiple contextual variable types (e.g., ‘spread’ and ‘specific’ requirements) within the same assessment plan (either single form or 'combined' form plan). For example, for Form A, I want to use "Specific" and "Group - Spread". i.e., I need the resident to see "cystic fibrosis" (specific) and any 4 of the other diagnoses (Group - Spread).
Once you have entered all of the contextual variable requirements, simply click "Save Draft" or "Publish Plan". The assessment plans are are currently leveraged in the CBME Program Dashboard.
If your institution is new to competency-based medical education (CBME), especially in the context of Canadian PG programs, please read this additional learning module to prepare to launch CBME with Elentra.
The CBME module in Elentra is optional and is controlled through a database setting (setting: cbme_enabled). If you don't see CBME options like form templates and the CBME dashboard in your organization and you'd like to, you'll need to speak to a developer.
The Competency-Based Medical Education module allows institutions to import Entrustable Professional Activities (EPAs) and map them to Key Competencies, Enabling Competencies and Milestones. In turn, this allows the creation of assessment tools and the collection of learner performance data, all tied to EPAs.
Different institutions will use Elentra differently as they transition to CBME. There are some CBME features that rely on the use of other parts of Elentra, however, using CBME without any additional features (e.g. rotation scheduling), you will be able to:
map EPAs, competencies and milestones,
use supervisor, procedure and field note form templates to create assessment forms users can initiate on demand,
create periodic performance assessment (PPA) and rubric forms that are linked to EPAs (and also available on demand),
allow faculty, learners and staff to initiate assessment forms on demand,
store and report on learner performance data based using the CBME individual dashboard,
create EPA-based assessment plans for each course/program which allows use of the CBME program dashboard,
assign faculty as competence committee members and academic advisors,
log meeting information for each resident,
promote learners through stages, and,
collect feedback from learners on completed assessment forms.
If you wish to create additional forms for assessment or evaluation purposes, use the clinical rotation scheduler, or schedule individual learning events, resources for those actions are also available on this site at the following links:
Assessment and Evaluation (used to create and distribute additional forms like course or faculty evaluations) Clinical Rotation Schedule (used to schedule learners into clinical rotations - note this will allow use of some additional learner tools in the CBME module) Scheduling Learning Events (used to schedule events like academic half days)
cbme_enabled
Controls whether an organization has CBME enabled or not.
cbme_standard_kc_ec_objectives
Controls whether you give courses/programs the option to upload program specific key and enabling competencies.
allow_course_director_manage_stage
Enable to allow course directors to manage learner stages (in addition to Competence Committee members).
allow_program_coordinator_manage_stage
Enable to allow program coordinators to manage learner stages (in addition to Competence Committee members).
cbme_memory_table_publish
cbme_enable_tree_aggregates
cbme_enable_tree_aggregate_build
cbme_enable_tree_caching
default_stage_objective
Must have information entered for CBME to work. Set to
global_lu_objectives.objective_id
of the first stage of competence (e.g., Transition to Discipline)
cbme_enable_visual_summary
Enables additional views from program dashboard
After sufficient information has been added to the system, a user can generate a map showing the connections between EPAs, roles, key competencies, enabling competencies, and milestones. This information can be accessed through the EPA Encyclopedia or a program's CBME page.
From the dashboard, access the EPA Encyclopedia from the left sidebar on the Helpful Links card.
There are several filter options when viewing the EPA Encyclopedia.
Search for a specific learner. After you select a learner the page will refresh to show that learner's program(s).
Switch between multiple programs as needed.
Search for a specific EPA by typing in an EPA code or key word. Elentra will search the EPA title, detailed description and entrustment for a match.
Switch between CBME Curriculum versions as needed.
After clicking on an EPA, click on Detailed Description, Entrustment or Program Map to view details about the EPA.
This option is only available to users with Admin > Manage Courses access (e.g. program directors, curriculum coordinators, program coordinators).
Navigate to Admin > Manage course/program and select a course.
Click CBME and the Map Curriculum tab.
Scroll down to click on an EPA or type the beginning of the EPA code and title into the dialogue box and click on the relevant EPA when it appears below.
Hover the mouse over a point on the map to view the complete text for the mapped item.
If you want to map a connection between an EPA and a milestone, you can do so through the Map Curriculum Tags tab. This allows you to map additional milestones to EPAs after you have uploaded your templates.
Navigate to admin, manage course and select a course.
Click CBME and then map curriculum tags.
Scroll down to the Map Curriculum Tags area.
Either type in and select the tags required to sort the displayed results OR click through the displayed options to create the mapping structure you require.
You can display the results for more than one role at a time by clicking on multiple roles. Note that any already mapped milestones will still show up in the milestone list.
Click to highlight the appropriate tags and then click ‘Save Curriculum Map’.
Do not delete any contextual variables from the original list within Manage Curriculum or the CBME auto-setup feature will prompt you to re-add them the next time you use CBME.
Contextual Variable Responses are used to describe the context in which a learner completes something. Examples of contextual variables include diagnosis, clinical presentation, clinical setting, case complexity, patient demographics, etc. For each variable, programs must define a list of response options. For example, under clinical presentation, a program might include cough, dyspnea, hemoptysis, etc.
Elentra will autosetup a list of contextual variables in the Manage Curriculum interface. Institutions can also add contextual variables (e.g., assessor's role) to this list before uploading contextual variable responses (e.g., nurse, physician, senior resident) via each program's CBME tab. Elentra provides access to the same Contextual Variables for all programs. Programs customize the list of contextual variable responses, but they will all see the same list of contextual variables. For this reason we strongly recommend that you work with your programs to standardize the list of contextual variables you will load into Elentra. If you do not, you risk eventually have hundreds of contextual variables that all program administrative staff have to sort through, whether they apply to their program or not.
Elentra will auto-create the following Contextual Variables for a CBME enabled organization. Do not delete any of these or your users will be prompted to run the auto-setup tool again.
Institutions can add additional contextual variables to make available to their programs through Admin > Manage Curriculum. You'll need to be a medtech:admin or staff:admin user to do this. If you add a contextual variable to the organization wide tag set, programs will be able to upload their own contextual variable responses for the tag.
Navigate to Admin > Manage Curriculum.
Click on 'Curriculum Tags' from the Manage Curriculum Card on the left sidebar.
Click on the Contextual Variables tag set.
Click 'Add Tag' or 'Import from CSV' to add additional tags (more detail regarding managing curriculum tags is available here).
Any new tags you add to the Contextual Variable tag set (e.g. observer role, consult type, domain) will be visible to and useable by all programs within your organization.
When you add new tags to the tag set, you'll be required to provide a code and title. It is recommended that you make the code and title the same, but separate the words in the code with underscores. For example: Title: Observer Role Code: observer_role
Please note that there is by default a 24 character limit for curriculum tag codes and a 240 character limit for curriculum tag titles. If you enter more characters than the limit, the system will automatically cut off your entry at the maximum characters.
After you have added Contextual Variables to the existing tag set, programs will be able to add the new response codes (and responses) to the contextual variable response template and successfully import them.
Once you have built out the list of all contextual variables you want included in the system you can import specific contextual variable responses for each program.
First complete the Contextual Variable Response template. (See here for more information on downloading templates if needed.)
Note that responses will display in the order they are in in the csv file you upload. You can reorder the responses later but you may save time by inputting them in the order you want users to see them.
Response Code: This column is pre-populated by fields including assessor’s role, basis of assessment, case complexity, case type, etc. You can add to this list if you have built additional contextual variables through Manage Curriculum (see above). Make sure to use the exact response code from the newly created contextual variable tag in the response_code column.
Response: In this column, you can add the response variables required (e.g. Case complexity response variables: low, medium, high). To add more than one response variable per response code, simply insert another line and fill in both the response code and response columns.
If there are items in the response code column that you don’t use, you must delete them before you upload the file.
Description (optional): This field is not seen by users at this point and simply serves as metadata.\
Save as a CSV file.
Navigate to Admin > Manage Courses.
From the cog menu beside the course name, select 'CBME.'
From the tab menu below the Competency-Based Medical Education header, click CV Responses.
Click Upload CV Responses.
Drag and drop or search your computer for the appropriate file.
Click 'Save and upload Contextual Variable Responses'.
When the upload is complete you'll be able to click on any contextual variable and view the responses you've added.
After a program has uploaded an initial set of contextual variable responses, they can be managed through the CBME tab.
Navigate to Admin > Manage Courses/Programs.
Search for the appropriate course as needed and from the cog menu on the right, select 'CBME'.
Click on 'CV Responses' from the tab menu below the Competency-Based Medical Education heading.
To modify the contextual variable responses, click the title of the contextual variable or click 'Show' to the right of the one you want to edit.
To edit an existing response, make the required change and click the green disk icon in the Save column.
To delete an existing response, click the red minus button beside the response.
To add a new contextual variable response, click 'Add Response' at the bottom of the category window. This will open a blank response space at the end of the list. Fill in the required content and save.
If you get a yellow bar across the screen when you try to modify contextual variables it means none have been uploaded for the program you are working in. Return to the Import CBME tab and complete Step 4.
Elentra allows you to map EPAs to specific rotations and indicate for each rotation whether an EPA is a priority and how likely a learner is to encounter the EPA. For this information to be useful for learners, you must be using the Elentra Rotation Scheduler.
From the CBME dashboard, EPAs can display their priority and the likelihood of being encountered within a learner's rotation. Additionally, EPAs relevant to the learner's current rotation are outlined.
When learners and faculty trigger assessments they will also have the option to apply pre-set filters (e.g. priority EPA, current stage EPAs). All of these tools can help learners quickly identify EPAs to focus on in their clinical work.
Navigate to Admin > Clinical Experience.
Click on the Rotation Schedule tab.
Create a draft schedule or open a published schedule. Note that if you are creating a new rotation schedule, you will also need to add rotations to the schedule.
Once rotations exist within a schedule a small Objectives badge will appear beside the rotation name.
Click on the CBME Objectives badge to open a list of EPAs tagged to the rotation.
Click the plus sign beside any EPA you'd like to label with priority or likelihood.
Indicate the likelihood by clicking on the appropriate button in the Likelihood.
Set an EPA as a priority by clicking the exclamation mark in the priority column.
Repeat as necessary.
Return to the rotations list using the Back to Rotations button in the top right. \
When importing data to create EPA maps, programs are required to upload a contextual variable template to provide contextual variable response options (e.g. case complexity: low, medium, high). In addition to this information, programs will need to add specific details about any procedures included in their contextual variable response options. This additional information, called procedure attributes, plays an important part in building procedure assessment forms. Think of the procedure attributes as assessment criteria (e.g. obtained informed consent, appropriately directed assistants, etc.) for each procedure .
For each procedure in your program, you must define headings, rubrics and list items that may be used to assess a learner completing the procedure. Headings appear at the top of a form, rubric becomes the subtitle of a rubric, and list items are the things that will actually be assessed on the rubric.
All this information must be stored and uploaded in a CSV spreadsheet that uses two columns: Type and Title.
Within ‘Type’ there are three things you can include: H to represent heading (e.g. Participate in Surgical Procedures, NG Insertion) R to represent rubric (e.g. Procedure, Pre-Procedure) L to represent list items (e.g. Apply knowledge of anatomy)
The three procedure criteria form a hierarchy. A heading can have multiple rubrics, and a rubric can have multiple list items. Arrange the criteria in your spreadsheet to reflect their nesting hierarchies.
In the 'Title' column enter the information to be uploaded.
When complete, save the file in a CSV format.
Sample Procedure Form On this sample form, the following information was uploaded as procedure attributes: H: Participate in Surgical Procedures R: Procedure L: (ME1.3.2) Apply knowledge... (ME3.4.2) Perform procedural tasks... Use common surgical instruments...
You can use the same file for multiple procedures if they share the same heading, rubric, and list information.
Navigate to Admin > Manage Courses.
From the cog menu beside the course name, select 'CBME.'
From the tab menu below the Competency-Based Medical Education header, click **'**CV Responses'.
From the list of contextual variables, click on ‘Procedure’.
Before uploading procedure attributes, make sure you are working with the correct CBME curriculum version and change the version as needed.
Beside each procedure you will see a small black up arrow under the Criteria column.
Click on the arrow to upload a .csv file of information for each procedure.
You must indicate an EPA to associate the procedure attributes with (note that you can select all as needed).
Either choose your file name from a list or drag and drop a file into place.
Click ‘Save and upload criteria’.
You will get a success message that your file has been successfully uploaded. Click 'Close'.
After procedure attributes have been uploaded for at least one EPA you will see a green checkmark in the Criteria column.
Click the green disk in the Save column to save your work. You will get a green success message at the bottom of the screen.
Repeat this process for each procedure and each relevant EPA. Remember, a program can use the same procedure attributes for multiple procedures and EPAs if appropriate.
You can view the procedure attributes already uploaded to a procedure by clicking on the green check mark in the criteria column.
You will see which EPAs have had procedure attributes added and can expand an EPA to see specific details by clicking on the chevron to the right of the EPA.
General information about cohorts/course lists can be found here.
For (P)GME programs using CBME, residents should be stored in course lists in order to easily add them as the audience of a course. Course lists are managed through Admin>Manage Cohorts. We recommend that you create a course list for each academic year and populate the course list with all learners, CBME and non-CBME, who will participate in that program for the specified time.
If you are syncing your users with another source, your course lists may be automatically populated. It will depend on the script written by developers to connect Elentra to your other source of user information.
When you create courses, you can set the Course List as the audience for the course. It is important that learners be enrolled in a current curriculum period for them to be able to trigger forms and have forms triggered on and to them.
Note that including the learner level and being able to manage it on the Enrolment tab is optional. It can be turned on or off by a developer through a setting in the database (learner_levels_enabled).
Users who have access to Admin > Manage Courses will be able to manage learner levels through the Course Enrolment tab.
Navigate to Admin > Manage Courses/Programs.
Select the appropriate course/program (if applicable).
Click the Enrolment tab.
You will see a list of learners. Use the curriculum period switcher on the right if needed.
Please note that if you store your learner levels in another system and pull them into Elentra, making a change in Elentra will not automatically update data in your other system.
To use supervisor and procedure forms and certain item types, an organization needs to have rating scales in place. Elentra includes several default rating scales but organizations can also add their own.
There are three types of rating scales: default, Global Assessment and MS/EC which stands for Milestone Scale /Enabling Competency. There is no user interface to modify the scale types and both Global Assessment and MS/EC are used on form templates including supervisor and procedure form templates.
There is a user interface to manage rating scales but only System Administrators and users with Medtech>Admin group and role can modify the available scales. Note that scales are applied to an entire organization and can be accessed by multiple programs.Managing rating scales is part of the Assessment and Evaluation module; please see additional help resources .
Note that as of ME 1.14, when a new rating scale is added by an organization there is some developer work that must be completed to make the rating scale available on form templates. If you are a developer, please see the resource below.
Elentra is designed to automatically populate the item text when you select a global rating scale. Using different scales will result in different questions populating the form. See examples below.
You can make minor edits to existing EPAs and milestones through the user interface.
Navigate to Admin > Manage Courses.
From the cog menu beside the course name, select 'CBME.'
From the tab menu below the Competency-Based Medical Education header, click on Import CBME Data.
To make minor edits to EPA titles, detailed descriptions, or entrustment statements:
Click "Actions"
Click "Edit EPAs"
To make minor edits to Milestones:
Click "Actions"
Click "Edit Milestones"
Edit the EPA text, detailed descriptions, or Entrustment as required for each EPA. Then click "Save Changes" for each EPA that you have changed.
Note that this should only be used to make small modifications to this information and should not change the meaning of the text.
If you edit an EPA, it will be updated automatically across the system including on all completed forms. If you edit any milestones, you will have to republish forms on which they are used. Edits made to milestones will not be made on already published forms.
A supervisor form is used to give a learner feedback on a specific EPA and can be triggered by a learner or supervisor. Once an EPA is selected, the form displays the relevant milestones to be assessed. A supervisor can indicate a learner’s progress for each milestone that was observed and can provide a global entrustment rating. Comments can be made optional, prompted or mandatory in each section of the form.
When you create a supervisor form template and publish it, the system automatically looks at the EPAs, milestones, and contextual variables selected and generates the appropriate number of forms. If you kept 3 EPAs on the supervisor form template, the system will generate 3 unique forms (one per EPA) that are available to be triggered by a user.
You need to be logged in as a Program Coordinator, Program Director or staff:admin role to access Admin > Assessment & Evaluation.
Navigate to Admin > Assessment & Evaluation.
Click ‘Form Templates’ on the tab menu.
Click the green ‘Add Form Template’ button in the top right and a pop-up window will appear.
Type in a form name and select the appropriate form type from the dropdown menu. Select ‘Supervisor Form’.
If you have permission to access multiple programs, use the dropdown menu to select the appropriate program for this form. This option will only show up if you have access to multiple programs.
Click 'Add Form'.
Add additional form template information as required:
Template Title: Edit the form template title/name if needed.
Description: The form description can be used to store information for administrative purposes, but it is not seen by users completing the form.
Form Type: The form type was set when you created the form and cannot be changed here.
Course/Program: Program coordinators and faculty directors may not have access to multiple courses, while staff:admin users are likely to. If you have access to multiple courses, make sure you've selected the correct course to affiliate the form with.
EPA Version: If you have two (or more) curriculum versions in the system, the Form Template Builder will default to loading the most recent version. In the "EPA Version" tab, simply select the appropriate version.
If you want to build new forms for learners using Version 1, change the EPA Version to Version 1, click Save, and it will load the appropriate EPAs.
Permissions: Anyone added under permissions will have access to edit and copy the form. You may wish to include a program in the permissions field so that you can filter by this form type later on.
To add individual, program, or organisation permissions, click the dropdown selector to select a category, and then begin to type in the appropriate name, clicking on it to add to the list.
You can add multiple individuals, programs, and organisations to the permissions list as needed.
Include Instructions: Check this to open a rich text editor where you can provide instructions about the form to users. The instructions will display at the top of forms built from this template. The same instructions will apply to all forms published from this form template.
Select which EPAs can be assessed using forms generated from this template.
All EPAs assigned to a course are included on the template by default.
To remove EPAs, click the 'x' to the left of the EPA code. You can add back any required EPAs by clicking on the dropdown menu and checking off the tick box for a desired EPA.
Click the grey badge beside an EPA to display a list of all the milestones mapped to that EPA (all are selected by default). Remove milestones as needed and then click 'Save and Close’.
Deleting unnecessary milestones is one way to reduce the length of the form and reduce the time required to complete it.
Modify the milestones for each EPA as needed. Elentra does not enforce a maximum number of selected milestones per EPA.
Click 'Save'.
If you want all EPAs to have the same available contextual variables, leave all EPAs checked off.
If you’d rather specify which contextual variables apply to which EPAs simply uncheck an EPA and it will appear below with its own customizable list of contextual variables.
Select which contextual variables you want to include with which EPAs by checking and unchecking the tick boxes.
You may only select between 1 and 6 contextual variables per EPA per supervisor form.
By default, all of the options within a contextual variable are included on any forms made from the template.
Click the grey button beside a contextual variable to view the available contextual variable responses.
To remove specific responses from this template, deselect them. For convenience, you can also use ‘Check All’ and ‘Uncheck All’.
To allow users to select multiple CV responses when completing a form, check the multi-select box. If selected, the item created on the form will be a drop down multiple responses type, instead of single response type.
When you have made the required changes, click the blue ‘Save and Close’ button.
If you modify which contextual variable response options will be available on a template, the number in the grey badge will show how many responses have been included out of the total possible responses.
Click 'Save'.
All contextual variables will display on this list, even if a program doesn't have contextual variable responses set for that variable. If you attempt to select a contextual variable for which there are no responses set, you will get an error message that reads "No objectives found to display." Click the X on the red message to remove it and then select a different contextual variable to use.
Use the first dropdown menu to select the scale you want to use to assess enabling competencies or milestones. (Scales can be configured by a medtech:admin user via Admin > Assessment and Evaluation.)
Indicate whether comments are disabled, optional, mandatory, or prompted.
Disabled - Comments are disabled at the milestone level.
Optional - An optional comment box will appear for each milestone. Comments are not required for a form to be submitted.
Mandatory - A mandatory comment box will appear for each milestone. Comments are required for all milestones before a form can be submitted.
Prompted - A mandatory comment box will appear only for the prompted responses indicated. This is a popular option for responses at the lower end of a scale.
The default response feature allows you to prepopulate a form with the selected response.This can reduce time required to complete the form.
The Responses fields will be automatically populated depending on the scale selected in the first dropdown menu.
Click 'Save'.
From the first dropdown menu, select a Global Rating Scale.
Enter Item Text if needed.
Elentra allows organizations to optionally automatically populate the Item Text based on the selected scale. If you do not see Item Text you organization may require some additional configuration by a developer.
From the second dropdown menu, indicate whether comments are disabled, optional, mandatory, or prompted.
The Responses fields will be auto-populated depending on the scale selected in the first dropdown menu.
Click 'Save'.
On each form template you create you’ll notice a greyed out area at the bottom including Next Steps, Concerns, and a place for feedback. This is default form content that cannot be changed except by a developer. These items will be added to all forms published from this template.
Click 'Publish' to make forms generated by this template available for use. Remember that the number of forms that will be created from a template depends on the number of EPAs assigned to the template.
Once a form template has been published, you can rearrange the template components for each form; however, you cannot makes changes to the scales or contextual variables. To make these changes, copy the form template and create a new version.
Please note that a behind the scenes task needs to run before your forms will be published. At some schools this may take up to an hour so expect a slight delay between when you publish a form and when it is available to be triggered by users.
Building assessment forms for learners and faculty to access is an important step in using the CBME tools.
Two options for building forms exist:
Building forms with a template (e.g. supervisor, procedure, and field note forms)
Building forms without a template (e.g. generic, periodic performance assessment (PPA) or rubric form)
Using a form template allows Elentra to quickly generate forms pulling in the relevant milestones, contextual variables and rating scales defined by administrative staff. After form templates are published, they can be triggered by faculty and learners at any point. Form templates have defined items included and administrators can not add additional items to a form template. Examples of form templates are the supervisor form, procedure form, and field note form.
You can also build forms without a template which gives you increased flexibility to add items of your own design. Examples include the generic form, rubric form, and periodic performance assessment form. The rubric and PPA have some required elements but you can also add additional items to these forms. As long as one item added to a rubric or PPA form is linked to an EPA the form will be triggerable by faculty and learners.
If a form is not something that a faculty or learner will trigger themselves, program administrators can create forms and send them out to be completed via distributions. A distribution defines who or what is being assessed/evaluated, who is competing the form, and when it is to be completed. Building items and forms to use with distributions is not a requirement if you only want to use triggerable forms. For additional information on building and distributing forms that are not triggerable, please see the Assessment and Evaluation help resources .
For Staff and Faculty Groups to have access to the CBME features in the Distribution wizard, a developer will need to add an additional resource_type
called CBME
into the elentra_auth.acl_permissions
table to include the organisation that has enabled CBME (if your instance of Elentra has more than one organisation) and the two groups of staff and faculty with an app_id
of 1 and only a read permission.
An example of this entry for an instance where CBME is enabled in organisation_id 1 and 3:
Setting up Academic Advisor Groups will allow the Academic Advisors to access the CBME Dashboard and assessments of their affiliated learners. Academic Advisor Groups rely on the Elentra Course Groups feature and are not specific to CBME-enabled organizations.
Please see for information on configuring Course Groups.
After clicking on an EPA you will see something like this:
There is no user interface to configure the item text used with each scale. This is controlled through the database; if you need to create a scale and will be using it as a global rating scale please speak to your project manager or a developer.
After creating forms using a form template, you can reorder form items on individual forms. This applies to published Supervisor Forms, Field Notes, and Procedure Forms.
Open the Form Template that you used to build the form of interest.
Click on the title of the form on which you wish to re-order items.
Click and drag on the “cross” icon to re-order the assessment items.
The changes will be saved automatically. This must be done for each individual form that you wish to re-order.
To review feedback provided about the assessment tools themselves, go to Admin>Assessment and Evaluation, and click Reports from the second tab menu (look under the Assessment & Evaluation header). Under Assessments, click on Assessment Tools Feedback Report. CBME form feedback will show up there. You can filter by course and form to refine the list of feedback. If you are logged in as a PA you will only have access to forms associated with your program.
Rubrics are assessment tools that describe levels of performance in terms of increasing complexity with behaviourally anchored scales. In effect, performance standards are embedded in the assessment form to support assessors in interpreting their observations of learner performance.
If you create a rubric form and at least one item on the form is linked to an EPA the form will be triggerable by faculty and learners once published. Results of a completed rubric form are included on a learner's CBME dashboard information.
Within a given form, you can only tag curricular objectives (e.g., EPAs or milestones) from the same curriculum version. To ensure that you do not accidentally add an EPA from a different version, we recommend you create the form first and then "Create & Attach" new items to the form.
You need to be a staff:admin, staff:prcoordinator, or faculty:director to access Admin > Assessment & Evaluation to create a form.
Click Admin > Assessment & Evaluation.
Click on Forms from the subtab menu.
Click Add Form.
Provide a form name and select Rubric Form from the Form Type dropdown menu; then click Add Form.
Form Title: Form titles are visible to end-users (learners, assessors, etc.) when being initiated on-demand. Use something easily distinguishable from other forms.
Form Description: The form description can be used to store information for administrative purposes, but it is not seen by users completing the form.
Form Type: You cannot change this once you have created a form.
On-Demand Workflow: On-demand workflows enable users to initiate a form on-demand using different workflow options (EPA, Other Assessment Form, Faculty Evaluation and Rotation Evaluation).
EPA: Use for forms tagged to EPAs that you want users to be able to initiate. Contributes to CBME dashboards.
Other Assessment Form: Use for forms that you want users to be able to initiate and complete on demand without tagging to EPAs; or, for tagged forms that you don't want to appear in the EPA list. Only forms with EPAs tagged will contribute to CBME dashboards.
Course: Program coordinators and faculty directors may not have access to multiple courses, while staff:admin users are likely to. If you have access to multiple courses, make sure you've selected the correct course to affiliate the form with.
EPA Version: Select the CBME Version that this form will be used for. After setting the version, you will only be able to tag EPAs from that version to this form.
By default, the Form Editor will load the most recent CBME version. Under "EPA Version", simply select the appropriate version. Click Save. If you want to build new forms for learners using Version 1, simply change the EPA Version to Version 1 and it will load the appropriate EPAs.
Permissions: It is highly recommended that you assign course/program-level permissions to all of your forms, as some filters rely on this setting. Additionally, using a form in a workflow requires that it be permissioned to a course.
Authorship permissions give other users access to edit the form. You can add individual authors or give permission to all administrators in a selected program or organization.
To add individual, program, or organization permissions, click the dropdown selector to select a category, and then begin to type in the appropriate name, clicking on it to add to the list.
You can add multiple individuals, programs, and organizations to the permissions list as needed.
In order to use the "Programs" filter in the form bank and when learners initiate assessments/evaluations, you need to add Program-level permissions to each form.
Select the relevant contextual variables for this form by clicking on the checkbox. Adjust which contextual variable responses should be included by clicking on the gray badge. This allows you to deselect unneeded contextual variable responses which can make the form faster to complete for assessors.
If you want to include an Entrustment Rating on the form, click the checkbox. Select an entrustment rating scaled from the dropdown menu. Note that the responses will be configured based on the scale you select. It is also possible that the Item Text will be autopopulated based on the scale you select.
For the optional Entrustment Rating, set requirements for comments noting that if you select Prompted comments you should also check off which responses are prompted in the Prompt column. If you use this option and any person completing the form selects one of the checked responses, s/he will be required to enter a comment. Additionally, if the form is part of a distribution you'll be able to define how prompted responses should be addressed (e.g. send email to program coordinator whenever anyone chooses one of those response options).
The default Feedback and Concerns items will be added when the form is published.
Add form items by clicking 'Add Items', or click the down arrow for more options.
'Add Free Text' will allow you to add an instruction box.
If you add free text, remember to click Save in the top right of the free text entry area. Any free text entered will display to people using the form.
'Add Curriculum Tag Set' should not be used.
To create and add a new item, click the appropriate button.
Select the Item Type and add any item responses, if applicable.
Tag Curriculum Tags to your newly created item.
In the example below, because you are using a form that is mapped to "Version 2", the curriculum tag sets will be locked to "Version 2". This will ensure that you do not accidentally tag an EPA from a different version.
After you have added items to your form you may download a PDF, and preview or copy the form as needed.
Save your form to return to it later, or if the form is complete, click Publish. You will see a blue message confirming that a form is published. Unlike form templates which require a behind the scenes task to be published, a rubric form will be available immediately.
Rubric forms can also be scheduled for distribution through the Assessment and Evaluation module.
Users can't access the form when initiating an assessment on demand. Why is this happening?
Check that your form is permissioned to a course and has a workflow (e.g. Other Assessment) defined.
My PPA or Rubric Form is not displaying a publish button. Why is this happening?
In order for a PPA or Rubric form to be published, you must have at least one item that is mapped to part of your program's "EPA tree". You will only see the "publish" button appear after you have tagged an item to either an EPA(s) or a milestone(s) within an EPA. After saving the item, you will now see a "Publish" button appear.
Is it a requirement to publish PPA and Rubric forms?
You only need to publish PPAs and Rubric forms if you wish to leverage the EPA/Milestones tagging functionality in the various CBME dashboards and reporting. You are still able to use PPAs and Rubric forms without tagging EPAs or milestones if you only need to distribute them, or select them using the "Other Assessment" trigger workflow. If you want any of the standard CBME items, such as the Entrustment Item, Contextual Variables, or the CBME Concerns rubric, you must tag and publish the form. Keep in mind that the assessment plan builder only supports forms that have the standard entrustment item on it - meaning, only published PPAs/Rubrics forms.
A procedure form is an assessment tool that can be used to provide feedback on a learner’s completion of a specific procedural skill. Once a procedure is selected, specific criteria will be displayed. A procedure form can be initiated by a learner or faculty.
When you create a procedure template and publish it, the system looks at the number of EPAs and procedure contextual variable responses selected and generates the appropriate number of forms. If you keep 3 EPAs and indicate 10 procedures on the form template, the system will publish 30 forms that are available to be triggered by a user (one form per EPA per procedure).
To use the Procedure Form Template, a program must first:
Define contextual variables responses for the Procedure variable
Upload assessment criteria CSV files for each procedure. This provides the actual assessment criteria for each procedure.
You can upload different criteria (i.e., different assessment forms/items) for each procedure.
You can optionally use the same criteria for every EPA that will assess that procedure, or you can upload different criteria for every EPA that will assess that procedure.
You need to be logged in as a Program Coordinator, Program Director or staff:admin role to access Admin > Assessment & Evaluation.
Navigate to Admin > Assessment & Evaluation.
Click ‘Form Templates’ on the tab menu.
Click the green ‘Add Form Template’ button in the top right and a pop-up window will appear.
Type in a form template name and select the form type (Procedure Form) from the dropdown menu.
If you have permission to access multiple programs, use the dropdown menu to select the appropriate program for this form. This option will only show up if you have access to multiple programs.
Click 'Add Form'.
You will be taken to the procedure form template build page.
Template Title: Enter the title of the form. This will be seen by users.
Description: The form description can be used to store information for administrative purposes, but is not seen by users completing the form.
Form Type: This was set in the previous step and cannot be edited here.
Course/Program: Program coordinators and faculty directors may not have access to multiple courses, while staff:admin users are likely to. If you have access to multiple courses, make sure you've selected the correct course to affiliate the form with.
EPA Version: If you have two (or more) curriculum versions in the system, the Form Template Builder will default to loading the most recent version. In the "EPA Version" tab, simply select the appropriate version. Click Save.
If you want to build new forms for learners using Version 1, change the EPA Version to Version 1, click Save, and it will load the appropriate EPAs.
Permissions: Anyone added under permissions will have access to edit the form before it is in use and use the form if they are setting up a distribution. You may wish to include a program in the permissions field so that you can filter by this form type later on.
To add individual, program, or organisation permissions, click the dropdown selector to select a category, and then begin to type in the appropriate name, clicking on it to add to the list.
You can add multiple individuals, programs, and organisations to the permissions list as needed.
Include Instructions: Check this to open a rich text editor where you can provide instructions about the form to users (instructions will display at the top of forms built from this template). The same instructions will apply to all forms published from this form template.
Select which EPAs can be assessed using forms generated from this template.
All EPAs assigned to a course are included on the template by default.
To remove EPAs, click on the small 'x' to the left of the EPA code. You can add back any required EPAs by clicking on the dropdown menu and checking off the tick box for a desired EPA.
Click ' Save and Next'.
Note: You do not specify milestones for use on a Procedure Form.
By default, ‘Procedure’ will be selected as a contextual variable.This will require some additional information to be added to the system if the program you are working in hasn’t input procedure response options.
If you want all EPAs to have the same available contextual variables leave all EPAs checked off.
If you’d rather specify which contextual variables apply to which EPAs simply uncheck an EPA and it will appear below with its own customizable list of contextual variables.
In addition to 'Procedure', you may select between 1 and 6 contextual variables per EPA.
By default, all of the response options within a contextual variable are included on any forms made from the template.
Click the grey button beside a contextual variable to view the available contextual variable responses.
To remove specific responses from this template, deselect them. For convenience, you can also use ‘Check All’ and ‘Uncheck All’.
To allow users to select multiple CV responses when completing a form, check the multi-select box. If selected, the item created on the form will be a drop down multiple responses type, instead of single response type.
When you have made the required changes, click the blue ‘Save and Close’ button.
If you modify which contextual variable response options will be available on a template, the number in the grey badge will show how many responses have been included out of the total possible responses.
Click 'Save'.
Setting a Scale
Use the first dropdown menu to select the scale you want to use to assess enabling competencies or milestones. (Scales can be configured by a medtech:admin user via Admin > Assessment and Evaluation.)
Indicate whether comments are disabled, optional, mandatory, or prompted.
Disabled - Comments are disabled at the milestone level.
Optional - An optional comment box will appear for each milestone. Comments are not required for a form to be submitted.
Mandatory - A mandatory comment box will appear for each milestone. Comments are required for all milestones before a form can be submitted.
Prompted - A mandatory comment box will appear only for the prompted responses indicated. This is a popular option for responses at the lower end of a scale.
The default response feature allows you to prepopulate a form with the selected response.This can reduce time required to complete the form.
The Responses fields will be automatically populated depending on the scale selected in the first dropdown menu.
Click 'Save'.
From the first dropdown menu, select a Global Rating Scale.
Enter Item Text if needed.
Elentra allows organizations to optionally automatically populate the Item Text based on the selected scale. If you do not see Item Text you organization may require some additional configuration by a developer.
From the second dropdown menu, indicate whether comments are disabled, optional, mandatory, or prompted.
The Responses fields will be auto-populated depending on the scale selected in the first dropdown menu.
Click 'Save'.
On each form template you create you’ll notice a greyed out area at the bottom including Next Steps, Concerns, and a place for feedback. This is default form content that cannot be changed except by a developer.
When the form is complete, a green bar will tell you the form can be published.
Click 'Publish' to make your template available for use.
Once a form template has been published, you can rearrange the template components for each form; however, you cannot makes changes to the scales or contextual variables. To make these changes, copy the form template and create a new version.
Please note that a behind the scenes task needs to run before your forms will be published. At some schools this may take up to an hour so expect a slight delay between when you publish a form and when it is available to be triggered by users.
In some programs residents may be required to log multiple procedures or encounters and only have a subset of those logged entries be assessed. Elentra does support a logbook outside the CBME module and some programs have opted to have residents use both tools to capture the full picture of residents' progress. For more detail on Elentra's logbook, please see here.
A Periodic Performance Assessment (PPA) Form is designed to capture longitudinal, holistic performance trends. At least one item on a PPA form must be linked to an EPA for the form to be initiated on demand by users.
Within a given form, you can only tag curricular objectives (e.g., EPAs or milestones) from the same curriculum version. To ensure that you do not accidentally add an EPA from a different version, we recommend you create the form first and then "Create & Attach" new items to the form.
You need to be a staff:admin, staff:prcoordinator, or faculty:director to access Admin > Assessment & Evaluation to create a form.
Click Admin > Assessment & Evaluation.
Click on Forms from the subtab menu.
Click Add Form.
Provide a form name and select PPA Form from the Form Type dropdown menu; then click Add Form.
Form Title: Form titles are visible to end-users (learners, assessors, etc.) when being initiated on-demand. Use something easily distinguishable from other forms.
Form Description: The form description can be used to store information for administrative purposes, but it is not seen by users completing the form.
Form Type: You cannot change this once you have created a form.
On-Demand Workflow: On-demand workflows enable users to initiate a form on-demand using different workflow options (EPA, Other Assessment Form, Faculty Evaluation and Rotation Evaluation).
EPA: Use for forms tagged to EPAs that you want users to be able to initiate. Contributes to CBME dashboards.
Other Assessment Form: Use for forms that you want users to be able to initiate and complete on demand without tagging to EPAs; or, for tagged forms that you don't want to appear in the EPA list. Only forms with EPAs tagged will contribute to CBME dashboards.
Course: Program coordinators and faculty directors may not have access to multiple courses, while staff:admin users are likely to. If you have access to multiple courses, make sure you've selected the correct course to affiliate the form with.
EPA Version: Select the CBME Version that this form will be used for. After setting the version, you will only be able to tag EPAs from that version to this form.
By default, the Form Editor will load the most recent CBME version. Under "EPA Version", simply select the appropriate version. Click Save. If you want to build new forms for learners using Version 1, simply change the EPA Version to Version 1 and it will load the appropriate EPAs.
Permissions: Authorship permissions give other users access to edit the form. You can add individual authors or give permission to all administrators in a selected program or organization. It is highly recommended that you assign course/program-level permissions to all of your forms, as some filters rely on this setting.
To add individual, program, or organization permissions, click the dropdown selector to select a category, and then begin to type in the appropriate name, clicking on it to add to the list.
You can add multiple individuals, programs, and organizations to the permissions list as needed.
In order to use the "Programs" filter in the form bank and when learners initiate assessments/evaluations, you need to add Program-level permissions to each form.
Select the relevant contextual variables for this form by clicking on the checkbox. Adjust which contextual variable responses should be included by clicking on the gray badge. This allows you to deselect unneeded contextual variable responses which can make the form faster to complete for assessors.
If you want to include an Entrustment Rating on the form, click the checkbox. Select an entrustment rating scaled from the dropdown menu. Note that the responses will be configured based on the scale you select. It is also possible that the Item Text will be auto-populated based on the scale you select.
For the optional Entrustment Rating, set requirements for comments noting that if you select Prompted comments you should also check off which responses are prompted in the Prompt column. If you use this option and any person completing the form selects one of the checked responses, s/he will be required to enter a comment. Additionally, if the form is part of a distribution you'll be able to define how prompted responses should be addressed (e.g. send email to program coordinator whenever anyone chooses one of those response options).
The default Feedback and Concerns items will be added when the form is published.
Add form items by clicking 'Add Items', or click the down arrow for more options.
'Add Free Text' will allow you to add an instruction box.
If you add free text, remember to click Save in the top right of the free text entry area. Any free text entered will display to people using the form.
'Add Curriculum Tag Set' should not be used.
To create and add a new item, click the appropriate button.
Select the Item Type and add any item responses, if applicable.
Tag Curriculum Tags to your newly created item.
In the example below, because you are using a form that is mapped to "Version 2", the curriculum tag sets will be locked to "Version 2". This will ensure that you do not accidentally tag an EPA from a different version.
At least one item added to the PPA form must be linked to an EPA in order for the form to be initiated on-demand by users.
After you have added items to your form you may download a PDF, and preview or copy the form as needed.
Save your form to return to it later, or if the form is complete, click Publish. You will see a blue message confirming that a form is published. Unlike form templates which require a behind the scenes task to be published, a rubric form will be available immediately.
PPA forms can also be scheduled for distribution through the Assessment and Evaluation module.
My PPA or Rubric Form is not displaying a publish button. Why is this happening?
In order for a PPA or Rubric form to be published, you must have at least one item that is mapped to part of your program's "EPA tree". You will only see the "publish" button appear after you have tagged an item to either an EPA(s) or a milestone(s) within an EPA. After saving the item, you will now see a "Publish" button appear.
Is it a requirement to publish PPA and Rubric forms?
You only need to publish PPAs and Rubric forms if you wish to leverage the EPA/Milestones tagging functionality in the various CBME dashboards and reporting. You are still able to use PPAs and Rubric forms without tagging EPAs or milestones if you only need to distribute them, or select them using the "Other Assessment" trigger workflow. If you want any of the standard CBME items, such as the Entrustment Item, Contextual Variables, or the CBME Concerns rubric, you must tag and publish the form. Keep in mind that the assessment plan builder only supports forms that have the standard entrustment item on it - meaning, only published PPAs/Rubrics forms.
Do I need to publish a PPA before being able to attach it to a distribution?
No, the distribution wizard does not require you to publish the form before being able to attach it to a distribution. As long as you do not want to tag EPAs or Milestones, or have the form be reported in the CBME Dashboards, then you do not need to publish it.
Form embargo is an optional Elentra feature and must be enabled through a database setting (cbme_assessment_form_embargo).
When creating and editing forms the embargo option will only appear if the above setting is enabled.
If embargo forms exist and then the setting is later turned off then any new assessments created with the form will not be embargoed.
An embargo can be applied to CBME forms that are permissioned to a course (e.g. Rubric Form). It does not currently apply to Form Templates.
If an embargo is applied to a form, tasks using that form will not be released to the target until specific release conditions as set by the form creator have been met. (Note that the distribution wizard provides a similar option for peer-assessment tasks within a specific distribution.)
There are three options available when setting an embargo.
A Course Director or Program Director must release the assessment
To set up manual release, just check the "Embargo" checkbox. The assessment responses will be hidden from the learner until a program coordinator or course director for the course the assessment belongs to goes to the task and releases it to the learner.
You can optionally choose to send email notifications for tasks completed using this form. Check of the appropriate box(es) to enable email notifications.
When tasks with a manual release requirement are completed, Program Coordinators and Program Directors will be able to view the task and optionally release it to the target or the Competency Committee Member.
If tasks are released to learners, they will display under Tasks Completed on Me (information re: CBME dashboard pending).
If tasks are released to CC members, they will be visible when the CC member view the learner's Assessments tab (information re: CBME dashboard pending).
Tasks generated using this form will never be available to the target, only program coordinators and program directors will be able to view the completed tasks.
To set up a permanent embargo, check off "Embargo", AND check off "Permanent Embargo?"
Tasks generated using this form will not be available to the target unless the target has completed at least one task using one or more other specific forms).
To set up this option, click on "add completion requirement". That will make the responses become visible to the learner as soon as the other form (selected from the dropdown) is complete.
Tasks that are embargoed will display an embargoed label on the target card.
Embargoed forms can be used in distributions or on-demand assessments.
All tasks generated using an embargoed form will have these embargo release conditions applied, regardless of on-demand completion method (e.g. complete and confirm via PIN, email blank form, etc.)
It is possible for an embargo to be applied even if a task related to the completed assessment would normally require action from the assessee.
Since the embargo release function is carried out by course contacts, this feature is only available to forms which have a course relation.
Embargo form behaviour conflicts with another Elentra tool - the ability for evaluators to immediately release their completed task to the target. Because of this a check was added to ensure that this question does not appear if the task is using an embargo form.
PAs can enter completed assessment forms on behalf of assessors. This allows faculty to complete a pen and paper version of a form and have the data entered into Elentra.
To enter a completed form on someone's behalf:
Navigate to Admin>Assessment & Evaluation.
Click on the green 'Record Assessment' button below the orange Assessment Tasks heading.
Select a resident (you will need to know the curriculum period the learner is in) and assessor from the searchable dropdown menus.
Select a CBME Version if necessary.
Select a Date of Encounter (i.e., the day the form was completed).
Select an EPA as you would to initiate a form. Filters and search are available.
Search for the appropriate form. You can preview the form to make sure it is the one you want or click 'Begin Assessment' to start a form.
You will be submitting the assessment on behalf of the selected assessor. There is a reminder of the selected assessor displayed at the top of the form in a yellow information bar.
Complete the form and click Submit.
You will get a green success message and be returned to the assessment entry screen to complete another form if needed. \
This feature allows residents to cognitively situate their assessors by adding an optional note to any assessment that will be emailed (complete and confirm via email; email blank form; self-assessment, then email blank form). It can be used to remind the assessor about specific case details, provide a focus for assessment, or anything else that the resident feels the assessor should know before completing the assessment. This note will stay attached to the assessment and will be visible from the CBME dashboard for reference.
This cue will stay attached to the assessment and will be visible from the CBME dashboard for reference.The cue is optional and if provided it will appear at the top of the assessment for the assessor. The cue can also be seen on the completed assessment for both the resident and the faculty member.
For “Email blank form,” the cue modal will pop up from the Trigger Assessment page after “Send Assessment” has been clicked.
For “Complete and Confirm via Email” and “Self-assessment, then email blank form”, the cue modal will pop up after clicking “Submit and notify attending by email” at the bottom of the form.
A field note form template is used to give a learners narrative feedback about their performance.
When you create a field note form template and publish it, the system automatically looks at the EPAs and contextual variables selected and generates the appropriate number of forms.
Ensure you are logged in as a staff:admin user, or as a Program Coordinator or Program Director affiliated with a program.
Navigate to Admin > Assessment & Evaluation.
Click ‘Form Templates’ on the tab menu.
Click the green ‘Add Form Template’ button in the top right and a pop-up window will appear.
Type in a form template name and select the form type from the dropdown menu. Select ‘Field Note Form.’
If you have permission to access multiple programs, use the dropdown menu to select the appropriate program for this form. This option will only show up if you have access to multiple programs.
Click 'Add Form'.
You will be taken to the field note form template build page.
Template Title: This is the title of the form and will be seen by users.
Description: The form description can be used to store information for administrative purposes, but is not seen by users completing the form.
Form Type: This was set in the previous step and cannot be edited here.
Course/Program: Program coordinators and faculty directors may not have access to multiple courses, while staff:admin users are likely to. If you have access to multiple courses, make sure you've selected the correct course to affiliate the form with.
EPA Version: If you have two (or more) curriculum versions in the system, the Form Template Builder will default to loading the most recent version. In the "EPA Version" tab, simply select the appropriate version. Click Save.
If you want to build new forms for learners using Version 1, change the EPA Version to Version 1, click Save, and it will load the appropriate EPAs.
Permissions: Anyone added under permissions will have access to edit the form before it is in use and use the form if they are setting up a distribution. You may wish to include a program in the permissions field so that you can filter by this form type later on. To add individual, program, or organisation permissions, click the dropdown selector to select a category, and then begin to type in the appropriate name, clicking on it to add to the list. You can add multiple individuals, programs, and organisations to the permissions list as needed.
Include Instructions: Add additional text at the beginning of the form by clicking the small tick box beside ‘Include Instructions.’ This will open a rich text editor where you can enter text, images, hyperlinks, etc. This information will display to users when they complete forms published from this blueprint.
Specify which EPAs can be assessed using forms generated from this template.
All EPAs assigned to a course are included on the template by default.
To remove EPAs, click on the small 'x' to the left of the EPA code.
You can add back any required EPAs by clicking on the dropdown menu and checking off the tick box for a desired EPA.
Click the grey badge beside an EPA to select or remove specific milestones for forms built from this template.
Click 'Save'.
If you want all EPAs to have the same available contextual variables leave all EPAs checked off. If you’d rather specify which contextual variables apply to which EPAs simply uncheck an EPA and it will appear below with its own customizable list of contextual variables.
Select which contextual variables you want to include with which EPAs by checking and unchecking the tick boxes.
You can remove specific contextual variable responses by clicking on the grey button beside a contextual variable.
For convenience, you can also use ‘Check All’ and ‘Uncheck All’.
When you modify which contextual variable response options will be available on a template, the number in the grey badge will show how many responses have been included out of the total possible responses.
To allow users to select multiple CV responses when completing a form, check the multi-select box. If selected, the item created on the form will be a drop down multiple responses type, instead of single response type.
When you have made the required changes, click the blue ‘Save and Next’ button.
You may only select between 1 and 6 contextual variables per EPA per form.
All field note form templates include a Continue and Consider section in which faculty can record comments to provide feedback to learners. These sections cannot be edited in the Field Note Form Template.
From the first dropdown menu, select a Global Rating Scale.
Enter Item Text if needed.
Elentra allows organizations to optionally automatically populate the Item Text based on the selected scale. If you do not see Item Text prepopulated and you would like to, you'll need to speak to a developer about making that change.
From the second dropdown menu, indicate whether comments are disabled, optional, mandatory, or prompted.
Disabled - Comments are disabled at the milestone level.
Optional - An optional comment box will appear for each milestone. Comments are not required for a form to be submitted.
Mandatory - A mandatory comment box will appear for each milestone. Comments are required for all milestones before a form can be submitted.
Prompted - A mandatory comment box will appear only for the prompted responses indicated. This is a popular option for responses at the lower end of a scale.
The Responses fields will be auto-populated depending on the scale selected in the first dropdown menu.
Click 'Save'.
On each form template you create you’ll notice a greyed out area at the bottom including Next Steps, Concerns, and a place for feedback. This is default form content that cannot be changed except by a developer.
Click 'Publish' to make your template available for use.
Once a form template has been published, forms created from it will live on the resident dashboard and can no longer be edited. The number of forms that will be created from a template depends on the number of EPAs assigned to the template.
From the Dashboard, faculty click the green 'Start Assessment/Evaluation' button on the right side of the screen.
First, faculty select an On-Demand Workflow which will dictate which forms become available to select. These options will only display if an organisation has form work-flows configured. If None is the only available option, select None and continue.
After selecting an on-demand workflow, the choices a user has will depend on the workflow they are completing. In this example, we'll complete an EPA form.
Next faculty select a learner. They can begin to type a name to limit the list of options. When they mouse over a name, they can see the learner's name and photo (if uploaded).
Set a date of encounter.
If the learner is enrolled in two programs the faculty will have to specify a program.
Next faculty select an EPA.
For a reminder on what is included in a specific EPA the black question mark provides a quick link to the EPA Encyclopedia (see above).
Users can easily navigate the list of EPAs by applying preset filters including Current Stage EPAs, Current Rotation EPAs, and Priority EPAs. Click on a filter title to apply it. In the example above the Priority EPAs filter is being applied.
After an EPA is selected, the available assessment tools will be displayed. Learners can search the tools by beginning to type in the search bar. Note the small clock in the top right of each tool card. This is an estimate of how long the form will take to complete based on the experience of other users.
Faculty can click 'Preview This Form' to view the form and ensure it is the one they want or they can click 'Begin Assessment' on any tool to begin.
Learners and their affiliated faculty and program administrators can track assessment form completion by navigating to the learner's CBME dashboard.
Click on the Assessment and Evaluation badge at the top of the page. Click the My Learners tab and click on CBME Dashboard below the relevant learner name.
From the learner's CBME dashboard click on Assessments. Scroll down past the Filter Options until you see a set of tabs including Completed, In Progress, Pending, and Deleted. Choose the appropriate tab to review assessment form completion.
Click on the Assessment and Evaluation badge at the top of the page. Click the My Learners tab and click on Assessments below the relevant learner name.
These pages provide information on tasks triggered by faculty and learner's as well as tasks assigned through distributions and provide access to some reporting. For more information see the Reviewing Progress>Assessments Page help section.
PAs can view a faculty's tasks from the Assessment and Evaluation tab. It works almost the same as the learner's assessment page but is accessed from the Faculty tab.
When a PA sets up assessment and evaluation tasks to be completed via distributions, progress can quickly be viewed via the Assessment and Evaluation module.
From the Dashboard, learners have access to a green 'Start Assessment/Evaluation' button on the right side of the screen.
First, learners select an On-Demand Workflow which will dictate which forms become available to select. These options will only display if an organisation has form work-flows configured.
After selecting an on-demand workflow, the choices a learner has will depend on the workflow they are completing. In this example, we'll complete an EPA form.
Next learners select an assessor. They can begin to type a name to limit the list of options. When they mouse over a faculty name, learners can see the faculty card including their name and photo (if uploaded).
If learners need to add an external assessor (someone who doesn't have an Elentra account), they can click 'Add External Assessor'.
Next, learners set a date of encounter.
Next, learners select an assessment method. Details about each assessment method are provided inline.
In our example, completing an EPA form, learners next select and EPA to be assessed on.
For a reminder on what is included in a specific EPA the black question mark provides a quick link to the EPA Encyclopedia.
Users can easily navigate the list of EPAs by applying preset filters including Current Stage EPAs, Current Rotation EPAs, and Priority EPAs. Click on a filter title to apply it. In this example the Priority EPAs filter is being applied.
After an EPA is selected, the available assessment tools will be displayed. Learners can search the tools by beginning to type in the search bar. Note the small clock in the top right of each tool card. This is an estimate of how long the form will take to complete based on the experience of other users.
Learners can click 'Preview This Form' to view the form and ensure it is the one they want or they can click 'Begin Assessment' on any tool to begin.
Depending on the selected method of assessment, learners will either be directed to the assessment form to start it, or the form will be sent directly to the assessor.
Organizations can optionally enable a database setting to add a shortcut icon to learner's CBME dashboards beside each corresponding EPA (setting = cbme_ondemand_start_assessment_shortcut_button). If you'd like this option to be available to your learners, please speak to a developer.
The shortcut icon displays as a small play sign on each EPA card.
Using the shortcut will take the learner to the Start Assessment/Evaluation page with some information already completed.
There is a tool to allow admin users to reopen a completed assessment and edit it as needed. This is a feature that can be turned on or off in the database settings file depending on how your organization wants to use it.
If enabled, staff:admin can access this feature.
Navigate to your Assessment & Evaluation badge (beside own name in the top left).
Click on My Learners and then Assessments under a specific learner's name.
From the list of Tasks Completed on Learner click on a task to view it.
Click the "Reopen Task" button just below the form title. This will set the task to in-progress and the staff>admin or faculty will be able to adjust it.
Provide a reason to reopen the task (e.g. was accidentally deleted, was missing data, other).
Click 'Reopen Task'.
Once reopened, a user can complete the task and submit it on behalf of the original assessor, or they can forward the task to the original assessor to update.
Use with caution!
This tool should be used judiciously to ensure that residents are not ”gaming” the system and bullying anyone into changing their assessments to be more favourable.
The most commonly used tools for reviewing resident progress in CBME are the CBME Program Dashboard and individual learners' CBME dashboards. Remember that to use the CBME Program Dashboard a program must have built an assessment plan for its EPAs. At this time the CBME Program Dashboard view is only available to staff and faculty and is not visible to learners. Learners continue to use their individual CBME dashboards.
Although the CBME Program Dashboard is enabled by default, it can be disabled for specific programs if they prefer to use only individual CBME Dashboards or are not building assessment plans at this time. You will need a developer's assistance to disable the CBME Program Dashboard for a specific program/course.
Updated in ME 1.22! An additional views was introduced (Faculty Evaluation Dashboard) and residents are now able to access their own dashboard.
The CBME visual summary is optional and is controlled by a database setting (cbme_enable_visual_summary). To enable this for your organization, please speak to a developer.
IMPORTANT PREREQUISITE: Assessment Plan Builder
In order to leverage the visual summary, your program must have assessment plans entered in Elentra. Please visit the Assessment Plan Builder lesson for more information.
To access the visual summary:
Log in to Elentra as an administrator (e.g. program administrator, course director).
At the top right, click on the "Assessment & Evaluation" task list icon.
Click on the "My Learners" tab
You will land on the Program Dashboard.
From the tab menu below the filter option, click Visual Summary.
You will be directed to the Visual Summary dashboard.
Toggle between the different dashboards, and/or programs as applicable.
The Normative Assessment Dashboard shows the performance of all residents in a program relative to each other and their training phases and is meant to be only viewed by Competency Committee members.
The normative dashboard presents summarized data metrics of all the residents in a program relative to each other. The data is presented as both a visual representation (left) and a tabular representation. Users are provided with an option to either view all the residents in a program or selectively view the metrics of residents in one training stage by using the dropdown at the top of the dashboard. This can be useful during Competency Committee meetings when residents are judged on their performance relative to others in their training group.
By default, the normative dashboard filters out residents without any completed EPAs. However, this behavior can be turned off by toggling the checkbox at the top of the dashboard.
The bar chart visualizes the following four metrics individually and to switch between the metric being visualized, users can select the corresponding radio buttons above the chart. Each bar represents one resident and users can hover their mouse over a bar in the chart to see the name of the resident it represents, and the value of the current metric being shown. Clicking on a bar in the bar chart switches the user to the resident dashboard to view a detailed breakdown of all assessments of that resident.
Total EPAs - This is a count of the total number of EPAs filled out by a resident.
Currently the total EPAs only considers EPAs that have been collected on valid assessment forms. However, In a future release this count will be updated to also include EPAs completed on archived, expired or deleted assessment forms.
Achievement Rate - This is the total number of EPAs a resident has achieved divided by the total number of EPAs completed by that resident. An achieved EPA is one where the EPA meets certain requirements such as acquiring a rating of 4 or above on a 5-point scale, or satisfying specific contextual variable requirements in an EPA, or meeting diverse assessor role requirements.
Progress Rate - This is the number of EPAs a resident has achieved divided by the total number of EPAs they are required to complete in all the valid EPA forms in a program across the different training phases.
While the achievement rate is a personal indicator for each resident to see what number of EPAs they attempt are achieved, the progress rate is a global indicator that shows where they are in their residency training program.
Total EPAs vs Achieved EPAs - This chart visualizes two bars for each resident showing their total EPAs and achieved EPAs next to each other. While this metric is similar to achievement rate, it can offer a better picture of a resident’s overall performance as high achievements rates can be occasionally misleading for residents with a very low number of completed EPAs all of which are achieved.
Finally, there is a tabular representation of the same metrics with the ability to sort the residents in the table by clicking on the column header. By default, the residents are sorted in the table by name in an ascending order. This can be changed to a descending order by simply clicking on the column header “Name”. Similarly clicking on each column header sorts the table in an ascending order based on that particular metric and clicking the same column header again changes the order to descending.
The normative dashboard is linked to the resident dashboard so to view a particular residents’ assessments in detail users can simply click on a bar corresponding to a resident in the bar chart or their corresponding row in the table. This will automatically switch the user to the resident dashboard with that particular resident preselected.
The Resident Metrics Dashboard focuses on individual residents and is designed to be used by Residents and Competency Committee members**.**
The resident dashboard has a wealth of information that is grouped into different categories for easier comprehension. First if you arrived at the resident dashboard by selecting a resident on the normative dashboard their data is automatically fetched for you. However, if you manually switched over to the resident dashboard by clicking on the navigation tabs above, you will need to select a resident from the dropdown in the filter panel situated at the top of the dashboard. The dropdown contains the list of all the residents in the program with their name and their
corresponding progress rate. The names of residents are further grouped by their training stage and then sorted alphabetically for easier access.
The dropdown is also an editable text box and so you can type a resident’s name partially to automatically filter the available options in the dropdown. This makes it easier to search for a particular resident in a program with many residents.
After selecting a resident, users can then click on the “GET RECORDS’’ button to visualize their assessment data. You might notice the small button with the calendar icon on it. This is used to highlight assessment data gained by the resident in a particular time period. For now, ignore it but we will learn more about further down. The resident dashboard consists of 4 main sub sections. Let us look at each one individually.
This section provides the following summarized metrics of the resident:
Total EPAs observed - Count of all EPAs filled by a resident.
This number may vary from the total EPAs count for the same resident on the normative dashboard as this number also includes assessments filled on expired/archived assessment forms and not just currently valid assessment forms.
Progress Rate - This is the number of EPAs a resident has achieved divided by the total number of EPAs they are required to complete in all the valid EPA forms in a program across the different training phases.
Achievement Rate - This is the total number of EPAs a resident has achieved divided by the total number of EPAs completed by that resident. An achieved EPA is one where the EPA meets certain requirements such as acquiring a rating of 4 or above on a 5 point scale, or satisfying specific contextual variable requirements in an EPA, or meeting diverse assessor role requirements.
To the right of the acquisition metrics is a line chart that shows the weekly EPA acquisition rate for the last six months by the resident. This is meant to give a high level overview at a quick glance of the residents assessment gathering in the recent past.
This section is meant to quickly lookup a residents’ recent performance with the option to view records in the following ranges: last 10 days, last 25 days, last month and last 3 months. The chart does not visually distinguish the different EPA types (i.e., EPA-F1 vs EPA-C2), instead, it provides this and other additional information in a pop-up menu that can be invoked by hovering the mouse over a point.
The line chart provides a simple representation of the last “N’’ assessments filled by the resident where every EPA is represented as a point with the oldest record starting on the left. The points are arranged vertically using the O-Score Entrustability scale with 5 being the highest (I did not need to be there) and 1 being the lowest (I had to do). The better a resident performs in an EPA, the higher is the vertical position of the point in the chart. We use background lines to show the 5 levels, instead of labelling the points, to reduce visual clutter as the levels are easy to understand without providing additional context.
The final section provides a detailed overview of every single EPA completed by the resident. The entire list of EPAs that residents are required to complete are broken down into four groups based on the training phase during which a resident is supposed to complete them and are numbered accordingly.
Each training phase is presented as a block with the title of the training phase and a label indicating whether the resident has completed the training phase. If a training phase is in progress a completion rate is shown to indicate the number of assessments the resident has achieved in that training phase relative to the total number of required assessments for every EPA in that phase. Each training phase block acts as an accordion and can be expanded or collapsed to view the list of all EPAs in that block.
Although residents generally complete the EPAs of their current training phase before they pick up EPAs of later phases, there are exceptions. Due to various external factors such as their rotation schedules and the nature of medical cases of the patients they attend to, residents can occasionally end up completing EPAs which are not in their current training phase. This means residents can have a non-zero completion rate for training phases that they have not yet started officially. When a training block is open, all the EPAs in that block are arranged sequentially based on the numbering order in a 3-column layout as shown in the following figure.
First Column: EPA ID and textual description of the corresponding medical scenario that the EPA targets.
Second Column: The residents acquisition metrics for each EPA are provided as four numbers along with two bullets charts that visualize how far along the resident is in completing that EPA. The first bullet chart (blue) visualizes the observed EPA count relative to the required count while the second bullet chart visualizes the achieved EPA count relative to the required count. If an EPA is complete (the required numbers of EPAs are achieved), the “TO GO’’ metric changes into a completed check mark icon.
Third Column: This is a visualization of all assessment filled by the resident for that particular EPA. The information is visualized similar to the recent EPA chart discussed above. Assessments are arranged chronologically on the X axis with the oldest to the left and are arranged vertically based on the EPA rating (5-point O-Score Entrustability scale) with 5 being the highest (resident managed the situation independently) and 1 being the lowest (Assessor had to completely take over the situation). Each point in this chart can be hovered upon to view additional information about that EPA such as narrative feedback in an onscreen popup window.
Finally, two buttons are provided as seen in the bottom left corner of the chart. The first one (book icon) can be clicked to see all the records in a table that can be sorted and filtered by columns. To filter the table start typing in the input box at the top of each column in the table. This will dynamically filter the table as you type. To sort the table by a column simply click on the column header.
The second button (sliders icon) brings up dropdown filter lists that can be used to visually identify a particular record based on patient demographics or other contextual variables such as “Case Complexity’’ or “Clinical Presentation’’. For example, if a user wanted to see which of the records were for senior patients, they could select the ‘Senior’ option from the drop-down list and the corresponding points (observation scores) would turn light red.
This is a common feature across the dashboard that highlights all assessments that were filled in particular time period. To enable this, head over the filter panel at the top of the dashboard and click on the small button with the calendar icon on it. This will open a panel where you can set the start date and end date for the time period. You can either type in directly into the input box or use the date selector on the calendar below.
Once the start date and end date are set all assessments that fall in that particular time period are converted into diamonds across the dashboard. This provides a way to visually distinguish these EPAs while still viewing them relative to other EPAs filled outside of the selected time period. This feature can be particularly useful during competence committee meetings which happen once every three months such that the time period can be set to highlight only the EPAs filled by the resident since the last meeting.
Another way to set the time period on the dashboard is by simply clicking on rotation block in the rotation schedule. This will automatically set the start date and end date of the time period to the dates of the rotation block and all assessments filled in that rotation block are automatically highlighted.
The checkbox provided in the date filter panel when enabled hides all EPA levels which do not have any assessments filled in the selected training period. If an entire training phase does not have any EPAs filled in the training period, then the whole training phase block as a whole is also hidden. This can be useful to reduce the visual clutter on the dashboard and only focus on a small subset of EPAs.
From this dashboard, program directors and coordinators can see a breakdown of all assessments completed in a program by a faculty member for informing faculty development.
Updated in ME 1.21 to include additional details about assessment plan requirements.
Faculty, program administrators and residents can easily review a learner's progress from the learner's individual CBME dashboard.
Note that the CBME dashboard includes several tabs: Stages, Assessments, Assessments Items, Trends, Comments, and Pins. (Reports can be generated from the Stages page.)
There is another Assessments Dashboard that pulls in additional form information if your organization is using the Assessment and Evaluation module for other forms (e.g. program and faculty evaluations). This page discusses just the CBME Dashboard.
When logged in as a faculty member or program coordinator, click the Assessment & Evaluation badge that appears between the user's name and the logout button in the top right.
Click on 'My Learners' from the tab menu.
Search for a learner as needed, and if you can't find someone, ensure that you are in the correct curriculum period using the dropdown menu on the right.
After finding the correct learner, click on Dashboard under the learner's name to view the learner's progress. Residents automatically land on their CBME Dashboard when they log into Elentra.
From a learner's CBME Dashboard, click through the different tabs to view a range of assessment information. On most tabs, you can apply filters in order to refine the list of assessments visible. To use this option, select all the appropriate filters (e.g. date, curriculum tag, contextual variable, assessment tool, etc.) and then click 'Apply Filters'. Note that the filters a PA or PD applies on one learner will be maintained as you move through different pages.
From the stages tab you can see a summary of a learner's progress across EPAs and stages.
Under each stage is the curriculum version the learner was on for that stage.
EPAs are displayed in order of learner stage and completed stages can be collapsed.
A badge on EPA cards displays the resident’s progress towards meeting the uploaded assessment plans (Achieved/Required) and is visible to all who have access to the dashboard including the resident themselves. These numbers align with those on the CBME Program Dashboard and are updated on the same nightly schedule. Note that you can toggle between viewing all requirements and remaining requirements.
If you are using the rotation scheduler, EPAs specific to a learner's current rotation are outlined in the stage colour and the priority and likelihood of an EPA in the learner's specific rotation is shown through the exclamation mark and bar chart (unlikely, likely, very likely). Whether or not the rotation scheduler is in use, green checkmarks indicate that a stage or EPA is complete (this is set by the Competency Committee).
Click the down arrow on the right side of an EPA card to see a list of completed assessments for that EPA. Depending on the form there may be a count of the learner's global assessment rating, which you can click on to access an aggregated report of the learner's performance on this form.
From the “Stages” tab of the CBME Dashboard, click on the grey down arrow on the right side of the EPA card (“View Assessment Details” tooltip will appear on hover).
This will display the titles and total counts of all forms that have been completed on a resident for that EPA. Simply click on the form title that you wish to view aggregated data for.
This will open a new tab with the aggregated report as well as a trends chart. Within this tab, click on the form title to expand the aggregated data. If there have been multiple versions of the same form, these will aggregate separately, so you will need to click on each form version to view the data. You are also able to navigate directly to individual assessments by clicking on them within the trends chart.
Additionally, from ‘View Assessment Details’, you are able to generate an aggregated report by clicking on the entrustment rating responses. This will generate a report for only those assessments with that specific level of entrustment (e.g., to view aggregated report of all assessments where the resident was entrusted with “Supervison Only” on that particular form).
See a list of all completed assessments and filter as desired.
Toggle between completed, in progress, pending and deleted assessments below the filter settings.
Note that Pending tasks here includes all assessments, whether or not they have expired.
On each individual assessment card, note that the form type and relevant EPA are displayed. You can also click the down arrow on the right to show some form information (e.g., global rating, comments, assessor and assessment method), and click 'View Details' on the left to see the assessment form.
Users can quickly filter for read/unread assessments.
The small grey number beside 'Assessments' in the tab menu represents all unread assessments.
From the regular list view an eye with a slash means an assessment is unread.
There is an option to mark all assessments as read on the right side above the assessment cards.
Marking assessments as read or unread is user specific so learners and faculty will see their own results.
Users can pin an assessment from this screen and learners can give an assessment a "thumbs up" to provide feedback to an assessor about their feedback.
Quickly see a list of all completed assessment items and filter as desired. Click the down arrow on the right to see comments, response descriptor and the name of the assessor (as applicable). Click View Details at the bottom left of each card to access the completed assessment form.
Users can pin an assessment item from this screen.
View trends in learner performance across global assessment ratings. Note the overall tally of ratings in the top right corner of each card. Hover over on a node on the chart to view information about the form name, date and rating; you can also click through to access the actual assessment form.
Quickly access a list of all narrative comments provided to the learner on any complete CBME assessment tool. The tool type and relevant EPA will be displayed below the form name on each comment card.
Users can pin comments from this tab.
Quickly view all assessments, items, or comments that have been pinned. Apply filters as desired, or just scroll down to find the toggle to switch between assessments, items, and comments.
Updated in ME 1.21 to include additional details about assessment plan requirements.
The program level dashboard leverages the updated assessment plan builder to provide an overview of resident progress towards meeting the plan. From within one interface, Program Directors, Program Administrators, and Academic Advisors (only assigned learners) are able to see all of the learners in their program and their progress towards meeting the plan requirements. There is currently no learner-facing side of this dashboard.
IMPORTANT PREREQUISITE: Assessment Plan Builder
In order to leverage the program-level dashboard, your program must have assessment plans entered in Elentra. Please visit the Assessment Plan Builder lesson for more information.
Once you have entered your assessment plan requirements into the Assessment Plan Builder, the dashboard will populate the EPA counters.
Log in to Elentra as an administrator (e.g. program administrator, course director).
At the top right, click on the "Assessment & Evaluation" icon beside your name.
Click on the "My Learners" tab to view the CBME Program Dashboard.
Multiple tabs provide different progress information (EPAs, Stages, Program Stats). An advanced filter set allows programs to filter the information on each page. These filters persist across tabs.
There are currently three tabs within the Program Dashboard. See the screenshots below for examples.
Assessments By EPA: Visualizes each learner's progress towards meeting the assessment plans organized by EPA. You can view all learners in one interface.
Stage Completion Status: Visualizes each learners progress towards meeting all EPAs within a stage. You can view all learners in one interface.
Program Stats: Currently includes a bar graph depicting how many assessments have been completed on each resident, highlighting the proportion that meet the plans.
Click on the information icon in the top right of an EPA tile to view a residents progress to date in terms of fulfilling the contextual variable and other requirements as defined by the Assessment Plan. Some samples views are posted below.
Note that while requirements are incomplete, you can toggle between viewing all requirements or remaining requirements only.
Select the curriculum period that you wish to view enrolment for. This is typically the current academic year.
If you have access to more than one program, you can toggle between them using this dropdown menu.
Search learner names using free-text search.
You are able to sort the learner list by:
Learner name ("Sort by Name")
Progress towards meeting the plan ("Sort by Progress")
Total number of assessments ("Sort by Total Assessments")
Choose to sort the learner list in ascending or descending order.
Filter the learner list by Post-Graduate Year (PGY) level. You may select more than one PGY.
Filter the EPA list by Royal College Stages. You may select more than one stage.
Filter the EPA list. You may select more than one EPA.
Overall Total: Total number of assessments completed on the learner for EPAs that have an assessment plan entered.
EPA Total: The number directly beneath the EPA Code is the total number of assessments that have been completed on that EPA for that learner, regardless of whether or not they met the assessment plan requirements.
Requirements Total: The fraction indicates how many completed assessments met the assessment plan over how many assessments are required.
The resident progress dashboard is meant to give a high level overview of your learners' progress towards meeting your assessment plans. The decision to mark EPA progress as "Approved" is made solely at discretion of the Competence Committee.
Red: No Progress. Indicates that the learner is in that stage, but:
has not been assessed on that EPA, OR
has been assessed but none of the assessments meet plan requirements
Yellow: In Progress < 50%. Indicates that the learner has been assessed on the EPA, but is currently meeting less than 50% of the requirements
Blue: In Progress > 50%. Indicates that the learner has been assessed on the EPA and is meeting more than 50% of the requirements
Green: Achieved. Indicates that the learner has been assessed on the EPA and is currently meeting the defined assessment plan numbers; however, the progress still needs to be reviewed and approved by the competence committee.
Green: Approved (with checkmark). Indicates that the EPA has been reviewed and approved at the competence committee level.
Grey: All EPAs that are not in the learner's current stage will appear grey, even if they have assessments that count towards the assessment plan.
The assessment plan builder allows you to specify minimum requirements for assessment forms on a per-EPA basis. When you enter an assessment plan, you enter the following information for each form or group of combined forms:
Minimum number of assessments, with a global assessment rating equal to or high than an indicated value
Minimum number of unique assessors
Contextual variable requirements, including a defined number of required responses (or a range of responses), such as a certain number of presentations or complexities
These values are then combined in the system to create the total number of required assessments. It is possible for a learner to have the correct number of required assessments for the global assessment rating without achieving the plan due to not meeting the contextual variable or unique assessor requirements.
For example, if the learner needs 5 assessments at "meets expectations" or above, in addition to being assessed on 5 different clinical presentations, the dashboard will only count the first instance of "acute injury" that "meets expectations", and will only count other clinical presentations towards the total after that. Any additional 'acute injuries' that 'meet expectations' will not be counted, since the learner still needs to be assessed on 4 more unique clinical presentations.
If a program does not want to use the CBME Program Dashboard a developer can disable it for specific programs. (Developers, the program dashboard can be disabled for a course by adding an entry cbme_progress_dashboard with a value of 0 in the course_settings table.)
Navigate to Admin>Assessment and Evaluation. PAs can switch between viewing Assessments and Evaluations and can navigate through outstanding, upcoming, and deleted tasks. PAs can also filter by delivery type and can send a reminder or delete a task from the A+E dashboard.
Note that an assessor must have a PIN setup for learners to select the first option. For more detail on setting user PINs see .
When in a faculty or PA role you can also access a Milestone Report from the Stages tab. More details about the Milestone Report .
Currently, the assessment plan builder and CBME Program Dashboard only support: Supervisor Forms, Field Notes, PPAs (with global entrustment item), Procedure Forms, and Rubric/Flex Forms (with global entrustment item). It does not currently support PPAs/Rubrics that do not have the global entrustment item added.
The CBME Program dashboard and the assessment plan builder take both form versions and EPA versions into consideration**.** This means that in order for the dashboard to generate correct assessment counts, you need to enter assessment plans for all active EPA versions, and in some cases, all form versions. Remember, your competence committee still has access to all learner data and can 'overrule' the system by marking an EPA as Approved in these (and other) cases. The dashboard should not be the sole source of information for competence committees.
The CBME Program Dashboard only counts assessments that have published assessment plans linked to them. In some cases, you may have assessments that were completed on older form versions that do not have a plan, or you have not yet built assessment plans for your new forms, so the dashboard does not count these assessments. Additionally, some form types are currently not supported such as PPAs and Rubrics that are not tagged to any EPAs. If a learner has gathered assessments on a previous EPA version and is now on a new version (e.g., assessed initially on F3-Version 1, but was given F3-Version 2 midstream) these archived assessments will not display on the program dashboard since it only displays the learner's current EPA versions. To view archived assessment data, navigate to the learner's CBME dashboard.
At most schools, the CBME Program Dashboard is updated on a once-nightly basis.
No, archived assessments will not display on the program dashboard since it only displays the learner's current EPA versions. To view archived assessment data, navigate to the learner's CBME dashboard.
Yes, the program dashboard does include assessments completed by external assessors.
In addition to the CBME Dashboard, users can access a learner's Assessments page to view additional tasks completed on the learner and assigned to the learner. The Assessments page reflect tasks completed via distributions and CBME forms initiated on demand by faculty and learners, however the reporting tool accessible from Tasks Completed on Learner applies only to forms managed via distributions (for reporting on on-demand CBME forms please see the CBME Dashboard page).
Accessible by PDs and PAs from the CBME Dashboard Stages tab, the Milestone Report allows faculty and PAs to generate a milestone report for a specific learner. This report is a breakdown of completed assessments that have been tagged to milestones and EPAs. Assessments are tallied based on the milestone and EPA that they are tagged to and the tool generates one report per unique scale. The number of unique scales is determined by what tools were used to complete the included assessment tools. The reports are generated as CSVs and are zipped into a folder.
From a learner's CBME Dashboard Stages tab, click Milestone Report in the top right (beside Log Meeting).
1. Select a date range for the report. 2. Click “Get Tools" 3. Select the tools that you wish to view the aggregated milestone report for, or click “Select All Tools" 4. Click “Generate Report”. This will open a download modal for you to select where to save the zip file. 5. Unzip the file 6. If multiple rating scales were used to assess the milestones, there will be one CSV file for each rating scale 7. Open a CSV file 8. Each row represents a milestone, and each column represents the rating scale options. These rating scale options are grouped by EPA (e.g., for the Queen’s Three-Point Scale, you will see 4 columns for each EPA: Not Observed, Needs Attention, Developing, and Achieving). 9. Each cell displays the total number of times that milestone was assessed for that EPA, including how many times it was rated at that level on the rating scale (e.g., 3 of 6 completed assessments were rated as “Achieving”.
NOTE: Even though you may have selected only one of many Supervisor Tools (or other tools), if you used the same scale on all tools, the report will display all data for all tools that used that rating scale. We will be enhancing this in future iterations to only report on tool(s) selected.
Residents can highlight assessments that they found helpful to their learning. From the “Assessments” tab of the CBME Dashboard, a learner can click on the “thumbs up” icon to indicate to an assessor that their feedback was helpful to the resident's learning. Learners can also include a comment on why they found it helpful. This feedback is important to help assessors identify the types of feedback that residents find beneficial for their learning.
When publishing a new curriculum version within the CBME module, learners will automatically be updated to have the new curriculum version (i.e., EPAs marked as “replaced” or “changing”) for all of their upcoming stages of training. There may be a time where you would like certain learners to have stages from a specific version. This guide will instruct you on how to properly update a learner or series of learners to have stages from a specific version.
In order to update the learner(s) objectives in a timely manner there are a few things to gather before you begin the process:
1. Assemble a list of proxy_ids for all of the learners that you wish to update. This process is done on a per course basis so make sure that all of the learner proxy_ids that you compile belong to the same course.
2. Make note of the stages that you wish to be updated. The script requires the stage letter in order to know which stages to update, so make a list of the stage letters. For example, if you are updating a learner to have an old version of Transition to Discipline (D) then you will need to note D as the stage you are updating.
3. The final thing you will need is the cbme_objective_tree_version_id for the version that you wish to update to. So if the course you are updating is course_id 123 and you would like to update a learner to version 2 for a stage then you need to look up the cbme_objective_tree_version_id for course 123 version 2 in the cbme_objective_tree_versions table. You will also need the cbme_objective_tree_version_id for the version that the learner is already a part of. The script requires that you set the version for every stage available to the learner which is why we need the current versions.
4. Access to the database
5. SSH access to your production environment
Updating the learner stages requires a developer tool to be run from developers/tools/cbme_migration. You must have SSH access to your production server in order to complete these steps. Please Note: It’s recommended that you go through the following steps in a test/staging/development environment first so that you can ensure that the script updated the learners properly.
Steps:
1. Open up your database client and open the cbme_objective_tree_aggregates table.
2. For every learner proxy_id that you compiled ahead of time we will be deleting their cbme_objective_tree_aggregate records for the course that we are updating. Select all of the rows where tree_viewer_value is the proxy_id that we are dealing with, the tree_viewer_type is “proxy_id” and the course_id matches the course that we are using. Once you have all of that data, we are going to DELETE it from the table. Repeat this until we have deleted the aggregates for every learner in the list.
3. Now that the aggregates are deleted, we can update the learners’ stages using the script. SSH into your server and navigate to the following directory: /var/www/vhosts/your installation name here/developers/tools/cbme_migration
4. Once in that directory we are going to be executing the reset-learner-tree-version.php script. Tip: if you run php reset-learner-tree-version.php --usage it will bring up the help dialogue to describe all of the options that are available. Once you have read through all of the available options you will notice that there are multiple modes that this script can be run in. For this scenario we will be using “specify” mode since we want to specify which version of stages the learners will be receiving. We must specify all stages in the --stages parameter so that the script updates them to the correct version.
As an example, if your data is this:
organisation_id = 1
course_id = 123
proxy_ids = 1111,2222,3333,4444
stages to update = C,P
current version id = 10
new version id = 20
The command will look like this:
We do not need to provide the --exclude parameter in this case
You will notice with the command above that we have listed all 4 stages in the --stages parameter even though we are only updating C and P. The reason for this is because the script requires that all stages be specified in order to update them to the correct version. In this case we are not changing D and F so we set them as the original version (10) in the script parameters. Each stage corresponds with a version in the --versions parameter, so in this case D will get version 10, F will get version 10, C will get version 20 and so on.
5. Once that script runs the last thing to do is to clear the cbme_objective_tree cache for the course that we are dealing with. The easiest way to do this is through the interface:\
Login to your install as an administrator who has admin access to the course that we are updating
Navigate to Admin > Manage Courses (Programs) > Select the course that you are using > CBME tab > Import CBME Data
Click on the Actions button on the left side above the EPA search bar and select the Edit EPAs option.
Whenever one of these EPAs are updated, the cache is cleared for the course. So all that is required is to just click save on the first EPA that is listed and the cache will be cleared. You do not need to change any of the text in the EPA that you just saved. Simply saving what is already there will sufficiently clear the cache.
Once you have cleared the cache for the course then the learners should see the updates on their dashboard. As mentioned before, it's recommended that you do this process in a test environment first so that you can verify the data is the way you would like it before updating production. If you do run into the scenario where you updated a learner to the wrong version then you can always repeat this process and update them to the correct version.
The easiest way to verify that the learners are in the correct version would be to login as some of the learners that were updated and compare their dashboards to the version they were set to. Usually there is a difference between one version to the next whether it be the EPA titles or the number of EPAs.
To promote a learner to a new stage, log in as a Competence Committee member, and click on the Assessment and Evaluation badge at the top of the page (beside the green logout button).
Select the 'My Learners' tab and then click on the CBME Dashboard tab of the learner you want to promote.
On the right hand side of each row you'll see a small grey circle; click on the circle to open a dialogue box where you can enter text and mark an EPA or stage as complete.
You can also click 'View History' in the dialogue box and see any previous comments or decisions regarding learner promotion (or demotion).
Note that there is a database setting you can enable to allow course directors and program coordinators to manage stages (settings: allow_course_director_manage_stage and allow_program_coordinator_manage_stage). Both these settings are off by default and need to be enabled by a developer if you want to use them.
An archived assessment is a CBME assessment that was completed on a resident using an EPA from a previous curriculum version. Assessments are archived only when a program uploads new versions of their EPAs but a resident has already collected assessments in the stage beyond the one they are currently in. In this case, the resident still receives the new EPAs for all future stages; however, all completed assessments from the old versions of those stages/EPAs are “archived”.
From the "Stages" tab, each EPA card displays how many assessments have been archived for that EPA. Expand the card for more detail.
In the example above the learner has 3 "current" assessments, and 3 "archived" assessments from a previous EPA version.
When on the "Assessments" tab, archived assessments are identifiable by locating the grey square beside the form title. Current assessments will not have the grey square beside the form title. The image below shows 3 archived assessments.
Program Administrators and Faculty Directors have the ability to get an overview of others' task completion status from Admin > Assessment and Evaluation.
Navigate to Admin > Assessment and Evaluation.
Filter by task delivery type as needed.
View Outstanding, Upcoming, or Deleted tasks.
Search any list for for specific forms, or owners. Note: The 'Owner' is the person who is responsible for completing the form.
Hover over the Targets column to view the targets of the form.
Check the appropriate boxes and then send reminders as needed.
Check the appropriate boxes and then 'Delete Task(s)' as needed.
Click on any form to complete it on behalf of another user.
The list of users included in the Outstanding task list are those faculty listed as "Associated Faculty" on the program setup tab, faculty who have been included in a distribution in the course, and learners in the audience of a program enrolment (also configured on the program setup tab).
New in ME 1.17! On-demand workflows enable learners to trigger forms on-demand using different workflow options. Administrators are able to select the on-demand workflow options when building a form.
There is developer work in the database required to set up workflows. (Please see file at bottom of the page.)
Please see additional details here.
New in ME 1.20
Learners can now create meetings and upload files.
Faculty and program administrators can log meetings to maintain a record of conversations about learner progress.
In CBME enabled organizations users can access the Log Meeting button for the learner's individual dashboard.
Additionally, Learners can access Meetings from the user icon in the top right and Faculty can access their learners meetings from the user icon and My Learners (which will take them to the Assessment and Evaluation My Learners view).
To enter a record, click the Log Meeting button.
Click Log New Meeting.
Provide a date, enter any comments and click Create Meeting. The author is automatically recorded.
Logged meetings can have files uploaded to them, be edited or be deleted using the tools in the Actions column. A program admin., academic advisor or competence committee member can create or add to any entry they have made (but not those made by other users).
Meeting logs created by faculty and staff are visible to the learner.
Learners have quick access to view their own meeting logs from the CBME dashboard My Meetings button.
In a non-CBME enabled organization, learners can access My Meetings from the user icon in the top right.
From the user icon, click Meetings.
Click Log New Meeting.
Learners will be prompted to identify an advisor they met with. Click Browse Advisors, select a Curriculum Period (e.g. Sept. 1, 2020 - July 15, 2021) and then search for or select an advisor.
The list of advisors available to a learner is based on the course group tutors assigned to them.
Enter the date of the meeting.
Add any comments from the meeting. (This is optional.)
Click Create Meeting.
After the meeting is created, learners can optionally upload supporting files by clicking the upload icon in the Actions column.
Learners can upload files to meetings logged by other people (e.g., their advisor), however, learners can only edit or delete the meetings they created.
When learners download a file they will be prompted with:
By downloading this file, you are agreeing that you will review it contents and your review of this file will be indicated in the My Meetings interface. Would you like to continue? Yes or No.
If they click yes, an additional column on the My Meetings interface will record the date and time the file was downloaded.
Learners or faculty can pin assessments or individual comments to keep them easily accessible for review during meetings. This can help to keep an important piece of feedback or other information front and centre until it has been discussed.
How to pin something
To pin an assessment, simply click on the pushpin icon that appears to the right of the assessment title and details. You'll get a success message that you've pinned the item. In the example to the left, the second and third assessments have been pinned.
To pin a comment, click on the Comments tab from the CBME Dashboard and then click the pushpin beside the desired comment. In the example to the left, the second comment has been pinned.
From the CBME Dashboard, click on Pins at the end of the tabs menu. This will open a screen showing all pinned items. To unpin something, just click the pushpin. You'll see a success message that you've unpinned the item.
To pin an individual assessment item, navigate to the Assessments Items tab of the CBME dashboard. Apply filters as needed and click the pushpin icon to pin beside an assessment item to pin it.