Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Currently, the assessment plan builder and CBME Program Dashboard only support: Supervisor Forms, Field Notes, PPAs (with global entrustment item), Procedure Forms, and Rubric/Flex Forms (with global entrustment item). It does not currently support PPAs/Rubrics that do not have the global entrustment item added.
The CBME Program dashboard and the assessment plan builder take both form versions and EPA versions into consideration. This means that in order for the dashboard to generate correct assessment counts, you need to enter assessment plans for all active EPA versions, and in some cases, all form versions. Remember, your competence committee still has access to all learner data and can 'overrule' the system by marking an EPA as Approved in these (and other) cases. The dashboard should not be the sole source of information for competence committees.
The CBME Program Dashboard only counts assessments that have published assessment plans linked to them. In some cases, you may have assessments that were completed on older form versions that do not have a plan, or you have not yet built assessment plans for your new forms, so the dashboard does not count these assessments. Additionally, some form types are currently not supported such as PPAs and Rubrics that are not tagged to any EPAs. If a learner has gathered assessments on a previous EPA version and is now on a new version (e.g., assessed initially on F3-Version 1, but was given F3-Version 2 midstream) these archived assessments will not display on the program dashboard since it only displays the learner's current EPA versions. To view archived assessment data, navigate to the learner's CBME dashboard.
At most schools, the CBME Program Dashboard is updated on a once-nightly basis.
Yes, the program dashboard does include assessments completed by external assessors.
The most commonly used tools for reviewing resident progress in CBME are the CBME Program Dashboard and individual learners' CBME dashboards. Remember that to use the CBME Program Dashboard a program must have built an assessment plan for its EPAs. At this time the CBME Program Dashboard view is only available to staff and faculty and is not visible to learners. Learners continue to use their individual CBME dashboards.
Although the CBME Program Dashboard is enabled by default, it can be disabled for specific programs if they prefer to use only individual CBME Dashboards or are not building assessment plans at this time. You will need a developer's assistance to disable the CBME Program Dashboard for a specific program/course.
New in ME 1.20!
See a curriculum version on the individual CBME dashboard.
Faculty, program administrators and residents can easily review a learner's progress from the learner's individual CBME dashboard.
Note that the CBME dashboard includes several tabs: Stages, Assessments, Assessments Items, Trends, Comments, and Pins. (Reports can be generated from the Stages page.)
There is another Assessments Dashboard that pulls in additional form information if your organization is using the Assessment and Evaluation module for other forms (e.g. program and faculty evaluations). This page discusses just the CBME Dashboard.
When logged in as a faculty member or program coordinator, click the Assessment & Evaluation badge that appears between the user's name and the logout button in the top right.
Click on 'My Learners' from the tab menu.
Search for a learner as needed, and if you can't find someone, ensure that you are in the correct curriculum period using the dropdown menu on the right.
After finding the correct learner, click on Dashboard under the learner's name to view the learner's progress. Residents automatically land on their CBME Dashboard when they log into Elentra.
From a learner's CBME Dashboard, click through the different tabs to view a range of assessment information. On most tabs, you can apply filters in order to refine the list of assessments visible. To use this option, select all the appropriate filters (e.g. date, curriculum tag, contextual variable, assessment tool, etc.) and then click 'Apply Filters'. Note that the filters a PA or PD applies on one learner will be maintained as you move through different pages.
From the stages tab you can see a summary of a learner's progress across EPAs and stages.
Under each stage is the curriculum version the learner was on for that stage.
EPAs are displayed in order of learner stage and completed stages can be collapsed.
A badge on EPA cards displays the resident’s progress towards meeting the uploaded assessment plans (Achieved/Required) and is visible to all who have access to the dashboard including the resident themselves. These numbers align with those on the CBME Program Dashboard and are updated on the same nightly schedule.
If you are using the rotation scheduler, EPAs specific to a learner's current rotation are outlined in the stage colour and the priority and likelihood of an EPA in the learner's specific rotation is shown through the exclamation mark and bar chart (unlikely, likely, very likely). Whether or not the rotation scheduler is in use, green checkmarks indicate that a stage or EPA is complete (this is set by the Competency Committee).
Click the down arrow on the right side of an EPA card to see a list of completed assessments for that EPA. Depending on the form there may be a count of the learner's global assessment rating, which you can click on to access an aggregated report of the learner's performance on this form.
When in a faculty or PA role you can also access a Milestone Report from the Stages tab. More details about the Milestone Report here.
From the “Stages” tab of the CBME Dashboard, click on the grey down arrow on the right side of the EPA card (“View Assessment Details” tooltip will appear on hover).
This will display the titles and total counts of all forms that have been completed on a resident for that EPA. Simply click on the form title that you wish to view aggregated data for.
This will open a new tab with the aggregated report as well as a trends chart. Within this tab, click on the form title to expand the aggregated data. If there have been multiple versions of the same form, these will aggregate separately, so you will need to click on each form version to view the data. You are also able to navigate directly to individual assessments by clicking on them within the trends chart.
Additionally, from ‘View Assessment Details’, you are able to generate an aggregated report by clicking on the entrustment rating responses. This will generate a report for only those assessments with that specific level of entrustment (e.g., to view aggregated report of all assessments where the resident was entrusted with “Supervison Only” on that particular form).
See a list of all completed assessments and filter as desired.
Toggle between completed, in progress, pending and deleted assessments below the filter settings.
Note that Pending tasks here includes all assessments, whether or not they have expired.
On each individual assessment card, note that the form type and relevant EPA are displayed. You can also click the down arrow on the right to show some form information (e.g., global rating, comments, assessor and assessment method), and click 'View Details' on the left to see the assessment form.
Users can quickly filter for read/unread assessments.
The small grey number beside 'Assessments' in the tab menu represents all unread assessments.
From the regular list view an eye with a slash means an assessment is unread.
There is an option to mark all assessments as read on the right side above the assessment cards.
Marking assessments as read or unread is user specific so learners and faculty will see their own results.
Users can pin an assessment from this screen and learners can give an assessment a "thumbs up" to provide feedback to an assessor about their feedback.
Quickly see a list of all completed assessment items and filter as desired. Click the down arrow on the right to see comments, response descriptor and the name of the assessor (as applicable). Click View Details at the bottom left of each card to access the completed assessment form.
Users can pin an assessment item from this screen.
View trends in learner performance across global assessment ratings. Note the overall tally of ratings in the top right corner of each card. Hover over on a node on the chart to view information about the form name, date and rating; you can also click through to access the actual assessment form.
Quickly access a list of all narrative comments provided to the learner on any complete CBME assessment tool. The tool type and relevant EPA will be displayed below the form name on each comment card.
Users can pin comments from this tab.
Quickly view all assessments, items, or comments that have been pinned. Apply filters as desired, or just scroll down to find the toggle to switch between assessments, items, and comments.
In addition to the CBME Dashboard, users can access a learner's to view additional tasks completed on the learner and assigned to the learner. The Assessments page reflect tasks completed via distributions and CBME forms initiated on demand by faculty and learners, however the reporting tool accessible from Tasks Completed on Learner applies only to forms managed via distributions (for reporting on on-demand CBME forms please see the CBME Dashboard page).
To promote a learner to a new stage, log in as a Competence Committee member, and click on the Assessment and Evaluation badge at the top of the page (beside the green logout button).
Select the 'My Learners' tab and then click on the CBME Dashboard tab of the learner you want to promote.
On the right hand side of each row you'll see a small grey circle; click on the circle to open a dialogue box where you can enter text and mark an EPA or stage as complete.
You can also click 'View History' in the dialogue box and see any previous comments or decisions regarding learner promotion (or demotion).
Note that there is a database setting you can enable to allow course directors and program coordinators to manage stages (settings: allow_course_director_manage_stage and allow_program_coordinator_manage_stage). Both these settings are off by default and need to be enabled by a developer if you want to use them.
New in ME 1.20
Learners can now create meetings and upload files.
Faculty and program administrators can log meetings to maintain a record of conversations about learner progress.
In CBME enabled organizations users can access the Log Meeting button for the learner's individual dashboard.
Additionally, Learners can access Meetings from the user icon in the top right and Faculty can access their learners meetings from the user icon and My Learners (which will take them to the Assessment and Evaluation My Learners view).
To enter a record, click the Log Meeting button.
Click Log New Meeting.
Provide a date, enter any comments and click Create Meeting. The author is automatically recorded.
Logged meetings can have files uploaded to them, be edited or be deleted using the tools in the Actions column. A program admin., academic advisor or competence committee member can create or add to any entry they have made (but not those made by other users).
Meeting logs created by faculty and staff are visible to the learner.
Learners have quick access to view their own meeting logs from the CBME dashboard My Meetings button.
In a non-CBME enabled organization, learners can access My Meetings from the user icon in the top right.
From the user icon, click Meetings.
Click Log New Meeting.
Learners will be prompted to identify an advisor they met with. Click Browse Advisors, select a Curriculum Period (e.g. Sept. 1, 2020 - July 15, 2021) and then search for or select an advisor.
The list of advisors available to a learner is based on the course group tutors assigned to them.
Enter the date of the meeting.
Add any comments from the meeting. (This is optional.)
Click Create Meeting.
After the meeting is created, learners can optionally upload supporting files by clicking the upload icon in the Actions column.
Learners can upload files to meetings logged by other people (e.g., their advisor), however, learners can only edit or delete the meetings they created.
When learners download a file they will be prompted with:
By downloading this file, you are agreeing that you will review it contents and your review of this file will be indicated in the My Meetings interface. Would you like to continue? Yes or No.
If they click yes, an additional column on the My Meetings interface will record the date and time the file was downloaded.
Learners or faculty can pin assessments or individual comments to keep them easily accessible for review during meetings. This can help to keep an important piece of feedback or other information front and centre until it has been discussed.
How to pin something
To pin an assessment, simply click on the pushpin icon that appears to the right of the assessment title and details. You'll get a success message that you've pinned the item. In the example to the left, the second and third assessments have been pinned.
To pin a comment, click on the Comments tab from the CBME Dashboard and then click the pushpin beside the desired comment. In the example to the left, the second comment has been pinned.
From the CBME Dashboard, click on Pins at the end of the tabs menu. This will open a screen showing all pinned items. To unpin something, just click the pushpin. You'll see a success message that you've unpinned the item.
To pin an individual assessment item, navigate to the Assessments Items tab of the CBME dashboard. Apply filters as needed and click the pushpin icon to pin beside an assessment item to pin it.
Residents can highlight assessments that they found helpful to their learning. From the “Assessments” tab of the CBME Dashboard, a learner can click on the “thumbs up” icon to indicate to an assessor that their feedback was helpful to the resident's learning. Learners can also include a comment on why they found it helpful. This feedback is important to help assessors identify the types of feedback that residents find beneficial for their learning.
Accessible by PDs and PAs from the CBME Dashboard Stages tab, the Milestone Report allows faculty and PAs to generate a milestone report for a specific learner. This report is a breakdown of completed assessments that have been tagged to milestones and EPAs. Assessments are tallied based on the milestone and EPA that they are tagged to and the tool generates one report per unique scale. The number of unique scales is determined by what tools were used to complete the included assessment tools. The reports are generated as CSVs and are zipped into a folder.
From a learner's CBME Dashboard Stages tab, click Milestone Report in the top right (beside Log Meeting).
1. Select a date range for the report. 2. Click “Get Tools" 3. Select the tools that you wish to view the aggregated milestone report for, or click “Select All Tools" 4. Click “Generate Report”. This will open a download modal for you to select where to save the zip file. 5. Unzip the file 6. If multiple rating scales were used to assess the milestones, there will be one CSV file for each rating scale 7. Open a CSV file 8. Each row represents a milestone, and each column represents the rating scale options. These rating scale options are grouped by EPA (e.g., for the Queen’s Three-Point Scale, you will see 4 columns for each EPA: Not Observed, Needs Attention, Developing, and Achieving). 9. Each cell displays the total number of times that milestone was assessed for that EPA, including how many times it was rated at that level on the rating scale (e.g., 3 of 6 completed assessments were rated as “Achieving”.
NOTE: Even though you may have selected only one of many Supervisor Tools (or other tools), if you used the same scale on all tools, the report will display all data for all tools that used that rating scale. We will be enhancing this in future iterations to only report on tool(s) selected.
When publishing a new curriculum version within the CBME module, learners will automatically be updated to have the new curriculum version (i.e., EPAs marked as “replaced” or “changing”) for all of their upcoming stages of training. There may be a time where you would like certain learners to have stages from a specific version. This guide will instruct you on how to properly update a learner or series of learners to have stages from a specific version.
In order to update the learner(s) objectives in a timely manner there are a few things to gather before you begin the process:
1. Assemble a list of proxy_ids for all of the learners that you wish to update. This process is done on a per course basis so make sure that all of the learner proxy_ids that you compile belong to the same course.
2. Make note of the stages that you wish to be updated. The script requires the stage letter in order to know which stages to update, so make a list of the stage letters. For example, if you are updating a learner to have an old version of Transition to Discipline (D) then you will need to note D as the stage you are updating.
3. The final thing you will need is the cbme_objective_tree_version_id for the version that you wish to update to. So if the course you are updating is course_id 123 and you would like to update a learner to version 2 for a stage then you need to look up the cbme_objective_tree_version_id for course 123 version 2 in the cbme_objective_tree_versions table. You will also need the cbme_objective_tree_version_id for the version that the learner is already a part of. The script requires that you set the version for every stage available to the learner which is why we need the current versions.
4. Access to the database
5. SSH access to your production environment
Updating the learner stages requires a developer tool to be run from developers/tools/cbme_migration. You must have SSH access to your production server in order to complete these steps. Please Note: It’s recommended that you go through the following steps in a test/staging/development environment first so that you can ensure that the script updated the learners properly.
Steps:
1. Open up your database client and open the cbme_objective_tree_aggregates table.
2. For every learner proxy_id that you compiled ahead of time we will be deleting their cbme_objective_tree_aggregate records for the course that we are updating. Select all of the rows where tree_viewer_value is the proxy_id that we are dealing with, the tree_viewer_type is “proxy_id” and the course_id matches the course that we are using. Once you have all of that data, we are going to DELETE it from the table. Repeat this until we have deleted the aggregates for every learner in the list.
3. Now that the aggregates are deleted, we can update the learners’ stages using the script. SSH into your server and navigate to the following directory: /var/www/vhosts/your installation name here/developers/tools/cbme_migration
4. Once in that directory we are going to be executing the reset-learner-tree-version.php script. Tip: if you run php reset-learner-tree-version.php --usage it will bring up the help dialogue to describe all of the options that are available. Once you have read through all of the available options you will notice that there are multiple modes that this script can be run in. For this scenario we will be using “specify” mode since we want to specify which version of stages the learners will be receiving. We must specify all stages in the --stages parameter so that the script updates them to the correct version.
As an example, if your data is this:
organisation_id = 1
course_id = 123
proxy_ids = 1111,2222,3333,4444
stages to update = C,P
current version id = 10
new version id = 20
The command will look like this:
We do not need to provide the --exclude parameter in this case
You will notice with the command above that we have listed all 4 stages in the --stages parameter even though we are only updating C and P. The reason for this is because the script requires that all stages be specified in order to update them to the correct version. In this case we are not changing D and F so we set them as the original version (10) in the script parameters. Each stage corresponds with a version in the --versions parameter, so in this case D will get version 10, F will get version 10, C will get version 20 and so on.
5. Once that script runs the last thing to do is to clear the cbme_objective_tree cache for the course that we are dealing with. The easiest way to do this is through the interface:
Login to your install as an administrator who has admin access to the course that we are updating
Navigate to Admin > Manage Courses (Programs) > Select the course that you are using > CBME tab > Import CBME Data
Click on the Actions button on the left side above the EPA search bar and select the Edit EPAs option.
Whenever one of these EPAs are updated, the cache is cleared for the course. So all that is required is to just click save on the first EPA that is listed and the cache will be cleared. You do not need to change any of the text in the EPA that you just saved. Simply saving what is already there will sufficiently clear the cache.
Once you have cleared the cache for the course then the learners should see the updates on their dashboard. As mentioned before, it's recommended that you do this process in a test environment first so that you can verify the data is the way you would like it before updating production. If you do run into the scenario where you updated a learner to the wrong version then you can always repeat this process and update them to the correct version.
The easiest way to verify that the learners are in the correct version would be to login as some of the learners that were updated and compare their dashboards to the version they were set to. Usually there is a difference between one version to the next whether it be the EPA titles or the number of EPAs.
An archived assessment is a CBME assessment that was completed on a resident using an EPA from a previous curriculum version. Assessments are archived only when a program uploads new versions of their EPAs but a resident has already collected assessments in the stage beyond the one they are currently in. In this case, the resident still receives the new EPAs for all future stages; however, all completed assessments from the old versions of those stages/EPAs are “archived”.
From the "Stages" tab, each EPA card displays how many assessments have been archived for that EPA. Expand the card for more detail.
In the example above the learner has 3 "current" assessments, and 3 "archived" assessments from a previous EPA version.
When on the "Assessments" tab, archived assessments are identifiable by locating the grey square beside the form title. Current assessments will not have the grey square beside the form title. The image below shows 3 archived assessments.
The program level dashboard leverages the updated assessment plan builder to provide an overview of resident progress towards meeting the plan. From within one interface, Program Directors, Program Administrators, and Academic Advisors (only assigned learners) are able to see all of the learners in their program and their progress towards meeting the plan requirements. There is currently no learner-facing side of this dashboard.
IMPORTANT PREREQUISITE: Assessment Plan Builder
In order to leverage the program-level dashboard, your program must have assessment plans entered in Elentra. Please visit the Assessment Plan Builder lesson for more information.
Once you have entered your assessment plan requirements into the Assessment Plan Builder, the dashboard will populate the EPA counters.
Log in to Elentra as an administrator (e.g. program administrator, course director).
At the top right, click on the "Assessment & Evaluation" icon beside your name.
Click on the "My Learners" tab to view the CBME Program Dashboard.
Multiple tabs provide different progress information (EPAs, Stages, Program Stats). An advanced filter set allows programs to filter the information on each page. These filters persist across tabs.
There are currently three tabs within the Program Dashboard. See the screenshots below for examples.
Assessments By EPA: Visualizes each learner's progress towards meeting the assessment plans organized by EPA. You can view all learners in one interface.
Stage Completion Status: Visualizes each learners progress towards meeting all EPAs within a stage. You can view all learners in one interface.
Program Stats: Currently includes a bar graph depicting how many assessments have been completed on each resident, highlighting the proportion that meet the plans.
Select the curriculum period that you wish to view enrolment for. This is typically the current academic year.
If you have access to more than one program, you can toggle between them using this dropdown menu.
Search learner names using free-text search.
You are able to sort the learner list by:
Learner name ("Sort by Name")
Progress towards meeting the plan ("Sort by Progress")
Total number of assessments ("Sort by Total Assessments")
Choose to sort the learner list in ascending or descending order.
Filter the learner list by Post-Graduate Year (PGY) level. You may select more than one PGY.
Filter the EPA list by Royal College Stages. You may select more than one stage.
Filter the EPA list. You may select more than one EPA.
Overall Total: Total number of assessments completed on the learner for EPAs that have an assessment plan entered.
EPA Total: The number directly beneath the EPA Code is the total number of assessments that have been completed on that EPA for that learner, regardless of whether or not they met the assessment plan requirements.
Requirements Total: The fraction indicates how many completed assessments met the assessment plan over how many assessments are required.
The resident progress dashboard is meant to give a high level overview of your learners' progress towards meeting your assessment plans. The decision to mark EPA progress as "Approved" is made solely at discretion of the Competence Committee.
Red: No Progress. Indicates that the learner is in that stage, but:
has not been assessed on that EPA, OR
has been assessed but none of the assessments meet plan requirements
Yellow: In Progress < 50%. Indicates that the learner has been assessed on the EPA, but is currently meeting less than 50% of the requirements
Blue: In Progress > 50%. Indicates that the learner has been assessed on the EPA and is meeting more than 50% of the requirements
Green: Achieved. Indicates that the learner has been assessed on the EPA and is currently meeting the defined assessment plan numbers; however, the progress still needs to be reviewed and approved by the competence committee.
Green: Approved (with checkmark). Indicates that the EPA has been reviewed and approved at the competence committee level.
Grey: All EPAs that are not in the learner's current stage will appear grey, even if they have assessments that count towards the assessment plan.
The assessment plan builder allows you to specify minimum requirements for assessment forms on a per-EPA basis. When you enter an assessment plan, you enter the following information for each form or group of combined forms:
Minimum number of assessments, with a global assessment rating equal to or high than an indicated value
Minimum number of unique assessors
Contextual variable requirements, including a defined number of required responses (or a range of responses), such as a certain number of presentations or complexities
These values are then combined in the system to create the total number of required assessments. It is possible for a learner to have the correct number of required assessments for the global assessment rating without achieving the plan due to not meeting the contextual variable or unique assessor requirements.
For example, if the learner needs 5 assessments at "meets expectations" or above, in addition to being assessed on 5 different clinical presentations, the dashboard will only count the first instance of "acute injury" that "meets expectations", and will only count other clinical presentations towards the total after that. Any additional 'acute injuries' that 'meet expectations' will not be counted, since the learner still needs to be assessed on 4 more unique clinical presentations.
If a program does not want to use the CBME Program Dashboard a developer can disable it for specific programs. (Developers, the program dashboard can be disabled for a course by adding an entry cbme_progress_dashboard with a value of 0 in the course_settings table.)