Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Currently, the assessment plan builder and CBME Program Dashboard only support: Supervisor Forms, Field Notes, PPAs (with global entrustment item), Procedure Forms, and Rubric/Flex Forms (with global entrustment item). It does not currently support PPAs/Rubrics that do not have the global entrustment item added.
The CBME Program dashboard and the assessment plan builder take both form versions and EPA versions into consideration**.** This means that in order for the dashboard to generate correct assessment counts, you need to enter assessment plans for all active EPA versions, and in some cases, all form versions. Remember, your competence committee still has access to all learner data and can 'overrule' the system by marking an EPA as Approved in these (and other) cases. The dashboard should not be the sole source of information for competence committees.
The CBME Program Dashboard only counts assessments that have published assessment plans linked to them. In some cases, you may have assessments that were completed on older form versions that do not have a plan, or you have not yet built assessment plans for your new forms, so the dashboard does not count these assessments. Additionally, some form types are currently not supported such as PPAs and Rubrics that are not tagged to any EPAs. If a learner has gathered assessments on a previous EPA version and is now on a new version (e.g., assessed initially on F3-Version 1, but was given F3-Version 2 midstream) these archived assessments will not display on the program dashboard since it only displays the learner's current EPA versions. To view archived assessment data, navigate to the learner's CBME dashboard.
At most schools, the CBME Program Dashboard is updated on a once-nightly basis.
No, archived assessments will not display on the program dashboard since it only displays the learner's current EPA versions. To view archived assessment data, navigate to the learner's CBME dashboard.
Yes, the program dashboard does include assessments completed by external assessors.
When publishing a new curriculum version within the CBME module, learners will automatically be updated to have the new curriculum version (i.e., EPAs marked as “replaced” or “changing”) for all of their upcoming stages of training. There may be a time where you would like certain learners to have stages from a specific version. This guide will instruct you on how to properly update a learner or series of learners to have stages from a specific version.
In order to update the learner(s) objectives in a timely manner there are a few things to gather before you begin the process:
1. Assemble a list of proxy_ids for all of the learners that you wish to update. This process is done on a per course basis so make sure that all of the learner proxy_ids that you compile belong to the same course.
2. Make note of the stages that you wish to be updated. The script requires the stage letter in order to know which stages to update, so make a list of the stage letters. For example, if you are updating a learner to have an old version of Transition to Discipline (D) then you will need to note D as the stage you are updating.
3. The final thing you will need is the cbme_objective_tree_version_id for the version that you wish to update to. So if the course you are updating is course_id 123 and you would like to update a learner to version 2 for a stage then you need to look up the cbme_objective_tree_version_id for course 123 version 2 in the cbme_objective_tree_versions table. You will also need the cbme_objective_tree_version_id for the version that the learner is already a part of. The script requires that you set the version for every stage available to the learner which is why we need the current versions.
4. Access to the database
5. SSH access to your production environment
Updating the learner stages requires a developer tool to be run from developers/tools/cbme_migration. You must have SSH access to your production server in order to complete these steps. Please Note: It’s recommended that you go through the following steps in a test/staging/development environment first so that you can ensure that the script updated the learners properly.
Steps:
1. Open up your database client and open the cbme_objective_tree_aggregates table.
2. For every learner proxy_id that you compiled ahead of time we will be deleting their cbme_objective_tree_aggregate records for the course that we are updating. Select all of the rows where tree_viewer_value is the proxy_id that we are dealing with, the tree_viewer_type is “proxy_id” and the course_id matches the course that we are using. Once you have all of that data, we are going to DELETE it from the table. Repeat this until we have deleted the aggregates for every learner in the list.
3. Now that the aggregates are deleted, we can update the learners’ stages using the script. SSH into your server and navigate to the following directory: /var/www/vhosts/your installation name here/developers/tools/cbme_migration
4. Once in that directory we are going to be executing the reset-learner-tree-version.php script. Tip: if you run php reset-learner-tree-version.php --usage it will bring up the help dialogue to describe all of the options that are available. Once you have read through all of the available options you will notice that there are multiple modes that this script can be run in. For this scenario we will be using “specify” mode since we want to specify which version of stages the learners will be receiving. We must specify all stages in the --stages parameter so that the script updates them to the correct version.
As an example, if your data is this:
organisation_id = 1
course_id = 123
proxy_ids = 1111,2222,3333,4444
stages to update = C,P
current version id = 10
new version id = 20
The command will look like this:
We do not need to provide the --exclude parameter in this case
You will notice with the command above that we have listed all 4 stages in the --stages parameter even though we are only updating C and P. The reason for this is because the script requires that all stages be specified in order to update them to the correct version. In this case we are not changing D and F so we set them as the original version (10) in the script parameters. Each stage corresponds with a version in the --versions parameter, so in this case D will get version 10, F will get version 10, C will get version 20 and so on.
5. Once that script runs the last thing to do is to clear the cbme_objective_tree cache for the course that we are dealing with. The easiest way to do this is through the interface:\
Login to your install as an administrator who has admin access to the course that we are updating
Navigate to Admin > Manage Courses (Programs) > Select the course that you are using > CBME tab > Import CBME Data
Click on the Actions button on the left side above the EPA search bar and select the Edit EPAs option.
Whenever one of these EPAs are updated, the cache is cleared for the course. So all that is required is to just click save on the first EPA that is listed and the cache will be cleared. You do not need to change any of the text in the EPA that you just saved. Simply saving what is already there will sufficiently clear the cache.
Once you have cleared the cache for the course then the learners should see the updates on their dashboard. As mentioned before, it's recommended that you do this process in a test environment first so that you can verify the data is the way you would like it before updating production. If you do run into the scenario where you updated a learner to the wrong version then you can always repeat this process and update them to the correct version.
The easiest way to verify that the learners are in the correct version would be to login as some of the learners that were updated and compare their dashboards to the version they were set to. Usually there is a difference between one version to the next whether it be the EPA titles or the number of EPAs.
Updated in ME 1.21 to include additional details about assessment plan requirements.
Faculty, program administrators and residents can easily review a learner's progress from the learner's individual CBME dashboard.
Note that the CBME dashboard includes several tabs: Stages, Assessments, Assessments Items, Trends, Comments, and Pins. (Reports can be generated from the Stages page.)
There is another Assessments Dashboard that pulls in additional form information if your organization is using the Assessment and Evaluation module for other forms (e.g. program and faculty evaluations). This page discusses just the CBME Dashboard.
When logged in as a faculty member or program coordinator, click the Assessment & Evaluation badge that appears between the user's name and the logout button in the top right.
Click on 'My Learners' from the tab menu.
Search for a learner as needed, and if you can't find someone, ensure that you are in the correct curriculum period using the dropdown menu on the right.
After finding the correct learner, click on Dashboard under the learner's name to view the learner's progress. Residents automatically land on their CBME Dashboard when they log into Elentra.
From a learner's CBME Dashboard, click through the different tabs to view a range of assessment information. On most tabs, you can apply filters in order to refine the list of assessments visible. To use this option, select all the appropriate filters (e.g. date, curriculum tag, contextual variable, assessment tool, etc.) and then click 'Apply Filters'. Note that the filters a PA or PD applies on one learner will be maintained as you move through different pages.
From the stages tab you can see a summary of a learner's progress across EPAs and stages.
Under each stage is the curriculum version the learner was on for that stage.
EPAs are displayed in order of learner stage and completed stages can be collapsed.
A badge on EPA cards displays the resident’s progress towards meeting the uploaded assessment plans (Achieved/Required) and is visible to all who have access to the dashboard including the resident themselves. These numbers align with those on the CBME Program Dashboard and are updated on the same nightly schedule. Note that you can toggle between viewing all requirements and remaining requirements.
If you are using the rotation scheduler, EPAs specific to a learner's current rotation are outlined in the stage colour and the priority and likelihood of an EPA in the learner's specific rotation is shown through the exclamation mark and bar chart (unlikely, likely, very likely). Whether or not the rotation scheduler is in use, green checkmarks indicate that a stage or EPA is complete (this is set by the Competency Committee).
Click the down arrow on the right side of an EPA card to see a list of completed assessments for that EPA. Depending on the form there may be a count of the learner's global assessment rating, which you can click on to access an aggregated report of the learner's performance on this form.
When in a faculty or PA role you can also access a Milestone Report from the Stages tab. More details about the Milestone Report here.
From the “Stages” tab of the CBME Dashboard, click on the grey down arrow on the right side of the EPA card (“View Assessment Details” tooltip will appear on hover).
This will display the titles and total counts of all forms that have been completed on a resident for that EPA. Simply click on the form title that you wish to view aggregated data for.
This will open a new tab with the aggregated report as well as a trends chart. Within this tab, click on the form title to expand the aggregated data. If there have been multiple versions of the same form, these will aggregate separately, so you will need to click on each form version to view the data. You are also able to navigate directly to individual assessments by clicking on them within the trends chart.
Additionally, from ‘View Assessment Details’, you are able to generate an aggregated report by clicking on the entrustment rating responses. This will generate a report for only those assessments with that specific level of entrustment (e.g., to view aggregated report of all assessments where the resident was entrusted with “Supervison Only” on that particular form).
See a list of all completed assessments and filter as desired.
Toggle between completed, in progress, pending and deleted assessments below the filter settings.
Note that Pending tasks here includes all assessments, whether or not they have expired.
On each individual assessment card, note that the form type and relevant EPA are displayed. You can also click the down arrow on the right to show some form information (e.g., global rating, comments, assessor and assessment method), and click 'View Details' on the left to see the assessment form.
Users can quickly filter for read/unread assessments.
The small grey number beside 'Assessments' in the tab menu represents all unread assessments.
From the regular list view an eye with a slash means an assessment is unread.
There is an option to mark all assessments as read on the right side above the assessment cards.
Marking assessments as read or unread is user specific so learners and faculty will see their own results.
Users can pin an assessment from this screen and learners can give an assessment a "thumbs up" to provide feedback to an assessor about their feedback.
Quickly see a list of all completed assessment items and filter as desired. Click the down arrow on the right to see comments, response descriptor and the name of the assessor (as applicable). Click View Details at the bottom left of each card to access the completed assessment form.
Users can pin an assessment item from this screen.
View trends in learner performance across global assessment ratings. Note the overall tally of ratings in the top right corner of each card. Hover over on a node on the chart to view information about the form name, date and rating; you can also click through to access the actual assessment form.
Quickly access a list of all narrative comments provided to the learner on any complete CBME assessment tool. The tool type and relevant EPA will be displayed below the form name on each comment card.
Users can pin comments from this tab.
Quickly view all assessments, items, or comments that have been pinned. Apply filters as desired, or just scroll down to find the toggle to switch between assessments, items, and comments.
An archived assessment is a CBME assessment that was completed on a resident using an EPA from a previous curriculum version. Assessments are archived only when a program uploads new versions of their EPAs but a resident has already collected assessments in the stage beyond the one they are currently in. In this case, the resident still receives the new EPAs for all future stages; however, all completed assessments from the old versions of those stages/EPAs are “archived”.
From the "Stages" tab, each EPA card displays how many assessments have been archived for that EPA. Expand the card for more detail.
In the example above the learner has 3 "current" assessments, and 3 "archived" assessments from a previous EPA version.
When on the "Assessments" tab, archived assessments are identifiable by locating the grey square beside the form title. Current assessments will not have the grey square beside the form title. The image below shows 3 archived assessments.
The most commonly used tools for reviewing resident progress in CBME are the CBME Program Dashboard and individual learners' CBME dashboards. Remember that to use the CBME Program Dashboard a program must have built an assessment plan for its EPAs. At this time the CBME Program Dashboard view is only available to staff and faculty and is not visible to learners. Learners continue to use their individual CBME dashboards.
Although the CBME Program Dashboard is enabled by default, it can be disabled for specific programs if they prefer to use only individual CBME Dashboards or are not building assessment plans at this time. You will need a developer's assistance to disable the CBME Program Dashboard for a specific program/course.
In addition to the CBME Dashboard, users can access a learner's to view additional tasks completed on the learner and assigned to the learner. The Assessments page reflect tasks completed via distributions and CBME forms initiated on demand by faculty and learners, however the reporting tool accessible from Tasks Completed on Learner applies only to forms managed via distributions (for reporting on on-demand CBME forms please see the CBME Dashboard page).
Updated in ME 1.21 to include additional details about assessment plan requirements.
The program level dashboard leverages the updated assessment plan builder to provide an overview of resident progress towards meeting the plan. From within one interface, Program Directors, Program Administrators, and Academic Advisors (only assigned learners) are able to see all of the learners in their program and their progress towards meeting the plan requirements. There is currently no learner-facing side of this dashboard.
IMPORTANT PREREQUISITE: Assessment Plan Builder
In order to leverage the program-level dashboard, your program must have assessment plans entered in Elentra. Please visit the Assessment Plan Builder lesson for more information.
Once you have entered your assessment plan requirements into the Assessment Plan Builder, the dashboard will populate the EPA counters.
Log in to Elentra as an administrator (e.g. program administrator, course director).
At the top right, click on the "Assessment & Evaluation" icon beside your name.
Click on the "My Learners" tab to view the CBME Program Dashboard.
Multiple tabs provide different progress information (EPAs, Stages, Program Stats). An advanced filter set allows programs to filter the information on each page. These filters persist across tabs.
There are currently three tabs within the Program Dashboard. See the screenshots below for examples.
Assessments By EPA: Visualizes each learner's progress towards meeting the assessment plans organized by EPA. You can view all learners in one interface.
Stage Completion Status: Visualizes each learners progress towards meeting all EPAs within a stage. You can view all learners in one interface.
Program Stats: Currently includes a bar graph depicting how many assessments have been completed on each resident, highlighting the proportion that meet the plans.
Click on the information icon in the top right of an EPA tile to view a residents progress to date in terms of fulfilling the contextual variable and other requirements as defined by the Assessment Plan. Some samples views are posted below.
Note that while requirements are incomplete, you can toggle between viewing all requirements or remaining requirements only.
Select the curriculum period that you wish to view enrolment for. This is typically the current academic year.
If you have access to more than one program, you can toggle between them using this dropdown menu.
Search learner names using free-text search.
You are able to sort the learner list by:
Learner name ("Sort by Name")
Progress towards meeting the plan ("Sort by Progress")
Total number of assessments ("Sort by Total Assessments")
Choose to sort the learner list in ascending or descending order.
Filter the learner list by Post-Graduate Year (PGY) level. You may select more than one PGY.
Filter the EPA list by Royal College Stages. You may select more than one stage.
Filter the EPA list. You may select more than one EPA.
Overall Total: Total number of assessments completed on the learner for EPAs that have an assessment plan entered.
EPA Total: The number directly beneath the EPA Code is the total number of assessments that have been completed on that EPA for that learner, regardless of whether or not they met the assessment plan requirements.
Requirements Total: The fraction indicates how many completed assessments met the assessment plan over how many assessments are required.
The resident progress dashboard is meant to give a high level overview of your learners' progress towards meeting your assessment plans. The decision to mark EPA progress as "Approved" is made solely at discretion of the Competence Committee.
Red: No Progress. Indicates that the learner is in that stage, but:
has not been assessed on that EPA, OR
has been assessed but none of the assessments meet plan requirements
Yellow: In Progress < 50%. Indicates that the learner has been assessed on the EPA, but is currently meeting less than 50% of the requirements
Blue: In Progress > 50%. Indicates that the learner has been assessed on the EPA and is meeting more than 50% of the requirements
Green: Achieved. Indicates that the learner has been assessed on the EPA and is currently meeting the defined assessment plan numbers; however, the progress still needs to be reviewed and approved by the competence committee.
Green: Approved (with checkmark). Indicates that the EPA has been reviewed and approved at the competence committee level.
Grey: All EPAs that are not in the learner's current stage will appear grey, even if they have assessments that count towards the assessment plan.
The assessment plan builder allows you to specify minimum requirements for assessment forms on a per-EPA basis. When you enter an assessment plan, you enter the following information for each form or group of combined forms:
Minimum number of assessments, with a global assessment rating equal to or high than an indicated value
Minimum number of unique assessors
Contextual variable requirements, including a defined number of required responses (or a range of responses), such as a certain number of presentations or complexities
These values are then combined in the system to create the total number of required assessments. It is possible for a learner to have the correct number of required assessments for the global assessment rating without achieving the plan due to not meeting the contextual variable or unique assessor requirements.
For example, if the learner needs 5 assessments at "meets expectations" or above, in addition to being assessed on 5 different clinical presentations, the dashboard will only count the first instance of "acute injury" that "meets expectations", and will only count other clinical presentations towards the total after that. Any additional 'acute injuries' that 'meet expectations' will not be counted, since the learner still needs to be assessed on 4 more unique clinical presentations.
If a program does not want to use the CBME Program Dashboard a developer can disable it for specific programs. (Developers, the program dashboard can be disabled for a course by adding an entry cbme_progress_dashboard with a value of 0 in the course_settings table.)
New in ME 1.20
Learners can now create meetings and upload files.
Faculty and program administrators can log meetings to maintain a record of conversations about learner progress.
In CBME enabled organizations users can access the Log Meeting button for the learner's individual dashboard.
Additionally, Learners can access Meetings from the user icon in the top right and Faculty can access their learners meetings from the user icon and My Learners (which will take them to the Assessment and Evaluation My Learners view).
To enter a record, click the Log Meeting button.
Click Log New Meeting.
Provide a date, enter any comments and click Create Meeting. The author is automatically recorded.
Logged meetings can have files uploaded to them, be edited or be deleted using the tools in the Actions column. A program admin., academic advisor or competence committee member can create or add to any entry they have made (but not those made by other users).
Meeting logs created by faculty and staff are visible to the learner.
Learners have quick access to view their own meeting logs from the CBME dashboard My Meetings button.
In a non-CBME enabled organization, learners can access My Meetings from the user icon in the top right.
From the user icon, click Meetings.
Click Log New Meeting.
Learners will be prompted to identify an advisor they met with. Click Browse Advisors, select a Curriculum Period (e.g. Sept. 1, 2020 - July 15, 2021) and then search for or select an advisor.
The list of advisors available to a learner is based on the course group tutors assigned to them.
Enter the date of the meeting.
Add any comments from the meeting. (This is optional.)
Click Create Meeting.
After the meeting is created, learners can optionally upload supporting files by clicking the upload icon in the Actions column.
Learners can upload files to meetings logged by other people (e.g., their advisor), however, learners can only edit or delete the meetings they created.
When learners download a file they will be prompted with:
By downloading this file, you are agreeing that you will review it contents and your review of this file will be indicated in the My Meetings interface. Would you like to continue? Yes or No.
If they click yes, an additional column on the My Meetings interface will record the date and time the file was downloaded.
Learners or faculty can pin assessments or individual comments to keep them easily accessible for review during meetings. This can help to keep an important piece of feedback or other information front and centre until it has been discussed.
How to pin something
To pin an assessment, simply click on the pushpin icon that appears to the right of the assessment title and details. You'll get a success message that you've pinned the item. In the example to the left, the second and third assessments have been pinned.
To pin a comment, click on the Comments tab from the CBME Dashboard and then click the pushpin beside the desired comment. In the example to the left, the second comment has been pinned.
From the CBME Dashboard, click on Pins at the end of the tabs menu. This will open a screen showing all pinned items. To unpin something, just click the pushpin. You'll see a success message that you've unpinned the item.
To pin an individual assessment item, navigate to the Assessments Items tab of the CBME dashboard. Apply filters as needed and click the pushpin icon to pin beside an assessment item to pin it.
Residents can highlight assessments that they found helpful to their learning. From the “Assessments” tab of the CBME Dashboard, a learner can click on the “thumbs up” icon to indicate to an assessor that their feedback was helpful to the resident's learning. Learners can also include a comment on why they found it helpful. This feedback is important to help assessors identify the types of feedback that residents find beneficial for their learning.
Accessible by PDs and PAs from the CBME Dashboard Stages tab, the Milestone Report allows faculty and PAs to generate a milestone report for a specific learner. This report is a breakdown of completed assessments that have been tagged to milestones and EPAs. Assessments are tallied based on the milestone and EPA that they are tagged to and the tool generates one report per unique scale. The number of unique scales is determined by what tools were used to complete the included assessment tools. The reports are generated as CSVs and are zipped into a folder.
From a learner's CBME Dashboard Stages tab, click Milestone Report in the top right (beside Log Meeting).
1. Select a date range for the report. 2. Click “Get Tools" 3. Select the tools that you wish to view the aggregated milestone report for, or click “Select All Tools" 4. Click “Generate Report”. This will open a download modal for you to select where to save the zip file. 5. Unzip the file 6. If multiple rating scales were used to assess the milestones, there will be one CSV file for each rating scale 7. Open a CSV file 8. Each row represents a milestone, and each column represents the rating scale options. These rating scale options are grouped by EPA (e.g., for the Queen’s Three-Point Scale, you will see 4 columns for each EPA: Not Observed, Needs Attention, Developing, and Achieving). 9. Each cell displays the total number of times that milestone was assessed for that EPA, including how many times it was rated at that level on the rating scale (e.g., 3 of 6 completed assessments were rated as “Achieving”.
NOTE: Even though you may have selected only one of many Supervisor Tools (or other tools), if you used the same scale on all tools, the report will display all data for all tools that used that rating scale. We will be enhancing this in future iterations to only report on tool(s) selected.
To promote a learner to a new stage, log in as a Competence Committee member, and click on the Assessment and Evaluation badge at the top of the page (beside the green logout button).
Select the 'My Learners' tab and then click on the CBME Dashboard tab of the learner you want to promote.
On the right hand side of each row you'll see a small grey circle; click on the circle to open a dialogue box where you can enter text and mark an EPA or stage as complete.
You can also click 'View History' in the dialogue box and see any previous comments or decisions regarding learner promotion (or demotion).
Note that there is a database setting you can enable to allow course directors and program coordinators to manage stages (settings: allow_course_director_manage_stage and allow_program_coordinator_manage_stage). Both these settings are off by default and need to be enabled by a developer if you want to use them.
Updated in ME 1.22! An additional views was introduced (Faculty Evaluation Dashboard) and residents are now able to access their own dashboard.
The CBME visual summary is optional and is controlled by a database setting (cbme_enable_visual_summary). To enable this for your organization, please speak to a developer.
IMPORTANT PREREQUISITE: Assessment Plan Builder
In order to leverage the visual summary, your program must have assessment plans entered in Elentra. Please visit the Assessment Plan Builder lesson for more information.
To access the visual summary:
Log in to Elentra as an administrator (e.g. program administrator, course director).
At the top right, click on the "Assessment & Evaluation" task list icon.
Click on the "My Learners" tab
You will land on the Program Dashboard.
From the tab menu below the filter option, click Visual Summary.
You will be directed to the Visual Summary dashboard.
Toggle between the different dashboards, and/or programs as applicable.
The Normative Assessment Dashboard shows the performance of all residents in a program relative to each other and their training phases and is meant to be only viewed by Competency Committee members.
The normative dashboard presents summarized data metrics of all the residents in a program relative to each other. The data is presented as both a visual representation (left) and a tabular representation. Users are provided with an option to either view all the residents in a program or selectively view the metrics of residents in one training stage by using the dropdown at the top of the dashboard. This can be useful during Competency Committee meetings when residents are judged on their performance relative to others in their training group.
By default, the normative dashboard filters out residents without any completed EPAs. However, this behavior can be turned off by toggling the checkbox at the top of the dashboard.
The bar chart visualizes the following four metrics individually and to switch between the metric being visualized, users can select the corresponding radio buttons above the chart. Each bar represents one resident and users can hover their mouse over a bar in the chart to see the name of the resident it represents, and the value of the current metric being shown. Clicking on a bar in the bar chart switches the user to the resident dashboard to view a detailed breakdown of all assessments of that resident.
Total EPAs - This is a count of the total number of EPAs filled out by a resident.
Currently the total EPAs only considers EPAs that have been collected on valid assessment forms. However, In a future release this count will be updated to also include EPAs completed on archived, expired or deleted assessment forms.
Achievement Rate - This is the total number of EPAs a resident has achieved divided by the total number of EPAs completed by that resident. An achieved EPA is one where the EPA meets certain requirements such as acquiring a rating of 4 or above on a 5-point scale, or satisfying specific contextual variable requirements in an EPA, or meeting diverse assessor role requirements.
Progress Rate - This is the number of EPAs a resident has achieved divided by the total number of EPAs they are required to complete in all the valid EPA forms in a program across the different training phases.
While the achievement rate is a personal indicator for each resident to see what number of EPAs they attempt are achieved, the progress rate is a global indicator that shows where they are in their residency training program.
Total EPAs vs Achieved EPAs - This chart visualizes two bars for each resident showing their total EPAs and achieved EPAs next to each other. While this metric is similar to achievement rate, it can offer a better picture of a resident’s overall performance as high achievements rates can be occasionally misleading for residents with a very low number of completed EPAs all of which are achieved.
Finally, there is a tabular representation of the same metrics with the ability to sort the residents in the table by clicking on the column header. By default, the residents are sorted in the table by name in an ascending order. This can be changed to a descending order by simply clicking on the column header “Name”. Similarly clicking on each column header sorts the table in an ascending order based on that particular metric and clicking the same column header again changes the order to descending.
The normative dashboard is linked to the resident dashboard so to view a particular residents’ assessments in detail users can simply click on a bar corresponding to a resident in the bar chart or their corresponding row in the table. This will automatically switch the user to the resident dashboard with that particular resident preselected.
The Resident Metrics Dashboard focuses on individual residents and is designed to be used by Residents and Competency Committee members**.**
The resident dashboard has a wealth of information that is grouped into different categories for easier comprehension. First if you arrived at the resident dashboard by selecting a resident on the normative dashboard their data is automatically fetched for you. However, if you manually switched over to the resident dashboard by clicking on the navigation tabs above, you will need to select a resident from the dropdown in the filter panel situated at the top of the dashboard. The dropdown contains the list of all the residents in the program with their name and their
corresponding progress rate. The names of residents are further grouped by their training stage and then sorted alphabetically for easier access.
The dropdown is also an editable text box and so you can type a resident’s name partially to automatically filter the available options in the dropdown. This makes it easier to search for a particular resident in a program with many residents.
After selecting a resident, users can then click on the “GET RECORDS’’ button to visualize their assessment data. You might notice the small button with the calendar icon on it. This is used to highlight assessment data gained by the resident in a particular time period. For now, ignore it but we will learn more about further down. The resident dashboard consists of 4 main sub sections. Let us look at each one individually.
This section provides the following summarized metrics of the resident:
Total EPAs observed - Count of all EPAs filled by a resident.
This number may vary from the total EPAs count for the same resident on the normative dashboard as this number also includes assessments filled on expired/archived assessment forms and not just currently valid assessment forms.
Progress Rate - This is the number of EPAs a resident has achieved divided by the total number of EPAs they are required to complete in all the valid EPA forms in a program across the different training phases.
Achievement Rate - This is the total number of EPAs a resident has achieved divided by the total number of EPAs completed by that resident. An achieved EPA is one where the EPA meets certain requirements such as acquiring a rating of 4 or above on a 5 point scale, or satisfying specific contextual variable requirements in an EPA, or meeting diverse assessor role requirements.
To the right of the acquisition metrics is a line chart that shows the weekly EPA acquisition rate for the last six months by the resident. This is meant to give a high level overview at a quick glance of the residents assessment gathering in the recent past.
This section is meant to quickly lookup a residents’ recent performance with the option to view records in the following ranges: last 10 days, last 25 days, last month and last 3 months. The chart does not visually distinguish the different EPA types (i.e., EPA-F1 vs EPA-C2), instead, it provides this and other additional information in a pop-up menu that can be invoked by hovering the mouse over a point.
The line chart provides a simple representation of the last “N’’ assessments filled by the resident where every EPA is represented as a point with the oldest record starting on the left. The points are arranged vertically using the O-Score Entrustability scale with 5 being the highest (I did not need to be there) and 1 being the lowest (I had to do). The better a resident performs in an EPA, the higher is the vertical position of the point in the chart. We use background lines to show the 5 levels, instead of labelling the points, to reduce visual clutter as the levels are easy to understand without providing additional context.
The final section provides a detailed overview of every single EPA completed by the resident. The entire list of EPAs that residents are required to complete are broken down into four groups based on the training phase during which a resident is supposed to complete them and are numbered accordingly.
Each training phase is presented as a block with the title of the training phase and a label indicating whether the resident has completed the training phase. If a training phase is in progress a completion rate is shown to indicate the number of assessments the resident has achieved in that training phase relative to the total number of required assessments for every EPA in that phase. Each training phase block acts as an accordion and can be expanded or collapsed to view the list of all EPAs in that block.
Although residents generally complete the EPAs of their current training phase before they pick up EPAs of later phases, there are exceptions. Due to various external factors such as their rotation schedules and the nature of medical cases of the patients they attend to, residents can occasionally end up completing EPAs which are not in their current training phase. This means residents can have a non-zero completion rate for training phases that they have not yet started officially. When a training block is open, all the EPAs in that block are arranged sequentially based on the numbering order in a 3-column layout as shown in the following figure.
First Column: EPA ID and textual description of the corresponding medical scenario that the EPA targets.
Second Column: The residents acquisition metrics for each EPA are provided as four numbers along with two bullets charts that visualize how far along the resident is in completing that EPA. The first bullet chart (blue) visualizes the observed EPA count relative to the required count while the second bullet chart visualizes the achieved EPA count relative to the required count. If an EPA is complete (the required numbers of EPAs are achieved), the “TO GO’’ metric changes into a completed check mark icon.
Third Column: This is a visualization of all assessment filled by the resident for that particular EPA. The information is visualized similar to the recent EPA chart discussed above. Assessments are arranged chronologically on the X axis with the oldest to the left and are arranged vertically based on the EPA rating (5-point O-Score Entrustability scale) with 5 being the highest (resident managed the situation independently) and 1 being the lowest (Assessor had to completely take over the situation). Each point in this chart can be hovered upon to view additional information about that EPA such as narrative feedback in an onscreen popup window.
Finally, two buttons are provided as seen in the bottom left corner of the chart. The first one (book icon) can be clicked to see all the records in a table that can be sorted and filtered by columns. To filter the table start typing in the input box at the top of each column in the table. This will dynamically filter the table as you type. To sort the table by a column simply click on the column header.
The second button (sliders icon) brings up dropdown filter lists that can be used to visually identify a particular record based on patient demographics or other contextual variables such as “Case Complexity’’ or “Clinical Presentation’’. For example, if a user wanted to see which of the records were for senior patients, they could select the ‘Senior’ option from the drop-down list and the corresponding points (observation scores) would turn light red.
This is a common feature across the dashboard that highlights all assessments that were filled in particular time period. To enable this, head over the filter panel at the top of the dashboard and click on the small button with the calendar icon on it. This will open a panel where you can set the start date and end date for the time period. You can either type in directly into the input box or use the date selector on the calendar below.
Once the start date and end date are set all assessments that fall in that particular time period are converted into diamonds across the dashboard. This provides a way to visually distinguish these EPAs while still viewing them relative to other EPAs filled outside of the selected time period. This feature can be particularly useful during competence committee meetings which happen once every three months such that the time period can be set to highlight only the EPAs filled by the resident since the last meeting.
Another way to set the time period on the dashboard is by simply clicking on rotation block in the rotation schedule. This will automatically set the start date and end date of the time period to the dates of the rotation block and all assessments filled in that rotation block are automatically highlighted.
The checkbox provided in the date filter panel when enabled hides all EPA levels which do not have any assessments filled in the selected training period. If an entire training phase does not have any EPAs filled in the training period, then the whole training phase block as a whole is also hidden. This can be useful to reduce the visual clutter on the dashboard and only focus on a small subset of EPAs.
From this dashboard, program directors and coordinators can see a breakdown of all assessments completed in a program by a faculty member for informing faculty development.