Two additional views were introduced in 1.24:
Program Evaluation
Program Oversight
The CBME visual summary dashboard suite is optional and is controlled by a database setting (cbme_enable_visual_summary). To enable this for your organization, please speak to a developer. There is also a corresponding cron job that needs to be run nightly to update the data shown in the visual summary dashboards. More information about this can be found in the upgrading to ME 1.24 guide.
To leverage the visual summary, your program must have assessment plans entered in Elentra. Please visit the Assessment Plan Builder lesson for more information.
Log in to Elentra as an administrator (e.g. program administrator, course director).
At the top right, click on the "Assessment & Evaluation" task list icon.
Click on the "My Learners" tab
You will land on the Program Dashboard.
From the tab menu below the filter option, click Visual Summary.
You will be directed to the Visual Summary dashboard.
Toggle between the different dashboards, and/or programs as applicable.
Residents have access to the visual summary page only though their Learner Dashboard where they can click on “visual summary” from the tab menu.
The permission system within the visual summary page has been set up in a specific way such that most users are only shown a small subset of the dashboards based on their user role and the rest of the dashboards are hidden completely. Only a small set of elevated user types have access to all the dashboards.
Residents: Resident Dashboard (only their data) Competency Committee Members/Chairs: Resident Dashboard, and Normative Assessment Course Director: Resident Dashboard, Normative Assessment, Faculty Development, and Program Evaluation Program Coordinator: Resident Dashboard, Normative Assessment, Faculty Development and Program Evaluation Course Director/Program Coordinators, with access to multiple programs: Resident Dashboard, Normative Assessment, Faculty Development, Program Evaluation and Program Oversight (limited to the programs they have access to) Medtech Admin: Resident Dashboard, Normative Assessment, Faculty Development, Program Evaluation and Program Oversight (Access to all programs)
The Resident Metrics Dashboard focuses on individual residents and is designed to be used by Residents and Competency Committee members.
The resident dashboard has a wealth of information that is grouped into different categories for easier comprehension. First if you arrived at the resident dashboard by selecting a resident on the normative dashboard their data is automatically fetched for you. However, if you manually switched over to the resident dashboard by clicking on the navigation tabs above, you will need to select a resident from the drop-down in the filter panel situated at the top of the dashboard. The drop-down contains the list of all the residents in the program with their name and their corresponding progress rate. The names of residents are further grouped by their training stage and then sorted alphabetically for easier access.
The drop-down is also an editable text box and so you can type a resident’s name partially to automatically filter the available options in the drop-down. This makes it easier to search for a particular resident in a program with many residents.
After selecting a resident, users can then click on the “GET RECORDS’’ button to visualize their assessment data. You might notice the small button with the calendar icon on it. This is used to highlight assessment data gained by the resident in a particular time period. For now, ignore it but we will learn more about it further down. The resident dashboard consists of several main sub sections. Let us look at each one individually.
This section provides the following summarized metrics of the resident:
Total EPAs observed - This is a count of the total number of EPAs filled out by a resident.
Progress Rate - This is the number of EPAs a resident has achieved divided by the total number of EPAs they are required to achieve for all the valid EPA forms in a program across the different training phases.
Achievement Rate - This is the total number of EPAs a resident has achieved divided by the total number of EPAs completed by that resident. An achieved EPA is one where the EPA meets certain requirements set in the assessment plan, such as acquiring a rating of 4 or above on a 5-point scale, or satisfying specific contextual variable requirements, or meeting diverse assessor role requirements.
To the right of the acquisition metrics is a line chart that shows the weekly EPA acquisition rate for the last six months by the resident. This is meant to give a high level overview at a quick glance of the residents assessment gathering in the recent past.
This section is meant to quickly lookup a residents’ recent performance with the option to view records in the following ranges: last 10 days, last 25 days, last month and last 3 months. The chart does not visually distinguish the different EPA types (i.e., EPA-F1 vs EPA-C2), instead, it provides this and other additional information in a pop-up menu that can be invoked by hovering the mouse over a point.
The line chart provides a simple representation of the last “N’’ assessments filled by the resident where every EPA is represented as a point with the oldest record starting on the left. The points are arranged vertically using the O-Score Entrustability scale with 5 being the highest (I did not need to be there) and 1 being the lowest (I had to do). The better a resident performs in an EPA, the higher is the vertical position of the point in the chart.
If a resident has assessments that were filled on EPA forms across several different rating scales then each rating scale is provided with its own recent EPA chart as shown in the above image.
The final section provides a detailed overview of every single EPA completed by the resident. The entire list of EPAs that residents are required to complete are broken down into four groups based on the training phase during which a resident is supposed to complete them and are numbered accordingly. With the addition of support for dynamic CBE the number of training phases can be higher or lower than 4. Some programs such as surgical foundation for example only have two training phases.
Each training phase is presented as a block with the title of the training phase and a label indicating whether the resident has completed the training phase or not. If a training phase is in progress a completion rate is shown to indicate the number of assessments the resident has achieved in that training phase relative to the total number of required assessments for every EPA in that phase. Each training phase block acts as an accordion and can be expanded or collapsed to view the list of all EPAs in that block.
Although residents generally complete the EPAs of their current training phase before they pick up EPAs of later phases, there are exceptions. Due to various external factors such as their rotation schedules and the nature of medical cases of the patients they attend to, residents can occasionally end up completing EPAs which are not in their current training phase. This means residents can have a non-zero completion rate for training phases that they have not yet started officially.
When a training block is open, all the EPAs in that block are arranged sequentially based on the numbering order in a 3-column layout as shown above.
EPA ID and a textual description of the corresponding medical scenario that the EPA targets.
The residents acquisition metrics for each EPA are provided as 3 numbers along with two bullet charts that visualize how far along the resident is in completing that EPA. If an assessment plan is not available for an EPA the required and achieved numbers default to “N/A” (not available). The first bullet chart (blue) visualizes the observed EPA count relative to the required count while the second bullet chart visualizes the achieved EPA count relative to the required count. A green check mark icon indicates the completed status of the EPA. It can show up either because the resident has achieved the required number of EPAs or if the competence committee has marked the EPA as complete (even if the achieved count is not met). In the scenario shown in the image above the latter is true. However, if an EPA has not been marked complete and the resident has not met the required achieved EPA count then a “TO GO” metric is shown in place of the check mark icon as shown above.
This is a visualization of all assessments filled by the resident for that EPA. The information is visualized like the recent EPA chart discussed above. Assessments are first grouped by the EPA form type and version number. This ensures that assessments of a similar variety are visualized together. Within each chart, assessments are arranged chronologically on the X axis with the oldest to the left and are arranged vertically based on the EPA rating (5-point O-Score Entrustability scale) with 5 being the highest (resident managed the situation independently) and 1 being the lowest (Assessor had to completely take over the situation). However, the rating scale is not always a standard 5-point scale and can change depending on the type of scale used in the assessment plan for a given EPA. For example, in the image shown below the EPA form has a 2-point (yes/no) rating scale.
Further each point in this chart can be hovered upon to view additional information about that assessment such as narrative feedback, situational context, assessor name and their role in an onscreen popup window as shown above.
Finally, three buttons are provided as seen in the bottom left corner of each chart. The first button (sliders icon) brings up a set of drop-down filter lists that can be used to visually identify a particular record based on patient demographics or other contextual variables such as “Case Complexity’’ or “Clinical Presentation’’. For example, if a user wanted to see which of the records were for “respiratory distress”, they could select that option from the Clinical presentation drop-down list and the corresponding points (observation scores) would turn light red.
The second button (book icon) can be clicked to see all the records in a table, which can be sorted and filtered through. To filter the table start typing in the input box at the top of each column in the table. This will dynamically filter the table as you type. To sort the table by a column simply click on the column header. For example, in the image below the table is being sorted in an ascending order by the first column (date).
The third button brings up a popup screen that shows the achievement criteria breakdown for the EPA. This feature has been duplicated from the main CBME dashboard where hovering over the “i” icon in an EPA gives a breakdown of the assessment criteria as shown above.
If a school has enabled the ability to track expired assessments, an optional section is visible at the end of the dashboard which shows a tabular breakdown of all the expired assessments filled against a selected resident.
This is a common feature across all the sections of the resident dashboard that highlights all assessments that were filled in a particular period. To enable this, head over to the filter panel at the top of the dashboard and click on the small button with the calendar icon on it. This will open a panel where you can set the start date and end date for the period. You can either type in directly into the input box or use the date selector on the calendar above.
Once the start date and end date are set, all assessments that fall in that period are converted into diamonds across the dashboard. This provides a way to visually distinguish these EPAs while still viewing them relative to other EPAs filled outside of the selected period. This feature can be particularly useful during competence committee meetings which happen once every three months such that the period can be set to highlight only the EPAs filled by the resident since the last meeting.
The checkbox provided in the date filter panel when enabled hides all EPA levels which do not have any assessments filled in the selected training period. If an entire training phase does not have any EPAs filled in the training period, then the whole training phase block is also hidden. This can be useful to reduce the visual clutter on the dashboard and only focus on a small subset of EPAs.
The Normative Assessment Dashboard presents summarized data metrics of all the residents in a program relative to each other. It is meant to be viewed by Competency Committee members to assess a resident against their peers.
The data is presented as both a visual representation(left) and a tabular representation(right). Users are provided with an option to either view all the residents in a program or selectively view the metrics of residents in a particular training stage by using the drop-down at the top of the dashboard. This can be useful during Competency Committee meetings when residents are judged on their performance relative to others in their training group.
By default, the normative dashboard filters out residents without any completed EPAs. However, this behaviour can be turned off by toggling the checkbox at the top of the dashboard.
The bar chart visualizes the following four metrics individually and to switch between the metric being visualized, users can select the corresponding radio buttons above the chart. Each bar represents one resident and users can hover their mouse over a bar in the chart to see the name of the resident it represents, and the value of the current metric being shown. Clicking on a bar in the bar chart switches the user to the resident dashboard to view a detailed breakdown of all assessments of that resident.
Total EPAs- This is a count of the total number of EPAs filled out by a resident.
Achievement Rate- This is the total number of EPAs a resident has achieved divided by the total number of EPAs completed by that resident. An achieved EPA is one where the EPA meets certain requirements set in the assessment plan, such as acquiring a rating of 4 or above on a 5-point scale, or satisfying specific contextual variable requirements, or meeting diverse assessor role requirements.
Progress Rate- This is the number of EPAs a resident has achieved divided by the total number of EPAs they are required to achieve for all the valid EPA forms in a program across the different training phases.
Total EPAs vs Achieved EPAs- This chart visualizes two bars for each resident showing their total EPAs and achieved EPAs next to each other. While this metric is like achievement rate, it can offer a better picture of a resident’s overall performance as high achievement rates alone can be occasionally misleading for residents with a very low number of completed EPAs when all of them are achieved.
Finally, there is a tabular representation of the same metrics with the ability to sort the residents in the table by clicking on the column header. By default, the residents are sorted in the table by name in an ascending order. This can be changed to a descending order by simply clicking on the column header “Name”. Similarly clicking on each column header sorts the table in an ascending order based on that metric and clicking the same column header again changes the order to descending.
The normative dashboard is linked to the resident dashboard so to view a particular residents’ assessments in detail, users can simply click on a bar corresponding to a resident in the bar chart or their corresponding row in the table. This will automatically switch the user to the resident dashboard with that resident pre-selected.
This dashboard organizes all the EPAs that have been completed by learners in the program by the year they were completed in with the goal of informing program evaluation. There are two main sections in this dashboard. The first section of the dashboard visualizes key metrics across several academic years as shown below.
This graph displays the number of EPAs that have been completed and expired per learner in each year in the program. Users can mouse over the bars to see the number of active residents with assessments in each year.
This stack chart displays the proportion of EPAs in each year that have been rated at each level of entrustment ('I had to do' to 'I didn't need to be there'). Users can mouse-over each row for additional details. This chart only considers assessments that have been completed on "Supervisor Forms" that have a standard 5 point "O score" scale.
This graph displays the average number of words contained within the completed EPAs of each year. The length of the feedback has been found to correlate with the quality of feedback and so a higher word count is preferable.
This graph visualizes the number of EPA observations submitted per month over multiple years. It is intended to identify increases and decreases in EPA assessments over seasons and years. The X axis of this chart spans across a given academic year from July to June.
The second section of the dashboard shows EPA metrics that contextualize the EPAs that have been completed within a program over a selected academic year.
The summarized metrics are followed by three pie charts that group the assessors by their type, group, and role. Assessor role and group distributions are only available for "internal" assessors. These charts let program evaluators get a better picture of which user group gives them a higher share of EPA assessments. For example, in programs like Surgical foundations where senior residents often assess junior residents. The role “Trainee” might have a higher share compared to “Lecturers” or “Directors”.
The “export program data” button available at the top of the dashboard lets users download a detailed CSV export of all assessments completed or expired in the program. This can be used for additional downstream analysis of metrics that are not shown in the dashboard.
This dashboard organizes all of the EPAs that have been completed in a program by the assessor that completed them with the goal of informing faculty development. To get started, users need to first select the academic year in which the assessments were collected and click “GET RECORDS”.
There are three filters available in this dashboard. The first filter lets users select an assessor group to only look at assessments completed by users in that group. This filter can be used to remove or include student assessors. The second filter lets users select a department to only look at assessments completed by assessors from that department. This can be used to remove assessors from external programs. The final filter lets users select a specific assessor for their metrics to be highlighted. Alternatively, users can click on any of the bars in the charts below to select and highlight that assessor.
There are two sections each showing summarized metrics related to faculty assessments. The first section shows the amalgamated metrics for EPAs completed by all assessors in a given academic year. Users can mouse-over the EPA Rating visual to see the proportion of EPAs rated at each level of entrustment (EPAs completed on Supervisor Forms with 5-point O score scales are only considered for this metric). Users can also mouse-over the Training Stage visual to see the proportion of EPAs completed in each stage of training. The second section shows the metrics for EPAs completed by the selected Assessor. Like the section above, users can mouse-over the EPA Rating visual to see the proportion of EPAs rated at each level of entrustment and also mouse-over the Training Stage visual to see the proportion of EPAs completed in each stage of training.
The acquisition metrics panel is followed by four charts each visualizing a specific metric of an assessor relative to others in the program. The red highlighted bar represents the current selector assessor/faculty.
This chart displays the number of EPAs observed by each assessor. Users can mouse-over for each assessor's name and click to highlight that assessor's data. If an assessor is selected, their EPA count is shown in the chart title in red.
This chart displays the percentage of EPAs sent to each assessor that expired before completion. If your school doesn't track expiry metrics of assessments, then the values in this chart would default to zero for all faculties.
This chart displays the average entrustment score of EPAs completed by each assessor. If an assessor is selected, their average entrustment score is shown in the chart title in red. This chart only considers assessments that have been completed on "Supervisor Forms" that have a standard 5 point "O score" scale.
This chart displays the average number of words per comment included with the EPAs completed by each assessor. If an assessor is selected, their average words per comment metric is shown in the chart title in red.
The charts are followed by two tables that are only visible when a faculty is selected. The first table displays all the EPAs completed by the selected assessor. It is searchable (click the white box) and sortable (click the column header). The second table displays Expired EPAs that were not completed by the selected assessor. It is also searchable (click the white box) and sortable (click the column header). Both the tables can be exported as a CSV file.
The dashboard has two different types of exports that can be triggered by using the two buttons at the top of the page.
This provides a CSV export with a list of all faculties and their related metrics. This can be used for downstream analysis that is beyond the scope of the dashboard. If a faculty is selected, the report contains their data alone. If a user wants the data of all the faculties in the export then they must select “All” in the “Assessor” filter.
This button is only visible when a faculty has been selected and it lets the user export the entire dashboard as a PDF file that can be shared with other users who don't have access to the dashboard. It prompts a “Save as PDF” popup as shown below.
This dashboard lets you compare metrics among different programs for the purpose of program oversight. The dashboard is mainly intended for users that oversee several programs or other higher-level users like Medical Administrators and Deans.
Users need to first select an academic year. This updates the Program dropdown and shows an alphabetically sorted list of programs with the “EPA count” next to the name of each program. The list only contains programs that have assessments that were completed in the selected year. The user can then multi-select all the programs they need to compare and then click “GET RECORDS”.
This graph displays the number of EPAs that have been completed and expired in each year by program. If your school doesn't track expired assessments the values default to zero for all programs.
This graph displays the number of EPAs that have been completed and expired per learner in each year by program. The dotted line represents the average completed EPA count per learner for all the selected programs.
This stack chart displays the proportion of EPAs in each year that have been rated at each level of entrustment ('I had to do' to 'I didn't need to be there'). Users can mouse-over each row for additional details. This chart only considers assessments that have been completed on "Supervisor Forms" that have a standard 5 point "O score" scale.
This graph displays the average number of words contained within the completed EPAs of each year. The dotted line represents the mean value of all the selected programs. The length of the feedback has been found to correlate with the quality of feedback and so a higher word count is preferable.
These are a collection of graphs that visualize the completed and expired count by month through the academic year (spanning from July to June) for each program individually.
This button lets you download the metrics shown in this dashboard as a CSV file for other types of downstream analysis beyond the scope of this dashboard.