Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
There are multiple tools available to facilitate assessment and evaluation through Elentra.
To create items and forms to assess learner performance on tasks (e.g., clinical skills), use the Assessment and Evaluation Module.
To create items and forms to have learners evaluate courses, faculty, themselves, and other activities, use the Assessment and Evaluation Module.
To create assessment items to use in small, low-stakes assessments use the Quiz Module.
To create assessment items to use for tests and exams use the Exam Module.
To store learner grades and facilitate faculty grading of assignments use the Gradebook feature in each course.
Elentra ME also includes a portfolio, which some organizations may choose to use for student assessment purposes. For more detail about the portfolio please see the Portfolio help section.
The Assessment & Evaluation Module in Elentra is predominantly used to assess learner performance in a clinical environment, and provide multiple user groups with the ability to evaluate faculty, courses/programs, and other activities within the context of your organization. Since the Assessment and Evaluation Module essentially allows you to create forms for people to complete you could even use it to do a pre-course survey, or as a way to collect information from a group about their work plans in a collaborative activity.
To use the Assessment & Evaluation Module users must create items (e.g., questions, prompts, etc.), create a form (a collection of items), and then create a distribution. A distribution defines who will complete a form, when the form will be completed, and who or what the form is about. Through the Assessment and Evaluation Module users also have a quick way to view their assigned tasks and administrators can monitor the completion of tasks.
Reporting exists to export the results of various assessments and evaluations.
Forms created through the Assessment & Evaluation module can also be attached to gradebook items to facilitate inline grading (note that only specific item types are supported on forms attached to gradebook entries).
The Competency-Based Medical Education module also includes a variety of form templates for use. There is no user interface to configure form templates at this point.
For instructions specific to Competency-Based Medical Education, please see the inline help resources available in your Elentra installation on each program's CBME tab.
A form is a collection of items used to assess or evaluate a learner, faculty member, course, service, event, or anything else in your organisation. Form templates are organisation-specific and allow you to rely on templated items to make forms consistent. In the case of the Competency-Based Medical Education module used in graduate and postgraduate medical education, form templates also allow you to set parameters regarding EPAs, contextual variables and rating scales to automatically build and publish a set of triggerable assessment forms for clinical learning.
If you are using CBME please refer to the inline help in your Elentra installation to learn more about using form templates.
If you are in an organisation that isn't using CBME, form templates is unlikely to be something you will use.
Form templates are built off of something called a form blueprint and there is currently no way through the user interface to configure a form blueprint. If you want different form templates from those in a default installation of Elentra, you'll need a developer's help.
A distribution based on a learning event schedule allows you to schedule forms to be sent out where the targets of the distribution are the attendees of the event OR faculty who taught the event OR a specific event within the selected event type.
Navigate to Admin>Assessment and Evaluation.
Click 'Distributions' above the Assessment and Evaluation heading.
Click 'Add New Distribution'.
Distribution Title: Provide a title. This will display on the list of Distributions that curriculum coordinators and program coordinators can view.
Distribution Description: Description is optional.
Task Type: Hover over the question mark for more detail about how Elentra qualifies assessments versus evaluations. If a distribution is to assess learners, it's an assessment. If it is to evaluate courses, faculty, learning events, etc. it is an evaluation. Notice that the language on Step 4 will change if you switch your task type. The task type will also dictate other fields available throughout the wizard.
Assessment Mandatory: This will be checked off be default.
Select Form: The form you want to distribute must already exist; pick the appropriate form.
Select a Curriculum Period: The curriculum period you select will impact the list of available learners and associated faculty.
Select a Course: The course you select will impact the list of available learners and associated faculty.
Click 'Next Step'
Distribution Method: Select 'Learning Event Schedule' from the dropdown menu.
Select Event Type: Select the appropriate event type(s) from the dropdown menu. You can select multiple event types as needed.
Release Date: This tells the system how far back on the calendar to go when creating tasks. Hover over the question mark for more detail.
Task Expiry: If you check this box, the tasks generated by the distribution will automatically expire (i.e. disappear from the assessors task list and no longer be available to complete). You can customize when the task will expire in terms of days and hours after the delivery.
Warning Notification: If you choose to use the Task Expiry option, you'll also be able to turn on a warning notification if desired. This can be set up to send an email a specific number of days and hours before the task expires.
The target is who or what the form is about.
If you are creating an Evaluation:
Evaluations delivered for:
Faculty who taught events with the selected event types (this will generate an evaluation for any faculty member who taught the event type specified in the Method section, e.g. lecture, lab, etc.)
Events with the selected event types (this will generate an evaluation for any event of the event type specified in the Method section)
Target Attempt Options: Specify how many times an evaluator can evaluate each target, OR whether the evaluator can select which targets to assess and complete a specific number (e.g. evaluator will be sent a list of 20 targets, they have to complete at least 10 and no more than 15 evaluations but can select which targets they evaluate).
If you select the latter, you can define whether the evaluator can evaluate the same target multiple times. Check off the box if they can.
Click 'Next Step'
If you are creating an Assessment:
Assessments delivered for: Currently, the only option here is attendees who are enrolled in events with the specified event type.
Target Attempt Options: Specify how many times an assessor can assess each target, OR whether the assessor can select which targets to assess and complete a specific number (e.g. assessor will be sent a list of 20 targets, they have to complete at least 10 and no more than 15 assessments but can select which targets they assess).
If you select the latter, you can define whether the assessor can assess the same target multiple times. Check off the box if they can.
Click 'Next Step'
The assessors/evaluators are the people who will complete the form.
There are three options:
Assessors/evaluators are attendees enrolled in the event
Attendee Options (note that you must be using Elentra's attendance module within events to use this feature)
Send to all enrolled audience attendees that attended the event
Sent to all enrolled attendees, even if they did not attend
Assessors/evaluators are faculty members associated with the event
This will send the distribution to the faculty listed on the event (e.g. teacher, tutor, etc.)
Assessors/evaluators are external to the installation of Elentra
This allows you to add external assessors/evaluators to a distribution
Begin to type an name or email, if the user already exists you'll see them displayed in the dropdown menu. To create a new external user, scroll to the bottom of the list and click 'Add External Assessor/Evaluator'
Provide first and last name, and email address for the external assessor and click 'Add Assessor/Evaluator'
Click 'Next Step'
You can immediately save your distribution at this point and it will generate the required tasks, but there is additional setup you can configure if desired. (Not all options will display depending on the other parameters of the distribution.)
Authorship: This allows you to add individual authors, or set the distribution to be accessible to everyone with A+E access in a course or organization. (This may be useful if you have multiple users who manage distributions or frequent staffing changes.)
Target Release: These options allow you to specify whether the targets of the distribution can see the results of completed forms.
Task List Release:
"Targets can view tasks completed on them after meeting the following criteria" can be useful to promote completion of tasks and is often used in the context of peer assessments. Targets will only see tasks completed on them after they have completed the minimum percentage of their tasks set by you.
Target Self-Reporting Release: This controls whether targets can run reports for this distribution (i.e. to generate an aggregated report of all responses). When users access their own A+E they will see a My Reports button. This will allow them to access any reports available to them.
Target Self-Reporting Options: This allows you to specify whether or not comments included in reports are anonymous or identifiable. (This will only be applied if you have set reports to be accessible to the targets.)
Reviewers: This allows you to set up a reviewer to view completed tasks before they are released to the target (e.g. a staff person might review peer feedback before it is shared with the learner).
Check off the box to enable a reviewer.
Click Browse Reviewers and select a name from the list. Note that this list will be generated based on the course contacts (e.g. director, curriculum coordinator) stored on the course setup page.
Prompted Response Notifications: This allows you to decide what action to take for any answers on the form that designated as prompted or flagged response options in items included on the form. (For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response.) You can optionally select to email the Assessment Reviewers, Program Coordinators, Program/Course Directors, or Distribution Authors.
Click 'Save Distribution'.
The assessment and evaluation module provides a way for learners to be assessed, especially in clinical learning environments, and a way for all users to assess courses/programs, faculty, learning events, etc. Any forms used for assessment and evaluation require form items (e.g., questions, prompts, etc.).
When a user creates items, s/he automatically has permission to access and use that item. However, additional permissions can be added to items after they have been created. This is important to note because if you want another user to have access to edit an item or to add it to a form you should give that user permission to access the item.
If you are creating items for a form to be attached to a gradebook post for the purpose of online grading using a dropbox and Assessment and Evaluation form please note that not all item types are currently supported because there is no structure to weight them on the form posted to the gradebook. When creating items for a form to use with a gradebook dropbox it is recommended that you only use multiple choice, dropdown selector, rubric and scale items. If your form requires narrative comments do not use the free text comment item type as the grader will not be able to save their comments; instead, allow or require comments on your scale or rubric items and encourage graders to provide feedback within the rubric or scale item.
Note that you can copy existing items which may save time. To copy an existing item, click on the item and click 'Copy Item' which is beside the Save button.
Navigate to Admin>Assessment & Evaluation.
Click 'Items'.
Click 'Add New Item'.
Complete the required information, noting that different item types will require different information. Common item details are below: Item Type: This list shows the item types supported by Elentra. Item codes will display when you use a list view of items and A complete list of item types is provided below. Item Text: This is what will show up on a form this item is added to. When you view items in the detail view you'll also see the item text. Item Code: This is an optional field. Item codes do display when you view items in a list and they are searchable. Some organizations apply their own coding system to their items, but another use case might be if you are importing items from another tool or vendor and they have a coding system you want to match. Rating Scale: Rating scales can be configured through the Scales tab within Assessment & Evaluation. First select a scale type and then select the specific scale. Selecting a rating scale will prepopulate the response categories for this item. In some item types you will also be required to add response text (e.g. multiple choice items) and that text will show up on the actual form. In other question types you may rely on just the response categories. Mandatory: Click this checkbox if this item should be mandatory on any forms it is added to. Allow comments: Click this checkbox to enable comments to be added when a user responds to this item. If enabled, you have several options to control commenting.
Comments are optional will allow optional commenting for any response given on this item.
Require comments for any response will require a comment for any response given on this item. Require comments for a prompted response means that for any response where you check off the box in the Prompt column, a user will be required to comment if they select that response.
Allow for a default value: If you check this box you will be able to select a default response that will prepopulate a response when this item is used on any form. Set a default response by clicking on the appropriate response line in the Default column.
Depending on the question type, add or remove answer response options using the plus and minus icons.
Depending on the question type, reorder the answer response options by clicking on the crossed arrows and dragging the answer response option into the desired order.
Add curriculum tags to this item as needed.
If you have access to multiple courses/programs, first use the course/program selector to choose the appropriate course/program which will limited the available curriculum tags to those assigned to the course/program. Click the down arrow beside the course selector and search for the course by beginning to type the course name. Click the circle beside the course name.
Click through the hierarchy of tags as needed until you can select the one(s) appropriate for the item.
As you add curriculum tags, what you select will be listed under the Associated Curriculum Tags section.
Scroll back up and click 'Save'.
Horizontal Multiple Choice (single response): Answer options will display horizontally on the form and the user can select one answer. Response text required; response category optional. Response descriptors provide another data point so you can potentially report on them in the future. They are metadata in MC questions whereas in a rubric they are displayed. Horizontal MC will let you assign the same response descriptors to multiple responses.
Vertical Multiple Choice (single response): Answer options will display in a vertical list on the form and the user can select one answer. Response text required; response category optional.
Drop Down (single response): answer options will display in a dropdown menu. Response text required; response category optional.
Horizontal Multiple Choice (multiple responses): Answer options will display horizontally on the form and the user can select two or more answers. Response text required; response category optional.
Vertical Multiple Choice (multiple responses): Answer options will display in a vertical list on the form and the user can select two or more answers. Response text required; response category optional.
Drop Down (multiple responses): Answer options will display in a dropdown list that remains open and allows users to select multiple responses using the control or command and enter/return keys.
Free Text Comments: Use this item type to ask an open ended question requiring a written response. (In ME 1.11 and lower you can not map a free text comment to a curriculum tag set.)
Date Selector: Use this item type to ask a question to which the response is a specific date (e.g. What was the date of this encounter?)
Numeric Field: Use this item type to ask a question to which the response is a numeric value (e.g. How tall are you?)
Rubric Attribute (single response): Use this to create an item that relies on response categories as answer options. If you enter text in the response text area it will not show up to the user unless you create a grouped item. If you create a grouped item remember you need to use the same scale across all items to be grouped together. If you want a rubric item to display response text, create a grouped item with just one item included.
Scale Item (single response): Use this to create an item that relies on response categories as answer options. If you enter text in the response text area it will not show up to the user unless you create a grouped item. If you create a grouped item remember you need to use the same scale across all items to be grouped together.
Creating a grouped item allows you to group items and guarantee that they appear together on forms. If you use the rubric attribute or scale item item types, creating a grouped item will create a rubric with common response categories (e.g. developing, achieved) and specific response text for each field (e.g. performed a physical exam with 1-2 prompts from supervisor, independently performed a physical exam).
Navigate to Admin>Assessment & Evaluation.
Click on the Items tab.
Click on the Grouped Items sub-tab.
Click 'Add A New Grouped Item'.
Provide a grouped item name and select a rating scale type and then a rating scale. All items in the group will have the same response categories assigned to them, as configured through the rating scale. (Rating scales can be set up through the Scales tab in Admin>Assessment and Evaluation.)
Click 'Add Grouped Item'.
Complete the required information, noting the following: Title: This will display when you view a list of grouped items. Description: This field is optional; note that the grouped item description will display below the grouped item title on forms produced for users (see image below). Permissions: Adding a group, course, or individual here will give those users access the the grouped item.
To add items to a grouped item you can either create and attach a new item or add existing items (click the appropriate button).
If attaching existing items, use the search bar and filters to find items. You will only be shown items that match the rating scale parameters you've selected. Click the checkbox beside a question (in list view) or beside the pencil icon (in detail view) and click 'Attach Selected'. Because an existing item may already be in use on another form, in some cases you will not be able to modify the response descriptors for that item.
If creating and attaching items, follow the instructions above for creating items. The rating scale for your new items will be set to match the rating scale of the grouped item. After creating one item, you can repeat the steps to create and attach as many items as needed.
Click 'Save'.
To edit an item click on the pencil icon. Bear in mind that an existing item may already be in use on another form.
To delete an item from a grouped item, click on the trashcan icon.
To reorder the items in the grouped item, click on the crossed arrows and drag the item into the appropriate location.
When you have added all required items to the grouped item, click 'Save'.
Click 'Grouped Items' at the top of the screen to return to the list of grouped items.
Navigate to Admin>Assessment and Evaluation.
Click 'Items'.
From list view, click on any item to open it. From grid view, click the pencil icon to edit an item.
Give permission to an individual, organisation or course by first selecting the appropriate title from the dropdown menu, and then beginning to type in the search bar. Click on the desired name from the list that appears below the search bar. Giving permission to an entire organisation will allow anyone affiliated with the organisation through their user profile, AND with access to the Assessment and Evaluation module to use the item. If you give permission to an entire course, anyone listed on as a course contact on the setup page AND with access to the Assessment and Evaluation module will have access to the item.
After you've added all permissions, you can return to the list of all items by clicking 'Items'.
Toggle between list view and detail view using the icons beside the search bar.
In detail view, see the details of an existing item by clicking on the eye icon.
In detail view, edit an existing question by clicking on the pencil.
To delete items, check off the tick box beside a question (list view) or beside the pencil icon (detail view) and click 'Delete Items'.
From an Edit Item page you can click on a link to view the forms that use an item or the grouped item an item is included in.
When viewing items in list view, the third column shows the number of answer options the item has. Clicking on it takes you to the item, and by clicking again you can see all the forms that use this item.
From the Items tab type into the search box to begin to find questions.
You can apply a variety of filters to refine your search for existing items.
To select a filter, click on the down arrow beside the search box. Select the filter type you want to use, click on it, and then begin to type what you want to find or continue clicking to drill down and find the required filter field. Filter options will pop up based on your search terms or what you’ve clicked through and you can check off the filters you want to apply. Apply multiple filters to further refine your search.
If you're working with a filter with multiple hierarchies, use the breadcrumbs in the left corner of the filter list to go back and add additional filters.
When you’ve added sufficient filters, scroll down and click Apply Filters to see your results.
To remove individual filters from your search, click on the down arrow beside the search field, click a filter type and click on the small x beside each filter you want to remove. Scroll down and click ‘Apply Filters’ to apply your revised selections.
To remove all filters from your search, click on the down arrow beside the search field, click a filter type, scroll down, and click on ‘Clear All’ at the bottom of the filter search window.
When you create assessment and evaluation items you will have the option of applying rating scales to certain item types; creating rating scales promotes consistency across items and can be a time saver for the administrative staff creating items and forms.
You must be a medtech:admin user to manage rating scales. Scales can have permissions configured if needed (see end of page).
For additional detail about rating scales in CBME, please see here.
Navigate to Admin>Assessment and Evaluation.
Click 'Scales' from the A&E tabs list. Any existing rating scales will be displayed.
Click the green 'Add Rating Scale' button.
Complete the required information, noting the following: Title: Title is required and is what users will see when they build items and add scales so make it clear. Description: This is optional and is not often seen though the platform. Rating Scale Type: This defines the type of rating scale you are creating. Later, if you add rating scales to items, or add standard scales to form templates, you will first have to select a scale type. There is no user interface to configure rating scale types. Sample scale types include generic, global rating scale, milestone scale, etc. In a default Elentra installation you'll likely just see a default scale type. (In installations with CBME enabled you'll see global rating and milestone/enabling competency scales.)
Add or remove response categories by clicking the plus and minus icons.
For each response category, select a descriptor (these are configured through the assessment response categories). Note that you can search for descriptors by beginning to type the descriptor in the search box.
To edit an existing rating scale click on the scale title, make changes as needed, and click 'Save'.
To delete a rating scale click the checkbox beside the rating scale and click 'Delete Rating Scale'.
Once scales are created, they will become visible options when creating items and using some form templates.
Setting permissions for a scale dictates which users will be able to access a scale when they create assessment and evaluation items. For example, if you set a scale's permissions to Undergraduate Medicine, all users with access to Admin > A & E in the undergraduate organisation will be able to use the scale when creating items. If you set a scale's permissions to several individual users, only those users will be able to access the scale when creating items.
You must create a scale before you can edit the permissions for it. After a scale is created you will automatically be redirected to the edit page.
In the Rating Scale Information section, look for the Scale Permissions heading.
Select Individual, Organisation, or Course from the dropdown options.
Type in the search bar to find the appropriate entity.
Click on the entity name to add it to the permission list.
Add as many permissions as required.
Scroll down and click 'Save'.
You can create a distribution to send to a staff person who can later forward the task to the appropriate person. The use case for this might be setting up a distribution at the beginning of the year and not knowing exactly which preceptors will be working in a specific environment. You can send the distribution to a staff member who can forward the tasks once a clinic schedule is set.
Navigate to Admin>Assessment and Evaluation.
Click 'Distributions' above the Assessment and Evaluation heading.
Click 'Add New Distribution'.
Distribution Title: Provide a title. This will display on the list of Distributions that curriculum coordinators and program coordinators can view.
Distribution Description: Description is optional.
Task Type: Hover over the question mark for more detail about how Elentra qualifies assessments versus evaluations. If a distribution is to assess learners, it's an assessment. If it is to evaluate courses, faculty, learning events, etc. it is an evaluation. Notice that the language on Step 4 will change if you switch your task type.
Assessment Mandatory: This will be checked off by default. Leaving it checked creates a record that the tasks are mandatory but it doesn't impact the user experience.
Select Form: The form you want to distribute must already exist; pick the appropriate form.
Select a Curriculum Period: The curriculum period you select will impact the list of available learners and associated faculty.
Select a Course: The course you select will impact the list of available learners and associated faculty.
Click 'Next Step'
Distribution Method: Select 'Delegation' from the dropdown menu.
Delegator: Click Browse Delegators and select the appropriate name from the dropdown menu. You can only select one delegator.
Delegation Options:
Delegation based on date range
Delegation based on rotation schedule
From this point forward the distribution wizard will be configured based on whether you selected a date range or rotation-based distribution. Please refer the the help pages for those two methods to continue through the wizard.
Please see here for more information about how to complete a delegation task.
To effectively use the various assessment tools in Elentra ME some initial setup is required.
For the Assessment & Evaluation module specifically, a staff:admin or medtech:admin will need to configure assessment response categories in System Settings. For information about configuring assessment response categories (required to build scales), please see the System Setup help section. A medtech:admin will have to configure scales through Admin > Assessment & Evaluation (see next section).
Note that assessment characteristics, which is also part of system settings, is used in the Gradebook module to define assessment types like written exam, test, quiz, project, etc. You do not need to build a list of assessment characteristics to use the Assessment & Evaluation module.
When you create a distribution, one of the options available via the Results tab is Target Release and Target Self-Reporting Options. This allows you to require that users complete a specific number of tasks before they can review their own results. This is typically used for peer assessments so that learners are required to provide feedback to others before they can see their own feedback. Please note that these settings are specific to a single distribution only. This tool does not require users to have completed a percentage of tasks across all distributions.
Rotation Based Distributions allow you to set up a distribution based on a rotation schedule. This means you can easily send a form to all enrolled learners to be delivered when they are actively in the rotation. Note that you must have rotations built using the Clinical Experience Rotation Scheduler to use this distribution method.
Navigate to Admin>Assessment and Evaluation.
Click 'Distributions' above the Assessment and Evaluation heading.
Click 'Add New Distribution'.
Distribution Title: Provide a title. This will display on the list of Distributions that curriculum coordinators and program coordinators can view.
Distribution Description: Description is optional.
Task Type: Hover over the question mark for more detail about how Elentra qualifies assessments versus evaluations. If a distribution is to assess learners, it's an assessment. If it is to evaluate courses, faculty, learning events, etc. it is an evaluation. Notice that the language on Step 4 will change if you switch your task type.
Assessment Mandatory: This will be checked off be default.
Select Form: The form you want to distribute must already exist; pick the appropriate form.
Select a Curriculum Period: The curriculum period you select will impact the list of available learners and associated faculty.
Select a Course: The course you select will impact the list of available learners and associated faculty.
Click 'Next Step'
Distribution Method: Select 'Rotation Distribution' from the dropdown menu.
Rotation Schedule: Select the appropriate rotation schedule from the dropdown menu.
Release Date: This tells the system how far back on the calendar to go when creating tasks. Hover over the question mark for more detail.
Delivery Period: Choose between delivering tasks repeatedly, once per block, or once per rotation. For each option additional customization allows you to control the timing of the distribution (e.g. 1 day after the start of the block, or 3 days before the end of the rotation). Note that in this context a learner might participate in multiple blocks within one rotation.
Task Expiry: If you check this box, the tasks generated by the distribution will automatically expire (i.e. disappear from the assessors task list and no longer be available to complete). You can customize when the task will expire in terms of days and hours after the delivery.
Warning Notification: If you choose to use the Task Expiry option, you'll also be able to turn on a warning notification if desired. This can be set up to send an email a specific number of days and hours before the task expires.
The target is who or what the form is about.
Assessments delivered for: Use this area to specify the target of the form.
If you choose "The targets for this Distribution are learners," you'll see the following options:
Learner Options:
All learners in this rotation: Additionally specify whether to include learners from your program, and/or outside your program
Additional Learners: Check this off and then use the drop down selector to add the required learners. (Hover over a learner name to see their profile information.)
Specific learners in this rotation: Use the drop down selector to add the required learners.(Hover over a learner name to see their profile information.)
CBME Options: This option applies only to schools using Elentra for CBME. Ignore it and leave it set to non CBME learners if you are not using CBME. If you are a CBME school, this allows you to apply the distribution to all learners, non CBME learners, or only CBME learners as required.
Target Attempt Options: Specify how many times an assessor can assess each target, OR whether the assessor can select which targets to assess and complete a specific number (e.g. assessor will be sent a list of 20 targets, they have to complete at least 10 and no more than 15 assessments but can select which targets they assess).
If you select the latter, you can define whether the assessor can assess the same target multiple times. Check off the box if they can.
Click 'Next Step'
The assessors are the people who will complete the form.
There are three options:
Assessors are learners
Learner Options: All Learners of Specific Learners
All Learners: Select all learners in the rotation or specific learners in the rotation
All learners: Select from My Program and/or Outside of my program, use the drop down selector to add additional learners
Specific learners in the rotation: Use the drop down selector to add required learners
Assessors are faculty members
Browse faculty and click on the required names to add them as assessors
Select Associated Faculty: This tool will pull the names of faculty listed on the course setup page as associated faculty
Feedback Options: This will add a default item to the distribution asking if the faculty member met with the trainee to discuss their assessment.
Assessors are external to the installation of Elentra
This allows you to add external assessors to a distribution
Begin to type an email, if the user already exists you'll see them displayed in the dropdown menu. To create a new external assessor, scroll to the bottom of the list and click 'Add External Assessor'
Provide first and last name, and email address for the external assessor and click 'Add Assessor'
Feedback Options: This will add a default item to the distribution asking if the faculty member met with the trainee to discuss their assessment.
Click 'Next Step'
You can immediately save your distribution at this point and it will generate the required tasks, but there is additional setup you can configure if desired.
Authorship: This allows you to add individual authors, or set the distribution to be accessible to everyone with A+E access in a course or organization. (This may be useful if you have multiple users who manage distributions or frequent staffing changes.)
Target Release: These options allow you to specify whether the targets of the distribution can see the results of completed forms.
Task List Release:
"Targets can view tasks completed on them after meeting the following criteria" can be useful to promote completion of tasks and is often used in the context of peer assessments. Targets will only see tasks completed on them after they have completed the minimum percentage of their tasks set by you.
Target Self-Reporting Release: This controls whether targets can run reports for this distribution (i.e. to generate an aggregated report of all responses). When users access their own A+E they will see a My Reports button. This will allow them to access any reports available to them.
Target Self-Reporting Options: This allows you to specify whether or not comments included in reports are anonymous or identifiable. (This will only be applied if you have set reports to be accessible to the targets.)
Reviewers: This allows you to set up a reviewer to view completed tasks before they are released to the target (e.g. a staff person might review peer feedback before it is shared with the learner).
Check off the box to enable a reviewer.
Click Browse Reviewers and select a name from the list. Note that this list will be generated based on the course contacts (e.g. director, curriculum coordinator) stored on the course setup page.
Prompted Response Notifications: This allows you to decide what action to take for any answers on the form that designated as prompted or flagged response options in items included on the form. (For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response.) You can optionally select to email the Assessment Reviewers, Program Coordinators, Program/Course Directors, or Distribution Authors.
Click 'Save Distribution'.
ME 1.14 includes a new feature relevant to evaluations. It allows learners to optionally release completed tasks to the targets; in use, this means that evaluations, which do not traditionally have the option to be released as individual tasks to targets, can now be released for those targets to view.
See more information in the Learner Use of A&E section.
It has been brought to our attention that this option does not align with policy at many institutions and as such, we need to allow for schools to optionally turn this on and off. We are working on an enhancement to allow schools to optionally enable this feature and hope to include said enhancement in ME 1.15.
Faculty have quick access to review their assessment and evaluation tasks and those of their affiliated learners through the Assessment and Evaluation badge.
Click the Assessment & Evaluation badge in the top right beside the logout button.
A series of tabs will open under the Assessment & Evaluation header.
When viewing results on any of the tabs in A&E, use the search bar to look for a specific task by name and click the down arrow to limit your results by applying search filters like distribution method (date range, delegation, learning event, rotation schedule), curriculum period, course, and task status (pending, in progress, completed). Apply start and end dates to limit your results to a specific time frame. Remember to click 'Apply Filters' to apply your choices. Remember to click 'Remove Filters' to clear filters and view all results. Note that your previous filter settings may still be applied so if you are seeing no results, or fewer results than you expected, try 'Remove Filters'.
Assessment Tasks This tab shows users all pending tasks for which they are responsible. Faculty can view and complete a task by clicking on it. Faculty can remove a task by clicking 'Remove Task' and providing a reason for its removal. Faculty can download individual forms, or download multiple forms by clicking the download checkbox on each card and then clicking 'Download PDF(s)' at the top right. When users do this they will be able to choose whether to download all tasks as one file or as multiple files.
Tasks Completed on Me This tab displays all tasks completed on the user and which s/he has been given access to view. This can be controlled in the creation of a distribution.
My Reports: To view information from forms completed on oneself, the user can click My Reports on the right hand side.
Set the date range.
Remember that the creator of a distribution can set whether a user can view reports on a distribution, so not every distribution completed on a target is available for the target to view.
My Completed Tasks:
The user can view all completed tasks and again, download an individual file or download multiple files at once.
Faculty: Faculty who are listed as course/program directors have access to view the assessment and evaluation pages of faculty associated with their course/program (including external assessors). Faculty appear on this list if they have been the assessor or target in a distribution tied to the course.
Hide cards of external faculty by clicking Hide Card. Bring hidden cards back into view by clicking 'Show Hidden Faculty'.
Update external faculty emails by clicking on Update Email and providing revised information.
Program directors will be able to view the Current Tasks, Completed Tasks, Upcoming Tasks, and Tasks Completed on Faculty tabs for their faculty.
Send reminders, remove tasks or download tasks. (There is an option to select all and send reminders.)
From the Tasks Completed on Faculty tab the course/program director can also view Reports for this Faculty.
My Learners The My Learners tab will allow faculty to access a variety of information for learners associated with them. This list of available options will vary depending on which modules are in use in Elentra but could include CBME Dashboard, Assessments, and Log Book.
Download a list of all learners associated with a faculty by clicking 'Download Enrolment'. A pdf title "learners-for-faculty-name" will download and list all learners, including their primary email and learner level.
Search for an individual by beginning to type his/her name in the search learners area; the learner cards displayed will automatically display to reflect the searched name.
Refine the list of learners by switching the curriculum period using the dropdown menu on the far right.
Click on the appropriate tab to view the desired information.
My Learners - Assessments For each of the tabs described below you can search by a task name; apply advanced search filters like distribution method (date range, delegation, learning event, rotation schedule), curriculum period, course, and task status (pending, in progress, completed); set a date range; and download a pdf of an individual task or multiple tasks (note that you can select all if required).
Reports on the Learner:
This tool allows you to view and download a pdf report (with or without comments) that aggregates learner performance on the selected form in one report. You can also use the Options dropdown menu to quickly view the form and the individual assessments.
Click group by distribution as desired to sort the forms by distribution. If the same form has been used across multiple distributions this will tease apart each distribution and you can report on forms completed in a single distribution.
Tasks Completed on Learner: View tasks completed on the learner. Cards display the form title and type, task delivery and completion dates, form triggerer name (if applicable), and name and role of the assessor. Pending Tasks on Learner: These task cards will look similar to the completed tasks but will offer the ability to send a reminder about the task. To do so, click the checkbox beside the bell and then click 'Send Reminders' at the top of the page. You also have the option to remove a task on this screen. Be aware that is you remove a task you will be required to give a reason why. Upcoming Tasks on Learner: This displays scheduled tasks for the learner that are not yet active. Learner's Current Tasks: This displays tasks the learner is currently responsible for completing (e.g. faculty evaluation, service evaluation, etc.). Task cards may display a rotation or block name depending on how they were scheduled. Faculty can send a reminder or remove a task here as well. Learner's Upcoming Tasks: This displays a learner's upcoming tasks scheduled through a distribution.
Program coordinators have quick access to review their assessment and evaluation tasks and those of their affiliated learners and faculty through the Assessment and Evaluation badge.
Click the Assessment & Evaluation badge in the top right beside the logout button.
A series of tabs will open under the Assessment & Evaluation header.
When viewing results on any of the tabs in A&E, use the search bar to look for a specific task by name and click the down arrow to limit your results by applying search filters like distribution method (date range, delegation, learning event, rotation schedule), curriculum period, course, and task status (pending, in progress, completed). Apply start and end dates to limit your results to a specific time frame. Remember to click 'Apply Filters' to apply your choices. Remember to click 'Remove Filters' to clear filters and view all results.
Assessment Tasks This tab shows users all pending tasks for which they are responsible. This is particularly important for program coordinators because of the delegation type distributions. If a distribution was created and set as a delegation to be sent to a program admin., s/he can access that delegation here and assign the task to the appropriate faculty (or other user). Users can view and complete a task by clicking on it. Users can remove a task by clicking 'Remove Task' and providing a reason for its removal. Users can download individual forms, or download multiple forms by clicking the download checkbox on each card and then clicking 'Download PDF(s)' at the top right. When users do this they will be able to choose whether to download all tasks as one file or as multiple files.
Tasks Completed on Me This tab displays all tasks completed on the user and which s/he has been given access to view. Whether a user can view tasks completed on them is controlled in the creation of a distribution.
My Reports: To view information from forms completed on oneself, the user can click My Reports on the right hand side.
Set the date range.
Remember that the creator of a distribution can set whether a user can view reports on a distribution, so not every distribution completed on a target is available for the target to view.
My Completed Tasks: * The user can view all completed tasks and again, download an individual file or download multiple files at once.
My Learners The My Learners tab will allow faculty to access a variety of information for learners associated with them. This list of available options will vary depending on which modules are in use in Elentra but could include CBME Dashboard, Assessments, and Log Book.
Download a list of all learners associated with a faculty by clicking 'Download Enrolment'. A pdf title "learners-for-faculty-name" will download and list all learners, including their primary email and learner level.
Search for an individual by beginning to type his/her name in the search learners area; the learner cards displayed will automatically display to reflect the searched name.
Refine the list of learners by switching the curriculum period using the dropdown menu on the far right.
Click on the appropriate tab to view the desired information.
My Learners - Assessments For each of the tabs described below you can search by a task name; apply advanced search filters like distribution method (date range, delegation, learning event, rotation schedule), curriculum period, course, and task status (pending, in progress, completed); set a date range; and download a pdf of an individual task or multiple tasks (note that you can select all if required).
Reports on the Learner:
This tool allows you to view and download a pdf report (with or without comments) that aggregates learner performance on the selected form in one report. You can also use the Options dropdown menu to quickly view the form and the individual assessments.
Click group by distribution as desired to sort the forms by distribution.
Tasks Completed on Learner: View tasks completed on the learner. Cards display the form title and type, task delivery and completion dates, form triggerer name (if applicable), and name and role of the assessor. Pending Tasks on Learner: These task cards will look similar to the completed tasks but will offer the ability to send a reminder about the task. To do so, click the checkbox beside the bell and then click 'Send Reminders' at the top of the page. You also have the option to remove a task on this screen. Be aware that is you remove a task you will be required to give a reason why. Upcoming Tasks on Learner: This displays scheduled tasks for the learner that are not yet active. Learner's Current Tasks: This displays tasks the learner is currently responsible for completing (e.g. faculty evaluation, service evaluation, etc.). Task cards may display a rotation or block name depending on how they were scheduled. Faculty can send a reminder or remove a task here as well. Learner's Upcoming Tasks: This displays a learner's upcoming tasks scheduled through a distribution.
Faculty: Program coordinators associated with a course will be able to access a tab listing faculty. Faculty will appear on this list if they have been the assessor in a distribution or the target or assessor of a triggered form tied to the same course as the program coordinator.
Hide cards of external faculty by clicking Hide Card. Bring hidden cards back into view by clicking 'Show Hidden Faculty'.
Update external faculty emails by clicking on Update Email and providing revised information.
Program coordinators will be able to view the Current Tasks, Completed Tasks, Upcoming Tasks, and Tasks Completed on Faculty tabs for the available faculty.
Program coordinators can send reminders, remove tasks or download tasks. (From this screen there is an option to select all and send reminders.)
From the Tasks Completed on Faculty tab the program coordinator can also view Reports for this Faculty.
This tool allows the program coordinator to view and download a pdf report (with or without comments) that aggregates evaluations of the selected faculty in one report. Program coordinators can also use the Options dropdown menu to quickly view the form and the individual evaluations.
Responses and comments aggregated in these reports are de-identified.
Click group by distribution as desired to sort the forms by distribution. If the same form has been used across multiple distributions this will tease apart each distribution and you can report on forms completed in a single distribution.
A form is a collection of items used to assess or evaluate a learner, faculty member, course, service, event, or anything else in your organisation.
Some form templates are available in the CBME module but there is not a user interface to build additional form templates at this point. This section is about creating forms without templates.
If you are creating a form to be attached to a gradebook post for the purpose of online grading using a dropbox and Assessment and Evaluation form please note that not all item types are currently supported because there is no structure to weight them on the form posted to the gradebook. When creating a form to use with a gradebook dropbox it is recommended that you only use multiple choice, dropdown selector, rubric and scale items. If your form requires narrative comments do not use the free text comment item type as the grader will not be able to save their comments; instead, allow or require comments on your scale or rubric items and encourage graders to provide feedback within the rubric or scale item.
Navigate to Admin>Assessment & Evaluation.
Click 'Forms'.
Click 'Add Form'.
Provide a form name and select a type (if applicable).
Click 'Add Form'.
Provide a form description if desired and set form permissions to give access to other users. Anyone given permission to the form will be able to edit it until it is used in a distribution.
Click 'Add Item(s)' to add existing items.
Search for existing items and tick off the check boxes, then click 'Attach Selected' to apply your choices.
You can also add grouped items, free text (e.g., for instructions), or a curriculum tag set to your form. To add any of these, click on the down arrow beside Add Items. If you choose to add Free Text or a Curriculum Tag Set, please note that you must save your choices within the item box using the small 'Save Required' button.
Adding a Curriculum Tag Set is a very specific tool that supports field notes for use in family medicine. Most users should ignore this option.
To create new items while creating your form, click Add Items and then click Create & Attach a New Item. When you complete your new item and save it, you will be returned to the form you in the process of building.
Preview your form by clicking on the eye icon/Preview Form button.
Download a copy of the form using the Download PDF button.
Save the form when you have added all the relevant items.
To delete items on a form, tick off the box on each item card and then click the red Delete button.
To rearrange items on a form, click the crossed arrow icon on the item card and drag and drop the item where you want it to be.
To edit an item, click on the pencil icon. Note that an item already in use on a form that has been distributed not be able to be edited. Instead you must copy and attach a new version of the item to edit and use it.
To quickly view the details of an item, click on the eye icon on the question card.
Navigate to Admin>Assessment & Evaluation.
Click 'Forms'.
Use the search bar to look for the form you want to copy. Click the down arrow beside the search bar to apply filters to refine your search results.
Click on the name of the form you want to copy.
Click 'Copy Form' and provide a new name for the copied form.
Click 'Copy Form'.
Edit the form as needed and click 'Save'.
Navigate to Admin>Assessment & Evaluation.
Click 'Forms'.
Use the search bar to look for the form you want to delete. Click the down arrow beside the search bar to apply filters to refine your search results.
Tick off the box beside the form name (you can select multiple forms to delete at once), and then click the red Delete Form button.
Elentra includes a variety of reporting options through the Admin > Assessment & Evaluation tab. These reports reflect data collected through forms managed through distributions. To view reports for triggered forms when using the Competency Based Medical Education module, use the learner CBME dashboard. (More information about CBME reporting is available in the CBME section.) . One exception to this is if you are trying to report on form feedback from form built with form templates in CBME - that feedback is available through the Assessment Tools Feedback Report in the Assessments section.
A&E Reporting is an administrative reporting tool. Note the individual users may have access to their own reports via their Assessment & Evaluation button, but the availability of such reports depends on how distributions were set up.
When you create most reports you will have some additional options after selecting the appropriate course/faculty/learner/form, etc. These options allow you to customize the reports you run for different audiences.
As of ME 1.13 not all these options will actually display on all reports, despite the fact that you have the option to select them through the user interface. Please see each specific report for additional detail about what will or will not be visible.
Include Comments: Enable this option if you'd like the report to include narrative comments made on the selected form. Unique Commenter ID: If you select to include comments you'll see this option. It allows you to apply a masked id number to each assessor/evaluator. This can be useful to identify patterns in comments (e.g., multiple negative comments that come from one person) while protecting the identity of those who completed the form. Include Commenter Name: If you would like to display the names of commenters click on the checkbox. Include Description: If you click this checkbox you can give a description to the report. The text you enter will be displayed at the top of the report generated. Include Average: Click this checkbox to include a column showing the average score. Include Aggregate Scoring: If you enable the average, you'll have the option to also include a column with aggregate positive and negative scoring in some reports. This gives a dichotomous overview of positive and negative ratings.
For use when you have clinical learning courses with block schedules and want an overview of those learners who had an approved leave during a specific block.
Select a curriculum period.
Select a block.
Click 'Download PDF'.
For use when you have clinical learning courses with rotation schedules and want an overview of those learners who had an approved leave during a specific rotation.
Set a date range.
Select one or more learners.
Set the report parameters regarding displaying description and comments.
Click 'Generate Report'.
Note that once generated, this report is available to download by clicking 'Download PDF'.
A distribution defines who will complete a form, who/what the form is about, and when the form will be completed. Usually, program coordinators manage distributions of forms to cohorts of learners, faculty, event participants, etc. Generally, the following language is used in the A&E module:
Assessor/Evaluator: The person completing a form/task Target: The person, course, experience or other that the form/task is about Distribution method: How the form/task is assigned to people (e.g. based on a date range, rotation schedule, event schedule, etc.)
The available distribution methods and some of their use cases are listed below:
Rotation Schedule: Use a rotation schedule distribution to send forms based on a rotation schedule. This could include assessments of learners participating in a rotation, evaluation by learners of a specific rotation site or experience, etc.
Delegation: Use a delegation to schedule a distribution to be sent to an intermediate person who will later forward the tasks to the appropriate assessor/evaluator. For example, if you want to set up distributions in September, but it is unknown who learners will work with in January because a clinic schedule hasn't been set yet, you could send the distribution to a delegator to forward once the clinic schedule is known.
Learning Event Schedule: Use a learning event schedule distribution when you want to have participants evaluate an event, when you want participants to evaluate the faculty teaching particular types of events, or when you want faculty to assess the learners participating in certain types of events.
Date Range: Use a date range to create the most generic type. You could use this for faculty or course evaluations, or for assessment of learners at any point.
A distribution wizard walks users through creating any distribution. Note that a distribution is only saved once all five steps of the wizard are complete. You can navigate forward and backward in a distribution using the Previous Step and Next Step buttons. You can cancel a distribution at any time by clicking 'Cancel'.
Navigate to Admin>Assessment & Evaluation.
Click 'Distributions'.
Click 'Add New Distribution'.
This will open a five-step wizard that walks you through creating a distribution. On the following pages you can read about the specifics of each distribution method.
Navigate to Admin>Assessment & Evaluation.
Click 'Distributions'.
Use the search tool to look for the distribution you want to copy. Click the down arrow beside the search bar to apply filters to refine your search results.
Click on the cog icon to the right of the distribution you want to copy. Select 'Copy Distribution'.
Edit the information in each step of the distribution wizard as needed and save your work.
Navigate to Admin>Assessment & Evaluation.
Click 'Distributions'.
Use the search bar to look for the distribution you want to delete. Click the down arrow beside the search bar to apply filters to refine your search results.
Click the checkbox beside the name of the distributions you want to delete.
Click the red Delete Distributions button.
Navigate to Admin>Assessment & Evaluation.
Click 'Distributions'.
Use the search bar to look for the distribution you want to manage. Click the down arrow beside the search bar to apply filters to refine your search results. You will only see distributions to which you have access.
Click on the cog icon to the right of the distribution you want to copy. Select 'View Distribution Report'.
Review progress. Click on any category (Pending, In Progress, Completed) to view more details about specific targets and assessors.
Pending tasks have not yet been started. In Progess tasks have been started but not complete. Completed tasks are done and have been submitted.
To delete tasks tick off the box below the garbage icon on each task card and then click the red Delete Task(s) button.
To send reminders to those with incomplete forms, tick off the box below the bell icon on each task card and then click the blue Manage Distribution button and select Send Reminders from the dropdown list. To select all tasks for reminders click on the bell icon.
Review your choices and, if correct, click Confirm Reminders. You will get a green success message.
To add a task to a distribution, click the blue Manage Distribution button and select 'Add a Task' from the dropdown list. Complete the required information and click 'Confirm Task'.
Viewing progress results for learning event-based distributions: Program coordinators who set up such distributions should view progress from their My Assessments page.
Many of the functions described above can also be completed at an individual task level when logged into the system as a program coordinator or faculty director.
Click the Assessment and Evaluation badge at the top right (beside the logout button).
Click the My Learners tab.
Click on the appropriate tab for a learner (e.g. CBME, Assessments, Logbook).
When looking at a user's Assessment & Evaluation dashboard, some users will be able to send reminders, remove tasks, and download PDFs of selected tasks from their assigned learners or faculty.
New in ME 1.14! Ad hoc distributions create a form available to be triggered on demand by learners or faculty. Please note that a form can only be triggered once during the distribution (this differs from the CBME module where a form can be triggered multiple times).
Currently, this feature is designed to support learner assessments only. A distribution set up as an evaluation will not work within the context of the new ad hoc feature.
WARNING: The ad hoc distribution method is not intended to be used in an organization that has CBME enabled (e.g. most postgraduate medical programs in Canada). Additionally, it should only be used for assessments, not evaluations.
An ad hoc distribution allows an administrator to set up a form that is available to a specific audience, during a specific time frame, and can be triggered by learners or faculty. A potential use case is a clinical environment where it is unknown which learners will work with which assessors. Using an ad hoc distribution, administrative staff can set up the distribution then allow learners or faculty to trigger the assessment and provide details about who the target or assessor in the situation was.
As ME 1.14 marks the first version of the ad hoc distribution, please be aware of the following:
Users can only select 'complete and confirm via PIN' or 'send blank form' as form completion methods (you can't currently begin a form and send it via email or use the self-assess and send blank form options).
When setting the targets of a distribution you can only use cohort and individuals (you can't currently set the distribution up with a course audience or course group as the target).
The assessors list will be based on course contacts from the course setup page.
Use the ad hoc distribution for assessments only.
There are some database settings that can be used to tailor the use of the ad hoc distribution. If you want to change these you will need help from a developer.
You can restrict the ability to trigger the form to only targets or only assessors (by default it is set to allow both to trigger forms)
You can turn on or off the ability to allow forms to be completed and confirmed via PIN of to send a blank form (by default it will allow both).
Permissions: Anyone with access to Admin > Assessment & Evaluation will be able to create an ad hoc distribution. For default installations this will include Medtech: Admin, Staff:Admin, Faculty:Admin, Staff:PCoordinator and Faculty:Directors.
Navigate to Admin > Assessment & Evaluation.
Click 'Distributions' above the Assessment and Evaluation heading.
Click 'Add New Distribution'.
Distribution Title: Provide a title. This will display on the list of Distributions that curriculum coordinators and program coordinators can view. In addition, this distribution title will be used by learners and faculty to access the form. For this reason, a clear title including a course or use is recommended (e.g. Clinical Skills Week 1 History).
Distribution Description: Description is optional.
Task Type: Hover over the question mark for more detail about how Elentra qualifies assessments versus evaluations. If a distribution is to assess learners, it's an assessment. If it is to evaluate courses, faculty, learning events, etc. it is an evaluation. Notice that the language on Step 4 will change if you switch your task type as will other steps of the wizard.
Assessment Mandatory: This will be checked off be default. Currently this information is recorded but does not impact how a task displays to a user.
Select Form: The form you want to distribute must already exist and you must have permission to access the form; pick the appropriate form from the dropdown menu.
Select a Curriculum Period: The curriculum period you select will impact the list of available learners and associated faculty.
Select a Course: The course you select will impact the list of available learners and associated faculty.
Click 'Next Step'.
Distribution Method: Select 'Adhoc' from the dropdown menu.
Start Date: This is the beginning of the period the form is meant to reflect.
End Date: This is the end of the period the form is meant to reflect.
Task Expiry: Optional. Set the date on which the tasks generated by the distribution will automatically expire (i.e. disappear from the assessor's task list and no longer be available to complete). Tasks will expire at 12:00 AM on the day that you select, so it is best to select the day after your intended expiry date.
Click 'Next Step'.
The target is who or what the form is about. You will only set specific targets if you are creating an assessment; if you are creating an evaluation you will not set a target list.
Select Targets: Use this area to specify the targets of the form.
When setting the targets of an ad hoc distribution for an assessment you can only use cohort and individuals (you can't currently set the distribution up with a course audience or course group as the target).
Target Attempt Options: Specify how many times an assessor can assess each target, OR whether the assessor can select which targets to assess and complete a specific number (e.g. assessor will be sent a list of 20 targets, they have to complete at least 10 and no more than 15 assessments but can select which targets they assess).
If you select the latter, you can define whether the assessor can assess the same target multiple times. Check off the box if they can.
Click 'Next Step'.
The assessors are the people who will complete the form. Currently, ad hoc distributions only support making faculty who are course contacts assessors.
You can immediately save your distribution at this point and it will generate the required tasks, but there is additional setup you can configure if desired.
Authorship: This allows you to add individual authors, or set the distribution to be accessible to everyone with A&E access in a course or organization. (This may be useful if you have multiple users who manage distributions or frequent staffing changes.)
Target Release: These options allow you to specify whether the targets of the distribution can see the results of completed forms.
Task List Release:
"Targets can view tasks completed on them after meeting the following criteria" can be useful to promote completion of tasks and is often used in the context of peer assessments. Targets will only see tasks completed on them after they have completed the minimum percentage of their tasks set by you.
Target Self-Reporting Release: This controls whether targets can run reports for this distribution (i.e. to generate an aggregated report of all responses). When users access their own A+E they will see a My Reports button. This will allow them to access any reports available to them.
Target Self-Reporting Options: This allows you to specify whether or not comments included in reports are anonymous or identifiable. (This will only be applied if you have set reports to be accessible to the targets.)
Reviewers: This allows you to set up a reviewer to view completed tasks before they are released to the target (e.g. a staff person might review peer feedback before it is shared with the learner).
Check off the box to enable a reviewer.
Click Browse Reviewers and select a name from the list. Note that this list will be generated based on the course contacts (e.g. director, curriculum coordinator) stored on the course setup page.
Prompted Response Notifications: This allows you to decide what action to take for any answers on the form that designated as prompted or flagged response options in items included on the form. (For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response.) You can optionally select to email the Assessment Reviewers, Program Coordinators, Program/Course Directors, or Distribution Authors.
Click 'Save Distribution'.
Once an ad hoc distribution has been created you can monitor its progress from Admin > Assessment & Evaluation. Click on Distributions, and then the name of the distribution.
The Show Progress screen you:
Not Started (forms that have been triggered but not started)
In Progress (forms that have been triggered, started, and saved in draft mode)
Completed (forms that have been triggered and completed).
Click on any of the labels to view the names of the targets and assessors and delivery dates.
Administrative staff can also send reminders, and add and delete tasks from here.
After an ad hoc distribution has been set up (remember it may take up to a day for your distribution to become active depending on when behind the scenes cron jobs happen), learners and faculty who have a form available to them will see a "Trigger Assessment" button on their dashboard.
From the dashboard, click 'Trigger Assessment'.
Select a distribution (begin to type a distribution name to quickly filter the list).
Select an assessor. Hover over an assessor name to see their photo (if uploaded) and details about them including email and course affiliation.
Select the assessment method. Email blank form will send a copy of the form to the selected assessor. Complete and confirm via PIN will allow the learner to immediately view and start the form, then have the assessor sign off on the form using his/her PIN. (For information on setting user PINs, please see here.)
If the assessor has a PIN set the system will automatically default to Complete and confirm via PIN as the assessment method.
Select a date of encounter.
Click 'Submit'.
If the learner selected Email blank form, the assessor will receive an email alerting them to the task; additionally, they will see the task added to their Assessment Tasks list accessible from their A&E badge in the dashboard header.
If the learner selected Complete and confirm via PIN, the form will display on the screen. The learner can begin to complete the form, then pass the device to the assessor. When the form is complete, the assessor can enter their PIN to confirm and complete the form.
Note that if a form triggered using Complete and confirm via PIN is saved as a draft, the learner will need to reopen the form when in the company of the assessor to have the form completed.
After a form is completed, the learner's ability to view it will depend on the distribution settings. If the distribution allows the learners to view the tasks, they will be able to view them from their Tasks Completed on Me list accessed from their A&E badge in the dashboard header.
From the dashboard, click 'Trigger Assessment'.
Select a distribution (begin to type a distribution name to quickly filter the list).
Select a target. Hover over a target name to see their photo (if uploaded) and details about them including email, group and role (e.g. student, 2022) and enrolled courses.
Select a date of encounter.
Click 'Begin Assessment'.
The form will display on the screen.
The faculty member can complete the form and Save as Draft or Submit. Note that if the user saves the form as draft, they will have to reopen and complete it at a later date.
The faculty member can delete the task if they triggered it in error.
The faculty member can forward the task to another faculty member if needed.
Faculty can view their draft forms, or forms that leaners have triggered to them from their Assessment & Evaluation badge located in the dashboard header. The tasks to be completed will display in the Assessment Tasks list.
The Evaluation Reports section will allow you to generate reports based on evaluations completed via distributions. Evaluation reports will not include commenter names, even if you check off the commenter name option when setting the report options.
This report is relevant only if your organization uses the Clinical Experiences rotation schedule. If you distribute course evaluations or other through a rotation based distribution, you can use this report to view results. Obviously the exact format of the report will depend on the form it is reporting on.
Select a course, date range, rotation, curriculum period and form.
Set the report parameters regarding displaying comments and averages.
Click 'Generate Report'.
This report will not include learner names, even if you check off the commenter name option when setting the report options.
For use with distributions completed by event type.
Set a date range.
Select Individual Events: Check this off if you want the ability to select individual events (otherwise you will have to report on all events).
Select the event type, distribution by a curriculum period, learning event and form.
This report will not include learner names, even if you check off the commenter name option when setting the report options.
Use this to report on feedback provided by participants when a feedback form is attached to an event as a resource.
Select a course, date range, event type, form and learning event (optional).
You will notice some extra report options for this type of report.
Separate Report for Each Event: This will provide separate files for each report if multiple events are selected to include.
Include Event Info Subheader: This will provide a bit of detail about the event being evaluated (title, date, and teacher).
This report can include an average and an aggregate positive/negative score. This report will not include learner names, even if you check off the commenter name option when setting the report options.
For use in viewing a summary report of learner evaluation of an instructor.
Select a course and set a date range.
Select a faculty from the dropdown menu by clicking on his/her name. Note that only faculty associated with the selected course in the given time period will show up on the list. Additionally, they must have been assigned as an assessor in another distribution in the organization. Please see additional information below.
Select a form and distribution (optional).
Set the report parameters regarding displaying comments and averages.
Click 'Download PDF(s)'.
This report will not include an average, even if you check of Include Average when setting the report options.
The way faculty names become available to select for this report is when the faculty member is also an assessor on a distribution in the organization. This is designed in part to protect the confidential nature of faculty evaluations and prevent staff from being able to generate reports on any faculty at any time. If your organization does not use assessments or you require reports on faculty whose names aren't available, please reach out to us and we can help you put a work around in place.
You may also be able to report on faculty evaluations by accessing an aggregated report from a specific distribution. Please see more detail in the Weighted CSV Report section in Distribution.
For use in viewing a summary report of learner evaluation of a course.
Select a course and set a date range.
Select a form and distribution (optional).
Set the report parameters regarding displaying comments and averages.
Click 'Download PDF(s)'.
This report will not include learner names, even if you check off the commenter name option when setting the report options. This report can include an average and an aggregate positive/negative score.
Program or Curriculum Coordinators have two methods to monitor completion of forms and tasks associated with their program and learners.
Coordinators can access their own tasks (including delegations) and monitor the progress of individual faculty and learners through the Assessment & Evaluation Badge located between the user name and the logout button.
Coordinators can monitor the progress of any distributions they have access to and view a complete inventory of outstanding, in progress, and deleted tasks for users associated with their course or program through the Admin>Assessment & Evaluation tab.
One of the distribution options is to create a delegation. This sends tasks to a delegator to forward to the appropriate person at a later date. If there are multiple targets for a delegation, users will be able to forward some tasks to one assessor, and other tasks to another assessor.
If a distribution was created using a delegation, the designated user needs to complete the delegation by assigning assessors.
Log in as the delegator.
From the Assessment Tasks tab, click on the relevant assessment.
You'll see a list of targets in a table.
Click the checkbox beside a target name.
At the bottom right click 'Select Assessors'.
Search for as assessor and click the checkbox beside the assessor name.
If you need to add an additional internal and/or external assessor you may do so by clicking the 'Add Additional Assessor' button. This will allow you to enter the name of an internal user and/or email address of an existing external assessor. You can even add a new external assessor if necessary.
After adding the required assessors, click 'Proceed to Confirm Assessments'.
If you wish to mark the delegation as complete, click the checkbox. This will move the task to the delegator's My Completed Tasks list. If you have targets without assessors/evaluators, you will receive a warning. If you'd rather that the task to stay on the Assessment Tasks tab, do not click the checkbox.
Confirm your choice by clicking 'Create Assessment Tasks'.
The list of targets and assessors/evaluators will be updated with the newly entered information.
In the date range based delegation above, the delegator has assigned 2 targets to Alex Adams and 2 to Bennett Adkins. The rest of the tasks will be assigned at a later date.
When building a distribution a reviewer can be set in Step 5 of the Distribution Wizard. This person will need to review and approve, reopen or hide all completed tasks before the tasks as required. The use case for this might be providing an intermediary who checks the appropriateness of narrative comments on course and faculty evaluations or peer assessments.
For more information on setting up a reviewer on a distribution, please see the page for the distribution type you are working with.
If you have been set as the reviewer for a distribution, click your Assessment and Evaluation badge and you'll see a tasks with a Reviewer label under your Assessment Tasks tab.
Click View Task.
Review the form contents.
Choose one of three options:
Reopen the task: This will send the task back to the assessor/evaluator and it will remain in their pending/in progress task list until they complete it. If you reopen a task you will get a message indicating that the task has successfully been reopened.
Approve the task: This will essentially mark the task as finalized and release the task to the target (if allowed by the parameters of the distribution).
Hide the task: This will hide the task from being viewed by the target. The person who completed the task gets no feedback. If you hide a form, you will be prompted to enter a reason for why you hid the form. Note that hiding a form does not remove its data from reports generated about this form and distribution.
Note that currently, there is no user interface to retrieve all hidden forms or to view the comments about why the form was hidden.
Learners have quick access to review their assessment and evaluation tasks through the Assessment and Evaluation badge.
Click the Assessment & Evaluation badge in the top right beside the logout button.
Three tabs will be visible to learners: Assessment Tasks, Tasks Completed on Men, and My Complete Tasks.
When viewing results on any of the tabs in A&E, use the search bar to look for a specific task by name and click the down arrow to limit your results by applying search filters like distribution method (date range, delegation, learning event, rotation schedule), curriculum period, course, and task status (pending, in progress, completed). Apply start and end dates to limit your results to a specific time frame. Remember to click 'Apply Filters' to apply your choices. Remember to click 'Remove Filters' to clear filters and view all results. Note that your previous filter settings may still be applied so if you are seeing no results, or fewer results than you expected, try 'Remove Filters'.
Assessment Tasks: This tab displays tasks the learner is currently responsible for completing (e.g. faculty evaluation, service evaluation, etc.). Task cards may display a rotation or block name depending on how they were scheduled. Learners can view a task, download individual tasks or multiple tasks at once, and can remove tasks. If the learner removes a task, they will have to provide a reason.
Tasks Completed on Me: This tab displays all the tasks that have been completed on the learner. Task cards display the form title and type, task delivery and completion dates, form triggerer name (if applicable), and name and role of the assessor. Learners can view a task and download individual tasks or multiple tasks at once.
My Reports: From this button on the right hand side, learners can access reports on forms completed on them (if the distribution allowed for learners to view results). Provide the appropriate dates. Click 'Report'.
My Completed Tasks: This tab displays forms the learner has completed. Task cards display the form title and type, rotation/block name, task delivery and completion dates, and progress. Learners can view a task and download individual tasks or multiple tasks at once.
When you toggle to card view of a list of users (e.g. within a distribution) you'll be able to see their user photo (if they have provided one).
New in ME 1.14! Learner's can now optionally choose to release an evaluation task to a faculty member.
When administrators create distributions they can optionally decide whether or not to release the tasks and associated reports to the targets of the forms. Choosing to release tasks allows the target to view forms completed on them. In the case of evaluations, administrators only have the option to release task reporting to targets, not release individual tasks. As of ME 1.14, we've introduced an option for learners to override this and allow a faculty member to view forms completed on them. This applies in the case of evaluations only.
At the bottom of an assigned evaluation form, learners will now see the option to release the evaluation to the selected target. If the learner picks Yes from the dropdown selector, the task will become available for the target to view through their A&E tab.
The faculty member can now view that completed form.
Note that learners opting to individually release tasks for faculty to view, does not change the faculty member's ability to create a report (that is still dictated by the distribution).
In addition to being able to access their personalized Assessment & Evaluation badge where they can view their own tasks and access My Learners and My Faculty lists, coordinators and some faculty will also have access to Admin>Assessment & Evaluation. From this tab, coordinators can filter results and easily view outstanding, upcoming, and deleted tasks, as long as they have been added as a Course Contact. Coordinators can send reminders and remove tasks from this screen as well.
Faculty who are the assessors in a distribution or the target or assessor on a triggered form should show up here. Learners enrolled in PA or Course Director's program will also display on this list.
From the A&E Dashboard users can search for individual tasks or users to view their progress. Hovering over the target number will reveal a list of targets. Reminders can be send by clicking the box below the bell icon and then clicking Send Reminders.
If you are at an institution using the CBME features of Elentra, note that you can filter by delivery method to view triggered assessment or assessments and evaluations sent out via delegation.
Note that the Owner of a task is the person who has responsibility for completing the next step of a task. As such, if a user has stored a task in draft mode, they may be the owner of the task, even if it will eventually be sent to a faculty/preceptor.
This report can be used by administrative staff to keep an inventory of distributions.
Select a report type. You can see an overview of distributions or individual tasks.
Select a course.
Set a date range.
Select a task type.
Select a distribution or an individual task (your option will depend on the first selection you made on the page).
Click 'Generate Report'.
From here you can search within the results or click on any distribution to see its progress.
Use to see an overview of who is set as a reviewer for distributed tasks. (When you create a distribution you can assign a reviewer who serves as a gatekeeper of completed tasks before they are released to be seen by their target. This is completed during the final step of a distribution. For additional information please see the Assessment and Evaluation>Distributions help section.)
More information coming soon.
This report functions like the report above but offers users a view of the report in the interface without requiring them to open a PDF.
Please see the next page for information on the Weighted CSV Report.
Users with access to Admin>Assessment and Evaluation will be able to view A&E reports. Generally this will include staff:admin, and staff:pcoordinator and faculty:director users assuming the staff:pcoordinators and faculty:directors are assigned to a specific course or program.
To view A&E Reports:
Click Admin>Assessment & Evaluation.
From the second tab menu, click on 'Reports'.
Reports are divided between evaluations, assessments (both based on the type of distribution selected), leave and distribution. As of ME1.13 these reports include data from distributed forms. If you also use triggered forms through the Competency Based Medical Education module note that these forms can be reported on through different tools including the Milestone Report and other aggregated reports available from the learner CBME dashboard.
A date range distribution allows you to send a form to the appropriate assessors/evaluators in a specific date range.
Navigate to Admin>Assessment and Evaluation.
Click 'Distributions' above the Assessment and Evaluation heading.
Click 'Add New Distribution'.
Distribution Title: Provide a title. This will display on the list of Distributions that curriculum coordinators and program coordinators can view.
Distribution Description: Description is optional.
Task Type: Hover over the question mark for more detail about how Elentra qualifies assessments versus evaluations. If a distribution is to assess learners, it's an assessment. If it is to evaluate courses, faculty, learning events, etc. it is an evaluation. Notice that the language on Step 4 will change if you switch your task type as will other steps of the wizard.
Assessment Mandatory: This will be checked off be default.
Select Form: The form you want to distribute must already exist; pick the appropriate form.
Select a Curriculum Period: The curriculum period you select will impact the list of available learners and associated faculty.
Select a Course: The course you select will impact the list of available learners and associated faculty.
Click 'Next Step'
Distribution Method: Select 'Date Range Distribution' from the dropdown menu.
Start Date: This is the beginning of the period the form is meant to reflect.
End Date: This is the end of the period the form is meant to reflect.
Delivery Date: This is the date the task will be generated and delivered to the assessors/evaluators. (The delivery date will default to be the same as the start date.)
Task Expiry: Optional. Set the date on which the tasks generated by the distribution will automatically expire (i.e. disappear from the assessor's task list and no longer be available to complete). Tasks will expire at 12:00 AM on the day that you select, so it is best to select the day after your intended expiry date.
Warning Notification: If you choose to use the Task Expiry option, you'll also be able set an automatic warning notification if desired. This will send an email to assessors a specific number of days and hours before the task expires.
The target is who or what the form is about.
Assessments delivered for: Use this area to specify the targets of the form.
If you choose "Select learners," you'll see the following options:
Select Targets:
Use the dropdown menu to select the appropriate learners
CBME Options: This option applies only to schools using Elentra for CBME. Ignore it and leave it set to non CBME learners if you are not using CBME. If you are a CBME school, this allows you to apply the distribution to all learners, non CBME learners, or only CBME learners as required.
Target Attempt Options: Specify how many times an assessor can assess each target, OR whether the assessor can select which targets to assess and complete a specific number (e.g. assessor will be sent a list of 20 targets, they have to complete at least 10 and no more than 15 assessments but can select which targets they assess).
If you select the latter, you can define whether the assessor can assess the same target multiple times. Check off the box if they can.
Click 'Next Step'
The assessors are the people who will complete the form.
There are three options:
Select faculty members
Browse faculty and click on the required names to add them as assessors
Select Associated Faculty will add the names of all faculty listed as course contacts on the course setup page
Exclude self-assessments: If checked, this will prevent the assessor from completing a self-assessment
Feedback Options: This will add a default item to the distribution asking if the faculty member met with the trainee to discuss their assessment.
Select learners
Click Browse Assessors and click on the appropriate option to add learners as assessors
Exclude self-assessments: If checked, this will prevent the assessor from completing a self-assessment
Select individuals external to the installation of Elentra
This allows you to add external assessors to a distribution
Begin to type an email, if the user already exists you'll see them displayed in the dropdown menu. To create a new external assessor, scroll to the bottom of the list and click 'Add External Assessor'
Provide first and last name, and email address for the external assessor and click 'Add Assessor'
Feedback Options: This will add a default item to the distribution asking if the faculty member met with the trainee to discuss their assessment.
Click 'Next Step'
You can immediately save your distribution at this point and it will generate the required tasks, but there is additional setup you can configure if desired.
Authorship: This allows you to add individual authors, or set the distribution to be accessible to everyone with A+E access in a course or organization. (This may be useful if you have multiple users who manage distributions or frequent staffing changes.)
Target Release: These options allow you to specify whether the targets of the distribution can see the results of completed forms.
Task List Release:
"Targets can view tasks completed on them after meeting the following criteria" can be useful to promote completion of tasks and is often used in the context of peer assessments. Targets will only see tasks completed on them after they have completed the minimum percentage of their tasks set by you.
Target Self-Reporting Release: This controls whether targets can run reports for this distribution (i.e. to generate an aggregated report of all responses). When users access their own A+E they will see a My Reports button. This will allow them to access any reports available to them.
Target Self-Reporting Options: This allows you to specify whether or not comments included in reports are anonymous or identifiable. (This will only be applied if you have set reports to be accessible to the targets.)
Reviewers: This allows you to set up a reviewer to view completed tasks before they are released to the target (e.g. a staff person might review peer feedback before it is shared with the learner).
Check off the box to enable a reviewer.
Click Browse Reviewers and select a name from the list. Note that this list will be generated based on the course contacts (e.g. director, curriculum coordinator) stored on the course setup page.
Prompted Response Notifications: This allows you to decide what action to take for any answers on the form that designated as prompted or flagged response options in items included on the form. (For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response.) You can optionally select to email the Assessment Reviewers, Program Coordinators, Program/Course Directors, or Distribution Authors.
Click 'Save Distribution'.
The Assessments Reports section mostly allows you to generate reports based on assessments completed via distributions. There are some exceptions, however most reports are for distributed forms.
New in ME 1.14 The Completed Assessments Report allows you to view how many tasks have been sent to an assessor and how many they have completed (for triggered tasks).
This report allows you to compile all assessments completed on a target in one or more courses into one or many files. It does not aggregate results, just compiles multiple forms.
Select a course, set a date range and select a course group (optional).
Select a learner.
Select a form (optional).
Click 'Download PDF(s)'.
Choose whether to download as one file (all forms will be stored in one file) or not (you'll download a file for each form).
A file will download to your computer with the appropriate forms included. Each form will include the target and assessor, delivery and completion date, form responses and comments, etc. The file names will be: learnerfirstname-learnerlastname-assessment-datereportrun-#.pdf. For example: earnest-acosta-assessment-20181005-1.pdf
Use this report to create an aggregated report on learner performance on a single form that may have been used multiple times and completed by multiple assessors. For this report, the list of learners available will depend on someone's affiliation with a course/program if they are a staff:pcoordinator or faculty:director.
Set the date range.
Select a learner.
Select a form.
Set the report parameters regarding displaying comments and averages.
Click 'Download PDF(s)'.
An average and aggregate positive and negative score are available with this report. This report will not include learner names beside comments, even if you check off the commenter name option when setting the report options.
Please note that if in the distribution for a form, you set comments to be identifiable, learners themselves will be able to view commenter names when they self-report the form results as seen below.
For use in reporting on tasks delivered to and completed by faculty. Report columns include the number of tasks delivered and completed as well as the average time to completion from delivery date and average time to completion from the end of the experience (e.g., a block) per user. It also provides an overall average across all users.
Select a course.
Set a date range.
If there were external assessors used you will have the option to include externals in the report or not.
Select one or more users by clicking the checkbox beside each required name. Please note that if you select all faculty it can take some time for all names to appear. Please be patient! To delete a user from the report click the 'x' beside the user's name.
Include Average Delivery Date: Enable this if desired.
Click 'Download PDF(s)'.
For use in collating responses provided by faculty completing forms produced through form templates in the competency-based medical education module. This report aggregates comments from forms that are sent out using a distribution. When logged in as an admin. you'll see a full list of all form feedback provided thus far display on the screen.
Set a date range.
Select a course from the dropdown options.
Select a tool from the dropdown options.
Click 'Apply Filters'.
Results will display on the screen and you can click 'Download PDF(s)' if you need to download a copy.
To begin a new search be sure to click 'Reset Filters' and select a new course and/or tool as appropriate.
You can click on the page icon to the right of the feedback column to view the form being referenced.
For use in monitoring the progress of faculty in completing the tasks assigned to them through triggered forms.
Select a course.
Set a date range.
Decide whether or not to include External assessors. (For many schools, this will be irrelevant as you do not yet have the ability to trigger tasks to external assessors.)
Select the relevant faculty.
Click 'Download CSV(s)'.
A csv file will download to your computer.
The Weighted CSV Report provides a csv file that includes the data collected through completed forms. It lists users who completed the form down the side and form items across the top. Inside each cell will be data from the form representing the scale ratings made by those who completed the forms. If your response descriptors include numbers (e.g. 1 - Overall, this instructor is an effective teacher.) note that those numbers will not necessarily be reflected in the csv.
It is important to note that the weighted CSV report was specifically designed to be used in conjunction with items using a rating scale (e.g. grouped items using a rubric), and allows you to create custom weights for scale response descriptors which get reflected in the report. There is currently no way to configure these weights through the user interface and you will need a developers help to assign weights to scale response descriptions in the database. (Developers, you'll need to use the cbl_assessment_rating_scale_responses table.)
If no weights are applied to the scale responses, the report defaults to assign a value of 0, 1, 2, 3, 4 to the responses in a left to right order. In effect, the Weighted CSV Report will work best if the rating scale you apply to the items mimics a 0-4 value (e.g. Not Applicable, Strongly Disagree, Disagree, Agree, Strongly Agree).
You can access the Weighed CSV Report from two places: from the Assessment & Evaluation Reports tab or from an individual distribution.
To access the Weighted CSV Report from Admin>Assessment & Evaluation you must have general access to the Admin>A&E tools. Such access will usually apply to staff:admin, staff:pcoordinator, and faculty:director users when the staff:pcoordiantors and faculty:directors are affiliated with a course/program.
The weighted CSV report is accessible from the Admin>Assessment & Evaluation Reports tab.
Click Admin>Assessment & Evaluation.
From the second tab menu, click on 'Reports'.
Scroll to the bottom of the list and in the Distribution section and click on 'Weighted CSV Report'.
To access the Weighted CSV Report from an individual distribution you must have access to that distribution.
Click on Admin>Assessment & Evaluation.
From the first tab menu, click on 'Distributions'.
Search for or click on the title of the relevant distribution.
Click on the Completed Assessments card (far right).
Click on the Weighted CSV button under the Assessments Completed heading.
A file should download to your computer.
When completing a distribution you will notice a "Feedback Options" section in Step 4 if the assessor is set to faculty members (it will not appear when the assessor/evaluator is set to learners). If you check this off it will add an item to the form for this distribution asking the assessor whether or not they met with the trainee to discuss their performance (see sample text below).
You might choose to use this option and add the item if you want to collect data on how often preceptors are meeting with learners.
Most reports are currently available as PDFs.
Some screen shots of sample reports are posted below but remember that your reports will contain the items relevant to the forms you've designed and used. In some cases information has been redacted to protect users' privacy.