Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The pdf below provides a sample view of all item types that can be included on an assessment and evaluation form (please note this does not include some items specific to forms built from CBME templates). Some items are designed to be used online (e.g. autocomplete multiselect). To view sample items in use in an online environment, check out your Elentra demo site.
The assessment and evaluation module provides a way for learners to be assessed, especially in clinical learning environments, and a way for all users to evaluate courses/programs, faculty, learning events, etc. Any forms used for assessment and evaluation require form items (e.g., questions, prompts, etc.).
If you are creating a form to be attached to a gradebook assessment please note that not all item types are supported because there is no structure to weight them on the form posted to the gradebook. When creating a form to use with a gradebook assessment it is recommended that you only use multiple choice, dropdown selector, rubric (grouped item only), and free text item. Do not use date selector, numeric, or autocomplete (multiple responses) items.
Note that you can copy existing items which may save time. To copy an existing item, click on the item and click 'Copy Item' which is beside the Save button.
Navigate to Admin>Assessment & Evaluation.
Click 'Items'.
Click 'Add A New Item'.
Complete the required information, noting that different item types will require different information. Common item details are below: Item Type: This list shows the item types supported by Elentra. Item codes will display when you use a list view of items and A complete list of item types is provided below. Item Text: This is what will show up on a form this item is added to. When you view items in the detail view you'll also see the item text. Item Code: This is an optional field. Item codes do display when you view items in a list and they are searchable. Some organizations apply their own coding system to their items, but another use case might be if you are importing items from another tool or vendor and they have a coding system you want to match. Rating Scale: Rating scales can be configured through the Scales tab within Assessment & Evaluation. First select a scale type and then select the specific scale. Selecting a rating scale will prepopulate the response categories for this item. In some item types you will also be required to add response text (e.g. multiple choice items) and that text will show up on the actual form. In other question types you may rely on just the response categories. Mandatory: Click this checkbox if this item should be mandatory on any forms it is added to. Allow comments: Click this checkbox to enable comments to be added when a user responds to this item. If enabled, you have several options to control commenting.
Comments are optional will allow optional commenting for any response given on this item.
Require comments for any response will require a comment for any response given on this item.
Require comments for a prompted response means that for any response where you check off the box in the Prompt column, a user will be required to comment if they select that response.
Allow for a default value: If you check this box you will be able to select a default response that will prepopulate a response when this item is used on any form. Set a default response by clicking on the appropriate response line in the Default column.
Depending on the question type, add or remove answer response options using the plus and minus icons.
Depending on the question type, reorder the answer response options by clicking on the crossed arrows and dragging the answer response option into the desired order.
Add curriculum tags to this item as needed.
If you have access to multiple courses/programs, first use the course/program selector to choose the appropriate course/program which will limited the available curriculum tags to those assigned to the course/program. Click the down arrow beside the course selector and search for the course by beginning to type the course name. Click the circle beside the course name.
Click through the hierarchy of tags as needed until you can select the one(s) appropriate for the item.
As you add curriculum tags, what you select will be listed under the Associated Curriculum Tags section.
Scroll back up and click 'Save'.
Horizontal Multiple Choice (single response): Answer options will display horizontally on the form and the user can select one answer. Response text required; response category optional. Response descriptors provide another data point so you can potentially report on them in the future. They are metadata in MC questions whereas in a rubric they are displayed. Horizontal MC will let you assign the same response descriptors to multiple responses.
Vertical Multiple Choice (single response): Answer options will display in a vertical list on the form and the user can select one answer. Response text required; response category optional.
Drop Down (single response): answer options will display in a dropdown menu. Response text required; response category optional.
Horizontal Multiple Choice (multiple responses): Answer options will display horizontally on the form and the user can select two or more answers. Response text required; response category optional.
Vertical Multiple Choice (multiple responses): Answer options will display in a vertical list on the form and the user can select two or more answers. Response text required; response category optional.
Drop Down (multiple responses): Answer options will display in a dropdown list that remains open and allows users to select multiple responses using the control or command and enter/return keys.
Free Text Comments: Use this item type to ask an open ended question requiring a written response. (In ME 1.11 and lower you can not map a free text comment to a curriculum tag set.)
Date Selector: Use this item type to ask a question to which the response is a specific date (e.g. What was the date of this encounter?)
Numeric Field: Use this item type to ask a question to which the response is a numeric value (e.g. How tall are you?)
Rubric Attribute (single response): Use this to create an item that relies on response categories as answer options. If you enter text in the response text area it will not show up to the user unless you create a grouped item. If you create a grouped item remember you need to use the same scale across all items to be grouped together. If you want a rubric item to display response text, create a grouped item with just one item included.
Scale Item (single response): Use this to create an item that relies on response categories as answer options. If you enter text in the response text area it will not show up to the user unless you create a grouped item. If you create a grouped item remember you need to use the same scale across all items to be grouped together.
Autocomplete (multiple responses): Use this item to allow users to search the option list in an autocomplete fashion and then make multiple selections. The response options will display in the order they were added to the item and when a user begins to type a response, the list of options will be filtered. The user can select more than one response as needed.
Field Note: This item type was developed specifically for post-graduate family medicine prior to the current work on they dynamic competency based education tools.
To begin you select a curriculum tag set.
For a tag you want to be available to be assessed on a form, you add 4 response descriptors (e.g., Excellent, Needs Improvement) and corresponding level of competency descriptions.
If you have a hierarchical curriculum tag set you can develop items for the bottom/most granular tags in the set.
When you create an assessment form, you can add the relevant curriculum tag set to the form. When individuals use the form they can navigate through the tag set and select the appropriate tag to assess.
Rubric Numeric Validator
This item helps support validated numeric grouped items. You do not need to create any specific items using this item type; rubric numeric validator items can be configured using a checkbox on the grouped item screen (more detail below).
Dynamic Item Placeholder
Do not use this item type when building your own forms. It is a placeholder for form items that are generated and replace this one when forms are published. This type does not render as anything and carries no other information.
Confidential Free Text Comment
Introduced in Elentra ME 1.26, this item type is designed to collect narrative comments that will not be visible to the target of an assessment/evaluation even if the form or a self-report is released to the target for viewing. Please note that in some reports, the target will see the item prompt, but will not be able to see the item response. In other words, the target will know that information was collected, just not what the specific response was.
Confidential item responses will display in some Assessment & Evaluation Administrator reports (e.g. Form Responses Report, Assessment Data Extract)
Creating a grouped item allows you to group items and guarantee that they appear together on forms. If you use the rubric attribute or scale item item types, creating a grouped item will create a rubric with common response categories (e.g. developing, achieved) and specific response text for each field (e.g. performed a physical exam with 1-2 prompts from supervisor, independently performed a physical exam). There is also the ability to Copy a Grouped Item which is next to the Create & Attach a New Item and Attach Existing Item(s) buttons. You can choose to create a new item linkage to keep all items as grouped or to create new individual items from the original grouped item.
Navigate to Admin>Assessment & Evaluation.
Click on the Items tab.
Click on the Grouped Items sub-tab.
Click 'Add A New Grouped Item'.
Provide a grouped item name and select a rating scale type and then a rating scale. All items in the group will have the same response categories assigned to them, as configured through the rating scale. (Rating scales can be set up through the Scales tab in Admin>Assessment and Evaluation.)
Click 'Add Grouped Item'.
Complete the required information, noting the following: Title: This will display when you view a list of grouped items. Description: This field is optional; note that the grouped item description will display below the grouped item title on forms produced for users (see image below).
Grouped Item Code: Optional.
Validate numeric items: Check this box if you plan to use multiple numeric items that add up to a total you want to validate. (See more detail below.)
Rating Scale: This may already be set based on creating the grouped question.
Permissions: Adding a group, course, or individual here will give those users access the the grouped item.
To add items to a grouped item you can either create and attach a new item or add existing items (click the appropriate button).
If attaching existing items, use the search bar and filters to find items. You will only be shown items that match the rating scale parameters you've selected. Click the checkbox beside a question (in list view) or beside the pencil icon (in detail view) and click 'Attach Selected'. Because an existing item may already be in use on another form, in some cases you will not be able to modify the response descriptors for that item.
If creating and attaching items, follow the instructions above for creating items. The rating scale for your new items will be set to match the rating scale of the grouped item. After creating one item, you can repeat the steps to create and attach as many items as needed.
Click 'Save'.
To edit an item click on the pencil icon. Bear in mind that an existing item may already be in use on another form.
To delete an item from a grouped item, click on the trashcan icon.
To reorder the items in the grouped item, click on the crossed arrows and drag the item into the appropriate location.
When you have added all required items to the grouped item, click 'Save'.
Click 'Grouped Items' at the top of the screen to return to the list of grouped items.
If you want to include a grouped item on a form that allows the form user to distribute points/minutes/other to different items, and have a sum that doesn't exceed a total value (dictated by admin. or by user) you can check off the Validate numeric items box.
In the example below the user enters their total encounter time (e.g., 180 minutes) and then divides those minutes between the different categories (e.g., 60 MSK, 30 Cardio Resp, 90 Neuro). The item is validated meaning that it won't let the user enter numbers for each item that don't total the first value entered.
When creating a grouped item, check off the validate numeric items box.
Numeric validator title: Give the validator (i.e., the total value Elentra will check items against) a title. This will display as the first item in the grouped item.
Numeric validation type:
I will define the total: This lets the item creator define the total for every time this item is used.
Numeric total: Enter the total to be used with this item.
The user completing the form will define the total: This will allow the user to create their own total each time they use this item.
Add items to the grouped item. Note, the items you add should be numeric items.
Use the item on a form as you normally would. Remember you may need to provide instruction to you users on how to complete the item.
When users create items, they automatically have permission to access and use those items. Users can optionally give permission for other individuals and coursess to access items. As much as possible, we recommend permissioning items to courses to reduce future workload to reassign items to individuals as you experience staff changes.
Navigate to Admin>Assessment and Evaluation.
Click 'Items'.
From list view, click on any item to open it. From grid view, click the pencil icon to edit an item.
Give permission to an individual or course by first selecting the appropriate title from the dropdown menu, and then beginning to type in the search bar. Click on the desired name from the list that appears below the search bar.
If you give permission to a course, anyone listed on as a course contact on the setup page AND with access to the Assessment and Evaluation module will have access to the item.
Currently, permissioning an item to an organization only allows medtech:admin users to access it. For that reason, permissioning items to courses is likely more useful.
After you've added all permissions, you can return to the list of all items by clicking 'Items'.
Toggle between list view and detail view using the icons beside the search bar.
In detail view, see the details of an existing item by clicking on the eye icon.
In detail view, edit an existing question by clicking on the pencil.
To delete items, check off the tick box beside a question (list view) or beside the pencil icon (detail view) and click 'Delete Items'.
From an Edit Item page you can click on a link to view the forms that use an item or the grouped item an item is included in.
When viewing items in list view, the third column shows the number of answer options the item has. Clicking on it takes you to the item, and by clicking again you can see all the forms that use this item.
From the Items tab type into the search box to begin to find questions.
You can apply a variety of filters to refine your search for existing items.
To select a filter, click on the down arrow beside the search box. Select the filter type you want to use, click on it, and then begin to type what you want to find or continue clicking to drill down and find the required filter field. Filter options will pop up based on your search terms or what you’ve clicked through and you can check off the filters you want to apply. Apply multiple filters to further refine your search.
If you're working with a filter with multiple hierarchies, use the breadcrumbs in the left corner of the filter list to go back and add additional filters.
When you’ve added sufficient filters, scroll down and click Apply Filters to see your results.
To remove individual filters from your search, click on the down arrow beside the search field, click a filter type and click on the small x beside each filter you want to remove. Scroll down and click ‘Apply Filters’ to apply your revised selections.
To remove all filters from your search, click on the down arrow beside the search field, click a filter type, scroll down, and click on ‘Clear All’ at the bottom of the filter search window.
Retiring items allows a user to manage archived and future or currently active items effectively. When users create forms they will not see retire items, however retired items will still be included in all relevant reports. Retired items will continue show in the item bank, but will have their item type highlighted red.
Click the tick box on an item card.
Click the orange Retire Items button.
Confirm your action by clicking 'Retire'.
The recently retired item will have the item type bar highlighted in red.
Deleting an item will remove it from the list of Assessment and Evaluation Items.
Click the tick box on an item card.
Click the red Delete Items button.
Confirm your action by clicking 'Delete'.
The deleted item will no longer show on the list of items.
Please note that there is no way through the user interface to recover an accidentally deleted item. If you have deleted something by mistake you will need help from a developer to correct the mistake.
Deleted items will remain on forms that already include that item.
New in ME 1.26!
A new item type for narrative feedback that you want to keep the target from seeing. The Confidential Item can be included on forms with other items that are visible to learners.
The option for administrative users to disable initial task creation email notifications on distributions.
A database setting that forces A&E task expiry to 23:59 (configured per distribution type) (assessment_tasks_expiry_end_of_day)
Three database settings that allow a developer to configure how tasks sort on user's A&E tabs, e.g., by event date, then delivery date, then expiration date instead of just by delivery date (assessment_sort_assessor_pending, assessment_sort_assessor_completed, assessment_sort_target_completed)
There are multiple tools available to facilitate assessment and evaluation through Elentra. See here for an overview of the multiple tools available and determine which is most appropriate for your needs. The Assessment & Evaluation module can be used to:
Create items and forms to assess learner performance on tasks (e.g., clinical skills and workplace based assessments)
Create items and forms to have learners evaluate courses, faculty, themselves, and other activities
Create items and forms to assess learner performance on gradebook assessments
Note that to allow inline faculty grading using a rubric form or similar, you build a form in Assessment & Evaluation, then attach it to a Gradebook Assessment
The Assessment & Evaluation Module in Elentra is predominantly used to assess learner performance in a clinical environment, and provide multiple user groups with the ability to evaluate faculty, courses/programs, and other activities within the context of your organization. Since the Assessment & Evaluation Module essentially allows you to create forms for people to complete you could even use it to do a pre-course survey, or as a way to collect information from a group about their work plans in a collaborative activity.
To use the Assessment & Evaluation Module users must create items (e.g., questions, prompts, etc.), and create forms (a collection of items). Organizations can optionally allow users to initiate specific forms based on a form workflow (e.g. to allow learners to initiate an assessment on themselves by a faculty member), or organizations can create tasks for individuals to complete using a distribution. A distribution defines who will complete a form, when the form will be completed, and who or what the form is about.
Assessment and Evaluation module users also have a quick way to view their assigned tasks and administrators can monitor the completion of tasks assigned to others.
Reporting exists to view and in some cases export the results of various assessments and evaluations.
The Competency-Based Education module also includes a variety of form templates for use. There is no user interface to configure form templates at this point. For instructions specific to Competency-Based Education, please see the CBE tab.
Database Setting
Use
disable_distribution_assessor_notifications
This goes hand in hand with unauthenticated_internal_assessments and enable_distribution_assessor_summary_notifications. When using unauthenticated distribution assessments, you probably want a single summary email with a link to each task rather than an email for each new task. This disables the per-task reminders.
enable_distribution_assessor_summary_notifications
This adjusts the emails for distribution assessments to have a nightly summary of all new tasks, and then a weekly summary of tasks that are still incomplete.
enable_prompted_responses_comments_and_reviews
Controls the visibility of a Prompted Responses tab for staff and faculty users who receive prompted response notifications.
flagging_notifications
include_name_in_flagged_notifications
defines whether or not to include names of users in prompted response notifications in Assessment and Evaluation items
unauthenticated_internal_assessments
Impacts email notifications sent to assessors. If enabled, allows distributed assessments to be completed by internal assessors without logging in via a unique hash provided via email
assessment_tasks_show_all_multiphase_assessments
assessment_delegation_auto_submit_single_target
On by default, controls whether or not a page will auto submit when there is a single target in a delegation task
assessment_display_tasks_from_removed_assessors
when enabled, will display assessment tasks from assessors that have been removed from the distribution
cbme_ondemand_start_assessment
cbme_ondemand_start_assessment_shortcut_button
optional, if selected a short cut icon will be displayed to learner in their cbme dashboard beside corresponding EPAs
cbme_ondemand_start_assessment_add_assessor
optional, if selected you will have the ability to add new assessors. By default these added assessor will be stored in the cbl_external_assessors table. If you want the added assessor to be an internal user you must also merge ME-1434 to get this complete feature
cbme_ondemand_start_assessment_director_as_staff_workflow
toggle whether faculty directors get the staff workflow view. Default is off (0) - all faculty get the faculty workflow view
cbme_ondemand_start_assessment_replace_admin_trigger
to toggle whether the “Trigger Assessment” button on /admin/assessments gets switched over to use the workflow view. Default is off (0).
cbme_ondemand_expiry
Enable to apply an expiry date for tasks generated from on demand workflows.
cbme_ondemand_expiry_offset
If cbme_ondemand_expiry is in use, define the period of time after which an on demand task will expire.
cbme_ondemand_expiry_workflow_shortnames
If cbme_ondemand_expiry is in use, define which workflow types it applies to.
cbme_assessment_form_embargo
For organizations with CBME enabled, the option allows you to hide rubric forms until certain conditions are met.
assessment_triggered_by_target
For adhoc distributions only, controls whether or not a target can initiate a form. Enabled by default.
assessment_triggered_by_assessor
For adhoc distributions only, controls whether or not a target can initiate a form. Enabled by default.
assessment_method_complete_and_confirm_by_pin
For adhoc distributions only, controls the available form completion method. Enabled by default.
assessment_method_send_blank_form
For adhoc distributions only, controls whether or not a target can initiate a form. Enabled by default.
housing_distributions_enabled
clerkship_housing_department_id
evaluation_data_visible
Can be used to restrict users from opening individual, completed evaluation tasks
show_evaluator_data
Can be used to hide names of evaluators from distribution progress reports, the Admin > A&E Dashboard, and reports
disable_target_viewable_release_override
Can be added to the course_settings table or the settings table to turn off the ability for evaluators to optionally release complete evaluations to a target (restricting release option can also be controlled throught distributions if this setting is not applied)
assessment_tasks_expiry_end_of_day
Use this to store a list distribution types were you want the task expirty to be 23:59 on the day selected (inputs: ‘date_range,rotation_schedule,delegation,eventtype,adhoc,reciprocal’) (ME 1.26)
assessment_sort_assessor_pending
Optionally configure how tasks display on a user's My Assessment Tasks tab. Available inputs: a JSON encoded array of arrays, e.g., [{"sort_order":"desc","sort_column":"`event_start`"},{"sort_order":"desc","sort_column":"`task_title`"},{"sort_order":"desc","sort_column":"`task_expiry_date`"}]'
(ME 1.26)
assessment_sort_assessor_completed
Optionally configure how tasks display on a user's My Completed Tasks tab (see above for input details) (ME 1.26)
assessment_sort_target_completed
Optionally configure how tasks display on a user's Tasks Completed on Me tab (see above for input details) (ME 1.26)
A form is a collection of items used to assess a learner, or to evaluate a faculty member, event, rotation, course or anything else in your organisation. Forms can be created for specific courses, or for use across an entire organization. When building a form administrators can optionally indicate a form workflow which allows Elentra users to initiate a form on-demand (e.g. for the purposes of clinical assessment).
This help section is primarily about creating and managing forms outside the context of Elentra's Competency-Based Education tools. See here for information about creating CBE-specific forms.
If you are creating a form to be attached to a gradebook assessment please note that not all item types are supported because there is no structure to weight them on the form posted to the gradebook. Do not use date selector, numeric, or autocomplete (multiple responses) items. When creating a form to use with a gradebook assessment it is recommended that you only use multiple choice, dropdown selector, rubric (grouped item only), and free text items. Please see additional details about form behavior in gradebook in the Gradebook section.
Elentra supports several types of forms.
Generic Form
Users can add any items to this form type and it will be immediately available for us.
Standard Rotation Evaluation Form
You will require developer assistance to add this form to your organization.
Only use this form type if you are also using the Clinical Experience rotation scheduler and you use an identical form to evaluate all rotations within a course.
This form can have preset items that appear on the form each time one is created. There is no user interface to build the templated items and a developer needs to do so.
When a new form is made, the templated items (if any exist) will be automatically added to it. Users can then optionally add additional items. After adding items, the form must be published before it can be used.
Forms must be permissioned to a course to be used, and each course can only have one active form at a time. If a new form is created it will overwrite the existing form.
Each course must have its own form permissioned to it. (If forms are identical for all courses, you can create one form, copy it multiple times and adjust the course permissions.)
This form can be automatically distributed based on a rotation schedule (set by administrators when building rotations) OR can be made available to be accessed by learners on demand (use the on-demand workflow for this option).
Standard Faculty Evaluation Form
You will require developer assistance to add this form to your organization.
Only use this form type if you use an identical form to evaluate all faculty within a course.
This form can have preset items that appear on the form each time one is created. There is no user interface to build the templated items and a developer needs to do so.
When a new form is made, the templated items (if any exist) will be automatically added to it. Users can then optionally add additional items. After adding items, the form must be published before it can be used.
Forms must be permissioned to a course to be used, and each course can only have one active form at a time. If a new form is created it will overwrite the existing form.
This form can be made available for learners to initiate on demand using on-demand workflows.
Rubric Forms (available only with CBE enabled)
Rubric forms can be used to assess learners and provide lots of flexibility to administrators to build the form with the items desired.
As long as at least one item on a rubric form is mapped to a curriculum tag, and the corresponding tag set is configured to be triggerable, learners and faculty will be able to access the form on demand.
Results of completed rubric forms are included on a learner's CBE dashboard assuming the mapped tag set is configured to show on the dashboard.
Periodic Performance Assessment Forms (available only with CBE enabled)
A Periodic Performance Assessment (PPA) Form is designed to capture longitudinal, holistic performance trends. It functions very similarly ot a rubric form and offers flexibility in terms of the items that can be added to it.
As long as at least one item on a PPA form is mapped to a curriculum tag, and the corresponding tag set is configured to be triggerable, learners and faculty will be able to access the form on demand.
Field Note Form (available only with CBE enabled)
A field note form template is used to give a learners narrative feedback about their performance.
Supervisor Form Template (available only with CBE enabled)
A supervisor form is used to give a learner feedback on a specific curriculum tag (e.g., an EPA) and can be initiated by a learner or supervisor. Once a curriculum tag is selected, the form displays the relevant child tags (e.g., milestones) to be assessed on an automatically generated rubric. A supervisor can indicate a learner’s progress for each curriculum tag that was observed and can provide a global entrustment rating. Comments can be made optional, prompted or mandatory in each section of the form.
Procedure Form Template (available only with CBE enabled)
A procedure form is an assessment tool that can be used to provide feedback on a learner’s completion of a specific procedural skill. Once a procedure is selected, specific criteria will be displayed. A procedure form can be initiated by a learner or faculty.
There are prerequisites to using a procedure form. A course must have uploaded procedures in their Contextual Variables AND have uploaded procedure criteria for each procedure.
Smart Tag Form (available only with CBE enabled)
Smart tag forms are template-based forms which are built and published without curriculum tags attached to them.
Curriculum tags are attached based on the selections made when triggering assessments using a smart tag form.
To effectively use the Assessment & Evaluation module, a staff:admin or medtech:admin user will need to configure assessment response categories in System Settings. Assessment response categories are the terms available when users build items or create scales to apply to rubric items in the Assessment & Evaluation module. Some examples are low performance, medium performance, and high performance, or needs improvement, meets expectations, and exceeds expectations.
Note that assessment characteristics (also in System Settings), are used in the the Gradebook module to define assessment types like written exam, test, quiz, project, etc. You do not need to build a list of assessment characteristics to use the Assessment & Evaluation module.
When you create assessment and evaluation items you will have the option of applying rating scales to certain item types; creating rating scales promotes consistency across items and can be a time saver for the administrative staff creating items and forms.
You must be a medtech:admin user to manage rating scales.
For additional detail about rating scales in CBE, including how to add default text to appear with global entrustment items, please see here.
Default
Used on Assessment and Evaluation items
Dashboard
Used with Competency-Based Education when displaying assessment information on the Learner Dashboard
Global Assessment
Used on Competency-Based Education forms and form templates
MS/EC (Milestone/Enabling Competency)
Used on Competency-Based Education Supervisor Form Templates
Please note that if you have created a second organization within your Elentra installation, you will need a developer's help to add the rating scale types to your organization.
Navigate to Admin>Assessment and Evaluation.
Click 'Scales' from the A&E tabs list. Any existing rating scales will be displayed.
Click the green 'Add Rating Scale' button.
Complete the required information, noting the following:
Title: Title is required and is what users will see when they build items and add scales so make it clear.
Description: This is optional and is not often seen though the platform.
Rating Scale Type: This defines the type of rating scale you are creating. Later, if you add rating scales to items, or add standard scales to form templates, you will first have to select a scale type. There is no user interface to configure rating scale types.
In a default Elentra installation you'll likely just see a default scale type. In installations with CBE enabled you'll see global rating and milestone/enabling competency scales.
Response Categories
Add or remove response categories by clicking the plus and minus icons.
For each response category, select a descriptor (these are configured through the assessment response categories). Note that you can search for descriptors by beginning to type the descriptor in the search box.
Response Colour - use this when you build a Dashboard scale
Response Character - use this when you build a Dashboard scale
Response Preview - this displays what the icons will look like on the Learner Dashboard
To edit an existing rating scale click on the scale title, make changes as needed, and click 'Save'.
To delete a rating scale click the checkbox beside the rating scale and click 'Delete Rating Scale'.
Note that once a rating scale is in use, you are unable to delete it through the user interface. This is to prevent any changes to previously obtained assessment and evaluation results.
Once scales are created, they will become visible options when creating items and using some form templates.
Setting permissions for a scale dictates which users will be able to access a scale when they create assessment and evaluation items. For example, if you set a scale's permissions to Undergraduate Medicine, all users with access to Admin > A & E in the undergraduate organisation will be able to use the scale when creating items. If you set a scale's permissions to several individual users, only those users will be able to access the scale when creating items.
You must create a scale before you can edit the permissions for it. After a scale is created you will automatically be redirected to the edit page.
In the Rating Scale Information section, look for the Scale Permissions heading.
Select Individual, Organisation, or Course from the dropdown options.
Type in the search bar to find the appropriate entity.
Click on the entity name to add it to the permission list.
Add as many permissions as required.
Scroll down and click 'Save'.
Please note that some rating scale values will be ignored in the Weighted CSV Report. Values that will be ignored are:
n/a
not applicable
not observed
did not attend
please select
The Distribution Wizard is the five step tool that takes administrators through the process of building a distribution. The specific information entered on each step of the Distribution Wizard will depend on whether you are creating an assessment or evaluation and the distribution method you select. Below is some general information about the options on each step.
Note that a distribution will not save until all 5 steps of the wizard are complete.
Navigate to Admin>Assessment & Evaluation.
Click 'Distributions'.
Click 'Add New Distribution'.
This will open a five-step wizard that walks you through creating a distribution. The following is a high level overview of each step. The exact options on each step will depend on the type of distribution you are building. Additional details for each distribution method are included on separate pages.
On this step you define some basics like which form you are going to send out, and the relevant course and cperiod.
Assessment Mandatory: If this is checked off, assessors/evaluators who receive a task from this distribution will not be able to to delete it. The task will remain on their A&E task list until it expires or is deleted by a distribution administrator.
Disable Initial Task Email: Introduced in ME 1.26 this option allows administrative users to opt-out of having Elentra sent initial task email notifications to assessors/evaluators. (Applies on to non-delegation based distributions).
On this step you define the type of distribution you'd like to build (e.g., date-based, rotation-based).
Task Expiry: Check this to apply an expiry date to tasks. After the expiry date has passed, users will no longer be able to complete the task.
Delay Task Creation
This option refers to the Summary Assessment Task options supported by Elentra.
Other options will display depending on the type of distribution being built.
There is a database setting a developer can configure if you want to force the expiry date of certain distribution methods to always be 23:59 on a given day (assessment_tasks_expiry_end_of_day).
On this step you define the target of the task. This could be learners, faculty, events, or courses depending on the type of distribution being used.
On this step you define the assessors or evaluators who will complete the task. This could be learners or faculty courses depending on the type of distribution being used.
When completing a distribution you will notice a "Feedback Options" section in Step 4 if the assessor is set to faculty members (it will not appear when the assessor/evaluator is set to learners). If you check this off it will add an item to the form for this distribution asking both the assessor and the target whether or not they met to discuss the target's performance (see sample text below).
You might choose to use this option and add the item if you want to collect data on how often preceptors are meeting with learners.
In effect, this tool adds two items to a form. The first item will appear as above for the assessor to complete. Once the form is completed it will be stored in the learner's "Tasks Completed On Me" tab. The learner can access the form from there, and the learner will also receive an email notification that they can take action on the form.
The form is available for the learner to complete and answer the same question from their perspective. The comment box is available for learners to record any additional details.
Currently there is no reporting tool to compare learner and assessor responses, although they can be viewed on any completed forms. There is also no visual cue to an admin user that the response from the learner is pending; if there is no learner response displaying it means the learner hasn't answered the question.
On Step 5 you can immediately save your work, or make some adjustments to the distribution options.
This allows you to add individual authors, or set a course or organization as the author. This may be useful if you have multiple users who manage distributions or frequent staffing changes. Adding someone as an author will allow them to more quickly access the distribution from their distribution list.
Distributions are automatically accessible to all users with staff:admin group and role permissions.
Adding a course permission will make the distribution show, without filters applied, to program coordinators and faculty directors associated with the course.
Adding an org. permission will make the distribution accessible to anyone with administrative access to Assessment & Evaluation. (Note that most users will need to apply filters to access the distribution.)
Task List Release
This section lets you decide whether and how to allow targets of tasks to view tasks completed on them.
"Targets can view tasks completed on them after meeting the following criteria" can be used to motivate learners to complete their assigned tasks. Learners will only see tasks completed on them after they have completed the minimum percentage of their tasks set by you.
Target Self-Reporting Release
This controls whether targets can run reports on themselves for this distribution.
If given access, users will be able to generate an aggregated report of all responses on tasks in a distribution from their A+E badge and the My Reports button.
"Targets can view reports for tasks completed on them after meeting the following criteria" can be used to motivate learners to complete their own assigned tasks before seeing the results of tasks completed on them. Targets will only be able to run reports on themselves after they have completed the minimum percentage of their tasks set by you.
The "Targets can view tasks/reports after meeting the following creiteria" options are typically used for peer assessments. In the example above, if a learner were working with several peers in a small group and they were required to assess each other, the learner would have to complete 75% of their assigned peer assessments before they would be able to view the feedback provided to them by their peers.
Please note that these settings are specific to a single distribution only. This tool does not require users to have completed a percentage of tasks across all distributions.
This controls whether users can view the names of their assessors when reviewing comments left on completed tasks.
If you have set your distribution to release individual tasks, targets will be able to view individual results and the names of assessors, so the option to make comments anonymous makes the most sense when you are never going to release individual assessments, but are allowing self-reporting.
This allows you to set up a reviewer to view completed tasks before they are released to the target (e.g. a staff person might review peer feedback before it is shared with the learner).
Check off the box to enable a reviewer.
Click Browse Reviewers and select a name from the list. Note that this list will be generated based on the course contacts (e.g. director, curriculum coordinator) stored on the course setup page.
Prompted Responses: This allows you to define whom to send an email to whenever a prompted response is selected on a form used in the distribution. For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response, any time "I had to do it" is picked as an answer an email notification will be sent.
You can optionally select to email Program Coordinators, Program/Course Directors, Academic Advisors, Curricular Coordinators, or you can add a Custom Reviewer. If you select to add a Custom Reviewer you can select their name from a searchable list of users.
Sender Details: Define the email address that notifications and reminders will be sent from for a distribution.
Options are the distribution author, an existing user, an existing generic email, or a new generic email.
To create a new generic email provide a name and email address. This will be stored in the system and available to other users to use as needed.
Navigate to Admin>Assessment & Evaluation.
Click 'Forms'.
Click 'Add Form'.
Provide a form title and select a form type (if applicable). See list of form types above.
Depending on the form type selected, you may be required to identify a course/program.
Click 'Add Form'.
Form Title: This will be set based on the previous step although you can edit the title if needed.
Form Description: Optional. This will display to users when the form is accessed.
Form Type: This will be set based on the previous step.
On Demand Workflow: If your organization uses workflows like EPA or Other Assessment (i.e. allows users to initiate specific forms on-demand), make the appropriate selection. Set to None if you do not want users to access this form on-demand.
Set form permissions to give other users access to this form. You can optionally give permissions to individuals, a course or an organisation.
Form Permissions Tips:
Any individual given permission to the form will be able to edit it until it is used in a distribution.
We strongly recommend permissioning forms to at least a course so that staffing changes are simplified. If you do this, any user who is a course contact for that course and who also has permission to access Admin > Assessment & Evaluation, will be able to access the form and include it in a distribution if needed.
If Jane Doe is the only person with access to a form and she retires, you'll need to manually reassign all her forms to a new user. If the form is permissioned to a course, any course contact with access to Assessment & Evaluation will be able to access the form.
Currently, permissioning a form to an organization only allows medtech:admin users to access it. As such we recommend relying mostly on course permissions.
Please note that forms using a workflow (e.g. EPA, Other Assessment, Rotation Evaulation or Faculty Evaluation) must be permissioned to the appropriate course.
Standard Rotation Evaluation and Standard Faculty Evaluation forms must also be permissioned to a course.
Confidentiality
Check this box if you'd like completed tasks using this form to replace the assessor/evaluator name with "Confidential." This can be useful for things like course or faculty evaluations.
See more detail here.
One thing to be aware of if using the Confidentiality option on forms is that once the name of the assessor/evaluator is changed to Confidential, it will be impossible for an admin. to monitor a distribution and see who has/has not completed what. If you typically monitor distributions or use task completion to populate some aspect of a course gradebook (e.g. a professionalism score), you may not want to use the Confidentiality option.
Click 'Add Item(s)' to add existing items.
Note that you can also add grouped items, free text (e.g., to provide instructions), or a curriculum tag set to your form. To add any of these, click on the down arrow beside 'Add Items'. If you choose to add Free Text or a Curriculum Tag Set, please note that you must save your choices within the item box using the small 'Save Required' button.
Note: Adding a Curriculum Tag Set is a very specific tool that supports field notes for use in family medicine. Most users should ignore this option.
Search for existing items and tick off the check boxes, then click 'Attach Selected' to apply your choices.
To create new items while creating your form, click Add Items and then click Create & Attach a New Item. When you complete your new item and save it, you will be returned to the form you in the process of building.
Save the form when you have added all the relevant items.
To preview your form, by clicking on the eye icon/Preview Form button.
To download a copy of the form, use the Download PDF button.
To delete items on a form, tick off the box on the item card and then click the red Delete button on the left.
To rearrange items on a form, click the crossed arrow icon on the item card and drag the item to where you want it to be.
To edit an item, click on the pencil icon on the item card. Note that an item already in use on a form that has been distributed will not be able to be edited. Instead you must copy and attach a new version of the item to edit and use it.
To quickly view the details of an item, click on the eye icon on the item card.
Navigate to Admin>Assessment & Evaluation.
Click 'Forms'.
Use the search bar to look for the form you want to copy. Click the down arrow beside the search bar to apply filters to refine your search results.
Click on the name of the form you want to copy.
Click 'Copy Form' and provide a new name for the copied form.
Click 'Copy Form'.
Edit the form as needed (e.g., add additional items, change permission, etc.).
If you edit an item on a form and that item is in use on other forms, you will affect all of the associated forms. You can optionally view all forms that include the item.
For grouped items you can optionally copy and attach the grouped item to the form allowing you to change it as needed.
Create new item linkage
Create new items
For single items you can optionally copy the item to edit it. This will create a brand new item with no connection/link to the item it is copied from.
Click 'Save'.
Retiring a form means it will remain available in existing distributions, and reports, but will not be available for any new distribution.
Navigate to Admin>Assessment & Evaluation.
Click 'Forms'.
Use the search bar to look for the form you want to retire. Click the down arrow beside the search bar to apply filters to refine your search results.
Tick off the box beside the form name (you can select multiple forms to retire at once), and then click the orange Retire Form button.
You will be prompted to confirm your action. Click 'Retire'.
Retired forms will display with a red highlight around them.
Deleting a form means that all pending and in-progress tasks that used that form will not have a form associated with them and will display an error message stating that the form has been deleted when an assessor/evaluator tries to access the form.
Navigate to Admin>Assessment & Evaluation.
Click 'Forms'.
Use the search bar to look for the form you want to delete. Click the down arrow beside the search bar to apply filters to refine your search results.
Tick off the box beside the form name (you can select multiple forms to delete at once), and then click the red Delete Form button.
If you are in an organisation that is not using Elentra's Competency-Based Education tools, the form templates tab is unlikely to be something you will ever use.
Form templates allow users to make forms more consistent and in the case of the Competency-Based Education module, allow users to configure the requirements for multiple forms through one user interface. Administrative staff specify EPAs and milestones, contextual variables, and rating scales to be used on assessment forms for clinical environments, and these forms get created after a form template is published. See for more information.
Form templates are built off of something called a form blueprint and there is currently no way through the user interface to configure a form blueprint. If you want different form templates from those in a default installation of Elentra, you'll need a developer's help.
An overview of Distributions and their management
New in ME 1.26!
The option for administrative users to disable initial task creation email notifications on distributions.
A database setting to make task expirty 23:59 for specified distribution types (assessment_tasks_expiry_end_of_day)
A distribution defines who will complete a task using what form, who or what the task is about, and when the task will be delivered and completed. Usually, administrative staff manage distributions to send tasks to learners, faculty, event participants, etc.
In most cases, when a distribution sends a task, users receive an email notification that a task has been created for them. They can also access tasks from their Assessment and Evaluation badge.
You can watch a recording about Distributions at (login required).
You must have one of the following permission levels to access this feature:
Medtech:Admin
Staff:Admin
Staff:PCoor (assigned to a course/program)
Faculty:Director (assigned to a course/program)
Distribution: defines who will complete a task using what form, who/what the task is about, and when the task should be delivered and completed
Assessor/Evaluator: The person completing a form/task
Target: The person, course, experience or other that the form/task is about
Distribution method: How the form/task is assigned to people (e.g. based on a date range, rotation schedule, event schedule, etc.)
Distribution wizard: Walks users through creating any distribution.
Rotation Schedule: Use a rotation schedule distribution to send forms based on a rotation schedule. This could include assessments of learners participating in a rotation, evaluation by learners of a specific rotation site or experience, etc.
Delegation: Use a delegation to schedule a distribution to be sent to an intermediate person who will later forward the tasks to the appropriate assessor/evaluator. For example, if you want to set up distributions in September, but it is unknown who learners will work with in January because a clinic schedule hasn't been set yet, you could send the distribution to a delegator to forward once the clinic schedule is known.
Learning Event Schedule: Use a learning event schedule distribution when you want to have participants evaluate an event, when you want participants to evaluate the faculty teaching particular types of events, or when you want faculty to assess the learners participating in certain types of events.
Date Range: Use a date range to create the most generic type. You could use this for faculty or course evaluations, or for assessment of learners at any point.
Ad Hoc: Used for assessments only, allows user to trigger an assessment when needed (differs from on demand workflows in that the distribution controls some things like the potential list of assessors)
Reciprocal: Allows you to link two distributions so that one generates tasks based on the other (e.g., distribution 1 asks faculty to assess learners; once Faculty A has assessed Learner B, distribution 2 can be configured to get Learner B to evaluate Faculty A).
For use cases where certain evaluations and/or assessments need to be confidential in the sense that the assessor should be hidden, Elentra includes a feature when building forms to hide the names of assessors/evaluators. This could apply to things like faculty evaluations and especially rotation evaluations.
One thing to be aware of if using the Confidentiality option on forms is that once the name of the assessor/evaluator is changed to Confidential, it will be impossible for an administrator to monitor a distribution and see who has/has not completed what. If you typically monitor distributions or use task completion to populate some aspect of a course gradebook (e.g. a professionalism score), you may not want to use the Confidentiality option.
At the bottom of the Form Information section you'll see a checkbox to enable form confidentiality.
Enabling this feature changes the way the assessor is displayed in many areas across the platform. For example, on the Admin Assessment and Evaluation page, any assessment/evaluation that is tied to a form with the setting enabled will have the assessor marked as confidential:
Clicking on one of these assessments will also have the assessor hidden within the assessor card:
Assessment cards will also hide the name, even if it is released to the target in the case of faculty evaluations:
Reports have also been updated so that whenever the assessor name is returned for a form with the setting enabled, “Confidential” will be shown instead. The same occurs if you try to download an assessment as a PDF.
The wording used throughout the feature can easily be replaced in the translation file for customization.
Disable Target Release option on Evaluations
In ME 1.14 Elentra introduced a way for learners to optionally release completed evaluation tasks to the targets of the task. In effect, a learner can complete an evaluation of a faculty member and choose to immediately release the completed task to said faculty member. The faculty member will see who the feedback came from (in contrast, if you allow faculty to run reports on a distribution of which they are a target, they do not see the names of the evaluators).
While advantageous in some circumstances, allowing learners to optionally release completed evaluation tasks to faculty is not desirable to all organisations. To allow organisations to customize how they want to treat individually completed evaluations, some database settings can be used.
To disable the target release option at a organisation level, a record must be added to the database settings
table for each organisation.
shortname: disable_target_viewable_release_override
organisation_id: <the_intended_org_id>
value: 1
To disable the target release option at a course level an entry must be created in the course_settings
table with the following values:
course_id: <the_intended_course_id>
shortname: disable_target_viewable_release_override
value: 1
An entry should be created for each course/organisation that must have the release option disabled.
Finally, if you wish to provide flexibility to use the target release option per distribution, do not apply the database settings and instead teach your administrative staff to toggle the checkbox that says “Disable the assessor's ability to release the evaluation to the target once complete” on Step 1 of the distribution wizard.
The Disable Target Release option will only be visible on Evaluation tasks and it does not impact a faculty member's ability to create a report on a distribution (that is still controlled by the distribution in Step 5 of the distribution wizard).
When creating assessment and evaluation items, users will see the option to check a box designating a response option as a Prompt. This sets a flag on the response option that allows Elentra to send email notifications to users when that response option is selected when someone answers that item on a task. Examples of prompted responses could include unsatisfactory performance, patient safety issues, etc.
Prompted response notifications are available in the following scenarios:
For items used on forms that are delivered through distributed tasks
Administrators define the staff and faculty to notify of a prompted response in the Distribution Wizard, Step 5: Results > Notifications and Reminders
Staff and faculty will receive email notifications and can also view Prompted Response information in Elentra in the is enabled
Administrators define whether to notify learners when they are the target in the Course Setup tab (form must be associated with the course)
Note that if a distribution is configured to have a Reviewer, prompted response email notifications will not be delivered to learners until the Reviewer has completed their review and released a task
When Form Templates are used, they include default items in a Concerns section, and a place for Form Feedback. These items are prompted responses by default, however, learners do not have access to view these items, nor will they be notified if one of these items is selected with them as the target. Instead, faculty directors and program coordinators will be notified of this information via email or the if enabled.
Automated notifications for prompted responses on items other than Concerns and Feedback sections of Form Templates for on-demand forms is not yet supported in Elentra.
If a form is not in use, this checkbox can be enabled/disabled at will. For forms that are in use, this setting can ONLY be enabled (a developer can disable it in the DB). For this reason, admins will be prompted by a warning box confirming their choice:
In the case where tasks can have multiple targets, if someone who is not the assessor attempts to view any of the tasks while at least one is pending. This is done because if there are multiple targets, you can switch between them on the assessment page.
To see what the learner experiences when allowed to optionally release completed evaluation tasks, please see .
In this context, we use the term off-service to mean learners who are completing rotations outside their home program (e.g. a family medicine resident on a pediatrics rotation).
Clinical Experiences > Rotation Scheduler includes support for off-service rotations or slots. When an administrator builds rotations she can optionally create off-service slots and make them available to all other programs or specific programs. The following information assumes you are an organization using off-service slots and refers to rotation-based distributions. (This page does not discuss automated rotation evaluations nor rotation evaluations completed on demand via workflows.)
As the home program you can:
Create assessment tasks targeting your learners while they are on an off-service rotation.
Report on assessment tasks targeting your learners while they were on an off-service rotation.
For distributions you created, you can view results via the Learner Assessments report or the Learner Reports (Aggregated).
For distributions created by host programs, but targeting your learners, you can only view results via Learners Reports (Aggregated).
Create evaluation tasks asking your learners to evaluate their experience on an off-service rotation.
View individual responses and download a Weighted CSV of your learners' responses to an evaluation of their off-service experience (do this via the Completed Distribution Report); up to and including ME 1.17 a home program cannot use the Rotation Evaluation (Aggregated) Report to see an overview of their learners' experience on an off-service rotation (whether the distribution was set up by the home or host program).
As the host program you can:
Create assessment tasks targeting visiting, off-service learners while they are on your rotation.
Report on assessment tasks targeting visiting, off-service learners while they were on your rotation.
Create evaluation tasks asking visiting, off-service learners to evaluate their experience on your rotation.
Report on evaluation tasks completed by visiting learners to your rotation, as long as you created the distribution. If the home program created an evaluation asking their learners to evaluate your rotation or faculty, you will not be able to access the results.
In effect, up to and including ME 1.17, neither the home program nor the host program can create a Rotation Evaluation (Aggregated) Report to view responses of off-service learners to off-service rotation evaluations created by the home program.
By default, all assessment distributions have the task release set to immediately release tasks to targets when they are completed. One case where an administrator may not want this is in the case of Peer Assessment, where the learners should not be able to see individually completed and identified tasks.
To allow institutions to prevent this from happening due to administrative error of not setting the appropriate settings, a database setting was introduced in Elentra ME 1.21 (peer_assessment_target_release_enabled). This setting is disabled by default.
With the setting enabled, if an assessment distribution’s targets and assessors are both LEARNERS, the Task List Release will be set to Targets cannot view tasks completed on them, the Target Self-Reporting Release will be set to Targets can immediately view reports for tasks completed on them, and the Target Self-Reporting Options will be set to Comments are anonymous.
Conditions that need to be met in the distribution wizard to be considered a peer assessment:
The target is considered to be a learner when a distribution uses one of the following:
Date range based distribution, pick the “Select learners” option
Rotation based distributions, pick the “The targets for this Distribution are learners” or “The targets for this Distribution are peers” options
Date range delegation based distribution, pick the “Select learners” option
Rotation based delegation distributions, pick the “The targets for this Distribution are learners” or “The targets for this Distribution are peers” options
Event based distribution, pick the “Attendees who are enrolled in events with the selected event types” option
The distribution delivery type is adhoc
The assessor is considered to be a learner when a distribution uses one of the following:
Date range based distribution, pick the “Select learners” option
Rotation based distribution, pick the “The assessors for this Distribution are learners” option
Date range delegation based distribution, pick the “Select learners” option
Rotation based delegation distribution, pick the “The assessors for this Distribution are learners” option
Event based distribution, pick the “The assessors for this Distribution are attendees enrolled in the event” option
Ahdoc distribution, pick the “Select learners” option
With the above considered, if both the target and assessor are determined to be learners, then the distribution is a peer assessment that can then use the new rules assuming that the database setting is enabled.
Sample use cases for copying a distribution include to complete assessments/evaluations in the context of small groups (i.e., copy the distribution so have one for each group, then change the targets and assessors/evaluators as needed), to edit distributions for use in the subsequent academic year (i.e., copy the distribution, then change the cperiod as required), etc.
Note that Elentra does not support copying a distribution and changing the distribution method. If you change the method of a distribution you will need to reenter the required information in the subsequent steps.
Navigate to Admin>Assessment & Evaluation.
Click 'Distributions'.
Use the search tool to look for the distribution you want to copy. Click the down arrow beside the search bar to apply filters to refine your search results.
Click on the cog icon to the right of the distribution you want to copy. Select 'Copy Distribution'.
Edit the information in each step of the distribution wizard as needed and save your work.
Editing a distribution can have unanticipated effects and it is recommended that whenever possible, you test the impact of your edits in a staging environment.
Editing the Expiry Date: If an expiry date has been applied to the active distribution, you can edit the expiration date to 're-open' tasks to the targets who haven't completed them yet. Those tasks will be visible to the targets the day after the update is made to the expiry date (because the job that runs behind the scenes to deliver tasks runs at night). A developer can run the job immediately if it is time sensitive.
Changing the Targets: If you add a new targets to a distribution, the system will reopen the task for the assessor/evaluator to allow them to complete the task on the new target.
Editing the Rotation Schedule of Targets Associated with a Distribution: Will cause new tasks to be delivered to the assessor.
Changing the Reviewer of a Distribution: Will cause the distribution to redeliver previously delivered tasks; be sure to apply a Release Date when making this change.
Changing the Target Release Options in Step 5: Changes the visibility of tasks/reports retroactively for all completed tasks.
If you want to change a distribution partway through its use but you want to leave all existing tasks active, you can copy the distribution, alter it as needed, and then retire the original distribution on a defined date. When you retire a distribution, you preserve any existing data and pending tasks but prevent future tasks from generating.
To retire one distribution, click the menu cog to the right of the distribution name and select 'Retire Distribution'.
To retire multiple distributions at once, click the checkbox beside distribution names and click the 'Retire Distribution' button.
Set a Retirement Date for the distribution(s). As of this date, no more tasks will generate for this distribution.
Click 'Retire'
You will get a success message and can click to return to the Distribution index.
Retired distributions will still display on the list of Distributions, but they will have a small orange clock beside the name to show they have been retired.
An informational notice will be included on distributions that are set to retire soon. The notice will be seen when you access the Progress Report for a distribution.
If you delete a distribution you will no longer see it in the list of Distributions and it can't be recovered through the user interface.
Navigate to Admin>Assessment & Evaluation.
Click 'Distributions'.
Use the search bar to look for the distribution you want to delete. Click the down arrow beside the search bar to apply filters to refine your search results.
Click the checkbox beside the name of the distributions you want to delete.
Click the red Delete Distributions button.
Confirm your decision.
Navigate to Admin>Assessment & Evaluation.
Click 'Distributions'.
Use the search bar to look for the distribution you want to manage. Click the down arrow beside the search bar to apply filters to refine your search results. You will only see distributions to which you have access.
Click on the cog icon to the right of the distribution you want to copy. Select 'View Distribution Report'.
Review progress. Click on any category (Pending, In Progress, Completed) to view more details about specific targets and assessors.
Pending tasks have not yet been started. In Progess tasks have been started but not complete. Completed tasks are done and have been submitted.
To delete tasks tick off the box below the garbage icon on each task card and then click the red Delete Task(s) button.
To send reminders to those with incomplete forms, tick off the box below the bell icon on each task card and then click the blue Manage Distribution button and select Send Reminders from the dropdown list. To select all tasks for reminders click on the bell icon.
Review your choices and, if correct, click Confirm Reminders. You will get a green success message.
To add a task to a distribution, click the blue Manage Distribution button and select 'Add a Task' from the dropdown list. Complete the required information and click 'Confirm Task'.
Viewing progress results for learning event-based distributions: Program coordinators who set up such distributions should view progress from their My Assessments page.
Many of the functions described above can also be completed at an individual task level when logged into the system as a program coordinator or faculty director.
Click the Assessment and Evaluation badge at the top right (beside the logout button).
Click the My Learners tab.
Click on the appropriate tab for a learner (e.g. CBME, Assessments, Logbook).
When looking at a user's Assessment & Evaluation dashboard, some users will be able to send reminders, remove tasks, and download PDFs of selected tasks from their assigned learners or faculty.
A date range distribution allows you to send a form to the appropriate assessors/evaluators in a specific date range.
Navigate to Admin>Assessment and Evaluation.
Click 'Distributions' above the Assessment and Evaluation heading.
Click 'Add New Distribution'.
Distribution Title: Provide a title. This will display on the list of Distributions that curriculum coordinators and program coordinators can view.
Distribution Description: Description is optional.
Task Type: Hover over the question mark for more detail about how Elentra qualifies assessments versus evaluations. If a distribution is to assess learners, it's an assessment. If it is to evaluate courses, faculty, learning events, etc. it is an evaluation. Notice that the language on Step 4 will change if you switch your task type, as will other steps of the wizard.
Disable Target Release: This will only display for Evaluations. If you do not want evaluators to be able to release the form to the target upon completion, check this box. If left unchecked, evaluators will be able to release the form to the target upon completion.
Assessment/Evaluation Mandatory: This will be checked off by default. Mandatory tasks can't be deleted by assessors/evaluators.
Disable Reminders: Check this box to exclude tasks from this distribution from reminder notification emails. This option will only display if specific database settings are enabled. If you check this box, assessors/evaluators will still be sent an initial task creation notification, but weekly reminder summary emails will be disabled.
Select Form: The form you want to distribute must already exist and you must have permission to access it; pick the appropriate form.
Select a Curriculum Period: The curriculum period you select will impact the list of available learners and associated faculty.
Select a Course: The course you select will impact the list of available learners and associated faculty.
Click 'Next Step'
Distribution Method: Select 'Date Range Distribution' from the dropdown menu.
Start Date: This is the beginning of the period the form is meant to reflect.
End Date: This is the end of the period the form is meant to reflect.
Delivery Date: This is the date the task will be generated and delivered to the assessors/evaluators. (The delivery date will default to be the same as the start date.)
Task Expiry Date: Optional. Set the date on which the tasks generated by the distribution will automatically expire (i.e. disappear from the assessor's task list and will no longer be available to complete). This tool was updated in ME 1.22 so that tasks expire at 11:59 pm on the day you select.
Warning Notification: If you choose to use the Task Expiry option, you'll also be able set an automatic warning notification if desired. This will send an email to assessors a specific number of days and hours before the task expires.
Delay Task Creation: This option relates specifically to creating a summary assessment task distribution. This allows you to create a distribution that will pull in and display completed tasks from other distributions. In effect, the assessor of the distribution you create will be able to see items from previously completed tasks and take them into account when completing their own task.
Select a distribution (note that you will only see distributions to which you have access).
Define how many tasks must be completed in the linked distribution(s) before the new distribution will take effect.
Enter a fallback date. This date represents when your distribution will create tasks whether or not the minimum number of tasks have been completed in the linked distribution.
You can adjust the fallback date on the fly and the distribution will adjust accordingly to only generate tasks as appropriate (previously created tasks will still remain).
If you set a fallback date that falls before the release date, the fallback date will be ignored.
The target is who or what the form is about.
Assessments/Evaluations delivered for: Use this area to specify the targets of the form.
For Assessments your options include: targets are the assessors (self-assessment), and learners (
For Evaluations your options include: faculty members, courses, course units (if in use), individuals regardless of role, external targets, and rotation schedules.
Select Targets: Here you can specify your targets using the dropdown selector.
CBME Options: This option applies only to schools using Elentra for CBME. Ignore it and leave it set to non CBME learners if you are not using CBME. If you are a CBME school, this allows you to apply the distribution to all learners, non CBME learners, or only CBME learners as required.
Target Attempt Options: Specify how many times an assessor can assess each target, OR whether the assessor can select which targets to assess and complete a specific number (e.g. assessor will be sent a list of 20 targets, they have to complete at least 10 and no more than 15 assessments but can select which targets they assess).
If you select the latter, you can define whether the assessor can assess the same target multiple times. Check off the box if they can.
Click 'Next Step'
The assessors are the people who will complete the form.
There are three options:
Select faculty members
Browse faculty and click on the required names to add them as assessors
Select Associated Faculty will add the names of all faculty listed as course contacts on the course setup page
Exclude self-assessments: If checked, this will prevent the assessor from completing a self-assessment
Feedback Options: This will add a default item to the distribution asking if the faculty member met with the trainee to discuss their assessment.
Select learners
Click Browse Assessors and click on the appropriate option to add learners as assessors
Exclude self-assessments: If checked, this will prevent the assessor from completing a self-assessment
Select individuals external to the installation of Elentra
This allows you to add external assessors to a distribution
Begin to type an email, if the user already exists you'll see them displayed in the dropdown menu. To create a new external assessor, scroll to the bottom of the list and click 'Add External Assessor'
Provide first and last name, and email address for the external assessor and click 'Add Assessor'
Feedback Options: This will add a default item to the distribution asking if the faculty member met with the trainee to discuss their assessment.
Exclude Self Assessments: Check this to stop learners from completing a self-assessment
Feedback Options: Check this to add an item to the form requiring the target and assessor to confirm they spoke. More detail here.
Give access to the results of previous assessments: This relates to Elentra's ability to provide a summary assessment task to users. If enabled, tasks generated by this distribution will link to tasks completed in the listed distributions. When users complete the summary assessment task they will be able to view tasks completed in the other distributions. For any items that are used on both forms, results from previously completed tasks will be aggregated for the assessor to view.
Click to add the relevant distribution(s).
Click 'Next Step'
You can immediately save your distribution at this point and it will generate the required tasks, but there is additional setup you can configure if desired.
Authorship: This allows you to add individual authors, or set a course or organization as the author. This may be useful if you have multiple users who manage distributions or frequent staffing changes. Adding someone as an author will allow them to more quickly access the distribution from their distribution list.
Distributions are automatically accessible to all users with staff:admin group and role permissions.
Adding a course permission will make the distribution show, without filters applied, to program coordinators and faculty directors associated with the course.
Adding an org. permission will make the distribution accessible to anyone with administrative access to Assessment & Evaluation. (Note that most users will need to apply filters to access the distribution.)
Target Release: These options allow you to specify whether the targets of the distribution can see the results of completed forms.
Task List Release:
"Targets can view tasks completed on them after meeting the following criteria" can be useful to promote completion of tasks and is often used in the context of peer assessments. Targets will only see tasks completed on them after they have completed the minimum percentage of their tasks set by you.
Target Self-Reporting Release: This controls whether targets can run reports for this distribution (i.e. to generate an aggregated report of all responses). When users access their own A+E they will see a My Reports button. This will allow them to access any reports available to them.
Target Self-Reporting Options: This allows you to specify whether or not comments included in reports are anonymous or identifiable. (This will only be applied if you have set reports to be accessible to the targets.)
Reviewers: This allows you to set up a reviewer to view completed tasks before they are released to the target (e.g. a staff person might review peer feedback before it is shared with the learner).
Check off the box to enable a reviewer.
Click Browse Reviewers and select a name from the list. Note that this list will be generated based on the course contacts (e.g. director, curriculum coordinator) stored on the course setup page.
Notifications and Reminders:
Prompted Responses: This allows you to define whom to send an email to whenever a prompted response is selected on a form used in the distribution. For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response, any time "I had to do it" is picked as an answer an email notification will be sent.
You can optionally select to email Program Coordinators, Program/Course Directors, Academic Advisors, Curricular Coordinators, or you can add a Custom Reviewer. If you select to add a Custom Reviewer you can select their name from a searchable list of users.
Sender Details: Define the email address that notifications and reminders will be sent from for a distribution.
Options are the distribution author, an existing user, an existing generic email, or a new generic email.
To create a new generic email provide a name and email address. This will be stored in the system and available to other users to use as needed.
Click 'Save Distribution'.
Rotation Based Distributions allow you to set up a distribution based on a rotation schedule. This means you can easily send a form to all enrolled learners to be delivered when they are actively in the rotation. Note that you must have rotations built using the Clinical Experience Rotation Scheduler to use this distribution method.
Navigate to Admin>Assessment and Evaluation.
Click 'Distributions' above the Assessment and Evaluation heading.
Click 'Add New Distribution'.
Distribution Title: Provide a title. This will display on the list of Distributions that curriculum coordinators and program coordinators can view.
Distribution Description: Description is optional.
Task Type: Hover over the question mark for more detail about how Elentra qualifies assessments versus evaluations. If a distribution is to assess learners, it's an assessment. If it is to evaluate courses, faculty, learning events, etc. it is an evaluation. Notice that the language on Step 4 will change if you switch your task type.
Disable Target Release: If you do not want assessors/evaluators to be able to release the form to the target upon completion, check this box. If left unchecked, assessors/evaluators will be able to release the form to the target upon completion.
Assessment/Evaluation Mandatory: This will be checked off be default.
Disable Initial Task Email: Introduced in ME 1.26 this option allows administrative users to opt-out of having Elentra sent initial task email notifications to assessors/evaluators. (Applies to non-delegation based distributions).
Disable Reminders: Check this box to exclude this distribution from reminder notification emails.
Select Form: The form you want to distribute must already exist; pick the appropriate form.
Select a Curriculum Period: The curriculum period you select will impact the list of available learners and associated faculty.
Select a Course: The course you select will impact the list of available learners and associated faculty.
Click 'Next Step'
Distribution Method: Select 'Rotation Distribution' from the dropdown menu.
Rotation Schedule: Select the appropriate rotation schedule from the dropdown menu.
Specific Sites: Select a site if you wish to include only targets from a specific site on this distribution.
Release Date: This tells the system how far back on the calendar to go when creating tasks. Hover over the question mark for more detail.
Delivery Type:
Basic delivery (single): allows you to send a single evaluation task repeatedly, once per block, or once per rotation.
Dynamic delivery rules (multiple): allows you to deliver multiple evaluation tasks at specific intervals during a rotation. For example, if you have a 5-month rotation, and you’d like to deliver an interim evaluation at the 2-month mark and the 4-month mark, you can set a rule for the distribution to do so.
If you select Basic delivery (single), then select your Delivery Period:
Choose between delivering tasks repeatedly, once per block, or once per rotation.
Repeatedly means you set how often during the rotation the task sends (e.g. every 5 days)
Once per block means that for each booking the learner has in a rotation they will be sent a task
Once per rotation means that consecutive bookings (i.e., two or more back-to-back blocks) will be treated as one unit of time
For each delivery period, additional customization allows you to control the timing of the distribution (e.g. 1 day after the start of the block, or 3 days before the end of the rotation).
Options are before/after the start, before/after the middle, and before/after the end.
In this sample schedule, a 'Once per block' delivery for the IM rotation would send learner 012022 three tasks - one each for Blocks 1, 2, and 3. A 'Once per rotation' delivery for the IM rotation would send learner 012022 two tasks - one for Blocks 1+2, and one for Block 4.
Note that if learners have consecutive bookings at different sites they will not be treated as one rotation. For example a 'Once per rotation' delivery for the IM rotation for Tibor would result in two assessment tasks because he is at Hospital 1 and then Hospital 2.
If you select Dynamic deliver rules (multiple), then set your delivery rules.
Identify the specific delivery rules depending on the length of a rotation (you identify the length of the rotation in months, weeks, or days).
If you enter an integer larger than 1 for the number of tasks, you will be able to define when each task will be delivered (percent of the way through the rotation) and a visualization will be supplied by the Task bar below.
Clicking on Add new delivery will allow you to add additional delivery schedules that depend on the length of the rotation, which allows you to set different parameters for shorter rotations and longer rotations.
As noted in the interface, “larger” rules take precedence over smaller ones. So if you had a 0-1 month rule and a 0-2 month rule, the 0-2 month would supersede the other.
Remember: if a learner's schedule has two or more of the same rotation scheduled in a row, the system will treat them as a single rotation.
You can have as many rules as you want, but they must be contiguous (e.g., 0-2 months, 2-4 months, 4-6 months) to avoid leaving learners without an assessment.
Examples of Dynamic Delivery Scenarios:
If you would like two interim assessment tasks delivered for any rotation 5 to 8 months in length, you could identify that for a 5 – 8-month rotation, task one should be delivered at 40% of the way through the rotation and task 2 should be delivered at 80% of the way through the rotation. Then, for a learner on a 5-month rotation, the interim task would be delivered at 2 months into the rotation and again at 4 months into the rotation, for example.
If you would like three tasks delivered for any rotation 3 – 6 months in length, you could set task one at 25% through the rotation, task 2 at 50% through the rotation, and task 3 at 75% through the rotation, which would represent delivery of the task at ¼ of the way through the rotation, 1/2 the way through the rotation, and ¾ of the way through each 3-month, 4- month, 5-month, or 6-month rotation that a learner might be scheduled on.\
Task Expiry: If you check this box, the tasks generated by the distribution will automatically expire (i.e., disappear from the assessor's task list and no longer be available to complete). You can customize when the task will expire in terms of days and hours after the delivery.
Warning Notification: If you choose to use the Task Expiry option, you'll also be able to turn on a warning notification if desired. This can be set up to send an email a specific number of days and hours before the task expires.
The target is who or what the form is about (e.g., learners, faculty members, a rotation). Note that you'll only see the option to set the rotation or faculty member as a target if you are creating an evaluation.
Assessments will be delivered for: Use this area to specify the target of the form.
Targets are the assessors (self assessment): use this for self-assessments. Learners will be delivered a task where they are both the target and assessor.
Targets are learners
Learner Options:
All learners in this rotation
Learners from My Program/Outside of my Program: This refers to organizations that use on and off-service rotation slots (usually PGME programs).
Checking My Program will target leaners who are scheduled in the rotation AND enrolled in the course.
Checking Outside of my program will target learners who are scheduled in the rotation but enrolled in a different course than that in which the rotation exists.
Learner Levels: This refers to learner levels usually used in CBME enabled organizations. Checking a learner level will restrict the distribution to target only learners at that learner level. Leaving all boxes unchecked will target all learners, regardless of their learner level.
Additional Learners
Click Browse Additional Learners. Search for a learner and check the box beside the learner name. To delete a learner, click the x beside their name.
Specific learners in this rotation: Use the drop down selector to add the required learners. (Hover over a learner name to see their profile information.)
CBME Options: This will display whether you select All Learners or Specific Learners. This option applies only to schools using Elentra for CBME. Ignore it and leave it set to non CBME learners if you are not using CBME. If you are a CBME school, this allows you to apply the distribution to all learners, non CBME learners, or only CBME learners as required.
Targets are peers
This option allows for targets who have completed the rotation to assess all of their peers in the rotation block.
CBME Options: This option applies only to schools using Elentra for CBME. Ignore it and leave it set to non CBME learners if you are not using CBME. If you are a CBME school, this allows you to apply the distribution to all learners, non CBME learners, or only CBME learners as required.
Targets for this Distribution will be determined by the Assessor
With this option, the assessor/evaluator selects the target based on their role and distribution assessor options when they are completing the task.
In the Distribution Progress Report, administrators will see the target listed as "Assessor will choose before completing task."
Target Attempt Options: Specify how many times an assessor can assess each target, OR whether the assessor can select which targets to assess and complete a specific number (e.g. assessor will be sent a list of 20 targets, they have to complete at least 10 and no more than 15 assessments but can select which targets they assess).
If you select the latter, you can define whether the assessor can assess the same target multiple times. Check off the box if they can.
Click 'Next Step'
The assessors are the people who will complete the form.
Assessor Options:
Assessors are learners
Learner Options: All Learners of Specific Learners
All Learners: Select all learners in the rotation or specific learners in the rotation
All learners: Select from My Program and/or Outside of my program, use the drop down selector to add additional learners
Specific learners in the rotation: Use the drop down selector to add required learners
Additional Learners: Check this option to add additional learners from outside the program.
Assessors are faculty members
Browse faculty and click on the required names to add them as assessors
Select Associated Faculty: This tool will pull the names of faculty listed on the course setup page as associated faculty
Assessors are preceptors associated with a rotation schedule
This allows you to associated this distribution with all preceptors affiliated with the rotation based on the slot bookings made for learners. The distribution will dynamically update based on changes made to the rotation schedule.
Select individuals regardless of role (Individuals or external assessors)
This allows you to add external assessors to a distribution
Begin to type an email, if the user already exists you'll see them displayed in the dropdown menu. To create a new external assessor, scroll to the bottom of the list and click 'Add External Assessor'
Provide first and last name, and email address for the external assessor and click 'Add Assessor'
Exclude Self Assessments: Check this to stop learners from completing a self-assessment.
Feedback Options: This will only display when the assessors are faculty. Checking the box will add a default item to the distribution asking if the faculty member met with the trainee to discuss their assessment.
Give access to the results of previous assessments: This relates to Elentra's ability to provide a summary assessment task to users. If enabled, tasks generated by this distribution will link to tasks completed in the listed distributions. When users complete the summary assessment task they will be able to view tasks completed in the other distributions. For any items that are used on both forms, results from previously completed tasks will be aggregated for the assessor to view.
Click to add the relevant distribution(s).
Currently for rotation-based distributions there is no option to configure minimum tasks completed on linked distributions nor a fallback date for the summary assessment task.
Click 'Next Step'
You can immediately save your distribution at this point and it will generate the required tasks, but there is additional setup you can configure if desired.
Authorship: This allows you to add individual authors, or set the distribution to be accessible to everyone with A+E access in a course or organization. (This may be useful if you have multiple users who manage distributions or frequent staffing changes.)
Target Release: These options allow you to specify whether the targets of the distribution can see the results of completed forms.
Task List Release:
"Targets can view tasks completed on them after meeting the following criteria" can be useful to promote completion of tasks and is often used in the context of peer assessments. Targets will only see tasks completed on them after they have completed the minimum percentage of their tasks set by you.
Target Self-Reporting Release: This controls whether targets can run reports for this distribution (i.e. to generate an aggregated report of all responses). When users access their own A+E they will see a My Reports button. This will allow them to access any reports available to them.
Target Self-Reporting Options: This allows you to specify whether or not comments included in reports are anonymous or identifiable. (This will only be applied if you have set reports to be accessible to the targets.)
Reviewers: This allows you to set up a reviewer to view completed tasks before they are released to the target (e.g. a staff person might review peer feedback before it is shared with the learner).
Check off the box to enable a reviewer.
Click Browse Reviewers and select a name from the list. Note that this list will be generated based on the course contacts (e.g. director, curriculum coordinator) stored on the course setup page.
Notifications and Reminders:
Prompted Responses: This allows you to define whom to send an email to whenever a prompted response is selected on a form used in the distribution. For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response, any time "I had to do it" is picked as an answer an email notification will be sent.
You can optionally select to email Program Coordinators, Program/Course Directors, Academic Advisors, Curricular Coordinators, or you can add a Custom Reviewer. If you select to add a Custom Reviewer you can select their name from a searchable list of users.
Sender Details: Define the email address that notifications and reminders will be sent from for a distribution.
Options are the distribution author, an existing user, an existing generic email, or a new generic email.
To create a new generic email provide a name and email address. This will be stored in the system and available to other users to use as needed.
Click 'Save Distribution'.
Currently this feature only supports using completing one task per distribution. Even if you configure Step 3: Targets, Target Attempt Options to be more than one (which the user interface does allow), the system will not be able to support assessors/evaluators choosing more than one target per distribution.
After the distribution is created, the distribution progress page will show targets as "Assessor will choose before completing task."
When an assessor/evaluator accesses the task they will see that the target is to be defined by them.
When completing the task, the assessor/evaluator will define their target.
If an existing distribution needs to be changed we recommend retiring the existing distribution and creating a new one. Additional details here.
For rotation-based distributions users might also edit a rotation schedule and individual learner bookings and that can impact any existing distributions.
If you are using a rotation-based distribution that relies on the clinical experience rotation schedule to determine the assessor (based on scheduled preceptors), updating a booking by changing the preceptor AFTER tasks have already been generated and delivered will result in the existing task being removed from the distribution progress report and a new one generating for the newly assigned preceptor. However, the original task will not automatically be removed from the original assessor's A&E task list. If the task was optional they will be able to delete it themselves, however, if the task was mandatory, at this point the best option is for an administrator to find the task from the Admin > A & E Dashboard and delete it there.
A distribution based on a learning event schedule allows you to schedule forms to be sent out where the targets of the distribution are the attendees of the event OR faculty who taught the event OR a specific event within the selected event type.
Navigate to Admin>Assessment and Evaluation.
Click 'Distributions' above the Assessment and Evaluation heading.
Click 'Add New Distribution'.
Distribution Title: Provide a title. This will display on the list of Distributions that curriculum coordinators and program coordinators can view.
Distribution Description: Description is optional.
Task Type: Hover over the question mark for more detail about how Elentra qualifies assessments versus evaluations. If a distribution is to assess learners, it's an assessment. If it is to evaluate courses, faculty, learning events, etc. it is an evaluation. Notice that the language on Step 4 will change if you switch your task type. The task type will also dictate other fields available throughout the wizard.
Disable Target Release: This will only display for Evaluations. If you do not want evaluators to be able to release the form to the target upon completion, check this box. If left unchecked, evaluators will be able to release the form to the target upon completion.
Assessment/Evaluation Mandatory: This will be checked off be default.
Disable Reminders: Check this box to exclude this distribution from reminder notification emails.
Assessment/Evaluation Mandatory: This will be checked off by default. Mandatory tasks can't be deleted by assessors/evaluators.
Disable Initial Task Email: Introduced in ME 1.26 this option allows administrative users to opt-out of having Elentra sent initial task email notifications to assessors/evaluators. (Applies to non-delegation based distributions).
Select Form: The form you want to distribute must already exist; pick the appropriate form.
Select a Curriculum Period: The curriculum period you select will impact the list of available learners and associated faculty.
Select a Course: The course you select will impact the list of available learners and associated faculty.
Click 'Next Step'
Distribution Method: Select 'Learning Event Schedule' from the dropdown menu.
Select Event Type: Select the appropriate event type(s) from the dropdown menu. You can select multiple event types as needed.
Release Date: This tells the system how far back on the calendar to go when creating tasks. Hover over the question mark for more detail.
Task Expiry: If you check this box, the tasks generated by the distribution will automatically expire (i.e. disappear from the assessors task list and no longer be available to complete). You can customize when the task will expire in terms of days and hours after the delivery.
Warning Notification: If you choose to use the Task Expiry option, you'll also be able to turn on a warning notification if desired. This can be set up to send an email a specific number of days and hours before the task expires.
Delay Task Creation: This option relates specifically to creating a summary assessment task distribution. This allows you to create a distribution that will pull in and display completed tasks from other distributions. In effect, the assessor of the distribution you create will be able to see items from previously completed tasks and take them into account when completing their own task.
Select a distribution (note that you will only see distributions to which you have access).
Define how many tasks must be completed in the linked distribution(s) before the new distribution will take effect.
Enter a fallback date. This date represents when your distribution will create tasks whether or not the minimum number of tasks have been completed in the linked distribution.
You can adjust the fallback date on the fly and the distribution will adjust accordingly to only generate tasks as appropriate (previously created tasks will still remain).
If you set a fallback date that falls before the release date, the fallback date will be ignored.
The target is who or what the form is about.
If you are creating an Evaluation:
Evaluations delivered for:
Faculty who taught events with the selected event types (this will generate an evaluation for any faculty member who taught the event type specified in the Method section, e.g. lecture, lab, etc.)
Events with the selected event types (this will generate an evaluation for any event of the event type specified in the Method section)
Target Attempt Options: Specify how many times an evaluator can evaluate each target, OR whether the evaluator can select which targets to assess and complete a specific number (e.g. evaluator will be sent a list of 20 targets, they have to complete at least 10 and no more than 15 evaluations but can select which targets they evaluate).
If you select the latter, you can define whether the evaluator can evaluate the same target multiple times. Check off the box if they can.
Click 'Next Step'
If you are creating an Assessment:
Assessments delivered for: Currently, the only option here is attendees who are enrolled in events with the specified event type.
Target Attempt Options: Specify how many times an assessor can assess each target, OR whether the assessor can select which targets to assess and complete a specific number (e.g. assessor will be sent a list of 20 targets, they have to complete at least 10 and no more than 15 assessments but can select which targets they assess).
If you select the latter, you can define whether the assessor can assess the same target multiple times. Check off the box if they can.
Click 'Next Step'
The assessors/evaluators are the people who will complete the form.
Assessor Options
Assessors/evaluators are attendees enrolled in the event
Attendee Options (note that you must be using Elentra's attendance module within events to use this feature)
Send to all enrolled audience attendees that attended the event
Sent to all enrolled attendees, even if they did not attend
Send this assessment to a percentage of enrolled attendees that attended the event
You supply the percentage; this option will randomly select the appropriate number of attendees and send them tasks. You'll be able to see which users have been sent the task after the distribution is saved.
Note: If you use this option and individual tasks are deleted (by a user or by an administrator), new tasks will generate in order to meet the required percentage. This can occur even if the task is deleted long after the relevant event. As such, you may consider using an expiry date when selecting to send tasks to a percentage of attendees.
Assessors/evaluators are faculty members associated with the event
This will send the distribution to the faculty listed on the event (e.g. teacher, tutor, etc.)
Assessors/evaluators are external to the installation of Elentra
This allows you to add external assessors/evaluators to a distribution
Begin to type an name or email, if the user already exists you'll see them displayed in the dropdown menu. To create a new external user, scroll to the bottom of the list and click 'Add External Assessor/Evaluator'
Provide first and last name, and email address for the external assessor and click 'Add Assessor/Evaluator'
Give access to the results of previous assessments: This relates to Elentra's ability to provide a summary assessment task to users. If enabled, tasks generated by this distribution will link to tasks completed in the listed distributions. When users complete the summary assessment task they will be able to view tasks completed in the other distributions. For any items that are used on both forms, results from previously completed tasks will be aggregated for the assessor to view.
Click to add the relevant distribution(s).
Currently for learning event based distributions there is no option to configure minimum tasks completed on linked distributions nor a fallback date for the summary assessment task.
Click 'Next Step'
You can immediately save your distribution at this point and it will generate the required tasks, but there is additional setup you can configure if desired. (Not all options will display depending on the other parameters of the distribution.)
Authorship: This allows you to add individual authors, or set the distribution to be accessible to everyone with A+E access in a course or organization. (This may be useful if you have multiple users who manage distributions or frequent staffing changes.)
Target Release: These options allow you to specify whether the targets of the distribution can see the results of completed forms.
Task List Release:
"Targets can view tasks completed on them after meeting the following criteria" can be useful to promote completion of tasks and is often used in the context of peer assessments. Targets will only see tasks completed on them after they have completed the minimum percentage of their tasks set by you.
Target Self-Reporting Release: This controls whether targets can run reports for this distribution (i.e. to generate an aggregated report of all responses). When users access their own A+E they will see a My Reports button. This will allow them to access any reports available to them.
Target Self-Reporting Options: This allows you to specify whether or not comments included in reports are anonymous or identifiable. (This will only be applied if you have set reports to be accessible to the targets.)
Reviewers: This allows you to set up a reviewer to view completed tasks before they are released to the target (e.g. a staff person might review peer feedback before it is shared with the learner).
Check off the box to enable a reviewer.
Click Browse Reviewers and select a name from the list. Note that this list will be generated based on the course contacts (e.g. director, curriculum coordinator) stored on the course setup page.
Notifications and Reminders:
Prompted Responses: This allows you to define whom to send an email to whenever a prompted response is selected on a form used in the distribution. For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response, any time "I had to do it" is picked as an answer an email notification will be sent.
You can optionally select to email Program Coordinators, Program/Course Directors, Academic Advisors, Curricular Coordinators, or you can add a Custom Reviewer. If you select to add a Custom Reviewer you can select their name from a searchable list of users.
Sender Details: Define the email address that notifications and reminders will be sent from for a distribution.
Options are the distribution author, an existing user, an existing generic email, or a new generic email.
To create a new generic email provide a name and email address. This will be stored in the system and available to other users to use as needed.
Click 'Save Distribution'.
New in ME 1.20!
Reciprocal distributions can now be created for delegation-based distributions.
The Reciprocal Distribution method allows an administrator to set up a distribution that delivers tasks based on the status of tasks delivered by another distribution. Reciprocal tasks can be delivered either when the corresponding distribution tasks are delivered (i.e., the corresponding distribution tasks don't have to be completed in order for corresponding tasks be be delivered by the reciprocal distribution) or when the corresponding distribution tasks are completed (i.e., the corresponding distribution tasks must be completed in order for the reciprocal distribution tasks to be delivered).
The distribution off of which the reciprocal distributions is based must be either a date range-based, rotation-based and delegation distributions.
An example of a reciprocal distribution might be that you set up a Learner Assessment distribution that delivers tasks to an assessor to assess learner performance; when those are delivered, you'd also like the learners being assessed by the assessors identified in the Learner Assessment distribution to automatically receive an Evaluation task to complete on their assessor.
Build the distribution off of which you would like the reciprocal distribution to run; note that this distribution must use either a date range, rotation-based or delegation distribution method.
E.g., an admin wants a faculty evaluation task sent to the target of an assessment so that the learner can evaluate to the assessor, first set up the assessment distribution.
Build a second distribution that will look at the first distribution in order to deliver tasks
E.g., in the example given above, the next step would be to set up the evaluation distribution.
When building the second distribution, select the distribution method of Reciprocal Distribution on Step 2: Method
When Reciprocal Distribution is chosen, the admin can then select the corresponding distribution they want the current distribution to be based on.
The next step allows the admin to select either "I want this distribution to deliver a task each time the corresponding distribution delivers a task" or "I want this distribution to deliver a task each time a task from corresponding distribution is completed".
Step 3: Targets: allows the admin to select "The targets of this distribution will be determined by the corresponding distribution."
Step 4: Assessors/Evaluators: allows the admin to select "The assessors (or evaluators) of this distribution will be determined by the corresponding distribution."
Step 5: the selections available on Step 5 of the distribution will be determined by the type of distribution (assessment or evaluation) that was selected in Step 1.
After you have created your first distribution, based on which you would like tasks delivered by a reciprocal distribution, create a corresponding reciprocal distribution.
In this particular example, a faculty evaluation distribution is being created to deliver tasks based on a previous learner assessment distribution; i.e., the first distribution will ask faculty to assess a learner; based on that distribution, this reciprocal distribution will ask the learner to complete an evaluation of the faculty who assessed them.
Enter a Distribution Title and make it as descriptive as possible.
The optional Distribution Description allows you to clarify the intention of the task and will appear on the task card on the assessor/evaluator’s My Assessments page.
Select Assessment or Evaluation from the Task Type dropdown (depending on the type of distribution you to create). This selection is important; it will determine available settings in the Distribution Wizard going forward.
In this case, Evaluation is selected.
Leave the Disable Target Release checkbox unticked if you would like the evaluator to be able to choose to release their completed evaluation to the target.
Leave the Assessment/Evaluation Mandatory checkbox ticked if the assessment/evaluation is considered mandatory.
Disable Initial Task Email: Introduced in ME 1.26 this option allows administrative users to opt-out of having Elentra sent initial task email notifications to assessors/evaluators. (Applies to non-delegation based distributions).
Leave the Disable Reminders checkbox unticked if you would like the tasks in this distribution to be included in the weekly reminder summary email notification. If you place a tick in this checkbox, initial task creation email notifications will be sent, but weekly reminder summary emails will be disabled.
Select a form using the Browse Forms dropdown.
Select a curriculum period from the Browse Curriculum Periods dropdown.
Select a course/program from the Browse Programs dropdown.
Click Next Step to move on to Step 2: Method.
Note: the ‘Previous Step’ button allows you to navigate back to Step 1. You can cancel out of the distribution on any page; it will only be saved once you complete all 5 steps.
Select a Distribution Method from the dropdown: choose Reciprocal.
Select the Corresponding Distribution: from the list of distributions that you have access to, select the distribution whose tasks you want your reciprocal distribution's task delivery to be based on.
Specify when the new task should be delivered: identify when you would like the tasks generated by your reciprocal distribution to be delivered based on the corresponding distribution.
On previous task delivery: tasks will be delivered by the reciprocal distribution when the corresponding tasks are delivered, regardless of whether or not they are actually completed.
On previous task completion: tasks will be delivered by the reciprocal distribution only when the associated tasks on the corresponding distribution are completed.
Task Expiry: If you check this box, the tasks generated by the distribution will automatically expire; i.e., they will disappear from the evaluator’s task list and no longer be available to complete. You can customize when the task will expire in terms of days and hours after the delivery. When in doubt, leave this field blank.
Warning Notification: If you choose to use the Task Expiry option, you'll also be able to turn on a warning notification if desired. This can be set up to send an email a specific number of days and hours before the task expires.
Click Next Step to move on to Step 3: Targets.
Targets are the object of the assessment: those learners whose academic performance is being assessed.
Target option: select the target will be the assessor/evaluator of the previous task.
Target attempt options: (always select the first choice, the second choice is for peer assessments). Leave the default setting of: Assessors can assess each target a minimum of (1) times and a maximum of (1) times.
Click Next Step to move on to Step 4: Assessors/Evaluators.
Select an Assessor/Evaluator Option: select the assessor/evaluator will be the target of the previous task.
Click Next Step to move on to Step 5: Results.
The Results settings are optional. If you do not wish to edit them, simply click the Save Distribution button. Otherwise, click on a heading to toggle a section open.
**Authorship:**This allows you to add individual authors or set the distribution to be accessible and editable by everyone with A&E access in a program/course or organization. This may be useful if you have multiple users who manage distributions or frequent staffing changes.
Choose an Author Type from the dropdown: Individual, Organization or Course (Program)
Individual Author names can be added using the Distributions Authors search box.
Target Release: This option was created for Undergraduate Medicine and can be skipped over by Postgraduate Medicine programs. This option allows you to specify whether the targets of the distribution can see the aggregated results of completed forms. Note: completed evaluation tasks are automatically hidden from target view by the system.
Task List Release: “Targets can view tasks completed on them after meeting the following criteria” can be useful to promote completion of tasks and is often used in the context of peer assessments. Targets will only see tasks completed on them after they have completed the minimum percentage of their own tasks set by you.
Target Self-Reporting Release: This controls whether targets can run reports for this distribution; i.e., to generate an aggregated report of all responses. When users access their own A&E badge, they will see a ‘My Reports’ button. This will allow them to access any reports available to them.
Target Self-Reporting Options: This allows you to specify whether or not comments included in reports are anonymous or identifiable. This will only be applied if you have set reports to be accessible to the targets.
Reviewers: This allows you to set up a reviewer to view completed tasks before they are released to the target; e.g., a staff person might review peer feedback before it is shared with the learner. They can decide to forward the form to the target or to hide the form from the target.
Check off the box to enable a reviewer.
Click Browse Reviewers and select a name from the list. Note: This list will be generated based on the course/program’s contacts (e.g., director, curriculum coordinator) stored on the course Setup page.
Prompted Response Notifications can be enabled to send selected individuals an email message when evaluators choose a response that has been flagged on the form as a ‘prompted response’. E.g., if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response, a notification can be delivered to the individual(s) you identify in this step. You can optionally select to email the Evaluation Reviewers, Program Coordinators, Program/Course Directors, or Distribution Authors.
Click Save Distribution. The distribution will be ‘live’ as of the next day. It will be triggered when a learner is on the rotation and forms will be sent out on the date defined by the rules you set up.
When tasks are created, if the criteria for delivering the task has not yet been met, you'll see that detail in the distribution progress report.
You can use a delegation distribution to send tasks to a delegator who can then forward the tasks to the appropriate assessor/evaluator. The use case for this might be setting up a distribution at the beginning of the year and not knowing exactly which preceptors will be working in a specific environment. You can send the distribution to a staff member who can forward the tasks once a clinic schedule is set or send the distribution to a user who is actually the target of the task and can then delegate the task to the appropriate assessor/evaluator.
Navigate to Admin > Assessment and Evaluation.
Click 'Distributions' above the Assessment and Evaluation heading.
Click 'Add New Distribution'.
Distribution Title: Provide a title. This will display on the list of Distributions that curriculum coordinators and program coordinators can view.
Distribution Description: Description is optional.
Task Type: Hover over the question mark for more detail about how Elentra qualifies assessments versus evaluations. If a distribution is to assess learners, it's an assessment. If it is to evaluate courses, faculty, learning events, etc. it is an evaluation. Notice that the language on Step 4 will change if you switch your task type.
Disable Target Release: This will only display for Evaluations. If you do not want evaluators to be able to release the form to the target upon completion, check this box. If left unchecked, evaluators will be able to release the form to the target upon completion.
Assessment/Evaluation Mandatory: This will be checked off be default.
Disable Initial Task Email: This option will not be enforced on Delegation Distributions so should be ignored.
Disable Reminders: Check this box to exclude this distribution from reminder notification emails.
Select Form: The form you want to distribute must already exist; pick the appropriate form.
Select a Curriculum Period: The curriculum period you select will impact the list of available learners and associated faculty.
Select a Course: The course you select will impact the list of available learners and associated faculty.
Click 'Next Step'
Distribution Method: Select 'Delegation' from the dropdown menu.
Delegator: Click Browse Delegators and select the appropriate name from the dropdown menu. You can only select one delegator.
Delegation Options:
Delegation based on date range
Delegation based on rotation schedule
Delegator Type:
The delegator is an individual
The delegator is the target
From this point forward the distribution wizard will be configured based on whether you selected a date range or rotation-based distribution. Please refer the the help pages for those two methods for additional details.
You can optionally add Assessors for a delegation distribution. The users you add to this list will be made available to the delegator to select from a quick pick list. The delegator will optionally be able to search for and add additional assessors/evaluators as needed.
If you don't want to define any assessors, check "The assessors for this Delegation Distribution are unknown."
Currently, this feature is designed to support learner assessments only. A distribution set up as an evaluation will not work within the context of the ad hoc feature.
An ad hoc distribution allows an administrator to set up a form that is available to a specific audience, during a specific time frame, and can be initiated on demand by learners or faculty. A potential use case is a clinical environment where it is unknown which learners will work with which assessors. Using an ad hoc distribution, administrative staff can set up the distribution then allow learners or faculty to trigger the assessment and provide details about who the target or assessor in the situation was.
One advantage of using an ad hoc distribution over a workflow is that you will have a distribution report to review and monitor task completion for the specific distribution.
Please be aware of the following when using ad hoc distributions:
Users can only select 'complete and confirm via PIN' or 'send blank form' as form completion methods (you can't currently begin a form and send it via email or use the self-assess and send blank form options).
When setting the targets of a distribution you can only use cohort and individuals (you can't currently set the distribution up with a course audience or course group as the target).
The available assessors list will be based on course contacts from the course setup page.
Use the ad hoc distribution for assessments only.
There are some database settings that can be used to tailor the use of the ad hoc distributions. If you want to change these you will need help from a developer.
You can restrict the ability to initiate the form to only targets or only assessors (by default it is set to allow both to initiate forms). (settings: assessment_triggered_by_assessor and assessment_triggered_by_target)
You can turn on or off the ability to allow forms to be completed and confirmed via PIN or to send a blank form (by default the system will allow both). (settings: assessment_method_complete_and_confirm_by_pin and assessment_method_send_blank_form)
Permissions: Anyone with access to Admin > Assessment & Evaluation will be able to create an ad hoc distribution. For default installations this will include Medtech: Admin, Staff:Admin, Faculty:Admin, Staff:PCoordinator and Faculty:Directors.
Navigate to Admin > Assessment & Evaluation.
Click 'Distributions' above the Assessment and Evaluation heading.
Click 'Add New Distribution'.
Distribution Title: Provide a title. This will display on the list of Distributions that curriculum coordinators and program coordinators can view. In addition, this distribution title will be used by learners and faculty to access the form. For this reason, a clear title including a course or use is recommended (e.g. Clinical Skills Week 1 History).
Distribution Description: Description is optional.
Task Type: Hover over the question mark for more detail about how Elentra qualifies assessments versus evaluations. If a distribution is to assess learners, it's an assessment. If it is to evaluate courses, faculty, learning events, etc. it is an evaluation. Notice that the language on Step 4 will change if you switch your task type as will other steps of the wizard.
Assessment Mandatory: This will be checked off be default. Currently this information is recorded but does not impact how a task displays to a user.
Disable Initial Task Email: Introduced in ME 1.26 this option allows administrative users to opt-out of having Elentra sent initial task email notifications to assessors/evaluators. (Applies to non-delegation based distributions).
Select Form: The form you want to distribute must already exist and you must have permission to access the form; pick the appropriate form from the dropdown menu.
Select a Curriculum Period: The curriculum period you select will impact the list of available learners and associated faculty.
Select a Course: The course you select will impact the list of available learners and associated faculty.
Click 'Next Step'.
Distribution Method: Select 'Adhoc' from the dropdown menu.
Start Date: This is the beginning of the period the form is meant to reflect.
End Date: This is the end of the period the form is meant to reflect.
Task Expiry: Optional. Set the date on which the tasks generated by the distribution will automatically expire (i.e. disappear from the assessor's task list and no longer be available to complete). Tasks will expire at 12:00 AM on the day that you select, so it is best to select the day after your intended expiry date.
Delay Task Creation: This option relates specifically to creating a summary assessment task distribution. This allows you to create a distribution that will pull in and display completed tasks from other distributions. In effect, the assessor of the distribution you create will be able to see items from previously completed tasks and take them into account when completing their own task.
Select a distribution (note that you will only see distributions to which you have access).
Define how many tasks must be completed in the linked distribution(s) before the new distribution will take effect.
Enter a fallback date. This date represents when your distribution will create tasks whether or not the minimum number of tasks have been completed in the linked distribution.
You can adjust the fallback date on the fly and the distribution will adjust accordingly to only generate tasks as appropriate (previously created tasks will still remain).
If you set a fallback date that falls before the release date, the fallback date will be ignored.
Click 'Next Step'.
The target is who or what the form is about. You will only set specific targets if you are creating an assessment; if you are creating an evaluation you will not set a target list.
Select Targets: Use this area to specify the targets of the form.
When setting the targets of an ad hoc distribution for an assessment you can only use cohort and individuals (you can't currently set the distribution up with a course audience or course group as the target).
Target Attempt Options: Specify how many times an assessor can assess each target, OR whether the assessor can select which targets to assess and complete a specific number (e.g. assessor will be sent a list of 20 targets, they have to complete at least 10 and no more than 15 assessments but can select which targets they assess).
If you select the latter, you can define whether the assessor can assess the same target multiple times. Check off the box if they can.
Click 'Next Step'.
The assessors are the people who will complete the form. Currently, ad hoc distributions only support making faculty who are course contacts assessors. That means the faculty must be listed on the Course Setup tab as Associated Faculty or as Course Director.
There are three options:
Select faculty members
Browse faculty and click on the required names to add them as assessors
Select Associated Faculty to add the names of all faculty listed on the course setup page
Select learners
Click Browse Assessors and click on the appropriate option to add learners as assessors
Select individuals external to the installation of Elentra
This allows you to add external assessors to a distribution
Begin to type an email, if the user already exists you'll see them displayed in the dropdown menu. To create a new external assessor, scroll to the bottom of the list and click 'Add External Assessor'
Provide first and last name, and email address for the external assessor and click 'Add Assessor'
Exclude Self Assessments: Check this to stop learners from completing a self-assessment
Give access to the results of previous assessments: This relates to Elentra's ability to provide a summary assessment task to users. If enabled, tasks generated by this distribution will link to tasks completed in the listed distributions. When users complete the summary assessment task they will be able to view tasks completed in the other distributions. For any items that are used on both forms, results from previously completed tasks will be aggregated for the assessor to view.
Click to add the relevant distribution(s).
Click 'Next Step'
You can immediately save your distribution at this point and it will generate the required tasks, but there is additional setup you can configure if desired.
Authorship: This allows you to add individual authors, or set the distribution to be accessible to everyone with A&E access in a course or organization. (This may be useful if you have multiple users who manage distributions or frequent staffing changes.)
Target Release: These options allow you to specify whether the targets of the distribution can see the results of completed forms.
Task List Release:
"Targets can view tasks completed on them after meeting the following criteria" can be useful to promote completion of tasks and is often used in the context of peer assessments. Targets will only see tasks completed on them after they have completed the minimum percentage of their tasks set by you.
Target Self-Reporting Release: This controls whether targets can run reports for this distribution (i.e. to generate an aggregated report of all responses). When users access their own A+E they will see a My Reports button. This will allow them to access any reports available to them.
Target Self-Reporting Options: This allows you to specify whether or not comments included in reports are anonymous or identifiable. (This will only be applied if you have set reports to be accessible to the targets.)
Reviewers: This allows you to set up a reviewer to view completed tasks before they are released to the target (e.g. a staff person might review peer feedback before it is shared with the learner).
Check off the box to enable a reviewer.
Click Browse Reviewers and select a name from the list. Note that this list will be generated based on the course contacts (e.g. director, curriculum coordinator) stored on the course setup page.
Notifications and Reminders:
Prompted Responses: This allows you to define whom to send an email to whenever a prompted response is selected on a form used in the distribution. For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response, any time "I had to do it" is picked as an answer an email notification will be sent.
You can optionally select to email Program Coordinators, Program/Course Directors, Academic Advisors, Curricular Coordinators, or you can add a Custom Reviewer. If you select to add a Custom Reviewer you can select their name from a searchable list of users.
Sender Details: Define the email address that notifications and reminders will be sent from for a distribution.
Options are the distribution author, an existing user, an existing generic email, or a new generic email.
To create a new generic email provide a name and email address. This will be stored in the system and available to other users to use as needed.
Click 'Save Distribution'.
Once an ad hoc distribution has been created you can monitor its progress from Admin > Assessment & Evaluation. Click on Distributions, and then the name of the distribution.
The Show Progress screen you:
Not Started (forms that have been triggered but not started)
In Progress (forms that have been triggered, started, and saved in draft mode)
Completed (forms that have been triggered and completed).
Click on any of the labels to view the names of the targets and assessors and delivery dates.
Administrative staff can also send reminders, and add and delete tasks from here.
After an ad hoc distribution has been set up (remember it may take up to a day for your distribution to become active depending on when behind the scenes cron jobs happen), learners and faculty can initiate a task from their main dashboard.
From the dashboard, click 'Start Assessment/Evaluation'.
Click the Adhoc Distributions tab to move to the correct menu to initiate a task.
Select a distribution (begin to type a distribution name to quickly filter the list).
Select an assessor. Hover over an assessor name to see their photo (if uploaded) and details about them including email and course affiliation.
Select the assessment method. (The methods of completion available to users can be configured in the database for an organization. Speak to a developer if you want to change which assessment methods are available to users.)
Email blank form will send a copy of the form to the selected assessor.
If the assessor has a PIN set the system will automatically default to Complete and confirm via PIN as the assessment method.
Select a date of encounter.
Click 'Submit'.
If the learner selected 'Email blank form', the assessor will receive an email alerting them to the task; additionally, they will see the task added to their Assessment Tasks list accessible from their A&E badge in the dashboard header.
If the learner selected Complete and confirm via PIN, the form will display on the screen. The learner can begin to complete the form, then pass the device to the assessor. When the form is complete, the assessor can enter their PIN to confirm and complete the form.
Note that if a form triggered using Complete and confirm via PIN is saved as a draft, the learner will need to reopen the form when in the company of the assessor to have the form completed.
After a form is completed, the learner's ability to view it will depend on the distribution settings. If the distribution allows the learners to view the tasks, they will be able to view them from their Tasks Completed on Me list accessed from their A&E badge in the dashboard header.
From the dashboard, click 'Start Assessment/Evaluation'.
Click the Adhoc Distributions tab to move to the correct menu to initiate a task.
Select a distribution (begin to type a distribution name to quickly filter the list).
Select a target. Hover over a target name to see their photo (if uploaded) and details about them including email, group and role (e.g. student, 2022) and enrolled courses.
Select a date of encounter.
Click 'Begin Assessment'.
The form will display on the screen.
The faculty member can complete the form and Save as Draft or Submit. Note that if the user saves the form as draft, they will have to reopen and complete it at a later date.
The faculty member can delete the task if they triggered it in error.
The faculty member can forward the task to another faculty member if needed.
Faculty can view their draft forms, or forms that leaners have triggered to them from their Assessment & Evaluation badge located in the dashboard header. The tasks to be completed will display in the Assessment Tasks list.
\
Elentra's Summary Assessment Task option allows you to create a distribution that allows the assessor for said distribution to view the results of previously completed tasks (on a per distribution basis).
Please note that this feature is currently for distributed assessment tasks only. Additionally, it can't be used with form templates and tasks triggered on-demand (i.e., CBME tools).
Summary assessment tasks are configured through regular distributions. Within the instructions for each distribution, you'll see additional notes about the options to enable for summary assessment tasks (i.e., the Delay Distribution Availability option on Step 2: Method, and the Give access to the results of previous assessments option on Step 4: Assessors/Evaluators).
Users completing a summary assessment task can:
View aggregate data (but not average scores) for any items that were included on tasks in associated distributions and the summary assessment task (note, the assessment item ids on each form must match for this feature to work)
2. View comments from tasks from associated distributions and see the date, time, assessor and distribution
3. Search for and view completed tasks from associated distributions, even if not all items are included on the summary assessment task
While reviewing previous tasks, easily navigate between distributions, tasks, or learners as needed.\
When administrators configure a distribution to create summary assessment tasks, they can:
Define which previous distributions to link to (associated distributions)
For date based distributions and adhoc distributions only:
Optionally define a minimum number of tasks to be completed in the associated distribution(s) before summary assessment tasks are generated
Optionally define a failsafe date on which to send the summary assessment task, even if minimum numbers tasks in associated distributions have not been achieved
If a summary assessment task has been completed and then more tasks are completed in the associated/evidence distributions, will the summary assessment task reopen to be edited?
Not automatically. If an admin user wants to manually reopen the summary assessment task so that the assessor can edit their responses, they can do so. Note that if the task was already made visible to learners, editing it may cause confusion.\
What happens if you set the delivery date for the distribution before the fallback date? The fallback date will be ignored.\
How soon after the tasks in the associated distributions are completed will the summary assessment task be created? The summary assessment task will be eligible to be created as soon as the tasks in the associated distributions are completed. When the tasks actually get delivered to the assessor will depend on the schedule of some behind the scenes tasks (cron jobs) that your installation of Elentra performs. At many schools, the tasks will be delivered the next day.\
Note that in a rotation based distribution you are able to toggle between a block view and date range view of the distribution report. When using Block view you can view completion per block and switch blocks as needed using the forward and back arrows.
Additionally, if the target option for a distribution was that the assessor/evaluator would define their target when completing a task, pending tasks will display the target as to be defined.
New in ME 1.20!
Updates to the way that Forwarded Tasks display in distribution progress reports.
After a distribution is created you can view a distribution progress report. Reports include an overview of the tasks generated and not started, the tasks in progress and completed tasks. Click on any of the status cards to display the individual tasks in each status category.
From the Distribution Report you can also complete a variety of actions including: deleting a task, recovering a deleted task, sending a reminder, adding a task, downloading completed tasks, and completing a task on someone's behalf. For some distribution types you can also generate a weighted csv report from the Distribution Report.
More details about these actions are available in the .
As of Elentra ME 1.20 we have updated how distribution progress reports show forwarded tasks. Previously, forwarded tasks were no longer visible on the relevant distribution progress page. Now, after a task has been forwarded, it will still display on the distribution progress report, listed under a new assessor/evaluator, and will display a 'forwarded' label.
Please note that a forwarded task will only show on the distribution progress report when it has been forwarded one time. If it is forwarded a second time it will be gone from the distribution progress report view.
Please see for more information about how to complete a delegation task.
Feedback Options: Check this to add an item to the form requiring the target and assessor to confirm they spoke. More detail .
Complete and confirm via PIN will allow the learner to immediately view and start the form, then have the assessor sign off on the form using his/her PIN. (For information on setting user PINs, please see .)
Please note - Currently, when an A&E task reminder email is sent, the “From” field and valediction are not displaying the name of the individual who triggered the reminder, but instead displaying the name of the individual who triggered the task. This is currently being addressed with ticket: https://elentra.atlassian.net/browse/ME-7106
Elentra automatically sends email notifications to users when they have an Assessment and Evaluation task to complete or review.
Assessors/evaluators will receive one email notification overnight with all new tasks linked in the body of the email.
For tasks scheduled via a distribution, the timing of when specific tasks will trigger an email will depend on how the distribution is configured (e.g., the task delivery date set, the timing of learning events for event-based distributions, dynamic delivery settings for rotation-based distributions).
Note that if a delegation-based distribution has been created, when a delegator forwards a task to a user, a distinct email will immediately be sent for that task.
For user-initiated, on-demand tasks, email notifications will be created when the initiating user submits the form and the task moves to a new owner (i.e., assessor).
Other circumstances that can generate email notifications for a task include:
A program administrator sends a reminder for a specific task;
A task is forwarded from one user to another user.
Course directors, program administrators, and users set to receive prompted response notifications for specific distributions will also receive email notifications when they have tasks to review. This could include:
In a CBME context, when any user completing a form has answered "Yes" to a concern item on a CBME form generated from a form template (i.e. Supervisor Form, Field Note Form, Procedure Form)
When a user has indicated an answer configured as a prompted response on an item, when the form containing that item is included in a distribution configured to have prompted response notifications.
There are some configurable options for email notification which can be controlled through the database (not through the user interface). This includes the ability to adjusts emails for tasks scheduled via distribution to send a nightly summary of all new tasks, and then a weekly summary of tasks that are still incomplete.
include_name_in_flagged_notifications defines whether or not to include names of users in prompted response email notifications.
Navigate to Admin > Assessment & Evaluation.
Filter the task list as necessary to find the task you want.
Check the box under the trash can icon on the right side of the task table.
Click 'Delete Task(s)'.
You are required to provide a reason for deleting the task and can optionally provide additional notes.
The reason entered will be displayed when tasks are viewed on the Deleted tab.
Click 'Remove Task(s)'.
You will get a success message. Close the success message to return to the A & E Dashboard.
Navigate to Admin > Assessment & Evaluation.
Click the Distributions tab.
Find the appropriate distribution and click on it or use the cog icon and select 'View Distribution Report'.
Click on a tab (e.g., Not Started) to view tasks.
Check the box under the trash can icon on the right side of the task table.
Click 'Delete Task(s)'.
You are required to provide a reason for deleting the task and can optionally provide additional notes.
The reason entered will be displayed when tasks are viewed on the Deleted tab.
Click 'Remove Task(s)'.
You will get a success message.
Navigate to Admin > Assessment & Evaluation.
Toggle to the Assessment or Evaluations view as needed.
Click the Deleted tab.
Filter the task list as necessary to find the task you want.
Click on a task to view it.
Click 'Reopen Task'.
Provide a reason for reopening the task.
Click 'Reopen Task'.
The task will now display under the Outstanding tab and the admin dashboard and will be set to in-progress for the task owner. New notifications will not automatically be sent.
Navigate to Admin > Assessment & Evaluation.
Click the Distributions tab.
Find the appropriate distribution and click on it or use the cog icon and select 'View Distribution Report'.
Click on a tab (e.g., Not Started) to view tasks.
Check the box under the trash can icon on the right side of the task table.
Click 'Recover Task(s)'.
You will get a success message and the task will appear as active on the task list.
There is developer work in the database required to set up workflows. Developers can see additional details here: https://elentra.atlassian.net/browse/ME-1554 and https://elentra.atlassian.net/browse/ME-1935
Setting up on-demand workflows allows you to configure forms so that Elentra users can initiate a form when needed without administrative staff creating a distribution. You can use workflows to:
allow users to initiate assessment tasks (EPA-based and other), and
allow learners to initiate rotation and faculty evaluations.
If there are forms you do not want users to be able to initiate on-demand, simply set the workflow to None.
You will need to decide as an organization how you want to use workflows and provide the relevant information to your developers for them to appropriately set up the database.
Sample uses include:
Allow learners or faculty to initiate CBME assessments (e.g. Supervisor form) on demand
Allow learners or faculty to initiate any generic assessment form on demand
Allow learners to initiate faculty evaluations on demand
Allow learners to initiate rotation evaluations on demand
You can watch a recording about Workflows at collaborate.elentra.org (login required).
There are some things to note about using on-demand workflows:
Forms will only show on the Start Assessment screen if they have an affiliated workflow. If a form's workflow is set to None it will not be available on demand and should be used only in distributions.
To use on-demand workflows, all forms associated with a workflow must be permissioned to a course.
Only learners should be set up to initiate faculty and rotation evaluations on demand. This is because Elentra uses learners' schedules to determine which forms they should be able to access. There is no user interface to support faculty completing standard course and faculty evaluation forms.
Please note that in addition to the types of workflows that an organisation may enable, additional form types can also be added to an organization. These include Standard Faculty Evaluation and Standard Rotation Evaluation forms. These form types are not prerequisites to use workflows, however, if an organization wants to use automated rotation evaluations (based on a rotation schedule and configured when building a rotation), they must use a Standard Rotation Evaluation form.
EPA - This is designed specifically for organizations using CBME. Use for forms tagged to EPAs that you want learners to be able to trigger. Forms using the EPA workflow contribute to CBME dashboards. Please note that forms generated from CBME form templates (e.g., supervisor form) will automatically have an EPA workflow added to them.
Other Assessment Form - Use for forms that you want learners to be able to trigger and complete on demand without tagging to EPAs, or forms that you don't want to appear in the EPA list when triggering. Can be applied to both forms tagged with EPAs/milestones and forms that are not tagged with EPAs/milestones. Only forms with EPAs/milestones tagged will contribute to CBME Learner and Program dashboards.
Faculty Evaluation - Use for faculty evaluations. Learners can initiate on demand. A date range is included when triggered. Optionally use with standard faculty evaluation form types.
Rotation Evaluation - Use only for rotation evaluations and when scheduling learners in rotations using the Clinical Experience Rotation Scheduler. Learners can initiate on demand. Optionally use with standard faculty evaluation form types.
Note: Learners can only evaluate rotations that they have completed or are currently in. They cannot initiate an evaluation for a rotation scheduled in the future.
None - Use for forms that you do not want users to be able to initiate on demand. You can still attach these forms to distributions. You can still tag EPAs or milestones as needed.
An organization can apply expiry dates to on-demand tasks for specific workflows using a database setting (cbme_ondemand_expiry).
Organizations must additionally define which workflows will have the automatic expiry dates applied to them (cbme_ondemand_expiry_workflow_shortnames) and when tasks will expire. By default, Elentra will apply an expiry date 7 days after the task is generated. Organizations can modify this via a database setting (cbme_ondemand_expiry_offset).
This tool also includes support for administration users to reset expiry dates manually from the assessments interface. Program coordinators and course directors can view expired assessments which have a visual indicator that they have expired. A Reset button can optionally be used to reset the expiry date for the task.
Email blank form
Elentra will send the assessment task to the assessor to be completed at a later time.
If a learner has initiated a task to send to an assessor, the timing of its delivery is controlled by a behind the scenes task called a cron job. Typically the default setup is for the cron job (send-queued-notifications) to run every 10 minutes. This means if a learner initiates a task using the complete and confirm via email method at 8:30, the assessor might not receive the task until 8:40. If this timing should be adjusted, you must speak with a developer at your institution.
Complete and confirm via email
The learner can begin the assessment (i.e., fill in some of the form information) and then send it via email for the assessor to complete and confirm at a later time.
Complete and confirm via PIN
Assessors must have a PIN configured in their user profile for a learner to select this option.
Self Assessment, then email blank form
The learner completes the assessment and then the assessor gets a blank copy to also complete.
If a course is using a learner or program dashboard, the learner's self-assessment will not count towards their form completion totals or assessment plan requirements.
There isn't any existing report or view option that is available in the UI that allows for an easy review of the learner's self-assessment and the assessor's completed form. For organizations that want to complete research on these tasks (e.g., compare a learner self-assessment and a faculty assessment), you'll need a developers help to pull the relevant data from the Elentra database.
Faculty and learners click the green Start Assessment/Evaluation button on their dashboard to access a form using a workflow. Learners may see two tab menus: one for Start Assessment/Evaluation and one for Adhoc Distributions. To use a workflow they should click 'Start Assessment/Evaluation'.
Select a workflow (options available will depend on how you have configured workflows in your database).
Remaining steps will depend on the workflow selected.
Note that there is a database setting to control whether learners can add an external assessor when initiating an on demand task (setting = cbme_ondemand_start_assessment_add_assessor). If you don't see an option for learners to an external assessor and you'd like to, you'll need help from a developer.
Developers can see additional details here: https://elentra.atlassian.net/browse/ME-1554 and https://elentra.atlassian.net/browse/ME-1935
To allow the use of form workflows in an organization, some setup is required. You need to associate form types with workflows, and, if necessary, restrict the user groups who can access specific workflows.
You can review the available form types in cbl_assessments_lu_form_types
If your organization wants to use standard rotation evaluation or standard faculty evaluation forms, you will need to add those to the database and associate them with the correct audience.
1
Generic Form
2
Supervisor Form
3
Field Note Form
4
Procedure Form
5
PPA Form
6
Rubric Form
7
Standard Rotation Evaluation Form
8
Standard Faculty Evaluation Form
9
Housing Form
10
Smart Tag Form
Specific workflows are listed in cbl_assessments_lu_form_workflows
1
Entrustable Professional Activity (EPA)
2
Other Assessment Form
3
Faculty Evaluation
4
Rotation Evaluation
5
None
6
Smart Tag
cbl_assessments_lu_form_type_workflow_link
acts as a junction table to associate form workflows with form types.
The migration that runs will assign a workflow of None to all form types. How you configure things depends on how your organization will use workflows. For example, in a CBME setup you could associate Supervisor Forms with the EPA workflow. You could associate Generic Forms with Other Assessment Form workflow. How you set this up depends on how you want to allow users to initiate forms on demand.
Finally, cbl_assessments_lu_form_workflow_groups
is responsible for restricting access to a specific workflow to specific user groups.
The migration that runs will populate this table so that by default, only students can initiateFaculty Evaluation
or Rotation Evaluation
workflows on demand.
Any form and user group combination added to this table creates exclusive rights for those user groups to access a form workflow.
If no workflow is associated with a user group, then all user groups should be able to initiate that workflow on demand.
There are several additional database setting options available when using workflows. These include:
cbme_ondemand_start_assessment (enabled by default)
cbme_ondemand_start_assessment_director_as_staff_workflow (disabled by default, if enabled allows faculty directors to get staff workflow view)
cbme_ondemand_start_assessment_replace_admin_trigger (disabled by default, if enabled changes the Trigger Assessment button on /admin/assessments to use the workflow view)
There are several options when it comes to administering rotation evaluations in Elentra.
Use Automatic Standard Rotation Evaluations
This option relies on the use of a Standard Rotation Evaluation form type and requires that each course with rotations to be evaluated have its own form associated with it.
By using the Standard Rotation Evaluation form type you can optionally have a developer build standard templated items to be included on each form.
Once forms are built administrators can set parameters for automatic distribution of rotation evaluation tasks when they build each rotation (i.e., send 2 days before the end of a rotation).
There is no distribution progress report for this approach so reporting on learner completion of tasks must be done via the Form Responses Report.
Results of automated rotation evaluations are available in a canned report.
Allow learners to initiate evaluations on their own using on-demand workflows.
A benefit of this approach is that it requires very little administrative effort to set up. Once forms exist, learners can access them as required.
This approach can be supported using a generic form or a standard rotation evaluation form type, as long as the workflow is set as "Rotation Evaluation" and the form is permissioned to the relevant course(s).
This approach requires that learners take responsibility for generating their own tasks and does not guarantee that they will evaluate every rotation, nor every faculty member you might want to be evaluated.
To report on learner completion of tasks you will have to rely on the Form Responses Report (there will be on distribution progress report).
It can be more time-consuming to monitor learner completion of their responsibilities using this approach because you will rely on the Admin > A & E dashboard.
Use Distributions to send tasks to users to complete.
This approach requires more administrative effort as you will need to configure a distribution for every rotation you want to be evaluated (for some schools this may be hundreds).
Tip: Use the distribution copy tool to reduce your workload.
Benefits of this approach are that task are clearly defined for learners and administrators can use a distribution progress report to quickly see who has and hasn't completed their tasks.
If your organization uses the Clinical Experience rotation scheduler and a Standard Rotation Evaluation form type to evaluate rotations within courses/programs, you can use the automatic standard rotation evaluation option. This feature is enabled when you build rotations, and removes the need to have distributions or use workflows for rotation evaluations. Instead, each course/program can create a standard rotation evaluation form (which can have preset standard items shared across courses), and the form will be auto-distributed to learners based on their rotation schedules. This option requires developer assistance. A developer must add the appropriate form type in the database and build the templated items if desired (there is no user interface to build the templated items).
Please note that the automatic rotation evaluation option is specifically to support rotation evaluations, and that course evaluations should still be sent via distributions.
Each program/course can only have one standard rotation evaluation form active at a time. If you create a second form of this type it will override the first form you made.
Additionally, each course must have its own form created and permissioned to it.
To use the automated standard rotation evaluation feature, check off the "Automatically trigger rotation evaluations" checkbox when creating a new rotation, then set the preferred delivery schedule for these automated tasks.
To report on forms completed this way, please use the "Automatic Rotation Evaluations (Aggregated)" Report.
If your organization wants to use a standard faculty or rotation evaluation form a developer will need to add those two form types to the database and associate them with the correct organisation (relevant table: cbl_assessments_form_type_organisation).
Additionally, a developer will need to create the standard rotation evaluation items. These items that will be included on every standard rotation evaluation form (courses/programs will be able to add additional items as needed).
Once the form types and standard items are added, administrative users can build standard rotation evaluation forms and publish them as needed.
If you want to allow learners to initiate evaluations of rotations and faculty on an as needed basis you can use on-demand workflows.
Faculty Evaluation and Rotation Evaluation workflows can be applied to any type of form permissioned to a course to allow learners to initiate the form on-demand.
Note that you can not set up faculty evaluation or rotation evaluation workflows to allow faculty or staff to evaluate faculty or rotations
Note:
If a learner initiates a form using a Rotation Evaluation workflow, Elentra checks the learner's rotation schedule. The learner will be able to complete a rotation evaluation for any past or current rotation they are scheduled into. A learner will not be able to initiate an evaluation for a rotation they are scheduled into in the future.
If a learner initiates a form using a Faculty Evaluation workflow, they will be able to access a complete list of all faculty stored in Elentra.
Program or Curriculum Coordinators have two methods to monitor completion of forms and tasks associated with their program, faculty and learners.
Coordinators can use Admin > Assessment & Evaluation to access distributions they are permissioned to, view A & E Reports, and see outstanding, upcoming, deleted, and completed tasks for users associated with their course/program.
Coordinators can access their own tasks (including delegations sent to them) and monitor the progress of individual faculty and learners through the Assessment & Evaluation Badge located at the top right of the screen.
Updated in ME 1.21!
Added more information to the Manage External Assessors Screen including a link to their tasks
External Assessors are members of a learning community who do not have Elentra accounts. Examples could include social workers, patients, allied health professionals, etc. External assessors can be included in distributions and learners can initiate tasks to external assessors.
When an external assessor is created their name and email are stored but they do not have a complete user record in Elentra. We avoid duplication of external assessor records by having the system check email addresses.
Existing external assessors will display at the end of a list of users when searching for a user when creating a distribution.
If a learner searches for an existing external assessor to initiate a task to, the name will display with other returned results and the assessor will be labelled as external on the user card.
Note that allowing learners to add external assessors when initiating a task is an option controlled through a database setting. If you don't see the option for learners to add an external assessor and you'd like them to be able to, ask a developer to enable cbme_ondemand_start_assessment_add_assessor in the settings table.
Updated in ME 1.21!
Additional information added to the page, the user that created and last updated the external assessor record
External assessors' assessment count now links to their assessments
Users with medtech:admin, staff:admin, and faculty:admin group and role permissions will be able to access Manage External Assessors.
Please note that this tool does not currently include a way to merge completed assessments or evaluations from two accounts into one; nor does it include the ability to automatically turn an external assessor record into a full Elentra user account.
Navigate to Admin > Manage Users.
From the User Management card in the left sidebar, click Manage External Assessors.
You will see a list of recently added external assessors and additional details including who created the assessor and who last modified the user record.
Optionally increase how many assessors are displayed on the page.
Click on a name to edit that person's name or email address.
Click on the number in the assessments column to view assessments completed by that assessor.
If an assessor has been deleted, there will be no link to their assessments.
Switch to the Assessor Search tab to search for a specific external assessor.
Click Add External Assessor to add additional assessors.
Provide first and last name and email address, then click Add External Assessor.
You will see a green success message and be redirected to the page you were previously on.
Curriculum and program coordinators as well as some faculty directors can access Admin > Assessment & Evaluation to monitor the status of assessment and evaluation tasks.
Typically, coordinators and faculty directors will access this page to see task progress for learners and faculty associated with their course. For example, if Jane Doe is set as the program coordinator for Pediatrics, she will see the tasks of all learners enrolled in Pediatrics and all faculty who are associated with Pediatrics because they are an assessor on a distributed form associated with Pediatrics, or they are the target or assessor on a user-initiated on-demand form associated with Pediatrics.
Admin > Assessment and Evaluation is a useful access point to send reminders for specific tasks, record assessments completed outside Elentra, trigger bulk assessments, etc. More details about those actions are included in the following pages.
On the A & E dashboard tab, tasks can be filtered to easily view outstanding, upcoming, deleted, and completed tasks, as well as prompted responses.
Click on the task status you want to review (e.g. Outstanding, Deleted, etc.)
Enter a search term (e.g. a user name or form title).
Apply a delivery type filter if desired (e.g. distribution based).
Set a date range if desired.
Click on a task to view it in a new tab.
Click the Field Display Option button to optionally add additional fields to your dashboard display
Start and End Date - represents the start and end date or the relevant task (if defined, e.g., in a distribution or when a task is initiated on-demand)
Course - the course a task is tied to
Event - the event a task is tied to (if applicable)
Choosing a field will add the appropriate labels to the tasks in the Tasks column
Note that the Owner of a task is the person who has responsibility for completing the next step of a task. As such, if a user has stored a task in draft mode, they may be the owner and target of the task, even if it will eventually be sent to a faculty/preceptor.
When viewing a task a coordinator can optionally complete the task on behalf of another user, forward the task to be completed by someone else, or delete the task.
Click 'Assessments' or 'Evaluations' to toggle between the types of tasks you are reviewing. Remember, in Elentra Assessments are about learners, and Evaluations are about other things (e.g., faculty, courses, rotations, events).
Outstanding tasks include those that have been delivered but are not yet complete.
Hover over the grey number in the targets column to view all targets included in the task.
Quickly view expiry dates if applicable, and optionally send reminders or delete tasks from this tab.
Upcoming tasks include those that are scheduled for delivery in the future.
Deleted tasks have been deleted by a user.
Optionally hide tasks you no longer need to view by checking the box in the Hide column and clicking 'Hide Selected Tasks'.
Please note that tasks will show up under this tab only when they have had their terminal phase completed. This means that tasks that are initiated on demand and might be started by a learner, then completed by a preceptor will only show on this tab when the preceptor has completed/confirmed the task (i.e., complete and confirm via PIN and complete and confirm via email tasks). If a task has been submitted by a learner but not completed by a preceptor it will not be stored here yet (it will be in the Outstanding tab).
A potential privacy concern exists for this tab, especially in the case of Evaluations. Coordinators and faculty with access to this tab can view completed evaluations. Until Elentra introduces some setting options to control who accesses this tab, some schools may wish to comment it out and hide it (requires developer assistance).
The Prompted Responses tab will display all completed tasks that include items where users selected prompted responses. (Recall that a prompted response is set per item when items are created.)
Users can view the completed task by clicking on the task title. The prompted response will be emphasized with a red exclamation point.
If a user has responsibility to act on a prompted responses, they should use the personalized Prompted Response Tab accessible from their Assessment and Evaluation badge. From that tab users can review prompted responses, leave comments, and see who else has reviewed the prompted responses.
Reopening completed forms is an optional feature that can be turned on or off in the database settings file depending on how your organization wants to use it.
There is a tool to allow staff:admin users to reopen a completed assessment and edit it as needed.
Navigate to your Assessment & Evaluation badge (beside own name in the top left).
Click on My Learners and then Assessments under a specific learner's name.
From the list of Tasks Completed on Learner click on a task to view it.
Click the "Reopen Task" button just below the form title. This will set the task to in-progress and the staff>admin or faculty will be able to adjust it.
Provide a reason to reopen the task (e.g. was accidentally deleted, was missing data, other).
Click 'Reopen Task'.
Once reopened, a user can complete the task and submit it on behalf of the original assessor, or they can forward the task to the original assessor to update.
Use with caution!
This tool should be used judiciously to ensure that residents are not ”gaming” the system and bullying anyone into changing their assessments to be more favourable.
Navigate to Admin > Assessment & Evaluation.
Filter the task list as necessary to find the task you want.
Check the box under the bell icon on the right side of the task table.
Click 'Send Reminders'.
You will see a summary of the reminders that will be sent (e.g. assessor name and number of notifications).
Click 'Confirm Reminders'.
You will get a success message. Close the success message to return to the A & E Dashboard.
Navigate to Admin > Assessment & Evaluation.
Click the Distributions tab.
Find the appropriate distribution and click on it or use the cog icon and select 'View Distribution Report'.
Click 'Manage Distribution' and select 'Send Reminders'.
You will get a confirmation message and see a summary of the reminders to be send (e.g. assessor name).
Click 'Confirm Reminders'.
You will get a success message. Close the success message to return to the Distribution Report.
Elentra includes a variety of report options through the Admin > Assessment & Evaluation tab. Please read report descriptions carefully as not all reports necessarily include data from both distributed forms and user-initiated on-demand forms.
Additional reporting tools for user-initiated on-demand forms used with the CBME tools can be viewed form the CBME dashboard.
A&E Reporting is an administrative reporting tool. Note that individual users may have access to their own reports via their Assessment & Evaluation button, but the availability of such reports depends on how distributions were set up.
When you create most reports you will have some additional options after selecting the appropriate course/faculty/learner/form, etc. These options allow you to customize the reports you run for different audiences.
Please note that not all these options will actually display on all reports, despite the fact that you have the option to select them through the user interface. Please see each specific report for additional detail about what will or will not be visible.
Include Comments: Enable this option if you'd like the report to include narrative comments made on the selected form. Unique Commenter ID: If you select to include comments you'll see this option. It allows you to apply a masked id number to each assessor/evaluator. This can be useful to identify patterns in comments (e.g., multiple negative comments that come from one person) while protecting the identity of those who completed the form. Include Commenter Name: If you would like to display the names of commenters click on the checkbox. Include Description: If you click this checkbox you can give a description to the report. The text you enter will be displayed at the top of the report generated. Include Average: Click this checkbox to include a column showing the average score. Include Aggregate Scoring: If you enable the average, you'll have the option to also include a column with aggregate positive and negative scoring in some reports. This gives a dichotomous overview of positive and negative ratings.
Users with access to Admin > Assessment and Evaluation will be able to view Assessment & Evaluation tasks and reports. Generally this will include medtech:admins, staff:admins, and staff:pcoordinator and faculty:director assigned to a specific course or program.
To view A&E Reports:
Click Admin > Assessment & Evaluation.
From the tab menu, click 'Reports'.
Reports are categorized under evaluations, assessments, leave and distributions. Please read report descriptions carefully as not all reports necessarily include data from both distributed forms and user-initiated on-demand forms.
Additional reporting tools for user-initiated on-demand forms used with the CBME tools can be viewed form the CBME dashboard.
Due to privacy requirements, some organizations choose to restrict access to completed evaluations in some way. The existing options are to:
Set specific forms as Confidential. If you enable this at a form level, any time that form is used the name of the assessor/evaluator will be shown as Confidential. This can help protect learner privacy, however it also restricts the ability to view who has completed their assigned tasks. If you monitor learner completion of evaluation tasks for the purposes of professionalism grading or similar, you may not want to set forms as Confidential.
Use a database setting (show_evaluator_data) to hide the names of any evaluator on a distributed or on-demand task. This means when an administrator or faculty member with access to a distribution progress report or the Admin > A&E Dashboard views tasks they will never see the names of evaluators but can see the contents on completed evaluation tasks.
Use a database setting (evaluation_data_visible) to restrict users' access to completed, individual evaluation tasks. Users can view a list of completed tasks but if they click on a task they will be denied access to view the task contents. Instead of seeing the task, they will receive an error message stating that they do not have access.
Warnings:
Note that medtech:admin users will still have access to all Evaluation data even with show_evaluator_data and evaluation_data_visible enabled. If this is not desired, set any Medtech Admins as “Assessment Report Admin", but have all the flags turned off instead of on in the database.
Show_evaluator_data and evaluation_data_visible do not cover Prompted Responses.
When the evaluation_data_visible setting is applied, admin and faculty users with access to Admin > Assessment & Evaluation will no longer be able to see the results of individually completed evaluation tasks or generate evaluation reports where evaluators could be identified.
On the Admin > A&E Dashboard users will still see the Evaluations tab on the Admin Dashboard, but they cannot click on individual tasks in any of the tables; the hyperlinks to those tasks are removed. (Maintaining visibility of the tasks themselves allows users to send reminders as needed.)
Under Assessment & Evaluation Reports users will be prevented from generating Individual Learning Event Evaluations since evaluators are identifiable in those reports.
For any distribution with a task type of Evaluation, users will be prevented from being able to open individual, in-progress or completed tasks on the Pending/In Progress and Completed tabs of the Distribution Progress page. Users will also be prevented from downloading PDFs of individual tasks on the distribution progress page.
Under the Assessment & Evaluation Tasks Icon, if the user can access Learner Assessments via the My Learners tab, they will be prevented from viewing or downloading individual evaluation tasks from the Learner's Current Tasks tab.
If the user can access Faculty Assessments via the Faculty tab, they will be prevented from viewing or downloading individual evaluation tasks from the Tasks Completed on Faculty tab.
When a distribution is set up as an Evaluation, the Reviewer option from Step 5 is no longer available to prevent accidentally allowing a user to see individual, identified, completed tasks.
If evaluation_data_visible is in use, an organization may wish to designate a small number of users to view completed evaluations. For this purpose individual proxy ids can be stored in the database (with show_evaluator_data enabled and from the ACL rule 'evaluationadmin').
Even when show_evaluator_data is enabled, the Progress pages for Evaluation distributions and the admin dashboard will not identify the evaluators. Tasks will show Confidential Evaluator and no email address.
New in ME 1.25!
Staff:pcoor users now have the ability to set a Reviewer for a task they are initiating
Forms must be permissioned to specific courses to be triggered using this tool.
Administrative users have the ability to trigger an assessment to send tasks on an as needed basis. This might be useful if a specific learner does not have enough assessments in a specific area or there is a reason to prompt a specific assessor to complete a task on a specific learner.
Navigate to Admin > Assessment & Evaluation.
Click 'Trigger Assessments'.
Select a learner.
Select an assessor.
Define a form type (e.g. CBME or Generic).
Set a Start and End Date for the Assessment (this defines the period of time the assessment addresses).
Optionally set the assessment as mandatory. (This means the asessor will not be able to delete the task.)
Optionally indicate if feedback is required for the assessments. If you check this off it will add an item to the form asking the assessor whether or not they met with the trainee to discuss their performance (see sample text below). The target will also be asked to answer this question.
Optionally set prompted response notifications. This allows you to decide whom to contact if any answer on the form is designated as a prompted or flagged response option. (For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response.) You can optionally select to email the Program Coordinators, Program/Course Directors, or add a Custom Reviewer. If you select to add a Custom Reviewer you can select their name from a searchable list of users.
Click 'Get Tools'.
Search for the appropriate tool as needed.
Click 'Preview This Form' to confirm you have the form you want or click 'Send Assessment' to send the assessment to the selected assessors.
You will be prompted to optionally provide an Assessment Cue. This can be a short note to explain the context of the assessment. The text entered will display at the top of a form when the assessor is completing it.
Click 'Send Assessment.'
A green success message will display at the top of the page and you will be returned to the beginning of the bulk trigger assessment process to repeat if needed.
The administrator Record Assessment tool is to be used for forms that a user could initiate on demand (e.g. CBME forms or other assessment workflows). A user cannot record a task using a form that is configured to be sent only via a distribution.
A staff:admin or program coordinator can enter completed assessment forms on behalf of assessors. This allows faculty to complete a pen and paper version of a form and still have the data entered into Elentra.
To enter a completed form on someone's behalf:
Navigate to Admin > Assessment & Evaluation.
Click on the green 'Record Assessment' button below the orange Assessment Tasks heading.
Select a learner (you will need to know the curriculum period the learner is in) and assessor from the searchable dropdown menus.
Select an assessor (this should be the person who completed the pen and paper form).
Select a course/program (this will only be necessary if the learner is enrolled in multiple programs).
Select a CBME Version if necessary. (This helps limit to tools returned to the appropriate curriculum version).
Select a Date of Encounter (i.e., the day the form was completed).
Select an EPA as you would to initiate a form. Filters and search are available.
Search for the appropriate form. You can preview the form to make sure it is the one you want or click 'Begin Assessment' to start a form.
You will be submitting the assessment on behalf of the selected assessor. There is a reminder of the selected assessor displayed at the top of the form in a yellow information bar.
Complete the form and click 'Submit'.
You will get a green success message and be returned to the assessment entry screen to complete another form if needed.
Forms must be permissioned to specific courses to be triggered using this tool.
Administrative users have the ability to trigger bulk assessments to send tasks to multiple assessors, for multiple targets, in the combinations required.
Navigate to Admin > Assessment & Evaluation.
Click 'Trigger Bulk Assessments'.
Select a course/program.
Select an assessor and target(s) as required. Note that to search for targets you will need to filter by curriculum period.
For each target you can set a start and end date to reflect what the assessment period should be.
To add additional assessors (and targets) click the 'Add Assessor' button.
Optionally set the assessments as mandatory.
Optionally indicate if feedback is required for the assessments. If you check this off it will add an item to the form asking the assessor whether or not they met with the trainee to discuss their performance (see sample text below). The target will also be asked this question.
Optionally set prompted response notifications. This allows you to decide whom to contact if any answer on the form is designated as a prompted or flagged response option. (For example, if you have an item asking about student completion of a procedure and "I had to do it" was set as a prompted/flagged response.) You can optionally select to email the Program Coordinators, Program/Course Directors, or add a Custom Reviewer. If you select to add a Custom Reviewer you can select their name from a searchable list of users.
Click 'Tools'. The tools returned will be those that are permissioned to the course/program you selected at the top of the page.
Click 'Preview This Form' to confirm you have the form you want or click 'Send Assessment' to send the assessment to the selected assessors.
You will be prompted to optionally provide an Assessment Cue. This can be a short note to explain the context of the assessment. The text entered will display at the top of a form when the assessor is completing it.
Click 'Send Assessment.'
A green success message will display at the top of the page and you will be returned to the beginning of the bulk trigger assessment process to repeat if needed.
If an administrator is looking at a task with an expiry date, she can extend that date if needed. (This option will display when the task was initiated on-demand by a user AND an organization has their database setting to apply automatic expiry dates to on-demand forms enabled.)
Click 'Reset Expiry' in the top right
Check the date indicated as the new expiry date
Confirm by clicking 'Reset Expiry'
You will return to the view of the task
If users view the task from their Assessment & Evaluation badge, they'll see the updated expiry date on their task card.
Administrators can forward tasks on behalf of assessors/evaluators.
Navigate to Admin > Assessment & Evaluation.
Filter the task list as necessary to find the task you want.
Open the task.
Click 'Forward Task'.
Choose a user to forward the task to.
Note that emails will be sent to the old and new assessor/evaluator.
Click 'Forward Task'.
You will get a success message. Close the success message to return to the A & E Dashboard.
When a distributed task is forwarded, the new task that generates is actually not included in the original distribution. Enhancements in ME 1.20 allow administrative users to see a record of tasks that have been forwarded once, from a distribution progress page.
The task will be removed from the original assessor's task list. The newly generated task will display for the new assessor and will be labelled "Forwarded."
Because the newly generated task isn't actually a part of the distribution, an administrator can't send reminders, nor delete the new task from the distribution progress page. Instead you would need to do so from the Admin > A & E Dashboard.
Completed forwarded tasks are included in a Weighted CSV report generated for a distribution.
Note that if a task is forwarded more than once, anything beyond the second task will not display on the distribution progress report at all.
For use when you have clinical learning courses with block schedules and want an overview of those learners who had an approved leave during a specific block.
Select a curriculum period.
Select a block.
Click 'Download PDF'.
For use when you have clinical learning courses with rotation schedules and want an overview of those learners who had an approved leave during a specific rotation.
Set a date range.
Select one or more learners.
Set the report parameters regarding displaying description and comments.
Click 'Generate Report'.
Note that once generated, this report is available to download by clicking 'Download PDF'.
The Evaluation Reports section will allow you to generate reports based on evaluations completed via distributions. Evaluation reports will not include commenter names, even if you check off the commenter name option when setting the report options.
This report is relevant only if your organization uses the Clinical Experiences rotation schedule. If you distribute rotation evaluations through a rotation-based distribution, you can use this report to view results. The exact format of the report will depend on the form it is reporting on.
Select a course, date range, rotation, curriculum period and form.
Set the report parameters regarding displaying comments and averages.
Click 'Generate Report'.
This report will not include learner names, even if you check off the commenter name option when setting the report options.
For use with distributions completed by event type.
Set a date range.
Select Individual Events: Check this off if you want the ability to select individual events (otherwise you will have to report on all events).
Select the event type, distribution by a curriculum period, learning event and form.
This report will not include learner names, even if you check off the commenter name option when setting the report options.
Use this to report on feedback provided by participants when a feedback form is attached to an event as a resource.
Select a course, date range, event type, form and learning event (optional).
You will notice some extra report options for this type of report.
Separate Report for Each Event: This will provide separate files for each report if multiple events are selected to include.
Include Event Info Subheader: This will provide a bit of detail about the event being evaluated (title, date, and teacher).
This report can include an average and an aggregate positive/negative score. This report will not include learner names, even if you check off the commenter name option when setting the report options.
For use in viewing a summary report of learner evaluation of an instructor.
Select a course and set a date range.
Select a faculty from the dropdown menu by clicking on his/her name. Note that only faculty associated with the selected course in the given time period will show up on the list. Additionally, they must have been assigned as an assessor in another distribution in the organization. Please see additional information below.
Select a form and distribution (optional).
Set the report parameters regarding displaying comments and averages.
Click 'Download PDF(s)'.
This report will not include an average, even if you check off Include Average when setting the report options.
The way faculty names become available to select for this report is when the faculty member is also an assessor on a distribution in the organization. This is designed in part to protect the confidential nature of faculty evaluations and prevent staff from being able to generate reports on any faculty at any time. If your organization does not use assessments or you require reports on faculty whose names aren't available, please reach out to us and we can help you put a work around in place.
You may also be able to report on faculty evaluations by accessing an aggregated report from a specific distribution. Please see more detail in the Weighted CSV Report section in Distributions.
Displays the average of each selection that was made by evaluators for each item in the form for each target on whom the form was completed. Commentary for each target is listed below the table of total selections for each form item.
Select one or more courses.
Set a date range.
Optionally include external assessors by checking off the box.
Select the relevant faculty.
Select the relevant form.
Select the relevant distribution(s).
Click Download CSV.
Takes a total of all evaluations and splits the rating scale into low and high responses. For each faculty target, the report indicates the highest score that they received for each question and also shows the lowest low score and the highest high score across all faculty for each question.
Select one or more courses.
Set a date range.
Select the relevant faculty.
Select the relevant form(s).
Select the relevant distribution(s).
Include Target Names - Check this box if you want to include the target names in the report.
Include Question Text - Check this box if you want to include the question text in the report.
Click Download CSV.
Aggregates the results of the standard faculty evaluation form type on a per faculty basis that have been delivered via distribution only. Optionally include any additional program-specific questions.
Select one or more courses.
Set a date range.
Select the relevant faculty.
Include Comments - check this box to include comments.
Unique Commenter ID - check this box to apply a commenter id to comments so you can look for patterns from one evaluator.
Include Description - check this box and enter text that will be included at the top of the PDF report.
Include Average - check this box to include average ratings.
Click Download PDF.
For use in viewing a summary report of learner evaluation of a course.
Select a course and set a date range.
Select a form and distribution (optional).
Set the report parameters regarding displaying comments and averages.
Click 'Download PDF(s)'.
This report will not include learner names, even if you check off the commenter name option when setting the report options. This report can include an average and an aggregate positive/negative score.
Select a Course
Set a date range for the report.
Select the event types.
Select the appropriate distributions.
Select the relevant learning event (optional).
Optionally select to download as one file.
This report will only be populated if you are using automated rotation evaluations enabled via the Clinical Experiences Rotation Schedule. To use automated rotation evaluations you must also be using a standard rotation evaluation form type.
Select a course/program
Set the date range
Select a form (if you had multiple published forms you might be able to pick them)
Select a rotation schedule (the options available will be based on the set date range)
If you select multiple rotations, results will be aggregated (this can function a bit like a program evaluation if desired)
Choose whether to view results as a CSV or PDF
This report will not include commenter names.
When building a distribution a reviewer can be set in Step 5 of the Distribution Wizard. This person will need to review and approve, reopen or hide all tasks before the tasks are completed as required. The use case for this might be providing an intermediary reviewer who checks the appropriateness of narrative comments on course and faculty evaluations or peer assessments.
For more information on setting up a reviewer on a distribution, please see the page for the distribution type you are working with.
If you have been set as the reviewer for a distribution, click your Assessment and Evaluation badge and you'll see tasks with a Reviewer label under your Assessment Tasks tab.
Click View Task.
Review the form contents.
Choose one of three options:
Reopen the task: This will send the task back to the assessor/evaluator and it will remain in their pending/in progress task list until they complete it. If you reopen a task you will get a message indicating that the task has successfully been reopened.
Approve the task: This will essentially mark the task as finalized and release the task to the target (if allowed by the parameters of the distribution).
Hide the task: This will hide the task from being viewed by the target. The person who completed the task gets no feedback. If you hide a form, you will be prompted to enter a reason for why you hid the form. Note that hiding a form does not remove its data from reports generated about this form and distribution.
Note that currently, there is no user interface to retrieve all hidden forms or to view the comments about why the form was hidden.
This report can be used by administrative staff to keep an inventory of distributions.
Select a report type. You can see an overview of distributions or individual tasks.
Select a course.
Set a date range.
Select a task type.
Select a distribution or an individual task (your option will depend on the first selection you made on the page).
Click 'Generate Report'.
From here you can search within the results or click on any distribution to see its progress.
Use to see an overview of who is set as a reviewer for distributed tasks. (When you create a distribution you can assign a reviewer who serves as a gatekeeper of completed tasks before they are released to be seen by their target. This is completed during the final step of a distribution. For additional information please see the Assessment and Evaluation>Distributions help section.)
More information coming soon.
This report functions like the report above but offers users a view of the report in the interface without requiring them to open a PDF.
Please see the next page for information on the Weighted CSV Report.
Distributions in Elentra can be configured to allow assessors/evaluators to view the results of previously completed tasks when completing their own tasks.
Say you have a learner on a clinical rotation. Every three days they may get assessed by a preceptor using a daily encounter form. At the end of the rotation, a director is expected to complete a final assessment form on the learner. An administrator can configure a distribution to allow the director to view the previously completed daily encounter tasks and use aggregated data to help them make their ratings on the final assessment form.
For any items that were used on both forms, the summary assessment task assessor will be able to see an aggregate count of ratings and comments made on previously completed tasks.
The summary assessment task assessor can also view any completed task in full (e.g. see all additional items included on form, any comments given, etc.).
Summary assessment tasks currently rely on distributed tasks (not user initiated on demand tasks using workflows). If you have a scenario where you'd like learners to be able to initiate tasks on demand but still include those tasks in a head form distribution you should set up an adhoc or delegation distribution for the learners. Then you can link that distribution to the head form distribution.
Staff have quick access to review their assessment and evaluation tasks and those of their affiliated learners and faculty through the Assessment and Evaluation badge accessed in the top right.
When viewing results on any of the tabs in A&E, you can:
Use the search bar to look for a specific task by name and click the down arrow to limit your results by applying search filters like distribution method (date range, delegation, learning event, rotation schedule), curriculum period, course, and task status (pending, in progress, completed), and task type (mandatory/optional).
Sort by Delivery Date or Expiry Date (select one from the drop down selector and then enter the desired start and end dates).
Remember to click 'Apply Filters' to apply your choices.
Remember to click 'Remove Filters' to clear filters and view all results.
This tab shows users all pending tasks for which they are responsible. Mandatory tasks are indicated with a red highlight with a bold red "Mandatory" title on the card.
This is particularly important for curriculum and program coordinators because of the delegation type distributions. If a distribution was created and set as a delegation to be sent to a program admin., s/he can access that delegation here and assign the task to the appropriate faculty (or other user).
View and complete a task by clicking on it.
Remove a task by clicking 'Remove Task' and providing a reason for its removal.
Download individual forms, or download multiple forms by clicking the download checkbox on each card and then clicking 'Download PDF(s)' at the top right. When users do this they will be able to choose whether to download all tasks as one file or as multiple files.
This tab displays all tasks completed on the user and which s/he has been given access to view. This can be controlled in the creation of a distribution.
The user can view all completed tasks and again, download an individual file or download multiple files at once.
The My Learners tab will allow administrative staff to access a variety of information for learners associated with them. This list of available options will vary depending on which modules are in use in Elentra but could include CBME Dashboard, Assessments (refers to tasks managed through Assessment and Evaluation), Learner Explorer (refers to course gradebooks), Schedule (i.e., rotation schedule) and Log Book.
Note that in a CBME-enabled organization, program coordinators can optionally see the CBME Program Dashboard when they click on the My Learners tab.
From the My Learners tab, users can:
Download a list of all learners associated with them by clicking 'Download Enrolment'. A pdf title "learners-for-name" will download and list all learners, including their primary email and learner level (if learner levels are in use).
Search for an individual by beginning to type his/her name in the search learners area; the learner cards displayed will automatically display to reflect the searched name.
Refine the list of learners by switching the curriculum period using the dropdown menu on the far right.
Click on the appropriate tab to view the desired information.
Curriculum/program coordinators associated with a course will be able to access a tab listing faculty. Faculty will appear on this list if they have been the assessor in a distribution or the target or assessor of a triggered form associated with the same course as the program coordinator.
Hide cards of external faculty by clicking Hide Card. Bring hidden cards back into view by clicking 'Show Hidden Faculty'.
Update external faculty emails by clicking on Update Email and providing revised information.
Program coordinators will be able to view the Current Tasks, Completed Tasks, Upcoming Tasks, and Tasks Completed on Faculty tabs for the available faculty.
Program coordinators can send reminders, remove tasks or download tasks. (From this screen there is an option to select all and send reminders.)
From the Tasks Completed on Faculty tab the program coordinator can also view Reports for this Faculty.
This tool allows the program coordinator to view and download a pdf report (with or without comments) that aggregates evaluations of the selected faculty in one report. Program coordinators can also use the Options dropdown menu to quickly view the form and the individual evaluations.
Responses and comments aggregated in these reports are de-identified.
Click group by distribution as desired to sort the forms by distribution. If the same form has been used across multiple distributions this will tease apart each distribution and you can report on forms completed in a single distribution.
This tab will only display if the user has Prompted Responses to manage. Prompted Responses is also an optional setting in Elentra so may not be on for your organization.
When building items for use on assessment or evaluation forms, administrators have the option to identify response selections as prompted responses; then, they can identify on a distribution who should be notified by email if a prompted response is selected. The Prompted Response Notifications tab allows recipients of prompted response email notifications to review and interact with the form that includes the prompted response.
When you click on your Assessment & Evaluation badge, you will see a Prompted Responses tab.
Once you have clicked the Prompted Response Tab, you will see a table displaying a list of your prompted response notifications, the course it belongs to, and the owner, target, and review the status of each task.
You can sort the table by clicking on any of the headers.
Search for a specific prompted response by using the search field.
Clicking on the title of the task will allow you to interact with the form.
Once you are viewing the form:
Click in the comment box to add in your comments about the assessment, if you have any.
Comments can only be viewed by other prompted response recipients and administrators associated with the course/program.
If you do not wish to add comments to a form, you can simply click the Submit Review/Comment to mark it as reviewed.
Prompted Responses on forms can also be configured to send email notifications. If users are the recipients of Prompted Response notifications, they will see the prompted response indicated in the email notification.
To view information from forms completed on oneself, the user can click My Reports on the right hand side.
Set the date range.
Remember that the creator of a distribution can set whether a user can view reports on a distribution, so not every distribution completed on a target is available for the target to view.
The Weighted CSV Report provides a csv file that includes the data collected through completed forms. It lists users who completed the form down the side and form items across the top. Inside each cell will be data from the form representing the scale ratings made by those who completed the forms. If your response descriptors include numbers (e.g. 1 - Overall, this instructor is an effective teacher.) note that those numbers will not necessarily be reflected in the csv.
It is important to note that the weighted CSV report was specifically designed to be used in conjunction with items using a rating scale (e.g. grouped items using a rubric), and allows you to create custom weights for scale response descriptors which get reflected in the report. There is currently no way to configure these weights through the user interface and you will need a developers help to assign weights to scale response descriptions in the database. (Developers, you'll need to use the cbl_assessment_rating_scale_responses table.)
If no weights are applied to the scale responses, the report defaults to assign a value of 0, 1, 2, 3, 4 to the responses in a left to right order. In effect, the Weighted CSV Report will work best if the rating scale you apply to the items mimics a 0-4 value (e.g. Not Applicable, Strongly Disagree, Disagree, Agree, Strongly Agree).
Please note that some rating scale values will be ignored in the Weighted CSV Report. Values that will be ignored are:
n/a
not applicable
not observed
did not attend
please select
You can access the Weighed CSV Report from two places: from the Assessment & Evaluation Reports tab or from an individual distribution.
To access the Weighted CSV Report from Admin>Assessment & Evaluation you must have general access to the Admin>A&E tools. Such access will usually apply to staff:admin, staff:pcoordinator, and faculty:director users when the staff:pcoordiantors and faculty:directors are affiliated with a course/program.
The weighted CSV report is accessible from the Admin>Assessment & Evaluation Reports tab.
Click Admin>Assessment & Evaluation.
From the second tab menu, click on 'Reports'.
Scroll to the bottom of the list and in the Distribution section and click on 'Weighted CSV Report'.
To access the Weighted CSV Report from an individual distribution you must have access to that distribution.
Click on Admin>Assessment & Evaluation.
From the first tab menu, click on 'Distributions'.
Search for or click on the title of the relevant distribution.
Click on the Completed Assessments card (far right).
Click on the Weighted CSV button under the Assessments Completed heading.
A file should download to your computer.
Most reports are currently available as PDFs.
Some screen shots of sample reports are posted below but remember that your reports will contain the items relevant to the forms you've designed and used. In some cases information has been redacted to protect users' privacy.
One of the distribution methods supported by Elentra is a delegation. This can be used with a date-based or rotation-based distribution to send tasks to a delegator to send to the appropriate assessor/evaluator at a later date. If there are multiple targets for a delegation, users will be able to forward some tasks to one assessor, and other tasks to another assessor.
If a distribution was created using a delegation, the designated user needs to complete the delegation by assigning assessors.
Log in as the delegator.
From the Assessment Tasks tab, click on the relevant assessment.
You'll see a list of targets in a table.
Click the checkbox beside a target name.
At the bottom right click 'Select Assessors'.
Search for as assessor and click the checkbox beside the assessor name.
If you need to add an additional internal and/or external assessor you may do so by clicking the 'Add Additional Assessor' button. This will allow you to enter the name of an internal user and/or email address of an existing external assessor. You can even add a new external assessor if necessary.
After adding the required assessors, click 'Proceed to Confirm Assessments'.
If you wish to mark the delegation as complete, click the checkbox. This will move the task to the delegator's My Completed Tasks list. If you have targets without assessors/evaluators, you will receive a warning. If you'd rather that the task to stay on the Assessment Tasks tab, do not click the checkbox.
Confirm your choice by clicking 'Create Assessment Tasks'.
The list of targets and assessors/evaluators will be updated with the newly entered information.
In the date range based delegation above, the delegator has assigned 2 targets to Alex Adams and 2 to Bennett Adkins. The rest of the tasks will be assigned at a later date.
Improved in ME 1.20!
Updated Learner Assessments (Aggregated) Report so that administrative staff running the report can optionally see commenter names (previously ticking that checkbox had no effect).
The Assessments Reports section mostly allows you to generate reports based on assessments completed via distributions. There are some exceptions, however most reports are for distributed forms.
This report allows you to compile all assessments completed on a target in one or more courses into one or many files. It does not aggregate results, just compiles multiple forms.
Select a course, set a date range and select a course group (optional).
Select a learner.
Select a form (optional).
Click 'Download PDF(s)'.
Choose whether to download as one file (all forms will be stored in one file) or not (you'll download a file for each form).
A file will download to your computer with the appropriate forms included. Each form will include the target and assessor, delivery and completion date, form responses and comments, etc. The file names will be: learnerfirstname-learnerlastname-assessment-datereportrun-#.pdf. For example: earnest-acosta-assessment-20181005-1.pdf
Use this report to create an aggregated report on learner performance on a single form that may have been used multiple times and completed by multiple assessors. For this report, the list of learners available will depend on someone's affiliation with a course/program if they are a staff:pcoordinator or faculty:director.
Set the date range.
Select a learner.
Select a form.
Set the report parameters regarding displaying comments and averages.
Include Comments - Check this to include any narrative comments from tasks included in the report.
Unique Commenter ID - Check this to identify the authors of comments using a unique code instead of the name.
Include Commenter Name - Check this to include the name of comment authors.
Include Description - Check this to include a narrative description that will display at the top of the report.
Include Average - Check this to include averages across each item. Note: This option assigns a numeric value to the item response options.
An average and aggregate positive and negative score are available with this report.
Click 'Download PDF(s)'.
For use in reporting on tasks delivered to and completed by faculty. Report columns include the number of tasks delivered and completed as well as the average time to completion from delivery date and average time to completion from the end of the experience (e.g., a block) per user. It also provides an overall average across all users. Available as a PDF or CSV.
Additional details on this report:
shows all tasks delivered to an assessor based on the parameters set when requesting the report (i.e., course, date range, faculty member)
importantly, the report displays progress records per target not just a single assessment task. So if a distribution generates 15 tasks for a faculty member to complete, that might show as ‘1’ on the A&E badge, but in the Timeliness of Completion report it will show as 15 delivered tasks
tasks that are delegations will not display in the report (i.e., when the faculty member is the delegator)
tasks that are the result of delegations will show in the report (i.e., if a clinical secretary is the delegator, once she sends a task to a faculty member, it will show in the report on that faculty member)
if a task has expired it is included in the delivered column but not in the completed column
if a task has been deleted it is not included in the report
if a faculty member forwards a task, it is removed from their delivered count
if a task has been forwarded to the faculty member, it is included in the delivered column and the completed column (once applicable)
if a task allows for multiple assessments on the same target, the initial task will register a count of 1 in the delivered column. If 1 task is completed, the completed column will increase by 1. If additional tasks are completed beyond the first one, the delivered and completed columns will increase by 1.
To generate the report:
Select a course.
This restricts the results of the report to assessments associated with this course. If a faculty member is an assessor across multiple courses, you'd need to run multiple reports (one per course) to get an overall view of their total assessments delivered and completed.
Set a date range.
If there were external assessors used you will have the option to include externals in the report or not.
Select one or more users by clicking the checkbox beside each required name. Please note that if you select all faculty it can take some time for all names to appear. Please be patient! To delete a user from the report click the 'x' beside the user's name.
Include Average Delivery Date: Enable this if desired.
Click 'Download PDF(s)' or 'Download CSV(s)'.
The report should download to your computer.
For use in collating responses provided by faculty completing forms produced through form templates in the competency-based medical education module. This report aggregates comments from forms that are sent out using a distribution. When logged in as an admin. you'll see a full list of all form feedback provided thus far display on the screen.
Set a date range.
Select a course from the dropdown options.
Select a tool from the dropdown options.
Click 'Apply Filters'.
Results will display on the screen and you can click 'Download PDF(s)' if you need to download a copy.
To begin a new search be sure to click 'Reset Filters' and select a new course and/or tool as appropriate.
You can click on the page icon to the right of the feedback column to view the form being referenced.
An organization must be using the rotation scheduler to use this report. This report displays the number of completed assessments over the number of triggered assessments per block for all students in a course during the selected curriculum period. Data is grouped by block, based on the selected block type. The learner is considered the target of the assessments.
Select a course (you can only select one).
Select a curriculum period.
Select a block template type (e.g. 1 week, 2 week). The block templates available will depend on the setup of the rotation schedule for the course.
Click 'Download CSV(s)'
For use in monitoring the progress of faculty in completing the tasks assigned to them.
Select a course.
Set a date range.
Decide whether or not to include External assessors. (This is only relevant if you allow users to send tasks to external assessors, e.g., users without an Elentra account.)
Select the relevant faculty (you can select more than one).
Click 'Download CSV(s)'.
A csv file will download to your computer.
Generates one CSV file per tool (you can select multiple tools at the same time) and lists all completed instances of that tool across all learners within a custom date range. The report displays the encounter date, delivery date, and completed date for each form, as well as each form item and its associated response scale in the column header. If comments were entered for a selected response, they are included in a subsequent column. Users can opt to view only completed instances of a form or to view all instances of a form (pending, in progress, completed, and deleted).
Provides an inline overview or allows you to generate a CSV that displays all/selected assessment responses for a specified course, form, assessment status and date range.
Note that assessor and evaluator names will display in this report unless a form have been set as confidential.
Upon logging in to Elentra, click on to the Assessment and Evaluation Task Icon in the top right.
From the Assessment Tasks tab, click on 'View Task' on a card to access tasks.
If there is only one target for a task, you will see a form to complete.
Fill in the required sections and click 'Submit'.
You can optionally download the task as a PDF, forward it to another user, or, if the task is not mandatory, delete the task.
If there are multiple targets for a task, you'll see a view of your task progress and a list of all learners beneath the relevant task status card (e.g., Not Started, In Progress, Completed).
To complete tasks from the Not Started list, click on a learner name.
You will be taken to a form to complete. Fill in the required sections and click 'Submit'.
You will automatically be taken to the next target to complete another task.
If you want to jump to a different target, click the Choose a Target button and then click a learner name.
For distributions that allow you to complete multiple tasks on a single target, click on the learner name and look for the Begin New Attempt option. This will open a blank form for you to complete.
Faculty users can forward tasks to other users to complete as necessary.
While viewing a task, click 'Forward Task'.
Select a faculty member to forward the task to.
Confirm your choice by clicking 'Forward Task'.
You will get a success message and can click 'Close' to return to your list of Assessment Tasks.
The user the task was forwarded to will immediately see it in their Assessment Tasks list.
After a faculty member has forwarded a task, they will still see the target in their list but the target will be marked with a forwarded label, will not count towards their outstanding tasks, and will not be able to be clicked on. If the faculty member hovers over the forwarded label, the name of the person they forwarded the task to will display.
Note that if a task is forwarded more than once, the original assessor will not see that. Additionally, the third task will not display on the distribution progress report for administrators to review.
For faculty who are Course Directors and may have access to Admin > Assessment and Evaluation please refer to the Administrator Use of A & E section.
Faculty have quick access to review their assessment and evaluation tasks and those of their affiliated learners through the Assessment and Evaluation badge located at the top right.
Click the Assessment & Evaluation badge in the top right.
A series of tabs will open under the Assessment & Evaluation header.
When viewing results on any of the tabs in A&E, use the search bar to look for a specific task by name and click the down arrow to limit your results by applying search filters like distribution method (date range, delegation, learning event, rotation schedule), curriculum period, course, and task status (pending, in progress, completed).
Apply start and end dates to limit your results to a specific time frame.
Remember to click 'Apply Filters' to apply your choices.
Remember to click 'Remove Filters' to clear filters and view all results. Note that your previous filter settings may still be applied so if you are seeing no results, or fewer results than you expected, try 'Remove Filters'.
There is a database setting to allow organizations to configure the order in which tasks display on the Assessment tasks tabs (assessment_sort_assessor_pending, assessment_sort_assessor_completed, assessment_sort_target_completed).
This tab shows users all pending tasks for which they are responsible. Mandatory tasks are indicated with a red highlight with a bold red "Mandatory" title on the card.
Faculty can view and complete a task by clicking on it.
Faculty can remove a task by clicking 'Remove Task' and providing a reason for its removal.
Faculty can download individual forms, or download multiple forms by clicking the download checkbox on each card and then clicking 'Download PDF(s)' at the top right. When users do this they will be able to choose whether to download all tasks as one file or as multiple files.
If a faculty member has profiles in multiple organizations within one installation of Elentra, they will be prompted to switch organizations if they have assessment tasks to complete in another organization.
This tab displays all tasks completed on the user and which s/he has been given access to view. This can be controlled in the creation of a distribution.
The user can view all completed tasks and again, download an individual file or download multiple files at once.
This tab will only display if the user is working in a CBME-enabled organization.
This tab shows feedback received from learners who have given a thumbs up and optionally commented on an assessment form completed by the faculty user. It is learner feedback to the faculty member on how comments/information provided on a completed assessment task helped the learner.
Note that in a CBME-enabled organization, program/course directors can optionally see the CBME Program Dashboard when they click on the My Learners tab.
From the My Learners tab, faculty can:
Download a list of all learners associated with a faculty by clicking 'Download Enrolment'. A pdf title "learners-for-faculty-name" will download and list all learners, including their primary email and learner level.
Search for an individual by beginning to type his/her name in the search learners area; the learner cards displayed will automatically display to reflect the searched name.
Refine the list of learners by switching the curriculum period using the dropdown menu on the far right.
Click on the appropriate tab to view the desired information.
This tab will only display if the user is a course/program director.
Faculty who are course/program directors have access to view the assessment and evaluation pages of faculty associated with their course/program (including external assessors). Faculty appear on this list if they have been the assessor or target in a distribution tied to the course.
Hide cards of external faculty by clicking Hide Card. Bring hidden cards back into view by clicking 'Show Hidden Faculty'.
Update external faculty emails by clicking on Update Email and providing revised information.
Program directors will be able to view the Current Tasks, Completed Tasks, Upcoming Tasks, and Tasks Completed on Faculty tabs for their faculty.
Send reminders, remove tasks or download tasks. (There is an option to select all and send reminders.)
From the Tasks Completed on Faculty tab the course/program director can also view Reports for this Faculty.
This tab will only display if the user has Prompted Responses to manage. Prompted Responses is also an optional setting in Elentra so may not be on for your organization.
When building items for use on assessment or evaluation forms, administrators have the option to identify response selections as prompted responses; then, they can identify on a distribution who should be notified by email if a prompted response is selected. The Prompted Response Notifications tab allows recipients of prompted response email notifications to review and interact with the form that includes the prompted response.
When you click on your Assessment & Evaluation badge, you will see a Prompted Responses tab.
Once you have clicked the Prompted Response Tab, you will see a table displaying a list of your prompted response notifications, the course it belongs to, and the owner, target, and review the status of each task.
You can sort the table by clicking on any of the headers.
Search for a specific prompted response by using the search field.
Clicking on the title of the task will allow you to interact with the form.
Once you are viewing the form:
Click in the comment box to add in your comments about the assessment, if you have any.
Comments can only be viewed by other prompted response recipients and administrators associated with the course/program.
If you do not wish to add comments to a form, you can simply click the Submit Review/Comment to mark it as reviewed.
To view information from forms completed on oneself, the user can click My Reports on the right hand side.
Set the date range.
Remember that the creator of a distribution can set whether a user can view reports on a distribution, so not every distribution completed on a target is available for the target to view.
A Faculty user can delete a task when viewing it as long as the task is not set as mandatory.
In the top right hand corner, click 'Delete Task'.
You are required to provide a reason for deleting the task and can optionally provide additional notes.
The reason entered will be displayed to administrative users viewing a list of deleted tasks.
If a task is set as mandatory, faculty will see an explanation when they hover over the Delete Task button.
The My Learners tab will allow faculty to access a variety of information for learners associated with them. This list of available options will vary depending on which modules are in use in Elentra but could include , (refers to tasks managed through Assessment and Evaluation), (refers to course gradebooks), Schedule (i.e., rotation schedule) and Log Book.
Administering faculty evaluations can be completed in Elentra a couple of different ways.
Allow learners to initiate evaluations on their own using on-demand workflows
If you want to allow learners to initiate evaluations of faculty on an as-needed basis you can use on-demand workflows.
A benefit of this approach is that it requires very little administrative effort to set up. Once forms exist, learners can access them as required.
The on-demand approach can use any type of form that is permissioned to a course as long as the workflow is set as "Faculty Evaluation." A generic form or a standard faculty evaluation form type are the most commonly used.
The on-demand approach requires that learners take responsibility for generating their own tasks and does not guarantee that they will evaluate every faculty member you might want to be evaluated.
If a learner initiates a form using a Faculty Evaluation workflow, they will be able to access a complete list of all faculty stored in Elentra.
Additionally, you can allow learners to initiate a task on an external target (i.e., an instructor who does not have a full Elentra account).
To report on learner completion of tasks you will have to rely on the Admin > A & E Dashboard or the Form Responses Report (there will be no distribution progress report).
It can be more time-consuming to monitor learner completion of their responsibilities using this approach because you will rely on the Admin > A & E dashboard.
Note that you can not set up faculty evaluation workflows to allow faculty or staff to evaluate faculty.
Use Distributions to send tasks to users to complete
This approach requires more administrative effort as you will need to configure distributions to generate tasks for users.
Tip: Use the distribution copy tool to reduce your workload.
Depending on how your organization requires faculty evaluations to be completed, you might rely on different distribution methods. For example:
An event-based distribution that asks all learners in attendance to evaluate the instructor of the event
A date-based distribution where learners are asked to evaluate a number of targets across a pool of options or a list of specific targets
For organizations that use the Clinical Experience Rotation Schedule, rotation-based distributions can support the evaluator defining their target while completing their task (i.e., Learner A spent four weeks on a pediatrics rotation; at the end of their rotation they pick one physician to evaluate). This is currently limited to one target per task so learners can not use this option to complete evaluations of multiple preceptors.
Use a delegation (date-range or rotation-based) to send tasks to an intermediary who will send them to the appropriate users when that information is known.
Benefits of using distributions are that tasks are clearly defined for learners and administrators can use a distribution progress report to quickly see who has and hasn't completed their evaluation tasks.
For information about the option learners may see to immediately release completed faculty evaluations to the target, see here.
Learners in organizations using CBE tools will also have access to their assessments from their CBE dashboard. For more information about using the dashboard, please see the CBE documentation (Reviewing Learner Progress).
Learners have quick access to review their assessment and evaluation tasks through the Assessment and Evaluation badge.
Click the Assessment & Evaluation badge in the top right beside the logout button.
Four tabs will be visible to learners: Assessment Tasks, Tasks Completed on Me, My Complete Tasks, and Tasks Waiting for Assessor (if applicable).
When viewing results on any of the tabs in A&E, use the search bar to look for a specific task by name and click the down arrow to limit your results by applying search filters like distribution method (date range, delegation, learning event, rotation schedule), curriculum period, course, and task status (pending, in progress, completed).
Apply start and end dates to limit your results to a specific time frame.
Remember to click 'Apply Filters' to apply your choices, and 'Remove Filters' to clear filters and view all results.
Note that your previous filter settings may still be applied so if you are seeing no results, or fewer results than you expected, try 'Remove Filters'.
There is a database setting to allow organizations to configure the order in which tasks display on the Assessment tasks tabs (assessment_sort_assessor_pending, assessment_sort_assessor_completed, assessment_sort_target_completed).
This tab displays tasks the learner is currently responsible for completing (e.g. faculty evaluation, service evaluation, etc.).
Mandatory tasks are indicated with a red highlight with a bold red "Mandatory" title on the card.
Task cards may display a rotation or block name depending on how they were scheduled.
Learners can view a task, download individual tasks or multiple tasks at once, and can remove task if they are not required. If the learner removes a task, they will have to provide a reason.
This tab displays all the tasks that have been completed on the learner.
Task cards display the form title and type, task delivery and completion dates, form triggerer name (if applicable), and name and role of the assessor.
Learners can view a task and download individual tasks or multiple tasks at once.
A Seen by Target icon appears in the top right hand corner so that the learner, faculty and staff can confirm the learner has seen the completed form.
This tab displays forms the learner has completed.
Task cards display the form title and type, rotation/block name, task delivery and completion dates, and progress.
Learners can view a task and download individual tasks or multiple tasks at once.
Introduced in ME 1.22, this tab shows a learner tasks where they are the target but an assessor has yet to complete the task. This tab can be useful for helping learners know that they have initiated a task or that a task was initiated on them but has yet to be completed.
The tab displays all on-demand workflow assessments, adhoc initiated assessments and regularly started assessments. (All manual assessments initiated by the learner themselves). The tasks will appear on this tab until they are completed.
Deleted tasks will also appear here so the student is aware that they did send a task and the assessor (or a staff member) deleted it. This allows them to reach out to the assessor in order to find out what happened.
By default, only 9 cards are loaded and the load more button will load an additional 9. Since all deleted tasks are returned, limiting the number of cards that load prevents some clutter. The cards are sorted from most recently delivered so newest cards will appear on top. Users are encouraged to use the filter options to help find a specific task or to search as needed.
If a filter returns nothing you get this message: Your search did not return any tasks pending on you.
If the student has never initiated any tasks or every single one of their tasks have been completed, they get this message instead: You have no tasks waiting on an assessor.
From this button on the right hand side, learners can access reports on tasks completed on them via distributions as long as the distribution is configured in Step 5 Results to allow for target self-reporting.
Provide the appropriate dates.
Click 'Report'.
When an administrator or faculty member accesses Reports on a Learner, they are able to see reports for on-demand and distributed tasks.
When administrators create distributions they can optionally decide whether or not to release the tasks and associated reports to the targets of the forms. Choosing to release tasks allows the target to view forms completed on them. In the case of evaluations, administrators only have the option to release task reporting to targets, not release individual tasks. However, a distribution can be set up to allow learners to optionally release completed evaluation tasks to a target. This applies in the case of evaluations only.
If this optional target release feature is used, at the bottom of an assigned evaluation task, learners will see the option to release the evaluation to the selected target. If the learner picks 'Yes' from the dropdown selector, the task will become available for the target to view through their Assessment & Evaluation tab.
The faculty member can now view that completed form.
Note that learners opting to individually release tasks for faculty to view does not change the faculty member's ability to create a report on a distribution (that is still dictated by the distribution).
Learners can initiate assessments on demand from their dashboard. A learner's specific user interface will depend on whether or not they are a part of an organization with CBME enabled and/or workflows enabled, and whether they are responsible for any adhoc distributions.
From the main dashboard, click 'Start Assessment/Evaluation'.
Toggle between 'Start Assessment/Evaluation' and 'Adhoc Distributions' as needed.
Start Assessment/Evaluation should be used for any CBME or workflow forms.
Adhoc Distributions should be used if a distribution was set up that requires the learner to initiate a task to send to faculty.
From the Dashboard, learners have access to a green 'Start Assessment/Evaluation' button on the right side of the screen.
If workflows are in use a learner will select an On-Demand Workflow. This will dictate which forms become available to select. If None is the only available workflow option, select None and continue.
In this example, we'll select an EPA form to complete.
After selecting an on-demand workflow, the choices a learner has will depend on the workflow they are completing.
Next learners select an assessor. They can begin to type a name to limit the list of options. When they mouse over a faculty name, learners can see the faculty card including their name and photo (if uploaded).
If learners need to add an external assessor (someone who doesn't have an Elentra account), they can click 'Add External Assessor'. (Please note that allowing learners to add external assessors is a database setting you can optionally enable or disable for an organization. If you don't see the option for learners to add an external assessor and you'd like them to be able to, ask a developer to enable cbme_ondemand_start_assessment_add_assessor in the settings table.)
Next, learners set a date of encounter.
Next, learners select an assessment method. Details about each assessment method are provided inline.
Note that an assessor must have a PIN setup for learners to select the first option. For more detail on setting user PINs see here.
In our example, completing an EPA form, learners next select an EPA to be assessed on.
For a reminder on what is included in a specific EPA the black question mark provides a quick link to the EPA Encyclopedia.
Users can easily navigate the list of EPAs by applying preset filters including Current Stage EPAs, Current Rotation EPAs, and Priority EPAs. Click on a filter title to apply it. In this example the Priority EPAs filter is being applied.
After an EPA is selected, the available assessment tools will be displayed. Learners can search the tools by beginning to type in the search bar. Note the small clock in the top right of each tool card. This is an estimate of how long the form will take to complete based on the experience of other users.
Learners can click 'Preview This Form' to view the form and ensure it is the one they want or they can click 'Begin Assessment' on any tool to begin.
Depending on the selected method of assessment, learners will either be directed to the assessment form to start it, or the form will be sent directly to the assessor.
Select a distribution (learners will only see distributions for which they are a target).
Select an assessor.
Select an assessment method.
The assessment methods available to the learner can be controlled in the database settings table to offer maximum flexibility.
If an adhoc distribution has been set for a summary assessment task and there is not yet enough data to complete the task, the learner will not be able to initiate the task.
If adhoc distribution is completed via PIN and is a summary assessment task, the assessor will have to enter their PIN before the learner can view the form (this is so the learner is with a faculty member when they view feedback from others).
Enter a date of encounter.
Click 'Submit'.
Depending on the selected method of assessment, learners will either be directed to the assessment form to start it, or the form will be sent directly to the assessor.
If a learner has initiated a task to send to an assessor, the timing of its delivery is controlled by a behind the scenes task called a cron job. Typically the default setup is for the cron job (send-queued-notifications) to run every 10 minutes. This means if a learner initiates a task using the complete and confirm via email method at 8:30, the assessor might not receive the task until 8:40. If this timing should be adjusted, you must speak with a developer at your institution.
When logged in as a faculty member or program administrator, click the Assessment & Evaluation badge that appears in the top right.
Click on 'My Learners' from the tab menu.
Search for a learner as needed, and if you can't find someone, ensure that you are in the correct curriculum period using the dropdown menu on the right.
After finding the correct learner, click on Assessments on the learner card to view the learner's assessments page.
Multiple tabs are available on the Assessments page. Each tab displays a collection of assessment tasks. CBME specific forms are included on this page and so are any forms created through the Assessment and Evaluation module and sent to users through a distribution.
On each assessment card you will see a range of information which may include the form title, form type, delivery and completion dates, progress (e.g., 3 of 14 complete), who triggered the form , the target, and the assessor/evaluator (including their group and role). Not all of this information is included on every task card.
On each tab you can search for tasks, set a date range and apply filters. Click the down arrow beside the search tasks bar to select a filter type and drill down to the necessary filter. Select a filter and click the green Apply Filters button. Your applied filters will display on an Active Filters list.
These are tasks completed by assessors on the learner.
Each task can be downloaded or viewed. To download tasks, click the appropriate box at the bottom of the task card and then click Download PDF(s) at the top of the page. Note that there is a Select All tool available. To view a task click View Task at the bottom of the task card.
More tasks can be displayed by clicking Load More Tasks at the bottom of the page.
From this tab you can view all pending tasks on the learner. This includes tasks that have been distributed but are not yet started and tasks that are in progress.
Tasks that have expired will not display on this tab.
To remove a task from this list click Remove Task and provide a reason for the removal.
To send reminders click the applicable tasks (or select all) and click Send Reminders in the top right.
To download a task select the appropriate tasks (or select all) and click Download PDF(s) in the top right.
To view a task click View Task at the bottom of the task card.
This tab displays scheduled distributions where the learner is the target.
To remove a task from this list click Remove Task and provide a reason for the removal.
To download a task select the appropriate tasks (or select all) and click Download PDF(s) in the top right.
To view a task click View Task at the bottom of the task card.
This tab displays active distributions where the learner is an assessor/evaluator.
To remove a task from this list click Remove Task and provide a reason for the removal.
To send reminders click the applicable tasks (or select all) and click Send Reminders in the top right.
To download a task select the appropriate tasks (or select all) and click Download PDF(s) in the top right.
To view a task click View Task at the bottom of the task card.
This tab shows tasks scheduled through a distribution where the learner is an assessor/evaluator. Tasks can be removed from this list by clicking Remove Task and providing a reason for the removal.
From the Tasks Completed on Learner tab, users can access Reports for this Learner. This gives them access to aggregated reports on a learner for each form used and includes on-demand and distributed tasks.
Set a report date range if desired.
From the Options dropdown menu you can: 1. Choose to generate the report with or without comments (both can then be downloaded as a PDF); 2. Click View This Form to open a new tab that shows the form; and 3. Click View Assessments to open a tab that lists all completed assessments for this form and provides links to view each one.
Clicking the 'Group by distribution' checkbox will reorganize the list of forms to reflect completion based on distribution and add a column to the display to show distribution description information.
Retiring a PPA, Rubric or generic form that has been TRIGGERED:
We recommend that if the form was in use at all, simply retire it instead of deleting it so that any Pending or In progress assessments/evaluations can still be completed by their assessor/evaluator.
Deleting a PPA, Rubric or generic form that has been TRIGGERED:
All pending and in-progress tasks that used that form will not have a form associated with them and will display an error message stating that the form has been deleted when assessor/evaluator tries to access them (retiring is the better choice!).
If needed, a developer can restore the old form to allow assessors/evaluators to complete the assessment/evaluation task that uses the form.
If you delete a form that has been used to trigger an assessment, then any pending assessments can no longer be completed.
Retiring a PPA, Rubric or generic form used in a DISTRIBUTION:
All pending and in-progress tasks that used the form will still be accessible to the assessors, but the form will be not be available for selection on any new distributions.
You can go ahead and replace the form on the distribution whose form has been retired; all future tasks will use the new form.
If you do not replace the form on the distribution, future tasks scheduled to be delivered will send the retired form.
Deleting a PPA, Rubric or generic form used in a DISTRIBUTION:
All pending and in-progress tasks that used that form will not have a form associated with them and will display an error message stating that the form has been deleted when assessor/evaluator tries to access them (retiring is the better choice!).
If you replace that deleted form with a new form on the distribution, all future tasks will use the new form.
If there are pending or in-progress tasks where users need to have the deleted form replaced with the new form, a developer can replace the old form with the new one for pending tasks.
EXCEPT: They cannot switch assessment forms for tasks where the assessor/evaluator has already completed the assessment/evaluation form for one or more targets, but not all of them. In this case, they would have to temporarily restore the old forms so that the assessor/evaluator can finish their tasks. They can only switch assessment/evaluation forms for completely pending assessments (meaning they don't have any progress).
Deleting a FORM TEMPLATE:
Deleting a form template does NOT delete the forms that were previously generated by the deleted template. They remain to allow assessors/evaluators to complete them; however, the old forms will not be available going forward.
You can create your distribution using one form, and then switch the form on the distribution to the new form when it is ready. All future tasks created by the distribution will use the new form.
It is recommended that, if you replace a form that you no longer wish to use, you retire the form so that previously delivered tasks using that form can still be completed.
As long as you do not delete the old form altogether, any pending or in progress tasks that have been delivered using the old form can still be completed by the assessors/evaluators.
If you need to replace the old form with the new form for any pending tasks (pending tasks are tasks that have not yet been started by the assessor/evaluator) you will need to get help from a developer.
Once all in progress and/or pending tasks using the old form have been completed, you can delete the old form if you wish. You can also choose to delete the old form if there are in progress and/or pending tasks that are using it, but the assessor/evaluator will receive an error message stating that the form no longer exists when they attempt to complete it. In this case, it is better to retire the form, so that these tasks can still be completed using the old form, but the old form will no longer be available to select when creating future distributions.
Editing the Expiry Date: If an expiry date has been applied to the active distribution, you can edit the expiration date to 're-open' tasks to the targets who haven't completed them yet. Those tasks will be visible to the targets the day after the update is made to the expiry date (because the job that runs behind the scenes to deliver tasks runs at night). A developer can run the job immediately if it is time sensitive.
Changing the Targets: If you add a new targets to a distribution, the system will reopen the task for the assessor/evaluator to allow them to complete the task on the new target.
Editing the Rotation Schedule of Targets Associated with a Distribution: will cause new tasks to be delivered to the assessor.
Changing the Reviewer of a Distribution: will cause the distribution to redeliver previously delivered tasks; be sure to apply a Release Date when making this change.
Changing the Target Release Options in Step 5: Changes the visibility of tasks/reports retroactively for all completed tasks.
Navigate to Admin > Assessment & Evaluation.
Click on the Distributions tab.
Edit the distribution that had the incorrect target selected (remove the incorrect target, add the new target).
Ask a developer to push the distribution through to reflect the changes OR the next day (after the system has a chance to run the changes to the distribution overnight), navigate to the distribution and click on its title to view its Progress page.
Find the task in the Available assessments tab (the assessment is available because it is new).
Click on the name of correct target.
Use the Choose a Target dropdown menu to open the completed assessment, which was completed on the incorrect target.
Click the Choose a Target dropdown menu and click the circular double arrow to the right of the correct target under Forms Not Started to assign the completed assessment to the correct target; you will see the page refresh with the correct target and the completed form.
Now the incorrect target will be listed in the Forms Not Started list, and you can delete that target for the assessor/evaluator: click the Choose a Target dropdown menu, click on the incorrect target in the Forms Not Started list, click Delete Task, and complete the required fields to remove the task.