Back to Top


Evaluation is a key component of instructional technology because it determines whether what has been created and implemented has had a real impact on the learners. This is an area that I feel is often missed in practice. In my professional setting, unfortunately time and resources often don't allow for a true evaluation beyond Level 1 (i.e. how the learners react to the training). This provides insight into how the learners perceived the instruction, which is important for ensuring we reach the audience appropriately. Even more importantly, however, are the higher levels of evaluation to determine if learning actually occurred, whether that learning transfers to the performance environment, and whether a real impact is made on the organization.

While I already knew some of the benefits of true evaluation prior to the ITMA program, my experiences in the program have brought the importance into a new light for me as I have learned to think more deeply about the impact that my instructional solutions need to have on my learners and on the organization. This deeper focus early in the design process leads to a more enlightened development process and sets the stage for what to be looking for in the evaluation process to determine how well I met the goals.

The AECT breaks down Evaluation into the following sub-domains:

  • Problem Analysis
  • Criterion-Referenced Measurement
  • Formative Evaluation
  • Summative Evaluation
  • Long-Range Planning

For complete details of the AECT standards, view the Initial and Advanced Standards.

The following content provides evidence of my competencies in these areas.

5.1 - Problem Analysis

Problem analysis deals with gathering information and developing strategies for making decisions. This area involves collecting, analyzing, and interpreting data in order to improve instruction.

Needs Assessment (Instructional Design course)
In the Instructional Design course, my main project was to design classroom training to teach Customer Service associates how to explain and offer online self-service functions to our customers when they call. This document outlines the needs assessment that I conducted to determine the performance gap and to establish the case for solving the performance gap through instruction.

Impediments Preventing Use of the Web to Address Your Needs (Education & the Web course)
In this document, a number of problems are presented - problems that can prevent the Web from addressing educational needs. For each potential impediment, I provide information about how the barrier could impact me, those for whom I am responsible, and those to whom I am responsible. It then discusses strategies that I could employ to try to minimize the impact of these barriers.

Teaching: Fun & Learning II - Observation of Instructional Setting (Learning Theories course)
For part of this particular assignment, I was asked to observe a live instructional setting, gather data, analyze the data to produce charts, and interpret the data to answer specific questions about what I experienced. In the Fun and Learning II section (beginning on page 4), I present the results of my data collecting in a table, followed by an analysis and interpretation of the results in the form of commentary and visuals.

5.2 - Criterion-Referenced Measurement

Criterion-referenced measurement is used to determine whether learners have mastered a specific objective or set of objectives. It measures the effectiveness of instruction in terms of the learning impact it has (or does not have) on the learners.

Multimedia Design Document: Sections 6 & 7 - Objectives & Assessment Items (Multimedia Authoring course)
For the Multimedia Authoring course, my main project was computer-based training on how to use an online time-keeping and vacation management application used at my company. In Section 6 of this document, I provide a table that describes all the performance objectives for the self-paced lesson. For each objective, I provide an assessment item that serves as criterion-referenced measurement of the level of mastery of that particular objective. Many of these assessment items are checklists that are automatically assessed as the learner progresses through the simulated environment within the instructional program. Each assessment is tied to specific criteria outlined in the associated instructional objective.

Quality and Credibility of Web Sites (Education & the Web course)
In this document, I present a list of criteria that I feel are crucial in determining the credibility of a website. I then use that criteria to evaluate two websites, one that meets the criteria and one that does not. This document provides evidence of my ability to develop criteria for measurement based upon specific standards and to use that criteria to evaluate something.

Virtual Store Design Report: Sections 6 & 7- Objectives & Assessment Items (Project & Report course)
Section 6 of this document provides the learning objectives for the Virtual Store instructional program. For each objective, an assessment item is provided for measuring successful mastery of that objective. The program constantly assesses the learner's performance in three key areas by presenting virtual situations and activities that require the learner to make decisions or perform actions.

5.3 - Formative Evaluation

Formative evaluation deals with determining how to improve an instructional solution through further development (again illustrating the iterative, non-linear nature of Instructional Systems Design).

E-Learning Evaluation Form (Project & Report course)
I created this form as a tool for formative evaluation of my Virtual Store program in the Project & Report course. The form could also be used for summative evaluation, although my original intent was to use it for making improvements to the project in further development before finalizing the program. I based the form on the evaluation criteria used in the Multimedia Authoring course and concepts that I learned in the Software Evaluation course.

Evaluation Summary (Project & Report course)
This form summarizes the feedback I received as a result of formative evaluation of my Virtual Store program. It also provides my response to the feedback and any actions taken to improve the final instructional product. This was an extremely helpful part of the overall process because it allowed me to gain objective feedback from outside parties to ensure the effectiveness of the final program.

Multimedia Formative Evaluation (Multimedia Authoring course)
This report documents the formative evaluation conducted on my multimedia project. The document first provides a summary of the feedback received from three evaluators and my responses to the feedback. I then summarize the strengths and weaknesses of the multimedia program, followed by a summary of the revisions that I would make to the program as a result of the evaluation process.

5.3 - Summative Evaluation

Summative evaluation deals with determining the adequacy of an instructional solution for actual utilization.

Evaluation of Virtual OfficeMax (Software Evaluation course)
This final report details the results of a summative evaluation that I performed on the Virtual OfficeMax multimedia program. This software program is currently used as a self-guided instructional program for new hires within the first 30 days of hire. The program itself became the inspiration for my own virtual retail store program for my Applied ID Project. This document provides an exhaustive review of the Virtual OfficeMax program, along with a summary of final recommendations.

Software Evaluation Checklist (Software Evaluation course)
On page 3 of this document, I provide a checklist that I created to assist with performing a summative evaluation of a software program. The intent of the checklist is to help determine the adequacy of a particular instructional program (e.g. software, online instruction, etc.) for utilization in instruction. Also included in the document is a summary of how I arrived at the particular items on my checklist.

5.4 - Long-Range Planning

Long-range planning focuses on a holistic, strategic plan, typically for a future period of anywhere from 3-5 years or longer.

Portfolio Proposal (Portfolio Evaluation course)
One of the key deliverables in the Portfolio Evaluation course is a proposal for a potential portfolio program in my own organization. In this document, I present a proposal for implementing the use of portfolios as a complement to the existing performance management process in the organization. While the document does not call out specific timelines, it includes some of the strategies, including a pilot launch within the training organization, followed by a rolling implementation across additional areas of the organization.

Seels, B. B., & Richey, R. C. (1994). Instructional technology: The definition and domains of the field. Washington, DC: Association for Educational Communications and Technology.