Evaluation Plan

The E-Comm: Director Development Program Evaluation Plan was completed as part of the EDIT 7350 course, Evaluation and Analytics in Instructional Design, taught by Dr. Lauren Bagdy. Dr. Bagdy provided this case study, assigning our team—consisting of Stephanie Denny, Michaela Hyland, Julie Wyatt, and myself—the task of developing an evaluation plan for a Director Development Program at an online clothing and lifestyle brand. This program was designed to support directors across various functional areas in preparing for leadership roles, such as vice president.

  • The Director Development Program consisted of seven learning modules, beginning with an orientation module that outlined program expectations and objectives, followed by the core lessons. Our evaluation was primarily summative, aimed at defining the goals and objectives of the program and creating evaluation tools to assess if these goals were being met. The purpose of the evaluation plan was to determine if the training would likely help director-level employees enhance skills in time management, team-building, brand influence, and cross-functional collaboration.

    To structure the evaluation, we used the Kirkpatrick model, focusing on gathering participant feedback at different levels. Each module included an end-of-module survey to capture participants' initial reactions—a key part of Level 1 in the Kirkpatrick model—providing insight into whether participants found the training enjoyable and valuable at first impression. Additionally, we recommended conducting an interview with the Client Success Manager leading the training to gather qualitative insights on pacing, participant engagement, and common questions that arose during the training. These data collection methods were intended to yield both quantitative and qualitative information for a comprehensive evaluation.

    Although the evaluation was primarily summative, formative evaluation elements were embedded in the program, enabling ongoing feedback for continuous improvements. While our plan did not include actual data collection, our findings and recommendations highlighted areas where the instructional components could be enhanced based on anticipated learner and facilitator feedback.

    In this project, I designed the evaluation framework and developed an open-ended questionnaire to capture participant feedback. I collaborated with my team to create an analysis plan that would guide the interpretation of potential data. I selected this project for my portfolio because it demonstrates my skills in evaluation design, developing data collection tools, and understanding the quality of instructional products. It highlights my ability to recommend effective evaluation methods and assess whether instructional interventions are likely to achieve their intended outcomes based on anticipated feedback.