Evaluation should not be done as an afterthought, but rather built right into the ID process (Smith & Ragan, 1999). If I am thinking about the evaluation from the beginning, it is helpful for designing the rest of the components. If I know what I am going to evaluate, it provides a context for the rest of the development process.
Evaluation differs from assessment in its focus more than in its function. Assessment provides insights into what impact the instructional event had upon the student. Evaluation provides insights into how this impact was achieved and how the process of instruction could be more effective. Both are an important part of improving the program as “learner’s opinions of the quality of the instruction are important” but “the key is their learning as a result of the instruction” (Smith & Ragan, 1999, p. 338). If nobody learned anything, the teaching approach might be to blame.
The course in learning fluency under evaluation is designed to equip students with a process that they can duplicate outside of the learning environment where they acquire it. This means that evaluation and assessment must take place during the event as well as after the event is completed. Summative evaluation allows the instructional designer a chance to analyze the overal output of the learning experience to see if it produced the intended results (aka. Did the proposed solution actually solve the problem?) (Smith & Ragan, 1999). This is done by assessing the progress of the learners and asking for their input to evaluate the experience.
Unfortunately it can be difficult to tell why students learned or did not learn. Those who seemed to perform well might have done so without instruction and those who seemed to perform poorly might have thrived in a different sort of learning environment. Thus, it is almost impossible to tell from a summative evaluation whether or not the instructional event has been most effectively designed. It is reminiscent of the statistical fallacy that correlation can predict causation. There will always be some students who struggle to excel within the framework provided by the instructional event. Do we change the experience to meet their needs at the risk of preventing the majority of other students from learning quite as well?
The purpose of teaching students the 5 elements of learning fluency is to help all students overcome this obstacle and equip them to thrive in any learning environment by putting the responsibility and tools for learning back into their hands. However, the way in which they will acquire these needs to be adaptable to their individualized interests and strengths. Tools must be designed with the end-user in mind [cite the model of human-centered design].
Formative evaluation is an ongoing process that analyzes multiple elements of the instructional event to determine where it can be more effectively designed. It is meant to take place largely before the implementation (Smith & Ragan, 1999), but can also be conducted every time the learning event takes place.
The elements considered in the formative evaluation for this workshop include a look at the design process used to create the workshop, the delivery of the experience, and the outcomes as they compare to the objectives (Smith & Ragan, 1999). Using these three approaches provides the course designer with insights into how well it is working and how it might be improved.
Formative Evaluation Strategy and Questions
Every student comes to the learning experience with a different set of attitudes, assumptions, and pre-requisites. It is not possible to screen for all of these, but it is possible to discover which ones apply to a majority of learners within a course and then provide adequate supports where students need them. Since one of the goals of evaluation is to identify these points, Smith & Ragan (1999) suggest only providing support if students are unable to proceed with instruction.
This course makes a growth mindset assumption that all students have the potential to improve their learning ability. Some will be further ahead of others in this process, and their contribution will be valuable to the learning community. However, it is important to make sure that those who are less equipped to participate have opportunities to engage. One of the objectives of the Learn By Doing model applied here (see http://www.humancenteredlearning.com) is that students are not passive recipients of information, but actively involved in the formation process of the ideas they will take away.
Preliminary Evaluation Questions (before instruction through reflection, expert, and one on one feedback)
- Accessibility for a diverse range of cultures, intelligence types, and education levels?
- Adequate level of detail for activity descriptions, objectives, etc…
- Ease of facilitation?
- Additional tools needed for student engagement?
- Potential areas for confusion?
- Alignment of objectives with activities and outcomes?
Ongoing Evaluation Questions (throughout instruction)
- How much time is needed for each activity?
- Which objectives or activities do students seem to struggle?
- Where is more support or additional practice needed?
- Who needs additional help to enage in the learning event?
- Are their pre-requisites that insure a more delightful experience?
- What kinds of students are struggling?
- What does the feedback given by students about their different strategies say about the strengths and weaknesses of the instructional process?
- Can the processes designed by the students be duplicated outside the learning environment? (learning transfer)
- Is there another objective that needs to be put in place for the learning goal to be achieved?
End of Instruction Questions (administered through survey)
- What will you do with what you have learned tonight? (tells what problem students think this will solve)
- What elements of the course were helpful or confusing for you? (identifies potential for improvement or additional focus)
- Name and email/twitter handle of a friend who should take this course. (Identifies whether students this is valuable enough to share)
After Instruction (collected via email survey 3-4 months later)
- Demographic questions
- What is your current GPA?
- How have you used what you learned?
After Instruction (collected via reflection by instructor)
- To what level was the instruction implemented as designed?
- Time taken for preparation? Time for the event?
- What additional supports are needed for facilitation?
- Overall impression of student attitudes and engagement?
- Perception of student progress?
- Do we need additional training sessions or more time for the event?
- Did we try to do too many things?
- What parts of instruction may be redundant?
- Do we need to consider another element of learning fluency?
Summative Evaluation
The previous two sections may be considered a summative evaluation since they take place after the instructional event has been completed. These questions are focused on measuring the overall value of the event for participants and providing concrete suggestions for whether and how it should continue.
Based on the information collected the summative evaluation process will include the following questions:
- What do we need to measure next time in our evaluation?
- What actually effected learner performance? (community, exposure to ideas, engagement in process, etc.)
- What is the value of this program? (time saved, improved grades, attitude change, etc.)
- How can we improve the ROI? (cost per student, course effectiveness, or benefit focus)
- Who will be interested in these results?
Reports of the event should include the following information for
Students
- How other students use what they learned,
- Effectiveness of the strategy for various problems
- Perceptions of the learning experience
- Ease of participation
Parents
- Change in student performance (e.g. GPA).
- Student attitudes toward learning
- Ongoing benefits of learning fluency
Teachers
- Benefits of teaching students who know how to learn
- Reduction in ID considerations
- Quality of learning community
Administration
- Quality of learning community
- Student satisfaction with the learning environment
HR Directors
- ROI for equipping self-directed learners
- Transferability of skills
Click here to register your interest
or
continue scrolling to learn more about how this course could transform the experience of learning for your student, school, or professional organization.
What is Part of The Course?
This training will be designed for teachers and instructors who want to see an increase in student engagement and performance, but will also be open to students that want to maximize their own learning experience. In addition to raising awareness of how these factors influence the success of students, the training session will include the development of tools and strategies which can be used by the individuals and classrooms that will implement this solution.
The specific learning goal for the class is for students to create a personalized strategy for mastering one relevant aspect of learning fluency. A secondary learning goal is that students are equipped with the knowledge and resources needed to continue developing learning fluency on their own.
Questions or Thoughts? I look forward to hearing from you!
*All references can be viewed in the site reference list here
Appendix
To print evaluation forms, please download the following document. Includes evaluation forms for the instructor during facilitation and to reflect after. Student evaluation forms are included in the content outline handout.