Difference between revisions of "Assessment-Driven Course Design/OG"
Sfrancisco (talk | contribs) m (Edited format) |
Sfrancisco (talk | contribs) m (Edited reference) |
||
Line 35: | Line 35: | ||
==Positive Consequences== | ==Positive Consequences== | ||
By applying this pattern, one increases the coherence of all course parts. Content, learning objectives, and assessments match. Starting with this approach, one can identify in an early stage the learning objectives that are hard or even impossible to assess. In that case, the learning objectives should be revisited. The same goes if the assessment method (e.g. a written exam) does not match the skills to be assessed (programming a larger piece of software). Having the assessment criteria defined upfront helps with communicating them to the students too, making the assessments fair and open (as discussed in Bergin et al.<ref name="Bergin2015">Bergin, J., Kohls, C., Köppe, C., Mor, Y., Portier, M., Schümmer, T., Warburton, S. (2015). [http://www.koeppe.nl/publications/Koppe-PLoP15-FairPlayPatterns.pdf Assessment-Driven Course Design - Fair Play Patterns]. In ''Proceedings of the 22nd Conference on Pattern Languages of Programs (PLoP 2015)''. New York:ACM</ref>). | By applying this pattern, one increases the coherence of all course parts. Content, learning objectives, and assessments match. Starting with this approach, one can identify in an early stage the learning objectives that are hard or even impossible to assess. In that case, the learning objectives should be revisited. The same goes if the assessment method (e.g. a written exam) does not match the skills to be assessed (programming a larger piece of software). Having the assessment criteria defined upfront helps with communicating them to the students too, making the assessments fair and open (as discussed in Bergin et al.<ref name="Bergin2015">Bergin, J., Kohls, C., Köppe, C., Mor, Y., Portier, M., Schümmer, T., Warburton, S. (in press 2015). [http://www.koeppe.nl/publications/Koppe-PLoP15-FairPlayPatterns.pdf Assessment-Driven Course Design - Fair Play Patterns]. In ''Proceedings of the 22nd Conference on Pattern Languages of Programs (PLoP 2015)''. New York:ACM</ref>). | ||
==Negative Consequences== | ==Negative Consequences== |
Latest revision as of 19:26, 6 June 2017
Assessment-Driven Course Design | |
Contributors | Joseph Bergin, Christian Kohls, Christian Köppe, Yishay Mor, Michel Portier, Till Schümmer, Steven Warburton |
---|---|
Last modification | June 6, 2017 |
Source | Bergin et al. (2015)[1]; Warburton et al. (2016)[2][3] |
Pattern formats | OPR Alexandrian |
Usability | |
Learning domain | |
Stakeholders |
Use assessments as drivers for developing your course to ensure that the course content, Learning Outcomes (Learning Outcomes), and the way the outcomes are tested match.
Context
You are about to start developing a new course and the main topic of the course is clear.
Problem
There often is a mismatch between what is taught in the course (and how) and the way the students’ achievement of the learning goals is assessed. In that case, one does not really know if the students indeed learnt the topics as intended.
Forces
When there is little time, people tend to start with developing the material they need the fastest. They hereby focus mainly on the topics themselves and less on the way these topics can be taught in the best way possible. Furthermore, students are often assessed for certain skills and knowledge in a way that differs from what they have learnt and applied before.
Solution
Therefore: Use assessments as drivers for developing your course to ensure that the course content, learning outcomes, and the way the outcomes are tested match.
Solution Details
By understanding and knowing what will eventually be tested we can align our course development to these goals. Rather than being content focused, Constructive Alignment (Constructive Alignment) suggests to design the course by addressing the Learning Outcomes (Learning Outcomes) first and then develop the assessment criteria in direct relation to these. An Assessment Criteria List (Assessment Criteria List) along with Criteria Refinement (Criteria Refinement) can help in this process. A Performance Sheet (Performance Sheet) (or rubric) is a tool to communicate the criteria and actually evaluate the performance. The Performance Sheet (Performance Sheet) can also be used to let students know what is expected from them and to justify grading.
Positive Consequences
By applying this pattern, one increases the coherence of all course parts. Content, learning objectives, and assessments match. Starting with this approach, one can identify in an early stage the learning objectives that are hard or even impossible to assess. In that case, the learning objectives should be revisited. The same goes if the assessment method (e.g. a written exam) does not match the skills to be assessed (programming a larger piece of software). Having the assessment criteria defined upfront helps with communicating them to the students too, making the assessments fair and open (as discussed in Bergin et al.[4]).
Negative Consequences
Applying this pattern requires some more time at the beginning of course design, as more things need to be prepared before the actual teaching material will be developed. If one is right before the beginning of a course and has no material at all, then it is likely more appropriate to start with the development of slides, readers etc. But this will increase the chance of having the above mentioned problem.
Example
At HAN University of Applied Sciences, all teachers who are responsible for developing a course must first deliver a course description. This course description contains the main topics of the course, but also the concrete learning objectives (between six and eight), the assessment criteria (per learning objective), and the way these are assessed (e.g. written exam, performance assessment etc.). This course description is reviewed by colleagues, hereby ensuring that all parts of the course description are described clear enough and match. This course description then forms the basis for developing the further material of the course.
At TH Köln all courses should start with Learning Outcomes (Learning Outcomes) for the learning outcomes and an overview of what will be assessed. This was implemented in a course on programming languages and led to a deeper reflection about the methods and examples provided during the lectures and practical assignments.
Related patterns
Olson presents the pattern Clear and Appropriate Assessments (Clear and Appropriate Assessments)[5], which is similar to Assessment-Driven Course Design (Assessment-Driven Course Design).
References
- ↑ Pattern published in Bergin, J., Kohls, C., Köppe, C., Mor, Y., Portier, M., Schümmer, T., & Warburton, S. (2015). Assessment-driven course design foundational patterns. In Proceedings of the 20th European Conference on Pattern Languages of Programs (EuroPLoP 2015) (p. 31). New York:ACM.
- ↑ Patlet published in Warburton, S., Mor, Y., Kohls, C., Köppe, C., & Bergin, J. (2016). Assessment driven course design: a pattern validation workshop. Presented at 8th Biennial Conference of EARLI SIG 1: Assessment & Evaluation. Munich, Germany.
- ↑ Patlet also published in Warburton, S., Bergin, J., Kohls, C., Köppe, C., & Mor, Y. (2016). Dialogical Assessment Patterns for Learning from Others. In Proceedings of the 10th Travelling Conference on Pattern Languages of Programs (VikingPLoP 2016). New York:ACM.
- ↑ Bergin, J., Kohls, C., Köppe, C., Mor, Y., Portier, M., Schümmer, T., Warburton, S. (in press 2015). Assessment-Driven Course Design - Fair Play Patterns. In Proceedings of the 22nd Conference on Pattern Languages of Programs (PLoP 2015). New York:ACM
- ↑ Olson, D. (2008). Teaching Patterns: A Pattern Language for Improving the Quality of Instruction in Higher Education Settings. ProQuest.