Why the alignment of learning outcomes with the learning competencies important in administering the teaching and learning process?

When you are creating a course, strive to design with the end in mind. After you have established a set of measurable learning objectives for your course, work to develop assessments that are aligned with your stated learning objectives. Think of the learning objectives as a set of skills, knowledge, or abilities that your students will be able to demonstrate a mastery of at the end of the course. Then consider the assessments as a way for the student to prove they are capable of that mastery.

Let’s consider the following learning objective: Describe the seven steps of the research process when writing a paper. First, consider the action verb. The verb “describe” falls within the knowledge and comprehension levels of Bloom’s Taxonomy. Therefore, your learning assessment should have the student demonstrate that he can describe the seven steps of the research process when writing a paper. The following assessment examples might be considered for this objective:

Why the alignment of learning outcomes with the learning competencies important in administering the teaching and learning process?
Click for high resolution

Complete a quiz consisting of multiple choice questions, where the student chooses the correct steps.
Create a poster highlighting the steps.
Ask the students to recite the steps.

Note that asking the students to write a paper using the seven steps of the research process is not appropriate because this asks the student to perform at a higher level of Bloom’s Taxonomy. Doing so would not be in alignment with the stated learning objective.

What if the objective was rewritten? For example, Use the seven steps of the research process when writing a paper. This new objective is using an action verb that falls in the application level of Bloom’s Taxonomy, and the student is expected to use the research process to write a paper. The grading would also primarily focus on how the student used the process.

If your student had never been exposed to the process before, it would be appropriate to have a low-stakes quiz to assess the students’ knowledge of the research process prior to writing the paper.

For a course to meet the Quality Matters standards, it must have assessments that are in alignment with the stated learning objectives. Remember, when creating assessments, look at the action verb being used for your learning objective and the level of learning to apply.

Note: Since this article was published Bloom’s taxonomy has been updated, so the levels represented in the image are from a previous version.

  • Journal List
  • Am J Pharm Educ
  • v.79(1); 2015 Feb 17
  • PMC4346822

Am J Pharm Educ. 2015 Feb 17; 79(1): 10.

Abstract

Objective. To determine whether national educational outcomes, course objectives, and classroom assessments for 2 therapeutics courses were aligned for curricular content and cognitive processes, and if they included higher-order thinking.

Method. Document analysis and student focus groups were used. Outcomes, objectives, and assessment tasks were matched for specific therapeutics content and cognitive processes. Anderson and Krathwohl’s Taxonomy was used to define higher-order thinking. Students discussed whether assessments tested objectives and described their thinking when responding to assessments.

Results. There were 7 outcomes, 31 objectives, and 412 assessment tasks. The alignment for content and cognitive processes was not satisfactory. Twelve students participated in the focus groups. Students thought more short-answer questions than multiple choice questions matched the objectives for content and required higher-order thinking.

Conclusion. The alignment analysis provided data that could be used to reveal and strengthen the enacted curriculum and improve student learning.

Keywords: Alignment, classroom assessment, critical thinking, higher order thinking

INTRODUCTION

Institutions of higher education, including pharmacy institutions, strive to deliver effective instruction. Instructional objectives and assessments should be aligned and work seamlessly to promote high quality education and to support student learning.1-4 We describe alignment as “the degree to which expectations and assessments are in agreement and serve in conjunction with one another to guide the system toward students learning what is expected.”5 Course objectives set the expectations for students. If objectives and assessments are aligned, then students have the opportunity to learn and meet the expectations of the program.

Critical and higher-order thinking has been identified as an important goal of pharmacy education,6-8 and should be represented in course objectives. Critical thinking has been ranked among the top 5 attributes that pharmacists should demonstrate.9 Educational outcomes for entry-to-practice pharmacy programs in Canada indicate that critical thinking is important.3 Moreover, the professional practice of pharmacists involves problem solving and decision making, both of which require critical thinking. However, concerns exist that Canadian pharmacy students are not becoming critical thinkers.9-11 As a result, pharmacy schools identify the need to promote higher-order thinking as an essential part of their programs.7,12 For the purpose of this study, we considered higher-order and critical thinking to represent similar cognitive processes.

Evaluating alignment is one way to demonstrate the connection between outcomes and assessment and can improve the quality of the educational system.2,3,13 Opportunity to learn is increased for students when there is a match between the content and cognitive processes of course objectives and classroom assessments.2 To make valid inferences from assessment results, examinations must assess the achievement of the course objectives to indicate content and construct validity.14-17 Strong alignment neither causes nor guarantees improved achievement; however, it can provide evidence of content validity and contribute to accountability.1,18

According to Roach et al, “there is little clarity in terms of ‘how much’ alignment is adequate to facilitate learning and improved outcomes for students.”3 There is also lack of consistency in how alignment is measured, according to Webb et al.4 Choosing an alignment approach depends on the purpose for conducting the alignment and the feasibility of undertaking such a time-consuming task.2 Minimal research has been conducted on alignment methods for classroom assessments used in higher education. Wittstrom et al used subject matter experts and Webb’s method of alignment to examine the alignment of 2 pharmacotherapy courses. Results indicated that the level of alignment was acceptable, but that improvement was needed.1

Alignment should be conducted for cognitive demand as well as content.19 Many frameworks categorize various aspects of thinking into lower and higher levels.20,21 To achieve the cognitive alignment, we chose Anderson and Krathwohl’s revision of Bloom’s Taxonomy.22 It is recommended as suitable for instruction and assessment with learners over the age of 16.20,21 The taxonomy has 6 cognitive levels, organized in a hierarchical order, from least to most complex: remember, understand, apply, analyze, evaluate, and create. The processes of apply, analyze, evaluate, and create are considered to be higher-order thinking.23,24

The purpose of this study was to analyze the objectives and assessment tasks of 2 therapeutics courses in Newfoundland and Labrador to determine if they were aligned with (1) the Associations of Faculties of Pharmacies in Canada (AFPC) Educational Outcomes for First Professional Degree Programs in Pharmacy,6 (2) subject area content, and (3) cognitive processes, particularly higher-order thinking.

METHODS

To analyze the objectives and assessments for alignment, we used quantitative and qualitative methods (document analysis and focus groups, respectively). This research was reviewed and approved by Memorial University’s Health Research Ethics Authority Board.

We analyzed objectives and assessments for 2 therapeutics courses. Each therapeutics course was 13 weeks in duration, with four 2-hour classes per week. Each course had a coordinator, multiple instructors, and was taught in sections. All assessments consisted of examinations with short-answer and multiple-choice tasks. Student responses to the examinations were not part of the data. The first author was a nonparticipant observer in the therapeutics courses, observing the instruction and interaction among the students.25,26 This author took field notes, recording the cognitive level of classroom activities and describing student involvement. These data helped to categorize the level of thinking required for objectives and assessments.

We used expert review and document analysis to compare breadth of content of the examination tasks to the objectives and to judge depth of thinking of the objectives and examination tasks.2,3,27 There is no recommended number of experts for conducting alignment analyses; however, as few as 2 experts have been used.28,29 Different types of experts have also been be used.28 We used an assessment specialist, an experienced pharmacy professor, and a student who had specific knowledge of the objectives and assessment tasks after having recently completed the courses. Using 3 different types of experts strengthens validity evidence based on test content.14

The alignment process consisted of 4 steps. First, we matched AFPC outcomes to the course objectives for content. We established which outcomes were represented in the course objectives and determined categorical concurrence, which is the extent to which the examinations included tasks that assessed the content of each objective. For example, we checked for assessment tasks that did not assess any objectives. Then, we determined range of knowledge correspondence, or the extent to which each course objective was assessed for content. In other words, we looked to see if any objectives were not assessed.

We then conducted depth of knowledge alignment to determine how well the cognitive levels of the assessment tasks matched the cognitive levels of the corresponding course objectives. The first author trained the other authors on how to complete the cognitive analysis. They each read the Anderson and Krathwohl Taxonomy,22 and then each met with the first author to discuss the cognitive processes before analyzing the objectives and assessments. This training served to strengthen validity evidence based on response processes.14

Course objectives were then independently analyzed to determine their cognitive levels. The authors compared similarities and discrepancies and, based on their discussion, made a final determination of the cognitive level of each objective. Assessment tasks were similarly analyzed. Finally, an analysis was conducted to determine if objectives and assessments that corresponded for content were matched for cognitive levels.

For the focus groups, we used purposeful sampling to recruit representative students in our pharmacy program to discuss the objectives and assessments.26 These students had all completed the first course in the previous semester and were near completion of the second course. We conducted semi-structured, 30-minute to 45-minute interviews with 12 volunteer students divided into focus groups of 3-4. The purpose was to determine whether the students thought the assessments matched the objectives for content, and to determine the thinking processes that students used to answer the assessment tasks (see focus group questions in Table 1). The focus groups were audio-recorded and reviewed several times.30,31 The discussions were transcribed and summarized, and copies of the transcripts and summaries were provided to the students to review, modify, and make additional comments. We coded the data and developed themes.

Table 1.

Focus Group Questions and Tasks for Therapeutics Courses 1 and 2

Why the alignment of learning outcomes with the learning competencies important in administering the teaching and learning process?

The level of thinking required to achieve an outcome or respond to an assessment task may depend upon classroom instruction. Thus, multiple sources of information can result in the most valid, defensible classification of objectives.22 Hence, interviewing students to ascertain their views of whether the objectives and assessments were aligned, how they were taught, and the types of thinking they used to respond to the assessment tasks provided complementary qualitative data to strengthen our quantitative analysis.

RESULTS

The authors agreed 100% that stated course objectives matched AFPC Outcomes. There was 98.7% agreement among the raters with their analyses of the level of categorical concurrence and range of knowledge between course objectives and assessments.

The objectives were categorized for cognitive processes and the ratings were compared. We used Kendall's tau-b rank correlation because even though the cognitive levels are hierarchical, the intervals between ratings lack quantitative meaning. The correlation between the raters was greater for the general categories of higher and lower order thinking than for the specific categories of the taxonomy. However, the correlation (the Kendall tau-b) was significant for both: τ=0.647, p=0.01 for the specific categories, which was considered good; and τ=0.793, p=0.01 for the general categories of higher-order and lower-order thinking, which was considered strong. The results provided a high reliability for the rating of the cognitive levels of the objectives. For the first course, data included 16 course objectives and 201 tasks from 8 examinations. For the second course, data included 15 course objectives and 211 tasks from 4 examinations. The majority of course objectives were matched for content to the AFPC Outcomes of Care Provider and Communicator (Table 2). For categorical concurrence, almost half of the assessment tasks (47.1%) were aligned with the Care Provider course objectives (41.9%), indicating strong alignment. However, overall categorical concurrence was not acceptable in that 50% of the assessment tasks were not matched to the course objectives. These assessment tasks were related to Care Provider and Scholar ideas (Table 2), but they did not correlate with objectives in the course syllabus.

Table 2.

Content Alignment of AFPC Outcomes and Course Objectives with Assessment Tasks in Therapeutics Courses 1 and 2

Why the alignment of learning outcomes with the learning competencies important in administering the teaching and learning process?

As indicated above, Care Provider course objectives were assessed at an acceptable level. The AFPC Scholar category accounted for 6.5% of the objectives, which were assessed by 11.7% of the assessment tasks (Table 2). However, none of the remaining objectives had any matching assessment tasks. Thus, they were not formally assessed in the course examinations, and we did not conduct any further analyses with these objectives. We then matched cognitive levels of the assessment tasks to cognitive levels of the objectives. Care Provider and Scholar Objectives were selectively used for the cognitive alignment, as these were the only objectives that were formally assessed. Figure 1 indicates the cognitive levels of the objectives, most of which should have required higher-order thinking (86.7%). Table 3 indicates the cognitive levels of the examination tasks, with 33.7% of the tasks requiring higher-order thinking (139/412 tasks). This discrepancy in the results indicated that more assessment tasks requiring higher-order thinking were needed to assess the objectives.

Why the alignment of learning outcomes with the learning competencies important in administering the teaching and learning process?

Numbers of care provider and scholar objectives at the cognitive levels of Anderson and Krathwohl Taxonomy

Table 3.

Numbers of Examinations and Cognitive Levels of Questions for the 2 Therapeutics Courses

Why the alignment of learning outcomes with the learning competencies important in administering the teaching and learning process?

Table 4 demonstrates the cognitive alignment of course objectives and their matching assessment tasks. The category of apply was the only level where the alignment was reasonably proportioned, with 28.2% of the assessment tasks and 20% of the objectives categorized at this level. There were no objectives at the remember level, yet more than half of the assessment tasks were categorized at this low cognitive level. The reverse was true for the highest cognitive levels. Two-thirds of the objectives were categorized at the highest levels of the taxonomy, namely analyze, evaluate, and create. Yet, only 5.6% of the questions required this level of thinking. In general, the cognitive alignment of objectives and corresponding assessment tasks was less than satisfactory. The 12 students, 6 female and 6 male, were in their third year of the program and were 22-29 years of age. They were representative of students in the program, and in particular therapeutics classes, in terms of age and academic performance.

Table 4.

Cognitive Alignment of Corresponding Objectives and Assessment Tasks for Therapeutics Courses 1 and 2

Why the alignment of learning outcomes with the learning competencies important in administering the teaching and learning process?

Several themes emerged, but 2 were particularly connected to the alignment between objectives and assessments. The first theme was related to content alignment. Students thought the short-answer tasks on the assessments reflected some of the objectives, but felt that many of the multiple-choice questions did not reflect objectives. All students agreed that most of the long answers were based on the objectives, but that multiple-choice questions could be vague and cover anything from the lecture. Students also thought the assessment tasks should have reflected the “important” concepts from the objectives, but thought that did not always happen.

The second theme represented the students’ perceptions of the assessment tasks’ cognitive levels. All students thought that some of the short-answer tasks required them to think at a higher level and apply what they learned, but that many of the multiple-choice questions were at the remember level For example, one student commented, “I kind of like the short answer … style of questioning because it does allow you to apply what you’ve learned, you kind of apply it a little better than you would in a multiple choice, especially when you have to justify it, you’re using the information you learned and then you get to explain the reason why you chose something.” On the other hand, many students emphasized that a lot of the multiple choice questions were “pure memorization.”

In general, the student comments indicated that they thought objectives and assessment tasks should match for content, but they felt that this did not happen often, particularly for the multiple-choice questions. They also thought that while short-answer tasks sometimes required higher-order thinking, most multiple-choice questions required lower-order thinking. Students’ qualitative thoughts strengthened the quantitative alignment analysis by providing deeper understanding of the thinking they used to respond to the assessment tasks.

DISCUSSION

The alignment results did not support a strong claim of content validity for the assessments. Approximately half of the objectives had multiple matching assessment tasks; the other objectives were not assessed. Objectives should have at least 5-6 tasks to be sufficiently assessed.32,34 Of the course objectives that had a sufficient amount of tasks associated with them, only the Care Provider and Scholar AFPC Outcomes were matched to these objectives. As a result, there were not many AFPC Outcomes assessed in the courses. As indicated in Table 2, the AFPC Outcomes of communicator, collaborator, manager, and advocate were not addressed in the assessments. This has implications for what students value as important learnings, in that students are known to think that what is assessed is what is important.23 Consequently, when course objectives reflect AFPC Outcomes that are neither taught nor assessed, students may think these particular outcomes are not important. Thus, changes were made to strengthen the alignment between course objectives and assessments.

When objectives with no matching assessment tasks were identified, class instruction was examined through discussion between the first and second authors. As a result, modifications were made to the courses for the next cohort of students. When it was agreed that objectives were neither taught nor assessed, then objectives thought not to be necessary were deleted, and when objectives were thought to be essential, instruction and assessments were modified. For example, one objective, “discussing the advocacy role of the pharmacist” was not part of instruction nor was it assessed. It was decided that this objective was not part of the intent of the course and it was deleted. However, a communicator objective, “presenting care plans,” was retained and a corresponding assessment task was developed for it.

Half of the assessment tasks did not assess any of the course objectives. How to address assessment tasks not aligned with content of the objectives was identified as a problem.28 We decided that if concepts were important enough to assess, then they should be included in course objectives. Thus, we modified the objectives for the next cohort of students. For example, many of the tasks that were about foundational knowledge at the remember level were identified as important by both the second author and the students. One objective that was added to match this expectation was that students should be able to “identify the causes, risk factors, signs, and symptoms of infectious, dermatological, and oncology disorders.”

Cognitive complexity is difficult to determine.10 Instructors develop assessment tasks, but not all students may think like the instructor when responding to the tasks. Students who have previous knowledge of concepts may not have to think as deeply as other students who are not as familiar with the concepts. Also, students previously exposed to tasks that require complex thinking may start to find these types of tasks less demanding as they do more of them. Students responding to these complex tasks for the first time may have to think differently and use more analytical skills than students who are familiar with complex tasks. Thus, the cognitive complexity of tasks depends upon more than the wording of the task; it may also depend upon the students’ prior knowledge and background experience with completing these types of tasks.34 Listening to students in the focus groups describe the thinking they used when answering assessment tasks helped to determine the cognitive levels of some of the tasks and to increase the validity of our interpretations. The focus groups and discussions among the authors about cognitive complexity allowed us to arrive at a better understanding of the thinking required, and consensus was reached.

There are currently no criteria for determining what level of alignment is acceptable, and a perfect match between objectives and assessments is not necessary because assessments typically sample from the domain of interest.35 Complexity plays a role when assessment tasks match part of an objective and a decision has to be made about a match. For example, an objective in one of the courses required students to establish goals of therapy, assess alternatives, recommend therapeutic options, and determine monitoring parameters. We determined that an assessment task that asked students to decide on the goals of therapy for a case study would align with this objective for content, even though not all parts of the objective would be assessed.

Our conclusion about the cognitive alignment was similar to the content alignment: it was insufficient. There were no objectives at the lower level of remember, but 63.6% of the assessment tasks required students to remember with no higher-order thinking involved. For higher-order thinking, 86.7% of the objectives met this level of thinking, but only 33.8% of the assessment tasks required higher-order thinking. As a result of our conclusions, we modified some of the assessment tasks to require higher-order thinking.

Critical thinking may not be a natural behavior, and students may need to be taught to think critically.36,37 If course objectives emphasize higher-order thinking and there is a strong alignment between objectives and assessments, students will have opportunities to develop these critical-thinking and higher-order thinking skills, which are considered essential for pharmacists.

Roach et al noted that there is little literature on the use of alignment research in the classroom, and no criterion for satisfactory alignment of objectives and assessments.3 However, our alignment procedures revealed a wide disparity between objectives and assessments for content and cognitive processes—a gap that concerned both students and instructors. Our work allowed us to conclude that changes were necessary to increase opportunities for students to become critical thinkers. We recommend that institutions use assessment data to make changes that will improve student learning.38 This study provides information that will enable instructors to make necessary changes in their objectives or assessments to provide students with increased opportunity to learn and to demonstrate their achievement. The instructors of the 2 therapeutics courses in this study indicated their willingness to use the results to help them examine what is important to teach and assess, and to implement changes in their objectives and assessments to improve student learning. As previously discussed, this has already begun.

One limitation of the study was that data came from only 2 courses at 1 school of pharmacy, and we could not generalize the results to the entire curriculum. However, by replicating the methods in other courses and programs, we could determine in those courses how often and to what degree alignment occurs. Moreover, this method could be used by other pharmacy schools regardless of curricular design or course content.

Another limitation might be that the qualitative component consisted of a small sample size. In qualitative research, sample size is dependent upon the purpose of the research and the questions asked.39,40 It is typical to include only a few individuals.26 Lincoln and Guba recommended that researchers should sample until saturation is reached.41 By the end of the focus groups using 12 students, no new ideas were emerging, so we decided that we had reached saturation. These students were representative of other pharmacy students in the program, so we did not think increasing the number of students would yield new information. Moreover, McLaughlin et al noted that in qualitative educational research in pharmacy, the participants are selected for “specific characteristics of interest,” and that generalization is not the responsibility of the researcher.42

In this study, we supplemented document analysis with listening to students describe the thinking processes they used when responding to assessment tasks. Listening to students helped us categorize the cognitive levels. Alignment studies do not generally include student input.35 This qualitative component, therefore, added a new perspective to the literature on the cognitive aspect of alignment.

This study could be replicated in other courses in pharmacy education and other curricular areas of higher education institutions. Instructors and those involved in curriculum development could use the results of the study and the process of alignment for professional development in higher education. Moreover, because curriculum for all pharmacy schools in Canada is modeled on the AFPC Educational Outcomes and critical thinking is one of the overarching competencies for entry-to-practice pharmacy students in Canada,43 the United States,44 and elsewhere, this study has the potential to reach beyond our school in Newfoundland and Labrador to national and international educational systems.

CONCLUSION

In order for the consequences of assessment to be valid, assessment tasks must be aligned with course objectives for content and cognitive processes. If pharmacy programs purport to teach national outcomes then course objectives need to reflect the national outcomes, and these outcomes must be assessed for both the content and the cognitive complexity of the national outcomes. Both of these conditions are necessary to make valid decisions about whether students have met the expectations of the national outcomes and whether they are prepared to be successful pharmacists.

Furthermore, if the expectation of national outcomes is that pharmacy students need to acquire critical-thinking skills, then instruction and assessment must work together to promote these higher-order thinking skills.

REFERENCES

1. Wittstrom K, Cone C, Salazar K, bond R, Dominguez K. Alignment of pharmacotherapy course assessments with course objectives. Am J Pharm Educ.2010;74(5):Article 76. [PMC free article] [PubMed] [Google Scholar]

2. Martone A, Sireci SG. Evaluating alignment between curriculum, assessment, and instruction. Rev Educ Res.2009;79(4):1332–1361. [Google Scholar]

3. Roach AT, Niebling BC, Kurz A. Evaluating the alignment among curriculum, instruction, and assessments: implications and applications for research and practice. Psychol Sch.2008;45(2):158–176. [Google Scholar]

4. Webb NM, Herman JL, Webb NL. Alignment of mathematics state-level standards and assessments: The role of reviewer agreement. Educ Meas: Issue Pract. 2007;26(2):17–29. [Google Scholar]

5. Webb NL. Alignment Study in Language Arts, Mathematics, Science, and Social Studies of State Standards and Assessments for Four States. Washington, DC: Council of Chief State School Officers; 2002. [Google Scholar]

6. Association of Faculties of Pharmacy of Canada. Educational Outcomes for First Professional Degree Programs in Pharmacy (Entry-to-Practice Pharmacy Programs) in Canada; 2010.

7. Cisneros RM. Assessment of critical thinking in pharmacy students. Am J Pharm Educ.2009;73(4):Article 66. [PMC free article] [PubMed] [Google Scholar]

8. Austin Z, Gregory PAM, Chiu S. Use of reflection-in-action and self-assessment to promote critical thinking among pharmacy students. Am J Pharm Educ.2008;72(3):Article 48. [PMC free article] [PubMed] [Google Scholar]

9. Thompson DC, Nuffer W, Brown K. Characteristics valued by the pharmacy practice community when hiring a recently graduated pharmacist. Am J Pharm Educ.2012;76(9):Article 170. [PMC free article] [PubMed] [Google Scholar]

10. Miller DR. An assessment of critical thinking: can pharmacy students evaluate clinical studies like experts? Am J Pharm Educ.2004;68(1):Article 5. [Google Scholar]

11. Peeters MJ. Cognitive development of learners in pharmacy education. Curr Pharm Teach Learn.2011;3(3):224–229. [Google Scholar]

12. Tiemeier AM, Stacy ZA, Burke JA. Using multiple choice tasks written at various Bloom’s Taxonomy levels to evaluate student performance across a therapeutics sequence. Innov Pharm.2011;2(2):1–11. [Google Scholar]

13. Resnick LB, Rothman R, et al. Benchmarking and alignment of standards and testing. Educ Assess.2004 1&2:1-27. [Google Scholar]

14. American Education Research Association, American Psychological Association, & National Council on Measurement in Education. Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association; 1999. [Google Scholar]

15. Kane MT. Validation. In: Brennan RL, editor. Educational Measurement. 4th ed. Westport, CT: American Council on Education and Praeger Publishers; 2006. pp. 17–64. [Google Scholar]

16. Messick S. Validity. In: Linn RL, editor. Educational Measurement. 3rd ed. New York: Macmillan; 1989. pp. 13–103. [Google Scholar]

17. Sireci SG. Packing and unpacking sources of validity evidence: History repeats itself again. In: Lissitz RW, editor. The Concept of Validity: Revisions, New Directions, and Applications. Charlotte, NC: Information Age Publishing, Inc.; 2009. pp. 19–37. [Google Scholar]

18. Beck MD. Review and other views: “alignment” as a psychometric issue. Appl Meas Educ.2007;20(1):127–135. [Google Scholar]

19. Shepard LA. The role of assessment in a learning culture. Edu Res.2000;29(7):4–14. [Google Scholar]

20. Moseley D, Baumfield V, Higgins S, et al. Thinking Skill Frameworks for Post-16 Learners: An Evaluation. A research report for the Learning and Skills Research Center. Wiltshire: Cromwell Press Ltd. 2004 [Google Scholar]

21. Moseley D, Baumfield V, Elliott J, et al. Frameworks for Thinking: A Handbook for Teaching and Learning. Cambridge: Cambridge University Press; 2005. [Google Scholar]

22. Anderson LW, Krathwohl DR. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New York: Longman; 2001. [Google Scholar]

23. Nitko AJ, Brookhart SM. Educational Assessment of Students. 6th ed. New Jersey: Pearson; 2011. [Google Scholar]

24. Paul R, Elder L. Critical Thinking: Tools for Taking Charge of Your Learning and Your Life. 2nd ed. New Jersey: Pearson Prentice Hall; 2006. [Google Scholar]

25. Gay LR, Mills GE, Airasian P. Educational Research: Competencies for Analysis and Applications. Toronto: Pearson; 2012. [Google Scholar]

26. Creswell JW. Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research. 4th ed. New Jersey: Pearson; 2012. [Google Scholar]

27. Webb NL. Criteria for Alignment of Expectations and Assessments in Mathematics and Science Education. Council of Chief State School Officers and National Institute for Science Education. Madison,WI: Wisconsin Center for Education Research, University of Wisconsin; 1997.

28. Davis-Becker SL, Buckendahl CW. A proposed framework for evaluating alignment studies. Educ Meas: Issue Pract.2013;32(1):23–33. [Google Scholar]

29. Porter AC. Measuring the content of instruction: uses in research and practice. Educ Res.2002;31(7):3–14. [Google Scholar]

30. Rubin HJ, Rubin IS. Qualitative Interviewing: The Art of Hearing Data. London: Sage; 2005. [Google Scholar]

31. Kvale S, Brinkman S. Interviews: Learning the Craft of Qualitative Research Interviewing. 2nd ed. Los Angeles: Sage; 2009. [Google Scholar]

32. Frisbie DA. Checking the alignment of an assessment tool and a set of content standards. Iowa Technical Adequacy Project (ITAP). Iowa City, IA: University of Iowa; 2003. [Google Scholar]

33. Webb NL. Issues related to judging the alignment of curriculum standards and assessments. Appl Meas Educ.2007;20(1):7–25. [Google Scholar]

34. Herman JL, Webb MW, Zuniga SA. Measurement issues in the alignment of standards and assessments: a case study. Appl Meas Educ. 2007;20(1):101–126. [Google Scholar]

35. Koretz M, Hamilton LS. Testing for accountability in K-12. In: Brennan RL, editor. Educational Measurement. 4th ed. Westport, CT: American Council on Education and Praeger; 2006. pp. 531–578. [Google Scholar]

36. Costa AL, Kallick B. Discovering and Exploring Habits of Mind. Virginia: Association for Supervision and Curriculum Development; 2000.

37. Lipman M. Thinking in Education. 2nd ed. Cambridge: Cambridge University Press; 2003. [Google Scholar]

38. Abate MA, Stamatakis MK, Haggett RR. Excellence in curriculum development and assessment. Amer J Pharm Educ.2003;67(3):Article 89. [Google Scholar]

39. Merriam S. Qualitative Research: A Guide to Design and Implementation. San Francisco, CA: Jossey-Bass; 2009. [Google Scholar]

40. Yin RK. Qualitative Research from Start to Finish. New York: Guilford Press; 2011. [Google Scholar]

41. Lincoln YS, Guba EG. Naturalistic Inquiry. Thousand Oaks, CA: Sage; 1985. [Google Scholar]

42. McLaughlin JE, Dean MJ, Mumper RJ, Blouin RA, Roth MT. A roadmap for educational research in pharmacy. Amer J Pharm Educ.2013;77(10):Article 218. [PMC free article] [PubMed] [Google Scholar]

43. The National Association of Pharmacy Regulatory Authorities (NAPRA) Professional Competencies for Canadian Pharmacists at Entry to Practice. Ottawa, ON: Author; 2014. [Google Scholar]

44. Accreditation Council for Pharmacy Education. Accreditation standards and guidelines for the professional program in pharmacy leading to the doctor of pharmacy degree. Adopted January 15, 2006. http://www.acpe-accredit.org Accessed May 3, 2013.


Articles from American Journal of Pharmaceutical Education are provided here courtesy of American Association of Colleges of Pharmacy


Why is it important to align learning outcomes with learning competencies?

Building alignment between assessment and learning outcomes also allows you to develop and communicate the pathway for students' learning progression. It enables you to explain what knowledge and skills were expected on entry to the course and the knowledge and skills that will be developed throughout the course.

What is learning outcomes aligned with learning competencies?

A learning outcome is a clear and specific statement that identifies what students must demonstrate at the level and standard required to successfully pass their study at program and course levels. Read more here.

Why is it important to align the assessment method to the learning outcomes or objectives?

Assessments should reveal how well students have learned what we want them to learn while instruction ensures that they learn it. For this to occur, assessments, learning objectives, and instructional strategies need to be closely aligned so that they reinforce one another.

Why alignment is important in education?

Clear alignment helps students understand how various parts of the course fit together, which in turn helps them learn. Practice with feedback: Students need multiple opportunities to practice using the knowledge and skills they are learning, along with timely feedback.