Go to Collaborative Learning Go to FLAG Home Go to Search
Go to Learning Through Technology Go to Site Map
Go to Who We Are
Go to College Level One Home
Go to Introduction Go to Assessment Primer Go to Matching CATs to Goals Go to Classroom Assessment Techniques Go To Tools Go to Resources

Go to CATs overview
Go to Attitude survey
Go to ConcepTests
Go to Concept mapping
Go to Conceptual diagnostic tests
Go to Interviews
Go to Mathematical thinking
Go to Performance assessment
Go to Portfolios
Go to Scoring rubrics
Go to Student assessment of learning gains (SALG)
Go to Weekly reports

Go to previous page

Classroom Assessment Techniques

(Screen 5 of 6)
Go to next page

Theory and Research
Today, it is generally recognized that the commonly used series of 60-minute examinations can only provide an instructor with a quick and limited view of the knowledge a student has actually achieved during a semester course (Slater, 1997). Conventional multiple-choice tests do not provide the instructor with enough information to ascertain why the student gave a particular response. Unfortunately, even student-supplied responses, in-class essays, and quantitative problem-oriented test items are severely limited in scope and complexity due to unavoidable time constraints. These deficiencies and others have previously been thoroughly described and documented (Berlack et al., 1992, p. 8).

Portfolio assessment strategies, such as those used in fine arts such as photography, architecture, and writing, might hold the most promise for science instruction. In the introductory level science course, portfolios provide a forum for extended and complex learning activities and observations (Slater, 1994; Collins, 1992; 1993). For example, an introductory geology portfolio might contain maps drawn by the student, cross-sections, and interpretations from student observations. The student can also provide an indication pertaining to some of the difficulties encountered in obtaining information and justification for any assumptions employed. In such a procedure, much of the responsibility of both learning and assessment is transferred to the student.

In terms of effectiveness, Slater (1997) reports how different types of portfolios in three separate classroom contexts were used to explore the effectiveness of portfolio assessment strategies. In each study, a two-group comparison strategy was used and the groups were compared on several measures. These included a common final examination and a pretest/posttest self-report survey. Additionally, each group that used portfolios completed open-ended surveys and participated in focus group interviews. Three classroom contexts were used: (1) college physics at an urban community college; (2) physical science for elementary education majors at medium-sized university; and (3) introductory environmental science for non-science majors in a large-enrollment lecture course (n > 280) at a major university.

For each study, one of two course sections was randomly selected to be assessed primarily by portfolios while the other was assessed traditionally using quizzes and tests. With the exception of the final examination, students who were primarily assessed using portfolios were not administered any of the quizzes or tests that the traditional students took. Student portfolios were evaluated at regular intervals throughout the semester using a holistic scoring rubric (described thoroughly by Rischbieter, Ryan, & Carpenter, 1993; Astwood & Slater, 1996; Kuhns, 1993). At the end of the semester course, all students took the same multiple-choice final examination with 24 to 50 items that were directly correlated to the course learning objectives.

In each study, the results were essentially identical. Students assessed by portfolios scored just as well on a traditional multiple-choice final examination as their traditionally assessed counterparts. However, an analysis of the qualitative data suggests that, from the students' perspectives, there may be major advantages to the portfolio assessment strategy.

All students completed open-ended surveys and representatives from each class using portfolios participated in focus group interviews. Overall the students reported that they liked this alternative procedure for assessment. Probably most important to the students, the portfolios significantly reduced the level of "test anxiety" (Slater, Samson, & Ryan, 1995). This reduction in student anxiety clearly shows up in the way that students attend to class discussions. Students suggest that they feel like they are being relieved of their traditional vigorous note taking duties so they are free to look at the holistic science of a given situation - not just the formulas. They state that they enjoy class discussion more because of the atmosphere promoted by the assessment strategies employed.

Students assessed by portfolios also report that they spend a lot of time going over the textbook or required readings to be sure that they comprehend the depths of each learning objective. Although it is unclear exactly how much time students devote to creating their portfolios, they do report that they contemplate the concepts outside of the classroom environment - always looking for that "neat thing" to include in their portfolio. Students reported that they thought that would remember what they were learning much better and longer than they would the material for other classes they took. Students suggest that this is because they have internalized the material while working with it, thought about the principles, and applied concepts creatively and extensively over the duration of the course.


  • Timothy F. Slater, Research Assistant Professor of Physics, Montana State University. Interests: authentic assessment in support of student-centered instruction; physics/astronomy education at both K-16 levels and public awareness.
    Internet-based scientific investigations involving real-time earth/space science data; and teacher-enhancement: solar.physics.montana.edu/tslater

Astwood, P.M. & Slater, T.F. (1996). Portfolio assessment in large-enrollment courses: effectiveness and management. Journal of Geological Education, 45(3).

Berlak, H., Newmann, F.M., Adams, E., Archbald, D.A., Burgess, T., Raven, J., and Romberg, T.A. (1992) Toward a new science of educational testing and assessment: Albany, State University of New York Press.

Collins, A. (1993) Performance-based assessment of biology teachers. Journal of College Science Teaching, 30(9): 1103-1120.

Collins, A. (1992) Portfolios for science education: Issues in purpose, structure, and authenticity. Science Education, 76(4): 451-463.

Guba, E.G. and Lincoln, Y.S. (1989). Fourth Generation Evaluation: Newbury Park, CA, Sage Publications, Inc., p. 294

Kuhs, T.M. (1994) Portfolio assessment: Making it work for the first time. The Mathematics Teacher, 87(5): 332-335.

Rischbieter, M.O., Ryan, J.M., & Carpenter, J.R. (1993). Use of microethnographic strategies to analyze some affective aspects of learning-cycle-based minicourses in paleontology for teachers. Journal of Geological Education, 41(3): 208-218.

Slater, T.F. (1994) Portfolio assessment strategies for introductory physics. The Physics Teacher, 32(6): 415-417.

Slater, T.F. (1997) The effectiveness of portfolio assessments in science. Journal of College Science Teaching, 26(5).

Slater, T.F. & Astwood, P.M. (1995) Strategies for grading and using student assessment portfolios. Journal of Geological Education, 45(3): 216-220.

Slater, T.F., Ryan. J.M, & Samson, S.L. (1997). The impact and dynamics of portfolio assessment and traditional assessment in college physics. Journal of Research in Science Teaching, 34(3).

Tobias, S. & Raphael, J. (1995) In-class examinations in college science - new theory, new practice. Journal of College Science Teaching, 24(4): 240-244.

Wiggens, G. (1989, May) A true test: Toward more authentic and equitable assessment. Phi Delta Kappan, 70(9): 703-713.

Wolf. D. (1989) Portfolio assessment: Sampling student work. Educational Leadership, 46(7): 35-37.

Go to previous page Go to next page

Tell me more about this technique:

Got to the top of the page.

Introduction || Assessment Primer || Matching Goals to CATs || CATs || Tools || Resources

Search || Who We Are || Site Map || Meet the CL-1 Team || WebMaster || Copyright || Download
College Level One (CL-1) Home || Collaborative Learning || FLAG || Learning Through Technology || NISE