Analysis
Student work can be measured against the following criteria:
- can students select appropriate variables for sorting a data set?
- can students select appropriate methods for analyzing a data set?
- can students construct, read and interpret graphical representations of a data set? and,
- can students draw sensible conclusions from a data set?
This generic scoring rubric may be modified and adapted for specific tasks.
Category of performance
|
Typical response
|
The student needs significant instruction
|
Student can begin to organize the data and makes a limited analysis using a single statistic. The student may not have attempted to represent the data in tables or graphs. Only one variable is typically considered.
|
The student needs some instruction
|
Student has made an attempt to organize the data and has attempted to represent it and draw conclusions from it. Again, the response may show that only one variable has been considered. The representation used may be inappropriate and the conclusions invalid.
|
The student's work needs to be revised
|
Student has selected appropriate variables and methods for sorting, analyzing and representing the data. There may be errors in the calculations and graphs. The student attempts to draw conclusions from the data but these may be flawed.
|
The student's work meets the essential demands of the task
|
Student has selected appropriate variables and methods for sorting, analyzing and representing the data. The student has used a variety of analytic tools to interrogate the data set. The conclusions/recommendations follow from and are supported by their analysis of the data
|
The example below shows how the generic rubric can be modified to fit the 'Emergency 911! Bay City' task:
Category of performance
|
Typical response
|
The student needs significant instruction
|
Students calculate a single statistic (e.g., mean or median response time). They recommend one ambulance service over the other on the basis of a comparison of this single statistic even though the mean difference is only .2 minute, not significant for making a policy recommendation. The analysis of the data ignores all other variables except response time.
|
The student needs some instruction
|
Students may calculate measures of center and explore the data with other kinds of analysis (e.g., box plots, stem and leaf plots) but they consider only a single variable - the response times of the two ambulance services. They demonstrate some ability to use their statistical "toolkit" but the analysis is not connected to the real-world context of the problem and the argument is weak.
|
The student's work needs to be revised
|
Students select appropriate variables for analyzing the data (e.g., response time in relation to time of call), make appropriate calculations, use appropriate graphical representations, and make a reasonable recommendation based on their analysis. There may be errors in the calculations and in the graphs. However, students do not fully interrogate the data set, thereby not ruling out other possible salient relationships (e.g., response time in relation to day of the call). The recommendations follow from the analysis but the report may lack clarity and thoroughness.
|
The student's work meets the essential demands of the task
|
Students select appropriate variables for sorting, analyzing and representing the data. Students consider a number of relationships and use a variety of analytic tools to fully interrogate the data set. Their recommendations follow from and are supported by their analysis of the data.
|
Malcolm Swan
Mathematics Education
University of Nottingham
Malcolm.Swan@nottingham.ac.uk
Most assessment practices seem to emphasise the reproduction of imitative, standardised techniques. I want something different for my students. I want them to become mathematicians - not rehearse and reproduce bits of mathematics.
I use the five 'mathematical thinking' tasks to stimulate discussion between students. They share solutions, argue in more logical, reasoned ways and begin to see mathematics as a powerful, creative subject to which they can contribute. Its much more fun to try to think and reach solutions collaboratively. Assessment doesn't have to be an isolated, threatening business.
Not just answers, but approaches.
Malcolm Swan is a lecturer in Mathematics Education at University of Nottingham and is a leading designer on the MARS team. His research interests lie in the design of teaching and assessment. He has worked for many years on research and development projects concerning diagnostic teaching (including ways of using misconceptions to promote long term learning), reflection and metacognition and the assessment of problem solving. For five years he was Chief Examiner for one of the largest examination boards in England. He is also interested in teacher development and has produced many courses and resources for the inservice training of teachers.
Jim Ridgway
School of Education
University of Durham
Jim.Ridgway@durham.ac.uk
Thinking mathematically is about developing habits of mind that are always there when you need them - not in a book you can look up later.
For me, a big part of education is about helping students develop uncommon common sense. I want students to develop ways of thinking that cross boundaries - between courses, and between mathematics and daily life.
People should be able to tackle new problems with some confidence - not with a sinking feeling 'we didn't do that yet'. I wanted to share a range of big ideas concerned with understanding complex situations, reasoning from evidence, and judging the likely success of possible solutions before they were tried out. One problem I had is that my students seemed to learn things in 'boxes' that were only opened at exam time. Thinking mathematically is about developing habits of mind that are always there when you need them - not in a book you can look up later.
You can tell the teaching is working when mathematical thinking becomes part of everyday thinking. Sometimes it is evidence that the ideas have become part of the mental toolkit used in class - 'lets do a Fermi [make a plausible estimate] on it'. Sometimes it comes out as an anecdote. On graduate told me a story of how my course got him into trouble. He was talking with a senior clinician about the incidence of a problem in child development, and the need to employ more psychologists to address it. He 'did a Fermi' on the number of cases (wildly overestimated) and the resource implications (impossible in the circumstances). He said there was a silence in the group...you just don't teach the boss how to suck eggs, even when he isn't very good at it. He laughed.
Jim Ridgway is Professor of Education at the University of Durham, and leads the MARS team there. Jim's background is in applied cognitive psychology. As well as kindergarten to college level one assessment, his interests include the uses of computers in schools, fostering and testing higher order skills, and the study of change. His work on assessment is diverse, and includes, the selection of fast jet pilots, and cognitive analyses of the processes of task design. In MARS he has special responsibility for data analysis and psychometric issues, and for the CL-1 work.
About MARS
The Mathematics Assessment Resource Service, MARS, offers a range of services and materials in support of the implementation of balanced performance assessment in mathematics across the age range K to CL-1. MARS is funded by the US National Science Foundation, and builds on earlier funding which began in 1992 for the Balanced Assessment Project (BA) from which MARS grew.
MARS offers effective support in:
The Design of Assessment Systems: assessment systems are tailored to the needs of specific clients. Design ranges from the contribution of individual tasks, through to full scale collaborative work on test development, scoring and reporting. Clients include Cities, States, and groups concerned with educational effectiveness, such as curriculum projects and professional development initiatives.
Professional Development for Teachers: most teachers need help in preparing their students for the much wider range of task types that balanced performance assessment involves. MARS offers professional development workshops for district leadership and 'mentor teachers', built on materials that are effective when used later by such leaders with their colleagues in school.
Developing Design Skills: many clients have good reasons to develop their own assessment, either for individual student assessment or for system monitoring. Doing this well is a challenge. MARS works with design teams in both design consultancy and the further development of the team's own design skills.
To support its design team, MARS has developed a database, now with around 1000 interesting tasks across the age range, on which designers can draw, modify or build, to fit any particular design challenge.
Tell me more about this technique:
Mathematical Thinking CATs || Fault Finding and Fixing || Plausible Estimation
Creating Measures || Convincing and Proving ||
Reasoning from Evidence