As a partial requirement for my Assessment course, I developed an Assessment Project which includes a summative Unit Test designed to measure students' mastery of Science SOL 2.6 (Weather). The purpose of this project was to create an assessment with high reliability and validity and to gain an understanding of what is truly involved in creating meaningful assessment opportunities. The process by which I developed this assessment is outlined below, and it has proved to be extremely helpful in planning instruction.

The first step in creating this test was to unpack the SOL, delving into each of the learning objectives described in the Curriculum Framework from the Virginia Department of Education. By examining the explicit, implied, and conditional knowledge and skills in each objective and considering at what cognitive levels they fell, I was able to break down a large topic into component parts which can be taught and tested. The hardest part of unpacking the objectives was thinking through the implied content. We have integrated the knowledge and skills involved into a single competency as adults, and it can be difficult to think back and break down what underlying assumptions or knowledge we are accessing. In order to teach students effectively, we need to supply instruction on these underpinnings. That is where the process of unpacking becomes essential, because we have to step backwards through concepts we take for granted and think like a 2nd grader. What was obvious to me as I read these standards and skills is new to a child, and I needed to break down my thinking into discrete chunks that I could explain and demonstrate. It’s easy to overlook some of these prerequisite understandings and ideas, and doing so could leave students quite confused. This process was harder than it appeared at first glance, but is essential to teaching all of the skills students need to meet the requirements outlined.

The next step was to create a Table of Specifications which provided the framework to map assessment questions back to the learning objectives unraveled as mentioned above. As I created test questions, I charted where on this Table they fell. This allowed me to quickly see which objectives had been tested and which I needed to create more questions for. By ensuring I had a way to measure student competency at each of the cognitive levels required and in each specific skill, I developed a test which provides a reliable measurement of student learning and knowledge.

This tight alignment between the skills referenced in SOL documentation, the number of questions at each intersection in the Table, and the skills I am assessing with each question is evidence that my assessment has construct validity. As such, it provides a good example of my "ability to create assessments that provide both a valid and reliable representation of student learning," which is Competency #16 of the William & Mary Student Teacher Competencies.

The ability to create valid and reliable assessments which map back to the Standards of Learning and the specific learning objectives identified in the Curriculum Framework means that I am able to not only give my students fair opportunities to demonstrate what they have learned but also to give tests which have predictive validity. Because my tests line up with the same skills and competencies that the SOL tests are aligned to, they should be good predictors of student performance on those exams.

The process of developing this assessment has helped me gain greater insight into the Virginia Standards of Learning and has given me the tools to plan instruction that will help my students meet and exceed the standards set by the Virginia Department of Education. These are valuable tools to carry with me as I begin my teaching career, and learning to use them has made me more confident that I can provide meaningful instruction in any area that my students need. Learning an assessment for which to prepare my students has made me a better teacher, and I look forward to using these skills in the classroom.


During my student teaching experience, I worked with students who were struggling to remediate their skills where needed. Below is an example of a student's work showing the student struggling with the idea of rounding to the nearest inch after being introduced to the concept of 1/2 inches. After remediation, the student was able to round properly on 19 out of 20 problems!
orig_quiz_opt.jpg
Original Quiz
after_remediation_opt.jpg
After Remediation

I also used assessments of student performance in conducting Daily Edit activities. The mistakes in these teacher-created sentences that students worked on each morning were often mistakes that students had recently been making in their writing. For example, this was the Daily Edit one morning soon after students turned in a paragraph about their favorite stuffed animal. Students were using run-on sentences both in their paragraphs and in bi-weekly writing assessments and needed reinforcement on where to end sentences:
daily_edit_1_opt.jpgdaily_edit_2_opt.jpg

By using trends that I saw in student work in ongoing instruction, I was able to better tailor my instruction to meet my students' needs. This, above all, should be the main goal of assessment.