SWK/Hilltowns

Gateway Superintendent’s Corner

Dr. David Hopson

Dr. David Hopson

Anyone who’s following educational trends in the media will have seen all of the pros and cons of standardized educational testing and the implementation/adoption of the Common Core State Standards. The debate seems to be intensifying, particularly around the ‘new’ national tests being developed by two groups: PARCC (the Partnership for Assessment of Readiness for College and Careers testing consortium, to which Massachusetts belongs) and the Smarter Balanced Assessment Consortium. Both are creating computerized tests based upon Common Core standards, meaning they are focused on students’ ability to combine information from multiple selections of materials and formulate an appropriate response. Ideally the test results will be available to schools more quickly than the paper-based MCAS tests due to the use of computerized grading.
These ‘new’ tests are causing much controversy and have many parents, staff members, and even schools opting out of participating. The PARCC test has not been fully ‘adopted’ in Massachusetts yet and, as part of that process, many students and schools—including two classes at Gateway—are participating in a ‘pilot test’ of the assessment system. Setting up for the trial run of the PARCC assessment has turned into a major burden for many schools, including ours. The time commitment of staff (technology, administration, proctors) to plan for a trial run in which the requirements seem to change regularly, and the efforts of setting up computers to run the tests (thereby effectively taking them out of classroom use for the time it takes to set them up, run the training, and then have students take the test) seems to grow exponentially. We’ve also found that Pearson (the test development company) is setting a technological level for the hardware so high that many of our computers cannot be used to take the test and others have to undergo significant updates to accommodate the hardware requirements. I can barely imagine scaling this test up from the 34 students we have taking it this year to all students in grades 3-8 in the near future.
While there has been much talk about finding ways to help districts pay to upgrade their technology infrastructure and purchase computers to take the tests, I have yet to see any guarantee that this will happen through the state budget. In fact, if what I have read and heard is correct, the cost of administering these new tests is nearly $30 per student, significantly more than the current MCAS test, and this is money that will have to be found in the state’s budget for educational assessment. This could mean a smaller increase in future years for state aid to education or unrestricted local aid to towns. With this increase in state costs, and the potential for significant additional expenditures by schools as they upgrade their technology to meet these new testing requirements, one might almost consider this yet another ‘unfunded’ mandate for local cities and towns.
Despite all of the additional time and effort being expended this year to trial the PARCC test, no data from these tests will be shared with students, teachers, or even the district. And, once this test is in place, it will be used to measure student, teacher, and district growth – an interesting statistical problem for the state, or Pearson, as there is no historical data for these tests, the tests require a whole new level of student interactions with computers, and are measuring what are essentially “new” common core standards which are still in the process of being implemented in our schools. It would have been nice if the federal government had spent the time, money, and energy to develop current assessments that actually measure student growth across the entire spectrum of school activities (things like project-based rubrics, hands-on activities, projects that have students demonstrate the application of skills, collaboration and creativity, and written work, all of which makes up student ‘grades’ provided by their teachers) rather than just an annual test of their proficiency in English Language Arts and Mathematics as measured by a high stakes, multiple choice/short answer/short essay written (or in this case, done on a computer) test, which seems to have little bearing on what adults have to do to be productive in society and at work.

To Top