You are currently browsing the tag archive for the ‘New Hampshire’ tag.

**ITTS — It’s The Test, Stupid!**

Results in on New Hampshire’s Smart Balanced Assessment which without tweaks will be hitting our students in less than 3 month! The Smart Balanced Assessments is one of the Common Core Assessment consortiums. The other is PARCC. We have been tearing apart the PARCC test for almost 9 months when issues with it became apparent in New York. Until now, we do not have data on the taking of the Smart Balanced one. As with primaries, New Hampshire decided to go first….

Here is what the post-test taking debriefing group came up with.…

**I feel sad for the students who have to take this test — not many will be successful*…

**Much is said about “depersonalizing” information as part of a learning strategy. This is not how students learn*.

**There is too much “stuff” going on the screen at once. It is difficult to move the icons where you want them. Students don’t know how to use the “mouse” everything for them is “tough screen”.*

* If you leave the screen for a short period of time the information on the screen will be gone when you return. “I tried the grade six-grade math—it was humbling. It was scary….*

**I had technology problems. If kids have these problems they’ll just quit.*

**Double-wide monitors would help. I am a huge fan of concept maps but notepad does not let you do that on Smarter Balance. You can’t even copand paste from the notepad into the test… (* Good point; I had that issue too when I took it)

**This was more of a test on the computer skills than on the math concepts. If I was a student I would just pickout an answer and move on.*

*Too tedious—kids will get sick of it and just guess to move on. Kids won’t even get past the computer directions.

======

These are just a sample of the concerns that were raised at this meeting. We did shift to “what do we have to do from now until the spring of 2015 to prepare students. Sample answers include:

*Pay attention to the directions. Provide students with many opportunities to read directions for their assignments.

*You can’t just read this test and then respond. Students need to highlight and take notes—especially during the audio questions.

*Students need to learn to “read the question first”.

*Students need to be able to go back into the text passages to pull out data that will support their answers.

*Students need to read through the questions and all possible answers. Sometimes questions give the answers to other questions in the test.

*Kids need to know how to do “note taking”.

*We need to teach students “how to draw an inference”.

*Students need to learn how to write a transition sentence between two paragraphs.

*Students need to learn how to write using “the speakers” voice.

*Students need to memorize formulas in this test.

*Students will have difficulty writing in the boxes that expand because of the technology of the way the box expands.

*Students will have trouble reading and understanding the directions and what is being asked by the question. Is this test closely aligned to the “common core?” It is important that teachers know what the test will be assessing.

***I am concerned that the math test is not necessarily testing students’ math abilities since there is so much reading. This test seems to assess how well the students read the math questions more than their math skills. Thus, because of the amount of reading, I question the validity of our receiving a math ability score.**

*When Measured Progress developed the NECAP there was a committee on bias to check for testing bias. Does Smarter Balance do the same? Also, math teachers were asked to evaluate the questions to eliminate unnecessary verbiage so that the Math was being tested.

*The opening pages of directions and computer information was **ridiculous**. **I didn’t read it—I’m sure my students won’t.** Suggestions: We should have posters made of the most important and often used keys to post in each math classroom. Students need to practice making equations in Word, including the fractions symbol. We need to teach students to distinguish between on correct answer and many correct answers. There are questions that tell the students to choose the correct answers.

*The test is difficult to navigate with so many keystrokes to juggle.

*The page layout makes it eye weary even though you can expand the screen and zoom in and out.

*The passages are lengthy and time consuming and made me consider just choosing “B” so I could move on. Some terms in the reading seemed out-dated—“Plumb crazy and millwright” for example.

I had to use multiple skills and at the same time multitask—id—the audio portions require me to listen and at the same time read possible answers while constructing a well written paragraph in my head.

*The test assumes the students are skilled in such areas as pre-reading and questions and if they are not, it assumes they will learn while taking the test to read the questions in advance of the reading.

*There wasn’t a flow or cadence to the questions. The type or style of questions changed from one to the next. **The answers were not straight forward**—for example on the math test they did not want the answer to the equation, they wanted to know if the answer was 2/3rd greater than what you started with. I understand this is import ant but this test will be exhausting for the kids.

*The idea of the best answer and then there being 2 or more good and appropriate answers. It felt like a trick. We’re going to look bad for a few years.

*I did 30 questions in an hour and then had to take a break. My eyes hurt and my shoulders felt strained. When I returned 5 minutes later the work was gone.

*Each question is totally different than the one before it creating confusion which creates more confusion for the test takers.

*Frustration level builds as your take the test creating mental despair—students will shut down.

**Many of the math questions seemed to have no basis in the real world and skills that will never be used in life.** Students will need to be taught the technology skills for the test.—scrolling through screens, highlighting, scanning the questions, touch typing, and more.

*The test does not encourage students to use writing webs, brain maps, organizers to assist with writing. Summary:** In my opinion, this test is a sad indictment of how disconnected the people who design the test are from the typical students in the classroom**. Assessment is necessary but it should be designed to be developmentally appropriate for the students being tested. Assessment should also all for different methods to demonstrate competency rather than one computer model. **This test is designed for one type of student—the verbal learner with exceptional executive functioning skills.**

*I took the Grade 7 Language Arts test which I believe is developmentally designed for adults, not seventh grade students. The questions were tedious and punitive. I’m not sure that any seventh grader in the State would be able to score well on this test. The worst part of this test was the directions. They were numerous and multifaceted. After observing middle school students take tests for over a decade, it is my firm belief that most kids will stop reading the directions because there are too many and they are far too complex. Students will fail this test and the test will destroy their confidence which is an important stage of their development. Future careers will be destroyed by this test. In addition, the results of this test will become a public relations night mare for the school and the school districts as children will fail in large numbers.

======

As you can see from a round table of adults taking these tests, **there is considerable problem with Smart Balanced Assessments**. Forget the standards. They are a distraction ITTS — It’s The Test, Stupid! The scores will be low, because the test forces students to become disengaged. The ridiculousness of the Math problems which also came up when we published the questions, is not in their math skills, but how they asked the question… These tests don’t test math. They test how well one can decipher the confused mind of the test maker…

This is a preview of what will come… And we paid over $8 million dollars for this debacle……