• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Have we talked about Standardized School Tests?

Standardised testing in that form can be used to guide the curriculum. But I still think they are a snapshot.

What happens if the child blitz's the year level assessment? Do you they do the next year up and level them that way?

No, the school curriculum is very wide. So they'd be either given fewer concepts and taught more repetitively, or given more concepts and more deeply, but still in the same classes. The school uses a basically "branching off" kind of curriculum. The better you do the more branches you are allowed to explore. My kids call them, "side quests."

Sounds good in theory. Do they have another teacher who oversees that? I am just thinking of how difficult it is to cater for different levels within the classroom as it is, without trying to teach extra curriculum on top of it.
 
No, the school curriculum is very wide. So they'd be either given fewer concepts and taught more repetitively, or given more concepts and more deeply, but still in the same classes. The school uses a basically "branching off" kind of curriculum. The better you do the more branches you are allowed to explore. My kids call them, "side quests."

Sounds good in theory. Do they have another teacher who oversees that? I am just thinking of how difficult it is to cater for different levels within the classroom as it is, without trying to teach extra curriculum on top of it.

No, they do it on their own. One teacher (Language Arts) has separate vocab books available for the high achievers. One teacher (math) does video lectures with homeworks that the kids can watch on their own. One teacher (science) does extra projects. But the class size is small which allows it. 14-17 kids per class.
 
Sounds good in theory. Do they have another teacher who oversees that? I am just thinking of how difficult it is to cater for different levels within the classroom as it is, without trying to teach extra curriculum on top of it.

No, they do it on their own. One teacher (Language Arts) has separate vocab books available for the high achievers. One teacher (math) does video lectures with homeworks that the kids can watch on their own. One teacher (science) does extra projects. But the class size is small which allows it. 14-17 kids per class.

14 - 17 kids per class? Can I come and work there? PLEASE!!!!!!???????
 
^^ That. In spades. It goes beyond students not really understanding the material upon which they test well, "teaching to the test" totally devalues the development of the ability to learn, period. IMHO it is far more important to teach critical thinking, how to approach problems etc., than it is to teach how to solve a particular problem or set of problems.

I agree - so how to we discover whether schools are delivering this - so that states can't leave certain underfunded schools to fail in this?

Great question! How to test a student's ability to develop solutions, vs. their ability to answer questions? I think one could come up with some open ended questions that beg for inventive solutions, but then it gets difficult to assign a hard value to something as soft as "inventiveness" or whatever. I like to do that when interviewing candidates for positions like marketing... I'll ask stupid questions like "why are manhole covers round?". I am not as impressed with "because manholes are round" as I am with "If they were rectangular they could fall through the hole" or even "they're very heavy and since they're round, they can be rolled". But still, I have no weighting system for any of the answers that gives me actionable information. I have to take it in the context of the rest of the interview.
Maybe some more subjective evaluation of students would be a good thing? I don't know.
 
Standardised testing in that form can be used to guide the curriculum. But I still think they are a snapshot.

What happens if the child blitz's the year level assessment? Do you they do the next year up and level them that way?
And not a very good one at that. A kid can have an off day. Another could have test anxiety (know the material, freak out on tests).
 
I agree - so how to we discover whether schools are delivering this - so that states can't leave certain underfunded schools to fail in this?

Great question! How to test a student's ability to develop solutions, vs. their ability to answer questions? I think one could come up with some open ended questions that beg for inventive solutions, but then it gets difficult to assign a hard value to something as soft as "inventiveness" or whatever. I like to do that when interviewing candidates for positions like marketing... I'll ask stupid questions like "why are manhole covers round?". I am not as impressed with "because manholes are round" as I am with "If they were rectangular they could fall through the hole" or even "they're very heavy and since they're round, they can be rolled". But still, I have no weighting system for any of the answers that gives me actionable information. I have to take it in the context of the rest of the interview.
Maybe some more subjective evaluation of students would be a good thing? I don't know.

The problem with that kind of question is that it relies on the assumption that it is not familiar to the interviewee - and there seem to be a small number of 'old chestnuts' such as 'Why are manhole covers round' that come up all the time - it's not hard for students or interviewees to simply memorize the 'right' answer, without having had to think very much at all about the problem, and with no need to be able to reason out the answer for themselves.

I first encountered the question about manhole covers in about 1984; and it made me think for a minute. But every time since, I have needed no thinking time at all - and nor would any of my classmates who couldn't answer the question in 1984, but can now, if their memory stretches back the 32 years to the time some smart alec know-it-all git was trying to impress their teacher.
 
Great question! How to test a student's ability to develop solutions, vs. their ability to answer questions? I think one could come up with some open ended questions that beg for inventive solutions, but then it gets difficult to assign a hard value to something as soft as "inventiveness" or whatever. I like to do that when interviewing candidates for positions like marketing... I'll ask stupid questions like "why are manhole covers round?". I am not as impressed with "because manholes are round" as I am with "If they were rectangular they could fall through the hole" or even "they're very heavy and since they're round, they can be rolled". But still, I have no weighting system for any of the answers that gives me actionable information. I have to take it in the context of the rest of the interview.
Maybe some more subjective evaluation of students would be a good thing? I don't know.

The problem with that kind of question is that it relies on the assumption that it is not familiar to the interviewee - and there seem to be a small number of 'old chestnuts' such as 'Why are manhole covers round' that come up all the time - it's not hard for students or interviewees to simply memorize the 'right' answer, without having had to think very much at all about the problem, and with no need to be able to reason out the answer for themselves.

I first encountered the question about manhole covers in about 1984; and it made me think for a minute. But every time since, I have needed no thinking time at all - and nor would any of my classmates who couldn't answer the question in 1984, but can now, if their memory stretches back the 32 years to the time some smart alec know-it-all git was trying to impress their teacher.

I pulled that example because it was familiar to me and it has numerous known answers. The "right answer" isn't always - or even usually - the right answer, depending on the rest of the interview. I never did a PhD thesis, but I gather that much of their evaluations are done interview-style, and I think there's merit to that. The manhole question is a classic, and even if someone has heard all the answers, they still have to pick one, and you get to ask why.
 
The problem with that kind of question is that it relies on the assumption that it is not familiar to the interviewee - and there seem to be a small number of 'old chestnuts' such as 'Why are manhole covers round' that come up all the time - it's not hard for students or interviewees to simply memorize the 'right' answer, without having had to think very much at all about the problem, and with no need to be able to reason out the answer for themselves.

I first encountered the question about manhole covers in about 1984; and it made me think for a minute. But every time since, I have needed no thinking time at all - and nor would any of my classmates who couldn't answer the question in 1984, but can now, if their memory stretches back the 32 years to the time some smart alec know-it-all git was trying to impress their teacher.

I pulled that example because it was familiar to me and it has numerous known answers. The "right answer" isn't always - or even usually - the right answer, depending on the rest of the interview. I never did a PhD thesis, but I gather that much of their evaluations are done interview-style, and I think there's merit to that. The manhole question is a classic, and even if someone has heard all the answers, they still have to pick one, and you get to ask why.
I generally had to interview people to fill a technician position. I wanted to see if they understood the bare basics needed to do the job of assembling test set ups and running tests without my having to constantly direct them. So I preferred to ask questions to see if they could think and understand the very basics of what they should have learned in school - no problem solving, just checking for understanding of basic laws. I made up a very simple five question test that I had the secretary give them to see if I wanted to waste the time interviewing them. My boss saw the questions and told me that I couldn't expect them to know all that, that even he couldn't answer those questions. I told him that I wouldn't expect him to since he was a business major but anyone who had taken and understood freshman physics should find the questions so simple that they would get a good laugh from of them. That while he wouldn't hire someone as an accountant that didn't know the difference between a debit and a credit, I wouldn't hire someone as a lab tech that didn't know that when a gas is compressed it gets hotter. He grudgingly let me continue using it.
 
Last edited:
I pulled that example because it was familiar to me and it has numerous known answers. The "right answer" isn't always - or even usually - the right answer, depending on the rest of the interview. I never did a PhD thesis, but I gather that much of their evaluations are done interview-style, and I think there's merit to that. The manhole question is a classic, and even if someone has heard all the answers, they still have to pick one, and you get to ask why.
I generally had to interview people to fill a technician position. I wanted to see if they understood the bare basics needed to do the job of assembling test set ups and running tests without my having to constantly direct them. So I preferred to ask questions to see if they could think and understand the very basics of what they should have learned in school - no problem solving, just checking for understanding of basic laws. I made up a very simple five question test that I had the secretary give them to see if I wanted to waste the time interviewing them. My boss saw the questions and told me that I couldn't expect them to know all that, that even he couldn't answer those questions. I told him that I wouldn't expect him to since he was a business major but anyone who had taken and understood freshman physics should find the questions so simple that they would get a good laugh from of them. That while he wouldn't hire someone as an accountant that didn't know the difference between a debit and a credit, I wouldn't hire someone as a lab tech that didn't know that when a gas is compressed it gets hotter. He grudgingly let me continue using it.

I am constantly astonished by the low expectations for science in the field of 'general knowledge'. If you watch a TV quiz show - even a 'highbrow' contest like Mastermind - they will pitch a question like 'Which is the closest planet to the sun in our solar system?' or 'What chemical element is represented by the symbol "Pb"?' as though these were arcane and difficult questions to answer; if they pitched History, Art or Geography questions at the same elementary school level, everyone would roll their eyes and say 'This is far too easy'; but with science, not only are contestants apparently expected to have forgotten the stuff they should have learned before they got to High School, but the contestants themselves frequently fail to answer even these incredibly basic questions correctly. I don't know which upsets me more.
 
In Massachusetts, when I was a kid, the standardized test seemed to be a basement test to ensure that the remedial education was reaching everyone. This seems like a decent path to go with. Ensure the baseline is being taught (pass/fail). Want to determine the height of knowledge, do AP's.
 
And then there's this...

Parents from all over are praising a message from an area superintendent about his thoughts on the PSSAs.
Read more: http://y102reading.iheart.com/onair...-message-on-pssas-has-14587366/#ixzz45iVijuet

in which he says,
I ask for you to trust that our school district and our fine educators are working hard to do what is right.

I look at that and think... why would we use "trust" instead of a measure? I want, as Jimmy writes above, "Ensure the baseline is being taught (pass/fail)."
 
Back
Top Bottom