Jump to content
Premed 101 Forums

Did You Go Into Medicine For The Money


RGK

Recommended Posts

I don't have statistically-significant data - only from one school that keeps track. But there is large anegdotal evidence - grades dropping from 90' to 60' in 1st year for some kids but not others (not everything can be explained by drinking and partying) is quite a common occurence. Next, TAs from UofT whom I know, and who teach 100 level, complaining about incredible gaps in knowledge  of some of 1st year students, who must have had 85+ average to get to UofT Science in the  first place. Most importantly to me, my own observations.

 

I concede it might not be enough to influence your opinion, but it's good enough for me to form my own.

 

UK is just an example, but generally EU system of education is more uniform, and everybody has some kind of standardized assessement. Some do better than others, but at least you know where you stand when students enter university. Here, it is all over the place.

 

I'll state the common refrain: the plural of anecdote is not data.

 

More importantly, conclusions on contrast between two systems shouldn't be drawn with experiences from only one side of the comparison. What your experiences and stories from others indicate is that there could be significant inconsistencies between high school training in Canada. I'd prefer data on that front, but I don't deny that this is likely a reality. You've taken it a step further and claimed that these inconsistencies are greater than those in the EU. Even if your anecdotes were representative (which I would argue that they aren't), they can't speak to that comparison because they're all from this side of the Atlantic.

 

Where we do have data (rather than anecdotes) which compares the two systems (rather than looking at just Canada's), Canada appears to come out slightly ahead. More, better tailored data would be preferable, but there's no way I'm going to believe one-sided anecdotes over data which indicates the opposite - and neither should you.

 

Everyone's entitled to their own opinion, but those opinions should be informed. You can believe whatever you'd like if you ignore contradictory data, but that's not an approach that promotes productive decision-making.

Link to comment
Share on other sites

  • Replies 369
  • Created
  • Last Reply

No, because if people continue to support the public system, it's no sacrifice at all to have your kids in public school. :)

 

Honestly though, the public schools would have to be dangerous (like in some places in the states) before I'd take my kids out. Barring that, I'd make up any deficits (and I really don't think there are any at this point in time) with tutors and such.

 

What's the difference between private school and tutoring, then? People who send their kids to private school are still paying into the public system, so it isn't money. Is it having those kids in the public system? But by hiring tutors to make up for deficiencies, aren't you still, effectively, covering up deficits in the system instead of letting them be visible?

Link to comment
Share on other sites

What's the difference between private school and tutoring, then? People who send their kids to private school are still paying into the public system, so it isn't money. Is it having those kids in the public system? But by hiring tutors to make up for deficiencies, aren't you still, effectively, covering up deficits in the system instead of letting them be visible?

No no no. Public schools are funded PER child in that school. So as enrollment drops, funding drops, services get cut, programs are eliminated. (because obviously all the money given to the school for each student is not spent only on that student)

 

My point about tutoring was mostly just that you could buy a lot of tutoring for much less than private school tuition. Besides, major deficiencies won't occur unless people stop supporting the public system en mass. The post you quoted was me saying what I would do if the public schools got really, really bad.

 

What I would actually spend the extra money on would be art, music, sport and travel for the kids.

Link to comment
Share on other sites

I'll state the common refrain: the plural of anecdote is not data.

 

More importantly, conclusions on contrast between two systems shouldn't be drawn with experiences from only one side of the comparison. What your experiences and stories from others indicate is that there could be significant inconsistencies between high school training in Canada. I'd prefer data on that front, but I don't deny that this is likely a reality. You've taken it a step further and claimed that these inconsistencies are greater than those in the EU. Even if your anecdotes were representative (which I would argue that they aren't), they can't speak to that comparison because they're all from this side of the Atlantic.

 

Where we do have data (rather than anecdotes) which compares the two systems (rather than looking at just Canada's), Canada appears to come out slightly ahead. More, better tailored data would be preferable, but there's no way I'm going to believe one-sided anecdotes over data which indicates the opposite - and neither should you.

 

Everyone's entitled to their own opinion, but those opinions should be informed. You can believe whatever you'd like if you ignore contradictory data, but that's not an approach that promotes productive decision-making.

 

I don't think we are far apart about on the principle. One builds opinion on the best information available  While ignoring data is counterproductive, interpretation of data, acknowledging their limitations, weigthting other factors (including anegdotal evidence and own experience), and listening to opinions and arguments of others, all are part of a normal process that intelligent individuals use to form opinions and make decisions..  

 

Thank you for contributing your opinion, While you did not offer any new information (I already quoted your data, while recognizing their context and limitation), you made several good points, and your point of view is certainly appreciated.  

Link to comment
Share on other sites

No no no. Public schools are funded PER child in that school. So as enrollment drops, funding drops, services get cut, programs are eliminated. (because obviously all the money given to the school for each student is not spent only on that student)

 

My point about tutoring was mostly just that you could buy a lot of tutoring for much less than private school tuition. Besides, major deficiencies won't occur unless people stop supporting the public system en mass. The post you quoted was me saying what I would do if the public schools got really, really bad.

 

What I would actually spend the extra money on would be art, music, sport and travel for the kids.

 

If you were to rely on tutors, you would have to have a good base of tutors in the first place. Trust me, not that easy. While extensive tutoring exists,it  is mostly geared towards kids that are failing their classes. Tutoring is aimed at closing evident gaps in achool  material.  Tutoring to expand the knowledge beyond the minimum school program does not exists except of few  "math schools".  Harder still to find a tutor for kids who learn differently (i.e. those with learning disabilities).

 

Good private school can meet these objectives (less opportunity in public schools), but not all private schools are good with this respect.

 

In any case, people have various reasons to send kids to private schools. Snobism could be one of them but i would not risk the statement that this is the prevailing reason, as some posters tried to suggest.

Link to comment
Share on other sites

I don't think we are far apart about on the principle. One builds opinion on the best information available  While ignoring data is counterproductive, interpretation of data, acknowledging their limitations, weigthting other factors (including anegdotal evidence and own experience), and listening to opinions and arguments of others, all are part of a normal process that intelligent individuals use to form opinions and make decisions..  

 

Thank you for contributing your opinion, While you did not offer any new information (I already quoted your data, while recognizing their context and limitation), you made several good points, and your point of view is certainly appreciated.  

 

I believe we're further apart than you let on. 

 

You quoted the OECD data, but ignored its key findings, all of which contradicted your stated opinions. You even implied it supported your position, which is a complete misrepresentation of the data.

 

The OECD data does have limitations, which is why making affirmative conclusions about our education system based on that data is likely not justified. Yet I never made any affirmative conclusions. However, you have argued for several affirmative conclusions, particularly your assertion that our education system was inferior and/or more variable than those in the EU. All I've put forth so far is to state that such a conclusion is unsupported, at most asserting a negative conclusion - a null hypothesis, if you will - that our education system is not inferior or more variable than those in the EU. I'm not arguing that Canada's system is better - the OECD data isn't robust enough to support that conclusion - only that it's not worse.

 

In the end, it's not even your opinion I take objection to. It's completely plausible that Canadian secondary education couldn't be inferior to EU secondary education and I would welcome stronger data that supported such a conclusion - I've asked for such data. No, it's the logic - or lack thereof - you use to reach your opinion and the confidence with which you state it that I object to. You rely on anecdotes that, at best, explain half the story. You present contradictory data as evidence of your position, then dismiss it when the contradiction is pointed out. You state your opinion as fact and then employ it as evidence for further conclusions.

 

I haven't been weighing in on the private school vs public school side of this thread nearly as much, but you apply those same logical missteps to that debate: relying on anecdotal evidence to support a positive conclusion, ignoring contradictory data, presenting your opinion as fact, and drawing further conclusions based primarily on that unsupported opinion.

 

I realize that you're trying to be conciliatory, but I read your post and find it to be dismissive. I enjoy having discussions with those who hold differing viewpoints, but it's fruitless if those viewpoints are set in stone, unresponsive to evidence or logic. As they say, you're entitled to your own opinion, not your own facts. I've already provided the strongest available evidence I can find on the subject, it contradicts your viewpoint, and your opinion is unchanged and untempered. I've asked for data which supports your opinion and you've admitted that you have none. That overall approach is quite far from where I stand on principle: opinion should be informed by data.

Link to comment
Share on other sites

I believe we're further apart than you let on. 

 

You quoted the OECD data, but ignored its key findings, all of which contradicted your stated opinions. You even implied it supported your position, which is a complete misrepresentation of the data.

 

The OECD data does have limitations, which is why making affirmative conclusions about our education system based on that data is likely not justified. Yet I never made any affirmative conclusions. However, you have argued for several affirmative conclusions, particularly your assertion that our education system was inferior and/or more variable than those in the EU. All I've put forth so far is to state that such a conclusion is unsupported, at most asserting a negative conclusion - a null hypothesis, if you will - that our education system is not inferior or more variable than those in the EU. I'm not arguing that Canada's system is better - the OECD data isn't robust enough to support that conclusion - only that it's not worse.

 

In the end, it's not even your opinion I take objection to. It's completely plausible that Canadian secondary education couldn't be inferior to EU secondary education and I would welcome stronger data that supported such a conclusion - I've asked for such data. No, it's the logic - or lack thereof - you use to reach your opinion and the confidence with which you state it that I object to. You rely on anecdotes that, at best, explain half the story. You present contradictory data as evidence of your position, then dismiss it when the contradiction is pointed out. You state your opinion as fact and then employ it as evidence for further conclusions.

 

I haven't been weighing in on the private school vs public school side of this thread nearly as much, but you apply those same logical missteps to that debate: relying on anecdotal evidence to support a positive conclusion, ignoring contradictory data, presenting your opinion as fact, and drawing further conclusions based primarily on that unsupported opinion.

 

I realize that you're trying to be conciliatory, but I read your post and find it to be dismissive. I enjoy having discussions with those who hold differing viewpoints, but it's fruitless if those viewpoints are set in stone, unresponsive to evidence or logic. As they say, you're entitled to your own opinion, not your own facts. I've already provided the strongest available evidence I can find on the subject, it contradicts your viewpoint, and your opinion is unchanged and untempered. I've asked for data which supports your opinion and you've admitted that you have none. That overall approach is quite far from where I stand on principle: opinion should be informed by data.

 

You are running in cirles here. Yes, opinion should be informed as much as possible, and statistical data are not everything , especially if they are limited and have insufficient level of granularity  (e.g. entire Canada vs. Provinces). I referenced  the OECD data before you even mentioned them, and they do not contradict my point which is, I repeat, inconsistency in secondary education. OECD statistics are inconclusive with this resepect, as you kindly agreed. Other evidence, not statistical but  more compelling because it is empirical and first hand, you dismiss as insubstantial. You never even considered it, simply dismissed, while accusing others of being unresponsive or illogical.

 

But anyway, nothing new is coming here in terms of information or  contribution to the topic, so no point of taking in further.

Link to comment
Share on other sites

You are running in cirles here. Yes, opinion should be informed as much as possible, and statistical data are not everything , especially if they are limited and have insufficient level of granularity  (e.g. entire Canada vs. Provinces). I referenced  the OECD data before you even mentioned them, and they do not contradict my point which is, I repeat, inconsistency in secondary education. OECD statistics are inconclusive with this resepect, as you kindly agreed. Other evidence, not statistical but  more compelling because it is empirical and first hand, you dismiss as insubstantial. You never even considered it, simply dismissed, while accusing others of being unresponsive or illogical.

 

But anyway, nothing new is coming here in terms of information or  contribution to the topic, so no point of taking in further.

 

 

The OECD data is not necessarily well-tailored to the question at hand, but it's hardly inconclusive. It includes several metrics which clearly state that inconsistency of secondary school student outcomes in math, reading, and writing is better in Canada than in most major EU countries, including the UK. It's a standardized metric, so it's about as conclusive as you can get. Student performance in secondary school isn't the same as student performance immediately after secondary school - which is why I wouldn't use this data to claim that Canada does better in early post-secondary education than countries like the UK - but they are related metrics, so it certainly contradicts the notion that Canada does worse than countries like the UK in early post-secondary education. More tailored information would be better, and I've asked for better information, but you have offered none.

 

Granularity is encapsulated within those statistics. Having region-specific data would provide more insight into where issues are, but don't factor into overall measures of inconsistency. Inconsistency is inconsistency, whether it's the difference between a school in Alberta and one in PEI, or it's the difference between two distinct schools in Toronto. Places like the UK also have differences in quality of education between different regions and better region-specific data would similarly demonstrate issues in specific regions of that country. 

 

I really, really shouldn't have to explain why anecdotal evidence has little value. But, in case it does, I'll explain by way of a counter-example. As I said, I went to a high school near the bottom end of quality in my city (and province) - my graduating class has multiple professionals in several disciplines, including two in medical school. I also had a chance to speak with a former high school teacher who taught at many schools, including one of the lowest-rated in London. They deplored the stratification of students (poorer students shunted away from the high-rated schools towards the lower-rated schools), but specifically said that the quality of instruction was roughly the same. Do these stories - all completely true - change your opinion? They shouldn't, they're anecdotes, subject to virtually every bias possible and in no way standardized or comparable to other experiences. You speak of the flaws in the OECD data, yet anecdotal evidence has all those flaws and many, many more.

 

You're right, we're going in circles, so the point of further discussion is fairly minimal. My main point remains - relying on anecdotal evidence over contradictory data is poor logic, and that's exactly what you've done.

Link to comment
Share on other sites

The OECD data is not necessarily well-tailored to the question at hand, but it's hardly inconclusive. It includes several metrics which clearly state that inconsistency of secondary school student outcomes in math, reading, and writing is better in Canada than in most major EU countries, including the UK. It's a standardized metric, so it's about as conclusive as you can get. Student performance in secondary school isn't the same as student performance immediately after secondary school - which is why I wouldn't use this data to claim that Canada does better in early post-secondary education than countries like the UK - but they are related metrics, so it certainly contradicts the notion that Canada does worse than countries like the UK in early post-secondary education. More tailored information would be better, and I've asked for better information, but you have offered none.

 

Granularity is encapsulated within those statistics. Having region-specific data would provide more insight into where issues are, but don't factor into overall measures of inconsistency. Inconsistency is inconsistency, whether it's the difference between a school in Alberta and one in PEI, or it's the difference between two distinct schools in Toronto. Places like the UK also have differences in quality of education between different regions and better region-specific data would similarly demonstrate issues in specific regions of that country. 

 

I really, really shouldn't have to explain why anecdotal evidence has little value. But, in case it does, I'll explain by way of a counter-example. As I said, I went to a high school near the bottom end of quality in my city (and province) - my graduating class has multiple professionals in several disciplines, including two in medical school. I also had a chance to speak with a former high school teacher who taught at many schools, including one of the lowest-rated in London. They deplored the stratification of students (poorer students shunted away from the high-rated schools towards the lower-rated schools), but specifically said that the quality of instruction was roughly the same. Do these stories - all completely true - change your opinion? They shouldn't, they're anecdotes, subject to virtually every bias possible and in no way standardized or comparable to other experiences. You speak of the flaws in the OECD data, yet anecdotal evidence has all those flaws and many, many more.

 

You're right, we're going in circles, so the point of further discussion is fairly minimal. My main point remains - relying on anecdotal evidence over contradictory data is poor logic, and that's exactly what you've done.

 

 

I don't give much weight to your Wikipedia reference,  but even there it specifically states that " [anegdotal evidence]  is, however, within the scope of scientific method for claims regarding particular instances". With lots of caveats of course.

 

With regards to your own experience, what point are your trying to make saying "my graduating class has multiple professionals in several disciplines, including two in medical school". That the educational system is consistent? And what criteria did you use calling your school " near the bottom end of quality in my city (and province)", when you indicate results that contradict that statement? I value your experience as much as mine, but I expect at least basic logic when using it in discussion.  

 

 In the instance we discussed, OECD  data were not contradictory; their limitation rendered them irrelevent to the inconsistency issue (they had other limitations that I didn't bother to mention before as they are obvious: OECD is a standardized testing, not standardized credentialing examinations and those are essentially different).  

 

You  state above that "Inconsistency is inconsistency, whether it's the difference between a school in Alberta and one in PEI, or it's the difference between two distinct schools in Toronto. Places like the UK also have differences in quality of education between different regions and better region-specific data would similarly demonstrate issues in specific regions of that country".

 

I agree with this entirely. But you forget one significant factor  which I already pointed out and you ignored; most educational systems, eg. UK or our Alberta (nation’s top performer), or BC, have standardized exams that objectively demonstrate student's achievement regardless of the school attended . AAA is AAA whether you studied for your A-levels in Hampshire or Sussex or anywhere. But undelying knowledge to obtain 95% in science in one school in Toronto may be barely suffient to achieve 75% in another. So yes, you can compare regions or schools with regards to their achievements such as averages or number of top students, but there is no doubt that A student is truly and A student, entering post-secondary education as such. That's what I call consistent.

 

Ontario does not have standardized exams system in schools (EQAO testing is entirely different matter), and neither has Canada as a whole.

Link to comment
Share on other sites

I don't give much weight to your Wikipedia reference,  but even there it specifically states that " [anegdotal evidence]  is, however, within the scope of scientific method for claims regarding particular instances". With lots of caveats of course.

 

With regards to your own experience, what point are your trying to make saying "my graduating class has multiple professionals in several disciplines, including two in medical school". That the educational system is consistent? And what criteria did you use calling your school " near the bottom end of quality in my city (and province)", when you indicate results that contradict that statement? I value your experience as much as mine, but I expect at least basic logic when using it in discussion.  

 

 In the instance we discussed, OECD  data were not contradictory; their limitation rendered them irrelevent to the inconsistency issue (they had other limitations that I didn't bother to mention before as they are obvious: OECD is a standardized testing, not standardized credentialing examinations and those are essentially different).  

 

You  state above that "Inconsistency is inconsistency, whether it's the difference between a school in Alberta and one in PEI, or it's the difference between two distinct schools in Toronto. Places like the UK also have differences in quality of education between different regions and better region-specific data would similarly demonstrate issues in specific regions of that country".

 

I agree with this entirely. But you forget one significant factor  which I already pointed out and you ignored; most educational systems, eg. UK or our Alberta (nation’s top performer), or BC, have standardized exams that objectively demonstrate student's achievement regardless of the school attended . AAA is AAA whether you studied for your A-levels in Hampshire or Sussex or anywhere. But undelying knowledge to obtain 95% in science in one school in Toronto may be barely suffient to achieve 75% in another. So yes, you can compare regions or schools with regards to their achievements such as averages or number of top students, but there is no doubt that A student is truly and A student, entering post-secondary education as such. That's what I call consistent.

 

Ontario does not have standardized exams system in schools (EQAO testing is entirely different matter), and neither has Canada as a whole.

 

Anecdotal evidence has some value, but its value is limited and it has lower value than other, more rigorous forms of data (like cross-sectional analysis, which is essentially what the OECD data provides). Perhaps the greatest weakness in anecdotal evidence is selection bias, whereby those presenting the anecdotal evidence provide examples that match their viewpoints, while ignoring counter-examples that may be more representative of reality. I doubt this was your intention, but you displayed that quite well in quoting the Wikipedia article - the complete sentence you quoted said: "Anecdotal evidence is considered dubious support of a generalized claim; it is, however, within the scope of scientific method for claims regarding particular instances." Yet, you conveniently left out the start of that sentence which works against your position (not to mention the rest of the article which further details the limitations of anecdotal evidence). I quoted Wikipedia because this is a very basic concept, at the level of an encyclopedia. (Here's a chapter in a medically-related book saying much the same thing, if you'd prefer a more prestigious source)

 

Anecdotal evidence has a role in exploring new concepts - that's why we do case studies in medicine - but their value is minimal beyond that point. In health care, there's the classic "levels of evidence" pyramid, but it applies equally well to education. Anecdotal evidence isn't completely useless, but it's unreliable and has far less value than more discerning levels of evidence. That's the main concern I have - we have data on this subject, we don't need to rely on anecdotes. The data's not perfect, but it's much more meaningful that anecdotes.

 

As for my own experiences, you cite examples of individuals doing poorly after going to lower-end high schools as evidence of inconsistency in high school education, yet I have many examples of individuals doing quite well after going to a lower-end high school. As for how I know I went to a lower-end high school, reputation is part of that, but since I'm arguing against anecdotes, I have data too. The Fraser Institute ranks schools in Ontario, and mine is on the lower end - again, hardly perfect data, but it's based on standardized testing so it serves as a decent proxy. You also conveniently ignored my other anecdote, which was from a former high school teacher. I'll repeat - I'm not trying to prove anything with these stories. They're anecdotes, I CAN'T prove anything with these examples. The only point I'm trying to make is that anecdotes are fickle, unreliable, and say as much about the speaker as they do about what the speaker has observed. You focus on the examples on inconsistency because you believe it exists at an abnormally high levels, which only reinforces that belief. It's a classic confirmation bias.

 

Inconsistency in performance and inconsistency in evaluation are certainly two completely different things. If your argument had been only about how we evaluate students, there'd be no issue. However, your arguments have conflated the two definitions over the course of this thread. At some points you speak of standardization of evaluation, but more than a few others, particularly the ones I've responded to, you speak about quality. Take this one in particular:

 

 

Doesn't this tell you something? Universities have to teach material from last year of school  because most of the kids did not master it at school, public or private. Repeat for you, a new thing for many others. Lots of under-prepared kids go to university and then drop out. Not to mention that elsewhere in the world this material is taught in grade 9 or 10.

 

You are right, some private schools are not strong in academics, and some public schools are relatively good.That doesn't change my point -  I commented on average level of secondary education in Ontario, as well as on wide inconsistencies  in programs, teaching and assessment. it is below standards of developed countries. Only IB program can be compared to world standards, and  it is way more advanced than Ontario curriculum.

 

 

You asserted average level of secondary education was lower than other countries, notably those in the EU, as well as being less consistent, in this post. When I pointed out average levels of performance were actually higher, you claimed it was more about inconsistency in results. I pointed out that consistency in results was actually higher in Canada, so you now claim it's all about inconsistency in evaluation. That's called moving the goalposts. If you agree that overall quality and consistency of secondary education is, at a minimum, no worse here than in countries like the UK, then we are no longer in disagreement. You'll get no quarrel from me about inconsistency in how we evaluate high school students for the purposes of post-secondary education - I don't know how it compares to other countries, but it's not good enough here and I would welcome efforts to make it more consistent. But that wasn't your original argument.

Link to comment
Share on other sites

Anecdotal evidence has some value, but its value is limited and it has lower value than other, more rigorous forms of data (like cross-sectional analysis, which is essentially what the OECD data provides). Perhaps the greatest weakness in anecdotal evidence is selection bias, whereby those presenting the anecdotal evidence provide examples that match their viewpoints, while ignoring counter-examples that may be more representative of reality. I doubt this was your intention, but you displayed that quite well in quoting the Wikipedia article - the complete sentence you quoted said: "Anecdotal evidence is considered dubious support of a generalized claim; it is, however, within the scope of scientific method for claims regarding particular instances." Yet, you conveniently left out the start of that sentence which works against your position (not to mention the rest of the article which further details the limitations of anecdotal evidence). I quoted Wikipedia because this is a very basic concept, at the level of an encyclopedia. (Here's a chapter in a medically-related book saying much the same thing, if you'd prefer a more prestigious source)

 

Anecdotal evidence has a role in exploring new concepts - that's why we do case studies in medicine - but their value is minimal beyond that point. In health care, there's the classic "levels of evidence" pyramid, but it applies equally well to education. Anecdotal evidence isn't completely useless, but it's unreliable and has far less value than more discerning levels of evidence. That's the main concern I have - we have data on this subject, we don't need to rely on anecdotes. The data's not perfect, but it's much more meaningful that anecdotes.

 

As for my own experiences, you cite examples of individuals doing poorly after going to lower-end high schools as evidence of inconsistency in high school education, yet I have many examples of individuals doing quite well after going to a lower-end high school. As for how I know I went to a lower-end high school, reputation is part of that, but since I'm arguing against anecdotes, I have data too. The Fraser Institute ranks schools in Ontario, and mine is on the lower end - again, hardly perfect data, but it's based on standardized testing so it serves as a decent proxy. You also conveniently ignored my other anecdote, which was from a former high school teacher. I'll repeat - I'm not trying to prove anything with these stories. They're anecdotes, I CAN'T prove anything with these examples. The only point I'm trying to make is that anecdotes are fickle, unreliable, and say as much about the speaker as they do about what the speaker has observed. You focus on the examples on inconsistency because you believe it exists at an abnormally high levels, which only reinforces that belief. It's a classic confirmation bias.

 

Inconsistency in performance and inconsistency in evaluation are certainly two completely different things. If your argument had been only about how we evaluate students, there'd be no issue. However, your arguments have conflated the two definitions over the course of this thread. At some points you speak of standardization of evaluation, but more than a few others, particularly the ones I've responded to, you speak about quality. Take this one in particular:

 

 

 

You asserted average level of secondary education was lower than other countries, notably those in the EU, as well as being less consistent, in this post. When I pointed out average levels of performance were actually higher, you claimed it was more about inconsistency in results. I pointed out that consistency in results was actually higher in Canada, so you now claim it's all about inconsistency in evaluation. That's called moving the goalposts. If you agree that overall quality and consistency of secondary education is, at a minimum, no worse here than in countries like the UK, then we are no longer in disagreement. You'll get no quarrel from me about inconsistency in how we evaluate high school students for the purposes of post-secondary education - I don't know how it compares to other countries, but it's not good enough here and I would welcome efforts to make it more consistent. But that wasn't your original argument.

 

I'm totally on your side here, but please don't use the Fraser institute to prove anything when we're talking about education. Their main motto in that sector is to promote private education.

 

They got into trouble a few years back because by their ridiculous metrics, the elementary school in Bountiful, BC (the polygamist community) was one of the best in the province. They drop out of school by grade 8 to become child brides, but in grade 4 they're top notch. It was pretty funny.

Link to comment
Share on other sites

I'm totally on your side here, but please don't use the Fraser institute to prove anything when we're talking about education. Their main motto in that sector is to promote private education.

 

They got into trouble a few years back because by their ridiculous metrics, the elementary school in Bountiful, BC (the polygamist community) was one of the best in the province. They drop out of school by grade 8 to become child brides, but in grade 4 they're top notch. It was pretty funny.

 

I tend to agree about the Fraser Institute, most of their analyses are not worth much and are heavily biased towards whatever goal they have going into a study.

 

Ignore their rankings then - the underlying data that they draw on is independent of their skew and makes my point equally well. My school had poor standardized test scores compared to others in the city and province.

Link to comment
Share on other sites

I tend to agree about the Fraser Institute, most of their analyses are not worth much and are heavily biased towards whatever goal they have going into a study.

 

Ignore their rankings then - the underlying data that they draw on is independent of their skew and makes my point equally well. My school had poor standardized test scores compared to others in the city and province.

Fair enough. I would only add that standardized test scores are okay at the high school level but any lower than that they aren't really indicative of much other than what schools forced teachers to teach to the test (which is often not really conducive to good educational practice).

 

But since we are talking about high school and beyond, carry on. :)

Link to comment
Share on other sites

 

You asserted average level of secondary education was lower than other countries, notably those in the EU, as well as being less consistent, in this post. When I pointed out average levels of performance were actually higher, you claimed it was more about inconsistency in results. I pointed out that consistency in results was actually higher in Canada, so you now claim it's all about inconsistency in evaluation. That's called moving the goalposts. If you agree that overall quality and consistency of secondary education is, at a minimum, no worse here than in countries like the UK, then we are no longer in disagreement. You'll get no quarrel from me about inconsistency in how we evaluate high school students for the purposes of post-secondary education - I don't know how it compares to other countries, but it's not good enough here and I would welcome efforts to make it more consistent. But that wasn't your original argument.

 

 

It's not the moving target, I actually mean both: inconsistency in secondary education AND inconsistency in evaluation.

 

The example I mentioned was brought to my attention by several TAs. The population discussed was a group of 1st year Science students. For simplicity, let's say two students: both got into the competitive Science course based on 87 HS average. One student  demonstrates knowledge allowing him to easily continue with 101 university course. The other student lacks fundamental knowledge in math and sciences to the point that he should not be allowed to the lab.

 

Obviously there is inconsistency in evaluation (one student deserves  85% whereas the other maybe 65% or less),but  more importantly there is big difference in  the the level of knowledge of secondary school graduates deemed ready for university. In this sample, the population with insufficient knowledge was significantly larger than the population with the knowledge that could justify 85% mark (12:3, one of the 3 had IB).  

 

Lack of consistent evaluation does not cause inconsitent education, but facilitates it and prevents improvements.  It is also one of the reasons why we don't have comparable and meanigful data how the students are really doing.

 

I agree with the other poster re. Frazer institute, not worth quoting. They had to pull out some of their reports, including one about private schools.

Link to comment
Share on other sites

Fair enough. I would only add that standardized test scores are okay at the high school level but any lower than that they aren't really indicative of much other than what schools forced teachers to teach to the test (which is often not really conducive to good educational practice).

 

But since we are talking about high school and beyond, carry on. :)

 

 

You touch on important point that some seem not to understand. You are absolutely right that "standardized test aren't really indicative of much other than what schools forced teachers to teach to the test". That's why the value of OECD data or Ontario's EQAO testing is limited, and whilst it can be indicative of meeting certain criteria in 1 or 2  basic subjects, it is not an evidence of performance and general knowledge at secondary school level that was discussed here.

 

In contrastcredentialing exams (such as GCSE and A-levels in UK and Ireland, Brevet des collèges and baccalauréat in France, Matura in Poland Abitur in Germany, IB internationally, and also provincial exams in British Columbia) are given to upper level students in specific subject areas to determine whether course material is mastered, whether the student is ready to move on to the next level or graduate, and to determine standing in the course. These are not 'standardized tests'.

 

In BC, the results from credentialing exams are gathered and summarized at three levels: the school, school district, and province. This allows for comparison among the three levels as well as among different schools (public, private, independent) and different student populations, for example, urban and rural, or higher and lower socio-economic groups.

 

Lack of such system in Ontario makes comparison between schools and students very distorted if not meaningless. And that's where we end up with 1st year university students: all over the place.

Link to comment
Share on other sites

It's not the moving target, I actually mean both: inconsistency in secondary education AND inconsistency in evaluation.

 

The example I mentioned was brought to my attention by several TAs. The population discussed was a group of 1st year Science students. For simplicity, let's say two students: both got into the competitive Science course based on 87 HS average. One student  demonstrates knowledge allowing him to continue with 101 university course. The other student lacks fundamental knowledge in math and sciences to the point that he should not be allowed to the lab.

 

Obviously there is inconsistency in evaluation (one deserves  85% whereas the other maybe 65% or less),but  more importantly there is big difference in  the the level of knowledge of secondary school graduates deemed ready for university. In this sample, the population with insufficient knowledge was significantly larger than the population with the knowledge that could justify 85% mark (12:3, one of the 3 had IB).  

 

Lack of consistent evaluation does not cause inconsitent education, but facilitates it and prevents improvements.  It is also one of the reason why we don't have comparable and meanigful data how the students are really doing.

 

I agree with the other poster re. Frazer institute, not worth quoting. They had to pull out some of their reports, including one about private schools.

 

It's still moving the goalposts, when you initially make claims about average performance, then shift to inconsistency. More importantly, if you'd like to make the case for inconsistency of education, please, make that case! Again, the two definitions are getting conflated. I agree with much of what you're saying in this post, but it's irrelevant when looking at inconsistency in educational outcomes. While having better measurements about student performance would likely make it easier to improve that performance, not having that information doesn't mean students are doing badly or that there is greater inconsistencies in outcomes than in other countries.

 

The OECD data focuses on the absolute fundamentals: reading, math, science. That's not just "basic subjects", they're the fundamental building blocks to everything taught in university and beyond (performance in these areas correlate well with both performance in university and longer-term success). Yet, despite having a more standardized evaluation schemes, UK performance is worse, both overall and in terms of consistency, when compared head-to-head with Canada. Standardized evaluations don't necessarily lead to overall improved outcomes; there are many other factors at play.

 

I know, I know, you don't think the OECD data is good enough, but I'll repeat - you can make any argument you want seem justified when you dismiss contradictory data.

 

More importantly, you're drawing a very odd line in distinguishing what you consider highly desirable testing with useless testing. Ultimately, the GCSE and other similar schemes are standardized tests, contrary to your assertion. You can argue they are better standardized tests (though I'd want to see some evidence to support that), but they are still standardized ways of evaluating students. On one hand you're arguing for more standardized evaluation, while dismissing data from standardized evaluations.

 

Teaching to the test is a major concern, but hardily one unique to Canada. It happens in the UK as well, a lot - and don't take my word on it, just Google "teaching to the test UK" and you'll get a whole mess of results stretching back over the past decade. This is why OECD data can be helpful - the system-level data never gets reported, so there's little incentive to teach to the test.

 

Again, if you've got data or studies to back your claims, show it. I would love to see information that contradicts what I'm saying - it's how I learn. But unsubstantiated assertions just aren't good enough, not for me, and I hope not for you.

Link to comment
Share on other sites

It's still moving the goalposts, when you initially make claims about average performance, then shift to inconsistency. More importantly, if you'd like to make the case for inconsistency of education, please, make that case! Again, the two definitions are getting conflated. I agree with much of what you're saying in this post, but it's irrelevant when looking at inconsistency in educational outcomes. While having better measurements about student performance would likely make it easier to improve that performance, not having that information doesn't mean students are doing badly or that there is greater inconsistencies in outcomes than in other countries.

 

The OECD data focuses on the absolute fundamentals: reading, math, science. That's not just "basic subjects", they're the fundamental building blocks to everything taught in university and beyond (performance in these areas correlate well with both performance in university and longer-term success). Yet, despite having a more standardized evaluation schemes, UK performance is worse, both overall and in terms of consistency, when compared head-to-head with Canada. Standardized evaluations don't necessarily lead to overall improved outcomes; there are many other factors at play.

 

I know, I know, you don't think the OECD data is good enough, but I'll repeat - you can make any argument you want seem justified when you dismiss contradictory data.

 

More importantly, you're drawing a very odd line in distinguishing what you consider highly desirable testing with useless testing. Ultimately, the GCSE and other similar schemes are standardized tests, contrary to your assertion. You can argue they are better standardized tests (though I'd want to see some evidence to support that), but they are still standardized ways of evaluating students. On one hand you're arguing for more standardized evaluation, while dismissing data from standardized evaluations.

 

Teaching to the test is a major concern, but hardily one unique to Canada. It happens in the UK as well, a lot - and don't take my word on it, just Google "teaching to the test UK" and you'll get a whole mess of results stretching back over the past decade. This is why OECD data can be helpful - the system-level data never gets reported, so there's little incentive to teach to the test.

 

Again, if you've got data or studies to back your claims, show it. I would love to see information that contradicts what I'm saying - it's how I learn. But unsubstantiated assertions just aren't good enough, not for me, and I hope not for you.

 

 

Inconsistency speaks of  lack of quality. And what you cannot measure, you cannot manage (improve).  [Deming].

 

You seem not to distinguish between standardized tests (that indeed measure "building blocks")  and comprehensive credentialing exams. Although the exams are standardized, they are designed differently and have entirely different objective which I described in one of the posts above (no need to repeat).  Teaching to the test,  a major concern with standardized testing, is a non-issue in credentialing exams that measure, in depth, student's absorption of course material. "Teaching to" credentialing exams is nothing less than thorough teaching of course material.

 

I don't consider OECD data useless, but they are not useful for this discussion. If I had data about Ontario other than  EQAO then I would use it, but part of the problem is lack of them.  So,no point in challenging me about it, I already said i have no data. How can Ontario assess and compare knowledge of HS  grads if we don't have credentialing exams - on the basis of OECD data?  Give me a break. We have very little idea what HS graduates know. If we had exams, we would have the results and then we could slice and dice and drive conclusion as BC does. But in absence of credentaling exams  1/  exit knowledge of HS students vary widely 2/ in many cases is grossly insufficient as a preparation for university  and 3/ comparisons are very difficult if not imposible - hence referring to anegdotal evidence. Not because it is better, but because there are no data.

Link to comment
Share on other sites

Inconsistency speaks of  lack of quality. And what you cannot measure, you cannot manage (improve).  [Deming].

 

You seem not to distinguish between standardized tests (that indeed measure "building blocks")  and comprehensive credentialing exams. Although the exams are standardized, they are designed differently and have entirely different objective which I described in one of the posts above (no need to repeat).  Teaching to the test,  a major concern with standardized testing, is a non-issue in credentialing exams that measure, in depth, student's absorption of course material. "Teaching to" credentialing exams is nothing less than thorough teaching of course material.

 

I don't consider OECD data useless, but they are not useful for this discussion. If I had data about Ontario other than  EQAO then I would use it, but part of the problem is lack of them.  So,no point in challenging me about it, I already said i have no data. How can Ontario assess and compare knowledge of HS  grads if we don't have credentialing exams - on the basis of OECD data?  Give me a break. We have very little idea what HS graduates know. If we had exams, we would have the results and then we could slice and dice and drive conclusion as BC does. But in absence of credentaling exams  1/  exit knowledge of HS students vary widely 2/ in many cases is grossly insufficient as a preparation for university  and 3/ comparisons are very difficult if not imposible - hence referring to anegdotal evidence. Not because it is better, but because there are no data.

 

The Brits don't consider "teaching to the test" a non-issue. Again, in an ironic twist, you've highlighted the one of the few instances where anecdotal evidence is actually relevant - disproving an absolute. You say "Teaching to the test,  a major concern with standardized testing, is a non-issue in credentialing exams", but yet it takes a quick Google search to find more than a few individuals for whom it is an issue.

 

I said it before - you state opinions as though they were a certainty, without evidence to justify that certainty. If you had said "teaching to the test" was a smaller issue with credentialing exams, there's no data to show otherwise that I'm aware of, and the anecdotes I've mentioned wouldn't hold much weight. Yet, you went for the absolutist statement.

 

Having written both "standardized tests" and "credentialing exams" so far in my career, with more than a few of each to come, I again argue that the difference is semantics, two phrases for largely the same thing. They are both evaluations that are consistent across institutions and jurisdictions that everyone is expected to satisfactorily complete to move on.

 

As for the last bit, again, data > anecdotes. The data I have is far from perfect, but the anecdotes you have are even further from perfect. I would welcome better data. In the absence of it, I refrain from making sweeping claims about the relative quality of our educational system, because I would have nothing to back up those assertions. The main point I've tried to make throughout this overlong discussion is that you should likewise refrain from making such claims, since you have nothing to back up your assertions.

Link to comment
Share on other sites

The Brits don't consider "teaching to the test" a non-issue. Again, in an ironic twist, you've highlighted the one of the few instances where anecdotal evidence is actually relevant - disproving an absolute. You say "Teaching to the test,  a major concern with standardized testing, is a non-issue in credentialing exams", but yet it takes a quick Google search to find more than a few individuals for whom it is an issue.

 

I said it before - you state opinions as though they were a certainty, without evidence to justify that certainty. If you had said "teaching to the test" was a smaller issue with credentialing exams, there's no data to show otherwise that I'm aware of, and the anecdotes I've mentioned wouldn't hold much weight. Yet, you went for the absolutist statement.

 

Having written both "standardized tests" and "credentialing exams" so far in my career, with more than a few of each to come, I again argue that the difference is semantics, two phrases for largely the same thing. They are both evaluations that are consistent across institutions and jurisdictions that everyone is expected to satisfactorily complete to move on.

 

As for the last bit, again, data > anecdotes. The data I have is far from perfect, but the anecdotes you have are even further from perfect. I would welcome better data. In the absence of it, I refrain from making sweeping claims about the relative quality of our educational system, because I would have nothing to back up those assertions. The main point I've tried to make throughout this overlong discussion is that you should likewise refrain from making such claims, since you have nothing to back up your assertions.

 

To my  best knowledge, the concerns with "teching to test" in UK are reated to SATs not GCSE and A-levels (judging by The Guardian article and several others). The UK Opposition's education secretary is entitled to his opinion too (not based on data, ha ha, his speach did not quote any).

 

Yet you are right that even credentialing exams may  rise concerns about how well they represent mastering the course materisls. There are always some other factors, such as ability of coping with exams by some students. Trying to deal with this, some jurisdiction opt for final course mark that takes into consideration course work along with credentialing exam - similar to assessment at universities. Hybrid system could have advantages if well designed. In any case, systematic assessment is one of the drivers of  quality education, and it is sadly lacking here.  

Link to comment
Share on other sites

To my  best knowledge, the concerns with "teching to test" in UK are reated to SATs not GCSE and A-levels (judging by The Guardian article and several others). The UK Opposition's education secretary is entitled to his opinion too (not based on data, ha ha, his speach did not quote any).

 

Yet you are right that even credentialing exams may  rise concerns about how well they represent mastering the course materisls. There are always some other factors, such as ability of coping with exams by some students. Trying to deal with this, some jurisdiction opt for final course mark that takes into consideration course work along with credentialing exam - similar to assessment at universities. Hybrid system could have advantages if well designed. In any case, systematic assessment is one of the drivers of  quality education, and it is sadly lacking here.  

 

Here's a few examples of complaints about "teaching to the test" related to the GCSE and A-levels, in part or in whole.

 

I think we're coming to a loose consensus here though, which is gratifying, if surprising. I agree, having a more comprehensive assessment plan might have better educational value, though whenever you test for something, there's an incentive to teach to it - though, as a result, having higher quality evaluations can lead to higher-quality education. We'll see how things change in Canada moving forward - hopefully, for the better.

Link to comment
Share on other sites

I have a sinking feeling that if ralk's public school kid ever meets older's private school kid, there would be a serious Romeo and Juliet type situation.

 

I hope you're talking about the whole at-odds falling-in-love part of Romeo and Juliet, not the teenage suicide part of Romeo and Juliet  :eek:

Link to comment
Share on other sites

I hope you're talking about the whole at-odds falling-in-love part of Romeo and Juliet, not the teenage suicide part of Romeo and Juliet  :eek:

I'm talking about the whole part!

 

Don't let this ancient grudge break to new mutiny!

 

Remember, the two house holds where we lay our scene are both alike in dignity. Even when they send the kids to public school.

Link to comment
Share on other sites

I have a sinking feeling that if ralk's public school kid ever meets older's private school kid, there would be a serious Romeo and Juliet type situation.

 

:lol:  :D  :)

 

Right!

Finally a lighter note.

By the way, the discussion started about some people (not me) having 4 kids in private schools.  I would still not criticise those people.

Link to comment
Share on other sites

:lol:  :D  :)

 

Right!

Finally a lighter note.

By the way, the discussion started about some people (not me) having 4 kids in private schools.  I would still not criticise those people.

Not sure how old you are to name yourself "older", but if you are early 30s like me, you will remember that movie being EVERYTHING for a while (the soundtrack too, obvi).

 

If people want to send their kids to private school, I think that is fine, even though I do not think it is necessary.

 

When I have kids, I would really want them to work once they are 16. I always worked in customer service, starting at McDonald's, and to this day, I feel like this had a huge impact on me.

 

I also expect that you need to teach your kids a lot of things at home no matter how good their school is. I would obviously want to make sure that they learn their letters and numbers, are surrounded by books, and develop a love of reading.

 

But, I also expect that a lot of the things you think would be taught in school, may not be. For example, I fully expect that spelling quizzes, memorizing multiplication tables, and touch typing will be learned at home, along with many other skills

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.


×
×
  • Create New...