Jump to content
Premed 101 Forums

Interesting Program Directors Take on CARMS


Recommended Posts

Happens to be from my school - but it is response to others to have similar concerns.

http://www.cmaj.ca/content/canadian-program-directors-have-zero-data-select-residency-candidates

Canadian program directors have zero data to select residency candidates

  • Matthew D. McInnes, Associate professor of Radiology and Epidemiology, University of Ottawa, The Ottawa Hospital Research Institute, Ottawa, Ont.
11 April 2018

I share Dr. Persad's concern. I supervised nine CARMS matches as diagnostic radiology residency program director in Ottawa and it was plainly obvious that the lack of objective data with which to evaluate candidates is at the crux of a deeply flawed system. Good research(1) has identified that objective data (medical school marks or examinations) are the only reliable indicators of success in residency. Canadian program directors presently have zero objective data to use to select candidates. We are, as far as I am aware, the only system in the world that has both an entirely pass fail system combined with lack of standardized examinations (the LMCC examination is done after Carms).

This frustration from students (subjective/ vague criteria) and program directors (lack of useful data points) comes up year after year(2). Until Canadian Medical schools and the LMCC rectify this by moving the LMCC to third year (for a four-year program), or return to an objective, marks-based evaluation system, these frustrations will not subside.

Link to comment
Share on other sites

  • Replies 107
  • Created
  • Last Reply
23 minutes ago, rmorelan said:

Happens to be from my school - but it is response to others to have similar concerns.

http://www.cmaj.ca/content/canadian-program-directors-have-zero-data-select-residency-candidates

Canadian program directors have zero data to select residency candidates

  • Matthew D. McInnes, Associate professor of Radiology and Epidemiology, University of Ottawa, The Ottawa Hospital Research Institute, Ottawa, Ont.
11 April 2018

I share Dr. Persad's concern. I supervised nine CARMS matches as diagnostic radiology residency program director in Ottawa and it was plainly obvious that the lack of objective data with which to evaluate candidates is at the crux of a deeply flawed system. Good research(1) has identified that objective data (medical school marks or examinations) are the only reliable indicators of success in residency. Canadian program directors presently have zero objective data to use to select candidates. We are, as far as I am aware, the only system in the world that has both an entirely pass fail system combined with lack of standardized examinations (the LMCC examination is done after Carms).

This frustration from students (subjective/ vague criteria) and program directors (lack of useful data points) comes up year after year(2). Until Canadian Medical schools and the LMCC rectify this by moving the LMCC to third year (for a four-year program), or return to an objective, marks-based evaluation system, these frustrations will not subside.

I am not sure that moving LMCC to the 3rd year would be a good idea, as the majority of CMG will be setting up electives and studying hard for their dream specialty and trying to impress the preceptors in different programs.

In other word, if you are gunning for a specialty other than family medicine, your studying will be mostly tailored to one specific discipline and end up doing not that well in LMCC compared to your colleague who wants to pursue family medicine and has done broad electives. 

The medical school admissions in Canada is quite competitive, and selects academically gifted individuals; the majority of CMGS (> 95%) pass LMCC Part I with no difficulty. I don't know if having 20 points more in LMCC part I demonstrates that you have a strong resident?

The French medical schools in Quebec still has grading system (I believe?) for preclerkship, but having grades in preclerkship adds a certain amount of stress, as medical students sacrifice the majority of their time studying in library rather than enriching their ECs, research and quality time with their friends and family. 

Link to comment
Share on other sites

3 minutes ago, LittleDaisy said:

I am not sure that moving LMCC to the 3rd year would be a good idea, as the majority of CMG will be setting up electives and studying hard for their dream specialty and trying to impress the preceptors in different programs.

In other word, if you are gunning for a specialty other than family medicine, your studying will be mostly tailored to one specific discipline and end up doing not that well in LMCC compared to your colleague who wants to pursue family medicine and has done broad electives. 

The medical school admissions in Canada is quite competitive, and selects academically gifted individuals; the majority of CMGS (> 95%) pass LMCC Part I with no difficulty. I don't know if having 20 points more in LMCC part I demonstrates that you have a strong resident?

The French medical schools in Quebec still has grading system (I believe?) for preclerkship, but having grades in preclerkship adds a certain amount of stress, as medical students sacrifice the majority of their time studying in library rather than enriching their ECs, research and quality time with their friends and family. 

How about using the USMLEs. It is a well established exam used in the US, and it is graded on a curve... I highly doubt "all" CMGs are academically gifted, there are IP advantages, students who get high CARS but lower in sciences and students who had easy undergrads. I think the USMLEs will definitely be a well-recognised, well established exam to use. 

Link to comment
Share on other sites

I mean, one middle ground option might be to go from P/F to Honours/P/F or Honours/High Pass/P/F - you wouldn't have to obsess about every single point, but it would give some indication of where people fall and also an indication of strengths and weaknesses.

The thing about using something like the USMLE is, does it give any indication of aptitude for a specific specialty?  Because you can be really smart and not have the hands for surgery or the interviewing skills for psychiatry.  It's also a US exam which means learning a bunch of US specific stuff that is not relevant to our system.

Link to comment
Share on other sites

24 minutes ago, ellorie said:

I mean, one middle ground option might be to go from P/F to Honours/P/F or Honours/High Pass/P/F - you wouldn't have to obsess about every single point, but it would give some indication of where people fall and also an indication of strengths and weaknesses. 

The thing about using something like the USMLE is, does it give any indication of aptitude for a specific specialty?  Because you can be really smart and not have the hands for surgery or the interviewing skills for psychiatry.  It's also a US exam which means learning a bunch of US specific stuff that is not relevant to our system. 

Oh I think people would definitely obsess over every single point if there were honours and high pass, gunners gunna gun lol

I support standardized exams a la USMLEs. They are at once more objective than school grades (which are pseudo-objective as they are not actually standardized across different schools) and are actually easier on the student to prepare for, given the abundance of official prep resources. We always complain about how school exams change year to year and how they don't test us on the stuff they teach us, so I am hesitant to think that numerical grading would be the clean solution to the objectivity issue.

I want to say let's address the aptitude issue you mentioned with skills-based questions/maybe some testing stations in CaRMS interviews. This isn't going to be a popular opinion with med students. For full disclosure I actually avoided applying to some programs that I'd heard were pimping their applicants. It creates a lot of stress for the applicants, and unfortunately it is also just about the only way programs can directly assess a semblance of aptitude. Yes excess stress can affect performance on these things but it will stress everyone equally so theoretically it's still playing fair. If every program had it as a mandatory part of their interviews and this was transparently communicated in advance to med students, I'm sure I would've (with a lot of swearing and complaining) owned up to it eventually and prepared/applied more broadly in the end.

Link to comment
Share on other sites

I am not saying there is a particularly good solution for the problem - I guess my main point is to simple reinforce the idea that there is a problem. Our choice of a pass/fail system has its advantages - but we have to at least acknowledge and where possible minimize the drawbacks. Right now people don't really know what they have to do to get ready for CARMS (in an environment where not matching has increasingly worse outcomes) - and spend a lot of time doing things not actually related to learning medicine hoping they will add the features they need to match.  Downstream program directors have no real idea now to select people - often generating some pretty questionable heuristics that in turn make it impossible to be really competitive for all schools (say having to do an elective there for instance - well you only have so many electives). These are the prices we pay for not having something objective at any point in the system, and there are questions to whether how stress free we make the system ultimately (preclerkship is easier but what about the CARMS run up). Also whatever methods we are using now to assess for the "hands for surgery, or interviewing skills for psychiatry" would still be there for us to use - no objective testing has even been shown to solve that aspect. 

LMCC I would think would be the most logical test if a test was to be used. For one thing if it was a major contributor to the process then it would be taken more seriously - some would argue that would force people to know what we want them to know (although the flip side is that test - which in my opinion has some stark weaknesses would also need an overhaul - it is easy to brush off testing issue when there is a really high pass rate, and no one really cares beyond passing. Without that pressure the test isn't examined with the same focus as say the MCAT or USMLE are - those tests are taken seriously, and I mean very seriously by the test creators). The LMCC is our test, supposed to be our measure of ability and right now frankly holds questionable utility in the overall accreditation process. People have been annoyed by that aspect of it for 2 decades now. 

Adding some form of test would play games with the current structure of medical school for sure - ha any every choice has a drawback. It might be that CARMS is pushed back a month, the LMCC is done just after electives and the results are available shortly there after. That is just an example. It would put more pressure backwards on studying, electives, and even clerkship etc. If it is before electives in the Fall - another time slot then people would know what scores they are entering CARMS with. 

Ultimately we cannot complain about a subjective and error prone CARMS process beyond a point if we helped create the limitations of that system.

Link to comment
Share on other sites

Coming from a 3 year program, I would be in favour of a standardized exam. I'm going for a fairly competitive surgical specialty, and the lack of ability to show programs any kind of objective measure of my aptitude makes it really hard to gauge my chances in any way. The only viable way they have of testing us is by seeing us on elective, but we can't do electives everywhere. I would like it to be more like the US where we do a standardized test that gives an objective score to work off of, and lets applicants measure for ourselves how competitive we'd be for a certain specialty.

If there is an objective measure to work off, stronger candidates could even start using some of their precious electives as what they are meant for: to explore different areas of medicine that they may not get further training in but might come in handy in the future (for example, as a -hopefully- future surgeon I'd like to do an elective in family medicine to see what it's like for patients and the GP's perspective in referral which I can use to write better dictations in the future knowing what GP's like to see).

I'd argue this could even bring in more well rounded residents into a program and help people make more informed decisions about where they would be happy specialty-wise. Lots of advantages here, but of course the question returns to the validity of the test being used. I agree that USMLE is not a great test for our model, something that suits our medical school curriculum and is more clinically oriented would be better. And of course, 3 year schools would have to make adjustments as necessary but I don't think that's impossible nor is it a bad thing.

Link to comment
Share on other sites

1 minute ago, TechToMD said:

Coming from a 3 year program, I would be in favour of a standardized exam. I'm going for a fairly competitive surgical specialty, and the lack of ability to show programs any kind of objective measure of my aptitude makes it really hard to gauge my chances in any way. The only viable way they have of testing us is by seeing us on elective, but we can't do electives everywhere. I would like it to be more like the US where we do a standardized test that gives an objective score to work off of, and lets applicants measure for ourselves how competitive we'd be for a certain specialty. If there is an objective measure to work off, stronger candidates could even start using some of their precious electives as what they are meant for: to explore different areas of medicine that they may not get further training in but might come in handy in the future (for example, as a -hopefully- future surgeon I'd like to do an elective in family medicine to see what it's like for patients before getting referred and knowing how it feels from the GP's perspective so that I can use that to write better dictations in the future knowing what GP's like to see). I'd argue this could even bring in more well rounded residents into a program and help people make more informed decisions about where they would be happy specialty-wise. Lots of advantages here, but of course the question returns to the validity of the test being used. I agree that USMLE is not a great test for our model, something that suits our medical school curriculum and is more clinically oriented would be better. And of course, 3 year schools would have to make adjustments as necessary but I don't think that's impossible nor is it a bad thing.

That is a valid point I think - McMaster has always claimed their program is equivalent and this would be another way to show that in practice. It would actually be a powerful recruiting tool as well in a sense. In the US for instance the average USMLE score of the students is used to help attract people to the school. 

As you point out it would also partially remove some of the pressure of selecting a specialty so early - a standard test would be a universal currency in the matching process. You change your mind and at least you have that to stand on. Now you can spend a lot of time before you ready doing things that may not have any positive impact on your application in the end. It also means you won't be constantly trying to figure out as much about what a particular school's program in X is actually interested in - which honestly no one can clearly give you an answer to as it everyone just has a piece of the puzzle, and it changes quite frequently (PD for instance have a lifespan of about 5 years give or take on average and there are often shake ups when the change). 

 

Link to comment
Share on other sites

I think it's clear that our system is starving for an objective measure of knowledge and ability. Some programs this year (none that I applied to) apparently were asking for undergrad transcripts. It's probably not a great indicator of subsequent medical school performance, but they were desperate to have something to compare applicants objectively. It's great to hear that even PDs, insiders in this process, feel like they're grasping for straws. It means that both sides recognize there's a serious problem, the system is flawed, and we need to fix it. When good applicants are going unmatched every year in a system that seems to select residents capriciously... we finally have an impetus to action.

Link to comment
Share on other sites

I don't like the idea of grades because they wouldn't be standardized across schools.

I could accept using the NMBE or whatever the end of clerkship rotation exam is.  That way PDs could see how you were doing by subject.  Obviously the issue here is that the second anything numerical is introduced, that will become the entire focus of the med students, and the will obsess over it.  My life was pretty chill with the Pass/Fail system haha it would make life so much more stressful without it

Link to comment
Share on other sites

I don’t know what’s with all the obsession to have something objective. The way I see it, there are three factors that get you an interview to a program: 1) demonstration of interest, which will mean different things to different specialties regardless of whether we have grades or a standardized exam, and 2) some sort of accomplishment related to that specialty (i.e. it is not sufficient to simply do electives and get an interview), again this will be independent of grades or standardized exams and 3) absence of red flags and a few colleagues from your specialty highlighting some positive attributes in your reference letters, again this won’t change even with grades or a standardized exam. None of these three things are really subjective - you either demonstrated interest or you didn’t, you either took on research or other projects or you didn’t, and either you were half decent on electives or you weren’t. Maybe it’s subjective to the applicant because they don’t know the criteria, but frankly that’s the liberty the employer has - they can evaluate on and prioritize whatever they want.

 

Once you you have an interview, it is absolutely subjective and I don’t see why it shouldn’t be. Pre-interview screens for applicant quality on paper (your CV), the interview screens for people the program wants to have around for 5 years, and this will differ from one program to the next. Every job on Earth uses this model - impress with a CV to get the interview, then demonstrate your fit with the team and job needed at the interview to secure the position. No google exec is looking at the applicants grades. To me, a program director saying they need grades to discriminate applicants is lazy. There’s probably something they can discern between two people on paper, and if not just interview both.

 

Relying on grades and and exam scores alone would not, and should not change this - it’s a crutch for people who don’t have anything else going on and thinking that if you get good grades and a good exam score then you deserve a match is wrong. It will show what students are at the top of the class, but how does that translate into someone who is going to be a good resident? Anyone who uses every hour of the day to focus on med school grades is doing it wrong - at one point there are diminishing returns of how useful that extra 5% knowledge is in your daily practice, so you are better off using that extra time to get to know the faculty and residents and do other things that will help your career for the better. 

Link to comment
Share on other sites

Out of curiosity, would coming from a school with grades be considered to be an advantage if one were to apply to the P/F schools? I'm guessing it obviously would for someone with a perfect GPA but what about someone with an average around A- (slightly above average at my school, for instance).

Link to comment
Share on other sites

5 minutes ago, ZBL said:

I don’t know what’s with all the obsession to have something objective. The way I see it, there are three factors that get you an interview to a program: 1) demonstration of interest, which will mean different things to different specialties regardless of whether we have grades or a standardized exam, and 2) some sort of accomplishment related to that specialty (i.e. it is not sufficient to simply do electives and get an interview), again this will be independent of grades or standardized exams and 3) absence of red flags and a few colleagues from your specialty highlighting some positive attributes in your reference letters, again this won’t change even with grades or a standardized exam. None of these three things are really subjective - you either demonstrated interest or you didn’t, you either took on research or other projects or you didn’t, and either you were half decent on electives or you weren’t. Maybe it’s subjective to the applicant because they don’t know the criteria, but frankly that’s the liberty the employer has - they can evaluate on and prioritize whatever they want.

 

Once you you have an interview, it is absolutely subjective and I don’t see why it shouldn’t be. Pre-interview screens for applicant quality on paper (your CV), the interview screens for people the program wants to have around for 5 years, and this will differ from one program to the next. Every job on Earth uses this model - impress with a CV to get the interview, then demonstrate your fit with the team and job needed at the interview to secure the position. No google exec is looking at the applicants grades.

 

Relying on grades and and exam scores alone would not, and should not change this - it’s a crutch for people who don’t have anything else going on and thinking that if you get good grades and a good exam score then you deserve a match is wrong. It will show what students are at the top of the class, but how does that translate into someone who is going to be a good resident? Anyone who uses every hour of the day to focus on med school grades is doing it wrong - at one point there are diminishing returns of how useful that extra 5% knowledge is in your daily practice, so you are better off using that extra time to get to know the faculty and residents and do other things that will help your career for the better. 

The thing is that all the criteria you list are subjective. How many electives is enough to demonstrate interest? Is it 2 weeks, 4 weeks, or should it be all your electives? Is someone who did 6 weeks likely to be a better resident than someone who did 4? What counts as an accomplishment in that field? And reference letters are by definition subjective - they're one person's opinion of the applicant, and worth only as much as you trust the opinion of that person. You also don't know if the reference writer has some kind of vested interest in the applicant.

On top of that, how many of these above factors can be validated scientifically as being significant in choosing residents who are likely to perform well? If we're going to justify some selection method for residents then we should at least do so based on evidence - but I don't think we have any in this case. Whereas if we had a more objective measure - then at least you know that on top of having great references, "accomplishments" in the field, that the person has a solid bank of medical knowledge.

Of course there's always going to be a subjective component, the problem is that right now we have no real objective component whatsoever except for some qualitative and categorical data.

Link to comment
Share on other sites

7 minutes ago, shematoma said:

The thing is that all the criteria you list are subjective. How many electives is enough to demonstrate interest? Is it 2 weeks, 4 weeks, or should it be all your electives? Is someone who did 6 weeks likely to be a better resident than someone who did 4? What counts as an accomplishment in that field? And reference letters are by definition subjective - they're one person's opinion of the applicant, and worth only as much as you trust the opinion of that person. You also don't know if the reference writer has some kind of vested interest in the applicant.

On top of that, how many of these above factors can be validated scientifically as being significant in choosing residents who are likely to perform well? If we're going to justify some selection method for residents then we should at least do so based on evidence - but I don't think we have any in this case. Whereas if we had a more objective measure - then at least you know that on top of having great references, "accomplishments" in the field, that the person has a solid bank of medical knowledge.

Of course there's always going to be a subjective component, the problem is that right now we have no real objective component whatsoever except for some qualitative and categorical data.

Like I said, subjective to you but not for the program - they know exactly how many electives or what research shows interest to them. And rightly it will differ across specialties and within specialties at different programs as the needs and desires of their programs/communities differ. What’s to say that neurology in Toronto really cares about grades, publications or doing >6 weeks of electives while neurology in Edmonton really doesn’t? It’ll still be subjective to you, and there’s nothing to scientifically validate. That’s just how job hiring works. Unfortunately not everything in life has nicely weighted criteria applied across the board like med school admissions, which underscores the need to find a specialty early, learn what they’re looking for and zone in. 

Link to comment
Share on other sites

2 minutes ago, ZBL said:

Like I said, subjective to you but not for the program - they know exactly how many electives shows interest to them. And rightly it will differ across specialties and within specialties at different programs as the needs and desires of their programs/communities differ. What’s to say that neurology in Toronto really cares about grades while neurology in Edmonton really doesn’t? It’ll still be subjective to you. There’s nothing to scientifically validate. That’s just how job hiring works. Not everything in life has nicely weighted criteria applied across the board like med school admissions. 

Well the whole premise of this post is that even programs themselves are frustrated with nothing having much to work with. So it is a problem even if you don't think it is.

Link to comment
Share on other sites

3 hours ago, LittleDaisy said:

The French medical schools in Quebec still has grading system (I believe?) for preclerkship, but having grades in preclerkship adds a certain amount of stress, as medical students sacrifice the majority of their time studying in library rather than enriching their ECs, research and quality time with their friends and family. 

At my school, pre-clerkship grades have been also ended for the new curriculum - partly because of the reasons you mention, but also to favour collaboration.  Study notes after often shared or exchanged and same with question banks, not necessarily equally, - of course this is only a "semi-official" sanctioned process, but obviously can be a major determinant of the final grade.  Problems with "note hoarding" have been reported in US graded med schools.  

No easy answers to the above questions - I tend to favour the "job" based point of view for hiring since residency seems more like a role rather than being a full-time student and hiring should reflect that outlook.  It is however a valid question to wonder about academic record or ability, although it seems like less "learning" and more "doing" in residency as compared to medical school.  One school I attended had "optional" grading where grades could either be S(atisfactory)/U or reported.

 One alternative I suppose is more like the US where top MLE scorers do have quite a bit more options.  The criticism in the US has been precisely that scoring well on the MLEs doesn't necessarily indicate aptitude or interest for specialty X and that specialties like FM for instance become disfavoured.    

Link to comment
Share on other sites

The idea that residency is a job is fine. That doesn't mean there shouldn't ever be the option of objective measures for hiring. It's not uncommon for private sector employers hiring new graduates to ask for transcripts and test scores. It's also quite common for your knowledge and skills to be tested at job interviews - the short timeline and constraints of the CaRMS process makes this a little harder for programs to do.

We're never going to take the subjective element out of residency selection (or medical school selection for that matter), and no one is suggesting that it become purely based on exam scores. Even in the US, a good STEP score doesn't guarantee that anyone is a shoo in for any program. But they are another tool for programs to compare applicants. And there is research backing up their use.

Even in Canada, the vast majority of medical schools use GPA and the MCAT in their admissions because these are validated tools with which to select students. But you still need to interview well and have other subjective qualities to get in.

We've seen that residency programs are trying hard to get objective data on their applicants - some now ask for undergraduate transcripts. The CASPER has also been recently adopted at some programs. Why? Because this kind of data is useful. The match process is a very regimented one - programs get dumped hundreds, if not >1000 applicants to review in the span of 2-3 weeks. They then have a few days in January to interview and rank applicants. They work within a lot of constraints. We've already seen in this thread evidence of program directors feeling like they can't do a good job of selecting applicants given the data and constraints they have.

All we're talking here is maybe we need some kind of tool to add some objectivity into the process, and there's support from stakeholders on all sides for this. 

Link to comment
Share on other sites

1 hour ago, ZBL said:

I don’t know what’s with all the obsession to have something objective. The way I see it, there are three factors that get you an interview to a program: 1) demonstration of interest, which will mean different things to different specialties regardless of whether we have grades or a standardized exam, and 2) some sort of accomplishment related to that specialty (i.e. it is not sufficient to simply do electives and get an interview), again this will be independent of grades or standardized exams and 3) absence of red flags and a few colleagues from your specialty highlighting some positive attributes in your reference letters, again this won’t change even with grades or a standardized exam. None of these three things are really subjective - you either demonstrated interest or you didn’t, you either took on research or other projects or you didn’t, and either you were half decent on electives or you weren’t. Maybe it’s subjective to the applicant because they don’t know the criteria, but frankly that’s the liberty the employer has - they can evaluate on and prioritize whatever they want.

 

Once you you have an interview, it is absolutely subjective and I don’t see why it shouldn’t be. Pre-interview screens for applicant quality on paper (your CV), the interview screens for people the program wants to have around for 5 years, and this will differ from one program to the next. Every job on Earth uses this model - impress with a CV to get the interview, then demonstrate your fit with the team and job needed at the interview to secure the position. No google exec is looking at the applicants grades. To me, a program director saying they need grades to discriminate applicants is lazy. There’s probably something they can discern between two people on paper, and if not just interview both.

 

Relying on grades and and exam scores alone would not, and should not change this - it’s a crutch for people who don’t have anything else going on and thinking that if you get good grades and a good exam score then you deserve a match is wrong. It will show what students are at the top of the class, but how does that translate into someone who is going to be a good resident? Anyone who uses every hour of the day to focus on med school grades is doing it wrong - at one point there are diminishing returns of how useful that extra 5% knowledge is in your daily practice, so you are better off using that extra time to get to know the faculty and residents and do other things that will help your career for the better. 

As mentioned we don't have any evidence that any of that stuff works, as far as selecting for strong/successful residents. In fact, in many jobs, there is evidence that the standard interviews selects for employees who are more likely to later get fired. Clearly not an indicator that they are applicants who employers like having on the job.

Further, to use your example, Google has extensive measurements to assess applicants, including multiple rounds of "interviews" that are better described as objective assessments of applicants' abilities to perform a variety of tasks. That process is not the same as what we have for CaRMS.

Link to comment
Share on other sites

7 minutes ago, GrouchoMarx said:

I wouldve loved an objective measurement of skill, because then maybe the programs ive tried to transfer to would believe im capable of their curricula instead of just assuming that because im in path im the medical school equivalent of an incel

If you are a CMG and are currently a pathology resident, I don't think that any PDs will look negatively at you. I will suggest that you do electives in your field of interest, and get strong LORs. The PGY1 in pathology, you mainly do off-service rotations, and could easily get some training credited. 

If you want to transfer into family medicine, it is relatively easy. I haven't heard anyone being turned down (even with no electives and no family preceptors LOR)

Link to comment
Share on other sites

54 minutes ago, TheBoss said:

As mentioned we don't have any evidence that any of that stuff works, as far as selecting for strong/successful residents. In fact, in many jobs, there is evidence that the standard interviews selects for employees who are more likely to later get fired. Clearly not an indicator that they are applicants who employers like having on the job.

Further, to use your example, Google has extensive measurements to assess applicants, including multiple rounds of "interviews" that are better described as objective assessments of applicants' abilities to perform a variety of tasks. That process is not the same as what we have for CaRMS.

and as I understand it there actually is evidence the USMLE does correlate with performance of a resident including passing the licencing exams. I mean that isn't too surprising on the exam front - speaking as someone doing the exam right now with the multiple choice part in 2 days there are a million pieces of data they are looking for - basically they ARE looking for people who have studied so much they are past the diminishing returns part :) They are doing that (and I asked that exact question) because they believe it does equate to competence as a physician. 

You can argue with the conclusions or some other aspect of these studies but at least there is some form of validation there - a lot more than all the other methods. Which is why the USMLE has the strength is does in the US.

 

 

Link to comment
Share on other sites

2 minutes ago, LittleDaisy said:

If you are a CMG and are currently a pathology resident, I don't think that any PDs will look negatively at you. I will suggest that you do electives in your field of interest, and get strong LORs.

If you want to transfer into family medicine, it is relatively easy. I haven't heard anyone being turned down (even with no electives and no family preceptors LOR)

out of curiosity how would you get electives in another program while you are in the middle of doing the first one? Is there some method for that - I haven't heard anyone doing that :)

Link to comment
Share on other sites

Just now, rmorelan said:

out of curiosity how would you get electives in another program while you are in the middle of doing the first one? Is there some method for that - I haven't heard anyone doing that :)

I heard of someone from dermatology transferring into family medicine as PGY1 (rare case)! The person specifically talked to their PD and asked for electives in family medicine; even though not part of their residency curriculum. 

The downside is if you don't successfully transfer, and you expressed your interest to your current PD to help you get off-service electives, it might hurt you in the end. It's a fine line :P 

Link to comment
Share on other sites

1 minute ago, LittleDaisy said:

I heard of someone from dermatology transferring into family medicine as PGY1 (rare case)! The person specifically talked to their PD and asked for electives in family medicine; even though not part of their residency curriculum. 

The downside is if you don't successfully transfer, and you expressed your interest to your current PD to help you get off-service electives, it might hurt you in the end. It's a fine line :P 

Ha :) very fine I would say - I mean once you say you are leaving there is blood in the water. I wonder how willing the average path PD would be to allowing that. I guess there has to also be some electives in the program overall (otherwise if you don't switch you are now short some blocks, and I am not sure how those would be made up. 

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.


×
×
  • Create New...