Saturday, March 31, 2007
I am going to be out of the country next week, so don't expect anything new until Easter or later. Best of luck with all of your endeavors.
Wednesday, March 28, 2007
The Economics of Residency Part III: Payment
Many people don't know this, but Medicare, when they aren't collecting money from residents paychecks as employees, actually gives every residency program a student stipend for every resident that they take. This can be a six figure stipend. In that same vein, hospitals are not allowed to bill for the work that residents due. If my understanding of this process is correct, this stipend is actually modified as a ratio of the Medicare work done at the hospital (Someone please correct me on this one point if I am mistaken). Hospitals pay residents and provide all benefits given to residents with this stipened. The astute observer who read my previous post would notice that this leaves the hospital with a hefty payment in exchange for training the resident. This of course, leads to another contradiction.
Hospitals receive a significant benefit for having residents available. They cover the floors, they operate on low level cases with minimal supervision, they "move the meat" so to speak in the ED, and they provide 24 hour call coverage that often prevents attendings with hospital priveleges from having to come in at 2:00 AM. Hospitals cannot bill for resident's work directly, but they can bill for hospital services, and because residents often perform these services, the hospital bills for them indirectly.
Moreover, because resident's cannot bill when they are actually performing higher level physician functions, there is a perverse incentive to engage residents in scut work. A resident costs the same whether he does 100 blood draws or scrubs in on an interesting case beyond his current skill level. However, when he does 100 blood draws, the hospital doesn't have to hire a phlebotomist. This saves them money. If he scrubs in on an interesting case beyond his skill level (where he might learn something), the hospital not only cannot charge for his presence in the room, but he will actually slow down the attending physician who CAN bill. Thus, the incentive is exactly the opposite of what would be expected from a residency program.
Flying in the face of many years of tradition, I hereby move that the Medicare stipend be removed and residents be allowed to bill for the work that they do. This would accomplish two things:
1. Hospitals would have an economic incentive to use residents efficiently. Having a resident actually engaged in productive activity is probably better for educational purposes than having them engaged in scut. Also, this would put residents on the same billing level atleast as the hospital PAs and NPs, diminishing the backwards incentive for hospitals to not hire necessary coverage. Higher resident billing rates would reduce incentive for having them do the work of ancillary staff.
2. Programs would have an economic incentive tfor teaching residents skills early, as the program could actually benefit economically from having a resident who could bill for those cases. The resident should also be able to bill as an assistant. As unfond as I am of Medicare, if they absolutely must be involved, paying the resident as an assistant rather than giving the program a lump sum would be a much better incentive.
Also, as long as residency is required for certification, hospitals should be required to reimburse residents atleast a portion of what they actually generate. I'd love to let the market sort out this mess, but that can't happen within the controlled licensing system. Until the system changes, residents cannot fairly negotiate these rates themselves, and there has to be some sort of legitimate recompense for work completed.
There is however, a significant barrier to implementation of any change towards autonomy. Like most things these days, it lies in liability. Stick around for my next post on malpractice.
Hospitals receive a significant benefit for having residents available. They cover the floors, they operate on low level cases with minimal supervision, they "move the meat" so to speak in the ED, and they provide 24 hour call coverage that often prevents attendings with hospital priveleges from having to come in at 2:00 AM. Hospitals cannot bill for resident's work directly, but they can bill for hospital services, and because residents often perform these services, the hospital bills for them indirectly.
Moreover, because resident's cannot bill when they are actually performing higher level physician functions, there is a perverse incentive to engage residents in scut work. A resident costs the same whether he does 100 blood draws or scrubs in on an interesting case beyond his current skill level. However, when he does 100 blood draws, the hospital doesn't have to hire a phlebotomist. This saves them money. If he scrubs in on an interesting case beyond his skill level (where he might learn something), the hospital not only cannot charge for his presence in the room, but he will actually slow down the attending physician who CAN bill. Thus, the incentive is exactly the opposite of what would be expected from a residency program.
Flying in the face of many years of tradition, I hereby move that the Medicare stipend be removed and residents be allowed to bill for the work that they do. This would accomplish two things:
1. Hospitals would have an economic incentive to use residents efficiently. Having a resident actually engaged in productive activity is probably better for educational purposes than having them engaged in scut. Also, this would put residents on the same billing level atleast as the hospital PAs and NPs, diminishing the backwards incentive for hospitals to not hire necessary coverage. Higher resident billing rates would reduce incentive for having them do the work of ancillary staff.
2. Programs would have an economic incentive tfor teaching residents skills early, as the program could actually benefit economically from having a resident who could bill for those cases. The resident should also be able to bill as an assistant. As unfond as I am of Medicare, if they absolutely must be involved, paying the resident as an assistant rather than giving the program a lump sum would be a much better incentive.
Also, as long as residency is required for certification, hospitals should be required to reimburse residents atleast a portion of what they actually generate. I'd love to let the market sort out this mess, but that can't happen within the controlled licensing system. Until the system changes, residents cannot fairly negotiate these rates themselves, and there has to be some sort of legitimate recompense for work completed.
There is however, a significant barrier to implementation of any change towards autonomy. Like most things these days, it lies in liability. Stick around for my next post on malpractice.
Sunday, March 25, 2007
The Economics of Residency Part II: $5-$20/Hour
There is relatively minor variation in pay between US residency programs and virtually no variation between specialties within the same program. This creates a rather odd pay scenario. After completing four years of medical school, all graduates who enter residency will be paid between $38k and $55k. This varies a little with regards to the military, which pays its residents more (You get to pay it back later, trust me). Most programs are close to $40k. This leaves a pathology resident who is assigned a 40 hour work week with a pay rate of about $20/hour. For our surgeon in a program that is adherent to the 88 hour work week maximum, the rate is closer to $7/hour if overtime were calculated. In a non-adherent program, this can be worse. Thus, a surgery intern may actually make less than the service worker in the cafeteria of the same hospital in which he works. This may actually put some residents below the minimum wage in the state in which they work. Most programs and specialties are in-between, with pay hovering in the $10/hour range. One might ask why this happens and how it is justified. Why do physicians put up with it?
Medical students owe a lot of money. Official numbers are about a $130,000 average per student, but anyone in medical school will tell you that this is misleading. For the most part, students who have to borrow money owe closer to $200,000, with some owing $300,000 plus. Some students are supported, at least in part, by their families, and this skews the numbers down to a less frightening statistic. Upon completion of school, with this crushing debt burden, the only way in which a student can turn this hideously expensive degree into earning potential is with a medical license. The only way to attain a medical license is to enter residency. Thus, most students have no other viable economic option. Most students will then trudge through residency with these debts accruing interest in some form of deferment or forebearance.
Residencies are mostly accredited by the ACGME (American College of Graduate Medical Education), with a few that exist for DO graduates accredited by the AOA. The only way that a residency is considered adequate for licensing purposes is for it to receive accreditation from one of these two entities. This stifles a competative market in post-graduate medical training. As a new physician, I cannot go apprentice with an internist until I am comfortable with internal medicine, as the internist isn't accredited by the ACGME as a residency program. Oddly enough, nurse practicioners(NPs) and physician's assistants(PAs) are allowed to do this. Independent NPs often do go the route of working for a few years under a physician and striking out on their own. This is illegal for a physician to do. A newly minted NP can find a job, usually in the $60-$80k range, work far fewer hours than the resident, and now in many states strike out on his own.
Now, one might think that it is strange for so many highly educated people to allow themselves to be pushed into such low paying jobs for such a long period of time. The fact is, that until recently, residency was almost universally considered to be training, an extension of schooling. The slow evolution of residency requirements meant that most physicians began to view it as a natural extension of the medical school training process. The idea of doing it a different way (which is now done by NPs as well as being done by the MDs of old) was just not on the radar screen at that time. NPs and PAs are a relatively new invention, and MDs were all being forced into residency. Most people viewed residents as highly autonomous students. Today however, the scenario looks a little bit more like this:
If it is in the best interest of the program to call the resident a student, he is a student. If it is in the best interest of the program to call him an employee, he is an employee. Similarly, residents are often, at least perceived, to be exempted from most federal labor protections, because they are considered students. The IRS however, will happily collect FICA from the resident as an employee, without the exemption given to students. Similarly, the legal system will view the resident as a liable practicioner in the realm of malpractice. The program however, will usually not allow the resident final judgement over actions for which he is liable. This is of course, because the legal system, which sees the resident as liable, also sees that attending physician as liable. Basically, whichever term is worse for the resident will be the one applied in any given situation. Oddly, the resident now has far less autonomy than he did when everyone thought he was a student.
To add insult to injury, government payment programs, which have now taken over nearly 50% of medical payments and set all sorts of arbitrary standards that have been adopted by almost all private third party payers, will not usuall reimburse physicians who have not completed a residency. Furthermore, the ususally have to become boarded in a specialty that is considered to be related to any particular medical activity for which they hope to receive compensation. Thus, the option of completing only the internship (or first year of residency) becomes a practical impossibility for most students, forcing them to complete the training. Furthermore, current malpractice law holds most physicians to the "standards of the community," which is a doctrine that often demands the same competence from non-specialists that can be seen with specialists. This essentially prevents a non-boarded physician from trying to sell his somewhat lower level of training for a lower price, because his risk is too high, and malpractice insurers will often not cover his performance of most medical activites for which a specialty exists in the area.
So you might be asking, "do you think that residency should exist at all?" You'd be surprised to hear that my answer is yes. This is an issue of force and supply and demand. My problem is with the former interfering with the later. I'll address the impact of government payers and board certification on the supply and demand associated with post-graduate medical training in my next post.
Medical students owe a lot of money. Official numbers are about a $130,000 average per student, but anyone in medical school will tell you that this is misleading. For the most part, students who have to borrow money owe closer to $200,000, with some owing $300,000 plus. Some students are supported, at least in part, by their families, and this skews the numbers down to a less frightening statistic. Upon completion of school, with this crushing debt burden, the only way in which a student can turn this hideously expensive degree into earning potential is with a medical license. The only way to attain a medical license is to enter residency. Thus, most students have no other viable economic option. Most students will then trudge through residency with these debts accruing interest in some form of deferment or forebearance.
Residencies are mostly accredited by the ACGME (American College of Graduate Medical Education), with a few that exist for DO graduates accredited by the AOA. The only way that a residency is considered adequate for licensing purposes is for it to receive accreditation from one of these two entities. This stifles a competative market in post-graduate medical training. As a new physician, I cannot go apprentice with an internist until I am comfortable with internal medicine, as the internist isn't accredited by the ACGME as a residency program. Oddly enough, nurse practicioners(NPs) and physician's assistants(PAs) are allowed to do this. Independent NPs often do go the route of working for a few years under a physician and striking out on their own. This is illegal for a physician to do. A newly minted NP can find a job, usually in the $60-$80k range, work far fewer hours than the resident, and now in many states strike out on his own.
Now, one might think that it is strange for so many highly educated people to allow themselves to be pushed into such low paying jobs for such a long period of time. The fact is, that until recently, residency was almost universally considered to be training, an extension of schooling. The slow evolution of residency requirements meant that most physicians began to view it as a natural extension of the medical school training process. The idea of doing it a different way (which is now done by NPs as well as being done by the MDs of old) was just not on the radar screen at that time. NPs and PAs are a relatively new invention, and MDs were all being forced into residency. Most people viewed residents as highly autonomous students. Today however, the scenario looks a little bit more like this:
If it is in the best interest of the program to call the resident a student, he is a student. If it is in the best interest of the program to call him an employee, he is an employee. Similarly, residents are often, at least perceived, to be exempted from most federal labor protections, because they are considered students. The IRS however, will happily collect FICA from the resident as an employee, without the exemption given to students. Similarly, the legal system will view the resident as a liable practicioner in the realm of malpractice. The program however, will usually not allow the resident final judgement over actions for which he is liable. This is of course, because the legal system, which sees the resident as liable, also sees that attending physician as liable. Basically, whichever term is worse for the resident will be the one applied in any given situation. Oddly, the resident now has far less autonomy than he did when everyone thought he was a student.
To add insult to injury, government payment programs, which have now taken over nearly 50% of medical payments and set all sorts of arbitrary standards that have been adopted by almost all private third party payers, will not usuall reimburse physicians who have not completed a residency. Furthermore, the ususally have to become boarded in a specialty that is considered to be related to any particular medical activity for which they hope to receive compensation. Thus, the option of completing only the internship (or first year of residency) becomes a practical impossibility for most students, forcing them to complete the training. Furthermore, current malpractice law holds most physicians to the "standards of the community," which is a doctrine that often demands the same competence from non-specialists that can be seen with specialists. This essentially prevents a non-boarded physician from trying to sell his somewhat lower level of training for a lower price, because his risk is too high, and malpractice insurers will often not cover his performance of most medical activites for which a specialty exists in the area.
So you might be asking, "do you think that residency should exist at all?" You'd be surprised to hear that my answer is yes. This is an issue of force and supply and demand. My problem is with the former interfering with the later. I'll address the impact of government payers and board certification on the supply and demand associated with post-graduate medical training in my next post.
Saturday, March 24, 2007
The Economics of Residency Part I: The Basics of Residency
One of the first things that you hear about in medical school is the match. This life altering event, which the majority of incoming medical students have never even heard of, is the catalyst for thousands of newly minted physicians relocating to various corners of the country in order to train in their respective specialties. For those of you who are unfamiliar with the modern medical training system, candidates will interview at a number of different training programs across the country. After hearing about this process constantly for the first three years of medical school, every student will actively participate in the fourth and final year of school. After all of the interviews, the programs and interviews rank each other by order of overall preferance and the information is fed into a computer. The computer then determines who goes where. These training programs are called residencies, and the new physicians, soon to be called residents, will find themselves obligated to work with an annual contract.
Before I continue this post, I am going to add a disclaimer. I am not a resident. I probably will be one day, but I am currently just a medical student. After speaking to many people who are in various stages of training, I feel that I have gotten a pretty significant grasp on the whole process. I have friends who are going through the different stages as we speak. However, I am fully willing to accept any criticisms over my perception. This first post is primarily for background, and we will get to money in part II.
Residency training can be a vastly different experience for the different specialties. The internship year can mean anything from a 40-45 hour week with most weekends off to brutal weeks of 80+ hours with persistent sleep deprivation and 30+ hour shifts. Surgical specialties tend to have the worst hours of all. Currently, residencies are restricted from working their residents for than 80 hours a week, though some programs have managed to attain an exemption that carries this out to 88. Everyone doesn't play by the rules, and different programs have different degrees of compliance with this requirement. There is also a 30 hour shift time limit, that is also followed to varying degrees. Depending on specialty, residency training can also vary from 3-7 years in length.
The concept of residency originally came from an academic program at John Hopkins University. A few bright and single medical students would upon graduation, with academic ambition, actually live in the hospital in exchange for room and board. They were then exposed to highly varied pathology, and they covered the floors of the hospital as physicians. This old world hospital was essentially a boarding house for the sick who had no family to take care of them. The hours were long, but the pace was slow. The residents were give a half day off each week, and they still managed to, with a 156 hour work week, get enough sleep to be compatable with life. They had no family to speak of, due to the requirement of being single, and the length of training was only 1-2 years. Residency had NOTHING to do with medical licensing, there were no board certifications, and this relatively short sacrifice was almost a sure ticket to a prestigious career.
In time, an internship became required in order to even qualify for a state medical license. This was 1-2 years, which was no longer sufficient to qualify as a residency. In many states, this is still all that this required for licensure, though some states now require a full three years, consistent with the shortest of modern residency programs. That being said, just doing an internship and then practicing is almost unheard of these days. Most of these physicians are relegated to low level positions, with minimal chances for better pay, promotions, status positions, or partnerships.
Of course, this whole process is strange. What other occupation requires years of formal training after the schooling process in order to procure a license? Even in professions in which such training is possible, it is certainly not required for even the highest levels of private practice achievement. Lawyers finish three years of law school and learn on the job or engage in a trial by fire by striking out on their own. There is a similar process for engineers, architects, journalists, and everyone else. Sometimes a one year internship is built into the actual schooling process, but there is certainly nothing even remotely close to what exists in medicine. Why is medicine different? Stay tuned to find out.
Before I continue this post, I am going to add a disclaimer. I am not a resident. I probably will be one day, but I am currently just a medical student. After speaking to many people who are in various stages of training, I feel that I have gotten a pretty significant grasp on the whole process. I have friends who are going through the different stages as we speak. However, I am fully willing to accept any criticisms over my perception. This first post is primarily for background, and we will get to money in part II.
Residency training can be a vastly different experience for the different specialties. The internship year can mean anything from a 40-45 hour week with most weekends off to brutal weeks of 80+ hours with persistent sleep deprivation and 30+ hour shifts. Surgical specialties tend to have the worst hours of all. Currently, residencies are restricted from working their residents for than 80 hours a week, though some programs have managed to attain an exemption that carries this out to 88. Everyone doesn't play by the rules, and different programs have different degrees of compliance with this requirement. There is also a 30 hour shift time limit, that is also followed to varying degrees. Depending on specialty, residency training can also vary from 3-7 years in length.
The concept of residency originally came from an academic program at John Hopkins University. A few bright and single medical students would upon graduation, with academic ambition, actually live in the hospital in exchange for room and board. They were then exposed to highly varied pathology, and they covered the floors of the hospital as physicians. This old world hospital was essentially a boarding house for the sick who had no family to take care of them. The hours were long, but the pace was slow. The residents were give a half day off each week, and they still managed to, with a 156 hour work week, get enough sleep to be compatable with life. They had no family to speak of, due to the requirement of being single, and the length of training was only 1-2 years. Residency had NOTHING to do with medical licensing, there were no board certifications, and this relatively short sacrifice was almost a sure ticket to a prestigious career.
In time, an internship became required in order to even qualify for a state medical license. This was 1-2 years, which was no longer sufficient to qualify as a residency. In many states, this is still all that this required for licensure, though some states now require a full three years, consistent with the shortest of modern residency programs. That being said, just doing an internship and then practicing is almost unheard of these days. Most of these physicians are relegated to low level positions, with minimal chances for better pay, promotions, status positions, or partnerships.
Of course, this whole process is strange. What other occupation requires years of formal training after the schooling process in order to procure a license? Even in professions in which such training is possible, it is certainly not required for even the highest levels of private practice achievement. Lawyers finish three years of law school and learn on the job or engage in a trial by fire by striking out on their own. There is a similar process for engineers, architects, journalists, and everyone else. Sometimes a one year internship is built into the actual schooling process, but there is certainly nothing even remotely close to what exists in medicine. Why is medicine different? Stay tuned to find out.
Sunday, March 18, 2007
Organ Transplantation: How to Bankrupt the Medical System
Perhaps one of the most amazing things that we have accomplished in modern medicine is being able to remove an organ from a living or recently deceased human being and put it into another human being while allowing it to retain its essential level of function. Organ transplantation is a tribute to the genius of many hardworking men and women whose understanding of human anatomy, physiology, and function is so vast, that they have managed to save countless lives from the supposedly inevitable conclusion of poor lifestyle choices, infectious disease, or congenital defect. However, there is a dark side. I feel that I would be remiss if I didn't talk about the economic consequences of organ transplantation.
In the US, kidney transplantation costs about $100,000. Heart Transplantation can approach close to $1,000,000. These are astronomical costs. Also, many people with fatal conditions sit in highly expensive ICUs in sedated or non-functional states only to have no organs emerge in time and die after using huge amounts of resources. Waiting lists continue to grow, and donation is relatively flat.
I guess the first question that should be asked is, "How can we procure enough organs from donors in order to prevent these long ICU stays?" I think that the answer is simple. Let people pay for organs. Far from the astronomical prices that often come up on the black market, I'll bet that you could solve the entire organ shortage by letting people cover the funeral costs of an already deceased individual in exchange for their organs. In the face of the costs that I mentioned before, this would barely be a blip on the economic radar. It would also overturn this bizarre notion that because organ transplantation saves lives, we should ignore all economic laws of supply and demand when trying to procure organs. Anyone with a decent high school education who has seen a supply and demand curve can tell you that a shortage of a product on the market is probably the result of the price being set too low (in this case $0). At higher prices, there would be more donors.
Now, on the flip side of this, what would all of these donations cost? Because of the socialized and cost spreading nature of modern medicine, these costs would be directly (or indirectly) born by everyone. We'll start with kidneys. I recall reading that there are about 90,000 on the renal transplant waiting list. At ~$100,000 a pop, this would be a cost of about $9 billion in renal transplants alone, neglecting the cost of rejection, medication, lifelong immunosuppression, future hospitalizations, and the cost of training enough extra surgeons to cover 90,000 transplants. An ever aging population would insure that a steady supply of need would follow, and that the list would grow again in no time, causing a steady need for the expenditure. I realize that there would be savings in dialysis costs, but the longer life expectancy of the patients with the transplanted organs may offset those savings with increased need for medical care. Now, apply this to everything from cornea transplants to heart transplants, and the costs will easily soar. If we gave a conservative estimate of close to $50 billion for ALL additional organ transplants, while still ignoring the costs that come afterward, we will increase, almost perpetually, total costs by an amount that is equivalent to almost 2% of the ENTIRE FEDERAL BUDGET. It would be a MUCH HIGHER percentage of medical expenditure going to relatively few people at very high cost per person.
Now, I am not opposed to organ transplants. I believe that people should be able to pay for them like anything else. I believe that people should be able to get insurance to cover them like anything else, though I have no problem with different policies for those who want to be covered and those who don't want to pay the price of the expensive risk coverage. However, we shouldn't turn a blind eye to how the world of unreciprocated giving that so many see is the ideal is the reason for our shortages. We also shouldn't be afraid to point out that within the current system, the shortages are the only reason that we haven't gone bankrupt.
In the US, kidney transplantation costs about $100,000. Heart Transplantation can approach close to $1,000,000. These are astronomical costs. Also, many people with fatal conditions sit in highly expensive ICUs in sedated or non-functional states only to have no organs emerge in time and die after using huge amounts of resources. Waiting lists continue to grow, and donation is relatively flat.
I guess the first question that should be asked is, "How can we procure enough organs from donors in order to prevent these long ICU stays?" I think that the answer is simple. Let people pay for organs. Far from the astronomical prices that often come up on the black market, I'll bet that you could solve the entire organ shortage by letting people cover the funeral costs of an already deceased individual in exchange for their organs. In the face of the costs that I mentioned before, this would barely be a blip on the economic radar. It would also overturn this bizarre notion that because organ transplantation saves lives, we should ignore all economic laws of supply and demand when trying to procure organs. Anyone with a decent high school education who has seen a supply and demand curve can tell you that a shortage of a product on the market is probably the result of the price being set too low (in this case $0). At higher prices, there would be more donors.
Now, on the flip side of this, what would all of these donations cost? Because of the socialized and cost spreading nature of modern medicine, these costs would be directly (or indirectly) born by everyone. We'll start with kidneys. I recall reading that there are about 90,000 on the renal transplant waiting list. At ~$100,000 a pop, this would be a cost of about $9 billion in renal transplants alone, neglecting the cost of rejection, medication, lifelong immunosuppression, future hospitalizations, and the cost of training enough extra surgeons to cover 90,000 transplants. An ever aging population would insure that a steady supply of need would follow, and that the list would grow again in no time, causing a steady need for the expenditure. I realize that there would be savings in dialysis costs, but the longer life expectancy of the patients with the transplanted organs may offset those savings with increased need for medical care. Now, apply this to everything from cornea transplants to heart transplants, and the costs will easily soar. If we gave a conservative estimate of close to $50 billion for ALL additional organ transplants, while still ignoring the costs that come afterward, we will increase, almost perpetually, total costs by an amount that is equivalent to almost 2% of the ENTIRE FEDERAL BUDGET. It would be a MUCH HIGHER percentage of medical expenditure going to relatively few people at very high cost per person.
Now, I am not opposed to organ transplants. I believe that people should be able to pay for them like anything else. I believe that people should be able to get insurance to cover them like anything else, though I have no problem with different policies for those who want to be covered and those who don't want to pay the price of the expensive risk coverage. However, we shouldn't turn a blind eye to how the world of unreciprocated giving that so many see is the ideal is the reason for our shortages. We also shouldn't be afraid to point out that within the current system, the shortages are the only reason that we haven't gone bankrupt.