This is, I believe, one of the downsides of empiricism and the fixation on citation rather than observation and reason. The idea that a whole profession came to reject the idea that infants are able to feel pain is astonishing, especially considering anyone who has ever taken care of infants knows that these little human beings react to things which are known to be painful. Babies feeling pain is something that should be self-evident, shouldn't it? It may not be empirically evident, but decisions can still be made without such metrics.
Don't get me wrong, empiricism and citation is valuable. However, a myopic focus on it enables some very twisted conclusions.
We should find sympathy for the doctors here. Administering general anesthesia to a newborn was a terrifying and deadly prospect at the time, so it was much easier to find a way to pretend it was unnecessary - certainly many of these surgeries had to happen one way or another for the baby to survive.
I also suspect that the actual strand of opinion is being fairly uncharitably represented in that article, and the original perception was more along the lines of "they won't feel and process pain quite like adults, and they won't remember it, and they won't develop any pathological fears as a result of it, so anaesthetic is less important as well as riskier", not an assumption babies didn't have senses.
Let's not forget the profession used to perform operations on adults without anaesthetic before they understood safe ways of administering it too.
> they won't feel and process pain quite like adults, and they won't remember it
Mostly it's that second point. Nobody can complain about their treatment if they can't remember having it. That's how childbirth was done in the mid-20th century - you just give the woman memory blockers and don't worry about whether she's comfortable.
That's also how procedures are still done now, even though consensus has moved to the view that the mid-century treatment of women giving birth was an atrocity. When I was getting a stomach biopsy (involving a large tube being shoved down your throat), they told me to take a sedative. I asked if the sedative would help with undergoing the procedure. And they told me no, people are just as unhappy either way, but if I took the sedative I wouldn't remember the procedure afterwards.
There are numerous stories (maybe urban legends) about people suffering from aforementioned psychiatric issues after waking up amidst surgery but being not able to remember it afterwards. Curious if it is true or not, cause that casts a shadow on “no memories no problem” idea.
I couldn't find any stories about people that couldn't remember waking up, but still had psychological issues. How would they even know they had woken up?
The cases I could find were either people under general anesthesia (unconscious) that woke up and can remember at least parts of the procedure, or people under conscious sedation who probably didn't get a high enough dose of memory blockers.
I don't know enough about memory to say for sure, but I suppose it seems plausible to internalize a fear at an unconscious level even if the conscious memories are blocked. Similar to how some people have arachnophobia without ever having a bad experience with spiders.
I'm afraid it's the case that most all surgeries on infants (circumcision) are elective and performed for aesthetic reasons. For males it's likely they only time they will go into shock from pain during their lives.
I find it sad, how people universally agree that female genital mutilation is bad, because cutting off parts for no medical reason is bad.... but then do the same thing to boys, "just because".
Most people don't have anything remotely close to what you'd call "beliefs", so no consistency is required between one pseudo-belief they might hold and another.
If you want something even more shocking, ask them how they feel about docking a dog's ears or its tail, then ask them how they feel about routine infant circumcision. For maximum cognitive dissonance, inform them that docking ears and tails is done for both aesthetic and purported hygiene reasons.
Well, people don't universally agree that FGM is bad because there are people who practice FGM. Also, most people think MGM is… if not barbaric, at least weird, but are somewhat lenient in the name of religious tolerance. The US is the extremely weird outlier.
Yeah, i heard that about the US.. that hospitals in some places ask parents directly to do the circumcision after birth as a "default option", and parents have to "opt-out" of it. Really sad and barbaric.
Where i live (small eu country), there were some movements to make it illegal (unless medically needed), but it didn't pass, not to offend certain minorities.
> I'm afraid it's the case that most all surgeries on infants (circumcision) are elective and performed for aesthetic reasons
Circumcision is awful, for sure. However am I reading your comment correctly in that you think that "most all surgeries on infants are elective and performed for aesthetic reasons"? Because with my little one, he's had to undergo some pretty invasive procedures in his short life so far that are on the extreme opposite of aesthetic reasoning.
Circumcisions are "most all" surgeries on infants (in the US), since they are sadly done routinely on something like a quarter of them (half of anatomic boys).
In the US, yes… although I wonder whether non-MGM surgeries on infants are rare enough that the large number of circumcisions made in the US (and a bunch of smaller countries) actually make them the most common infant surgery worldwide.
True. Lobotomies used to be common practice as well. Even a surgery performed recently, UPPP, for treating sleep apnea is "no longer recommended"... It was only five or ten years ago where they would perform this incredibly invasive surgery, but enough research shows they're not that effective.
I get the impression that a lot of gynecological care is in the same realm, but I don't have the equipment to have first hand anecdotal evidence. Heard many a horror story about IUD placement / removal, where folks were told that a Tylenol would be enough.
I’ve traded chronic pain stories with women who have endometriosis and I am convinced that it would just be beyond my capacity to cope.
Many women (I’m not sure how many because, of course, it’s not well studied) with endometriosis are ignored for years because it’s assumed that they are experiencing period pain and period pain doesn’t “count” for some reason.
Common layman prejudice aside, it's more likely experience from people placing iv lines regularly. Black skin is more difficult for many reasons, and I can easily believe that many get the impression of thicker skin because of other associated factors.
Black people really do have black skin, you know? It's not all racism...
From a brief skim of the literature, the main difference seems to be that darker skin is better protected against aging effects from sun exposure, and better maintains its elasticity (etc.) when exposed to ultraviolet radiation over decades. But less research than might be expected has been done about other differences between the skin of different groups.
* * *
> There exists substantial evidence to support that Black skin has a higher transepidermal water loss, variable blood vessel reactivity, decreased skin surface pH, and larger mast cell granules compared with White skin. Although some deductions have been made about Asian and Hispanic skin, further evaluation needs to be done. Differences in water content, corneocyte desquamation, elastic recovery/extensibility, lipid content and skin microflora, although statistically significant, are inconclusive.
> The thickness of the skin is higher on the cheek compared with the dorsal and ventral forearm, with no ethnic or age-related specificity. We confirm that the sub-epidermal non-echogenic band is a sensitive marker of skin aging, and reveal for the first time that it is less pronounced in African Americans. From OCT images, we bring out evidence that the thickness of the dermal–epidermal junction (DEJ) decreased with age, and was higher in African Americans than in Caucasians. Finally, by comparing US images at 150 MHz with OCT images, we show that papillary dermis thickness can be measured and appears to be quite constant irrespective of age or ethnic group.
> Skin roughness, scaliness and stratum corneum hydration varied significantly in different anatomic areas and age groups. There was no racial variation in skin hydration between any anatomic site, nor significant differences in roughness and scaliness between races, except for the preauricular area. Skin roughness was significantly increased in the aged, compared to the young at the preauricle, volar forearm, lower back, thigh and lower leg. Older women demonstrated significantly more scaling at the preauricle than younger women. Stratum corneum hydration correlated with scaliness. No significant correlation between stratum corneum hydration and skin roughness was observed.
> the thickness of the dermal–epidermal junction (DEJ) [...] was higher in African Americans than in Caucasians.
There are traits that tend to travel next to each other for historical reasons, as well as because they physically are controlled by the same genes. It's important to pay attention to them; one of the facts that really wrecked the reputation of Kim's Convenience (an anodyne Canadian show I've never watched) was that they gave one of the Korean main characters multiple sclerosis, and Koreans almost never get multiple sclerosis. It made the actress playing the character hate the show.
I am pretty sure this is literally about skin thickness. Here is the list of false beliefs polled:
Blacks age more slowly than whites; Blacks’ nerve endings are less sensitive than whites’; Black people’s blood coagulates more quickly than whites’; Whites have larger brains than blacks; Whites are less susceptible to heart disease than blacks; Blacks are less likely to contract spinal cord diseases; Whites have a better sense of hearing compared with blacks; Blacks’ skin is thicker than whites’; Blacks have denser, stronger bones than whites; Blacks have a more sensitive sense of smell than whites; Whites have a more efficient respiratory system than blacks; Black couples are significantly more fertile than white couples; Whites are less likely to have a stroke than blacks; Blacks are better at detecting movement than whites; Blacks have stronger immune systems than whites
Apparently a majority of white laypeople think that black people literally have thicker skin. Among white first-year medical students, 40% believe this, dropping to 25% of white residents.
Though in the narrow context of skin condition, it is true that people with darker skin “age more slowly”, i.e. have less age-related deterioration in skin quality because they are less susceptible to UV radiation. Minority groups in the USA are also more susceptible to cardiovascular disease, though that’s largely if not entirely attributable to environmental factors.
I mean, as you hinted there are at least a few things on that list that are true. We’re in a weird period of time where people are desperate to ignore any differences between races but they absolutely exist as one would expect due to different evolutionary pressures.
I don’t know how you’d go about testing some of those and many shouldn’t matter to doctors, but I personally think the differences between us are fascinating.
> We’re in a weird period of time where people are desperate to ignore any differences between races but they absolutely exist as one would expect due to different evolutionary pressures.
This isn't really the case because 'races' aren't biologically definable categories, and hence have no direct relevance to evolution. There can be correlations between a person's assigned racial category in a given culture and certain biological features, such as e.g. propensity for sickle cell anaemia. However, this is just because there can be loose correlations between this category (which is usually assigned based on superficial features of appearance and cultural characteristics) and the areas of the world where the person's ancestors hail from. No serious scientific study of human difference pays any attention to race, as it is a largely arbitrary and culturally specific construct.
Grateful I turned that shit down in 2012. Even then the evidence against it was overwhelming (basically, it makes sleep apnea worse because scar tissue comes back fierce).
There are some pretty interesting procedures like MSE that aim to expand your skull internally by putting lateral outward pressure on your palate. The immediate effects are a new gap between your two front teeth and greatly expanded breathing capacity. Hopefully. People have various ways of describing the moment two parts of your internal skull bits separate, once described as "radiant pleasure without actual pleasure"
It was certainly not easy. I took 5 weeks off of work! It’s not in my nature to stop working for so long, so that was a strange experience. Recovery has been good. I look better now, so that’s awesome...a good side effect! I have some remaining “weird feeling” in my lower left chin area, but it doesn’t really bother me. I am 11 months post-op now and get my braces off next month. Overall I’d definitely recommend the survey if it’s medically warranted. It absolutely was for me.
Glad the procedure was successful for you. We're still trying to find what the best solution might be for my wife, but it gives me hope to hear success stories like this. Thanks!
Congrats I guess? Your wife is an exception to a very well established norm. Losing weight is a treatment so good (as in, effective, safe, cheap) that most doctors will recommend it as the number 1 step in treating sleep apnea.
"What is becoming increasingly clear is that we need to continue to strongly advocate weight loss for all our patients, regardless of the severity of their OSA or adherence to our other therapies. The benefits of weight loss are, to a degree, unquestionable. This study highlights that tangible benefits can be obtained with weight-loss interventions. The challenge, as always, lies in the implementation of our lofty goals."
I don't see your point. Obviously, some of those folks are just technically overweight - BMI at 25 - and aren't going to get as much help even though most of us picture obesity when looking at this. And that still leaves 10-40% that aren't and still have OSA.
> some of those folks are just technically overweight - BMI at 25 - and aren't going to get as much help
Citation needed. They are absolutely going to be benefited from weight loss.
"What is becoming increasingly clear is that we need to continue to strongly advocate weight loss for all our patients, regardless of the severity of their OSA or adherence to our other therapies. The benefits of weight loss are, to a degree, unquestionable. This study highlights that tangible benefits can be obtained with weight-loss interventions. The challenge, as always, lies in the implementation of our lofty goals."
"Results: Relative to stable weight, a 10% weight gain predicted an approximate 32% (95% confidence interval [CI], 20%-45%) increase in the AHI. A 10% weight loss predicted a 26% (95% CI, 18%-34%) decrease in the AHI. A 10% increase in weight predicted a 6-fold (95% CI, 2.2-17.0) increase in the odds of developing moderate-to-severe SDB. "
You are proving the OP's point. Cutting off your uvula? It sounds insane to me, and I wouldn't have the procedure even if a doctor recommended it to me.
There are so many totally insane medical practices that it's shocking we don't question the medical industry as a rule of thumb.
No, the post was attempting to demonstrate how the medical system is effective and works as intended by correcting the use of UPPP, when any sane rational person would never have suggested in the first place.
I feel like it's the opposite - that, historically, many people (regardless of profession) always generally assumed that certain living things did not feel pain or otherwise suffer for some definition of suffering. This status quo naturally persisted into the era of science until science demonstrated it to be wrong in this case. Should they have known better, sooner? Obviously - I don't fully comprehend what could lead to such a bizarre and terrible assumption. But the problem seems to predate the age of empiricism and medical science.
You can see this all the time right now. People always talk about dogs as though they are philosophical zombies that merely learned to imitate the emotional displays we see from them. I see people talking about animals frequently caveat their intuition by saying things like "I'm sure he's just learned that if he does that he'll get food" or the like.
But the notion that fellow mammals which evolved to live in social groups don't experience any of the same emotions we do despite using the same brain structures in the same contexts is absolutely insane to me. They clearly experience even fairly complex emotions like jealousy of their peers. I can give my child a snack that my dog likes and it's no big deal. But if I were to give the cat that same treat and not my dog, she will flip her shit. Another dog would be similarly. In fact, my dog will jealously guard a treat she doesn't even like from the cat or another dog. Like she won't even eat it in a neutral context. But in the jealousy eliciting context, she will take it and bring it somewhere and guard it.
Like it's insane to me that people think the entire range of our emotions and thoughts evolved solely in humans. There are other social mammals and have been for a long time.
It seems probable that (1) is a precondition of (2), such that we should not be surprised if there are sentient creatures (perhaps including dogs) that are capable of feeling pain, are intelligent, and able to produce apparently emotional displays without having evolved the underlying experience of emotion.
Human emotions are complex and came about as the result of a long period of evolution with an intricate social system built on top of it. The emotions that e.g. dogs show are the result of a few-thousand year history of humans selectively rewarding a particular animal for displaying a certain behavior.
It is, IMO, beyond the pale to suggest that animals like dogs don't experience pain or have subjective experience. (And the same goes for pigs, cows, and most other vertebrates.) That would come as a surprise to some rather intelligent historical figures like Descartes. I don't think it's similarly strange to think that these animals don't experience emotions in quite the same way we do, or that an apparent display of "sorrow" doesn't map 1:1 to an internal experience of sorrow in the way we might expect.
> You can see this all the time right now. People always talk about dogs as though they are philosophical zombies that merely learned to imitate the emotional displays we see from them. I see people talking about animals frequently caveat their intuition by saying things like "I'm sure he's just learned that if he does that he'll get food" or the like.
Dogs are a little bit different from other animals, though. I've always kept dogs and cats (currently have 3x dogs, 3x cats) and every single dog I've had over the years has been able to look at my face to see what I am looking at.
In other words, by looking at my face, they know where to turn their heads to look at what I am looking at, and by my expression, they know when to do it.
This is obviously not a "do action = get treat" type of scenario, and I don't know of any other animal that are in tune enough with humans to do that. To me, that's a very strong indication of higher-order thinking.
> Like she won't even eat it in a neutral context. But in the jealousy eliciting context, she will take it and bring it somewhere and guard it.
All my dogs have done the same; many of them will just bury it somewhere on the property and then forget about it.
There’s a huge difference in brainpower between an octopus and a lobster.
I don’t personally boil lobster alive, but I only place them slightly above plants and bivalves on my personal scale of how bad I feel bad for killing/eating them.
That's a great example because I disagreed with them and have no problem consuming anything based on its nervous system or perceptive capabilities
but then these same people, themselves, will suddenly have a cognitive maturity enough to notice things the same way I could my whole life, and then freak out about consuming them!?
I'm dumfounded! When was that the line!? It was only the line because in some cultures the humans were cognitively stunted the whole time!? Can I trust a single thing they say and perceive? Can they even pass the turing test?
If someone is making it up, that's terrible, but if they're basing it on some evidence it's not crazy. Not everything, not even all humans, have the same types of pain.
> I feel like it's the opposite - that, historically, many people (regardless of profession) always generally assumed that certain living things did not feel pain or otherwise suffer for some definition of suffering.
I'm not sure that's the case. Most people believe dogs feel pain by observing them, for example. Indeed, it is the most immediate common sense conclusion one can make based on observation. It wasn't until modern philosophy, specifically Descartes, characterized animals as zombie meat machines (pure res extensa). Aristotle, by contrast, did not deny non-human animals sensation or pain, as is evident in De Anima in which he reasons that sensation is in fact necessary for animals (this occurs in the context of determining the necessary faculties entailed by the nutritive soul, the sensitive soul, and the intellectual soul).
I believe it's the same with animal consciousness and intelligence.
It should be self evident.
Not that it mirror ours, but that they have something of a similar nature.
You have to remember that centuries ago, some humans assumed black people where, in fact, not human. Not having a soul. Not feeling pain as we do.
I remember the testimony of a person stating he once saw an African white master asking her black servant to check if electricity was on... by touching the wire. To her, they were not sensible enough to care.
I absolutely agree. I think there are many factors that make people look aside. We “believe” what’s convenient to us. What immediately comes to mind - and I’m a meat-eater - is eating meat and harming animals. For animals, we tell ourselves what we need so that we can continue eating meat.
“Babies feeling pain is something that should be self-evident, shouldn't it?”
I have yet to understand the historical underpinnings of this phenomenon, but as someone who deals with chronic pain, doctors don’t seem to be aware that people feel pain _in general_.
I’m being sort of facetious, but any chronic pain community is full to the brim with stories of various medical professionals being skeptical of patients’ pain reports or treating pain like it’s not a particularly important quality of life concern.
I have my own stories about this, of course, and some of them are pretty horrible.
It doesn’t surprise me, then, that we assumed at some point that infants don’t feel pain: after all, if we don’t take the pain of adult humans seriously, then why would we consider the pain of a creature who cannot even directly complain about it?
Even worse for you is how concerns about fentanyl are now a defacto war on pain relief. States have demonized long lists of pain meds, driving people in chronic pain to buy much riskier drugs on the street (something I witness fairly regularly).
States used to blame today's bans on the pill mill problem that was addressed over 20 years ago - but now don't even bother to do that. Now the argument reads like 'because fentanyl exists, you must live the rest of your life in pain'.
To make matters worse, the opioid epidemic is real; regardless of the cause, it’s not just war on drugs hysteria.
It ends up that opioids are terrible drugs. Aside from the obvious habit forming stuff, there’s things like hyperalgesia and central sensitization — long-term opioid use can screw up how your brain processes pain.
I witnessed the decline of opioids in the latter-half of the 2010s firsthand from the patient’s perspective. The response to these problems from the medical community was _sheer panic_. No one was concerned with “what are we going to do next”, it was just “we have to get these patients off of opioids”. Lots of people suffered. There were suicides. There were, as you mentioned, people who moved to street opioids. The whole thing was a fucking travesty.
Rolling up on a decade we’re just finally _starting_ to get a handle on this thing with responsible alternatives for pain management. The stench lingers; I think we will look back on this era with a similar lens to how we see surgeries before the work of Joseph Lister.
Opioids ruin lives. Opioids restore lives. People die from taking them. People are functionally dead when they're unobtainable.
The most false thing about opioids is that they are any one thing.
I've used hydrocodone for months at a time. It's addictive so there are protocols that need to be followed. Stick with the dose and taper when done. However, that was in the past. For severe pain, legal pain relief is no longer available - to me and to millions of others.
My state now has a hard 3 day limit and few Dr.s are willing to write even that. What has become routine everywhere is being Rx'd Tylenol for moderate to severe post-op surgical pain. I personally had to beg for tramadol following abdominal surgery. I recently had to enter into a protracted negotiation with my Dr - again, for tramadol - so I could exercise and get a little more life out of my failing knees.
The pendulum has swung way too far and the number of lives that are being ruined by that is nearly incalculable.
“The most false thing about opioids is that they are any one thing.”
That’s fair. A better way to say what I mean might be “our opioid therapy protocols aren’t very good” or “we’ve relied on opioids in ways that cause more harm than good”.
There are situations where opioids are the obvious right answer. I have a similar anecdote, having to beg for an opioid refill when I was suffering from post-op pain.
While I admit I don’t know the details, I wonder if your knee pain is one of those situations where opioids aren’t close to an ideal solution, but we never actually developed an ideal solution because it was easier to throw opioids at every pain problem. Now opioids are no longer allowed and you get to replace opioids with nothing.
> My state now has a hard 3 day limit and few Dr.s are willing to write even that.
I don’t mean to de-anonymize you, but what state is this? I know state laws vary, but I haven’t heard of anything close to those sorts of restrictions.
Fully agree. Spent 5 years going on and off Tramadol.
If I found myself craving it I would stop for a month. I was basically in bed in pure agony during those weeks off.
I've seen others that literally kidnap their own mothers to get opioids.
It’s because people lie for many reasons. To get out of work, get disability, make a claim, to get opioids because they’re addicted to them etc. As a doctor, you’re probably going to become cynical after a while.
This is very different from a baby because they have little motivation to lie and you can see them reacting to simple painful situations like bumping their head.
So? The first response should always be to take folks seriously. Always. Even if they are really searching for opioids. No doctor should assume folks are lying upfront.
Even if that means that a couple people are getting off work when they don't need it. And I'd much rather have a disability system that occasionally gives the wrong people benefits than to withhold benefits from someone and make their life torturous. Much like theft an loss are costs of retail, this should be the cost of a disability system: A few folks take advantage.
Spitballing, but maybe the problem is that no doctors wants to be the guy in town who's the most generous with opioid prescriptions, as then he'll get swamped by people seeking opioids?
People who look at the statistics after that, finding our doctor prescribing a significant amount of all opioids in the community, are liable to think that he's some sort of unscrupulous stooge of a pharmaceutical company.
That'd cause a race to the bottom of pain medication skepticism. To fix that you'd need some sort of liability reform (as just changing guidelines will never work if it's opposed to a self-sustaining incentive gradient), though I'm not sure what that would actually entail.
It's because of the chilling effect of lawsuits and the risk of losing their DEA license that doctors in the United States are skeptical of issuing pain medication. Americans with genuine pain problems that cannot be effectively treated by local doctors despite persistent good-faith efforts should instead find a doctor in a Mexican border town where pain medication is more readily available.
Wasn't the problem with opioids in the US that doctors already did prescribe them too freely? The fact that there's now a backlash after it caused a massive problem cannot be the reason that doctors have historically been sceptical about pain (and not just in the US but elsewhere where prescription opioid abuse has not been a problem!)
Spitballing, but maybe the problem is that no doctors wants to be the guy in town who's the most generous with opioid prescriptions, as then he'll get swamped by people seeking opioids?
Nope. In a well-designed system, this just be minimal. Design a better system with better access to medical records throughout the system, lesser liability (the US, mostly), and paid medical leave so that people can actually heal. The alternative is simply that we leave folks in different levels of pain. Heck, even just being able to see prescriptions nationally would be a bonus (Right now, in Norway, when I get an electronic prescription, I can go to any pharmacy to fill it and they can see my prescription history) Innocent people shouldn't have to go to the black market for pain relief because they can't get things figured out legally.
Again, I firmly believe that the first response should always be to believe the patient.
Was being given opioids for chronic pain (nerve pain from autoimmune). but not consistently, so I'd often run out.
I was constantly pushing doctors to try alternate treatments so I can deal with actual issue rather then take pain meds. They had no interest.
Took me 3 years to finally get a prescription to LDN, which finally worked for me.
> As a doctor, you’re probably going to become cynical after a while.
As my go to story on these sorts of issues, my partner had an ER doctor tell them that they hadn't been stabbed. The fact that there was an open wound oozing blood and multiple witnesses was not sufficient evidence. That wound was obviously a diabetic sore. That my partner is not diabetic was also a lie and the fact that their medical records showed no history of diabetes was merely proof of the incompetence of the previous doctors. Stitching the wound had to wait for two blood sugar tests to be performed, because the first test was could only have been a machine glitch. After a second negative test for diabetes, the doctor finally conceded that he couldn't rule out a stabbing.
How do some people end up making clearly counterproductive interpretations that ignore not only evidence in general, but clear evidence of harm to others?
I remember a Google employee changed a popular font for headings. It broke the layout on a very large number of websites.
Several people discussed this with the employee, suggesting the font should be forked rather than force-updated. The employee engaged, but refused. Eventually they said that if enough users contacted them with complaints, that would be evidence they would consider. When it was pointed out that it was very difficult to know how to contact them (they were not required to engage on the forum, or anywhere else, no contact information for them was otherwise findable, etc) they barely shrugged. It was as if part of the logic function in this human was managed by a 4 year old child internally.
Writers should use the singular “they” in two main cases: (a) when referring to a generic person whose gender is unknown or irrelevant to the context and (b) when referring to a specific, known person who uses “they” as their pronoun.
– apastyle.apa.org
I do not know the gender of the person under discussion, nor do I know if that person has pronoun preferences. I simply followed the "generic person whose gender is unknown" standard.
I've been using "they" in this manner for 50+ years. Never had anyone comment on it before. Are you attempting some kind of meta virtue signalling, considering "pronoun preferences" is a very recently added option?
> As a doctor, you’re probably going to become cynical after a while.
We should probably focus more on finding ways to reduce cynicism and lack of empathy in this profession than fighting false positives in the system.
> It’s because people lie for many reasons.
You should err on the side of caution and default to "they're probably not lying".
Most of us can agree that applying the logic you're describing to the law would be
wrong. Imagine that instead of having the presumption of innocence, we'd assume that certain types of people are probably guilty, so we'll punish them pre-emptively.
This already happens, of course, because of racism, xenophobia etc... but we do agree that it's morally wrong and at least we're trying to fix it.
Wondering if it might be possible to measure the pain people are feeling somehow. It’d be interesting to compare a person with a traumatic past vs one without
And if doctors are worried, the first step should be doing actual tests to see if people are experiencing pain. Not making some judgement based on their cynical bias whether someone is lying or being truthful.
It is amazing how much of the medical community states they believe in science, yet they seem entirely disconnected from it at times.
Sure, just like instead of prescribing anti-depressants, we should first do a depression test.
Unfortunately, neuroscience is not at a level where we can actually perform these tests. There is no objective measure to see if someone is in pain, or depressed, or suffering many other psychological symptoms.
Actually, there are tests now for depression. And yes, you can do objective tests like brain imaging to see people's responses to stimuli among other things. What should not happen is for doctors to meet someone for 20 minutes, do nothing but talk to the person and say.. oh, I don't trust them they must be lying. Except, that is what doctors do all the time and there is no scientific basis for their decisions.
Can you point to some tests for depression? A quick Google only turned out some studies, nothing approved and usable in clinical settings (and one of the studies had a test that couldnt differentiate between depression and bipolar disorder).
Similarly, brain imaging for pain is at the research level, and may come with several other risks. I doubt there are such tests that could be used to conclusively prove that someone is or is not in pain.
There are blood and urine tests for seratonin, but studies are mixed on how reliable they are. Since depression can also be accompanied by other factors though, there is a wide range of tests that can help provide objective evidence.
Sounds expensive, wouldn't it just be easier to trust them? I think the bigger problem is if a doctor doesn't know what's wrong or how to fix it, then they disengage.
How can you, with an objective test, determine whether or not I am experiencing chronic pain, or if I'm just pretending to be experiencing chronic pain?
Brain imaging would be a good example, but there is cortisol screenings among several other tests that can help determine that someone is actually in pain.
The same goes for fatigue or anything else they can’t actually test for. I would guess it mostly stems from them not having a way to objectively judge it. My pain tolerance is way higher than my wife’s, in part because I suffer from cluster headaches. 8/10 for me might be a broken bone. For her it might be a relatively superficial cut.
So what’s a doctor supposed to do when someone says something really hurts? Hell if I know.
> as someone who deals with chronic pain, doctors don’t seem to be aware that people feel pain _in general_.
I don't know your situation, but I wonder if it seems this way because doctors don't have many options for treating chronic pain. There are medications for acute pain, but when you use them chronically over many years—well, that's how we ended up with the Opioid epidemic.
That's not how we ended up with the opioid epidemic. We ended up with the opioid epidemic when doctors working in concert with drug companies engaged in criminal fraud to prescribe medication far beyond its normal dosages and in cases where it was not necessary.
It was all of the above. Some doctors were prescribing them at higher doses than necessary; others were prescribing them in normal doses for longer than is safe. I am sure that at least some of these doctors were driven by compassion for their patients.
I think you are correct, and that behavior is to the patients’ detriment.
Personally, I worked myself into the ground trying to both advance my career and to manage increasingly alarming health issues.
I would have benefitted from some “real talk” but that’s not baked into the contemporary western transactional model of medicine outside of those areas that routinely deal with terminal illness.
At least its changing these days now that THC is increasingly legal. Boomers have proved you can take it for six decades straight and be fine. Pro sports players are using it after getting beat up in games or practices too.
I am taking it for cca 25 years and I am fine too... sort of.
It can still become addictive, after tremendous amounts consumed over long time that don't make sense unless you just want to fuckup yourself as much as possible. But plenty of folks still end up there.
But its one of the easiest addictions to shed off, compared to say cigarettes, alcohol or well anything else (I would say even sugar is more difficult to wean off since its everywhere, joints not so much for now).
It has positive effects - pain management I guess especially with edibles, but for me its change of perspective/mindset on life matters, much higher creativity, michelin star level of taste experience even from relatively cheap food, and that sweet sweet high that is just so relaxing.
But it will make you overall less patient, which sucks. For parenting, for work, for life in general. Also, morning after is much better than alcohol binge night, but mental gears running on 20% isn't something that helps produce quality in your work. Plus most consumption methods mess up your lungs pretty badly, but quality vaping is I think cca good enough to not shorten your life considerably.
Individual doctors are rarely deserving of the implicit trust and praise they receive from proximity to the stereotypical hero doctor on this season's popular medical drama.
I went from a general naive faith in the profession to caveat emptor.
It is truly insane the things that doctors routinely recommend without doing any sort of real review. Get any of the asleep-at-the-wheel health organizations to support your finding and plenty of doctors will recommend even the laziest of new ideas with passion. General downward trend in SIDS over a decade? It's probably because babies that wear footie pajamas are less likely to get bad humors. You're a bad parent if you don't get Shekol brand sleepers.
I don’t think the AAP recommends footie pajamas to prevent SIDS. What you say may or may not happen, but your point would be a lot more poignant if you used a real example rather than contrived nonsense. If the issue you're talking about is widespread, then it should surely be easy to use a real example.
Criteria for c-section. Rates among doctors is 1/3 vs the 8% among midwives. Baby in bassinet and baby on back. SIDS is linked to improper sleep hormone regulation, not sleeping on your stomach. Breastfeeding only once every 4 hours during the day and giving a bottle the rest of the time. Supplimenting formula when mothers are trying to exclusively breastfeed. Waking the baby up to breastfeed despite no weight concerns. Circumcision. Prescribing amphetamines to prepubescent children. Prescribing hormone blockers to prepubescent children. Prescribing thalidomide to pregnant women. Continuing to gamble with mass casualties by railroading experiments that have not been adequately tested relative to the subject size (like Tdap for pregnant women or the COVID jab you probably got).
Doctors as a whole are not given the allowances they need to think about what they're doing. Today, they are little more than billable entities that follow someone else's flow chart to serve you McHealthcare as "efficiently" as possible.
My stock joke is that if the doctor or nurse says it won’t hurt, it will hurt at least a little. If they say it will hurt a little, it will hurt a lot. If they say it will probably hurt, time for serious anesthesia.
Also, there’s pretty much no such thing as minor surgery except maybe skin tag removal.
Lucky you, if you dont have genetics were anesthesia do not work properly. Then find a doctor, who accepts that problem and gets it over with painful but fast.
The hardest part is if you have a dentists that insist on anesthesia must work, but it wont, so you must act your ass off to prentend to not be in pain, so he wont push syringe after syringe, doing nothing and waiting for 15 minutes in between while your tooth is already open with the root canal drills in.
So you’re one too. No dental anesthetics work on me. I have to go under general anesthesia, where you’re knocked out so deep a machine has to take over your breathing. So every procedure starts at about $10,000 and that’s not counting the dental work. All out of pocket. No insurance coverage.
As someone with very low tolerance for pain, this seems like my worst nightmare. I’m curious though, did your tolerance for pain increase over the years as you had to endure it?
Decreased, if any. I think it’s probably because I know nothing has ever worked. FWIW I am old and get a major operation every two or three years: hernia, cancer, hydrocele, etc. I have no problem dealing with that kind of pain, even when it is considerable, which is often true post-op.
The likelihood of someone understanding something is inversely proportional to the impact on their ability to pay the mortgage.
Pain is recognized as “bad” innately. So if you can assert that no pain exists, you’re not accountable for causing, preventing or doing anything about it.
Doctors caring at all about pain is relatively new. When medicine started it was just about keeping you alive not keeping you out of pain.[1]
There is a fantastic book that is half about the history of pain medicine and half about a small region in mexico that distributes a lot of the heroin in the US. Dreamland[2]
The roots of the practice are not very clear, but it seems that for operating on infants, risk of death from Anesthetic is very high. And historically, anesthetics were given to induce muscle numbness (so they don't thrash around).
The argument is not whether they react to stimulus we consider painful, but if they feel it in a way that mature humans understand as pain. Babies take months just to understand that they have hands, they're incredibly simple minded, so the question was whether they even had a real concept of pain beyond "stimulus and instinctual reaction". Similar to how an insect will react to pain but it's just doing what it's programmed to do, there's no higher level thinking associated with it.
Completely agree. I suspect this wasnt just empiricism, it was also behaviorism which purported to be empirically based but added in a theoretically motivated (and ultimately empirically unsupported) skepticism about using minds to explain the observations of animals. If you don’t think babies have minds and pain is a mental state then this insane position follows quite logically.
It’s the same kind of insane lack of critical thinking that led people to be being buried alive because no one thought to check for the existence of a heartbeat despite heartbeats being known about since… forever
The thing that bothered me the most wasn't the awareness of pain in babies, it wqs the answer "they won't remember" when I mentioned the pain argument.
>little human beings react to things which are known to be painful. Babies feeling pain is something that should be self-evident, shouldn't it
Perhaps this is just semantics, but I think "feeling pain" implies a higher level of cognition than simply reacting to things which are known to be painful. Like the classic toddler move of painting with their feces. They're obviously physically capable of smell but it just doesn't seem to register the same way.
smell being good/bad seems to me to be learned? Go to Taiwan and smell stinky tofu 臭豆腐. To most non-Taiwanese it smells like cat poop or sewer water. Eating it as a non-fan feels like I'm eating in a dirty public restroom where the food itself doesn't taste bad but the smell from the over full toilet next to me is off putting. Similarly shrimp paste. Many cheeses also smell pretty bad.
All of them stop smelling so bad once you train yourself to enjoy them. Is the smell of poop any different?
And if you go to Taiwan and look around, empirically you can see that people try to avoid being immediately downwind of a stinky tofu cart. (It’s pretty tasty, mind you, but IMO not worth the hassle.)
Sure, but what about other examples. Tons of people like blue cheese. But it smells bad. It doesn't have "smells bad" in the name. Neither does "durian" 榴莲
But these aren't relevant examples. The claim was that people learn to think of certain smells as "good" or "bad". The example given was 臭豆腐, which makes no sense because it is thought of as smelling bad.
You don't even claim that your examples are thought of as smelling good. You don't have to have "smells bad" in your name in order to smell bad.
The lesson to draw here is that people are willing to eat things that both (1) taste good, and (2) smell bad. If you like blue cheese and it smells bad... how is that a counterexample?
(I'm open to the idea that people disagree on whether durians or blue cheese smell bad. But I'm less open to the idea that their opinion of the smell is learned - from what I've heard, some people think durian smells disgusting, and other people never saw anything wrong with the smell, but I haven't heard of people who initially thought the smell was disgusting but grew to like it.)
People tend to instinctively anthropomorphize, and this instinct is particularly strong when it comes to our offspring. Just because it seems conscious doesn't necessarily mean that it is.
That's a fair question. And I'm not saying it is wrong, only that it might be. Let me give a slightly less fraught example: we have an instinct to "honor our dead" and not to "desecrate their bodies." Does that make sense? The dead person doesn't care. Does it even make sense to talk about a dead "person"? One might argue that no dead thing can be a person, it's just a (dead) thing that was once a person and still happens to look like a person but isn't actually a person any more.
Likewise, a baby may look like a person, may even behave in some ways like a person, but not yet actually be one.
Again I have to emphasize: I am not saying this is the case, only that it is a possibility that needs to be taken into account when doing the moral calculus.
> Again I have to emphasize: I am not saying this is the case, only that it is a possibility that needs to be taken into account when doing the moral calculus.
An important thing to note about babies is that at some point in the past few months they objectively couldn't feel anything, despite having a human body.
So the question is when those various things switch over, not if.
> An important thing to note about babies is that at some point in the past few months they objectively couldn't feel anything, despite having a human body.
Is that actually true (if by "human body" you mean a humanoid body, rather than the vacuous-in-this-context sense of a body of a human)? I wouldn't be surprised if the ability to feel developed earlier than the human-like body.
Do people instinctively anthropomorphize? I bet the existence of that instinct has far less empirical support than the pain consciousness of newborn babies. Be that as it may, these comparisons to parents' "anthropomorphising" of their kids are not apples-to-apples.
Chatbots and machines programmed to show feeling responses are mimicry. People can be tricked. If you want to fool ships with a fake iceberg, you won't bother with more than the tip. But if you're a glaciologist, examples of sailors getting spooked by fake plastic iceberg tips would not lead you to question whether natural icy marine bodies (especially ones that usually develop into icebergs!) have any mass below sea-level. Or if you're studying carcinization, the fact that robot crab decoys can pass for real tells you exactly nothing about why unrelated phylogenetic lines independently develop crab-like features.
And the "people" in "dead people" refers to real people, who die. A corpse is a "dead person" like a puddle of water is "melted ice". Mourning, including respectful treatment of remains, is symbolic. None of that is anthropomorphism.
It is established that human babies develop into human adults. Their pain and other distinctly human responses resemble the adult forms profoundly. Are babies "a system [that] responds to painful stimulus" but feels no pain? Occam's Razor says no. I see zero downside to reckoning that possibility out of the moral calculus.
> I bet the existence of that instinct has far less empirical support than the pain consciousness of newborn babies.
I'll take that bet. Why do you think human children play with dolls?
> People can be tricked.
Indeed.
> A corpse is a "dead person" like a puddle of water is "melted ice".
You're kind of making my point for me here. A corpse is an inanimate object. An inanimate object cannot be a person. A corpse was a person, but when that person died it ceased to be a person just as when ice melts it ceases to be ice. When a person dies, when ice melts, they cease to exhibit any of the distinguishing properties that made them a person or that made them ice.
> It is established that human babies develop into human adults.
Sure, but that in and of itself does not make them people any more than the fact that people die makes live people corpses. A live person will some day become a corpse, but while it is alive it is not yet a corpse. A baby will some day become a person, but while it is a baby it may or may not be a person.
> Their pain and other distinctly human responses resemble the adult forms profoundly.
That's simply not true. A baby's response to pain is all but indistinguishable from its being, say, hungry, or just in a bad mood, whereas in adults these are pretty easily distinguished. But even if it were true, so what? You could make an automaton that mimicked adult responses to pain, but that automaton would not feel pain. You can't conclude anything about subjective experience from I/O behavior alone.
Look, I'm not saying that babies don't feel pain. I think they probably do. All I'm saying is that it's not the slam-dunk that many people here seem to think it is.
> Why do you think human children play with dolls?
Do children "play" with dolls? I don't think that is a slam-dunk. As far as we know, children may be mindless automata and what we call "doll-play" may be unfeeling mirroring of adult anthropomorphizing behavior absorbed through something like blindsight.
> You're kind of making my point for me here.
Not at all. My point was that calling a puddle "melted ice" has nothing to do with mistaking it for any kind of ice or ascribing characteristics of ice to it. Likewise with "dead people".
> A baby's response to pain is all but indistinguishable from its being, say, hungry, or just in a bad mood
That's simply untrue, even in neonates (not to mention babies several months older). Look into the abundant research applying the Facial Action Coding System and derived systems to infants and their pain vs. other distress responses.
> You can't conclude anything about subjective experience from I/O behavior alone.
This is a denkverbot, not productive skepticism. We can draw robust, valuable conclusions about subjective states without enjoying deductive certainty.
> Do children "play" with dolls? I don't think that is a slam-dunk.
Seriously?
> As far as we know, children may be mindless automata
Yes, that's possible. You can't prove that humans in general are not philosophical zombies. But I have memories of being self-aware around age 4 or so, so from my own personal experience I put that as an absolute upper bound on where humans might not be persons yet.
On the other hand, I also remember exhibiting I/O behavior that would cause an outside observer to conclude that I was experiencing much more intense pain than in retrospect I was actually experiencing.
> Facial Action Coding System
Of course it is possible to create a taxonomy of facial expressions and how those correlate to physical stimulus. What you can't do is know the subjective experiences that those expressions correspond to because no adult knows what it is like to be a baby. If anyone actually knew that, we would not be having this discussion at all.
> We can draw robust, valuable conclusions about subjective states without enjoying deductive certainty.
I don't deny that. What I'm saying is that you cannot draw those conclusions simply by observing a system's I/O behavior. You have to take the underlying mechanism into account. If you doubt this, look at this photo:
The subject of that photo appears to be in distress, but it is not in fact in distress. The way I know this is that the subject of that photo is actually a doll (look at the eyebrows), and dolls cannot be in distress. So there is an existence proof that merely because a physical system appears to be in distress that is not enough evidence to conclude that it is in fact in distress. You need something else.
The identity of someone is not just their body, and, while they may not be aware of how their body is being treated, an assault on their body can still be quite traumatic for those who do remember the person and in whom their identity still lives.
Yes, indeed, and I don't want to diminish in any way the value of mourning the dead. My point is just that that value accrues to the living, not to the dead.
So there may well be value in minimizing the pain of infants even if they don't actually experience pain. The policy decisions do not turn exclusively on the answer to the philosophical question.
But it's not a no-op either, because if you are going to (say) administer anesthetic to a baby you'd better be sure that the net benefits outweigh the costs to the baby.
This borders on solipsism. The same argument can be made about any human being. At the very least, we presume there is pain by observing the behavior of animals in response to stimuli and interpreting what we see analogically. An in light of certain reasonable assumptions (like "natura nihil frustra facit"), we can quickly rule out a host of preposterous interpretations to conclude that the sensation felt is painful. That
We don't need to factor into the moral calculus the possibility that no pain is felt since the most reasonable and rational conclusion is that infants do feel pain. What reason could there possibly be to believe that they don't that isn't some stretched exersize in special pleading and evasion? The baby is human!
I think you're misunderstanding what the OP is talking about. They aren't defending the practice, just that it's not necessarily an obvious answer to a question that should not be asked.
> the most reasonable and rational conclusion is that
In the context of science, this statement is a very dangerous one to make. There's many many many many things that people thought were "reasonable and rational conclusions" that turned out to be completely wrong after some actual rational experimentation and application of the scientific method.
It doesn't even make sense. What would it mean to anthropomorphize a human? It is a given and can't be fairly characterized like that. The only way to characterize it otherwise is to dehumanize.
I guess the word "anthropomorphise" here is conflating many understanding of "human" and humanity. In the same way, is a bunch of cells, an embryon or just a tissue sample, a human? Is an organ a human? Is a developing fetus a human? How many properties do each share with a typical example of a human?
There's a spectrum of biological complexity and other attributes we associate with the idea of "human". The question then is where does a baby fit on that spectrum.
The general idea of anthropomorphising, here, is to attribute characteristics and experience that we only know from our own experience as conscious, self-aware creatures capable of complex cognition and complex communication of that experience.
Babies get upset when you hurt them, and they communicate it bluntly. Conversely, they are unable to demonstrate “complex” communication. Behold, I have placed the babies ;)
You do know what the definition of anthropomorphize is right?
> attribute human characteristics or behavior to (a god, animal, or object).
As far as I'm aware, baby humans don't start off as non-humans and magically transform into a human at some age where they understand more. So it seems fitting to attribute human characteristics to humans. That is not anthropomorphizing.
If you ask someone to describe something generally anthropomorphic, the description would only tangentially fit a human newborn. “The weather is angry”, “trees strive to the light”, “this car is pretty demanding” are all examples of it. Not only newborns can’t express clearly what they strive to or angry of, they (likely or arguably, whichever) have no concepts of that in their minds, but we tend to attribute these to them either way.
Idk if it’s commenters here are nitpicky or not, but I recognized the sense of gp’s terminology without a doubt.
So because you couldn't find a single word to convey your meaning, you chose to shoehorn a semi-related word into the sentence in the hope that every reader would understand your tortured meaning? Why not use the more accurate multi-word construct? HN doesn't charge per word, last I checked.
You say "the more accurate multi-word construct" as if it were obvious that there is a single "accurate multi-word construct" that is already part of the English corpus that I could have employed here. If there is, I'm not aware of it, so I would have had to invent one. That would have taken considerable effort. Inventing new language constructs is not so easy, especially for subtle and emotionally fraught concepts like this.
And yes, I thought people would figure out what I meant because I provided three references, which were the actual substance of my comment. I thought people might read at least some of those and figure out what I meant.
Also, since you've decided to go all language-nazi on my ass, I looked up the dictionary definition of "anthropomorphize". There are two of them:
1. to attribute human form or personality to
2. to attribute human form or personality to things not human
So it is absolutely possible to anthropomorphize a human baby, and all the people who have criticized me for using that word can go eff themselves. Democracy and human civilization are crumbling and the best thing you can find to do with your time is criticize my choice of vocabulary, and get it wrong to boot? Bah to that. Life is too short.
Don't get me wrong, empiricism and citation is valuable. However, a myopic focus on it enables some very twisted conclusions.