Hacker Newsnew | past | comments | ask | show | jobs | submit | greygoo222's commentslogin


Different apples. Cessation vs never starting is completely different.

A lot of young people don't drink anymore anyway tho. It's not as extreme as it sounds.

As someone who's been sober for only 16 months, not sure how I feel about this. It is surely unrealistic.


Apples to oranges. It worked (and works) with the advent of Islam in Muslim lands.

Can confirm.

The people behind the website asked a voice agent to program it, and the STT parsed "agent" as "asian."



hahah wrong, I actually have a replacement rule "asian" → "agent" in my Wispr flow dict

was it “secret asian man”?

If someone wants to take Ozempic for cosmetic reasons, that's their business. I am almost certain you personally indulge in riskier activities than using Ozempic, or... modafinil? You know people still use research chemicals, testosterone, and modafinil, right?

Depends on how they work. Many genes that are active during early development are entirely silenced throughout adulthood, or otherwise have no effect.


You can debate what policies are the most fair without calling trans women "men."


> You can debate what policies are the most fair without calling trans women "men."

You're correct - man/woman are gender identities, male/female are biological facts. The more accurate version of that statement (which, btw, is not mine, I am just repeating what the complaints are) is:

"Females don't want to compete with males."

Happy?


Transition changes biology. We don't yet have the technology to fully reverse the effects of male puberty, so there can be reasonable debate about trans women who transitioned after puberty, but early transitioners have no meaningful advantage. Their bodies, in an athletic context, are female.

This is also true for many cisgender intersex women with XY chromosomes. Someone with androgen insensitivity can have XY chromosomes, yet be capable of giving birth. Drawing the line at having a Y chromosome makes no sense.


> Someone with androgen insensitivity can have XY chromosomes, yet be capable of giving birth

People with androgen insensitivity syndrom (AIS) have XY chromosomes but no uterus. So, no, they cannot give birth.


There are athletic sex differences even amongst prepubescent children, mostly caused by the testosterone surge in utero.

See https://womenssportspolicy.org/pre-puberty-male-female-child....


Quite a biased source, no? This doesn't provide evidence that these differences are biological. Boys are much more likely to exercise than girls due to social norms: https://pmc.ncbi.nlm.nih.gov/articles/PMC10478357/.


People with a diagnosis for that syndrome are specifically allowed by the new rules


> Their bodies, in an athletic context, are female.

I'm sorry, but this is not true. "Puberty blockers" do not complete suppress the effects of male genetics. They only attempt to block certain hormonal effects.

It is not possible to completely block the effects of having male genes by simple hormone modulation.

> Someone with androgen insensitivity can have XY chromosomes, yet be capable of giving birth

We do not determine eligibility for sports classes based on ability to give birth for good reason. It's not a proxy for the genetic athletic differences being addressed by these classes.

Individuals with androgen insensitivity typically cannot give birth. This an extremely rare possibility, not a typical feature of the condition.


ChatGPT did not make the treatment. ChatGPT summarized literature explaining how to make similar treatments, and then specialized tools were used to make the treatment. It is perfectly reasonable that ChatGPT can summarize literature and not make compliant 100 page legal documents.

The principle behind personalized mRNA vaccines is simple enough, and it's perfectly plausible that someone with money and lab access can create an effective treatment not offered through conventional means. It would be plausible even for a human patient right now, with mRNA vaccines still hung up in clinical trials. For a dog? Of course.


> It is perfectly reasonable that ChatGPT can summarize literature and not make compliant 100 page legal documents.

No it's not. Yes ChatGPT can help summarizing literature, but it can also definitely also do the heavy lifting when it comes to write red tape.

I know some people here, after spending too much time reading Ayn Rand, believe that governments are the greatest evil but come on. Red tape is not some ARC-AGI level of challenge.


The LLM did not design the drug. The LLM summarized some papers on how to design similar drugs, and then a dozen specialized tools were used in an established pipeline to design the drug. You people need to read the article and read the background before writing nonsense based on your assumptions.

Here's a previous comment of mine talking about personalized mRNA vaccines with useful citations: https://news.ycombinator.com/item?id=47210284


Funny how you keep skipping the unbelievable part of the story out of your replies: why would he spend 3 month hand typing a document that an LLM can definitely make at least 80% of it in one shot?


Have you ever tried writing a long, complicated document with an LLM? The last 20% takes 99% of the work.


True, and even more true in the case that you barely understand what you're doing. That's a feature rather than a bug of this sort of paperwork; the person who's simply pestered ChatGPT until it says "great idea" won't cross that threshold at all, whereas this guy [and the bioinformatics processing chain and experts in the loop he found] crossed it in his spare time. If it was just the "two hours a night typing" as quoted in the article, LLMs can do it in no time.

"ChatGPT better at finding expert advice than filling in compliance forms" and even "getting workable results from latest generation Open Source bioinformatics tools possible for smart laymen with minimal background reading; learning enough to prove they aren't dangerous only takes marginally longer" doesn't sound nearly as bad as "layman asks ChatGPT to cure his dog's cancer, only hard bit is writing enough words to convince gatekeepers" as rendered by news coverage of this (and not really elaborated on more by the TFA). A rendering which really should trigger people's bullshit filters.

Other fields crossed the computers can find potential solutions easily a lot earlier (any idiot can put dimensions into pretty dumb civil engineering tools and get answers that are probably correct; don't as me how I know!) and actually have higher barriers (no, even if you actually learn the relevant physics as well you will still need to pass some elements of your home design via someone with the right professional liability insurance linked to their experience and formal qualifications)


> The last 20% takes 99% of the work.

Of course it does, since the first 80% take literal minutes! But if you compare to doing it entirely manually, it's still x5 more efficient.

Why would you do it all by hand (spending 200hours in the process…) when you're an “AI entrepreneur”…

In fairness, it pains me to see people as gullible as you are just because you like the idea of the story being true.


You don't understand how the technology in question works, and you're just making shit up because you don't want to admit to being wrong.

What are you alleging here anyways? That all the scientists quoted and photographed in the article discussing their part in making the vaccine are in on the game? That the Australian made the story up wholesale? Come on.


> You don't understand how the technology in question work

See my comment history. I do know very well how language models work. Thank you.

What I don't know is why you're claiming you disagree with the story reported here being bullshit (the story being almost literally “ChatGPT did the heavy lifting and the only reason we can't have nice thing is because red tape is blocking humanity”: “he used AI to teach himself about how a personalized vaccine could work, designed much of the process himself. […] The red tape was actually harder than the vaccine creation”), when you know very well it's bullshit because your comments describe what has most likely happened (ChatGPT did nothing much besides telling what could work and pointing towards which scientists to seek help from).

Again, literally no one question the fact that mRNA-based medicine has incredible potential, the bullshit here is not about the medicine: it's about red tape being the only bottleneck in a fantasyland where AI solves all the hard challenges.


It's not a radical medical breakthrough, it's applying a technique already documented in the literature and years into human clinical trials. The LLM is just doing literature summary and planning. The most notable AI innovations here are in protein folding and binder prediction.


The LLM didn't oneshot the mRNA treatment, it merely suggested the idea. Most of the steps in the process were done with specialized tools. And no novel treatments were invented wholesale, it's more applying a documented process with existing open-source tools that's just too personalized and expensive to be offered by any vet.

I find this story perfectly plausible.


Why do you find the idea of a man complaining about having painstakingly hand typed a 100 page document over three month when he claims he can use an LLM in a way pretty much no-one has before him?

It's several orders of magnitude easier to get an LLM fill some kind of red tape than it could be to use it in the way he claims to have used it.


He is not using an LLM in some new and exciting way. The process of making a personalized mRNA vaccine looks something like this:

1. Collect and sequence patient's normal and tumor genomes 2. Predict immunogenic neoantigens from genome 3. Generate optimized mRNA sequence from neoantigens 4. Create vaccine from sequence

modulo some variations, which I wrote off the top of my head because I understand this technology.

Steps 1 and 4 are done by contracted labs. Steps 2 and 3 are doable through open-source computational tools and a little engineering. What does ChatGPT do here? ChatGPT explains the process, finds labs that will do 1 and 4 for pay, finds published algorithms and data for steps 2 and 3. It's barely more complicated than what ChatGPT would do to help a student with their homework.

Legal documents, on the other hand? Have you ever tried to get an LLM to do your taxes? It's not easy.


> Legal documents, on the other hand? Have you ever tried to get an LLM to do your taxes? It's not easy.

Taxes are numerate, which is where LLMs fuck up.

Legal documents are structured texts, which is where LLMs shine. Should you blindly trust the outcome? fuck no, but a good first pass is trivially achievable if you set the right parameters. and make sure its relevant to the right jurisdiction.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: