Hacker Newsnew | past | comments | ask | show | jobs | submit | throwup238's commentslogin

What they’re describing is a polymorphic virus. A great analogy for SV startups.

It works great in assembly, not so much for higher level languages.


Is all polymorphic code virii?

Not necessarily, but in practice no one has any use for the technique except to obfuscate viruses, with the exception of academic research.

The nonvirus equivalent is JITs which are present in all major browsers and tons of other runtimes, but they have no use for polymorphism except at a theoretical level (they all use it extensively, but at the type level).


That’s not even the complex part. Most of what you describe is a user interface issue, not a geometric kernel issue.

The hard part of 3 corners fillets is the tolerances. Each of those fillet operations has its own compounding float errors and when they meet, the intersection is so messy that they often do not intersect at all. This breaks almost every downstream algorithm because they depend on point classification to determine whether an arbitrary point is inside the manifold, outside, or sitting on an edge or vertex.

And that description of the problem is just scratching the surface. Three corner filets create a singularity in UV space at the common vertex so even when you find a solution to the tolerance problem you still have to deal with the math breaking down and a combinatorial explosion of special cases, almost each of which has to be experimentally derived.


when i did openscad, i just did a minowski hull with a 4sided bipyramid (aka rotated cube) to get chamfers for my cubes.

bonus: minowski hull with a round pyramid adds chamfers in the vertical and fillets in the horizontal, which is what i want for 3d printing most of the time. additionally it closes small overhangs, and it makes fonts smoother (i.e. fonts don't extrude in a 90degree angle, and get 45degree instead, and print better on vertical faces)

disclaimer: I havent used openscad for about a year and my memory may be fuzzy

edit: i am not saying minowsky hull would directly solve your problem, but maybe the algorithm gives you inspiration to solve your numerical issues


OpenSCAD is mesh based so it's not even in the same universe as a proper brep geometric kernel. Everything is easier when you give up on the math entirely, but that’s not good enough for real world manufacturing and simulation.

All of the major commercial geometric kernels have been working on these problems for thirty years and I’m sorry, but your five minutes experience with a glorified tessellator isn’t going to make progress on long standing computational geometry problems.


Join SolveSpace development? ;-)

This is why geometric kernels are the gateway to madness. ;) Thanks for the clarification.

> I known there is research out there (can't dig it up at the moment), but the goal would probably be to generate a robust geometric query for a selected item, so that small changes in the model don't affect which edge gets selected after subsequent operations.

There is quite a bit of research that this is impossible. No matter what algorithm or heuristic you use, the second that symmetry is introduced, the query breaks down. The only way to resolve those issues is to present them to the user as an underspecified constraint, and no geometric kernel is well designed to do that.


What strategy are you using for tolerances, compounding errors, and the nuances of floating point math?

This was already common in tech for Series C+ fifteen years ago when I raised a round. Once you’re talking tens or hundreds of millions, almost everyone wants milestones and tranches instead of giving all the money up front.

That’s what a source map is. It’s included in debug builds so that browser debuggers (and others) can step through the original code, comments and all, instead of the compiled javascript (which back in the day could become an undecipherable mess of callbacks if you were transpiling async/await to the legacy Promise API).

Unfortunately in many bundlers making a mistake like this is as easy as an agent deleting “process.env[‘ENV’] === ‘debug’” which they’ll gladly do if you point them at a production or staging environment and ask them to debug the stripped/compiled/minified code.


I see. I had read that it was a source map that was leaked here specifically, but my vague understanding of the term was mostly that it might be a way to trace back JavaScript lines to the TypeScript it compiled from, since I don't have much of an understanding of all of the other various steps that are part of a JavaScript build nowadays.

I think I still disagree with the parent comment premise that "they probably thought minifying was enough", since it sounds likely they were doing all of those other steps. The issue seems like insufficient auditing of the build process (especially if agents were involved, which seems likely for Anthropic) rather than not doing all of the usual JS build stuff.


It learned by reading HackerNews, after all.

Called it!

> To be honest, after reading some of these microplastics papers I'm starting to suspect most of them are bullshit. Plastics are everywhere in a modern lab and rarely do these papers have proper controls, which I suspect would show that there is a baseline level of microplastic contamination in labs that is unavoidable. Petri dishes, pipettes, microplates, EVERYTHING is plastic, packaged in plastic, and cleaned using plastic tools, all by people wearing tons of synthetic fibers.

> We went through this same nonsense when genetic sequencers first became available until people got it into their heads that DNA contamination was everywhere and that we had to be really careful with sample collection and statistical methods. [1]

[1] https://news.ycombinator.com/item?id=40681390


I haven't really read the studies but shouldn't they have negative controls to negate these effects? Wouldn't that let the author's correct for a baseline contamination level in the lab?

That was the difficulty with DNA: how do you make that control if everything is contaminated and minor variations in protocol (like wafting your hands over the samples one too many times) changes the baseline?

It took years to figure out proper methods and many subfields have their own adjusted procedures and sometimes even statistical models. At least with DNA you could denature it very effectively, I’m not sure how they’re going to figure out the contamination issue with microplastics.


I have worked at a sequencing center before. DNA contamination is easier to mitigate because the lab disposables aren't made out of what you are testing. Disposables are almost always plastic and tend to have minimal DNA contamination. Environmental DNA contamination is largely mitigated with PCR hoods and careful protocols/practices. However, these procedures don't mitigate DNA contamination at the collection level, which is likely where the statistical models you mentioned help.

I can't imagine wafting your hands over the tubes would change the plastic amounts substantially compared to whatever negative controls the papers used. But again, I am not an expert on this kind of analytical chemistry. I always worry more about batch effects. But it does seem like microplastics are becoming the new microbiome.


On the one hand, hundreds or perhaps thousands of studies might be wrong. On the other hand, this one might be wrong. Who's to say?

Not even that! This study doesn't even say contamination is causing overestimation. It says that it's possible.

But as mentioned elsewhere in the thread, everyone knows that it's possible and take measure to mitigate it.

A paper that said those mitigations were insufficient or empirically found not to work would be interesting. A paper saying "you should mitigate this" is... not very interesting.


> Not even that! This study doesn't even say contamination is causing overestimation. It says that it's possible.

From the article:

> They found that on average, the gloves imparted about 2,000 false positives per millimeter squared area.

I dunno, that seems like a lot of false positives. Doesn’t that strongly imply that overestimation would be a pretty likely outcome here? Sounds like a completely sterile 1mm^2 area would raise a ton of false positives because of just the gloves.


The way you mitigate this is by using negative samples. Basically blank swabs/tubes/whatever that don't have the substance you're testing in it, but that is handled the same way.

Then the tested result is Actual Sample Result - Negative Sample Result.

So you'd expect a microplastic sample to have 2,000 plus N per mm^2, and N is the result of your test.


That has happened many times in scientific research. The aforementioned fad in DNA sequencing was one such case where tons of papers before proper methods were developed are entirely useless, essentially just garbage data. Another case is fMRI studies before the dead salmon experiment.

> Plastics are everywhere in a modern lab and rarely do these papers have proper controls, which I suspect would show that there is a baseline level of microplastic contamination in labs that is unavoidable. Petri dishes, pipettes, microplates, EVERYTHING is plastic, packaged in plastic, and cleaned using plastic tools, all by people wearing tons of synthetic fibers.

Maybe so, but plastics are also everywhere in our daily lives, including on the food we eat and in the clothes we wear. As we speak I just took some eggs out of a plastic carton, unwrapped some cheese from plastic wrap, and got oatmeal out of a plastic bag. The socks and pants I'm wearing are made of polyester.

If plastics cause contamination in a lab, would you not also expect similar contamination outside of the lab?


You would, but if you do studies that claim that microplastics accumulate in our bodies or even in out brains it makes a difference.

I think you underestimate how much plastic is consumed in a lab doing experiments or analysis. I suspect it's an order of magnitude or two more than people are regularly exposed to at home or other non-industrial settings.

When I was an automation engineer at a lab, each liquid handler alone could go through several pounds of plastic pipette tips in a single day. All of that is made out of plastic and coated in a different thin layer of plastic to change the wettability of the tip. Even the glassware often comes coated in plastic and all these coatings are the thin layers most likely to create microplastics from abrasion (like the force of the pipette picking up the tip!). Throw all the packaging on top of that and there is just an insane amount of plastic.

The only place I've seen more plastic consumed is industrial and food manufacturing where everything is sprayed and resprayed with plastic coatings to reduce fouling.


> That’s not to say that there is no microplastics pollution, the U-M researchers are quick to say. > > “We may be overestimating microplastics, but there should be none. There’s still a lot out there, and that’s the problem,”

shouldn't you be particularly attentive to your bias then? an article came out that _seems_ to confirm your previous belief that you arrive at without really testing? like everyone itt that is looking like the comments of an steven crowder comment section in a post about climate change

Oh great. Another Hari Seldon.

A book along similar lines: https://www.amazon.com/Not-Nickel-Spare-Sally-Cohen/dp/04399...

(haven’t read it yet so I can’t vouch for it)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: