Yeah, if we cut back a bit on the war crimes we could easily fund both more moon missions and cool science, as well as a shit ton of great programs to help people with the basics like food and rent and health care.
The US spends more per capita, and even as a share of GDP, on healthcare out of public funds than some advanced industrialized states that have universal systems, as well as spending even more on healthcare out of private funds than out of public funds. If we didn’t have a system which expended vast quantities of additional resources in order to assure that a substantial subset of the population is denied needed healthcare and instead just provided the needed healthcare, we could fund all those other things without cutting back on the war crimes, crimes against humanity, and crimes against peace, either direct or those that we subsidize that are executed by other regimes.
We still should cut down (ideally to zero) on war crimes, crimes against humanity, and crimes against peace, but the reason is because those things are unqualified evil on their own, not because doing so is necessary to fund healthcare and other priorities, which it very much is not.
Yes. But to be fair to your specific point, symbolic solving of integrals used to be a huge skill in the engineering education. Nowadays, it is not a focus anymore, because numerical solutions are either sufficiently accurate or, more importantly, the only feasible approach anyway.
Sorry, I should have quoted properly in my reply.
My first sentence ("Yes.") was in general agreement with you, the second sentence was specifically about
> Mathematica has been able to do many integrals for decades and yet we still make students learn all the tricks to integrate by hand
But maybe, integrating by hand is still as big as ever in other parts of academia. Or were you thinking about high school? I'm fairly sure, that symbolic solving of integrals is treated as less important in education these days, than it was before digital computers, but I could be wrong. Mathematica's symbolic solve sure is very useful, but numeric solutions are what really makes the art of finding integrals much less relevant.
Every PhD program I'm aware of has a final hurdle known as the defence. You have to present your thesis while standing in front of a committee, and often the local community and public. They will asks questions and too many "I don't know" or false answers would make you fail.
So, there is already a system in place that should stop Bob from graduating if he indeed learned much less than Alice.
A similar argument can be made for conference publications. If Bob publishes his first year project at a conference but doesn't actually understand "his own work" it will show.
The difficulty of passing the defence vary's wildly between Universities, departments and committees. Some are very serious affairs with a decent chance of failure while others are more of a show event for friends and family. Mine was more of the latter, but I doubt I would have passed that day if I had spend the previous years prompting instead of doing the grunt work.
That would be cheating.
If the exam is 'gate keeping', I will say that it is a gate worth keeping.
To be clear, I am not against alternative forms of education. Degrees are optional. But if you want a degree, there have to be exams and cheating has to be prevented.
Awesome!
Small feedback: The test should maybe auto run. I solved the first level and was confused why I didn't proceed. The out was -1 (but goal was z) and it took me a while to see the 'run test' button.
Guys, this is a well known and under utilized effect of human psycho physiology. Visually focusing on a single point, small object, or just small visual field (aka tunnel vision) increases mental focus.
AFAIK it’s also one of the reasons we all get “glued” to smartphone screens.
Ah, excellent! Some scientific evidence for my preferred setup: 2 x 9:16 27" monitors, one in front and one to the side. (Plus another display, of no specific kind. Laptop, landscape monitor, etc.)
I sit with my eyes about 1 metre from the screen, and a 27" portrait display is approx 33 cm wide. So I think that's tan(FOV/2) = 16.5/100 = 0.165; FOV/2 = atan 0.165; FOV/2 = 9.37 degrees; FOV = 2*9.37 = 18.74 degrees. It's almost perfect!
(But even if my maths is wrong: this has proven a good setup for me, which I've used for many years now, and I recommend it to anybody thinking of experimenting with their desk setup. Many monitors come with a stand that allows rotation, so it's not necessarily difficult to try. If you don't like it, you can always switch back.)
Of course, being an aging boomer, using an 85" monitor isn't decreasing my focus at all. I just look at the part of the screen I'm using at the moment.
Personally I find it helpful to be able to spread windows out on that giant screen so that any one of them is instantly available at a glance (and I still use 8 "desktops"). Of course, I also don't reboot, well maybe once or twice a year after a kernel update. So setting all those windows up isn't something I do every time I sit at the computer.
I do feel sorry for the generations born into internet brain damage (seriously). My son is GenZ and (thankfully) struggles with the typical symptoms less than others, but is still affected.
This is clearly a consequence of growing up with constantly network connected hand held computers, and the maliciously crafted web platforms that exploited that constant connectedness.
He wasn't wrong in that claim: for the most part the bombers did get through, especially at night. The problem was that their effectiveness once "through" was far lower than the bombing proponents had claimed, due in particular to the lack of precision, but also the resilience of both targets and the enemy population.
> I think the propulsion system will be the easy part.
Really? I think rocket science is still not easy. Just look at how much nation states are spending on maintaining their liquid and/or solid fuel rocket programs. If they even have one, let alone both.
Quote: "All this sounds fairly academic and innocuous, but when it is translated into the problem of handling the stuff, the results are horrendous. It is, of course, extremely toxic, but that's the least of the problem. It is hypergolic with every known fuel, and so rapidly hypergolic that no ignition delay has ever been measured. It is also hypergolic with such things as cloth, wood, and test engineers, not to mention
asbestos, sand, and water —with which it reacts explosively. It can be
It has recently been shown that an argon fluoride, probably ArF2, does exist, but it
is unstable except at cryogenic temperatures.
[...] kept in some of the ordinary structural metals — steel, copper, aluminum, etc. —because of the formation of a thin film of insoluble metal
fluoride which protects the bulk of the metal, just as the invisible coat
of oxide on aluminum keeps it from burning up in the atmosphere.
If, however, this coat is melted or scrubbed off, and has no chance to
reform, the operator is confronted with the problem of coping with a
metal-fluorine fire. For dealing with this situation, I have always recommended a good pair of running shoes."
Granted this is about a fuel that is AFAIK not used for MANPADs, but the joke about the running shoes could be made about most aspects of rocket propulsion.
With all do respect I think your over complicating the problem. It’s not rocket science (no pun intended). It’s essentially a hobby rocket that can be weaponized and it’s all DIY. That’s the point simple and off the shelf. Not meant to travel towards the stratosphere or even long range. Quick and dirty way to cause havoc in a localized area.
Not saying you are wrong or the option shouldn't exist, but what specifically makes 8 GB too little but 12 GB sufficient?
Planned obsolescence and software that is written with the idea that "8 GB is borderline in 2026" seems to be blame.
But perhaps there are genuine limitations that 8GB RAM runs into. Certain AI models, rendering at certain resolutions maybe?
My 8GB M1 Air is my daily driver for over 5 years now and so far it has worked out well. Sometimes, I have to replace badly optimised software for good alternatives. I hope that by the time that MacOS becomes unusable, Asahi Linux is mature enough to replace the OS rather than the hardware.
I'm still on Sequoia and from what I've heard going to Tahoe would be terrible for the usability of my Air. So, no idea how much longer I will be able to hold out and if Asahi is ready now. It looks ok on first glance.
Because it might require time consuming testing, iterations, documentation etc.
If everything the maintainer wants can (hypothetically) be one-shotted, then there is no need to accept PR's at all. Just allow forks in case of open source.
reply