To be clear - I'm with you that these systems can absolutely be a force for vast good (at least, I think that was what you were getting at unless there was a missing '/s'). I use them daily to pretty astounding effect.
I'll admit to being a little put off by being labeled dogmatic - it's not something I consider myself to be.
it was a half sentence, for that I apologize. and I don't remember entirely what I meant.
However, I do see a lot of one-sentence "truthms" being thrown around. like "garbage in; garbage out" and the likest.
these are not correct. we can just look at the current state of the art with LLMs that has vast amounts of garbage going in - it seems like the value is in the vastness of the data over the quality.
> On the one hand, a tool is as good or bad as the person wielding it.
I see this as being a dogme. smart people make good LLMs dumb people do not. but this is an open question. it seems like the biggest wallet will be the winner of the LLM game.
Ah, I see. What I meant by that was, "A tool is as good or evil as the person wielding it".
There are most definitely good and bad tools, in terms of more or less effective. Machine learning models are for sure outclassing a whole swath of tools in a number of domains and will more than likely continue to overtake purposes over time.
Whether this is a good thing for society is what I thought we were questioning - which is what I meant by steering. We can build tooling to do things like establish veracity, enable interrogation of models, and provide reasoning about internals (which we should do).
Open sourcing as much of this effort as possible will further lead to Good Things (because why are people working on these for free if they're not creating something of actual use) - whilst leaving all ML development to large corporations will inevitably ensure that the only thing you can trust an ML model to do would be spy on you and try to get you to buy stuff, because money.
God the grammar in that last sentence was terrible. But I think you get the point.
I think the real reason is one line dogmas like this.