As an analyst it is your job to prepare valuable information to other. If you drop AI generated stuff unreflected, uncorrected and outdated at people, you will loose your job. I am starting to reject meeting minutes created by AI which are not understood by the writer and are not polished.
AI is a tool for you to create better results not an opportunity to offload thinking to others (like it is now done so often)
You're correct, but I'm sure his boss feels differently. In my experience businesses really care very little for precision and excellence in your work product. I don't just mean they cannot recognize excellence (although this is true as well) but that they actively dislike precision and would lump a lot of that sort of effort all into one bucket they might describe as "being too pedantic," or "not seeing the bigger picture," or some other epithet which _could_ be true, but in practice just means "I just want stuff done quickly and don't care if it's half-assed. We just need to report success to management which will never know or care about the truth."
This. Most bosses are so obsessed with applying Pareto's 80/20 rule in all situations (even when it does not apply) that most would trade velocity for accuracy without thinking. Frankly, I doubt the average manager would know wrong data when confronted with it.
Previously, we always had the output of office work tightly associated with the accountability that the output implies. Since the output is visible and measurable, but accountability isn't, when it became possible to generate plausible-looking professional output, most people assumed that that's all there is to the work.
We're about to discover that an LLM can't be chastened or fired for negligence.
AI is a tool for you to create better results not an opportunity to offload thinking to others (like it is now done so often)