Thanks! Sorry to have been away so much. The election was a temporal black hole.
I don't understand why, but the fact that LLMs like ChatGPT "hallucinate" data makes them completely untenable as a useful piece of software. A recent academic got in trouble for using ChatGPT to create a list of references used in a legal article.
From the
Verge:
I used an ancient copy of LaTEX that could build a citation list without fucking it up. And, yet GPT just makes up shit. The whole article is pretty good because it shows people are getting into trouble by relying on LLMs. Essentially, you get the bot to do the work and now you have to do the work again to ensure the bot didn't screw up? And, for the vast cost in power and water?
Maybe LLM models will get better, but considering their inherent flaws I would bet against them. And, don't we have enough shitty digital art already? Don't we have enough CGI in movies? At best, "AI" is about cutting costs and I really can't back a business model that means paying artists, writers, and photographers less.