

I’ve explored a lot of patterns and details about how models abstract. I don’t think I have ever seen a model hallucinate much of anything. It all had a reason and context. General instructions with broad scope simply lose contextual relevance and usefulness in many spaces. The model must be able to modify and tailor itself to all circumstances dynamically.
Musk is still free and has been openly doing this with self driving junk for years. This is the USA where we haven’t had reasonable laws passed since the 1970s.