

Sometimes people are my old job post AI stuff and I just tell them “stop using the lie machine”
Sometimes people are my old job post AI stuff and I just tell them “stop using the lie machine”
Automobile companies should be held accountable for destroying and lobbying against other modes of transit, so not really the best metaphor. Also destroying the environment is pretty bad.
Also there’s no cosmic law that says tech companies had to make LLMs and put them everywhere. They’re not even consistently useful.
These big companies have blood on their hands and it seems like no one is willing to do anything about it.
That’s a quote from Eco’s essay on ur-fascism, for the unfamiliar
https://theanarchistlibrary.org/library/umberto-eco-ur-fascism
Google should be broken up and its leadership fined into oblivion for anti competitive behavior
Make sure you speak clearly with minimal slang, or they might willfully misinterpret what you’re saying to deny your rights. Like to think you want a lawyer dog.
A lot of bosses think developers’ entire job is just churning out code when it’s actually like 50% coding and 50% listening to stakeholders, planning, collaborating with designers, etc.
A lot of leadership is incompetent. In a reasonable, just, world they would not be in these decision making positions.
Verbose blogger Ed Zitron wrote about this. He called them “Business Idiots”: https://www.wheresyoured.at/the-era-of-the-business-idiot/
It is absolutely stupid, stupid to the tune of “you shouldn’t be a decision maker”, to think an LLM is a better use for “getting a quick intro to an unfamiliar topic” than reading an actual intro on an unfamiliar topic. For most topics, wikipedia is right there, complete with sources. For obscure things, an LLM is just going to lie to you.
As for “looking up facts when you have trouble remembering it”, using the lie machine is a terrible idea. It’s going to say something plausible, and you tautologically are not in a position to verify it. And, as above, you’d be better off finding a reputable source. If I type in “how do i strip whitespace in python?” an LLM could very well say “it’s your_string.strip()”. That’s wrong. Just send me to the fucking official docs.
There are probably edge or special cases, but for general search on the web? LLMs are worse than search.
All the leadership who made this mistake should be fired. They are clearly incompetent
But i guess it’s always labor that pays the price
Yeah my friend is dating a Google recruiter and he overhears some absurd offers. Like, a reasonable person could retire on a few years at that salary.
I have a hypothesis that rich people are bad at money
I’ve read that windows 11 uses react (a JavaScript view framework) for parts of the UI, and that seems insane to me. JavaScript isn’t a great language. It’s popular because it runs in the browser. The windows desktop is not a browser.
Fine needs to be much bigger. All the decision makers that approved it need to be removed and barred from working in the industry
You have to remember it’s not about facts, it’s about feelings. As I always say, we’re all susceptible to that to some extent, but the republicans have it bad.
I don’t think it saves time on net if you have to read it and then go verify it anyway. Might as well go directly to the more trustworthy source in the first place! And if you don’t care if your answer is correct, why even search? Just make something up.
Well, yes, Google has been becoming shittier for years as they prioritize ads and fail to deal with SEO slop. You have to know what’s a good source, but that was true even when we were doing research in libraries.
The AI summary is making the problem worse. The information it provides is not trustworthy. It also deprives site owners from traffic. It’s really bad on like every metric.
Well, in this example, the information provided by the AI was simply wrong. If it had done the traditional search method of pointing to the organization’s website where they had the hours listed, it would have worked fine.
This idea that “we’re all entitled to our opinion” is nonsense. That’s for when you’re a child and the topic is what flavor Jelly Bean you like. It’s not for like policy or things that matter. You can’t just “it’s my opinion” your way through “this algorithm is O(n^2) but I like it better than O(n) so I’m going to use it for my big website”. Or more on topic, you can’t use it for “these results are wrong but I like them better”
A world there python ran in the browser instead of javacript would probably be a whole lot better.
I love this idea. Do you mind if I promote it with some queer folks I know?
Myself I’m pretty straight and don’t have a website, but maybe one day.
If a feature is useful people will use it, be it AI or not AI.
People will also use it if it’s not useful, if it’s the default.
A friend of mine did a search the other day to find the hour of something, and google’s AI lied to her. Top of the page, just completely wrong.
Luckily I said, “That doesn’t sound right” and checked the official site, where we found the truth.
Google is definitely forcing this out, even when it’s inferior to other products. Hell, it’s inferior to their own, existing product.
But people will keep using AI, because it’s there, and it’s right most of the time.
Google sucks. They should be broken up, and their leadership barred from working in tech. We could have had a better future. Instead we have this hallucinatory hellhole.
Do they vet the people? Could someone hypothetically sign up for the app, case the rich person’s situation, and then do crimes? Sounds like a good way to find rich assholes.