Last week, Copilot made an unsolicited appearance in Microsoft 365. This week, Apple turned on Apple Intelligence by default in its upcoming operating system releases. And it isn’t easy to get through any of Google’s services without stumbling over Gemini.
Regulators worldwide are keen to ensure that marketing and similar services are opt-in. When dark patterns are used to steer users in one direction or another, lawmakers pay close attention.
But, for some reason, forcing AI on customers is acceptable. Rather than asking “we’re going to shovel a load of AI services into your apps that you never asked for, but our investors really need you to use, is this OK?” the assumption instead is that users will be delighted to see their formerly pristine applications cluttered with AI features.
Customers have not asked for any of this. There has been no clamoring for search summaries, no pent-up demand for the revival of a jumped-up Clippy. There is no desire to wreak further havoc on the environment to get an almost-correct recipe for tomato soup. And yet here we are, ready or not.
Without a choice to opt in, the beatings will continue until AI adoption improves or users find that pesky opt-out option.
I am convinced that everyone really has to make this work one way or another because so goddamn much money was - and still is - being spent on this garbage.
We’re being asked at work “up our CoPilot usage” to justify the license costs. Pretty sad when you need to be forced to use it.
Please refuse and at every opportunity let them know how stupid they are for wasting that money.
Use it and then explain how much of a waste of time it was to get the wrong results.
No, that just plays into their hands. If you complain that it sucks, you’re just “using it wrong”.
Its better to not use it at all so they end up with dogshit engagement metrics and the exec who approved the spend has to explain to the board why they wasted so much money on something their employees clearly don’t want or need to use.
Remember, they won’t show the complaints, just the numbers, so those numbers have to suck if you really want the message to get through.
This! ☝️
Just because you brought up copilot, I think people need to see this
lmao my workplace encourages use / exploration of LLMs when useful, but that’s stupid
Correct. It’s about metrics. They’re making AI opt-out because they desparately need to pump user engagement numbers, even if those numbers don’t mean anything.
It’s all for the shareholders. Big tech has been, for a while now, chasing a new avenue for meteoric growth, because that’s what investors have come to expect. So they went all in on AI, to the tune of billions upon billions, and came crashing, hard, into the reality that consumers don’t need it and enterprise can’t use it;
Transformer models have two fatal flaws; the hallucination problem - to which there is still no solution - makes them unsuitable for enterprise applications, and their cost per operation make them unaffordable for retail customer applications (ie, a chatbot that gives you synonyms while you write is the sort of thing people will happily use, but won’t pay $40 a month for).
So now the C-suites are standing over the edge of the burning trash fire they pushed all that money into, knowing that at any moment their shareholders are going to wake up and shove them into it too. They’ve got to come up with some kind of proof that this investment is paying off. They can’t find that proof in sales, because no one is buying, so instead they’re going to use “engagement”; shove AI into everything, to the point where people basically wind up using it by accident, then use those metrics to claim that everyone loves it. And then pray to God that one of those two fatal flaws will be solved in time to make their investments pay off in the long run.
Yeah, it’s sunk cost fallacy all the way down. We’re just being harvested because…fuck us I guess.
It’s a combination of sunk cost and FOMO.
This is it. “We spent so damn much money on this, we gotta see some NUMBERS on the dashboard!”
They have to put it into everything and have people and apps depend on it before the AI bubble pops so after the pop it’s too difficult to remove or break the dependency. As long as it’s in there they can charge subscription and fees for it.
“coPilot, what is the Sunk Cost Fallacy?”