I won’t belabor this point because I need to get ready for therapy, but … AI is coming, it’s already here, it’s somewhere lost on the interstate, whatever whatever. In some industries, like Human Resources, people think it’s a savior at the level of Jesus. One of the smartest things I’ve ever written (not on this site) is about all that.
If you know the history of AI, there’s been a lot of stops and starts — often called “AI Winter.” Usually it was because funding dried up. That’s not likely to happen this time, although a global recession might pull the whole deal back a few years, sure.
Right now, an average rank-and-file, paycheck-collecting employee probably most sees AI in scheduling applications, and maybe in some stuff with chatbots around their benefits, etc. Most day-to-day human beings not working in labs don’t see it in all its glory at this moment in human history.
Or do they?
Here’s the irony part
For AI to work, humans have to feed it information, i.e. “That is a dog” or “That is a penis.” (The last one was a joke, but I assume someone somewhere is doing that work, because PornHub needs AI too, I’m certain of it.)
Here is The New York Times talking about how humans need to input data to make AI better, and — >
Before an A.I. system can learn, someone has to label the data supplied to it. Humans, for example, must pinpoint the polyps. The work is vital to the creation of artificial intelligence like self-driving cars, surveillance systems and automated health care.
Tech companies keep quiet about this work. And they face growing concerns from privacy activists over the large amounts of personal data they are storing and sharing with outside businesses.
Ya know why “tech companies keep quiet about this work?” There are a few reasons, but one is the grand irony of what’s happening right now. In short, that would be …
… we are giving people crappy data-entry-type jobs so that they can help inform machines to a level that will take more and more of our jobs.
Ha! We doing the volleyball set-up move, and technology is about to spike it right in our nose, shattering it. Amen and praise be!
What’s “the tyranny of small decisions?”
Mid-1960s economic theory whereby a bunch of small, seemingly insignificant decisions add up to a massively poor decision. This is sometimes also called “life,” i.e. “The Butterfly Effect,” and/or “Your 30s,” i.e. “Sometimes You Get Punched In The Taint.”
The modern rise of AI as a whole sometimes feels like a bad exercise in the tyranny of small decisions. It’s like …
- “Hey, we have a bunch of task work jobs where humans make mistakes, and…”
- “… we could use better calendars and stuff, like easier ways to schedule meetings ….”
- “… maybe some of this stuff could apply to medicine or law enforcement….”
- “… yea, but then we’d need like, I guess we’d need people to input the data that would eventually make it harder for those people to get jobs…”
- “… right, right, but like, if we do this right, we could make some money to feed our families…”
- “… true, true …”
So it begins with small decisions, and the small decisions eventually snowball and snowball some more into “Hey, now we’ve got this technology that’s going to maybe be a net-job-creator, or that’s what some people are saying, but we don’t really know…”
Tyranny of small decisions.
One thing I will say is that you commonly hear “Oh, this has happened before in human history,” and it has. But the scale right now is very different, and the psychology of what/how work is happens to also be very different. Automation will work if we re-train or “re-skill,” as the buzzword goes. If we don’t do that, we’re going to crater the middle of society. So that’s a nice thought for a Tuesday, right?
The problem right now is the advancement of the tech stack, the secretive nature of what the “tech industry” is doing, and the greed/glutton of people running companies in 2019-2020. Automation will save you money. If you think the goal of being a big boss is cutting costs and saving money, well, you will embrace automation. There is absolutely no way around that logic.
And ironically, who got you there? Humans, inputting data to make the technology better.
A grand irony, rooted in small decisions.