Computers can now talk like humans but humans also talk more like computers
The most common manifestation of the Turing test looks something like this: A human has conversations with two participants, one a person and one a computer, with their identities obscured. If humans can’t identify the human on average, the computer (programmer) is successful at the Turing test.
What Turing and almost any technological optimist predicted is that eventually computers become adept enough to sound like human. What would have been more difficult to predict is what has come to pass: Humans would speak more like computers. In other words, it’s not that computers would merely catch up to human conversational facility, but that the two would converge.
The cause is legibility. When Google and others made the internet a marketplace of information, they set the criteria for getting discovered. It was all innocent enough when Google and others asked for metadata to help bespoke publishers get surfaced, but then content farms and other neat tricks figured out how to game the system. The latter is a technical hack that eventually gets quashed; the former undermines the criteria for the Turing test.
Consider recipes. It is really weird that the only way to get a highly legible, structured recipe seen from an website that doesn’t already have lots of adjacent SEO juice is to prepend it with a seemingly illegible, unstructured backstory about the recipe. Examples taken from following the search results for “BLT recipe” via Byrne Hobart’s Why Recipe Bloggers Make You Scroll so Far to Read the Recipe:
To be clear, those are Probably Not UnTrue type stories about BLT's that no person wants to publish read and no one wants to read, but it’s the only way to vault a blog onto the first page of Google results. And they’re on the first page because people want authenticity, and Google knows that, so they’re scanning the web for the most legible sources of authenticity. It’s like the Turing test, except that the obscured person is talking with the obscured computer, and the computer is talking with the person taking the test.
This isn’t just a Google phenomenon, nor even a computer-legibility phenomenon. It’s also a social signaling phenomenon on social platforms. Inasmuch as fortune cookies are mechanical, you get humans converging to the mechanical once they’ve garnered enough followers:
The rhetorical style of any Twitter account that continues to gain followers converges on that of a fortune cookie.
— Eugene Wei (@eugenewei) May 21, 2018
While I adore the unrelenting optimism in web3 communities, I'll be damned if the “gm”/“wagmi”/“maxi”/etc. phenomenon isn’t an exercise in making your signal so common and so legible that parsing out any real authentic content has become almost illegible. Anecdotally speaking, I think the below is less of a Discord problem so much as a PBKAC problem:
Opinion - Discord is unusable. Endless feed of stuff, all the time, everything gets lost, zero curation, cant track anything, like 1000 people shouting at each other in a bar, like Twitter without a basic algorithm. Urgh! We cant built web 3 communities on this...
— Raoul Pal (@RaoulGMI) February 16, 2022
@dkb868's “Google Search is Dying” laid out the case that the reason Reddit has still, somehow, yet to peak on Google buzz is that people are increasingly explicitly seeking out more authentic experiences, where “authentic” means something like, “Places where humans still talk like humans”:
[There’s] some general sense that the authentic web is gone. The SEO marketers gaming their way to the top of every Google search result might as well be robots. Everything is commercialized. Someone’s always trying to sell you something. Whether they’re a bot or human, they are decidedly fake.
So how can we regain authenticity? What if you want to know what a genuine real life human being thinks about the latest Lenovo laptop?
You append “reddit" to your query (or hacker news, or stack overflow, or some other community you trust).
This is also the thrust of Ben Thompson’s Social Networking 2.0: “[The traditional social networks] still have value, but primarily as a tool for distribution and reach of content that will increasingly be created in one place, and discussed in another.”
.
One problem with computer-like human speech on the internet is the very nature of text: It’s easy to hide in text. You can paraphrase Wikipedia your way to writing a five paragraph essay in school the same as you can publish a popular tweet where you and your audience only 70% know what they’re talking about. This isn’t a problem for Reddit because pseudonymity makes performativity useless, which makes hiding moot. It's also why even YouTubers feel more authentic than much of Twitter, Instagram, and Facebook, even though YouTubers explicitly treat YouTube-ing as a profession.
The Turing test, in a way, implies some kind of elimination of friction: The friction between what computers can tell humans and what humans want to hear. Humans have sort of pushed the ball over the goal line with our recipes written for Google and our fortune cookie tweets, but it's a touchdown nevertheless. Fortunately, and optimistically, if Reddit’s continued ascent is any guide, it’s all the more evidence for the idea that technology doesn’t mostly displace stuff; it's mostly additive. Humans still gonna human.