Post 31: “Crap,” a technical term.
Part 1 of the “LLMs, a solution in search of a problem?” series
Okay, so this is one of my favorite recent conversations with an LLM. And I swear it isn’t just because I ended up getting some praise from the model, 😂, (we will certainly have to file this under being ‘emotionally’ manipulated by AI for future discussions. . .), but also because the topic allowed me to once again organically bring in one of my favorite books of all time, Anathem, by Neal Stephenson.
This discussion covers #4 on our list of Mostly Benign to Very Malignant uses of LLMs:
As said above, this is supposed to be in the “mixed” category, however, by the end it seems pretty malignant to me unfortunately.
This is because of the “crap” from the title of the post.
In the book Anathem, there is a section where a sort of internet expert of that world, (trying to minimize spoilers), is explaining that the internet has become useless because AI bots have mass produced content that created a flood of generic, repetitive, or misleading content that led to search engines not being able to distinguish between valuable content and AI-generated garbage. In the book he calls this garbage, "Crap”, which he also says is a “technical term.” Amazing that book was written in 2008. . . that's some foresight!
And, well, if we aren’t living in that world right now, we are probably almost there. The internet already seems like it is starting to be dominated by AI generated crap, (or ‘slop’. . . I guess that is the term our world uses, or so ChatGPT tells me. . . although I like crap better. . . ). As I mention in the conversation with the LLM below, the internet has become all but useless to me unless I know the exact site I want to visit or I am trying to buy something. . . and even then, trying to distinguish between high and low quality is not easy.
Did I say this one was enjoyable? Well, I get some validation at the end for how I am using LLMs, so I guess that was some comfort for me? I don’t know that this conversation will comfort you, but I do think it is interesting: