
Ask HN: Is the word “bot” losing its meaning? - usui
From my understanding, the word &quot;bot&quot; is supposed to mean something that is working as autonomously as believably possible.<p>Within the past few years, I have read a lot of comments in social media and forums where people respond with accusations of people being a &quot;Russian bot&quot; or &quot;Chinese bot&quot;. I can understand what they mean if they use &quot;shill&quot;, but using the word &quot;bot&quot; means to me that the original poster is an autonomous web scraper or automatic program to vigorously defend or attack something. But the thing is, even if there is a nation-state-coordinated agenda to subvert online opinion, these things are still being conducted by real people, which I am fairly certain are still just regular humans. You can argue that programs like GPT-3 are generating the posts, but I think it&#x27;s obvious by the flow and logic&#x2F;emotional response chain between accusers and accusees that it is between real people because GPT-3 simply cannot handle pulling in knowledge of theory-building in a coherent, substantive way (at least not yet)—GPT-3 can only give the facade of having complex, novel thoughts and responses as is typical of heated discussions, but usually it is quite vapid. GPT-3 also doesn&#x27;t typically make typos, nonconformist &quot;nicely formatted&quot; writing of humans, or use new information too well.<p>My conclusion then is that real people are pushing their viewpoints still, not &quot;bots&quot;, but I may be misunderstanding the way the term is used. Or, has &quot;bot&quot; changed in meaning&#x2F;use? Is it just a word used to effortlessly shut down opposing viewpoints? I&#x27;m trying to seriously understand how people use this word now.
======
slightwinder
A bot is something that looks like a human or pretends to be one, but is
actually a machine. That always was the meaning of it and nothing changed
here. In a game a bot is software controlling the player-character and doing
things, like a human player would do it. In web a bot is software visiting
sites and scrapping them, like a human user would do with a browser. In social
media a bot is software which creates and posts content and answers, like a
human would do.

But, as everything is virtual, you have no direct way to proof whether some
virtual actor is human or software. Only the quality of their actions can give
you hints on it. So people go around accusing people left and right if their
arguyments are to much following the protocol, and don't show signs of self-
thinking by their understanding.

This on this other side has an additional meaning in form of human bots. A bot
technically does not nead to be all virtual and digital, it could be also a
meatbag with wetware following a guidebook, just acting out pre-definend
routes. You see this usually in companies, call-centers and propaganda-jobs.

This is BTW a funny roundtrip, as the original meaning of robot was coinend
for humans who do braindead labor, just following the rules, not thinking on
their own. It just later moved to machines.

------
davidschof
My son, 9, is a keen Fortnite player. He uses this word to criticize newer or
less capable players. "You're a bot"

~~~
usui
How interesting. Probably because playing against bots in a multiplayer game
usually sucks. If really high AI difficulty was the norm, then that wouldn't
work.

~~~
muzani
They're not very skilled in a lot of games. They can be a threat to newbies,
but they're easily fooled by experienced players, and can even be less
accurate vs more advanced moves, involving motion.

------
muzani
It seems like "bot" and "drone" has the usage flipped. You don't really accuse
them of being corporate drones anymore. You accuse them of being mindless
bots.

------
coolassdude6942
No. It has the same meaning. The layman just wildly overestimates the current
capabilities of AI. The average Twitter user who sincerely accuses another
account of being a “russian bot” actually does think it’s some autonomous
entity regurgitating propaganda.

------
jermier
> I can understand what they mean if they use "shill"

You kind of answered your own question. Many people never differentiate
between bot and shill, instead preferring to lump those two words together. A
bot is programmatic; a shill is a human agent that likes to amplify messages
or spread disinformation manually. Although some shills may still use some
level of automation, for example, often using several accounts at the same
time using some bespoke software arrangement.

------
gypsyBelly
It means the bottom lane in League of legends.

------
ocbyc
Or people on reddit accusing you of being paid to post because you don't share
their Uber liberal religion.

------
rvz
Yes. It is indeed losing its meaning.

This word has been abused to the point where it has become meaningless just
like how I keep hearing the word 'literally' used in every sentence. Even
worse, it's used carelessly as an insult to criticise others or used towards
anyone who disagrees with them rather than describing the actual quick and
automatic actions of an enity being closer to a 'robot', 'botnet' or 'aimbot-
like' behaviour.

> Within the past few years, I have read a lot of comments in social media and
> forums where people respond with accusations of people being a "Russian bot"
> or "Chinese bot". I can understand what they mean if they use "shill", but
> using the word "bot" means to me that the original poster is an autonomous
> web scraper or automatic program to vigorously defend or attack something.

I think shill is a better word here to describe this. If people cannot
distinguish between a real user account and a bot account then either this bot
account has passed the turing test or Twitter is not doing enough to disclose
actual bot accounts. They should do what Keybase did with disclosing bot
accounts by signing them up differently and then tying them to the real
account holder. Google ReCAPTCHA + phone verification for sign-ups solves the
mass automated sign-up issue, distinguishing between real and bot accounts by
linking them up with 'owners' would clear this mystery up.

> You can argue that programs like GPT-3 are generating the posts, but I think
> it's obvious by the flow and logic/emotional response chain between accusers
> and accusees that it is between real people because GPT-3 simply cannot
> handle pulling in knowledge of theory-building in a coherent, substantive
> way (at least not yet)

While it can generate very convincing sentences, it has the limitation of
always accepting and responding to nonsensical input, especially in very
divisive / heated discussions which is likely to be the case. The output will
also be limited to data up to 2019 which it was trained on.

> My conclusion then is that real people are pushing their viewpoints still,
> not "bots", but I may be misunderstanding the way the term is used. Or, has
> "bot" changed in meaning/use? Is it just a word used to effortlessly shut
> down opposing viewpoints?

Exactly. This is almost exactly what this Twitter blogpost outlines about the
misunderstanding of the word "bot" [0]. However, My point still stands: If the
accuser cannot prove that a particular account has the properties of being a
"robot" which issues replies very quickly in an automated fashion [0], then
perhaps the fault is on Twitter not being able to find a way to disclose if
these accounts are either a "bot or not". They should overhaul the sign-up
process like the way Keybase dealt with bots: Link the bot account with a real
account and disclose which real account owns that bot account.

[0] [https://blog.twitter.com/en_us/topics/company/2020/bot-or-
no...](https://blog.twitter.com/en_us/topics/company/2020/bot-or-not.html)

~~~
usui
Wow, thanks for showing that post on Twitter. I think I did five different
Google searches on this topic and I could not find a single link because
Google kept feeding me garbage search results (Off-topic: Google Search is
getting really, REALLY frustrating recently to find what I want and I'm not
sure if it's me or the search engine. It seems like Google Search is getting
worse and worse with every passing month, like it's not specific anymore and
it doesn't care what words or quotes I put in the search query).

Anyway, half that Twitter post is about platform manipulation, which I agree
definitely relates to bots. However, I still have this gut feeling that
platform manipulation is done almost unilaterally by swarms of people, not
__bots __. I think the definition of the term I am familiar with is clashing
with how the word is being used.

> However, My point still stands: If the accuser cannot prove that a
> particular account has the properties of being a "robot" which issues
> replies very quickly in an automated fashion [0], then perhaps the fault is
> on Twitter not being able to find a way to disclose if these accounts are
> either a "bot or not".

I really don't think it takes a genius to realize which accounts are
automatons and not-automatons, because the former replies nonsensically. So, I
was going to reply to your comment saying that it's not really necessary for
Twitter to disclose which accounts are bots or not. However, after typing out
that sentence, I realize that technologists (which I identify as) have a
better understanding on what is capable by state-of-the-art automation.

Do you really think that people genuinely can't tell which social media
accounts are bots or not? My emerging understanding was that "bot" was a label
used to mostly maliciously insult others, but I did not consider that these
people could genuinely not tell if the accounts were bots or not.

I'm not saying I could not be fooled by a program, because there is that
recent instance of someone from my university claiming to have fooled Hacker
News with GPT-3 [0]. Unlike a blog post or article, however, it's much easier
to tell when you are having an interactive discussion and a back-and-forth of
talking points. This is why I have a hard time believing that anyone actually
thinks someone they are talking to is a "bot".

[0]
[https://news.ycombinator.com/item?id=24164470](https://news.ycombinator.com/item?id=24164470)

