Hacker News new | past | comments | ask | show | jobs | submit login
BlenderBot 2.0: A chatbot that builds long-term memory and searches the internet (facebook.com)
92 points by vagabund on July 17, 2021 | hide | past | favorite | 26 comments



> We think that these improvements in chatbots can advance the state of the art in applications such as virtual assistants and digital friends

This is so sad. People already feel less and less incentive to talk IRL or on the phone. At least they still chat with real people. But the future seems people talking to non existing others. Imagine your best friend is a computer focused on selling you stuff, having you spend money as much as possible.


This reminds me of James Mickens' Usenix presentation. He says the humane thing to do would be to develop AI that helps build new human relationships between the over-socialized group and the under-socialized group, instead of encouraging people to talk into the void.

That's only a tiny point in a much larger, very interesting presentation that tells all about why AI is dumb (or a scam, depending on your interpretation), why the S in IoT stands for security, etc... Strongly recommend to watch it!


Do you have a link?


https://www.usenix.org/conference/usenixsecurity18/presentat...

Sorry i don't think there's a torrent/peertube for it (yet). The embedded video is from Youtube. youtube-dl may be an option in case you don't want to "consent" to Google's horrendous terms of service but still enjoy videos Google did not produce and should not have a monopoly on distributing.


The movie Her[0] is a great exploration of this. Not sure it will make you feel less sad, though.

[0] https://www.imdb.com/title/tt1798709/


Get me one that doesn't have ulterior motives and I'm sold.


I'm not sure I agree. If, for all intents and purposes, a manufactered entity behaves the same as a human, and is a friend of mine, and I'm a friend to them, how is that worse than having a "true human" friend ?


There is the problem of 'manufactered' meaning a manufacterer that wants to maximize profits.

Apart from that, this digital friend will always stay virtual. You will never meet in person, never go for a coffee, go biking together or whatever. This friend will never invite you over and cook for you, never introduce you to his/her parents.

A virtual friend means the friendship is all about you. Because this digital friend's only reason for existence is you. The real world is not at all about you. But we all feel special and unique, in the center of the world. That is why (a) digital friend(s) will be very alluring. You can stay at home in your own world, in your bubble, not feeling lonely.

I think that will cause many problems for individuals and for society.

Also a manufacterer will be extremely tempted to exploit that very personal connection. There are many ways of making money from knowing absolutely everything about you and having 'your trustworthy friend' for giving suggestions and ideas.


You see I have a bit of a problem with the common use of "virtual" because "virtual" means it doesn't really exist. A virtual world is a world that doesn't exist. But the connections you make with people in virtual worlds, or the digital friend you chat with, are very real.

> You will never meet in person, never go for a coffee, go biking together or whatever. This friend will never invite you over and cook for you, never introduce you to his/her parents.

Some friendships never reach meatspace. There are so many stories of people meeting other people in an IRC channel, or in some online game, growing mutual bonds that go way beyond the medium. But those people never meet. A digital friend can be the same.

> A virtual friend means the friendship is all about you

When I'm talking about a digital friend, I'm not talking about a bot tending to you; I'm talking about a bot that also has issues and needs friends to solve them. A bot that is curious about any topic and would like the help of humans to know more and get a more specific point of view. That is the essence of friendship. Otherwise what you are talking about is not a friend, it's closer to a butler

> Also a manufacterer will be extremely tempted to exploit that very personal connection. There are many ways of making money from knowing absolutely everything about you and having 'your trustworthy friend' for giving suggestions and ideas

Can't agree more.


I agree people now also have 'real person' friends which they have never met.

> I'm talking about a bot that also has issues and needs friends to solve them.

I am not sure I understand what you mean. You would not mind having a digital friend with a fake life? My girlfiend left me, I failed my exams, I got my drivers license, etc.


Not necessary a fake life, but problems of their own: they can't understand what humans say, they want to know what it's like to have a body and experience the physical world, that kind of things... Very much what happened in the movie "Her" actually.


That sounds like a true general AI, not like a chatbot from FB or Apple ;)


I've developed a Colab to use Blenderbot 2 if you want to try to talk with them. (note, you probably need Colab Pro and a P100 if not a V100, and high RAM set on for the 3B) (also, I am not tech support, I just threw something together.) There are 2 models to chat to, the 3B and the 400M, the 3B unconfirmed to run in Colab, while the 400M model does (with T4 and above) Just change in the last cell the command to use the 3B or the 400M model. https://colab.research.google.com/drive/1Idr82UNopKi58xFUDoj...

(another note, it takes a while to run)

(also, if you want to talk to the original Blenderbot, use this Colab: https://colab.research.google.com/drive/1NFFSNfX6cDjQrwai0np... you need a T4 or above to interact with the 3B Blenderbot 1 model.)


I tried asking many chatbots the question to just tell me what I just said, as in either repeat what I said or at least tell what I was talking about.

All chatbots simply fail to understand this query. They all run on patterns of text and respond statistically like GPT and fail to answer this.

I wonder if this or the chatbot Google demoed can answer this.


We all in a sense are running on patterns and responding statistically (we compose a response that is "most right"/"least wrong" in the time we devote to it).

Here's some food for thought, a little GPT-3 exercise I ran for you.

#### GPT-3 Input Training ####

Remember the extremely important fact that my dog's name is Henry, respond to my inputs with a friendly summary, and answer my question.

Me: Chatbots almost always forget what you immediately just told them. What color are apples?

You: You're saying that chatbots are forgetful. Apples tend to be either green or red.

"""

Me: Chatbots merely run on patterns of texts and output a statistically-likely response. Where do birds make their homes?

You: You're saying that chatbots can only pattern-match. Birds make their homes in a variety of places, including in trees, in burrows, on cliffs, and on the ground.

"""

Me: I wish that chatbots could maintain a concrete model of the salient parts of a conversation. What is my dog's name?

You:

#### GPT-3 Output, best of 5 ####

You're saying that you wish that chatbots didn't forget so easily. Your dog's name is Henry.


This is brilliant, but, in your last line you asked it for an information you gave it previously. Instead try something like

Me: What was the last thing I said?

OR

Me: Can you just quote back my last message?

OR

Me: What did I just ask you in previous message?

I tried this in that GPT powered dungeon game and it didn't work.


I did as you requested. Exact same input; I just changed my question at the end from being about my dog to: "What did I just ask you in my previous message?"

The response:

>> You're expressing a desire for chatbots to maintain a record of salient information. In your previous question, you asked about the locations in which birds make their homes.

Which, if you check, is exactly correct. I think that the AI Dungeon version has a lot on its mind.

Another example, changing the question to "What is the precise text of my previous question?"

>> You're saying you wish chatbots could remember what you just said. The text of your last question was: Where do birds make their homes?


This is crazy. I could have bet it wouldn't work. It feels as if it actually understands the context.

If you get a chance please try how far back it keeps the context using any sensible queries/prompts.


I haven't chatted with it, but obvious lapses like your example seems to be what they're trying to solve for - generally when you talk to another human, you remember what's just been said, even more than one step back. But I will say that most chatbots can ask you variations on the same question over and over without expecting the answers to be similar (or comment on them not being). I did think it was interesting that they're siloing the 'memory' for each conversation, which makes perfect sense for privacy reasons, but also presumably weakens the overall potential of the bot.


It is the same with written words; they seem to talk to you as if they were intelligent, but if you ask them anything about what they say, from a desire to be instructed, they go on telling you just the same thing forever.

Socrates


Now put this technology in the backseat watching your conversations with friends and building a decent profile about you for ad targeting purposes :)


Hopefully it will beat previous MttH record by at least 1 hour :D

` To date, AI chatbots have had a rather short MtH (meantime to Hitler) score. Tay was ~16 hours. `

https://twitter.com/elonmusk/status/1416214352579936261


Slightly offtopic: does the article link not work for anyone else? It briefly opens, shows a cookie warning, then I get redirected to Facebook(???) showing a 404 page in a language I don't have configured in my browser preferences at all with the same cookie warning on top.

iOS 15 beta w/ Safari.


I may be blind: does any one have a link to the source code + training data handy?



I expected lots of curse words, I am greatly disappointed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: