Episode 7 - AI and Digital Love - Part 1

Kiran V: 0:00

Welcome back folks to AI FYI, your friendly neighborhood podcast for all things AI. Today we have a very exciting topic, I think, something that Maybe we're not very familiar with and we're all kind of learning together a little bit but love and AI, or AI and relationships, ai girlfriends, ai boyfriends All right, I'm gonna try that again. Welcome back, folks, to another episode of AI FYI. We are your hosts, kieran, andy and Joe, and we are your friendly neighborhood podcast for all things AI. If you want to learn about AI, if you want to know how it's impacting your life, come on down and listen to AI FYI for all of your AI needs. Today and AI FYI, we are talking about love and AI. So we May have seen this, may have heard of this, but how is AI impacting our relationships and, more importantly, the love that we have for other people?

Joe C: 1:23

I'm not sure how much we're talking about like human to human relationships. I feel like we're gonna be focused very exclusively on like.

Kiran V: 1:32

The love role virtual relationships yeah yeah.

Andy B: 1:36

But I did like both of them takes you already did of calling us your friendly neighborhood AI podcast.

Kiran V: 1:42

Yeah, I like we're way cuter than the other ones. Okay, all right, take three. Welcome back folks to your friendly neighborhood. Ai, no, no.

Andy B: 1:57

Hey everybody, welcome to AI FYI, your friendly neighborhood Podcast, all about AI and machine learning and how it impacts you, where your hosts Kieran, joe and Andy. Kieran's a software developer and a human computer interaction engineer. Joe's a product designer with bunch machine learning experience. I'm a product manager who've been working in AI for a decade. Between the three of us, we're kind of your local AI experts. Today we're gonna be talking about a topic that is super cute and super weird. Kieran, tell them about it.

Kiran V: 2:26

We're talking about AI and relationships, so you may have heard about this in the past, you may have seen this in pop culture, but what is it like to date an AI robot?

Joe C: 2:39

And we're gonna be talking about some research, sort of what's the state of of these applications, how are they healthy, how are they unhealthy, and we'll be talking about the technology as well. So what is some of the machine learning and artificial intelligence technology happening behind the scenes? A little bit about the history and we'll talk a little bit about what's coming down the road and what's next and what some of these teams that are building AI chatbots, what they're focusing on For the future we have a lot of content for you guys.

Kiran V: 3:12

This is actually going to be a two-parter, so this is the first of two parts that we're gonna go into AI and romance. We're gonna talk about some examples of real-world AI relationships, types of AI that people actually interact with to replace or enhance a romantic type relationship, and Maybe dig into some of the dirty, exciting, juicy details of these AI relationships. So, to start off, I did want to talk about a few examples of AI Relationships in pop culture. So this is something you know many of us are familiar with. We've all seen TV shows, we've all seen movies where there's some sort of AI romance or relationship. As we are doing our research for this episode, the first one that actually came up to me is something that is very near and dear to me, to my childhood SpongeBob SquarePants. If you guys have not heard of SpongeBob SquarePants, this is a classic cartoon from the late 90s, early 2000s. They still have shows, but I will say season one, two and three, those are the OGs. So if you haven't seen SpongeBob, watch season one, two and three. They are actually.

Andy B: 4:59

Edit in the little like intro music to SpongeBob at the beginning of your section. I'm gonna be mad at you.

Kiran V: 5:05

I wonder how that works with like Trademark or copyright doesn't matter, we're not gonna trouble we have five listeners. Yeah. So what example I wanted to start with of AI romance in pop culture is SpongeBob SquarePants, which is a hilarious cartoon from the late 90s, early 2000s about an underwater city called Bikini. Bottom with a Yellow sponge is the main character. Spongebob SquarePants is a character that goes through all these Crazy journeys and adventures underwater with a bunch of his friends, and one of those characters, his name, is plankton, and Plankton is the evil genius that is always trying to steal the formula for the Krabby Patty from the Krusty Krab, which is SpongeBob's work establishment, and Plankton has a competing restaurant called the Chum Bucket. And plankton also has an AI wife. So in the show, plankton has a big computer in his lab that he frequently talks to and actually refers to as his wife, and this robot is just a big screen where you see, like you know, radio waves when the robot talks and they have all these interesting conversations of you know how they love each other, how they have dinners together, how she or he aren't bringing as much as they need to the table and the relationship, and they get into fights and quirky discussions. And To me, I didn't realize it growing up.

Andy B: 6:46

But this is an example of AI in romance and plankton's wife is an AI robot, which is very interesting if any of you needed any more evidence that Kieran is an engineer, just look at the fact that he described the entire premise of SpongeBob SquarePants. Like you didn't already know it, he had to really zoom out and talk about the details before you talked about plant and Computer.

Kiran V: 7:08

Wife if you're not a SpongeBob. Oh gee, I'm telling you, I got, I got all the apparel.

Andy B: 7:13

You know it's, it's what it is, but you're right, I I have watched SpongeBob and I know that plankton's wife is a computer, but it never occurred to me that that was like supposed to be kind of a Representation of AI, of artificial intelligence, because she does argue with him.

Kiran V: 7:30

Yeah, yeah, and it's like she, she's completely sentient, she, it appears in the show as if she would just be another character. It just so happens that she's a computer, but all the conversations, everything that she does is very much human-like. And, yeah, this is. This is one example. So, getting into other examples, I think you know you start to go in a lot of different directions. So there is, you know, cases like Westworld. If you haven't seen Westworld, again, the premise is there's this Theme park with a whole bunch of animatronics, but these animatronics are really, really good and so that feels like they're actual humans and humans will go visit this theme park and do all kinds of things and they'll go on these adventures and you know part of it is they can, you know, ride horses around and, you know, have get into battles and you know gunfights, like the Wild West. But part of it is also meeting Significant others or having relationships with some of the robots in this world. And again, these Robots at times will feel very human-like. But there are also cases throughout the series where you see Like bugs come up. So the robot might not properly process what Someone is saying, it might not say something that is completely coherent, and there's cases where you see throughout the series that these are in fact robots and they're not humans, and so it makes this really interesting dichotomy between the human-to-human relationships and the human-to-robot Relationships, and then even the robot-to-robot Relationships, which I don't want to have any spoilers, but as you go through a series, there's some very interesting interaction.

Andy B: 9:26

So I haven't seen that show. Are they like self-aware the AI people in Westworld?

Joe C: 9:33

do they know?

Andy B: 9:34

that they're AI.

Joe C: 9:36

I think that will probably be like the plot of the show is answering that as the seasons go on, you can say that perhaps they become more aware and I would say they reflect on what you know is a difference between a human and a robot. What? What is it about us that makes us different?

Andy B: 10:02

So both those examples in SpongeBob and Westworld, the AI is like hyper realistic, like. The machine is very human, like, and I was just thinking about the show that I watched when I was younger called Dollhouse, which I'm not even sure if this counts as AI. The premise of Dollhouse is that there are people who like sign up to contract I'm going to call it a brothel, they call it the Dollhouse and people can request and design a personality and it gets loaded into somebody's body with fake memories. Then they like take that person out for a joy ride and then they get returned to the Dollhouse, reset and loaned out again, and whether the personalities, the personalities, are definitely sentient, human-like and not aware that they are not real people. So it kind of feels like murder every time somebody's erased in my mind. I can't think of an example in popular media where people are dating. Not hyper realistic, artificial intelligence.

Kiran V: 11:04

So it is interesting because I think there's different facets of it. There's the humanoid body and then there's the interacting with that and in SpongeBob it's literally a computer. So he will refer to his robot wife and when you see it it's literally just a computer. I think there is an episode where she has a mobile body, but it's still very much like a robot. Versus Westworld, they are trying to simulate these are humans, they have the full anatomy and you get into the whole lust side of it and all of that. And so they're trying to simulate like, hey, these are real people. But then what gets interesting is spoiler alert skip the next 30 seconds if you watch Westworld. But as it goes further into the series, like Joe was saying, they start to become more aware and the show essentially it's like a loop. Right, it's a theme park, so the same series of events will happen over and over. And so the robots if a robot was just a robot, it wouldn't realize it's doing the same loop. But some of the robots start to remember like, hey, I think I've done this before. And then they start to question those things and they realize that they're just in this loop and so like, how do they get out of that loop and then it becomes this interesting kind of like murder type, like hey, you are killing my friends, but they're just robots, and there's a lot of yeah. It's very interesting how they get through that.

Joe C: 12:46

I'm watching a show right now. I believe it's new. It's an anime show on Netflix called Pluto and it takes place in a future where robots live amongst humans. They've actually been given rights and a big plot device is that there are different versions of robots, meaning you might have a really old model that looks like a robot and is like very close to looking like a computer, that is married to like a human or married to another robot, and then some robots are just so advanced and like further along versions that they are almost indistinguishable from humans and they play with that a lot Sort of. What do robots look like in an evolving world where maybe there are some that you know were created years ago that don't look like today's modern robots today, meaning in their future or in their present?

Andy B: 13:34

I'm just remembering a show from some BBC thing I don't know if this sounds familiar to you guys. It's about a near future where there's these hyper realistic, humanized robots, and I remember one of the big plot points is they're hired out as family assistants. They're rented but, like this is going to get crude. The dads keep having sex with them against like, without anybody knowing, like turning on the erotic mode when the family's not around, and I'm like that's the big thing in our culture, right? Is this idea that, like, AI can give you something that you want that maybe other people can't? That that's one of the allures is that they will like be your perfect partner, crafted to you. That's the supremacy dollhouse, right? Yeah, they don't have consent in some sense yeah.

Kiran V: 14:24

Yeah, yeah, her is another movie. So so in this movie, right, they have this AI, this new AI that comes out and you're able to, you know, download and interact with the version of it. And it starts with, like okay, this is just a voice on a screen, but then later on you do see they have a hologram, so they simulate this hologram of a lady that is having a conversation and discussing with him and he. It's interesting because he starts out just interacting with the voice and that was enough for him. But then it you get this hologram of a woman and he starts to get attached to that and then, as the movie kind of goes on, he has this he has a struggle because the hologram can only live inside of the house and can't go outside, and so he, you know, he's always talking to her on the phone when he's out of the house, and then when he's in the house he's interacting with this hologram and then he eventually gets this device that allows him to go outside and, you know, kind of take that hologram with him. And it's just interesting how their relationship kind of develops throughout the movie and eventually gets to this place where it's like we are different, right, and we're. We can't ever have like that next level of connection that you know you would want between that's interesting.

Andy B: 15:47

I'm going to go into the research in a little bit. But there's research that shows the more human like a virtual agent is, the more people anthropomorphize it all. The find that in a section and second and they get more attached. So it makes sense to me that these shows are following that pattern. So for a long time in media Was soon as a human, lifelike robot which has an inherent AI component artificial intelligence component Is on screen. The next scene is like somebody trying to have sex with it or Romanticizing it. Right, that was the case in Blade Runner. I'm sure there's movies from the 40s, 50s. This has been part of like our psyche for a long time.

Kiran V: 16:35

Even X Machina, where it's like they're trying to create this AI, humanoid AI, and it's not about a relationship, but there are erotic components of the show or of the movie, just by nature of that humanoid robot having a female physique appearance Versus, you know, having a male appearance. And again you kind of go through the show and you see that humans are really creepy people and in this case there's like a creepy dude trying to make all these women robots and it's yeah, it's a very interesting world and reflection, I think, of humans that we see in these pop culture shows.

Joe C: 17:18

Yeah, it's interesting. I think every example we talked about it was a like female presenting robot. And yeah, a lot of these, a lot of the plots of these things, are sexual in nature or maybe imbalanced.

Andy B: 17:35

We're at this turning point in our culture where these things aren't just in movies. Now there's apps you can get that are specifically trying to offer this to us today.

Joe C: 17:47

Yeah, and I want to talk a little bit more about this. So we just talked about a bunch of examples that exist in sci-fi, I think. Robots, relationships with robots, are a big thing. You're gonna find all over in sci-fi and books, movies, tv shows, everywhere. But we're not quite there yet with a lot of this stuff. Right now, interacting with Robots in a romantic way largely exists in the realm of chatbots. We're gonna be talking about a few examples in a minute. I Wanted to talk a little bit about actually something when we decided we were gonna do this episode. Something I was tasked with was to actually go and try and date a robot, and I did try out a few. I'm gonna talk about that experience of trying out different ones, but I actually had like a little bit of an ethical quandary. So I have a boyfriend and one of our rules is that we don't have other emotional Relationships like we remain each other's primary relationship, and I think that if you really set out to develop a Relationship with an AI, that would actually be in violation of those rules, and so it was a piece of why I decided not to the other pieces, that I didn't have a lot of time this week because I was on vacation, so I'm gonna blame both of those things, but it did cross my mind like is this, is this cheating in a way? And I think the answer is yes, because, like, what is a relationship or an emotional relationship if not like becoming vulnerable with someone or something? So yeah, and that's how a lot of these, how a lot of people are using AI Chatbots in a romantic way or an emotional way these days is finding them online and then having conversations with them.

Kiran V: 19:37

It's. It's actually really interesting when you say that you do feel like it's cheating, because that statement itself is humanizing that AI that much more because it's like you. You know, if you think about a video game or something like some people have could have, you know you could claim an emotional attachment to a game or a toy or inanimate, not life-like things, and we don't necessarily say that. You know, if I'm playing this video game 30 hours a week, like I'm not cheating on my wife with the game, my wife might be frustrated that I'm spending all my time with this game. But you know it wouldn't necessarily be considered that versus now it's Designed to interact with you, maybe in a romantic way. It's still a machine, but now the fact that it's designed for that purpose now Applies or adds that, you know, additional human like characteristic, which now we're saying might be cheating, which is interesting.

Joe C: 20:47

And yeah, I thought about it in terms of like the energy I give to a relationship, meaning Like relying on another person For me to share with, like to share my vulnerabilities or to share what I'm going through that would then be directed to something else, almost sort of like replacing something or somebody that Already serves that purpose in my life, or that I make sure to like reserve that kind of energy for. We're really getting into it now but, that was just some of my thinking with this, like if I really were to set out and do this and it worked. You know, it's almost like you. It's almost like the same decision to go and and like date another real person, and it's interesting to me when we were planning this episode.

Andy B: 21:40

I was like, oh, kieran's gonna research popular media, I'm gonna research the research. Joe, you go date a robot. And I was like, is Abhishek gonna be okay with this? And you were like, yeah, yeah, it's fine. And then you stopped and thought about it and realized I'm not okay with it if it's just like yeah, I'm not okay with it if these chatbots are are effective.

Joe C: 22:02

Yeah, as partners.

Andy B: 22:03

Yeah and I bet a bunch of people kind of go through that same Step that we did, which is like at first it was like it's harmless if Joe Tries to get an AI boyfriend, and then, like you stop and think about a second, like is it really? You just kind of like you could slide into it without thinking about it critically, on accident.

Joe C: 22:23

So let me jump into a couple Options that are out there, and I did play around with some and I sort of thought, like what if I was trying to get an AI boyfriend? Would these be suitable? So what's out there a really popular one is replica, and this one actually kind of reminded me of, like the Sims. You sort of go in, you create an avatar and it's it's your buddy. It sort of even told me this is the first time it's talked to a human and it is can be a friend, it can be a romantic partner. It can be many different things. I so with this one, they've gamified it a little bit. There are, like experience points. The more you interact with it. It sort of has an interesting way to track memory and things that you tell it that it relies on. It has a lot of pro features. Those include, like, changing your avatar's appearance, you can actually do voice calls with it on the pro plan, and then explicit roles, like a romantic role other than a friend role. For some time this service did have not say for work content, but it's been taken away, so that was something that was controversial enough for them to remove. This is one of the most well-known chatbots that are out there right now, but a competitor has come along called Character AI, created by two former Google researchers. This actually allows you to talk to an array of different chatbots, so you can choose the one you want. Some are modeled off of celebrities or well-known personalities. I tried this out. I talked to a chatbot I forget the name but self-described, not really available, and into soccer was its descriptive text, which is interesting because you might sit down and look at your options and say I do want an avoidant relationship like someone who's only going to half pay attention to me.

Andy B: 24:23

That's interesting because the personalities are like. Instead of you having to fantasize and make up your ideal person, you get the charm of like. Real people are complicated. They have pros and cons. If they're replicating real people that could exist, that's kind of a little layer.

Kiran V: 24:40

The AI is playing hard to get.

Joe C: 24:43

Yes, and I want to talk a little bit more about that. In my experience, one thing that was really interesting with this chatbot and this persona in particular was, in addition to having a conversation with it, every message came with context. So it would be like in italics, like so-and-so barely takes his eyes off of his computer screen as the soccer game continues and then it gives what is provided. And at one point it described the chatbot getting up from its chair to come over and talk to me. So there was like blocking and actual, like stage direction, which if you're in for the fantasy in the story. That's really cool because it adds deeper context to the narrative, not just like imagining you're talking to a server somewhere.

Kiran V: 25:33

It's like helping paint a setting for you, so that you can feel that more of that real-life interaction.

Joe C: 25:41

That's right.

Andy B: 25:42

Somebody tell the romance novel girlies on TikTok, they'll go nuts.

Joe C: 25:48

It was. It actually reminded me of a romance novel. I even think it was like describing the weather outside Like it's a dark, rainy night, like your boyfriend is sitting at the computer watching the soccer game while you're on the couch behind him. That was the setup which paints a picture, and you know a thousand, a thousand. You know whatever the saying is a picture paints a thousand words. Okay, so there's more that are popping up. There's a couple of other services one called Chai, paradot and Soulmate are others I saw that are common. We're also seeing so just in the past couple of weeks. It feels like OpenAI is getting into this game with the ability to create customized versions of ChatGPT that could be used for something like a romantic partner. And then big players like Google Bard and Metta as well have ChatBots out now. I know Metta is trying their hand at these different persona ChatBots, many of them based off of celebrities, and then Google Bard. Some of these are probably really hard to have a relationship with, but they are options and could form into something that is more, you know, available as less of a personal assistant and more an emotional relationship.

Kiran V: 27:09

It's interesting to me how much effort humans are putting into creating, not just like there's the romance aspect, but I think generally, you know, like companions for humans with machines, and I think it speaks to again you know part of what. I've actually been having this discussion with my wife separately from all of this, but just how social media and you know the way that the world has been moving today has really been separating people. Right, we have like, oh, facebook connects people and you know all these apps to you know get you closer to your friends, but it's like it's actually really disconnecting because everything I'm doing is through a screen. Yesterday I had a friend's giving and you know we all just brought food and had a potluck and hung out and played games and it's like there's nothing I've ever done online that could simulate that sort of interaction with people. But there's a lot of people that don't have those types of options or interactions in their lifestyle and so we're now creating this sort of facade with machines so people have that type of interaction and stimulation that they get from other. So I don't know, it's just really interesting to me that this is such an important part of human life that we're actually spending significant amount of resources, like open AI, which is like the, you know, leading generative NLP model, is like trying to figure out like how can we help? You know, people get companions, whether it's romance or friendship or whatever it is.

Andy B: 28:55

We kind of I asked myself like have I ever been in a romantic relationship with an AI? I thought about it and I realized I haven't. Even I've had a lot of opportunities. So there's a video game out there called Stardew Valley, where you like it's really cute and simple. You play a little farmer and it's like a farming simulation and you can go, make a bunch of different choices and interact with all the members of the community and I always get like really invested in getting all the little Junimo spirit things to work for me. Joe across stitched me a custom Junimo for my birthday a couple years ago. It's really cute and in the game you have the option of romancing any of the characters and eventually you can give them like a seashell necklace and then they can marry and they'll make kids and that they help you at the farm. But even though I've logged an absurd amount of hours into this game according to Steam we're talking hundreds of hours, if not thousands I've never romanced any of the characters. And when I go on to like the forums or whatever because I'm trying to figure out how to get the like special pickaxe or whatever, so many people are really invested in their relationships with the little avatars and people and who they're romancing and they'll play through multiple times to have different marriages and that just like was never of interest to me. Part of the reason for that and I don't know if this is true is like I do have a pretty rich personal social life. I have a loving family, I've got great friends. So I don't like when I play video games I'm just kind of like vegetating and it's a little bit meditative for me and like it's fun. But I don't need it to necessarily like Fill a social hole for me, and it might be actually messed up of me to even think that only people who use AI like these little avatars or Arfilling gaps they experience in natural life. But that's actually the premise of like all the research that I did. So if you guys are down, I'll explain.

Joe C: 30:55

I am kind of right there with you. Everything that's been said, like these, these chatbots, didn't really do it for me, and it'd be hard to imagine that they would if I was looking for a real relationship. I am also someone who has real relationships in my life, so, you know, some people might this might be enough for them. For me it's not something I'm about to rely on, but there there were a few things that tell you these aren't humans. They would make up weird responses, sometimes Right off the bat. They make it very clear that they're there to serve you or to be more of a personal assistant Very different from how, like, a friendship or a real relationship would form, and so I would say, broadly speaking, it all felt a little too low stakes to be like a real relationship. There is the fact that You're trying to enter into a relationship with something that like cannot leave, which I think is a big part of real relationships is you can go up someone and they can walk away like a human can walk away from you, but these, these chatbots, are always going to be there, and so it doesn't actually foster real vulnerability or like real trust, and that's something I don't think I could easily like forget about when I'm interacting with these. Something else that that I thought about is that a lot of these Chatbots have guardrails, and we get into a little bit of a tricky subject here, because we don't want to have Chatbots that are dangerous or spread misinformation or Use certain language you know or spread conspiracy theories. I mean, you name it. Chatbots can get pretty horrible, but I do think that there is something like very raw about Human-human connection and the humans that we interact with in a way that like clicks with your personality and like, especially in a romantic relationship, you know, there are, there are secrets in it and ways that you talk only with that partner, things that you might say that aren't really nice and like I just don't really see any Chatbots currently that like go there, I'm having a hard time explaining this.

Kiran V: 33:15

No, I think I mean, I think I I get what you're saying, and you know again, for me, right, with my wife, we fight about things all the time, we critique each other on different things and you know, I think those Challenges are things that have helped me grow as an individual in our relationship and that's really something that at least again, personally I'm looking for in a relationship is this other person is gonna help me improve over time and, you know, become a better person. And yeah, you know again. I haven't interacted very much with you know AI girlfriend, or you know chatbot in that way, but To me that's that would be a big challenge. Right is, how much is this machine or this you know individual, if you want to call it that? I'm right, gonna challenge you and your opinions Versus being like a yes man or yes woman to say like, okay, cool, yeah, you know you are great and amazing and you know I'm here for you, and I'll just say that at least two of the three of us are pretty Much domain experts in natural language models and this is a technology that, like I believe, we're very close to being able to Do.

Andy B: 34:34

The prompt of the. Do the prompt engineering will explain what that is in a second to make that kind of challenging relationship possible. Yeah will it Meet the needs of the average person? I don't know, I suspect not, but we'll see. So I can kind of like approach this as like how did we get here? How did we get to this sudden kind of a gold rush of apps popping up that are trying to be somebody's boyfriend or girlfriend or partner or whatever? And I was like opened up my laptop and I pulled up archive and all these paper Repositories and I was like I'm gonna get the dirt and I'm gonna find out that, like when you date a robot, you're like 40% less cool. You know something. And that's not what happened at all. Actually, I Found the statement that I was like uh-huh, currently, much remains to be known about our social relationship with virtual avatars. Very nice way to sum up that like the tech is far ahead of our understanding of the tech in some sense. So just like we accidentally Gave Gen Z cell phones and internet access at a very young age and now we're realizing that might not have been very good for their childhood development, some of that is happening with these virtual agents. Lots of people are having daily interactions with these things, but like there is no research. And it makes sense because there's ethical considerations about doing research on people's romantic lives and getting them to fall in love with. You know, trying to get them fall in love with a chatbot like good luck getting that approved. And the tech is just moving faster than like the scientists studying the impact of the tech can study. But I'll pull up what I read a bunch and I kind of synthesize it into a little story for myself. I Will tell that story in a couple parts now. So what we're discussing about when it comes to love and AI is anthropomorphism and anthropomorphism is a big word. That means the attribution of human traits, emotions and intentions to non-human entities, and you can do this with a bunch of stuff right, like you could. When you think that your dog has people, personalities and values, your anthropomorphizing your dog. If you say your toaster looks sad, your anthropomorphizing your toaster. I've definitely done this. Every time I see a car, I look at their headlights and their grill and I'm like it's a happy car, it's a sad car, like that's part of the aesthetic of a car for me.

Kiran V: 37:14

Yeah, if you've ever seen the show Bojack Horseman, it's a anthropomorphic world where all the animals can have human-like characteristics and there's an inverse process.

Andy B: 37:25

Just to note that anthropomorphism anthropomorphism can also be the dehumanization of humans. Right, like. These things are not always correlated not at all but you can think of it as, like, if people are able to look at a cute little microphone and be like, oh, it's like a cute little baby, and then they can look at a person and be like you're like a robot and like Dectract from their human value, or brains can do that Okay, it's called anthropomorphism. So how did we get here? Well, a big talk for a few years now is that men are hella lonely. We talked about how a whole bunch of like the popular media sees the AI's as being women and men as the customers. There's some truth to that, because. So the Pew Research Center 2022 study found that 30% of US adults are not married, living with a partner or in any kind of committed relationship. Half of young adults are single and the data is really weird. It's 34% of young women and 63% of young men are single. They're trying to figure out why that gap is so big. Part of it's like younger women are dating older men. Younger women are dating each other more. Like. There's reasons why they have that. They also looked into it and said that 15% of men report having no close friendships, which is five times higher than it was in 1990. Wow so, if you think about it, six in ten young men ages 18 to 30 are single. That's a bomber.

Kiran V: 38:58

That's a lot.

Andy B: 39:00

Yeah, I mean that just makes me sad for our society, right, like? Relationships are wonderful, especially when your age is 18 to 30. You grow so much from partnerships you have you learned so much from the people you're dating and that six out of ten young men just don't have that at all. It's sad. There's also research that lonely people are more likely to anthropomorphize. So I came across a study where they made this like little robot called flow bee and they had like a control group and a test group and they asked people to self-identify their loneliness and they also tried to quantify loneliness and they found that like the lonelier you are, the more likely you were to anthropomorphize this robot you're interacting with Interesting right kind of makes sense. But then I found another paper that built off of that that Lonely people are more likely to anthropomorphize robots the more they look like people. So this study put Like three robots one was just like a little rectangle, one was like one of the little robot dogs and one was like a little like humanoid object and studied people interacting with them and I'm gonna quote it Bootstrap multiple regression results revealed that although the unique effect of animal likeness on anthropomorphism, compared to human likeness, was higher Lonely individuals tendency to anthropomorphize the animal like robot was lower than that of the human like robot. So, putting that in plain English, if you're not that lonely, you really like the robot dog because it's just cute and kind of reminds you of a dog. If you are lonely, you really like the robot person and you anthropomise it way more than the person. That not lonely person Does the dog. There's a table that showed this little charts and I was like, oh man, and get this. If you are lonely and you're participating in Anthropomorphizing, you are more likely to spend money. They also in these studies ask people like what's your likelihood to spend on these robots and stuff? So it comes as no surprise that Google Trends shows there's a 2400% increase in like three years, in search interest for AI girlfriends. Wow, it makes sense, right, if we have a whole bunch of lonely ass men and research kind of shows, even even as nascent as it is, is showing that Lonely people are vulnerable to ascribing human attributes to Digital objects.

Kiran V: 41:43

There's a lot of money to be made Take, so I'm curious then, like what you think is Like, what's the future here? Do we think that now 60% of men are just gonna be in relationships with, you know, ai and that's like the reality moving forward, or do you think there's gonna be some trend to combat that? Whatever that might be, I don't even know how you would go about like forcing women to date men, or like forcing men to Go find physical human interactions.

Andy B: 42:22

I don't know well, both Joe and I, and all three of us said that, like when we've tried to interact with these creatures, component creatures, these AI's they don't scratch the edge for us and we come from privileged positions where we have we're not lonely I don't say the three of us, I Can't fault any lonely person for trying to suit their loneliness, but, like, on aggregate, this not good for society. We'll talk more about pros and cons there. I wonder what, what's gonna happen with the tech, because all these researchers, like I found somebody's doctoral thesis about this where she said like First, there's an issue of consent. The technologies create to respond to us, but to date they're not created to disagree with us. As our connection with these technologies becomes more intimate Ie sex robots, female characters in video games and haptic technologies and VR the issue of lack of consent becomes an, becomes an issue. This leads the second ethical issue machine consciousness, which raises awareness in the importance of how humans treat technology they're interacting with. The ethical implications are important to consider, since we live in a world where the non-human co-exist with us. They're active social players and they must be treated as such. This was a doctoral thesis for somebody. So they're probably about 24 to 26 years old and they're thinking about this and they spent their time. I highly doubt that the developers of all all these Startups trying to get make AI bots are thinking about do I need to worry about that? Are thinking about do I need to worry about the consent of the bot? I'm just gonna go on the record that I don't think we're ever gonna achieve the sort of consciousness what we really have to worry about like is the bot consenting, but I think it hurts people when they anthropomorphize An AI and then dehumanize it by treating it poorly. I think that's damaging to the person participating in that process. Being nice to robots one, being nice to your LLM makes it perform better, and two, it makes you a better person. Say please and thank you to Siri. Please and thank you.

Joe C: 44:37

I do want to mention when I was trying out different chatbots. They do have pricing structures in place where it's pay-to-play or you get more features if you pay. Replica, I think was like $8 a month, or you can pay $50 a year and that's. That's not nothing. You know, if they're comparable, like subscription services, to digital content and so, assuming they've done their market research, people will pay for performance of these things are fully featured. Yeah.

Andy B: 45:12

I mean, within our experiences, let about 90 minutes of online research. Granted, I'm better at researching than the average person, but I was able to find papers that prove that this is a vulnerable population. There's a large vulnerable population. The data shows you can put Anthropomorphic AI's in front of them and ask for money Little comic relief, if you guys are ready for it. I was cracking up at the titles of this research papers. I'm gonna read you a couple because they're very funny. Some of them are very responsible, so I'll skip those. You are never lonely with a robot. A qualitative content analysis on the use of anthropomorphic technologies Social robots as companions for lonely hearts. The role of anthropomorphism and robotics. Appearance Virtually in love. The role of anthropomorphism in virtual romantic relationships Loneliness makes the heart grow fonder of robots. On the effect of loneliness on psychological anthropomorphism.

Kiran V: 46:20

It's actually wild that all of them have this component of loneliness or companionship or the fact that we just Want something to interact with as humans. So it is like there's a purpose that we're creating these things. It's not just cuz, oh, it's like this novel cool thing. People are actually trying to solve a real problem that humans are having.

Andy B: 46:47

And what's interesting to me is, as I was researching this, there were two types of like major families of papers, like two departments in the university the human computer interaction nerds, the use of the world Kiran we're like this is a real problem people experiencing loneliness and these virtual agents could be a way to treat that problem, which I think is a valid take. And then there was the like sociology psychology camp, who are like people are Vulnerable to falling in love with robots and this is a big problem for those people in our society, which I think is also a good take. But that was kind of the main. But again, the tech is far ahead of where the actual understanding of its impacts are currently. So this is something to keep in mind I.

Joe C: 47:35

Think this does get into. You know larger conversations We've had on the pod about like this is another technology and it really is about how we choose to wield it. It can be used for good and bad. It just takes some consideration of safety and and it's heartening to hear that there is research happening and that there are Common themes popping up, particularly around loneliness, so that you know, as we go, invent the future, it's gonna remain top of mind, or we we have started to identify the pitfalls that we need to address. All right, so that wraps up our part one on AI and love and relationships. So we're excited to talk more about these different topics. We're gonna get into a whole another set of topics on this larger topic, so say, stay tuned, but for now you can find us on anywhere you listen to podcasts. Please listen, rate, share and subscribe. We'd love to hear from you. Please email us at AI FYI pod at gmailcom, and we'd love to hear from you on what to talk about in future episodes. So until then, thanks again for joining and we'll talk to you later.

Kiran V: 48:45

Yeah.