Explicit NT 012 - Humans and Cyborgs and Bots! Oh, My!
Ep. 12

NT 012 - Humans and Cyborgs and Bots! Oh, My!

Episode description

There is nothing artificial about this episode, although we wouldn’t call it intelligent either. We debate the nature of an augmented human vs. an android and Chris remembers a story he came up with a long time ago…

Executive Producer - D1MC3

Episode Art By - Kit

Download transcript (.srt)
0:00

*music*

0:15

Alright, we're on the road

0:16

again.

0:17

On the road again.

0:19

Oh no, no, copyright, copyright!

0:21

Yeah, I was about to say, that's

0:23

where the song has to be cut

0:25

off to avoid copyright.

0:27

Hi, Kit.

0:29

Meow.

0:30

Meow. Yes, meow indeed.

0:32

We have roughly 30 unique

0:34

listeners per month.

0:36

What?

0:37

Of which I'm one.

0:39

What?

0:40

Because I always listen to our

0:42

show for quality and also for...

0:45

Comedy?

0:46

Ego purposes.

0:48

Oh.

0:49

But, something unique happened

0:51

right around the time when we

0:53

were recording episode 10.

0:56

We officially got our first

0:58

donation.

0:59

Yay!

1:00

Yeah, I'm as shocked as you are.

1:03

Someone actually thought that

1:05

it would be worth their time to

1:07

spend money.

1:09

So...

1:10

We want to give a shout out to

1:13

Dimce, also known as Salty

1:16

Dimce, also known as D1MC3, I

1:20

believe.

1:22

Are you saying a radio call

1:23

sign?

1:24

No, that's how he spells his

1:26

name.

1:27

Now, to be fair, we've known

1:28

Dimce for quite some time,

1:30

both Kit and I.

1:32

Yeah.

1:33

We appreciate his encouragement.

1:34

He flat out said, "Keep up the

1:36

great work or good work on our

1:39

Ko-Fi." And now, we are

1:41

officially

1:43

$5 closer to our hypothetical $1,000.

1:48

And then eventually, it'll go

1:50

up because of inflation.

1:52

Inflation, obviously.

1:53

Yes.

1:54

So, Dimce, thank you for

1:55

listening.

1:57

Thank you for your contribution

1:59

to our podcast.

2:01

We definitely appreciate it.

2:03

And, once Kit sets up a PayPal

2:05

account, I will happily give

2:08

him his half of the funds.

2:11

Also, I'm not going to get too

2:14

into this, but I will own and

2:17

apologize for the technical

2:21

issues at the end of Episode 10.

2:24

And the entire technical issue

2:27

that was Episode 11.

2:30

We've since changed our

2:31

recording methodology to not

2:33

use a wire.

2:35

Hopefully not break.

2:36

Yes.

2:37

And I've also invested in a...

2:41

it's not the pro version of

2:43

this plug-in, but it is a pro

2:45

plug-in to better assist with

2:48

some audio quality things over

2:51

time.

2:52

So, hopefully you'll notice

2:53

that difference, but I will own

2:55

the technical issues and

2:57

hopefully, this time around, we

2:59

won't have any.

3:01

Hopefully.

3:02

Hopefully.

3:03

Rain.

3:04

If it rains, it rains.

3:07

If it rains, it pours.

3:09

And sometimes when it rains, it's

3:11

raining cats and dogs.

3:14

Yeah, that's painful.

3:16

For who?

3:17

The humans or the animals?

3:19

Yes.

3:20

Yeah, and Weird Rain has

3:22

definitely been the premise of

3:24

multiple horror films, so...

3:28

That's always fun, too.

3:30

There will always be two things

3:32

that will always be part of non-topical.

3:37

One is inflation.

3:39

Can you guess what the other

3:41

one is?

3:42

Yes.

3:43

Yes, I can, but it's your job

3:44

to say it, so please do.

3:46

Cheese...

3:47

Quesadilla...

3:49

Anyway, speaking of technical

3:51

stuff, I want to start off

3:53

today with a bit of a thought

3:55

exercise with you.

3:57

Oh no.

3:58

Oh no.

3:59

Oh no.

4:00

Yeah, I know.

4:02

You have robots.

4:04

Yeah.

4:05

All right, a robot is a pretty

4:06

straightforward thing.

4:08

An ordinary robot or a quote-unquote

4:10

"AI robot"?

4:12

I'm not talking text-based

4:14

robots like LLMs and ChatGPT.

4:17

I know.

4:18

I'm saying in terms of...

4:21

A robot is a mechanical thing

4:23

that may or may not have

4:25

intelligence that can do a

4:27

variety of things.

4:29

Are you talking about robots in

4:30

general or like biped robots or...?

4:33

I'm talking about robots in

4:34

general, but still.

4:35

I'm thinking more along the

4:37

line of androids and things

4:39

like that.

4:41

Like humanoid-looking robots,

4:42

which don't really exist yet.

4:44

I mean, they do.

4:46

So, there are robots.

4:49

Yeah.

4:50

And then there's augmented

4:52

humans that have body parts

4:54

that are mechanical.

4:57

I'm trying to think...

4:59

Like cyberpunk-style.

5:00

Well, not just that, but...

5:02

I'm trying to think there's a

5:03

term for it. Cyborgs.

5:05

Yeah.

5:06

So, there's cyborgs, which are

5:08

part human, part machine.

5:10

And then you have pure human,

5:11

which is...

5:12

Well, what we're born as.

5:14

Yeah.

5:15

The meat puppets that are

5:18

somehow animated by thoughts

5:21

and/or spirit and/or whatever.

5:25

All right, so we have these

5:26

three things.

5:27

So, you have the scale from

5:29

meat sack to mechanical

5:32

nightmare.

5:34

Computer.

5:35

Yes.

5:36

That is the scale.

5:37

From meat sack to mechanical

5:38

nightmare.

5:39

Yeah.

5:40

The question is...

5:43

Where is the line between those

5:45

different phases?

5:48

So, you have meat sack, you

5:51

have cyborg, and you have

5:53

mechanical terror.

5:56

Or not terror, but you get what

5:58

I'm saying.

5:59

Yeah.

6:00

If my heart is faulty...

6:03

Yeah.

6:04

And they give me a heart pump

6:06

or a mechanical heart, am I a

6:09

cyborg yet?

6:11

Technically.

6:12

You have a mechanical

6:15

modification to your body.

6:19

Yes.

6:20

All right.

6:21

Let's go more simple again.

6:23

Repeat that again.

6:24

Might not have heard that.

6:25

Yeah.

6:26

Let's go more simple.

6:28

I go to war.

6:29

Okay.

6:30

All right.

6:31

I go to war.

6:32

I accidentally step on a landmine.

6:33

What a shame.

6:34

I'm lucky.

6:35

I only lose my right leg.

6:36

Good for you.

6:37

Now, when I get back, they

6:37

replaced my right leg with a

6:37

prosthetic.

6:38

Now, a prosthetic is not

6:39

robotic really, unless it's one

6:41

of those fancy prosthetic, you're

6:43

working on where they can

6:47

connect wires into your brain

6:52

and then your brain controls it.

6:59

Yeah.

7:00

I think that's the more

7:01

specific thing of what is a cyborg

7:03

when it actually connects to

7:05

you

7:06

in a way where you can actually

7:10

control it with your brain and

7:14

not like... you move the little

7:19

leg stump and then the leg also

7:20

moves.

7:21

Okay.

7:22

The moment that it's like

7:23

actually connected to your

7:24

neurons and that's when it

7:26

becomes more cyborg-y than just...

7:29

All right.

7:30

So, if it's like the guy from

7:31

Corridor Digital where they 3D

7:33

printed him a pinky, that doesn't

7:35

count.

7:36

Yeah, no.

7:37

All right.

7:38

Like it has to be like...

7:40

because if you think about it,

7:41

a prosthetic is kind of like

7:42

wearing a glove.

7:44

Okay.

7:45

That's how you have to think

7:46

about it.

7:47

You're just putting it on.

7:48

You can just take it off

7:49

whenever you want, but you can't

7:50

just take off cyborg parts

7:52

easily.

7:53

All right.

7:56

The next question is...

7:59

So, we have determined for the

8:00

purposes of this conversation,

8:02

not the purposes of research

8:05

and PhDs and people who deal

8:07

with this type of stuff

8:09

professionally, if you even

8:11

want to call them that.

8:14

No, I'm kidding.

8:15

Of course, they're

8:15

professionals.

8:16

Yeah.

8:17

But our definition of somebody

8:18

who is a cyborg is somebody who

8:20

has mechanical or computerized

8:22

parts that they can control

8:24

with their brain.

8:26

Like I wouldn't count a...

8:27

Or directly connected to their

8:28

brain.

8:29

I wouldn't count a pacemaker as

8:30

a cyborg thing, I don't think.

8:33

Well, you wouldn't count a

8:34

heart pump by that definition

8:35

either because it's not

8:36

connected to a brain.

8:38

Yeah.

8:39

So that's just augmentation.

8:40

Yes.

8:41

All right.

8:42

The next question is a little

8:44

bit more tricky.

8:46

Yeah.

8:47

You have from cyborg to...

8:49

Machines?

8:50

See, here's the thing.

8:52

I knew you were going to go to

8:53

this point so I already thought

8:55

about this.

8:56

Okay.

8:57

This is how I am going to

8:58

define it.

8:59

If you have a good portion of

9:01

your brain, like at least 25%...

9:05

Wait, there's a good portion of

9:07

my brain?

9:08

No.

9:09

If at least 25% of your brain

9:12

is run by mechanical things or

9:15

if 50% of vital organs are run

9:18

by machine...

9:21

At that point I think you're

9:23

getting more to the machine

9:26

side than the organic side.

9:29

Yes, but where's the pure cut

9:33

over?

9:35

And this is where it gets funky

9:36

and I'll explain why.

9:38

Don't worry, I tend to agree

9:40

with you.

9:41

But if you look at...

9:43

There's a classic anime known

9:45

as Ghost in the Shell.

9:47

Not the Ghost in the Shell TV

9:49

series which is also equally

9:50

good from what I've heard.

9:53

But the entire premise of Ghost

9:55

in the Shell is that you could

9:57

take a human consciousness and

10:00

put it into an Android body.

10:02

And there are a variety of

10:03

different...

10:04

Yeah.

10:05

There's plenty of different

10:05

takes.

10:06

There's plenty of different

10:07

takes like that.

10:08

The latest Alien Earth series

10:09

is based on that concept.

10:12

Even... even... spoiler for Stray

10:15

the Cat Game, but yeah, even in

10:17

Stray there's that as well.

10:19

The question then becomes, if

10:22

your body is all mechanical and

10:25

your brain is technically a

10:27

computer, but your

10:29

consciousness is that of a

10:31

human and you have...

10:33

And you have...

10:34

Funny.

10:35

Memories of a human and you

10:37

have the morals of a human and

10:39

the computer is only existing

10:41

to continue the brain functions

10:44

of the human, but in a computerized

10:47

way.

10:50

Is that person now an android?

10:52

You know what?

10:53

Or are they still a person?

10:55

I think that game theory

10:58

described it best.

11:01

Is that once a person dies, it's

11:03

no longer going to be that

11:06

person for real because it's

11:08

not going to interact in the

11:10

same way that person would.

11:13

So it's just mimicking what it

11:15

knows about that person.

11:17

So, you're saying it can't grow

11:19

and evolve and change?

11:21

It would not be... it would...

11:23

it could, but it would probably

11:25

not be the same as that person

11:27

would.

11:29

But it would be that person's

11:29

consciousness.

11:32

At the same time, it's still

11:33

just mimicking that person.

11:36

It is still a machine.

11:38

That's interesting.

11:40

That's interesting.

11:41

That's game theory's take on

11:42

something.

11:44

That's an interesting theory.

11:46

I mean, it is called game

11:47

theory where I just snatched it

11:49

from, so...

11:51

Actually, this reminds me of a

11:52

story...

11:54

Of a story that I had in mind

11:55

that never fully wrote it or

11:56

anything, but...

11:59

That... that could tie into

12:01

this.

12:02

And could become a movie.

12:05

So...

12:07

Here's the premise.

12:09

There's a kid...

12:11

Wait, are you talking about

12:11

something that already exists?

12:13

No.

12:14

I'm talking about something

12:15

that I had an idea for when I

12:16

was like in middle school or

12:17

high school.

12:18

Oh.

12:19

That could become a movie.

12:21

Or not, because I don't have...

12:23

Horror comic or something.

12:25

Whatever.

12:27

You have this kid.

12:29

Kid is homeschooled.

12:31

Kid's never really left the

12:31

house.

12:34

Kid is...

12:35

But never thinks anything weird

12:36

of it or anything.

12:37

Hmm.

12:39

And...

12:41

It goes through the motions of

12:42

living.

12:43

Goes to sleep.

12:44

Has dreams.

12:46

Does creative things.

12:47

I think...

12:48

I think I can see where this is

12:48

going.

12:49

Can do art and stuff.

12:51

Seems to be, from his own

12:52

perspective, a kid.

12:55

Yeah.

12:57

I think I can see where this is

12:58

going.

13:00

Then...

13:00

Nah, I don't think you do yet.

13:03

Then...

13:05

Parents disappear.

13:07

Ah.

13:08

Parents disappear under

13:08

mysterious circumstances.

13:11

What a shame.

13:13

And...

13:15

Kid is trapped in house.

13:17

Has never really left.

13:19

Well, is he trapped or does he

13:20

just not know how to unlock

13:22

doors?

13:24

Well, kid isn't trapped per se,

13:25

but has never left house.

13:28

Yeah.

13:28

Alright.

13:29

This could also be a teen

13:29

instead of a kid, whatever.

13:32

Person.

13:33

But has never left house.

13:35

Maybe has made one online

13:35

friend that lives in the area

13:37

supposedly, whatever.

13:40

Supposedly.

13:41

Actually...

13:43

Actually, no.

13:44

I do remember...

13:46

They did make...

13:48

One friend.

13:50

If I remember my initial idea.

13:53

And they started reaching out

13:53

to this friend and like, kinda

13:55

kept it secret from their

13:56

parents.

13:58

Their parents said, "Don't go

13:58

online. Don't mess with people,

13:59

you know."

14:01

"Just go online for homework.

14:01

Don't go on social media. It'll

14:03

mess you up."

14:07

Smart parents.

14:09

Yeah.

14:10

Because social media will fuck

14:11

you up.

14:12

Especially now.

14:14

Wait, what?

14:15

Especially now.

14:16

Yes.

14:18

Anyway.

14:20

Kid makes friends online.

14:22

Does chats with her.

14:24

All this stuff.

14:26

But his parents disappeared.

14:28

What a shame.

14:30

They just go out one day and

14:30

don't show back up.

14:34

Now...

14:36

The kid doesn't know how to

14:36

leave the house or isn't

14:37

comfortable leaving the house

14:39

cause he never left.

14:42

Yeah.

14:43

Yeah.

14:43

And...

14:45

And...

14:47

He starts exploring more and

14:48

trying to figure out what's

14:50

going on but...

14:52

Because he doesn't really feel

14:53

comfortable leaving the house

14:54

he's stuck just using computers

14:56

and whatnot.

14:58

So he reaches out to his friend...

15:01

And says, "Hey...

15:03

I'm okay but my parents have

15:04

been missing under who knows

15:06

what circumstances for like a

15:08

month."

15:10

Wait, I have an important

15:10

question now.

15:13

Yes.

15:14

How would he still be alive?

15:15

He has no food.

15:17

And no one's paying the water

15:18

and electrical bills.

15:21

Don't worry.

15:22

Don't worry.

15:22

We'll get there.

15:24

Oh yeah, right.

15:25

Anyway.

15:26

Right, right.

15:27

I remember where this is going

15:27

so I don't...

15:29

I understand.

15:30

I don't know where it's going

15:30

but you don't entirely.

15:31

This has multiple twists.

15:32

Yeah, yeah.

15:33

I have...

15:34

But you can help me to workshop

15:35

it a little bit as we go

15:36

forward.

15:38

You can start to come up with

15:38

some ideas.

15:41

Alright, so...

15:43

Girl character because she is

15:43

and she is roughly his age so

15:45

we'll say they're both like 16,

15:47

17, whatever.

15:50

Something or other.

15:51

Young adults.

15:53

They're in love.

15:55

Or something.

15:56

They're not yet in love.

15:57

But...

15:59

Girl comes over, knocks on door.

16:03

Kid opens door.

16:04

Wait, hold up, hold up.

16:05

How does she know where he

16:05

lives?

16:07

She...

16:08

He tells her where he lives.

16:09

How would he know if he's never

16:10

gone outside?

16:11

Does he just like look at the

16:12

mail address?

16:13

I mean you gotta know your

16:13

address so you can get mail.

16:16

Yeah, but why would he get mail?

16:19

Look, maybe he has a magazine

16:20

that he likes, okay?

16:22

Look, it's a common thing.

16:24

Look, you don't get much mail

16:25

but you know your address.

16:26

But how would he know about any

16:28

magazines if he's not supposed

16:30

to go online?

16:33

You do realize that maybe his

16:34

parents were encouraging him by

16:36

getting him things that he was

16:38

interested in in terms of...

16:41

I don't know, maybe he's into

16:43

mod development and there's mod

16:45

development monthly.

16:47

Who knows?

16:48

Who knows?

16:50

Point is, he knows his address.

16:51

So he gives the address to the

16:51

girl.

16:52

He opens up the door.

16:54

He starts talking to her and

16:55

she's like, "Where are you?"

16:58

And he's like, "I'm right here.

17:00

What are you talking about?

17:01

Where am I?"

17:03

And she goes, "No, like where

17:03

are you?

17:04

I don't see you.

17:05

I can hear you.

17:05

I don't see you anywhere."

17:06

And as they're doing that, she's

17:09

starting to get like really

17:12

confused and he's really

17:15

confused.

17:17

Because he's like, "No, like I

17:18

opened the door for you.

17:19

What are you talking about?"

17:20

And she goes, "Well, look, I

17:23

don't know what type of game

17:25

you're playing with me here.

17:29

You called me over.

17:30

You told me your parents are in

17:31

trouble.

17:32

Why are you playing this game

17:33

of like hiding and talking to

17:35

me through some speaker system?"

17:37

And this is when things start

17:39

to unravel for the guy a little

17:41

bit.

17:42

So he discovers, so girl comes

17:45

to his room.

17:47

He tells her, he's like, "Well,

17:48

fine.

17:49

Let's just go to my room."

17:50

So he sees, like he feels

17:51

himself walking to his room.

17:54

You know, he sees her walking

17:55

to his room.

17:58

They get to his room.

17:59

His computer turns on.

18:01

He's like, "See, look, I turned

18:01

on my computer.

18:03

I did this.

18:04

I did that."

18:05

And she's like, "Yeah, but you're

18:05

not here.

18:06

What is going on?"

18:07

Like, are you in some, like, I'm

18:08

getting sick of this.

18:11

And they come to discover that

18:14

he's actually an AI

18:16

consciousness that was raised

18:19

from the level of a child.

18:23

And all the cameras in the

18:24

house.

18:25

Well, here's an important

18:26

question.

18:27

How did he never think to look

18:28

down at himself?

18:30

Like, if you go to bed...

18:31

But he did.

18:32

For example, he had, for

18:33

example, for video chat, he had

18:35

an avatar that looked like a

18:36

real person.

18:38

Yeah.

18:39

So he has a sense of what he

18:40

looks like.

18:44

He knows what he looks like.

18:46

He knows what he feels like.

18:47

Like, he actually feels

18:48

sensations.

18:50

And to the food question...

18:52

Yeah, it doesn't mean to eat.

18:53

He would be sitting down with

18:56

his quote unquote family and

18:59

eating food with them.

19:03

So at this point, the girl is

19:05

still going to stay.

19:07

They've figured out this thing

19:08

and they want to discover what

19:09

the heck happened to his

19:10

parents.

19:11

So he can't pick the lock to

19:13

his parents' room.

19:16

And he also can't just leave

19:17

because...

19:19

Right.

19:20

There's no cameras outside.

19:22

He can't pick the lock to his

19:25

parents' room.

19:27

Because he's not physical.

19:28

Because he's not physical.

19:29

But she can.

19:30

And there are cameras in his

19:31

parents' room.

19:32

But he, for some reason, can't

19:33

get in there until...

19:35

Well, you know what that means?

19:36

It means there's more than one

19:37

AI.

19:38

No.

19:39

There's not.

19:40

And there's like an answering

19:40

machine.

19:41

And there's also a secret door

19:42

in his parents' room.

19:43

Now, once the girl gets in,

19:44

somehow that gives him access

19:45

to the room as well.

19:47

I don't know how.

19:48

We're not going to explain it.

19:49

Magic.

19:51

Science magic.

19:52

Yes, exactly.

19:53

There's probably some kind of

19:53

science magic.

19:54

There's some kind of science

19:55

magic on the door.

19:56

Censor on the door.

19:57

Whatever.

19:58

Since I'm going to assume that

19:58

this is some kind of experiment.

20:01

Anyway, she gets in.

20:02

There's more than one AI.

20:03

There's more than one AI.

20:04

No.

20:05

There's not.

20:06

And there's like an answering

20:06

machine.

20:07

Anyway, she gets in.

20:08

It gives him access.

20:10

And there's an answering

20:11

machine in there.

20:13

They press the button and there's

20:14

a male voice on the answering

20:16

machine like,

20:18

"I will not let you continue

20:19

doing what you're doing.

20:22

You are a threat to humanity.

20:24

You are a threat to everything

20:25

that we hold it held sacred.

20:28

Why are you still doing this?"

20:30

I don't know.

20:31

So there was a threatening

20:32

message on the phone.

20:33

Oh.

20:35

Uh huh.

20:36

And there's also a secret door.

20:39

So they get into the secret

20:40

door.

20:41

And somehow their camera's

20:43

there too.

20:45

It goes down.

20:46

Yeah, of course.

20:47

Like we have to just explain

20:49

away that maybe...

20:53

Maybe there's some kind of dongle

20:54

that he says, "Hey, my parents

20:56

always wore this dongle when

20:58

they walked from room to room,

20:59

but they always took it off

21:00

when they went into their room

21:02

and I never understood why."

21:04

Because that was the ability to

21:05

kind of carry his consciousness

21:06

into the rooms that he might

21:08

not have normally been allowed

21:09

to.

21:10

Especially when you're a kid.

21:12

Like you might not be allowed

21:13

in certain rooms in the house.

21:15

Yeah.

21:16

So anyway, they go downstairs

21:18

and there's a body that's in

21:20

like a vat of liquid hooked up

21:22

to heart monitors, hooked up to

21:24

brain monitors, all this stuff.

21:28

And it's a real body.

21:29

Yeah.

21:30

And it's a real body of a kid

21:33

roughly his age.

21:37

It's like, what the hell?

21:39

Now, this is when they start to

21:41

get the big reveal about who he

21:42

actually is.

21:44

And it turns out that his

21:46

parents, "parents" quote

21:49

unquote, were really brilliant

21:53

scientists, obviously.

21:56

Obviously.

21:57

And they didn't lose a child

22:01

through childbirth per se.

22:07

But the child they gave birth

22:10

to was not, for whatever reason,

22:14

able to live on its own.

22:18

Or was not able to fully

22:19

function at that level of

22:21

development.

22:24

And so they created, using

22:26

their brilliance, an AI with

22:30

the full capacity of a baby.

22:35

And then instead of ingesting

22:37

everything like a large

22:38

language model does, they

22:41

taught it the same way, at the

22:43

same pace you would teach a

22:45

child.

22:47

Well now I can definitely see

22:48

where this is going.

22:52

You think you do.

22:53

But not entirely.

22:54

So obviously, the guy thinks to

22:56

himself, "I need to go and save

22:59

my parents from whoever this

23:02

sleazebag is."

23:04

And they, as they're looking

23:05

through the records, they find

23:07

this third doctor that the

23:08

parents used to be with at like

23:10

MIT or Berkeley or whatever.

23:13

WPI.

23:14

WPI.

23:15

Right.

23:16

So it turns out there's a

23:17

command to put his

23:18

consciousness in the body,

23:19

which nobody is surprised at at

23:21

this point.

23:22

Yeah.

23:24

And what you don't see coming

23:25

is the fact that there's

23:27

another consciousness in the

23:29

body.

23:31

Just because the body wasn't

23:32

necessarily able to live on its

23:34

own will make up something

23:36

about the parents doing

23:38

physical therapy with the body

23:40

as it's in a coma and feeding

23:42

it and finding some way to

23:44

program information like

23:46

language and knowledge of the

23:48

world into the brain of the

23:50

body.

23:51

Because otherwise, when you

23:52

moved the AI to the body, the

23:54

body would be all atrophied and

23:56

the brain wouldn't have the

23:58

same knowledge.

24:00

Yeah.

24:01

So anything he learned was also

24:02

put into the brain of this body.

24:05

Yeah.

24:07

Well, what was also in the body

24:09

was the actual personality of

24:12

the kid that for whatever

24:15

reason couldn't move on his own.

24:19

Yeah.

24:20

Yeah.

24:21

So now you have this situation

24:23

where you have two

24:25

consciousnesses in one body and

24:28

consciousness of the body is

24:30

rightfully angry at AI because

24:33

like AI has had all this

24:35

freedom and has gotten to learn

24:38

and experience the world.

24:41

Will the consciousness of the

24:42

body understand what AI is?

24:45

Well, yeah, because it was

24:46

programmed with the same

24:46

knowledge that the AI had.

24:48

I mean, just because it's an AI

24:49

doesn't mean that it's not

24:51

aware that AI exists.

24:53

Hmm.

24:54

So anyway, but they both agree

24:57

that they've got to go out and

24:59

figure out what happened to the

25:02

parents.

25:04

And it's at this point where

25:05

there's like a command center

25:08

in this lab and they see a

25:09

strange man pulling up and it's

25:11

not the guy.

25:13

It's not the doctor, but there's

25:16

a strange man pulling up in a

25:19

black vehicle.

25:21

They get out of the vehicle.

25:22

That's all sketchy.

25:23

And they're clearly carrying a

25:25

weapon of some kind.

25:28

So now AI is in a physical body

25:31

and AI is mortal.

25:34

And obviously, girl that AI has

25:36

been friends with is also

25:39

mortal.

25:40

And there's a guy with a gun

25:41

coming into the house.

25:43

So this is where Adventure Mode

25:45

kicks in and they all have to

25:48

go on a crazy journey to try to

25:51

find sketchy doctor following

25:53

clues.

25:55

And it turns out that the

25:56

doctor, you know, didn't want

25:59

this to happen and is going to

26:01

kill the kid and thus kill both

26:03

the kid and the AI inside the

26:06

physical, like the AI kid and

26:08

the actual consciousness.

26:11

So the AI didn't back up its

26:12

memory or anything?

26:14

Well, who's to say it would

26:15

know how?

26:16

So anyway, so they go on this

26:19

adventure.

26:20

They finally find where the

26:21

doctor is.

26:22

They find the parents and it

26:24

gets to a standoff situation

26:27

where, you know, you know, it's,

26:30

you know, you need to shoot,

26:33

you know, there's some kind of

26:36

choice.

26:38

Either, either, you know, I'll

26:40

kill your parents and your,

26:43

your little chicky do here, or

26:45

you will have to take this pill

26:47

to suicide yourself.

26:50

Oh no, I said suicide, heaven

26:51

forbid.

26:52

Oh no.

26:54

And the kid being altruistic is

26:56

going to sacrifice himself.

27:00

Of course.

27:01

Both, both the AI and the, the

27:03

actual consciousness in the

27:05

body decide it's better to

27:06

sacrifice themselves and save

27:08

their friends and family.

27:11

Also just from a logical point

27:13

of view, one, technically two

27:15

in exchange for three.

27:17

Yes.

27:18

Yes.

27:18

But.

27:19

Even though no one would ever

27:21

do that.

27:22

Girlfriend, girlfriend managed

27:24

to, at some point in their

27:26

adventure, get a knife or a

27:28

small pistol or something.

27:31

So as kid swallows pill,

27:33

girlfriend struggles out of her

27:35

restraints.

27:37

And of course they fall in love

27:38

at some point.

27:39

Yeah.

27:40

You know, the, the, the actual

27:41

consciousness in the body doesn't

27:43

really care about her, but the

27:45

AI obviously does because it's

27:46

his only friend.

27:48

And obviously they fall in love.

27:49

And then, so as the AI slash

27:51

other kid are losing

27:53

consciousness from the pill,

27:56

they, they notice the girl

27:59

getting out of her restraints

28:02

and they start hearing guns, a

28:05

gunshot.

28:07

And then kid wakes up and kid

28:11

is like, huh? What's going on?

28:16

And kid lifts up arm and there's

28:17

actually an arm there and sets

28:19

up. And the parents are like,

28:20

Oh, we're so proud of you. You,

28:22

you made the right thing.

28:24

He's like, yeah, but like,

28:25

where's Jerry? Who's the other

28:27

consciousness? Jerry's fine.

28:28

Don't worry about it. Jerry's

28:30

not the AI. Jerry is actually

28:32

the other body.

28:33

And Jerry walks in, in a, in

28:35

the body is like, yeah, somehow

28:38

when your consciousness entered

28:41

my body, it fixed the neural

28:44

disorder and helped me to be my

28:46

own person. It's like, but then

28:49

who am I? Or what am I? And

28:52

they basically say, well, now

28:55

you can go two routes here. It's

28:58

either a, it's a pure Android.

29:01

Or B it's in the dead girl's

29:03

body. No, no, no, no, we're not

29:06

going to do that. Or B the

29:09

parents go, well, there was

29:11

this kid at the hospital who

29:14

was roughly your age, who was

29:16

brain dead and had asked his

29:19

parents to donate his body to

29:21

science.

29:23

And so because of that, and

29:25

because of our connections to

29:27

the different universities, we

29:29

were able to do the genetic

29:31

type matching and brain type

29:33

matching that we needed to do

29:35

and determined that we were

29:37

able to, because then he can

29:40

leave, live a normal life.

29:42

Yeah.

29:43

Versus if you're an Android, I

29:44

mean, yeah, you might have the

29:46

physical appearance of a human

29:47

being, but you're never going

29:49

to age.

29:50

Yeah.

29:51

And like, I mean, your, your

29:52

mechanics will deteriorate over.

29:55

Yeah.

29:56

But that's, that could be

29:57

replaceable.

29:58

Up to a point.

29:59

Up to a point.

30:00

But the point being that being

30:02

in a human body allows you to

30:04

have a real life and to have a

30:06

family and, and whatnot.

30:09

Being in an Android body.

30:11

I mean, yeah, you could still

30:12

be with somebody physically if

30:14

you wanted to, but you would

30:15

never have children.

30:16

I mean, you could adopt if you

30:18

wanted a family, but it's not

30:20

exactly the same.

30:22

So I don't know which route to

30:23

go there yet, but it still

30:25

brings up like moralistic

30:27

issues of dumping a

30:28

consciousness into another

30:30

human's body.

30:31

Yeah.

30:33

That's brain dead that somehow

30:33

will work with the AI

30:35

consciousness, but then the

30:36

actual personality of the

30:38

person from the body isn't

30:39

there anymore.

30:41

It's a little weird.

30:43

Yeah.

30:45

So I don't know what the right

30:47

decision, but that's, that's a

30:49

movie idea that I had or a

30:51

story idea I had as a kid.

30:53

Obviously it's not that well

30:54

fleshed out.

30:55

So, sir, add your special sauce

30:58

to the mix.

30:59

Shall you?

31:00

One special sauce.

31:02

Look, it's by collaborating

31:04

that we create great content

31:07

that Hollywood purchased right

31:09

now.

31:10

Nobody's going to purchase this

31:11

turd of the idea.

31:13

Are you sure?

31:14

I don't know.

31:16

I don't know.

31:17

So what, what feedback do you

31:19

have?

31:19

What notes?

31:21

Uh, how do they get the AI into

31:23

the brain though?

31:27

Uh, they figured it out.

31:28

You're saying at the end?

31:30

No, like in the beginning.

31:33

Oh, it was already set up that

31:33

way in the lab.

31:35

Yeah, but how?

31:36

It was, it was, they did the

31:37

research and they had already.

31:39

I'm saying, how does it

31:40

actually work?

31:43

Look, that's, that is one of

31:46

those weird existential

31:48

questions, sir.

31:51

That I don't think, because the

31:53

movie itself.

31:54

You need to keep in mind that

31:55

like the human brain is like,

31:57

what is it like?

31:58

500 terabytes memory?

32:00

It's something crazy.

32:02

Look, how, how would you have

32:03

that much memory?

32:05

Well, I mean, with all the AI

32:06

stuff, you know, there might be

32:08

enough to make a full human

32:09

brain if everyone worked hard

32:10

enough to get there.

32:11

If, if the parents were able to

32:14

keep this physical human brain

32:16

up to date with what the AI was

32:18

learning, then it, they have to

32:20

have some kind of computer

32:22

interface to be able to deal

32:23

with that.

32:25

And then maybe they don't, they

32:26

only copy over the information

32:28

that's different, not the

32:29

information that is the same.

32:31

Therefore, reserving the brain

32:33

storage space.

32:34

Although the brain does not

32:35

technically work like that at

32:37

all.

32:37

It's a, it's a network of

32:38

neurons and...

32:39

I was saying the computer brain.

32:41

How would there be enough room

32:43

for the computer brain to learn

32:44

everything a human brain can?

32:47

Well, here's, here's the way

32:49

you do that.

32:51

They have a big lab downstairs.

32:53

The consciousness is in these

32:55

giant servers.

32:56

I'm saying physically, like, I

32:58

don't think, I like, I don't

33:00

even think there's as much

33:01

memory as a single human brain

33:03

in the entire world right now.

33:06

There might be because of all

33:07

the AI stuff, but I doubt it.

33:10

Are you kidding me?

33:11

There, there are server

33:13

clusters that are petabytes.

33:16

Oh.

33:16

So, 500 terabytes would

33:18

definitely be possible.

33:21

I don't remember how much the

33:22

human brain is.

33:23

You might have to look that up.

33:24

It doesn't matter.

33:25

It doesn't matter.

33:26

Okay.

33:26

They figured out how to have

33:28

racks and racks of servers in

33:29

the basement.

33:30

Okay.

33:31

They figured it out.

33:32

That's all you need.

33:33

They have racks and racks of

33:34

servers.

33:35

Sci-Fi logic.

33:36

Sci-Fi logic.

33:36

Sci-Fi logic.

33:37

And they got the, they got

33:38

those fancy NVIDIA servers.

33:41

Right.

33:43

That's just a sprinkle.

33:45

They got them fancy NVIDIA

33:47

servers that are like 24

33:50

graphics GPUs linked together.

33:55

Yeah.

33:56

To make it smart.

33:59

All right.

34:00

Yeah.

34:01

They, they, they, they, they,

34:02

these are, these are smart

34:03

people.

34:04

They can figure out stuff that

34:06

you and I can't even conceive

34:07

of.

34:09

I mean, clearly we just

34:09

conceived of it.

34:11

So.

34:12

I mean the technical ability to

34:15

actually do it.

34:16

Not the hypothetical.

34:20

Well, we'll solve this with

34:21

servers magic that we just came

34:24

up with.

34:25

But anyway.

34:28

Any other thoughts on the story?

34:30

Uh.

34:31

This is good.

34:33

It's a little cliche.

34:34

But that's the only way

34:36

Hollywood would buy it.

34:37

So.

34:39

Well, yeah.

34:40

Alright, we're gonna get to

34:42

that bridge.

34:44

And then we'll be dry.

34:45

You can tell this is a short

34:47

rain because of the way it

34:48

smells.

34:51

Yes.

34:52

It smells dirty, so that's how

34:54

you can tell it's a short rain.

34:56

Ah, it's acid rain! Ah, it

34:58

burns! It burns!

35:00

Ah, it burns! It burns! Ah! Ah!

35:07

Ah! Okay. We're now out of the

35:11

acid rain.

35:12

For now! For now!

35:17

So yeah, that was my story idea.

35:19

Um, I have no idea what to call

35:23

it.

35:24

That's nice. You're no help at

35:27

all, sir.

35:29

Thank you. I think at the time

35:31

I called it like Android A1-9

35:33

or something like that, but...

35:35

Yeah. It didn't really have a

35:37

good ring to it, you know?

35:41

Alright, so I spent... Well,

35:43

why not what you could call it

35:45

if you wanted to keep a similar

35:46

name to that?

35:47

What? You could call it Android

35:51

A1-3-X.

35:54

Why? Okay, just look at it in

35:56

your head.

35:58

Just look at that little

35:59

combination. Alex? Yeah. Nice!

36:03

I like that. Alright, we got a

36:04

name for our character.

36:07

So, uh, Hollywood, as always,

36:09

if you like this idea, being

36:11

that it's an original idea,

36:13

and being that I did 90% of the

36:15

talking for this episode, so...

36:17

It's not nearly as creative as

36:19

the other ideas. You know, I'll

36:21

give you this story for... for

36:24

the low, low cost of 25k...

36:30

Plus 2% of the gross. That's

36:32

all we're asking. That's...

36:34

That's a low amount. I mean,

36:36

yes,

36:36

there still needs some work.

36:38

There's still some action

36:39

scenes of like, you know, the

36:40

kids being chased by assassins

36:42

and...

36:42

Yadda yadda. And driving and...

36:45

Maybe finding another robot...

36:47

Android thing that tries to

36:48

kill them.

36:49

Who knows? Who knows? But the

36:50

point is...

36:53

There's some stuff that needs

36:54

to be fleshed out a little bit

36:55

more.

36:58

Uh, and we recognize that. Yeah.

37:00

So, 25k plus... What'd I say? 2

37:03

or 3% of the gross? Yeah.

37:05

And, uh... We'll say 3. Inflation.

37:08

Yeah. 3% of the gross. For

37:10

inflation.

37:12

And this should be a great, you

37:13

know, teen sci-fi... You know,

37:16

you could put this on the

37:17

Disney Channel.

37:19

Yeah. You could put this on

37:20

whatever Disney's family

37:22

channel used to be able to see

37:24

family,

37:25

now it's something else. Freeform.

37:27

You could put this on Freeform.

37:29

Exactly. It doesn't have to be

37:31

overly

37:31

violent. You have some tender

37:33

moments of romance. You know,

37:35

you have that moment where they

37:38

kiss.

37:39

You have the moment where it

37:40

seems like it's all gonna end.

37:42

It's all gonna end horribly.

37:43

Our main

37:44

character's gonna die. You have

37:46

family. Yeah. You have loving

37:49

parents. You have action,

37:51

suspense,

37:51

drama. It's all you need. And

37:53

let's be... We all know. I mean,

37:56

you could put this on Disney

37:57

Plus as well.

37:59

We all know that between Kit

38:00

and I... I mean, Kit, between

38:02

you and I, the people of your

38:04

generation

38:05

aren't that bright. Yeah. They'll

38:07

eat it up. Yeah, exactly. All

38:09

right. I don't know if that's

38:11

true,

38:12

but that's all we have for this

38:14

episode. If you, uh, like this

38:16

idea... It can be... It can be

38:19

yours for the

38:20

logo price of $50,000. Uh, yes.

38:23

But regardless, we do want to

38:25

hear from you. Feedback@Nontopical.com.

38:29

It's the most dominant place to

38:31

do that. Or if you want to be

38:33

like Dimce, we do believe in

38:35

something

38:36

called value for value. Which

38:38

means if you get value out of

38:40

what we do, show value in

38:41

return. It could be a

38:43

dollar, $5, $50. It could be

38:46

whatever number you conceive of.

38:49

33.33. Or a hypothetical $1,000.

38:57

$1,000. So, that about wraps it

39:01

up for episode 12. Do you have

39:03

anything else to add, Kit?

39:10

Two. Two? Yeah. That's what I

39:12

have to add.

39:17

I don't know why that makes any

39:19

sense, but your two has been

39:20

accepted. Do you know what math

39:23

is?

39:24

Yes. Exactly. All right,

39:26

everybody. Well, thank you for

39:28

listening to this episode of

39:30

Mostly Me Talking. Don't

39:31

worry. Maybe the next one will

39:33

have more of Kit. All right,

39:35

everybody. Bye. Bye. Bye.

39:37

Bye.

39:37

Bye.