Explicit NT 020 - Killer Android
Ep. 20

NT 020 - Killer Android

Episode description

What would you do if a killer android were after you? Kit and Chris share that they don’t really like their chances…

What do you think, how would you survive? Let us know via our feedback address or via the Fediverse!

Download transcript (.srt)
0:00

hello everybody yay we are

0:06

recording in my living room

0:12

normally we would be walking or

0:19

something but

0:22

there are well really without

0:24

complaining about the cold

0:25

which it's around 20 degrees

0:27

fahrenheit

0:28

which isn't terrible although

0:30

warm the the mornings the

0:31

mornings have been like

0:33

starting off at negative

0:34

one negative two negative seven

0:37

the primary reason we can't

0:39

walk is there was just this

0:41

small little

0:42

snowstorm last weekend and it

0:45

dumped just under two feet of

0:48

snow depending on area so

0:50

anywhere from 14

0:52

inches to 20 inches across the

0:54

state of connecticut and that

0:56

means that the paths that we

0:58

normally walk

0:59

down no and even better you

1:02

think well hey what about an

1:04

indoor place well the only

1:07

indoor place i could

1:08

think of where we could walk

1:10

and talk is the mall and

1:13

unfortunately at the mall if

1:14

you're walking and

1:16

talking with recording devices

1:18

uh security takes notice of

1:20

that and tends to reject you

1:22

from the

1:22

premises or rather eject you

1:24

from the premises because they

1:27

don't want patrons being

1:29

recorded

1:30

even if it's not on video like

1:32

they will so yeah we're

1:34

recording in the living room

1:36

and uh it it may

1:37

may or may not sound better or

1:39

worse whatever i i do know i

1:40

won't be breathing as heavily i

1:42

don't think

1:43

yay yay yay yay after we

1:45

recorded last episode kit

1:48

brought something up to me and

1:52

uh kit i'd like you to

1:54

go ahead and explain your

1:57

concern uh it's it's not non-topical

2:00

it's not non-topical and what

2:03

what he means is that

2:05

no don't get me wrong we're

2:07

good at what we do i still

2:09

think that that hollywood will

2:12

eventually

2:13

take us up on our deals that

2:16

said the point of the show was

2:19

not necessarily to focus on any

2:23

one thing

2:23

to really just be free to

2:25

discuss whatever and we fell

2:27

into a groove where coming up

2:29

with movie ideas just

2:30

sort of happened and it's great

2:33

and it's a lot of fun but if

2:35

that's what we're trying to do

2:37

all the time

2:38

then it's more of a focused

2:42

show versus a whatever the heck

2:44

we feel like talking about show

2:47

yeah

2:47

and because of that it's not to

2:52

say that we won't ever fall

2:53

into that groove again it's a

2:55

fun groove for us

2:56

we like being creative and

2:58

coming up with stuff like that

2:59

we're going to try to keep

3:02

things a little

3:03

bit more scattered and to base

3:06

things a little bit more on

3:08

different ideas and

3:10

conversations and that

3:12

may mean that there'll be some

3:14

shorter episodes rather than

3:15

the 30 minutes to an hour that

3:17

you

3:18

normally receive that doesn't

3:19

mean that there's going to be

3:21

any less work for us we're

3:22

still going

3:23

to probably be recording

3:24

several things at once over the

3:26

period of an hour or two and

3:27

then dividing it up differently

3:29

so

3:30

so what's been on your mind kit

3:32

what what what is it that i do

3:34

have anything to talk about

3:37

yeah it's just a lot of like

3:38

mostly what's on my mind is a

3:40

lot of life stuff going on that

3:42

is

3:44

probably not fit for podcasting

3:46

just because it's the same

3:48

things that adults typically

3:50

worry about

3:51

you know work relationships

3:53

money downsizing stuff planning

3:56

for the future all that type of

3:59

stuff

3:59

not exactly the the greatest

4:01

content for a podcast although

4:04

it is stuff that we could talk

4:06

about

4:07

but i thought that you had some

4:09

specific ideas in terms of uh

4:11

some some questions or

4:12

conversation starters

4:15

well i have questions

4:17

okay what would you do if an

4:20

android assassin were after you

4:23

okay we have a lot of things

4:25

that we have to pre-establish

4:27

for this

4:28

uh it it's not just so simple

4:32

as to say an android because

4:35

throughout the history of

4:37

science fiction

4:39

there are many different types

4:41

of androids i mean we have to

4:43

first of all i would presume

4:44

regardless of the form of the

4:47

android that it is physically

4:49

superior in terms of strength

4:52

at minimum

4:53

yeah intellectually superior is

4:56

different if it's smarter than

5:00

you if it's more intelligent

5:03

then it's going to be very

5:05

difficult of a scenario we also

5:08

have to figure out in terms of

5:11

android

5:12

what type of android are we

5:14

talking from a physical

5:15

perspective not just in

5:17

strength but are we talking

5:19

original terminator are we

5:21

talking terminator 2 where it's

5:25

made of liquid metal and can

5:27

change into

5:28

anything are we talking blade

5:30

runner where the only way that

5:32

you can tell that this thing is

5:34

an android

5:35

is by administering an obscure

5:37

test and observing the eyes

5:39

really closely and all of that

5:42

and those

5:42

androids actually look more

5:43

like humans and they only live

5:45

for if i remember correctly

5:46

three years

5:47

and they are superior strength

5:49

wise but they're not

5:50

necessarily superior

5:51

intellectual wise also are

5:53

these androids connected to a

5:54

hive mind of some sort are they

5:56

constantly connected to the

5:57

internet

5:58

or is their brain self-contained

5:59

because if they're constantly

6:01

connected to the internet then

6:03

they could

6:04

tap into government resources

6:05

they could tap into traffic

6:07

cameras they could tap into

6:08

rather

6:09

bank atm cameras ring doorbells

6:12

the whole nine yards so i need

6:15

you to better flesh out the

6:18

enemy here so uh

6:21

is humanoid looking, but it's

6:25

still clearly an android

6:28

because humans are egotistical

6:30

and stuff.

6:31

So maybe its skin is like

6:32

slightly off color, kind of

6:34

like Data from Star Trek Next

6:36

Generation?

6:37

I mean, and you can see its

6:38

joints, like it's not...

6:40

Okay, so the skin isn't like

6:43

perfect, you can actually see

6:46

like the polymer plastic type

6:48

outer skin sort of.

6:50

So we're talking like kind of

6:51

like the iRobot movie look

6:53

where it's like the white

6:54

plastic type thing?

6:56

I mean probably more metallic.

6:58

Okay, but does it have like a

7:00

humanoid face or is it just

7:02

completely blank? It does, but

7:04

it can't like express mouth

7:06

movement as much.

7:07

Okay, so it's basically... It

7:11

also doesn't have eyebrows.

7:13

So, alright, so you're talking

7:14

like a metal or plastic face

7:16

that has just kind of things

7:17

that look like eyes,

7:19

a facsimile of a nose, and a

7:20

facsimile of a mouth that may

7:22

or may not light up when it's

7:24

talking.

7:25

Yeah.

7:26

Okay.

7:27

It can like open to a square

7:29

shape probably, but like that's

7:31

about it.

7:32

So basically like C-3PO.

7:34

Yeah, it does kind of look like

7:36

C-3PO actually now that I think

7:38

about it.

7:39

I mean, but probably cleaner.

7:40

Like probably, you know. Okay.

7:43

So we've established the look

7:46

of this thing. So it cannot run

7:49

or move smoothly. It is a bit janky.

7:53

Uh, it's pretty smooth actually.

7:55

It's pretty smooth. You can see

7:57

its joints, but like they're

7:58

close to human joints.

7:59

So it does not move like C-3PO.

8:01

It moves like a human.

8:03

Yeah.

8:03

Okay.

8:04

Maybe even better than a human

8:05

because it can optimize that.

8:08

All right. So we've established

8:10

that. Intelligence.

8:12

Uh, I mean it's smarter than a

8:15

human in some ways, but it also

8:17

can't like feel any emotions or

8:20

anything because it's still a

8:22

robot.

8:23

So it doesn't really have a

8:25

sixth, well not even a sixth

8:26

sense, but you know how like

8:28

with humans we can sometimes

8:30

sense that something's off.

8:32

Like we could sense there's a

8:33

trap being set up for us.

8:34

Yeah. All it can do is like

8:36

scan around and base

8:37

practically and think

8:39

practically what you can do,

8:41

but it can't think emotionally

8:43

what you're going to do.

8:44

All right.

8:45

All the time. It can try. It

8:47

can like mimic, but it doesn't

8:49

understand it.

8:52

Okay. And in terms of

8:54

connection into hive mind slash

8:58

internet resources slash the

9:02

entire network of the world,

9:05

is it connected or does it have

9:07

to physically go to a computer

9:10

and it's limited to access that

9:12

a human would have?

9:14

Uh, it's self-contained, but it

9:16

probably has like a couple of

9:19

ports in its head or somewhere

9:21

on its body to connect to

9:23

certain things.

9:25

But it would have to physically

9:26

connect to them. Yeah. It's not

9:28

like Starlink. Yeah. And it

9:30

would also be like vulnerable

9:32

to very specific virus. Like if

9:35

you built a virus specifically

9:38

to attack it, that could

9:41

possibly get past it.

9:43

All right. So we've established

9:46

the type of android. How about

9:48

the world that we're living in?

9:51

Is it just normal world or is

9:52

this some kind of future world?

9:55

Is it apocalypse?

9:56

Considering the character that

9:59

I'm basing this off of that I

10:02

came off with, uh, cyberpunk-ish

10:05

type world. Okay. So we're

10:08

technologically advanced. Okay.

10:10

I have to ask this. Do we or do

10:12

we not have flying cars?

10:13

Uh, probably not. Okay. But

10:16

like marginally more commercial

10:20

flying things. Okay. So

10:22

basically giant cityscapes,

10:25

lots of neon or LED lights,

10:28

lots of advertising, um, city

10:31

streets. Robotic modifications

10:34

of body. Okay. Stuff like that.

10:38

Okay. All right. And last but

10:40

not least, why?

10:42

Why is this android after? Oh,

10:44

oh wait, I didn't think about

10:46

this. Weapons. Uh, does the

10:49

android have any inbuilt

10:51

hardware? Uh, inbuilt? Or built

10:53

in whatever. Yeah. Does it?

10:55

Built in? Yeah. Does it have

10:58

any weapons or does it have to

11:00

acquire weapons?

11:02

I mean, it can probably replace

11:04

its hands if it wanted to. Well,

11:07

want in quotations. If it would

11:10

be efficient to. Uh, it might

11:13

have like a laser pointer or a

11:15

blacklight or something in its

11:17

finger. But aside from that, it's

11:20

just physically strong because

11:23

it's made of metals.

11:25

All right. So it's not like it

11:28

has, um, machine gun or laser

11:31

gun. It, it, it, it, it, yeah,

11:33

probably not.

11:34

Oh, all right. All right. That

11:36

would be a little ridiculous.

11:38

Okay. And it obviously can't

11:39

shape shift. It has specific

11:41

form. All right. So last but

11:42

not least, on top of the

11:43

weapons and all of that stuff.

11:45

So we've got our setting. We've

11:47

got the, the enemy. What have I

11:50

done?

11:52

That this thing is coming after

11:54

me. Who have I upset that they've

11:57

sicked this thing on me? Or

11:59

what have I done to, even

12:01

though it doesn't have emotions,

12:04

offend its personal, like, uh,

12:06

protocols as such that I

12:08

present a threat to it that it

12:10

need, that it can then go past

12:12

the whole not allowed to harm

12:15

humans rule and is coming after

12:17

me.

12:18

Uh, I mean, it's, it's an

12:19

assassin. So if, if you, if you

12:21

talk to the wrong people and

12:22

upset them, you never know. You

12:24

never really know.

12:26

All right. So this is a robot

12:28

that's specifically designed as

12:30

an assassin. And if it came

12:32

across a machine gun or a

12:33

sniper rifle, it would know how

12:35

to use it with deadly precision.

12:38

Yeah. And it probably already

12:40

has like a gun or two. All

12:41

right. So this is, this is like

12:43

a military robot or something

12:45

like that. Like that has been

12:47

repurposed.

12:48

This was like a soldier type

12:50

robot. And then they got reprogrammed

12:52

by somebody to become an

12:53

assassin.

12:55

Yeah, sort of. Okay. All right.

12:57

So we have an assassin robot

12:59

coming after me in a dystopian

13:01

city like world.

13:02

Run by big corporate. Run by

13:05

big corporate. So chances are,

13:08

knowing myself and knowing my,

13:10

my current attitude towards

13:12

corporate environments, chances

13:15

are that I said or did

13:17

something related to the

13:19

corporation I'm part of.

13:21

And somebody was monitoring

13:24

something and decided that I

13:26

pose a threat to the bottom

13:28

line in a significant way, or I

13:31

pose a threat to the corporate

13:33

workplace. Like I might be that,

13:36

that one bad apple, that one

13:38

guy that's going to cause

13:39

everyone else to rise up

13:41

against the corporate mentality.

13:44

Like I've been having meetings

13:46

where like, I've been getting

13:49

people to, to start to, to, you

13:50

know, say, Hey, there might be

13:52

a better way of doing things

13:54

where it's not just about

13:56

making money, but it's also

13:58

about bringing general

14:00

happiness to people as well.

14:02

And that we should all have a

14:03

good work-life balance and that

14:04

we should all have a good work-life

14:05

balance and all that. Okay. So

14:07

that is my presumption at this

14:08

point.

14:09

They've been monitoring my, my

14:11

next generation of Microsoft

14:13

teams conversations. And the,

14:16

the CEO has taken a look at

14:18

this and been like, no, we can't

14:20

fire him because that'll just

14:22

be like a token.

14:24

And you know, everybody will

14:25

rise up. If we fire him, he's

14:27

got to be dead and it has to

14:29

look normal. So we've, we've

14:31

got the rationale behind

14:32

everything.

14:34

I am not a fighter. I am not a

14:35

superhero. I am not somebody

14:38

who is a soldier of any type. I

14:40

have not fired any weapons.

14:44

So what do I do once I, well,

14:46

first of all, I, I think there

14:48

has to be a moment where I

14:50

discover that this Android is

14:52

coming after me.

14:54

And I mean, it doesn't just

14:55

show up or maybe it does. So

14:57

maybe I'm sitting in the cubicle

15:00

at work, although I work from

15:02

home a lot.

15:03

So does it show up at my door

15:06

or does it show up at work?

15:11

Even so, I'm like kind of stuck.

15:14

Like I'm a normal human. I am

15:16

Joe Q human. I am not, I am not

15:19

military trained. I am not.

15:22

I mean, that depends on if you

15:23

have any cybernet modifications

15:25

or not.

15:26

Look, look, I, no, I am, I am

15:28

being myself here.

15:31

I'm going to be myself because

15:33

the question is, what would I

15:36

do if Assassin Android is

15:38

coming after me?

15:40

Yeah. Uh, there, there is one

15:42

detail about its appearance

15:44

that I didn't say and that's,

15:46

uh, its size glow.

15:48

Okay. Well, I mean, it, evil

15:49

robot. So, I mean, not

15:51

necessarily evil, but.

15:53

Just glow any color. Doesn't

15:55

have to be like stereotypically

15:57

evil.

15:58

So, I'm going to base this as

16:00

if I'm living in a similar

16:02

apartment style, but in a

16:04

building in a city.

16:07

With two doors. So, I have two

16:09

exits.

16:10

I have no weapons.

16:13

And I have a moderate amount of

16:15

computer skills.

16:18

And could pick up programming

16:20

if I needed to.

16:21

The building also would

16:23

probably have an alarm because

16:24

it's the future.

16:26

The future. Okay.

16:27

So, for the sake of argument,

16:29

let's say that I have the next

16:31

generation of ring doorbell or

16:32

whatever.

16:34

Robot shows up at my doors,

16:36

which both are normally locked.

16:40

Yeah.

16:40

I'm not expecting a package or

16:42

anything.

16:43

And I'm going to presume that I

16:44

know what package delivery

16:46

robots look like compared to

16:48

whatever the hell this thing is.

16:51

So, I don't know why this thing

16:53

is here.

16:54

But I can readily see that this

16:55

is not a robot that should be

16:57

coming to my location.

17:00

So, I'm not going to answer the

17:01

door.

17:03

But it's going to start banging

17:04

on the door.

17:05

And possibly take out a gun and

17:06

try to shoot out the locks.

17:08

So, at that point, I'm going to

17:10

be calling the authorities

17:12

while taking the other way out

17:15

to get to my car and to get

17:16

away.

17:19

So, I'm calling the authorities.

17:21

I'm going outside.

17:23

I'm giving a description of

17:24

this thing.

17:25

I'm forwarding them the footage.

17:28

Because that's basically all I

17:30

can do at this point.

17:34

Knowing 911 and the way it

17:35

operates today, I'm going to

17:38

presume that it would operate

17:40

similarly.

17:42

Unless, of course, it's owned

17:43

by the same corporation that I

17:45

work for.

17:46

Of course.

17:46

In this case, it's going to get

17:48

cut off.

17:49

Or it's not going to get

17:50

reported properly.

17:53

Regardless, I'm driving away.

17:57

Because by the time robot busts

18:00

in, I will be gone.

18:03

Even if you try to shoot out

18:05

the locks, even with robot

18:07

precision, that's going to take

18:09

a shot or two.

18:10

Because locks are not

18:11

necessarily as easy to blow out

18:13

as people think.

18:15

Not to mention the fact that my

18:16

neighbors are going to hear

18:18

this thing pounding on my door

18:19

and shooting.

18:21

And it's going to attract

18:22

attention to itself.

18:24

I mean, unless it doesn't shoot.

18:27

Well, even so.

18:28

You hear loud banging on a door

18:29

and hear the sound of splintering.

18:32

That's going to attract some

18:34

attention.

18:36

I'm sorry.

18:37

As much as I don't have a

18:39

direct, like, great rapport

18:41

with my neighbors and stuff.

18:44

I'm pretty sure if something

18:45

crazy was going down, everybody

18:47

would be poking their heads out

18:49

and being like, what the heck

18:51

is going on?

18:52

But meanwhile, I'm getting the

18:54

heck out of Dodge.

18:57

So I'm driving off.

18:59

Obviously, once this thing

19:00

discovers I'm gone, it's going

19:01

to try to follow me.

19:02

But logically, what I'm going

19:05

to do is I'm going to go to the

19:07

not corporate police.

19:11

I'm going to go to the police

19:12

that are not the corporate

19:15

influenced ones.

19:17

Because I'm going to presume

19:19

there is a standard police

19:20

force still in place.

19:21

Yeah.

19:23

So I'm going to go there.

19:25

Military police force convo.

19:26

Fine.

19:27

And I'm going to show them the

19:28

footage.

19:30

And my presumption is that they're

19:32

going to be like, well, this

19:34

thing is not going to stop.

19:37

So we're going to take you into

19:38

protective custody.

19:41

And I'm going to be like, I'm

19:42

totally fine with that.

19:45

And assassin robot being

19:47

assassin robot, it's not going

19:50

to be so dumb as to go into a

19:52

police station and start trying

19:54

to get me.

19:56

Because a good assassin is not

19:58

going to go out in the open and

20:01

do things.

20:02

You have to make it all clean-like.

20:04

It also is going to try to kill

20:06

as few people as possible

20:09

unless it is more efficient to

20:11

kill them.

20:13

So my best resource is to go to

20:15

the police to show them the

20:17

footage that I have in the

20:18

hopes that they'll be able to

20:21

track down where this thing is

20:23

coming from and whatnot.

20:25

I know that's the most boringest

20:28

of answers.

20:30

But as a normal Joe Q human

20:31

being, there's not much that I'm

20:34

going to be able to do on my

20:36

own without the assistance of

20:39

higher authorities.

20:42

I am not going to be able to

20:43

fight this thing on my own.

20:45

And I know I'm not going to be

20:46

able to fight this on my own.

20:49

I'm not stupid.

20:51

So in this scenario, that is

20:53

like my number one go-to.

20:56

Now here's the question.

20:58

What if the robot breaks the

20:59

camera?

21:01

Well, there would still be

21:02

enough footage before it breaks

21:04

the camera, you would presume.

21:06

But even so, I'm still going to

21:09

the cops and giving them this

21:11

description.

21:14

And knowing technology as I do,

21:16

that thing had to get to my

21:18

apartment somehow.

21:20

It had to either walk into the

21:23

complex or it had to arrive by

21:26

vehicle.

21:28

And given that this is a city

21:29

and this is a dystopian future,

21:31

there are cameras everywhere.

21:34

So there's going to be footage

21:36

somewhere that can be subpoenaed

21:38

with this thing.

21:40

Not to mention that while it

21:41

might not be connected to a

21:43

hive mind and be able to access

21:45

the internet all the time,

21:47

most robots, military, repurposed,

21:50

or otherwise, are going to have

21:52

some sort of tracking

21:54

associated with them.

21:56

They might try to mask it.

21:58

And given this is an assassin

21:59

robot, so chances are it would

22:00

be very difficult to determine.

22:02

But they're still going to give

22:04

off some kind of sensor

22:06

readings to something,

22:09

regardless.

22:10

Yeah.

22:10

And chances are it's going to

22:12

have some kind of unique

22:14

signature based upon the

22:15

processors or GPUs inside of

22:17

the thing from an

22:19

electromagnetic perspective.

22:22

Yeah.

22:23

So it's the boring ass answer.

22:26

But my best bet is the cops.

22:29

Now, if you presume all cops

22:31

are owned by corporate

22:33

influence, where do I go next?

22:37

So if all cops are owned by the

22:39

same corporate influence, the

22:42

next thing that I might

22:44

consider is I would not go to

22:46

government directly because

22:49

senators and such are probably,

22:52

again, owned by corporations.

22:57

So my next best bet is probably

23:00

the military.

23:03

Even though the military is

23:05

owned by the government, the

23:07

military might not be

23:09

completely co-opted by

23:10

corporation.

23:12

Now, if the military is

23:14

completely co-opted by

23:16

corporation, there are very few

23:18

avenues that I have.

23:21

If I were to be more optimal

23:24

about the military, it might be

23:27

like a collective military

23:31

between different groups of big

23:35

corporates and countries.

23:39

So there might be only partial

23:40

influence.

23:41

Yeah.

23:42

All right.

23:43

So in that case, if the cops

23:45

are owned by corporate and I

23:47

know that they are, I will go

23:50

to the nearest non-classified

23:52

military base and just be like,

23:55

look, this is a situation.

23:58

I can't go to the cops because

23:59

they're owned by, you know, I

24:02

fear like this robot's been

24:04

sent because of stuff that I've

24:07

done with my company.

24:09

They're not too happy that, you

24:11

know, I'm trying to increase

24:12

morale and help people to stand

24:14

up for themselves.

24:16

So I think they sent this robot

24:18

after me.

24:20

I can't go to the cops.

24:21

Please help me.

24:22

And I'm going to hope that the

24:24

military can help me at that

24:25

point.

24:27

Other than that, the only other

24:28

thing I can think of that would

24:29

be a stereotypical sci-fi plot.

24:31

And it's presuming that I

24:33

actually have friendships that

24:35

go in this circle is I have a

24:37

friend that's like, you know,

24:39

really conspiracy theorist,

24:41

really knows all the technology,

24:43

can like hack anything, etc.

24:46

But knowing people in real life,

24:49

I don't really have any person

24:50

in my life like that.

24:52

So I can't really fall back on

24:54

that, can I?

24:55

We're in a fictional situation

24:57

and a fictional world here, but

24:59

it's a world where I still want

25:01

to be rooted in myself.

25:03

So I don't want to make up

25:04

things that I don't have access

25:06

to.

25:08

Those are the only options that

25:10

I think that I have.

25:13

And in the due course of things,

25:14

yes, the robot will eventually

25:16

try to come after me.

25:18

But knowing that, like you said,

25:20

it's going to want to injure as

25:21

few people as possible and be

25:23

as least obvious as possible.

25:26

The safest place for me is

25:27

either with the police or with

25:30

the military.

25:31

Because there's a lot of bodies,

25:34

a lot of video, and a lot of

25:35

security there.

25:37

And if that thing then comes

25:39

for me, then chances are they're

25:41

going to be able to gun it down.

25:44

This is not Robocop where you're

25:46

going to have Ed whatever its

25:48

number is with the giant Gatling

25:51

guns.

25:51

Or Robocop who is basically indestructible

25:55

for the most part.

25:57

This is an android that can be

25:59

destroyed.

26:00

It is stronger than humans, but

26:02

when faced by humans that may

26:04

have electromagnetic pulse

26:06

weapons and other weapons,

26:09

chances are it's going to get

26:10

taken down.

26:11

And when it does get taken down,

26:14

presuming that its memory is

26:16

not going to get automatically

26:18

wiped,

26:18

presuming that blowing it up

26:21

stops its immediate memory wipe,

26:24

then they're going to be able

26:26

to get the memory out of the

26:27

thing and figure out who hired

26:29

it.

26:30

At which point, that person

26:31

will be taken into custody.

26:33

And I will have to presume that

26:35

they'll probably eventually

26:38

escape because corrupt

26:40

corporate society that may or

26:42

may not be able to grease the

26:43

palms of the courts.

26:45

But still, look, I'm going to

26:47

be just completely honest.

26:50

I will eventually die.

26:52

Because if it's that type of

26:53

society, even if I go through

26:55

appropriate channels, they'll

26:58

be able to protect me for a

26:59

while.

27:01

But eventually, there's just

27:02

going to be more and more

27:03

sophisticated assassin robots

27:06

getting sent after me.

27:08

The person that is committing

27:10

the crimes against me is going

27:12

to have too much power and

27:13

influence to be able to be held

27:15

to account for their crimes.

27:18

So regardless of how many times

27:20

I lean on the military or even

27:22

if I start working for the

27:24

military as a civilian

27:26

consultant in the long term,

27:28

eventually they're going to

27:29

come up with an assassin

27:31

android that is going to be

27:32

able to evade the cameras,

27:34

sneak through the air ducts,

27:35

and kill me with some kind of

27:37

fast acting poison that won't

27:39

leave behind too much residue.

27:42

So I know that eventually I am

27:44

going to die to these things.

27:48

And I know that that's not fun.

27:51

But I'm being a realist here.

27:54

I could go full sci-fi trope

27:55

and be like, oh yeah, I have

27:58

all of the John Wick and or

28:01

Bourne skills.

28:03

I am like an assassin myself.

28:06

I can hack anything.

28:09

I can, you know, I can slide

28:10

down the side of a building

28:12

with a pistol in my hand and be

28:14

shooting as I'm sliding down

28:16

the side of it and somehow

28:17

still nail the landing.

28:19

But realistically, that's not

28:21

who I am.

28:22

I've got a left knee that

28:23

sometimes acts up.

28:25

If I were to like even stick a

28:26

landing off of the second story

28:27

of a building, I'm still going

28:29

to mess up that knee.

28:30

I'm going to just be hobbling

28:32

away.

28:33

This is the real world, like

28:34

for me, but in a fictional

28:36

scenario.

28:38

Even in a fictional scenario

28:39

where I know customer service

28:41

systems inside and out, let's

28:43

say,

28:44

and they're the next generation

28:45

of customer service systems,

28:47

that's not going to help me in

28:48

this situation.

28:50

Even though I know a little bit

28:52

about how to run Python scripts

28:53

and things like that, that's

28:55

not going to help me in this

28:56

situation.

28:58

I am going to die unless I'm

28:59

protected by people who are

29:01

much better equipped to deal

29:03

with this than I am.

29:05

And unfortunately, even those

29:07

people, unless they figure out,

29:10

look, the guy who's sticking

29:12

people on this poor civilian is

29:14

not going to stop unless he's

29:17

actually assassinated himself,

29:20

which would be kind of against

29:22

the law in of itself.

29:24

So that's my answer.

29:30

I'm going to go to the

29:31

competent authorities as best

29:33

as I can.

29:35

And chances are I'm eventually

29:36

going to die unless those

29:38

competent authorities can

29:41

actually bring this person to

29:43

justice who sick these things

29:45

upon me.

29:47

Yeah, I mean, that's boring.

29:50

It's not as fun as what people

29:51

want.

29:52

Like, they want to hear me

29:53

saying, oh, I would take a

29:55

baseball bat to the thing's

29:57

head until its central

29:58

processor shut down.

30:00

And then I would download its

30:02

memory and figure out who did

30:04

this.

30:04

And then I would go online and,

30:06

like, leak it online with the

30:08

full, like, proof in 8K or

30:10

whatever it is at that point.

30:12

You know, caught in 16K from

30:15

the camera on the assassin bot

30:17

who it is that told everyone to

30:19

assassinate me.

30:21

But nobody would believe that

30:22

anyway because AI deepfakes

30:24

would be so good at that point

30:26

that it really doesn't matter.

30:28

You know, it would just fall

30:30

apart anyway.

30:31

Mustless are illegal now.

30:33

You never know.

30:33

You never know.

30:34

But I'm just saying,

30:35

realistically, I'm not that

30:36

person anyway.

30:37

So grounding myself in realism

30:40

as to who I am in a fictional

30:43

environment, that is how I

30:45

would have to react.

30:48

There's no getting around it.

30:49

Neat.

30:50

Like, maybe if I had a crowbar

30:53

and it broke in and I had

30:55

access to the crowbar, I could

30:57

try to bash it.

30:59

But chances are it has weapons

31:01

with it at that time anyway.

31:04

So I hate to say it, but crowbar

31:06

does not always beat gun.

31:09

I mean, if you're lucky.

31:11

Yeah.

31:12

And that's how I would approach

31:14

it.

31:14

I mean, is there anything you

31:15

would do differently?

31:18

I would just die.

31:19

Wow.

31:22

You're not even going to try to

31:23

evade.

31:23

You're just going to be like,

31:24

my death's at the door.

31:26

I resign myself to this fact.

31:30

Kit is at home.

31:31

Kit is at home and he sees the

31:34

robot show up.

31:36

Kit's below the age to drink or

31:37

whatever, but he's like, well,

31:39

I'm going to go.

31:41

So like the robot busts in and

31:42

he's just sitting there like

31:44

just drinking from the bottle.

31:46

The robot kind of looks at him

31:48

like, well, aren't you going to

31:50

run?

31:50

No.

31:51

Just let me finish this.

31:53

Let me get really drunk first.

31:55

And the assassin droid might

31:56

just sit there and be like, you

31:58

know, that's not going to stop

32:00

me from achieving my goals here.

32:03

I'll give you the extra 15

32:04

minutes to get drunk.

32:06

And then so it doesn't hurt as

32:08

much.

32:11

That's kind of sad, though.

32:12

So you wouldn't try to find it

32:13

all.

32:14

I might, but I don't think I

32:15

could.

32:16

So you wouldn't.

32:18

You wouldn't.

32:18

I'm kind of too slow to do

32:19

anything.

32:21

You wouldn't try to get to a

32:22

neighbor at least or try to

32:23

call the cops.

32:24

Nothing.

32:25

No, I'd probably be too slow to

32:27

do that.

32:28

Wow.

32:30

That is sad.

32:31

I don't know how else this

32:33

react.

32:34

I mean, that's just that.

32:35

So my reaction.

32:37

I'm going to at least try to go

32:38

to competent authorities to

32:40

help.

32:41

Kit's reaction.

32:42

I'm just going to sit there and

32:43

let it happen.

32:47

I'm going to hope that it doesn't

32:48

have to torture me first.

32:51

There's no information it needs

32:52

to extract from me.

32:54

I'm just going to sit there,

32:56

like, grab a bottle of alcohol

32:57

and just, like, let it come.

33:01

Wow.

33:02

All right, man.

33:03

To each their own.

33:07

I'm just saying what would be

33:07

most likely to happen.

33:09

Because it's faster, stronger,

33:11

and better than me.

33:13

True.

33:14

True.

33:15

Very true.

33:15

But, yeah, so, um, assassin,

33:18

android, coming after us.

33:20

That's, that's the way we would

33:21

deal with it.

33:22

You know what you would, you

33:24

know what you would do with

33:25

your alcohol?

33:26

You would have one final cheese

33:28

quesadilla.

33:29

Yes, sir.

33:30

Yeah, absolutely.

33:32

So, that would be, that would

33:34

be the ultimate end.

33:36

Like, Kit sitting there, cheese

33:37

quesadilla, bottle of wine,

33:39

just drinking from the bottle.

33:42

Maybe somehow, for some reason,

33:43

he has a cigarette.

33:45

Just like, like, I don't know

33:47

where you'd get that from.

33:49

But, yeah.

33:50

That's, that's how Kit wants to

33:52

go out to an assassin.

33:53

Or an assassin.

33:56

He went out enjoying what he

33:57

likes.

33:59

Cheese quesadillas.

34:01

Cheese quesadilla.

34:03

Anyway, if you like this

34:03

conversation and you like this

34:05

slightly different direction,

34:07

please let us know.

34:09

Feedback@nontopical.com.

34:11

We're still waiting to hear

34:12

from people as to what you want

34:14

to see or hear more of from the

34:15

show.

34:16

Or what you enjoy hearing from

34:18

the show.

34:19

Do you like the episode artwork?

34:20

That's another thing.

34:21

Kit works really hard on the

34:22

episode artwork.

34:23

I'm not going to lie.

34:24

Sometimes, some of the pieces

34:26

of art he puts together take,

34:28

like, a couple of hours to do.

34:30

He's really puts a lot of time

34:32

and effort into it.

34:34

I feel like a couple of hours

34:35

isn't that much time.

34:37

That's your opinion.

34:39

Also, as always, we are what we

34:42

consider to be a value for

34:44

value show.

34:46

Which means that if you get any

34:48

value out of this.

34:50

Show some value in return.

34:52

And we often neglect to mention

34:54

this.

34:54

But the place to go to do that

34:58

is khttps://ko-fi.com/nontopical.

35:03

And there's links to that in

35:04

the show notes.

35:05

And in a modern podcast player,

35:08

such as Cast-O-Matic, Cast-O-Pod,

35:10

Fountain, Podcast Addict, and

35:13

all these other ones.

35:15

There's actually usually a

35:16

convenient link right in the

35:18

app to be able to get to the

35:20

donation page.

35:21

And we really appreciate that

35:22

because it does encourage us to

35:24

keep going forward with this.

35:26

So we thank you for that as

35:28

well.

35:29

I think that's about it for

35:30

this episode.

35:31

Unless, Kit, do you have

35:32

anything else to add?

35:34

A hypothetical 1,000 credits.

35:38

Yes, we could use that

35:38

hypothetical 1,000 future

35:40

credits.

35:43

Because with inflation, that'll

35:44

be equal to about $10.

35:47

Yeah.

35:49

All right, everybody.

35:50

We'll catch you next time.

35:52

Bye.

35:52

Bye.