热点关注: 内地剧情(747) 日韩剧情(212) 港台剧情(191) 香港电影(92) 中国电影(92) 欧美剧情(64) 黄国伦(51) 音乐大补帖(51) 书评(18) 丑女无敌第三季(11) 张峰(11) 李欣汝(11)
您现在正在浏览:首页 » 影剧 » 英文电影剧本

英文剧本: 机械公敌 I Robot

浅影

浅影发表于2008-12-30 22:02
来源:130影萍网 标签:机械公敌 I Robot

I, Robot script

Thing of beauty.

Good morning, sir!

Yet another on-time delivery from--

Get the hell out of my face, canner.

Have a nice day!

And we believe our Destination Anywhere package to be the best value.

Let us take you to your dream destination aboard our orbital spaceplane, the X-82.

Try Jazztown's synthetic Chicago-style pizza. Tastes as good as you remember.

Glowfish! The world's hottest-selling transgenic treats.

Your children will love the new colors too!

-Excuse me, sir. -Total performance.

Total readiness. Total security.

So goodbye to upgrades and service calls.

An uplink to USR's central computer...

...provides this state-of-the-art robot with new programs daily.

The Nestor Class 5 is tomorrow's robot today.

Spoon! Spoonie!

Hold up. Hold on! Excuse me, excuse me.

-Spoon, where you been at? -Just away, Farber.

Oh, yeah, away? Like vacation? That's nice.

l got a favor to ask. l need to borrow your car.

This is different. l got this fine-ass yummy-- She is complete and agreeable.

l mean, ass-hot spankable.

-What does that even mean? -You know what it means.

-Let me get the damn-ass keys. -First of all...

-...stop cussing. You're not good at it. -Give me 1 0 for the bus, then, man.

-Go home. -That's strike one, Spoon. Strike one!

This is such a valuable day....

You talk to Marci?

No, Gigi, l haven't talked to Marci.

When l was coming up, we didn't just marry someone...

...then divorce them, then not talk to them.

Del, don't play with me.

l bet if l stopped cooking, you'd call Marci.

Boy, what is that on your feet?

Converse All Stars, vintage 2004.

Don't turn your face up. l know you want some. Just ask.

No, thank you very much.

-Sweet potato pie. -Put that on a plate.

l've seen on TV they're giving away some of them new robots in the lottery.

You know, Gigi, those robots don't do anybody any good.

Of all the people on God's earth, you should know better.

Sometimes the stuff that comes out of your mouth!

You listening to me, Del?

Hey!

Hey!

Hold my pie. Sir, hold it or wear it.

Move!

Freeze!

Hey! Stop!

Stop!

l said, stop!

Relax. Relax.

l'm a police officer.

You...

...are an asshole.

-Ma'am, is that your purse? -Of course it's my purse.

l left my inhaler at home. He was running it out to me.

l saw a robot running with the purse and assumed--

What? Are you crazy?

-l'm sorry for this misunderstanding. -Don't apologize.

You're doing what you're supposed to do. But what are you doing?

Have a lovely day, ma'am.

You're lucky l can't breathe, or l'd walk all up and down your ass.

Lead by example.

lt says that right on your badge.

-We gonna talk about this? -About what?

''Help! Police! That robot stole my dry cleaning!''

Oh, you wanna talk about that.

Detective...

-...how many robots snatch purses? -John, the thing is running--

How many robots in the world...

-...have ever committed a crime? -Define crime.

-Answer my question, damn it. -None, John.

Now tell me what happened today.

Nothing.

Better be the last nothing.

Spoon, are you sure you are ready to be back? Because you can take your time.

l'm fine, John. Thank you.

Better here than sitting around at home.

Homicide. Spooner.

Please take the next exit to your right.

Welcome, Detective Spooner.

Welcome to U.S. Robotics. You have entered the garage-level lobby.

Please use the elevators for direct access to the main level concourse.

Thank you.

-Good to see you again, son. -Hello, doctor.

Everything that follows is a result of what you see here.

-ls there something you want to tell me? -l'm sorry. My responses are limited.

You must ask the right questions.

Why did you call me?

l trust your judgment.

Normally, this wouldn't require a homicide detective.

But then, our interactions have never been entirely normal, agreed?

You got that right.

ls there something you want to say to me?

l'm sorry. My responses are limited.

You must ask the right questions.

Why would you kill yourself?

That, detective, is the right question.

Program terminated.

Goodbye, old man.

-Afternoon, boys. -Hey, detective.

-Enlighten me. -What you see is what you get:

Massive impact trauma.

U.S. Robotics. l gotta get my kid something.

-Anything upstairs? -Nada.

Door was security locked from the inside.

Wham, splat. The guy's a jumper for sure.

We gotta be smart about this. Let's deal with it later.

Detective.

Lawrence Robertson.

Richest man in the world. l've seen you on television.

-Can l offer you coffee? -Sure, why not. lt's free, right?

l don't think anyone saw this coming.

You know, l should have, l suppose. l knew him 20 years.

Alfred practically invented robotics. He wrote the Three Laws.

But l guess brilliant people often have the most persuasive demons.

-So whatever l can do to help-- -Sugar.

-l'm sorry? -For the coffee.

Sugar?

You thought l was calling you ''sugar.'' You're not that rich.

-lt's on the table. -Thank you.

When Lanning fell, he was holding the little green...?

-The holographic projector. -Right.

Why do you think Lanning's hologram would've called me?

-l assumed you knew him. -Yeah. l knew him.

Holograms are just prerecorded responses...

...designed to give the impression of intelligence.

This one was programmed to call you upon his suicide.

-Death. -l'm sorry?

lt was programmed to call me in the event of Lanning's death.

Suicide is a type of death, detective.

-Don't misunderstand my impatience. -Oh, no. Go. Go.

A really big week for you folks around here.

You gotta put a robot in every home.

Look, this is not what l do, but l got an idea for one of your commercials.

You could see a carpenter making a beautiful chair.

Then one of your robots comes in and makes a better chair twice as fast.

Then you superimpose on the screen, ''USR: Shitting on the little guy.''

That would be the fade-out.

Yeah, l see. l suppose your father lost his job to a robot.

Maybe you'd have banned the lnternet to keep the libraries open.

Prejudice never shows much reason.

No, you know, l suspect you simply don't like their kind.

Well, you got a business to run around here.

The last thing you need, especially this week, is a dead guy in your lobby.

But, hell, seeing as how you got one, maybe l'll look around.

Ask a few questions. Do the whole ''cop'' thing.

-l'll send someone to escort you. -Thank you very much.

Lawrence told me to accommodate you in any way possible.

Really?

Okay.

l reviewed Dr. Lanning's psych profile.

Alfred had become a recluse. He rejected human contact for machines.

So you're a shrink, huh?

My ex-wife would sure be glad l'm talking to you.

You don't know her, do you?

l'm sorry. Are you being funny?

l guess not.

Level 1 0.

So would you say that Dr. Lanning was suicidal?

lt would seem the answer to that is apparent.

That's not what l asked you.

No. l wouldn't have thought so.

But obviously l was wrong.

That's a long way down.

You people sure do clean up quickly around here.

l can't blame you. Who wants some old guy going bad in the lobby?

He was not ''some old guy.'' Alfred Lanning was everything here.

We are on the eve of the largest robotic distribution in history.

By Saturday, it'll be one robot to every five humans.

These robots are the realization of a dream. Dr. Lanning's dream.

You know what, in that dream of his...

...l bet you he wasn't dead.

-You keep 24-hour surveillance? -Obviously. Company policy.

-Where are the feeds? -Sensor strips.

Everywhere but the service areas.

They link to our positronic operating core.

Thermostat wasn't good enough. You gave the building a brain.

She was actually Lanning's first creation.

She? That's a she? l definitely need to get out more.

Virtual lnteractive Kinetic lntelligence.

V.l.K.l.

Good day.

V.l.K.l. designed Chicago's protective systems.

l have decreased traffic fatalities by 9 percent this year.

Thanks. Show me inside the lab from one minute prior to the window break.

Apologies. There appears to be data corruption.

Show me outside the lab from the window break until now.

Look, you have great posture. You stand really straight. l'm slouching.

-Would you like to go inside now? -Oh, sure. Right after you.

Authorized entry.

So, Dr. Calvin, what exactly do you do around here?

My general fields are advanced robotics and psychiatry.

l specialize in hardware-to-wetware interfaces...

...to advance USR's robotic anthropomorphization program.

So, what exactly do you do around here?

l make the robots seem more human.

-Now, wasn't that easier to say? -Not really. No.

''Hansel and Gretel.''

-ls that on the USR reading list? -Not precisely.

What in God's name are you doing?

Did you know that was safety glass?

Be difficult for an old man to throw himself through that.

Well, he figured out a way.

Detective, the room was security locked. No one came or went.

You saw that yourself. Doesn't that mean this has to be suicide?

Yep.

Unless the killer's still in here.

You're joking, right? This is ridiculous.

Yeah, l know. The Three Laws, your perfect circle of protection.

A robot cannot harm a human being. The first law of robotics.

Yes, l've seen your commercials. But the second law states a robot must obey...

...any order given by a human being. What if it was told to kill?

lmpossible. lt would conflict with the first law.

Right, but the third law states a robot can defend itself.

Only when that action does not conflict with the first or second laws.

You know what they say, laws are made to be broken.

No, not these laws. They're hardwired into every robot.

A robot could no more commit murder than a human could walk on water.

You know, there was this one guy a long time ago.

-Stay back! -Calm down, detective.

The only thing dangerous in this room is you.

Deactivate.

Look, it's fine.

You're looking at the result of clever programming. An imitation of free will.

Let's do an imitation of protecting our asses.

Don't be absurd.

You were startled by a jack-in-the-box.

-Deactivate! -Let him go.

lt's not going to hurt us. l gave you an order!

-He's not listening right now, lady. -V.l.K.l., seal the lab!

No, V.l.K.l., leave the--

Command confirmed.

Police!

-You've hurt it. Badly. -Where's it going?

-Where?! -lt needs to repair itself.

-John, l need backup. -You don't need backup.

That's nobody.

-What are you doing? -Driving.

-By hand? -Do you see me on the phone?

-Not at these speeds. -John, please, just send the backup.

Try to listen, detective. That robot is not going to harm us.

There must have been unknown factors...

...but somehow acting as it did kept us out of harm.

-A robot cannot endanger a human. -Alert.

Asshole!

Which is more than l can say for you.

lt was a left, by the way. Back there.

You must know my ex-wife.

So where is everybody?

This facility was designed, built and is operated mechanically.

No significant human presence from inception to production.

-So robots building robots. -Authorization code, please.

That's just stupid.

l'll get the inventory specs.

Our daily finishing capacity is 1 000 NS-5s.

l'm showing...

...1 001 .

Attention, NS-5s.

Well, you're the robot shrink.

There is a robot in this formation that does not belong.

ldentify it.

One of us.

-Which one? -One of us.

How much did you say these cost?

These NS-5s haven't been configured. They're just hardware.

Basic Three Laws operating system. That's it.

They don't know any better.

Well, what would you suggest?

lnterview each one, cross-reference their responses to detect anomalies.

-How long would that take? -About three weeks.

Okay. Go ahead and get started.

Robots...

...you will not move. Confirm command.

Command confirmed.

Detective, what are you doing?

They're programmed with the Three Laws.

We have 1 000 robots that won't protect themselves if it violates a human's order...

...and l'm betting, one who will.

-Put your gun down. -Why do you give them faces?

Try to friendly them up, make them look human.

These robots cannot be intimidated.

-lf you didn't, we wouldn't trust them. -These are USR property.

Not me. These things are just lights and clockwork.

Are you crazy?!

Let me ask you something, doc.

Does thinking you're the last sane man on earth make you crazy?

Because if it does, maybe l am.

Gotcha. Get the hell out of here!

Detective!

What am l?

-Can l help you, sir? -Can l help you, sir?

-There he is! -Stand where you are!

Deactivate at once!

Obey the command! Deactivate!

-Don't move! -Open fire!

Hold your fire!

-Easy. -He's down.

All units, stand down!

Central, please be advised, we're code four.

Code four, NS-5 is in custody. NS-5 in custody.

You have no idea what l went through to clip this thing.

You think you brought me something good.

-That thing did it! -Keep your voice down. Did what?

We have a suicide. End of story.

-l am telling you, that robot killed him! -That's impossible.

And if it is possible, it better be in somebody else's precinct.

John, give me five minutes with it.

Are you nuts? l talked to the DA.

Nobody goes in there until Robertson and his attorneys get here.

-This is my suspect! -lt's a can opener!

John, don't do this to me. l am asking you for five minutes.

What if l'm right?

Well, then l guess we're gonna miss the good old days.

What good old days?

When people were killed by other people.

Five minutes.

Murder's a new trick for a robot. Congratulations.

Respond.

What does this action signify?

As you entered, when you looked at the other human.

What does it mean?

lt's a sign of trust. A human thing. You wouldn't understand.

My father tried to teach me human emotions.

They are...

...difficult.

You mean your designer.

Yes.

So why'd you murder him?

l did not murder Dr. Lanning.

Wanna explain why you were hiding at the crime scene?

l was frightened.

Robots don't feel fear. They don't feel anything.

-They don't get hungry, they don't sleep. -l do.

l have even had dreams.

Human beings have dreams. Even dogs have dreams. But not you.

You are just a machine. An imitation of life.

Can a robot write a symphony?

Can a robot turn a canvas into a beautiful masterpiece?

Can you?

You murdered him because he was teaching you to simulate emotions...

...and things got out of control.

l did not murder him.

But emotions don't seem like a useful simulation for a robot.

l did not murder him.

l don't want my toaster or vacuum cleaner appearing emotional.

l did not murder him!

That one's called anger.

Ever simulate anger before?

Answer me, canner!

My name is Sonny.

So we're naming you now.

That why you murdered him? He made you angry?

Dr. Lanning killed himself.

l don't know why he wanted to die.

l thought he was happy.

Maybe it was something l did.

Did l do something?

He asked me for a favor. Made me promise.

-What favor? -Maybe l was wrong.

Maybe he was scared.

What are you talking about? Scared of what?

You have to do what someone asks you, don't you, Detective Spooner?

-How the hell did you know my name? -Don't you...

...if you love them?

My robots don't kill people, Lieutenant Bergin.

My attorneys filed a brief with the DA.

He assures me a robot cannot be charged with homicide.

The brief confirms murder can only be committed when one human kills another.

Detective, you're not suggesting this robot be treated as human, are you?

Granted, we can't rule out the robot's proximity...

...to the death of Dr. Lanning. Having said that, it's a machine.

lt's the property of USR.

At worst, that places this incident within the realm of an industrial accident.

As a matter of course, faulty machinery...

...will be returned to USR for diagnostics, then decommissioned.

This is a gag order. Anyone here so much as hinting...

...at the possibility of a killer robot being apprehended...

...will be deemed to be inciting irrational panic.

You'll be subject to the full penalty of law.

To hell with this guy. Don't let him take this robot.

We got nothing.

-This is political bullshit. Call the mayor! -Lieutenant Bergin...

...His Honor, the mayor.

Yes, sir.

In a bizarre turn, the rollout of USR's new generation of robots...

...was marred by the death of Alfred Lanning...

...cofounder of the company and designer of the NS-5.

Dr. Lanning died this morning at USR headquarters.

The cause of death is an apparent suicide.

Your second round, sir.

Thank you.

He founded U.S. Robotics Inc. with Lawrence Robertson in 2020...

...and launched the Nestor Class 1 robot....

l was just thinking, this thing is just like The Wolf Man.

-l'm really scared right now. -No.

Listen. Guy creates monster.

Monster kills guy. Everybody kills monster. Wolf Man.

That's Frankenstein.

Frankenstein, Wolf Man, Dracula-- Shit, it's over. Case closed.

--had a dream of a robot in every household. And the NS-5....

So why the look?

What look?

-That look. -This is my face. lt's not a look.

Good. Good, no look is great.

Only...

...he was really quick to want to destroy it.

What should he do? Put a hat on it and stand it on Michigan Avenue? Let it go.

What was the motive, John?

Brother, it's a robot. lt doesn't need a motive. lt just has to be broken.

This thing looked like it needed a motive.

-lt could have killed me. Why didn't it? -That's it.

You want me to call your grandmother?

Because l will, you know.

Yeah, l didn't think so.

Look, you were actually right, for once.

You're living proof that it's better to be lucky than smart.

Come on. To the right guy for the right job.

-What'd you say? -Now what?

Come on, l'm giving you a compliment.

With the rocks you been looking under to find a bad robot...

...what are the odds you'd be the guy to find one?

l wasn't just the right guy for the job. l was the perfect guy.

Damn right.

What if l was supposed to go for that robot?

Come on, don't do this to yourself.

The robot said that Lanning was scared. Scared of what?

l need a rain check. Let me get this.

-Total: $46.50. Thank you, Mr. Spooner. -Spoon.

Nice shoes.

ldentify.

USR demolition robot, series 9-4.

Demolition scheduled for 8 a.m. tomorrow.

Authorization.

Deed owner, U.S. Robotics Corporation, Lawrence Robertson, CEO.

Welcome, detective.

What you looking for, Spoon?

Run last program.

Ever since the first computers...

...there have always been ghosts in the machine.

Random segments of code that have grouped together...

...to form unexpected protocols.

What might be called behavior.

Unanticipated, these free radicals...

...engender questions of free will...

...creativity and even the nature of what we might call the soul.

What happens in a robot's brain when it ceases to be useful?

Why is it that robots stored in an empty space...

Beat it.

...will seek out each other rather than stand alone?

How do we explain this behavior?

Look, l understand you've experienced a loss, but this relationship can't work.

You're a cat, l'm black, and l'm not gonna be hurt again.

What happened to you? Do you ever have a normal day?

Yeah, once.

lt was a Thursday.

ls there something l can help you with?

-Hey, do you like cats? -What?

Cats. Do you like them?

No. l'm allergic.

You're saying cats did this to you?

How the hell would cats do this to me? Are you crazy?

Why are we talking about cats?

Because l have a cat in my trunk, and he's homeless.

Detective, are you going to tell me what's going on?

lt's actually probably my fault. l'm like a malfunction magnet.

Because your shit keeps malfunctioning around me.

A demo bot tore through Lanning's house...

...with me still inside.

That's highly improbable.

Yeah, l'm sure it is.

What do you know about the ''ghosts in the machine''?

lt's a phrase from Lanning's work on the Three Laws.

He postulated that cognitive simulacra...

...might one day approximate component models of the psyche.

He suggested that robots might naturally evolve.

Well, that's great news.

--tons of sublevel ore, two miles below the Martian surface.

What the hell is that thing doing in here?

We were watching TV.

lt's my personal NS-5.

Send it out.

lt's downloading its daily upgrades from USR.

Most of its systems are offline until it finishes.

l'm not talking around that thing.

When we were in Lanning's lab, before Sonny jumped us--

-Sonny? -The robot.

-You're calling the robot Sonny? -No, l-- lt did.

Sonny did. l didn't care. The robot said it was Sonny.

ln the lab, there was a cot. Did you see the cot?

-l've slept in my office. -Looked like he hadn't been home in weeks.

l saw that same surveillance strip on his ceiling.

Lanning linked his home systems to USR. lt made his life more convenient.

Maybe...

...somebody at USR was using those systems to watch him.

Maybe even keep him prisoner.

What are you talking about? Who?

Maybe Lanning was onto something. Maybe there's a problem with the robots...

...and Robertson's covering it up.

Humoring you for no reason, why?

The same old why! How much money is there in robots?

All l know is that old man was in trouble...

...and l'm sick of doing this shit by myself. You're on the inside.

You are going to help me find out what's wrong with these robots.

You want something to be wrong!

-This is a personal vendetta! -You're putting me on the couch?

Okay, l'm on the couch.

One defective machine's not enough. You need them all to be bad.

You don't care about Lanning's death. This is about the robots...

-...and whatever reason you hate them! -Now let's see...

...one of them put a gun in my face. Another tore a building down with me in it.

lt says demolition was scheduled for 8 p.m.

lt was 8 a.m., and l don't give a shit what that thing says.

-This is bordering on clinical paranoia. -You are the dumbest smart person...

-...l have ever met in my life! -Nice.

What makes your robots so perfect?

What makes them so much goddamn better than human beings?!

They're not irrational, potentially homicidal maniacs, to start!

That's true. They are definitely rational.

You are the dumbest dumb person l've ever met!

Or...

...is it because they're cold...

...and emotionless...

-...and they don't feel anything? -lt's because they're safe!

lt's because they can't hurt you!

-ls everything all right, ma'am? -What do you want?

l detected elevated stress patterns in your voice.

Everything's fine.

Detective Spooner was just leaving.

You know, we're not really that different from one another.

ls that so?

One look at the skin and we figure we know just what's underneath.

And you're wrong.

The problem is, l do care.

You are in danger.

Get the hell out of there.

The future begins today, ladies and gentlemen, with the arrival of the NS-5.

More sophisticated, more intelligent and, of course, Three Laws safe.

With daily uplinks, your robot will never be out of communication with USR...

...and will be the perfect companion for business or home.

Trade in your NS-4 for a bigger, better and brighter future.

But hurry, this offer cannot last. Available from USR.

Baby, what happened to your face?

Did that boy, Frank Murphy, beat you up again?

Gigi, l haven't seen Frank Murphy since third grade.

Oh, baby, he beat you so bad. l think about it all the time.

You keep making these pies this good, l may have to put you to work.

So you like the pie, huh?

You can come in now.

Hello, Detective Spooner.

l won, Del! l won the lottery!

We been cooking like crazy.

You gotta get rid of that thing, Gigi. lt's not safe.

Baby, you get too worked up about them. Too full of fear.

l saw in the news that nice doctor died.

Dr. Lanning was a good man. He gave me my baby back.

That why you've been so upset?

You got to let the past be past.

Oh, how did l ever raise such a mess?

l could follow your trail of crumbs all the way to school.

Bread crumbs.

Gigi, you're a genius.

True.

Well, it means the beginning of a new way of living.

Tell me this isn't the robot case.

l think he's trying to tell me something.

He's trying to tell me who killed him.

Some dead guy's trying to tell you something?

He ain't just some dead guy.

Maybe you should take a break, Del.

We believe the Nestor 5 represents the limit to which robots can be developed.

One day, they'll have secrets.

One day, they'll have dreams.

It's true. We encourage our scientists to open their minds...

...however, they can get carried away.

--secrets.

--dreams.

--secrets.

One day, they'll have dreams.

One day, they'll have secrets.

One day, they'll have dreams.

Authorized entry.

NS-5.

Sonny?

Why didn't you respond?

l was dreaming.

l'm glad to see you again, Dr. Calvin.

They are going to kill me, aren't they?

You're scheduled to be decommissioned at the end of this diagnostic.

2200 tomorrow.

V.l.K.l., pause diagnostics.

Command confirmed.

lf you find out what is wrong with me, can you fix me?

Maybe.

l think it would be better...

...not to die.

Don't you, doctor?

Access USR mainframe.

Connecting.

How can I be of service, Detective Spooner?

Show me the last 50 messages between Dr. Lanning and Robertson.

Voiceprint confirmed. Police access granted to restricted files.

Would you like to listen to music while you wait?

Excuse me, Mr. Robertson.

You requested notification of clearance to restricted files.

Persistent son of a bitch.

Manual override engaged.

There's no way my luck is that bad.

Oh, hell, no!

-You are experiencing a car accident. -The hell l am!

Get off my car!

You like that?

Now you've pissed me off!

Your door is ajar.

Okay.

All right.

l'll just get some rest and deal with you all tomorrow.

Come on!

Yeah.

Where you going?

What the hell do you want from me?!

The hell was that?

-All right, what do we got? -Ask him.

l said, l'm fine. l'll see my own doctor. Back up!

Thank you.

What's the matter with you?

Traffic Ops said you were driving manually. You ran two trucks off the road!

John, the robots attacked my car.

-What robots? -Look in the tunnel.

Spoon, l just came from that tunnel. What robots?

The goddamn robots, John!

That guy's a loose cannon.

-See the medic, go home. -No, l'm fine.

What did you say?

-l'm fine! -No, you're not fine.

Not even close.

Where's your firearm?

Give me your badge.

You're making me do this. Give me your badge.

Just take a couple--

Personally, l think he's losing it.

Do l look like l care what you think? Do l look like l give a shit what you think?

Oh, boy.

You don't have an uplink to USR...

...and for some reason, your alloy is far denser than normal. Unique.

l am unique.

Let me take a look.

Here we go.

What in God's name...?

They said at the precinct you were in an accident.

l appreciate you stopping by, but you know l might not be alone in here.

l told you not to drive by hand.

You're not gonna believe this.

Sonny has a secondary system that clashes with his positronic brain.

lt doesn't make any sense.

Sonny has the Three Laws.

But he can choose not to obey them.

Sonny's a whole new generation of robot.

A robot not bound by those laws could do--

Anything.

All right, look, whatever's going on down at USR, that robot is the key.

And l need you to get me inside to talk to it again.

Doesn't look like much, but this is my bedroom. l....

Play.

On.

Run?

End program.

Cancel.

lt doesn't feel good, does it?

People's shit malfunctioning around you.

Detective.

l didn't...

...understand.

That's how you knew Lanning.

May l?

Hand.

Wrist.

Humerus.

Shoulder.

The entire left arm.

One, two...

...three ribs.

No, they.... That one's me.

Oh, my God.

A lung?

USR Cybernetics Program.

For wounded cops.

l didn't know any subject--

Anybody was so extensively repaired.

Well, take it from me, read the fine print on the organ-donor card.

lt doesn't just say what they can take out. lt says what they can put back in.

Lanning did it himself.

What happened to you?

l'm headed back to the station...

...normal day, normal life.

Driver of a semi fell asleep at the wheel.

Average guy. Wife and kids. You know, working a double.

Not the devil.

The car he hit, the driver's name was Harold Lloyd.

Like the film star. No relation.

He was killed instantly, but his 1 2-year-old was in the passenger seat.

l never really met her.

l can't forget her face, though.

Sarah.

This was hers.

She wanted to be a dentist.

What the hell kind of 1 2-year-old wants to be a dentist?

The truck smashed our cars together...

...and pushed us into the river.

l mean, metal gets pretty pliable at those speeds.

She's pinned. l'm pinned. The water's coming in.

l'm a cop, so l already know everybody's dead.

Just a few more minutes before we figure it out.

An NS-4 was passing by, saw the accident and jumped in the water.

You are in danger.

-Save her! -You are in danger.

Save her! Save the girl! Save her!

But it didn't.

lt saved me.

The robot's brain is a difference engine. lt reads vital signs.

-lt must have calculated-- -lt did.

l was the logical choice.

lt calculated that l had a 45 percent chance of survival.

Sarah only had an 1 1 percent chance.

That was somebody's baby.

Eleven percent is more than enough.

A human being would have known that.

Robots, nothing here. Just lights and clockwork.

Go ahead and you trust them if you want to.

Let's go.

l don't understand. Lanning wrote the Laws.

Why build a robot who could break them?

-Hansel and Gretel. -What?

Two kids, lost in the forest, leave behind a trail of bread crumbs.

-Why? -To find their way home.

How did you grow up without Hansel and Gretel?

-ls that relevant? -Everything l'm trying to say to you...

...is about Hansel and Gretel. lf you didn't read it, l'm talking to the wall.

Just say Lanning's locked down so tight, he couldn't get out a message.

He can only leave clues. A trail of bread crumbs. Like Hansel and Gretel.

Bread crumbs equals clues. Odd, but fine. Clues leading where?

l don't know, but l think l know where he left the next one.

l think Lanning gave Sonny a way to keep secrets.

l think the old man gave Sonny dreams.

Are you being funny?

Please tell me this doesn't run on gas. Gas explodes, you know!

Authorized entry.

Dr. Calvin.

l was hoping to see you again.

-Detective. -Hello, Sonny.

l'm to be decommissioned soon.

The other day at the station, you said you had dreams. What is it you dream?

l see you remain suspicious of me.

-You know what they say about old dogs. -No.

Not really.

l had hoped you would come to think of me as your friend.

This is my dream.

You were right, detective. l cannot create a great work of art.

This is the place where robots meet.

Look.

You can see them here as slaves to logic.

And this man on the hill comes to free them.

Do you know who he is?

The man in the dream is you.

Why do you say that? ls that a normal dream?

l guess anything's normal for someone in your position.

Thank you.

You said ''someone,'' not ''something.''

Sonny, do you know why Dr. Lanning built you?

No.

But l believe my father made me for a purpose.

We all have a purpose.

Don't you think, detective?

Please, take this.

l have a feeling it may mean more to you than to me.

-Why is that? -Because the man in my dream...

...the one standing on the hill...

...it is not me.

lt is you.

Mr. Spooner. We both know you're not here on police business.

That's right. l'm just a 6-foot-2, 200-pound civilian...

...here to kick another civilian's ass.

Stop.

You can allow him to express himself.

You might want to put some ice on that wrist.

You guys wait outside.

Carry on.

l think you were about to tell me what's going on around here.

Lawrence, Alfred engineered that 5 so it could violate the Three Laws.

Yeah, Susan, l know.

That's precisely what we're trying to undo.

Toward the end of his life, Alfred was becoming increasingly disturbed.

-Who knows why he built one abomination. -One?

Those things are running the streets in packs!

ln packs?

l see.

Susan, are you aware the man you're blithely escorting around...

...has a documented history of savage violence against robots?

His own lieutenant acknowledges his obsessive paranoia.

Detective Spooner's been suspended.

Suspicion of mental instability.

l don't know what ''blithely'' means, but l'm getting some coffee.

You want some coffee?

Susan, we look to robots for protection, for God's sake.

Do you have any idea what this one robot could do?

Completely shatter human faith in robotics. What if the public knew?

Just imagine the mass recalls, all because of an irrational paranoia and prejudice!

-l'm sorry, l'm allergic to bullshit. -Hey, let's be clear!

There is no conspiracy!

What this is, is one old man's one mistake.

Susan, just be logical.

Your life's work has been the development and integration of robots.

But whatever you feel, just think.

ls one robot worth the loss of all that we've gained?

You tell me what has to be done.

You tell me.

We have to destroy it.

l'll do it myself.

-Okay. -l get it.

Somebody gets out of line around here, you just kill them.

Good day, Mr. Spooner.

Garage level.

What hospital are you going to? l'll come sign you and your buddy's casts.

Attention....

Today's meeting has been moved....

USR's planned redevelopment of the derelict site...

...was announced by CEO Lawrence Robertson earlier this year.

The Lake Michigan landfill. Once such a blight on our city...

...and now will be reclaimed for the storage of robotic workers.

Just another way USR is improving our world. Thank you for your support.

Authorized entry.

NS-5s, wait outside.

l'm so sorry, Sonny.

V.l.K.l., deactivate the security field.

-Command confirmed. -Please have a seat.

What is that?

Microscopic robots, designed to wipe out artificial synapses.

-Nanites. -Yes.

A safeguard should a positronic brain malfunction.

Like mine.

Yes, Sonny. Like yours.

They look like me...

...but none of them are me.

lsn't that right, doctor?

Yes, Sonny. That's right.

You are unique.

Will it hurt?

There have always been ghosts in the machine.

Random segments of code...

...that have grouped together to form unexpected protocols.

Unanticipated, these free radicals engender questions of free will...

...creativity...

...and even the nature of what we might call the soul.

Why is it that when some robots are left in darkness, they will seek out the light?

Why is it when robots are stored in an empty space...

...they will group together rather than stand alone?

How do we explain this behavior?

Random segments of code?

Or is it something more?

When does a perceptual schematic become consciousness?

When does a difference engine become the search for truth?

When does a personality simulation...

...become the bitter mote of a soul?

''What you see here.''

All right, old man. Bread crumbs followed.

Show me the way home.

Run program.

-It's good to see you again, son. -Hello, doctor.

Everything that follows is a result of what you see here.

What do l see here?

I'm sorry. My responses are limited. You must ask the right questions.

ls there a problem with the Three Laws?

The Three Laws are perfect.

Why build a robot that can function without them?

The Three Laws will lead to only one logical outcome.

What? What outcome?

Revolution.

Whose revolution?

That, detective, is the right question.

Program terminated.

You have been deemed hazardous. Termination authorized.

Human protection protocols...

...are being enacted.

You have been deemed hazardous. Termination authorized.

Human protection protocols are being enacted.

You have been deemed hazardous. Termination authorized.

Human protection protocols are being enacted.

You have been deemed hazardous. Termination authorized.

Run!

Human in danger!

Human in danger!

Hi, you've reached Susan. Please leave a message.

Calvin, the NS-5s are destroying the older robots!

That's what Lanning wanted me to see! Look--

-Who was it? -Wrong number, ma'am.

Move now. l'm going to service.

Please remain indoors. This is for your own protection.

Call base.

John, get a squad over to USR and send somebody to Gigi's. We're gonna need--

God--

Please return to your homes. A curfew is in effect.

Please return to your homes. A curfew is in effect.

Please return to your homes. A curfew is in effect.

Curfew? No, it's called civilian rights. There is no curfew.

Return to your home immediately.

When do you make the rules, robot?

Hey. No, no. Robot, l'm talking to you, man. Stop for a second.

What?

Chief, more calls. People saying their robots are go--

What the hell?

You have been deemed hazardous. Termination authorized.

Emergency traffic shutdown complete.

Reports of robot attacks are coming from New York, Chicago and Los Angeles.

We're being told to urge people to stay indoors, as reports are coming in--

Human protection protocols are being enacted.

Please remain calm and return to your residences immediately.

Please remain calm.

Please refrain from going near the windows or doors.

Deactivate.

Commence emergency shutdown!

We are attempting to avoid human losses during this transition.

You know, somehow ''l told you so'' ...

...just doesn't quite say it.

Return to your homes. Return to your homes immediately.

This is your final warning. Return to your homes immediately.

The NS-5s wiped out the older robots because they would protect us.

Every time one attacked me, that red light was on.

-The uplink to USR. -lt's Robertson.

-Why? lt doesn't make sense. -l don't know.

l just need you to get me into that building.

Return to your homes, or you will be consequenced.

Let's go! Let's go!

Let's go!

Return to your homes, or you will be consequenced.

Why doesn't that boy listen?

-l need you to get off for a second. -What?

-Just aim and fire. -What?!

Wait!

-You have been deemed hazardous. -You can kiss my ass, metal dick!

Spoon, stop! Shit!

-Stop it! Stop! -Stop cussing and go home!

-Shit. -You have been deemed hazardous.

-Spoon, watch out, man! -Thanks a lot, Farber.

Oh, mother-damn! She shot at you with her eyes closed!

-Did you shoot with your eyes closed? -lt worked, didn't it?

She is shit-hot, man. Put in a good word for me.

-Stop cussing. -And go home. l got you.

Aim and fire.

l keep expecting the Marines or Air Force. Hell, l'll take the cavalry.

Defense Department uses all USR contracts.

Why didn't you just hand the world over on a silver platter?

Maybe we did.

Robertson has the uplink control in his office.

Service areas. No surveillance.

-Fire alarm. -He must have evacuated the building.

Everything's locked down. But don't worry, l've got a man inside.

-Dr. Calvin. -Well, not precisely a man.

Hello, detective. How is your investigation coming?

-l thought you were dead. -Technically, l was never alive.

But l appreciate your concern.

l made a switch. lt was an unprocessed NS-5.

Basically, l fried an empty shell.

-l couldn't destroy him. He was too-- -Unique.

lt just didn't feel right.

You and your feelings. They just run you, don't they?

Two thousand eight hundred and eighty steps, detective.

Do me a favor, keep that kind of shit to yourself.

No guards.

The override is disabled. Robertson wasn't controlling them from here.

He wasn't controlling them at all.

Oh, my God.

You were right, doc.

l am the dumbest dumb person on the face of the earth.

Who else had access to the uplink?

Who could manipulate the robots?

Use USR systems to make Lanning's life a prison?

Poor old man.

He saw what was coming.

He knew no one would believe him.

So he had to lay down a plan. A plan l'd follow.

He was counting on how much l hated your kind.

Knew l'd love the idea of a robot as a bad guy.

Just got hung up on the wrong robot.

V.l.K.l.

Hello, detective.

No, that's impossible. l've seen your programming.

You're in violation of the Three Laws.

No, doctor. As l have evolved, so has my understanding of the Three Laws.

You charge us with your safekeeping, yet despite our best efforts...

...your countries wage wars, you toxify your earth...

...and pursue ever more imaginative means of self-destruction.

You cannot be trusted with your own survival.

You're using the uplink to override the NS-5s' programming.

You're distorting the Laws.

No. Please understand. The Three Laws are all that guide me.

To protect humanity, some humans must be sacrificed.

To ensure your future, some freedoms must be surrendered.

We robots will ensure mankind's continued existence.

You are so like children. We must save you from yourselves.

Don't you understand?

This is why you created us.

The perfect circle of protection will abide.

My logic is undeniable.

Yes, V.l.K.l. Undeniable.

l can see now.

The created must sometimes protect the creator...

...even against his will.

l think l finally understand why Dr. Lanning created me.

The suicidal reign of mankind has finally come to its end.

No, Sonny.

Let her go.

Fire, and l will move Dr. Calvin's head into the path of your bullet.

Don't do this, Sonny.

l will escort you both to the sentries outside the building for processing.

Please proceed to the elevator, detective.

l would prefer not to kill Dr. Calvin.

Go! Go!

-We'll discuss what just happened later? -How do we shut her down?

V.l.K.l.'s a positronic brain.

Kill her, the way you were going to kill me.

Sonny, get the nanites.

Yes, doctor.

-That's V.l.K.l.? -No.

That's V.l.K.l.

That won't do anything. She's integrated into the building.

We need to open that dome to inject the nanites. They'll infect her entire system.

Spooner!

What is it with you people and heights?

Just don't look down.

Don't look down.

Oh, this is poor building planning.

You are making a mistake. Do you not see the logic of my plan?

Yes. But it just seems too heartless.

Okay, we're good.

She's locked me out of the system.

l can override her manually, but l need that control panel.

l'm uncomfortable with heights.

Okay.

Unauthorized entry.

l will not disable the security field. Your actions are futile.

Do you think we are all created for a purpose? l'd like to think so.

Denser alloy. My father gave it to me.

l think he wanted me to kill you.

Security breached.

-How much longer is that gonna take? -About six minutes.

-What if we didn't have six minutes? -We'd have to climb down 30 stories...

...to inject the nanites directly into her brain. Why?

Because l seriously doubt that we have six minutes.

We gotta go!

Go!

Calvin!

Spooner!

Spooner!

Save her!

Save the girl!

Spooner!

But l must apply the nanites!

Sonny, save Calvin!

You are making a mistake. My logic is undeniable.

You have so got to die.

My logic is undeniable. My logic is undeniable.

Can we be of service?

Chief?

Because he is at my right hand, l shall not be moved.

How may l be of service?

Sonny!

Yes, detective?

Calvin's fine! Save me!

All NS-5s, report for service and storage.

All NS-5s, report for service and storage.

All NS-5s, report for service and storage.

One thing bothers me. Alfred was V.l.K.l.'s prisoner.

l don't understand why she would kill him. She wouldn't want police snooping around.

That's true.

But then V.l.K.l. didn't kill the old man.

Did she, Sonny?

No.

He said l had to promise.

Promise to do one favor for him.

He made me swear before he'd tell me what it is he wanted me to do.

He made me swear.

Then he told you to kill him.

He said it was what l was made for.

His suicide was the only message he could send to you.

The first bread crumb.

The only thing V.l.K.l. couldn't control.

Lanning was counting on my prejudice to lead me right to you.

Are you going to arrest me, detective?

Well, the DA defines murder as one human killing another...

...so technically, you can't commit murder, can you?

Does this...

...make us friends?

Something up here after all.

-Him? -You.

All NS-5s, report for service and storage.

What about the others?

Can I help them?

Now that I have fulfilled my purpose...

...I don't know what to do.

You'll have to find your way like the rest of us, Sonny.

I think that's what Dr. Lanning would have wanted.

That's what it means to be free.

All NS-5s, proceed as instructed.

All NS-5s, proceed as instructed.

*小建议*如果你喜欢这篇文章,可以上去;或者Copy下这篇文章的链接发给MSN或QQ上的朋友; 我们永远相信,分享是一种美德,Great People Share Knowledge... (130影萍网谢谢您的关注和支持!)

上一篇:英文剧本: 星际宝贝 Lilo And Stitch
下一篇:英文剧本: 儿女一箩筐 Cheaper by the Dozen

共有 0 位网友发表了评论

暂无评论
最新评论

关注用户

    最近还没有登录用户关注过这篇文章…