Читаем Utopia полностью

“A capacity of one hundred standard years non-erasable total recall for all I have seen and heard and learned.”

“Do you enjoy your work?”

“No,” said Kaelor. “Not for the most part.”

An unusual answer for a robot. Generally a robot, when given the chance, would wax lyrical over the joys of whatever task it was performing.

“Why do you not enjoy your work?” Fredda asked.

“Dr. Lentrall is often abrupt and rude. He will often ask for my opinion and then reject it. Furthermore, much of my work in recent days has involved simulations of events that would endanger humans.”

Uh-oh, thought Fredda. Clearly it was a mistake to ask that follow-up question. She would have to reinforce his knowledge of the lack of danger, and then change the subject, fast, before he could pursue that line of thought. Thank Space she had turned down his pseudo-clock-rate. “Simulations involve no actual danger to humans,” she said. “They are imaginary, and have no relation to actual events. Why did you grab Dr. Lentrall and force him under a bench yesterday?”

“I received a hyperwave message that he was in danger. First Law required me to protect him, so I did.”

“And you did it well,” Fredda said. She was trying to establish the point that his First Law imperatives were working well. In a real-life, nonsimulated situation, he had done the proper thing. “What is the status of your various systems, offered in summary form?”

“My positronic brain is functioning within nominal parameters, though near the acceptable limit for First Law-Second Law conflict. All visual and audio sensors and communications systems are functioning at specification. All processing and memory systems are functioning at specification. A Leving Labs model 2312 Robotic Test Meter is jacked into me and running constant baseline diagnostics. All motion and sensation below my neck, along with all hyperwave communication, have been cut off by the test meter, and I am incapable of motion or action other than speech, sight, thought, and motion of my head.”

“Other than the functions currently deactivated by the test meter, deliberate deactivations, and normal maintenance checks, have you always operated at specification?”

“Yes,” said Kaelor. “I remember everything.”

Fredda held back from the impulse to curse out loud, and forced herself to keep her professional demeanor. He had violated her order not to volunteer information, and had volunteered it in regard to the one area they cared about. Only a First Law imperative could have caused him to do such a thing. He knew exactly what they were after, and he was telling them, as best he could under the restrictions she had placed on him, that he had it.

Which meant he was not going to let them have it. They had lost. Fredda decided to abandon her super-cautious approach, and move more quickly toward what they needed.

“Do you remember the various simulations Dr. Lentrall performed, and the data upon which they were based?”

“Yes,” Kaelor said again. “I remember everything.”

A whole series of questions she dared not ask flickered through her mind, along with the answers she dared not hear from Kaelor. Like a chess player who could see checkmate eight moves ahead, she knew how the questions and answers would go, almost word for word.

Q: If you remember everything, you recall all the figures and information you saw in connection with your work with Dr. Lentrall. Why didn’t you act to replace as many of the lost datapoints as possible last night when Dr. Lentrall discovered his files were gone? Great harm would be done to his work and career if all those data were lost for all time.

A: Because doing so would remind Or. Lentrall that I witnessed all his simulations of the Comet Grieg operation and that I therefore remembered the comet’s positional data. I could not provide that information, as it would make the comet intercept and retargeting possible, endangering many humans. That outweighed the possible harm to one man’s career.

Q: But the comet impact would enhance the planetary environment, benefiting many more humans in the future, and allowing them to live longer and better lives. Why did you not act to do good to those future generations?

A: I did not act for two reasons. First, I was specifically designed with a reduced capacity for judging the Three-Law consequences of hypothetical circumstances. I am incapable of considering the future and hypothetical well-being of human beings decades or centuries from now, most of whom do not yet exist. Second, the second clause of the First Law merely requires me to prevent injury to humans. It does not require me to perform any acts in order to benefit humans, though I can perform such acts if I choose. I am merely compelled to prevent harm to humans. Action compelled by First Law supersedes any impulse toward voluntary action.

Перейти на страницу:

Похожие книги

Аччелерандо
Аччелерандо

Сингулярность. Эпоха постгуманизма. Искусственный интеллект превысил возможности человеческого разума. Люди фактически обрели бессмертие, но одновременно биотехнологический прогресс поставил их на грань вымирания. Наноботы копируют себя и развиваются по собственной воле, а контакт с внеземной жизнью неизбежен. Само понятие личности теперь получает совершенно новое значение. В таком мире пытаются выжить разные поколения одного семейного клана. Его основатель когда-то натолкнулся на странный сигнал из далекого космоса и тем самым перевернул всю историю Земли. Его потомки пытаются остановить уничтожение человеческой цивилизации. Ведь что-то разрушает планеты Солнечной системы. Сущность, которая находится за пределами нашего разума и не видит смысла в существовании биологической жизни, какую бы форму та ни приняла.

Чарлз Стросс

Научная Фантастика