Читаем Utopia полностью

Kresh made a thoughtful little noise in his throat. “That’s what it comes down to, isn’t it?” He considered for a moment, and then went on. “Of course, the traditional Spacer response would be to do nothing at all,” said Kresh. “Let it alone, let it pass. If there is no way to know if it would be better to act, why then far better to leave the thing alone. If you do nothing, then there is nothing you can be blamed for if things go wrong.”

“Another proud legacy of the Three Laws,” Fredda said. “Be safe, do nothing, take no chances.”

“If the Three Laws teach humans to avoid taking needless risks now and again, I for one see that as a very strong argument in their favor,” said Donald, speaking for the first time. “But even the First Law contains an injunction against inaction. A robot cannot stand idly by. It must act to prevent harm to humans.”

Kresh looked toward Donald with a smile. “Are you saying that a robot faced with this decision would choose to bring down the comet? Is that what you would do?”

Donald held up his hands palm out and shook his head back and forth vigorously. “By no means, Governor. I am quite literally incapable of making this decision. It would be a physical impossibility for me to do it, and more than likely suicidal to attempt it. So it would be for any properly constructed Three Law robot.”

“How so?”

“The First Law enjoins us against doing harm to humans, and against inaction at such times when robotic action would prevent harm to humans.” Donald’s speech became labored as he spoke. It was plain that even discussing the issues in a hypothetical context was difficult for him. “In this case, both action or inaction might or might not cause or prevent harm to humans. Attempting to deal with such a difficult problem, with the lives of so many present and potential humans in the bal-balance would cause…would cause irreparable damage to any pospospositronic brain, as the question produced cascading First-Law/First-Law conflictzzz.” Donald’s eyes grew a bit dimmer, and his movements seemed oddly sluggish as he put his arms back down at his side.

“All right, Donald,” said Kresh, in his firmest and most reassuring voice. He stepped over to the robot and put his hand on Donald’s sloping shoulder. “It’s all right. You are not the one who will have to make that decision. I order you to stop considering it at this time.” There were times when only the words of a robot’s direct master could be enough to snap the robot out of such a state.

Donald’s eyes faded out all but completely for a moment, and then came back to their normal brightness. He seemed to be looking at nothing at all for a few seconds, but then his eyes began to track again, and he looked straight at Kresh. “Thank-thank you, sir. It was most unwise of me to consider the question so closely, even when called upon to do so.”

Kresh nodded absently, knowing that he had brought it on himself. He had asked Donald why a robot could not make such a decision, and a question was, in essence, an order. It required constant caution, endless care, to deal with the delicacy of a Three-Law robot’s sensibilities and sensitivities. Sometimes Kresh was deeply tired of it all. There were even times when he was ready to concede that the Settlers might have a point. Maybe some parts of life would be easier without robots.

Not as if they had such an option at the moment. But if robots could not be trusted to face such a situation…Kresh turned toward Donald again. “Donald, I hereby order you to turn around and face the wall, and to shut off all your audio inputs until you see my wife or me waving at you. Do you understand?”

“Yes, sir. Of course.” Donald turned his back on Kresh and Fredda. “I have now shut down my audio receptors.”

“Very good,” said Kresh. More damn fool precautions, but that couldn’t be helped. At least now Donald would be unable to hear or eavesdrop. Now they would be able to talk without fear of saying the wrong thing in front of the robot and accidentally setting up a damn fool First Law crisis. Kresh turned toward Fredda. “What about the Robotic Planetary Control Center?” he asked. “I wanted to consult with it-and with the Computational Planetary Control Center-before I reached a decision.”

“Well, what about them?” Fredda asked.

The two control centers were the heart of the reterraforming effort, performing all the calculations and analyses of each new project before it was launched. The original intent had been to build a single control center. There were two basic designs to choose between. One was a Settler-style computational unit, basically a massively complex and powerful, but quite nonsentient, computer. The other was a Spacer-style robotic unit that would be based on a hugely powerful positronic brain, fully imbued with the Three Laws. It would, in effect, be a robot mind without a robot body.

Перейти на страницу:

Похожие книги

Аччелерандо
Аччелерандо

Сингулярность. Эпоха постгуманизма. Искусственный интеллект превысил возможности человеческого разума. Люди фактически обрели бессмертие, но одновременно биотехнологический прогресс поставил их на грань вымирания. Наноботы копируют себя и развиваются по собственной воле, а контакт с внеземной жизнью неизбежен. Само понятие личности теперь получает совершенно новое значение. В таком мире пытаются выжить разные поколения одного семейного клана. Его основатель когда-то натолкнулся на странный сигнал из далекого космоса и тем самым перевернул всю историю Земли. Его потомки пытаются остановить уничтожение человеческой цивилизации. Ведь что-то разрушает планеты Солнечной системы. Сущность, которая находится за пределами нашего разума и не видит смысла в существовании биологической жизни, какую бы форму та ни приняла.

Чарлз Стросс

Научная Фантастика