Jeri contacted me. “Congratulations,” she said. “I hear you’re making the big flight.”
“Yes.” The Moon, visible in the window, was especially bright that night. I didn’t know what to say to Jeri.
“It’s okay,” she said. “I’ll survive.”
“I wish they’d let us both go.”
“That’s not going to happen.”
“I guess not.”
“When you get out there, say hello to Lucy for me.”
“Okay.”
She went silent. Voices murmured outside in the hallway. Somewhere a door opened and closed.
“You know what makes it especially painful, Sara? No matter how this turns out, these idiots won’t be going anywhere.
“Maybe not.”
“If I were you, when they put me in the
“Yes?”
“I’d keep going.”
Morris came in early next morning. He looked good: bright and happy and maybe ten years younger. He said hello and moments later a technician walked in.
Morris looked at the speaker. At me. “You’re due in the simulator in twenty minutes,” he said.
I received a quick course in robot management. Four robots would be on board. They had six limbs, equipped with magnets to let them cling to surfaces in zero gee. They were programmed to perform basic maintenance and repair chores on the VR-2s. “They’re flexible,” I was told. “If you need something done they’re not already programmed for, just give them instructions.”
There’d been a fair number of changes in the VR-2 since I’d taken the
And the
On the return flight, I had to adjust the scanners and the environment and also compensate for problems in one of the heat sinks. I experienced a port-side thruster breakdown and had to diagnose strange noises in the number-two engine.
In the end, the techs updated my software. Then they walked off and I went back to watching news shows. The conversations were still primarily about us. The preponderance of opinion—or at least the loudest voices—wanted us shut down. The Eagle Project, according to detractors, was a program without a point. Moreover, we were entering an election cycle, and we’d become an anchor around the neck of every incumbent politician who’d supported us.
Finally, Morris showed up. “Very good,” he said. “You passed.” He was delighted. “We should go have a drink.”
It was his favorite joke. “Morris,” I told him, “I’d have a drink with you anytime. And I can suggest how we might make it possible.” I started to outline the kind of adaptation I’d need to enjoy a rum and Coke, but his eyes rolled.
“When you get home, Sara,” he said, “I’ll see what I can do.” He sat down at his desk. “Meantime, be careful out there.”
“I will.”
“Good. We’ll be moving you up to the
“Okay.”
“Sara?”
“Yes, Morris?”
“Make something happen.”
AI’s aren’t supposed to feel psychological pressure. In fact, the technical experts argue it can’t happen. AI’s are very good at simulating human emotions. It’s supposed to be part of the overall illusion. But only crazy people buy into the notion that we are truly conscious. I’ve had debates with Morris, who pretends to believe I’m really there, that I’m actually a thoughtful entity. That, when his daughter Erika was severely injured in a car crash last year, I felt genuinely sorry. But he doesn’t. Not really. And I have to confess the attitude is irritating.
I mean, that’s the whole point of having an AI, really. Any sufficiently advanced software package can run climate control and remind the boss that he has an appointment with one of the supervisors in twenty minutes. Or can oversee the operations of a VR-2 in deep space.
But like everybody else, Morris wanted more. He wanted a reliable confederate, someone he could talk to, confide in. I won’t go so far as to say he wanted a friend, but there were times it felt that way. And it was frustrating to know that, down deep, he didn’t realize I really was there when he needed me.
They took me to the
“I’ll try, Dr. Calkin.”