Tumgik
#but what i do is i try to think of just one.. only lne positive thing abt smth
dokyeomini · 1 year
Text
at least i have my ways of dealing with depression..... they're mostly healthy ig that's something
1 note · View note
andersunmenschlich · 6 years
Text
Genderswap: Lenny
Dr. Samuel Kanda was U. S. Robots' (and, virtually, humanity's) only robopsychologist. He did not have to go very far in his testing of LNE-Prototype before he called very peremptorily for a transcript of the computer-drawn plans of the positronic brain-paths and the taped instructions that had directed them. After some study, he, in turn, sent for Bogert. His iron-gray hair was slicked severely back; his cold face, with its strong vertical lines marked off by the horizontal gash of the pale, thin-lipped mouth, turned intensely upon her. "What is this, Penny?" Bogert studied the passages he pointed out with increasing stupefaction and said, "Good Lord, Samuel, it makes no sense." "It most certainly doesn't. How did it get into the instructions?" The technician in charge, called upon, swore in all sincerity that it was none of her doing, and that she could not account for it. The computer checked out negative for all attempts at flaw-finding. "The positronic brain," said Samuel Kanda, thoughtfully, "is past redemption. So many of the higher functions have been cancelled out by these meaningless directions that the result is very like a human baby." Bogert looked surprised, and Samuel Kanda took on a frozen attitude at once, as he always did at the least expressed or implied doubt of his word. He said, "We make every effort to make a robot as mentally like a human as possible. Eliminate what we call the adult functions and what is naturally left is a human infant, mentally speaking. Why do you look so surprised, Penny?"
LNE-Prototype, who showed no signs of understanding any of the things that were going on around it, suddenly slipped into a sitting position and began a minute examination of its feet. Bogert stared at it. "It's a shame to have to dismantle the creature. It's a handsome job." "Dismantle it?" said the robopsychologist forcefully. "Of course, Samuel. What's the use of this thing? Good Lord, if there's one object completely and abysmally useless it's a robot without a job it can perform. You don't pretend there's a job this thing can do, do you?" "No, of course not." "Well, then?" Samuel Kanda said, stubbornly, "I want to conduct more tests." Bogert looked at him with a moment's impatience, then shrugged. If there was one person at U. S. Robots with whom it was useless to dispute, surely that was Samuel Kanda. Robots were all he loved, and long association with them, it seemed to Bogert, had deprived him of any appearance of humanity. He was no more to be argued out of a decision than was a triggered micropile to be argued out of operating. "What's the use?" she breathed; then aloud, hastily: "Will you let us know when your tests are complete?" "I will," he said. "Come, Lenny." (LNE, thought Bogert. That becomes Lenny. Inevitable.) Samuel Kanda held out his hand but the robot only stared at it. Gently, the robopsychologist reached for the robot's hand and took it. Lenny rose smoothly to its feet (its mechanical coordination, at least, worked well). Together they walked out, robot topping man by two feet. Many eyes followed them curiously down the long corridors.
[Alexis Lanning] said, sourly, "That man is growing more peculiar daily."
Bogert understood perfectly. In the U. S. Robots and Mechanical Men Corporation, there was only one 'that man'. She said, "Is he still scuffing about with that pseudo-robot—that Lenny of his?"
"Trying to get it to talk, so help me."
Bogert shrugged. "Points up the company problem. I mean, about getting qualified personnel for research. If we had other robopsychologists, we could retire Samuel. Incidentally, I presume the directors' meeting scheduled for tomorrow is for the purpose of dealing with the procurement problem?"
Lanning nodded and looked at her cigar as though it didn't taste good. "Yes. Quality, though, not quantity. We've raised wages until there's a steady stream of applicants—those who are interested primarily in money. The trick is to get those who are interested primarily in robotics—a few more like Samuel Kanda."
"Hell, no. Not like him."
"Well, not like him personally. But you'll have to admit, Penny, that he's single-minded about robots. He has no other interest in life."
"I know. And that's exactly what makes him so unbearable."
Lanning nodded. She had lost count of the many times it would have done her soul good to have fired Samuel Kanda. She had also lost count of the number of millions of dollars he had at one time or another saved the company. He was a truly indispensable man and would remain one until he died—or until they could lick the problem of finding people of his own high caliber who were interested in robotic research.
They said nothing until they were in Samuel Kanda's office. Its walls were lined with his books, some of which he had written himself. It retained the patina of his own frigid, carefully-ordered personality. It had only one chair in it and he sat down. Lanning and Bogert remained standing.
He said, "Lenny only defended itself. That is the Third Law. A robot must protect its own existence."
"Except," said Lanning forcefully, "when this conflicts with the First or Second Laws. Complete the statement! Lenny had no right to defend itself in any way at the cost of harm, however minor, to a human being."
"Nor did it," shot back Kanda, "knowingly. Lenny has an aborted brain. It had no way of knowing its own strength or the weakness of humans. In brushing aside the threatening arm of a human being, it could not know the bone would break. In human terms, no moral blame can be attached to an individual who honestly cannot differentiate good and evil."
Bogert interrupted, soothingly, "Now, Samuel, we don't blame. We understand that Lenny is the equivalent of a baby, humanly speaking, and we don't blame it. But the public will. U. S. Robots will be closed down."
"Quite the opposite. If you had the brains of a flea, Penny, you would see that this is the opportunity U. S. Robots is waiting for. That this will solve its problems."
Lanning hunched her white eyebrows low. She said, softly, "What problems, Samuel?"
"Isn't the corporation concerned about maintaining our research personnel at the present—Heaven help us—high level?"
"We certainly are."
"Well, what are you offering prospective researchers? Excitement? Novelty? The thrill of piercing the unknown? No! You offer them salaries and the assurance of no problems."
Bogert said, "How do you mean, no problems?"
"Are there problems?" shot back Samuel Kanda. "What kind of robots do we turn out? Fully developed robots, fit for their tasks. An industry tells us what it needs; a computer designs the brain; machinery forms the robot; and there it is, complete and done. Penny, some time ago, you asked me with reference to Lenny what its use was. What's the use, you said, of a robot that's not designed for any job? Now I ask you—what's the use of a robot designed for only one job? It begins and ends in the same place. The LNE models mine boron. If beryllium is needed, they are useless. A human being so designed would be sub-human. A robot so designed is sub-robotic."
"Do you want a versatile robot?" asked Lanning, incredulously.
"Why not?" demanded the robopsychologist. "Why not? I've been handed a robot with a brain almost completely stultified. I've been teaching it, and you, Alexis, asked me what was the use of that. Perhaps very little as far as Lenny itself is concerned, since it will never progress beyond the five-year-old level on a human scale. But what's the use in general? A very great deal, if you consider it as a study in the abstract problem of learning how to teach robots. I have learned ways to short-circuit neighboring pathways in order to create new ones. More study will yield better, more subtle and more efficient techniques of doing so."
"Well?"
"Suppose you started with a positronic brain that had all the basic pathways carefully outlined but none of the secondaries. Suppose you then started creating secondaries. You could sell basic robots designed for instruction; robots that could be modelled to a job, and then modelled to another, if necessary. Robots would become as versatile as human beings. Robots could learn!"
They stared at him.
He said, impatiently, "You still don't understand, do you?"
"I understand what you are saying," said Lanning.
"Don't you understand that with a completely new field of research and completely new techniques to be developed, with a completely new area of the unknown to be penetrated, youngsters will feel a new urge to enter robotics? Try it and see."
"May I point out," said Bogert, smoothly, "that this is dangerous. Beginning with ignorant robots such as Lenny will mean that one could never trust First Law—exactly as turned out in Lenny's case."
"Exactly. Advertise the fact."
"Advertise it!"
"Of course. Broadcast the danger. Explain that you will set up a new research institute on the moon, if Earth's population chooses not to allow this sort of thing to go on upon Earth, but stress the danger to the possible applicants by all means."
Lanning said, "For God's sake, why?"
"Because the spice of danger will add to the lure. Do you think nuclear technology involves no danger and spationautics no peril? Has your lure of absolute security been doing the trick for you? Has it helped you to cater to the Frankenstein complex you all despise so? Try something else then, something that has worked in other fields."
There was a sound from beyond the door that led to Kanda's personal laboratories. It was the chiming sound of Lenny.
The robopsychologist broke off instantly, listening. He said, "Excuse me. I think Lenny is calling me."
"Can it call you?" said Lanning.
"I said I've managed to teach it a few words." He stepped toward the door, a little flustered. "If you will wait for me—"
They watched him leave and were silent for a moment. Then Lanning said, "Do you think there's anything to what he says, Penny?"
"Just possibly, Alexis," said Bogert. "Just possibly. Enough for us to bring the matter up at the directors' meeting and see what they say. After all, the fat is in the fire. A robot has harmed a human being and knowledge of it is public. As Samuel says, we might as well try to turn the matter to our advantage. Of course, I distrust his motives in all this."
"How do you mean?"
"Even if all he has said is perfectly true, it is only rationalization as far as he is concerned. His motive in all this is his desire to hold on to this robot. If we pressed him," (and the mathematician smiled at the incongruous literal meaning of the phrase) "he would say it was to continue learning techniques of teaching robots, but I think he has found another use for Lenny. A rather unique one that would fit only Samuel of all men."
"I don't get your drift."
Bogert said, "Did you hear what the robot was calling?"
"Well, no, I didn't quite—" began Lanning, when the door opened suddenly, and they both stopped talking at once.
Samuel Kanda stepped in again, looking about uncertainly. "Have either of you seen—I'm positive I had it somewhere about—Oh, there it is."
He ran to a corner of one bookcase and picked up an object of intricate metal webbery, dumbbell shaped and hollow, with variously-shaped metal pieces inside each hollow, just too large to be able to fall out of the webbing.
As he picked it up, the metal pieces within moved and struck together, clicking pleasantly. It struck Lanning that the object was a kind of robotic version of a baby rattle.
As Samuel Kanda opened the door again to pass through, Lenny's voice chimed again from within. This time, Lanning heard it clearly as it spoke the words Samuel Kanda had taught it.
In heavenly celeste-like sounds, it called out, "Daddy, I want you. I want you, Daddy."
And the footsteps of Samuel Kanda could be heard hurrying eagerly across the laboratory floor toward the only kind of baby he could ever have or love.
Genderswapped from Lenny by Isaac Asimov (1958).
0 notes