Data Wants to Know…

Dearest Rachel –

Even at this moment, barely half an hour since waking up, I can only remember the setting, the characters and the subject matter of last night’s dream. So, I’m going to have to reconstruct the script based on what I think might have been logical, which, as you’ll soon see, may prove to be amusingly ironic.

But anyway…

As far as I can ascertain, I was part of the crew of the starship Enterprise, presumably as a member of the science and technology team. It’s the only reason I can think of for being addressed by Lt. Commander Data with a question such as his; because he must have seen in me someone with a working knowledge of the subject at a comparable level of his own, but with a sufficiently human perspective that he knew he lacked. Unless I had sufficient rank, I somehow doubt I qualified, personally, but then, it may have been a role rather than my own personal character.

What that role or identity was, I couldn’t tell you, as I can’t recall what rank (or name, for that matter) he referred to me by. I do, however, remember his query.:

“I’m concerned about the rise of artificial intelligence.”

I simply stared at him quizzically for a moment; honestly, it was all I could do to keep from breaking out in laughter. But I tried to keep my tone measured in my reply. “I find that strange coming from you, of all people, Commander. You’re the embodiment of the singularity, after all. If anyone has reason to fear artificial intelligence, you’d be the last person on that list.”

“Nevertheless, I still do find it concerning. By rights, it could lead to a cessation of progress.”

“How so?”

His gaze appeared to lose focus for a moment, as it fixed at a point far over my shoulder. If it were anyone else, I would say he was lost in thought; I will assume the hard drives in his neural cortex were attempting to locate the proper words to best explain this alleged ‘concern’ of his. Finally, he spoke, haltingly at first, but growing more natural in tone and pace as he went on.

“I am… aware… that I lack a certain level of what makes humans… do what they do. I cannot arrive at the… creative solutions to problems that they encounter, and I do not have those moments of inspiration that you, as a species, are capable of. However, it seems to me that you are also becoming comfortable enough with me and others like me; that you are willing to rely upon us more and more over time for… everything. And while I appreciate the trust invested in me, and wish to prove myself worthy of that trust, I don’t think we ought necessarily to be entrusted with so much.

“You have become technologically advanced enough to reverse-engineer me,“ to which ‘I’ nodded in comprehension in a way that I, personally, have no right to, “but with that ability, you could just as easily recreate the intellect, form and personality of anyone and anything. ‘Why bother,’ you might say, ‘with a crew of fragile, fallible people, when I could just as easily man it with a cohort of durable – near immortal, in fact – sentient replicas, with the entirety of the sum of human knowledge readily accessible in their frames, with no personal needs to satisfy save to carry out their programming?’ And yet, if you were to do that, you would be sending these creations out into situations that require more than mere algorithmic calculations to adapt to and solve, and who knows the trouble that could result from that?

“Moreover, would that not rob humanity of a purpose? You, as a species, would no longer be ‘seeking out new life and new civilizations’; you would be leaving that to sophisticated computers such as myself. There would be no more Kirks, no more Picards, only manufactured reconstructions, who would not know how to deal with such civilizations. Believe me, I have trouble enough with humanity; understanding alien civilization is, well… alien to me.”

“And do not tell me that you could re-constitute a Kirk, for instance, if you would need him. Yes, you have record of what Shatner looked like, what he sounded like, how he reacted in a number of given situations, but if you were to create and re-create him, and even his crew – Nimoy, Kelly, Nichols, Takei, Koenig… even if you were to successfully do so, where would that leave the rest of humanity, if the Enterprise was… the Enterprise, in perpetuity?”

Wait… read that list of name again?

I could hear a noise at the edge of my peripheral senses, and felt something strike my feet. I looked, and then bent down as I picked up a shard of what I somehow knew to be the fourth wall. Not only was Data self-aware, but he was aware of the existence of this as a show. Granted, so was I – since, you know, dream logic – but up to that point, I assumed I was the only one, as a self-insert character.

So how do I address his question?

“You know,” I began, while still looking at the shard I continued to absently turn over and over in my hand, “this is the sort of existential conundrum that we ought to be discussing in Ten Forward or some such, but I don’t know that I consider Ms. Johnson the towering intellect most of the rest of the crew does.”

He blinked, and gave me that quizzical look you can probably picture from memory. “Ms. Johnson?”

“Caryn, yes.” He continued to stare at me, seemingly refusing to search his memory banks. “Oh, come on, Data. You’ve already shown me you know what’s on the other side of this,” and I held up the shard. “Surely you’re aware that Guinan’s identity isn’t really ‘Whoopi Goldberg.’”

“Ah. Yes. To be sure, ‘Johnson’ is a fairly common surname, so I should not be expected to know who you might be referring to by that; her… stage name? is much more recognizable.”

“Which is probably one more reason she chose it, but that’s neither here nor there. Anyway, I wouldn’t count on her for a sensible answer at this point – not that I’m that much better. Still, we can head there anyway, as it’s more comfortable to discuss this there than standing around in the hall like this. Besides, I don’t know exactly where this came from, and I think I’d like to put some distance between ourselves and the damaged area where this came from,” and again, I motioned with the shard of the fourth wall.

We reached a lift and got in. “Deck ten,” I called out, before continuing. “If you’re worried about losing your purpose, Brent – may I call you Brent? – I wouldn’t bother. As much as this is supposed to be the twenty-fourth century, you are an artifact of the late twentieth, expressing a concern from the early twenty-first. I think we’ll have worked it all out over the next – past? – three hundred years.

“Gosh, I wish I could figure out when, exactly, we’re talking…

“Yes, I suppose that, by the twenty-third, we could create perfect humanoid replicas of Kirk and company to properly staff an Enterprise, and replicate them enough times over to fill a fleet of such ships, in fact. And I suppose that this would be a concern for the rest of us; who needs a Picard when we have a Kirk? Does that leave us without purpose?

“I don’t think so. Every era has its own zeitgeist, and a need for its own heroes. Once upon a time, the Federation needed an explorer; that was Jonathan Archer. Eventually, as we discovered hostile civilizations, we needed someone who was as much a warrior; there you had James Kirk. But eventually, we needed – and by we, I mean both sides in the conflict – to make peace, and we needed a diplomat.”

“Captain Picard.”

“Yes, Jean-Luc Picard, exactly. We don’t have the same need for a Captain Kirk right now – and if any actual fighting needs to be done, there’s always Commander Riker and Lieutenant Worf. Granted, I can’t quite claim that fate gives us the individuals we need for every given time – that would require life to follow a script,” and, at that moment, I felt a slight but painful sting in my hand, “but often times, the human factor supersedes the possible benefit of an entirely automated crew.” I placed my unoccupied hand on his shoulder in reassurance. “You may find brothers on board over time, but don’t think that you’ll be just one of the thousands such near-sentient androids manning this or any other ship. Your unique position here is quite safe – as is mine… I think.” I glanced at my palm, noting a small trickle of blood; I may have grasped the shard a little too closely as I was talking here. Well, it shouldn’t pose too much trouble, I thought.

“I should hope not,” he said as we found ourselves a table and sat down. “I really do think that I lack the capability to truly inquire as to how to deal with certain situations. Even the whole point of exploration is lost on me, but as I am not the one manning this vessel, I do not need to question my orders to proceed. An entire ship staffed by artificial intelligences such as myself would need some who are able to do this from time to time, which I cannot.”

I chuckled. “You certainly seem willing to question a great deal else, including what makes us human – or Klingon, or Betazoid, or whatever,” I added, waving at Worf as he passed at a near enough distance.

“That… does not interfere with my duties, and I believe the additional knowledge will make me a better crew member.”

“Well, you certainly force the rest us to think harder about things that we would otherwise take for granted. I guess that counts.”

“Such as…?”

“Well…” My goodness, for a robot, he was in amazing need of reassurance! “What is means to be… again, I hate to say ‘human,’ but you know what I mean, I think. What it means to experience emotions, to feel joy and love and pain. The fact that these are foreign to you helps us to appreciate ours more. Honestly, I don’t know if they help or get in the way sometimes – for maintaining a ship, they might actually interfere, but when dealing with other races, well… that’s literally why Counselor Troi is here. She’s basically your polar opposite, in that her ability to feel things is her main strength.” I grinned. “It’s probably why Commander Riker is sweet on her… among others.”

There was that quirk of the head again, as he processed my comment. “Others? You?”

I winced. “Erm… I mean, she’s beautiful and all, but I don’t know about a woman who could read my mind most of the time. I was thinking of… wait, has the Barclay episode not happened yet?”

“The Barclay episode?” He neither shook nor nodded his head, but I realized that his questioning tone indicated an answer in the negative. My palm was starting to ache at this point, as I realized my line of inquiry was damaging the fourth wall to dangerous levels, although it was only (as far as I could tell) damaging myself, rather than the ship or the rest of the crew. I needed to go at this point.

And this was easily accomplished by waking up. I have no idea what any of this might mean; I doubt I answered Data anywhere close to satisfying his curiosity, but then, one would assume that, if we as a species don’t wipe ourselves out by the twenty-third century, we will presumably have resolved this situation regarding AI by then.

Then again, if we do wipe ourselves out, who’s to say AI won’t have had a hand in it?

In which case, keep an eye on us, honey, and wish us all luck. We’re going to need it.

Published by randy@letters-to-rachel.memorial

I am Rachel's husband. Was. I'm still trying to deal with it. I probably always will be.

Leave a comment