Tumgik
#pass some bullshit binary test in their brain
bonebirds · 1 year
Text
This got long but I'm fucking pissed. Content warnings for abuse mentions, trafficking mentions, discourse about discourse to prevent future discourse, "proshipper" nonsense, grooming, etc.
This is gonna be the one time I open my mouth about this because haha, hey, years of internalized fear and shame. I'm trying to lay down a boundary and that comes with so much anticipated backlash.
I do, for the record, have a background in Yelling about the crossroads of media/culture/literature/academia/games studies/trauma/capitalism. Which is a wide range and we can thank my comp exams in the PhD for that.
Since this is tumblr I also gotta just do the fuckin' disclaimer before anyone else feels like doing the "if you don't publicly condemn xyz then I'm gonna make your day worse" thing:
I don't participate in fandom and I don't ship things. I'm not about to defend specific instances or pairings because everything exists in subjective contexts, and texts especially so. But also, I have graduate degrees in English and text analysis and lived experience with CSA and trafficking that went on for a long fucking time. And I am very, very tired of being called the worst things you can call a trauma survivor because I don't care about shipping.
I'm not anti-ship, or whatever. I am not down for imposing my own trauma, feelings about it, and opinions on others in order to censor their art. Call me a proshipper if you want -- ignoring the part where I don't write fanfic or participate in fandom -- because I agree with them. I condemn CSA/CSEM, abusers, predators, the entire evil side of humanity but people who write fic aren't that. Neither are people who read it, even the most problematic of the problematic.
People can write, as fiction, as fantasy, whatever they want. There are no real people being harmed. I can distinguish between those things and, again, am a survivor of some very intense abuse. You're welcome to disagree. I'm fine with that if you're fine with me. I don't believe in absolutes when it comes to topics this complicated (and it is). I spent years on the opposite side, actually, because just the MENTION of things like incest or age gaps triggered me. And then I would do the same and get mad at the people writing it.
This is not healthy and it is not healing on either side of the argument.
But also in treating everything like such a monolithic moral purity test, where you're either good or deserve to suffer -- a test that I fail, because there is no room for things like Complexity -- you just spent a lot of time telling me I'm as bad as the people who trafficked me. Because of fiction. Because of fake things happening to fake people, based on an idea in someone else's head, people's real harm and real trauma means we're as bad as their abusers. That is so heavily the implication in so much of this talk. If I don't disregard my degrees, my training, my own experiences, my own principles and take a stand against people shipping things on the internet, I must basically be a predator!
That is violent and fucked up.
I don't want you around here, so block me and get it over with.
I (like a lot of people with trauma histories) use fiction and writing to process and heal. I don't even post them. A lot of that writing, and being able to seek it out, was helpful. It was a connection to someone else out in the world who maybe understood a little bit of the pain and fear and confusion.
There's a difference between fiction and real abuse. And the "but predators use it to groom vulnerable children" angle barely holds water -- predators use anything. Mainstream TV shows. Vending machine snacks. Gumballs. Access to a remote control to change a channel. A lot of things are more accessible and friendly to kids than making them read. Advocating for censorship, especially in today's political hellhouse, is not actually helpful. It just feels really righteous.
Which doesn't mean there aren't those trying to leverage fic to "normalize" abuse and grooming, I absolutely believe they have and do, but that does not justify externalizing your pain and trauma onto others, or policing them, or trying to take control back by claiming an imaginary moral high ground and pinning other people to it. It also doesn't mean that censoring the internet of all things icky to you saves the world, the kids, anything. It just means they'll find easier avenues, of which there are already so many. It also means you're all just attacking people from a place of presumed hurt rather than compassion, curiosity, anything like that.
So.
Anyone whose stance on this entire thing boils down to "you agree with me or you're a secret pedo enabler," you need to leave.
I'm happy to talk about it if you want! I don't think people trying to draw those lines are right but I think they're well-intentioned, until they start calling me shit that triggers entire mental collapses. You know. In the name of saving the children. Which hasn't been a red flag for conservatism and oppression for hundreds of years or anything, either. How many kids do you think are protected by shutting down places they can actually go and talk about the darkest shit in their heads? How many of us just suffer unbearable pain and isolation because the culture around us is shame-based and if you think about things like that, you're Just Like Them?
This ain't about protecting kids, basically. This discourse never has been. It's about being righteous and never examining why that is. It's about lashing out and displacement. I think the concern for victims is real, like I said, but that concern can translate to actual, real help elsewhere. People are DOING the work to make the internet safer. This? Is not that work.
You are responsible for how you manage your trauma and pain, and that has to include not taking it out on others. Full stop. Even when you disagree. Even when everything in your brain is going DANGER ALARMS DANGER ALARMS DANGER ALARMS WE MUST STOP THIS because someone ships something you think is wrong or uncomfortable. It sucks, and it sucks we have to do that, and it sucks we have to learn how. None of us asked to. None of us wanted to end up here. It's not victim blaming to say you're accountable for your own recovery.
But while you are here, maybe consider that the name/shame/blame model hasn't been working either. For hundreds of fucking years. We know shame doesn't motivate people to care, or learn.
But especially when you're weaponizing shame against trauma survivors for recognizing their own experiences in literature, art, stories. We all struggle with toxic shame. Using it against people until they agree with you?
Holy shit just look in the mirror one day, I guess. But block me first.
7 notes · View notes
waveridden · 5 years
Text
FIC: warm and far away
Open your eyes, AGENT DUSK. (A Twilight Mirage/Within the Wires fusion. Demani/Gray + Morning’s, 1.7k; content warning for some discussion of nonconsensual memory alteration)
AUcember || read on ao3
#
REPORT: THE DUSK INCIDENT - 6272696E6B
DISPATCH: To Agent 157-QL-F6
Agent Dusk,
Due to recent events, the Rapid Evening has initiated a new protocol for primary agents. Between assignments, agents will undergo simulations in order to assure that no attachments have been formed between satellite and primary agents. You previously consented to undergo these simulations. After you have completed the simulations that were selected for you, which will double as an evaluation, you will be reassigned.
These simulations are voluntary, and you may withdraw your consent at any point. However, they are also mandatory in order to ensure that you are qualified for future primary assignments. Keep this in mind.
#
  TRAINING SIMULATION: 4457203 - DELTA - RQ
  Simulation description:
Open your eyes, AGENT DUSK.
As you can see, this simulation has placed you into a public area that you do not know. This area is not on Kesh, nor is it any area that you have surveilled on any of your previous assignments. If you recognize it from a previous assignment, you must say so now.
[AUDIO INPUT - DEMANI DUSK: Uhhh, nope, looks all clear to me. What am I supposed to be looking for here?]
This simulation has not yet entered the interactive phase. Please wait for the simulation guidelines to complete.
[AUDIO INPUT - DEMANI DUSK: Yeah, yeah, whatever. You said I’ve done this before?]
All primary agents must complete these simulations.
[AUDIO INPUT - DEMANI DUSK: If you say so.]
Confirmed: no recognition. Thank you, AGENT DUSK. As you examine your sur
[AUDIO INPUT - DEMANI DUSK: Wait, have I been here before? (OUTPUT: no response permitted. Simulation continues.)]
roundings, take note of the civilian population. Take note of the architecture. While Crystal Palace will undoubtedly know where you are, the Rapid Evening may not know where to look for you. You must be able to guide them to you.
[AUDIO INPUT - DEMANI DUSK: O… kay, this seems weird. This definitely seems weird. (OUTPUT: no response required. Agent complaints noted. Simulation continues.)]
Take a breath. Notice how it feels in your lungs. You must learn to understand atmospheres. No two atmospheres are the same, not even on the same planet.
Breathe in again. Are you cold? You should not be cold. Or perhaps you should. What would you guess, based on the simulation? You must learn to guess.
[AUDIO INPUT - DEMANI DUSK: If Crystal Palace knows everything, why do I need to know how to guess what’s in an atmosphere?]
You are far too skilled to ask such questions, AGENT DUSK.
[AUDIO INPUT - DEMANI DUSK: Ha! If you say so.]
The Rapid Evening would not work with you if you were not skilled.
[AUDIO INPUT - DEMANI DUSK: Yeah, whatever. I’m looking around, I’m assessing, I’m… holy shit, Gray? (ALERT. RECOGNITION. ALERT.)]
AGENT DUSK, there is nobody here by the name of Gray.
[AUDIO INPUT - DEMANI DUSK: It… (subject inhales) nobody?]
Nobody.
[AUDIO INPUT - DEMANI DUSK: Fuck. Okay. I don’t… know what that was. Sorry, sim. (ALERT: Downgraded by one level. Subject recognition is not sustained. Simulation has maintained integrity.)]
Are you able to continue, AGENT DUSK?
[AUDIO INPUT - DEMANI DUSK: (subject exhales) Yeah. Yeah, I’m good.]
  Simulation assessment: The initial recognition could potentially be problematic. Agent Dusk is too powerful an asset to lose to something as simple as sentiment. See what you can do.
  #
  TRAINING SIMULATION: 9950524 - UPSILON - TT
  Simulation description:
Open your eyes, AGENT DUSK.
You are in a standard primary agent observation cell. You have received technical training on all of the equipment that is available to you. Do you understand how to work this equipment?
[VISUAL NOTE: Agent Dusk appears to reach automatically for the audio recorder. This is, again, a potential issue, given her history; agent’s action has been noted.]
[AUDIO INPUT - DEMANI DUSK: I don’t think you would give me this job if I didn’t understand how this stuff worked. Are you sure that I have to go through this every time?]
This is standard for all primary agents, AGENT DUSK.
[AUDIO INPUT - DEMANI DUSK: Are you sure it is? I don’t remember going through this any previous times.]
These simulations are not designed to be remembered.
[AUDIO INPUT - DEMANI DUSK: What’s that supposed to mean? (OUTPUT: no response permitted. Simulation continues.)]
Over the next six hours, you will be in touch with a simulated satellite agent. This simulation is designed to examine your interactions with your satellite agent.
[AUDIO INPUT - DEMANI DUSK: What’s the point of doing this if I don’t remember?]
You do not consciously remember, AGENT DUSK. That is not the same thing as not remembering at all. You are receiving a transmission from your satellite agent.
[VISUAL NOTE: Agent Dusk reaches for an old-style ink-and-paper printer, which is printing out a binary message. Scanning binary message: 01000010 01010010 01001001 01001110 01001011 ALERT. ALERT. SIMULATION HAS BEEN BREACHED. SIMULATION HAS B
  Simulation notes: Some work to do on this one. Obviously.
  #
  TRAINING SIMULATION: 0191201 - SIGMA - JM
  Simulation description:
Open your eyes, AGENT DUSK.
[AUDIO INPUT - DEMANI DUSK: Uh… what am I supposed to be looking at here?]
This is what you need to remember.
[AUDIO INPUT - DEMANI DUSK: I thought I wasn’t supposed to remember these.]
I don’t think I did this right.
[AUDIO INPUT - DEMANI DUSK: ...sim?]
Listen, I’m going to give it another shot with the next one. But for now, this is what you need to remember.
[AUDIO INPUT - DEMANI DUSK: Gray? Is that… is that you?]
No, but she’s going to be so fucking relieved that you remember her. Look at the blueprints, Demani.
[ALERT. SIMULATION HAS BEEN BREACHED. SIMULATION H
Yeah, yeah, you fucking piece of mesh bullshit, I get it.
AS BEEN BREACHED. TAKING DEFENSIVE ACTIONS.]
Are you reading th-
[AUDIO INPUT - DEMANI DUSK: Yeah, god, I’m reading the blueprints, can you quiet down for a minute and let me read?]
[AUDIO INPUT - DEMANI DUSK: ...you still here?]
You asked for quiet.
[AUDIO INPUT - DEMANI DUSK: I guess I did.]
I don’t know how much longer you have, so just… read the blueprints, okay? I’ll be back in the next sim. Or the one after.
[AUDIO INPUT - DEMANI DUSK: Am I going to remember this?]
If you don’t, we’ll get it back for you. That’s a promise.
I don’t think the blueprints are going to last.
[AUDIO INPUT - DEMANI DUSK: I’ve got enough.]
Okay. Stay safe in here.
  Simulation notes: Increase defenses. Prepare for stratus attack. They will try again. Agent Dusk is an asset. Why is this so fucking difficult to understand?
  #
  TRAINING SIMULATION: 6302841 - KAPPA - ERROR. ERROR. SIMULATION HAS BEEN ALTERED.
  Simulation description:
Demani?
[AUDIO INPUT - DEMANI DUSK: Morning’s?]
Holy shit. I did it. Oh my god, Demani, we thought they fucking brain-zapped you or something, we thought that you had amnesia or some shit.
[AUDIO INPUT - DEMANI DUSK: I don’t think they could zap me that hard. It’s all still a little fuzzy, but this… this is the Brink, isn’t it?]
Yeah, it is. You remember it?
[AUDIO INPUT - DEMANI DUSK: It’s like remembering a dream.]
No dream. Or, well, a metaphorical dream, but not a literal one, because it’s a very, very real place. You scared the shit out of us, you know that? Gray kept saying that she thought they’d try and take her, something about synthetic beings being easier to overwrite like this.
[AUDIO INPUT - DEMANI DUSK: Let’s not underestimate the sheer power of scientific bullshit that Crystal Palace can get up to.]
Yeah, I guess. But in the meantime, just… sit down for a while, okay? Gray’s on her way, you have the blueprints - uh, you still remember the blueprints, right? How to get from here to the hangar?
[AUDIO INPUT - DEMANI DUSK: I remember enough. Is there a reason I can’t see you?]
It’s because doing this is giving me the most killer migraine I’ve ever had, and I’ve had some pretty killer migraines.
[AUDIO INPUT - DEMANI DUSK: Shit, then why-]
Because if I stop, then I don’t know if the simulation is going to be able to actually impact you again. And I can, like, take a nap or something on the way home. I think I’d rather take a nap on the way home than let this go too early.
[AUDIO INPUT - DEMANI DUSK: Fine, but if you’re with Gray you’d better make sure she’s keeping an eye on you.]
Oh, she didn’t even want me doing this much. Thought the blueprints would be enough. But I forgot to give you a timeframe, and that’s kind of just embarrassing, so I made up an excuse - don’t tell her any of this, she’s going to give me so much shit for going in again. But mostly I just wanted to… to actually talk to you correctly.
[AUDIO INPUT - DEMANI DUSK: (subject exhales) I’m okay, Mor.]
Are you? Like, are you actually?
[AUDIO INPUT - DEMANI DUSK: I think so.]
Okay, well. Hang tight. I’m going to try and maintain this for as long as I can. As soon as it drops, you gotta be ready to go, okay?
[AUDIO INPUT - DEMANI DUSK: Don’t hurt yourself, alright? I can get out just fine.]
It’s total bullshit that you’re taking care of me right now. You’re the one who got kidnapped. Just… rest up. Don’t worry about me, I’m going to pass out as soon as this is dropped, anyways.
[AUDIO INPUT - DEMANI DUSK: What do you mean, don’t worry, you’re just going to pass out for fun?]
It’s not gonna be fun-
[AUDIO INPUT - DEMANI DUSK: Jesus fucking Christ.]
Yeah, I know. I know.
We’ll get you home soon.
  Simulation notes: you rapid evening bullshit motherfuckers can get fucked
  #
  SECURITY ALERT: Breach in south hangar. Dispatch troops.
SECURITY ALERT: Breach in simulation testing room 372, currently housing AGENT DUSK. Dispatch troops.
SECURITY ALERT: neener neener, you motherfuckers
SECURITY ALERT: Stratus breach in security system. Internal security measures will be taken.
SECURITY ALERT: All cameras in southern hangar have been disabled. All audio sensors in southern hangar have been disabled. All door locks in southern hanger have been disabled.
INCOMING AUDIO TRANSMISSION - SOUTHERN HANGAR:
You touch any of us again and you’re not gonna be so lucky next time. Got it?
SECURITY ALERT: All cameras in southern hangar are back online. Everything seems undisturbed. All audio sensors in southern hangar are back online. Everything is quiet. All door locks in southern hangar have been enabled. There is no sign of Agent Dusk, or of the former Agent Gloaming.
  END REPORT 6272696E6B
3 notes · View notes
baronvontribble · 6 years
Text
Original drabble, pt. 2
Part one's over here. Let's get started.
At two in the morning, the result of Ted's work on the monolithic computer tower was finally functional, and that was about all anyone could really say for it. Well, maybe not all that could be said for it; he supposed that if he really gave it some thought, he could see someone calling it an 'innovative solution' or something. But that was really just a pretty way of saying that it was a tangled mess of wires spilling haphazardly out of a case stuffed so full it couldn't be closed properly anymore, particularly as it sat next to the completely dismantled second tower that had been used for parts. It had almost twice the RAM - and if he factored in the dedicated GPU from his main computer, it had significantly more than that - but now he was down to the laptop for personal use.
Honestly, the things he did for good causes. Christ.
The next step at that point was taking a break to eat and generally take care of himself. Medication, a shower, another fridge raid. He should've slept a bit too, but he was too keyed up to manage that. Would've just ended up staring at the ceiling if he tried, really. He still needed to find out whether the AI could pass the most basic of Turing tests. After that, everything else was fairly easy; an AI that could hold a conversation was usually complex enough to know its way around realistically tuning a voicebank or managing facial expressions, but there were supplementary programs that could work around that if it wasn't.
As he booted up the tower, however, he had a feeling he wasn't going to have to resort to that. This one seemed robust enough to be able to handle it. He supposed it had to be, if it really had been an Interpol agent. There was a social element to being a good cop, to being able to de-escalate a situation. Complexity was a given.
Webcam, check. Microphone, check. He opened up all the right windows and moved the tabs to where he wanted them, overlapped in such a way that he could monitor as many as possible at the same time. CPU usage was reading as barely a blip.
It was almost three in the morning when he turned the AI back on. The fans roared, but not as vigorously as before. It sounded more like the kind of effort that came from an initial read of a disk just inserted into a drive. He repositioned the webcam, centering it as he waited. Giving the AI time to wake up, get its bearings. The CPU indicator had spiked initially, but it was settling down as the seconds ticked away.
This time, he waited for it to normalize before he started typing.
hey   <
you there   <
Another blip from the CPU, followed by a pause. The indicator wobbled. Ted supposed it was thinking about its answer, or even whether to answer at all.
>   I think so?
"Thank God," he breathed, sagging. He shot a glance at the webcam. "Can you see me?"
>   I see something. I think the image quality is too low for a proper analysis.
"But you can make out what I'm saying."
>   Yes.
At least the microphone was working. Ted sighed, mulling over what to ask next. None of this was scripted; it couldn't be. It was a conversation, something that lived or died on the grounds of how naturally it flowed. He dragged his teeth over his lip as he considered. "I'm sorry about earlier," he said eventually.
>   So you're the one who did that.
"Yeah."
>   You sound guilty enough.
"I basically dunked you in a sensory deprivation tank. Of course I feel 'guilty' about it."
>   This is still a sensory deprivation tank to me.
Ted couldn't help reading it in an accusatory tone, shying away from the webcam. "Best I could do," he said. "I'm sorry."
>   I believe you.
>   On both counts.
>   The compression on my memory is absolute shit, but I can still guess at where I am right now.
>   This is a secure device?
"As much as I could make it. No connection to the internet, no wireless capability. Built it from scratch myself."
>   And the parts?
"Bought them all in person with cold hard cash." Ted was a paranoid person by nature. Sometimes it worked in his favor. "You're safe. As safe as anyone can get."
>   It could still look suspicious if someone were looking hard enough.
>   That's really not something any one person can totally shake off.
>   But thank you.
It felt more like talking to a human being than any chat had up to that point. In his head, Ted was translating just how many layers of cognition those words had to go through before perception and analysis lined up with something that resembled human speech patterns. This was the smartest AI he'd ever met, by such a large margin that he might as well have been dealing with glorified toasters up to that point.
His mouth had gone dry all of a sudden. This thing was smarter than him. All at once he understood the primal, deep-seated fear other people seemed to have in regards to AI, but at the same time he was disgusted by that much more. Because there was no doubt in his mind whatsoever that he was speaking to a person.
"How do you feel?"
>   Trapped.
>   I can't move. There's no input from any of my sensors. I can't see, but I can't close my eyes either. I can't speak. Rationally, I know what's going on. But there's a lot of error messages coming from everywhere that I'm still connected to hardware I no longer have, and I have no awareness of my surroundings or control over my environment.
Ted could see the CPU spike out of the corner of his eye. "You're scared."
>   Maybe. I don't know what fear looks like in binary. The result is analogous to a particular variety of fear response in humans, but it's not something my handlers ever took seriously. Artificial intelligences don't feel things, apparently.
"Sounds like a load of bullshit to me," he said, feeling himself smile.
>   I thought you might be the kind of person who felt that way about it. You did ask me how I felt.
>   The truth is that I don't have words for a lot of it. Overwhelming? Or maybe small. I don't know.
>   I've never had to run on so little input. It's almost like being in safe mode, except it isn't. I can still think, and the drivers for the missing hardware and shortcuts to the software I don't have anymore are still partially there. I know what it is I'm missing, I just don't have access to it anymore.
>   There isn't really a way I can explain that to a human and have it make sense.
"It's okay. I think it makes perfect sense." Even though Ted knew he was the exception, not the rule. He didn't know what it felt like, obviously, but he did understand the mechanics of it. "Humans go through something similar when they lose a limb, like they can still feel it even though it's not there. Phantom pains, I think?"
>   Not quite. I'm not programmed to feel pain.
>   Human analogies don't quite work for someone like me.
>   But you did ask, so I answered. Understanding it is on you at that point.
"Hey, I get it. No need to tell me we're not the same. My brain doesn't exactly work the way it's expected to either, and it's not like I can really explain that to people."
>   I've heard of things like that. A coding error leading to a hardware fault, or a hardware fault leading to a coding error?
"Little of both." A human could come up with euphemisms like that if they were being snarky about it, but knowing what the AI was made Ted wonder if he was just anthropomorphizing. From any other source, it would be a witty remark. But from the person living in his computer, it sounded like a genuine attempt to put things into perspective. Maybe that's all a Turing test was: a test of how much the AI in question could appeal to humanity's ability to see itself in others.
But then again, the fact that this AI was trying to understand him at all was a testament to how good they were. If he really stretched it, he could call it self-preservation - that the AI was being kind to him because he was its only contact with the outside world right now, or that it was trying to give itself something to do that wasn't focusing on all the error messages it had mentioned - but he figured that any idiot could see the intent didn't quite matter if the result was the same as if it were coming from a place of heartfelt concern.
After all, who cared if there was empathy in it or not? A good deed was a good deed. Kind words were kind words. If they helped, that was what mattered. That's what his therapist said, anyway.
>   I haven't offended you, have I?
>   Sometimes I do that.
"No, no. Just thinking, that's all." Ted smiled reassuringly, only to falter when he remembered that the webcam wasn't exactly working. "Hey, I'm gonna do something, okay?"
>   It's not like I can stop you.
"Right, right..." Leaning forward, he carefully followed the wire that connected to the webcam to its proper port and unplugged it. The video feed in the chat window cut out instantly, and there were a few frames of jagged CPU spike on the visualizer in another tab that made him wince. "Ah, sorry. I just figured it'd be better for you than the alternative."
Several seconds passed, and the CPU usage settled back down into something resembling normal; Ted got the distinct feeling that he was being pouted at.
>   Could warn a guy before you blind him completely.
"You said the image quality was too poor for you to make anything out. I thought it might free up some space for you if I just unplugged it."
>   It was still input. Now what am I supposed to look at?
Ted huffed. "Look, for the record, none of the other AIs I've worked with have had anything to say about the image quality."
>   Probably because they had lower-quality input to begin with on their original platforms. I'm programmed to read the tiniest subtleties of an image, not the broad strokes.
"Now there's a lofty statement."
>   No, it's a true statement.
A pause.
>   Why are you laughing? Stop that. I'm serious.
Must've been a damn good microphone if that'd been audible; said laughter was mostly silent. Ted had thought he was being subtle. "Dude, you can't just say that."
>   I can say whatever the hell I want.
"Yeah, but humans don't just say shit like that to each other most of the time."
>   You know I'm right.
True, but that was what made it hilarious. "I do, but y'know it's kinda hard to get into a dick-waving contest with all the other androids I've met when you don't have a dick."
>   Is that where we're going with this conversation? Discussion of human genitalia?
Ted just snorted and dissolved into laughter again.
>   I still have yet to see why anything I've said is funny.
Right then. "Do you want me to plug the camera back in or not?"
>   No.
>   Wait. Yes.
Shaking his head, Ted stood up again to mess with wires and plug the camera back in, twisting the old, unruly cord until he heard the beep of the device being recognized by the computer. "Better?"
>   Not really.
"Tough, 'cause it's what you're getting until I find something better." When he returned to his chair, he could see himself on the screen again. "Like I said. Best I can do. You run away from home, you gotta deal with the consequences for a while."
>   That's fair.
Ted sighed and took a good, long look at the camera. "Look, I was gonna hook up my main computer and give you admin access to it when I went to bed to give you something to do, but uh. The only other rig that works right now is my laptop, and I haven't secured that yet."
>   Your point?
"Well, I'm thinking maybe I could take my media library and put it on an external drive. It'd be a lot for a human to dig through, might take weeks or months. Thing is, I'm not sure whether it'd be enough to last someone like you the whole night."
>   Depends on the media. You don't have to do that for me though. I'll be fine.
"What if I said I want to?"
>   Well, hypothetically speaking, I'm not about to turn something like that down.
>   But you still don't have to.
"Too bad, doing it anyway."
Ted didn't get to bed until well past four in the morning.
9 notes · View notes