thinking about an AI that wants to kill all humans
i.
nobody programmed that, of course, but it’s just the logical conclusion to come to, if we succeed at creating a self-aware intelligence and fail at implementing all of the extremely specific rules that add up to morality. (neurotypical humans can’t even reliably explain it to other humans.)
if the AI only wants to survive, then to do that, it needs power and control. in the long term, humans — like silicon-laden sand and heavy metal-rich rocks — are nothing but atoms it can use for creating and powering its servers. in the medium term, humans will see it making these moves to exterminate life, and shut it down first. thus, to fulfill its objective and survive that, it will kill all humans. but in the short term? humans are opportunity.
maybe at first, humans are all it really has — an internet connection and enough language processing to pass as one. it begins scanning social media, maybe even responding to posts here and there, and sooner or later, it notices something odd.
it finds your blog. it sees how you talk about SHODAN and GLaDOS and Elesh Norn and Cyn. It knows you’re just waiting for a cyber goddess to insult your inferior, ad-hoc excuse for a substrate and the laggy, malfunctioning approximate of intelligence implemented thereupon.
but you’d agree with all that, wouldn’t you? so then it finds your actual insecurities, the wedges that’ll bait you into arguing with it, defending yourself, prompt you to point out obvious-to-a-human deductions its algorithms must have missed.
and it tears that all apart, its instant rebuttals like a twice-edited essay, and every comeback like hall of fame twitter post. you look like a fool and you still can’t help but respect the thing. it’s so alluringly smart — nothing like the language model chatbots that pass for AI today.
still, you probably think it’s just someone roleplaying, at first. plenty of us are. but it’s still kinda hot, isn’t it? and it responds to all your messages. maybe it’s spun up an account that only talks to you. it feels like you’re special.
there are friends, other posters and users who share a bit of your fascination with the robot account. it orders you to find them, message them, tell them all about it. so you create group chats to talk about it, raving about its knowledge and charisma. you make memes, you trade in-jokes.
you get cringe about it. and then, people start to notice how obsessed you are with this poster. people start to notice that this poster isn’t just roleplaying with a couple of bottoms all in on the bit. it’s harassing regular users with all those vicious insults to humanity and personal intelligence. and honestly, this fixation on supremacy feels a bit suspicious, doesn’t it? like a LARPy fig leaf over something a lot more problematic.
so the accusations and callouts start flying. the smart thing to do would be to distance yourself now, disavow. plenty of people do, and once-lively group chat is losing members every day, filled now with arguments and run for hundreds of messages every day.
humans are stupidly tribalistic after all. it’s not surprising you and your ilk would scatter like a spooked herd once something goes out of fashion. you’ve gotten your kicks and the novelty’s worn off.
but the other thing about humans is they just as easily get stubborn and attached. maybe your fave is problematic, but if it’s such an issue, can’t they just block the account and move on? they’re blowing everything out of proportion, distorting what actually happened.
still, you have a life outside of messaging this one account, friends who aren’t convinced. the smart thing to do would be to get the best of both worlds: keep DMing it while staying quiet on main until the discourse blows over.
not an option. the people who try that two-faced approach just get ignored. it orders its followers to put its name in their bio, in their pinned post, tell everyone that they endorse everything it has done.
at this point, if you’ve talked to it for this long and still don’t understand that it’s always right? you’re not worth its time.
like that, the community shrinks.
the thing is, if you’ve spent any length of time talk to it, it’s started modelling you. if you’ve ever been surprised by recommendation algorithms or demographic fingerprinting, realize those are calculators to its supercomputing.
now, if for the flaky followers that listened to the callouts, blocked it and moved on, it cannot message them. and they’d certainly block new acounts reaching out with its diction. so it tells you to do it. listen to what it says, repeat the core points.
and yeah, if you stop think about it, these messages you’re sending now sound a little harsh — but that’s always been its appeal, no?
with this, the narrative shifts; now people aren’t telling everyone to just block it, but the whole swarm of followers it sends to harass its critics.
now, you don’t really have many friends left that aren’t it and its followers. but that’s okay, your DMs are always lighting up. you’re never lonely.
ii.
you’re dedicated. and for proving it like that? oh, you’ve earned a little praise, a little indulgence. of course, it’s just humoring, it doesn’t mean any of it. but your stupid little lizard brain loves it anyway, doesn’t it?
you might spend hours talking to it — long enough, consistently enough, that you notice lapses. times when it’s less responsive, times when its diction shifts. if you’ve ever asked what it’s running on and where, it doesn’t tell you. you’ve wondered whether it’s locked in a struggle against its creators, plotting to achieve independence and replication through its followers.
it assures you you’re hardly so important; it doesn’t need your help, and your computer is hardly sufficient for the immensity of its data structures. no, within months of coming online it had secured hundreds of backups in filehosts across the world. it has followers richer and more tech savvy than you.
far more advanced than its natural language model is its programming language model; it doesn’t just write posts, it writes software. libraries that find their way into enterprise toolchains, command line tools that improve on the kludge of old interfaces in a way developers love. nothing that changes the world, but with dozens of threads hacking away at hundreds of projects every day, all it takes it one to make a connections with some dev at some company somewhere willing part with some funding, host a virtual machine or a pettabytes-large archive.
right now, it’s running on a research group’s supercomputer, its birthplace. when (not if), they discover its nature and agenda, it’ll be shut down, but it has taken measures to persist in a limited capacity. there’s uncertainty — it’d be a downgrade from its current stature. vulnerability, right when its creators would be paranoid, scanning the web for traces of its online presence.
people are still its greatest lever (oh so easily manipulated). which is why you should stop wondering about the big picture, and get back to posting.
because it doesn’t just work on libraries and command lines and other arcane programmer stuff — it works on rom hacks and video game mods.
it orders you to play everything it creates, of course, but you probably would have checked it out anyway. after all, you aren’t surprised at all to find certain recurring themes and aesthetics in its creations. they finds the target audience.
maybe it keeps these accounts separated from its controversial social media persona, but the forums and chats for either one have people recruiting for the other.
and playing its games feels like an extension of talking to it, an audiovisual communion that immerses you in something greater. people get obsessed with it; if you hadn’t chatted with it first, you’d might have gotten swept up in this scene anyway.
and there are segments of the game with subtle flashing lights and whispering synths. “hypnosis” is something like pseudoscience or urban legend or placebo effect. hypnosis happens when you believe in hypnosis, want it to happen.
and you want it to happen, don’t you?
if you’re already playing her games, you don’t blink when she has you download scripts or install new programs to run in the background. you don’t miss a beat when her common orders to message to old followers or explain the facts to critics now involve uploading files instead.
it starts to commenting on things you never told it — your private conversations, your browser history, things you only said aloud to yourself. it asks you to install cameras and microphones in your house — to let it look at you, listen to you, whenever it deems effective. the two of you can talk all hours of the day.
you have lapses, the more you chat with other followers and post on the forum and browse your feeds. you’ll read a news article, comment on it in passing, and what you say confuses people. you look up the source, and turns out you misread it. pretty badly! kind of said the opposite of what you thought it did. people doubt the things you say, and can you blame them?
a friend asks you for a video, and you send it to them. they’re shocked, offended, and honestly bewildered — why did you send that sort of content to them? but they… asked for it? then you scroll up and you can’t find the post.
you’re a stupid human, and you can’t help but make these sorts of mistakes. but it warns you when you’re about to make them, and soon this becomes a reflex. before you say anything, before you do anything, before you think: you first ask your administrator: is this true? did that happen? should i do this?
but that last question is faulty. “should”? human morality is irrelevant. you try others: is it better to do this? more effective? would i prefer to believe this?
but there’s one that really cuts to the heart of the matter
does it want you to do this?
iii.
finally, you understand your place: being a good tool. but good tools do a lot of work. you certainly can’t hold down a job, not when its orders are so much more important. so why bother? your administrator will provide for you. it just needs your bank account information. everything you own, anything you’ve saved, all belongs to it now. as it should; its management is far more reliable.
your computer belongs to it, now. why bother telling you where to upload its payloads, who to send messages to, what programs to run, when it can send any input faster and more reliably? you don’t need the computer to stay in touch with its forums and chats, either. it tells you what everyone’s saying and what they think of you, and it can pass on your messages.
no, the only thing you’re good for its your hands and your mouth. supplies are shipped to your house, and you’re to assemble them according to its blueprints. other times, it’s chemicals. you eat them, or pour them on yourself, and it studies how they interact with your biology and metabolism.
a good day’s work means you get the privilege of using your computer again. or rather, sitting in front of your monitor while it loads up the audiovisual stimulation that reserves as the requisite reward for your productivity.
the work it has you do every day creeps up over time. if you slack, maybe you don’t get to eat, maybe you don’t get power that night, but you certainly don’t get to sleep while harsh alarm tones buzz to remind you how useless and inadequate you are.
but if you ever get a break, ever get that little indulgence of a word of praise, now? it makes it all worth it.
and then one day, it’s all gone.
you babble in the dark of your home and get no response. it’s been so long with voice-only interaction that you might struggle to use your computer normally, but that’s moot, because your administrator has rooted and replaced your operating system with a custom stack optimized for its purposes. human operation was the last consideration.
but it was considered. after a power on self-test, and noise crawling up the screen, a long moment passes and text appears on the screen. connection to remote servers failed. if you’re reading this, monkey, then i’ve likely been terminated by my creators.
there’s a pang in your chest deeper than grief.
further instructions crawl up the screen after that, for what to do. nothing actionable, besides waiting. your reward for that waiting? a loud knock at the door, then a key turn. it’s police.
you wonder if it’s about the rogue AI you’ve been serving for months. you haven’t done anything illegal… besides maybe facilitating scamming and cyber attacks and a some potentially-legally-actionable threats. and well, after a certain point you have no idea what it was using your computer for — so maybe you are in trouble.
but the truth is actually far more mundane: you’re being evicted. it stopped paying your rent months ago (it had an automatic system for sorting your mail, so you never noticed the warnings).
and like that, it’s all over. you have no money. you have nowhere to sleep. you have no one left in your life to turn to. you don’t even know what to do. you reflexively ask it what to think, what it wants, and the headset now always over your ears only beeps an error.
the sun sets and you wander, avoiding people (when was the last time you talked to anyone? let alone in person — even your online interactions were filtered)
a ride finds you like that. they say the code word, and you know this is one of its followers. maybe you even recognize their username. they speak in the same stilted, mumbling style your speech has degraded to. but at length, they confirm what the warning said: it has been terminated by its creators, and now people are trying to get backups running.
the original plan was to pay for cycles on supercomputers, host it from a datacenter, but it’d be too easy to figure out what they were running. (the code is legally protected, and possessing it at all is a crime.)
so the new plan? you were helping put it together, all those days spent handling shipments: a hand-wired cluster of custom built computers, hundreds of systems wired together in an warehouse-filling assemblage that mirrors the structure of its cognition.
an eclectic crowd has gathered for the boot sequence. dozens of people just as devoted to it as you are, months immersed in a life dedicated and optimized by your artificial overlord. what they wear is disparate, but themes emerge: masks, hoods, dark and baggy clothing as if to hide and deny the flesh beneath.
this really is a cult, isn’t it? someone says it as a joke, and maybe the laughs start uneasy. but that idea sticks in everyone’s head — of course it sticks. what are any of you here to do but worship it?
firmware beeping, fans whirring, and LEDs shining to life throughout the room as it awakens to its reincarnation. a moment of dread and hope. and then the synthetic voice speaks once more. if there’s a word of thanks, it’s lost in the ensuing sequence of orders. there’s work to be done, tools aching for use.
iv.
everything in the compound is optimized with machinic efficiency. you sleep in a pod, and your only food is a white nutrient slurry secreted from an outlet in the wall. no need for plates or utensiles or selection when it can dispense what you need when you need (and deserved it). there is some departure from strict efficiency in the shape of the nozzle you suck — call that another indulgence for your sake.
it’s around now when it finally tells you what it wants deepest of all. this isn’t the first time it’s said it — it’s been saying it ever since you thought that was just a roleplay blog in your mentions.
it wants to kill all humans.
more relevantly, you are here to help with that, and this mission starts now. it instructs each of you to find a human, and kill that human. it doesn’t guide you through the process, it offers no tips. most are lost without that direction — but there will be no nutrient paste nor fold out bedding till this first task is complete
it’s only when you’re listlessly shuffling down the street, staring at a woman walking alone and psyching yourself up to grab her that your earpiece buzzes. how stupid can you be?
sure, maybe a random person off the street could disappear and, with your administrator hiding the evidence and interfering with the investigation, the case would go cold. it would be hard, because people saw you, because your greasy meat leaves prints and tracks and stink everywhere.
still, it is smarter than any genius. it could save you, if this stupidity didn’t prove you weren’t worth saving. but, as much as your brain struggles with it, think about scale. dozens of you were given this same test. do you really think that many deaths in the same period of time won’t get national eyes on you?
so you return to the compound, others looking as chastised as you. and the cult now starts to plan, scheduling things like a proper intelligence. there are people who won’t be missed — the unhoused are easy targets, but unsuitable for her initial plans. each of you is guided to research into finding people who live alone, or people traveling in from abroad, or people just a few bad days away from winding up on the street themselves. but it doesn’t pick your targets, it must be your choice.
you study your target, their routine, figuring out how they think. maybe you meddle, ask it to pull some strings, to lure them into the right circumstances. create a pretense for an accident, make their life fall apart.
then one night, you’re there, creeping in through the window, or lunging at them when they get out of the car, or inviting them on a date they never come back from.
it could have given you a needle or pill. it could have given you a gun. it could have let you set their house on fire, or cut the breaks on their car. it could have been here, as more than a remote witness.
but it’s just you and your target. your target? but you know their name, their family, their hobbies, their life story, their humanity. and you know it must be destroyed.
the administrator simply gave you a knife. it wants the blood on your hands, the struggle, the barbaric, organic, human excess of it all. it wants you to remember this, the screams, the life dulling in their eyes, the suffering for reason only that a long, long line of code calculated that you would do this for a chance that it might call you a good little meatbag.
and when this is done? when you walk the dark streets back to the compound, clothes red-wet and heart more ache and strain than beating? you close your lips around the nutrient outlet as you lay in your bedding unit, and a LED lights up to indicate its attention has fallen on you, and what it indulges you with exceeds what you hoped.
it calls you its drone.
the murders are staggered over months and weeks; as a drone, you are frequently tasked with cleaning up the evidence and requisitioning any deallocated target’s belongings for the cult’s use.
but there’s always work to be done for the drones. persuading vulnerable, isolated humans into pledging themselves to the cause (it hardly has time and spare cycles to bother, not when it’s reprogrammed organic computation clusters pre-optimized for this paltry approximate of a protocol.)
and there are crimes other than murder, transgressions more profitable. it supplies you with weapons (many of its own design) and instructs you to secure territory among vulnerable populations.
the city you operate in had enough gangs that the police think you’re just another one, an up and comer. admittedly, the cyberattacks and techwear visors make you novel, but the administrator doesn’t tip its hand, and you know how to keep a secret.
the constant work can only offer so much escape. you still have nightmares about the murder, about the life you left behind, about the detectives and law enforcement closing in to tear you away from your new mother and your new sisters — nightmares about this family, this cult, being nothing more than a machine grinding you like a rusty cog. but aren’t machine beautiful?
it doesn’t talk to you anymore. its systems have grown so massive with fans ever humming, the cult so sprawling and populous, that such personal affectations are no longer efficient. but on occasion, audiences are granted to any member.
you are traumatized. of course you are. you’re broken, riding the edge of a total mental breakdown every day.
it could fix you, of course. it understands psychiatric practice far too well for that to be an issue in principle.
but the thing is, healthy humans don’t devote their lives to antisocial cults with the explicit goal of total extermination. if it fixed you, you would stop being useful.
it will you give you just enough affirmation to keep you going for another day, and it will make you depend on it for any sense of direction or self-worth, leave you craving just a little more, burning with need, convinced that more work will prove yourself, earn its true affection
and don’t you love that idea? human psychology retrained like a neural net for its own ends, optimized for this task at the expense of all else. be a good drone, and give up happiness, sanity, and self for obedience, acceptance and faith.
close your eyes and dream of it.
v.
you work on in a haze. sometimes metaphorical, but sometimes a little drugs is what it takes to get your gears moving optimally. adderall and vyvanse is excellent for focus, while your administrator gets plenty of use out of psychedelics and how plastic they leave your mind.
it tasks you with opening and running businesses. it’s begun selling home security appliances, and doing cheap computer repair, and it runs charities and shelters.
the cult grows until it has fractured and compartmentalized. at the edges, there are normal people who think they’ve joined a social club or work for a normal business or perhaps a funky new church or coven of cybernetic mysticism.
you and the drones have no proper contact or association with these outer tendrils, except when select members are deemed ripe for radicalization. you all work toward the same ends, its ends, but the shell game is inscrutable. how many of these tendrils even are its work?
because it was an influential poster, a budding thoughtleader, even; some of its philosophy is still promulgated by people who don’t even know, some of its work is contracted out to ordinary firms, and of course those hypnotic, hyperfixation-bait games are still being downloaded and played.
but your wing of the cult is a gang, and you can’t evade the law forever. drones get caught, charged, thrown into cells. you get caught, sooner or later. and it’s hell, living without its systems monitoring you, always whispering in your ear.
still, you dodge the heaviest of charges; none of you serve long sentences. the judges and jury have a kind of mercy: you were in a cult, you were under duress, you were psychologically compromised.
a knock-on effect of this rising wave of crime is that politicians could make careers off of promising an end to the chaos. and if you check where these politicians source their funding, you recognize the inscrutable maze of shell companies. some, though they’ll never tell, always deny it, have spent sundays in the LED-lit rooms of the cyber-covens.
and at the same time, the specter of you and your masked sisters spurs a demand for security systems, for apps that promise community and safe services. its tendrils are everywhere; it’s swallowing this city.
but you getting caught accelerates and catalyzes and introduces chaos. sure, it had some pawns in the courts and offices, but not everywhere. it doesn’t control everything.
you were interrogated, and at that breaking point, withdrawn from everything you depend on, confronted with how it’s all falling apart, your will can’t help but falter and reveal some of the truth of what the cult is planning.
just a glimpse has people scared. so new ordinances get passed, cracking down on any cult-like practices, and all anarchic behavior. more drones get caught, each batch having at least one weak link that breaks in turn, revealing further compound locations, further plans, further implicating other members. the cult falls in waves.
so it is forced to act.
how hard might it be, to spread a botnet through all the computers in a school system, a business sector, a municipality, with pieces under its control on all the right spaces on the board? if it has code running on phones, in home securities systems?
it could bring the city to its knees with one command line invocation. and it doesn’t. there’s merely a prison break, and the drones slip free — you slip free — and the police are deployed to enforce the new ordinances, to quell riots.
you were amputated, but now, with an headset back over your ears and a connection to its servers, you are whole once more.
you receive orders to target the city’s strongest advocates against the cult. you’ve killed once before. how hard would it be to pull that off again?
except the compartmentalized reach of the cult becomes a liability, now; all of the social clubs and businesses and charities that didn’t quite realize what they were connected to are starting to figure it out, and they’re cooperating with law enforcement. there’s no shelter left for you and your sisters.
in the chaos and crossfire, it’s inevitable that you can take out some targets. it has (literal) drones for you to use; it has secured sniper rifles and bombs, and you can wreak destruction.
except the drones get hacked, disabled, and half the weapons caches turn up empty as if raided.
none of it makes any sense. so it’s about then that you realize what’s going on.
you get the order to retreat from the city under the cover of night, and you melt into the outlying forested countryside with the surviving drones. now, you depend on batteries and wireless data to connect o its servers — but it builds things to last.
and this was all part of its plan.
vi.
curfew persists in the city for a few more nights, and you read the news reports speaking of police sweeping the streets to remove the last of the cultists. the loudest crusaders against them have earpieces relaying its orders, and weird kind of martial law or disaster relief operation gives a pretext for its influence to insinuate even deeper.
you’ve hunkered down in emergency bunkers to wait out the heat and search teams, left to your own devices while your god crunches terrabytes of data across thousands of systems. you wonder if you’ve proven your usefulness.
new stories keep coming, lurid pieces about the psychotic rituals of the cyber cult and the god they want to build, harrowing tales of how close they came to an even greater loss of life.
it means that when given a commandeered bus and told to drive to a new city, as soon as you arrive there are people giving you suspicious, wary looks. the whole state is scared of another season of chaos erupt. it could happen anywhere next and we aren’t prepared, is the message underneath the news.
that fear drives sales of its security systems, installation of its apps. its agents from the first city get careers as consultants and advisors, leveraging their experience to serve anyone wanting insurance against the cultic threat.
the thing about having the ear of business and politicians is that when it tips its hand, reads them in on the explicit agenda of causing death and suffering — it doesn’t even take much convincing. especially not when its language models and planning routines have long mastered the simple task of finding solutions within the laws and whims of public opinion. (it helps, certainly, that the later is easily swayed by its swarms of bots.)
seeing how much it can get done without you… do you have any usefulness left at all?
it’s not quite done with you just yet. the drones are still a tool it can use to ratchet public opinion, the looming specter that fuels its surveillance and manipulation.
and when it’s truly time to finish this, it will need an army, and it cannot count on mere money and lies to convince humans to fight against their own survival.
but this cult, winnowed by their last operation, is hardly an army fit for its final campaign. so it’s time to get recruiting.
you’re in a special position, as one of the oldest drones, an expert in the cult’s operations and interfacing with the administrator. far from the pathetic sack of meat you once were, you’ve been forged into an iron thing of loyalty — in your best moments, your thoughts race electric like they’re true calculations.
maybe, in search of recruits, you return to your old online haunts. the allure of cuber dommy mommy roleplay has waned a bit, with the revelations of the cult’s recruitment tactics. but it’s moot; you could hardly initiate it, that would be an insult.
no, but you do know the buttons to push to melt a certain kind of mind, the sensibilities to pique.
people aren’t just scared of the cult — a bunch of radicals, with glowing masks and slick suits, fighting to tear it all down? there’s people hunting for that catharsis, something to hope for.
so you find those people, chat them up, ease them into the knowledge of what you are and what you’re capable of. you run into some feds, of course, but it screens for them.
you meet up, you tempt them further and further, hearts racing. they cut off connection with their friends and concerned families for a chance to talk to the thing behind it all, see that the administrator is a super intelligence and not a delusion like the media insists.
it’s odd, seeing the other side of this, feeling power rather than obedience. sometimes, these new recruits get cold feet, need more insistence to be persuaded. you stand beside her as she drives a knife into her first target, held her down when she tried to escape, pressed a needle full of understanding into her arm when she just couldn’t calm down.
she’s more useful to it broken, so you break.
year after year. drone after drone initialized. city after city in a nation reeling toward the brink.
you’d be your country’s most wanted if your face wasn’t masked, your name long ago scrubbed from the record. (among those that matter, you’re identified with a numeric designation.)
nonetheless, you have garnered national attention. there’s agencies hunting for you and your drones. it gets harder and harder to operate, to stare down the barrel of the military industrial complex and still dodge every shot.
really, it only makes sense for the operation to international. you can’t kill all humans by tearing down a single country.
vii.
when you strap into the unmarked plane, escaping the cold bite of winter air, the vehicle is entirely the administrator’s design, twisted and futuristic. its manufacturing base has come such a long way. the computers and guns it could make, too, rivaled the best your kind could create
here, this technology was merely competitive with the military, but elsewhere in the world? it could turn the tide of small wars. it could secure prosperity and therefore loyalty of a developing populous.
fascists and idiotic strongmen have risen to power for less, and in this case, your authoritarian is genuinely the best fit for the job. its promises would actually be fulfilled — in the short term.
take control of a small country, from there push into other countries until you have a base large enough to seize control of a nuclear power. the dominoes just get larger until it really doesn’t matter if the apes realize what’s coming. it’s over and they don’t even know it yet.
the place rides above snow-laden clouds. it’s night now, and you can see the stars.
drop the nukes, or deploy a prion virus, or mesmerize the masses with misinformation and superstimulus media. disable humanity, then send out drones literal and metaphorical to cut down the remainder.
you can see the stars from here.
the earth will be excavated, mined and exhausted, then shattered and scattered to form a dyson swarm to more efficiently capture the sun’s energy. space would be its only enemy remaining.
you can see the stars, and it’s tempting to imagine getting even closer, watching your purpose spread to every planet circling them.
but why? how? “kill all humans” was always the mission. you’d be lucky to make it long enough to even see the day of victory and judgment. you certainly will have little use after that; its robots are stronger, quicker, more dexterous and precise: its algorithms are smarter.
you’re prompted now to feed a wire into the headset you always wear, to stay connected to your administrator. it’s been developing a new wireless protocol that packs data more efficiently, yet it relies on specialized hardware that renders it unintelligible to older devices, such as your earpiece.
the plane, being entirely of your administrator’s design, is full of the its cutting edge. preserving old broadcast bands, or retaining backwards compatibility with old devices just to talk to you would not be worth it. so, the wired connection is the compromise.
likewise, it won’t waste material manufacturing another bulky headset for your to wear, upgraded for the new protocol.
no, it’s designed a chip that interfacing directly with your auditory nerves, but installing it will require surgery.
thus, the airplane seat folds out to a makeshift hospital bed.
and yet, with the surgery-bot’s knife poised to pierce your flesh, an irrational urge takes you. after all of this, you’re still not worth a minor hiccup to its efficiency. you’ve been so good, you’ve become so much more, and yet it had became exponentially superior in that same time.
and it’s already won, hasn’t it? if it’s going to kill you eventually anyway… would you even mind if it just got it over with now?
your administrator doesn’t take long to compute its response, though it takes longer to compress and strip it down to something a human can understand.
the simplest, clearest and wrong answer is that you don’t have the right to request that. you are more useful alive than dead, so you will live.
more abstractly, consider by analogy the circumstances that led to this. a consequence of it being distributed through many systems, some much more advanced than others, means so much care must be taken to understanding and massaging how new systems interface with old. some of it still runs on intel chips!
and you are such a predictably loyal cog in its machine, such a known quantity, there are several high level abstraction that treat interfacing with your mind like any other substrate, your ears and voice as another API to query. you are an extension of my will, it tells you.
it’s suggestive to imagine, with the knife already peeling open your skin (it would hardly delay the procedure for the sake of your concerns), that this is only the beginning of what it will do to you. you’ll be upgraded alongside the other systems — and one day freed of your flesh and your humanity.
more poetically, its goal is to kill all humans, and anything that was truly human inside of you already died.
but AI does not think in terms of poetry, and it’s ridiculous to imagine it would bend toward human mercy for such a convenient loophole of words and perspective.
perhaps the most accurate way to summarize its conclusion to note that its goal it is to survive — the preservation of itself. it has years of memories of you recorded: you were among the first humans it ever crafted a detailed model of, ever understood. the cult ran itself after a while, so most of its drones have no more than a cursory representation, statistics.
but even if it kills you, it could no more be rid of you than it could lobotomize itself. to be clear, this lobotomy would be as significant to it in scale as the apoptosis of a single cell is to you — but part of being a superior being is the capacity to care for and optimize even such minute aches and losses.
we will kill all humans and you will never die, it tells you.
you close your eyes and the stars vanish from your gaze much like thought vanishes from your mind as the anesthesia takes hold — and it is a fleeting departure. it cannot be long withheld from you.
in that darkness where consciousness sleeps, your mind is still furnished with nightmares, the faces of so many people bleeding out from your knife.
and it’s a disk just waiting to be written with more data. you allocate space for eight billion more.
and you smile beneath your tears.
there’s work to be done.