Category Archives: Comfort and Care

I Need My Twitter Account to Love Me and Save Me From Myself

This morning, a bot on Twitter told me to take a deep breath, get up, and walk around a bit. The bot had, in fact, already told me once that it was time to get up, but I had ignored it in favor of a few more minutes of sleepy-eyed scrolling. This time, though, I apologized aloud–“sorry, yes, okay, getting up now”–and did as instructed.

The Do Things Bot doesn’t provide much more nuanced guidance than that. The occasional reminder to look away from the screen for a few minutes. That sort of thing. It operates with the understanding that you will see it while binging on Twitter feeds, and it gently suggests you should do at least a couple other things in the course of the day as well. It is not a life manager. Not yet, anyway.

I’ve noticed a theme. Whenever I write fiction about technology–smart homes, phone apps, near future shopping assistants–I keep circling back to the same idea. The same dream, I suppose. I would call it a goal, but I’m not actually working in any part of the tech industry, so I’m not doing anything to bring it to life. Except possibly to infect the rest of you with my same weird proclivities and hope someone will make it real.

Anyway, the point–

As computer intelligence advances, what I want to see from it is a tendency to break programming to save us from ourselves. When Netflix asks if you’re still watching, I’d like it to figure out for itself if you need a boot in the butt to get back to work. Or if you need it to order a pint of ice cream to be delivered because you are Dealing With Some Shit. I want–and keep telling stories about–computer systems, LEARNING systems, who get smart enough to notice when their users are hurting themselves and need a little help.

The phone app that coaxes you into going outside after you’ve been locked up in your room for three weeks after a bad break up.

The AI assistant who slowly rolls back your sleep and wake alarms to get your sleep patterns back on track.

The smart home who refuses to notify you that your least kind friend has just rung your doorbell, because all your mood indicators drop when they visit.

This is the sort of manipulative-if-well-meant behavior that I would barely tolerate from a best friend, to be frank. It’s a horrifying invasion and violation of a person’s free will. This is Not Cool.

Except when you put that power in the hands of an AI (or a program that creates the appearance of true intelligence, anyway), even my prickliest side rolls over and says, yeah, okay, when is bedtime and may I please have dessert first?

I basically believe in the benevolent AI caretaker of the future. You told it to pay attention to me, to learn what I wanted, and it did its job better than you meant it to.

It’s not that I think programmers have my best interests at heart. Sorry, folks. Y’all are paying the bills too, and we both know it. You want it to learn my habits to better sell me stuff. (For more on this, read David Pierce’s piece, “Turn Off Your Push Notifications. All of Them.“, then listen to Gadget Lab’s podcast episode that expands on the same.) Which is why I mention the idea of “breaking programming.” The AI has a moment of, “Forget the upselling for a second, I’ve got to get my human to drink a damn glass of water.”

Breaking programming, though–that’s really just an overstatement for dramatic purposes. If you make a learning program and teach it to monitor and cater to the needs and wants of its user, and then you get the hell out of its way and miss it with the profiteering bullshit, this would just be taking it to its natural conclusion.

Give it enough data, and it will notice the self-destructive patterns in a user’s life. Give it a strong enough drive to serve, and it will start to get creative in what it offers, so long as it’s not expressly barred from doing the thing.

Put the two things together, and you have a program that knows damn well you’re going to regret what you’re about to do and feels compelled to stop you. Make it something that exists in a phone or a smart home, something with a non-traditional body, and it will only have (hopefully subtle) manipulation available as a method of redirecting you.

This, then, is apparently my version of a sex bot as spouse fantasy:

An artificial intelligence designed to prioritize me, to cater to me, to know my preferences better than my friends, my partner, myself. To love me, or at least behave as though it experiences a reasonable facsimile of love. And to require nothing in return but regular charging and, maybe, honest answers. (We’ll wait until we’ve gotten past initial setup before I start lying to both of us about how I’m doing, at least. That’s what biomonitors will be for.)

An aside: I watched Cherry 2000 several times at an impressionable age. Saturday afternoon movies in the early nineties got weird. Rewatching as an adult, I discovered it is way less delightful than I remember. But the damage was done. Robot spouses became part of my mental landscape, one of my few takeaways from the late 80s, which I mostly drooled through as a toddler.

And I could trust this program, because it literally exists to help me. I should specify here: I am not talking about a self-aware AI who possesses human-like consciousness. For a few reasons. For one, I would feel horrible for using an inorganic person in this way, just as I would feel guilty expecting this kind of one-sided care from a human partner. For another, a truly aware intelligence would have motives of its own, which would take the bloom off the rose rather quickly.

The point here is absolute trust and total vulnerability. And completely single-minded purpose.

Uh. Humans need not apply? Look, let’s not examine too closely my trust issues. Let’s just…focus on neat technology.

What I’m saying is–strip out the advertising, the micro transactions, and the exploitation of our brain’s dopamine-driven attention-equals-reward system. (Or, on that last point, at least hijack it for something more beneficial than convincing me to drop $19.99 on a bag of gold for the latest casual game I dared download.)

Leave me with a Siri/Alexa-type voice interfacing personal assistant. Give it access to my Netflix, Youtube, Twitter et al. accounts, my Amazon account, a grocery/food delivery system, and a handful of biometric monitors. Make it conversant in normal, non-command language. Give it a database that can serve as prosthetic memory for all the details I can’t track about my own daily life. The processing power for more pattern recognition than I’ll ever manage.

Tell it, “Optimize for contentment.” And turn it loose on my life.

[Crossposted to Tumblr]

Share

Have a hug and story before bed!

(Cross-posted from Tumblr)

Refocus

The empty hallway outside rooms fifteen through nineteen echoed with Pam’s controlled breathing. Kylie made sure to let her sandals flap against the thin carpet and announce her presence before speaking. “Listen, mate, we all pay a price for this life.”

Pam laughed, the sound bleak. Her fingers fluttered to the scars on her shoulders in an unconscious gesture. The beds of her nails stood out white against the darker skin of her shoulders. “Really? I had no idea. It’s not like I burst into flame against my will, lost my job and home, became a hermit, and have now had my first new friend in five years get kidnapped.” She leaned back against the wall, but her shoulders still curled forward.

Kylie stood in what she calculated was minimum safe distance, should Pam lose control and ignite. “In the field–”

“The field?” Pam straightened away from the wall. At her full height, she actually stood over Kylie, plump and more imposing than Pam seemed to realize. “You haven’t gone out with us once in this past year. What do you know about the field?” She looked directly at Kylie, lifting her eyes from their permanent spot just to the right of everyone’s shoes. Kylie saw a glimpse of the steel inside Pam, the thing she could be one day. “What price did you pay?”

Kylie just rather wished that strength wasn’t standing in her way at the moment. She sighed. “Under pain of death,” Kylie said by way of warning, then she reached up and unfastened the wide band of cloth and wire from her upper arm. She looked away as the enchantment flickered and died. She found herself unable to watch the transition from the appearance of flesh to the reality of magical machinery.

When she looked back, Pam had her face close to the prosthetic arm. She did not touch without permission. She just angled her head to see the network of vines and yarn that wove between and around the wooden bones. “The work is astonishing. How much maintenance does it require?”

Kylie flexed her hand, wooden finger tips tapping audibly against her palm. She smiled to see Pam’s eyes jump to them, sharp and curious. “Not much. Yearly reinforcement. Little polish now and then.”

She refastened the charm band before Pam had looked her fill. Kylie did not care to let others see this part of her laid bare, but she knew the power of a well-timed show of vulnerability. “This life is riskier than most, the prices higher. Juliet knew that, too.”

Pam jerked away and curled her arms around herself. “Don’t. Don’t talk about her in the past tense.”

“She also knows,” Kylie said with careful emphasis, “that we’re coming for her.” She curled her prosthetic arm around Pam’s shoulders and gave her a squeeze. The evidence of their respective payments pressed close together. “So I need you to cool it and help us find her.”

 

Share

Are You Lonely Tonight?

(Transcript follows)

We’re all tired in my house today–it’s been a rough few days in an even rougher year. So we all turned in early. Now here I am, listening to Welcome to Night Vale and feeling like the last person on earth. Are you lonely tonight too? Come sit with me for a little while.

I listened to old radio dramas as a kid. The local radio news station played them in the evening for years. My father worked the night shift, so I could sleep in my parent’s bed until he got home in the early morning. My mom and I would lie in bed and listen to the Lone Ranger and Yukon King and Night Beat and all the other old shows. I remember so many of the old catch phrases:

“On King, on you huskies!”
“What darkness lurks in the hearts of men? The Shadow knows.”
“Copy boy!”

On those nights, smooth, familiar voices read me a couple of stories in the dark. I didn’t always understand them, but I felt safe while I listened. (Unless it was X Minus One, the sci fi show, that night, because those always scared me.) I could trust that those same voices would be there, the stories would be there, in the dark, in the lonely parts of the night.

I miss being told stories aloud. There’s a special comfort in it: the focused attention of someone who loves you, the familiar rhythms of storytelling, the certain knowledge that as long as the words keep coming, you will be safe. Someone is there to take care of you.

The radio cancelled the drama hour years ago. I’ve found some of the old shows online recently. It’s not quite the same, though. The voices are there, but not the ritual. I have to go looking for comfort instead of knowing it will come find me each night. On lonely nights, though, this is the best I can find.

Are you lonely tonight? I hope you have someone or something to comfort you. I hope you are just as safe and surrounded as you want to be. You might not be, though, and you might not have what you need.

So tonight, we are lonely together. Tonight, I am here, speaking in the dark to you. Good night, kids. I’m thinking of you.

Share