if these walls could talk (and more)

S.C. Stuart
10 min readApr 3, 2021

--

Did you know that environments in the future will become sentient?

Photo by Efe Kurnaz on Unsplash

A great deal of this research is so that we can — as it’s called — “age in place” (i.e. avoid the warehousing nightmare that led to too many COVID-19 deaths/ miserable end for so many of our elders).

Here’s the idea: our dwellings will respond to us — make sure we’re safe (tracking toxicity), breathing correctly (monitoring oxygen levels in the environment), present and correct at nightfall (memory impairment GPS trackers), and help is on the way if we fall (proximity sensors).

Sounds creepy?

Well, I hear you.

But the alternative sounds worse (see above ref: warehousing elders).

It’s not a giant leap from today’s smart speakers and the like — but it’s coming.

Photo by BENCE BOROS on Unsplash

In the spirit of sharing, here are a few articles I’ve written about “sentient dwellings”, followed by *coughs* a few extracts from my novel THE FUTURE HAS LANDED (set in 2051) where environments that respond to humans (and other beings — non-spoiler alert) abound.

Photo by Ginji Ito on Unsplash

ARTICLES

Ready to have a rapport with your residence? Meet Josh, a new artificial intelligence-powered home management system, with a personality of its own (actually several, but more on that later).

[from Meet Josh, Your AI Butler]

“……..we are on the cusp of living in mixed realities [in] The Architecture of Singularity: creating interactive environments that challenge traditional fabrication techniques and spatial assemblies, bringing in virtual and augmented reality, robotics and smart space applications. Thinking about architecture as an extension or a form of artificial intelligence by making semi-autonomous systems. These are sensor-based but also have an intelligence, a capacity to self-regulate not just in heating or cooling the environment but also through movement.”

[from article on UCLA Architecture futures program]

Photo by Tom Chen on Unsplash

Marcus, let’s kick off with the backstory for Scream The House Down. As an artist, have you deployed the much-vaunted ‘Primal Scream’ therapy as part of your own process to know it would be helpful for everyone stuck at home, staring into the void of Zoom?
[ML]
Well, I’m a fan of psychotherapy in general. And I did another project called House Of Pain, in 2013, which also allowed you to light up a building with your screams. But you did it in person. We made that work at a fairly stressful time in my life. It gave me an excuse to do a lot of screaming while testing it out. This project felt it was the right time to do it in a different way.

[from my article: SCREAMING INTO THE VOID]

Scream the House Down — installation by Marcus Lyall in London during the Pandemic
Photo by FLY:D on Unsplash

Imagine having a place that welcomes you back from the daily grind. Not just with the ideal temperature setting, but a location that’s “alive” and understands the type of music you need to hear or images you want to see, based in part on a readout from your wearable device.

For example, we can record humidity, carbon dioxide, people’s heart rate, but how do you correlate this data to thermal comfort? There are seasonal shifts, metabolic changes, different comfort responses depending on the task you’re engaging with. We take these personal, situational, and temporal differences to build adaptive and responsive environments.

[from my article on Buildings With Personalities]

Photo by Marek Okon on Unsplash

Do you consider yourself a brain hacker? Yes. I do. Anyone who creates these kinds of experiences is getting inside people’s heads. A few years ago, I created a neurogaming experience called SYNCSELF in which I used EEG head trackers to detect participants’ focus which then influenced the narrative of the film.

[article on XR film-maker Karen Palmer]

Photo by Nguyen Minh on Unsplash

How much smart automation have you built-in?
We had requests from many users to build an app so they could “call” the wheelchair to their bed when they woke up, rather than waiting for their carer to arrive. Using Wi-Fi or Bluetooth communications on the chair, an app on the user’s mobile phone, and the same parking assist technology that’s used in semi-automated cars based on ultrasound and vision sensors, it drives itself safely around the clutter of the environment to get to the user.

[article on assistive technology]

The idea that really captured my imagination — because it was so very imaginative — was “Eidla,” a remote companion intended to combat social isolation from Akanimoh Adeleye and Alyssa Kubota, both PhD students in Computer Science at UC San Diego. Eidla takes users on a fantastical trip to another world, where they meet strange and marvelous inhabitants. The idea is to engage people in a purely verbal/audio exchange and keep the psyche intact. Eidla would learn over time to retain data and deliver conversational gambits about the person’s hobbies, family members, and memories. It was charming.

[article on voice assistants for people living with disabilities]

Photo by Kevin Wu on Unsplash

Los Angeles is moving towards a whole Smart Connected City plan and, as part of that, we’re changing how we work. We started off by transitioning to a variety of “Smart” solutions with several vendors, starting with Philips’ Smart Pole initiative, which has communication nodes built in, providing 4G LTE wireless technology. There are also many other types on the table right now, including remote monitoring nodes (maintenance, gas/electricity company), solar panels, EV charging, connected security cameras, air quality meters and other sensors that help us gather data to improve our services.

[article on smart city — responsive environments tech]

still here?

why, thank you.

okay now for the fictional output [which I wrote after doing lots more articles like those above] — read on…

THE FUTURE HAS LANDED*

*out now on AMZN *hint*

She moved her head slightly to the right and the bike took a faster route. Eight minutes later Alex walked through the security checkpoint, pausing by the robot sentinels for an eye scan and took the glass-fronted elevator up to the production studio. The lobby was vast and cavernous, designed to impress and induce awe. Huge digital screens were suspended over the atrium, showing brightly colored augmented reality animations on a loop. These were sentient-aware and profile-dependent, changing depending on who was looking.

Alex looked up. In response, the AV output switched her eyeline visuals to moody black and white, vintage/Paris, Serge Gainsbourg-esque imagery. She grinned and took the stairs two at a time. Striding down the production level hallway in her motorcycle boots, Alex had to dodge a non-bio bug swarm. It paused as she entered the lobby, then continued on filling every cubic inch of airspace.

“New security protocols?” said Alex to her assistant Tali, indicating the swarm outside. Tali just shrugged and said something about Halloween.

[Extract from THE FUTURE HAS LANDED]

Photo by Fran on Unsplash

Alex lived in a skyscraper which had been converted into live/work spaces just before The Conflict. All fifty-one floors were demarcated into white-washed automated dwellings. She lived on the 30th floor, at number 30–6. It was exactly the same as all the other units. Each one was designed for a single occupant and had neon panels outside three partitions designated as: sleep, eat, bathe. As she entered she touched her hand on the entrance panel. “Honey, I’m home,” she said, softly.

There was silence. Then, at the sound of her voiceprint the live/work space came to life. Lights went on in the eat space and a whirring of a blender was heard. The sleep room lights dimmed, her bed automatically changed itself into fresh sheets on a roller system and dumped the used linens into a chute to the basement laundry room. “My day was okay. How was yours, Dwelling number three zero-six?” she said.

The blender stopped in the eating area. Alex put her head around the partition, grabbed the container of green smoothie and drank it while leaning back against the dividing wall. Then threw it, expertly, into the sink area — where a robotic arm automatically washed it, dried it and placed it on the open shelving where several sets of identical containers were stored, with white plates, bowls and cups. She carefully placed her Communicator and wearables into a lead-lined container. Unlike Trent, she’d not had surgery, so had no monitoring devices inside her body, keeping her stable — or, more importantly, tracked.

[Extract from THE FUTURE HAS LANDED]

Photo by Duminda Perera on Unsplash

Opening a hall closet, she touched a keypad and the dwelling’s 3D textile printer discharged a black cotton t-shirt and shorts. A robotic arm came out from the wall and tried to pick up the leather jacket — Alex slapped it and it retracted. She grabbed the freshly printed nightwear and walked to the bathing area.

“Shall we go through my settings?” said Riley. But Alex was too tired, and told it she wasn’t up for a long human-robot startup session. They’d do it in the morning. A beam emitted from Riley’s undercarriage towards a pocket socket on the wall as it settled in to recharge for the night. Alex collapsed onto her bed and closed her eyes, pulling the duvet over her head, while slipping out of the shorts and t-shirt.

Then a faint ringing sound came from the sleep area. “It’s Trent McKenna,” said Riley.

Alex realized it had already jacked itself into the dwelling’s central nervous system. “I have no intention of speaking with him tonight,” snapped Alex from inside the sleep unit. The ringing sound ended. Alex turned over and tried to go to sleep without chemical assistance. She failed. So she sat up in bed. “I see you didn’t need my permission to do set-up,” she said.

“Affirmative,” said Riley. “But it’s considered polite with human co-workers to at least offer some measure of control.”

“Even if it’s illusory?”

“My training indicates it is a useful bonding exercise.”

[Extract from THE FUTURE HAS LANDED]

Photo by Lennon Cheng on Unsplash

That sort of thing had never stopped her before, but that was because she was sticking it to the spectre of Corporate America. Dealing with the Chinese was a whole other ballgame.

Lying full-length on her new chaise, she blinked in the direction of the entertainment console to wake it up. It did so, and instantly gathered data on her mood, hormonal monthly shift issues, nutrition levels and what she’d enjoyed in her forty-two years on the planet so far.

Doing its best, it came up with some really good ideas for a night in, but Alex switched it off by waving her hand in front of the sensor.

Using voice activation she told the live/work dwelling automation system to CALL RILEY.

NUMBER NOT FOUND, it responded.

[Extract from THE FUTURE HAS LANDED]

Photo by Alexander Popov on Unsplash

Alex outlined the basic premise of Nirvana and ran through her very high level digest on a dozen PhD-authored neuroscience papers about the brainstem/limbic/cortex areas of the brain (physical, emotional, thinking); fight or flight impulses; arousal; the role of hormones and neurotransmitters within the endocrine system (cortisol, dopamine, serotonin) to regulate mood.

One of the hearing-impaired coders typed into the concurrent messaging system: “OMG. Are we building Philip K. Dick’s Penfield Mood Organ for real?”

Alex stopped to scan their words. She paused to get the reference extrapolated via her in-ear AI assistant. It had been a long time since she’d read the original Do Androids Dream of Electric Sheep? She knew a few words of sign language, so acknowledged their input and thanked them.

“In a way,” she continued, speaking to the group as a whole, “But we are not providing a dial-up/down response — we have to build something that literally makes people happy, because their antidepressants aren’t working anymore.”

There was a look of panic on some of the coders’ faces. Alex realized she was looking at a control group. How many of them were on something — legal, or otherwise? These were highly educated, or extremely well self-taught, individuals — the best in biophysics, computer science, game development, machine learning, AI and data science. She bet that they all used pharmaceuticals to sleep, wake up, and function throughout the day. No wonder they looked worried.

“Don’t panic,” she said, smiling, using one of Douglas Adams’s references from Hitchhiker’s Guide to the Galaxy they peppered throughout their conversational exchanges. “We know enough about human physiology now to tweak moods through the built and/or virtual environment — sound, vision, color, heat, light, touch, textures, and so on,” she said. “Plus an awareness of how the brain works in terms of suggestion, calming effects and generating fantasy.”

“We’re going to manipulate people, basically?” said Max Cho, privately — turning his face away so people couldn’t lip read him — on Alex’s dedicated audio channel.

Alex grinned at him, then sent a message via her Communicator to his augmented reality glasses so it ran as a feed in front of his eyes. “Jeez, man — what do you think we’ve been doing all these years? International diplomacy and humanitarian aid?”

[Extract from THE FUTURE HAS LANDED]

--

--

S.C. Stuart

S.C. Stuart is an award-winning futurist, technology commentator and strategist