The impact of AI on the built environment: three considerations for hospitals integrating new technology
The opportunities AI brings to healthcare are exciting. But all the conversations seem to be about the what and why, not the how.
Every building has character, a personality, thanks to its design, its context, its inhabitants. In the future, every building may be a personality, as artificial intelligence increasingly provides once-inert spaces with a voice and the ability to act.
This shift will require designers and architects to think in new ways about how inhabitants relate to buildings. As buildings become avatars of a sort, designers will shape the nature of the emotional connection between the building (or the building’s interfaces) and the user.
When we created the Research Lab for TEECOM, one of our goals was to make the lab into a hardware testbench. As we continued to accumulate more and more systems, we wanted to imbue the lab with the ability to manage all of its new hardware, to “keep tabs” on all these systems. By taking on these responsibilities and becoming nearly “self-regulating,” the lab AI would allow us to focus on higher-priority tasks. This is where we see buildings going in the future: an AI-powered intelligent infrastructure that maintains its own resources, self-aware (but without 2001: A Space Odyssey malevolent overtones). And while you can’t make a room fully self-aware yet, of course, you can get close.
Our first step towards self-managing infrastructure was creating a “brain” that could communicate with all of the smart elements of the lab, which would, in turn, allow all of these systems, both proprietary and open, to talk to each other and work cooperatively. Over time, we’ve incorporated systems into the lab that can sense occupancy, temperature, light level, and energy usage. Every smart device in the space also self-reports status and data. The trick was figuring out how to integrate all those of those “reports.” What we needed was a room “host” that could not only monitor these smart elements, but could take certain actions based on responsibilities that we’d assigned to it.
The software powering this host had to live somewhere. The obvious choice: a Raspberry Pi computer, a small embedded platform running Raspbian (Linux), an open source operating system. Next, we needed to build a software platform utilizing an architecture that would allow us to add more devices and technologies in the future; the software would handle incoming data (the aforementioned reports) and manage all of the current gear. We started designing the platform. Over the ensuing months, the software evolved from a simple RESTful API querying the building lighting system into an interactive, intelligent, self-correcting software platform. We called it “The Hub.”
So now our lab had a nervous system. The next step in “self-awareness” was giving the lab a personality. We integrated Amazon Alexa, which we later renamed FIERA (Fully Integrated Electronic Room Assistant) for the lab. However, we wanted FIERA to extend outside the reaches of simple call-and-response dialogue. That’s why now when you walk into the lab, FIERA greets you and gives you a little bit of information such as how many people are in the lab and what the temperature is.
We wanted to give FIERA some character, making it distinct and memorable like meeting a new person for the first time. When you think about the lab, we wanted you to think not just about the physical space, but the entity that lives inside – not the room and FIERA, but the room with FIERA. As a robotic analogue: if a robot’s hardware is the shell and AI is the brain, then we effectively utilized the lab space as our shell, powered by FIERA’s personality and intelligence. We’d turned our lab into a robot, transformed it into a personable, smart resource. How many rooms do you know that you can hold a conversation with?
FIERA acts as a conduit to The Hub, a friendly face (so to speak) for all the layers of complex hardware and software.We first programmed FIERA to be able to perform simple tasks. For example, the color of the lights, which are Philips Hue strips, can be altered to set a mood, or change color temperature based on time of day.
One of the more advanced tricks we taught FIERA was the lab boot-up sequence: When you knock on the door three times, FIERA turns on the lights, opens the garage door, and plays a series of audio effects. The purpose of this is twofold: 1) to demonstrate synchronized control of distinct and designedly isolated platforms, and 2) to show that a room can contain elements of character and personality, giving visitors the sense that they and the room are having a two-sided interaction. A one-sided interaction with a room is you walking in and switching on the light switch, but a two-way interaction is a dialogue: you ask the room to do something and it responds with language and actions that may be unscripted, that will vary and change over time.
The way people interact with rooms right now is incredibly simplistic and manual. You want to start your presentation, so you find the remote, plug in your laptop, find your presentation, bring it up, adjust the room lighting…In the future we’ll be interfacing with our environment in a more natural way. Instead of pushing buttons, we’ll tell the room, “FIERA, dim the lights and start my presentation.” The AI host will know the sequence to follow and what cues to wait for. Voice controls will evolve from requiring specific syntax, where you have to ask a specific question with a specific order of words, to a more colloquial way of speaking, where you don’t have to think about how you phrase something. The host will understand variances in how we speak and will ultimately understand the intent.
Effectively, self-regulating rooms will become assistants who will record, log, and analyze all your data; manage, interact with, and communicate with the people in the room; manage and coordinate all the technology in the room, and do it in a way that’s unobtrusive, inviting and always-on, omnipresent. It’ll be easier than pulling out your phone and asking Siri or Alexa a question – you just ask the room. It already knows the context. It understands what you’re trying to do.
Creating a more intelligent workspace will require a shift in how we think about designing buildings. It will require designers to think not only about the ease of use of a room, how a person interfaces with a space, but how the space interfaces back. How do we, as designers ourselves, create a personality that makes rooms more inviting and improves the productivity within a space? How do we leverage discrete technologies, products and features, to perform unified goals? It will involve a strategy for integrating AV, network, and control systems, and merging them together to form the equivalent of FIERA.
Stay ahead of the curve with our latest blog posts on industry trends, thought leadership, employee stories, and expert insights.