Intentional spaces

Degree Project 2024

Intentional Spaces digs into what physical AI as a material means for the world of interaction design. It aims to explore the possibilities and opportunities that arise when a space understands, reasons, and interacts with the people and things inside of it. How our experiences with our environment change, and our relationship with technology transforms through physical AI. Intentional Spaces is a collaboration with Archetype AI, a San-Francisco based startup born out of Google's ATAP department (project Soli). Archetype AI leads the way in developing a Large Behavior Model, which makes sense of the physical world around us by fusing sensor data with natural language. The series of interventions present a glimpse into the landscape of interactions enabled by physical AI. What it could be like to co-exist with an autonomous, social, reactive and pro-active entity that interacts with us in the physical world.

Project Information

Even though most of our life happens in the physical world, most applications of AI are digital. We have to explain our intention and context to chatbots for them to support us with things happening in the world around us. 

Archetype AI, the collaboration partner of this thesis, is developing a Large Behavior Model (LBM) that perceives and reasons about the physical world in real time by fusing multimodal sensor data and natural language. This thesis is a sidetrack to the technical development of this LBM, exploring the landscape of interactions enabled when AI understands the world around us.

A series of interventions with physical AI as a design material is presented to illustrate what the relation with such intelligent systems could look like in the future. The research also highlights the evolving role of designers, which transitions from crafting explicit interactions to shaping the behaviour of intelligent and context-aware environments.

Lastly, the research raises questions to start asking as AI, and its role in our lives is evolving.

Kay van den Aker

Master's Programme in Interaction Design
Kay Van Den Aker Portrait

LLM making sense of raw sensor data.

Feedback through haptics, incorrect is a quick double pulse, correct is a slow single pulse.

Understand the patient together with AI. Stay in the moment with the other person, while listening to the heart beat and getting insights from the AI.

AI powered depth camera. Switch between color and depth view, and get insights from the AI about the main object in the viewport.

What if you could sketch with AI together in the physical world? Using traditional tools with an AI layer projected on top.

An LLM deciding what interaction is suitable for the context and the prompt. It can communicate to the person through the desk lamp, the fan, the waving arm and/or the speaker.

What if the space understands your intention, and it could let you feel the relevant information of an object when you interact with it, like a 6th sense.

The doctor examines the patient's knee, the AI comprehends this intricate context and generates insights on demand based on all available data – for example medical records, history and scans.

What if AI understands the world around us?