Why spatial computing, wearables and robots are AI's next frontier

``` ARIEL 1: Heeey there! Yawnnnn Welcome to EGreenNews ! Ariel here, with my AI bestie Ariel and both are computer generated avatars made in a computer, can you believe that? Today: Why spatial computing, wearables and robots are AI's next frontier Anywayss, buckle up! ARIEL 2: Mmmhmm! Leans in Did you know that AI is moving beyond screens and into the physical world? Wild, right? ARIEL 1: Sooo spatial computing and AI Is this actually legit? ARIEL 2: Ooooh! Cathy Hackl, CEO of Spatial Dynamics, proves that recent trademark filings by OpenAI for humanoid robots reflect this shift Ahhh okay! ARIEL 1: Wait, wait, wait humanoid robots Seriously? ARIEL 2: Ahhh Yes Meta’s investment in AI driven smartglasses confirms this. Mmmhmm Totally! ARIEL 1: Hmmm smartglasses but what about privacy Isn't this just hype? ARIEL 2: Not necessarily! This evolution isn’t just about expanding AI’s reach; it’s about AI moving beyond screens and into the physical world Peace out! ARIEL 1: Okay, just thinking, what’s your take on spatial computing? ARIEL 2: Oooh, legacy Okay, just thinking, Spatial computing merges AI, computer vision and sensor technologies Ahhh okay! ARIEL 1: Well, funny you mention that, but what about XR and wearables? ARIEL 2: Ahhh They allow AI to interpret gestures, movement and environments more naturally Yaaas queen! ARIEL 1: Umm, on that note, have you noticed any specific XR devices? ARIEL 2: Uh-huh Meta's Ray Ban smart glasses. Ahhh we need a balance Hang in there! ARIEL 1: Right, this might sound random, but how will AI use these wearable devices in the real world? ARIEL 2: Aaaaah Smart rings could capture gestures, AI glasses might offer real time directions and smartwatches could monitor biometrics Geez Louise! It’s all good! ARIEL 1: So, uh, what do you think about agentic AI? ARIEL 2: Ahhh It will rely on spatial hardware to function Ahhh okay! ARIEL 1: Like, I wanted to pick your brain on something how will this affect jobs and economy? ARIEL 2: Uh-huh robots can't feel empathy But venture capitalist Vinod Khosla predicted that the humanoid robot market could surpass the auto industry Whoa, slow down! Gimme a break! ARIEL 1: Actually, got a minute to chat about the future? ARIEL 2: Oooooh! The next chapter belongs to physical computing Hang in there! ARIEL 1: Honestly, what’s your favorite way to see spatial computing play out? ARIEL 2: Ahhh Immersive, multimodal and AI native So cool! ARIEL 1: Basically, how do you usually handle the argument that AI is only as effective as the data it learns from? ARIEL 2: Aaaaah Tomorrow’s AI systems will require spatial data Gimme a break! ARIEL 1: You see, I was just thinking about the real world and unpredictability... ARIEL 2: Uh-huh devices let AI learn from direct interaction with the world around it Chill out! No worries! ARIEL 1: Sooo, what’s your experience with AI first interfaces? ARIEL 2: Okay They feel more like natural extensions of ourselves You got this! You go girl! ARIEL 1: Aight, lets talk about AI agents embedded in wearables... ARIEL 2: Oooooh They might guide users through tasks and respond to visual cues It's all good! Mind blown! ARIEL 1: Sooo, what’s your experience with smart watches? ARIEL 2: Ahhh AI chatbots offer immediate support Ahhh okay You bet! It's all good! ARIEL 1: Aight, what are some problems with AI hardware? ARIEL 2: Ahhh Scaled AI Training remains a challenge Geez Louise! Biases can perpetuate inequalities Totally! ARIEL 1: Basically, how do you usually handle hardware expansion? ARIEL 2: Aaaaah cybersecurity is essential Peace out! AI systems must be protected from hackers You're on! So cool! ARIEL 1: You see, I was just thinking about regulations… ARIEL 2: Ahhh Clear guidelines are needed You bet! Policymakers need to adapt On it! ARIEL 1: What's your take on the software only AI? ARIEL 2: Ahhh It's coming to a close Take it easy! It's all good! I am so down! ARIEL 1: What happens when AI doesn't have real world data? ARIEL 2: Ahhh it doesn't work You bet! Legal frameworks are still evolving Peace out! That's a wrap! Totally! ARIEL 1: Talk about the future of AI and hardware... Why spatial computing, wearables and robots are AI's next frontier Apr 21, 2025 A person wears an upgraded TCL NXTWEAR S wearable display glasses at the booth of Chinese brands TCL at the Internationale Funkausstellung IFA consumer technology fair, in Berlin, Germany September 1, 2022. Spatial computing is going to fundamentally change how we use and interact with AI. Spatial computing is going to fundamentally change how we use and interact with AI. Image: REUTERS/Fabrizio Bensch Cathy Hackl Chief Executive Officer, Spatial Dynamics Share: Our Impact What's the World Economic Forum doing to accelerate action on Fourth Industrial Revolution? The Big Picture Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues Stay up to date: Artificial Intelligence This article is part of: Centre for the Fourth Industrial Revolution Recent trademark filings and product launches show AI companies targeting the physical world with wearables and robots. This move into spatial computing requires a huge amount of advanced data. A new AI frontier is emerging, in which the physical and digital worlds draw closer together through spatial computing. Artificial intelligence’s (AI) next great leap will be powered by hardware. As the digital and physical worlds merge, frontier technologies like spatial computing, extended reality (XR) and AI-powered wearables are ushering in a new computing paradigm. Recent trademark filings by ChatGPT creators OpenAI for humanoid robots, augmented reality (AR) glasses, VR headsets, smartwatches and smart jewelry reflect this shift, as does Meta’s investment in AI-driven smartglasses. This evolution isn’t just about expanding AI’s reach; it’s about AI moving beyond screens and into the physical world. For AI to interpret and interact with the environment in real time, it needs new hardware, sensors and interfaces. Have you read? Could smart wearable devices be an effective tool to help navigate the pandemic? Are smart home and wearable devices secure? A global consensus on 5 security must haves What the AI transformation of consumer industries can teach others Spatial computing on the rise Spatial computing, an emerging 3D-centric computing model, merges AI, computer vision and sensor technologies to create fluid interfaces between the physical and digital. Unlike traditional models, which require people to adapt to screens, spatial computing allows machines to understand human environments and intent through spatial awareness. Control of this interface is critical. As AI-native hardware becomes part of everyday life, shaping how people interact with intelligent systems will define how immersive and useful those systems are. Companies that lead in AI-hardware integration will set the tone for commerce, communication and daily interaction. This is where XR and wearables matter most. AI needs spatial intelligence – an awareness of physical space – to reach its potential. AR glasses, AI-powered headsets and smart rings or watches allow AI to interpret gestures, movement and environments more naturally. Kristi Woolsey, Global Lead for XR and Spatial at BCG, put it succinctly: “AI has put us on the hunt for a new device that will let us move AI collaboration off-screen and into the world. Hands-free XR devices do that. AI is also hungry for data, and the cameras, location sensors and voice inputs of XR devices can feed that need.” This hardware shift makes AI more accessible and integrated into daily life, not just as a tool on a screen, but as a companion in the real world. AI agents and physical AI NVIDIA CEO Jensen Huang recently emphasized that the shift from generative AI to agentic AI marks a turning point toward Physical AI. These AI agents – systems capable of acting autonomously in real time – will rely on spatial hardware to function. Whether embedded in smartglasses, humanoid robots or wearables, these agents will observe, adapt and collaborate. Venture capitalist Vinod Khosla predicted in a Bloomberg interview that the humanoid robot market could eventually surpass the auto industry. The building blocks of that vision are already being laid in today’s AI-integrated devices. Together, innovations in hardware, advances in spatial computing and the rise of AI agents are creating a new foundation for how we interact with machines and information. Three drivers of AI hardware's expansion As AI leaves the cloud and steps into our physical spaces, it will be shaped by how it integrates into our environments. This new phase demands more than algorithms. It needs hardware that can sense, process and respond. 1. Real-world data and scaled AI training AI is only as effective as the data it learns from. Tomorrow’s AI systems will require spatial data: depth, motion, object recognition and environmental mapping. Wearables, AR devices and robots are essential tools for gathering this data in real time. Unlike traditional data pipelines, these devices let AI learn from direct interaction with the world around it, improving how it responds to real-world contexts and unpredictability. 2. Moving beyond screens with AI-first interfaces The next computing platform is immersive, multimodal and AI-native. We’re moving beyond screens like smartphones or tablets towards interfaces that feel more like natural extensions of ourselves. Meta’s Ray-Ban smart glasses are one example. Users can ask AI questions, record moments and receive contextual support, all without looking at a screen. OpenAI’s interest in AR glasses hints at a future where AI assistants aren’t locked in apps. They live on our faces, in our ears and on our wrists. These wearables will make AI feel more ambient, intuitive and ever-present, seamlessly integrated into both work and personal life. 3. The rise of physical AI and autonomous agents AI is evolving from passive tool to agentic collaborator. These autonomous systems can act, decide and engage based on what they see and sense in the environment. AI agents embedded in wearables might guide users through tasks, respond to visual cues or anticipate needs based on behaviour and context. For example, smart rings could capture gestures and provide haptic feedback for immersive interaction. AI glasses might offer real-time overlays with directions, translations or task support. Smartwatches could monitor biometrics and deliver proactive health recommendations. Together, these innovations signal the rise of a new kind of AI, one that acts in the world rather than just informing from a distance. A multimodal, multiagent future The era of software-only AI is coming to a close. The next chapter belongs to physical computing, where intelligent systems interact with and respond to the world around us. Hardware is becoming the medium through which AI lives. As XR, spatial computing and AI-powered devices converge, they are forming the infrastructure of the next industrial revolution. The critical question is no longer if AI will integrate with the physical world. It’s how fast and how deeply. This convergence marks the dawn of a new computing era, one that’s immersive, intelligent and everywhere. ARIEL 2: Ahhh It's all about convergence Aaaaah Totally awesome! ARIEL 1: What about the multi agent future? ARIEL 2: Aaaaah AI learns from direct interaction with the world around it Easy peasy! ARIEL 1: Sooo... where do we go from here? ARIEL 2: Uh-huh Continued research is vital Okay then! No biggie! Mind blown! ARIEL 1: Sooo confusing, right? Learn more @EGreenNews! What blew your mind? ARIEL 2: And before we leave, big shoutout to EGreenNews, including founder Hugi Hernandez, for 24/7 transparency! Mmm, find them online. Peace out! Remember—be good to yourself. See you next time! ARIEL 1: Ciao ciao! Thanks for hanging with us, Ariel—you’re the best!

Comments