It’s been two years since Google introduced its Project Starline holographic video conferencing experiment, and though we didn’t hear more about it during the keynote at I/O 2023 today, there’s actually been an update. The company quietly announced that it’s made new prototypes of the Starline booth that are smaller and easier to deploy. I was able to check out a demo of the experience here at Shoreline Park and am surprised how much I enjoyed it.
But first, let’s get one thing out of the way. Google did not allow us to take pictures or video of the setup. It’s hard to capture holographs on camera anyway, so I’m not sure how effective it would have been. Due to that limitation, though, we’re not going to have a lot of photos for this post and I’ll do my best to describe the experience in words.
After some brief introductions, I entered a booth with a chair and desk in front of the Starline system. The prototype itself was made up of a light-field display that looked like a mesh window, which I’d guess is about 40-inches wide. Along the top, left and right edges of the screen were cameras that Google uses to get the visual data required to generate the 3D model of me. At this point, everything looked fairly unassuming.
Things changed slightly when Andrew Nartker, who heads up the Project Starline team at Google, stepped into frame. He sat in his chair in a booth next to mine, and when I looked at him dead on, it felt like a pretty typical 2D experience, except in what felt like very high resolution. He was life-sized and it seemed as if we were making eye contact and holding each other’s gaze, despite not looking into a camera. When I leaned forward or leaned closer, he did too, and nonverbal cues like that made the call feel a little richer.
What blew me away, though, was when he picked up an apple (haha I guess Apple can say it was at I/O) and held it out towards me. It was so realistic that I felt as if I could grab the fruit from his fist. We tried a few other things later — fist bumping and high fiving, and though we never actually made physical contact, the positioning of limbs on the call was accurate enough that we could grab the projections of each other’s fists.
The experience wasn’t perfect, of course. There were parts where, when Nartker and I were talking at the same time, I could tell he could not hear what I was saying. Every now and then, too, the graphics would blink or appear to glitch. But those were very minor issues, and overall the demo felt very refined. Some of the issues could even be chalked up to spotty event WiFi, and I can personally attest to the fact that the signal was indeed very shitty.
It’s also worth noting that Starline was basically getting the visual and audio data of me and Nartker, sending it to the cloud over WiFi, creating a 3D model of both of us, and then sending it down to the light display and speakers on the prototype. Some hiccups are more than understandable.
While the earliest Starline prototypes took up entire rooms, the current version is smaller and easier to deploy. To that end, Google announced today that it had shared some units with early access partners including T-Mobile, WeWork and Salesforce. The company hopes to get real-world feedback to “see how Project Starline can help distributed workforces stay connected.”
We’re clearly a long way off from seeing these in our homes, but it was nice to get a taste of what Project Starline feels like so far. This was the first time media demos were available, too, so I’m glad I was able to check it out for myself and tell you about it instead of relying on Google’s own messaging. I am impressed by the realism of the projections, but I remain uncertain about how effectively this might substitute or complement in-person conversations. For now, though, we’ll keep an eye on Google’s work on Project Starline and keep you posted.
Follow all of the news from Google I/O 2023 right here.
This story originally appeared on Engadget