sitkack
2 days ago
I think what is not clear from your project description is that your AR glasses don't provide an environment locked virtual screen, meaning a screen that stays fixed in space while your head moves. The native screen of your glasses provide a screen that is fixed relative to the glasses, like having a display locked to your face.
Some AR glasses do provide this feature, so it overcomes a product flaw in AR glasses and will at some point not be necessary.
Your project could also allow for other features that the AR glass manufacturer didn't think of, or has gated behind upgrades and product segmentation.
Is my assessment correct?
msgodel
2 days ago
I use AR glasses heavily for work and other things. I'm actually typing this on a pair right now. I've never understood what the application for environment locked screens is other than novelty/marketing. My glasses provide enough sensor data to implement this but I just can't be bothered.
bodge5000
2 days ago
I guess it makes a lot more sense if you're emulating more than one monitor, without environment locking you'd only ever be able to see the sides of the monitor(s) beside the one you're looking at.
But yeh for a single monitor, I guess it takes a bit of getting used to but non-locked seems far more reliable
veilrap
2 days ago
I'm curious how/what you use the screens via AR for if you're not using environment locked screens. Particularly in a productivity/work environment.
Unless I'm misunderstanding the feature, it seems like enironment locked screens allows for more natural usage and interactions with the screens in the virtual space?
My experience is mostly with VR/AR products like Oculus has been mostly with environment locked AR information.
msgodel
2 days ago
I suppose they could? I prefer having my posture decoupled from what I'm looking at though.
It's like having a very nice monitor that uses ~1 watt of power and happens to be positioned exactly wherever is most comfortable without even having to think about it. It's way better than a normal monitor if you don't have to do eg pair programming.
stavros
2 days ago
How are you finding the focus? I use the Xreal Air 2, but the edges are blurry, and I can't get the glasses close enough to my face to see the entire screen in focus, even if the top of the glasses is touching my forehead.
derefr
2 days ago
I think the use-case for these is more VR focused, with the AR just being a "being able to notice when something needs your attention" feature (where you would respond to such an interrupt by taking the glasses off, not by trying to look at the interrupting thing through the glasses.)
I've heard people propose that these "screen in glasses" devices (like the Xreal Air) are useful for situations where you want a lot of visual real-estate but don't have the physical room for it — like in a dorm room, or on a plane. (Or at a library/coffee shop if you're not afraid of looking weird.)
---
Tangent: this use-case could likely just as well be solved today with zero-passthrough pure-VR glasses, with a small, low-quality outward-facing camera+microphone on the front, connected only to an internal background AI model running on its own core, that monitors your surroundings in order to nudge you within the VR view if something "interesting" happens in the real world. That'd be both a fair bit simpler/cheaper to implement than camera-based synced-reality AR, and higher-fidelity for the screen than passthrough-based AR.
† Which wouldn't even need to be a novel model — you could use the same one that cloud-recording security cameras use in the cloud to decide which footage is interesting enough to clip/preserve/remote-notify you about as an "event".
underlipton
2 days ago
The most obvious advantage is being able to "zoom in" on the screen by moving closer to it (as with a real monitor), which is impossible with 3DOF or view-locked XR.
blensor
2 days ago
Yes that's correct, and it could be expanded to other features yes, one thing I've been mulling over is mult monitor input which you don't even get on devices that support what I did here natively
gsf_emergency
2 days ago
It is indeed hard to assess from the video at 19s? (Relatively impressive thru-the-glass perspective ntheless!)
https://www.youtube.com/watch?v=D6w5kAA22TsP&t=19s
Is it me or is there some dynamic here that could/should be further exploited to implement what you suggest
andrewmcwatters
2 days ago
It's also a Viture-specific project, but I suspect if the author wanted, this could expand to other video display glasses in the future and he could simply rename the project.
blensor
2 days ago
Spot on. And yes expanding it to other glasses ( which are popping up quite a lot recently ) would for sure be possible