This thread is for anyone working on personal projects to share their progress, and hold themselves somewhat accountable to a group of peers.
Post your project, your progress from last week, and what you hope to accomplish this week.
If you want to be pinged with a reminder asking about your project, let me know, and I'll harass you each week until you cancel the service
Jump in the discussion.
No email address required.
Notes -
Been absolutely swamped between work and volunteer work, but on the hobbyist side, finally got in rev0.2 of a carrier board for this thing, along with a used pair of these. I was holding off on hope for the L9s (release date: soon), but given some of the issues rev 0.1 hit and the amount of other work needed to glue these two things together, it's probably best to try at a smaller scale first anyway.
What's the application? Augmented reality overlay
Edit: ah, just saw your other post. That's really neat. I'm still looking for low profile glasses with a decent HUD, maybe those are worth a try?
It's just at the napkin stage, now, and I don't yet have the programming chops to pull it off. But there's some fun ideas happening in this sphere, and while it'll be a long while before I'm even thinking about that level of design, it's a good way to keep motivation to learn.
With the caveat that I've only had them in my hands a day so far, depends a lot on what tradeoffs you're wanting to make, and what your use case is.
The One Lites are lightweight and surprisingly bright, but adjustability is mediocre, especially for very large or very small IPDs. The resolution is about the max of what's relevant for the field of view, but the field of view isn't great (~45ish degrees diagonal), and the lower framerate compared to the Pros or XReals newer offerings is noticeable for gaming (and fixing a virtual object in real space while the user's head is moving). No AR glasses are going to be stylish, but while the birdbath-style optics and Temu-brand sunglasses don't scream 'geek' as much as a Moverio set, the thick frames still look weird (arguably weirder) indoors and the newer generations with waveguide optics are thinner and better quality. The diopter settings are nice if you're nearsighted, but they can't handle astigmatism and you'll still need prescription lenses if you want to see the world too.
That said, it's really hard to beat the price, especially with the very robust used market going around.
It's not a standalone device, so if you're wanting a HUD outside of the office or a commute they're not easy to use. I don't have a compatible smartphone, and compatibility is complicated with any of these glasses. Viture sells a neckband style mini-computer, and it's supposedly pretty lackluster in about every way. Most (Thunderbolt-equipped) laptops work, but if you want to use a desktop computer or raspberry pi it can get more complicated -- the Pro Dock is very heavily built to handle some goofiness with the Nintendo Switch, but it might be useful for some of those cases and isn't an awful deal.
Dedicated devices like the Even Realities stuff might be better if you want an ultrasimple HUD that connects to your phone, and they're low enough profile that I could see them in a normal eyeglasses store, though in turn they're supposed to be a nightmare for hobbyists to develop anything serious around, and the screen specs are (intentionally) pretty crap.
More options
Context Copy link
More options
Context Copy link
Rangefinder glasses?
Kinda. I've done a proof of concept that was just a rangefinder using a monochrome SPI oled and some plastic lenses and a different time-of-flight chip, and that did have longer range than this layout will (although you start running into eye safety issues trying to exceed 50m). This one's more intent to be closer-range (the spec sheet says four meters and that's being generous), but gives a reasonable depth map across a wide field of view. Assuming I can get the data off the chip anywhere near the right speeds, the next step's going to be trying to get this into a wiregrid map overlaid on the user's field of view.
If that works with a low enough latency that it doesn't cause an Exorcist revival, mid-term goal is to try to use that map to project virtual desktops or graphics to solid objects, first from a fixed viewer position and then as the user moves.
Most of the current implementations for that sorta stuff depend on fidicuaries like AprilTags (or QR codes) and thus visual-light cameras that have a wide variety of privacy concerns, or solely handle angular heading. I don't think all of what I want to try will work -- these glasses near-universally give up on pinning virtual items in absolute position to the user for reasons, as anyone that's tried to integrate IMU data into position will tell you -- but there's a bunch of things you can do if you're willing to give up the general case and might work.
Cool -- I've got an older gen TOF plus a little oled unit and an old (broken) rangefinder box from a press camera (all sitting in a plastic baggie) that I've been meaning to frankenstein together into a focus assist hybrid display for large-format cameras. Time is short, but I'll get round to it.
Curious how it looks for accuracy once you dig in -- probably less important in your application, but I need <5cm on the near end; much less demanding as distances get larger tho.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link