Hacker News new | past | comments | ask | show | jobs | submit login

Yeah I can do python, have not used micropython and I have not worked with FPGA's before so that's not what I'm trying to do now. The micropython sounds more tangible to me.

So I'm going to see about somehow feeding in text into the monocle via phone/BT... like an OBS situation. Then you could use your phone's screen as a mouse/input method. I know it seems pointless just use your phone... but you know... it's cool.

I have to read the manual on this too.




Nreal Airs mentioned in another comment thread seem viable for the dishes problem. Thought I should mention to you.


I saw those a while back looked interesting. I already pre-ordered a simula vr headset regarding both-eyes. I know they're not the same thing but yeah.


I’ve been looking more at the Nreals since I replied and they are potentially an out of the box solution to the text information while doing dishes.

I’m thinking a custom textual python terminal app on transparent terminal emulator on black background with tiling window manager. I have a GPD microPC so it seems if I add the Nreal adapter to connect hdmi. Then all it’s really missing is some type of input mechanism maybe via bluetooth.

The Nreals looks potentially good enough to be literally walking around safely if the textual TUI is designed in a way to not be too distracting.

Edit: found this for input mechanism.

https://www.tapwithus.com/product/tap-strap-2/

This idea looks fairly realistic to me.


wow that tap with us is a cool product

I was thinking about how to make something like that, I'd be curious on its accuracy, will see if there are videos on it

hmm it does not work the way I think it would, I wanted a qwerty layout




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: