Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Open-source digital stylus with six degrees of freedom (github.com/jcparkyn)
583 points by jcparkyn 10 months ago | hide | past | favorite | 108 comments



Very cool. They've done what I've daydreamed about, and actually got it to work!

When I played Elite: Dangerous I used a "hand on throttle and stick" (HOTAS) setup, along with foot pedals. I couldn't help but think that there must be a better way to control a spaceship: your ship can pitch, yaw, and roll in addition to being able to fire thrusters in 6 directions.

I wanted a handheld ship model that I could move such that the ship in Elite would move in the same way. The linked project looks like it could do just that. Thrust would be controlled in a similar way, but with my other hand.

Strange or new input models like that are so amazing to me. Our imagination can really fly high with these sorts of capabilities.


One of my fondest memories is when I was 16 or so. I bought a 3D mouse (it’s like a mouse that you can control by waving it around) and taped it onto a hat that said Marines. Then I immediately went into CS: Source and got ready to annihilate everyone.

I think I played about the same, but it would’ve been funnier if I’d gotten creamed.

The theory was that my head is much more precise than my hands, so using a combination of both should improve my aim. It might sound like you’ll end up looking left and trying to see out of the corner of your eye, but in practice the two-mouse system ends up working out. I’ve always wanted to revive the idea for modern times, but you end up looking like a huge dork with a dildo glued to your head. I like that style though.

Your idea sounds way cool. You should make it!


Modern consoles (and modern controllers I assume) can do this "double-input" for aiming, using a joystick and a gyroscope. I'm still bad at shooters, but it feels super good. Plus you don't look like a huge dork with a dildo glued to your head.


> look like a huge dork with a dildo glued to your head.

What do you mean? I do get stares at the grocery store or on the street occasionally but no one ever mentioned anything.


I still hold onto my SpaceOrb369 for this sort of game. There is a guy that makes a converter to make it work over USB...

https://en.m.wikipedia.org/wiki/SpaceOrb_360

https://www.tindie.com/products/vputz/orbotron-9001-version-...


If you weren’t playing Descent with this you were losing.


I have the slightly newer version, branded by HP as the SpacePilot, and use it daily in Fusion 360 CAD. It has a native USB connection, and 6 user-definable hotkeys below an unimpressive LCD.

It requires some old drivers that aren't officially supported (why would they remove support for perfectly good hardware from the newer drivers? To send good stuff to the landfill, of course!), but when Autodesk tried to move Fusion to the new driver model exclusively, user outcry persuaded them to leave the old drivers in as an option. Apparently there's quite a few of us using those SpacePilots, and the phrase "from my cold, dead fingers" comes up not infrequently.


For sure, the HP one is a rebrand of the 3Dconnexion right?

Those are great, but nothing really compares to the original SpaceOrb360 controller form-factor for gaming IMO.


Yup, as far as I can tell the HP one is literally just a sticker and a different splash screen when the LCD powers up, it's otherwise 100% identical to the 3dconnexion original. Just slightly cheaper because of keyword search.

I haven't used the SpaceOrb360 but it looks like it would be a lot of wrist strain to hold it and use it for long periods. But I like the button placement under the right hand... hmm.

With the SpacePilot, I have my right hand on the keyboard or mouse so I can bind whatever buttons I want, and I suppose I could one-hand a regular controller and get A/B/X/Y/stick/triggers in my right hand instead. Hmm. Would that be easier on the wrists than melding the two together? I'm not sure. It would certainly give me more buttons to play with...


I plugged in my old SpacePilot Pro the other day and was so dissapointed that the drivers where so outdated, and like nothing was working.


> I couldn't help but think that there must be a better way to control a spaceship

Two sticks?

https://old.reddit.com/r/EliteDangerous/comments/16xi20a/dua...

https://youtu.be/T2-IHgNYaKA


That's really nice, before now I hadn't really understood how to use 2 sticks, but now it makes sense.

I used a CH Pro throttle, which has enough hats to sort of mimic that 2 stick action. Main throttle back forward for main engine and retros. A smaller thumb stick/hat for each of Y and Z axis.

https://www.chproducts.com/Pro-Throttle-v13-d-719.html


That works wonderfully with VKB's STECS, which lets you swap physical detent profiles in roughly 20 seconds. So you have a forward/backward axis on your throttle with a well defined "W" centre point, plus a high quality thumb stick for your lateral/vertical thrusters.

(In case anyone is wondering, I transitioned from the Warthog throttle with DeltaSim's slew mod with zero issues, I find the STECS Standard way more versatile and flexible).

I couple this with a Gunfighter Ultimate stick (V3), and Crosswinds pedals, all profile mounted. (My GFU grip has twist, but I don't use it.)

When you add TrackIR in my case (or VR for others), there's quite a bit of immersion going on! :-)

I play DCS and Star Citizen these days, also used to fly Elite Dangerous though. Building the combined quad screen workstation / simrig has been a wonderful side project over the past few years. ;)


Yep. I played Descent II this way and it was glorious.


In the sci-fi series The Expanse, the Rocinante is controlled in part by a 3DConnexion SpaceMouse, commonly used in CAD and 3D modeling.


this 6dof "joystick" has been around for decades: https://3dconnexion.com/nl/spacemouse/


And it works surprisingly well for Descent. The original 3dconnexion one worked first try on linux too.


I've never found an explanation of what exactly is in that.

Is it a track-ball, a rotary encoder, and a 3D stick?


There's a picture on the Wikipedia article: https://en.wikipedia.org/wiki/3Dconnexion and there's a patent you can look at: https://patents.google.com/patent/US20050172711A1/en?oq=2005...

It's flexible, and there are LEDs and linear optical sensors set up with an occluder in between that casts a shadow (or passes a light through a slit, according to the patent). The sensor detect where along the sensor the shadow or light falls, which indicates how much you have translated or rotated the control in a particular axis.


I don't know exactly, but using one, it has the sensation of using the trackpoint on a Thinkpad. The puck you push, pull, and twist to make the motion is a bit springiness. The harder to push it, the faster the motion happens in the software on screen. When you release, it springs back to neutral. The old ones (late 90's vintage) had trouble coming back to center, and there was a hot key to rezero it if it was starting to drift. The newer ones don't seem to have the issue.


It’s a knob that spring back to its origin when let go, that can move ~45 degrees or ~1/4” each axes(subjectively). It feels a bit like those old bobbing dashboard toy in hand.

Mechanically, it’s something like an Rx/Rz two-axis joystick with sliding knob part on top that can be slid in X/Y, pushed/pulled in Z and twisted in Ry as well. All axes are spring centered.


Imagine a 3-axis joystick that can also sense translation. So aside from pitch/roll/yaw you also get the linear up-down/left-rigth/forward-back axises.


I can see how pitch roll yaw would work, but I can't imagine how linear movements are made on the same joystick? Maybe up and down is push and pull but what about the other directions? Do you push the device itself?


The trick in comparing it to a joystick is that it can distinguish between tilting the control to a side, and pushing the control to the side. Does that help? Similarly, it can detect pushing straight down and pulling straight up.


Looks like current incarnation is Stewart platform lookalike optical setup for all axes[1]. I misspoke in a different comment: Mine was older SpaceBall 5000 model, and it was more like two joysticks joined at the stem.

1: https://www.fictiv.com/teardowns/spacenavigator-3d-mouse-tea...


Yup. Use one every day for Fusion360 and Blender.


This. Sadly the drivers suck.


There is an opensource driver package for Unix type systems[1]. I find them to be pretty reliable. The stock Windows drivers do leave something to be desired.

[1] https://spacenav.sourceforge.net/


I've always found it fascinating that companies that makes specialized hardware devices like this lean so hard into proprietary software, as if that's the thing that gives them the edge on the competition.

If anything the hardware is the hard part to copy and open source drivers could do nothing but benefit them and ensure wide platform compatibility and longevity, and yet so many companies like this insist on closing things up and forcing hackers to reverse engineer everything. It's bizarre logic to me.


Would it help with scripting macro software such as AutoHotKey? I imagine it wouldn't pick up some inputs due to proprietary driver.

It have the same issue for my Logitech G600 that it can't pick up some inputs. I set up the macros in LGHUB with keystrokes (multiple modifiers with keys) and use AHK to grab those and expands the capabilities of my G600 than what LGHUB offers.


Exactly, the inputs aren’t exposed as extra axes so those tools don’t see them.


Could it help if the driver have those axes assigned to a different key to allow AHK to pick that up and AHK can change it as axes? I done that with my XP-PEN AC-19, I assigned the dial wheel to different keys in XP-PEN software and AutoHotInterception with AHK to pick that up. Then assigned it with wheelup/down depending on what active window I am on.


It does not expose it as a key either. You must link their proprietary driver (and sign some legal document) that then gives you an API you can use to read the position of the spaceball.


Really? That sucks. I guess it would be better to get a custom keypad with rotary encoders that would expose itself.

I found it interesting how companies invest heavy in hardware and don't bother to improve the software side.


I always imagined a sphere suspended / held by a minimum number of strings (or rods?) / attachments. By physically pushing, pulling, and twisting the sphere you could detect these movements via compression and tension in the attachments. You could motorize those strings/rods to give resistence and feedback to the piloting.


Yeah that's almost exactly what I was daydreaming of! Weaker gyroscopes might mean a ship can rotate in one direction slower than others, and feedback to a controller could model that for the user.


> I couldn't help but think that there must be a better way to control a spaceship: your ship can pitch, yaw, and roll ... in 6 directions.

15 years later and I'm still convinced that the SpaceOrb 360 is THE GREATEST GAME CONTROLER OF ALL TIME. See what I did with the all cap there? It's the G.O.A.T. and it's a tragedy that it lost the battle to "WASD" keyboard and mouse gaming and the "two thumbs" style of controller.

If you've ever tried a SpaceOrb, you probably already know this. If you didn't live through the SpaceOrb's hayday (around 1994-5 with the original Quake, and Descent, truly one of the greatest games ever) then this post will sound like I'm a crotchety old dude trying desperately to relive my the golden hour of my youth. Ok, so you got me. What of it? ;)

https://www.hanselman.com/blog/the-best-controller-for-fps-a...

Degrees of freedom: https://hanselmanblogcontent.azureedge.net/WindowsLiveWriter...

I had purchased a box of these when they were going away, and sponsored a custom driver. For a while there were other solutions available, like OrbShield:

// via SeedStudio: https://www.seeedstudio.com/Orbshield-v1-0-kit-p-671.html?qu...

// via Tindie: https://www.tindie.com/products/vputz/orbotron-9001-version-...


Reading your comment with the linked project in mind, I started thinking about the idea of a “glove holding a sphere” sort of control system: the glove is the substrate for the ArUco codes, but the sphere could contain the accelerometers, but could also contain gyroscopes to provide inertial feedback? I don’t know anywhere enough about either technology but I imagine controlling a ship in 3d space would be a constant fight against inertia, and it would be awesome to feel that.


This is why 2 sticks is a fairly common setup for space games


If you like the puzzle of firing thrusters manually, try "ΔV: Rings of Saturn". The game is 2D, but you get six keys to individually fire different thruster combos, and a separate key to fire the main engine.

Then try staying in control with an unbalanced ship, or after you inevitably smashed one of the thrusters into a rock. It's a pretty space-nerd game too, they simulate details like reaction wheel saturation.


There’s an entire rabbit hole of dual stick setups for space games. Quack HOSAS to get started.


Ok Quack is adorkable.

What would be the word for "To Kagi"? They use "Fetch" in the search field, but it hasn't really stuck with me.


Why mention the provider instead of using the verb you use the provider for? You want to "search" for something.

You wouldn't say I am AT&Ting or AOLing for browsing the internet or Honda-ing / Ducati-ing for using the motorcycle


Sorry I forgot fun died with web 1.0. I’ll try to be more cynical in the future.


The folks at Sublight Dynamics had a really neat product in this vein that I was excited about. Seems like the perfect flightsim controller. They had trouble getting fully to market. Website is still up though so I'm holding out hope!


Given that a spaceship isn't all that different from a fixed-wing aircraft, and the stick + pedals arrangement has become a standard for controlling the latter for over a century, I think it'd be hard to improve on.


The difference is that spaceships have thrusters, so they have the ability to accelerate up, down, left, or right, in addition to all the controls of a fixed-wing aircraft (throttle + stick + pedals).


Do fixed wing aircraft spend a lot of the time going backwards? Do they, for example, need to turn around and point their engine in the direction of travel in order to slow down? If a fixed wing accelerates to 1000mph and then rotates it’s nose by 90 degrees, how fast is it going and in what direction? What about a space craft?


> If a fixed wing accelerates to 1000mph and then rotates it’s nose by 90 degrees, how fast is it going and in what direction?

I don't know how fast, but depending on which axis it rotates on and how fast, it may be going in several directions at once.


Lol, right?


With no gravity or drag they're way different.


> This project was part of my undergraduate thesis for electrical engineering. I

Undergrad! If you didn't get top marks on this there is no justice


Thanks! I'll get results in about two weeks (fingers crossed)


Why do you think so? When I was doing undergrad, the emphasis was on scientific work. So novelty or applicability were not important for grades, but if your written communication about it was scientific. I didn't see a write-up of this project, so I couldn't judge it grade wise


I agree, and this is one of the things that are problematic with academia. Not that scientific rigor / work / dissemination shouldn't be rewarded--it should. But so should work like OP's.


undergrad != academia

Undergrad is trying to learn the necessary basics to be able to get into academia.

Of course undergrads could publish papers already and be part of academia, but in this case I am not sure if there is any novelty or if it's just a nice product idea.


I don’t give a flying fuck about scientific writing style. In fact, I think much of it is counterproductive.

I do give a flying fuck about applicability and novelty, especially for an undergrad.


I scrolled down to the picture first and the desk made me instantly think “ah, a dorm room”


uni marks != quality :-) I agree with you though.


Very cool. The use of a webcam really makes me wonder if there's a future where our regular single ~78° FOV webcams are going to be replaced by dual (stereo) fisheye webcams that can:

- Enable all sorts of new UX interactions (gestures with eye tracking)

- Enable all sorts of new peripheral interactions (stylus like this, but also things like a steering wheel for racing games)

- Enable 3D 180° filming for far more flexible webcam meetings, including VR presence, etc.

The idea of being able to use the entire 3D space in front of your computer display as an input method feels like it's coming, and using a webcam the way OP describes feels like it's a little step in that direction.


I thought this was around the corner years ago when Intel and partners had RealSense modules being built into laptops but it seems like all the players have shifted focus to more enterprise and industrial markets.


Wii Remote (2006), Wiimote Whiteboard (2007), Kinect (2010), Leap Motion (2010- Ultraleap (2019)),

There are infrared depth cameras in various phones and laptop cameras now.

[VR] Motion controllers: https://en.wikipedia.org/wiki/Motion_controller#Gaming

Inertial navigation system: https://en.wikipedia.org/wiki/Inertial_navigation_system

Inertial measurement unit : https://en.wikipedia.org/wiki/Inertial_measurement_unit :

> An inertial measurement unit (IMU) is an electronic device that measures and reports a body's specific force, angular rate, and sometimes the orientation of the body, using a combination of accelerometers, gyroscopes, and sometimes magnetometers. When the magnetometer is included, IMUs are referred to as IMMUs.[1]

Moasure does displacement estimation with inertial measurement (in a mobile app w/ just accelerometer or also compass sensor data?) IIUC: https://www.moasure.com/

/? wireless gesture recognition RSSI: https://scholar.google.com/scholar?q=wireless+gesture+recogn...

/? wireless gesture recognition RSSI site:github.com : https://www.google.com/search?q=wireless+gesture+recognition...

Awesome-WiFi-CSI-Sensing > Indoor Localization: https://github.com/Marsrocky/Awesome-WiFi-CSI-Sensing#indoor...

3D Scanning > Technology, Applications: https://en.wikipedia.org/wiki/3D_scanning#Technology

Are there a limited set of possible-path-corresponding diffraction patterns that NIRS (Near-Infrared Spectroscopy) could sense and process to make e.g. a magic pencil with pressure sensitivity, too?

/q.hnlog "quantum navigation": https://news.ycombinator.com/item?id=36222625#36250019 :

> Quantum navigation maps such signal sources such that inexpensive sensors can achieve something like inertial navigation FWIU?

From https://news.ycombinator.com/context?id=36249897 :

> Can low-cost lasers and Rdyberg atoms e.g. Rydberg Technology solve for [space-based] matter-wave interferometry? [...] Does a fishing lure bobber on the water produce gravitational waves as part of the n-body gravitational wave fluid field, and how separable are the source wave components with e.g. Quantum Fourier Transform/or and other methods?

Because the digitizer


> e.g. a magic pencil with pressure sensitivity, too?

Wouldn't such a capability also be useful for surgical AR/AI, robotics, and training?


I'm sadly bearish on this kind of stuff. The Mac touch-panel showed that even a context-aware, reactive and fully integrated input struggled to break the flow gained by keyboard+mouse.

Even today some people are still on the fence about the new fangled mouse!


The Mac touch bar sucked, that why it didn’t gain traction.


Likely valid, but can you say why it sucked? And why any other input wouldn't struggle with the same faults?


It sucked for the following reasons:

- too easy to accidentally touch while reaching top-row keys.

- cannot be used by feel.

- replaces existing keys, removing functionality.

- turns off sometimes, so we have to tap once to wake it first.

- some common actions used to be 1 key, are now two taps (volume, brightness).

- does not always register my taps. No unambiguous indication that the tap was recognized.


I'm sure you've heard of TrackIR, which can do head tracking with some head-mounted reflectors. But without mounting anything to your head, the tobii eye tracker can do both head and eye tracking completely passively. It seems like adding hand/finger gesture controls, and peripherals like stylus, mouse, etc... would be an obvious next step, but it doesn't look like they or anyone else is doing that.


I have an Anker PowerConf 300 webcam which (on Mac or Windows) can go up to 115 degrees with the AnkerWork software.


Nice. Especially the fusion.

Some random-ish thoughts from exploring "a laptop keyboard... with hand pose, 3D stylus, and touch". Adding buttons yields a 3D mouse - but camera coverage can be a pain. Note the body is largely empty (and battery could be slimmed) - I could more-or-less type while holding a slim chopstick/toothbrush-like stylus (even with the markers, and a weirdly big tip). A big tip (sliced from a ping-pong ball sized xmas decoration) could slide fairly smoothly on a ThinkPad keyboard (and gave room for a less compact force sensor, and an extra tip marker). Thin stranded silicone ribbon cable can be string-like flexible - I just tethered the stylus to an arduino to get started.

Hmm... I wonder what the inertial sensor might make of something like a dimpled metal clicker, as a button (or three)?


The rolling shutter compensation is pretty cool and isn't something I would have thought of. Did you know that would be an issue from the start or notice it only after you built the rest of the system?


I knew it would have an effect (most of the literature for similar projects just uses global shutter cameras for this reason), but wasn't sure how significant it would be. It turned out to be small enough that it usually wasn't super noticeable, but in certain cases it really showed up (e.g., rotating the pen while keeping the tip in one place).

The thing I was most surprised by was how effective my solution was, given that it's a pretty gross approximation of reality. There are lots of much more sophisticated techniques for dealing with it, which I didn't end up needing.

One thing I would've liked to try out is using rolling-shutter-aware PnP [1], which can theoretically estimate pose and velocity simultaneously from one image, by exploiting the rolling shutter distortions.

[1] https://www-sop.inria.fr/members/Philippe.Martinet/publis/20...


I'm a big fan of all things 6DOF ! Nice work on the hardware and computer vision pose work but I am almost more impressed by the software surface you are drawing into and able to rotate. Thats interesting and could be used with any tangible user interface control like a finger slider for the same effect. Good project for problem solving skills looks like you nailed it bravo!

Btw the first 6DOF controller I had other (than a hacked WiiMote controller as a ir led Bluetooth receiver [1]) was the logitec mx air which was ahead of its day [2].

[1] https://web.cs.ucdavis.edu/~okreylos/ResDev/Wiimote/MainPage...

[2] https://www.cnet.com/reviews/logitech-mx-air-review/


1. Very cool project

2. Helpful documentation

3. Nice real world example for the use of a Kalman Filter!


This is really cool, and right out of my dreams.

Since the move to full remote work at my company, I've been longing for whiteboard sessions. The best I could conjure up was to use an old iPad and load up a shared whiteboarding webapp in a browser on both iPad and my desktop, and then share screen from my desktop, using a cheapo stylus on the iPad to draw.

It's pretty good, but my iPad is super old and slow. Replacing it is not affordable. But a conventional web cam + a relatively low-tech stylus would be much better, and drop the need for an external device completely.


The cheapest Wacom tablet [1] is 67CAD. Other brands have cheaper ones.

[1] https://wacomstore.ca/product/one-by-wacom-small/


Fantastic. I love that this is brand-independent, like buying a mechanical keyboard.


Very cool!

Could be useful for robotics / VR as well. One-camera hand tracking anyone?

Question: could you use gyro+accel to track pressure as well? Or at least "taps"?

Another question: how much does it cost? in particular, the pressure sensor...


1: You could absolutely use gyro/accel for detecting taps, but for proper pressure sensitivity (i.e., changing pressure in the middle of a stroke), there's not much you can do except have a pressure sensor. It's theoretically possible with a sufficiently accurate pose estimate and a springy pen tip, but not feasible at the level of accuracy I got.

2: I paid about $20 AUD for the pressure sensor, but they can be had for quite a bit cheaper (~$5 USD) in the US (https://www.arrow.com/en/products/hsfpar003a/alps-electric). Only problem is they're quite specialized, so not many places sell them. The custom PCB was another $10AUD, and the Arduino was about $20. There's a full parts list at https://github.com/Jcparkyn/dpoint/blob/main/setup-guide.md.


another question: how do you estimate _depth_ - i.e. the dimension _away_ from the camera?

I'm guessing it's the size of the ArUco markers, combined with accelerometer for smoother tracking - but that seems quite imprecise?


That's done by the PnP solver, but yes it's essentially looking at the size of the aruco markers on the screen. This is implicit in the equation the PnP solver is trying to solve: "find the pose (position and orientation) that results in the minimum re-projection error for all the marker corners". Re-projection error is the difference between the observed 2D location of each marker corner, and the theoretical 2D location for a given pose (using standard camera projection equations).

You're right that this is the least precise dimension for PnP, but it turns out to be good enough if the corner positions are decently accurate. Using sub-pixel refinement for the corner locations helps a lot (this is built into OpenCV — it looks at the brightness values for multiple pixels around the corner to get an estimate more accurate than one pixel). Having corner positions further apart in 3D space helps as well, which is part of the reason I used two "rings" of markers.

In my case I was getting somewhere in the ballpark of 1-2mm of depth error at ~3cm away from the camera, so ~0.5% relative error. Smoothing from the accelerometer stops this from being noticeable most of the time, but it will sometimes crop up in bad conditions (e.g. if one of the markers is being missed on some frames due to lighting problems).


Outside tracking with a camera is not something I would have thought of. Seems cool.

Reminds me of how sad I am nobody's done a good job of cheaply cloning the lighthouse tech that valve/htc use.


I should point out that I'm not the first person to use camera tracking for this [1], but to my knowledge there hasn't previously been a serious attempt to combine it with inertial or pressure sensors (which are both necessary for competing with graphics tablets) or make it open-source.

[1] http://media.ee.ntu.edu.tw/research/DodecaPen/


Isn’t camera tracking + inertial sensors exactly how the Oculus Rift headset & controller tracking worked?


Yes (or something like that, I don't know all the details), but they use infra-red light which requires a dedicated camera. I think they might also require more than one camera (although it is possible to do monocular IR tracking).


AFAIK you can actually get the sensors used for tracking at somewhat OK prices. But iirc the sensors actually do some part of the position calculations and as such aren’t simple enough to be really cheap.


Yep, they're actually just tiny solar panels! I got several trying to figure out how to use the lighthouse system. Not enough hours in the day and at the time there wasn't enough compute in a small package cheap enough for me.


Very cool! It has the added benefit of being able to manipulate objects in 3d. How does it compare to graphic tablets in terms of accuracy?


Currently it's not quite at the level of graphics tablets, but it's not too far off, and I think there's quite a bit of potential to improve it using similar techniques [1].

In terms of absolute accuracy, I measured an average error of 0.89mm (for the position of the tip) across the entire area of an A4 page with the camera in one place. In practice you have more precision than that though, because most of the errors are constant biases (not random noise).

For example, here's one of the tests [2] I did for the thesis, which compares the recorded stroke to what I actually wrote (scanned from carbon paper). After aligning the two captures (as a global 2D position offset, everything else is retained), the average distance from the recorded stroke to the scan was 0.158mm.

[1] This paper (which I linked in another comment) uses some more advanced techniques for pose estimation which could definitely be applied here (but it's closed-source, and I didn't have time to re-implement it from scratch): http://media.ee.ntu.edu.tw/research/DodecaPen/

[2] https://github.com/Jcparkyn/dpoint/files/13329235/main-sketc...


One thing with using graphic tablets for handwriting and drawing in practice is you really want your stylus to be thin, lightweight (15-20g), and have either neutral center of gravity or balanced towards the tip slightly. Also preferably battery-free (non-Wacom tablets were forced to use batteries due to patents, not anymore).

Until it's solved, the better practical use for this is probably a controller for VR or 3D software.


Brings back joyful memories of me begging random Professors from around the world over email for the source code of their papers.


Nice. It seems there is a systematic error towards the edges of the paper, which can easily be fixed in software.

Though it would be nice to do more tests. For example, there is some error when drawing sharp corners. It would be instructive to see how that changes as you change the angle, etc. And how that changes based on location on the paper.


Add a laser range finder on the camera and retroreflector on the pen and we have a cheap implementation of a 3D measuring device.


Impressive work! I wonder: is the inertial tracking sufficient to cover for occasional occlusion of the markers?


That depends what you consider "occasional", but for most definitions the answer would be no. I'm sure there are better ways to handle the inertial measurements than what I did, but even so, double integration drift causes errors very very quickly. I doubt it'd be possible to go over a second, and my implementation struggled with anything over ~150-200ms.

However, it does an impressively good job with low camera frame rates. I tested it at 10FPS (discarding every third frame), and the results are barely distinguishable from 30FPS. Below that it starts having difficulty.


TFA mentions a gyroscope, but I don't see one in the assembly diagram.


I believe the PCB has the IMU built into it along with the BT module. https://wiki.seeedstudio.com/XIAO_BLE/


wow, very impressive project! really I love it, I will try to build it one definitly.


Too bad this wasn't made as an OpenXR API layer so that it could be used with existing software


The great thing is that its open source, so you can add whatever niche APIs you want to it!


Why would I be the one to do that? I already have a pressure sensitive 6dof stylus that is compatible with openxr.

OpenXR isn't a niche API when you are dealing with 6dof input devices.


I know this might be shocking, but OP didn’t make this for you. There’s actually all sorts of people out there making all sorts of things without considering your specific needs


Please don't feed the troll. They're being deliberately argumentative.


Why are you using that tone? I never had the intention to use this. I am just stating that there is a missed opportunity that exists. Why are you implying that I want this as feature for me to use?


I'll chime in and say that OpenXR integration is a good suggestion, just not something I had the time/effort to implement (this is a university project, after all, and was focused more on the core tracking tech). I think people just took issue with the way you phrased your original comment.


> Why would I be the one to do that?

Because you are the one who is complaining.


Please don't feed the troll. They're picking fights on purpose.


It's open source so...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: