Hacker News new | past | comments | ask | show | jobs | submit login

I installed encoder knobs to emulate potentiometers

There's really no need to force this decision to go in either direction. Ever since I first heard of capacitive multitouch, going back to the timeframe before the first iPhone announcement, I've been waiting for someone to build 'stick-on' encoder knobs that the touchscreen controller can read.

These would simply take the form of a knob with a metal leaf or other polygonal electrode in its base, whose rotation could be sensed by code similar to that used to implement crappy 'virtual potentiometers' on existing touchscreens. The fixed part of the knob base would be epoxied or otherwise bonded directly to the screen surface, or perhaps held in place with some sort of frame.

Doesn't seem to have happened yet despite being an incredibly obvious (and inexpensive!) thing to do. Seems like the MIDI community would be all over something like this, even if no one else considered it worthwhile.

This reminds me of the Surface Dial. https://www.microsoft.com/en-us/p/surface-dial/925r551sktgn

Being able to set the dial on the screen and just turn it is a really good-feeling workflow, though likely not for something as mission-critical as the article is describing.

What I had in mind would cost closer to $0.99 than $99.00, though. (Well, OK, $9.99.) The Surface Dial was a relatively complex Bluetooth device; it didn't work through the touchscreen itself, except to the extent that the touchscreen somehow knew that a Surface Dial was resting on it.

According to the spec it uses the touchscreen, or at least some touchscreens:

>On-screen detection: Touch digitizer reports the onscreen location through a capacitive pattern (Studio only)


Right, to sense location, not rotation. Rotation goes through Bluetooth, as with any number of existing knob controllers.

Funny, I have a whole sketch for this. Well the cat is out of the bag, we might as well complete this to prevent any patents.

I envision using cam levered suction cups to hold on rotary and linear sliders that had touch screen sensitive rubber tips. One could go as far to 3d print arbitrary interaction devices that could get attached to the face of the touch screen. You can use the multitouch sensor w/o the screen but still be able to configure arbitrary devices to go on the front.

I even had a design for joystick. Lots of analog opportunities when you have something like a back illuminated camera or a touch controller that can sense areas. You could also serially transfer data from the device to the touch screen, either using physical touches or electrically simulated touches.

Yep, any number of other controls besides knobs would work under the same basic principle. A linear slider control would be an obvious one, as would calculator-style membrane buttons.

I'd be surprised if the concept weren't already patented, though, just because the idea of a generalized capacitive control surface seems fairly obvious, and the patent office doesn't really apply an "obviousness" test. What definitely surprises me is that, patented or not, I can't just go out and buy these sorts of controls.

These are great ideas! Tbh I think that much of the reason that these don't exist is the configurability of software UIs - these controls in Linux/ Windows/ OSX would need to be specifically programmed for by the programmer, rather than say in Squeak by the users. Breaking down that user/ programmer barrier is key I think.

What if the metal base of the rotatable knob has a certain rotationaly asymmetric pattern instead of being completely flat?

We would then be able to receive that pattern and understand how the knob is rotated.

What’s the resolution of todays touchscreen?

It's an interesting thought. You'd want to use a full bandwidth data channel (e.g. Bluetooth) for anything complicated.

But for simple things, if possible without affecting the other parts of the screen, it'd be amazing to have a broadly supported, low bandwidth standard.

I can't really make sense of the bandwidth (information rate) of a tactile interface. A measure in units of length would make more sense to me.

Any digital control signal effectively has a minimum bandwidth.

This might be as simple as "Here's my encoded position * frequency of sampling", but for a general interface you'd want something adaptable.

What if there were two dials on the control? Two dials and three buttons? One dial, four buttons, and a joystick?

It's enough for knobs of 1 inch spacing or wider, no easy support for fancy geared knobs that coaxially have a rough setting and a fine adjustment.

Very strongly related to what you describe is fiducial markers. They were a big thing in the small pond that was projected touch surfaces some years ago, though I haven’t heard anything much about it in recent years as the industry has headed in a different direction. It’s harder to find info on them now, but it’s still out there. A quick search yielded https://www.christianholz.net/fiberio.html which contains an image of the concept at the bottom.

Also see Reactable


I’m not sure touch screens have sufficient resolution.

Check this: http://huyle.de/2019/02/12/accessing-capacitive-images/ As you see, the sensor elements are huge, 4×4mm each, i.e. there’re only 15×27 sensors for the complete touch screen. On top of that, there’s high amount of noise in the signal of each sensor.

The reasons why it works OK in practice, fingers have very predictable shape, also a lot of software involved on all levels of the stack. Touch screen firmware filters out sensor noise and generates touch points. Higher level GUI frameworks “snap“ touches to virtual buttons, some platforms go as far as making virtual keyboard buttons different sizes, depending on which virtual keys are expected to be clicked next, according to predictive input software i.e. dictionaries.

What you propose probably can be done, by using a finger-like object, but I don’t expect the resolution will be great. At least not in comparison with hardware turning knobs, even cheap ones can be make extremely precise. See this https://en.wikipedia.org/wiki/Rotary_encoder and https://en.wikipedia.org/wiki/Incremental_encoder for more info, both are used a lot in wide variety of applications. Old mice with a ball had 2 of them, the reason why ball mice sucked was not sensor precision, it was dirt accumulation, a minor issue for a knob.

It would require multi-touch displays. I have no idea how ubiquitous those are these days, but I know most touchscreens interfaces in my life have not supported it.

In Michael Naimark's series of articles about "VR and AR Fundamentals" [1], the chapter on "Other Senses (Touch, Smell, Taste, Mind)" [2] discusses haptic feedback, and even mentions Hiroo Iwata's delicious "Food Simulator" [3].

[1] VR and AR Fundamentals: https://medium.com/@michaelnaimark/vr-ar-fundamentals-prolog...

[2] Other Senses (Touch, Smell, Taste, Mind): https://medium.com/@michaelnaimark/vr-ar-fundamentals-3-othe...

[3] Food Simulator: https://www.wired.com/2003/08/slideshow-wonders-aplenty-at-s... https://ars.electronica.art/center/en/food-simulator/ http://icat.vrsj.org/papers/2003/00876_00000.pdf

Hiroo Iwata is a brilliant mad scientist [4], and in a previous HN discussion about pie menus and haptic multitouch interfaces [5], I linked to his wonderful work on 3DOF Multitouch Haptic Interface with Movable Touchscreen. [6] [7]

[4] Professor Hiroo IWATA: http://www.frontier.kyoto-u.ac.jp/te03/member/iwata/index.ht...

[5] HN discussion of pie menus and haptic multitouch interfaces: https://news.ycombinator.com/item?id=17105984

[6] 3DOF Multitouch Haptic Interface with Movable Touchscreen: https://www.youtube.com/watch?v=YCZPmj7NtSQ

[7] 3DOF Multitouch Haptic Interface with Movable Touchscreen: https://link.springer.com/chapter/10.1007/978-981-10-4157-0_...

>Shun Takanaka, Hiroaki Yano, Hiroo Iwata, Presented at AsiaHaptics2016. This paper reports on the development of a multitouch haptic interface equipped with a movable touchscreen. When the relative position of two of a user’s fingertips is fixed on a touchscreen, the fingers can be considered a hand-shaped rigid object. In such situations, a reaction force can be exerted on each finger using a three degrees of freedom (3DOF) haptic interface. In this study, a prototype 3DOF haptic interface system comprising a touchscreen, a 6-axis force sensor, an X-Y stage, and a capstan drive system was developed. The developed system estimates the input force from fingers using sensor data and each finger’s position. Further, the system generates reaction forces from virtual objects to the user’s fingertips by controlling the static frictional force between each of the user’s fingertips and the screen. The system enables users to perceive the shape of two-dimensional virtual objects displayed on the screen and translate/rotate them with their fingers. Moreover, users can deform elastic virtual objects, and feel their rigidity.

There are some other really bizarre examples of haptic interfaces in the AsiaHaptics2016 conference videos! (Not all safe for work, depending on your chosen profession, predilection for palpation, and assessment of sphincter tone.) [8]

[8] AsiaHaptics2016: https://www.youtube.com/channel/UC8qMmIgmWhnQBeABjGlzGbg/vid...

Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact