the way I see it, physical buttons have two uses
one is tactile navigation: the ability to find out what part of the screen you're touching and navigate it without vision.
the second use though is activation feedback, a button has to be able to be navigated separately from its activation. unless these protrusions can tell whether they are pressed or not, it still seems useless to have a lumpy touchscreen...
so I really hope they found some method of separating simply sliding a finger around searching for a button and realizing when the finger is pressing down... this is the exact same issue as the various touch-mice devices
(left click with finger on left side, right click with finger on right side, unless that is you like to REST your fingers while clicking)
"While touchscreens provide a versatile user experience, they provide no tactile experience for consumers. Vibration haptics and similar solutions try to simulate a sensation of touch, but all are "feedback" technologies, vibrating only after touching the screen (even if they are touched in the wrong place or by mistake). In contrast, Tactus' technology creates real, physical buttons, where users can rest their fingers on the buttons, as on a mechanical keyboard, and input data by pressing down on the keys. Tactus is the only solution to both "orientation" and "confirmation" problems that are inherent in touch screens."
the second use though is activation feedback, a button has to be able to be navigated separately from its activation. unless these protrusions can tell whether they are pressed or not, it still seems useless to have a lumpy touchscreen...
so I really hope they found some method of separating simply sliding a finger around searching for a button and realizing when the finger is pressing down... this is the exact same issue as the various touch-mice devices
(left click with finger on left side, right click with finger on right side, unless that is you like to REST your fingers while clicking)