Haptic feedback features found in a recent iMovie update illustrate the power of Apple’s new Force Touch trackpad to provide feedback, not just as an input device, says one of the pioneers of the technology.
Freelance film editor Alex Gollner first noticed Apple was using the trackpad to provide tactile feedback in a recent update to iMovie.
When dragging a video clip to its maximum length, you’ll get feedback letting you know you’ve hit the end of the clip. Add a title and you’ll get feedback as the title snaps into position at the beginning or end of a clip. Subtle feedback is also provided with the alignment guides that appear in the Viewer when cropping clips.
Apple showed-off the Force Touch feature when announcing the new 12-inch MacBook, also adding it to the 13-inch MacBook Pro with Retina display. The WSJ recently claimed that Apple also plans to introduce the feature to the touchscreen on the next generation of the iPhone …
Apple’s presentation focused on use of the feature as an input mechanism, using it to control how OS X or an app responds to different levels of pressure. But haptics pioneer Vincent Hayward says the trackpad has equal potential as a feedback device, allowing you to feel what an app is doing, reports Wired.
Hayward can imagine it accentuating interaction with all sorts of on-screen elements, like buttons, menus and icons. “It could make interaction more realistic, or useful, or entertaining, or pleasant,” he says. […] A project from a group of Disney researchers involved a touchscreen environment in which icons felt “heavier” based on their file size. [Imagine] a version of Angry Birds where you could sense the tension in the slingshot as you drew it further back.
Hayward says Apple’s reported plan to add Force Touch to the iPhone is a realistic one, requiring only sufficiently powerful and battery-efficient motors.
FTC: We use income earning auto affiliate links. More.
Am not posting this as a ‘fanboy’ and I love reading this site but… just think about it… and…
The possibilities of this technology are truly mind-boggling.
I can’t wait to see what develops out of it.
Great that is a realistic move by Apple.
seems really nice. Can’t wait to see what they do with it in os x 10.11 this summer
Reblogged this on Mohit – The caretaker.
As a designer, the idea of feeling the haptic feedback when aligning items in Adobe programs seems like it would be a nice touch, and something I almost immediately thought of when I first saw the announcement.
I think it could also put things on a path moving forward for greater accessibility—imagine actually being able to feel the landscape of your computer screen, each icon, window, button, etc. What if it could dynamically output braille as you move the cursor over text on the screen? Sure, that’s probably off in the future a bit, but I agree with Darren—the possibilities are impressive when you look down that path with a little creativity.
This has to be our ‘S’ feature :)
nope…Apple has never introduced new screen tech on an “S” model.
plus Apple just increased the size and resolution of the iphone just last year.
Man this is exciting. This was like when Sony came out with the dual shock controller. I never wanted to use a normal controller again. It just felt like you were missing out on the experience.
My big question is how this could work in an iPhone and what the toll is on battery life is. One of the reasons Apple discourages use of the motor is because it drains battery. If this is efficient enough, the applications of this are wild. Subtle cues across the os, specific triggers in utility apps and then how immersive it will be for gaming. I bet it would be a big win for accessibility too.
I think for the first iteration the current vibrator motor might be good enough. it might take some rearranging of the motor to provide the best feedback/user experience but it should work.
As technologies get tweaked and perfected they’ll be able to put something in that’s more robust later.
It’s gonna be huge. Apple needs to make clothes with numerous sensors, including biometric sensors, and Taptic engines built into them. Combine that with an Apple virtual reality headset, and you could be immersed in virtual reality which would be insanely lifelike. You’re watching a movie and the wind is blowing, haptic feedback in the clothing give you the sensation of wind blowing against your body. You’re playing a horror game and the villian grabs you on your shoulder and you feel it through the haptic feedback right on your shoulder. The possibilities are endless.
I think we’ll leave it up to another company to make clothes. Apple’s VR is going to be good and they’re going to advance it beyond anything anyone besides oculus can do.
This adds another output method for the iPhone. Touch should be utilised as fully as visuals.
In Touch mode could the screen be kept off to lower the demand on the battery.
Really iOS should have two flavours; one for people with sight and people without. A change of layout design for the home screen may be needed. There would be new ways of control; screen chrome, scrolling, images, Contacts, Calendar or Settings. This new output should be linked to new Siri commands.
I had a Logitech iFeel mouse with haptic feedback years ago, and I really liked it. It made me sad that it never really caught on, and software was never designed to take full advantage of it from a UX perspective. Hopefully a more integrated approach will yield better results.
I’m guessing it was just a vibration?