> Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles.
This excites me so, so much! I can't really use my phone as a passenger in a car without getting motion sick after 1-2 minutes. This seems like it might be a promising thing to try.
I used to get bad nausea from aggressive physic-y VR games. But I heard people claim it can be grinded through. So I did that, and they were right. I can do VR games without needed to vomit although it’s still uncomfortable.
However… I am now much more sensitive to non VR motion sickness. :|
I played games my whole life and was shocked I had near instant VR motion sickness in sim racing. Can confirm it can be grinded through, recognize the feelings and stop immediately.
Very similar experience. My instinct would be to fight the sickness and push through, but in reality you need to stop immediately and try again in a few hours. Your tolerance tends to build exponentially!
Yes, sort of. I don’t necessarily have to feel hungry but if I’m on an empty stomach or just haven’t eaten in a while, the odds I get motion sickness are much higher.
If I’m riding somewhere to go get dinner, I have to sit in the front passenger seat. After dinner with a full belly? Throw me in the back and I’ll be fine.
Some people never really learn how to use one-pedal driving, so they end up just going back and forth between accelerating and decelerating. That'll make me motion sick in a hurry, and I bet that is fairly universal (among people prone to motion sickness in cars, that is). So in that sense, any EV or hybrid is potentially a problem, depending on the driver.
I suspect most people's interaction with Prius' are Uber rides. Maybe Uber drivers just get bad habits from the platform incentives (drive fast = get more rides)
I went on a cruise, and had significant (for me) motion sickness that only got better once I ate --- of course, I was avoiding eating because I didn't feel well, so that seems like the wrong choice.
I regularly drive two family members around—one gets motion sick much faster and more frequently when hungry, while the other gets motion sick the same either way.
I wonder what new voices will be added to VoiceOver? We blind people never, ever thought Eloquence, an old TTS engine from 20 years ago now, would ever come to iOS. And yet, here it is in iOS 17. I wouldn't be surprised to see DecTalk, or more Siri voices. More Braille features is amazing, and they even mentioned the Mac! VoiceOver for Mac is notoriously never given as much love as VoiceOver for iOS is, so most blind people still use Windows, even though they have iPhones.
I was expecting to see much better image descriptions, but they've already announced a ton of new stuff for plenty other disabilities. Having haptic music will be awesome even for me, adding another sense to the music. There are just so many new accessibility stuff, and I can't wait to see what all is really new in VoiceOver, since there's always new things not talked about in WWDC or release notes. I'm hoping that, one day, we get a tutorial for VoiceOver, like TalkBack on Android has, since there are so many commands, gestures, and settings that a new user never learns unless they learn to learn about them.
>I'm hoping that, one day, we get a tutorial for VoiceOver
Maybe it's not feasible for you but if you're ever near an Apple Store, you could definitely call them and ask whether they have an accessibility expert you could talk to. In the Barcelona Apple Store for example, there is a blind employee who is an expert at using Apple's accessibility features on all their devices. He loves explaining his tips and tricks to anyone who needs to.
My friends that use synthetic voices prefer cleanliness of the older and familiar voices. One friend listens at about 900 WPM in skim mode and none of the more realistic voices work well at those rates.
Every once in a while I'll hear a blind person's phone audio while I'm out and about and it sounds like an unintelligible stream of noises, but they're interacting with it and using it and getting information from it. It's amazing, a whole other world of interaction from what I experience with my phone. I kind of want to learn how they're interacting with it.
Back in the 90s we reverse engineered / patched the classic MacInTalk to speed up playback. Our testers had it cranked so fast as they navigated the UI that to me it sounded more like musical notes than text.
The image description stuff is already surprisingly good - I noticed when I got a photo text while driving and it described it well enough for me to know what it was.
It’s sometimes awesome, and often extremely basic. “Contact sent you a picture of a group of people at an airport”. Amazing. “Contact sent you a screenshot of a social media post”. Useless. We know iOS can select text in pictures, so Siri can clearly read it. It knows it’s SoMe, so why not give me the headline?
This is a good time to remind everyone that tomorrow, May 16th is Global Accessibility Awareness Day (GAAD) (https://accessibility.day), and that there are over 176 events worldwide going on to celebrate the process we are all making at improving accessibility in our products -- any plenty of learning opportunities for beginners and experts.
Accessibility settings are really a gold mine on iOS for device customization (yes, I agree, they shouldn’t be limited to accessibility).
I’m particularly interested in the motion cues and the color filters for CarPlay - I have color filters set up to enable every night as kind of a Turbo-night shift mode (deep orange-red color shift), would love to do the same for CarPlay.
I also completely forgot iOS had a magnifier built in!
Accessibility features tend to be superpowers though, and I'm glad Apple gates them behind permissions and opt-ins. We all know of applications who try to trick the user into granting them inappropriate access to the device through the Accessibility APIs. I think DropBox still begs you to grant them Accessibility access so its tendrils can do who-knows-what to your system.
Guaranteed that marketers are salivating at the idea of eye tracking on apps and website. It's an amazing feature that absolutely needs to be gatekept.
I wonder if it'll use the same architecture as visionOS; where the vision tracking events and UI affordances are processed and composited out-of-process; with the app never seeing them.
It varies. Things like keyboard control or that kind of thing, absolutely, but mostly I've used it for stuff like "don't make an animated transition every time I change pages like an overcaffienated George Lucas" or "actually make night shift shift enough to be useful at night". I also use the background sounds to augment noise cancellation while taking a nap. All of those are just useful things or personal settings, not necessarily attack vectors.
My favorite is the "allow ANC with just one AirPod in". I have no idea why this would be an accessibility feature. If I turn on ANC, then I don't want it to be disabled just because I'm listening with one ear!
Well, they aren't really limited to accessibility, but they are hidden there. It's sort of like a convenient excuse to get UI designers off your back if you want to ship customization.
I love accessibility features because they might be the last features developed solely with the benefit of the user in mind. So many other app/os features are designed to steal your attention or gradually nerf usefulness.
I often use them to get around bad UI/UX (like using Reduce Motion), or to make devices more useful (Color Filters (red) for using at night).
Even outside of this, even able-bodied folks can be disabled due to illness, surgery, injury, etc. So it's great to see Apple continuing to support accessibility.
The red color filter for outside when trying to preserve night vision is a great tip. Some apps have this built-in but much better to have the OS change it everywhere.
Recommend creating a Shortcut to toggle this setting.
I don't love that solid UX gets pushed under the accessibility rug, as an option you might never find.
I don't care how cynical it sounds, user experience became user exploitation a long time ago. Big Tech have been running that gimmick at too-big-to-fail scale for the last decade or so.
Let's say you're a developer at a big software company (not necessarily Apple, this happens everywhere) and you want to add a new optional setting.
The bar is pretty high. There are already hundreds of settings. Each one adds cost and complexity. So even if it's a good idea, the leadership might say "no".
Now let's say this same setting makes a big difference in usability for some people with disabilities. You just want to put it in accessibility settings. It won't clutter the rest of the UI.
The iPhone has a hidden accessibility setting where you can map and double and/or triple tap of the back of your phone to a handful of actions. I use this to trigger Reachability (the feature that brings the entire UI halfway down the screen so you can reach buttons at the top) because phone screens are so damn big that I can't reach the opposite top corner with my thumb even on my 13 mini without hand gymnastics. And the normal Reachability gesture is super unreliable to trigger ever since they got rid of the front Touch ID home button.
Accessibility benefits everyone, but in the basics you’re right. Too many simple straightforward options are now strictly inside accessibility. At least on the Apple side.
And don’t get me started on hidden command line settings.
Some of them are, at least on Apple's side, but it's always for a good technical reason. Screen recognition is only available on devices that have a neural chip, things that require lidar don't work on devices that don't have lidar and so on.
Google is worse at this, Talkback multi-finger gestures used to be Pixel and Samsung exclusive for a while, even though there was no technical reason for it.
Apple has a different problem, many accessibility features aren't internationalized properly. Screen recognition still has issues on non-english systems, so do image descriptions. Voice Over (especially on Mac) didn't include voices for some of the less-popular languages until very recently, even though Vocalizer, their underlying speech engine, has supported them for years. Siri has the same problem.
Every attention thief is absolutely thrilled at the idea of tracking your eyes. Let’s all imagine the day where the YouTube free tier pauses ads when you’re not actively looking at them.
Shit. I’m turning into one of those negative downers. I’m sorry. I’ve had too much internet today.
Wouldn't a lot of the companies that build in accessibility do it from a viewpoint of gaining an even wider reach and/or a better public image?
I don't see optimizing for that as bad. If they think we'll love the product more by making it better for a given audience, especially if I'm in that audience, I'm happy. Does that mean this company now gets richer? Perhaps, and that's fine by me
This will happen. These features are always ushered in as ways to make someone's life easier, and often that is exactly what it does, for a time, before some product manager figures out how they can maximize profit with it.
This is why Apple is the best. They’ll make features like this because it’s the right thing to do, even if it won’t really make any difference to 99.99% of their customers.
This was the right thing to do, but I doubt “the right thing to do” was the primary motivator for it. This is smart marketing that helps position them as the more caring, inclusive, and capable tech brand. It also expands their reach more than a fraction of a percent. 13% of Americans have some form of disability.
The accessibility features are also useful for automated UI testing.
>They’ll make features like this because it’s the right thing to do
While the positivity is great to see -- I'd temper that expectation they're simply doing the right thing and definitely acting in their perceived interest. People can and often do the right thing -- large companies rarely do.
Do you realize how bad Apple's accessibility issues were before? This is just marketing and I doubt many people are going to drop thousands of bucks to ditch years of tooling they're already intimately familiar with (aka, Windows), but this is an effort to try and entice some people. That's all it is. Marketing.
I have better than 20/20 vision (yes, really) and now mobility problems, but there are some macOS accessibility features that I love.
One is Zoom: I hold down two modifier keys and scroll, and can instantly zoom in to any part of the screen. It is extremely smooth, the more you scroll the higher you zoom, instantly and with high frame rate. Great when you want to show someone something, or just zoom into a detail of something if the app doesn’t support it or is two cumbersome. Or “pseudo-fullscreen” something.
The other one is three finger drag on the trackpad. Why that isn’t default is beyond me. It means that if you use three fingers, any dragging on the trackpad behaves as if you held the button down. It’s so convenient.
The default way to drag on a trackpad is something I’ve never gotten good with so I always enable drag lock the second I get a new laptop. Ideally I would switch to the three finger gesture but after 15 years of drag lock I just can’t get my brain to switch over.
It's better to have an easy way to hold a mouse button via keyboard with your left hand and continue to use one finger on the touchpad rather than do the whole three fingers to drag
three finger dragger for life here. whenever I see colleagues struggling to drag items on a Mac, I show them three finger dragging and it blows them away. total game-changer!
If accurate enough it seems like eye tracking could be useful beyond accessibility to eg control iPads without dragging one’s greasy fingers all over the screen. iPad already supports pointers.
That's a separate option. The option above head tracking in those settings allows for adding your facial expressions (smiling, blinking etc) as shortcuts for clicks.
I definitely get a good amount of motion sickness when using my phone while in a car so I'm super interested about the motion sickness cues and if they'll work. The dots look like they may get in the way a bit but I'm willing to take that tradeoff.
My current car motion sickness mitigation system is these glasses that have liquid in them that supposedly help your ears feel the motion of the car better (and make you look like Harry Potter)
I wonder if Vision Pro has enough microphones to do the acoustic camera thing? If so you could plausibly get "speech bubbles over peoples heads", accurately identifying who said what.
I imagine that could be pretty awesome for deaf people.
Deaf people I know certainly wouldn't want to be wearing a headset, say, in a restaurant. But a phone app or tablet that can do live text-to-speach would be good enough in many cases if it can separate voices into streams. Anything like this available?
I don't want to say it's certainly not available, but I doubt it. Maybe someone has done something trying to recognize different voices by differences in pitch/volume/...
The vision pro has six mics, which likely enables it to locate the sources of sound in space by the amount of time it takes a sound to reach each mic (i.e. acting as an "acoustic camera"). Tablets and phones aren't going to have that hardware unfortunately.
And yeah, obviously wearing a headset is a pretty big compromise that's not always (or maybe even often) going to be worth it. This was more of a though of "the hardware can probably do this cool thing for people already using it" than "people should use the hardware so it can do the cool thing".
I'm a believer in accessibility features. The difficulty is often in testing.
I use SimDaltonism, to test for color-blindness accessibility, and, in the last app I wrote, I added a "long press help" feature, that responds to long-presses on items, by opening a popover, containing the label and hint. Makes testing much easier, and doubles as user help.
Accessibility is for everyone, including you, if you live long enough. And the alternative is worse. So your choice is death or you are going to use accessibility features. – Siracusa
I aimed for the upvote button but they’re so tiny that my fat finger hit the downvote button by accident and then I had to retry the action. This is what people mean by accessibility is for everyone all of the time.
I've hidden and reported so many posts on the front page.
I can only hope that the algorithms take into account that there's a decent chance someone trying to hit a link or access the comments will accidentally report a post instead.
Zoom the website in, the browser has accessibility built in and hackernews zooms in fairly well.
Edit: I seem to be missing something as this is getting downvoted. I genuinely cannot use HN under 150% zoom so thought this was a basic comment I was making.
I'm severely hearing impaired and enjoy going to dance classes - swing, salsa, etc. If I'm standing still, I can easily tune into the beat. But once I start moving, I quickly lose it on many songs; dance studios aren't known for having large sound systems with substantial bass. I don't know that this specific setup would fix anything -- it would need some way of syncing to the instructor's iPhone that is connected via bluetooth to the studio's little portable speaker. But it's a step in the right direction.
While on the topic, I can hear music but almost never understand lyrics; at best I might catch the key chorus phrase (example: the words "born in the USA" are literally the only words I understand in that song).
A few months ago I discovered the "karaoke" feature on Apple Music, in which it displays the lyrics in time with the music. This have been game changing for me. I'm catching up on decades worth of music where I never had any idea what the lyrics are (filthy, that's what they are. So many songs about sex!). It has made exercising on the treadmill, elliptical, etc actually enjoyable.
> A few months ago I discovered the "karaoke" feature on Apple Music, in which it displays the lyrics in time with the music.
JSYK Spotify has had this for years too, its just under the "View Lyrics" button but it does highlight the sentence/word karaoke style. It used to be a "spotify app" back in that genre of the service.
There's a lot of interesting things we can do with haptics since they're relatively cheap to put in stuff. Hopefully accessibility gets the software and applications further along soon
I am excited for Vocal Cues - my main frustration with Siri is how poorly it comprehends even explicit instructions.
One thing I wish Apple would implement is some kind of gesture control. The camera can detect fine details of face movement, it would be nice if that were leveraged to track hands that aren't touching the screen.
For an example, if I have my iPhone or iPad on the desk in front of me, and a push notification that I don't need obstructs the content on the screen, I would love to be able to swipe my hand up towards the phone to dismiss it.
iOS 17 Image Descriptions are quite good, but audio descriptions don't seem to work on non-Pro devices, even though text descriptions are being shown on the screen and the audio menu is present and activated. Is that a bug?
Even on Pro devices, audio image descriptions stop working after a few cycles of switching between Image Magnifier and apps/home. This can be fixed by restarting the app and disabling/enabling audio image descriptions, but that breaks the use case when an iPhone is dedicated to running only Image Magnifier + audio descriptions, via remote MDM with no way for the local blind user to restart the app.
On-device iOS image descriptions could be improved if the user could help train the local image recognition by annotating photos or videos with text descriptions. For a blind person, this would enable locally-specific audio descriptions like "bedroom door", "kitchen fridge" or specific food dishes.
Are there other iOS or Android AR apps which offer audio descriptions of live video from the camera?
This is one major advantage to Apple sharing a foundation across all their devices. Vision Pro introduced eye tracking to their systems as a new input modality, and now it trickles down to their other platforms.
I am surprised CarPlay didn’t have voice control before this though.
Is this really likely to be downstream of the Vision Pro implementation? I would think that eye-tracking with specialized hardware at a fixed position very close to the eye is very different to doing it at a distance with a general purpose front facing camera.
Typically eye-trackers work by illuminating the eyes with near infrared light and using infrared cameras. This creates a higher contrast image of the pupils, etc. I assume Apple is doing this in the Vision Pro. Eye-tracking can also be done with just visible light, though. Apple has the benefit of knowing where all the user interface elements are on screen, so eye-tracking in this on the iPhone or iPad doesn't need to be high precision. Knowledge of the position of the items can help to reduce the uncertainty of what is being fixated on.
Not the hardware side, but the software side. Implementing all the various behaviours and security models for the eye tracking and baking it into SwiftUI, means that it translates over easier once they figure out the hardware aspect. But the iPads and iPhones with FaceID have had eye tracking capabilities for a while, just not useful in UI.
CarPlay devices (car components) are essentially playing a streaming video of a hidden display generated by the phone. CarPlay also lets those devices send back touch events to trigger buttons and other interactions. Very little process is done on the vehicle.
BTW if you are plugged in to CarPlay and take a screen shot, it will include the hidden CarPlay screen.
CarPlay is rendered by the phone itself, so it's not strictly a function of how powerful the car infotainment is. You've been able to talk to Siri since the beginning of CarPlay so additional voice control is really just an accessibility thing
My wife is a hospice nurse and from time to time she'll have a patient without any ability to communicate except their eyes (think ALS) - for these folks in their final days/weeks of life this will be a godsend. There are specialized eye-tracking devices, but they're expensive and good luck getting them approved by insurance in time for the folks in need near the end of their lives.
Eye Gaze devices(tablet with camera + software) cost around $20K, even if it offers 1/4 of the features this is good news for those who can't afford it.
Don't be ridiculous. Solid hardware and software combos for windows cost a small fraction of that. The convenient and decent Tobii PCEye costs like $1,250 and a very nice TMS5 mini is under $2,000. Your bullshit was off by at least an order of magnitude.
Let's be fair and compare similar products. Do you have any examples of $2000 mobile devices that support eye tracking on the OS level? The products you mention look like they're extra hardware you strap to a Windows PC. Certainly useful, but not quite as potentially life-changing as having that built into your phone.
I wonder if this announcement had anything to do with the bombshells OpenAI and Google dropped this week. Couldn’t this have been part of WWDC next month?
As Terramex pointed out this is tied to a particular, relevant event.
It’s also pretty common for Apple to preannounce some smaller features that are too specialized to be featured in the WWDC announcements. This gives them some attention when they would be lost and buried in WWDC footnotes.
It is also probably an indication that WWDC will be full of new features and only the most impactful will be part of the keynote.
They do it every year at the same time. Also, it’s a small announcement, not a keynote or the kind of fanfare we have at WWDC or the September events. This does not seem calibrated to be effective in an advertising war with another company. All this to say, probably not.
I think it's more of "clearing the decks" for stuff that didn't make the cut for WWDC. I assume WWDC is going to be all about AI and they couldn't find a good spot to put this announcement. "Clearing the decks" isn't a very kind way to refer to this accessibility tech since Apple has always been better than almost everyone else when it comes to accessibility. I don't see this as "we don't care, just announce it early" as much as "we can't fit this in so let's announce it early".
Can't wait for my son to try this. He has general coordination issues due to a brainstem injury but the eyes are probably the part of the body he can better control.
I'm not a fan of Apple's software and didn't have a great experience with the Vision Pro but I am excited to try it out.
One of the most important accessibility features they could bring back are physical home buttons on at least one ipad.
I am completely serious. I work with a lot of older adults, many of whom have cognitive challenges, and the lack of a physical button to press as an escape hatch when they get confused is the #1 stumbling block by a mile.
When there were physical buttons, it was very popular in Asia to enable the accessibility option to put a virtual one on screen instead, because they were afraid the physical one would break. So it was kind of useless in the end.
> because they were afraid the physical one would break
On early models it was actually quite common for the button to stop working after a year or two. The durability has since improved, but habits die hard.
It’s not physical, but if the problem is they need something visible rather than a gesture, then you can put an always-on menu button on the screen that has a home button in.
Touch ID is so drastically superior to Face ID in so many common “iPad scenarios”, eg. laying in bed with face partially obscured.
I don’t understand Apple’s intense internal focus on FaceID only. FaceID with TouchID as an option when a device is flat on the table or when your face is obscured is so much nicer.
While it's a significant step forward for accessibility, it also invites us to consider how such technologies could integrate into everyday use for all users. This could enhance ease of use and efficiency, but it also requires careful consideration of privacy safeguards.
This is the beginning of the end for the mouse — not just on phones, but on desktops, everywhere. I highly recommend everyone to at least schedule a free Apple Vision demo at the local Apple Store.
This is awesome and I love the UX, although I can't help but feel a bit sad that we need to always rely upon Apple and Microsoft for consumer accessibility.
It would be so great if more resources could be allocated for such things within the Linux ecosystem.
I share the sentiment. I've long noted in situations with grids of terminal windows where focus-follows-eyes would be so much faster and present less friction than using hotkeys or focus-follows-mouse.
My first thought upon seeing the Haptic Music feature is to wonder how long until they make compatible headphones and I can relive my high school years, walking around listening to nu-metal and hip-hop with a Panasonic Shockwave walkman.
an iphone constantly broadcasts your location to third parties, can deduce where you work and live, understands the unique shape of your face, has a built-in microphone and multiple kinds of cameras, stores all of your conversations, stores all of your photos, stores your health information, can figure out when you go to bed.. all on a completely closed and proprietary operating system.
it’s like asking “why hasn’t anyone mentioned that we’re all using a website right now”
Very curious to see how well eye tracking works behind a motorcycle visor. Thick leather gloves, bike noise, and a touch screen and audio interface are not much fun.
This will have just enough capability limits and annoyances for you to be convinced to purchase a Vision Pro. It also makes eye tracking more widely accepted by making it available to everyone by using their existing devices.
while we're on topic of accessibility I'd like to point out the following lovely facts:
* on Android, in the builtin popup menu for copypasta, the COPY button is wedged in tightly (by mere millimeters) between the CUT and PASTE buttons on either side of it. A harmless one packed in between two destructive, lossy ones, and usually irreversible.
* the size of an adult human fingertip is no secret (and in fact 99.99% of humans have at least one!)
* Google's staff consists of humans
* Google supposedly hires ONLY "smart" people and their interview process involves some IQ-like tests
* Android has had this obvious anti-pattern for many years now. perhaps 10+?
They keep announcing these bombastic accessibility features while simple things like tabbing remain frustratingly broken. The macOS “allow this app to access that” dialog supports shift+tab, but not tab.
This is how I feel about "face recognition" (it should mean recognizing whether something is a face or not), but it is common to use eye tracking this way.
Where are you getting this language point from? If I look up any company selling "eye trackers", their products are all meant to track where you're looking, e.g., https://www.tobii.com/
Could be, but not necessarily. Eye tracking usually means it's tracking the eyes of one or more people in a video. Gaze tracking usually requires your eyes stay pretty steady and close to the tracker.
remember how google added js apis to detect active tabs. and rendering intersection. complex webapps could use less battery and have richer interaction.
Sure it means something, it’s pretty low on the PR talk scale even. Let’s see:
> We believe deeply in
"We", in this case, refers to Apple and as a company can’t believe in anything, it specifically refers to their employees.
> the transformative power of innovation
A new feature will bring forth change
> to enrich lives
In order to make someone’s life better.
So far so good, at least no synergies being leveraged for 10x buzzword transformation. Let’s see if it passes the test:
There’s a bunch of new accessibility features, that certainly fits the bill for innovation.
As anecdata, at least one of these will drastically change how I interact with my devices, so there’s the transformative power.
I will be able to use these new accessibility features to improve both the way I work and how I privately use technology, so one could argue my life is being positively enriched by this transformative power brought about by innovation.
Do they "believe deeply" in this? That is only for Tim Cook and his employees to know, but they’re one of the few companies pushing advances in accessibility past the finish line, so they might!
I always wonder why press releases include claims that a person said a paragraph or more of text that noone would ever say out loud, but this one actually does sound like something he'd say. In a keynote video, anyway.
You can say a lot of negative true things about Apple but this is just silly. There is no way Apple is going to expose that data to underlying apps in the same way they refused to do it in Vision Pro. I'd bet a good bit of money it works the same way where it's a layer on top of the app that the app can't access and from the video it looks like that's exactly how it works.
Not to underlying 3rd party apps, but for their ad structure it is a total possibility. It might not be now, but down the road this feature will be used for that and you must be certain that Apple already discussed and planned it.
Eye tracking is absolutely an accessibility feature. Just because you don't need it, and it can be abused, does not mean it isn't an absolutely game changing feature for some people.
> Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles.
This excites me so, so much! I can't really use my phone as a passenger in a car without getting motion sick after 1-2 minutes. This seems like it might be a promising thing to try.
Vaguely related anecdote.
I used to get bad nausea from aggressive physic-y VR games. But I heard people claim it can be grinded through. So I did that, and they were right. I can do VR games without needed to vomit although it’s still uncomfortable.
However… I am now much more sensitive to non VR motion sickness. :|
I played games my whole life and was shocked I had near instant VR motion sickness in sim racing. Can confirm it can be grinded through, recognize the feelings and stop immediately.
Very similar experience. My instinct would be to fight the sickness and push through, but in reality you need to stop immediately and try again in a few hours. Your tolerance tends to build exponentially!
I have had good luck with just closing one eye. But that is very tiring to do for long periods.
A similar app that's on Android: https://play.google.com/store/apps/details?id=com.urbandroid...
Unsure if it actually works though, my personal test results are mixed.
Have you noticed any correlation between how hungry you are and how fast motion sickness kicks in?
Yes, sort of. I don’t necessarily have to feel hungry but if I’m on an empty stomach or just haven’t eaten in a while, the odds I get motion sickness are much higher.
If I’m riding somewhere to go get dinner, I have to sit in the front passenger seat. After dinner with a full belly? Throw me in the back and I’ll be fine.
I'm not sure why, but I feel like I only get motion sickness in the back of Priuses. It must be something about their braking curve.
I don't sit in enough EVs to tell if they're the same.
Some people never really learn how to use one-pedal driving, so they end up just going back and forth between accelerating and decelerating. That'll make me motion sick in a hurry, and I bet that is fairly universal (among people prone to motion sickness in cars, that is). So in that sense, any EV or hybrid is potentially a problem, depending on the driver.
Teslas are especially bad for me. I think it’s the rough suspension and fast acceleration/deceleration
I suspect most people's interaction with Prius' are Uber rides. Maybe Uber drivers just get bad habits from the platform incentives (drive fast = get more rides)
Toyota's hybrids are the worst. I never get motion sick except as a passenger in any Toyota hybrid
It’s really interesting you say this. Is this a known correlation? I feel like now that you mention it, it’s incredibly fast if I’m hungry.
I went on a cruise, and had significant (for me) motion sickness that only got better once I ate --- of course, I was avoiding eating because I didn't feel well, so that seems like the wrong choice.
It is a known correlation.
I regularly drive two family members around—one gets motion sick much faster and more frequently when hungry, while the other gets motion sick the same either way.
Does make me wonder what the difference is there.
I have not. For me, it does not matter. The ride begins - the motion sickness kicks in
I have motion sickness... It's so hard to movearound for me and I am still not able to find what works best for me
[dead]
I wonder what new voices will be added to VoiceOver? We blind people never, ever thought Eloquence, an old TTS engine from 20 years ago now, would ever come to iOS. And yet, here it is in iOS 17. I wouldn't be surprised to see DecTalk, or more Siri voices. More Braille features is amazing, and they even mentioned the Mac! VoiceOver for Mac is notoriously never given as much love as VoiceOver for iOS is, so most blind people still use Windows, even though they have iPhones.
I was expecting to see much better image descriptions, but they've already announced a ton of new stuff for plenty other disabilities. Having haptic music will be awesome even for me, adding another sense to the music. There are just so many new accessibility stuff, and I can't wait to see what all is really new in VoiceOver, since there's always new things not talked about in WWDC or release notes. I'm hoping that, one day, we get a tutorial for VoiceOver, like TalkBack on Android has, since there are so many commands, gestures, and settings that a new user never learns unless they learn to learn about them.
>I'm hoping that, one day, we get a tutorial for VoiceOver
Maybe it's not feasible for you but if you're ever near an Apple Store, you could definitely call them and ask whether they have an accessibility expert you could talk to. In the Barcelona Apple Store for example, there is a blind employee who is an expert at using Apple's accessibility features on all their devices. He loves explaining his tips and tricks to anyone who needs to.
My friends that use synthetic voices prefer cleanliness of the older and familiar voices. One friend listens at about 900 WPM in skim mode and none of the more realistic voices work well at those rates.
Every once in a while I'll hear a blind person's phone audio while I'm out and about and it sounds like an unintelligible stream of noises, but they're interacting with it and using it and getting information from it. It's amazing, a whole other world of interaction from what I experience with my phone. I kind of want to learn how they're interacting with it.
Here's a great demo by a blind software engineer that explains some of it: https://youtu.be/wKISPePFrIs?si=YOLcW9b2uyLXLn59
Back in the 90s we reverse engineered / patched the classic MacInTalk to speed up playback. Our testers had it cranked so fast as they navigated the UI that to me it sounded more like musical notes than text.
The image description stuff is already surprisingly good - I noticed when I got a photo text while driving and it described it well enough for me to know what it was.
Same, a family member send a photo while I was driving and over Carplay it was described fairly accurately.
It’s sometimes awesome, and often extremely basic. “Contact sent you a picture of a group of people at an airport”. Amazing. “Contact sent you a screenshot of a social media post”. Useless. We know iOS can select text in pictures, so Siri can clearly read it. It knows it’s SoMe, so why not give me the headline?
This is a good time to remind everyone that tomorrow, May 16th is Global Accessibility Awareness Day (GAAD) (https://accessibility.day), and that there are over 176 events worldwide going on to celebrate the process we are all making at improving accessibility in our products -- any plenty of learning opportunities for beginners and experts.
Accessibility settings are really a gold mine on iOS for device customization (yes, I agree, they shouldn’t be limited to accessibility).
I’m particularly interested in the motion cues and the color filters for CarPlay - I have color filters set up to enable every night as kind of a Turbo-night shift mode (deep orange-red color shift), would love to do the same for CarPlay.
I also completely forgot iOS had a magnifier built in!
Accessibility features tend to be superpowers though, and I'm glad Apple gates them behind permissions and opt-ins. We all know of applications who try to trick the user into granting them inappropriate access to the device through the Accessibility APIs. I think DropBox still begs you to grant them Accessibility access so its tendrils can do who-knows-what to your system.
With great power comes great responsibility.
Guaranteed that marketers are salivating at the idea of eye tracking on apps and website. It's an amazing feature that absolutely needs to be gatekept.
I wonder if it'll use the same architecture as visionOS; where the vision tracking events and UI affordances are processed and composited out-of-process; with the app never seeing them.
It varies. Things like keyboard control or that kind of thing, absolutely, but mostly I've used it for stuff like "don't make an animated transition every time I change pages like an overcaffienated George Lucas" or "actually make night shift shift enough to be useful at night". I also use the background sounds to augment noise cancellation while taking a nap. All of those are just useful things or personal settings, not necessarily attack vectors.
My favorite is the "allow ANC with just one AirPod in". I have no idea why this would be an accessibility feature. If I turn on ANC, then I don't want it to be disabled just because I'm listening with one ear!
Well, they aren't really limited to accessibility, but they are hidden there. It's sort of like a convenient excuse to get UI designers off your back if you want to ship customization.
FYI you can also make turbo night shift by scheduling toggling of white point balance, yep, in accessibility settings
I love accessibility features because they might be the last features developed solely with the benefit of the user in mind. So many other app/os features are designed to steal your attention or gradually nerf usefulness.
I often use them to get around bad UI/UX (like using Reduce Motion), or to make devices more useful (Color Filters (red) for using at night).
Even outside of this, even able-bodied folks can be disabled due to illness, surgery, injury, etc. So it's great to see Apple continuing to support accessibility.
The red color filter for outside when trying to preserve night vision is a great tip. Some apps have this built-in but much better to have the OS change it everywhere.
Recommend creating a Shortcut to toggle this setting.
I don't love that solid UX gets pushed under the accessibility rug, as an option you might never find.
I don't care how cynical it sounds, user experience became user exploitation a long time ago. Big Tech have been running that gimmick at too-big-to-fail scale for the last decade or so.
Let's say you're a developer at a big software company (not necessarily Apple, this happens everywhere) and you want to add a new optional setting.
The bar is pretty high. There are already hundreds of settings. Each one adds cost and complexity. So even if it's a good idea, the leadership might say "no".
Now let's say this same setting makes a big difference in usability for some people with disabilities. You just want to put it in accessibility settings. It won't clutter the rest of the UI.
You just turned your "no" into a "yes".
The iPhone has a hidden accessibility setting where you can map and double and/or triple tap of the back of your phone to a handful of actions. I use this to trigger Reachability (the feature that brings the entire UI halfway down the screen so you can reach buttons at the top) because phone screens are so damn big that I can't reach the opposite top corner with my thumb even on my 13 mini without hand gymnastics. And the normal Reachability gesture is super unreliable to trigger ever since they got rid of the front Touch ID home button.
Accessibility benefits everyone, but in the basics you’re right. Too many simple straightforward options are now strictly inside accessibility. At least on the Apple side.
And don’t get me started on hidden command line settings.
> developed solely with the benefit of the user in mind
Hopefully accessibility features are never artificially segmented to higher priced devices.
Some of them are, at least on Apple's side, but it's always for a good technical reason. Screen recognition is only available on devices that have a neural chip, things that require lidar don't work on devices that don't have lidar and so on.
Google is worse at this, Talkback multi-finger gestures used to be Pixel and Samsung exclusive for a while, even though there was no technical reason for it.
Apple has a different problem, many accessibility features aren't internationalized properly. Screen recognition still has issues on non-english systems, so do image descriptions. Voice Over (especially on Mac) didn't include voices for some of the less-popular languages until very recently, even though Vocalizer, their underlying speech engine, has supported them for years. Siri has the same problem.
At least in the US, they kind of can't be. The disability community is pretty up front about lawsuits.
I wonder if that would be legal, at least in the US. That feels like it'd be a violation of the ADA?
Every attention thief is absolutely thrilled at the idea of tracking your eyes. Let’s all imagine the day where the YouTube free tier pauses ads when you’re not actively looking at them.
Shit. I’m turning into one of those negative downers. I’m sorry. I’ve had too much internet today.
If this is at all like the eye tracking in Vision Pro, it is only available to the OS and apps are not given access to the data.
Wait for human attention detection to become mandatory to view DRMed content on the telescreen.
Watch the Black Mirror episode “Fifteen Million Merits”, to see how this might end up.
Wouldn't a lot of the companies that build in accessibility do it from a viewpoint of gaining an even wider reach and/or a better public image?
I don't see optimizing for that as bad. If they think we'll love the product more by making it better for a given audience, especially if I'm in that audience, I'm happy. Does that mean this company now gets richer? Perhaps, and that's fine by me
Funny, I was just thinking it was so that they can get more attention-economy eyeballs for ads.
This will happen. These features are always ushered in as ways to make someone's life easier, and often that is exactly what it does, for a time, before some product manager figures out how they can maximize profit with it.
Growth at all costs, I guess.
Accessibility features stand out as user-centric developments, love that
That’s a profound and surprising insight. You’re absolutely correct.
Accessibility features can be used to steal attention too
This is why Apple is the best. They’ll make features like this because it’s the right thing to do, even if it won’t really make any difference to 99.99% of their customers.
They really earn consumer loyalty.
This was the right thing to do, but I doubt “the right thing to do” was the primary motivator for it. This is smart marketing that helps position them as the more caring, inclusive, and capable tech brand. It also expands their reach more than a fraction of a percent. 13% of Americans have some form of disability.
I feel like it also helps them get an edge in the healthcare sector, which has buckets of money in the mix.
75+% of Americans become disabled (mostly temporarily) in their lifetime. So it will affect most people someday. And everyone dies in the end.
Tim Cook in response to a shareholder proposing scrapping accessibility to improve ROI:
“When we work on making our devices accessible by the blind, I don’t consider the bloody ROI”.
The accessibility features are also useful for automated UI testing.
>They’ll make features like this because it’s the right thing to do
While the positivity is great to see -- I'd temper that expectation they're simply doing the right thing and definitely acting in their perceived interest. People can and often do the right thing -- large companies rarely do.
> People can and often do the right thing -- large companies rarely do.
There are companies which try (and succeed) to be honest "citizens". Berkshire and Costco have quite good reputations.
I read this very skeptically.
When I hear eye tracking I immediately think of advertisers targeting me based on what I look at, NOT quadriplegics using Apple devices.
Maybe I'm a cynic
I'm hoping they'll add this to macOS too. Eye tracking would be great for end-user UX research.
I'd also like to see what sort of weird stuff people come up with to use eye tracking. Games could use that in interesting ways.
Do you realize how bad Apple's accessibility issues were before? This is just marketing and I doubt many people are going to drop thousands of bucks to ditch years of tooling they're already intimately familiar with (aka, Windows), but this is an effort to try and entice some people. That's all it is. Marketing.
I have better than 20/20 vision (yes, really) and now mobility problems, but there are some macOS accessibility features that I love.
One is Zoom: I hold down two modifier keys and scroll, and can instantly zoom in to any part of the screen. It is extremely smooth, the more you scroll the higher you zoom, instantly and with high frame rate. Great when you want to show someone something, or just zoom into a detail of something if the app doesn’t support it or is two cumbersome. Or “pseudo-fullscreen” something.
The other one is three finger drag on the trackpad. Why that isn’t default is beyond me. It means that if you use three fingers, any dragging on the trackpad behaves as if you held the button down. It’s so convenient.
The default way to drag on a trackpad is something I’ve never gotten good with so I always enable drag lock the second I get a new laptop. Ideally I would switch to the three finger gesture but after 15 years of drag lock I just can’t get my brain to switch over.
It's better to have an easy way to hold a mouse button via keyboard with your left hand and continue to use one finger on the touchpad rather than do the whole three fingers to drag
Not for me, no.
But sounds like something accessibility options may also provide, and I can see how it may be better for a lot of people.
three finger dragger for life here. whenever I see colleagues struggling to drag items on a Mac, I show them three finger dragging and it blows them away. total game-changer!
> and now mobility problems
Meant to write “no mobility problems”, but too late to edit now.
If accurate enough it seems like eye tracking could be useful beyond accessibility to eg control iPads without dragging one’s greasy fingers all over the screen. iPad already supports pointers.
I was thinking the same, it would make having an iPad under my monitor much less cumbersome
macOS has had a version of eye tracking for a while, it's really fun to try out.
System preferences -> Accessibility -> Pointer Control
Then turn on the "Head pointer" option.
Cool. I'm a bit unsettled that my camera's green dot didn't turn on for it though.
VisionOS has this too. I mapped it to a triple click of the button for when eye tracking becomes inaccurate.
Cool! That works surprisingly well. But how do you click while using this? Clicking on the trackpad doesn't work, when it's tracking my head.
That's a separate option. The option above head tracking in those settings allows for adding your facial expressions (smiling, blinking etc) as shortcuts for clicks.
Eye tracking coupled with the show grid feature would seem like using a computer the way that people do in movies https://www.youtube.com/watch?v=UxigSW9MbY8
It's basically how the Apple Vision Pro mainly works.
I definitely get a good amount of motion sickness when using my phone while in a car so I'm super interested about the motion sickness cues and if they'll work. The dots look like they may get in the way a bit but I'm willing to take that tradeoff. My current car motion sickness mitigation system is these glasses that have liquid in them that supposedly help your ears feel the motion of the car better (and make you look like Harry Potter)
I wonder if Vision Pro has enough microphones to do the acoustic camera thing? If so you could plausibly get "speech bubbles over peoples heads", accurately identifying who said what.
I imagine that could be pretty awesome for deaf people.
Deaf people I know certainly wouldn't want to be wearing a headset, say, in a restaurant. But a phone app or tablet that can do live text-to-speach would be good enough in many cases if it can separate voices into streams. Anything like this available?
I don't want to say it's certainly not available, but I doubt it. Maybe someone has done something trying to recognize different voices by differences in pitch/volume/...
The vision pro has six mics, which likely enables it to locate the sources of sound in space by the amount of time it takes a sound to reach each mic (i.e. acting as an "acoustic camera"). Tablets and phones aren't going to have that hardware unfortunately.
And yeah, obviously wearing a headset is a pretty big compromise that's not always (or maybe even often) going to be worth it. This was more of a though of "the hardware can probably do this cool thing for people already using it" than "people should use the hardware so it can do the cool thing".
google yesterday also open sourced their accessibility feature for android and windows that controls cursor using head movements and facial gestures
https://github.com/google/project-gameface
I'm a believer in accessibility features. The difficulty is often in testing.
I use SimDaltonism, to test for color-blindness accessibility, and, in the last app I wrote, I added a "long press help" feature, that responds to long-presses on items, by opening a popover, containing the label and hint. Makes testing much easier, and doubles as user help.
Accessibility is for everyone, including you, if you live long enough. And the alternative is worse. So your choice is death or you are going to use accessibility features. – Siracusa
I aimed for the upvote button but they’re so tiny that my fat finger hit the downvote button by accident and then I had to retry the action. This is what people mean by accessibility is for everyone all of the time.
I have a tremor and I run into this issue on HN all the time. I need to zoom in a lot to be sure I'll hit it.
I've hidden and reported so many posts on the front page.
I can only hope that the algorithms take into account that there's a decent chance someone trying to hit a link or access the comments will accidentally report a post instead.
This may be an unpopular opinion but I'm just happy zoom works as expected on this site and don't mind if once in a while I have to correct a misvote.
I appreciate the information density, and that HN hasn't adopted some "contemporary" UI that's all whitespace and animations.
(And yes I agree the buttons are awkward)
Zoom the website in, the browser has accessibility built in and hackernews zooms in fairly well.
Edit: I seem to be missing something as this is getting downvoted. I genuinely cannot use HN under 150% zoom so thought this was a basic comment I was making.
See also https://en.wikipedia.org/wiki/Curb_cut_effect
By any chance, do you know where said the above quote? I tried searching online, could not find it.
By any chance I know, because I clipped it in Overcast.
It was on Accidental Tech Podcast #415, 26:19. The topic starts at 25:40
https://atp.fm/415
Music haptics can be a cool way to teach someone how to dance and “feel the beat”
I'm severely hearing impaired and enjoy going to dance classes - swing, salsa, etc. If I'm standing still, I can easily tune into the beat. But once I start moving, I quickly lose it on many songs; dance studios aren't known for having large sound systems with substantial bass. I don't know that this specific setup would fix anything -- it would need some way of syncing to the instructor's iPhone that is connected via bluetooth to the studio's little portable speaker. But it's a step in the right direction.
While on the topic, I can hear music but almost never understand lyrics; at best I might catch the key chorus phrase (example: the words "born in the USA" are literally the only words I understand in that song).
A few months ago I discovered the "karaoke" feature on Apple Music, in which it displays the lyrics in time with the music. This have been game changing for me. I'm catching up on decades worth of music where I never had any idea what the lyrics are (filthy, that's what they are. So many songs about sex!). It has made exercising on the treadmill, elliptical, etc actually enjoyable.
> A few months ago I discovered the "karaoke" feature on Apple Music, in which it displays the lyrics in time with the music.
JSYK Spotify has had this for years too, its just under the "View Lyrics" button but it does highlight the sentence/word karaoke style. It used to be a "spotify app" back in that genre of the service.
There's a lot of interesting things we can do with haptics since they're relatively cheap to put in stuff. Hopefully accessibility gets the software and applications further along soon
Using haptics in music to enhance rhythm perception and dance skills. Sounds really cool!
I am excited for Vocal Cues - my main frustration with Siri is how poorly it comprehends even explicit instructions.
One thing I wish Apple would implement is some kind of gesture control. The camera can detect fine details of face movement, it would be nice if that were leveraged to track hands that aren't touching the screen.
For an example, if I have my iPhone or iPad on the desk in front of me, and a push notification that I don't need obstructs the content on the screen, I would love to be able to swipe my hand up towards the phone to dismiss it.
iOS 17 Image Descriptions are quite good, but audio descriptions don't seem to work on non-Pro devices, even though text descriptions are being shown on the screen and the audio menu is present and activated. Is that a bug?
Even on Pro devices, audio image descriptions stop working after a few cycles of switching between Image Magnifier and apps/home. This can be fixed by restarting the app and disabling/enabling audio image descriptions, but that breaks the use case when an iPhone is dedicated to running only Image Magnifier + audio descriptions, via remote MDM with no way for the local blind user to restart the app.
On-device iOS image descriptions could be improved if the user could help train the local image recognition by annotating photos or videos with text descriptions. For a blind person, this would enable locally-specific audio descriptions like "bedroom door", "kitchen fridge" or specific food dishes.
Are there other iOS or Android AR apps which offer audio descriptions of live video from the camera?
All these features look amazing! That car motion sickness feature especially. Can’t wait to try it!
At least I put off the phone while I was in the car. Not the case now. Thank you Apple but I'd rather be sick while looking at your phone in a car.
This is one major advantage to Apple sharing a foundation across all their devices. Vision Pro introduced eye tracking to their systems as a new input modality, and now it trickles down to their other platforms.
I am surprised CarPlay didn’t have voice control before this though.
Is this really likely to be downstream of the Vision Pro implementation? I would think that eye-tracking with specialized hardware at a fixed position very close to the eye is very different to doing it at a distance with a general purpose front facing camera.
Typically eye-trackers work by illuminating the eyes with near infrared light and using infrared cameras. This creates a higher contrast image of the pupils, etc. I assume Apple is doing this in the Vision Pro. Eye-tracking can also be done with just visible light, though. Apple has the benefit of knowing where all the user interface elements are on screen, so eye-tracking in this on the iPhone or iPad doesn't need to be high precision. Knowledge of the position of the items can help to reduce the uncertainty of what is being fixated on.
Not the hardware side, but the software side. Implementing all the various behaviours and security models for the eye tracking and baking it into SwiftUI, means that it translates over easier once they figure out the hardware aspect. But the iPads and iPhones with FaceID have had eye tracking capabilities for a while, just not useful in UI.
The 'tracking eyes' part is different, but once you have eye position data, the 'how the eyes interact with the interface' could be very similar.
It would be amazing if it gets carried over to the Mac.
CarPlay devices aren't really powerful.
CarPlay devices (car components) are essentially playing a streaming video of a hidden display generated by the phone. CarPlay also lets those devices send back touch events to trigger buttons and other interactions. Very little process is done on the vehicle.
BTW if you are plugged in to CarPlay and take a screen shot, it will include the hidden CarPlay screen.
CarPlay is rendered by the phone itself, so it's not strictly a function of how powerful the car infotainment is. You've been able to talk to Siri since the beginning of CarPlay so additional voice control is really just an accessibility thing
My wife is a hospice nurse and from time to time she'll have a patient without any ability to communicate except their eyes (think ALS) - for these folks in their final days/weeks of life this will be a godsend. There are specialized eye-tracking devices, but they're expensive and good luck getting them approved by insurance in time for the folks in need near the end of their lives.
Eye Gaze devices(tablet with camera + software) cost around $20K, even if it offers 1/4 of the features this is good news for those who can't afford it.
Don't be ridiculous. Solid hardware and software combos for windows cost a small fraction of that. The convenient and decent Tobii PCEye costs like $1,250 and a very nice TMS5 mini is under $2,000. Your bullshit was off by at least an order of magnitude.
Let's be fair and compare similar products. Do you have any examples of $2000 mobile devices that support eye tracking on the OS level? The products you mention look like they're extra hardware you strap to a Windows PC. Certainly useful, but not quite as potentially life-changing as having that built into your phone.
I wonder if this announcement had anything to do with the bombshells OpenAI and Google dropped this week. Couldn’t this have been part of WWDC next month?
Tomorrow (today in some timezones) is Global Accesibility Awareness Day: https://en.m.wikipedia.org/wiki/Global_Accessibility_Awarene...
As Terramex pointed out this is tied to a particular, relevant event.
It’s also pretty common for Apple to preannounce some smaller features that are too specialized to be featured in the WWDC announcements. This gives them some attention when they would be lost and buried in WWDC footnotes.
It is also probably an indication that WWDC will be full of new features and only the most impactful will be part of the keynote.
They do it every year at the same time. Also, it’s a small announcement, not a keynote or the kind of fanfare we have at WWDC or the September events. This does not seem calibrated to be effective in an advertising war with another company. All this to say, probably not.
I think it's more of "clearing the decks" for stuff that didn't make the cut for WWDC. I assume WWDC is going to be all about AI and they couldn't find a good spot to put this announcement. "Clearing the decks" isn't a very kind way to refer to this accessibility tech since Apple has always been better than almost everyone else when it comes to accessibility. I don't see this as "we don't care, just announce it early" as much as "we can't fit this in so let's announce it early".
As noted elsewhere, Apple always does their accessibility announcements in advance of WWDC.
Can't wait for my son to try this. He has general coordination issues due to a brainstem injury but the eyes are probably the part of the body he can better control. I'm not a fan of Apple's software and didn't have a great experience with the Vision Pro but I am excited to try it out.
Haptics in the Music app will be great but it’s not exactly “new”, considering my haptic music app has been out for several years now: https://apps.apple.com/us/app/phazr/id1472113449
One of the most important accessibility features they could bring back are physical home buttons on at least one ipad.
I am completely serious. I work with a lot of older adults, many of whom have cognitive challenges, and the lack of a physical button to press as an escape hatch when they get confused is the #1 stumbling block by a mile.
When there were physical buttons, it was very popular in Asia to enable the accessibility option to put a virtual one on screen instead, because they were afraid the physical one would break. So it was kind of useless in the end.
> because they were afraid the physical one would break
On early models it was actually quite common for the button to stop working after a year or two. The durability has since improved, but habits die hard.
It’s not physical, but if the problem is they need something visible rather than a gesture, then you can put an always-on menu button on the screen that has a home button in.
https://support.apple.com/en-sg/111794
I agree. I have a last gen base iPad with Touch ID and a home button. I am pretty tech savvy but actually prefer this form factor.
Touch ID is so drastically superior to Face ID in so many common “iPad scenarios”, eg. laying in bed with face partially obscured.
I don’t understand Apple’s intense internal focus on FaceID only. FaceID with TouchID as an option when a device is flat on the table or when your face is obscured is so much nicer.
While it's a significant step forward for accessibility, it also invites us to consider how such technologies could integrate into everyday use for all users. This could enhance ease of use and efficiency, but it also requires careful consideration of privacy safeguards.
This is the beginning of the end for the mouse — not just on phones, but on desktops, everywhere. I highly recommend everyone to at least schedule a free Apple Vision demo at the local Apple Store.
This is awesome and I love the UX, although I can't help but feel a bit sad that we need to always rely upon Apple and Microsoft for consumer accessibility.
It would be so great if more resources could be allocated for such things within the Linux ecosystem.
Bummer the eye tracking is iOS only. I’ve been wanting focus-follows-eyes for decades.
I share the sentiment. I've long noted in situations with grids of terminal windows where focus-follows-eyes would be so much faster and present less friction than using hotkeys or focus-follows-mouse.
My first thought upon seeing the Haptic Music feature is to wonder how long until they make compatible headphones and I can relive my high school years, walking around listening to nu-metal and hip-hop with a Panasonic Shockwave walkman.
Surely, this would be a headphone feature possible without having to be supported by the player.
It's funny how the post is about enhanced surveillance technology and the text sentiment of the comment thread is overwhelmingly positive.
Surveillance technology: >:(
Surveillance technology with the word "accessibility" in the title: :)
of course it’s surveillance technology.
an iphone constantly broadcasts your location to third parties, can deduce where you work and live, understands the unique shape of your face, has a built-in microphone and multiple kinds of cameras, stores all of your conversations, stores all of your photos, stores your health information, can figure out when you go to bed.. all on a completely closed and proprietary operating system.
it’s like asking “why hasn’t anyone mentioned that we’re all using a website right now”
Very curious to see how well eye tracking works behind a motorcycle visor. Thick leather gloves, bike noise, and a touch screen and audio interface are not much fun.
Gotta make surf they got all the eye balls then make sure they don't get to date anybody and become homeless.
I hope this works for people with one eye. There are dozens of us—dozens!
Didn’t Mark Rober say he worked on some motion sickness stuff at Apple?
Nice to see features from Vision Pro make it onto other Apple products
This will have just enough capability limits and annoyances for you to be convinced to purchase a Vision Pro. It also makes eye tracking more widely accepted by making it available to everyone by using their existing devices.
while we're on topic of accessibility I'd like to point out the following lovely facts:
* on Android, in the builtin popup menu for copypasta, the COPY button is wedged in tightly (by mere millimeters) between the CUT and PASTE buttons on either side of it. A harmless one packed in between two destructive, lossy ones, and usually irreversible.
* the size of an adult human fingertip is no secret (and in fact 99.99% of humans have at least one!)
* Google's staff consists of humans
* Google supposedly hires ONLY "smart" people and their interview process involves some IQ-like tests
* Android has had this obvious anti-pattern for many years now. perhaps 10+?
Reminder: eye tracking means an always-on camera.
I'm not fully against it, but it's good to keep in mind the implications.
They keep announcing these bombastic accessibility features while simple things like tabbing remain frustratingly broken. The macOS “allow this app to access that” dialog supports shift+tab, but not tab.
https://support.apple.com/guide/mac-help/use-your-keyboard-l... - Keyboard navigation
https://support.apple.com/guide/mac-help/navigate-your-mac-u... - Full Keyboard Access (accessibility feature, goes beyond just tabbing between elements)
It's annoying that tabbing between UI elements is off by default on macOS. It's one of the first things I turn on with a new mac.
Quibble but this isn't "eye tracking" it's "gaze tracking". Eye tracking is detecting where your eyes are. Gaze tracking is what you're looking at.
This is how I feel about "face recognition" (it should mean recognizing whether something is a face or not), but it is common to use eye tracking this way.
Where are you getting this language point from? If I look up any company selling "eye trackers", their products are all meant to track where you're looking, e.g., https://www.tobii.com/
Interesting. I picked it up working with Tobii devices a few years ago actually. I guess they updated their vocabulary.
Well then it's both?
Could be, but not necessarily. Eye tracking usually means it's tracking the eyes of one or more people in a video. Gaze tracking usually requires your eyes stay pretty steady and close to the tracker.
Imagine an eye-tracking loupe function on mobile: fit the same amount of text but bubble up the part under foveolar gaze. Save on readers* everywhere.
* readers are those glasses you can pick up at the drug store for $17.99.
It took me a couple read-throughs to understand what you meant, but yes, on-screen gaze-sensitive magnification would be amazing, I agree.
remember how google added js apis to detect active tabs. and rendering intersection. complex webapps could use less battery and have richer interaction.
everyone rejoiced. nobody implemented anything useful.
soon, advertisers, google included, was selling ads packages by "viewability". lol.
can't wait for safari to lead with "eyed" ads.
I know this is a joke, but for for anyone who was wondering,
> all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.
It's not an API adopted which apps can use, it's a system pointer, like a mouse or trackpad
eye tracking feature is interesting … for ad-tech companies
[dead]
[flagged]
how can I make sure it's off? Is it off by default?
Yes
[flagged]
my first thought regarding eye-tracking: "whose accessibility to what|whom?"
Do you have another idea beyond the accessibility of Users without fine motor control to the phone's screen?
which ads you are looking at
> “We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO.
Does this line actually mean anything? Press releases are so weird.
Sure it means something, it’s pretty low on the PR talk scale even. Let’s see:
> We believe deeply in
"We", in this case, refers to Apple and as a company can’t believe in anything, it specifically refers to their employees.
> the transformative power of innovation
A new feature will bring forth change
> to enrich lives
In order to make someone’s life better.
So far so good, at least no synergies being leveraged for 10x buzzword transformation. Let’s see if it passes the test: There’s a bunch of new accessibility features, that certainly fits the bill for innovation. As anecdata, at least one of these will drastically change how I interact with my devices, so there’s the transformative power. I will be able to use these new accessibility features to improve both the way I work and how I privately use technology, so one could argue my life is being positively enriched by this transformative power brought about by innovation. Do they "believe deeply" in this? That is only for Tim Cook and his employees to know, but they’re one of the few companies pushing advances in accessibility past the finish line, so they might!
I always wonder why press releases include claims that a person said a paragraph or more of text that noone would ever say out loud, but this one actually does sound like something he'd say. In a keynote video, anyway.
Eye tracking is not an accessibility feature, it is an advertising optimization feature disguised as an accessibility feature.
You can say a lot of negative true things about Apple but this is just silly. There is no way Apple is going to expose that data to underlying apps in the same way they refused to do it in Vision Pro. I'd bet a good bit of money it works the same way where it's a layer on top of the app that the app can't access and from the video it looks like that's exactly how it works.
Not to underlying 3rd party apps, but for their ad structure it is a total possibility. It might not be now, but down the road this feature will be used for that and you must be certain that Apple already discussed and planned it.
Eye tracking is absolutely an accessibility feature. Just because you don't need it, and it can be abused, does not mean it isn't an absolutely game changing feature for some people.
He didn't say it wasn't an accessibility feature, just that it was disguised as one.
Just because it's not a game changing feature for some people, doesn't mean its primary function isn't advertising.
Apple are very good about not making this kind of thing available to apps that don't have an explicit reason to need it.
[flagged]
My presumption is that apps will not be able to access this data, at least without some sort of permission gate.
Indeed they haven’t for all the years it was limited to FaceId’s attention option.
It allows people with ALS to navigate their device with their eyes.
Microsoft added a similar feature to Windows about seven years ago.