iOS 18 is actually exciting

I think I echo the sentiments of many when I say that although the past few iOS updates have been substantial, they lacked a killer feature to incentivize people to upgrade. In my mind, Apple Intelligence is the killer feature Apple has been waiting for. Siri’s ability to anticipate your needs and surface information when you need it most really could be a game changer. I hold on to my hardware for a long time. I went from an iPhone 4S to a 6S+ and then to a second-gen iPhone SE. Now I’m using an iPhone 14 Pro Max.

VoiceOver upgrades

VoiceOver adds a number of small quality-of-life improvements as well. For starters, we finally get a tutorial (something Google has had for years with their TalkBack screen reader.) You can also set minimum and maximum values (in words per minute) to change the speaking rate of your preferred voice.


These WPM Minimum and Maximum steppers are located below the standard sliders for rate, volume, and pitch.

The rate slider adjusts the speech rate as a percentage like you may be used to. 

The WPM (Words Per Minute) controls are for setting the minimum and maximum speech speeds. For example, if the minimum WPM is set to 50, this means the speech rate will not go below 50 words per minute, even if the slider is set to 0%. Conversely, the maximum WPM setting ensures that the speech rate will not exceed your specified upper limit, preventing the speech from becoming too fast to comprehend. The upper limit for this is 900 WPM and the lower limit is 176 WPM. The minimum values can range from 50 WPM to 175 WPM. Curiously, the Alex voice does not have values for Timbre or Sentence Pause. As of the first developer beta I could not test the equalizer button with the Alex, Victoria, Allison, or Ava voices.

New changes for Braille users

Apple has also introduced automatic braille screen input, the ability to change your braille input and output settings independently by using the new dedicated rotor options, and the ability to access Spotlight search as well as the item chooser by pressing return to start searching for items matching the entered text. Turning on braille screen input under settings > VoiceOver> braille allows you to perform a split double tap with two fingers (one at the bottom and one at the top) to trigger braille screen input. See this post on AppleVis if you’d like a better explanation.

Conclusion

Although I’ve been running Large Language Models (LLM’s) on my M2 MacBook Pro for quite some time, it is my sincere hope that Apple makes this technology more approachable and adds value for the average consumer.

Leave a Reply