I read this post http://behindthecurtain.us/2010/06/12/my-first-week-with-the-iphone/comment-page-1/#comment-9947 by Austin Seraphin last week. I wanted to blog about it at the time, but my preparations for work in Prague prevented me from doing that. I did comment on Austin’s blog at the time though:
Thank you for sharing this.
I am a sighted iPhone user and reported upon these functions when they were introduced on the 3GS. I’d thought they were tremendous. To read your commentary of actual use and to understand more deeply, the pleasure these functions give has been enlightening.
I also noted that there were many others commenting who like me were very impressed with his post. So thank you Austin once more, for that food for thought.
Any regular readers may remember that I’d blogged about the iPhone 3GS’s voice over function back in January http://eduvel.wordpress.com/2010/01/04/iphone-3gs-accessibility/. In that post, I cited another blind user’s view of the then new iPhone accessibility functions. http://www.nillabyte.com/blog.php?b=280 Since January there has been an upgrade to both the phone and the operating system. What’s more, there has been a new and very popular device launch and the iPad is now amongst us and several of the updated features are available on that too.
IOS4 provides new and extended accessibility features.
Which means that the 3GS’s capabilities are improved – with the addition of (for example):
* Touch Typing – here, the user just draws their finger across the keyboard to hear each letter read out. Once the letter needed is found, the user simply lifts his or her finger to select it.
* Bluetooth wireless braille displays are supported too. Just pair up any one of 30 devices, choose one of the 25 supplied Braille language directories and off you go!
And not all improvements are for those with sight impairments or blindness. The deaf or hearing impaired can also be helped by using features such as:
* Face Time – which provides better access for the deaf with the new ability to communicate by phone using sign language
* Optional mono-audio – which if hearing is limited in one ear, can route both right and left channels into both earbuds
But these in-built features are not the sum of what iPhones can do to help learning become inclusive. There are many Applications [Apps] that do similar, sterling work.
A selection of these:
* SpeakIt http://appshopper.com/utilities/say-it (£1.19) is a great way of vocalising text on the iPhone. Cut and Paste (or type) text into the window and it will read it back to you in one of several voices. The resulting file can be emailed!
* iConverse http://www.converseapp.com/ (£5.99) ‘iConverse is an educational tool designed for young children and individuals with communicative disabilities, and also toddler-aged children who have yet to master language.’ At it’s simplest you upload a photo, annotate it and the software reads out the annotation.
* Google Voice – http://www.google.com/mobile/google-mobile-app/ (free) simply click the microphone icon and speak your search term. No fiddly spelling and pretty accurate.
* Soundnote – http://soundnote.com/ (£2.99) is an iPad App. It is a note pad. However, the killer feature is that it also records audio. Lots of audio. So, a learner (lets say a dyslexic learner) can make brief notes as he or she listens to the speaker (teacher?) whilst recording the entire class. The recording can then be saved.
What are you using? How are you using it?