By Team IDK | June 8, 2014
Cued Speech may be finding its place with digital hearing-devices. From the 1960s, Cued Speech was seen as “too oral” by signing deaf people and “too manual” by families using the verbal approach, but this could be changing.
Reasons For Using Cued Speech
Literacy and reading skills are the main incentives for families and schools to consider Cued Speech. Recent research however shows today’s preschoolers to have strong, emerging literacy skills from hearing language after newborn tests, accessing early hearing-devices and speech services.
Easier To Learn
Cued Speech takes about 20 hours to learn during workshops, and users can master the communication system in less than a year. Children are also reported to gain two to three months’ literacy progress within just one academic year of using Cued Speech in their school environments.
By Team IDK | June 2, 2014
Traditionally used in hearing-aids, it’s fascinating to see bone conduction featuring in wearable technologies. Bone conduction is the physics behind bone-anchored hearing-aids (BAHAs) but with the new Bonebridge implant, no physical abutment is needed, making the device easier to wear.
In time, these particular wearable technologies will shrink and redirect to new contexts developed specifically for the areas of education and health.
By Team IDK | May 25, 2014
Two students at Rochester Institute of Technology, Patrick Seypura and Alec Satterly, who have hearing issues, are gearing for connected homes with a smartphone-based alarm clock app, to distribute via Cenify, their company.
This video shows how the app and phone might work in the home context:
A wireless version of the app-managed clock is in progress. Vibrating alarm clock options exist but app based and wearable technology solutions will be the first to link with smart home products as these reach the market.
By Team IDK | May 22, 2014
Outcomes for children receiving remote-speech therapy by telepractice, are similar to in-person sessions with a therapist. A report by Hear and Say, on using Skype to deliver teletherapy services to remote areas of Australia, was published in the Journal of Telemedicine and Telecare (read below).
Early Intervention Boosts Social Integration
Children who start auditory-verbal therapy and early intervention at infant stage have improved social integration, according to an unrelated study by First Voice, an Australia-NZ consortium which includes Hear and Say.
Infant Teaching Improves Language Scores
Notably, children who received verbal-based early teaching had an average language performance standard score of 94.73, placing them in the average range of their hearing peers (standard score for hearing children is 85-115).
By Team IDK | May 20, 2014
With today’s classrooms having multiple digital data-sources, students who read live captions are challenged by room lighting or shadows, placement of units, and fitting audio-visual media screens into each student’s line of sight.
Improving Caption Experiences
Researchers at the University of Rochester are tackling these issues, aware that students in these classrooms with full hearing, benefit from captions.
Separately, Michael Argenyi, a past medical student at Creighton University (Omaha) is to receive almost USD 500k in legal fees, after five years of pursuing the right to live captions for his medical studies at the campus.
Note: The university is not required to reimburse Aryengi for the USD 110k he paid himself for captions during his initial two years as a medical student.
By Team IDK | May 19, 2014
Talking to your baby from birth [especially when hearing-devices are worn], is crucial for their infant language development. While most babies hear for two months before birth, there will be babies with hearing devices who need to build up their word and sound-vocabulary after missing sounds earlier on.
Chatting During Family Time
One book, Small Talk, by parents for parents, has lesson-plans to develop language in toddlers. Its writers discuss parent-child chats around digital devices but suggest this time is shared, and limited. Similarly, during TV time, parents can comment on what a show is presenting to its audience.
For parents who want coaching and strategies to teach their children to hear and talk, there is remote access to auditory-verbal therapy by telepractice from specific centres in the US. Print resources for home-work with children are routinely available online, free of charge from these learning centres.
Language – And Behaviour
With confirmed links between a child’s language ability and behaviour, this learning-window is being tapped by early-years educators and services.
Family interaction is an opportunity for everyone to learn to slow down and really listen to what’s being said. This is a very transferable skill in today’s fast-paced, diverse world with running distractions at every juncture.
Some resources to start you off:
2) KidTalk (Canada) – interaction ideas and reading guidance.
By Team IDK | May 16, 2014
A parent briefing was held in Dublin on May 10th, 2014 by the national cochlear implant centre (NCIC) in Ireland to advise parents on timelines for the bilateral cochlear implant programme to roll out from July 2014.
- July 21st – First simultaneous bilateral cochlear implant surgery
- July 23rd – First sequential bilateral cochlear implant surgery
By end-2014, the goal is to complete 30 sequential bilateral surgeries.
The NCIC has a schedule to guide families on their possible timelines for assessments as the bilateral programme rolls out over a three-year period.
Full details are on the presentations here. NOTE: these files save to your PC.
Presentations were also made by delegates from Yorkshire and Nottingham:
This information was intended to both inform parents, and to answer queries they may have had, regarding bilateral implantation for their children.
Finally, the NCIC team advised that not all new hires are in place, so the roll-out process will take time and to please bear with them in the process.
By Team IDK | May 13, 2014
We hear the term ‘disruptive technology’ used in consumer terms, one very visible example being the superseding of digital cameras by quality camera-phones. Another example was Netflix moving its services online. The writer of the below piece looks at bilateral cochlear implants in the same context:
“A disruptive technology is a technology that creates a new market and may eventually disrupt an existing market, replacing an earlier technology.”
Bilateral cochlear implants are effectively disrupting hearing ability in both a biological – and in a social context (as the piece below, shows).
This article shows us that deafness can become ‘hearing‘ – with devices.
Now we can say that almost every child with hearing loss should be able to hear with appropriate technology. (The exception is children without a cochlea.)
Disruption again occurs here, when (1) children access sounds not heard before digital hearing devices were available and (2) when this access to sound positively impacts their reading ability. With hearing-devices, the sound-to-letter links are heard as they read, translating to better literacy.
* Cochlear Implants – what you need to know
By Team IDK | May 8, 2014
Anyone who requests live captions or CART (communication access in realtime) for an educational or training context, knows the pain points of (1) defining your hearing issues (2) explaining what CART is, and its benefits (3) arranging its provision and (4) establishing who actually pays for it.
One blogger, Chelle George, describes in detail the hoops through which she jumped, to access captions for a writer’s workshop she wanted to attend.
First, she wrote to the community college office; here is the letter:
Read: Letter For Accommodation
Copying-In On Email
Weeks after the office acknowledged her letter, there was no update on her request. Chelle emailed the college office, copying the college’s disability centre to explain that FM did not give her 100% access to dialogue:
Read: Requesting CART, again
Eventually, the college called Chelle in to discuss the course and after some ‘face’ contact, CART was arranged for her upcoming course.
Read: CART Success
Facilitation should be easier to arrange than this, particularly with a growing number of people who don’t use sign language. The ‘burden’ of advocacy is often cited – however, colleges and staff will become more informed as more young deaf students request facilitation for their own courses.
By Team IDK | May 6, 2014
For over 30 years, families used cued speech to give deaf children visual access to sounds for lipreading (speechreading in the US) and to facilitate the child’s literacy by using the family’s language for reading and writing.
Visual Cues To Speech Sounds
A very small number of children (with today’s digital hearing-devices) don’t get to hear certain speech sounds to benefit lip-reading, and this is where cued speech comes in. This option is also a back-up when hearing-devices are off, say when a child is going to bed, bathing or swimming.
Interest in cued speech grew recently, with the first music video to use the method, called “Go”, by Twista. Families and school districts also see better literacy when children switch to cued speech, with hearing devices.
Studies published in 2010 show cued speech to improve speech perception, lipreading and English-first language development in children with cochlear implants – plus the literacy benefits of one language for reading and writing.
Literacy levels in certain students in Illinois (US) were reported to improve one to two grade-levels in a school year, when cued speech was introduced.
Cued speech is not for everyone, but it should be reviewed with other options.