By Jane Odom, M.Ed - AAC Language Training & Implementation Specialist
I was fortunate enough to go to the Florida Education Technology Conference (FETC) last week. This year it was in Miami, so - 80 degrees and sunny. I had no problem leaving the cold for the beach.
I was on a mission. I love STEAM (Science, Technology, Engineering, Arts and Mathematics). I think it is such a powerful learning tool and can open so many doors for so many students. But, when I have been to other conferences and looking for coding and robotics programs that were accessible, I was unsuccessful. Many of the developers were startup companies and had no idea what universal design was about and why it was important.
I didn’t expect to find much at FETC, but was hopeful.
I was able to attend the Apple Playground. We were all given iPads and Apple Pencils and then got to explore a variety of different apps. I was interested in Coding. I was actually presenting with Chris Bugaj on teaching core vocabulary through coding the next day. It was a cool presentation on teaching core concepts through motivating activities involving coding and robotics, but did not allow ALL students to participate in ALL aspects.
So, I went over to the demo on Swift Playground. This app was developed by the Apple Education team to help students learn to code. It was just like all other coding tools where the student could drag and drop codes into a box to then run the code. Not all students have the fine motor skills to do this. So, I complained.
Immediately, the Apple educator went and got someone familiar with new accessibility features. I explained my dilemma and he thought for a moment and smiled. We turned on “Voice Control” and asked the iPad to turn on “Numbers.” This added a number to everything on the screen that could be activated. By using simple voice commands, we were able to construct the code and run the program. I was ecstatic! But, would it work with a synthesized voice?
We scheduled a time the next morning to meet with these amazing team members to try it out using an Accent AAC device. Since the commands were just “tap 6” or “tap 10,” we could easily test it out.
And it worked – seamlessly!
You could also use the Accent to voice control many other features of the iPad. The folks from Apple had never worked directly with a dedicated device and were very excited to learn about our language system and the students we work with.
I can just imagine how some students now have access to the fun and intrigue of coding. Think of a student using eye gaze, constructing their own commands to run a cool robot...totally independently!!!
I found a couple of other companies that specifically focused on teaching coding and robotics that allowed for text entry as opposed to just “drag and drop.” I will be testing many of these and, if there is enough interest, would love to share my results.
If you are attending the ATIA conference next week, be sure to stop by our presentation “Coding Core: Teaching Language with Coding, Robots, and AAC Devices.” We will also have a demonstration at the AT Makers event on Saturday morning in the exhibit hall.
News - robotics, STEAM, STEM, coding, accessibility, Apple, Accent, AAC, computer emulation