While Assistive Technologies providers have historically promoted standalone user-centered solutions, much potential exist for assistive technologies innovators to leverage the Internet of Things, Smart Homes or Smart Cities innovations as well as low-cost consumer devices offering alternative modes of communications. This session will stage a dialogue between industry and advocates to examine collaboration opportunities between the assistive technologies, consumer electronics and mobile industries in promoting Persons with Disabilities’ independent living, developing standards, tools, and platforms that enable seamless interactions with their environment.

Session Chair: David Dikter, CEO, Assistive Technology Industry Association (ATIA)


  • Julie Eshleman, Assistive Technology UX Researcher, Leonard Cheshire
  • Scott Freiermuth, Principal Corporate Counsel, Government Affairs Department, T-Mobile
  • Sandy Hanebrink, Executive Director, Touch the Future, Inc.
  • Juliana Tarpey, Product Lead, Alexa Accessibility, Amazon




OCTOBER 25, 2022
1:30 PM ET


Services Provided By:
Caption First, Inc.
P.O. Box 3066
Monument, CO 80132

This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.

FRANCESCA CESA BIANCHI: Good afternoon. Good afternoon and welcome back. We are resuming our program this afternoon, and it is my pleasure, if I may have your attention, please.
It is my pleasure to introduce the next panel discussion New Opportunities for Assistive Technologies in an Interconnected Environment, and the session Chair is David Dikter CEO of Assistive Technology Industry Association, ATIA. The floor is yours. Thank you.

DAVID DIKTER: Thank you, Francesca. It's nice to see everybody. I know some of you. We have a great panel here today, and my role as ATIA, at the Assistive Technology Industry Association is to work with and for the AT industry. That's companies, including mainstream companies, that make technology that folks with disabilities need and require and want to enhance their lives and enable them to access all of the other technologies that exist in the world.
We run this small little event in Orlando that some of you may or may not have heard of where we bring together thousands and thousands of practitioners in space around assistive technology, sometimes I call it like the last mile for technology for people with disabilities. These are the folks who are not here, but the folks who are working to connect children, adults, seniors with disabilities to the technology that they need.
This field has for the past 30 something years that I've been involved in it, emerged, changed, grown, and the advancement of mainstream technologies has made it even more awesome.
However, just because we have accessible websites, doesn't always mean folks with disabilities actually even have the technology to be able to access that. So, we still have a long way to go to getting AT, whether it be built into technologies, and we'll hear a little bit about that, or whether it be a third party technology. We still have a long way to go to supporting individuals with disabilities to get the technology that they need, whether it's funding, whether it's training, whether it's figuring out which technology works best.
With that said, we're here today to talk about new opportunities in the interconnected world in Smart Cities and what's happening in the space to enhance more AT being built, being designed and developed, more entrepreneurs, and then more companies as we have two company representatives here who are building in solutions to give better experience at connecting folks with disabilities and maybe everybody to their world with smart technologies and home environment technologies. I happen to be a big strong be advocate of home environment technologies. Over the past three years, I have completely made my own home a Smart Home and it's an incredibly eye opening experience thinking about what that means for folks with disabilities.
With that said, I'm going to turn it over to our speakers and let them introduce themselves briefly. They're going to tell you a little of what they're working on and what they're thinking about the topic. I have questions but even yet if you have questions, I think it's even more valuable because answering your questions is far more important than me just coming up with questions. I have a million of them. That being said, I'm going to introduce Julie Eshleman, a UX Researcher at Leonard Cheshire.

JULIE ESHLEMAN: Thank you so much. Hi, I'm Julie an Autistic sociologist. We don't all do math. I'm representing a non mather qualitative researcher working for Leonard Cheshire on assistive technology research, and my focus is how disabled people use technology to build the lives and experiences they want to have every day. So, our ambition was to offer some smart technology using Alexa and make the living space in care homes in the UK easier to manage, because we wanted people to have more ways to manage their time and space and have as much independence as possible, even in a place where they have a 24/7 support team on site.
So, what we found is that there are a lot more barriers to people using technology in the culture and the values than in the interaction between people and the devices themselves. So, even if we had the perfect device set up perfectly for the perfect task and the perfect time, if everybody around didn't have cultural values that supported the role of technology in that person's life it would fall really flat. That's what I'm going to talk a little bit about today. Some of the things that we kind of found is that in order for technology users to be successful, the support network has to believe that a person deserves independence, because not everybody does. That their caring role is protected if independence increases, that technology enables activities that make daily life enjoyable. It wasn't about having the kit. It was about having certain beliefs about the role of technology.
So, I wanted to kind of share three big ideas that I think contribute to this topic from like a culture and value's perspective because there is real work to do beyond what the disabled tech user and the tech provider communities are co producing together, and that's kind of these value shifts that we all need understand and work towards.
My first point is I'm sure we've all kind of heard the IBM Training Manual quote that for people without disabilities, technology makes things easier, and for people with disabilities, technology makes things possible. I think we have moved passed a time when technology is a nicety. The world has been so successful at moving everything online and making everything remotely accessible, that if you don't have the rights kind of access, then you don't have access to information to use web connected tools, you can't make doctor appointments, can't check bank balance, and can't pay your local county taxes. So, some research on disability and employment from Georgia Tech found that disabled people that participated in their survey, are actually now more employed than they were prior to the use of technology, so they were I think, the researchers were expecting there to be a lot of negative impact of COVID, and they actually found that every one of their survey participants reported that they're more employed now. That's really important. That means technology is no longer a nice to have and it's a necessity and it's a right.
The next piece is that there are still fundamentally really low expectations for what a quality life looks like for a disabled person. Somehow, it is still kind of separate in people's minds what a nice life and a nice way to spend your personal time and a nice way to engage with your hobbies and things that you are interested in, or ideas about that are very different for that of people with disabilities versus people without disabilities. There is a lot of work for us to do. I spoke with a woman, I had kind of a devastating interview. She was talking to me about her Alexa and how she it's given her access to audio books and she was just veraciously listening to audio books from the local library and life changing for how she got to spend her time. When I was talking to her about how that has impacted her, and I made the mistake of asking her why she thinks that she never learned about this technology before if she's so interested in reading?
And she said, I'm disabled, and no one thinks I deserve a better life than this. Owe! She said I'm cared for and I'm loved, but no one thinks that my life should look like this. So, I think that we have a lot of cultural work to do around our expectations for what a good life looks like, and how technology I think, there is a real opportunity here to introduce a lot more equity and make people's lives look the way they want them to look.
Then the last idea that I have is that if we think that technology is a right, and if we think that disabled people deserve the live the same daily life as non disabled people, we have to improve the processes that educate people and get the tools people want to use into their hands and hope. We need to use technology with more access options for more people and to cultivate those changes in our own values. We need to address the physical infrastructure that's necessary for all of those connected devices to be stable and reliable tools that don't turn into a post code or zip code lottery if you have the connectivity then you get to build the life that you want if the tools are available. If you don't, you don't.
So, I think maybe my takeaway is that the perfect tool for the perfect task, set up perfectly for the user, it is not enough. It's worth all much us investing in the values and those ecosystems around the interactions between people and technologies so the interactions are meaningful and go well.

DAVID DIKTER: Thank you so much. That's a lot to think about, and I appreciate that.
Yeah. I have about a million questions now just for you, but we'll move on to Scott. Scott, I'm sorry, Scott Freirmuth the Principal Corporate Counsel of Government Affairs at T Mobile. Thanks, you have some things be to share.

SCOTT FREIERMUTH: Thank you. A pleasure to be here at M Enabling. This is I believe my fourth time here. I've never been on the big stage. This is kind of exciting.
So, as you mentioned, I am Principal Corporate Counsel at T Mobile for those, I'm wearing a blue sports coat and black and blue played dress shirt and brown hair and sporting a beard right now, there we go. I cover a range of Federal regulatory issues for T Mobile, but my favorite internal client is the TRS program provider in 35 states and also provide service in Puerto Rico and T Mobile Accessibility is the nation's sole provider of IP Relay Service, at least for now. There are two other companies that have submitted applications for certification.
So, in thinking about this topic, assistive technologies and in the interconnected environment and instead of looking sort of too far forward at some of the technologies that I think we're seeing outside, and they're super exciting, believe me. I thought I might also be instructed to talk about the OG services, one of the assistive technologies which is TRS, and how our increasingly interconnected environment is impacting that service.
Before I go delve into a little more, I want to say these are my comments and observations and not T Mobile. I'm sure I'm not breaking new ground. Many in the room are familiar with TRS. That said, three letters, ASR or automatic speech recognition is having a profound impact on the other three letter word, TRS, right. On September 17th, 2019, the SEC issued a declaratory issuing the use and compensation for fully automated ASR to produce captions for IP CTS services, caption telephone services. Since that time multiple providers have gained approval to provide IP CTS via ASR and we're receiving applications for IP relay service that utilize ASR.
In my opinion, this move to ASR was inevitable, and it is a direct result of our increasingly interconnected environment and our digitized world that we live in. Frankly, I think it was just a matter of time before ASR engines could produce high quality, accurate, and reliable captions, and I think we're there.
Many of the IP CTS providers are providing a hybrid approach, where in ASR is the default technology, but there are also live communication assistants that either oversee the calls, and there are other providers where the user can choose whether it's an ASR or a live communication assistant.
And I would think over time, especially with artificial intelligence, which is continually enhancing and improving ASR, that we'll see fewer and fewer and fewer TRS calls handled by live CAs.
So, as I look forward, I see that ASR and TRS will increasingly converge, bringing about a tremendous disruption to a 30 plus year old TRS market. I would expect that the traditional siloed TRS services will converge and we'll see more one numbered solutions that provide TRS access across platforms, including video relay service, IP CTS as well as IP Relay.
In short to wrap up, you know, we've all seen just technology disruption over the last 10 or 15 years, and you know I think of Uber and Airbnb and all of these disrupted technologies that come in. This disruption means change, and typically consumers, or in this case TRS users, I think, are going to stand to benefit from that change. Thank you.

DAVID DIKTER: Thank you so much. So, before we go on, I want to say that the change is so awesome that the new opportunities about how, for example, ASR services are now able to be integrated into so many other realms of technology, that it's not just one directional and it's a multitude of ways to integrate. I want to continue to address that as we go forward because that's the interconnectedness to all of this, I think.
Next, we have Juliana Tarpey, Project Lead Product Lead at Alexa Accessibility and she and I had great exchanges and they're doing some awesome stuff on the Alexa Team. Juliana in.

JULIANA TARPEY: Thank you so much for having me. My name is Juliana. I am still a woman with long brown hair but today wearing a black and brown sweater dress because I was freezing yesterday. (Laughing).
I wanted to talk today on this panel really about the great opportunity to present about connectedness, and really how Alexa is so powerful at helping individuals connect with the world around them, but as to themselves, the ability to have this digital assistant that can answer your questions, take notes, help you remember things, and it's also about the self, and that's the piece that drives independence.
In order for Alexa to really have a huge impact, it's all about collaboration at every scale. So, we start with our small little company, (Laughing), Amazon, and just the huge amount of teams that exist within Amazon and for us to collaborate with those, so the customers can reach different parts of Amazon, whether that's shopping, music, Prime Video, but we can't stop there. We need to go beyond that, so if your preferred music provider is not Prime Music, if it's going to be Spotify or Pandora, we also connect to that and so it's ensuring that Alexa is being the intermediary there and simplifying your connection to be able to do that and so instead of having to go to the store, you can just say to add toilet paper to the shopping basket and when you get to the store you can recall the shopping list. That's really where it becomes very powerful is connecting to all of that Amazon offers and beyond.
And then yesterday I touched briefly on the collaboration we're doing within speech understanding with Apple, Meta, Microsoft, and Google, and the University of Illinois and just how important that collaboration is particularly within the realm of accessibility because we need to collect more samples of different voice patterns to be able to better understand everyone.
So that's that kind of huge Meta, to use that word, collaboration. But most importantly, at Amazon, we start from the customer and work backwards. It's also about working with different partnerships that we have and then meeting individuals within that that can really teach us how they're using the technology, how they're innovating. Many times we learn from them and then also finding what those gaps are, and then working to plug those.
I'll tell you about my friend Josh who I met earlier this year at an advocacy event with United Spinal, he's a quad from when he was a teenager and just had a baby boy and got this adaptive baby carrier put on to the chair and just an amazing guy, and he was telling me about all about how much Alexa he has in the house and how he uses it to operate the elevator and everything in the home. I said that's great, when am I coming to your house. So, he's got enough to invite me and we spent a few hours going through the home, everything from the studio where we record a podcast and using a switch bot connected to Alexa to turn on the instruments, and to understand at the individual level to understand how to push the barriers, and we'll be working on a project that we can't talk just yet about, there are always things brewing, but it's always about collaboration at those different levels.

DAVID DIKTER: Awesome. Thank you so much. That's awesome.
So, I'm going to come back. I'm going to give you my question now but I'm going to come back to it, which is there is so much that Alexa can do today and there is so much more that you talked about you're adding in. How is it, going back to your comments, how is it that consumers with disabilities, that they even know that the solutions exist to begin with? How, within the ecosystem you're working in and partnering in, do you get the word out about the solutions? That's my question and we'll come back, and I'll let you sit on that one.
Next, we have Sandy Hanebrink and I don't know how long we've known each other, Sandy, a pretty long time, who is the Executive Director of Touch the Future. Many of you here probably know Sandy, but Sandy is all things. She is a consumer with disabilities, she is a leader in this field of accessibility, she is a developer of assistive technologies. She is a provider of assistive technologies, and she's pretty awesome. With that introduction, did I miss something important?

SANDY HANEBRINK: I'll cover it.

DAVID DIKTER: You'll cover that part. Go ahead, Sandy, tell us about your thoughts.

SANDY HANEBRINK: Thank you. Thank you for the opportunity to be part of this panel. I am Sandy Hanebrink a quadriplegic sitting in a manual wheelchair with Bluetooth enabled smart drive device that is programmed and calculates a lot of data through ICT devices through the app.
And I say all of this because a lot of people in technology don't think about all the interconnections of technology and assistive technology being utilized by people with disabilities whether it was intended to be utilized or not. There are some unique opportunities.
So, some of the things that we work on like David said is kind of multiple hastes, I'm also a Paralympian and proud to say I still have four American records and three world records.
And how the paralympic is one of the biggest driving forces that all of you are here and people with disabilities are respected and been part of the advocacy that where this past Olympics the Olympics and Paralympics in U.S. and Australia and other countries, the medal pay is the same as Olympians, so we're getting there.
And when you think about how technology and assistive technology and people with disabilities, a lot of times you see devices like Alexa and that and how they're working to provide that independence within the home and how people with disabilities are innovative in making their products do things that they didn't even know it could do, or not sure they want it to do.
And then but why does it stop there? A lot of this technology stops in the home. Our payment systems in the U.S. are CMS only pay for in the home, like people with disabilities are supposed to stay hidden away in the home. Where our advantage is we have the ADA generation now and we have rights and laws and we have the right to, you know, the Convention on the rights of people with disabilities, we have M Enabling, right, and so we know that we're not staying in the home. We're slowly but surely integrating into your companies and if we're not, then you're not doing it right because you can't do accessibility without the users.
And so, what I think is important is, we have the technologies and stuff for in the home or for that individual use, and as things get more mainstreamed, we're seeing some assistive technology that is specialized for people with unique or more complex disabilities get harder to find, and so people are getting left out. We see a lot of things here for vision and hearing, but we don't see neuro. I mean there is a couple of people, you know, like Alexa has the new Tap Touch Access and things, but we're leaving out in ICTs mobility, cognition, and sensory, and other than hearing and blind.
And so, you know, neurodiverse and some of the issues unique to seniors, and so I challenge everyone to think inclusively. We keep saying inclusive, but I keep saying I'm here but where is everybody else?
The only other people I saw wheelchairs were here were all presenting so why aren't they coming is a big question. Do they feel that they don't belong? Is that because the industry isn't embracing that along with their movement? And so I see unique abilities with Alexa, and everything is based on that WiFi connection and that, but how do I use that when I come here? And I have to set up every single device on another WiFi, can it hold memory so that it continues to communicate and I can use that device and I don't have to use a device that costs a couple thousand dollars that's uniquely made for somebody with a disability, that once I sync it, it works, and I can take a remote and push one button, push one button and have the bellman do it and I control the TV with my phone with that device, and I can have the bellman plug one outlet in and now I can control the light because I don't know how many places I either transferred in the dark because I can't even with ADA still, I can't get to the lamp or it requires dexterity, so I either transfer in the dark and hope I don't fall, or I sleep with the lights on. Or I watch one channel that comes up on the TV because I can't control that remote.
So how do we make these devices portable? Where I don't have one set up at home and one at work, but when I travel, when I go to someone's house. So not so uniquely tied. How do we ensure that when the platform upgrades, (Laughing), that my assistive technology is using, it will still work. I do a lot of work in the health care industry, and I spoke with the presenter from the Department of Justice and asked this question. There is a lot of work that's been done in electronic medical records and portals forward facing to the consumer, to the patient. But doctors, nurses, occupational therapists, physical therapists and that who have disabilities can't input to those systems. They're not accessible on the back end, so why is that also not a focus?
So, there are some unique opportunities, especially as we're starting to see in Uganda and other countries where we have Smart Cities, how are we connecting what we have into Smart Cities, and why are they always doing disability as an afterthought, well we'll fix that later.
How do we get it to where it's part of the business model? Where we don't need legislation to mandate it because we just do it because it's a human performance, human first initiative. And so that's where I see the opportunities are, and the only way you have those opportunities is to include people with disabilities. And true inclusion means like well-intended, they set up the stage and everything, but there is a table here so in order for me to be I'm in front which is fine, I'm glad to step forward, but you know do I have to move things out of the way so that I can be a part of it? Simple little things mean a lot.
I work on a lot of DEI initiatives in medical, and I don't know how many OT, PT, and medical schools, everybody does minimum, so you come in the side door or back door. After 27 years as an alumni and as faculty, you would think that I could come in the front door of the College of Health Professions.
You know, they did a big diversity thing and have pictures and everything and it looks like a Photoshop thing, like are there really that many diverse people in Charleston South Carolina, you know, I bet they're international. But it looks like this beautiful thing, but no one with an obvious disability, and they have this video with all of these diverse people walking around campus and they're at a bus stop that's not accessible with a bus that's not accessible and the sign in the background that says wheelchair entrance around back and they want to diversify and include people with disabilities among the university, and we even beautifully got them to include a new position that is a disability studies background to be the Director of DEI for the College of Health Professions, which is huge, right, but I'm thinking, who are they going to hire? Who is going to be willing to come in the back?
You know, so it says a lot, you know, a picture is worth a thousand words. Think about your pictures and quit using these stock images that are not people with disabilities because we know. And it's not do you know just because you know somebody that you know everybody? Probably. Probably. Yeah.
So, but we know. So, I mean how someone sees, how someone's mannerisms are, how they're like talking, are they a quad, I can spot a quad in a picture a mile away, right, but you're using Photoshop, and canned stock images that aren't real. If you ever show somebody in a hospital type wheelchair again, it's like oh, come on penal, evolve. So don't use that stuff. Find people, and hopefully find them in your organizations.

DAVID DIKTER: Thanks, Sandy. So, we're at the point where I think we have some questions. I'm hoping that all of you have lots of questions as well. I'm hoping that many of you in the audience are considering how your companies or considering starting a company, or considering developing a product because the whole idea here is that there is a need for continuing emergence of ATs of not necessarily just standalone, but stuff that's built in, stuff that's not, stuff standalone, we need it all.
One of my challenges to everybody here and my question to all of you, over a long window, over 30 years of my time, we've seen a lot of products come and go. And when you have folks with disabilities who are deciding to adopt a product, or a person that they work with around their disability helps them adopt a product, the worst thing that can happen is the product stops working after a short period of time. We've seen this a lot with apps. Right. The company didn't make it, and all of a sudden, the platform, iOS or Android, decides to upgrade. We know that never happens, and the app all of a sudden didn't keep up pace and the app no longer works. So, the individual is either stuck staying in a different time warp of their mobile device or their tablet, or they upgrade and now have to go and find a whole new application that does or maybe doesn't even exist what they thought they needed to do.
How do you address that concern within your companies in terms of making sure that there is legacy for all the things that you develop and make sure that the folks that you're serving now with the stuff that you're making now continues to be available. And then how do you advise, Sandy, how do you advise folks to stay current with the technology you develop, whoever wants to go first.

SANDY HANEBRINK: I can speak to a good example. Google did an upgrade to the system in security to where they weren't going to allow any apps that could man incompetent date the platform. We had a number of assistive technology devices, whether it was environmental control or AAC devices that that or even different input devices that that's exactly what the apps were doing so that it simplified things for someone with cognitive disabilities or it enabled somebody who is a quadriplegic to be able to easily navigate and move around and use the functions of the phone.
And so it was like shutting apps down right and left. Right. We were able then to put a new process in place so people could notify them that it was intended for access for people with disabilities, here is what it was, and they looked at it at that and then granted approvals and each individual app then can now still be there so that these technologies can still work.
So that's where I see the collaboration happens is where you're working with the people with disabilities and users, the developers of that specialized assistive technology, and then the mainstream technologies that come up with solutions that work. And then even with some of them that still pose like a security threat, Google worked with those developers to put in what was necessary where they didn't have the time and money to do it to make their app safe so it could be used on their platform. So that's where I see how the interplay has to happen, and that was like and we did it sitting at ATIA in a backroom because we pinned them down. Come with me. And then trying to find the right people within Google because this is a different department and the accessibility department didn't talk with this department and so they had to find the people, too, because you know the stovepipes within our own systems and how they had to break they had to change how they do business in order to facilitate that. And we were able to do it in a matter of weeks and not years.

DAVID DIKTER: Go right ahead.

I would love to add to that. One of the things we have found in our research is that there are some consumers who, the technology that they're using is so important to them and so vital for the quality of their daily life that they will persist through some challenges to make that tech work, so when it's not working, they will engage with support systems or help desks and things like that. There are some people for whom if that technology does not work one time, it's untrustworthy and they will never touch it again. I think that some of it is about building trust and making sure that consumers know that there are lots of avenues to get support because technology companies want their technology used, (Laughing). I think it's important to sell kit, but I think really you want people to get meaningful use out of the kit that they've purchased, and so I making sure that you're building trust with consumers that when an update comes through, it's not going to drop off and you should keep using it, please stay engaged with this tool and we want it to be an important tool for you, I think, is something that needs to be cultivated and encouraged.


SANDY HANEBRINK: One other thing about that is the fear, some of you may have noticed me yesterday walking around with my exoskeleton and that was developed by a small technology company that was bought out by a big technology company and their primary focus and even though they do medical exoskeletons and also their biggest reason was for industrial exoskeletons because of course that's volume, right, it's more profit.
But because now this bigger company and their focus being on industrial, it's basically put a dead stop on the continued development and advancement of this technology, so we have to, you know, how do we keep you guys, the main industry overall mainstream industry from not stifling disability innovations?

DAVID DIKTER: I have a kind of an interconnected question about this because you actually gave some examples of how you're working with a whole bunch of organizations, including mainstream platform companies as well as some universities and some consumer advocacy organizations. That's a lot of work. How do you make it happen in the context of the technology company that's trying to rapidly develop and deploy and develop and deploy? It's not develop and deploy and let's sit around and wait a little bit. It's a constant process. How do you make those longer term relationship things work in the context of your development world?

Thank you for the question. You answered some of it in the last question which is longer term relationships. Every time you switch, you have to stop and start and get to know each other. That's what we do want to build is the longer term relationships so that there is a quick question, you can just pick up the phone, or if you want to set you were a focus group, we were able to put one together within three weeks at one point because there was a cancellation elsewhere, and it's because we had that depth of relationship. So, I think the long term is an important part of it.
I think it's also about we're really able to focus on accessibility on my team which we're really fortunate about, and we kind protect our corner of the world. It allows us to sit down and say where do we want to make progress this year. To your point we have to continue work within the vision segment and hearing segment, but we've been very intentional this year saying we want to innovate for speech, mobility, and next year we're going to do our first research study within the cognitive space to also go into that. But it always has to begin with research. It always has to start with that.
And so, one of the one of the examples, Tap to Alexa I hope you guys come to the booth and check out, and I'll go on about it because it was originally developed by David, a colleague of mine that was there from the beginning. He has a nonverbal Autistic brother and he wanted a way for him to use Alexa without speech. It was the intersection of cognitive and speech. But the reason that I love this feature is because it's all about customization and what I was saying yesterday about just accessibility to just be personalization for everyone, you know, shouldn't just be bracketed into you're in this group and you're in that one, especially with intersectionality.
What you can do is adapt and create your own icons for the things that you want to do, you know, turn on the light, call mom, what not. You can add colors, you can change icons, and there was one example where we had a participant with CP, and they had the switch access controller, so of course we made it switch access compatible and also said hey, we made you can change the layout so instead of having 9 icons on a screen, you can have 3 so that they're bigger, so they're bigger touch targets, and we showed it to two different participants who used switch, and one of them said hey that's great, yeah, I can hit this. The other one said, yep, I can hit this but I prefer the switch. Great. You decide, and it's just about giving more options so that the customer can choose, but yes, we can develop these options like having the bigger icons and then someone can say, great, I don't need my switch for this and I can do more directly on this device versus relying on intermediatory technology which then has challenges of is it going to operate forever, that's also a really exciting opportunity.


SCOTT FREIERMUTH: Sure. I would just echo your comments in terms of having those long standing relationships with the consumers and consumer groups, and again I'm coming to this from a Relay perspective, but you know providing the services in 35 plus states for, you know and we're talking decades of service here, right, so we have folks out in the field that are regularly attending events in states throughout the country, and our ear is to the ground all the time.
On top that have, you know, we employ quite a number of, you know, both users of people with disabilities in our company, so you know we kind of have a sort of built in focus group, if you will. But they also have their ear to the ground. So, we're continually looking to improve the services as best we can. We're always making changes to our platforms as they become more app based and we're looking to make sure those are integrating well with the apps and the user interfaces are as good as they can be and accessible as they can be. It's that continual process of reaching out, understanding, keeping your ear to the ground, listening, taking feedback, sometimes it bubbles up even through complaints, but that feedback is good and always makes it so that we're improving our services.

DAVID DIKTER: Awesome. I want to turn it over to the floor because I'm sure there are some questions out there. If you have a question, do we have a mic that's running around the room somehow? If you have a question, let us know somewhere. Somebody has to have a mic. Here we go. Right here. In the middle of the room right here. If you could say your name and organization.

AUDIENCE MEMBER: I'm Alex Dunn with Enable to Play and my question to the group is in a more interconnected system, especially with AT, how do you balance resiliency when these many sort of connected integrations or even just Internet connectivity or reliability becomes a challenge, just as sort of painting an example in a Smart Home or Smart City for me as an abled bodied person, for example if my Alexa doesn't understand what I'm saying to turn off the lights, no problem I can get up and go hit the light switch. But how do we create an environment where we're able to create new technologies that solve problems but not create a hard dependency on them over time.

SANDY HANEBRINK: For some people it is a dependency, how we access the world and what makes things possible. What I say is how do we make them more reliable so they function the way we need them to function. And I think that goes to the attitudes too of, you know, yeah well if it doesn't work, if it only works half the time, I can still get up and do it. But it's still not in the front mindset that that's not everybody's life experience. So, I would say that we need to change our mindset on that and ensure the reliability is not just 70% functionality is okay. It's a valuable tool. It needs to be 99.9%, you know, and so that all comes to the biggest barrier that we face every day as people with disabilities is attitude.

JULIANA TARPEY: I definitely agree with you that we need to work on that reliability, but I think also on our team we also challenge ourselves because we know we're not there yet. And there is also situations where the connectivity may not be there.
So let me give you an example. Forgive me for plugging Tap to Alexa again, but it is awesome. So, when we were when we were designing this and it was moving it into a portable device, right, and we're thinking about a lot of kids with speech disabilities going to school and they have their tablet. Sometimes they get to school and they're not allowed WiFi because the tablet could connect them to gaming and they could not pay attention in class.
So, there was that challenge where it's not even about, you know, the service being there and it's just that contextual environment. And so we took a hard look at what we were offering, and within it we built a speech generator. And we said, hey, we can do this part offline. We don't need connectivity for this. You know, so a bit more work on the technical side but entered up building a hybrid product with the speech generation is available offline, they can use it in school, ask questions in class, but then the other piece with Alexa connectivity to require that. So, I think it's also about those contextual moments, and yes what things can we offer offline in case there are different environments or in case something fails.

I think there are a lot of interdependencies that we have to pay attention to. Again, like we keep saying, it's not just between the user and tech or the tech being reliable and things like that. It's also that there is only so much that Alexa and Amazon can do in back systems if there are places without good WiFi infrastructure or data coverage or things like that. There is clearly a lot of interdependencies that all have to go very, very smoothly, or we have to employ the importance of the support network and making sure people's values are that the understanding of how important this tool is and that is works is embedded in the values of the people who are around and in somebody's support system so they can come alongside and kind of support this humorous troubleshooting so we can get the device back on line because we know it's important and not just nice to have.

DAVID DIKTER: Thank you.

SCOTT FREIERMUTH: Just to add. I'm not sure this is a great answer to your question, but I think this does center around usable. So, again, I'm the lawyer and I geek out on law and policy and those sort of things, but the FCC, means years ago Section 255 and CVAA sort of broke things down between accessibility and usability, right, and I think a lot of the focus has been on accessibility and making the things accessible.
Sometimes I think that the usability kind of comes second behind that, so that's been sort of, you know, a battle that I think we all face internally with our companies and otherwise. You can build these fantastic things that are super accessible, but if nobody knows about them, can discover them, can use them, then you're not really fulfilling the purpose of the mandate under Section 255 or CVAA. It's a constant battle and constant struggle to, you know, to get both of those right.

DAVID DIKTER: There are so many solutions that exist today that people don't know about that could change things, but as for me to comment and go back to your comment, the resiliency for folks with disabilities to tolerate some of these faults in the fact that WiFi goes out. I know my behavior when my WiFi goes out and I can't say the words that I use when that happens. (Laughing). Thank goodness now for 5G hotspots on the phone which I can then figure out how to make work, but that's not so easy for everybody to implement. Right. They have a phone and maybe that would be the backup solution, but having backups to the challenges.
Your question made me think about Smart Cities and automated bus note of courses, like you're at a bus stop and I live in a city so bus stops can have three, four, five different bus stops that stop at one corner and how do I know which bus is coming and going? I mean, you know, and if I'm blind and I'm there I can ask other people. But what if there are no other people and the bus stops or the bus is sort of waiting for somebody to wave it down and the whole intricacies of getting buses, and that notification system becomes something that people become dependent on, and how do we solve for or how do we plan for the definite break in the system? It's definitely not going to be perfect. How do we help folks who are leveraging the systems to be resilient to when it does fail to get those solutions quickly?
And one more question from the audience? Who has got a question? There we go. Right up front here.

AUDIENCE MEMBER: Yes. Paul Burdon with Waymap and question really for Juliana, you and I talked about how much I love Alexa. I used to work with people with neurological conditions to enable Alexa to improve their day to day life. A guy with ALS, a very advanced ALS, Alexa ran the computer, books, music, telephone, I said you're the most Alexa enabled guy that I know. If you could have Alexa do anything else for you, what would it be? He said that's easy. I want it to be able to raise my head in my hospital bed.
He had just a very little movement with his neck and that was it, and so I found a guy that he built a box, and it raised the hospital bed, his feet and his head.
So, my question is this, with all of the skills being, basically free, how do you at Amazon incentivize people to build and reward people to build skills that are helping people like that as opposed to just skills that play music or whatever?

JULIANA TARPEY: That's a tough question. I can't answer for Amazon as a whole. I can only answer within Alexa. Let me think about that.
Yeah. Our third party developers are a huge, huge part of Alexa, an enormous part of Alexa. And I think what we've tried to do is offer a lot of resources, a lot of training, and also ability to ask the teams questions. But I think the thing that probably comes most to mind is the tech work that we do with the Alexa accelerator and every year we do have a number of folks innovate with accessibility because as you all know accessibility drives innovation, so that allows startups to present ideas and then be sponsored to build the ideas and work with people from Amazon, so some of my engineers go and work with them for a period of time to be able to drive that forward.
But I think the true innovation kind of happens at a more micro level, and so I think it's more about what we're giving in terms of access for those third party providers to be able to build, and something that we've started updating, too, is our accessibility guidelines. So not just apps that are built for accessibility use cases, but making sure that all the other apps have access, right, so that people with disabilities can use them. And so it's very easy to get excited about this part of the innovation, the new, the directed, the built for us, but I think the other part is almost as important so that anything that's being built is accessible and can be used by anyone.

DAVID DIKTER: Can I address that too? I'm sorry, I'm a moderator but I have another answer for that. It's a 15 year old who is making those modifications and taking what's built into a platform that permits that. It's actually Carolyn Philipson who is figuring out that somebody has a need and is figuring out how to adjust Alexa specifically for the unique needs of an individual, which is what you're talking about.
That's the essence of assistive technology, which is technology not for the masses. It's technology that specifically meets the needs that are unique to somebody with a disability. There is mass production technology.

We have whole industries of occupational therapists and speech and language pathologists that that's what they do. People don't have money to buy, and if we can get something that is low cost and to utilize this to make what I call AAC light, but as it continues to develop, I can guarantee you that the OTs and speech pathologists are going to make what Alexa does as a speech generating device to be a very advanced device. So, we make I mean I do stuff with Alexa at my house that they probably were thinking what the heck. Like I can turn my oven on and can have dinner started because of how I've set things up, and I can set the alarm clock on my phone to trigger my Alexa, which then triggers, you know, my coffee to go on because I'm always trying to figure out ways that help me, but as for the people that I serve. Right.
And so, now that we have other ways to input, it's like woohoo, I don't have to figure it out because it's gone mainstream. It's standard now.

JULIANA TARPEY: That's the other thing we're trying to do. You don't have to be a developer to create something. Right. So, we have two things that come to mind, blueprints which is kind of a hackie simple way for you to create your own Alexa, you know, behavior or what not. But even easier than that is routines, which I'm sure you use a lot, and you know folks tend to use Alexa for music for timers, weather, occasionally lights, and you're like it's just the tip of the iceberg and there is so, so, much. We try to get folks to use routines, which is if then, then that. If I say good morning, turn on the light, tell me the weather, start the coffee machine. But there is also something that we're working on for next year too, which is adding more triggers and easier set up of routines.
For example, turning audio into visual. Right. If you're deaf and then there is a dog barking, can the light flash blue. And then turning visual into audio as well, so how can we create shortcuts so that routines are easier to make as well so that it's kind of just easier to set up because it's all there, but you know it requires a little bit of work, and especially as we look at my sister team which is the aging demographic, they're scared to try the things even though they're right there, so we're always trying to make it easier as well.

SANDY HANEBRINK: We like not having to hack it now.

DAVID DIKTER: We're coming to the end of our time. I want to clarify what I mean by 15 year olds, so at ATIA every year, we have a Maker Day, and it involves turning cardboard things into stands that allow somebody to kind of position their device in a way that works for them, but it also involves a whole bunch of high school and college robotics teams from all over the country, who as part of their robotics work, have a community service component. They're going out and working with local folks with disabilities and doing Alexa projects or building some other communication boards on a playground or something of that nature.
But, again, it kind of goes back to this issue of collaboration and integration and partnerships. AT company it's are great partners for IT companies or tech development firms that are trying to understand how do things work for people with disabilities. You can go to some folks with disabilities, but if you're not understanding a full range of the disability arena, you know, there is not just one blind type of person. There is a whole culture and whole community of folks who are visually impaired, who are low vision, who are blind, and who have potentially other disabilities that you need to understand when you're developing your technology. If you don't, then you're leaving out a whole bunch of folks anyway, and that's not the intent that you probably have.
So, the collaborations that you have that we all have been trying to build for forever, need to continue because they're really important. Did you have one other thing you want to add?

JULIANA TARPEY: Just want to add a final thought from our end as well in terms of that collaboration with customers and it's something that we're really excited about, is that we have a number of folks from our partnerships that are going to be part of the Artemis 1 Mission and going to the NASA space station and in a room and we're actually sending Alexa to space and then they'll be communicating with Alexa from Ground Control. So, it's really, really exciting because having this project, we said we need people with disabilities in that room. We need them to inform this project that we're going to do that is eventually going to show how astronauts communicate up and down and to space and to the moon, but having people with disabilities be part of that communication is then going to allow some of those astronauts be people with disabilities because we don't want it to be that we designed it without them and then oh, now it won't work. So, we really, really, really proud of that. We have someone from United Spinal going there in a chair now who is really, really awesome, and he's going to be able to say turn the cabin lights on and how cold is it up there and all kinds of things, but that's just kind of how even as we go out of this world, we're keeping our customers front and center of everything.

DAVID DIKTER: Thank you so much for that. That sounds pretty cool. Thank you all for being here and joining uses. Thank you to the panelists for joining us.
Thanks a lot.

FRANCESCA CESA BIANCHI: Thank you very much, David, and to the panelists very, very interesting insights. Thank you. We are actually resuming we will have a short break of 15 minutes, 15 or 15 minute break and resume here in this room at 2:45 p.m. sharp with the next panel discussion on generational and intersectional digital gaps and why it matters. Thank you.

This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.

Leave a Reply

Your email address will not be published. Required fields are marked *