Last week, I gave an invited presentation at my local Mac User Group meeting (Fresno Mac Users Group, FMUG)
The MUG learned about me through one of my former colleagues, who had seen me give an untethered presentation (about the biology research my undergrad and grad students and I conduct at California State University, Fresno) using my iPad and iPhone, and ExplainEverything. You can view that presentation here, if you're interested in learning more about mutations, in a presentation geared to a lay audience:
I hadn't known we had a MUG in central California! I was elated to be asked, and it was great to be back in the company of local die-hard Apple evangelists. When I was in high school in Oregon, I took regular advantage of that local MUG (in Corvallis), and I vividly recall borrowing my mom's car to drive there monthly to rub elbows with the other Apple geeks. My strongest memory of the CMUG meetings was the one where I purchased (and had installed) 4x 1 MB RAM chips to replace the 4x 256k RAM chips in the MacPlus I had inherited from my parents when they upgraded to a 9500.
Yes, my family are famously: slow adopters of new tech and/or advocates of the "drive it until it dies" approach (cars in my family tend to be on the road 20+ years before they get replaced). I'm the same way, which sometimes makes me a less-capable advocate of technology because I'm often behind the curve on having the latest and greatest, but that's another story.
I had three main goals of my presentation to FMUG (an overview video, produced in Clips, is at: https://youtu.be/rYo1Zc6niAk):
Goal 1. To demonstrate how Apple builds accessibility features into their hardware and software. As an example, I used these screen shots that show how inaccessible (to us red-green color-blind folk) color pickers used to be, as recently as System 7, shown here:
The introduction of the crayon color picker may have seemed childish, but that the colors were accompanied by names was a huge deal to those of us who rely on this sort of information so that we color grass green and not brown or red!
Here, just for fun, is a current version of the Apple color-picker (left) and a color-blind simulated version (right). To me, the two images look identical. To you, if you aren't color-blind, this gives you an idea of how I view the world and why it is useful to have color accessibility.
For more information, please refer to this resource for choosing color-blind friendly colors to use in media you produce!
Goal 2. To broadcast information about Fresno State's "DISCOVERe" program, launched in 2014, which is a 1:1 tablet program (but not exclusively Apple, and probably inevitably moving to a BYOD approach). As a program that is entirely voluntary for faculty to join, and then design at least one course to use mobile devices to increase student engagement and success, we have quadrupled the number of DISCOVERe courses, as well as students enrolled in DISCOVERe classes, over the past three years; we have almost 200 faculty fellows of the program now, while our initial cohort (of which I was a member) numbered 32. Most importantly, I wanted to share with FMUG (and you, too!) that in a single semester last year, DISCOVERe faculty who chose to adopt open educational resources, like iBooks, saved our students a total of over $117,000 in course materials costs!
Goal 3. To share my love of Clips. The highlight of the presentation was a live demonstration of Clips. I first showed a few examples of Clips I had produced, after which questions from the audience immediately turned to the captioning (which I pointed out was a critical component of accessibility).
Not surprisingly (to those with Clips experience), the biggest audible *gasp* in the room came when I gave a live demonstration of Live Titling. Many audience members had previously asked how I created the captions, but I held off on answering those questions until I got to this point!
After I was done demonstrating Clips, I was surprised to hear from the audience that most of them had not heard about Clips and wondered, after I showed them the basics of using this app, why Apple hadn't made a bigger push to promote Clips! I told them that this was one reason I was here, and that I would continue to be a resource for them in their adoption of Clips.
In every presentation I give, I want to make sure the audience leaves with information or a skill of immediate value. So, tonight, I asked all of them to brainstorm how they might use Clips in their lives. We had a great conversation about using the app to (for example) send multimedia greetings to family members via social media, and one kingergarten teacher asked me if it would make sense for her to record herself reading picture books for her students.
I remarked both that Clips would be reasonble (for short books like picture books) but I emphasized a
Key Benefit
of Live Titling in videos: that this accessibility feature is not just for the deaf, but that non-native English speakers (and those, like children, learning to read and write English) really benefit from hearing the word and seeing it written at the same time!
By the end of the meeting, I had demonstrated another way to use Clips: as an aid to newcomers (to any event or organization). Clips, especially in conjunction with time-lapse video using Camera, are great to show people how to navigate from one place to another, as I did here for students at the start of the semester to show them how to get from our lecture classroom to my office on campus:
This could work just as well for directions to a church, to your house for birthday party invitees, and so on. Here, I used Clips to provide FMUG a resource to share with new members to help them find the FMUG meeting space!
I was honored to share the power of Clips with my new Apple allies in Fresno and to build new connections between my campus and community. Again, you can view my FMUG Clip here: https://youtu.be/rYo1Zc6niAk