iphone

Set your iPhone to open a tuner or take a screenshot when you tap the Apple logo on back

I have been seeing this tip gain popularity with teachers online, so I feel obligated to share it here:

You can program your iPhone to do a nearly endless list of things by double or triple tapping the back of it. Go to Settings-->Accessibility-->Touch and then scroll down to the option called "Back Tap."

Alternatively, you can swipe down in settings to reveal a search bar and then type in "back tap."

You can program a tap of the Apple Logo on the back of your iPhone to do tons of system actions like going home, muting your phone, taking a screenshot, or launching Control Center.

The Touch options in the accessibility settings.

The Touch options in the accessibility settings.

Setting a double and triple-tap.

Setting a double and triple-tap.

There are lots of options!

There are lots of options!

You can also choose a Shortcut to launch. And Shortcuts can do anything from launching an app to running JavaScript. So you can imagine the possibility...

Personally, I have a double-tap set to reveal Control Center and a triple tap set to initiate open a new note in my note-taking app, Drafts.

To open a specific app, you will first need to make a Shortcut that performs the “Open App” action and then select that Shortcut from the available options in the Back Tap settings. To do that, open the Shortcuts app (pre-installed on every iPhone or available from the App Store on older versions of iOS).

Once in Shortcuts, create a new one with the plus icon in the upper right. Name your shortcut if you want (by pressing the three-dots “More” button), and then press “Add Action.” There is an overwhelming number of options if you are unfamiliar with Shortcuts, so just use the search and look for the action called “Open App.” Select this action from the search results and then a block will appear with a blue “Choose” option where you can choose the app you want it to open. Choose your tuner of choice.

Once saved, this Shortcut will be available as an option in the Back Tap settings.

***Note: The Tonal Energy app actually allows you to set up Shortcuts that jump to specific places within the app like the Analysis or Metronome section. You can find this in the TE settings. It will save you a bunch of extra taps.

Creating a new Shortcut.

Creating a new Shortcut.

Search for the Open App action.

Search for the Open App action.

Tonal Energy allows you to make Shortcuts that launch to specific parts of their app in the settings.

Tonal Energy allows you to make Shortcuts that launch to specific parts of their app in the settings.

METT Episode #22 - Teaching Hybrid, Composing Music, and Finding Balance, with Tyler S. Grant

Syncing Peloton Bike Workouts to the Apple Watch Activity Rings

My wife and I started using the 12 dollar a month Peloton service, without the bike, early this year. It is full of engaging, thorough, and motivating classes that span everything from yoga to strength training. I recommend it. Even if you don't have an interest in the bike, it is still a viable service for staying physically active at home. That said, we did become interested in the bike through this service and have been owners since around February.

Go to the Apple Health settings of the Peloton app to begin setup.

Go to the Apple Health settings of the Peloton app to begin setup.

After my bike workout, I go to this screen of the Peloton app to review my workout.

After my bike workout, I go to this screen of the Peloton app to review my workout.

One of my favorite features of the bike is that it syncs my activity to Apple's health ecosystem, where I also track sleep, water, and numerous other fitness metrics.

The newer and fancier Peloton bike uses Apple’s GymKit technology to sync metrics only the bike knows (like distance) with metrics only the Apple Watch knows (like heart rate) and then immediately track it as an Apple Watch workout. 

I admit I am slightly jealous I don’t have this version but you can get the same results if you have a third-party heart rate monitor. All I do is wear this third party heart rate monitor on my arm when I do a bike ride, and then open up the Peloton app on my phone when I am done. The Peloton app syncs my ride metrics to the Apple Health app, which then syncs the fitness data to the Fitness app on the iPhone and Apple Watch, ensuring that I fill my rings. 

Post workout, I review my workout in the Peloton app and then open Apple Health to see the data tracked in that workout alongside other thigns I am tracking like diet, water, meditation minutes, and blood oxygen.

Post workout, I review my workout in the Peloton app and then open Apple Health to see the data tracked in that workout alongside other thigns I am tracking like diet, water, meditation minutes, and blood oxygen.

Next, I open the Apple Fitness app. Sometimes it takes a few minutes for the rings to show up here, and then later on the watch, but they always do.

Next, I open the Apple Fitness app. Sometimes it takes a few minutes for the rings to show up here, and then later on the watch, but they always do.

The best part is that I can charge my watch while I ride, which means I can wear it to track sleep throughout the night using AutoSleep.

Peloton also has an Apple TV and Amazon Fire Stick app now. Great for doing yoga in the living room. I track these workouts normally on my Apple Watch by running the appropriate workout type before I start.

Stay healthy out there.

Post Sticky Notes to Your Home Screen

IMG_3380.PNG

Speaking of widgets on the iPhone home screen, this is one that I have a feeling a lot of people will appreciate. 

Sticky Widgets allows you to post sticky notes straight to the home screen that come in different colors and say anything you want. The experience is as simple as you can imagine.

Sure, I advocate for using proper note-taking and task management software, but there are times where you just want to write something directly and trust that it will be plastered in front of your eyes indefinitely.

Check out a full review from MacStories...

Sticky Widgets Brings Simple Sticky Notes to Your Home Screen:

Sticky Widgets enables placing sticky note-style widgets on your iPhone or iPad Home Screen which can be modified simply by tapping on the widget. It’s utility that’s such an obvious fit for widgets, I’m surprised I haven’t seen a hundred other apps doing the same thing.

New Software Updates from Apple: Exploring Widgets!

iOS 14, iPadOS 14, watchOS 7, and tvOS 14 came out a few weeks ago. I have a lot to say about these updates, but today I wanted to write about widgets for a moment.

Widgets are catching on as a significant feature amongst the masses. As someone who plays around with the way apps are organized on the home screen at least twice a week, I can tell that widgets are going to add a lot of excitement (and anxiety) into my life. I have been toying with them since July when this software entered the public beta, and I am far from resolved.

Here is where I have landed for now…

IMG_3278.png
IMG_3261.png
IMG_3257.png

Page one (middle image) contains my most tapped app icons. This will be a hard habit to break, but I find lots of value in having upcoming calendar tasks and weather permanently on my most visited screen. Weather Line and Fantastical have the best small-sized widgets, in my opinion. Even this smallest widget size takes up four app icons, so they need to be beautiful and information-dense for it to be worth me sacrificing four apps.

I didn’t think I would want weather on this first screen, but now that it is always visible to me, I don’t see how I could live without it. The Weather Line widget is awesome because its user interface depicts the weather on a line, almost like a chart. It even manages to fit an hourly rain graph into its small space when it is raining out. Not even my second favorite weather widget, Carrot Weather, does that.

The Today View (left image) is where I keep Siri Shortcuts and the older, legacy style widgets from iOS 13. As much as I like the newer widgets’ look, the older style widgets are interactive. I keep OmniFocus, Timery (for time tracking), Streaks (for tracking daily habits), and Waterminder (for quickly logging water) all on this screen because I can tap right on the buttons to act on these apps without the widget needing to launch into the app.

I am continually playing with page 2 (right picture). I like it to be mostly another grid of tappable apps, but I am experimenting with various widgets here. I think what I have settled on is to have the Maps and Notes app widgets stacked on top of each other at the top, and then to use the Siri Suggestion widget, which shows me two rows of apps that swap in and out throughout the day based on my phone’s predictions of which apps I want to use in which contexts. The image above shows some other widgets I am experimenting with, but I think I prefer having more app icons there.

IMG_3071.png

On the iPad, I keep: calendar, weather, notes, Apollo (a Reddit app I use to keep up on the latest news about my interests), Siri Shortcuts, and the Files app for launching into recently modified files. 

On both my phone and iPad, I am waiting for an OmniFocus widget to track my tasks. Even though I like the one in the Today view where you can mark the tasks as done right from the widget, I think I might want to have my next few upcoming tasks permanently visible on page one.

9to5Mac.com and MacStories.net have been two great websites to follow if you want to stay up on which apps offer widgets.

METT Podcast #16 - Master Your Virtual Teaching Tech, with David MacDonald

Thanks to my sponsor this month, MusicFirst

David MacDonald returns to the show to talk about the hardware and software in our virtual teaching setups. Then we speculate about touchscreen Macs and consider how Apple's recent App Store policies might impact the future of creative professional software on iOS.

Topics include:

  • New Zoom features for musicians and teachers
  • David and Philip Rothman's new podcast, Scoring Notes
  • Using Open Broadcaster Software to level up your virtual teaching
  • Routing audio from your apps into Zoom and Google Meet calls
  • Teaching with Auralia
  • LMS integration with third-party music education apps
  • Using MainStage and Logic for performing instruments into virtual classrooms
  • Touchscreen Macs
  • Apple's App Store Policy

Show Notes:

Where to Find Us:
Robby - Twitter | Blog | Book
David MacDonald - Twitter | Website | Blog

Please don't forget to rate the show and share it with others!

Subscribe to Music Ed Tech Talk:

Subscribe to the Blog

Subscribe to the Podcast in... Apple Podcasts | Overcast | Castro | Spotify | RSS

Today's episode is sponsored by MusicFirst:

MusicFirst offers music educators and their students easy-to-use, affordable, cloud-based software that enables music learning, creation, assessment, sharing, and exploration on any device, anywhere, at any time.

MusicFirst Classroom is the only learning management system designed specifically for K-12 music education. It combines the flexibility of an LMS with engaging content and powerful software integrations to help manage your students’ progress, make lesson plans, and create assignments.

And for younger students, MusicFirst Junior is the perfect online system for teaching elementary general music. It includes a comprehensive K-5 curriculum, hundreds of lessons & songs, and kid-friendly graphics to making learning and creating music fun!

Whether you’re teaching remotely, in-person, or in a blended learning environment, MusicFirst will work with you to find a solution that fits your program’s unique needs. Try it free for 30 days at musicfirst.com.

David’s teaching setup.

David’s teaching setup.

My teaching setup.

My teaching setup.

…From far away.

…From far away.

Automating Band Warmups, Teaching Auditory Skill, and Managing My Classroom… with Solfege Bingo

Intuition, I realized, was the certainty with which a skill instantly worked on the basis of rational experience. Without training, intuition does not develop. People only think that intuition is inborn. If intuition unexpectedly reveals itself, however, it is because unconscious training has been amassed somewhere along the way.
— Shinichi Suzuki , Nutured by Love

What is Solfege Bingo

Solfege Bingo is a game for young music students. You can play in class to help develop audiation, pitch recognition, and solfege.

CleanShot 2020-08-30 at 08.52.57@2x.png

The book comes with a series of bingo cards, each of which with three-note Solfege patterns in each square. “Do re mi, fa sol do, etc...” With the book comes a CD that has many different recorded examples of a singer singing these patterns, with space in between each pattern. Students match the three-note patterns they hear with the ones on their card until they get bingo.

The CD features a second set of recorded examples in which a clarinet plays the patterns so that the students must recognize the patterns by ear, not by syllable.

I first learned about this series as a student teacher, where the choir teacher would use them as warm-ups. She would use them as ear training examples to familiarize her ensembles with solfege. On the recorded examples, the space between each pattern is equal to the length of the patterns themselves, so you can use them as a call and response. The recording models the pattern, the choir sings it back.

Transposing the Tracks for Bands and Adding a Drone

A few years ago, I got the idea to transpose these recordings into band keys using GarageBand. I added a clarinet drone on the key center (using one of the software MIDI instruments) to help students hear the relationships of the pitches not only to each other but also to the tonic. 

In band, I start the year by implementing these play-along tracks during warm-ups, starting in concert Bb. I first use the vocalist track and have students sing back. Then they play it back, with brass buzzing on mouthpieces. Then with brass on instruments. (The repetition of this has the side effect of reinforcing fingerings.) Eventually, once I feel like they have begun to internalize the pitches, I play them the clarinet version of the recording. The clarinet drone rings through my entire track, which takes the place of my usual Tonal Energy Tuner drone.

It sounds like this when it’s done…

Excerpt 1
Excerpt 2
In GarageBand, I dragged in the audio file I wanted to edit, creating an audio track. Then, I created a second software instrument track, selected clarinet as the instrument, and held out the note Bb on my MIDI keyboard for the drone. Double-clickin…

In GarageBand, I dragged in the audio file I wanted to edit, creating an audio track. Then, I created a second software instrument track, selected clarinet as the instrument, and held out the note Bb on my MIDI keyboard for the drone. Double-clicking an audio region reveals a transpose option on the left. Dragging the slider moves the pitch up of the selected region up or down by a semitone.

Classroom Management (Making Two of Me)

I recall a year where I was struggling with engaging one of my band classes during the warm-ups. I needed a way to create some structure and reinforce expectations for the first 10 minutes of class, while making sure that the winds got the tone and ear development I wanted them to have. It is always easy to assume that students are against you when they are talking amongst themselves, wandering the back of the room, and slouching in their seats. I have come to find that, more often than not, my students aren’t against me, they just flat out didn’t understand my expectations for participation, posture, and technique and that they needed my support (even when it seems my expectations should be obvious). 

My solution was to duplicate myself. I needed there to be one of me on the podium guiding the rehearsal sequence, and another of me walking the room to adjust students’ expectations of themselves.

I added the Solfege Bingo play-along tracks to slides in my daily agenda presentation, which is always on display at the front of the room through a projector. I make all of my slides in Apple’s Keynote. I found that I could embed an mp3 of one of my tracks into a slide and set the presentation to automatically skip to the next slide after a certain length of time had passed. So I created a sequence of these Solfege Bingo tracks, and a couple of other typical warm-ups I do, and embedded them all in Keynote slides so that the warm-up would happen automatically. 

In the upper right corner, click the Transitions button to reveal transitions. From the Start Transition dropdown menu, you can choose to have a slide start automatically after a certain amount of time, using the Delay timer. You might have to tweak…

In the upper right corner, click the Transitions button to reveal transitions. From the Start Transition dropdown menu, you can choose to have a slide start automatically after a certain amount of time, using the Delay timer. You might have to tweak this a little bit to get it right, but the result is that these couple of Keynote slides play in a row, automatically, while I walk around the band room and give feedback to students.

This allows me to work the room. While warm-ups were taking place, I can walk in the percussion section and remind them what instrument they play for warm-ups that day (it's on the chart in the back of the room 🤷‍♂️). I can give postural feedback to my trombones. I can high five the tuba player. I can fit someone for a concert shirt. I can do nearly anything. And this is all while reinforcing audiation, tone development, and proper intonation.

I recommend the Solfege Bingo book. It’s effortless to modulate tracks with software. You can use the pitch-flex feature in GarageBand, as I mentioned above. But you can also use apps like Transcribe!, The Amazing Slow Downer, or Anytune

Adding a clarinet drone is easy. I added a software instrument track in GarageBand, set it to a clarinet, and played the tonic along to the recording. But you could also use Tonal Energy as a GarageBand instrument.

Conclusion

Given the time I am posting this, it is worth mentioning that I totally intend to use these warmup play-along tracks in my online band classes this fall, which will be taking place in Google Meet. I am using the Loopback app to route the audio of Keynote through to the call, and a soundboard app called Farrago to trigger them. I can run the tracks through Google Meet and everyone plays along while on mute. I am hoping to blog about Farrago soon.

I am also planning to blog about another version of this workflow I have tried in especially needy classrooms, where I go as far as to record myself giving instructions to the band in between transitions, and even program the tracks to rehearse concert music for me while the real ‘me’ works the room. I have run up to 40 minutes of a band rehearsal through pre-recorded instructions and play along tracks before!

Get a copy of Solfege Bingo here.

Learn OmniFocus Workflow Guest: October 3, 2020

I am thrilled to announce that I will be joining Learn OmniFocus as a Workflow Guest on October 3rd, 2020.

Learn OmniFocus is a website dedicated to helping others live a fulfilling and productive life with OmniFocus, complementary productivity apps, and services.

You can learn a ton from their free resources, including basics like organizing tasks into projects and assigning tags to them. They also have information on advanced features like project templating and automation.

My session will be all about how I use OmniFocus and complementary productivity apps to keep my life as a teacher and musician together. Here is the session description:

Teacher, musician, and technologist, Robby Burns will be joining us from Ellicott City, Maryland to share how he uses OmniFocus and complementary productivity apps to keep his active life on track.

Robby has been using OmniFocus since 2010. He has a long history with Apple technologies and was originally drawn to OmniFocus’ deep integration with Apple’s operating systems. He especially appreciates that the Omni Group is quick to add support for new Apple technologies.

During the LIVE session, Robby will share details of his OmniFocus setup and workflows, including:

  • How and when he uses OmniFocus on his iPhone, iPad, and Mac.

  • Adjustments that he’s made to his use of OmniFocus and complementary productivity apps since switching from in-person to virtual teaching.

  • His strategy for using tags.

  • How he uses the Forecast perspective to keep his calendar lined up with his commitments.

  • How he uses defer dates to relieve the stress of seeing too many things at once.

  • Custom perspectives that help him hone in on his most important tasks, including his “Top 3” perspective that narrows his focus to only three items.

  • How he creates OmniFocus projects based on templates stored in Drafts.

Read more and register here. The session will have a live Q/A and members can interact and share ideas. I hope to see you there!

You can become a member of Learn OmniFocus here. They have educator and student discounts. It is worth checking out if you wish to be more productive!

A free recording of the video will be made available to everyone by October 10.

My Online Teaching Setup (High-Tech Edition)

My studio computer and associated hardware.

My studio computer and associated hardware.

When school let out in March, I wrote My Very Straightforward and Very Successful Setup for Teaching Virtual Private Lessons. The impetus for this post, and its snarky title, was an overwhelming number of teachers I saw on Facebook fussing about what apps and hardware they should use to teach online when all you really need is a smartphone, FaceTime, and maybe a tripod.

I stand by that post. But there are also reasons to go high-tech. I have had a lot of time this summer to reflect on the coming fall teaching semester. I have been experimenting with software and hardware solutions that are going to make my classes way more engaging.

Zoom

I have been hesitant about Zoom. I still have reservations about their software. Yet, it is hard to resist how customizable their desktop version is. I will be using Google Meet for my public school classes in September, but for my private lessons, I have been taking advantage of Zoom’s detailed features and settings.

For example, it’s easier to manage audio ins and outs. Right from the chat window, I can change if my voice input is going through my Mac's internal microphone or my studio microphone, or if video is coming from my laptop webcam or my external Logitech webcam. This will also be useful for routing audio from apps into the call (we will get to that in a moment).

Zoom allows you to choose the audio/video input from right within the call.

Zoom allows you to choose the audio/video input from right within the call.

Zoom also allows you to AirPlay the screen of an iOS device to the student as a screen sharing option. This is the main reason I have been experimenting with Zoom. Providing musical feedback is challenging over an internet-connected video call. Speaking slowly helps to convey thoughts accurately, but it helps a lot more when I say “start at measure 32” and the student sees me circle the spot I want them to start in the music, right on their phone.

You can get really detailed by zooming in and out of scores and annotating as little as a single note. If you are wondering, I am doing all of this on a 12.9 inch iPad Pro with Apple Pencil, using the forScore app. A tight feedback loop of “student performance—>teacher feedback—>student adjustment” is so important to good teaching, and a lot of it is lost during online lessons. It helps to get some of it back through the clarity and engagement of annotated sheet music.

Selecting AirPlay as a screen sharing option.

Selecting AirPlay as a screen sharing option.

AirPlaying annotated sheet music to the Zoom call using the iPad Pro and forScore app.

AirPlaying annotated sheet music to the Zoom call using the iPad Pro and forScore app.

As much as I love this, I still think Zoom is pretty student hostile, particularly with the audio settings. Computers already try to normalize audio by taking extreme louds and compressing them. Given that my private lessons are on percussion instruments, this is very bad. Zoom is the worst at it of all the video apps I have used. To make it better, you have to turn on an option in the audio settings called “Use Original Audio” so that the host hears the student’s raw sound, not Zoom’s attempt to even it out. Some of my students report that they have to re-choose this option in the “Meeting Settings” of each new Zoom call.

If this experiment turns out to be worth it for the sheet music streaming, I will deal with it. But this is one of the reasons why I have been using simple apps like FaceTime up until this point.

My Zoom audio settings.

My Zoom audio settings.

My Zoom advanced audio settings.

My Zoom advanced audio settings.

Sending App Audio Directly to the Call

I have been experimenting with a few apps by Rogue Amoeba that give me more control over how audio is flowing throughout my hardware and software.

Last Spring, I would often play my public school students YouTube videos, concert band recordings from Apple Music, and warm-up play-alongs that were embedded in Keynote slides. I was achieving this by having the sound of these sources come out of my computer speakers and right back into the microphone of my laptop. It actually works. But not for everyone. And not well.

Loopback is an app by Rogue Amoeba that allows you to combine the audio input and output of your various microphones, speakers, and apps, into new single audio devices that can be recognized by the system. I wrote about it here. My current set up includes a new audio device I created with Loopback which combines my audio interface and a bunch of frequently used audio apps into one. The resulting device is called Interface+Apps. If I select it as the input in my computer’s sound settings, then my students hear those apps and any microphone plugged into my audio interface directly. The audio quality of my apps is therefore more pure and direct, and there is no risk of getting an echo or feedback effect from my microphone picking up my computer speaker’s sound.

A Loopback device I created which combines the audio output of many apps with my audio interface into a new, compound device called “Interface+Apps.”

A Loopback device I created which combines the audio output of many apps with my audio interface into a new, compound device called “Interface+Apps.”

I can select this compound device from my Mac’s Sound settings.

I can select this compound device from my Mac’s Sound settings.

Now I can do the following with a much higher level of quality...

  • Run a play-along band track and have a private student drum along
  • Play examples of professional bands for my band class on YouTube
  • Run Keynote slides that contain beats, tuning drones, and other play-along/reference tracks
  • and...

Logic Pro X

Logic Pro X is one of my apps routing through to the call via Loopback. I have a MIDI keyboard plugged into my audio interface and a Roland Octopad electronic drum pad that is plugged in as an audio source (though it can be used as a MIDI source too).

The sounds on the Roland Octopad are pretty authentic. I have hi-hat and bass drum foot pedal triggers so I can play it naturally. So in Logic, I start with an audio track that is monitoring the Octopad, and a software instrument track that is set to a piano (or marimba or xylophone, whatever is relevant). This way, I can model drum set or mallet parts for students quickly without leaving my desk. The audio I produce in Logic is routed through Loopback directly into the call. My students say the drum set, in particular, sounds way better in some instances than the quality of real instruments over internet-connected calls. Isn’t that something...

Multiple Camera Angles

Obviously, there is a reason I have previously recommended a set up as simple as a smartphone and a tripod stand. Smartphones are very portable and convenient. And simple smartphone apps like FaceTime and Google Duo make a lot of good default choices about how to handle audio without the fiddly settings some of the more established “voice conference” platforms are known for.

Furthermore, I can’t pick up my desk and move it to my timpani or marimba if I need to model something. So I have begun experimenting with multiple camera angles. I bought a webcam back in March (it finally just shipped). I can use this as a secondary camera to my laptop’s camera (Command+Shift+N in Zoom to change cameras).

Alternatively, I can share my iPhone screen via AirPlay and turn on the camera app. Now I can get up from my desk and go wherever I need to. The student sees me wherever I go. This option is sometimes laggy.

Alternatively, I can log in to the call separately on the iPhone and Mac. This way, there are two instances of me, and if I need to, I can mute the studio desk microphone, and use the phone microphone so that students can hear me wherever I go. I like this option the best because it has the added benefit of showing me what meeting participants see in Zoom.

Logging in to the Zoom call on the Mac and iPhone gives me two different camera angles.

Logging in to the Zoom call on the Mac and iPhone gives me two different camera angles.

SoundSource

This process works well once it is set up. But it does take some fiddling around with audio ins and outs to get it right. SoundSource is another app by Rogue Amoeba that takes some of the fiddly-ness out of the equation. It replaces the sound options in your Mac’s menubar, offering your more control and more ease at the same time.

This app is seriously great.

This app is seriously great.

This app saved me from digging into the audio settings of my computer numerous times. In addition to putting audio device selection at a more surface level, it also lets you control the individual volume level of each app, apply audio effects to your apps, and more. One thing I do with it regularly is turn down the volume of just the Zoom app when my students play xylophone.

Rogue Amoeba's apps will cost you, but they are worth it for those who want more audio control on the Mac. Make sure you take advantage of their educator discount.

EDIT: My teaching set up now includes the use of OBS and an Elago Stream Deck. Read more here.

Conclusion

I went a little overboard here. If this is overwhelming to you, don't get the idea that you need to do it all. Anyone of these tweaks will advance your setup and teaching.

This post is not specific about the hardware I use. If you care about the brands and models of my gear, check out My Favorite Technology to read more about the specific audio equipment in my setup.

🎙 #14 - Empowering Performing Ensembles at a Distance, with Theresa Hoover Ducassoux

Theresa Hoover Ducassoux joins the show to talk about technology for teaching band at a distance, productivity methodologies, Google apps for personal and school use, Flipgrid, empowering students, and more...

Other topics:

  • Personal productivity systems and apps
  • The Getting Things Done Methodology
  • Teaching band online
  • Being creative with whatever teaching scenario and schedule your district is moving forward with this fall
  • Engaging students with musical performance using the Flipgrid video service
  • Google apps for personal productivity
  • Google apps for classroom teaching
  • Organizing files in Google Drive
  • Automating band warm ups
  • Chamber music breakout groups using Google Meet and Soundtrap
  • Getting Google Certified
  • Her book- Pass the Baton: Empowering All Music Students
  • Our favorite album and apps of the week

Show Notes:

App of the Week:
Robby - Loopback by Rogue Ameoba (They have educator discounts)
Theresa - Flat for Docs

Album of the Week:
Robby - Jennifer Higdon Harp Concerto
Theresa - Dustin O’Halloran, piano solos

Where to Find Us:
Robby - Twitter | Blog | Book
Theresa - Twitter | Website - MusicalTheresa.com | Book - Pass the Baton: Empowering All Music Students | Blog - Off the Beaten Path

Please don't forget to rate the show and share it with others!

Subscribe to Music Ed Tech Talk:

Subscribe to the Blog

Subscribe to the Podcast in... Apple Podcasts | Overcast | Castro | Spotify | RSS