macos

My Online Teaching Setup (High-Tech Edition)

My studio computer and associated hardware.

My studio computer and associated hardware.

When school let out in March, I wrote My Very Straightforward and Very Successful Setup for Teaching Virtual Private Lessons. The impetus for this post, and its snarky title, was an overwhelming number of teachers I saw on Facebook fussing about what apps and hardware they should use to teach online when all you really need is a smartphone, FaceTime, and maybe a tripod.

I stand by that post. But there are also reasons to go high-tech. I have had a lot of time this summer to reflect on the coming fall teaching semester. I have been experimenting with software and hardware solutions that are going to make my classes way more engaging.

Zoom

I have been hesitant about Zoom. I still have reservations about their software. Yet, it is hard to resist how customizable their desktop version is. I will be using Google Meet for my public school classes in September, but for my private lessons, I have been taking advantage of Zoom’s detailed features and settings.

For example, it’s easier to manage audio ins and outs. Right from the chat window, I can change if my voice input is going through my Mac's internal microphone or my studio microphone, or if video is coming from my laptop webcam or my external Logitech webcam. This will also be useful for routing audio from apps into the call (we will get to that in a moment).

Zoom allows you to choose the audio/video input from right within the call.

Zoom allows you to choose the audio/video input from right within the call.

Zoom also allows you to AirPlay the screen of an iOS device to the student as a screen sharing option. This is the main reason I have been experimenting with Zoom. Providing musical feedback is challenging over an internet-connected video call. Speaking slowly helps to convey thoughts accurately, but it helps a lot more when I say “start at measure 32” and the student sees me circle the spot I want them to start in the music, right on their phone.

You can get really detailed by zooming in and out of scores and annotating as little as a single note. If you are wondering, I am doing all of this on a 12.9 inch iPad Pro with Apple Pencil, using the forScore app. A tight feedback loop of “student performance—>teacher feedback—>student adjustment” is so important to good teaching, and a lot of it is lost during online lessons. It helps to get some of it back through the clarity and engagement of annotated sheet music.

Selecting AirPlay as a screen sharing option.

Selecting AirPlay as a screen sharing option.

AirPlaying annotated sheet music to the Zoom call using the iPad Pro and forScore app.

AirPlaying annotated sheet music to the Zoom call using the iPad Pro and forScore app.

As much as I love this, I still think Zoom is pretty student hostile, particularly with the audio settings. Computers already try to normalize audio by taking extreme louds and compressing them. Given that my private lessons are on percussion instruments, this is very bad. Zoom is the worst at it of all the video apps I have used. To make it better, you have to turn on an option in the audio settings called “Use Original Audio” so that the host hears the student’s raw sound, not Zoom’s attempt to even it out. Some of my students report that they have to re-choose this option in the “Meeting Settings” of each new Zoom call.

If this experiment turns out to be worth it for the sheet music streaming, I will deal with it. But this is one of the reasons why I have been using simple apps like FaceTime up until this point.

My Zoom audio settings.

My Zoom audio settings.

My Zoom advanced audio settings.

My Zoom advanced audio settings.

Sending App Audio Directly to the Call

I have been experimenting with a few apps by Rogue Amoeba that give me more control over how audio is flowing throughout my hardware and software.

Last Spring, I would often play my public school students YouTube videos, concert band recordings from Apple Music, and warm-up play-alongs that were embedded in Keynote slides. I was achieving this by having the sound of these sources come out of my computer speakers and right back into the microphone of my laptop. It actually works. But not for everyone. And not well.

Loopback is an app by Rogue Amoeba that allows you to combine the audio input and output of your various microphones, speakers, and apps, into new single audio devices that can be recognized by the system. I wrote about it here. My current set up includes a new audio device I created with Loopback which combines my audio interface and a bunch of frequently used audio apps into one. The resulting device is called Interface+Apps. If I select it as the input in my computer’s sound settings, then my students hear those apps and any microphone plugged into my audio interface directly. The audio quality of my apps is therefore more pure and direct, and there is no risk of getting an echo or feedback effect from my microphone picking up my computer speaker’s sound.

A Loopback device I created which combines the audio output of many apps with my audio interface into a new, compound device called “Interface+Apps.”

A Loopback device I created which combines the audio output of many apps with my audio interface into a new, compound device called “Interface+Apps.”

I can select this compound device from my Mac’s Sound settings.

I can select this compound device from my Mac’s Sound settings.

Now I can do the following with a much higher level of quality...

  • Run a play-along band track and have a private student drum along
  • Play examples of professional bands for my band class on YouTube
  • Run Keynote slides that contain beats, tuning drones, and other play-along/reference tracks
  • and...

Logic Pro X

Logic Pro X is one of my apps routing through to the call via Loopback. I have a MIDI keyboard plugged into my audio interface and a Roland Octopad electronic drum pad that is plugged in as an audio source (though it can be used as a MIDI source too).

The sounds on the Roland Octopad are pretty authentic. I have hi-hat and bass drum foot pedal triggers so I can play it naturally. So in Logic, I start with an audio track that is monitoring the Octopad, and a software instrument track that is set to a piano (or marimba or xylophone, whatever is relevant). This way, I can model drum set or mallet parts for students quickly without leaving my desk. The audio I produce in Logic is routed through Loopback directly into the call. My students say the drum set, in particular, sounds way better in some instances than the quality of real instruments over internet-connected calls. Isn’t that something...

Multiple Camera Angles

Obviously, there is a reason I have previously recommended a set up as simple as a smartphone and a tripod stand. Smartphones are very portable and convenient. And simple smartphone apps like FaceTime and Google Duo make a lot of good default choices about how to handle audio without the fiddly settings some of the more established “voice conference” platforms are known for.

Furthermore, I can’t pick up my desk and move it to my timpani or marimba if I need to model something. So I have begun experimenting with multiple camera angles. I bought a webcam back in March (it finally just shipped). I can use this as a secondary camera to my laptop’s camera (Command+Shift+N in Zoom to change cameras).

Alternatively, I can share my iPhone screen via AirPlay and turn on the camera app. Now I can get up from my desk and go wherever I need to. The student sees me wherever I go. This option is sometimes laggy.

Alternatively, I can log in to the call separately on the iPhone and Mac. This way, there are two instances of me, and if I need to, I can mute the studio desk microphone, and use the phone microphone so that students can hear me wherever I go. I like this option the best because it has the added benefit of showing me what meeting participants see in Zoom.

Logging in to the Zoom call on the Mac and iPhone gives me two different camera angles.

Logging in to the Zoom call on the Mac and iPhone gives me two different camera angles.

SoundSource

This process works well once it is set up. But it does take some fiddling around with audio ins and outs to get it right. SoundSource is another app by Rogue Amoeba that takes some of the fiddly-ness out of the equation. It replaces the sound options in your Mac’s menubar, offering your more control and more ease at the same time.

This app is seriously great.

This app is seriously great.

This app saved me from digging into the audio settings of my computer numerous times. In addition to putting audio device selection at a more surface level, it also lets you control the individual volume level of each app, apply audio effects to your apps, and more. One thing I do with it regularly is turn down the volume of just the Zoom app when my students play xylophone.

Rogue Amoeba's apps will cost you, but they are worth it for those who want more audio control on the Mac. Make sure you take advantage of their educator discount.

EDIT: My teaching set up now includes the use of OBS and an Elago Stream Deck. Read more here.

Conclusion

I went a little overboard here. If this is overwhelming to you, don't get the idea that you need to do it all. Anyone of these tweaks will advance your setup and teaching.

This post is not specific about the hardware I use. If you care about the brands and models of my gear, check out My Favorite Technology to read more about the specific audio equipment in my setup.

🎙 #14 - Empowering Performing Ensembles at a Distance, with Theresa Hoover Ducassoux

Theresa Hoover Ducassoux joins the show to talk about technology for teaching band at a distance, productivity methodologies, Google apps for personal and school use, Flipgrid, empowering students, and more...

Other topics:

  • Personal productivity systems and apps
  • The Getting Things Done Methodology
  • Teaching band online
  • Being creative with whatever teaching scenario and schedule your district is moving forward with this fall
  • Engaging students with musical performance using the Flipgrid video service
  • Google apps for personal productivity
  • Google apps for classroom teaching
  • Organizing files in Google Drive
  • Automating band warm ups
  • Chamber music breakout groups using Google Meet and Soundtrap
  • Getting Google Certified
  • Her book- Pass the Baton: Empowering All Music Students
  • Our favorite album and apps of the week

Show Notes:

App of the Week:
Robby - Loopback by Rogue Ameoba (They have educator discounts)
Theresa - Flat for Docs

Album of the Week:
Robby - Jennifer Higdon Harp Concerto
Theresa - Dustin O’Halloran, piano solos

Where to Find Us:
Robby - Twitter | Blog | Book
Theresa - Twitter | Website - MusicalTheresa.com | Book - Pass the Baton: Empowering All Music Students | Blog - Off the Beaten Path

Please don't forget to rate the show and share it with others!

Subscribe to Music Ed Tech Talk:

Subscribe to the Blog

Subscribe to the Podcast in... Apple Podcasts | Overcast | Castro | Spotify | RSS

Due app gets an Update for the Mac

Due is an indispensable app that I depend on daily on both iOS, the Apple Watch, and the Mac. There are three things that immediately come to mind when I think about why I like this app over the basic Reminders app.

  1. Its design is beautiful, intelligent, and easy to read.
  2. Its natural language parsing is a breeze... "remind me to help with lunch duty at 11:27 am" adds a reminder with the time just as I type it.
  3. The swipe down gesture to add a new task is very intuitive.
  4. By far, most important: you can set the notifications to keep pinging you until you check the task as done. You can even customize the amount of time it snoozes when you tap the snooze button.
Unknown.png

I don't use this app to manage big projects. For that, I use OmniFocus. But for tasks that have to be acted on in a very specific moment, Due is the tool for the job.

It just got a major Mac update. Most of the changes are design focused, which I can appreciate because the Mac app, while it functioned, was starting to look pretty out of date. If you want to read more of the specifics, I recommend the MacStories article linked below.

Due for Mac Modernized with New Design and Features:

A full-fledged task manager is terrific for many projects, but if you dump your entire life into one, it can quickly become a cluttered mess. At the same time, if you’re focused on a big project, it’s easy to let everything that’s not in your task manager slip through the cracks. One strategy for attacking the problem that has worked well for me is using a separate, lightweight app for tasks like remembering to take out the garbage, pick up medicine at the pharmacy, or publish an article when an embargo lifts.

In the past, I’ve used Due on the iPhone and iPad for these sorts of tasks. There has been a Mac version of Due for years too, but it hadn’t been updated in about two years and was showing its age. However, with today’s update, Due for Mac joins the iOS version with a fully-modern design and slate of new features, putting it on par with the outstanding iOS version, which I’ve covered in the past.

StaffPad Comes to iPadOS (Reflections on App Store Pricing and Touch Screen Operating Systems)

Five years ago, StaffPad came to Windows Surface tablets. StaffPad is a professional music notation application that turns handwritten notes into beautiful music notation. It is built around the stylus being the primary input, and because the iPad did not have stylus support at launch, StaffPad remained Windows only.

Multiple years into Apple supporting its own official stylus, the Apple Pencil, StaffPad is finally here on iPad!

StaffPad’s intro video sells itself, so I am not going to write much about the app here. Instead, I point you to…

StaffPad’s Introductory Blog Post

Download Link to the App Store

Scoring Notes Review - a must read if you are interested

Since the features of StaffPad are covered in the links above, I want to comment on two interesting aspects of this release.

First, the price. At $89.99, this is no impulse purchase. I find it refreshing to see a professionally priced app like this on the App Store. For years, the App Store has seen a race to the bottom type approach for grabbing sales. Users are so used to <5 dollar apps that the idea of paying for software has diminished from reality.

IMG_2544.png

Increasingly, developers are finding that subscription based pricing is the only way to maintain software and put food on the table. There was a big discussion about this in the Apple community last week when beloved calendar application Fantastical released their version 3 and went to subscription pricing. As is customary when an app goes to subscription pricing, users of the application and bystanders alike were enraged at the idea of a calendar costing four dollars a month.

I couldn’t resist sneaking my love of Fantastical into this post. The interface is beautiful.

I couldn’t resist sneaking my love of Fantastical into this post. The interface is beautiful.

And the natural language input is one of many essential features that helps me get my work done more efficiently.

And the natural language input is one of many essential features that helps me get my work done more efficiently.

As a user of Fantastical, I was happy to keep supporting development. It is one of my most used applications on a daily basis and its features are essential to me having a full time teaching job, while also scheduling gigs, 25 private students, speaking engagements, and all of my other personal events.

Fantastical is what I would call a prosumer application. It offers more power to someone looking for an advanced and well designed calendar, and it has a wide appeal (everyone needs a calendar!). Four dollars a month is steep, but manageable. Now that the price is reoccurring, I do think it will appeal to a smaller audience, as each user will have to reevaluate on a monthly or yearly basis whether or not this application is continuing to be worth the cost.

StaffPad is very different. It is a professional creation tool. Much like Photoshop is essential to designers and photographers, score editors are essential to the lives of most musicians, composers, and music educators. By charging 89 dollars, StaffPad follows a long history of apps in its field, which are often priced between 200 and 600 dollars.

I have to wonder… if the iPad had more software like this, and from an earlier point in time, would users have adjusted their expectations and would more expensive professional apps be more viable? And if so, would the viability of such professional apps lead to more (and better) professional apps on iOS?

And furthermore, would Apple adjust to these trends? Apple still offers no free trial for apps (something that will definitely deter a lot of my music teaching colleagues away from giving StaffPad a chance). Not to mention that professional creative software has a tradition of volume licensing and educator discounts. Educators who would normally be able to afford a program like this for themselves or their class are going to be stuck if they are looking for the same options with StaffPad.

App developers get around to this in number of ways, an example of which is to offer a free app where you have to buy it as an in-app purchase after a week. Of course there is also the subscription model. I am glad StaffPad went with a more traditional model than a subscription because it fits within the tradition of how its class of software is priced. And my hope is that this just might convince more developers to bring their own apps to iPadOS.

Which brings me to my second point…

StaffPad doesn’t, and probably wont, have a macOS app. It is built entirely around stylus input. This is why it could only exist on Windows Surface tablets at first. I am thrilled it is on iPad, but this presents an interesting question for users of Apple products.

A Windows Surface user notices no distinction between whether or not StaffPad operates on a touch-based OS or a traditional point-and-click OS, because they are one and the same. Even as macOS and iPadOS move closer and closer together, this distinction has lead them to be products with very different potentials.

On the other end, all the other players in the score-editor field (Sibelius, Finale, Dorico) remain “desktop” applications that run on traditional point-and-click operating systems. With the power of the current iPad Pro, there is no reason these applications couldn’t exist on iOS, other than that developing for iOS is very different. None of these developers have shown any signs of bringing their programs to iPadOS any time soon, and I would suspect StaffPad has no plans for a Mac version.

I admire how Apple has held their ground about the iPad being the iPad and the Mac being the Mac. It has made both platforms stronger. But as the iPad becomes a more viable machine for getting work done, Apple has got to get a plan for how to solve this essential “input” question.

Planning Band Rehearsals with MindNode, A Mind Mapping App for macOS and iOS

I am starting my third week of paternity leave tomorrow. And while I am doing my best to ignore work at all costs, I am also reminded that when I return I will have three weeks with my students to work on their band assessment music. My long term sub (who is incredible) will have been working on it with them for three weeks by the time I return. Naturally, there is still a lot I want to accomplish with it on my own time.

To that end, I decided to get organized. When I organize large projects, I like to create a mind map.

In my brain, there are a lot of ways I want to keep kids engaged with our current repertoire. I have score study and lesson planning tasks, music and videos I want to inspire them with, strategies for rehearsal, alongside stories and verbal illustrations to communicate abstract tonal and phrasing ideas. I also have some behavioral concerns that need to get locked down so that our focus is at 100 percent. Personally, I don’t know any way to dump out these interconnected ideas and see how they fit together without a map.

IMG_2526.png

MindNode is a mind mapping application for iOS and macOS that lets you easily dump ideas quickly into a beautifully structured map. A MindNode document starts with a single bubble in the middle of the screen from which you can create “nodes,” or branches, off from the middle. It is possible to create a vast tree of hierarchical concepts, topics, and ideas, without even taking your hands off the keyboard, much like typing a quick bullet point list into a note.

The nodes can later be dragged around freely anywhere on the map. When you move one branch, all of the others adjust around it dynamically, ensuring that your map is balanced.

Dragging a node adjusts the map.

Dragging a node adjusts the map.

MindNode has a ton of features that are beyond the scope of this post. You can add notes, images, and tasks to nodes, which you can see I have done in the map above. You can apply various different themes to the way the nodes look, or even customize your own theme. You can also view your map as a linear outline. The new version, MindNode 7 has even just added a visual tagging feature to help you better organize your nodes. You can read about that here.

MindNode is full of tools to conceptualize your map and format it so that it looks great.

MindNode is full of tools to conceptualize your map and format it so that it looks great.

One of my favorite new features is the Apple Pencil support. When you screenshot a mind map you can choose to annotate it like a normal screenshot or you can select 'Full Page' and MindNode will fit the entire document into view and cut out all of the user interface elements like menus and buttons. This way, you can mark up a clean copy of the file which you can then export as a PDF to an app of your choice.

This is what annotating a normal screenshot looks like.

This is what annotating a normal screenshot looks like.

MindNode uses Apple's PencilKit API to strip away buttons and menus, leaving you with a clean document to annotate.

MindNode uses Apple's PencilKit API to strip away buttons and menus, leaving you with a clean document to annotate.

MindNode's export tools are amazing. In the next screenshot, you can see all of the options. You can export the document itself as an image, PDF, or outline, just to name a few choices. But what I love is the option to export the nodes to the task manager OmniFocus or Things as a project.

IMG_2525.png

Next, you can see a screenshot of Apple Notes with an exported MindNode PDF (left) alongside todo app Things (right). MindNode has neatly formatted my map as a project in Things with headings and checkable todos that I can later give due dates and deadlines too. Awesome!

IMG_2527.png

In that screenshot, notice another PDF beneath my MindNode document. It is another PDF I exported. It is a score map of Greenwillow Portrait by Mark Williams which I am performing with one of the bands.

One of my goals for the quarter is to spend more time in the score. I like to occasionally study a score starting with the big picture and later moving to finer details. To help establish this big picture, I will occationally make a map that serves as a rough guide to a piece. One of my problems with drawing maps is running out of screen space on any of the four ends of the iPad. To solve this, I used Concepts, an open canvas drawing app.

It's full of features, but I am most attracted to the style of the pen tools and its ability to keep drawing in any direction, without being limited by the four walls of the iPad's screen. The document just keeps adding room to whichever side I keep drawing on. It's worth checking out if you have a need for this kind of drawing tool.

When I started this document, I ran out of room on the side of the screen at measure 19. All I needed to do to solve that problem is zoom out and keep drawing.

When I started this document, I ran out of room on the side of the screen at measure 19. All I needed to do to solve that problem is zoom out and keep drawing.

Happy winter and good luck preparing for your spring assessments if you are taking a performing group to one!

The 7 Best Apple Homekit Devices

Learn about my smart speaker setup on this episode of my podcast:

Subscribe to the Podcast in... Apple Podcasts | Overcast | Castro | Spotify | RSS

I keep promising myself that a larger dive into my home automation workflow is coming to this blog. And it is. But I thought that I would first take a moment to outline the top seven apps and devices that I am using in combination with the Apple Home app. These get special attention given that their HomeKit integration allows me to conveniently manipulate them all from within the Apple Home app and command them with Siri. 

All of the devices in this post are also compatible with the Amazon Echo. I only buy home devices that are equally compatible because I use Alexa in my house as well. Furthermore, the home automation space is still very young and fragmented. The more open a platform is, the more flexible it will be now and in the future. 

Philips Hue Lights

Be careful. These WiFi connected light bulbs are the gateway drug of home automation. With them, I can now turn on every light in my house with my phone or voice. For my small house, the bulbs work just fine, but I would recommend the light switches for larger homes for convenience and to save money. Check out the image below to see how these lights appear in the Home app. I can control them individually or group them together. I can automate them by time or location in the Apple Home app. It's really nice to have the lights gently dim around bed time, and gradually wake me up with a gentle red hue an hour before work in the morning. Because my iCloud account also knows who and where my wife is, I can set up an automation that turns off all the lights once both of us have left the house, and another that turns them back on when one of us returns. 

Check out Philips Hue lights here

The Home app aggregates all of my various different home automation devices.

The Home app aggregates all of my various different home automation devices.

The Good Morning scene is automatically set to run at 6:30 am on weekdays and at 9:30 am on weekends.

The Good Morning scene is automatically set to run at 6:30 am on weekdays and at 9:30 am on weekends.

This is the set up screen for my Good Morning scene.

This is the set up screen for my Good Morning scene.

Ecobee Thermostat 

The Nest thermostat was the first home automation device I ever bought. It doesn't work with Apple HomeKit though. So when it unexpectedly died last year, I jumped at the opportunity to try something new. Ecobee thermostats are the best around. Speaking into the thin air "Hey Siri, I'm cold" to turn up the heat is a modern day dream. Of course, I can automate temperature in all of the same ways I can do lights. And I can even group these devices into "scenes" in the Apple home app to streamline frequent actions. For example, the "Arriving Home" scene turns on the air and the lights. This scene is not only triggered by button or voice, but automatically runs when my phone is within close proximity to my house. 

This is what you see when you open the ecobee app.

This is what you see when you open the ecobee app.

Once you tap on a thermostat, you get more detailed controls.

Once you tap on a thermostat, you get more detailed controls.

This is my Arrive Home scene. The door unlocks for me, the thermostat turns on a good temperature, and the lights on the main level of the house turn on.

This is my Arrive Home scene. The door unlocks for me, the thermostat turns on a good temperature, and the lights on the main level of the house turn on.

Schlage Sense Door Lock

My Schlage Sense allows me to unlock my door with the tap of a button. My teaching studio is in the basement of my house and the door is upstairs. It is disruptive to a lesson to constantly be answering my door, so now I just tell my Apple Watch "Hey Siri, unlock the door." It authenticates through contact with my wrist and completes the task. Of course my Arriving Home and Leave Home scenes also unlock and lock the door, in addition to all of the other actions I mentioned above. Having my front door unlock for me when I arrive home makes me feel like I am living in the future. Having it automatically lock when I leave gives me peace of mind that my house is safe. 

Logitech Circle Camera

Of all the HomeKit devices out there, cameras are the hardest to shop for. I have found the Logitech Circle to be the best out there. Nest makes some great cameras but their lack of HomeKit support has driven me away. I have the Logitech set up in our dining room, facing down the primary hallway in my home. It is plugged into an iHome smart plug which is also HomeKit enabled so that I can turn it off and on remotely. This plug is automated in the Home app to turn on when neither my wife and I are home and turn off when one of us arrives home, therefore working like a security camera. When it detects motion it turns on our dining room and kitchen lights. It has a two way microphone so you can chat with someone in your home if you need to. And what I love about it most is that the camera feed shows up right in line with my other smart home controls in the Apple Home app. 

The interface for the Logi Circle 2 app.

The interface for the Logi Circle 2 app.

Eve Sensors

Sensors need no introduction. These things can trigger any other home device to act when they detect motion. Most of mine are set to turn on the lights in a given room when I walk into them. But they can also trigger thermostats and smart plugs. My favorite sensors on the market are made by eve. They are easy to set up and work reliably. Eve also makes a number of other interesting HomeKit products. 

Sensors appear as ‘Triggered’ in the Apple Home app when they have detected motion.

Sensors appear as ‘Triggered’ in the Apple Home app when they have detected motion.

In Apple Home, I can make an automation that turns on the upstairs light whenever my eve sensor is triggered upstairs.

In Apple Home, I can make an automation that turns on the upstairs light whenever my eve sensor is triggered upstairs.

The eve app makes a great alternative to the Apple Home app for controlling all your devices.

The eve app makes a great alternative to the Apple Home app for controlling all your devices.

iHome Smart Plugs

I like using smart plugs as an all purpose way of turning on and off the things in my house that are otherwise not “smart.” In addition to the camera workflow I mentioned above, I also have these plugged into other devices throughout the house. For example, my bedroom fan is plugged into one. I can now turn it on and off in the middle of the night without getting up. “Hey Siri, turn on the fan.” A lot of brands make smart plugs but the iHome is the easiest to set up and use in my experience. 

Apple HomePod

I was hesitant about the HomePod at first given that it shipped with incomplete software and relies entirely on Siri for voice commands. Still, the device offered some compelling features. When iOS 11.4 brought the features that were missing from release (AirPlay 2 and multi room audio), I scooped one up while Best Buy was running a 100 dollar off deal on them, refurbished. 

The HomePod fulfills a lot of the same purposes as the Amazon Echo. It is distinguished by linking into the Apple ecosystem, allowing me to command Apple Music, Apple Podcasts, and all of the home automation devices mentioned above. 

Control of the HomePod exists inside the Apple Home app where it appears as a speaker device. The recent addition of AirPlay 2 allows my two Sonos One speakers to show up in the Apple Home app as well. 

The HomePod is first and foremost a good speaker. But it can also command your other speakers in the house and even the audio output of your Apple TVs. Simply command “Hey Siri, move this music to the living room,” and listen as your music is magically transported from one speaker to the next. You can output your Apple TV audio through to this handy speaker and speak playback commands to your tv and movies with statements like “pause,” “stop,” and “skip ahead 50 seconds.”

The HomePod is the core of the Apple Home experience. Of course, you could just as easily control every device in this post from an Echo. However, as an Apple Music subscriber, and frequent listener to podcasts in the kitchen, having a HomePod makes sense for me to own.

It looks like the investment is going to pay off. This fall, iOS 13 will be adding even more features to the HomePod and Home app. For example, the HomePod will be able to distinguish between my voice and my wife’s. This way, when she asks it what is going on today, it will read from her calendar instead of mine. iOS 13 is also introducing speaker automations for scenes. So my Good Morning scene in the Home app will now play my favorite breakfast playlist in addition to turning on select lights and changing the temperature.

And finally, HomeKit automations and Siri Shortcut automations are going to be better tied together, and will be able to be triggered automatically. For example, doing something like stopping my wake-up alarm will both run the Good Morning scene and automatically run this Siri Shortcut that tells me how I slept, delivers a weather report, and opens a meditation in the Headspace app.

In iOS 13, HomePod play controls show up right in the Home app.

In iOS 13, HomePod play controls show up right in the Home app.

In iOS 13, music playback can become part of your scenes.

In iOS 13, music playback can become part of your scenes.

The new Siri Shortcuts app on iOS 13 integrates home automations and personal automations. It also allows them to be automatically triggered by time, location, opening a particular app, and more! In this example, stopping my wake-up alarm triggers m…

The new Siri Shortcuts app on iOS 13 integrates home automations and personal automations. It also allows them to be automatically triggered by time, location, opening a particular app, and more! In this example, stopping my wake-up alarm triggers my I’m Awake Siri Shortcut, which sets the Good Morning scene, reads me the weather, tells me how I slept, and starts a meditation.

Brief Thoughts on Apple’s Education Event

Well it has taken me long enough… This past week, Apple held an education event. Below are some brief thoughts on the subject. Chris Russell is coming on my podcast later this week to talk about all of the details. Keep in mind, I do not work in a school with 1:1 iPads or any kind of deployment strategy. But I am very seriously invested in Apple’s role in education and their vision for how their products fit into the classroom.

New iPad

This device looks great. Adding the Apple Pencil to this model will be an asset for schools. But will schools really pay 89 dollars for a pencil after just having purchased numerous 250 dollar iPads? 

The thing that gets me most excited about this device is its consumer potential. I am tempted to buy one for myself as a (more) mobile counterpart to my larger 12.9 inch iPad Pro.

iWork Updates

Apple Pencil support. FINALLY. This was my favorite announcement of the day. I anticipate editing Pages documents, scribbling on bus attendance lists made in Numbers, and annotating Keynote slides at the front of the classroom on a daily basis. I hate to be cynical (which the rest of this post will be), but Microsoft Office for iPad has had the ability to write on documents with an Apple Pencil since the Apple Pencil launched, two years ago. 

iBooks Author

Seems like the Mac app is no longer going to receive development. All book publishing features have been moved to Pages for iOS and Mac. It doesn’t appear that the new feature does everything that iBooks Author can do. Hopefully this is like when Apple rewrote Final Cut Pro X, took away some features, but then eventually added them back. Or when iWork was rewritten to be the same for iOS and macOS, stripping AppleScript features from the Mac, but eventually bringing them back. I would hate to see iBook authors unable to use workflows they have in the past using iBooks Author for the Mac. 

Classroom App for Mac

Apple’s learning management system comes to the Mac. Great! But what took so long? And can Apple keep up with the vastly more mature and flexible Google Classroom? (See conclusion below)

School Work App

An app for teachers to give assignments to students, check their progress, and collect it back. School Work can route students to other apps to do their assignments using the ClassKit API which is very cool. But why is this separate from the Classroom app? And where does iTunes U fit into all of this?

Conclusion

Apple is making a lot of solid efforts here but a lot of it it feels like too little too late, especially the student and learning management software. I really do hope they can keep up with Google Classroom who has been eating everyone’s lunch for years. Apple will have to be aggressive about adding new features to all of these new apps and making sure that their app ecosystem is flexible enough to compete with Chromebooks which use browser based software. Yes, there are way more apps on the App Store than there are Chrome based apps, but in education (and especially in music education) a lot of the big players are writing for Chrome OS. To me, the draw of Chromebooks in education is not their price, but the flexibility of web based software.

Apple’s software engineers seem spread very thin and unable to balance the release of various applications, consistently over time. This is true of many of Apple’s consumer apps. Mail and Reminders, two tentpole productivity apps have fallen way behind the competition. Calendar has not seen any more than a few major feature updates since I started using the Mac back in 2006. Apple’s apps are part of the “nice” factor of being in the ecosystem. Sometimes an app like Notes will get some major new features, but then we won’t hear from it for a few years. Google’s apps, by contrast, lack the same design sense, but are constantly being updated with new features. And they are not locked into annual OS updates like iOS is. In my opinion, this is Apple’s biggest problem right now.

Ironically, software is still my draw to Apple products. Even though their hardware is the most indisputably good thing they are doing right now (I am nearly without complaint of my iPhone X and the iPad 10.5 is perfect), it is the software that locks me in. In other words, I am much more committed to macOS and iOS than I am Mac and iPhone. This leaves me with some long term concern about my interest in continuing to use Apple products. And great concern about any educational institution who jumps on the iPad bandwagon just because apps are bright and colorful and demo well on stage. Apple has to show continual support for their education software if their dream for the classroom is to come true.

 

🔗 Black Friday App Deals

Every Black Friday, tons of apps go cheap or free on the App Store. I always use this opportunity to score a bunch of apps I have been waiting on, particular higher priced apps. 

The best resource I have found for keeping track of every deal is MacStories. Check out their blog post on The Best Deals for iPhone and iPad Apps, Games, and Accessories. They have already started collecting apps on sale and will be updating it all throughout tomorrow and in the coming days. It catalogues apps ranging from iOS apps to Mac apps to tech deals on Amazon. 

Happy app purchasing!!!

🔗 GarageBand on Mac Now Syncs Projects with iOS

Read Cult of Mac's overview of the new GarageBand update for macOS. I think this is essentially adding the feature to the Mac version of GarageBand that Logic added a few months back. I played around with it for a few minutes last night, trying to sync a project between the Mac and iOS version of GarageBand. Unless I am missing something, this workflow runs into all of the same issues as the Logic feature that I wrote about when it was released. The process is not direct as you still have to manually prepare the file for syncing and create a duplicate copy whenever you go from Mac to iOS or iOS to Mac. And I really wish I could edit the audio on my iPad too. My iPad Pro is powerful enough!!!

Quick and Dirty Thoughts on the WWDC Keynote

Here are some quick and dirty thoughts I have on many of the announcements at Apple’s WWDC Keynote on Monday.

Apple TV

Disappointed we didn’t get any new features in tvOS. Maybe next year with the introduction of new Apple TV hardware. YAY for the announcement of an Amazon Prime app though.

watchOS

Not really impressed here. The main things I think Apple Watch struggles with are…

  1. Access to audio controls

  2. A more predictive, contextual, ability to show things on the watch face

As for 1, Apple did address this by making music controls a swipe away while running a workout in the Workout app. I was hoping for something a little bit more globally accessible. They accomplished 2 by introducing the Siri watch face. But for me, the Siri watch face is too much of a compromise because it can’t show any other complications on the screen at the same time.

I am also disappointed that they didn’t announce a Podcast app or Notes app.

macOS

No complaints here really. I wanted them to start the process of breaking iTunes down into smaller apps. Maybe at least breaking Apple Music into its own app and TV into its own app and leaving the rest of the things iTunes does inside the app known as iTunes. Really though, I am cool with Apple making slower and steadier updates to macOS. My Mac is the machine I depend on the most for work so I appreciate that Apple is focused on stability.

Hardware

The new iPads look great! I can see myself eventually buying the 10.5 size. I love my 12.9 inch for reading scores with the forScore app, but I really miss being able to hold it comfortably with one hand and also reading it in bed. Maybe the 10.5 inch will be the perfect compromise.

The iMac Pro looks fantastic. Its not a machine I am looking for right now though so I will just enjoy it from far away and appreciate that Apple still cares about the Mac and its professional users.

iOS Features for iPad

  • Drag and drop: YES! Love it. Looks really well implemented too.

  • Dock: YES! A great idea I did not expect.

  • Files app: This is where I started to loose my mind. A native file browser with support for Google Drive and Dropbox is going to completely change the way I use my iPad! This might be my favorite announcement of the entire keynote.

  • System wide markup. This is another one that is going to completely change the way I use my iPad.

  • Notes app: Sooooo much good stuff here. In line drawing? AWESOME! Document scanner? AWESOME. Text searchable handwriting. YES! Bye Evernote.

… yeah. So this iPad stuff is going to be huge.

HomePod

Smart of Apple to position this device as competition against companies like Sonos instead of as competition for products like Google Home and Amazon Echo. The speaker ecosystem is something I really enjoy about having Sonos speakers but its lack of integration with my phone and music library is a constant hurdle. Something with good quality, that I can operate without using an extra app would be much more enjoyable. 

Will I buy one of these? It is really hard to see how this will play out. Amazon Echo and Sonos are working on some kind of integration. That could potentially keep me in that ecosystem, though the idea of selling the Sonos speakers and eventually replacing them with these Apple things has crossed my mind. It might be the kind of situation where I get one HomePod just to get a feel for it and then wait on additional purchases.