Apple and Education

Ibera - 4Apple held an education event last week at a ridiculously huge high school in Chicago. It was squarely aimed at what used to be one of their core (and most loyal) markets: K-12 schools in the United States. On this side of the pond, there have only been isolated areas where Apple gets a look-in. I used to be one of them, when I taught Media and Film Studies, but even then I didn’t have enough computers in the classroom for anything other than group work.

In these financially straitened times, Apple have been losing share to Google. Schools are starved of funds for ideological reasons, teacher salaries are rock bottom (also for ideological reasons), and Google offer both cheap computers (Chromebook) and a “free” suite of software that integrates with school systems.

Apple’s event introduced a new, cheaper iPad aimed at schools, which supports their (expensive) Pencil and has a suite of software aimed at school IT managers and teachers.

Now, if you take the iPad and consider what it can do, it’s great value. Whereas a Chromebook, like most cheap laptops, will fall apart within 3 years, an iPad will go on forever (as long as you don’t drop it). An iPad can be a still or video camera, and includes software to edit photos, create documents, and edit video or make music. Nothing in the Google suite of apps matches the quality of Apple’s software. Throw in the Pencil, and you can use the iPad across the curriculum. Which is not to mention the privacy concerns I’d have regarding Google and their “free” software.

It seems, however, that Apple has a problem when it comes to implementing class sets and multiple log-ins. Their user-switching tools are reportedly clunky. I don’t think, personally, that this is unique to Apple. I’ve watched students log into networked (PC) computers and (especially if it’s the first time they’ve used that particular machine), it can take a ridiculously long time. I’ve had students in my lessons who’d been issued with a laptop because of special needs, and they have sat waiting for it to log in for an entire lesson.

But if I was in charge of a budget and had the power to make things happen, would I buy iPads?

I don’t think I would. I’d replace suites of Windows PC and Chromebook computers with Apple in a heartbeat, but I’ve never been sold on the iPad.

Here’s the thing. A computer is only as good as its software, and while Apple’s software may be good (the best, even), here in the real world, teachers don’t have time to learn it. It’s not just budgets and salaries that are constrained, but time. You offer me a class set of brand new iPads (or even a one-iPad-per-child policy), and I’m going to shrug my shoulders. Those iPads are going to stay locked away, or in the students’ bags. Not only do I not have time to get to grips with the software I’d be using to assign work and set homework, but I don’t have time to design lessons and activities, or the inevitable administrative tasks that go along with setting class and homework.

We already get pointed towards online services that can be used for homework and resources. “It’ll save you time in marking,” they say. “It’s all marked automatically.” But it’s not just the marking time I don’t have. I don’t have the setting time, the thinking time, or the time to deal with the students who don’t do the assigned tasks (because, when a student doesn’t do the homework, you’re supposed to do something about it).

You think I’m whining. I teach seven different sets of students. Outside the extra time I choose to put in, I get 21 minutes per week, per class to plan lessons, set work, mark books, and do the admin for that class. Obviously, that’s impossible, so the extra time I put in is dedicated to those basic tasks.

So you can hand me the greatest IT tools in the world, the most amazing hardware and software, but I still don’t have time. It wouldn’t be so bad if the students themselves had any IT savvy, but it’s a rare student indeed who knows how to do anything beyond the basics. I spent 10 years teaching students how to use Page Setup and calling out, “You’ve got caps lock on,” when their log-in “wasn’t working.” These days, not being able to do something on a computer has replaced the dog as the the most common reason homework isn’t done. I’ve decided that life’s too short to watch any more people accidentally lose all the work they did in an hour, or not know how to resize an image. 

Advertisements

The continuing frustrations of Apple Music

The ‘functional high ground’ argument is in the air again, with various tech journalists and podcasters weighing in with their opinions on various parts of Apple software. There are still occasional glimpses of former glory: Music Memos was quietly released and is the kind of songwriting tool I’d have loved to have, back when I was writing a lot of songs. It does exactly what you’d want it to do: it’ll tune your guitar, has a big record button, it knows what chords you’re playing, and it even adds a decent robot backing track. Only Apple, as we used to say so often, can do this.

But of course, iTunes continues to be horrible, and I’m not alone in thinking that Photos is a poor substitute for Aperture and doesn’t need to keep its editing tools so deeply buried – multiple clicks to achieve a simple edit is not good design.

My greatest frustration continues to be Apple Music, which is awful on so many levels that I haven’t even seen any of the pros complaining about the problems I’m encountering. Theirs are all to do with synching and matching and so on, whereas mine are mainly to do with basic functionality and interface.

But I’ll start of with that synching feeling. A permanent feature on my phone’s screen in Music is the phrase, ‘Showing only music on this iPhone. Show All Music.

That last bit, highlighted in red, is supposed to be a button, but of course we’re not allowed to have ‘buttons’ anymore because they’re skeumorphic. So we just have to guess that ‘Show All Music’ is a button. But take a look at this screenshot.

IMG_8968

Yes, the greyed out tracks are not on my iPhone. Not. On. My. iPhone. So I shouldn’t be seeing them, should I? And yet they are there.

Getting into my car the other night, I plugged my phone into the Media connector in the car, and of course it started playing music, starting with the first song beginning with A (currently ‘After Hours’ by the Velvet Underground). Which is not what I wanted it to do. One of my daily frustrations is that my phone, unlike iPods of old, rarely seems to remember where it had reached in the playback (I play through all my songs alphabetically, which is my favourite form of ‘shuffle’). So it defaults to the first of the ‘A’s and I get angry and frustrated – and begin to kind of hate and resent that hapless first song (of 1000) in the list.

So I’m sitting in the car and ‘After Hours’ starts up, and I pick up the phone to find where I think the playback had reached, somewhere in the ‘D’s, and I tap the screen to play that song. Somewhere in the ‘D’s. I tap. And it starts to play ‘Baggage Claim’ by Miranda Lambert. What?

After a bit of experimentation, I realise that tapping somewhere in the ‘F’s will start playing somewhere in the ‘C’s, and so on. So as well as showing music that is not on the phone, Music is now also reading a tap in the ‘D’s as a tap in the ‘B’s. Brilliant. I suspect that means that the phone is seeing the tap where it really would be if it wasn’t displaying a bunch of songs that are not on the phone. If I could somehow make them invisible, I’d see that I was, in fact, tapping in the ‘B’s.

That was happening under the My Music tab (which is not a button, nor is it a tab, so what are we supposed to call it, Jony Ive? A section?). When I tapped into Playlists, my one and only playlist was there, and I was able to navigate through it and play back normally. But I shouldn’t have to do that, because there’s only one fucking playlist on the phone, and so the My Music section should have just the songs from that.

And I wouldn’t have to be doing all this if the app remembered where it was in the playback list.

Take a look at this second screen shot:

IMG_8969

Yes, that’s in the Playlist section, and you can see we’re playing back ‘Give Him a Great Big Kiss’ by the Shangri-Las. Now, what if I wanted to skip to the next track from this screen? I could tap the next on the list, but if I wanted to skip several, I’d have to scroll first, stop the scrolling, then tap the one I wanted. So to see an actual Skip option, I have to tap the tiny white band near the bottom and go into the Playback screen. What if I wanted to pause playback? There is a chance of doing that, because there is a tiny play/pause not-a-button. But what are the chances of hitting this first time if I am walking, or driving my car, or riding my bicycle? 50/50 at best, I’d say. Note that the image above, because of the resolution of my phone’s screen, exaggerates the size of the tap target. Now, given how huge my phone’s screen is, why does the play/pause button have to be so small? Why is there no skip forward or skip back?

Why is this software such a piece of shit at doing what it’s supposed to do?

I suspect the answer to that is that I’m not supposed to want to do any of this. I’m supposed to subscribe to the streaming service, and put my music listening into the hands of curators or algorithms, and just passively accept whatever Music decides to play. I suppose there might be some people who are happy with that. Me? Not so much.

Which brings me to my final complaint. Apart from not giving enough space to playback controls, the Music app also imposes the Radio not-a-button and the Connect not-a-button at the bottom of the screen, where they are liable to (only ever) be clicked accidentally, much like the Moments button on twitter.com. Why can’t I go into a Preferences pane and turn those items off? Why can’t I select an option for a larger playback button? Why doesn’t telling the app only to show music on the iPhone do what it’s supposed to do?

Because Apple Music is shite. And here’s the thing: I’ve tried a couple of other playback apps, hoping to leave all the Music nastiness behind, but they don’t work. They too forget where they were playing back – because they can never take over the System playback from Music. And they too (presumably following the Human Interface Guidelines) have tiny playback buttons.

Grrr!

And don’t get me started on the Music app on the new Apple TV…

Little Red Riding Hood

I wrote and recorded this song some time ago. I think it must have been one of the last things I did with my late-lamented Pro Tools system. I listened to it the other day for the first time in ages. Quite a lot of the music I recorded back then was a bit rubbish, but I still quite like this. So I made a video using found footage from YouTube (and a few clips nicked from student projects).

The history of that is that, in my previous employment, I was tasked with taking the company I worked for into new markets. Music technology was one. Back in the 80s, I’d made a few recordings with portable 4-track and 8-track systems. My ultimate set-up, in the early 90s, was to have a big analogue mixer and a Fostex 8-track reel-to-reel set up in my spare bedroom.

Fast forward to the 2000s, and I was looking for the first time at computer recording systems. My level of ignorance was high. For example, I not only didn’t know what MIDI was, but wouldn’t have been able to tell you the difference between MIDI and audio, didn’t know what a sequencer was, or a plug-in, or, etc.

I spent 18 months researching this stuff, among other things. My employer was remarkably tolerant, but eventually we became a reseller for all the major software and hardware companies in music technology.

Along the way, I accumulated a garage full of gear. I had a G5 iMac, external hard drives, a Joe Meek voice channel, a very decent Shure condenser mic, some high-end headphones, some high-end monitor speakers, an M Box, a Pro Tools LE system, and all the plug-ins.

In truth, I had plug-ins my Mac couldn’t cope with. For example, the Native Instruments Electric Piano was so demanding that I could never do anything with it. So for piano sounds, I used something called Ivory, which was more forgiving, and for a lot of samples I used Sampletank.

I left that job and the plug-ins went on working for a while, but over a couple of years, my licences expired, and I was left with a bare bones system. Ironically, of course, my time with the system meant that at the time I could have made best use of some of the effects plug-ins etc., I was down to the bare bones. At the time I had them all, I was fairly inept all round and made some fairly shonky recordings.

At the last gasp, before everything stopped working (the system was so old that eventually you had to choose between a working web browser/OS or working Pro Tools, although I didn’t know it beforehand), I participated in February Album Writing Month. I wrote 14 or 15 songs in a month, and recorded very quick and dirty demos.

I had reached a certain maturity. I still loved guitars, but knew that it took me a long time to get good sounds and good takes. It’s quite hard to record an acoustic guitar and have it sound as nice as your actual guitar. Same goes for electric. I’d mastered a certain sound, but in order to avoid it becoming samey, I started making use of more piano and different drum sounds, dipping into samples I’d ignored for years. I think Little Red Riding Hood came out of this. My last few recordings all sound very different from my sometimes quite awful early efforts. I’d achieved a certain competency, and acknowledged that I wasn’t ever going to be the guitar player I wanted to be.

I still quite like this song.