Machine not learning
August 31, 2016 at 10:17 PM by Dr. Drang
A few days ago, I was driving from my hotel to the airport, with Siri and Maps providing directions.1 I thought there’d be a gas station before I got on the interstate, but as I got closer I realized I was wrong. I turned off the planned route and started hunting.
One of the things I like about Siri is that she doesn’t complain when you deviate from her instructions. In the old days, leaving the route or missing a turn would cause GPS units to say “Recalculating…” in what I interpreted as an exasperated tone. Siri just accepts your change and gives you new directions.
So as I was hunting, Siri kept patiently telling me how to get to the airport. When I didn’t see a gas station after a few blocks worth of looking, I gave up and asked her for directions to the closest one.
“Starting route,” she replied. “Head north on Isenberg Street.”
This is why I couldn’t bring myself to read Steven Levy’s new article on how Apple is making great advances in machine learning. The iBrain is already inside my phone? No, not yet.
You see, when Siri told me to head north on Isenberg, I was traveling south on Isenberg. In that circumstance, “head north” is a stupid instruction to give.
Siri knew perfectly well I was going south on Isenberg. Not half a minute earlier, she’d been telling me how to turn off Isenberg to get to the interstate. And she’d been tracking my location continuously since I left the hotel. The context was there, but it wasn’t used.
This is what’s most frustrating about Siri and why I find myself yelling at her so often. It has nothing to do with big data or compromised privacy. The problem I posed was ideal for Apple’s everything-on-the-phone strategy. It didn’t even require changing apps. And yet Siri2 interpreted “get directions to the nearest gas station” without any regard to the data she had in her hands just seconds earlier. For some reason, when I asked for directions, only my position was used in developing the answer. That I was in a car and traveling south—essential information for giving good directions—was ignored.
Did Siri figure out I was going south and adjust her directions? Of course she did, and I got to the gas station fairly quickly. But the first thing she said was wrong, and that’s what ruins the experience.
Apple cares about the experience, doesn’t it?