For the drive to Montréal, I borrowed a Tom-Tom GPS device — for fun, really; the way is straightforward. It’s the first time I’ve used one in my car, and I have to say that despite the advantage of portability, the built-in ones that I’ve seen in friends’ cars are far better, what with their larger screens and consequent improvement in user interface.
One thing the Tom-Tom does is show your driving speed. Because it’s measured from the GPS signals, I presume it’s accurate. And I found something interesting: my car’s speedometer reads about 5% high, at least at highway speeds. I had to go 68 or 69 MPH on my speedometer for the GPS to show 65 MPH (the speed limit on the Northway).[1]
On thinking about it, it makes sense, for obvious reasons, to design speedometers to read slightly high, to slightly overstate the speed. It’s certainly always preferable to having them read low. If I think I’m going 70, but I’m really doing a little less than 67... I think I’m speeding just a bit, but I’m actually following the speed limit pretty well.
Of course, then it becomes like the people who set their clocks fast so they’ll think they’re running late and hurry. They know how fast their clocks are, so they compensate for it — they don’t worry if it looks like they’re ten minutes late because the clock is ten minutes fast. Similarly, I wonder if I’ll now compensate for my speedometer, and mentally subtract 3 MPH from what it reads on the highway.
Probably not.
[1] But note, here, that apart from the accuracy of the two devices, they also have different precisions (see this post for an explanation of the difference). The precision of the GPS device’s speed display is 1 MPH, while the precision of the speedometer’s is 5 MPH. “68 or 69 MPH” is an extrapolation interpolation.
Comments