Dirty vs. Clean Car
Posted by David Zaslavsky on — CommentsHot on the heels of their Bullet Fired vs. Bullet Dropped episode, the Mythbusters have another result that’s poised to shake up the world of science… well, maybe not. But this week’s main myth, Dirty vs. Clean Car, is the kind of neat idea that most of us would never think to test and yet turns out to be surprisingly close to practicality. The myth that Adam and Jamie are testing is that dirt on a car has the same kind of effect as golf ball dimples, increasing the fuel efficiency of the car. To sum up the results (SPOILER ALERT ;-), it doesn’t work, at least not with dirt — but putting an actual dimpled coating on a car does increase the fuel efficiency by 11%. (Only on Mythbusters would they dimple a car…)
As with a lot of recent myths, this one deals with fluid dynamics — but not just the simple stuff like drag force, as in the bullet myths. The golf ball effect is based on turbulence, specifically the idea that the rough surface of the ball induces turbulence which disrupts the wake (pocket of still air) that trails behind the ball. That pocket of still air takes energy to travel along with the ball; specifically, the whole combination of ball and wake has a kinetic energy
The amount of energy provided by the golf club is, on average, pretty much constant. So if you disrupt the wake, there’s less still air and \(m_\text{wake}\) goes down, which means that \(v^2\) can go up. The ball moves faster and travels further.
Turbulent systems are notoriously difficult to analyze in any detail. So I’m not even going to try to reproduce the result from the show with a calculation or simulation. But I do have a couple of points to pick on:
First of all, when measuring fuel efficiency, Adam and Jamie only ran 5 trials for each configuration. Sure, it takes time and effort to run the car down their 1-mile track, so there’s a practical limit on how many times you can do that, but the fact remains that 5 is not a very large sample size. With so few trials, is the improvement they observed from the dimples (11%) really significant, in a statistical sense?
To figure that out, we’d like to calculate the “standard error of the mean” for the data the Mythbusters collected. Standard error of the mean, denoted \(\sigma_{\bar{x}}\), is basically a measure of how precise your average is; there’s a 68% chance that the actual value is between \(\bar{x} - \sigma_{\bar{x}}\) and \(\bar{x} + \sigma_{\bar{x}}\). The smaller the standard error of the mean, the more precise your measurement. If you assume that your individual measurements are fairly reliable (which you could argue about in this case, but I won’t), it can be calculated from the formula
Just one problem, though: we don’t have the data! So I’m going to make a guess based on the second test, with the fully clean car, for which Adam reported that half the trials yielded a value of \(\addunit{\inch}{in}\unit{3\frac{1}{2}}{\inch}\) and the other half yielded a value of \(\addunit{\inch}{in}\unit{3\frac{5}{8}}{\inch}\). If they ran 4 tests, \(\addunit{\inch}{in}\sigma_{\bar{x}} = \unit{0.031}{\inch}\), corresponding to a relative error \(\sigma_{\bar{x}}/\bar{x}\) of 0.9%. And if the relative error for the clay-covered car was on the same order, about 1%, that’s much smaller than the 11% improvement they noticed. So yeah, it’s definitely statistically significant. (Now I feel kind of silly for going through all that work)
Here’s my other picking point (this one positive): Adam makes a good observation in the show about why the fuel efficiency doesn’t change when they add on 800 pounds of clay. As he explained, this is due to the clever way they designed their test; they don’t count the fuel used to accelerate the car up to 65 mph, only the fuel required to maintain that speed along the mile-long track. On the show they only said that the test didn’t simulate real-world driving conditions, so the mass didn’t have an effect, but here’s the quantitative explanation of why you can say that.
As the car moves down the track, it’s subject to the force of the engine (of course), a drag force exerted by the air, and a small amount of rolling friction exerted by the road on the tires. Using Newton’s second law,
Now, fuel economy is measured in miles per gallon, but each gallon of fuel corresponds to a roughly constant amount of energy. So the reciprocal of fuel economy would be roughly proportional to propulsive energy per unit distance:
Putting these last two equations together,
The drag force doesn’t depend on the car’s mass. The frictional force? It probably does depend on mass, but it’s so small that we can basically ignore it (that is, after all, why humans invented wheels in the first place). So the only dependence on mass that’s left is the \(ma\) term. If the car isn’t accelerating, that goes away. By running their tests at constant velocity, the Mythbusters managed to basically remove any effect that the car’s mass would have on the fuel economy they measured.
Unfortunately, one thing you may notice about that last equation is that it predicts that when acceleration is not zero, it drives the fuel economy down. People tend to do a lot of accelerating (in the physics sense, which includes braking) in their cars, and I have a feeling that’s going to be a much larger effect than anything that could be gained by putting dimples on new car models. But hey, like Jamie said, maybe we’ll see it on NASCAR someday…