Nov 28, 2020
RAY: There was a young gal who had a few bad accidents. So her grandmother decided to give her an unusual gift: a large sum of money to buy a brand-new safe car, like a Volvo.
There was one condition, however. When she got the car, Grandma wanted to see it, to make sure that she didn't take the money and go out and buy a Firebird. So on the first available Saturday, she decides to drive to Grandma's house, which is 120 miles away. Because she's not particularly eager to get there, she gets on the highway and sets the cruise control for 40 miles an hour.
She drives 120 miles to Grandma's house. Her new car has a little computer that tells her that her average speed is 40 miles an hour.
When she gets there, she shows Grandma the car and high tails it out of there; she's eager to get home because she wants to go to the tattoo parlor before it closes. She sets the cruise control for 60 miles an hour.
She travels the same road and the same 120 miles. When she gets home, she does a little figuring. She says, "I drove 120 miles up, 120 miles back, or 240 miles. I drove 40 miles an hour up, and 60 miles an hour back, so my average speed was 50 miles an hour, and it should have taken me 4.8 hours. But it took me 5 hours!"
The question is, how can that be?
RAY: She didn't calculate the average speed; she calculated the average of the speeds.
In fact, her average speed is not 50 miles an hour but 48 miles an hour. And you arrive at that by calculating the total distance divided by the total time. The total distance being 240 miles, and the total time being 5 hours. So 240 divided by 5 comes out to be 48 miles an hour.