## FR. Who Wants To Be A Mathematician?

Kyle and Chaim finally get back into the studio!

We first pose a quick question: *If you drive fifty miles in fifty minutes, must there be some ten minute interval in which you drive exactly ten miles?*

Of course there must — mustn’t there? *Well prove it!*

Our main feature this week is an interview with Michael Breen, from the American Mathematical Society, who came and hosted a game show “Who Wants To Be A Mathematician!” About a hundred high school students from all over northwest Arkansas came to cheer on their classmates; Kyle Strong of Har-Ber High in Springdale came in first, winning $1250, and Karan Batra, of Bentonville placed second with $250.

Our interview includes a few sample problems… I guess we shouldn’t list too many of them, in case Micheal wants to recycle them! Mike’s also responsible for the great series of Mathematical Moments posters— check them out!

PS: We opened with the Up To One Million Dollars In Prize Money May Be Given Away gag… Always fun!

## pgc said,

May 1, 2009 at 7:55 am

[spoiler]

Suppose that we cover 50 miles in 50 minutes but that there is no 10 minute period that covers 10 miles.

If the fist 10 minutes covers less than 10 miles, then every 10 minute period must cover less than 10 miles (since otherwise we’d have to pass through an occasion where we covered exactly 10 miles in 10 minutes). It follows that we must travel less than 50 miles in the full 50 minutes.

Similarly if the first 10 minutes cover more than 10 miles then the overall trip would cover more than 50 miles in 50 minutes since we’d never be allowed to slow down enough without violating the initial supposition.

So that’s a proof by contradiction that for there must be a 10 minute period that covers exactly 10 miles.

[/spoiler]

## strauss said,

May 1, 2009 at 9:32 am

Stephen S writes:

Firstly, Hi, found your show and have listened to all the back podcasts now to catch up. Yes I have quite a long drive some days. I’m in Bath in the UK by the way.

Secondly, the quick car trip question is false, since I can do the whole trip at 60 miles an hour 50 miles, fiftt mins) and never do 50 at all. In fact I could do 49 miles in 1 min and take the last 1 at a very lesuirely pace to do the time and distance.

Actually, in the real world, I’d allow 35 mins to do 50 miles as this is Europe – assuming a clear road and sleeping cops. (Which around here is often…)

We note:

Glad we’ve made your drives go by quicker!

Let’s see, even if you drove 49 miles in one minute (at a steady rate of 49 miles per minute) and then covered the rest in 49 minutes at (at a steady rate of 1/49 miles per minute) there’d still be the ten minute interval from 0:48 through 10:48 in which you’d drive exactly 10 miles!

(That is, you’d drive 1/5 of a minute at 49 mpm, and 49/5 minutes at 1/49 mpm, for a grand total of 10 miles)

So this would still work out!

Thanks for writing, and thanks for listening!

Chaim

## Nick Deguillaume said,

February 1, 2010 at 3:19 pm

Proof:

Define a function g defined on the domain [0,40]. g(t) is defined to be the distance travelled in miles in the ten minute period following t . Let f(t) be defined ans f(t)=g(t)-10. It is enough to show that there exists a t such that f(t)=0.

We know that f is continuous from the following argument:

f(t) is the difference between the two integrals I (t)=Integral(v(u),0,t+10) minus J(t)= Integral(v(u),0,t), where v(u) is the speed at time u. Making the critical assumption that to get from point a to point b that you need to visit every point in between (you can’t travel at an infinite speed) the above mentioned integrals must be differentiable, since for any t I'(t)=v(t+10) and J'(t)=v(t). Differentiable functions are continuous and the difference between two continuous functions are continuous. Therefore f is continuous.

To complete the proof we invoke the midpoint theorem

We have that f(0)+f(10)+f(20)+f(30)+f(40)=0 from the assumption that you travel 50 miles in 50 minutes. We can assume that none of f(0), f(10), f(20), f(30) or f(40) is equal to 0, otherwise result follows immediately. We can therefore assert there exists a consecutive pair (s,s+10) in the ordered set (0,10,20,30,40) such that the sign of f(s) is opposite to that of f(s+10). Evoking the midpoint theorem there exists a u in the interval [s,s+10] such that f(u)=0 as required.

QED

This proof relies on the fact that 50 is divisible by 10. I think that statement hold true of we replace 10 by 12. I have not figured out how to prove this though. Please tell me if you find a solution.

## Nick Deguillaume said,

February 1, 2010 at 3:44 pm

oops just read the spoiler it generalises to 12. Thanks