But interestingly e^-x^2 has no elementary antiderivative, that is, no function that can be assembled from the sorts of functions one usually sees: polynomials, trig stuff, e’s, logs etc. It’s useful enough, though, that “erf” was invented for just this purpose. ]]>

I got none of them, but had some thoughts that might

1. I have made several false starts on this one, but have little to add. From a little experimentation (with numbers, not noodles), I think the number is relatively small compared to N and increases much slower than N. I think it may even converge on a relatively small number (maybe 2 or phi).

2. This problem turned out much harder than I thought. I tried starting with the statistical odds of all getting the right one (1/N!), but couldn’t work it the other way because at all steps after the first, I am unsure of the number of loops involved. As Steve said, the first person has a (N-1)/N chance of getting the wrong one, but the second may have an (N-1)/(N-1) (100%) chance or may have (N-2)/(N-1). If you multiply the first times the second for each case, you get ((N-2)/N)*(N-2)/(N-1), which is the chance that A did not draw A or B, plus 1/N*(N-1)/(N-1), which is the chance that A drew B. So the odds of the first 2 being wrong should be ((N-2)^2)/(N*(N-1)) Doing it this way, the formula gets really long. I think knowing the loops solves this, though. Essentially, if we have an expected value, we can get a probability of one loop, which is no one connecting with their own name tag.

3. I am not certain on this one if we are relating M to the number of cards or the numbers we draw. If the number of cards is unknown, but the counting numbers are consecutive, I think M should relate to the value of the first card drawn, where a sample M cards large is taken and the next card to exceed the largest in the sample is the stopping point. I also had the thought that you could go to the first card that exceeded your current largest value by exactly one, but again, this assumes that consecutive numbers have been used.

]]>And speaking of words, “plinko” owes its name in part to pachinko, a Japanese game that functions as a cross between a pinball machine and a slot machine, but whose mechanism is Galton-quincunx-link.

Finally, it’s interesting that the Magic Putnam problem is a kind of inverse of the normal curve, and thus gives us an somewhat intuitive idea of how that curve takes shape.

]]>I agree with empire. Good questions but they are getting harder! I’m hoping Chaim will post the answers at some point.

I’ve seen the spaghetti one before so I’ll leave that to others, have fun in the kitchen!

2) I have a partial solution.

I’m quite interested in the parties Chaim goes to where the guests put their nametags in a bowl. Are math conferences more fun than I’ve imagined!

Like the creators of the infinite improbability drive I don’t get invited to those sorts of parties, I just calculate their probabilities and complain about the loud music.

Suppose there are N guests. Let the chance of them all getting the wrong nametag be p(N).

The first guest, A, has an (N-1) in N chance of picking the wrong nametag. Let’s say they pick the nametag for B. Now consider B. They can’t pick their own nametag as it has already been picked. So suppose they pick C’s nametag. Now consider C, again we know their nametag is taken by someone else, B. We can continue like this until we find the person who picks the nametag for A.

Some examples with probabilities:

A picks A – probability 1 in N, but we fail in our goal of everyone having the wrong nametag.

A picks B, B picks A – probability (N-1)/N x 1/(N-1) = 1/N

A picks B, B picks C, C picks A – probability (N-1)/N x (N-2)/(N-1)x 1/(N-2) = 1/N

A picks B, B picks C, C picks D, D picks A – probability (N-1)/N x (N-2)/(N-1)x (N-3)/(N-2) x 1/(N-3)= 1/N

and so on. Note that each loop contains the same nametags as guests and so the remaining nametags match the remaining guests.

The length of the loop is between 1 and N in length, with each possible length having a probability of 1/N. Of course only loops of length 2 or more will work for us and this will leave between 0 and N-2 guests unaccounted for.

Suppose our loop was N-r in length so there are r guests unaccounted for. Since the remaining nametags match the remaining guests we are at the starting position but now we have r nametags and guests, so the chance of success is now p(r).

By summing over the possible lengths we have the following formula:

p(N) = SUM( 0 <= r<= N-2 )( 1/N x p(r) )

and so

Np(N) = SUM( 0 <= r =0 (N+2)p(N+2) = (N+1)p(N+1) + p(N)

Unfortunately I have no idea how to solve this, but it looks to me as though it should involve e (is that convincing?)

I did try substituting q(N) = Np(N). You need to start with N=1 to avoid issues with zeroes. This gives:

q(1) = 1; q(2) = 1; for N>=1 q(N+2) = q(N+1) + q(N)/N

However this didn’t seem to help at all, so I valiantly retreated. That’s to say I gave up.

3) Again I don’t have a nice solution, but I do have a formula of sorts.

Let’s assume that the numbers written on the N cards are 1 to N.

If we win then the largest number in the first M cards is between M and N-1. Say it is r. The probability of us winning with the highest card being r is the product of the following probabilities:

r is in the first M cards; probability M/N;

all higher cards are not in the first M cards; (N-M)/n x (N-M-1)/(N-1) x … x (r-M + 1)/(r+1);

M is the leftmost of the higher cards; 1/(N-r)

This product is M/n(n-r) x (n-M)!/(r-M)! x r!/n!

So all that’s left is to sum this for r between M and n-1, then calculate the optimum M for any n, then take the limit as n increases.

It looks to me as though the answer has something to do with e.

]]>