Archive for category Math

A Game Theoretic Analysis of Dropped Cell Phone Calls

A problem that I suspect all cell phone users have goes something like this:

  • Alice calls Bob.
  • Alice and Bob get to talking and are having a productive and interesting conversation.
  • One of their phones drops the call.
  • Both Alice and Bob attempt to call the other, but each are directed to the other’s voice mail.
  • Both Alice and Bob try again with similar results.
  • Eventually, and after some frustration, they reconnect.
  • The conversation resumes, but not before the obligatory “my cell phone always drops calls in this part of town” conversation has occurred.

It turns out that we can use game theory to describe this situation.

The normal form game below describes what happens after the call is dropped, and allows both Alice and Bob to take one of two actions – call back or wait.

call back wait
call back (0,0) (1,1)
wait (1,1) (0,0)

This game is both symmetric and non-zero sum, and is commonly called a coordination game.

A common coordination game, which is isomorphic to this situation, is the case where two people are driving toward each other on a road and must decide on which side to drive. Either side is equally preferred by both as long as both choose the same – either left or right. Moreover, both sides are interested in preventing a head-on collision, so both are interested in making the correct choice.

In these types of situations it is common to settle upon some conventions, which are sometimes codified into law and sometimes not. The conventions allow us to coordinate our behavior and help us to avert unwanted outcomes.

As far as I know no convention about who should call back and who should wait has been adopted for when a cell phone call is dropped. Many different conventions would suffice. For instance, the person whose first name was alphabetically before the other’s name could call back. Or we could use last names. The ways to settle this are innumerable, but I would like to suggest one that I think makes sense.

The person who initiated the call to begin with is responsible for calling back. So, given our example where Alice originally called Bob and the call dropped, Alice will call back and Bob will wait. That’s it.

References: Leyton-Brown, Keven and Yoav Shoham. Essentials of Game Theory: A Concise, Multidisciplinary Introduction. Morgan and Claypool, 2008.


1 Comment

The Exponential Distribution

Please see a slightly expanded version here.

This is a derivation of the cumulative distribution function, characteristic function, moment generating function, first moment, expected value, second moment, and variance of the exponential distribution given its probability density function.

The probability density function of the exponential distribution is:

f_X(x) = \lambda e^{-\lambda x}

Thus the cumulative distribution function is:

F_X(x \leq a) = \displaystyle\int_{0}^{a} f_X(x) dx = \int_0^a \lambda e^{-\lambda x} dx = -e^{-\lambda x} \Big |_0^a = 1 - e^{-\lambda a}

And the characteristic function:

\phi_X(u) = E[e^{i u X}] = \displaystyle\int_{0}^{\infty} e^{i u x} f_X(x) dx = \int_0^{\infty} e^{i u x} \lambda e^{- \lambda x} dx = \lambda \int_0^{\infty} e^{-(\lambda - iu) x} dx = - \frac{\lambda}{\lambda-iu} e^{-(\lambda-iu)x} \Big |_0^{\infty} = \frac{\lambda}{\lambda - iu}

Now the moment generating function, which is obtained from the characteristic function:

M_X(t) = \phi_X(-it) = \displaystyle\frac{\lambda}{\lambda - (i(-it))} = \frac{\lambda}{\lambda - t}

Now the first moment:

M_X'(t) = \displaystyle\frac{d}{dt} \lambda(\lambda-t)^{-1} = -\lambda (\lambda -t)^{-2} \frac{d}{dt} (\lambda - t) = \frac{\lambda}{(\lambda - t)^2}


E[X] = M_X'(0) = \displaystyle\frac{1}{\lambda}

And the second moment:

M_X''(t) = \displaystyle\frac{d}{dt} M_X'(t) = \frac{d}{dt} \lambda (\lambda - t){-2} = -2 \lambda ( \lambda - t)^{-3} \frac{d}{dt} (\lambda - t) = \frac{2 \lambda}{(\lambda - t)^3}


E[X^2] = M_X''(0) = \displaystyle\frac{2}{\lambda^2}

And finally:

\displaystyle Var(X) = E[X^2] - E[X]^2 = \frac{2}{\lambda^2} - \Big(\frac{1}{\lambda}\Big)^2 = \frac{1}{\lambda^2}

Of course the expected value and the variance can be computed by evaluating:

\displaystyle \int_0^\infty x \lambda e^{-\lambda x} dx


\displaystyle \int_0^\infty x^2 \lambda e^{-\lambda x} dx


Next we’ll see how the exponential distribution is memoryless, that is:

\displaystyle P(X > t+s|X>t) = P(X > s)

First apply Bayes’ Rule:

\displaystyle P(X>t +s|X>t) = \displaystyle \frac{P(X>t|X>t+s)P(X>t+s)}{P(X>t)}
= \displaystyle \frac{1*P(X>t+s)}{P(X>t)}
= \displaystyle \frac{1-(1-e^{-\lambda (t+s)})}{1-(1-e^{-\lambda t})}
= \displaystyle \frac{e^{-\lambda (t+s)}}{e^{-\lambda t}}
= \displaystyle e^{-\lambda s}
= P(X>s)

Next, Let X_1, X_2 be exponential random variables with parameters \lambda_1 and \lambda_2 respectively, then:

\displaystyle P(X_1 < X_2) = \displaystyle \int_0^\infty P(X_1<X_2|X_1=x)P(X_1=x) dx
= \displaystyle \int_0^\infty P(X_1<X_2|X_1=x)\lambda_1 e^{-\lambda_1 x} dx
= \displaystyle \int_0^\infty P(x<X_2)\lambda_1 e^{-\lambda_1 x} dx
= \displaystyle \int_0^\infty P(X_2 \geq x)\lambda_1 e^{-\lambda_1 x} dx
= \displaystyle \int_0^\infty e^{-\lambda_2 x} \lambda_1 e^{-\lambda_1 x} dx
= \displaystyle \lambda_1 \int_0^\infty e^{-(\lambda_1 + \lambda_2)x} dx
= \displaystyle \lambda_1 \Big( -\frac{1}{\lambda_1+\lambda_2} e^{-(\lambda_1 + \lambda_2)x} \Big) \Big|_0^\infty
= \displaystyle \frac{\lambda_1}{\lambda_1+\lambda_2}


Ross, Sheldon M. Introduction to Probability Models, 9th edition. Academic Press. 2007.
Bremaud, Pierre. An Introduction to Probabilistic Modeling, 3rd printing. Springer. 1997.

Leave a comment