The article ``A Law Apple Would Like to Break'' by James B. Stewart (appeared on-line on Feb 24, 2011 under `Common Sense')  is fundamentally flawed in interpreting Bernoulli's law. The author uses Bernoulli's law to ``predict'' Apple's growth rate based on past numbers.
Before pointing out the serious flaws in the Mr. Stewart's argument, let me formally state Bernoulli's law (also known as the weak law of large numbers) without getting into the mathematical nuances and notation: ``For an infinitely large collection of random numbers each generated independently and under identical conditions, the average converges to the true mean of the random variable .'' Armed with this (almost) precise statement it can be seen that the author's arguments are deficient on three counts.
1. Although sales, revenue and such other economic indicators of a company's performance may be regarded as random variables, these numbers are neither independent, nor generated under identical conditions. It is easy to convince oneself that a company's performance numbers from the past and the present are correlated. Numbers in quarterly balance sheets are not generated by rolling a fair die over and over again!
2. It is not clear what the true mean of the random variable that the author is referring to is. Hence, it is illogical to conclude that Apple's growth should slow down by using the ``revert-to-its-mean'' idea. We do not know what the true mean is (no matter how much analysts and experts speculate). I do agree that there are other physical limitations that may lead to slower growth in case of larger companies (of course Apple can not sell an iPhone to every human and rock on this planet), but attributing it to the weak law is absurd.
3. Finally, Bernoulli's law applies to random numbers when the collection grows infinitely large. It says nothing about individual outcomes. Incidentally, colloquial misuse of Bernoulli's law (as in the present article) is so common that it has a name of its own -- the Gambler's fallacy . A streak of ten consecutive heads in ten tosses of a fair coin does not increase the possibility of seeing tails in subsequent tosses! It is natural and quite intuitive to think that the coin toss experiment should eventually ``even out.'' In fact, one can justifiably invoke the weak law of large numbers and say that if a fair coin is tossed infinitely many times, then the fraction heads (and tails) will become arbitrarily close to 1/2. However, twisting this argument and saying that a streak of one hundred consecutive heads raises the chances of seeing tails in the 101st toss is naive.
To conclude, I agree with the author's intuition that Apple's growth may slow down, but I have serious reservations about ascribing this to Bernoulli's law of large numbers. Apple is not running against this law in any manner whatsoever. No one can.
 See the news article at NYT.com.
 For those with a penchant for mathematics, a precise statement of the weak law can be found in any standard book on probability. For instance, see J. A. Gubner, Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press, 2006, pp. 576.
 The term originates from an interesting story that can be found on fallacyfiles.org. Also see the Wikipedia page on this topic and about the rampant misuse of Bernoulli's law under the garb of the law of averages.