Re: Date for Last Common Ancestor?

Stephen Barnard (steve@megafauna.com)
Thu, 15 Aug 1996 15:18:17 -0800

CHESSONP wrote:
>
>
> An example. Let I_k be the interval of real numbers greater than -1/k and
> less than 1+1/k, or in more familiar notation, (-1/k, 1+1/k). This
> interval has length 1+2/k. If we consider the sequence of interval lengths
> {1+2/k} obtained from the associated sequence of intervals {I_k}, we see
> that the lengths are strictly decreasing. However, the sequence of lengths
> does not go to zero, but rather to a non-zero number, namely one.

I don't see how this applies to my argument. Your example operates in an
entirely different domain (infinite sets of real numbers in some bounded
interval), while my argument deals with finite sets. Furthermore, your
example doesn't make use of the critical inheritance property (everyone has
exactly one mother) which is the basis for my argument.

You can't disprove an argument by presenting an entirely different problem
and saying that it doesn't lead to the same answer.

Thanks for trying, though.

Steve Barnard