[PATCH STABLE] setdiscovery: fix hang when #heads>200 (issue2971)

Matt Mackall mpm at selenic.com
Thu Aug 25 14:53:58 CDT 2011


On Thu, 2011-08-25 at 21:37 +0200, Peter Arrenbrecht wrote:
> # HG changeset patch
> # User Peter Arrenbrecht <peter.arrenbrecht at gmail.com>
> # Date 1314300314 -7200
> # Branch stable
> # Node ID 95abeecfd5d08d3f45119ead5c9df8e1650dbbee
> # Parent  4a43e23b8c55b4566b8200bf69fe2158485a2634
> setdiscovery: fix hang when #heads>200 (issue2971)

Queued, thanks.

> When setting up the next sample, we always add all of the heads, regardless
> of the desired max sample size. But if the number of heads exceeds this
> size, then we don't add any more nodes from the still undecided set.
> (This is debatable per se, and I'll investigate it, but it's how we designed
> it at the moment.)
> 
> The bug was that we always added the overall heads, not the heads of the
> remaining undecided set. Thus, if #heads>200 (desired sample size), we
> did not make progress any longer.

But it seems like this is still insufficiently probabilistic?

What will happen in legitimate cases where there are >200 heads to
discover?

-- 
Mathematics is the supreme nostalgia of our time.




More information about the Mercurial-devel mailing list