hg out of memory
natosha at unity3d.com
Sat Jun 11 06:05:25 CDT 2011
On Thu, Jun 9, 2011 at 10:59 PM, Luca <yelsma at tin.it> wrote:
> Dear Mercurial Users,
> my company would like to switch from svn to hg, but when pulling or cloning
> from svn, hg goes in out of memory.
> There is a svn revision that takes about 280 mb and all the revisions after
> cannot be cloned neither pulled.
> I successfully converted the svn repo to a hg one (with hgsubversion) only
> when I used a 64 bit version of Windows.
> After that, I tried to "hgclone" it from a 32 bit Windows machine, but it
> always fails, 'out of memory'.
> I see there is a link with some useful hints:
> but all the strategies don't work fine since there is a svn revision too
> big for "hg on 32".
> I also checked that with a 32 bit version of Windows using hg I can create
> a repo and commit a large file (280mb), but then I cannot clone it!!
> We need to track source files but also some big binary file, so we need to
> use a tool that can manage also large files, locking them if necessary.
> We tried bigfiles extension, but it is a bit complicated since we need
> another way to synchronize big files (e.g. ftp) while with svn we have a
> faster bootstrap time: "install prerequisites, svn checkout and go". This
> is very useful, especially from foreign base.
> HG manages very well branches and can track a big number of files in a fast
> way, but it seems a bit strange we cannot do what the old svn did.
> There is a plan to fix this lack on a 32 bit machine?
> I would be grateful if someone could help me; advices are welcome.
> Thanx in advance.
Mercurial is known to run out of memory when dealing with very large files;
if you add a file larger than about 10 MB it will generally warn you that
the file will take large amounts of memory to perform Mercurial operations
on. This has to do with the fact that some operations involve loading the
entire file into memory. Maybe it can be seen as a limitation to Mercuria,
or maybe not -- but distributed version control systems in generall do not
handle large files well. Some more information about large files in
Mercurial is here: http://mercurial.selenic.com/wiki/HandlingLargeFiles
Mercurial is (as far as I know) unique in the DVCS world becasue there are
some options for handling large binary files, although, as you mentioned
from trying the BigFiles extension, they do make your setup a bit more more
complicated. One that I would recommend is the HugeFiles extension (still
in the very early stages -- https://bitbucket.org/repo/all?name=HugeFiles),
which is an extension of the KBfiles (
the intent of being easy to use outside of Kiln. If it is not urgent for
you, that those of us who are working on the HugeFiles extension are
planning on releasing a much more developed and polished version of the
extensions in about a month or so.
Also, I've never used it but I have heard reports from a couple of people
that the Snap Extension works well (
We use Kbfiles/HugeFiles in our setup which is a 1.7 GB repository with
about 1.2 of these being large binaries (one of them as large as 250 MB).
As far as I know, there is no way to deal with large binaries in your
repository without using one of these tools. If you decide to try out the
HugeFiles extension, then please let me know. Of course implementation of
these solutions would also require that you get your code out of subversion
into a proper Mercurial repository.
As for whether the core Mercurial team plans to somehow address the issue of
large binary files internall, I don't kow -- someone with more knowledge in
that area would have to comment.
Build & Infrastructure Developer | Unity Technologies
*E-Mail:* natosha at unity3d.com
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Mercurial