hg out of memory
natosha at gmail.com
Sun Jun 12 05:41:37 CDT 2011
On Sun, Jun 12, 2011 at 11:54 AM, Luca <yelsma at tin.it> wrote:
> Il 11/06/2011 13.05, Na'Tosha Bard ha scritto:
> Hi Luca,
> On Thu, Jun 9, 2011 at 10:59 PM, Luca <yelsma at tin.it> wrote:
>> Dear Mercurial Users,
>> my company would like to switch from svn to hg, but when pulling or
>> cloning from svn, hg goes in out of memory.
>> There is a svn revision that takes about 280 mb and all the revisions
>> after cannot be cloned neither pulled.
>> I successfully converted the svn repo to a hg one (with hgsubversion) only
>> when I used a 64 bit version of Windows.
>> After that, I tried to "hgclone" it from a 32 bit Windows machine, but it
>> always fails, 'out of memory'.
>> I see there is a link with some useful hints:
>> but all the strategies don't work fine since there is a svn revision too
>> big for "hg on 32".
>> I also checked that with a 32 bit version of Windows using hg I can create
>> a repo and commit a large file (280mb), but then I cannot clone it!!
>> We need to track source files but also some big binary file, so we need to
>> use a tool that can manage also large files, locking them if necessary.
>> We tried bigfiles extension, but it is a bit complicated since we need
>> another way to synchronize big files (e.g. ftp) while with svn we have a
>> faster bootstrap time: "install prerequisites, svn checkout and go". This
>> is very useful, especially from foreign base.
>> HG manages very well branches and can track a big number of files in a
>> fast way, but it seems a bit strange we cannot do what the old svn did.
>> There is a plan to fix this lack on a 32 bit machine?
>> I would be grateful if someone could help me; advices are welcome.
>> Thanx in advance.
> Mercurial is known to run out of memory when dealing with very large files;
> if you add a file larger than about 10 MB it will generally warn you that
> the file will take large amounts of memory to perform Mercurial operations
> on. This has to do with the fact that some operations involve loading the
> entire file into memory. Maybe it can be seen as a limitation to Mercuria,
> or maybe not -- but distributed version control systems in generall do not
> handle large files well. Some more information about large files in
> Mercurial is here: http://mercurial.selenic.com/wiki/HandlingLargeFiles
> Mercurial is (as far as I know) unique in the DVCS world becasue there are
> some options for handling large binary files, although, as you mentioned
> from trying the BigFiles extension, they do make your setup a bit more more
> complicated. One that I would recommend is the HugeFiles extension (still
> in the very early stages -- https://bitbucket.org/repo/all?name=HugeFiles),
> which is an extension of the KBfiles (
> http://kiln.stackexchange.com/questions/1873/how-do-you-use-kbfiles) with
> the intent of being easy to use outside of Kiln. If it is not urgent for
> you, that those of us who are working on the HugeFiles extension are
> planning on releasing a much more developed and polished version of the
> extensions in about a month or so.
> Also, I've never used it but I have heard reports from a couple of people
> that the Snap Extension works well (
> We use Kbfiles/HugeFiles in our setup which is a 1.7 GB repository with
> about 1.2 of these being large binaries (one of them as large as 250 MB).
> As far as I know, there is no way to deal with large binaries in your
> repository without using one of these tools. If you decide to try out the
> HugeFiles extension, then please let me know. Of course implementation of
> these solutions would also require that you get your code out of subversion
> into a proper Mercurial repository.
> As for whether the core Mercurial team plans to somehow address the issue
> of large binary files internall, I don't kow -- someone with more knowledge
> in that area would have to comment.
> *Na'Tosha Bard*
> Build & Infrastructure Developer | Unity Technologies
> *E-Mail:* natosha at unity3d.com
> *Skype:* natosha.bard
> Hi Na'Tosha,
> thank you very much for the fast answer.
> I never used hugefile extension, I never heard it before to read your mail,
> for sure I will try it. If it will work well, I think I will suggest it to
> my company, but I still have some doubt about all those extensions (bfiles,
> bigfiles, hugefiles etc...) since I dont understand if they can work "from
> remote". My company has many bases (Italy, China, USA, ...) and now we can
> share the code and the binary files with a centralized svn server using
> https; probably also hugefiles will need an external support for
> transferring binary files (FTP or something similar).
> Although hg has this limitations, I think it’s well worthwhile to switch
> from svn to hg, but I have also to make the transition easy to my
> colleagues, otherwise they will reject hg.
You're right that HugeFiles, just like BigFiles and Bfiles, all require you
to setup a shared SSH/FTP/HTTP(s) server. This is because it is fundamental
to how they work. Hopefully if you have a system administrator s/he can
help with the setup if you want to give it a serious try.
When I made this switch for our team, I ran the SVN repository through the
convert extension with a filemap that explicitly told it to ignore all of
the large binaries I knew were in there, then I added them back at the last
revision as HugeFiles/Kbfiles. This worked for us because we didn't care
much about the history of the binary files, only the source code.
I suppose you could also just try to get a hold of a 64-bit machine for
awhile and see if it works.
If you can manage to clone the whole repo as it is, you can run the
hgconvert command to convert all files larger than a certain size to
HugeFiles. You might also want to look into the SnapExtension and see what
it has to offer.
We use HugeFiles/Kbfiles from multpile locations around the world without
problem, so once your server infrastructure set up it should not be an
> Anyway I think you're right, I have to get my code out of subversion into a
> proper Mercurial repository, and now I can do it only using Windows 7 64 bit
> + hgsubversion, otherwise the conversion fails, but this is even acceptable;
> my first problem is that I cant clone that hg repository from a 32 bit
> machine. I have to manually remove all big files and insert them inside
> hugefiles or something similar, and this is a long task; while for "small"
> repositories (without large binary files) hgsubversion play very very well
> and this makes easier the transition from svn to hg (in every company there
> is always someone who says "We have to keep the repository on a central
> server, so it will be safer" or "svn can handle branches well"). In fact
> hgsubversion let us to keep a svn central server with an hg client without
> lost logs, mixing the things for a while before to do a complete "switch
> off", so everyone can feel the difference between a centralized and a
> "almost" distributed revision control (hgsubversion still uses a centralized
> svn server, so heads are not allowed).
Normally you have a centralized HG repository as well, so I don't think that
is very different. By default Mercurial will not allow you to push multiple
heads, either (unless you are using named branches, of course).
Last of all, I agree with you, the core Mercurial team should plan something
> to fix the issue, otherwise many repositories will still remain to svn.
I don't recall every saying that ;-) -- only that it would have to be a
Mercurial core developer that commented on whether it could be fixed
internally. I would like to see one official solution or large binaries in
Mercurial, regardless of whether it's built-in or an extension.
"Pain is temporary; quitting lasts forever."
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Mercurial