top button
Flag Notify
    Connect to us
      Site Registration

Site Registration

Stalled git cloning with large repository

+1 vote
403 views

When cloning a large repo stalls, hitting Ctrl+c cleans what been downloaded, and process needs re-start.

Is there a way to recover or continue from already downloaded files during cloning? Please point me to an archive url if solution exists. (though I continue to search through them as I email this)

Can there be something like: git clone --use-method=rsync

posted Aug 29, 2013 by Sanketi Garg

Share this question
Facebook Share Button Twitter Share Button LinkedIn Share Button

1 Answer

+1 vote

No, sadly. The pack sent for a clone is generated dynamically, so there's no easy way to support the equivalent of an HTTP Range request to resume. Someone might implement an appropriate protocol extension
to tackle this (e.g., peff's seed-with-clone.bundle hack) some day, but for now it doesn't exist.

What you *can* do today is create a bundle from the large repo somewhere with a reliable connection and then grab that using a resumable transport such as HTTP. A kind person made a service to do that.

http://thread.gmane.org/gmane.comp.version-control.git/181380

answer Aug 29, 2013 by Mandeep Sehgal
Similar Questions
+2 votes

I am trying to clone perforce branch from git to my local drive, but it's skipping too many files and change list while fetching it from perforce.

It'll be very helpful if anyone can suggest me about how to git rid with this issue.

+1 vote

I would like to use Git with a SVN, so I try to clone the SVN repo with "git svn clone svn://myserver", it is a repo without trunk etc. Git reports the error "Couldn't find a repository". The SVN repo uses an authentification (username & password) and a normalsvn checkout works well.
How can I use the SVN repo with Git?

+1 vote

I've been trying to put my filesystem for a very small busybox-based distro into a git-repository. And with success. The only strange thing I can not get my head around is the following :

When making a compressed tarball from the files from the repository (after clone/checkout) I get a very much larger tar.gz-file. Size goes up from 16M to 21M (!?)

+2 votes

I wanted to know if Git supports large binary files of Gigabyte/ terabytes size. What is the cost of using GIT by approximately 10 (this would certainly increase to large number later)users?

+1 vote

I have some data files that need to be stored along with source code. These data files are large, but I don't need to keep their versions. I only need to keep the versions of the source code.

git-annex is mainly for large files with version. Therefore, it is not suitable for my situation.

Does anybody know whether there is a way to use git to manage source code (with version) as well data files (without version)?

...