top button
Flag Notify
    Connect to us
      Site Registration

Site Registration

Does Git supports large binary files of Gigabyte/ terabytes size.

+2 votes
268 views

I wanted to know if Git supports large binary files of Gigabyte/ terabytes size. What is the cost of using GIT by approximately 10 (this would certainly increase to large number later)users?

posted Apr 14, 2015 by anonymous

Looking for an answer?  Promote on:
Facebook Share Button Twitter Share Button LinkedIn Share Button

Similar Questions
+1 vote

I have some data files that need to be stored along with source code. These data files are large, but I don't need to keep their versions. I only need to keep the versions of the source code.

git-annex is mainly for large files with version. Therefore, it is not suitable for my situation.

Does anybody know whether there is a way to use git to manage source code (with version) as well data files (without version)?

+1 vote

When cloning a large repo stalls, hitting Ctrl+c cleans what been downloaded, and process needs re-start.

Is there a way to recover or continue from already downloaded files during cloning? Please point me to an archive url if solution exists. (though I continue to search through them as I email this)

Can there be something like: git clone --use-method=rsync

+1 vote

I've been using git for some time now, and host my remote bare repositories on my shared hosting account at Dreamhost.com. As a protective feature on their shared host setup, they enact a policy that kills processes that consume too much memory. This happens to git sometimes.

By "sometimes" I mean on large repos (>~500MB), when performing operations like git gc and git fsck and, most annoyingly, when doing a clone. It seems to happen in the pack phase, but I can't be sure exactly.

I've messed around with the config options like pack.threads and pack.sizeLimit, and basically anything on the git config manpage that mentions memory. I limit all of these things to 1 or 0 or 1m when applicable, just to be sure. To be honest, I really don't know what I'm doing ;)

Oddly enough, I'm having trouble reproducing my issue with anything but git fsck. Clones were failing in the past, but after a successful git gc, everything seems to be ok(?)

Anyway, I'd like some advice on what settings limit memory usage, and exactly how to determine what the memory usage will be with certain values.

+1 vote

I wanted to avoid push if any of the files is deleted from the local git clone area. Can anyone please help me with that?

I am using Stash for repository management.

+1 vote

Is it possible to configure git to see a group of files as a singular object?For instance, I have a directory with several files:

myjunk/text.txt 
myjunk/somefile.exe 
myjunk/anotherfile.odt

Whenever a modification is made, I would like to see just "myjunk" as modified, not the files beneath. For all intensive purposes, git treats myjunk/* as a single binary file.

...