Git Performance Optimization: Speed Up Your Workflow
Why Git Performance Matters
As repositories grow, Git commands can slow down dramatically. A 10-second `git status` or 5-minute `git clone` kills developer productivity. This guide covers practical techniques to keep Git fast, whether you're working with monorepos, legacy codebases, or CI/CD pipelines.
Key principle: Git performance issues usually stem from three sources: repository size, object count, and network latency. Each requires different optimization strategies.
Measuring Git Performance
Before optimizing, measure:
$ time git status
$ time git log --oneline -n 100
# Analyze repository size
$ git count-objects -vH
$ git gc --auto --verbose
# Profile Git operations
$ GIT_TRACE_PERFORMANCE=1 git status
Repository Size Optimization
1. Git Garbage Collection (git gc)
Git automatically runs `git gc --auto`, but manual tuning helps:
$ git gc --aggressive --prune=now
# Configure auto GC limits
$ git config gc.auto 5000
$ git config gc.autoPackLimit 50
2. Repacking Objects
Packfiles compress objects. Optimize packing:
$ git repack -a -d --depth=250 --window=250
# Use bitmap indexes for faster clones
$ git repack -a -d --write-bitmap-index
3. Removing Unreachable Objects
$ git fsck --unreachable
$ git prune --expire=now
Shallow Clones and Partial Clones
Shallow Clones (--depth)
Clone only recent history, ideal for CI/CD:
$ git clone --depth 5 --branch main git@github.com:user/repo.git
Partial Clones (--filter)
Clone without blobs, fetch only when needed (Git 2.19+):
$ git clone --filter=blob:none https://github.com/user/repo.git
# Clone only specific tree structure
$ git clone --filter=tree:0 https://github.com/user/repo.git
Blobless Clones
Fetch blobs on demand during checkout:
$ cd repo
$ git checkout main # Fetches blobs for this commit
Large File Handling
Git LFS (Large File Storage)
For binaries, media files, and large assets:
$ git lfs track "*.psd" "*.zip" "*.mp4"
$ git add .gitattributes
$ git add file.psd
$ git commit -m "Add design files with LFS"
BFG Repo-Cleaner
Remove large files from history permanently:
$ bfg --strip-blobs-bigger-than 100M repo.git
# Delete specific folders/files
$ bfg --delete-folders node_modules repo.git
$ git reflog expire --expire=now --all && git gc --aggressive --prune=now
Index and Working Tree Optimization
1. Sparse Checkout
Work with only a subset of files in monorepos:
$ cd monorepo
$ git sparse-checkout set services/api packages/shared
$ git sparse-checkout add docs
2. fsmonitor (File System Monitor)
Speed up `git status` on large repos (Git 2.28+):
$ git config core.fsmonitor true
# Or use Watchman integration
$ git config core.fsmonitor .git/hooks/fsmonitor-watchman
3. Untracked Cache
Network Optimization
Compression and Protocol
$ git remote set-url origin git@github.com:user/repo.git
# Adjust compression (0-9)
$ git config core.compression 9
$ git config pack.compression 9
Clone Optimization
| Technique | Command | Speed Gain |
|---|---|---|
| Shallow clone | --depth 1 |
80-95% faster |
| Blobless clone | --filter=blob:none |
60-80% faster |
| Tree-less clone | --filter=tree:0 |
40-60% faster |
| Single branch | --single-branch |
30-50% faster |
Configuration Tuning
$ git config --global core.preloadindex true
$ git config --global core.fscache true
$ git config --global gc.auto 5000
$ git config --global pack.threads 8
$ git config --global pack.windowMemory "512m"
$ git config --global pack.deltaCacheSize "256m"
Monorepo Performance Strategies
1. Partial Clones + Sparse Checkout
Combine techniques for massive repos:
$ cd monorepo
$ git sparse-checkout set --cone packages/my-service
$ git sparse-checkout add scripts
2. Scalar (Microsoft's Git extension)
Optimized for large repos:
$ brew install microsoft/git/scalar # macOS
# Clone with Scalar
$ scalar clone https://github.com/user/monorepo.git
$ scalar register # Register for maintenance
CI/CD Pipeline Optimizations
1. Cache Dependencies
- name: Cache Git objects
uses: actions/cache@v3
with:
path: .git/objects
key: ${{ runner.os }}-git-${{ hashFiles('.git/refs/heads/main') }}
2. Shallow Fetch in CI
$ git fetch --depth 1 origin $CI_COMMIT_SHA
$ git checkout -qf $CI_COMMIT_SHA
3. Commit History Truncation
$ git clone --mirror --depth 1 source-repo.git build-repo.git
Benchmarking Your Optimizations
$ hyperfine 'git status'
# After applying changes
$ hyperfine 'git status'
# Compare different commands
$ hyperfine --warmup 3 'git log -1' 'git show HEAD'
Performance Comparison Table
| Optimization | git status | git log | git clone |
|---|---|---|---|
| Baseline (100MB repo) | 1.2s | 0.8s | 45s |
| + git gc --aggressive | 0.9s (-25%) | 0.5s (-37%) | 38s (-15%) |
| + fsmonitor | 0.3s (-75%) | 0.5s (-37%) | 38s (-15%) |
| + shallow clone (depth 1) | - | - | 2s (-95%) |
| + blobless clone | - | - | 9s (-80%) |
Frequently Asked Questions
Git runs auto GC automatically. For active repos, manual GC every few months helps. Use git gc --auto in cron jobs for servers. Avoid running aggressive GC too frequently—it rewrites all objects and can be heavy.
Shallow clone limits history depth (number of commits). Partial clone limits object types (e.g., no blobs). They can be combined: git clone --filter=blob:none --depth 1 gives both benefits.
Use BFG Repo-Cleaner or git filter-repo. After cleaning, force push and notify all collaborators to re-clone. Example: bfg --strip-blobs-bigger-than 100M repo.git
Yes for large binaries—they're stored separately and downloaded on demand. But LFS adds overhead for small files. Use it only for assets >1MB that change infrequently.
Combine: partial clones, sparse checkout, Scalar for maintenance, and split CI/CD pipelines. Use git maintenance (Git 2.30+) for background optimization.