Fix: Git remote rejected — file exceeds GitHub's file size limit of 100.00 MB
Quick Answer
Resolve the GitHub push error when a file exceeds the 100 MB size limit by removing the large file from history, using Git LFS, or cleaning your repository with BFG Repo Cleaner.
The Error
You try to push your commits to GitHub and get this:
remote: error: Sizable object exceeds GitHub's file size limit of 100.00 MB
remote: error: GH001: Large files detected. You may want to try Git Large File Storage - https://git-lfs.github.com.
remote: error: File bigdata.csv is 150.23 MB; this exceeds GitHub's file size limit of 100.00 MB
To github.com:your-username/your-repo.git
! [remote rejected] main -> main (pre-receive hook declined)
error: failed to push some refs to 'github.com:your-username/your-repo.git'The entire push fails. None of your commits go through, even the ones that do not contain the large file.
Why This Happens
GitHub enforces a hard 100 MB per-file size limit on every push. When you git push, GitHub scans every object across every commit you are pushing. If any single file in any commit exceeds 100 MB, the entire push is rejected.
The key detail most people miss: the file does not need to exist in your current working tree. If you added a 200 MB file, committed it, then deleted it and committed the deletion, both commits go to GitHub on push. The first commit still contains the 200 MB blob, and GitHub still rejects it.
This means simply deleting the file and making a new commit does not fix the problem. You need to rewrite history so the large file never appears in any commit.
GitHub also issues a warning (but does not block the push) for files between 50 MB and 100 MB. If you see that warning, it is a good time to set up Git LFS before the file grows past the hard limit.
Fix 1: Remove the Large File and Amend the Last Commit
If the large file was added in your most recent commit and you have not pushed yet, this is the fastest fix.
Remove the file from the staging area without deleting it from disk:
git rm --cached bigdata.csvAdd the file to .gitignore so it does not get committed again:
echo "bigdata.csv" >> .gitignore
git add .gitignoreAmend the commit to rewrite it without the large file:
git commit --amend -m "Your original commit message"Now push:
git push origin mainThis only works when the large file exists in the latest commit. If it was introduced in an earlier commit, you need one of the other fixes below.
If you are running into other push errors alongside this one, check out failed to push some refs for additional troubleshooting steps.
Fix 2: Use git filter-repo to Remove the File from All History
git filter-repo is the modern replacement for git filter-branch. It is faster, safer, and recommended by the Git project itself.
Install it first:
# pip (works on all platforms)
pip install git-filter-repo
# Homebrew (macOS)
brew install git-filter-repoThen remove the large file from every commit in the repository history:
git filter-repo --invert-paths --path bigdata.csvThis rewrites every commit that touched bigdata.csv and removes the file entirely. Every commit hash from that point forward changes.
If you need to remove all files matching a pattern (for example, all .bin files larger than 100 MB):
git filter-repo --strip-blobs-bigger-than 100MAfter running filter-repo, your remote is removed as a safety measure. Add it back:
git remote add origin [email protected]:your-username/your-repo.gitThen force push (see Fix 7 for warnings about force pushing):
git push origin main --forcePro Tip:
git filter-reporequires a fresh clone to run by default. If you are working in your only copy, pass--forceto override this safety check:git filter-repo --invert-paths --path bigdata.csv --force. Back up your repo before doing this.
Fix 3: Use BFG Repo Cleaner
BFG Repo Cleaner is a specialized tool built specifically for removing large files and sensitive data from Git history. It is significantly faster than filter-branch on large repositories.
Download BFG from rtyley.github.io/bfg-repo-cleaner and run:
java -jar bfg.jar --strip-blobs-bigger-than 100M your-repo.gitOr to remove a specific file by name:
java -jar bfg.jar --delete-files bigdata.csv your-repo.gitAfter BFG finishes, clean up the leftover objects:
cd your-repo.git
git reflog expire --expire=now --all
git gc --prune=now --aggressiveThen force push:
git push origin main --forceBFG does not touch your latest commit by default. It considers your current HEAD as “protected.” If the large file is in your latest commit, remove it with git rm --cached and commit first, then run BFG.
Fix 4: Set Up Git LFS for Large Files
If you legitimately need large files in your repository, Git Large File Storage (LFS) is the correct solution. Instead of storing the full file in Git, LFS stores a pointer in the repo and uploads the actual file to a separate LFS server.
Install Git LFS:
# macOS
brew install git-lfs
# Windows (if not bundled with Git for Windows)
git lfs install
# Linux (Debian/Ubuntu)
sudo apt install git-lfs
git lfs installTrack the file type with LFS:
git lfs track "*.csv"This creates or updates a .gitattributes file. Commit it:
git add .gitattributes
git commit -m "Track CSV files with Git LFS"Important: Running git lfs track does not retroactively convert files already in your Git history. You still need to remove the large file from history using Fix 2 or Fix 3 first, then re-add it so LFS handles it.
To migrate an existing file to LFS and rewrite history in one step:
git lfs migrate import --include="*.csv" --everythingThis rewrites your entire history so that all .csv files are stored via LFS. After the migration, force push:
git push origin main --forceIf you have already been using LFS and are hitting errors, see Git LFS smudge filter errors for related fixes.
Common Mistake: Running
git lfs trackafter the file is already committed to regular Git does nothing for the existing commits. You must either rewrite history withgit lfs migrate importor remove the file from history and re-add it fresh. Simply tracking and pushing again still fails.
Fix 5: Add Large Files to .gitignore Before Committing
This is prevention, not a cure. If you have not committed the large file yet, the simplest path is to ignore it.
Create or edit .gitignore in your repository root:
# Large data files
*.csv
*.sql.gz
data/
models/*.binIf the file is already tracked by Git, .gitignore alone will not help. You need to untrack it first:
git rm --cached bigdata.csv
echo "bigdata.csv" >> .gitignore
git add .gitignore
git commit -m "Remove large file and add to gitignore"Verify the file is no longer tracked:
git ls-files | grep bigdataIf this returns nothing, the file is properly untracked. But remember, if the file existed in previous commits, those commits still contain the blob. You will still need Fix 2 or Fix 3 to clean the history before pushing.
Fix 6: Use Interactive Rebase to Edit the Offending Commit
If the large file was added a few commits back and you want precise control over which commit to fix, interactive rebase is the approach.
First, find which commit introduced the large file:
git log --all --diff-filter=A -- bigdata.csvThis shows the commit where bigdata.csv was first added. Count how many commits back that is, then start the rebase:
git rebase -i HEAD~5In the editor that opens, change pick to edit on the commit that added the large file:
edit abc1234 Add data processing pipeline
pick def5678 Update README
pick 789abcd Fix data parserSave and close the editor. Git pauses at that commit. Remove the file:
git rm --cached bigdata.csv
echo "bigdata.csv" >> .gitignore
git add .gitignore
git commit --amend --no-editContinue the rebase:
git rebase --continueIf you encounter conflicts during the rebase, resolve them and continue. For detailed guidance on handling rebase conflicts, see git rebase conflict.
After the rebase completes, all commit hashes from the edited commit onward are rewritten. Push with force:
git push origin main --forceFix 7: Force Push After Cleaning History
Every fix that rewrites history (Fix 2, 3, 4, and 6) requires a force push to update the remote. A normal git push fails because the remote history and your local history have diverged.
Force push your cleaned branch:
git push origin main --forceOr use the safer variant that only overwrites if no one else has pushed in the meantime:
git push origin main --force-with-leaseWarning: Force pushing overwrites the remote history. If other people have cloned or forked the repository, their local copies become incompatible with the new history. They need to re-clone or run git pull --rebase to sync up.
Before force pushing a shared repository:
- Notify your team that history is being rewritten
- Ensure no one has pending work based on the old commits
- Use
--force-with-leaseinstead of--forceto avoid overwriting someone else’s recent push
If you are dealing with non-fast-forward push rejections alongside the large file issue, the force push resolves both problems at once. Just make sure you have genuinely cleaned the history first.
Fix 8: Prevent Future Issues with Pre-Commit Hooks
After cleaning up the mess, set up a safety net so large files never sneak into your commits again.
Create a pre-commit hook at .git/hooks/pre-commit:
#!/bin/bash
# Block commits containing files over 100MB
max_size=$((100 * 1024 * 1024)) # 100 MB in bytes
for file in $(git diff --cached --name-only --diff-filter=ACM); do
if [ -f "$file" ]; then
file_size=$(wc -c < "$file")
if [ "$file_size" -gt "$max_size" ]; then
echo "ERROR: $file is $(($file_size / 1024 / 1024)) MB, which exceeds the 100 MB limit."
echo "Consider using Git LFS or adding it to .gitignore."
exit 1
fi
fi
doneMake it executable:
chmod +x .git/hooks/pre-commitThis hook checks every staged file before allowing the commit. If any file exceeds 100 MB, the commit is blocked with a clear error message.
For team-wide enforcement, use a hooks manager like Husky (Node.js projects) or pre-commit (Python framework) that keeps hooks in the repo so every contributor gets them automatically.
You can also configure Git itself to warn about large files globally:
git config --global core.bigFileThreshold 50mThis does not block commits, but it changes how Git handles files above the threshold internally, which can serve as a useful indicator.
For projects using GitHub Actions, you can add a check that scans for files approaching the limit. This catches issues even if someone bypasses local hooks. See GitHub Actions exit code 1 errors if you run into CI pipeline issues while setting this up.
Still Not Working?
If you have rewritten history and force pushed but the push still fails, check these less common causes.
Protected Branch Rules
If the branch has protection rules enabled on GitHub, force push is blocked by default. Go to Settings > Branches > Branch protection rules in your GitHub repository and temporarily disable “Do not allow force pushes” while you clean up the history. Re-enable it afterward.
Alternatively, push to a new branch, delete the old one, and rename:
git push origin main:main-cleaned --force
# Then on GitHub: delete main, rename main-cleaned to mainGitLab and Bitbucket Have Different Limits
GitHub’s limit is 100 MB per file. Other platforms differ:
- GitLab: 100 MB default, but self-hosted instances can configure a different limit via
git_max_size - Bitbucket Cloud: 100 MB per file
- Bitbucket Server: Configurable, default varies by version
- Azure DevOps: 100 MB per file for Git repos
If you migrated a repository from one platform to another and the push fails, check the target platform’s specific limits.
Pack File Size Issues
Even after removing a large file from history, Git may still have the old object in its pack files. Run a full garbage collection:
git reflog expire --expire=now --all
git gc --prune=now --aggressiveVerify the large blob is actually gone:
git rev-list --objects --all | git cat-file --batch-check='%(objecttype) %(objectname) %(objectsize) %(rest)' | sort -k3 -n -r | head -20This lists the 20 largest objects in the repository. If the large file still appears, your history rewrite did not fully remove it. Re-run the cleanup with git filter-repo (Fix 2).
The File Was Added in a Merge Commit
Merge commits can carry file changes that are tricky to rebase away. If the large file was introduced via a merge, git filter-repo (Fix 2) or BFG (Fix 3) are more reliable than interactive rebase, because they handle merge commits cleanly.
Local Reference Still Holds the Object
If you have stashes, tags, or other refs pointing to commits that contain the large file, Git keeps the object around. Clear stashes and delete tags that reference the old history:
git stash clear
git tag -l | xargs git tag -dThen re-run garbage collection and try pushing again.
If your push errors reference other issues like missing branch refs, check src refspec main does not match for related diagnostics.
Solo developer based in Japan. Every solution is cross-referenced with official documentation and tested before publishing.
Was this article helpful?
Related Articles
Fix: Git submodule update failed / fatal: not a git repository
Resolve Git submodule update and init failures including 'fatal: not a git repository', path conflicts, URL mismatches, shallow clone issues, and CI/CD checkout problems.
Fix: git cherry-pick error: could not apply commit (conflict)
How to fix git cherry-pick conflict errors caused by diverged branches, overlapping changes, missing context, renamed files, and merge commits.
Fix: error: failed to push some refs to remote
How to fix Git error 'failed to push some refs' caused by diverged branches, remote changes, protected branches, authentication failures, and pre-push hooks.
Fix: git error: src refspec 'main' does not match any
How to fix git error src refspec main does not match any caused by empty repos, wrong branch name, no commits, typos, and default branch mismatch.