Rongchai Wang
January 17, 2026 09:16
GitHub is introducing a rate limit for action cache entries to 200 uploads per minute per repository, addressing system stability concerns caused by high-volume uploads.
GitHub has implemented a new rate limit for its Actions cache system, limiting uploads to 200 new cache entries per minute for each repository. The change, announced on January 16, 2026, targets repositories that were overloading the cache system with fast uploads and causing stability issues across the platform.
Downloads remain unaffected. If your workflows retrieve existing cache entries, nothing changes. The limit specifically targets the creation of new entries – a distinction that is important for teams running parallel builds that generate new cache data.
Why now? GitHub named “Cache Thrash” as the culprit. Repositories that upload massive amounts of cache entries in short bursts degrade the performance of all other users of the shared infrastructure. The cap of 200 per minute gives heavy users enough leeway for legitimate use cases while preventing the kind of abuse that has destabilized the system.
Part of a wider review of measures
This rate limiting comes as part of several significant changes to the economics of GitHub Actions. Earlier this month, GitHub reduced hosted runner prices by 15% to 39%, depending on size. But the bigger news comes on March 1, 2026, when using self-hosted runners in private repos will cost $0.002 per minute – a new fee that will force some teams to completely rethink their CI/CD architecture.
The cache system itself received an upgrade at the end of 2025, meaning repositories can now exceed the previous 10 GB limit through pay-as-you-go pricing. Each repo still has 10GB available for free, but heavy users can now buy more instead of constantly having to deal with clearance policies.
What teams should check
Most workflows do not notice this limit. However, if you're running matrix builds that generate unique cache keys across dozens of parallel jobs, you'll need to do the math. A matrix with 50 jobs running concurrently could theoretically achieve 200 cache uploads in under a minute if each job creates multiple entries.
The solution is straightforward: consolidate cache keys where possible, or postpone order processing if you really hit the ceiling. GitHub has not announced a monitoring dashboard for cache upload rates, so teams concerned about reaching limits will need to manually review their workflow logs.
Image source: Shutterstock