Member-only story
Strategies to work with large size repositories a.k.a. monorepo in git/GitHub: sparse-checkout and filter clone
You can also follow an example in this video!
Imagine a repository that is so large that cloning its entire repository will take hours. Once a repository is cloned, pulling new contents will be faster for sure, but some people even want to avoid going through the route of cloning entire repository. In version control system world, we call that “monorepo, ” which is a version-controlled code repository that holds many projects.
To work with large files, GitHub typically recommends to enable Large File System (LFS). In fact, there are certain limits imposed from GitHub side such as 25MB for directly uploading through user interface or 100MB for uploading in general. To work with large files like these, only option to place them GitHub repository is LFS.
But what about cloning these repositories? Is there a technique to only clone what you want instead of copying entire thing? Such was a requirement for one customer, as they want their employees to…