I could see that keeping one big file would still be advantageous too in that environment. As a fopen on a set of small sized files plus read plus close over and over does add up. Just in cpu time and memory slack. Whereas treating it as one giant packed backing store would have advantages in speed. But at a cost of dev time. Even if you are compressing it as well could be an advantage. But I would expect there to be some spot where it crosses over from being an advantage to a disadvantage. I suspect it would be on small files/objects. But that would just be for speed. You may have to optimize for size in some cases. In the end it is usually better to build some test code and find out. Sometimes the results are what you expect. Sometimes you are spending time on things that do not matter anymore but were a big deal 20 years ago.