I'm developing an application which uses some large binary files - in the range 1GB - 25GB. The application will run primarily on servers and possibly the odd powerful/modern desktop PC. I could (a) split these large files up so they're always less than 4 GB, or (b) just keep together in one single file.
FAT32 file systems only allow file sizes up to 4 GB. If I don't split up the files, they won't be usable on FAT32 systems.
Do I need to bother splitting these files?
This application is always going to be running on reasonably modern hardware. Are there any modern servers out there which are likely to use FAT32? Are there any other cloud file systems which would have significant limits on file sizes? (e.g. AWS Elastic file system is fine, as it allows single files up to 47 TB).