CRC imposes no limit on file size. However your library should support feeding data in chunks to calculate crc. Simply adding CRC values of chunks won't work - you have to use CRC of preceding chunk as the starting value for next chunk CRC calculation and that needs to be support by your library.
E.g. pycrc16 gives this example of doing just that:
import crc16
crc = crc16.crc16xmodem(b'1234')
crc = crc16.crc16xmodem(b'56789', crc)
On the 16 vs 32 bit part. In general, the more bits are there the lower is probability of collisions and so are better chances of detecting errors. But that is also strongly dependent on the polynomial as well as other parameters for the CRC. So choosing a right/best CRC is a good pretext for a holly war :o).
I'm worried about "CRC-32CCITT" though. CRC names are somewhat arbitrary and often same name refer to different algorithms and many algorithms have multiple names. But as far as I know, "CCITT" always refer to a CRC-16 algorithm. So, as a word of caution - simple extension from 16 bits to 32 bits isn't a good idea as good 16-bit polynomials usually aren't very good as 32 bit polynomials. If you prefer to use 32-bit CRC, I'd suggest choosing a proper 32-bit CRC. See here a good list of "known good" parameter sets.