Large XML files can be hard to work with in simple tool chains, such as text-based notepads, statistical software, and php/python etc. scripts on servers where you can't control the memory available. The large file sizes of some IATI files (e.g. 10Mb +, some up to 50Mb) raises the barrier to entry for working with the data.
It may be preferable from a use-case perspective to split large IATI files into smaller sections. These could each be registered on the registry under the same package, just as different download URLs.
Would giving guidance on appropriate maximum file-sizes for IATI data be useful? Would it create big challenges for implementers? Is this better dealt with by having aggregators that can provide the smaller file-sizes?