I did finish fifth grade. My question in no way asked why files are of different lengths. The definition of interval is the time or space between two events or objects. In this case, the "data intervals" are the times between data points within a file. Examining my telemetry files, I find as few as eight and as many as 23 data points recorded per second, the mean is around ten. Hence, the question of why. Answering what might explain the why. The inconsistency between intervals might be corrected if the intervals are settable. Another advantage of the operator being able to set the intervals would be to significantly reduce the file size. Currently, for my purposes, I see no real value to 9,000+ data points for a 15 minute flight.
I work with a much more sophisticated system that records telemetry in both 5 and 20Hz. 5Hz is used for casual review but if detail is required 20Hz is the "go to" for the best info. When problems arise it's extremely beneficial to have the 20Hz data available for review since electrical faults are often of extremely short duration and can be completely missed in a 5Hz data plot. They plot out a whole lot better at higher resolutions. One of my responsibilities for my employer is performing incident reviews to determine cause. You can have it easy or comprehensive. I noted the 10Hz average with H telemetry as well. For me that's pretty good. Not great, but good enough for what the system is.