AtariZoll wrote: DrCoolZic wrote:
Do we really need to define a new format?
- Pasti has limitations but it does the job in almost all cases even if it is sometimes ugly
- IPF can do anything you can imagine. The problem is that currently it is only generated by SPS people. Getting IPF files from stream files simply does not happen (I have now be waiting almost three years to gt nothing)
so supporting it in emulator is good but it is not an option to end user that want to save their game (unless Aufit write IPF
- Flux level is an option. It p^rovides information at a level of details even lower than IPF. Is it useful? I do not know
Well, I already talked about something what would be almost perfect and not much space hungry: ST - like tracks data for unprotected tracks, but better STT like for unprotected and simpler protections (like non-standard sectors and headers), and flux level for better protected tracks. It is fine for most of floppies where 99% of sectors is standard - will have smaller images than public tool made Pasti ones. But in many cases it would have larger images, in range of 10 MB, where whole floppy is with custom tracks - track dumps, high-density (Space Ace serial ...) . When SPC will be enough mature it should solve all protections, I guess. 10 MB is not much for today storage.
The "problem" of such an hybrid format is "how do you determine which level of detail should be used to store a specific sector/track" ? If you know the protection used, you can have some scripts to store detailed informations only for the protected track, but what if the track is slightly damaged and gives CRC error when reading it ? Is this damages, or is the bad CRC part of the protection ?
For me, the problem is not only the storage format, it's the analysis required to choose which informations matter and which don't, and this is exactly the way IPF files are produced (and why they take so long)
So, I agree storing raw flux data solves this problem, as you store the highest level of details possible, but at the expense of a bigger file size (I know size is not that much a problem, but coders like to optimize things, why store 2-3 MB when it could be squeezed
And even at flux level, you sometimes need an expertise to watch the graph of the disk dump, see if data are scattered or not, which could indicate a damaged floppy that need to be cleaned and dumped again.
STX is nice, it can be used by anyone without any technical knowledge, but in the end, if the input floppy is damaged in a certain way, the user will have no idea the dump is bad.
A suitable format should come with tools to validate the data, maybe some parts of the analysis can be automated, but some tricky cases can remain.