As i understand NTFS compression does not do a "solid" compression aka all file are compressed individuelly.
Is it intelligent enough to not compress a files if it already below the size of a cluster?
Printable View
As i understand NTFS compression does not do a "solid" compression aka all file are compressed individuelly.
Is it intelligent enough to not compress a files if it already below the size of a cluster?
I do not think so.
NTFS can store multiple small files in one cluster of MFT.
https://technet.microsoft.com/en-us/...8WS.10%29.aspx
Quote:
NTFS creates a file record for each file and a folder record for each folder created on an NTFS volume. The MFT includes a separate file record for the MFT itself. These file and folder records are 1 KB each and are stored in the MFT. The attributes of the file are written to the allocated space in the MFT. Besides file attributes, each file record contains information about the position of the file record in the MFT. The figure MFT Entry with Resident Record shows the contents of an MFT record for a small file or folder. Small files and folders (typically, 900 bytes or smaller) are entirely contained within the file’s MFT record.
Where I can download a file compressor that compresses the file by the algorithm of NTFS compression? I want to compare the speed/ratio of my algorithm with NTFS one, thanks.