Determining the optical cluster size for your volumes

If you’re like most people then you have a separate partition where you store your data (mp3s, torrents, software, movies etc).  Most people set these partitions up with the default settings (NTFS, 4kb cluster size).  However, in most cases the 4kb cluster size is horribly inefficient for the purpose of these data partitions.  Chances are most of the files on this data partition will be at least 1mb.  The smaller cluster size causes your drive to perform extra seeks to read and write the data.  With the sudden rise of the new GUID Partition Table (GPT) and drives beginning to ship with a larger default sector size than 512bytes you should get ahead of the curve.

First, you need to determine what the optimal cluster(block) size is for your data.  The best way to figure this out is taking the median file size of the partition.  Once you know the best cluster size then you will need to copy the data to another partition or drive.  Finally, you can repartition with the new cluster size and then copy your data back.

Here are the steps:

  1. Run one of the below scripts, open in Excel/Calc and run the median() function — does anyone have a quicker way to determine median file size?  keyboard kung fu?
  2. Calculate the median size of files on the volume (listFilesSize – vbscript) (listFileSizes_python/linux — first attempt at python!)
  3. Move data off the volume
  4. Reformat volume using optimal cluster size
  5. Verify new cluster size: [root@localhost]# ntfsinfo -m -d /dev/sdaX

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s