This is ripped from the sonikmatter forum:
Let's do a quick review:
What does de-fragging do?
De-fragging is meant to move files so they occupy consecutive sectors on the physical platter of a hard drive. This has in the past optimized performance since an entire file could be read with minimal read/write arm movement.
De-fragging utilities did this by reading each sector of a file currently on the drive, and instructing the low-level driver to write that sector to a specific new sector, right next to the last sector of data written.
After hours of work, all your files were snugged up next to each other as close to the hub of the drive as possible. Arm movement was reduced to the minimum possible.
Advantages were there. On a 5.25 inch SCSI or ATA platter, on hardware, OS or a bus that was slower than the hardware, and in situations where one reader was the only drive accessor you could gain some from de-fragging.
What About Now?
First and formost (and this has always been true) hard drives try to write data in the most efficient manner anyway, all by themselves. Drives try to "de-frag" naturally. De-fragging (even in it's most effective times) really only made a difference when you drive was over 75% full. Files didn't start getting fragmented until space started running out.
Why de-fragging is reaching ineffective stages:
1) Contiguous sectors are often not the most efficient way to read a file on modern hard drives! For those of you old enough, do you remember the "speeded up" versions of the Apple II disk operating system, like ProtoDOS and otehrs? They did what many modern hard drives do now.
Instead of being effective at reading each successive sector, the most effective read pattern for many hard drive mechanisms is read one sector, process that into cache while skipping 2 sectors, read the 3rd. So a continous write operation (file system level write) will result in a sector pattern of (F1 is this file, fx is other files)
F1 fx fx fx F1 fx fx fx F1.
"De-fragging" that and putting each sector next to the other will actually reduce the efficiency of the system instead of improving it.
All hard drive manufactures know that de-fragging software is out there, and they know that most de-fragging software does not contain a catalog of optimum sector spreads for every possible hard drive platter/cache combination in the world.
So, many hard drive mechanisms or drives ignore de-fragging software. There you are, watching you hard drive de-frag for 4 hours. Inside the box the de-fragging software is telling the hard drive "OK, take this sector and put it on cylinder 3, sector $1A". The hard drive itself is replying "Oh yeah, I'm doin it boss, I'm puttin that data right where you asked" when in reality it is completely ignoring the low-level sector access calls because they just don't make sense for this drive.
Think about this part of the hard drive mechanism, also. When you bought your Audio drive, you probably spent a few extra bucks to get a 2 or 8 MByte cache on the hard drive controller board.
Well guess what! If it's working well (and most do) then most of the data reads your OS is making of the drive are being satisfied from the cache, NOT from the hard drive platter itself. The drive is reading ahead and gathering data in the most effective way, and the cache is getting the hits. De-fragging obviously has no effect on a RAM cache.
There are literally hundreds of platter, arm, cache designs on the market today, and each will have optimum data transfer patterns that are slightly different. What is a great sector pattern for a 7200 RPM Western Digital drive with an 8 mbyte cache is a Bad sector pattern for an 5400 RPM IBM Travelstar with no cache. So defrag software will probably get it wrong.
OK, lets step up to the OS level now.
Guess what? The OS is caching stuff too, and those caching algorythms are a hell of a lot better than they ever were, and get better all the time. When you think your reading from the hard drive itself, you often times are reading out of main memory, defragging has no impact.
Lets look at one more aspect of de-fragging in 2003 which, even if all the above were not true, makes it less relevant to unimportant:
De-fragging was based on an assumption that there was one accessor of a hard drive at a time, and that accessor would open and read from one (or two) files continuously for a long (in computer terms) time.
Those two assumptions are no longer valid.
First, Mac OS X and many versions of Windows are preemptively multitasked.
That means that no application can grab and hold system resoruces. The OS gives those resources periodically to every running application. So it is very likely that when you are in the middle of reading from a "de-fragged" 500 mbyte file, 30 or 40 times some other application is goign to get a little time, and might want to read it's own files. So the arm moves awa from your file, then back. De-frag meaningless.
Second, and gosh darn it, how many files do YOU have open when you're doing audio work?
In many cases with modern DAWs it can be hundreds. Your ESX files. Your audio files that you recorded. VST and AU settings files, and so on.
Does it make any sense to de-frag that? Nope.
Again, de-fragging is predicated on extended read-access to a single (or 2) file. That scenario is outmoded.
Lets be very simplistic: You have a "normal" DAW-based song in Logic. Tht probably means that you have 16-32 tracks of audio.
That means you have at the absolute bare minumum 1 file open and being read for each audio track, right? So that means at a minimum 16 large files being read constantly to play the music, right?
So no matter what you can possibly do the read head of the drive is going to be hopping around. And you're going to blow the 2 MByte cache on the card. Obviously 16 files cannot be de-fragged together in a way that makes sense, since the de-fragging software cannot know the usage patterns fo those 16 files, and know that for optimum use interleaving the sectors would be best. De-fragging software only knows how to de-frag one file at a time in isolation with no knowledge of the intended use of the file.
Which brings us to another thing,the smarts of the application programmers. The guys who program Logic (or Cubase, DP, etc.) DO know how they are going to use the files. So they write very very very smart code (reseach the patents or read the literature) to access the files they need in the most effective way. Since they know what they need from a sample, or 16 audio files, they can work with the file system themselves, intellegently, to get the data in and ready when it's needed. And they can work with the operating system to hint to the OS about what they plan to do next so the OS can help.
And one final thing: Remember Virtual Memory? That evil thing you all were warned to turn off becuase it killed performance and blew away defragging and disk caches?
Hear about it much anymore? No? Did it go away?
No it did not, in fact it's just there. In Mac OS X and WinXP virtual memory is on and active 100% of the time and is not a user-controllable action.
Why?
Because it works now. Becuase it's effective and efficient (again, like anything, you can bollox it up by having 1,000 applications open at once).
But keep this in mind, that's another thing accessing drives completley out of the de-fraggers control.
De-fragging made sense at one point in our computer history, and there are still some situations where it might now. But those are rare, and getting rarer.
Great defrag tutorial
One more point about this... I haven't defragged a drive since I used a 4gb UW-SCSI disk on a P2 266mhz.
I can also say that I've never had a failed hard drive in all that time (touch wood), while lots of people around me who defragged religiously kept getting HD-deaths.
I'm no scientist, but I think it's fair to say that those 4-hour defrag sessions are a considerable stress on the disk, on top of which if the power gets cut during that time you're pretty much fucked.
I've found that it's far more effective to reformat and reinstall if it's the OS-partition in question, otherwise, just back up/copy data to somewhere else, reformat the partition and copy the data back.
just a thought.
peace
I can also say that I've never had a failed hard drive in all that time (touch wood), while lots of people around me who defragged religiously kept getting HD-deaths.
I'm no scientist, but I think it's fair to say that those 4-hour defrag sessions are a considerable stress on the disk, on top of which if the power gets cut during that time you're pretty much fucked.
I've found that it's far more effective to reformat and reinstall if it's the OS-partition in question, otherwise, just back up/copy data to somewhere else, reformat the partition and copy the data back.
just a thought.
peace
-
- Posts: 236
- Joined: Sun Jul 29, 2001 4:00 pm
- Location: Rotterdam, The Netherlands
This effect is know as (and solved by)interleaving: low level formatting where the logical sectors are not placed in physical sequence on the platter, but interleaved. This was an issue with old drives; modern drives all work with a 1:1 interleave - the logical sectors are placed (and read/written) in sequence from/to the platter.On 2003-03-14 19:21, jupiter8 wrote:
<<cut>>Instead of being effective at reading each successive sector, the most effective read pattern for many hard drive mechanisms is read one sector, process that into cache while skipping 2 sectors, read the 3rd. So a continous write operation (file system level write) will result in a sector pattern of (F1 is this file, fx is other files)
F1 fx fx fx F1 fx fx fx F1.
So I feel the article mixes old-time issues and modern technologies. I'm absoluteley convinced in that defragging helps me to
- reorganize files on disk: group files that are read often, read occasionally or written recently.
- remove gaps on the data-disk, so a big multi channel audiofile gets written contageously.
As for mains-failure: with modern filesystems (journaled or logged FS'es) corruption is rare. I once had a power-failure when defragging my WIN2K NTFS filesystem, but no corruption occured. This is no guarantee, but things are not so bad as they seem to be. And IF corruption occurs, we just restore our backups
