The speed of a data hit on a hard drive is the slowest operation a computer can make (apart obviously to try a data hit on another, slower drive such as floppy, cd...) there are a number of factors that contribute to that.
1) the amount of external fragmentation -> use defrag to solve the problem
2) the amount of internal fragmentation, which in practice is minimal.
3) the file system -> if you are using Win XP it will be NTFS. this you cannot change. (you can, but for 60 GB, NTFS is better, more secure and faster)
4) the cache system and memory you use -> this is a bit complicated if you dont know about at least a bit of computer systems architecture. There are two caches, your ram memory and a processor cache (about halve of the processor is used for this) . Caches are high speed memory, they allow computer manufacturers to have a big, slow, cheap memory drive and a small, fast, expensive memory and give the ilusion of a big fast memory. In theese caches, entries for the most recently accessed documents are kept. here is were the processor will look first, and if it finds the file, the memory access took 10 times less than if you tried and retrieve it from disk.
what im trying to get here is that it doesnt matter how much memory is used in your hard drive. the files are indexed (even if you have not said so in the windows check box) which in computer terms is called paging. what it does matter is the amount of cache you have (amount of ram). leaving 35 gigs of space in your hard drive is wasting money to my eyes.
hope this is usefull.