Originally Posted by JETninja
Explain to me how you all feel OCing you video cards will harm them?
Two words: Electrostatic migration.
Two more words: Electromagnetic Interference. (EMI)
By mitigating the heat of your overclocked component, you've avoided immediate "burnout". Failing the smoke test. But any time you put two dissimilar, non-inert elements together, you will encounter corrosion, migration, or solution/particulate. And electronic components carry their own baggage with electrostatic migration, in the form of EMI.
Modern computer chips are frequently made from, at least, silicon, copper, and plastic. In some cases, steel, gold, and other substances are used. You might think that there are no "moving parts" in a chip, but you're wrong. Electrons move. With them, they carry a charge. Over time, two things happen:
1. The transistors develop their own "charge", similar to rubbing a needle across a magnet. Usually, this charge is very small, and takes a long time to develop, but eventually it can affect not only it, but the transisors around it. This is a form of EMI, and can permanently disrupt a chip's operation. Turning off your computer for long periods of time between over-clocked uses can mitigate this very well.
2. Over time, connections run too warm and at too high of a frequency will "migrate" into each other. Literally, they can fuse and become useless. Keeping your chip well within engineering tolerances can forestall this. Unfortunately, surface heat measurements aren't useful gauges of what's happening to each individual transistor. Frequently, transistors are actually running much, much hotter than a surface temperature gauge can detect. We're playing with lightning inside of these things
I say this, but with the caveat that electrostatic migration takes years to do its job, and generally only occurs when running circuits at a higher voltage or frequency than they were designed for. Chip designers understand electrostatic migration and EMI issues, and how to mitigate it. That said, running a chip well out of the spec for which it was designed, even if it "works fine" for long periods of time, is also almost certainly damaging it. Slowly.
However, it's normally the difference between reducing the useful lifetime of a chip from 20-30 years to 2-5 years. (I've worked at a silicon design house. Yes, they design these things to last forever, or at least 30 years. Even if they know they won't be used nearly that long.) And when it starts malfunctioning, reducing the clock speed or voltage can give it a new lease on life for a while longer as it begins operating within engineering tolerances.
Given PC life-cycles, you can put a big "OK, I know about electrostatic migration, but I don't care" sticker on this post and ignore it
However, you can't assume you aren't damaging your chip by overclocking it. You almost certainly are. The question is how much this damage is worth to you. Given that PC chip prices continually fall, there's a very good chance this damage is worth it in exchange for increased performance; 3 years from now, that same chip won't be worth enough cash to have ever worried about it in the first place.