Celerons are not bad chips. Most people who are really down on them are either thinking of the original Celeron chip, or are just repeating what they heard from someone who was thinking of the original Celeron chip, or repeating what they heard from someone who heard from someone... etc ad nauseum.
The original Celeron was released to compete with the low-price offerings of AMD and Cyrix. To reduce the cost of the chip, Intel completely removed the L2 cache. The reason is that CPU cache is static RAM, not the normal dynamic RAM used for system memory. Static RAM is FAST, but it is also EXPENSIVE to manufacture. The chip was dirt cheap, but with no L2 cache, its performance was perfectly abysmal. You'd have been better off with a 486SX/25 than the original Celeron. That chip was not well-received, obviously, and they didn't sell very many.
The next Celeron chip had 128kb integrated cache. This is an important distinction. The standard P2 chip had 256kb L2 cache, but it wasn't integrated. It was on the daughtercard that held the slotted CPU. Because it wasn't integrated, there was a bit of latency, and it only ran at half the speed of the core. So, you could pick up a Celeron 300A, and it would have 128kb L2 cache running at 300MHz with virtually no latency for a little over $100, or you could go for the big mamma P2-450 at $800 retail (smart/lucky shoppers could maybe get it from $400-$600), with a 256kb non-integrated L2 cache with a bit of latency running at 225MHz.
Because Intel has traditionally been hung up on the "more is better" philosophy, they didn't see a problem with this at first. Of course, all the independent review sites in the world set to testing, and the results were a Godsend to gamers. If you're running an application that performs a wide variety of operations on a large amount of data and disparate data types, then more cache really is better. However, game programmers, who have long been used to having more software power than hardware to back it up, optimize their code to a nearly frightening degree. Games do perform some fairly intensive routines, but they tend to be repetitive in terms of operation type, and they tend to operate on the bare minimum amount of data necessary to get the job done. What that means for us (at the time) is, while 0kb L2 cache is simply unacceptable, 128kb of cache is sufficient. The real boost was that the Celeron cache ran at the same speed as the CPU core. So, your game didn't need more than 128kb L2, and it could access that cache data twice as fast as the full-blown P2. So, people using large Excel spreadsheets or PC-based database applications had problems with those Celerons, but gamers were yanking them off the shelves left and right.
There was an added benefit to this. Because the cache was integrated and full-speed, the Celeron could take more abuse than the full P2s could. It was far easier to successfully overclock a Celeron than it was a P2. It was not at all uncommon to hear of people taking a 300A all the way to 400MHz. I'd heard of people getting quite close to the theoretical 500MHz limit of the P2 core the Celerons were based on, but that was with serious cooling rigs.
Incidentally, it was soon after the 300A that Intel started clock-locking their chips. They said it was because unscrupulous resellers were overclocking their chips, scrubbing off the markings, and selling them at the price of the higher chip (sort of like nVidia does with its OC line, except that they notify you of what they're doing). There certainly were people doing that, but I still think Intel did it because they were ticked that their $100-ish processors were outperforming their $600+ processors. All you folks that pin-mod, you're descendants of the people that found out that if you used a plain old graphite pencil to draw a line between 2 specific contacts on the Celeron's edge connector, you could disable the clock-locking. Of course, we were able to use an eraser to put our warranty back in effect, but you can't have everything.
Today, the current Celeron line has a few other features disabled to ensure that the main Pentium lines rule over the Celeron lines. However, the primary difference is still that the cache is smaller, because that is still where a great deal of the manufacturing expense of a CPU lies. There have been tons of studies done on the lines of "how much cache is necessary", so there's a wealth of information available if you're inclined to look for it. Today's games certainly utilize more cache than the games of the P2 era did, but there is still a definite point where getting more cache is a total waste of money. In general, 0->128 is a massive difference, 128->256 is a large difference, 256->512 is a noticeable difference, but anything after that is just expensive. The exception is in database server type applications. That is why the Xeon line, whose main difference over the Pentium line is 2MB L2 cache, is usually only available in server hardware where applications of that type are likely to be found.
If a user today doesn't intend to do hardcore gaming, serious graphics manipulation, or data-intensive operations, there is absolutely nothing wrong with a Celeron. For people wanting to browse the web, check e-mail, and type letters in Word, the Celeron is a perfect chip.