Yes, 3DMark05 is not the bottomline on the worth of your computer, unless you're a benchaholic, and then I'd ask why you got a laptop in the first place
As far as FPS goes, google it, there's alot on it. In my own experience, I did a project on this in tech class in 8th grade. We each sat down in this chamber and watched a screen that would change and we would say whether or not we detected a change. Anyway, I was able to detect up to 74FPS and some kid in my class got 82FPS (dunno if he was telling the truth though, it was 8th grade
). Anyway, you can test this yourself, but it's easiest to do this on a CRT capable of 100Hz+ refresh rate. Most movies run at 24FPS, but TV is in the 30FPS, as stated. If I look at movies and then TV back and forth, I can easily see the FPS difference, but I think this depends on how sensitive your eyes are. For a more comprehensive test, open up HL2 on a high FPS CRT and put fps_max 15 in the console to limit the max FPS to 15. Then do 20, then 30, then 40, then 50 etc all the way up to 100. You'll notice a big difference in the way the game plays and the way your eyes perceive the game up to 70FPS anyway. That's how you know how high your eyes can detect. It's very subjective, and everyone is different, but that's the best way I can think of explaining it. For the record, I enjoy playing my games at 30FPS because it gives them a movie like quality. I think when the FPS goes to 60 the games start looking gamey and cartoony.
Hope the helped,