The general thing, as with the benchmarks up there is, get a single card for 1920x1200(generally a current gen single card for current games) you get 2 cards for higher options, not more fps in general.
The problem is that 4xaa in some generally older and VERY well optimised games aren't gpu limited.
Keep in mind the graphs are horribly out of scale, starting not at 0, to massive exagerate the difference.
Go up to 8xaa, or a higher res(tri sli/quadfire setups are essentially a complete waste at 1920x1200, move up to triple screen setups, or high res 30" screens and the extra res brings you back to being gpu limited.
Lost planet(back then) and Stalker are probably the two toughest games(far cry 2 looks great but runs superbly for how it looks), and Lost planet is still completely gpu limited, Stalker shows a 8fps total difference from 3.1 to 4.1Ghz, and what cpu can't do 3.1Ghz these days, not a one out there, infact I don't think theres a cpu you can buy that can't do 3.1Ghz on stock volts with little to no heat increase.
Change the detail settings to absolute max, and there wouldn't be any cpu scaling at all in most of those games.
To sum up, you won't notice a lick of difference between 3.8 and 3.4Ghz, however 65C isn't a dangerous temp on i7's, if your P2 was running at 65C you'd be wanting to downclock/lower volts straight away.
Don't get bogged down by the actual number, but compare the number relative to the chips tolerance of temps. i7's are designed to run HOT, very hot infact, P2's are not at all, but also don't run hot.