1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

    Dismiss Notice

Will i notice a difference between 3.8GHz and 3.4GHz?

Discussion in 'CPUs' started by Thrill, 8 Jun 2010.

  1. Thrill

    Associate

    Joined: 6 Apr 2009

    Posts: 37

    Title says it all, I'm a gamer yes, I run an i7 930 with a Megahalems Rev B cooler, and I'm a bit worried about my temperatures hitting 65c while gaming, so I'm gonna go ahead and underclock from 3.8GHz to 3.4GHz then lower my voltage.

    I was wondering, will I see a noticeable performance difference or what? Seems a big leap, but I'm not sure tbh. I originally had my i7 930 at 4GHz, it came at 4GHz from OCUK.

    Thanks!
     
  2. smilertoo

    Mobster

    Joined: 4 Aug 2005

    Posts: 3,077

    not much difference but 65c max under load doesnt seem that hot.
     
  3. Penfold101

    Gangster

    Joined: 19 May 2010

    Posts: 141

    Thermal cut-off for CPU's is normally about the 100 degree mark I think...
     
  4. Big.Wayne

    Capodecina

    Joined: 11 Sep 2003

    Posts: 14,704

    Location: London

    Hello Thrill,

    I would imagine this is the sort of test you can carry out yourself and observe first hand what the difference is? . . .

    According to this article it would seem a gamer running a single GPU may as well run their Intel® Core™i7 @ stock?

    [​IMG]
     
  5. Thrill

    Associate

    Joined: 6 Apr 2009

    Posts: 37

    Hi, well I've underclocked (is that the right word?) from 3.8GHz to 3.6GHz, and lowered voltage and temperatures are slightly lower, but I'm gonna tweak the voltage and find lowest stable :)
     
  6. Big.Wayne

    Capodecina

    Joined: 11 Sep 2003

    Posts: 14,704

    Location: London

    Ok so now run some games and come back and tell us if you can notice the difference! :) . . . UnderClocking is technically running the chip below its rated speed . . . in your case this would be lower than 2.8GHz :cool:
     
  7. setter

    Caporegime

    Joined: 14 Dec 2005

    Posts: 28,160

    Location: armoy, n. ireland

    Also considering backing mine off to 3.6ghz or so, currently at 4ghz (1.2750 vcore). I can run it at 3.6ghz on 1.18 vcore, running 2 x gtx 275's in sli (stock speed), i cant see the chip at 3.6 bottleknecking the cards to much. Would certainly help to keep temps down a bit, also hitting 65c on the highest core when gaming at 1920x1200.
     
  8. Thrill

    Associate

    Joined: 6 Apr 2009

    Posts: 37

    Right, i've actually reduced clock to 3.6GHz and lowered voltage to 1.2000v and it's stable, and the temperatures are much better. and, i've installed the latest beta patch for ArmA 2 and noticed an INCREASE in performance. I have a 5870 graphics card, i just don't want my CPU to bottleneck the graphics card.

    This is the most demanding game I play, but the game is all about the CPU (heavy AI) and the hard drive (due to large textures streaming, thus my SSD ;))

    Setter, I was at 4GHz at 1.24V too and my temperatures on idle fluctuated between like 45 aand 53c, they're around 40c now, but if the cpu usage goes to 10% they hit 50, i've reseated heatsink and everything, reapplied thermal paste.

    but yeah my i7 seems to let me run it at 1.2v which is very low for 3.6GHz (please correct me if i'm wrong in saying that?)

    thanks guys
     
  9. setter

    Caporegime

    Joined: 14 Dec 2005

    Posts: 28,160

    Location: armoy, n. ireland

    Currently idling at 40-35-40-34, theyve jumped 4-5c in this recent hotter weather, 1.2v is good, but its always worth trying lower, i eventually got mine down to 1.18.
     
  10. Thrill

    Associate

    Joined: 6 Apr 2009

    Posts: 37

    not bad, ill try get the voltage a bit lower. dont wanna go to low, been getting random bsod not load caused, not sure.
     
  11. Swifty123

    Hitman

    Joined: 13 Feb 2010

    Posts: 888

    So it highly depends on different games, makes a massive difference on warhammer :O
     
  12. Exsurgo

    Hitman

    Joined: 2 Oct 2009

    Posts: 823

    Location: Belfast, UK

    I must say those results have not been reflected in my own experiance. Crysis, Napoleon Total War, WoW and BFBC2 have all seen very tangiable increases in frames between stock and 4Ghz. Minimum frames in Crysis have gone from around 22 to ~28, even more so in the benchmark loops, and smooth gaming is all about minimum frames.

    BC2 benefits a lot from the extra CPU power, it basically removes those frame stutters when buildings blow up etc. The difference between 3.6 and 3.8 however is likely to be pretty minimal.
     
  13. Culinia

    Hitman

    Joined: 10 Jan 2010

    Posts: 661

    My CPU does this i.e. 5 deg difference between cores...thought it was just mine :)

    I also thought this CPU scaling was interesting.

    Please do not hotlink images. Thank you.
     
  14. Raves

    Gangster

    Joined: 2 Jul 2005

    Posts: 343

    Location: Canberra

    d00d 65C is teh sweetspot - OC MOAR!!!!!!!
     
  15. Thrill

    Associate

    Joined: 6 Apr 2009

    Posts: 37

    oh right lol ;)
     
  16. drunkenmaster

    Caporegime

    Joined: 18 Oct 2002

    Posts: 33,194

    The general thing, as with the benchmarks up there is, get a single card for 1920x1200(generally a current gen single card for current games) you get 2 cards for higher options, not more fps in general.

    The problem is that 4xaa in some generally older and VERY well optimised games aren't gpu limited.

    Keep in mind the graphs are horribly out of scale, starting not at 0, to massive exagerate the difference.

    Go up to 8xaa, or a higher res(tri sli/quadfire setups are essentially a complete waste at 1920x1200, move up to triple screen setups, or high res 30" screens and the extra res brings you back to being gpu limited.

    Lost planet(back then) and Stalker are probably the two toughest games(far cry 2 looks great but runs superbly for how it looks), and Lost planet is still completely gpu limited, Stalker shows a 8fps total difference from 3.1 to 4.1Ghz, and what cpu can't do 3.1Ghz these days, not a one out there, infact I don't think theres a cpu you can buy that can't do 3.1Ghz on stock volts with little to no heat increase.

    Change the detail settings to absolute max, and there wouldn't be any cpu scaling at all in most of those games.

    To sum up, you won't notice a lick of difference between 3.8 and 3.4Ghz, however 65C isn't a dangerous temp on i7's, if your P2 was running at 65C you'd be wanting to downclock/lower volts straight away.

    Don't get bogged down by the actual number, but compare the number relative to the chips tolerance of temps. i7's are designed to run HOT, very hot infact, P2's are not at all, but also don't run hot.
     
  17. Thrill

    Associate

    Joined: 6 Apr 2009

    Posts: 37

    ah right, i see..
     
  18. Swifty123

    Hitman

    Joined: 13 Feb 2010

    Posts: 888

    I havn't noticed a difference between 3.1Ghz and 3.9Ghz. I think I need a new GPU, would be the only way to see a difference