AMD vs Intel – Our 8-Core CPU Gaming Performance Showdown!

  REPORT ANALYSIS AND FINAL THOUGHTS   So, there you have it. After a wide variety real-world game tests, we can see that AMD’s relatively low-priced FX-8370 is able to keep up, and even trade blows with Intel’s most powerful consumer desktop processor. Sure, this comparison isn’t really all that realistic, as not a single game on the market will support the 16 threads that the Core i7 5960X offers, and you’d be hard-pressed to find a game that can use all of either processor’s eight cores.  That being said, if we are to assume that the 5960X represents at least the same performance you can expect from the Core i7 4790K ($329), or even the Core i5 4690K ($230) when it comes to gaming, then it is clear that the cheaper FX-8370 ($199), and by relation the even cheaper FX-6350 ($129) should be able to keep up with those processors as well.   fx-eight-core-pricing   Yes, that’s right … if you’re playing modern, graphically demanding titles at resolutions of 1080p and above, you can expect roughly the same experience with a modern AMD processor as you can from an Intel. Sure, there will be titles that just play better on the Intel chips without a doubt. But, for the vast majority of titles, you really should be looking at spending more on your graphics card than your CPU.   amd-dx12-cpu-scaling   It is also worth considering that all of these tests were done using games that are based on the aging DirectX 11 or OpenGL APIs, and with new much more efficient APIs like DirectX 12 or OpenGL’s successor Vulkan, then we can expect performance on processors with multiple cores to increase exponentially as it is expected that those APIs will follow in the footsteps of AMD’s Mantle and offer performance increases on up to six cores.   Now, that is not to say that you should always buy AMD over Intel, no if you’ve got a budget that allows it, then an unlocked Core i5 or i7 will definitely outpace anything AMD offers if you plan on pairing it with something like a pair of GTX 980 Tis in SLI. But, if you’re on a budget of between $500-$800 USD – like a lot of you are – and plan to use a GPU like the EVGA GTX SSC 960, which we used in this report or a Radeon R9 380, like the Sapphire R9 380 Nitro we recently reviewed, then you should definitely give the red team a hard look before you count them out.

426 comments

  1. As a fellow reviewer, I appreciate the many hours of work that went into this
    . Thanks for opening people’s eyes.

    • Thanks a lot Jason. I’m a huge fan of your work so that means a lot coming from you.

    • On the other hand I as a former fellow reviewer call this benchmark unprofessional and misleading. First of all you are comparing a $1k CPU with a $190 one. Why not compare it with i5 and i7? Secondly,if you have done any benchmarking (proper benchmarking) you would have noticed that most games that support multicore cpus, use only quad core CPUs efficiently in Intel’s case. In AMD’s case its FX6300. Shame on you sir (Donny Stanley) for being lazy and for creating such low quality click bates.

      • Well, sorry to disappoint you then.

      • Former fellow reviewer? For what site? Some random no-name forum?

        The title of this article is “AMD vs Intel – Our 8-Core CPU Gaming Performance Showdown!”, so what place does an i5 or quad-core i7 have here? It’s not rocket science. There are tons of other reviews that talk about those cases, but this one is purely about octocore CPU’s, so the differing price ranges of the chips is totally irrelevant.

        • AMD vs Intel? Shame about the title!
          And yes, is unprofessional.

        • Except the AMD CPU is not really a true octa core cpu since each core shares a fetcher and decoder, FPU and Its Cache. This is a 8 thread “4 core” Cpu Vs a True 8 core with 16 thread CPU. Unless you have more the 3 Top of the line GPU’s in your system the 5960X is a waste and will be an inferior CPU to the 4 core intel CPU’s with 8 threads in gaming due to lower Single core performance and clock speeds so yes this review is very misleading.

      • And the review article shows that a $190 CPU can keep up with a $1000 CPU in games. How is it unprofessional to point that out?

      • You must be an Intel fanboy “reviewer” that cannot accept the fact that a cheap and a 3yr old architecture AMD 8core CPU can matched an expensive and newer Intel 8core CPU.

        • THATS TRUE! they are sons of evil intel godzilla

        • When you make everything GPU bound you could of put in a i3 and seen the $120 i3 would of matched your $200 AMD chip…

          • yeah try running netflix or have you internet browser up as well as your streaming programs and media player i3 not gunna do that

          • my i-5 4690k does all that and more. hyperthreading is a joke.

          • Hyper threading is good for games that are CPU bound like GTA5, Crisys 3, City Skylines ect… Also great for multitacking plus you get a higher out of the box clock speed with a I7 4790K then a I7 4690K for example.

        • If you buy an 8 core intel (5960x) for gaming you are dumb as fuck, its not meant to be used for gaming.

          Dumb people.

          • Exactly the only use for it when used for gaming is if you are running 3 or more top of the line Graphics cards and i mean top of the line where talking fury X, GTX980ti, or Titan X otherwise you will get shit performance in games with this in compairison to a I7 4790K or I7 6700K.

        • Majorpayne19. I purchased the flagship AMD Phenom II 965 with a Sapphire Radeon 5870 in 2010. This combination heats my house during the winter. Too hot. AMD was good for me back then. I am going to get a Skylake i-5 6600 or an i-7 Haswell 4790. AMD needs to do more work to keep hardware cool and quiet.

          • You could afford a Phenom 965 and a 5870 in 2010, but you couldn’t add 25$ more for an aftermarket cooler?

      • What I find funny is that he used SLI 970s. Come on, indulge us, for such an expensive CPU, you should be at least using 3x 980 TI or Titan Xes, maybe dual R9 295X2s.

        • OMG!!! How you stupid!!!!!!!!!!!!!!!!!!!! :))) 3X 980Ti or 2XR9 295X2??? First,SINGLE 295X2 crush min. 980Ti in SLI! 2X 295X2(=4 GPU) eat 4X 980Ti!!!! and second,on CPU can go any card,is no mater if is weak, but strong card on weak CPU…this is problem,or bottleneck if you know what is this 🙂 and yeah,9xx series is “great” card 😀 you cant play new Batman on 980!!! on LOW settings! and AMDs cards,run Batman without problem… 9xx series is SHIT! 7xx series is 3 times better… how much noobs is here!!! fucking fanboys! here is the facts and you still dont belive in this beacuse you are blind fucking fanboy!

          • That’s the point We’re meant to see the CPU bottleneck. Or are you saying we should just look and see “Yep, 30 FPS with every game, on every CPU ever made with 8 cores”?

          • In all Benchmarks my 8370E beats the i5, except for single core. My chip cost 160 euros, and the i5 4690k cost 240 euros. In all benchmarks, when overcooked, my chip comes in right between the i7 4770k/4790k and the i5 4690k (at the same clock speeds). Attachments were added to a comment reply to this buddy douche… You do realize that AMD has 95w TDP chips right? You like to show the 125w variants.

            Don’t get me wrong, the Intel chip wins in single core, based on IPC (I also own a couple Intel based systems), but the AMD based systems I own are far from horrid. Were talking about performance that comes in right between the i5 and i7 chips, for the price of an i3 (yes I own an i3, which needs upgraded to a Xeon in my server)!

            Your synthetic benches are not really showing how Intel wins.

            As far as DX12 Radeon GPU’s gained up to 80% while the GTX lineup seems to have taken a hit in performance in most tests.

          • Add 2 980TIs to your 8370E and see how it will work.dad

          • You buying? I’m going more for the R9 390×2, it’s a lot cheaper, with similar affects.

          • https://youtu.be/H1pURs5ZJvw

            That’s not SLi but every Intel fanboi will always state that AMD bottlenecks high end cards. That is 4k on the 980ti, with no bottleneck.

            My guess is that you’ve never tried it. I must say 4k on PCIe2.0, is pretty impressive as it is.

            Its all about preference. If prefer bang for you buck, go AMD… If you prefer to to spend double for a 5% increase in FPS, then go Intel.

          • Well I have two FURY’s (around 7% slower than 980ti on latest drivers) on my 8370 (@4.95GHz mind you) and pull 178FPS in Tomb Raider 2013 (ultimate settings), 125 FPS 1440P Ultimate settings and 66 FPS 4k once again at ultimate settings.

          • Well I have two FURY’s (around 7% slower than 980ti on latest drivers) on my 8370 (@4.95GHz mind you) and pull 178FPS in Tomb Raider 2013 (ultimate settings), 125 FPS 1440P Ultimate settings and 66 FPS 4k once again at ultimate settings.

          • And ewww, I do not want to fuck you, you sick loser.

          • how can single 295×2 crush 980Ti in sli? -_-. sorry for replying to a very old comment but its pretty clear that u don’t follow any game reviewing websites or even youtube videos where people compare between these gpus (like 290/290x vs 980/980ti etc). 295×2 is nothing but crossfire of 290s and it may give higher fps than a 980Ti but gaming performance will be better on that Ti or r9 furyx. so yes 2x 295×2 has no chance of beating 980Ti sli or r9 furyx crossfire setup cause 3 or 4 gpu scaling is pretty bad most of the time

          • 295X2 vs 980Ti in SLI??? whaaat???? please,read something about 295X2!!! OMFG!!!!! 290 in crossfire is the same like 295X2????? boy,you are soooooo stupid!!! RETARD!!!! 290???? why not 2x 290X in crossfire??? even 2x 290X in crossfire is not strong like fucking BEAST,DOMINATOR OF GPU, R9 295X2! REATARD!!!!! And now become new KILLER card, FuryX X2…. dont look now benchmark,card is not realesed and bechmark witch show perfomance of this BEAST card is not true… check bencmark when card comes out… AMD is king of GPU,no doubt… Nvidia will never be ahead…

          • Easy there son. Its not the end of the World.

          • son? 🙂 omg…

          • “computers… it’s my job!”

            followed by a stream of incoherent and insane rambling and ranting. the sad thing is people like this *really do* imagine themselves to be experts when in reality they truly appear to be in need of professional help and remedial education.

          • yeah,is my job… tjat,what you LAICS and fanboys saying… have you any proof??? I have a lot…

          • You clearly have no idea about what you are talking about and it is funny that people trust you with their computers.

          • yeah… I see what you GAYS know about this 🙂

          • Two GTX 780ti in SLI can beat the R9 295×2 and the GTX980 in SLI will completely demolish it.

            https://www.youtube.com/watch?v=CZhC66yjROs

          • beacuse after 295 write this: X2!!!! and if is two card in SLI/Crossfire it does not mean that literally twice more stronger! what is wrong with you?? ex. if is card 4Gb and you put two card in SLI,you will not get 8Gb!!! Stay on 4Gb… you know that? no… so,if is two cards in SLI, is not twice more stronger.. is stronger but not even close double… OMFG!!!

          • This is not twice as strong either due to it being basically the same as crossfire also you are not really getting 8Gb of Vram with this unless the game is designed for Vram stacking by the devs and the API is either DX12,Mantle, or Vulkan even if your rig is incorrectly showing 8Gb your only really getting 4 in DX11.

          • what?????????????? :))))))) R9 295X2 is like two 290 in crossfire??? then,what is with 290X??? dude,computers is my job 😉

          • Then you are terrible at your job because the 295×2 is exactly the same as two 290X’s in crossfire.

            https://www.youtube.com/watch?v=JEgsG_wl0kc

          • I am going to replace my ATI Sapphire Radeon 5870 with a GTX GeForce 960. How is the Radeon R9 290 / 390?

          • my Gigabyte Windforce R9 280X OC ver.2 is still faster and better than new 380X… I love AMD,but 3xx series is the same like 2xx,only have new chip but is not better… Only 390X is good and Fury X is amazing. the same thing is with Nvidia,9xx series is not better than 7xx series,even and some cards of 6xx, but like AMD, and Nvidia have one great card,980Ti… GTX 980Ti and Fury X is new card,all other cards is the same,or even slower, like old series (R9 2xx/GTX 7xx) and DX12… now all cards with GCN arhitecture support DX12! HD 7xxx,R7 2xx and R9 2xx… I love AMD but 3xx series (not 390X and Fury X) is very bad… :/ but,AMD have always AMAZING drivers,not like Nvidia for 9xx series! I dont hate Nvidia,but is the fact that Nvidia have terrible drivers… and for end,AMD is still on top,king of GPU (like Intel is king of CPU,for now,when come out “ZEN” 😀 bye,bye trone 😀 ) AMD always have flagship card! Now comes one more flagship card, 2X Fury X 😀 for now R9 295X2 is killer of all GPU…

          • better take 290.. 390 is very weak… 860.. no comment

          • shut up bitch

      • I think the article was rather informative. From this an other research I’ve done I’ve found that while Intel does provide for the more powerful processors, on price point you’ll get about the same amount of capacity for your buck. The AMD 8 core may only perform as well as The Intel 6 core but they’ll generally have about the same cost. You could take the power route and go with the Intel 8 core for much more money, but since the software being released generally doesn’t come close to using the capabilities you’re basically buying a Ferrari to drive to the convenience store, you’ll never get to use the full capabilities of it.

      • because it is 8 core vs 8 core, you retard. people who compare an 8 core vs a quad core are just stupid.

        • The AMD FX8350 is not a True 8 core cpu it is more similar to a quad core with hyperthreading then anything although it is still worse actually because every two of the so called 8 cores shares a fetcher, decoder, FPU and cache where as with intel each core is a full fledged core and has hyperthreading since each core does not have to share its resouces with another you get higher single core performance and both have only 8 threads when comparing cpu’s with a I7 4790K for example which also has the same clock speed and 8 threads you will see that the AMD FX8350 will get its ass completely handed to it I know for a fact because I have owned and tested both.

          https://www.youtube.com/watch?v=PgejkSWzvNs&list=PLyReHG5dDxXWxtuArwVgQkVAv6dppbbEp

  2. Sounds about right. I’ve always favored AMD for its inexpensive, but good CPU’s. Was AMD/Nvidia (cpu/gpu) for a while now (and Intel/Nvidia before that) and am AMD/AMD lately after switching to an ATi 3870 on a lark just to see if it felt any different than running a Geforce (it didn’t), then I stuck with ATi when it got bought out by AMD.

    To be honest, I feel that AMD’s always been competitive with Intel in terms of CPU’s in terms of price/performance at least, and the same thing with Nvidia. Personally, I just liked supporting AMD because I do not want Intel and Nvidia to become monopolies. Don’t get me wrong, I like Intel and especially Nvidia products, but I don’t trust any company when it’s a monopoly.

    • It really isn’t though, when you get down to performance Intel is always ahead, the main issues is people think more cores = better, which is not true. very very few games can efficiently even use more than 2-4 cores.

      Intel is at a point that even i3’s are out performing 8 core AMD chips in gaming and day to day use.

      Unless you have a specific application that can fully utilize 8 cores, Intel will almost always win out, even with less physical cores.

    • Intel and AMD are like Republicans and Democrats.

  3. I would like to see tests using also AMD GPUS.

    • So much this. Their gpus are quite know to be less cpu efficient in DX11 mode.
      This is especially apperent, when you couple low end cpus (like pentium or i3) with their midrange offering (like a 270).
      There are bunch of tests proving that.

    • GrimmReaper WithaSpoon

      That would take way too much time. As it is now, I bet this took weeks.

  4. Hope you follow up with a piece on AMD GPUs as they don’t have the PhysX support and that sometimes falls back to CPU to handle some of that load.

    • False…you can disable PhysX.

      • I normally don’t say this, but please state your justification for claiming that I am false in stating that AMD GPUs do not have native PhysX support.

        You can disable PhysX enhanced features in games and in the Nvidia Control Panel… but you cannot eliminate the *physics* that are processed in gaming environments. The question is specific to physics processing in a GPU-neutral environment – no TressFX, no PhysX. It’s been long perceived that Nvidia has a natural advantage here because of the PhysX acquisition, I’d like to see that revisited is all.

        • Wow really? I didn’t say AMD’s have PhysX support. I said you can disable it. You are saying without a Nvidia card you are taxing the CPU with PhysX which is bullshit. Disable it.
          PS: Gameworks PhysX =/= physics
          Of course you can’t disable phyics that are a part of the game code. Nvidia Gameworks PhysX however is something you can disable.

        • My brother got an AMD card and can play with Hairworks and Physx enabled, actually he got better fps than me cos he can slow the Tesellation to x32 or x16, I have an Nvidia card and can’t change the default Teselletation x64, there is no difference between x64 and x32 and little with x16

          • Odd that they wouldn’t allow that, especially in the lower end cards. If it’s something that cannot be easily adjusted on the card – firmware or something low level – that would make sense. Hopefully they consider that for future cards.

    • That’s a good idea; I used nVidia GPUs simply because that is what I had on hand. Will definitely consider this in the future though.

  5. Yeah, it is kinda hard to justify going over $200 for a gaming CPU. Especially if you aren’t going to spend on a 144hz or 120hz monitor. If you are like most people running 60hz all you need to do is get to a constant 60 fps and you’re golden. The most efficient way by far to get to that point is through the GPU.

    Even with this comparison I think you’d be hard pressed to find an FX8320 that couldn’t reach 4.3-4.6ghz, you go that route, overclock, and you’d be set for under $150.

    • In a few cases, a AMD cpu will be thte cause of not even managing 60fps. They suck. I was on a fx-8350 for four years. Four years too long.

      • You cannot look at a single component as the sole factor. At the time of running the FX-8350 what RAM, GPU and disk were you running? Are you sure you didn’t have other bottlenecks in that configuration?

        • Pretty sure you don’t know what “bottleneck” even means. Do tell, what other “bottleneck” could explain low fps in cpu bound senarios? You are extremely ignorant if you’re going to argue the validity in stating AMD cpu’s get much much lower fps in cpu bound senarios, games or using cpu bound image quality settings. The fact Intel rapes AMD silly is common knowledge.
          Here’s a example, I’ll leave it at just the one as I have better things to do: the steps of Whiterun ( Dragonsreach) in Skyrim looking down. A highly CPU bound spot. 30fps on fx-8350, even at 4.7ghz. On a 4790k at 4.6ghz over 50fps.
          I mod Skyrim with over 250 mods but that’s irrelevant. The spot in Whiterun suffers the same fps drop modded or not.
          You might think that’s not a big deal. I disagree and I’m glad to be rid of my old system.

          • He asked you your specs, not your opinion. If you’re running 250 mods on Skyrim, then GPU VRAM could easily be a bottleneck if you weren’t running something really high end.

            Take your bullshit somewhere else. This place is for objective analysis, not “omg my opinion must be right based on anecdotes omg AMD suckz”

          • Oh jesus Christ grab a clue. Learn to read. Moded or not, on AMD cpu you will get 30 fps on the steps of Whiterun. On Intel over 50.
            IT’S A CPU BOUND SPOT.
            I’m not going to bother listing my specs or going into detail. It’s irrelevant.
            Anecdotal my ass. I owned the piece of crap 8350 for 4 years.
            I don’t need any more examples and if you are seriously arguing what I’m saying you’re completely clueless.

          • Don’t bother typing your long winded witty reply.

          • My bad buddy, I guess I nor the author nor anyone else know anything :/

            Sorry for debating with your irrefutable presentation of facts and thoughtful analysis of why your games ran as so, and thank you for presenting proof to what you’re saying with screenshots or benchmarks of in-game performance like the review did. I’m definitely considering your point of view now.

          • You are seriously arguing AMD cpu’s are not inferior to Intels? Are you seriously still responding?
            Don’t take my word for it smart ass, go do some research. You’re arguing facts. Intel cpu’s blow AMD’s out of the water.
            Benchmarking games that aren’t cpu dependant doesn’t refute this.

          • No, actually, that’s common knowledge. What I am saying is that you are an ass in the way you respond to people you disagree with, in this case about whether or not the CPU is actually the bottleneck.

            Without presenting any information that would lead a reasonable person to conclude that the CPU is indeed the bottleneck (which may as well be the case), you expect everyone who reads your anecdote (because that’s all it is – you have yet to reply with information that would definitely prove you right) to agree with you. Anyone who doesn’t is apparently uneducated and cannot read.

            TL;DR No need to be an ass on a site where everyone loves tech. There are much politer and less exhausting ways to prove your point (and you’re right, the CPU is the bottleneck from your specs down below from a different thread) than to act patronizingly towards those who don’t immediately agree with you.

          • Really? I describe a CPU “bottleneck” or low fps due to a cpu that sucks at single threaded work loads and I get wanna be techs and wise asses that have the audacity to argue the fact AMD cpu’s blow ass, going off on wild tangents about ram frequency and HDD read speed. looooool
            This article at best minimizes how awful AMD cpu’s are and at worst is deceiving it’s readers as it conflicts with just about every source of info available on the internet.

            Oh and btw I have yet to respond with another example as I shouldn’t have to. Do you know how to Google? The entire internet will confirm what I’m saying. Get schooled.

          • Do you hear yourself? My comment wasn’t contesting what you said technically – I agreed the CPU is the bottleneck.

            You still sound like an absolute dick. I don’t know if you’re compensating for something else, or acting tough behind your anonymity, but forreal, it gets tiring when wannabe tough guys like you keep posting.

            It’s fine to say that AMD’s processor here isn’t as good as Intel’s processor here. No one cares that you’re doing that. The difference between you and most other people with that same opinion is that most other people would take the time to respond with a link or two (which really isn’t hard to find – no one on this sight is a tech noob, and that’s why they’re all here) and respond with a dissenting opinion that didn’t call all those who didn’t agree word for word wanna be techs or wise asses.

            It’s not your opinion, but rather your attitude that’s really off-putting for someone who names them self buddydudeguy.

          • I’m not going to respond with a link or two when you are capable of Googling, which you obviously need to.

          • Holy shit. You really are incapable of comprehending anything except your own ego.

            Lemme give it to you clearly: AMD processors are not as good as Intel processors when it comes to single threaded applications, and nobody here is disputing that.

            What anyone here also agrees on is that you’re a righteous asshole.

            you.
            are.
            an.
            asshole.

            see above
            see above again
            see above again if you still don’t get it

          • haha. Who’s the troll again?
            Why in the heeeeeeeeeeeeeelll were you arguing other wise then going off on wild tangents?????
            Rhetorical question. Please…please stop responding.

          • Is that really all you got? I honestly feel pity for you – if this is how you are on the internet, then you’re definitely the inverse in real life. Bullied, minimized, unimportant and ignored.

            Please see a counselor :/ There’s a lot of stuff you clearly need to work out. You’re compensating, and it hurts me that you’re hurting 🙁

          • If you thik you’re under my skin TROLL, you’re mistaken 🙂 Have fun with…what ever the hell it is you do. Make up your mind.

          • You’re a complete and utter moron. It’s just common knowledge, I don’t even have to provide facts.

          • What is common knowledge? You don’t have to provide facts about what? Could you possibly make less sense?
            If you’re being sarcastic in regards to my posts, yes…the fact that Intel sucks compared to AMD is common knowledge and I refuse to go Googling for any of you.

          • I think you’re right, at eveything you just said.

          • Stop with the troll crap. I get sick of reading comments by technology bigots who can’t see passed their own bias. I love my Intel gear too but not to the point where it’s the be all and end all, period.

          • Dude, you’re really an asshole, this opinion was based on your responses. He is not trolling, you’re doing the trolling here. Shit even a 3rd grader level of comprehension has slipped you, as you’re so egocentric, and narcissistic, to even accept his comments. Just because he doesn’t subscribe to your opinions!

          • I’m using an AMD FX9370 OC to 5.1 ghz with ram at around 1866mhz and two sapphire radeon R9 290’s in Xfire also using skyrim modded with 4K textures in 4k on my ASUS PB287Q and I have ZERO issues with the white run steps even at 4K ultra with 4K textures. Take your Intel fanboy faggottry out of here.

          • haha. Whatever you say. You’re so full of shit your eyes are brown. It’s not a “issue” it’s a cpu bottlenecked spot and you are not at 60fps looking down on the steps of Dragonsreach.

          • Do I have to record the game and show you? Are you that ignorant?

          • rofl! Do eet! I will show you footage of my 4790k wiping the flor with your fail ass FX.

          • I see your ignorance knows no bounds.

          • by what? 15 seconds and 20 fps or more? makes no difference. thats intel fanboys defense. since they feel bad for blowing so much money for 15 seconds or 20 or more fps. its the video card that does the work.

          • hahaha. Tell me that 20 more fps doesnt matter when you get 40 fps and I get 60. You’re in denial. Fail arse buyers remorse.

          • not in the games i play. as long as you can get to 70fps the games run fine for me and my AMD.

          • “it’s the video card that does the work” ya…you’re a idiot.

          • the cpu sends the info to the video card. that in the builders 101 guide, read it!

          • Paulie M30 (The Gaming Junkie)

            I have 2 8350s, one on air @4.6, one on an h100i@ 4.9. I also have a 4790k @ 4.8 on all cores, and a 5930k @ 4.6/ For anyone to claim the 8370 performs the same the 5960X, a $1000 chip with 16 threads is FUCKING RETARDED!! The FX chips weren’t even that good when they came out 5 years ago. After reading this smut, I ran a good amount of these benchmarks. Absolutely ZERO of them were even close to the results in the bullshit article. If you like AMD, thats fine, have at it. Buy to think that a chip with 5 year old architecture can stand toe to toe with whats right now the most powerful consumer grade chip on the market, you must have brain damage. My 8350 with 1 980 ti, 16 gb of ddr3 1866 on an Asus Sabertooth 990fx scored 10232 on Firestrike. All I did was swap out the chip and board for a 4790k and z97 mark 1 sabertooth and my score went up to 14968. Buddydudeguy, there’s no point arguing with with AMD fanboys, just as bad as arguing with console peasants. GET EDUCATED PEOPLE!

          • Thank you. For christ sake I was arguing with ignorance.

          • Paulie M30 (The Gaming Junkie)

            Well Buddydudeguy, I think we’re in the wrong place to expect intelligence. We’re only going to get ignorance unless we have a conversation. Someone asked me for proof of my “CLAIM” that the 5960x was a better chip than the 8370 and apparently the 5 links to REPUTABLE sites such as anandtech, oc3d, and linus tech tips wasn’t proof enounh. They want my benchmarks on the 5960x. A chip I clearly stated I don’t own. But, I’m still gonna bench the chips I have. Good day to you sir

          • Guys…don’t want to have a thread cleanup session. let’s stick to the context of the article and leave personal verbiage out of it shall we?

          • I wouldn’t eve provide “proof” that a 5960x is better than a 8370 lol. That’s not worth responding to.

          • Education is very important and part of being educated is understanding why the benchmark score is different when making comparisons . Much of the difference in scores in Firestrike’s default benchmark are due to the software giving the Intel chips a fairly large advantage in the way it makes use of the cores on the AMD chip. Run it again and pay attention to core usage on the AMD compared to the Intel , particularly during the combined test. What is funny to me is that a 4790k locked at it’s turbo frequency and a FX 8 core locked the 9590’s turbo frequency will score very similarly at firestrike extreme and the FX beats the 4790k ‘s overall score on firestrike ultra setting.
            (both using a stock 290X). Another important point to understand is just how a synthetic benchmark applies to how the machine is going to normally be used.

          • Don’t reply to him, he will just flag your comments, and they’ll be removed. It seems pretty clear that its the buddy guy.

          • Same gpu and ram in both rigs. 🙂

          • Paulie M30 (The Gaming Junkie)

            Yes actually. I own 2 970s, and 2 980 ti. I own in total 64 gb of corsair vengeance pro ddr3 1866, and 32gb of corsair vengeance ddr3 1600. I use 16gb ddr3 1600 on all benchmarks. Lastly, I was not saying that amd sucks. I own both amd and intel chips. My main argument and issue with this article was the fact that all benchmarks across both chips were the same. Both over clocked and stock, there was no variations. Not that amd was the same as intel. I’m well aware that you get no major gains in gaming with a 1000 dollars chip. But under not conditions, except controlled to show the results you’re trying to show, will a stock 8370 perform no different that an over clocked 8370. Nor will a 50th perform the same at 3.0ghz than it’ll perform at 4.4ghz. That’s all I was trying to say. If people wanna believe that, more power to them. Who ever did these benchmarks should run cinebench r15 on both setups, then let’s see what they come up with

          • Paulie M30 (The Gaming Junkie)

            Go to ANY REPUTABLE source, anandtech, pcper.com, Linus tech tips, and compare these benchmarks. They will not hold up

          • Don’t waste your time.. he’s just here to crap on AMD.

          • Look man, I get it. CPU bound games, Intel > AMD in your opinion. And in 90% or greater of those cases it’s true and has been proven time and time again. But it doesn’t mean that every single person need avoid AMD APUs, even if they’re playing games.

            Remove MMOs from the equation for the sake of argument. WoW, Skyrim, FFXIV, etc. as they will always be CPU bound, even on high end GPUs in SLI/Crossfire. That’s just the nature of the beast – has been ever since my EQ/SWG days. I don’t play a lot of physics-intensive games and many of the folks I build for play games like Minecraft, SC2 or MOBAs. The reason that I use Intel in my desktop is purely Android build times and other programming/development efforts.

            It goes again back to basics: What do people use the PC for and what does the job best for them at their price level? What works for you will be different than me. And since everyone’s situation is different, why not discuss those and let others decide?

          • Wow, skyrim, FFXIV? Skyrim isn’t a MMO.
            I’m trying to walk away but come on guys.

          • You should really go back to school, he never said Skyrim was MMO, he specifically stated that it’s “Physics intensive” and “single core dependent”. Dafuc is wrong with you, failed 3rd grade 20 times now?

          • Of course Intel CPUS are better, you’re going to spend $100 more on the chip and $30(used to be $100) more on the motherboard for that intel chip, it better be faster.

          • Right….?
            I’m sorry were you making a point? In your other post you were saying how awesome your beast 8350 is.
            You’re doing it wrong too, using the base clock..or do you have a fail ass non black edition?
            I’m glad we’ve established Intel is better…wait I’m confused. Can all of you make your mind up? One min you’re arguing objective fact and the next your talking about how beast your 8350’s are and how wrong I am.

          • nah what everyone is saying is how you’re an asshole who is overcompensating by acting acting tough on the internet, cupcake.

          • Just read the thread and this comment says it all.

          • Way to behave like a prick in comments section. Everyone is talking to you like a human being, and the bile still foams at your mouth.

            Gee Gee fanboy.

          • what GPU were you on?

          • Relax…It’s a CPU…

          • Just in case it wasn’t incredibly clear he’s making it up, I’d like to point out ALL FX8350s are black editions. The locked multipliers are gone on FX cpus.

          • all FX chips are unlocked moron!!! See there you have it, no clue from you!

          • The issue is the ALL CAPS replies at the start and the aggression, also people who lower them selves to name calling people usually won’t hold much ground because it just looks like they are throwing a pissy fit to try and get people to see what they see, and if usually never works.

            You could of said all you said early on, that clock for clock Intel is better, and then providing links and proof. Most then argue back that they can get AMD 8 cores for $200, while an i5 is $200+ and you only get 4 cores.

            So then you argue that the i5 will still beat the AMD [insert links] and even an i3 [insert links] can beat an AMD for reason XYZ.

            Just learn to be less confrontational and i am sure many people here would of been agreeing with you from the start.

            There are the hardcore AMD people and hardcore Intel people. People in the end need to learn that neither company gives 2 craps about you so support the product that you get the most performance from for your money, which in reality is Intel almost completely across the board with a few exception for CPU hungry applications that can utilize 8 cores for $200.

          • Francois Combrinck

            Im sorry you want to tell me for $200 you can build a better gaming pc with intel parts ? Bwhahahahahaha

          • Not really, most places now are selling 990 chipset boards for the same prices as new Intel boards, and remember that chipset is almost 4 years old with nothing new really going into it..

            Also most people buy an amd 8 core thinking WOW i got 8 cores,when for gaming and most day to day usage and Intel i3 would perform just as well and in some cases faster, cooler and user less power.

          • If you’re comparing the 990FX to a cheap H87 board then sure, they’re the same price, but if you want to overclock like many gamers then you can go with a $60 AMD 970, whereas you’re going to have a hard time finding a decent z87/z97 board for anywhere near as cheap as that.

            If we’re talking about JUST gaming and JUST right now, then sure there’s a couple decent i3’s that will perform just about as well in games as an 8350, but when DX12 and Vulkan get going in the next year that 8350 is going to smoke any i3, and if you’re talking about multithreaded apps it already does.

            You’re right about the power though, the AMD solution will cost you at least $2 a month in power over the intel. On the other hand, if you’re like me and live somewhere that gets cold in the winter I end up saving money with AMD because the extra heat translates directly into less need to run the house heater.

            Next I think we should discuss how much computers and CPUs in particular cost before Intel had any serious competitors, as well as every time they gained a significant lead in performance, and you can explain to me how supporting a company that price gouges every chance they get is a good thing, meanwhile I’ll be saving you hundreds of dollars on your next CPU purchase by helping to support their competitor.

          • Ok lets try this… What car would you rather have a Mitsubishi Evo or a Nissan GTR? The GTR is clearly better (Intel) but the Evo (AMD) does fine for half the price. I know perfectly well that Intel is faster, but refuse to pay the extra money for it because that little boost isn’t worth it IMO. And for the love of God reread the last paragraph if you have extra money buy Intel, but if your on a budget get an AMD

            BTW both cars will get you pulled over at the same speed, so when your out on the track is the only place you’ll see your GTR (Intel) outperform. Nürburgring Lap times:: 2011 GTR -7.34 $68k :: 2006 evo 9 – 8.11 $33k. {yes I ignored the 2015 nismo GTR cause at double the cost of the regular GTR it’d be basically saying supercomputer@$150k and was designed totally with that track in mind}

            To each their own, you should learn that value it an important part of things or why didn’t you even think to by DARPA’s 1THz chip? (I know not for sale to the public but proves the point)

          • “fine” is subjective. The FX’s result in substantially lower FPS in many cases. Numerous enough for me to call them jsut straight up bad compared to the alternative.

          • well said!

          • Francois Combrinck

            Haha know its not relevant but a modified evo will always be faster than a GTR around a track. There is no way around physics and unfortunately the EVO will always be much lighter than the GTR so evo will be much quicker as a track weapon. just my 2c its the same storry if you look at superbikes. 600cc bikes will demolish a hyabusa around a track. Hyabusa is to heavy although its faster in a straight line

          • Oh look another insecure troll. “Meh experience was bad. Wah intel legit m8s” GR8 B8 M8 i R8 8/8

          • I have an Fx 8350 (not overclocked) on an Asus M5A99X-EVO R2.0 paired with a Sapphire R9 280x Tri-x OC, Crucial Ballistix Elite ram 1866mhz CL 9-9-9-27 1T and i have 60 fps looking down the stairs in DragonsReach, at a resolution of 1920×1200. I don’t even have ssd, i own an pretty old WD Black Edition 640 gb, still good though.

          • Unmodded you dense moron. BIG DEAL.

          • “anecdotal” lol I’m still chuckling. AMD’s inferiority is common knowledge not anecdotal.

          • Gee running out of vram hurts performance! I dudn’t kn0w dat!
            lol you guys are hilarious. Youre right! I don’t get more fps in Skyrim with Intel, it’s all in my imagination. Must be my slow 1866 RAM too! We all know ram speed over 1600mhz matters? Wait..no it doesn’t.
            You guys need to get learned.

          • Once again DDR3 compared to DDR4. Bandwidth of DDR3 2133mhz is 27,000mb/s and DDR4 is roughly double at 60,000 mb/s. Yes the bandwidth has a lot to do with performance.

            https://www.micron.com/products/dram/ddr3-to-ddr4

            https://www.corsair.com/en-us/blog/2014/september/ddr3_vs_ddr4_synthetic

            Now stop talking out your arse!!!

          • i think Anandtech did a thorough review and testing of DDR3 ram speeds and literally found that anything over DDR3 1600,you will only notice if you run benchmarks all day long and no one will see a noticeable real world difference.

          • That’s not entirely true. RAM is only utilized, not used. For example, in games, utilizing more than 4-8gb of RAM is quite a feat. While utilizing up to xxxx amount while rendering. Yes there is a noticeable difference while rendering. Furthermore, the higher the RAM settings are, the more instructions per clock you see. I.e. 1600mhz uses 9cl, 1866mhz uses 10cl, 2133mhz and sometimes 2400mhz use 11cl.

          • rofl!!!

          • Low FPS in CPU bound scenarios can still see potential limitations in RAM and disk read/writes not being able to feed the necessary information to the CPU in a timely manner. It happens to me all the time when I’m doing Android builds, which is absolutely CPU intensive.

            By the way, I’m curious – how DID you get that FX-8350 4 years ago? Part of the test group prior to its release?

          • Lesigh…
            16gb of ddr3 1866 and a 250gb SSD. Happy? Graba fricken clue.
            And har har you’re not only clueless but you’re a joker too!
            What the fuck ever 3 years…4 years…I should have spent the extra $100 and gone Intel to begin with.
            What part of I get this fps on this hardware and this fps on that hardware do you find hard to grasp?
            Go ahead run Skyrim and go to the steps and look down. Your fps will be much higher on Intel.

          • Thank you. Depending on what GPU you had and how much it was relying on the CPU to pick up the slack, it is indeed possible for the CPU to be a choke point. In CPU intensive tasks, like building Android, the 8350 will build slower than an i7-4770/4790.

            Everything has to be considered in trying to identify these issues. The SSD certainly wasn’t holding things up but that RAM could have been a factor, based on my own experiences going from DDR3-1600 to 2400. Curious what you’re running now, if you’re willing to share.

          • Do you honestly think you’re teaching me something with your responses? You’re darn straight in cpu intensive tasks ( namely cpu bound games) AMD’s choke. There is nothing to ” identify” and you’re arguing objective fact.
            I’m not some noob that using low end parts buys prebuilts and/ or doesn’t know these things.
            I object to this article cuz it uses fairly GPU bound games and it leaves out key facts…like in a few games your fps will be abysmal if you’re on a AMD cpu. 20-50% less fps just because you chose AMD for your CPU is nothing to scoff at.

          • Don’t know. But this back and forth might answer questions from another reader.

          • In EVERYONES gaming situation, if the scenario /game is cpu bound , your fps will be abysmall compared to Intel. It is perfectly valid.

          • The problem with that argument is that not everyone plays CPU bound games. What would you recommend for a kid playing Minecraft, for example?

          • I said if the scenario /game is cpu bound. Logically if it’s not then your choice of cpu doesn’t matter.
            A lot of the time AMD cpu’s do fine. The rest of the time they suffer. It’s the rest of the time that bothers me and the rest of the time happens often enough to matter. If you’re broke ass, then by all means use a AMD CPU/ mobo. Just don’t come here arguing, asking what my HDD is or what speed DDR3 I’m using when it’s said how bad AMD cpu’s are in cpu bound scenarios. It’s way off tangent. If you know darn well AMD cpu’s are relatively bad, then why argue in the first place.
            Every single response from you is either common sense or off on a tangent where you could have accepted it as a simple example of AMD’s horrible single threaded performance /per core performance.
            And your example is terrible. Minecraft is CPU bound. I don’t play it and I don’t see the appeal but at least I know that much.
            If you want a good experience with it get the best cpu you can afford. For a kid playing minecraft or any other game for that matter…of course you go for least expensive while adequate.
            I’m not even sure what you’re saying any more.

          • Definitely not an AMD CPU at the moment. Minecraft does not handle multithreading at all. The G3258 would perform just as well as an 8350 in that scenario. Actually both are quite close, the 3258 pulls more fps but is less stable, while the 8350 is consistent and only has frame dips once in a while.

          • Minecraft doesn’t need to be multithreaded you can run Minecraft maxed out on a cheap APU from AMD. People have this misconception on games like minecraft because its basically single threaded it must require an Intel. If you look at frame times on a Pentium they are pretty crap on games like GTA 5 even if they get high FPS the stuttering is abysmal which is why you need basically an Athlon 860k or an i3-4150 or higher to handle most games decently anything above the i3 or 860k is pretty miniscule for most games.

          • Well, a good CPU and a HDD since rendering chunks is very CPU intensive.

            Try playing minecraft on a AMD A8-6xxx and then on a Intel C2Q 8200. The intel one will win, while its 6 years older but 1.5x faster.

          • Odd that you’re saying things in the comments of an article that completely contradicts the things you’re saying, looks like every game in this article manages 60 fps on AMD.

            It’s ok, the majority of the knowledgeable people have already realized you don’t know what you’re talking about the moment you said that games are “cpu intensive tasks” when the reality is there are almost no games that will use more than 2-3 cores. You’re wrong about that, you’re wrong about how long you had an AMD cpu, you’re wrong about them not being able to hit 60 fps, and you’re COMPLETELY wrong about your numbers, there is no 20-50% difference in FPS on AMD vs Intel in gaming period, more like 2-10% with the majority trending toward the 2% end.

            Dunno, personally I get 60 fps in everything because I overclocked my 8350 by upping the FSB to 220 and jacking the RAM speed through the roof, which is just yet another thing you’re wrong about with regards to your PS statement.

          • Ya you’re right. AMD cpu’s are great, not a joke compared to Intel at all. Totally man! 60 fps in any game!
            The entire internet is wrong and you’re right!
            Keep on sniffing that glue man. Keep on keep on.
            lol you think ram speed matters for gaming. I got better things to do that type at clowns like you.

          • Francois Combrinck

            Lol here you just saw an 8370 keep up with the best intel chip on the market. Yet you still argue. I am running a 8320 at 4.6ghz with a 670 gtx dcii edition and my fiernd has the 4th gen 4790k with the 680 gtx dcii. Both using 2100mhz ram and samsung evo ssd and we get almost identical frame rates with 2 to 5 fps difference in games. Just because you have moded a game with badly designed software and textures or coding that favours one cpu over the other does not mean that the one cpu your mods were not optimised for is bad. But everyone can see you are a fanboy for intel and thats your opinion you are entitled to. Just dont post utter bs and knock things You dont like. Posting misleading BS because you are a fanboy causes people to get annoyed with you. No one is convincing you to buy an amd buy what you want just dont talk nonsense to justify your choices of cpu

          • with GPU limited testing.. now toss in real high end cards to go with that high end CPU.. who would buy a $1k CPU and toss in some 960’s and 970’s… no one would if they bought the $1k GPU for gaming.

            and to note, anyone who buys a $1k CPu for gaming is not too bright either, unless they plan on a tri SLI/Xfire config with high end cards.

            As noted, they could of tested with an i3 as well and gotten the same results vs AMD…..so now should we defend the intel i3, since it is faster in the end for gaming vs AMD for almost every game they tested..

          • hahaha!!! There are LOTS of examples out there with 20-50% more fps on Intel. I’m not going to google it for you. Have fun with your “FSB” NOOB. And I am very very right about ram speed. Are you on a APU? Then RAM speed does matter. If not, you’re way out fo your element and don’t know what you’re talking about. You’re lucky if you get 2fps more with faster RAM.
            haha!!!

          • Are you on a APU? ??? HE SAID HE WAS ON A 8350!?!? something is telling me that you are some little 12 year old defending intel right now..

          • Get… a grip.. geez

          • Are you on a APU? ??? HE SAID HE WAS ON A 8350!?!? something is telling me that you are some little 12 year old defending intel right now.

          • haha!! Not even going to reply to you.

          • My question was rhetorical dumb sh9t. The only time RAM speed matters like…AT ALL is on a APU.

          • Why are you so rude? What is the point of being a prick? He is simply stating what he stated, he asked you about your hardware, and you give him lip! Instead of just trying to find the problem, and accept his help. You seem to be an Intel FanBoi, nothing more. My bet is that it isn’t the CPU, rather the PCIE lanes. Go figure that you see a difference between PCI 2.0 and PCI 3.0, Also go figure that you see a difference in DDR4 as compared to DDR3, let alone dual channel DD3 and quad channel DDR4, I mean the bandwidth is faster. Stop being so ignorant, his benchmarks are along the lines of other benchmarks done. It can even come down to what chipset you were using ()my bet is the 970 chipset, as you bought the FX chip at it’s release). I tend to agree with the other people here, you have absolutely no clue about what you’re saying.

          • You are half right on this one, Skyrim only taxes one core and AMD CPUs are terrible on single threaded scenarios.

          • Have you tried meditation?

          • Fx8350 isn’t that much old.
            Me waiting for zen.

          • Um..yes it is.

          • Intel have been through…how many revisions while AMD has done nothing but release a overclocked and higher TDP 8350 ( the 9590) It’s been a number of years. Stay away from AMD until ZEN. Their CPU’s suck.

          • Paulie M30 (The Gaming Junkie)

            Thank you! Some one with common sense

          • You’re right this test is bullshit and faked by using gpu bound scenarios and subpar GPUs (1440p using 970? That gpu can barely manage ultra on 1080p…).

          • pretty much, they do not strain the CPU, toss in 2 /3 cards in SLI/XFire and use a 980ti or Titan or some 290X’s or duals…then see where Intel will shine , a $1k CPU is not bought for a gaming rig to use 960’s and 970’s in…

            Then trim the intel back to the $300 range CPU’s and compare those to the $1k and see the same results..

            Then toss in i5 and i3 and really start to see a CPU bound problem but likely keep close to the AMD…

          • Skyrim is absolutely atrocious on CPUs and strongly favors AMD. Also “looking down” isn’t really relevant, although I understand why you would test that way.

            Also the reason WHY AMD performs so poorly in skyrim is because Bethesda has always had horrible optimization. i3’s and pentium dual cores run it just as well because they didn’t bother to include any optimizations for anything newer than the P4 architecture.

            As a former owner FX 6 core Skyrim ran pretty shitty for me too. But the big CPU bound games that are dual threaded, (RTS’s MMORPG’s) ran well, although the i7 3770K I got performed better about 20ish percent I didn’t see such a huge jump that you are explaining.

            Donny mentioned several times in this article that there ARE games that Intel will run better with, but most games will run just fine on AMD.

            If you play a lot of Skyrim I can understand your issues, but that is on the hands on Bethesda and not AMD’s, I’ve put hundreds of hours into Fallout 3, New Vegas as well as Skyrim on PC and I will be the first to tell you they aren’t polished at all.

          • You should learn how to articulate your arguments without being so abrasive.

          • So I may be ignorant too “30fps on fx-8350, even at 4.7ghz. On a 4790k at 4.6ghz over 50fps.” So a cpu that cost 320 vs one that cost 170 is better? This sort of thing is what the article was showing. It says Intels cards are faster, but are they worth it for most gamers like people think?

            (sorry for the late to the party comment, this article just showed up on my feed”)

          • IMO yes the investment is worth it. By the time I ditched my fx-8350 I was down right sick of the lousy performance. I’m loving my 4790k and wish I just went Intel to begin with at that time. As for your question…. thing is so many people see higher fps on intel and think that means FX’s “bottleneck” gpu’s. Fact is most people misuse the term and shouldnt bloody use it. More fps on intel is always due to superior IPS /core performance in CPU bound scenarios. Anything that’s GPU bound, a FX cpu will not “bottleneck”. More fps on intel in this area of a game or this or that game simply means the game is more CPU dependent and favors intels superior IPS .
            On the other end of the spectrum, noobs flat out deny that intel is better in any substantial or meanignful way. They are flat out wrong and kidding them selves. Granted in perfectly optimized games + games that arent very CPU dependant…FX’s do fine! The rest of the time they suck. Case in point…Skyrim. You get A LOT better performance on intel with Skyrim….or any game that likes a good CPU for that matter.
            AMD’s current cpu’s SUUUUUUCK.

          • “Granted in perfectly optimized games + games that arent very CPU dependant…FX’s do fine!” Since this is my point I will agree with what you told me and Let you bug that other guy with the bottleneck comment.

          • Problem is, there are many cases…games, parts of a game whatever that are cpu dependant and run much better on intel. The list of games that are purely GPU bound and CPU doesn’t matter is really rather small. Even then “fine” is relative and subjective. “her der 20fps isn’t a big deal” they say…when you see 40 fps and could be getting 60, ya it is a big deal.

          • i dont run at 1080p. my 720p serves me just fine with my AMD. the 20-30% intel fps does not justify a overcharge to the consumer. intel plays you guys like morons and you fall for it. last the games i play dont need a high frame rate.

          • AHAHAHA. Have fun with thte broken physics. Who is thte f8ck plays Skyrim without ipresetinterval on? You’re doing it wrong and missed the point ENTIRELY.

          • You disabled ipresetinterval, play with no mods and take a screen shot with nothing happening ( let alone the stairs looking down at Whiterun) and think you’re proving me wrong …I’m still chuckling.

          • Is that why you replied twice 😉 ?
            Most of the time I’m in the 100 to 120 fps range with vsync disabled. I believe 72 is the lowest I saw while in game.
            With Vsync enabled I averaged 59.8 fps on ultra settings.
            Post a screenshot of the exact location you are referring to along with your graphics settings, I’d be interested in seeing what you mean ( fps counter would be good too).

          • with no mods. Wow what a feat. And you do not get 72fps on the steps of Dragonsreach looking down at whiterun 🙂

          • oh my goodness, look what we have here 🙂

          • Enjoy the screwy physics bugs with high fps btw. You’re a noob. You don’t even know what I’m talking about when I say the steps of Dragonsreach..
            Now run along, go install texture mods, dyndolod Verdanty grass and ENB and tell me you get 120fps and 72 being your lowest.

          • I was most interested in comparing at the exact spot you seem to be obsessed with. But I can see you are more interested in making inflammatory remarks than actually learning anything. It seems like you are very insecure and your ego couldn’t accept anything that threatens your self made illusion of superiority.

          • Ya das totally it dood.

          • You didn’t seem very happy about me disabling vsync so I re-enabled it. Doesn’t seem to matter where I look, pretty much locked at 60fps. Here’s the settings etc.

          • You’re schooling me kid.

          • Between you and me, I’m 50 years old 😉

          • That was sarcasm genius. You haven’t schooled squat and you have completely missed the point while babbling about fps that is nether here nor there and proves nothing. I don’t care if you’re 90.

          • One of us knows exactly what he is talking about, and posts proof , the other one just talks a lot. 😉

          • proof of what??? You are so off tangent it’s hilarious.

          • I’m gonna spell it out real plain and simple for you. Not to mention repeat my self…. Follow the STEP guide for mods. On top of that get Verdant grass and a ENB of your choice. Now run along, do this and tell me you get 120fps. In fact I want to see a screen shot. I don’t give a flying crap what fps you get with vanilla Skyrim. No one does. It’s not demanding. In any case the steps of Dragonsreach is indeed a CPU bottlenecked spot and you will in fact get really bad fps looking down the steps at Whiterun modded or not. And for the bloody tenth time, you will get worse fps there on AMD then on Intel. Seriously grab a clue. You’re boring and annoying with your vanilla screen shots. Who the hell plays Skyrim with no mods to fix how arse ugly and dated it is.

          • I love your noob screen shots of your noob settings too. 8x MSAA. lmao. First off who the frack uses 8x MSAA. 8x is well passed diminishing returns and hurts your fps. Second by that screen shot alone I can tell you play Skyrim without ENB as you disable in game AA and AF and use ENB’s. Vanilla, unmodded Skyrim is nothing to brag about and is really really bad ammo to bring for a argument. You’re 50 years old? You come off like a teenager totally new to PC arguing with me and posting your noob arse vanilla screen shots.

          • It would appear that you are the one that is running short on “ammo” as I have posted proof that you are mistaken about the capabilities of the FX 8 core. I understand that given the same clockspeed on a game that uses basically one core the Intel will give higher frame rates. However, the FX’s can be clocked up to a point where you cannot see a difference between it and a 4790K.
            As for being a computer “noob” – well I cut my virtual teeth on a Commodore vic-20 back in the early 1980’s and have been building my own machines now for about 15 yrs. I have , within an arms-length of my desk , 2 FX 8 core rigs , an opteron 180, a 4790K running 4.9ghz ,2600k@4.5 , 3770k @4.7 , all on watercooling. I compare them against each almost daily. Approximately 25 other cpu’s are strewn about just waiting to be played with. I’ve been in the top 100 of the enthusiasts league over at HWBOT ( out of around 50,000).
            I don’t have any particular stake in the discussion other than I don’t like ignorance ( my own or others) and if someone makes a claim that is contrary to what I believe , I respond with curiosity rather than anger and seek the answers for myself.
            I have the feeling that no matter I post , you will have a problem with it , if it shows the truth to be anything other than what you want it to be.

          • You got the downs syndrome man. Stop posting. You posted saying how much fps you get with NO MODS for christ sakes. Who the f8ck cares!!! Vanilla Skyrim isnt demanding! What ever fps you get you will get more with intel! WTF part of that don’t you grasp? Namely…the steps of dragonsreach looking down at whiterun…your fps plummet on AMD. Not so much on intel. Or any other specifically CPU bound spot. You’re an idiot and I’m sick of getting notices that you brain farted again and posted. Grab a god damn clue.

          • lol skim read more of your insistent babbling..enthusiasts league! LOL with a AMD proc. You’re a clown. Stop posting. If I could ignore or mute you I would. Doesnt seem to be a option.

          • You seem upset, It might be healthy for your to admit you were wrong, you’re only human after all. Could be very liberating- not worth losing sleep over. Be well :).

          • https://uploads.disquscdn.com/images/09d6416db6040c39af32814c7af9c6c17fd47df37f00b9bc04bf7fef31044526.jpg https://uploads.disquscdn.com/images/72e80b429ab0004bdf7bc7d96087880a0a621bab42ef20f77159f828664f1a9b.jpg
            I’ve been working 60 + hour weeks , so I haven’t been playing very much at the bot lately, so I’ve lost some places in the enthusiast’s league.

          • Stop derping, you make my brain hurt.

          • Well, at least it’s a small hurt 🙂

          • I’m not the one arguing how awesome sauce AMD fx’s are. You’re fking dense.

          • No , I simply speak the truth. You either don’t know what you are talking about , or are just trying to illicit a response through childish trolling. To recap, you have been spouting that the FX is terrible in games like skyrim. In response I posted a screen shot of my FX rig pumping out 300 FPS in skyrim. You weren’t happy about where I took the SS or the fact that I had disabled vsync. So I posted a SS of my FX rig on skyrim with vsync enabled at the spot you suggested , maxxing out the frame rate at 60FPS. You doubted that I get 72 fps in that spot with the FX rig, So I disabled vsync again and promptly posted a SS showing ….. 72 fps on those steps you are obsessed with. Just countering ignorance with the truth.

          • haha. You’re right, AMD fx’s are awesome.

          • https://www.gamersnexus.net/game-bench/2182-fallout-4-cpu-benchmark-huge-performance-difference

            AMD cpu’s get smoked. SMOOOOKED. Look at that crap. Sub 60fps. Intel at 83.

            You’re an idiot.

          • Sigh, did you even take a look at what the “reviewer” did there?
            He put the 9590 on a 970 board, which is a handicap, and the board he used, does not even officially support the that cpu.
            Steve has lost a lot of credibility in my eyes. Don’t blindly accept information people present to you… they quite often are 1. inept or 2. pushing an agenda. Eventually I’ll get fallout 4 and run the benchmark for myself, I will be very surprised if I can’t come very close to the numbers the 4790K put up in that review even given the fact his graphics card is better than mine Either way, when I do, I’ll post the results here. Do you happen to have the game?

          • “sigh” is right…you continue to argue AMD’s current cpu’s are on the same level as Intel’s. You sir….take the cake.

          • That’s very contentious because in some applications they are , others they are not. Generally, their AMD’s cpu performance supports the pricepoint’s at which they are being sold. The problem I have is when people who don’t know what they are talking about say things that aren’t true, for whatever reason. I am fortunate enough to be able to own several examples of cpu’s from each company , I have a good understanding of how they compare against each other. I share that when I think it would be helpful. They are what they are , I’d happily run any thing people request on my FX machine for comparison’s sake . The important thing to understand before buying your hardware is , will it do what I need it to do at a level to the degree necessary to make you a satisfied customer. If you need a screwdriver, don’t waste your money buying a hammer. Related note : Always be skeptical of numbers someone presents you…. ALWAYS!

          • So what do you think my FX 8 core with the 780ti rig will get for FPS at in Fallout 4 at the settings the fellow in that extremely questionable review had? I mean he had the 9590 giving a lower minimum frame rate than one clocked 1.4 ghz slower. Common sense dictates that the benchmark was botched,
            ( most likely because he used a cpu that isn’t supported by the motherboard….completely irresponsible ) the kicker is he went on to say ” it stuttered like amds tend to do”, absolutely false and if he actually looked at his data and was responsible about it, he would have known it was a mistake.
            In running the 4790k and FX 8 cores side by side for most of a year, the 4790k locked at its turbo speed of 4.4ghz and an FX locked at the 9590 turbo speed in the overwhelming majority of games out there, you won’t be able to see a difference between the them .

          • the human brain cant process anything over 65 fps. so what difference does it make if an intel does 83fps. you wont see it. intel knows this but still pushes higher fps to idiots who don’t realize that.

      • What games, that is something most people fail to mention.

      • I was on a fx-8350 for four years. Four years too long.

        That’s interesting, seeing as how the FX-8350 hasn’t even been out for three full years yet, let alone four.

      • How is that possible when the FX 8350 was released on October 2012? The CPU is not even 3 years old yet.

      • I own multiple rigs and technologies including AMD gear (both CPU & CPU) and find their 8 core CPUs to be more than fine for high performance gaming.

        • because your likely GPU bound, not CPU bound, if your gaming at 1080, an i3/i5 would also been fine for most games for most people.

  6. What a stupid comparison. AMD’s FX’s suck. Why are they using a select few games that are not CPU bound to do this comparison?? It doesn’t even show how bad AMD CPU’s in fact are.

    • Get a life….

    • LOL, Intel got their HTT from AMD’s SMT, that was in the Athlon 64. Athlon 64 tore Intel apart, from within. Zen is supposed to reintroduce the SMT, and come out with chips similar to Intel I.e. 18 core 36 thread Server chips, 8 core 16 thread workstation chips, and 4 core 8 thread consumer chips. Don’t put it past AMD to take over again, and if I remember correctly, didn’t Intel just lose 2 law suites, one against AMD, and one because Intel stacked the data on the P4’s synthetic benchmarks? You’re just a hater, spewing nonsense. It’s noticeable by the lack of knowledge that you repeatedly post!

      • I don’t need a history lesson and the Athlon 64 days were some 10 years
        ago. WTF is your fail ass point? We’re talking about NOW not hopefully
        some day soon. Try again 🙁
        Every single thing I’ve posted is a fact. And you come here babbling about the Athlon 64 hahahaha. A hater/ Spewing nonsense?? AMD’s current chips SUCK. It’s a fact you need to face moron.

        • It states it in my comment, that the Zen core is supposed to return to SMT and leave CMT. No one is hating on Intel, it’s me simply saying that AMD acceled at SMT back in the day, and took Intel because of it.

          I’m running a FX 8370E, I have no problems with it. You had a problem with your chip, because you don’t have any idea about how computers work. I OC’d my chip, it scores right below the 4790K in ALL synthetic benchmarks.

          AMD’s best chip right now sells for less than an i5, and outperforms it. Intel i5’s are great for gaming, so is FX chips. I don’t know where you got that FX can’t play Skyrim, mine works great for it. My single core score is 115 in Cinebench R15, hell even in CPU-Z new software, they offer a benchmark. Which h the i7 4790K scores 10% better, or 600 points better than my FX CPU. In multi tasking the i7 4790k loses to my chip, by 12% or 1,100 points.

          You are the hater here, as none here, seem to have had such major issues as you have. We also didn’t need to falsify our stories. We also knew that all FX chips were unlocked (you claimed certain 8350’s weren’t BE) Yes RAM speeds and subsequent latency caused therefrom cause differences in the system.

          As stated before in comments, you have no clue about computers, you’re just rude to anyone who disagrees that the money you spent on the Intel system was justifiable.

          I own 2 AMD based computers, 2 Intel based laptops, and 1 Intel based server. I mean we can talk about my consoles if you want also! I’m far from a fanboi, and I’m all but a hater. I have no problems with either of the manufactures, outside of me trying to use my FX on a 760g chipset board (which I mentioned above).

          I don’t believe you owned an AMD system, and if you did, your configuration was trash!

          https://uploads.disquscdn.com/images/3fc847e6bfa55554c56cef3f2bed579b58eab292d4866fea86929ef623906011.png https://uploads.disquscdn.com/images/c18c395ba3a8d998ef9b3d8d474a59be75d4e3124b2692da900a257d0b87f003.png https://uploads.disquscdn.com/images/c9a0687f53682f8fab9e41c7ea1b6d5aa6188929b0e9e6991edc87699c7f69be.png https://uploads.disquscdn.com/images/bcdfdc9117a8b8b1d806e2e3c088b0cdb7821a3851ceaa014e06700e02589dec.png https://uploads.disquscdn.com/images/46f2e5a9a2cdeb8a8f98c0ca2f7a2139792032f85d7108707d79342ff0444ee4.png https://uploads.disquscdn.com/images/7ec6fc2c2a12dad025ce997bc8f73cbe8959fe9f4c0ce8530f22af9111ff02a1.png

          • The thing is, is that the FX chips falls between the i3 and i5 price range. Comparing a $160 FX chip to a $360 I7 chip, still doesn’t have anything price related. The I5 only beats the FX in single threaded, and the I3 barely beats it in single threaded. With DX12 and API overhead, you can forget that the I5 will beat the FX chip in any game which uses DX12.

          • It will be 1-2 years before DX12 is main stream anyways, which by then new cpu’s will be out (hopefully something worth getting excited over from AMD), hoping something may be better in the future is not a good way to use performance measuring now. single threaded the i3 and i5 can beat the FX, in many tasks. Unless you can get something that can use 6+ cores properly, the Intel will almost always win in almost everything with multiple threads as well, just review most of the benchmarks and tests around, you can spend almost the same with an i5, use less heat, less power, use the stock cooler and beat out the AMD 8 core.. because simple fact is, very very few games can efficiently use more than even 4 cores.

          • Games on the XBOne are already using DX12 patches. 99.9% of games on the PC are ported from consoles. PS4 and XBOne both use AMD 8 core APU’s. Previous consoles used 3 core CPU’s, too which the titles were ported to PC. If you have FireStrike on 3DMark, you will see how the API overhead test plays out, between AMD and Intel. Intel has had a rough time, paired with Nvidia chips in DX12, while AMD hasn’t (I figure it’s because DX12 basically copied Mantle). Go ahead and buy yourself a dual core chip (Pentium or I3), or even a quad core chip (4690k), for gaming, because within 2-3 years, you’re going to be upgrading. It’s as simple as future proofing, which AMD has been doing with the advent of HBM, HSA, and the new Memory in their Zen Chips.

            “Unless you can get something that can use 6+ cores properly, the Intel will almost always win in almost everything”

            In video rendering, there is relatively no difference between 4c/8t chips (FX 8xxx and Intel I7 4xxx). Even though Cinebench says there is an almost 200 point difference, real world testing shows nothing more than a couple of seconds to a couple of minutes (nothing exceeding 5 minutes)!

        • BTW you haven’t posted any facts, you’ve flammed every chance you’ve gotten. Fact is all FX 4000+ series are unlocked cores, you stated otherwise. Fact is, there is a difference between RAM speeds and latency, you stated there wasn’t. Fact is, the FX 8350 cane out in 2012, you stated 2011. Fact is, there is no more than 10 fps difference, even in skyrim between Intel and AMD cpu’s, you stated a min of 20 difference.. Fact is, you don’t seem to understand anything you’re going on about.

          On the other hand, I’ve posted benchmarks, resources to RAM speeds and timings, and manufactures websites. You’ve lipped off in every comment you’ve posted, yet references given, have yet to be posted by you. Your opinion does not equal facts! Got that yet?

  7. I knew the results as soon as I clicked this

    Also, isn’t this pretty useless? Graphics cards are going to do most of the work – these high-end CPUs are for actual processing work. They aren’t designed to display some fancy graphics, that’s what other components are for :s A “gaming” CPU is something mid-range, just enough so it can handle everything decently. The GPUs handle the rest

  8. Nice article. I own a 4790k and FX-8 core systems – having done a lot of testing, my results are very similar to yours. I’ve used the 290X lightning in both and as your article seems to demonstate at 1080 res + with high graphics settings , there really isn’t much difference at all in the way they perform for gaming. Depending on personal preferences in games ,resolutions and graphics settings, many people would be better off spending their money on a GTX 980/FX 8 core system than a lesser video card paired an i7.

    • Or you just go with the cheapst i5 and a semi decent h97 mobo. Will cost about the same as amd fx8, but will kick ass, when single thread is most important (ie lots of games still).

      Also, with intel you’re not buying acient stuff. AM3+ is 4 years old now.

      • Even tho I’m sad that AMD can hardly keep up with Intel (that means higher prices for us) I’d still have to be objective and go with Intel unfortunately. AMD cpus are outdated the second the get out of the factory, they have couple of years old architecture, and if you’re buying a new chip you don’t want to buy something that’s already outdated, unfortunately. Besides, the advantage Intel has is that their cpus are good enough for years and years after you buy them. I still have an old e8600 C2D and it freaking runs all games, only game I couldn’t get to run was the Far Cry4 cause it’s made for 4core cpus. It can even run freaking Star Citizen and it’s a 5 or 6 years old cpu. That’s what Intel does. While my other AMD pc is pretty much dead and it’s way waaay newer than that prehistoric e8600.

        • My Phenom 2 955 is still running close to everything quite smooth even in Full HD together with an 6590 and 8 gigs of 1600 ram. It handles farcry 4 easy.And this CPU is i guess 7 years old now and its performance is awesome.No need for me to upgrade till ZEN.

          • Phenoms are Phenomenal. I love my 965 BE, slightly overclocked to 3.6
            Mine is getting to be 6 years old. Loved the 5870 card with it. It is still the best system I have ever had.

        • Using that same logic you might also say that all the i5’s, i3’s and i7’s are also dead and outdated since Skylake is here. Anyway, with Windows 10 finally putting to work every ‘core’ CPU throws at it and DX 12, great things are to come for the fX AND i7’s.

        • Oh come on, those are talking points. Company X is out dated but company Y’s old crap is just great… the gear either works or it doesn’t. Some marketing speak about being out of date is meaningless at best.

          • I was thinking the same thing. What does it matter if it’s “outdated” if it fulfils your requirements? Is your 6 year old TV or monitor considered outdated? What about your HDD? They last a relatively long time. How about older video game consoles that people love to play like an NES? The only reason you should care about a component being outdated is when it is not meeting your needs.

          • Agreed, I have friends that only play battlefield 4 and love their AMD 83×0’s and don’t care that the Intel may be 5-10% faster in multiplayer, it wasn’t worth the extra $100 dollars for an i5.

          • Then they could of bought an i3 🙂 and got the same performance, runs cooler and uses less power….

          • No way your crazy the FX8350 is way more powerful than a i3 the FX8350 is as powerful or more so than the Intel i5-4590 as that is the minimum requirement for the oculus you need a Intel i5-4590 or equivalent Cpu and my Cpu is passing as oculus ready. Also in the coming years when Vulcan, DirectX12, and Mantle become the standard you’ll be glad you have 8 cores and not 2 as you will see huge benefits and massive improvements with multi core usage and I have news for you DX12 is the future intel is going to need to step up its game and soon.

          • Especially when DirectX 12, Vulcan and mantle all favor multicore processors your $175 FX8350 will be on the bleeding edge compared to intel’s weaker 4 core processors in the coming years things will change dramatically.

        • So when DX12 is utilized, where does that leave the dual cores, quad cores, and the quad core with HTT (which is already horrid for gaming)? As we see in early DX12 testing, AMD benefits more than Intel, and even Nvidia.

          Which AMD machine do you have? I mean I can still game on my old 2GHz dual core Opteron, and even my quad core Athlon II x4 640. Maybe its not the PC, rather the user who don’t know how to update the BIOS.

        • I have been with AMD since 1997. I will buy an i-7 4790 Haswell or an i-5 6600K Skylake in the next few week. AMD was good for me. I am sorry baby, we have drifted apart, we have new interest, and you cannot keep up with me. And you get to hot and are to loud all the time.

          • Try the FX8350 with a Cooler Master 212Evo you’ll be glad you did when DX12 becomes the standard and the Cooler Master 212Evo is an awesome heatsink and fan your system will be whisper quiet.
            (Warning: Do not use this fan with any CPU with a TDP of more than 180Watts) For more then 180 watt cpus that you will need a liquid cooling solution instead. Also the 8370 is a waste of money it is basically a 8350 overclocked which you can do yourself.
            https://www.newegg.com/Product/Product.aspx?Item=N82E16835103099

          • Chouwakun Tanthawa

            Yeah , My phenom II with 212 X never goes over 45 Celsius even in Summer Thailand with out Air COnditioning .

        • Chouwakun Tanthawa

          My Phenom II X6 1075T is still kicking ass all games DX 12 and Vulkan is freaking awesome , how old my CPU is it’s 7 years almost 8 years . all other components has died like Graphics Card for 2 times Geforce and 6970 PSU 3 times 1 monitors 2 HDD and 1 DVD-Rom , but my Ram and CPU not died and it will still run much longer than i expect .

  9. Good review! It’s already given that in high resolution like 4k, cpu will matter less. Your final conclusion of when if the user decides to get a 980ti and sli it that you’d go for an intel build? Tweaktown did an article about that last year where they compared an FX-8350 and an i7-4930k paired with 2 980s in sli at 4k. Although it’s not a ti, we should see a similar pattern wherein games at that resolution will be gpu bound no matter what modern quadcore or above cpu you’ll pair it with.

    https://www.tweaktown.com/tweakipedia/56/amd-fx-8350-powering-gtx-780-sli-vs-gtx-980-sli-at-4k/index.html

  10. THe DX12 benchmarks are where?

    3dMark API Overhead benchmark is available as well as Starswarm and Brad Wardell’s Ashes of the Singularity has released DX12 benchmarks also.

    If you really want to show performance that is over-the-top then add the 3dMark API Overhead Feature Benchmarks and Starswarm. AMD just crushes Intel/nVidia due to Asynchronous Shader Pipelines and Asynchronous Compute Engines.

  11. Dude, i think that in 4k, even with 2 980ti the FX still winning…

    • 4k is GPU extensive mainly. When you go in GPU extensive games + GPU extensive resolution…it’s more likely to be the same. That was the point of this. They’re not saying. “Let’s play poorly optimized games or heavy cpu extensive games!” As on DX11 that’s where AMD would likely fail. The point of this was to make people understand that there are areas where AMD processors are still good.

      • Yes but they did it in a fanboy way to try and say OMG AMD can hold up to a $1k intel CPU..

        When really they could of also shown that Intel Dual core or Quad core (i3/i5) in the same price range could of given the same results vs an 8 core AMD.

  12. I get that amd cant pump out new cpus all the time and in terms of gaming cpus are no longer important. Heck I run games on my old phenom 2 965 and its sporting a third vidcard, now a radeon 290, just fine for 1080p gaming but they could at least give us a die shrink. Wouldnt they save by doing that anyways? The new 14 nm intel cpus are tiny.

  13. Wish this was i7-5960X vs. FX-9590.

  14. You guys made sure to use only GPU bound scenarios by using subpar GPUs for any given resolution (nobody runs Ultra at 1080p on 960 neither does anyone play ultra at 1440p on 970, neither does anyone play 4k on dual 970 3.5gb sli cuz its silly and means you’re limited by you). Because of this manipulation your tests are not benchmarking CPU in any heavy tasks. Try running gtx 980 in 1080p in games like gta V and we’ll set if fx doesn’t bottleneck you.

    Besides your fx PLATFORM (motherboard and cooling as well as power and power supply are more expensive) costs the same as a locked i5 haswell setup (can run on cheap mother board without throttling, no need for 3rdpartycooling) which will be superior in performance as I said in CPU bound situations

    Except, with all that said, you benchmarked fx against CPU setup that costs way more than 1000$ on Intel side of things. why? Unrealistic scenario a classic locked i5outperforms fx cpuz in games easily, so why test fx again extreme grade cpu setup?

    I know why. So that you can make a point. “Yes Intel is 10% faster in all these gpu bound scenarios we made but it costs 1500$”

    Bullshit article, be ashamed of yourselves.

    • The FX processors do not throttle cheap motherboards aslong as you put a fan over the VRM’s (if they have no heatsink on the VRM’s, which is what I think you’re talking about). Or in my case, just install a 212 Evo in the cpu, then use the stock fan to blow air over the VRM’s. Use speedfan to create a fan profile, done, 50 bucks saved that are going to the GPU.

      • If you are on a 760G chipset, the FX chips throttle it, I couldn’t even get my 8370E stable at 3.9ghz. Bought me a new mobo, with 990fx chipset from Asus, and pow, I am stable at 4.5ghz with a vcore of 1.3v. I could probably hit 5.2ghz, if I OC my ram and busses.

        • I actually have a M5A97 LE R2.0 motherboard, and a FX-8350. When I installed the 212 EVO in my cpu, it kept throttling when it was at 100% load (yes it does throttle cheap motherboards, but if anything that’s more of a motherboard flaw and not a cpu flaw), but the cpu temperature was more than ideal. One day I decided to install the original fan that came with the stock heatsink, installed it over the VRM’s, created a fan profile on Speedfan, boom now its stable on Prime95… So yes, cheap motherboards do get throttled by FX chips, but the fix is very very easy. Btw, if running a heavy overclock, I’ve heard installing a fan over the VRM’s, even if they have an heatsink, will help a lot removing the heat/improving motherboard’s health

          • Yea I’m on a M5A99FX R2.0, but I honestly must say, it runs really good with my FX 8370E. I have no problems with this board though, I’m OC’d to 4.5ghz. The closest I have to a VRM fan is a Kingston RAM cooler, that has 2 fans running at 3500 RPM. Other than that, I have no fans around the VRM’s as I’m running my CBU radiator at the front intake, and also have another 120mm intake fan and one 120mm exhaust. I did have major issues on the 760G chip set though, but only when it came to overclocking, otherwise my chip ran perfectly fine up to 3.8ghz, over that was no good at all.

    • Nearly any gaming scenario is going to be GPU bound, that’s kinda the point. Spending extra on Intel for gaming doesn’t make much sense because you are spending extra for a GPU bound scenario where your money is wasted.

      • So go buy an I3 or i5, as it would be more than sufficient for most gamers these days….and again you save on power/cooling and get more performance per clock.

  15. Although I didn’t do a lot of research, it seemed that the AMD chip used in this test was a thousand dollars cheaper. Am I missing something? Because ix that is the case, there is zero reason to buy the Intel unless you have no problem flushing money down the toilet. I hope I screwed up something.

    • it is a bad comparison, they solely did it to compare 8 cores vs 8 cores, when an i3/i5 for more single GPU configs would of given the same results vs the AMD 8 core.

  16. I hope that this is down to earth Intel fanboys…

  17. Stupid question but I couldn’t find it in the article. Is this using Windows 10? 8.1?

    I have heard that Win10 might have better scaling for multicore but I couldn’t see where the OS was listed in the review 😛 Do you think OS is a major factor in how close this race is?

    Regardless this article made me feel great again for picking up my FX-8150 on release day. I knew it wasn’t as terrible as the internet wanted me to think 🙂

  18. You just compared AMD 4 year old architecture low budget 8 semi-cores gaming cheap CPU to the 16 threads extreme enthusiast workstation monster from Intel in a GPU bound scenario. And you find they are pretty much the same.

    Thats like “benchmarking” a 1965 GMC pickup vs a Porche 911 on a 50 mile per hour restricted road and find out they both run at aprox 50 mile per hour.

    And you do it an year too late. Seriously? Have you heard of Skylake?

    • Uh, that’s sort of the point, mate. Games are primarily GPU bound tasks; if a game is CPU bound there’s something wrong.

      This isn’t a CPU review. If it was I’d be missing a lot of data on other workloads besides gaming. This article is meant to illustrate that when it comes to gaming, any modern quad-core or better CPU will be just fine for the vast majority of use cases.

      I don’t think it matters that Skylake exists. It isn’t like it is some sort of end-all-be-all for gaming. It’s a few percent better or dead even in most cases that I’ve seen (haven’t had the opportunity to test it myself yet). Still, lots of folks are looking to get into PC gaming and are on low budgets and are told that AMD can’t play games as well as Intel which really isn’t the whole truth.

      Some people want to do benchmarks, most want to play games.

      • it kind of is a CPU review as it clearly states in the title and in that case they should of shown i3/i5 as well, as they would show almost the same results in some cases, this showing for less or the same price as the $200 AMD you could go Intel and get more performance per clock, less power and less heat…

  19. Cool read. I have always been told Intel is better, but AMD is cheaper and they still have a solid performance. I’m a gamer on a budget. It’s the same exact reason I don’t do iPhones. Why pay more for something when you can get the same thing for a cheaper price.

  20. I just prefer AMD over Intel pricewise. and yes I put more emphasis on the vidcard. I have a 6350 and an nvidia 960 4GB G1 Gaming card. and it runs my games just fine. no need to spend 300+ on a cpu chip from intel when I got my amd 6350 for just under 120

  21. There are so manyt flaws in this review it’s really not funny.
    1. different overclocks. Now people will say nbot enough to matter but fact remains that the amd chip is clocked higher and that does impact results.
    2. For those arguing other components. Ram does NOT make any more than 1-3fps difference over 1866mhz.
    3. These are NOT high end gpu’s LOL. they’re mid range gpu’s and are also not capable of fully using the abilities of a intel chip.
    4. Comparing a workstation chip to a gaming chip. Smart move there lol
    5. It is a well known FACT that intel rapes amd for single core performance which still accounts for roughly 95% ofd games on the market nowdays.
    6. It is a known and proven fact that ddr3 2133mhz ram will rape ddr4 2400mhz ram in gaming benchmarks right now so there is yet another scewed fact in favor of the amd setup lol.
    7. I would bet money that they have wound the X99 system back onto pcie 2.0 since it is well known that the 990fx mb’s still are not able to run pcie 3.0 capably which means yet again another gimping of the X99 setup.

    I am no fanboy. These are simply observations of fact that i saw in this article. Seems to me that the article was written with the express purpose of making the amd setup look to be much better value than it really is.

    Then there’s the fact that they managed to somehow show the amd system holding superior results @ 4K when every single other benchmark out there that i have seen and read shows the exact opposite.

    It is also a well known fact that yes the amd chips do bottleneck sli setups due to many factors but mostly the fact that they don’t have the cache and capacity to be able to handle the gpu bandwidth needed to utilise it fully. If you question that then feel free to look at 3D mark one days and see just how big of a difference there really is between 2 matched systems using amd and intel chips. I promise you will see a noticeable difference in them lol.

    Now back to earlier points. Ram is pretty much irrelevant in gaming short of a very minimal and small gain from higher mhz. So to those arguing that it can make a big difference. Sorry but you need to educate yourselves lol. As of right now an i5 will beat an 8350 in almost every gaming benchmark. and an i3 will even beat it in certain games too and yes even a G3258 will beat it in a few select games as well. So i cannot for one second understand how anyone in their right mind can stand up and make claims like this on an unprofessional and twisted review like this and still expect anyone knowledgeable to take them even slightly seriously lol.

    • 970’s SLI
      -Apparently not high end for Mr full of shit Alan Clancy. “not capable of fully using the abilities of a intel chip”, he expressed.

      I’d like to see your setup please, if you think 970’s SLI are not high end.

    • Alan, it doesn’t matter how YOU view the purpose of the 5960x. The simple fact is that so many boutique vendors are selling “Gaming PCs” with the 5960x that this article needed to happen to show people they’re wasting money.

    • First off, the Intel Celeron G3258 will bottleneck the GTX970 GPU, buy one and try it.

      As far as the I5 beating the AMD chip, that’s no more than 5 FPS, nothing noticeable.

      Throughput of DDR4 2400mhz is 60,000 mb/s, as compared to the throughput of DDR3 1866 at 27000mb/s, which relates to an almost .5 second difference in latency.

      All socket 2011 use PCIE 3.0, so your comment about them downplaying it to PCIE 2.0 is irrelevant!

      Overclocking has no bearing on IPC, which we all know Intel wins. As far as the OC being different, the Intel chip also has 8 more threads, so don’t disregard the tests based on OC alone, while they are different architectures one with SMT the other with CMT!

      This is also not my opinion, rather an observation that you’re just pissed about a $170 cpu hanging with a $1000 cpu.

      As far as graphics go, why would you test a Titan X gpu in a gaming environment? The Titan X is for rendering, as is the 5960x, which makes sense as to why the FX chip holds it’s own.

      • So why not show those i3/i5 results then to compare how a dual + HT / quad core can beat an 8 core AMD chip or perform the same in this scenario..

        Also, sure DDR4 has more through put, but what can actually use it..Anandtech did a large review on DDR3 speeds and found anything over 1600 useless, unless you run benchmarks all day long.. you wont see any difference.

        • Show me the i3 benches that hang with the FX, I’ll be waiting. As far as the i5 goes, you’re absolutely correct that they have a few FPS more, no one is denying that. BTW there are programs that specifically turn the HTT off in the i7’s and up, simply for gaming performance, and stability.

          The throughput is indicative to higher resolutions (4k), and it’s also usable in CAD editing programs. The same as not more than 4-6GB RAM is used in games, while up to 128GB is used in editing (this difference in throughput is also why Intel’s X edition CPU’s are monsters in rendering). Did you realize that the Intel 5960x and the AMD FX 8370, have roughly the same ‘Single Thread’ capabilities?

          I myself noticed a difference in 1600mhz and 2133mhz, not only in benchmarks, but also in gaming and rendering.

  22. The Intel cpu is definitely the better one of the two, however, at over 5 times the price of its red team counterpart, I find it hard to justify spending over $1000us on the Intel, especially if your primary use is gaming.

    To me, a much more useful comparison would be between the AMD in this article and something like the i5 4690k which are both roughly the same price.

    • It is a bad comparison, anyone who owns an 8 core AMD would not buy an $1k intel cpu anyways, so it is apples to oranges comparison really.

      As you said, compare it to an i3/i5 and see almost the same results.

  23. Juhász Roland Márió

    Oh good lord, where am I? what is goin on? I just read nearly all of the wall of text thats in the comments section, all of those Intel is good, AMD suxx… well… let me tell you, I’m still rocking an AMD FX-4170 with 2x 4GB DDR3 1866mhz and a Radeon 6870, all of this shit is old as fuck… and guess what? I can run GTA V in 1080p in dx10 mode with everything set at high with half Vsync for stable 30 FPS, shiiiiit it must be so bad for me… and yes my Sata II HDD holds the performance back… as sometimes it freezes for a few seconds while my HDD is showing massive amounts of reads, just like in Watch Dogs, same shit, but who am I to compete with my first world problems, when you guys can’t hit stable 60 or 120 FPS, don’t get me wrong I too would like to play GTA V with 60FPS, but I’m being realistic and don’t expect miracles from outdated hardware ^^
    PS. the AMD FX-4170 was nearly free back when I bought it and completely worth it IMO.

  24. When you´re using a i7 5960X for High-End Gaming.. in which world are 2 970s in SLI only High-End? There are much better graphic cards out there like Titan X or minimum the 980!

  25. Makes me glad I stick with team Red. Intel is better but for the price and performance ratio I will stick with AMD. Flame on.

    • You should stick with that gives you the best performance for your money and in many cases an i3 will beat those 8 core AMD’s in most gaming and day to day use…

  26. In the gaming the CPU isn’t matter too much. I have an AMD FX8320 at 4GHz and I have no problem with CPU usage. The GPU matters much more, and I spend less money for an AMD but more for a stronger GPU if I want to play games. And for gaming Intel i5 is enough from Intel. No game will use any i7 CPU. It’s just waste of money. i7 is for rendering, 3D modelling how it’s price shows.

  27. So all this is saying is the games are GPU limited. If someone is buying the Intel chip for gaming at that high price, they are going to be spending more on a GPU too, probably GTX 980 or 980ti or Fury X (or whatever AMD’s newest flagship is called).

  28. Good review but how about power consumption too?

  29. I am angered by this review. Not because I have any sort of stake in the Intel vs AMD war, but because this article is exceedingly environmentally oblivious.

    Running machines like this at their full potential draws as much power as everything in a typical household aside from the cooling/heating. We’re talking 1500 watts or more. The result is needless strain on the power grid, resulting in increase environmental risks and degradation. I’m not altogether against 8-core gaming, but to mention it without recommending solar panels in the same breath, is unethical. These machines also stand to ultimately weaken the games industry as a whole, because faced with these rising power draws, power companies will begin pro-rating their customers to fleece the high-consumption gamers good. Of course this will make solar panels look much more competitive, at the same time commerce orgs like the EU begin tightening the screws to make them look even more attractive… the only justification for not being solidly against 8-core gaming (my opinion: it’s the definition of excess).

    • Would have to point out I run a eight core CPU with a 270X both overclocked and I don’t pull anywhere near 1500w at full.In fact it only pulls between 500 and 600w at 100% resource usage so I’d love to hear where you are pulling your 1500w number from.Before you say something along the line of “you don’t have 2 gpus in your pc” that would only raise it to 800-1000w at 100% which is still far below your 1500w.

    • You are a moron. I AM NOW DUMBER THAN I WAS BEFORE READING YOUR POST. Fucking idiot!

    • Stop charging your cell phone and other mobile devices then…cause they are one of the larger power consumption devices theses days with everyone owning one..

      Also stop buying massive TV’s.. and make sure you have all LED lights in your house and efficient appliances and heating devices..

      Oh, if you have a car, you should also stop driving that too as it does far more damage than my computer does..

    • Wow just wow… lol i’m wondering if this is some form of sarcasm or something it’s that stupid !!!
      1500w did a flying unicorn that farts rainbows give you that info ? lol
      An FX9590 and a 290x doesn’t even pull 600w while gaming, god to pull 1500w you need 4 x 290x and an fx9590 and you still would use 1500w !

  30. Why did you only compare GPU limited tests? There is nothing that stresses the CPU at all. You could have tested an Intel i5-2500k from 4 years ago and gotten nearly the same results.

    • Because there are about 2 million benchmark tests already. How many have you seen that point this out? If you play these games it doesn’t matter really. Play any of the popular games on the list (Counter-Strike: Global Offensive, DoTA 2, Tomb Raider, Crysis 3, The Witcher 3, Grand Theft Auto V and Project CARS.) then either card will get you there.

      People see the benchmark tests and just believe Intel is better at everything, this tests shows that even the Intel benchmarks mean nothing in these games.

  31. Actually, AMD seems to crash alot on my 3D softwares.. and that heating omg!

  32. What OS did you use?

  33. Thank you for taking the time Donny. This is a good read, and hopefully readers will take the time to appreciate everything you are saying before bashing one side or the other. (we need both teams as consumers 🙂 never understand why this isn’t common sense)

    Also keep in mind, AMD’s 990FX is still running at GEN2 PCI-e speeds, where Intel is on GEN3.

    • unless your running multiple GPU’s with high end cards, not mid range, you wont see any performance difference, this has been known from the start with PCi 3’s release.

  34. compare an fx 6300 core to intel i7 4790k The haswell will win but the performance difference isn’t very big with all else constant. Amd still offer better value despite it being outdated, with dx12 I foresee that the fx series will remain relevant for a few more yrs to go.

  35. This is bad.

    1. Does core count matter so much that you have to compare 8 cores to 8 cores? What about comparing 8 threads to 8 threads? The AMD’s module design does not contain 8 entirely independent cores, it’s more like a more advanced form of HyperThreading on a hardware level.

    2. In a similar vein, the i7-5960X has 16 threads vs the FX-8370’s 8.

    3. What does thread count even matter when even a 4 threaded Core i5 – wait, even an OLDER Core i5 such as Ivy Bridge – can beat out the FX-8370? In something like Shogun 2, the old Core i5 beats AMD’s finest.

    4. It gets worse. What about games such as Skyrim or Battlefield 3? The Core i3, and even lowly Pentium G2120 beat the FX-8370! That’s a dual core processor beating AMD’s finest.

    5. The 5960X is part of the HEDT platform and is not designed to go head to head with, well, ANYTHING from AMD. Z170 or even Z97 are better options for people running single graphics cards, try a Quad SLI setup on the 5960X and FX-8370 and watch the INTEL pull far ahead.

    6. The AMD platform is outdated, yet it’s something that always gets ignored. The 990FX supports the same technologies as the 790FX, conversely it is also missing support for the same standards. AMD’s most advanced chipset still doesn’t support USB 3.0, PCI-Express 3.0, SATA 6Gbps…

    Either the reviewer doesn’t know what he’s doing or there’s an agenda being pushed here.

    • FX isn’t AMD’s finest, and hasn’t been for awhile, and most tests stack against the AMD solutions. FX is legacy for overclockers that know how to boost a FX’s overall performance.

      • Is that so? What do you think AMD’s finest is? Godavari, AKA the A10-7870K? It trades blows with the Pentium G3258K. You DO know that the FX series was rereleased in 2011, right? And that I’m not talking about the K8-based FX of the early to mid 2000s, right? AMD doesn’t have an SKU above the FX, which will remain their fastest CPU until Zen arrives.

    • “AMD’s most advanced chipset still doesn’t support USB 3.0, PCI-Express 3.0, SATA 6Gbps…”

      the 990fx does support usb 3.0 and sata 6gbps through the AMD chipset …

  36. Hi, thats a great article! Thank you for your work and determination!
    I’m excited to read your other stuff.

  37. Just what I’ve been telling people for years, if your main propose for building a new PC is gaming, then AMD can hold its own against over-priced, over-hyped Intel any day. You’d be better off spending your money on a good GPU. I’m so sick and tired of hearing Intel fan-boy’s who don’t know what their talking about bragging about some over-priced crap from Intel. Get a life, and then get some “REAL” knowledge before you talk and confirm to the world just how un-knowledgeable you are about computer technology.

    • Well said. I can hear Intel lovers getting pissed…. LMFAO. I am a fan of both companies, but as far as “better bang for the buck”, AMD wins ALL DAY LONG. Also, based on early DX12 results, AMD FX chips and overclocked ones are doing quite well, surprising everyone. Nice to see good competition among the two. (Intel & AMD)

      • Does it really? compare those AMD 8 cores to i3/i5’s and it doesn’t look like your getting all that bang for the buck anymore…

        I would love to see the Athlon XP days relived, i want AMD to get off it’s butt and innovate again and give us some competition…i loved my AMD 6 core rig i had…but there is little reason to get AMD these days…

        There is no competition between AMD and Intel.. fact and tests all show Intel ahead (not including GPU limited scenarios)

        • LOL, yes the API overhead is quite different between the i3/i5 and FX 8xxx chips.

          The point of DX12 API, is to utilize more cores, more than just 3 (from previous generation consoles) HT modules or horrid for gaming, so the i3 will have massive problems, while the i5 will be a little better, but lacking the resources in the API overhead will kill that chip off.

        • Intel is indeed ahead based on technology, but when it comes to developing products with open source software, AMD is the way to go. Only the newest AMD (not intels) processors are supported for open boot-loaders and offer better preformance on open source drivers.

        • bringing I3s up at this point is useless and worthless at this point.
          i3s are not deals at this day in age… you’re looking at a real heartache if you pick up anything less than an i5 or an FX 6/8 in 2015. The fact that you all are still trying to spurt this I3 nonsense is laughable at best. Yes, go tell everyone to save their duckings and get an i3 over an AMD so they can take the gamble of being locked out the game altogether because they aren’t running a quad core. I love team AMD.. I like Intel… but to sit here and mention I3s like they aren’t being locked out of newer titles isn’t cool.

        • I know, I had the chance to upgrade to an i7 w motherboard swap. I took it and love it!

    • Do you even see what you’re writting?

    • over priced.. why would someone who owns AMD even consider a $1k Intel CPU it is apples to oranges..I like Intel, but even i would not buy a $1k Intel CPU for gaming, that is not what that CPU is marketed for anyways.. where as AMD marketed their 8 core cpus as “gaming cpus” when games, maybe 1 or 2 can really use 8 cores..efficiently.

      Now lets look at more realistic benchs where and Intel Quad’s beat and AMD 8 cores….and even a dual core can beat AMD or perform the same for less..

      https://www.anandtech.com/show/8316/amds-5-ghz-turbo-cpu-in-retail-the-fx9590-and-asrock-990fx-extreme9-review/8

      https://www.xbitlabs.com/articles/cpu/display/amd-fx-9590-9370_5.html#sect0

      https://techreport.com/review/26977/intel-core-i7-5960x-processor-reviewed/5

      https://www.techspot.com/review/875-intel-core-i7-5960x-haswell-e/page9.html

      https://www.bit-tech.net/hardware/2014/08/29/intel-core-i7-5960x-review/10

      • “I like Intel, but even i would not buy a $1k Intel CPU for gaming, that is not what that CPU is marketed for anyways”

        Hum yeah it was, it was actually marketed solely towards gamers by Intel… Here is how intel promote their CPU on their website :

        https://www.intel.com/content/www/us/en/processors/core/core-i7ee-processor.html?wapkw=5960x

        Intel® Core™ i7 Processor Extreme Edition and Intel® High End Desktop Processors

        Dominate your gaming competition with an unlocked, unleashed, and uncompromised processor.1 With a killer combination of smart features including Intel® Turbo Boost Technology 2.0,2 Intel® Hyper-Threading Technology,3 and overclocking, the Intel® Core™ i7 processor Extreme Edition is flexible and devastatingly powerful. The lineup now include expanded six-core processor offerings, as well as Intel’s first desktop processor with eight dedicated cores for a flexible and devastatingly powerful experience. It’s the ultimate weapon when you demand unrivaled PC performance for gaming at its best—because the only thing more amazing than Intel® technology is what you’ll do with it.

  38. you should do some sort of editing tests…. gaming seems silly for 5960x. i never use my x99 rig for gaming.

  39. Nah, Intel works great for me, AMD is slow and laggy. Weird benchmark; the opposite happens to me.

  40. I fucking dare you to test it at anything other than 4K where all the overhead is on the GPU and not the CPU.

    1080p will show a bottleneck, also the 5960x has slightly worse IPC than say Haswell would so why not test it properly? This review is stupid imo sorry.

  41. Great post! And i agree with the fact that Intel cpus are faster to a degree. However, i find the whole argument about debating speed to be very monotonous, not in terms of the facts but in terms of the reasons ‘fanboys’ try to smash there opinions, and i thank you for not taking that stance in this posts. I grew up with intel cpus back in the 90’s and learned, even back then, they were highly acclaimed for there speed. now, as tech evolves they evolve at such an incredible speed, sometimes it seems irrelevant. This seems like a revival of the GHZ wars all over again. I recently started DIY PC building, and yes i chose an AMD cpu (FX 8350) and despite that it’s ‘slower’ it has more than enough speed to handle most modern applications at incredible speeds plus multitasking. Paired with a high end GPU and an SSD, i find load times in programs like windows, TURBOCAD and Photoshop are incredibly short. As i’m not a full on gamer, but as a hobbyist inventor and photographer who spends more time making 3D CAD files and Processed RAW format photos, Any modern age CPU will last the long run, and also be able to handle most applications (not just gaming) for many years to come. I picked 8-core CPUS because, like most say, it’s for multitasking. I do more than just Gaming so it makes sense to me. Yes, the i7 hyperthreaded CPUS by intel may be faster but it’s in such a small margin it wouldn’t make sense to pay $1000 when you can get a 8-core chip for about 2/10ths of the price. It’s sad to see a great company struggling with there profits this year, but that’s why there moving into the console arena. My only real buff with AMD is not CPUS, but motherboards. Intel always gets the new stuff (SATA express, USB 3.1, type c, etc.) while AMD is left in the dust. Now, the ‘basic’ stuff gets me by just fine, but it would be a huge help if Motherboard manufactures would give team red a little more leverage in terms of addons (even though i consider most of them just that, non essential components). If you want a basic, fast and efficient PC go with AMD, if you want to future proof, get all the new tech and toys (despite you know it’s going to cost more indefinitely) go with Intel. My personal experience.

    • “My only real buff with AMD is not CPUS, but motherboards. Intel always
      gets the new stuff (SATA express, USB 3.1, type c, etc.) while AMD is
      left in the dust.”

      There are two major reasons behind this, First AMD uses a socket for multiple CPU’s, while Intel upgrades their sockets every time they upgrade their CPU’s (which is a major reason for the price gauging)! Second, AMD uses NB and SB, while the bulk of the controllers are integrated into the CPU with Intel (and this brings us back to the first point)!

    • Thank you for this post. Seems like people like us real world users are few and far between.

  42. So many butthurt Intel fans in the comments below…

  43. Just listen to all you fuds. So called “discussions”, as in arguments, like this don’t help encourage people to come to pc gaming and instead pushes them towards consoles. Seriously who gives a sh*t what rig you are running? I don’t care what anyone has, I only care about what I have or more importantly can afford. Beginners need to start somewhere so calling out people who buy AMD isn’t helping the cause at all.
    Now listen here you fools, I defy ANYONE to stand in front of a monitor and tell me they can spot the difference between 60fps & 70fps on battlefield 4 on a standard 24 inch backlit LED monitor. If you say you can you are lying and you’re a p*ssy for doing so.
    You can have the best f*cking rig in the world, hell you could have Deep Blue stuffed in your mom’s basement but unless you have bionic retinas you’ll still only see detail to the limitations of your eyes.
    For real? who monitors their fps performance stats when playing a game, oh yeah I forgot, half of you saddo’s obviously.
    Get a life and just encourage people to game on PC’s, no matter what their choice of hardware, if they stick with it they can and most likely will get the bug and upgrade sooner rather than later.
    And then they can start to talk like all you w*nkers.

  44. Shiit benchmark. $1000 intel cpu vs. AMD $190. Try harder.

  45. Wish you woulda used the 9590 🙁

    • AMD FX 8370E or 8370 overclocked to 4.5GHz or better, produces better results than the AMD FX 9590 clocked at 4.7GHz with boost of 5 GHz.

      • The Vishera 8 core cpus will perform identically when clocked the same, 9xxx’s and 8xxx’x . They are the same basic chip that are binned differently.

        Now if you lock all 8 cores of an 8350 to 4.9 ghz , disabling turbo , vs a 9590 running its stock 4.7 ghz with a 5.0 ghz turbo mode., the 8350 can perform better on an application that uses more threads than the 9590 can boost to 5 ghz while in turbo mode.

        • Not true, benchmark performance shows that the ‘E’ designated chips score better at 4.5GHz vcore of 1.3v no turbo. The 9xxx series vcore is already at 1.45v turbo at 1.5-1.6v.

          I can clock this chip higher 4.9GHz vcore of 1.375v which yields a 4% gain over the 9590.

          • When each is clocked the same, they perform Identically.
            There is no debating that.

          • You are absolutely correct on that. My bad, I was going off of benchmark scores, as I don’t have another chip to test. Thank you for posting those screen shots, they are much appreciated.

        • Just curious, what does it score in passmark8? My 8370E gets 10200 points at 4.5GHz.

          • Passmark 8 64 bit trial version

            4.5 and 5 ghz with my FX 9370

          • That is awesome, but I noticed that your V Core was 1.5375 at 4.5GHz. I guess this is the only positive to having an ‘E’ edition, as I’m sitting at 4.5GHz with a V-Core of 1.3-1.325.

          • The newest FX 8xxx’s are much more efficient. A process improvement implemented in batch 1429 changed the leakage characteristics of these chips. Certainly lowering the voltage for stock clocks , but generally limiting maximum overclocks on ambient cooling methods. The stock VID of that 9370 is 1.538volts, with turbo it runs 1.58- it is a voltage hog :). The 8370 I have runs 4.9ghz on 1.44 volts for daily use – much more practical.

          • One other question, what OS are you running? After going back into PassMark8 to take a screen shot to match yours’, I noticed that after I upgraded to Win10 Pro (x64) my Single Core score took a 300 point hit, and my overall score took an almost 400 point hit.

          • Win 7 64 bit. My daily rig is torn down for cleaning, I just plopped this chip on a motherboard and downclocked it for the 4.5ghz test . Ram, NB etc. were at defaults, no tuning at all.

  46. So I’m very late to this party… But everybody saying there were no CPU bound tests…. Has anybody here played GTA 5? The higher you clock your CPU the more frames you get. Even on monstrous chips (5960X, 4790, FX9590) The frames just keep coming. As for the others OK.
    All in all, I’m surprised the FX8370 stood up as well as it did. Sure the results seem to be fairly cherry picked, But considering almost all the games I’m playing at the moment are in there, It’s pretty impressive.

  47. cpu does very little in this case.. *gpu

  48. amd video Card wrong great warming

  49. Since lots of people obviously missed some stuff in the article…

    1. You don’t need a benchmark beast CPU to play all games.
    2. Intel is better but it comes down to how much more you want to spend/when is the price no longer worth the gain.
    3. If you’re on a budget get an AMD they are fine or spend the extra on a better graphics card or nicer monitor.
    4. DirectX 12 is supposed to utilize multi core CPUs so the statement of Intel is better cause whats the point of multi cores (we will have to see once DirectX 12 is in full swing

    IMO DirectX12 will finally put Intel in the oh crap we need to build cheaper CPUs as our single core beasts won’t cut it anymore, and that right there is good for buisness. Forcing Intel to innovate or lose ground to cheaper AMD, or AMD may just switch with Intel and overcharge.

    (Now I consider myself a novice, I don’t memorize stats, but I’ve built my own computers since 98 and I’ll research them when I need to so fee free to disagree… Unless of course you just fanboy and throw a tantrum )

  50. Thank you for this. It’s nice to see an article that doesn’t unilaterally proclaim Intel superior.

    That said, a game I’d love to see these tests run with is the original Supreme Commander. If done with a game replay, you can be sure that the game plays identically. (Provided the replay doesn’t desync at least.)

  51. Before i start reading i must say i DO NOT AGREE WITH SETUPS HAVING DIFFERENT TYPE OF MOMORY MODULES! AS DDR3 AND DDR4 ARE COMPLETELY DIFFERENT TECHNOLOGIES!

  52. how about with a 980ti and playing at 3440×1440 ? amd is still equal ? thx

  53. video Test??

  54. Sheikh Shahriar Dipto

    But,my friend and me see a big difference in 3D rendering. Where Intel shines. Even intel’s every single core is much more powerful than AMD. And that’s why I left AMD. Games doesn’t need much more,until you go for higher resolutions and anything doesn’t go bottleneck.

  55. Cool test.

    Now please do test 1080P, even higher res, in Wildstar, or World of Warcraft. Do a raid, or go into a very populated city. Experience the hell on AMD platform, and normal fun on Intel one.

    You will see the AMD CPU can barely play the games that aren’t that good at using multiple cores. Why would one want to use such a CPU intentionally? I have one, and am sorry I wasted money on it.

  56. An eye opening article. It correctly states that Intel is superior but due need such a powerful processor for gaming.

    If some one is going to use a very high end gpu going with costly Intel’s is Good. But average joe’s who club a very high processor with a weak gpu are doing it wrong they should do it vice the versa of they really want to raise the frame rate.

  57. Anima minima maxima sunt

    Lol to be honest i dont know how u even can compare a X series intel vs that amd …. u cant even compare a normal i5 and i7 vs Intel extreme CPUs. Your test sir in my Eyes is the exact same thing as saying well hey lets test a Audi 2.0T vs BMW M5…. The price and horsepowers are irrelevant… Made me laugh, Very missleading

  58. Paulie M30 (The Gaming Junkie)

    So let me begin by saying these benchmarks are wildly inaccurate. You mean to tell me that in all games with both chips, both stock and overclocked, you got the same average on both? total bullshit! I own both of those cards. I’d like to see a 970 get 73 fps with maxed advanced settings at 1440p. I own an 8350, 4790k, 4690k and a 5930k. ALL 4 CHIPS PERFORM DIFFERENT!!!

    • Please share the benchmarks, otherwise its just hearsay.

      • Paulie M30 (The Gaming Junkie)

        All you have to do is look up benckmarks from any reputable site. Like PCPER or the hundreds of other sites. https://www.youtube.com/watch?v=la2-ElzDlHM

      • Paulie M30 (The Gaming Junkie)

        This is one of the best sites on the internet. Don’t get me wrong, I’m not saying the amd chips are bad, I’m just saying they can’t compare to a $1000 chip

        https://www.anandtech.com/bench/product/1317?vs=697

        • I didn’t ask you for a link to a website, you claimed to have an FX 8350, 4690k, 4790k, and a 5960x, as well as the GTX 960 and GTX 970. You claimed you ran benchmarks on all of them to include the games. I asked you to post your results on here, not link it to anandtech. Don’t get me wrong, when I say you’re full of shit!

          • Paulie M30 (The Gaming Junkie)

            Never said I had a 5960x, I said I have a 5930k. I don’t have to prove a fucking thing to you. Anyone with half a brain knows this article is garbage. However. I’ll put together my benchmarks and show you them

          • I’m simply stating that I’ve posted benchmarks on here. Benchmarks of the FX 8370E clocked at 4.5GHz. It hangs right between the i5 4690k and i7 4770k/4790k at the same relative clock speeds. One thing is for sure, the more cores that Intel introduces into a chip, the worse their ‘single core’ performs. You’re taking a workstation monster, against an FX chip, and wonder why the gaming experience is the same. Its because the 5960x is a chip for rendering, not gaming. All the benchmarks that I looked up for gaming, brought back the same results as his, and I compared them to my chip. Don’t be so butt hurt, that a $1000 chipset for productivity, gives the same results in gaming. Instead, you’d rather talk shit, instead of posting benchmarks (for games with FPS). If you buy either a 5690x or a Titan (X or Z) for gaming, you’re a complete and utter moron. The poster was making a statement that it hangs with them in games. The results would be different in productivity!

          • Once again guys… You provide valuable information and we thank you. Personal attacks will not be tolerated however. If it continues, a simple deletion of all comments relating to a username will occur and the IP banned from further posts. Please stay within the subject.

          • What personal attacks? The only thing that could be seen as a personal attack, is me stating that someone is talking crap, or its not a smart idea to buy a 5960x or Titan for gaming purposes only. Other than that, I feel like my complete statement is within the realm of this thread.

          • I am not going to get into a back and forth here and I am also not going to tolerate anyone using vulgarity, or making similar references further. If you are saying someone is wrong, simply use his own words against him and identify loss of credibility. Truly, I appreciate your input. I don’t appreciate when 2-3 posters start mudslinging without adding further to what they already stated. it is just as easy to delete all threads related to the three and ban further, although the loss of the valuable input that they did provide is lost as well. Thanks ahead.

          • Paulie M30 (The Gaming Junkie)

            That we can definitely agree on . If you buy a 5960x for gaming, you’re either an idiot or you’re trying to compensate for something. That being said, I still think the results in this article are wrong. First of all, a 960 is a low end card. second, 2 970s can hardly handle 4k. I use 2 980 ti for 4k gaming. 3rd, He did not get 73 fps, on high settings in 1440 with a 970. Just NO. My main issue was the fact that neither chip showed a difference at both stock and overclocked. This is legit the ONLY article that I’ve seen with results like this. we can go back and forth ALL day and night and I still won’t believe this article.

          • These tend to disagree with your statements, and agree with the poster of this thread.

            https://www.youtube.com/watch?v=_wEOK6Hf4h8

            (66 FPS @ 1440P on Ultra in Tomb Raider)

            https://www.youtube.com/watch?v=pmYGAYWHhR8

            https://www.youtube.com/watch?v=ychGsFEttQg

            I was going to post more, but I’m not even going to get into it with you! You’re a troll, and a simple YouTube search completely agrees with the poster of this thread. Please excuse me when I say that I don’t believe you have all the hardware you say you do. These are all GPU bound games, I mean even Crysis 3 with a dual core CPU gets 35 FPS on ultra settings. It’s a shame that you can’t produce benchmarks, and still claim to own all of this hardware. Please enjoy the videos, and afterwards, please kick rocks. Also, please stop flagging my comments, because they don’t agree with your opinions. I’m sorry that I can do actual research before I go spend money on PC components.

  59. Koning Rosekraans

    This is an awesome comparison, I got here through some Youtube argument about what CPU is best for gaming and someone posted a link to this site… the argument ended by the way. Quick question though, was there a noticeable difference in load times for those games?

  60. I really like this review (contrary to the earlier posts).

    I have owned an AMD FX 8350, and took it to 4.8Ghz, coupled with an AMD Radeon 7970K. Brilliant hardware. Excellent price/performance ratio.

    I now own an Intel i7 4790K, running at 4.6Ghz, coupled with Nvidia 780ti. Brilliant hardware, but expensive.

    The Intel/Nvidia hardware on average has probably cost me 50% more than the AMD hardware mentioned. I couldn’t tell you what the performance gains are, but they’re definately not a 50% increase (like the cost of ownership was).

    In terms of watts used, my AMD rig was eating 550watts when gaming. And although I can’t measure the watts used at the moment on my current rig, it has to be nearly 500watts or there abouts. So I deduce that the electricty bill/cost of ownership arguement is irrelevant/marginal.

    I am not a fanboi of either corner. But I would say that for the little videos I edit (and upload) the AMD was far superior over the Intel, rendering the clips faster and with less visible stuttering.
    I found the AMD to be far more responsive in Windows. I perform the same tasks in Windows on the Intel i7 and it feels more sloppy and definately not as punchy as the AMD 8350.
    For me the Intel (with its known and documented) single threaded performance is a winner in most games.

    I personally love the way people demonise AMD, believing that it is a worthless contender. I would swear that some people believe an AMD rig won’t run a game, or that the benchmarks are a complete lie. When truths be known that AMD isn’t really that far off the pace, with numbers written in black and white and sometimes with colours. Some people cannot accept the information as true? For an individual to not be able to accept information which is indeed true and proven is a quite a worrying way for a person to be.

    Sometimes the gap on the bar chart looks massive in comparision, and it makes me laugh because the gap on the bar chart is talking about the difference of 10 frames or sometimes less, but it seems as if people are quantifying the gap on the chart over physical the numbers??? The gap looks massive, but the difference is so marginal. I find it all quite misleading to be honest. Just ask yourself what is the real difference between 90 and 100 frames anyway? Who would be able to physically notice? I can physically noice the difference between £160 and £800 however.

    A lof of benchmarking sites benchmark in their own ways, sometimes the tests could be questionable, but ultimately the cheaper AMD rigs benchmark just fine in my opinion and the reflection in real world on this benchmark is totally argreeable. On this benchmark an Intel octo core, with hyperthreading, able to cope with 16 threads, a chip that is at least 4x more expensive than the AMD alternative, only gaining performance increases of 10% on some occasions, yet it is more than 4x the cost? I wouldn’t exactly call that value for money. Superior? Yes, if you have very deep pockets.

    AMD FX 8350 – UK Pounds 149.99
    Asus Sabretooth 990 – UK Pounds 125.99
    AMD Radeon 7970 – UK Pounds 220
    Total = UK Pounds 496

    Intel i7 4790k – UK Pounds 239.99
    Gigabyte UD5 Black – 159.99
    Nvidia 780ti – 309.99

    Total = UK Pounds 710

    So in actual fact I spent just over 50% more cash on the Intel/Nvidia hardware, for less than 50% performance increase on games (which is well documented in benchmarks everywhere on the WWW). So unless you have money to blow and don’t care about money spent go for
    Intel, otherwise the AMD alternative is an excellent choice…

    I like AMD, and having owned both types of chip can see advantages in both arenas. In an ideal world I would have an AMD rig and an Intel rig, or a chip which does everything I need and not specifically good at this and poor at that…

  61. 4 pages of GPU bound results for a CPU comparison. /facepalm

  62. more terrible than comparing AMD more or less comarable performance i find it sad that developing is trash. but there again because using low/mid range setup are the reason why they don´t build games for 8 core/thread and such as sli/crossfire configs.

    i wish i never had done buying a 3x more expensive rig because it dont have any benefit from this computer because games are really often not capaable of using hardware…my old 2008er 400€ did better job than this high end rig here..

  63. Nice review but it’s not all about fps.

  64. You mean to tell me for $3,500 I get 4 -5 more frames in some games where do I sign up.

  65. Good to know. You touched on a topical issue. I would appreciate if you’d written about how to merge some files online. I know a good online service I used before. Just look at the service bitsy.in/9766a

  66. Thx for the article..

  67. As an occasional gamer and PC builder for years I think a fair test would have been two systems that cost the same money to build with similar mobo’s and same memory. So the AMD’s cheaper processor would have allowed for a better graphics card.
    Intel is generally the better performing brand but it costs significantly more to own an Intel quad core i5 compared to an AMD quad.
    I was happily running an FX6300 overclocked on a quality board and with other decent components but was convinced by all the internet guff and benchmarks that my PC experience would be much better with a new i5 K series Haswell.
    About £450 later with a ROG motherboard, i5 4570K and 8gb Vengeance Pro memory I discovered that despite impressive benchmark increases the actual user experience wasn’t that much better anywhere. Too say I was pissed off (bear in mind I know how to overclock well) and the minimal beneficial increase for the amount spent was an understatement. The i5 alone was the thick end of 200 notes where I’d picked the 6300 up for less than £70 delivered.
    If I was starting from scratch I would like to know which CPU would give me the best user experience for spending a set amount on a complete build. Intel from my experience for a good few years isn’t great value for money, however if you want to sit safe in the knowledge of having a superior product not taking price into account then Intel every time. I’d rather have spent my £450 notes on better graphics than wasting it on an i5 that I didn’t need.
    One thing also is my AMD setup is 990FX chipset so I have 16 pcie lanes on my crossfired HD7850 (good when I bought them) graphics cards. On the odd graphics benchmark as a result the FX-6300 beats the i5 with it’s crappy x8 speed pci. Crossfired cards also run cooler if you don’t crank the settings up too much higher than on a single card as they spread the load between them. Noisy card fans really irritate me so I’m reluctant to get a single card when I upgrade again.
    I play Shogun 2, Rome 2 (Total War) and Skyrim, no idea what the setup would be like on anything else. The i5 barely makes a s##ts worth of difference in these games with my setup running 720p on my 23inch monitor. My eyesight isn’t able to tell the difference with 1080p on such a small screen so I crank the other settings up instead.

  68. In fact going by these benches, there doesn’t seem to be much difference between these processors, considering the prices the AMD easily wins. You won’t notice 10% when playing unless your rig is crawling along.

  69. 90% of comments are useless but really funny.
    Thanks guys.

  70. Lena "Tracer" Oxton

    AMD CPUs are just terrible at single-thread processing. I would not recommend buying an AMD CPU until Game Devs finally utilize the DirectX 12 fully.

  71. Insignificant Meme

    This is awesome! So at 4K even with a much weaker cpu there is almost no bottleneck!

  72. A year on and this is still the only review of it’s kind. Really a great idea. The only thing I wish it had also had was just one quick CPU synthetic while they were both in the lab to set the baseline IPC gap. Excellent excellent job regardless.

  73. I’ve always been willing to try both processors and for the longest time I stuck with Intel, I just thought that was the way things were meant to be. I see now (and have known for a while) that AMD is just as capable in many ways.

    I got sick and tired of needing a graphics card that was expensive, power hungry and oversized (for case fit) just so the Intel processor could impress me.

    I switched to laptop/portable computing and found myself even less impressed with the price tag. When AMD introduced their first APUs I was skeptical. (In some ways, I still am.) and finally decided to take the plunge and see if an APU taking directly from system memory could play my games.

    To my pleasant surprise, they could! (Now, I do play older games such as Left 4 Dead 2, Batman: Arkham Asylum and Arkham City, so they’re not exactly currently resource intensive.) After being impressed with that one, I switched to a newer A-10 APU for my new (and current) laptop and it plays games quite well.

    I know that newer Intel chips have their own built in graphics option, but you still need a separate graphics card to really make that experience work. So, if an AMD APU can do it in a laptop, imagine what an AMD 8 core with a separate graphics card could do.

    Nice to see that AMD can, in fact, stand up to the biggun’s.

  74. There are people here that are complaining that the games are “cherry picked”. However, I for one do cherry pick the games I purchase. I choose well optimized games and leave some of the other ones in the dust. (old article i just noticed i was look for ROTTR benchies)

  75. Joshua Nastonovich

    Something about this article is way off. I’m not sure how they did it, but this is not a representation of how the AMD 8 cores act. They have atrocious single core performance compared to the i7 lineup. Literally 50% worse. Either the author chose games that just so happen to be multi-threaded, or the results are skewed.

    You’ll notice in certain tests the minimum frames are much better on the Intel CPU. This is how all the results should be.

    Reader beware: This is not how FX CPU’s perform in the world. Save your pennies for an Intel setup or wait for AMD Zen.

    • Joshua Nastonovich

      Figured it out. The single-core performance on the 5960x is much worse than the other CPU’s in the i7 lineup. Checking “passmark” benchmarks this is evident immediately. Why would you not compare the 8370 to the 6700k? You know, the choice most gamers go with?

      Shady journalism at work here. For shame.

  76. i am trying to find numbers on amd cpus and 4k and these don’t look right.. you need more power video cards.. the tests are obviously gpu limited. those amd processors should not be so close.. it even beats or ties the 5960x which doesn’t make sense. at 4k the amd processor would be eaten alive by a 5960x. put two 1080s in there and watch what happens.

  77. hi all. i need a processer that can handle 400+ mods for skyrim and other hard core games but am on a fixed low income whats the best one?

  78. Here are how the Amd 8350 and even the 9590 Cpus really stand up against intel and this is their I7 4790K its a beter compairison since the 8350 is not really 8 cores. Both of these CPUs have 8 threads the I7 4790 K is more expensive it is about $318 but well woth every penny.
    https://www.youtube.com/watch?v=BDVcpAhegWs

  79. Here are how the Amd 8350 and even the 9590 Cpus really stand up against intel and this is their I7 4790K its a beter compairison since the 8350 is not really 8 cores. Both of these CPUs have 8 threads the I7 4790 K is more expensive it is about $318 but well woth every penny.
    https://www.youtube.com/watch?v=BDVcpAhegWs

  80. I have owned both the AMD FX8350 and the I7 4790K and i can tell you the 4790K is about twice as powerful i went from 30Fps in GTA 5 to 60Fps with my GTX 760 installed. I have upgraded to the GTX 1070 since but even with a GTX 980ti I struggled in GTA 5 to get over 40 FPS in some areas with the FX8350 not the case with the I7 4790K.

  81. Also even though the FX8350 can technically call itself a 8 core cpu it really isn’t since it each core shares a single fetcher and decoder, FPU and its cache because of this it struggles with single core performance which is what matters in most games right now.
    https://www.youtube.com/watch?

    v=PgejkSWzvNs&list=PLyReHG5dDxXWxtuArwVgQkVAv6dppbbEp

  82. Cant agree to this. Used a AMD Fx cpu for quite some time with a gtx 970 and it seemed to bottleneckt he gtx 970 slightly. Just switched a skylake i5 system and even at stock speed most game ran at least 10-15 fps higher most of the times.

    I think setting like high res textures, popular, draw distance etc which are not really gpu demanding but take toll on cpu really make the fx stutter.

  83. i love games but when pc are running slow than facing many issue.so thanks for share …
    simple and basic tips to improve windows speed for games

  84. A modern computer mouse is a highly precise piece of machinery. Even a basic office mouse, when compared to antique trackballs, can track movement with pixel-perfect precision. A conventional screen, on the other hand, is made up of hundreds of pixels, and humans aren’t fundamentally perfect.

Leave a Reply

Your email address will not be published. Required fields are marked *