Understanding Frame Rate – Uncovering The Truth Behind 30 VS 60 FPS

As someone who loves video games, I spend a lot of time discussing them on the internet via various Reddit communities, forums, and when I can stomach it; the YouTube comment section. One topic that seems to keep coming up in the ever-growing battle between PC and Console is the difference between 30 and 60 FPS. or rather the lack thereof, in the case of some arguments. In this report, we’ll be taking a look at some of these arguments and attempt to clarify some of the misinformation that is often used in them.

THE GREAT FRAME-RATE DEBATE

The fact that this is even being debated now, after so many years of both PC and console gaming co-existing seems strange, doesn’t it? Well, the reason we’re talking about this now is because with the latest generation of consoles many gamers were expecting to see console games that can rival even the most high-end PCs, in terms of graphical fidelity and technical capabilities.

difference between 30 and 60 fps

Unfortunately, more and more we’re seeing developers having to limit their games on the new consoles in both resolution and frame-rate (the former of which is a topic for another article) when compared to the PC version of the very same games, and even going as far as to justify these limitations as stylistic choices. Earlier this year, Ready At Dawn (developer of the upcoming game The Order 1886) actually stated that they chose to develop their game at 30 FPS because it delivers a “Filmic look”. It’s one thing to be technically limited when developing a game and being honest about it and it’s another to pass of the limitation as a feature or an art-style decision.

What’s worse, is that in some cases we’ve seen developers actually enforce these limitations on the PC version as well, even though they aren’t necessary. These limitations and attempts at creating artificial parity between the consoles and PC have caused these developers to receive much flak from PC gamers, while also igniting on-going debates between PC and console gamers. These debates tend to include lots of misinformation and unsubstantiated claims, sometimes mixed together with just enough fact to give them merit.

1. “The Human eye can’t see more than 30 FPS”

This one is one of the more interesting arguments because it actually starts getting into human biology. In the many debates among gamers you’ll find that this is one of the most used points from the side arguing for consoles. But, is there any merit to it? Is the human eye in fact incapable of seeing above 30 FPS?

Unfortunately, there isn’t a simple answer to this question, which is probably why there is so much misinformation surrounding it. The human eye does not see in frames per second. So, it’d be fairly difficult to determine the exact amount of FPS the human eye can perceive. What’s more is that what the human eye can perceive varies greatly depending on specific situations and the person themselves.

For example, an Army Ranger will be probably be able to see objects clearly at a much higher speed than the average human being, as they’re trained to have heightened senses and to be on high-alert for potential threats.

difference between 30 and 60 fps

We do know that the human eye is designed to detect motion and changes in light, the world being an infinite place, our eyes are constantly being streamed information without pause. If your eyes could only see a max of 30 FPS, you’d likely miss a lot of things just because they happened too fast. If that isn’t enough here is a quote from an article from way back in 2001 by Dustin D Brand that was tackling this exact same question:

“The USAF (United States Air Force), in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to “see” the after image as well as identify the aircraft. This simple and specific situation not only proves the ability to perceive 1 image within 1/220 of a second, but the ability to interpret higher FPS. “

So you see, if this example is any evidence than the human eye can not only perceive images at greater than 30 frames per second, but well above 200 as well.

2. “There’s no difference between 30 and 60 FPS”

Many gamers will say that there is no difference at all between 30 and 60 FPS, or any other frame-rate above 30 for that matter. That as long as the frame-rate is a constant 30 FPS or close to it that it will be ‘buttery smooth’ and provide an enjoyable experience.

This is also false, not only if you consider all that we covered above but going even further than just perceived “smoothness” in an image. If you’ve ever played a competitive multiplayer game like a first person shooter or a real-time strategy game, then no doubt you’ve encountered the dreaded input lag. Input lag is the delay in time between when you click a button and when up you see the object or subject on the screen react.

Naturally, a slower input response time will be less immersive since as humans we’re used to seeing our actions instantaneously as we do them. If you thought about waving at someone and had to wait 30 milliseconds before it started to happen then you’d definitely feel like something was wrong.

Battlefield 4 – 30 FPS

Battlefield 4 – 60 FPS

If you’re playing your games at 30 FPS then the amount of time your display is sitting on one frame is 33.3 milliseconds. This means that when you move your mouse to aim at a target, it will take at least 33.3 milliseconds before you even start to see the cursor move. This delay is halved to 16.65ms at 60 FPS and so on. Of course this is without considering other factors such as networking latency (in multiplayer games) and monitor response time which can add additional latency.

198 comments

  1. I will eagerly await the console defenders and grab my popcorn. Good article.

    • Thanks for the comment!

    • Console master race, u pc peasants are soooo poor and are not even loved by developers, consoles get all the exclusives and love.

      The console race will always always be number one for gaming, Consoles were made specifically for playing games not pc

      • Holy late comment Batman. I knew console fools were slow in the head, but that has to be a new record.

      • If you can’t tell the difference between 30 and 60 fps I really have to say I feel sorry for you..

      • mads ludvig timm fager lund

        The console platform is only more popular because developers earn more money, because of the bigger Price that console games cost

        • the pc is about its esports and mmo cash shops.pc cant beat a console in AAA title game sales.thats why a pc is a 3rd rate gaming platform.

          • PCs get more AAA titles. And AAA games aren’t even necessarily better than other titles. I doubt that you can not find a game you like when PCs get most of the games that consoles do, plus exclusive GENRES because of the power and precision of a keyboard and mouse. You can also use a controller IF YOU WANT TO. We have options. You don’t.

            #PCMR

          • the pc gets more AAA titles???? not in this day and age…lol.most of thegames that were only pc are on console now.the only games that are not on console are rts which i dont care about and esports.ill admit i wish world of warcraft was on console but not worried about that.

            rts and mmo’s are a dying breed pc multiplayer is about its mobas and esports.id rather have a platform where games work thats an OPTION to…rolf.

          • spic@the_teki_bar

            If you are gaming on a pc the only thing pc really beats consoles in are stratgy games I’ve never scene or played a console strategy game that can compete with the pc version. Even then so the titles for the console are far less and the ones they have aren’t done to well. No console would be able to keep up with a game like total war unless it’s scaled down massively but then the fun of that game would be sucked out. Watching having 2 armies of over 2000 units go up against another force in similar size is pretty entertaining.

            To spend 1200 plus on a comp to play battlefield is stupid since you can get a console for 400 and there isn’t a difference really. Pc people who rant and rave about having high frame rates are super losers that want to justify piss in money away like crazy. The only game I’ve played that a expensive comp is truly justified for is total war the rest of the titles can fuck right off and be played on a console ha

          • 1200? you can build a pc that is better the the ps4 pro for 500 dollars and with the added benfit of there are hardly anygames exclusive to anything anymore as well as being able to play almost every pc game that was ever released and emulators that let you play a good deal of console games and now ps now and xbox linking to the pc where is your so called advantage. maybe you should just go upgrade your ps4 to a ps4 pro lol.

          • Well something like total war I’m not sure how good a 500$ comp would handle it that’s pretty much why I was talking about stredity games. Also I’m thinking canadian pricing which is more expensive then American your 500 dollar comp would be at least 750 in Canadian add in the extra evert hung cost for us it could be closer to 100 then 750. That said most people don’t know how to build a good computer for gaming then people who do

          • this isnt the 90s it is so easy now my 11 year old built his first this year. other then a little research on what is compatiable its pretty much dont put there square peg in the round hole now.

          • It doesn’t change the fact a lot of people don’t know and don’t care to know. Building a computter is pretty easy and basic but most people don’t care to know or put a little time into figuring it out. Personally out of everyone I know less then a dozen know how to build a computer and set up the drivers ( even tho its super easy). You forget that most people really don’t know how anything works.

            Even at the engineering firm I worked at besides the IT department only a few people knew how to do it. Of the person doesn’t learn it before their 20’s then chances are they will never bother to learn.

          • Your giving the average Joe to much credit. The majority of people go through life oblivious to how almost everything they use in their daily life. I always found it ironic when I would deal with people and when I mentioned plc they didn’t know what it is even though the machine they use all day at work has one

      • Yes, but nonetheless PC’s have better performance, better cooling, better graphics, more games, more players are on PC than on all consoles combined, all games except for a few shitty console ports all run at 60 to 122 fps while most console games run at 30fps

        • lol, everyone is soo butthurt here. Everyone has their own opinion so who gives a fuck. I myself is both a PC and console user. There is a difference but I don’t really care and just game for fun. 🙂

        • This does not factor in persistence of vision nature’s checkerboard rendering. films looked like constant motion at 24 fps go figure.

      • You console cunts are nothing without PC. How else would games be made?

      • 2022 and this is a joke now

      • What you are talking saying ” games are meant for consoles not Pc ” its total bullshit Games are mearnt for PC as well that is why these days there is an existence of gaming pc . I have played a lot of games on PC and it does well for me , i like both consoles and PC

  2. It’s simple to explain this way: Take a 60 FPS image, and replace one frame (1/60) with a blank image. Does the user see the blank? If so, the user to can see 60 FPS.

    • That… doesn’t proof anything… at all….
      (Despite how it actually works) If you could only see 30 fps then you will see the blank most likely anyway. Assuming that you only see every second picture of the 60 fps material, it’s a 50/50 chance of seeing the blank depending on whether you see the even frames or the odd.

    • If you can see the flash of lightning you can see beyond 60FPS. Can you see lightning?

  3. I don’t recall feeling deprived by the hundreds of console games I’ve played over the last few decades that happen to be 30fps. I do however, remember the ones that looked horrible due to excessive jagged lines at low resolutions for the sake of 60fps.

    • It does have a point and it’s not necessarily PC vs consoles. If anything it’s trying to help game quality on consoles. It’s arguing FPS is an important part of the visual experience, not just resolution and detail. The problem is modern gaming computers set a precedent for the ideal visual experience which is a combination of very high resolution, FPS and detail. The next gen consoles simply don’t have the power to reach two, let alone all three of those to the quality of a modern gaming computer. Which of those factors the console releases sacrifice is up in the air.

      Last gen saw lots of 800P or 900P games on Xbox 360, as they felt that drop in resolution gave the best overall visual experience since it allowed increased FPS and detail. So you had all the resolution junkies freaking out. I know you complained about jaggies and implied that was because of the lower resolution… they could have dropped some detail and applied AA, but they felt detail was more important than smoothing out the jaggies (which I too hate passionately). Again, the article is saying FPS is very much a factor in visual experience. Now next gen consoles are hitting 1080P in all games, but to do that and keep a good amount of detail they’re having to sacrifice FPS and the FPS junkies are coming out in force.

      • Yes, that was exactly my point. While the debate is always framed as PC vs consoles, it shouldn’t be. This is an issue for all of us gamers and we should all demand a better experience.

      • Except they’re NOT hitting 1080p. They’re getting 900p (Assasssin’s Creed) or 792p at 30FPS…

        The PS4 can get 1080p in some games. But not all.

    • While your argument for better quality overall rather than 60FPS is very valid, that is not what he is talking about. 30FPS is undoubtedly more visually appealing than 60FPS; all other factors aside. Which is what he is saying.

    • Christopher Suiters

      You do know all console games USED to run at 60fps. But then they stopped because they couldn’t push shinier graphics and keep the frame rate up. Every NES and SNES game ran at 60fps

  4. It is very simple. To have 60 fps on most modern games you need lots and lots of transistors in cpu and gpu. The more transistors you have the more power you draw and the more heat that is created.

    Right now with current tech it is clear that the consoles are not even close in performance to most modern pc’s built with gaming in mind because they just do not pull enough power to do so. You do not have to consider thermal limitations of a console shell nor worry about a power limit with the pc.

    These design choices to minimize heat and reduce power consumption cost performance and are probably the single biggest reason for the limited CPU power from the PS4 and Xbox One. Plus running a simple cooling solution on a chipset that really isnt all that powerful will mean less RMA’s from heat failures which we all know plagued the last Gen consoles.

    • Your wrong.Maybe some consoles dont but the ps4 has that power.Thanks to the Gddr5.The ps4 has the power but sadly some developers dont know how to use it.Like ubisoft with AC unity.They blew it cause they didnt shared the work between the processors right.Look at bloodbourne.It has a very strong graphic and runs at 60 fps.The order on the other hand.Well look at the graphic.Its like a CGI movie and still its a game.Oh and im a pc,ps4 player.I had enough of always upgrading my pc.To play the next gen games on 60 fps you need a pc with the double price of a ps4.Getting a power house just for bigger fps? No thanks.Story and gameplay above graphics.I had fun with my ps3 for 7 years and under those years i upgraded my pc twice and had to buy a new one for new tech.And now i have a ps4 that will surve me for another 7years without upgrade.Its worth it.Even just for the exclusive that look better then other games cause the developers know how to optimise it on the ps4.Sorry for my english but im from Hungary.

      • The problem with that fact is GDDR5 is wonderful but all the Memory in the world won’t make a lesser graphics card anymore powerful. A good bit of the PS4 being better than the Xbox one graphically is due to it having a slightly better GPU and the GDDR5.

        AC Unity runs horribly on all systems, Xbox one PS4 and PC so that is a poor choice. Bloodborne looks interesting but I don’t know if it will be able to push 1080P 60FPS.

        The Order is going to be 1080P 30FPS, They try to pass it off like its a stylistic choice (it isn’t the hardware can’t do the graphics they want) a lower frame rate is never a better option.

        I don’t know about Hungary, but here in the States the PS4 and Xbox are $400 USD building a PC that is $600 USD that is twice as fast. You don’t HAVE to upgrade your PC every few years you just should. A lot of last gen games still ran on the 8800GTX just at settings comparable to the console. I’d doubt that a AMD HD 7870 won’t be able to run Xbox one games and PS4 games at the same or better settings 4-5 years from now.

        Optimization is one of the biggest lies that are told. oftentimes “optimization” means tweaking the effects and textures to run at a playable frame rate.

        Again, I don’t know what the cost of upgrading a PC or buying a PS4 is in Hungary, but here it’s a better idea economically.

        This was not a PC vs Console discussion it was why we shouldn’t be happy with developers telling us 30FPS is better than 60.

        • A lie? How odd.
          Even John Carmack seems to think that optimization is better on consoles and that it theoretically puts them ahead of a PC with equal specs. What a console fanboy! /s

          • Consoles have lower-level APIs, than current iterations of DirectX and OpenGL. However, APIs such as Mantle, DirectX 12 (which will be used on the Xbox One as well), and OpenGL-NEXT all share those same low-level properties.

            So, yes, a console with the same specs as a PC does have the potential to run games better, but it isn’t nearly what people try to make it out to be.

          • Optimisation should be better on a given console because, for example, all xbones have the same hardware and drivers making it much more simple.

            However, optimisation can’t make up for the fact that the consoles are using (top of the range) hardware from 5 years ago (for example the xbone gpu is a 7790, which is equivalent to a 6970 released in December 15, 2010). I don’t think PC gamers expect their 5 year old hardware to play new games at 1080p and 60 fps though.

          • No one’s arguing that the hardware on consoles is new.

      • Atty Marc-val De Lara

        *you’re

        • He specifically apologized for his English and sighted the fact that he was not a native English speaker, as he is from Hungary. I don’t believe commenting, purely to correct his grammar was at all necessary.

      • Excuse me.

        Who told you that GDDR5 ram made GRAPHICS better. GDDR5 is nice, but seriously, the developers know all they need to know about optimizing the games (most of the time). Its simple why the games doesnt have 60 fps. The consoles simply dont have enough power unless they play on the very lowest settings. Also, no, a PC to run 60 fps does NOT cost the double of a PS4 (nor an xbox). https://lmgtfy.com/?q=400%24+console+killer – Thats a let me google that for you link, else just google “400$ console killer” and you’ll find it.

        Have a nice day.

      • Klypto Kerrigan

        “always upgrading my pc”

        Wat. I redo do mine every 5-6 years.

      • If you value story and gameplay above graphics, which is totally fine by me, why didn’t you just get a cheaper computer? And then just upgrade when necessary?

      • LOL you used bloodborne as and example? that game was made FOR ps4 so the software and coding are specialized to fucking run smooth as fuck on a ps4, making aside that the time that it would had been put into making the pc port (wich we will never have ;( )they putted it into the console encoding so its very fucking optimized for the ps4 aside that i have even 1 more reason and its that bloodborne is as easy to run as dark souls 1 and 2 and i mean even a 720 ultra tier gaming pc will be able to 4k those games so there you go man

      • yo i made a pc that can run ultra 1080p 60fps (mostly) for 450 dollars (thats the price of a ps4 with all controllers and some games) and it will last me for around 8 to 10 years gaming to 720p in the future and it could last more if pc parts as console parts didnt had lifetimes

    • And the heat failures didnt plagued the consoles.I never had trouble with my ps3.Some poeple keep their console in a wrong place.

    • well our consoles have more fluidity and are more effective, we dont need to build or add anything to them and they are way cheaper than building a 900 or 1000 dollar gaming pc, my friend spent 5000 dollars on a pc like wtf just to have all that stupd fancy stuff i rather pay 400 bucks and play right out the box

      • youre just a dumbass i can kill your console for 450 dollars wich is the price of your ps4 plus 1 controller xD youre just buying the name you poor thingy

  5. call me weird but I have been playing my games at 30fps since the late 90s.. It’s a preference I have.. I don’t like 60fps or above.. I get more immersed at 30fps, It’s more CG. I buy the top of the line graphics card every year to use all those hidden settings in Nvidia Inspector or Radeon Pro. super sample, down sample, – LOD, ambient occlusion or hidden anti aliasing algorithms .. you name it:p

    • Not weird, it’s a preference. And PC games/drivers typically allow you to customize it to your preference which is different for everyone. So happy times for PC gamers. Problem is consoles are far more limited in power and typically don’t let you customize the visual experience. The developer gets to chose what’s more important and this article is trying to bring attention to the fact “some” people want FPS as a priority and some people really can see a difference between 30 FPS vs 60 FPS.

    • For some games I actually prefer 30fps over 60fps. I find if I’m playing a 3rd person game, or something doesn’t require any real twitch reaction I’m totally fine with 30fps. But if I’m playing a first person shooter where I use quick reaction anything less then 60fps is unplayable for me.

      • the funny thing is to all you guys out there is that there was a recent study done at my university with avid pro gamers and they were tricked into believing that they were playing at 60fps or vice versa…. they were all clueless…. bare in mind all of them claimed they knew the difference

    • you game since late 90s? Are you using a 10 year old profile pic? Can you please explain me how 30 fps looks better than 60? It’s not a matter of taste, it’s a fact. Same goes for resolution.

    • WTF? Just throwing around buzz words like you know what you are talking about here huh? You use a Supercharger attached to your Turbo too, huh? Gotta get the flux capacitor tuned right? I call bullshit on your ten year old ass.

  6. I’ve never understood the “input lag” issue as it relates to frame rate, because I’ve never experienced it… ever. Even when running under 30 fps, when I click my mouse, or move it, everything still happens instantaneously. If there’s any lag, it’s completely imperceptible. This goes back to gaming at a time when the very idea of 60 fps was mere fantasy for the entire industry, even on the PC. The ONLY time I’ve EVER experienced input lag is from poorly implemented mouse smoothing techniques, and that’s all.
    On the motion blur front, making a blanket statement that it always sucks is wrong. If it’s done correctly, it enhances realism drastically, especially with racing games. Assetto Corsa is a prime example, as well as pretty much any recent racing game from Codemasters. Also, when done correctly, it has zero (or very near zero) effect on frame rate or any sort of latency. Alien Isolation, which I play at a solid 60fps with all eye candy at maximum, including the motion blur, and it looks 10x better with the blur ON. (GTX 760, i7 3770) I’ve also played games that did in fact appear to run smoother at 30fps with motion blur on, than some other game at 40fps with motion blur off.
    While it’s true that there are some facts laced into this article, it also has some opinions that don’t necessarily ring true for every one out there.

    • Vsync though causes terrible input lag. Just switch it on and off and notice the difference.

      You also made a blanket statement. Motion blur does impact performance in some game engines (and also with some underpowered GPU’s). So you’re wrong too buddy. Of course a friggin racing game doesn’t have the calculations needed for blur to impact performance, they never push the limits.

      • E.g. Skyrim’s and Fallout’s ENB mods have options for such high quality motion blur that they heavily reduce framerates.

        • That’s irrelevant because you’re talking about an unofficial mod (a shite mod at that), which is not what this article addresses. Any mod that adds on post-effects that is not part of the original graphics engine code is going to be inefficient. I’m talking about in-game visual effects that are part of the ORIGINAL graphics engine, so the fact that a modded motion blur affects framerate is a no-brainer because pretty much ANY effect as part of a mod is going to affect frame rate, with or without motion blur. I don’t give a fuck about mods.

          • You really have no idea what the fuck you are talking about with any of this. First you can’t feel input lag in anything for some reason, even though we all know that its there, yet you contest that again and again, and then you call the ENB mod shite? Do you have any idea how amazing the ENB mod even IS? I work with it and make my own custom ENB’s, and I can tell you right now that it is a amazing, and POWERFUL tool. Not much more I can ask out of it. It adds in DX11 level effects to shit old ugly DX9 games like Skyrim, fixes major game bugs, allows 32bit games to utilize more RAM, Clears up some VRAM space in games like Skyrim for you, Lets you use RAM as VRAM if you so desire. Has a ingame GUI that lets you edit and adjust almost every effect. ETC, I could go on, but you need to get some more experience, with pretty much all of this stuff before you call one of the best graphical mods ever made ‘shite’.

          • Input lag is definitely detectable. You can see this if you turn V-sync on or have an extremely low framerate. The ENB mod may not be shit like he said, but it definitely has some less-than-satisfactory aspects. One of which is that it doesn’t use the best of code, isn’t very accessible, and uses a sketchy form of plugging into a game (injection). Also many of the features you said are restricted to Skyrim.

          • Yes, they are, and how is that a problem? It is made for games like Skyrim, Fallout, etc, that have less than satisfactory renderers and DX support. For what it is, and what it can do, it is amazing.

          • For what it is it is awesome, but what it is, is a horribly hacky way of making a game look better.

          • Hardy horrible though, as it works, and works well.

          • The phrase “If it works, it works” doesn’t work very well with computers. When it comes to ENB it can break a lot of things during injection through an assortment of misguided coding. For one, the code has no safety measures, so incorrect configuration can cause issues from BSOD to simple game crashes. Also being a dll injector if someone wanted to be malicious they could ship a virus with it.

          • Well, say that all you want, but nothing like that has happened to me over the last 4 years. Across two of my own PC’s, two friends PC’s, multiple ENB installs on each, sometimes for more than one game, from my own ENB presets to the ones on the Nexus to some new ENB’s I found on a Chinese forum, they have always worked and never causedy system or anyone else’s any harm. Worst that has ever happened with any ENB is the game crashing on startup when a OSD program is enabled. Nothing major.

          • Being good enough doesn’t excuse something from scrutiny. There are legitimate complaints to be had about ENB, and denying them because it works is to ignore it’s flaws.

          • Fine. Point is, its FAR from ‘shite’. Not to mention, your opinion on this matter is rather void, given your previous statements…

          • It may be far from shit, but it’s also far from professional quality as well, which is my point. And ignoring my point purely based on my previous statements, be they good or bad, is naive.

          • You fail to understand, that this is not a “professional” undertaking. There is one man working on the ENB mod. And taking all that into account, along with how well it DOES work. Its damn impressive, and does not deserve to be called “shite”. Also brings into question your entire point on this matter, why are you expecting ‘professional quality’ from a one man team? And as far as professional quality goes, it all really depends on what you would qualify as professional, you have the AAA game developers who ship games like Skyrim with outdated engines, heaps of bugs, updates that completely destroy certain aspects of the memory allocation, shoddy graphics that require work like this to be done, etc,. But to some, they could be called professional. You also have developers like Ubisoft, who ship games like Far Cry 4, which have decent engines, up to date graphical effects, but crash, cause BSOD’s etc., exactly like you claim the ENB mod can do. Now, tell me how the Developer of the ENB mod is any less professional than these developers? He promises nothing, yet still delivers a better product most of the time, that in all my time with it, HAS NOT done those things, has updated the graphical effects of the games it is tailored for, and fixed multitudes of bugs across those games. I judge a game/piece of software by its usefulness, usability, and how well it works, and the ENB mod works a hell of a lot better than many “professionally” created games.
            TL;DR; Professional quality? Pfff.

          • You seem to think I expected professional quality from it, when I didn’t. When I saw and downloaded it I expected it to be just as shit as it was. And if you’re going to compare it to games made by professionals that are not of professional quality then you’ve really missed the point. The point is that ENB is NOT of a very high standard. One guy made it (who by the way refuses to accept help because he’s a selfish prick but that’s for another time) and he made it to the best of his ability. While that’s wonderful, and the mod is amazing in that sense, if it were released professionally and sold people would complain because it is a volatile piece of shareware.

          • Really? Can’t feel input lag…. Like, Really?? Wow, just wow you’re dumb…… I get he pissed you off for insulting your mod or whatever, but you are a jackass for even trying to make that particular argument.

          • NO. I was talking about him SAYING that he can’t feel it. Read it again.

          • ENB is actually a fairly good mod, and its code is actually fairly efficient considering it can do near GI in realtime, but even if it weren’t, that makes it no different from original graphics engines, after all it IMPROVES the performance of the original game when you turn its additional effects off. Motion blur however, when done correctly can very much make lower framerates appear better.

      • Nope, not even with Vsync on have I experienced any input lag, UNLESS the game in question has poor coding for it, which does happen once in a while. Otherwise, I leave it on 100% of the time without any issues whatsoever.

        And I did NOT make blanket statement about motion blur… if you would have read it right the first time, I specifically said, WHEN it’s done CORRECTLY. Meaning that, yes, there ARE some games that have inefficient (and also not the most attractive) motion blur code, and in those cases it can hurt framerate. My point was, motion blur as a visual effect does not degrade framerate 100% of the time, as the article would have us believe. There are always going to be exceptions.

        Racing games can be just as graphically and computationally intense as any shooter once you throw in a complex physics engine and a full field of other cars with AI. In fact, I’ve had more racing games give my system a workout than most anything else, except maybe Xplane 10.

        • Vsync has input lag, there’s no doubt about that (unless you have less than or just above 60fps uncapped). It’s just how the technology works. Problem is, you don’t have the capacity to recognize and feel input lag. It’s definitely there and can be easily spotted with proper hardware.

        • If you played fighting games, you would know what input lag is. I wont try and convince you it exsists though, see a player can be used to the lag and still play very well. So you may never relize it is there in the game you play.

        • Christopher Suiters

          Watch the youtube channel linus tech tips. They use a slow motion camera and actually test input lag by showing when they click the mouse and when the action goes off. Counting the frames between. Then you get the input lag. Input lag is a thing. Your brain must just be to slow to notice it. I feel bad for you. I would be scared to be on the road near you. Because you will have slow ass reflexes.

          • Yeah, it is a thing, but if you read the other comments, you’ll see that it is often not perceptible by many people, and it doesn’t NEED to be. I have no problem playing very fast paced games, whether I see the input lag or not. Your comparison to a few frames of video that all happens in mere microseconds, versus the things that happen while driving a car is a clear indication of your inexperience at living on this planet. My 30+ years of driving without a car accident are proof that my reflexes work just fine.

          • It’s dumb that we’re arguing over an “article” that is at best a copy and paste of a few different opinions written by some nobody. Personally, I’m a pc guy but am more toward the side that above 30 all the time you’re good.

    • I think input lag is more of a problem for competitive players (and the same for motion blur, hence 120hz monitors). I used to play fps competitively and when you are concentrating on your performance in game, slight lags are much more perceptible. Now I just play games occasionally and casually and it’s less of an issue. My settings for games differ vastly depending on whether it’s a performance based game or whether it’s a story/visual based game.

  7. First-person shooters (Oblivion, New Vegas, Skyrim etc.) on PC need at least 60 FPS average, or more. Ideally your minimum FPS would never go below 60…
    With 60 FPS average, you’re going to get minimums in the 40s and that’s not always enough for smooth, fast, fluid gameplay.
    30 FPS average is crazy – you’re going to get minimum FPS in the teens, heheh…
    Sites like [H]ard OCP are the worst! They setup games on the very latest vidcards to produce ~40 FPS average. As long as minimums stay over 30 they’re ‘happy’.
    They should test vidcards with the games setup to give 60 FPS minimums.
    But as it is, they’re not really truthful with their readers…
    (Yeah sure, that 750 Ti is just fine @ 1920×1080. Look, 30 FPS.)
    I did Skyrim tech support for years, and everyone with problems stuttering, lagging and crashing had waay too high settings enabled, including heavy AA and post-processing and high-rez texture packs etc.
    Achieving 60 FPS with their wimpy rigs (by changing settings) was like a revelation to them – they had never before played the game running smooth, fast and fluid. Fun!
    Even nVidia’s ‘GeForce Experience’ optimization is sad. It keeps piling on the effects and AA until your frames are in the 30s… quite unfortunate.
    Your game will then run poorly, with stuttering and lag…
    “But look at that eye candy! Looks great, doesn’t it?” they’ll say, heheh.
    Sure, there’s plenty of nice-looking slideshows – as long as you’re not the one playing.

    • Meta Expyrosin Palingenesis

      If you did Skyrim tech support for years, you should know that it’s not a first-person shooter.

    • Oblivion, NEW VEGAS, skyrim? You’ve just picked some of the worse latency offenders in the industry. You’ve also picked two games that really don’t require a high frame rate at all.

      Also, 40FPS is generally what most people consider perfectly playable. It isn’t ideal by any stretch, but it’s definitely playable. 30FPS on the other hand is absolutely unplayable for me and many others, but if you went up to 60fps in some games, you’d be leaving a lot of visual effects on the table to hit that mark. Many people aren’t willing to make those sacrifices. I actually applaud nVidia for their “Experience” setups. They give you a slider to adjust for performance or quality and it changes the setups accordingly. This is idiot proof. Is the game not running at the frames you like? Slide it toward performance and verify. Rinse and repeat.

      I think you’re just being overly negative for no reason.

      I realize now this is 10 months old…

  8. Meh. I’ve been playing a bunch of games lately in both 60fps and 30 FPS across various platforms. I don’t even notice the framerate unless its unstable. So as long as a game is LOCKED at 30fps that’s enough for me. I don’t care to sacrifice resolution or graphics quality for higher.

    For the record, the games have been Shadow of Mordor (PC at 60fps), Dead Rising 3 (PC at 30fps), Last of Us Remastered (PS4 at 60fps), Sunset Overdrive (XB1 at 30fps), Modern Warfare (PS4 at 60fps) and Destiny (PS4 at 30fps). All played and looked beautifully.

  9. Yeah, that’s not actually correct in reference to the Film fps, the reason we find it cinematic is that we Associate that fps with film, and that faster fps is more realistic whereas a slower fps tricks our brain into a feeling of unreality, making it easier to see the vision as a fantasy.
    If you’ve ever watched War of the Worlds in 24fps vs 60fps, you’ll know what I mean…
    Tom Cruise cannot act, but we don’t notice it at 24fps because we’re lodged in the fantasy-zone.
    At a realistic framerate, the unreal quality of it all is starkly visible once our brains try to intepret the scene as real.

  10. “Peter Jackson and The Hobbit series is a great example of this”.

    I found 48 Frame Per Second to be a distraction that constantly removed myself from the movie experience.

  11. /r/pcmasterrace

  12. Cute article. #1, clear fact. #2, fact, but a bit of opinion slips in. #3, not even technically correct. #4, full blown subjectivity trying to pass itself off as objective fact. Final analysis, blatant pandering (see comment below as to who it was for if you haven’t figured it out yet).

    • #3 is very correct. One of the most common parts of post-production for CGI elements is adding in motion blur to cause the effect.

      #4 is again very correct. You cannot have a subjective opinion on what is an objective fact. While I can say and think gravity does not exist, gravity will still objectively exist. 60FPS is objectively better than 30FPS when all other factors are the same.

  13. Cort Stratton(Sony ICE team(ps4 API engineer)) recently made a push saying 30fps is better due to visual improvements on consoles. I certainly do NOT agree and any ps4 owner should be concerned with this statement as there are several cases of frame rate drops even on 30fps titles.
    “Ps4 is a supercharged PC” apparently BS is dropping from flying pigs these days. On another note Microsoft said the opposite with 60fps is better. Console gamers see nice graphics and are bewitched. 60fps will never be standard with people like Cort Stratton. 60fps unfortunately doesn’t advertise well compared to shiney graphics. Everyone should buy PC’s. Trust me you’ll be happier

    • A huge problem we have is people trusting representatives of these companies. they have no reason to tell the truth since it would hurt their bottom line.

  14. Try using the rift at 30 fps.

  15. Personally, i notice the difference mainly on the particle and moving liquid stuff, which is the reason i try to attain 60fps. I don’t seem to notice any difference with input speed.

  16. I really couldn’t care less about console vs PC, but I honestly can’t tell a difference between either video.

    • I will say this, it is far less noticeable in video than it is when you’re actually playing the game yourself. This is because in video, their isn’t any input lag because you’re not interacting.

  17. What a poor comparison. I honestly clicked this expecting a good objective analysis. I’m not interested in your petty console vs pc war.

    Never do you make it clear how significant the difference is. Obviously increasing the fps will not lower the quality of the experience, that’s not helpful to explain.

    Your point 2 is also ridiculous. You don’t react instantaneously based on an image being displayed to you 16 ms earlier every 33 ms. The mouse, keyboard, latency, and your brain are determining how fast you react. And your brain is already going to do what it’s planning on doing within those 16 ms that you received the earlier image at.

    • “You don’t react instantaneously based on an image being displayed to 16 ms earlier every 33 ms”

      In the article we never stated otherwise, It says that input latency was much lower (half) at 60 FPS compared to 30. It also clearly mentioned that there’s also additional latency added by other factors such as networking, etc. The whole point was that there is in fact a difference between 30, and 60 FPS when playing competitive games as latency can (and does) become an issue.

      This article was never intended to fuel the console vs PC debate, but rather to try and debunk and or clarify some of the misconceptions that are tied to it.

      The bottom line is: anyone arguing that 30 FPS is “better”, or “good enough” is wrong, and our point is that it doesn’t help the gaming community or industry, but hurt it.

  18. You do know consoles can run 60fps, right?

  19. I don’t think that 60 fps is drastically better than 30fps. Most people probably won’t be able to tell the difference past 45+ and if you enjoy a game you won’t necessarily be concerned about frame rates unless there are abysmal drops like fallout or skyrim. For example, I enjoyed the PS3 version of tomb raider just as much as I enjoyed the ps4 version. I’ve played console games for years and was never bothered by 30fps. This whole push for 60fps is generally a recent development that most people do not care about.

    • “Most people probably won’t be able to tell the difference past 45+…this whole push for 60fps is generally a recent development that most people do not care about”

      Who is most people? Who are you to claim what other people will or won’t see, or what they care or don’t care about? This is the point the article is trying to get across: people spreading misinformation and made up stats without any foundation or evidence to back themselves up.

      The actual truth of the matter is, if your claim was correct, then companies would not be making TVs and monitors that display images faster than 60Hz. But they *do* make such panels, so clearly enough people care to the point that companies see it not only worth their engineers’ R&D time and salary, but also see it as a source of profit, to research and produce hardware that support display rates above 60Hz.

      • The point of 120hz and 240hz panels is to help smooth out the judder that can occur on 60hz panels. It’s also needed for motion interpolation features and has nothing to do really with fps values in general.

        Until relatively recently, videos on YouTube were 30fps and have only started allowing to select for 60fps on desktop. The YouTube app has only just started rolling out updates for allowing 60fps playback b

        Most people that look at gameplay videos and comparisons do so on YouTube so wouldn’t be able to tell the difference. Even then in full motion you’d be hard pressed to notice the difference unless you had a version running 60fos right next to it.

        And being that NVIDIA’s game optimizer (not sure if AMD has one) specifically aims for 45fps and looking at videos it would seem there is not much difference between the fluidity of 45-60fps.

        Henceforth, my conclusion is that games have done well for years at 30fps and they will continue to do well now, games have not suddenly become “unplayable” at 30fps.

        The only time that 60 fps is strictly needed is during multiplayer as the reduction in latency is vital. For single player, a nice bonus but in no way vital or essential.

        • PC Games have supported more than 30fps for quite a while, so yes, now that we’re getting garbage ports from consoles that are locked @ 30fps, they have, in fact, “suddenly” become unplayable.

          Your claim of “Most people that look at gameplay videos on Youtube wouldn’t be able to tell the difference” still leaves me with my two original questions: Who is most people? Who are you to claim what other people will or won’t see? Also a third question now comes into play, assuming your statement is accurate: Wouldn’t the comparison then be compromised? If people are watching 30fps vs 60fps comparisons in a 30fps video, obviously they’re not going to see a difference, and it invalidates the entirety of that sample.

          Granted, I’m just going along with your example… I don’t really know why you brought Youtube into the conversation, because the original claim you made was “Most people probably won’t be able to tell the difference past 45+ and if you enjoy a game you won’t necessarily be concerned about frame rates” implying they would be observing the *source* material, not comparison videos. If you want to see a TRUE side by side comparison, you’d use something more robust with less chance of being artificially limited, such as this: http://www.svp-team.com/wiki/Frame_interpolation_overview or http://www.testufo.com

          Perhaps my inclusion of TV Panels was a mistake; my TV is a true 120Hz panel, so when connected to my PC, it is capable of displaying 120fps content (verified with testufo.com). But obviously that is not true of all 120Hz TV Panels, which as you stated most will simply use motion interpolation to simulate 120Hz motion.

          However, going back to the “most people won’t see a difference” argument, it still stands that TV companies would not waste their time and money messing with 120Hz or 240Hz interpolation if the mass populace was generally happy with 30fps. Another point to add to this is there’s equivalent software for PC Media Players such as MPC-HC and VLC which provide motion interpolation to simulate higher framerates on low-framerate content, e.g. http://www.svp-team.com.

          • The whole point of 120hz panels are for motion interpolation, black frame insertion, and 24fps playback. That’s it. TV manufacturers don’t make tvs for gamers, their main intended audience and the average consumer for this market is movie/TV watchers.

            If they actually did consider gamers, more of them would make low lag tvs, however as it stands the only one that does this is Sony and starting this year Samsung. If people were clamping for 60fps as much as you say then movies would have been made higher than 24fps a long time ago, it was a frame rate chosen for a limitation that no longer exists yet it is still the standard frame rate for movies.

            The reason is that people associate this with the cinematic feel of movies and has existed for so long it has basically become a cinema standard.

            Motion interpolation is a feature with a mixed reception, some watch but most people hate it because it makes them feel like everything was shot as a cheap home video or soap opera (hence the name soap opera effect) similar to the mixed reception to 3D ( which I am a fan of)

            Now, I agree that this has no bearing on video games but that served my point of 60fps not really extending past video games.

            For video games, I generally give it up to preference as there is no standard frame rate for video games. It’s not really about the number but about the smoothness. Some say that they will take 60fps every time, some say that they will take a smooth 30fps over slightly stuttery 60fps. Some will even take stuttery 30fps with frame drops if they think the game is good as proven by the scores of people that love games like Fallout 3 and Skyrim that performed horribly on consoles. It might even have a subjective quality to it depending on whether a person is more high octane or laid back, similar to people that prefer to call or text or how people prefer the morning or the night. I don’t know but 60fps is not any sort of standard at all and shouldn’t be treated as such.

            I will concede that the other is wrong or right. I’m a person that values high quality textures and effects above all, I could care less about frame rate or resolution or shadows or whatever as long as it’s at least 720p and around 30fps. That’s not to say that I don’t prefer games with 60fps given the choice but there are some games that I think look very bad or have some of their effect ruined with 60fps and would take 30fps every time (mass effect 2 for example for me).

            Now maybe you took my comment to mean that games shouldn’t be 60fps at all and I apologize for that misunderstanding as that is not what I meant at all. I in all honesty would be in support of all games giving you a choice between locked 30 or unlocked 60.

            My point is that making a big issue of not having a perfectly locked 60fps is really acting sort of entitled and spoiled. Games have not become unplayable at 30fps. The most that the average consumer will ever notice from higher frame rates are smoother transitions between frames which some like and some don’t and most don’t care. It doesn’t make you play any better, it doesn’t make the game any better or faster, it’s probably the last thing that the average consumer thinks about unless they are a really elitist PC master race type of person.

          • “My point is that making a big issue of not having a perfectly locked 60fps is really acting sort of entitled and spoiled. …. it’s probably the last thing that the average consumer thinks about unless they are a really elitist PC master race type of person”

            Tell that to the all entitled and spoiled elitist people who refunded Arkham Knight to the point that WB pulled it from sale, and has yet to reinstate it. If you prefer 30fps then good for you. If I bought XB1 or PS4, I’d be fine with 30fps too, because I recognize the hardware is too weak to push much higher than that. But on PC, where hardware is widely available that far surpasses that of the console internals, developers locking PC games to 30fps is legitimately unconscionable; it’s like opening a public race track where any type of car can join in, with the stipulation that you can only drive a top speed of 30mph.

            And now with Steam Refunds in play, it’s a poor business decision (and poor PR) to assume your customers are unobservant and uncaring about such things as FPS locks, and overall product quality features such as providing reconfigurable controls and other utterly basic options like FOV, resolution, texture quality, proper anti-aliasing settings… Arkham Knight’s option of “Antialiasing: On” or “Antialiasing: Off” was a good indicator of just how little Warner Brothers & Rocksteady cared about the quality of the PC port.

            Until recently, most gamers had no recourse when developers and/or publishers released these low effort cash-grabs. Warner Brothers and Rocksteady learned the hard way that they can no longer get away with such minimum effort ports.

            So yeah… PC gamers are definitely “entitled and spoiled” for wanting developers and publishers to treat us like the paying customers we are instead of treating us as unattended wallets ripe for the taking. And we are most certainly “elitist” to expect the developers and publishers to provide a high quality product that will make us respect their companies for what they are capable of creating.

        • A lot of older console games played at 60FPS…it is not vital or essential but saying it is unnecessary or a waste is a bold faced lie. I prefer higher frame rates because I get irritated at lower frame rates.

          Also using “Most people” in an argument isn’t giving any sources or point. Most people like to play farmville, what does that actually add?

          In videos it may be because you aren’t interacting but rather taking it all in. Input delay IS a thing.

  20. Nice, now i can see a rather clearer image for 30fps vs 60fps.
    It’s not completely annoying for a constant 30fps even for stable 60fps is much better.

  21. As with most technical discussions people use inappropriate examples to try and support their argument. The case of the Air Force flashing a single image on a screen for 1/220 of a second is a perfect example. Of course a person can see the image/after image but that does not mean that if the person were to view images at 30 FPS he would be able to differentiate between 30 and more FPS. It’s two completely different criteria and the single image flashed on the screen does not in any way mean that the eye/brain can differentiate between 30 and more FPS. Nor does the fact that the eye can differentiate changes in light mean that it can differentiate video images any differently at 30 FPS. The fact of the matter is scientific testing has been conducted and it demonstrated that people typically can not tell the difference between 30 FPS and higher frame rates even though some people will swear they can. Their belief is not supported by their response in scientific testing.

    • This proves nothing. All you’ve done here is refer to “scientific tests” that somehow support your claims, but have never provided any resources, or even so much as described the testing methodology. The fact of the matter is, there are lots of company’s which produce monitors with refresh rates not only higher than the 30Hz you claim as the maximum we can perceive (which just about every display on the market is) but even as high as 144Hz. Furthermore, as stated in the article, even the film industry is starting to adopt higher frame-rates, which are above 30FPS, if there wasn’t any difference no one would bother investing in the technologies.

      Also, once again, as stated in the article; even if you truly can’t see a difference in motion or smoothness between 30FPS and 60FPS in video, that still doesn’t change the fact that there is quite a bit of a difference between the two when it comes to interactive media such as video games because of input response time.

    • Klypto Kerrigan

      What scientific test? Where is your proof?

      Walk into Best Buy. Look at the TV’s playing whatever kids movie they are into these days. Notice anything weird about how the video feels compared to some of the cheaper models (if they have any)? That’s the TV and programs running at 60 / 120 / 144 FPS.

      https://boallen.com/fps-compare.html

      Or better yet, run this on some of your videos on your PC, the effect is instantly noticeable:

      https://www.svp-team.com/

      ^ Modern higher quality TV’s have something similar to this built into them that interpolates frames in case the video source is not >= 60fps to bring it up to spec.

    • There are millions of PC gamers that play games at 60+ FPS and can see the difference. Even some console games are 60FPS. Why would anyone waste time making games run at 60 or more fps, or make monitors that display more than 60+FPS IF PEOPLE COULDN’T SEE THE DIFFERENCE. We don’t need scientific testing, we NEED asshats like you to stop debating something that is FACTUAL, and not even close to open for interpretation. Articles like this should not even have to be written.

    • “Scientific testing has been done”

      Yet you provide no citations. You really expect people to just take your word for it?

      It’s very easy to see the difference between 30 and 60 FPS. I can tell when my framerate drops easily.

    • The first part of your comment was great but the second is pure retardation.

  22. i feel so bad cause i dont see a diffrence

  23. So this article shows up now on my Google Now page on Android. The end result? The article is biased to praising the overrated 60FPS and trying to ditch the underrated 30FPS as blatantly as obvious.

    See, whether you like it or not, whatever’s in your monitor is not reality. And 60FPS is not absolute reality either. 60FPS for me is for the old games while 30FPS is for today’s cinematic titles. Games drop the idea of focusing on the eye candy and instead makes you focus on how ‘super fast’ it is when it’s 60FPS, but at 30FPS the eye candy is pretty much what you look at in a cinematic way, and it helps the visuals stand out.

    About input lag? I hear its related to games that tie their input to the frame. Definitely not all games.

    And 60FPS being better than 30FPS is like $20 pay is better than $10? You dropped the ball BIG TIME here. 60FPS is not better than 30FPS just because its 30FPS higher and devs used that ages ago. I’d rather the devs focus on making quite the gameplay than trying to favor eye candy or FPS. So if I get a game today that runs at 20FPS but with great gameplay? Its mine, and I’d prefer it to a lacking game with 60FPS. My only need in a 20FPS game is for the timescale to remain correct, a second IRL is a second in-game.

    60FPS and higher just attempts to mix real life with fantasy in the visuals and for that I don’t take it for granted. 30FPS mixes fantasy with fantasy and doesn’t look awfully out of place like its 60 brother, in fact it fits. And no, I’m a PC gamer, but I’m waiting for a console because of the failure that is DRM on that platform.

    Thanks for the clickbait.

    • I don’t think you really understand exactly how many of the various mechanics you’ve brought up actually work. You seem to be equated FPS with gameplay, as if you had to choose one or the other. That’s false.

      You’ve mentioned input lag and said that its related to games that “tie their input to frame” this is not at all the case, as input lag is absolutely present in any title, and is always more noticeable at lower frame-rates, its just a fact.

      You speak of 60+ FPS as if it were some dated technology, that has long been outgrown. This too, is untrue. Yes, it is true that 60 FPS was much more common in earlier days, as a game dev wouldn’t dream of limiting their FPS to half of the displays refresh rate, but that doesn’t mean that its a dated technology. Its simply the better way of doing things. If you don’t believe me, then simply ask yourself way VR developers are targeting 90+ FPS for their hardware.

      • VR needs more frames because this time you are actually being part of the fantasy instead of having the fantasy laid out to you in a screen, and you’d expect your eyes to treat you well in a game. But I’m not gonna bother about VR.

        Input lag doesn’t matter with FPS, unless your game has them both tied. Some games will have you turn as easily on 30fps as is on 120fps as is on 240fps, and some games will basically let 30fps ruined and 60fps good at input. I know it.

        I believe 30fps and 60fps should be an option. Not have retarded gamers like 80% of the responders hunt like wolves for 60fps and leave 30fps behind.

    • – 30fps is not a an artistic choice and its not cinematic. If I wanted a cinematic experience then I’d go and watch a movie. The choice to go with 30fps is because of technical limitations.

      – 60fps is not just for old games. In 2015 60fps should be the
      standard, but sadly its not. We have developers pushing out fancy
      graphics instead of fancy gameplay. Yes 60fps might have been used in the old days, but that doesn’t mean its inferior in anyway.

      – 30fps is not higher than 60fps where on earth do you get this line of logic from? Even a little kid knows that 60 is greater than 30.

      – Your waiting for a console because of DRM even though DRM isn’t that
      big of an issue to begin with. Only a few games have shitty DRM and it
      usually gets cracked pretty quickly anyway. There’s also DRM free services such as GOG. In all my years of PC gaming only two games (Red Alert 3 retail, and Anno 2070) have given DRM issues. That’s just two out of the hundreds of other titles that are on PC.

      – So your also arguing that 20fps = Real-Time when FPS isn’t real time
      to begin with. Its a measure of how many frames something can put out
      in seconds.

      – Yes there is far less input lag at 60fps than there is at 30fps.

      – So if good eye candy at 30fps is so good then why did The Order 1886
      flop? Its because it sacrificed a good frame rate and decent gameplay
      for fancy visuals.

      – 30fps does not mix fantasy with fantasy. I’d like to know where this (lack of logic) came from? Framers per second has nothing to do with the genre of a game.

      • -30fps IS cinematic. Movies do 24fps. Bumping 6fps more still retains the feeling. Oh, and if I want visuals, I’d like a movie too, just that gameplay must be a priority.
        -Both have their merits, duh.
        -The reason why 30fps is better (for me) is for the cinematic feeling. Until I see 90% realistic visuals, 30fps fits.
        -Yay! Let’s pretend DRM doesn’t exist! What about Steam, Origin and Uplay? They are all bad anyways, if only some are better than the other. Yeah, would love GOG to pick up the pace and burn Steam, Origin and Uplay in hell.
        -I forgot what the point I brought up about this. Let’s continue, I am busy to see so many headless people responding.
        -No. Its tied to the framerate if you feel you’re better at 60 than 30.
        -And I don’t care about the Order. The best recipe for winning is great gameplay, then all else afterwards.
        -Lack of logic? More like SURPLUX OF LOGIC? 30fps makes me believe that what is in the screen is a fantasy even more. As in not real at all. Near real movement in a not real environment is off putting.

        • Small dick geeky moron. . .

        • Anything that you can get from steam you can crack within about 30 seconds to run without steam.

          And then there is the smarter way of doing things – Try it before you buy it (Get it from a torrent) Then if you like it and it works… THEN you pay for a license for it through steam and link the already completely downloaded and installed software to your Steam account.

          Badaboom badabing not a thief and havent given some shit tier developer free money for utterly unstable not fit for purpose software.

          Or shitbirds like Laminar Research (XPlane) who _still_ make a pretty decent but one of a bunch stock standard flight sim that _still_ doesnt include their original drawcard stand out features…. while theyre advertising iphone and android ports. Another ‘But its still in beta’ product that has been through at least 3 full version updates since release and want to charge goddamn $80 for?

          Software companies take the piss out of us because people that insist on fangirling out constantly let them.

      • Right. About your 20fps = real time point. No. I never argued about 20fps beyond saying that if a game with 20fps with great gameplay was there, I’d prefer it over a 60fps game with no gameplay. The point I’m delivering is, the gameplay should be the priority, not the framerate.

    • I hope this is just really dedicated trolling. The other replies have pretty much debunked every absurd point you have made, but I’d just like to go into a bit more detail on input lag.

      Your point about it only applying to games that process input frame-by-frame is absurd. Even if we pretend there is no delay in processing inputs, what is the average time that an input will be seen as a matching output at 30FPS? It’s 1/60th of a second. What’s the average time that an input will be seen as a matching output at 60FPS? It’s 1/120th of a second. The fact is, even assuming that inputs are processed instantly, there will always be a greater delay between input and feedback at lower frame rates.

      • Nice point. You mentioned ‘will be seen’, not ‘will be processed’. I don’t care if you can’t see your input in a fraction less. You still essentially tell the game to move 5 steps, whether at 30 or 60. And that mini-fraction of a second doesn’t mean much of a difference up until the fractions get closer to the full 1. Then you’d be sufferring framerate lag but your input would remain fine in a properly programmed game.

    • “Cinematic”, haha. If you’d have a game today that runs as 20FPS you probably had a severe headache before you came to enjoy any aspects of the gameplay. I have my hopes that you’re just doing some internet shenanigans here, but your other Disqus comments seem to be equally thoughtless. But pretty much what you’d expect from an IE user who prefers Uplay over Steam.

      Thanks for the laugh.

      • “If you’d have a game today that runs as 20FPS you probably had a severe headache before you came to enjoy any aspects of the gameplay.”
        Yep, absolutely. A great example of this is the N64 version of Legend of Zelda: Ocarina of Time. It is hard-coded at 20fps so even playing in an emulator won’t help (and if your emulator says otherwise, it is lying)
        Playing this game is an exercise in frustration after playing more modern games simply due to the low framerate causing input lag issues. I’d recommend the 3DS remake instead.

      • Yay some stupid brainless zombie.

        DRM-free >>>>> Origin > Uplay > Steam

        And about IE? Yeah. It must be pretty trollish for me to use IE11, which consumes less memory than Chrome and is better than the IE you have on Win7 or WinXP. Yeah, this SURE has to do with framerates.

    • LOL you’re stupid as shit mate, if you would know how many people are making fun of you right now.

      Just take a look: https://www.reddit.com/r/pcmasterrace/comments/3c6ccn/so_i_wrote_an_article_about_30_vs_60_fps_a_while/

      So PLEASE, don’t spread your half assed bullshit. Nobody wants to be bothered by your stupidity.

      • The PCMasterRace subreddit? Just shows how stupid and how rotten to the core your PC fan base happens to be. No wonder why console guys hate PC guys.

    • HAHAHAHAHAHAHAHA your dumber than a rock and apparently blind as one.

    • The main benefit of a high framerate is better gameplay, not just the eye candy. At 20fps, many games would be nearly unplayable.

      • If the game is properly designed around 20fps, it should work. Although I doubt any developer is going to think about doing that, let alone actually properly optimize around 20fps.

        That said, 30fps just happens to work great in gameplay today. But so would be 60fps and up.

    • Not sure if trolling or just plainly stupid.

    • Holy crap you’re retarded.

    • I generally don’t like to do this, but I’ll just let the others respond intelligently

      You are one of the single stupidest people I have EVER encountered on the internet. This video is the only thing that can respond to your comment

      https://www.youtube.com/watch?v=BcjIestFVOc

      • And you were so emotionally shaked by your stupid thought, you had to link a video I’ll never watch anyways, and rant. Yeah, you sure didn’t tell me where’s the stupid. That’s stupid enough of you.

    • Are you kidding?

    • You realize that every single console game is DRM locked right?

    • “I’m a PC gamer, but prefer 30 FPS” is the new “I have black friends, therefore, can’t be a racist.”

    • “I’ve never seen so many buzzwords used so incorrectly.”
      Some guy on r/pcmasterrace

      • Some guy on PCMasterRace? Yeah, he must be stupid.

        • lol, i love how no one from PCMasterRace bothered to come here and correct you, would be way too time consuming anyway, you know nothing about this, yet you pretend to know everything while insulting anyone who opposes you,Peasant.

  24. What simplistic arguments. Now we PC gamers suddenly have a 60 FPS cap or what?

    Console users have that 30/60 fps limit. We PC users have the FPS limit we bloody want.

  25. Erick Costa Fuga

    Half of this article is based on your opinion and has absolutely nothing but your word to support it. If people blindly believe in everything they read here they’ll leave even more ignorant.

    • Could you please point to a single point made that is not factual?

      • Erick Costa Fuga

        I was a little too hard on you. When i posted here i was talking to another person, who used your article to prove some points that were not actually here, so i guess i got it wrong. Still, frame rate has nothing to do with input lag. If anything, it would be output lag(?). Even so, unless you play a game where’s the objective is to shoot little targets that are crossing the screen in a second, you won’t have problems with this. In shooters, the targets are big and slow enough so it won’t make a difference. Also i guess it’s impossible to either prove or refute this, as you would have to do “blind” experiments with a considerable amount of people. So unless this has been done, it is a matter of opinion. Obviously there is a break point. The thing is: is it 30fps? People have been playing at 30fps for many years, and PC players are used to play between these two “marks”, so i think yes, 30fps is a break point. I hardly think you will be in advantage if playing at 60fps against someone playing at 30fps. It is not input lag and, again, even if we consider that it may – technically – cause slower reaction, it would be so small that a person wouldn’t be able to notice and take advantage of it. In the end it will be just “perceived smoothness”. Also, there are so many things increasing latency, in both input and output, that the frame rate, above 30fps, would be the least considerable one. Mathematically it may make sense, but realistically it doesn’t. Know, about preference, well, if you ask me if i prefer 30 or 60 fps, ignoring every other factor, i would obviously choose 60 (one would have to be retarded to choose differently). But it’s not that simple. In reality, you can play at 60fps with worse graphics or at 30fps with better ones. Generally i prefer the second option, and again, there is no fact here, is a personal choice (well, also depends on the game, if i feel it’s not smooth enough i may tune some things down a bit). This topic is specially calling my attention lately due to the information that Halo 5 may run 900p@60fps instead of 1080p@30fps like it’s a good thing. If you ask me, i don’t think it is. To me it’s an excuse on the developers because it’s easier to send a game running smoothly at 900p and 60fps than 1080p at 30 (lets face, a 5 fps drop from 60 to 55 is nothing, but from 30 to 25 it’s fairly noticeable). But this discussion will always be up, because don’t matter how good the computer is (including consoles) it’ll always be able to run better graphics with lower frame rate. Maybe will come the day where the graphics are so good that the frame rate will be far more noticeable, but to me this day is not here yet. For example, i chose to play Witcher 3 at ~45 fps with everything maxed out and full hairworks than at ~60fps with hairworks off (ok, 45 is not 30, but the “fps vs quality” still applies). But anyways, i’m sorry if i passed the wrong impression, you wrote a nice article there, but as i said, i was just biased by another person that used this as an argument to something that were not actually said here. See ya! PS.: What a giant comment have i wrote… lol

        • Those are some good points you bring up, I don’t disagree with the idea that it isn’t a major competitive advantage, there obviously isn’t enough (or any data) to really support that. However, regardless of advantage, it is a better experience, and one that is usually considered a necessity for competitive gamers.

          As someone who plays solely on PC with a fairly high-end GPU (GTX 970) I will say that in most games I am able to get well above 60 FPS (usually around 100-120) in games while still playing at very high settings. Sure, I could turn up graphics settings and play at a lower frame-rate, but I never find the increased graphical fidelity to be much higher or much more noticeable in the vast majority of titles that it is worth the lower frame-rate, and thus poorer experience.

          All that being said, I do appreciate your comments and your overall maturity in this conversation.

          Have a good day!

  26. u can see a slight difference, u have to be blind not to see a difference and i wear glasses too

    https://testufo.com/#test=framerates

  27. Pc master racers are morons. But hey gotta have something when your fat ugly and a virgin I guess. I will stick with both consoles AND pcs.

  28. i like 30fps but the delay from input (Keyboard and mouse) killing me …

  29. Utter balony. I Really feel for those aircraft pilots, as the whole elektric lighted world would seem to flicker all the time.
    Light turns on and off at a rate of about 50hz (50 times per sec) now try and look at a lightbulb and see if you notice it flickering.
    Now get a osciloscope and attach a bulb and put it too 35hz. See it flicker? Congratz you belong to the special 0.2% of people with above average senses. Put it too 24hz. Dont see it flicker? Congratz your normal.
    The only reason you ever want more than 24fps is because of non-linear framing. Some frames will take 1ms inbetween while others 50ms +. Usually somewhere in between 30-45 is good enough to offset this for 99% of the game. But more will definetly give a better result. Its iudt the question if you want to waste money on it?

    • That’s a good argument. But honestly, I did notice the difference with higher fps vids. The movement is more fluid to me compared the usual lower fps.

      • Its a fucking terrible argument.

        Although im not surprised a basement dwelling future neckbeard (after they hit puberty) cant tell the difference between the outside world and the computer screen.

        I run both a playstation and an msi gaming laptop. The playstation generally sits there playing foxtel while i game on the laptop.

        If i run a game and it stutters, i turn down the graphics. Do i notice a difference between the console graphics and the pc? no. not a single bloody moment.

        People make all kinds of whiny arguments why theyre right no matter what and no matter how ridiculous. This is no different from going through the ‘goofs’ sections on IMDB. A person who enjoys particular movies just watches the movies and the flaws (within reason) in the background are generally unnoticed….

        And then there are people with waaaaaaaaaaaaaaaaaaaaaaaay too much time on their hands and nothing to do ripping apart movies left right and centre to appear far smarter than they are.

        The whole pc/console thing and 30/60fps debate and the comments in here are exactly that.

        Its purely subjective whether you actually notice the frame rate. A shitty game that runs shitty on shitty hardware will always look shitty no matter what you do. A well made title will always look good no matter what its running on.

        To think otherwise pretty much makes you a fanboy no matter which side of this stupid stupid fence youre on.

    • My brother used to play CSGO at 144hz, he had a job at a factory, and the factory had fluorescent tubes that flashed at 50hz. He DID see the lights flash constantly. It bothered the heck out of him.

      I also have another theory that explains why people would be able to see a difference between 75hz and 144hz, even if their eyes can only send images to their brain at, let’s say 60hz. Let’s simplify it by saying your screen and your eyes can show you approx. 2 fps. If they are not synchronized, the moment your eyes see an image on the screen, that image is from what happened 0.001-0.499 seconds ago in the game. The image you see is outdated. If the framerate of your eyes does not align neatly with the framerate of the screen, you will get fluctuations in the delay. The size of those fluctuations tells you something about the framerate.
      Example: 2fps -> Delay in between 0 and 0.5 seconds
      10fps -> Delay in between 0 and 0.1 seconds

    • People who are used to high fps CAN see a difference between 60 and 144 fps, but you only need such high fps for games in which split-second reactions make all the difference. For any other game you would only need high fps to satisfy such gamers’ craving for a fluid image.

  30. Hi im from the future. I got an 144hz monitor and the difference between 144 fps and 60 is enormous.

  31. no one cared before about the fps on xbox 360 or ps3.i still dont understand why now.if people CARE so much about fps then get a pc…

  32. if you have always played on just a console NO it doesnt matter because you never had 60fps to begin with.ps,xbox,ps2,ps3,xbox 360, did not care about fps.just wanted to play games.

    there are only 2 types of gamers that care about fps.pc gamers and pc gamers with consoles for exclusives.true console gamers dont care we cant a console and the games.

    if you say it does matter then you are a pc gamer with a console or a pc gamer.if the consoles did 1080p/60fps there would be no need to have a pc.

  33. Say what you want about FPS, I still don’t see the difference.
    I play on consoles and on PC and everything looks the same to me.
    Even on Borderlands 2, where you can choose the FPS you want. I played it with every FPS setting and it’s always the same to me.

    Also, I play Battlefield 4 on PC online at 30 FPS and I never noticed any lag.
    Lucky me I guess.

  34. if you have always been a console gamer it doesnt matter.lets look at it like this lets say you drive a corvette and you next drive a honda then you will see a difference.if you always drive a honda you wont know the difference there for it doesnt matter.

    for those people who say yes it does are pc gamers.no console gamer is going to worry about it.we just play games.

  35. Hecz ✓ᵛᵉʳᶦᶠᶦᵉᵈ

    Console only gamers be like, “Eh there’s no difference”. Believe me there is fellow console brethren. What I can tell you from my experience, I own a PC as well as PS3/4, I played Mirror’s Edge on PC and PS3, normally I don’t get headaches but when I played ME on the PS3, I did but when I played it on PC, everything was so smooth and no headaches either.

    But continue giving that excuse console only gamers. Before calling me a fanboy, I mostly play on PS than on PC, save your fanboy comments for the birds

  36. I can’t tell the difference between mp3 and lossless flac and I think 720p is as sharp as a cellphone screen ever needs to be, but I can EASILY tell the difference between 30 and 60fps.

  37. 30fps is good for cinematic games (3rd Person Action-Adventure)
    ex.Uncharted/TLoU Campaign look/would look like on 2x speed/cartoonish playing on 60fps, cuz this are games intended to be played in 30fps and provide more movie-like experience..

    maybe in 10yrs the 60fps games will have cinematic-like games, but now 30fps is perfect for actual technology to provide movie-like gaming experience

    60fps on the regular and dynamic games
    for sure for FPS/Racing/Platforming and other Games that require instant action response and smooth/precise/balanced move

    240/300fps for Pro and Competitive Gaming (DOTA2, CSgo on Pro/Compet. Level)

    and always full 30fps differences are better
    not between 30 and 60,
    but either 30 or 60 cuz it’s properly syncing and displaying picture,
    without display moved in time from proper image
    immersion and focus isn’t interrupted
    which Really matters in racing, platforming and dynamic games

    for instance on ps1/ps2
    pal have 25/50 fps and
    ntsc have 30/60fps and there is significant difference in dynamic/racing/platforming games (mostly ps1)

    there ya’ll have fps de-mistified

    opinion/knowledge of someone who was doing speed runs/hc gaming and doing competitive gaming since ’99.. and is involved with gamedev..

  38. David Michael McLeod

    Obviously written by a PC elitist.

  39. Riasat Salmin Sami

    This article copies exactly Linus from his video.

    Or is it the other way around?

  40. I play all PC games at 4K on a 58inch HDR tv. I use the MSI gaming X 8g 1070 which is just about the fastest 1070 card.
    I am playing watch dogs 2 at 2160p with everything maxed out. At 4K AA isn’t needed. The game looks gorgeous and runs at a minimum 30fps. The game is completely smooth and frame rates never stutter. When driving at very high speed the frame rate never dips and there’s no tearing or stutter.
    I tried the game at 1080p. Obviously at this low resolution AA is needed. The settings were identical otherwise. The frame rate went up massively but the graphics were nowhere near as detailed and well defined at the lower resolution.
    I will say that forza horizon 3 runs at a constant 60 Fps at 4K and watch dogs 2 looks way better. But maybe this is a case of GINGO.
    There’s no answer to this resolution v fps debate as we all see the world differently
    All I can say that I find 4K gaming to be vastly superior to 1080p. Yes the fps is higher at low res but graphically inferior.
    The high speed driving at 4K at 30fps on wd2 is noticeably superior to 60 fps at 1080p. Game speed was identical at both resolutions. But the 4K graphics just looked so much better.
    This is a never the twain debate and it’s down to what the individual can actuall see rather than numbers.

  41. Uhmmm…we’re playing video games that take place in cartoon worlds with cartoon characters. I love when people are all excited for better “more realistic” graphics, me: “hey, did you forget that it’s a cartoon?” I can be startled over the most horrific looking monster (horrific looking in graphics terms) regardless, so really it’s a moot point. So what it may look better, but how hard is the game, how many patterns are there for me to try and figure out, how much interaction do I have with the scenery, do I have tons of areas that I have to search for important items, how much mobility does my character have? These things make a game more challenging and fun, so regardless of whether you choose PC or console, real gamer’s know these are the important questions, not the visual appeal.

  42. Achmad Faisal Faiq

    Developers just add 2 options (30fps and 60fps. Gamers can choose what they want.
    I myself prefer 60fps cause the game runs so smooth!

  43. Marc Anthony Forrester

    I came looking for answers. I left realizing I just don’t care that much.

  44. I’m all for higher fps games, due to shadowing and light textures, but sorry, I gotta call bullshit on the input lag. Input lag has nothing, nothing, nothing, to do with fps. Input lag is how quickly your hardware interprets the signal from your controller and displays it on screen, this has nothing to do with fps. You could be running at 5000 fps and still be sitting for .5 seconds waiting on your systems hardware to interpret and visually show the effect of the button you just pushed on another piece of hardware. I myself have experienced input lag, due to my keyboard batteries being low, on my pc, but this had nothing to do with the fps my machine is capable of.

  45. is it just me though or most stuff isn’t fast or slow enough to only be on screen for 1/220 th of a second… if something blurred on screen that fast then I don’t think you were meant to see it, considering how consoles are supposed to be rather well affordable is the reason they limit FPS surely. I personally might say that “I’d prefer lower FPS” if it costs me less and do a fairly effective job on what I play. I’m thinking of building my own computer within the next week and from what I’ve read Intel do better on FPS but even if its halving input reaction time it is only milliseconds which shouldn’t be too much of an issue especially if you can tactically make sure if you know of higher FPS opposition to play to their surprise and get the jump on them… I think I’ll make an attempt to get something AMD with 60 FPS but definitely won’t be going out of my way for such small increases in detail that are unrequired. Until superhero Flash gets his own game it seems rather redundant. More detail is better, money conscious is even more so

    • This article was very useful to me to see how higher FPS really isn’t the most important thing to me, I can now see it’s effects which was really informative and I see it as crucial in circumstances and more information in the same time is definitely awesome but if it is just a few extra pixel positions of a big ass character on a game and how my sights move slightly faster to that target then is it really worth it especially when from what I have seen the GPU is so much more important.

  46. There is no difference between 30 and 60 lmao

Leave a Reply

Your email address will not be published. Required fields are marked *