Artistic 60fps lock

I am just going to leave this here.

Other urls found in this thread:

ttimo.typepad.com/blog/2013/05/es_core-an-experimental-framework-for-low-latency-high-fps-multiplayer-games.html
web.archive.org/web/20140719053303/http://www.altdev.co/2013/02/22/latency-mitigation-strategies/
blogs.valvesoftware.com/abrash/
dtic.mil/dtic/tr/fulltext/u2/a178485.pdf
en.wikipedia.org/wiki/Spectral_color
en.wikipedia.org/wiki/Impossible_color#Imaginary_colors
en.wikipedia.org/wiki/Line_of_purples
m.digitaljournal.com/article/326976
twitter.com/SFWRedditImages

Who cares?

60 is all you need, anyways, you niggers.

It's not like you can see more than 24fps anyway.

...

...

Huh

...

I guess 60fps is the new 30fps, huh?

Can't wait for console titles to all be 60fps while everyone on PC is at 240 and all the plebs come out crying "You can't see more than 60fps anyway!".

Is that the fucking mouse from An American Tail?

I believe they mean, Operating frame rate. which is a sort of internal ticrate.
it's something you can more feel than see. Like and I'm not sure if valve fixed this or not, but Source games the operating ticrate is stuck at exactly the same as the visual ticrate, so if you set a really low FPS cap the game is significantly harder to control.
IIRC this is why Half Life 2 and portal 2 speedruns are capped at 120fps

That already happened. People who care are at 144FPS while everyone else is just now catching up with 60FPS. People are advancing far too slowly.

At least 60 looks nice. I'd rather everyone slobber all over it than defend 30.

I found the economic sweetspot with 1080/75

...

I want the artfags to leave.

i'll check these here

The biggest meme ever to have memed

Who gives a shit? 60fps is more than enough for anyone who isn't autistic.

Stay pleb, niggers.

You can't see more than 3 colors either. The human eye only has three types of cone cells, just like the component cables only have 3 color channels. Everything else is just shades and mixtures of those three. Having graphics cards that can produce however umpteen millions of colors is just marketing overkill.

Maybe yours.

Have we entered the point in which 60fps isn't good enough anymore or somethin'?

Why? So you can post screenshots with your FPS counter of 200 in the corner?

It was never good enough and it was only shitty hardware that made people tolerate it.

There is definitely good reason to increase refresh rates in first person shooters. 60hz is practically unplayable once you've tasted 144hz. In other genres, however, with much less direct control of the camera, I honestly almost never notice the difference between 60hz and higher refresh rates. I'd still like for refresh rates to increase for input lag and framerate consistency reasons, but there isn't that viscerally different result in a third-person gamepad-based action game like there is with directly controlling a first person camera with your mouse.

There are very good reasons to lock a framerate to 60hz rather than target framerate independence, because of the implementation details behind that.
You can use a variable timestep (i.e. stepping the physics simulation by a dt of 1/(framerate) every frame). This works, but makes it much more difficult to reproduce bugs, and causes tons of physics and animation bugs at higher framerates (see modded Dark Souls, where you start getting stuck in terrain if you try to play above 60hz or so).

Or you can separate a locked simulation framerate (say, locked to 60hz) from a variable rendering framerate, and interpolate between simulation frames to get arbitrary "tweened" rendering frames. This works well for online multiplayer action games, which already have a disconnect between the game's state and the player's view by necessity, but would introduce 1 or 2 frames of lag in the simulation in a single player game relative to directly rendering the game state.

With a single player action game, whose primary market is consoles where 60hz is the max expected to begin with, targeting variable framerates would be a ton of work that makes the game play worse on the intended target device. Makes no sense to bitch about this.
Here are the two reasonable exceptions I can think of to this:

Games could handle the view separately from the rest of the rendering state. Use the latest mouse or analog stick inputs to change the camera while leaving the rest of the view unchanged. You can see how this looks right now if you load up Halo:CE and play an online match. The players you fight and their positions are locked to 20hz or 30hz (I don't remember which), but your view is rendered at whatever your monitor can handle. It's a pretty jarring effect at that low of a simulation rate, but would probably look a lot better with a 60hz simulation rate.

Or, they could simulate the game at some greater multiple of 60hz, like 240hz, which would both reduce input lag in the simulation and provide plenty of frames for a wide variety of rendering rates, at the expense of a potentially significant increase in CPU load.

tl;dr shit's more complicated than you think

I've got a 144 Hz monitor and the hardware to support it. It sucks when I can't go to the max because of a lock, but 60 FPS is adequate. Anything beneath that is unacceptable on PC.
It honestly does not make a difference for certain genres. 60 or 144 is marginally different for Nier: Automata but means everything in a multiplayer shooter.

There's also the es_core approach where you sample input in a separate thread and the render interpolates between the latest timestep update and and your input. This gives you the stability benefits of a fixed physics timestep with the smoothness and low input latency of a variable timestep. Multiplayer shooters usually send and receive updates from the server at a fixed rate regardless of variable timestep anyhow so this probably wouldn't have many downsides aside from needing a dedicated input thread.
Similar tricks are out there, usually dreamed up by the VR industry to make headsets feel responsive even with forced vsync. Some of them will probably make their way into regular vidya eventually and even if VR doesn't catch on, we'll probably get more responsive games in the future.
ttimo.typepad.com/blog/2013/05/es_core-an-experimental-framework-for-low-latency-high-fps-multiplayer-games.html
web.archive.org/web/20140719053303/http://www.altdev.co/2013/02/22/latency-mitigation-strategies/
blogs.valvesoftware.com/abrash/

Faggots don't know about frame-timing.

Remember back before the LCD cancer? When we ran our displays at 85Hz-120Hz instead of locking them to 60? When you could actually see motion that fast instead of gray-to-gray transitions failing at times to keep up with even 15 FPS?

Dedicated gayman OLEDs can't come soon enough.


FPS puts a hard floor on lag, that fluctuates as each frame is drawn. For instance, at 30FPS input will lag for 0ms (just after refresh) to 33.3ms (just before the next frame) between refreshes, whereas 120FPS imposes a maximum of just 8.3ms. This is particularly exacerbated if your display can only accept whole frames instead of scanning it out from the GPU via the display interface in realtime pixel-by-pixel and line-by-line, in which case an ADDITIONAL lag floor of 1-2 frames (33.3ms-66-6ms @ 60fps) is imposed. Naturally, this isn't a factor with audio, which has a "framerate" of 40000Hz+.

Input lag is also imposed by other sources (pixel response on LCDs, interface controller on digital displays, the GPU, very badly written game engines, the OS, drivers, interfaces, badly designed input devices), but this is in addition to that imposed by FPS. *Oh, and there's the human sensory-neural-motor system's natural latency of about 100ms-200ms, which is added on top of everything else.*


It all boils down to games being written with techniques that were universally condemned in the 1980s. There is absolutely no excuse (especially when you're writing in high-level languages like C++ or Lua, and your graphics are all float-64 vector polygons or scaled sprites with keyframe tweened or realtime simulated animation.

The VR-style view-warping holds promise as a great way for (especially emulators with HLE graphics) to get recalcitrant older games to higher FPS without hideous SVP-style laggy 2D interpolation.

Google 8088 MPH and come back

I thought the same thing years ago but once I got a 144hz monitor a realized how wrong I was. You just haven't ever seen anything above 60 fps because most standard TVs and monitors come at 60hz. The jump wasn't as obviously noticeable as 30 from 60 but it feels so damn smooth, I cannot play any fps at 60hz anymore.

Unless you're a CS:GO ePeen faggot, 60 fps is perfectly fine

Your eyes actually enjoy a higher FPS more then lower FPS and you can notice a difference between FPS rates however the difference is basically choppiness vs smoothness. LotR was at 48 fps and people noticed, since it was outside the norm for movies at it's release.

...

You can, for all intents and purposes, see an infinite framerate.

When it comes to motion, the minimum you need is 15, however, the reason why higher framerates are preferred is all do to the speed of the motion you need to convey. Framerate is all about trying to give your eyes the maximum amount of reference frames in a given time. For some things you REALLY don't need that many reference frames, you can actually get by with motion with as little as 10 frames of reference per second. The more reference frames you do have, however, the more "detailed movement" can be referenced and the less your brain has to fill the gaps. You don't need more than 60fps generally for simple movement like a man walking across a floor. You DO need more than 60fps if you want to accurately depict the movement of fast objects like cars, airplanes, or even bullets

fuck you i'm not buying a new monitor.

honestly that sounds a excuse for "i tied physics into framerate"

Do you remember when 60 fps was a muh graphics meme brandished by big companies, and only became an all-important issue when it turned out they couldn't back it up ?
Me neither.

Good for you faggot, if you don't want a new monitor you don't need one. Not having games retardedly "capped" at a specific framerate simply means people can choose to go above 60, just like you can still choose to remain at 60 regardless.

Like pointed out, there is literally zero excuse for developers to be tying the framerate to the game physics nowadays, as that was considered bad practice back in the 1980s. Devs today that tie physics to framerate are making mistakes that were solvable over 30 fucking years ago

Not this retarded argument again. The scientific phenomenon is known as a flicker fusion threshold (colloquially blur limit), wherein above ~15fps, the human eye is incapable of distinguishing between individual images in the display. But that's the whole point of a movie or vidya - you're not trying to view individual animation frames but rather a smooth, collective whole.

There is a noticeable difference between 24, 60 and 144fps and the point isn't to try to view separate frames.
If you honestly can't see a difference in smoothness with increasing frequency then please actually get your eyes checked by a professional doctor instead of Bubba, your local neck-of-the-woods voodoo witch doctor.

Maybe in 15 years time they might even reach 144.

...

lol audio fags have their crystals and vidyacunts have their refresh rates

in 10 years you might have 220.

...

Does playing on 60+ fps actually make any difference?

enjoy your miliseconds of response time, I'll enjoy my nanoseconds of response time.

Diminishing returns and all that, but yeah, it can do. Fighter pilots can identify an enemy craft that they see for 1/255th of a second.
Of course, framerate is dependent on the screen, not just because of the hertz, but because of ghosting and the like which can blur frames together anyway (which is often why console gamers say they can't tell the difference, the image quality and ghosting between monitors and TVs is considerable).

Enjoy not ever enjoying a Panoramic Experience™ with multiple monitors

Don't worry, until we bludgeon TV manufacturers into supporting adaptive sync over HDMI for consoles, the normalfag herd won't move an inch.


Akshually actually, flicker fusion threshold varies by illumination level (and, to a lesser extent, other factors like central/peripheral vision and fatigue) from about 5FPS for a single photon, through 15FPS at a typical video image's brightness (100cd/m^2), up higher as the pain threshold for brightness is approached.

Also, note that motion perception (framerate) and flicker fusion aren't entirely the same thing. Flicker fusion can be attained just by literally flickering the same image often enough, whereas the illusion of smooth motion is perceived by swapping in new images often enough, and the effectiveness of the two can interact. For instance, theatrical films only contain 24FPS of material, but (by means of a spinning shutter or flickering lamp) are flickered to black at a rate of 96Hz.


Not to mention that even if he has a 240Hz LCD (especially if it's IPS), most gray-gray transitions will ghost way slower than that due to LCD's pathetic response times.


Don't even joke about that, that's laptop-tier.

Oops, wrong pic

Do you have the source for this? I googled it and it looks like it's from USAF, but I'd like to read that exact article from them

I can't remember the original source (pretty sure it was one of those "strange facts" books I'm so fond of) but I did find this old 1967 paper from the US Navy that touches on the visual response time of fighter pilots.
dtic.mil/dtic/tr/fulltext/u2/a178485.pdf

At least we're moving the goalposts.
Before it was 25 FPS and I hope it'll soon be 144.

Yes we all fucking know we can see more than xxFPS, we're not retarded faggots that play at 30.

Haven't heard of that one before. Gonna need a bit of an explanation on that.

holy fuck you guys are cancer.

60fps? not enough? fuck you guys.
seriously
fuck
you

how about they work on arts and storyline and stop trying to please fps faggots

fuck
you
guys

...

dude. shut up.

if we make 60fps an unacceptably low framerate, then it's even less likely that we'll ever see 30fps again

but your eyes can't see more than 60 fps

...

…among autistic enthusiasts and other people seen as having impossibly high standards, changing nothing in the overall scheme of things and in the gaming market, just like mandatory-60 FPS autists and 30 FPS

...

Coding games to run at set framerates is fucking stupid in the first place. It's like purposefully limiting the resolution the renderer can scale to.

AS someone who knows little about programming, why would you do that in the first place?

In this day and age it could very well be just to upset people trying to manually raise the FPS up to 60+, I remember there's still an occasional spot of butthurt because Dark Souls freaks out if forced to run at 60FPS and some physics-dependent objects like ladders don't work correctly.

...

Fags per Suck

You gotta admit Soul Reaver is unplayable on the PS1 because of the framerate.

...

...

Daily reminder the PC master race spends their time waiting for bad console ports and poorly emulated games.

Do you guys even play games or just run them and stare at benchmarks?

If anyone's allowed to cap the FPS, it should be the player.

...

if you are designing a game with low frame rate then the gameplay better not require quick réflex reaction and smooth animation

And i can understand if someone lock their game below a certain frame rate because they where going for a "Johnny Bravo" style animation were lots of frames have motion blur…
Thing is, i've still to see a game that executes this well

Real (spectral) colors are seen either when spectrally pure light of a certain wavelength stimulates one or more pigments of cone cell, or when multiple rays of light in different colors stimulate them similarly, causing the brain to interpolate either way. For instance, a pure cyan ray of light will be absorbed by both blue and green cone cells causing the brain to perceive cyan, but overlapping rays of pure blue and pure green light will produce an identical response, and the brain will react as though it is detecting pure cyan light.

Purple (note that the imaginary color "purple" is not the same as the spectral color "violet", and the two are visually distinct to humans) is different, because while on the one hand it is perceived in the brain when blue and red cone cells are stimulated but the green cells are not (by overlapping rays of pure blue and pure red light), on the other hand there is no single pure wavelength of light that will be seen as purple.

This allows the brain to "wrap around" the linear spectrum in such a way that red and blue appear to blend just like any other color, in spite of being on opposite ends of the real spectrum. Incidentally, something similar is responsible for gray/white, which also doesn't correspond to any one wavelength of light, but is caused by multiple rays of light that stimulate all three cone pigments approximately equally:
en.wikipedia.org/wiki/Spectral_color
en.wikipedia.org/wiki/Impossible_color#Imaginary_colors
en.wikipedia.org/wiki/Line_of_purples


Spending your entire career as a programmer targeting consoles, arcades, and embedded microcontrollers. That, of course, and being too masochistic to make your life easier when it comes time to port something to a newer platform. Being a nip (or weeb), in other words.

How would you translate from one internal fps to another? There's no reason to render more than that.

...

How is that possible?

These contrarian kids keep furtherly shitting up this place, I see.

Simply don't tie your update timestep to FPS and have rendering interpolate as need be based on framerate. If you used fixed 30/sec timestep and are rendering at 144 FPS you'd interpolate movement by 0.208.

It's the same shit you do for representing multiplayer on a client. A server isn't telling you client positions at a rate of 60/sec. Instead it will have a fixed lower rate and clients will interpolate positions to make it appear smooth.

How many FPS is real life, anyway? I don't mean eyes, I mean the things we're looking at.
It's probably less than infinity, so it should be achievable by computers.

Can't you just not lock them at all? I mean really.

That sounds good, but it would take some figuring of the motions to interpolate. Acceleration would be tricky. Movement would need to track from one frame to the next. Animations would need some interpolation mechanism. It's a lot of work for purely aesthetic reasons.

Do you know of any books?

it becomes comes down to, if you autists had your way every game would cost a billion dollars to produce, meaning every artistic creator, will never ever ever make a game every again. you guys want people to pander to this cancer and your the reason the gaming market sucks.

Wizards.

Honestly it depends how you've set things up and what type of game it is. For most 2D games it's very easy to implement. (When I last did a large 2D mutliplayer game it was literally a few lines of handling velocity*frame delta time.) Of course if you're doing your own 3D engine and handling keyframe based animated rigs, there's most work that needs to go into it. Essentially though the premise is the same: the renderer shouldn't care about anything except what it needs to draw. You actually don't need to track movement frame to frame since you're going to interpolate it every frame drawn anyway.

I'm hazy on any real reading behind it but I'm sure if you google fixed timestep there are plenty of articles that go into it.

It is effectively infinite. I mean, you're limited by the speed of light, but we're talking scales that make notions such as FPS utterly irrelevant.

As a sidenote, fighter pilots have been able to identify planes as friend or foe from images flashed for one frame at 500 FPS.

I watched that movie with my parents
It was a mistake.

I couldn't see shit because each pixel was the size of my hand, but it was awesome.

Overclock that shit, fatty.

Fucking fighter pilot here, these statistics are retarded bullshit. Any person can do that kind of stuff, if a highschool dropout who had all of his blood flow to his feet can do it so can you.

So you're just proving my point even further.

Kill yourself, tbh, retard.

That wasn't why he made that post, he was just telling you that you're an idiot for believing it takes a fighter pilot to be able to identify anything flashed at any speed.

As long as light hits your face, information is received even if extremely briefly, and the result depends on how much you were paying attention.

Thanks. It's getting clearer the more I think about it.

I'd store the animation, position, velocity, acceleration with the object. The interpolation process transforms the object data. The transformed data is then rendered.

It doesn't seem too difficult.

I wonder how input and networking is handled. I guess there is an upper bound to client input, but it's probably too massive to care, so I guess it would go into effect on the frame it arrived on. But I still know nothing about networking.

It's something to think about.

I never said that it required a fighter pilot. The test that I'm aware of was done on fighter pilots, and I'm not aware of one that was done on the general population.

60 always looks good unlike 30, you retarded snobbish faggots.

This is pathetic.

Even we get to 10000000000000000 fps there will be nothing artistic about it nor a reason to cap it.

It's not like you could afford a g-sync monitor anyway.

>>>/reddit/
>>>/cuckchan/
>>>/neogaf/

Then inhale a shotgun you cancerous faggot.

>>>/suicide/ for your shitty bait.

For you maybe. I can switch between 60 to 30 to 25 just fine, but only if the game was made to run at that framerate. I always go for 60 if it can be helped. In all honesty, I prefer this push for 60+ frames rather than 4k shit. Having more frames per second doesn't bloat games to 60GB+, and my eyes are fucked so I can't really see 4k.

Why do you need 60 fps in single player non-first person shooter?

People who are annoyed by pseudoscience and kike-defending.

This is what you sound like you bunch of retards. Higher framerate is always better.

I hope you enjoy being unable to handle what would've been otherwise avoidable close calls. A lower framerate reduces the margin of error for precise timing in games where that is neccesary like the entire fighting game genre.

kill yourself

Huuuurrr durr muh consoleshit, 60 fps master race

Duuuurrr so outdated lol consoleshit 100 gajillion fps master race

Framerate autism is the worst autism. You should all promptly kill yourselves.

yeah but 30 to 60 is noticeable faggot anything beyond that, not really

They'll never be satisfied.

Ever.

They will be satisfied, when games do not have framerate locks. This is not hard, user.

(checked)
Real life's "framerate" is 5.39 × 10^−44 seconds, the "Planck time". Due to the fact that space and time are actually the same thing, this can also be represented as and corresponds to real life's "resolution" of 1.616229(38)×10^−35 meters, the "Planck length".

Sadly, a computer capable of fully simulating the universe would be roughly the same size as the universe. In fact, under information theory, thermodynamic entropy of the universe is directly comparable to a computer's data storage and processing operations, a computer's operations are described in terms of their creating and moving entropy, and the universe itself can be described identically to a computer.


THIS. FPS locks were unanimously lambasted even back in the 16-bit era of turbo buttons for CPU underclocking as totally unnecessary, game programming has less excuses than ever to be so retarded. Give us unlocked FPS support, give us unlocked resolution support.

Here's the problem. You can only achieve this in two ways:
Interpolation. Take two game states, and you can generate arbitrary intermediate renderings in-between them. But in order to have two game states to interpolate between, you need to introduce a delay between the game simulation and the rendering. As previously stated, this works fine in multiplayer games because you already have delays and lag compensation, but in a single player game, this would introduce perceptible input lag.
Or extrapolation. Take the latest game state, and extrapolate where it will be in the future given current velocity/acceleration etc. This requires no delays, but introduces serious visual bugs, like objects clipping into each other, animation not activating properly, and in general lots of jittery mess on the extrapolated frames. The more work you do to alleviate these problems (for instance, performing collision detection on extrapolated positions to keep objects from interpenetrating), the closer you get to practically simulating the game at a higher framerate anyway.

Let's be honest here. The game is so poorly optimized that even a 1080 TI wouldn't be able to maintain 120 without turning everything to minimum.

...

what game is it?

Bad console ports are still superior to the original version 9/10 times.

About that.

m.digitaljournal.com/article/326976

It's extremely rare but it is possible for humans to be fully functional tetrachromat.

No-one's going to notice a frame when we're talking about 60/144+ FPS. You're also forgetting that input shouldn't be dictated by framerate under this system and should be part of the fixed timestep update, in which case you've already decided if you're going to handle input at a set rate. Whether that's 24, 30, 60 updates a second unless you're drawing at or below that rate you're going to have frames drawn that aren't registering the input yet. If it's perceivable your individual timesteps are probably way too infrequent then.

What movie? I thought the gif was from an old HHH promo from WWF

There's also the fact that CPUs run at 2-4GHz, meaning the programmer has hundreds of millions of instruction cycles in which to create each frame. In-engine lag is the result of code architecture that is faulty on a fundamental level for realtime applications such as gaming.

Your eyes stop seeing flickering starting at 100 Hz or thereabouts, in periphery; your central vision may be not sensitive to flickering as low as 50 Hz. So technically you don't need more than 200 FPS: if you were to alternate solid black and solid white screen on 200 Hz monitor, you would see solid gray color.

However there's persistence of vision and the continuous nature of photosensitive cells activation. You may not see objects flicker faster than 100 Hz due to persistence of vision, but you will see if they snap from one spot to another because their "motion" is discontinuous. There's a method to work around it though. You need a very fast and accurate eye tracking. You track eye's motion and apply fullscreen selective motion blur, so that the objects you track with your eyes are clear, and everything else has motion blur on it. This way you simulate continuous imaging of the eye, that is, the blur effect when objects move across the retina. If you do that, you don't need any more than 200 FPS screen. If you don't, no amount of FPS will do the business, there will always be some object moving across the screen fast enough that you notice discontinuous nature of that movement. You will of course need a low persistence display, that flashes each frame for a minuscule fraction of a second and then goes pitch black for the 99% of the time. That's because if display is continuously lit, the image stays for whe whole duration of the frame, and if you track any "moving" object on the display, it's image will smear across your retina creating false motion blur where there should be none.

You can do these fun vision experiments at home: 1) track your computer's mouse with your eyes and move it around, notice how it gets blurred. That's because you use full persistence display and mouse's image gets smeared across your retina since your eye is moving but the image is static on the screen. 2) aquire/create a strobe light that flashes at 60 flashes per second, but make sure it stays pitch black for most of the time in between flashes. Then go into pitch black room and use the strobe light for illumination. You'll notice that every moving object looks like it's in 60 fps rather than moving continuously.

Planck length is theorized to be the smallest meaningful distance between any two particles, any closer and you wouldn't be able to distinguish between the two. Planck time is how much time does it takes to travel Planck length at the speed of light. No proof exists that space (let alone time) is quantized and particles can only occupy spots at coordinates that are multiple of Planck length. In general both of those are highly arbitrary and don't particularly have anything to do with physics. Planck's constant describes relation between photon's energy and frequency. Because, you know, a single photon carries very specific amount of energy, depending on the frequency - not more and not less.

You're the retard that needs to kill yourself for getting this asshurt.

It might be then, i don't remember the movie very well but the actor is the same and the furniture behind him are fitting. It was a movie about 2 early 20th century mafiosos who were brothers and basically one of them is a faggot and you're constantly reminded of it, then at some point the movie goes from a movie about mafia crime to a bunch of gay men having orgies.

I played this shit as a kid. Fucking chinks man. I was so confused but at least it was fun.

Just based on that description you gave it seems kind of obvious that space would be quantized. I would imagine that it doesn't actually matter to physics very much because it's many orders of magnitude smaller than even what particle physics typically deal with, but "any smaller distance basically doesn't matter" sounds exactly like quantization to me.

Lets assume for one second that he's right and it is best at multiples of 30fps, what does he think would be wrong with 90 or 120fps?


There is no reason to ever cap it but testing by the BBC found that most people can't see any difference between 400 and 500fps and no one could see a difference between 500 and 600.

This explanation also known as


It's almost as retarded as older games that were tied to the CPU clock speed. When you buy a faster CPU, the game ran fast as shit because the programmer assumed that the CPU cycle would always be the same.

I'll let you in on a secret I only learnt recently myself **the "turbo" button actually limited the clockspeed to 16mhz so old games didn't run fast.
that's right, the turbo button actually made your PC run slower

Time for sleep

Should I put red capital bold emphasis on "theorized" and "no proof exists" for you? How do you even suppose that shit would be quantized? It's the "aether"-tier bull crap, there has to exist central frame of reference for it to work, the same way as if speed of light was tied to some central frame of reference. For each and every individual frame of reference space is stretched in a different way thus they all have their own different "grid" while that shit is supposed to be simultaneously equal between each and every one of them. Oh and the fact that space itself gets expanded all the time in all directions, so for all practical purposes each particle is its own geometrical center of the universe. How the fuck with all of that considered, quantized space is supposed to hold up is anyone's guess.

Oh and it is of course a big deal physics-wise, such objects exist as neutron stars where matter is compressed into degenerate quark soup, well within scale range where the supposed quantization would play big role.

Who? Probably a nobody.

saving it in the hopes of triggering you again

Reminder that "particles" don't exist under quantum mechanics, merely remaining in use by physicists as an abstract "toy model" (much like the "miniature solar system" description of atomic physics, or Newtonian physics itself for macroscopic applications). What we commonly refer to as particles are more accurately described as waves of excitation in the underlying probability field.

What people don't fucking get when 30 gets shat on for 60 is that there's a major difference and that's how smooth and fluid everything looks.

I can say from experience that 60fps outright does not work for movies and that's wonderful because it further forces the divide between games and tryhard cinematic shit.

Nigger, there is an absolute fuckloads of interpretations of quantum equations, from pilot wave that guides a particle through predetermined path that exactly coincides with its wave function, to multi-universe theory where every single solution of every single wave function in the universe is its own parallel universe and we're just happen to be in one of those, with by far the most popular interpretation being "shut the fuck up and solve equation" interpretation, i.e. it doesn't fucking matter what you think of it, it's all the same either way.

You've got it backward. What you're talking about are different theories of quantum mechanics, what I was talking about is quantum field theory, which all of quantum mechanics are a subset of.


You're aware that silent films almost all targeted the exact minimum necessary to perceive motion, 12-16FPS, and only adopted 24FPS (with the advent of "talkies") because it was the exact slowest speed that would produce acceptable audio from optical soundtracks, right?

I've seen 48FPS IMAX HD, and it looks fantastic. If the movie industry weren't such cheapskates, we would've all been standardized on that back in the 1970s, and continued progressing from there for 40 YEARS. Instead, the 1970s was when theatrical films stopped being shot on 65mm (even though theaters are still nearly all outfitted with 70mm projectors) in favor of 35mm, and after the massive resolution-dump that was "HD" (not quite equivalent to 16mm) video in place of film. Only thanks to RED, we've been slowly crawling through the past up to equivalents of 35mm, 65mm, 15-perf IMAX, and perhaps eventually where we REALLY ought to be by now.

Every film shot in 24FPS, every TV show shot in 60FPS, is a theft from future audiences, who will never be able to see that material as it should have been, all because the director wanted to scrimp on film stock that consumes a microscopic fraction of their production budget.