We have achieved graphics cards capable of 4K 60 fps in ultra in most games, so, Whats next?

We have achieved graphics cards capable of 4K 60 fps in ultra in most games, so, Whats next?

What will be the next objective of Nvidia and AMD in the coming years with Volta/whatever? 8K?

Also, what is the maximum resolution that we can achieve? at some point, we are going to hit a technology wall or something, mirror resolution maybe?

Other urls found in this thread:

chrissawyergames.com/faq3.htm
masm32.com/
twitter.com/SFWRedditGifs

The overwhelming majority of games, yes. Not those that came out the past two years on max settings though.

I would say the "max" resolution would be the same level as what we see with our eyes. After that what would be the point since nobody would notice the difference.
But even then there could still be work done on making it cheaper and more compact.

4K is a meme, a waist of time and money.

You're retarded, it varies with distance, and the ratio of resolution x screen size is more important.

And the vast majority still uses 1080p screens or less, having with these cards overreaching power for their needs.

Computers are an unity, you shouldn't look at the parts separately.
How so? Imagine you have the best PC in the world, but with shit 512MB RAM from 2003.
It will run things badly, the RAM will dictate where's the PC roof.
Same thing, best PC in the world with a shitty Sata1 HDD.
It will run things badly, the HDD will dictate where's the PC roof.

So you see, if any component development stales, the rest of the computer will be handicapped to work in its lower common denominator.
RAM today is overwhelmingly ahead of everything, there will be no problems here.
HDD is at a good spot, specially the SSD.
Mother Boards are fine as usual.
GPU are also overwhelming for now.
But CPU? They will hit a wall hard, and no matter how far ahead the other components are, the CPU will hold the whole PC down to its level.

And then things will stale. But software will still increase its need, so the things hardware has far ahead now will stop being overwhelming, then they'll start lacking, then software will either hyper-optimize or things will stop for good.

People really don't know how serious is the CPU question. It's breaking the limits of physics.
A good alternative must come or a technology crisis will come.

hopefully the next objective is to achieve graphics cards capable of 4k 60fps in ultra for under $500. or 1080p 60fps in ultra would also be fine.

Bullshit. Most programmers barely even scratch the surface of a modern CPU as they never learned how or are too diverse to understand it. I welcome the day when shitting out javascript hits a wall that takes skill and understanding of parallelism to overcome as it's the day tech becomes great again.

Maybe consolefags will get teevees > 60fps so we can stop pretending that 60fps is acceptable in 3 CYE.

I like the CYE notation, where does it come from?

Do people actually use their GPUs for graphics these days?

A computer doesn't run a single program, many shit devs together making shit programs can turn a good pc into trash.
And I said above:

And yes, the limits of physics are being reached, they need a solution now.

Industry analysts are saying 8K will be the END of the line for resolution increases for general consumers. At 8k you will need a substantially large screen, as in larger than you would be able to fit in your house by aby practical means, before you start seeing pixels at a reasonable viewing distance. After 8k companies will have to start using other shitty buzzwords to market displays, which we are already seeing with "HDR"

In addition, its general knowledge silicons limits will be reached within the next decade as semiconductor nodes past 5nm may prove impractical.

Right now GPU makers are focusing on improving CPU utilization in games through APIs like Vulkan and Directx 12. This in theory means significantly more optimized GPU usage giving developers higher budgets for more a'dem graphigs 'n sheit. After new APIs reach significant market penetration then it'll really all just be abound improving graphical pipelines.

It ends with virtual reality and foveated rendering.
Only the area you are looking at needs to be rendered in high res, your peripheral vision can be rendered with less res.
There no reason to go beyond this.

There no reason to go beyond this.
Don't we already have game engines and other APIs that allow developers to do this?

VR is well developed already, many gaming companies have small departments working with it, big tech companies such as Google and Microsoft as well.

But you know it will never catch? 100% of the people who uses it gets sever headaches and is impractical to use it for anything that isn't short-time.

I used once and felt nauseated.

lamow

You need a minimum FPS of like 90 for VR to not make you sick because at that fps even with the worst frame-pacing you still get adequate latency

I used VR at much lower FPS than that though and I'm fucking fine and I'm pretty sure the few people that are complaining about headaches and shit are the same vocal-minority that gets boat-sick easily or car sick easily if they sit in the back seat. It's still something that needs to be addressed though if VR ever really wants to "take off" regardless

The issue isn't whether or not a person gets sick. It is how long can they last. I tried to see what my limit was. Three hours. At the end I was disoriented, my balance was off, and I vomited. I had one of the worst headaches I had ever had and could barely stand for 15 minutes. The experience was bad enough that I lost all interest in VR. I do not want to feel like that again.

Jesus user, 3 hours of even staring at a screen without VR would probably make even me sick. That's a bit excessive I think

There will be far more things to display on an 8k display than what is done today. Watching a cityscape is amazing. You can see the cars and people on the streets clearly. A higher resolution means higher detail at further distances. Games today are fairly basic compared to what they could be in the future.

Everything 8k will be diminishing returns do to pixel density limits. An this is less of an engineering limit and more of a biological limit, namely, the human eye. It will require a substantially large screen before even people with better than 20/20 vision could see pixels. Thus 8k is seen as a "virtual" limit to screen resolution, unless it's a massive IMAX cinema screen, but we're talking purely consumer-level here.

*Everything beyond 8k...

REAL
TIME
RAYTRACING
never

What I'm saying is that there is plenty of room for improvement with GPUs. At least a decade of people constantly upgrading their systems for the next best experience. It is absurd what developers have to do today. To show off details they need to zoom in to a scene. 30 feet, you no longer have rocks, 100 feet, no grass, 200 feet everything is a low poly representation. 400 feet, billboards. The VR Ready label is horse shit too. Games still rely on billboards and flat textured planes. Both look horrible in VR.

I wonder how taxing it would really be on hardware to just start using displacement maps or bump maps over normal maps for things like ground textures

bump mapping is not as ellastic as normal mapping and I imagine displacement mapping might be expensive due to the fact GPU needs to create a mesh out of the texture

It's a meme I've been forcing. 3 Current Year Era.

4D displays with real time light field rendering will allow natural stereoscopic imagery and depth of field; it will be equivalent to looking through a literal window rather than the current pseudo 3D displays of today. The only problem is that with a passive display method, the special resolution will have to increase super exponentially as there will be two more dimensions, azimuth and altitude, corresponding to the angles in which a pixel is viewed.
It probably won't be the next thing but it will be one of the ultimate developments in display technology.

Displacement mapping is hard to use as there are all sorts of issues it causes when it's next to another material. Tessellation is much easier to work with.

FUCK OFF
There is still room to grow

The question isn't if someone can see it.
The question is why would you need it?

Why would I only want what I need, commie?

I know what it means, otherwise I wouldn't have said I like it

Well, you can credit hatechan if it spreads. I was going to make a Custom Debian Distribution and have everyone help add CYE to everything that handles time but then I got hired for a second job doing gayme programming and I have no time anymore.

pic related, this is the truth

The whole development of better graphics really bores the shit out of me. I've been out of gaymen for 10 years or so, but the great games from then weren't great because of their graphics but their gameplay. I remember AoE, AoE2, Settlers, GTA Vice City, Roller coaster Tycoon 1 & 2. Great times. Maybe I'm just too old for this shit but even if you would have 10000K I would not be tempted to buy it because I hate associating with normie needs and also must having the newest tech is a form of mental illness.

It's lame because their insane video requires big studio production to take advantage of it. In the old days, a small team or even single programmer could make a nice game. Wasn't RCT made by a single guy coding in assembly? Anyway that's why I like 8-bit systems - it's easy for any kid to make games, and there's no unrealistic expectations. All that matters if if the game is fun to play.

Yea I heard that rumor, too. Hard to beleive, why wouldn't you at least use C or some other kind of wrapper. Or there are levels of autism that I can't even imagine.

>chrissawyergames.com/faq3.htm

(heil'd)
It took me way to long to get your pic.

What? Are you a women?
I can stare at screen and use PC for 15-20 hours. But I am using CRT with superior refresh rate compared to your shit LCDs.

Enjoy your radiation sickness

60 fps is not really 100% great.
in certain games you need more fps to have acceptable display latenty. for example playing in Quake 3 at 60 fps is an exercise in futility.

ftfy

latency, of course.

for that, the "helmet" needs to track precisely where are the eyeballs looking, in real time. otherwise the peripheral zone would be misplaced and the player will see some shit.

This is already possible with older and objectively better games (besides graphics) e.g. AA or >5-3 years old.

...

Lots of dyslexic lads in here. Whatever.

1366x768 is perect, everything is too tiny on anything larger.

OP Here.

My monitor is a 1366x768 one and I clearly can see the improvement in a 1080p monitor. I can't imagine how big the difference would be in a 4K one.

A joint project by NVIDIA and AMD codenamed project Steel Lake Horse to massively accelerate technological progress of the video card industry. Results include:

-Advanced (TM) Realtime Intellectual Property Defense System (R). Scans the framebuffer in real time to make sure you aren't watching a copyrighted video via a non-protected format. It's implemented as software on a coprocessor with full access to video and system RAM. It has a bug or 20 based on stack smashing the said program by writing certain sequences of pixels to the screen. Security is hard.

-Enhanced (R) USB Screen Cast (TM). Cards with this capability allow up to 8 additional monitor outputs over USB. Just plug a USB-HDMI connector into any USB port and you're good to go. Has a vulnerability where plugging in a malicious USB drive results it it having read-write access to the entire framebuffer. Security is hard.

-Advanced (R) Generation 3 GPU (C) Antitheft Technology (TM). As computer graphics cards become more and more advanced, they are slowly becoming more of an investment to high end users. It is often said that losing the GPU is the biggest concern when it comes to computer theft. With AG3GPUAT technology, you can now pair your phone to your GPU and control it from anywhere. AG3GPUAT enhanced cards will connect to the internet over the system's NICs as well has through a built in radio. At any time, you can choose to disable the GPU or display a message through your phone, from anywhere. It is implemented as a software stack ontop a proprietary OS running on a coprocessor. It has a bug or 20 relating to command line injection and lack of authentication allowing anyone in the world to connect to the video card and access the full system's memory. Security is hard.

Since it's only on Windows, we can assume he used the Microsoft Macro Assembler, which is actually is pretty comfy and has a great macro engine, it's much more advanced than GAS according to the people on it's forums and uses Intel Syntax.

masm32.com/

flexible circle-shaped monitors fam, to create an artificial need for new hardwares and softwares
then some years later they'll come up with wide oval shaped monitors, so you upgrade again
then they'll go 3d with spheres and shit, gonna need big big video cards for processing

it's just your system can't into proper DPI scaling
1366 is nigger tier, even 4 years ago

Light field displays.

making it feel great, eliminating lag and jerkiness for good

the next step is to render 61 fps

...

I like the gta 3d trilogy, but they run at 30 fps. gta as even on 25 fps. You can uncap it, but the game glitches out then.

I am a poorfag and i sold both My RX470's to despertae miners after i had them for a year (MSI gaming G1 8GB) SOld them both @ 250 and i didn't even attempt 4k. As of right now, it IS a meme. I'd much rather have a 1440 P Ultrawide,curved or HDR Monitor and actually use the expanded fidelty. I mostly played KF2, or Dirt Rally @ 5760x1080P on an FX-8350 and still helped about 65FPS. In 2 years 4k will be viable is my guess, but us nonpoorfags have no reason to leave 1080P or 1440 P gaming. As it is now I'm running Hybrid PhysX with my Now ancient XFX 4870 1GB and a PNY 260 Core GTX 260. I'll gladly play 10 year old games that don't suck because 90 + % of Holla Forums and gaymen in general is TRASH. Hell the last AAA title i bought was BF3 in 2011. And just about everything newer was gifted to me or bought during summer sale.

user...you need glasses. Even my T500 Laptop has 1080P, and my non-pozzed (Suck it Winblows) T500 has the original 1680x1050 Weirdness.
if you still think 1366 is OK
>>>/suicide/

none of those things, despite being able to accomplish them in the future, and focusing all efforts to develop new methods of data-mining users.

THIS. Back in the early 2000s when SIMD and multiprocessing started to seriously take off, I felt certain the days of gouraud-shaded triangles with bitmapped surfaces and sprites would come to a close. Software renderers of the period, like the ambitious voxel-based game Outcast, or the RTRT-based Nature Suxx techdemo, were capable of amazing things using a single midrange core's integar/float unit, but they turned out to be the last of their breed. All we've gotten since then was the false start of Intel's Larrabee project, which faded into the supercomputer market, and nobody's looked into it since.

There are SO many technologies involving rendering, lighting, geometry, animation, physics, and more, that will never happen as long as Vulkan/DX12 games stick with essentially the same tired old rendering pipelines we've had since 3Dfx.


The new UHDTV 4k/8k broadcast standard supports 120Hz, and there are already shows being shot at 120 FPS. Though HDMI has supported 120Hz since v1.2, and shutterglass 3D TVs have all been theoretically capable of 1080p120. More important IMHO would be dynamic sync over HDMI for consoles and TVs.

This. Yet another among many reasons I'm annoyed by LCDs is that they killed >60FPS gaming from about 2003-2010, and it's now the preserve of a handful of people with special ≥120Hz gaymur monitors, instead of something anybody could at least wring ≥85Hz out of with a random multisync CRT.

144 FPS is important. Post processing and higher resolutions are absolutely not important. Now that graphics are suffering from diminishing returns, spend that extra graphical processing power on more frames not more pixels. Fuck your cinematic meme game experience go watch a movie.

For fuck's sake.

...

Yep, a gaymer alright, go back to Holla Forums.

Impressive.

Turtle time belt?

Utter retard, leave.

displacement mapping is localized sphere tracing (distance field mapping, whatever). Doesn't generate any additional geometry.

Real time ray tracing would be cool, at decent resolutions.

Forward ray tracing (from light source) combined with reverse ray tracing (standard, from the detector/display) would be pretty cool.

running FMEA simulations in real-time alongside gfx.

Distance field rendering is already moving into more of a mainstream role. Let's put 10x the memory in the video card, model the distance function of geometry as a forward propagation in a neural net, then use distance field rendering (sphere tracing, whatever) to render it.

Nothing will ever surpass 1080p. It is the highest resolution possible.

normie get out


fukken saved

Consider yourself spared from carrying the terrible weight that comes with knowledge of the Dark Autism


Hello slavfriend

but I can only do 13 or so hours before I get nauseous


Criminally underrated post

Can those run Crysis yet?

...

Tell me about the Volta, why is it being delayed?

cheaper power

MOAR COARS

Widespread use of deep colour on displays, true colour is not enough, I want more colours please.
Sadly, we still have to rely on OLEDs or LCDs and not any other cool technologies like Plasma displays.
How about open source video card hardware?

Yeah I sure miss dissipating one fucking kilowatt for a 32" display, that sure felt cool

Lotta loyalty for a hired pen, huh?

There is literally nothing wrong with OLED compared to plasma. In particular, OLED supports higher pixel density (with the experimental translucent "stacked" variety even eliminating subpixel misregistration), and even faster refresh rates.

That said, current OLEDs suck balls, not because of the screens themselves, but their HDTV driver electronics, which restrict them to 60Hz fixed-sync overprocessed lagginess via a single mud-slow HDMI link.

No, it sucks balls because in ~4 years your screen color is permanently destroyed with possible burn-ins.

I congratulate GPU and display manufacturers for turning resolution increases into a meme.
10 years ago you'd just go out and buy a screen with a resolution your money allows and be done with it, nowadays it's HAY GUISE SHREK OUT MUH 4KAAY CERTIFIED DRAGON DILDO

I've said this before but I'm glad companies don't listen to customers, otherwise we would be stuck with 1280x720 24p screens forever.

I know this is ironic but I know a bunch of retards who genuinely believe it.
I told them to go live in a forest then, their answer was "come on user, it's different".
I really hate niggercattle.

10 years ago was several years after the console apocalypse. We used to get resolution increases all the time on PC and then we got stuck on 1080p forever due to consoles.

...

I'm not one to lose my patience easily and even I suggest you kill yourself

Lifespan has been improving rapidly, as the latest OLEDs can do 100k hours to 1/2 brightness, something plasma couldn't do until 2006.

A phone spends most of its time with the screen off. If you use your samsung phone to display info while docked as some people did at first with the Droid Charge, you'll wreck the screen quickly. Faggot.

It burns in super fast when used as a PC monitor.

I have a samsung galaxy s3, the screen is AMOLED.
This 5 year old phone that I never used for anything other than making and receiving phone calls for 4 years which rarely happens, and reading books for the last 1 year has visible wear where the notification bar sits and on the corners of the screen.

Too bad for you that I had an S4 before it died and the screen was literally turning piss yellow as it aged

Do you even know what you're arguing about anymore?

The burn-in is not worth it. Both my Galaxy S4 and my Galaxy J7 2016 have permanent burn-in and both had relatively light use. I got an LG G5 as my next phone because LG seems like the only phone maker in this day and age that makes high-res smartphone screens that haven't fallen for the OLED meme yet.

OLEDs are planned obsolescence. Plain and simple. When an LCD backlight dies you can still use the LCD, when an OLED panel is even partially dead it pretty much becomes unusable.

Do you expect anybody here to keep live display or other always-on "features" active? I'm going to take offense to that.