4k gaymen mustard rig. Invest now or wait?

So I've come to the conclusion that 4K gaming won't be satisfactory until monitors or TVs have actual refresh rates of 120hz or above. They do advertise ridiculous shit like 1000hz but that is all post processing after the fact to make the image appear smoother and not the actual refresh rate. I considered going for 1440p 144hz, but I keep hearing how things upscale so horribly from 1080 and it's a mess to deal with for this reason.

Since my mustard rig broke down a year or two ago, I've been console gaymen. All the while saving up for a new one because it was toaster tier at that point and upgrading would not be worth it since all the components were ancient.

Needless to say, I want to make my investment count. I've considered purchasing a 1080p 120hz display to entertain me in the meantime while waiting. The question is if it's a right time to build a PC now, or to wait for these 4K displays to catch up, and then the components to drive it will be cheaper too.

Other urls found in this thread:

en.wikipedia.org/wiki/Mach_bands
thoreau.library.ucsb.edu/thoreau_faq.html
rtings.com/tv/reviews/by-usage/video-gaming/best
dit.ie/media/physics/documents/yb_keane.pdf
displaymate.com/ShootOut_Part_1.htm
bhphotovideo.com/c/product/999007-REG/sony_pvmx300pac1_pvm_x300_30_4k_monitor.html
bpm-media.de/en/Post-Production/Video-Monitors/27-47-Monitors/Sony-PVM-X300::357299.html
bpm-media.de/en/Post-Production/Video-Monitors/27-47-Monitors/Sony-BVM-X300::368862.html
prad.de/en/monitore/review/2013/review-dell-u2713h-part8.html#Viewing
anandtech.com/show/2803/5
en.wikipedia.org/wiki/DisplayPort#Specifications
prad.de/en/monitore/specials/inputlag/inputlag-part18.html
tftcentral.co.uk/reviews/content/dell_u2717d.htm#viewing
necdisplay.com/p/desktop-monitors/ea275uhd-bk
cdw.com/shop/products/NEC-MultiSync-EA275UHD-BK-LED-monitor-27in/3762183.aspx
ebay.com/itm/Pixio-PX277-27-inch-2560x1440-144Hz-AMD-FreeSync-WQHD-Gaming-PC-Monitor-/262508966762?hash=item3d1ec0f36a:g:Q4QAAOSwuzRXesKp
ebay.com/itm/EVGA-NVIDIA-GeForce-GTX-980-Ti-06G-P4-4996-KR-6GB-6GB-max-GDDR5-SDRAM-/112056636807?hash=item1a17189587:g:hNcAAOSwP0RXh-mp
twitter.com/SFWRedditImages

Hey everyone the OP wants
THE MUSTARD OF FOUR THOUSAND GAY MEN

OP here.

That's one way to put it.

Wait. There are 4k 144hz Monitors coming in a year, and by 2018 Nvidia will be releasing their Volta architecture which should be capable of [email protected]/* */ with a SLI setup.

OP LISTEN TO ME PLEASE FOR THE LOVE OF GOD

do NOT buy any 4k t.v.'s right now because the technology is WAYYYYY beyond any incompetent foolish companies abilities to actually implement it on a screen in such a way that it will look nice. I swear to you it looks like you are watching a plastic claymation movie no matter what you try to watch in a 4k t.v. because it looks SO fake that it looks so real that it looks even faker.

basically: 4k t.v.'s look like SHIT and they are very expensive.

1080p LED t.v.'s however have gone way way down in price lately.

assuming you buy a good brand you can trust, my friend got a really good vizio long ago and it seemed clearer than a 1080p tv and it was only a 720p

just get a good 1080p t.v. you will save yourself so much suffering. seriously, FUCK 4k. they are the EA of t.v.'s

bump so others can see

4k res is a meme until the price comes way down for standardization, and that wont happen for many years.

1080p with all the special settings turned up looks beautiful and to the average person 4k doesn't have a noticeable difference in picture quality between it and 1080p unless you're using a big ass screen.
And at 24 inches or below I'm not sure it would be noticeable to anyone, so 1080p will probably be a standard for a very long time.

...

let the normies still buy the 4k tv's t'ill a screen can be bought for 150$. So , I'd say wait t'ill 2018 to start caring.

I just want a freesync 1080p monitor at 144hz

is that so much to ask for, really?

who fucking cares

For 99% of purposes you won't be able to meaningfully tell the difference between 720p and your precious 4k meme. There is literally no justification for shelling out thousands of dollars on negligible improvements in resolution,

16:9 is cancer at any resolution

You might want to get your eyes checked buddy.

People who like nice things.

But I'd wait on 4k. A GTX 1080 is barely hitting 60fps at 4k with current games and SLI is.. SLI. 1440p's time to shine is now, 4k probably in two years. I'll be upgrading shortly to that configuration once inventory shortages clear up.

This has been my gut feeling pretty much.

Looks like 1440p 144hz is the way to go for now then.

Considering how long it will take until 4K will output at a decent refresh rate and hardware to run it decently on a single card, we'll let the gadget hipsters drive down the price first then.

After suffering through a few years of console gaymen, I refuse to spend money on a rig/setup that will not output a decent frame/refresh rate. My eyes are fucking bleeding at this point.

TN 1080p 1440hz monitor for gaming, 1440p IPS for work and looking at pretty things.

Wait for AMD's Zen and the flagship Polaris GPU.

I'm curious when we will get [email protected]/* */@10bit color depth.

No one seems to talk about color depth, but I can really tell the difference between 8bit and 10bit when looking at monochromatic images.

have you tried fixing it? upgrading hardware is a gimmick in current year. It's the software that needs upgrades, hence the new APIs that try to copy 3dfx now

If you don't mind paying more and want Intel instead of AMD, get a mobo/CPU now with a cheap filler GPU, and wait for Polaris/Pascal to smash each others prices down before getting a real GPU. If you want AMD or just cheaper Intel due to competition, wait for Zen in October.

2016 is set to be the best time to buy PC upgrades in the better part of a decade.


LCDs can just barely almost handle 24-bit color OR 60FPS, both at once would be impossible. Your only hope is the possibility that somebody will make decent OLED monitors. 50" OLEDs with the usual pointless TV chipsets have already broken the $1k barrier, so a theoretical perfect PC gayman OLED monitor would actually be relatively affordable right now.

I generally just use LCDs for text/lineart, and stick to CRTs for gayman.

I have only barely heard of refresh rate and 1440p t.v.'s

what in the world do these really mean, and why do they matter.

is it really easy to find a 1440 tv for a good price at most stores? will it cost the same as a 1080p? will videogames from any console just automatically function for it?

how does a 1440p t.v. look compared to a 1080p?

what are the best high refresh rate 1440p LED's with high refresh rates right now?

Son, you need to go to the eye doctor.

wait, so 1080p 1440hertz

or 1440p 1440hertz

what the fuck?

If someone could explain this to me I would be very happy because hopefully if my t.v. ever doesn't function anymore I will have the right knowledge and wisdom to get a great t.v. at fair price.

The number of times a second something happens. In the context of display specs, this means framerate. TV/film/etc and console peasantry max out at 60Hz, so you shouldn't ever care about this unless you're a PC master race (or enjoy fake interpolated frames like SVP), in which case you shouldn't be using TVs anyway.

This is just the vertical resolution of the display. 1440p usually means 2560x1440. As above, unless you're a PC gamer or enjoy faked upscales, this doesn't matter since nothing else goes above 1080p (1920x1080).

Stop calling them that, it's false advertising. They're just LCD TVs that replaced the fluorescent backlights with LEDs, the only genuine LED TVs use OLEDs.

In short, none of this will matter to anyone but PC gamers until substantial amounts of 4k content starts coming out. Also, LCDs suck. If you want a good bigscreen TV, get a projector and set up a front-projection screen, since they can display proper color and motion unlike LCDs.

whoa sounds cool. maybe too expensive for me? but it sounds like such a smart way to go. how do you set it up so that you can play video games on it or watch movies or t.v. so that you don't block the image by sitting in front of it to watch it?

how do get the angle right?

how do you get sound from a projector?

-------------------------------------------------------------------------------------------------------------------------------------------------------------


(cousins dad lives in Michigan is military and gets loads of free shit from his commander as bonuses who has a big stereotypical army seargent moustache, he goes abroad and strategizes and reduces the death rates by a measurable amount in his set of squads, dude is a fucking american hero) although I don't support war or shit like that.

bump?

Projectors aren't that expensive. Leaving aside cheap portable stuff, $400-$800 for a decent projector, and the screen varies from $40-$100 for a decent rigid or rollup (or $10 for a homemade wall treatment). If you want it out of your way, ceiling mount installation is best. Adjustment is a one-time thing and pretty trivial.

As for audio, just like TVs, some projectors have mediocre speakers included, but there's no reason you shouldn't just use discrete speakers. Assuming you don't already have an old modular hi-fi system kicking around, a cheap $100 "home theater in a box" with 6 surround speakers and a 200w+ receiver/amp will greatly outperform any TV or soundbar.


Digital displays like HDTVs, projectors, and LCD monitors automatically upscale or downscale any input to the display's native resolution, with varying quality. Adapters to connect various digital interfaces (HDMI, DVI, DP) are inexpensive, so the only hard compatibility barrier you might ever run into is older analog connectors.

Aspect-ratio (square, rectangular) rescaling can be toggled in the settings for TVs and most monitors, again with varying quality.

Projectors can be made to display almost any resolution or aspect ratio natively with the correct settings and lenses.

Don't waste your money on it. Pretty much all of the good games can be played on toasters, calculators and even casio keyboards.

...

Who or what are you quoting? OP specifically stated he wants a modern gaming rig, which is absolutely worthless because no good games are coming out whatsoever, and the ones that are probably run well on dated hardware.

Consumer hardware won't honestly display true 10-bit (16-bit, 4096-bit, etc.) data and its engineers would only laugh if you told them you need it. First, there is almost no content, second, it is used in special applications, like radiology, in which the eye adjusts to the local contrast levels of a piece of picture, and that absolutely can be done with software manipulation (it's faster and reduces eye strain), third, it is only noticeable in generated content, because nature isn't discrete at all.

What you most likely see is a normal generated gradient, and a dithered (in software, or in firmware) generated gradient. If they are properly calculated with sufficient bit depth, I assume you talk about gradients that are close to diagonals of an RGB cube or their projections on its faces, as those are the zones of noticeable aliasing of one or multiple color values. As with edge aliasing in 3D graphic, there are brute force (generate at higher rate and average) and smart (using non-orthogonal grid shifted at non-integral distance) solutions with their ups and downs. For a grayscale gradient example, calculated values that are close to (x+0.5,x+0.5,x+0.5) can be equally rounded to all 8 points (x,x,x), (x+1,x,x), (x,x+1,x)… (x+1,x+1,x+1) instead of just (x,x,x) and (x+1,x+1,x+1) that generate a visible edge. Dithering (adding noise under some dependent threshold) basically does that.

Also, I didn't know this has a name: en.wikipedia.org/wiki/Mach_bands

Lots of consumer hardware, like cameras, scanners, and GPUs, already do 10-bit or 16-bit "deep color". Even a lot of software, like Creative Cloud, RAW image files, Windows, DX11, and some games such as Aliens: Isolation use deep color assets, while others can benefit in the rendering stage through custom drivers.

Sadly, LCDs that support it are rare, expensive, and have horrible motion performance. As a result, software support is also clunky and uneven. Probably the cheapest way to experience it now is an external RAMDAC like HDFury hooked up to a CRT.

Banding aside, if you can't see the disgustingness of dithers when looking at even the best high resolution 24-bit grayscale or monochomatic imagery (especially while editing it in art software, especially especially when making alpha channels and 3D textures), you must be blind. I still remember how miffed I was when I got my first trucolor video card, and it still wouldn't go above 8-bit in grayscale mode.

...

Because he said he wants >1080p >60Hz with all the bells and whistles. Even many good old games with all the mods (not to mention enhanced 3D console emulation) will make midrange PCs pour blood out their eyesockets with the settings cranked like that.

Don't put all that shit on them then?

That's peasant talk, user.

Never forget that the Author was an alcoholic who left his wife and child to fend for themeselves for the year he spent finding himself in the woods. He was literally a deadbeat dad who wrote a self-congratulatory tale about his commune with nature while he failed in his obligations as a man. His wife and child could have easily died but he didn't give a fuck, he was the epitome of a selfish and vain man.

Purely opinion.

He was a lifelong bachelor, with no children, nor even any confirmed sexual encounters with anyone. 6/10, easily debunked, but believable enough to make me waste a couple minutes looking it up.
thoreau.library.ucsb.edu/thoreau_faq.html

nobody answers the good questions.

please someone spoonfeed me the answers. I'm an idiot and I will never figure it out.

User of a hardware calibrated 4K display here.

It's great. It makes even games from 2007 look good. Major help in ARMA where it allows more pixels on distant enemies and thereby more accurate fire.

If you want to be a Hz queen wait for the DP 1.3 models. Don't buy a TV.

Get a 1070 when the price has relaxed. Turn AA off and lighten up on the postprocessing and you'll be fine.

this is why you should care.

...

OLED can't come quick enough

Emitting equally in all directions?

Neither technologies do so.

This.

dummy in need of help

No, it refers to video frame resolution and scanning type. A 1080 progressive scan source updates all rows per frame, while a 1080 interlaced scan source only updates 540 rows per frame alternating between odd and even lines.

nice!

Like I said in & no TV is well suited for gaming (except turn-based and other non-action genres) since they're all laggy, and LCD screens are IMHO just generally inadequate for color/grays/blacks/motion. Also, nothing except PCs actually outputs native video beyond 1920x1080 60FPS yet, and there aren't any TVs (nor PC LCDs above 35") capable of >60Hz input above 1080p anyway. As a result, I'd strongly recommend a PC monitor (for action gaming) and/or a projector (for laggy things like movies and non-action games).

If, after all this, you don't mind the visual performance of LCD screens, and don't mind the lag of TVs, and want a screen instead of a projector, and want a screen bigger than 35", and can't afford an OLED, AND will do a lot of PC gaming/enjoy artificially resolution-upscaled/motion-interpolated video? RTings is probably the best site to compare TVs, here's a list you'll probably find helpful:
rtings.com/tv/reviews/by-usage/video-gaming/best


Note that I only had the number as variable, not the letter on the end (which is irrelevant, since practically nobody has made interlaced displays for years). Even for an interlaced signal, though all lines aren't from the same field, it's still the maximum vertical resolution of the image.

nah. I got a 1440p 30 inch screen and it looks night and day better than a 1080p.

4k monitor would look even better. But the problem would be games. Alot of games don't use 4k textures.

how do you get sound from a TN panel?

Tbh once you go above ~50 FPS and 900p, the monitor latency becomes the most important factor with regards to the display

I would take a 1440x900 60Hz monitor with 13ms total latency over the 4K 1337Hz Gaymer(tm) television with all its fucking post processing and lag


Headphones, external speakers, literally the same situation as using a computer monitor

Pajeet, please go.

...

is nobody going to answer this?

bumping

please someone teach me, I'm stupid.

I really want one of these, I can't believe that person has 3.

Like I said in it's the number of pixels.

Not unless you care about lag, most scalers at such high resolutions look better than native 1080p.

Higher refresh rates are always better

Like I said in all HDTVs have scalers built in, and will support any 1080p60 or lower input. Also, there are no TVs with 1440p native resolution so far as I'm aware, the next step up from 1080p on TVs is 4k.

Like I said in I don't think any HDTV is good enough to call "best" for gaming, so I strongly recommend a video projector and/or PC monitor (they come up to 35"). But if for some reason you want a TV and can't afford OLED (cheapest now is ~$1500), read this:
rtings.com/tv/reviews/by-usage/video-gaming/best

These are the same thing. While some TVs can display up to 480Hz, this is only for internal synthetically interpolated content. The very "best" can only accept external input up to 120Hz at 1080p, 60Hz at 4k.

Aside from a handful of 5k sets, TVs currently top out at 4k. Also, like I said above, there are no 1440p native TVs, only PC monitors and video projectors.

Got like 3 of these and they're high as fuck but not god tier due to the membrane. Model F is absolute god tier for NKRO PCB. Sage for offtopic.

These are impossible to find and this guy just hordes three to himself. Good to see them getting some use though.

I'll admit when I stumbled on this looking for flat aperture grille pics I felt a pang of deep, painful jealousy, but it's actually common recommendation for owners of gourmet CRTs and other vintage hardware to hoard several of the same model (working or not) to cannibalize for spare parts as they slowly die, never to be replaced, ever.

Damnit, OLED is never going to happen, normalfags and the manufacturers that prey on them are too retarded. I hope somebody Kikestarts a modern gayman CRT before the last Chinese CRT factories still running get shut down for good and all their tooling melted down for scrap.

I know you'll come back to me one day SED, once the patents are expired.

thanks so much user! I'm understanding better now.

What? I setup Gentoo in a day or two the first time I used it 10 years ago. Not even sure what is irony in your first paragraph. You seem to be implying a closed source OS written by corporate retards is no worse than Linux. IDE and coding from a simple text editor/in the console (probably with vim/emacs) aren't objectively better than the other. They're both pretty shit actually.


well memed

1 shekel has been deposited to your account.


Never forget you're a namefag.


A lot of games don't use 1080p textures.

Stop being a faggot.

It's like you like shit.

what's wrong with CRT?

wat.

Prettiness aside, the main benefit of a higher-res screen for gayman is that it allows you to spot entities further away, recognize distant silhouettes faster, and aim more precisely.

It's like you like anything new regardless of whether it's actually better

rekt. btw latency will throw off your aiming anyway.

Wait a minute, holy fuck it just hit me the implications of CRTs not having a fixed native resolution

NO SCALING VIDEOS, JUST SET YOUR SCREEN RES TO THE VID RES AND FULLSCREEN THEM

Not really. They're either remakes, or rehashes of the same old concepts. The perfect games have already come out, there's nothing new about new games. This argument belongs of Holla Forums though.

Oh look, a CRT user high on power bills and dishonesty. No surprise there, as all CRT users I've encountered on the internet were dishonest.


CRT measured black level performance is IRE dependent and can suffer during actual video content.


Hitachi invented IPS around 20 years ago which largely eliminates this problem.


Most CRTs in the day had uniformity equal or worse than production CCFL LCDs of 2003. The newest Korean 27" panels are seen in the wild with less than 10% luminance non-uniformity. Color uniformity is still a problem but there is no reason to suspect that CRT has a natural advantage here.

dit.ie/media/physics/documents/yb_keane.pdf

"Luminance uniformity was a big problem with
CRT displays."


No, IPS/MVA has been produced 8 bit + FRC for years now.


Not intrinsic.


The speediest IPS models have no appreciable response deficit compared to CRT.


You are repeating yourself.


Not intrinsic, model dependent, and an overhyped 1337 gay man meme.


This has always been meaningless because there is no specification for pixel structure, and interpolated LCD resolutions aren't less blurry than CRT.


This is not to mention the serious perceptual contrast problem of CRT. Less than 100:1 checkerboard contrast experienced without a screen polarizer leads to well in excess of an order of magnitude contrast benefit to LCD in most viewing conditions. Poor ANSI contrast affects gamut too, depending on the image content.

Memes aside, even a huge CRTs doesn't consume more than one or two hundred watts at peak fullscreen brightness. You know, about the same as incandescent lightbulbs most people had dozens of running all around their house until recently, yet which still didn't contribute the largest share of consumption.

Sure, CRT blacks are hurt by internal reflections in thick glass, but that's generally less obvious for most content than the cavernous gap in contrast performance between LCD and every other technology
displaymate.com/ShootOut_Part_1.htm

Leaving aside the fact that LCD's polarization means it will always have viewing angle problems no other technology does, and that even the best IPS still looks atrocious to me, this doesn't fix the fact that IPS have even worse motion performance than most LCDs. It's an either-or choice, and both suck.

Obviously, I'm not talking about shadow mask junk. And the overwhelming majority of even LED-backlit LCDs are still edge-lit garbage.
Compared to what? Industrial-grade transparency lamps?

That's temporal dithering, i.e.: fake TruColor.
Both indeed are so if you want less embarrassing motion performance.

Wow! Forget IPS for a second, even the fastest TN """"

The link also points out the poor ANSI contrast problem I mentioned. Would I choose a low hours PVM CRT over my EA275UHD for dark room video? easily.


The polarisation bit is misleading, and so is the notion that CRT has better viewing angles than IPS. CRT misbehaves at wide viewing angles because of the glass barrier.


And yet it is the tech of choice for prepress to medical to even high end video monitors which should have been displaced by OLED.

bhphotovideo.com/c/product/999007-REG/sony_pvmx300pac1_pvm_x300_30_4k_monitor.html


meme from 2004


You need to be sat down in front of my monitor and played an 8-bit/10-bit video test.

>Forget IPS for a second, even the fastest TN """"

...

You know I hate to see this anymore, it was bad to begin with since it's just either an appeal to authority or an appeal to popularity but I feel like it's even worse now. I think the fact those people are using a non-optimal display, displays a problem in the industry, either on their end (not knowledgeable about the equipment, or not caring) or the production end (good products are too scarce or too expensive).

Not to mentioned isn't the medical field still using SED? If I'm remembering right SED was only ever sold to medical companies, I could be wrong on this.

Interference between the crystals and the polarizers are what causes poor viewing angles in such a thin display. Plain and simple.
CRT (and every other technology aside from direct-view LCD) has no noticeable distortion until you're a few degrees from perpendicular, at which point it rapidly degrades. LCDs, on the other hand, show noticeable color and gray distortion even a few degrees from dead center (especially vertically).

Is it? Aside from the often questionable judgment of professionals at the industry level (like the transition from film to video for even the highest-budget movies, and my own personal experience with prepress being force marched onto LCDs in the mid-2000s), do they really have a choice? The rapid shutdown of general market CRT and plasma factories/laboratories eliminated them as an option for other markets, the supply of OLEDs is still trickle-thin (the OLED version of the display you linked still costs almost double!), and comparatively tight research budgets for OLED mean lifespan and brightness is still far lower than it should be.
bpm-media.de/en/Post-Production/Video-Monitors/27-47-Monitors/Sony-PVM-X300::357299.html
bpm-media.de/en/Post-Production/Video-Monitors/27-47-Monitors/Sony-BVM-X300::368862.html

meme from reason TN still competes against IPS in different markets today just like always

Ah, I understand what you're saying now. In other words, instead of fake 8-bit (from 6-bit), it's fake 10-bit (from 8-bit). As opposed to the actual >8-bit everything other than LCD has always been capable of.

That still shows 2.1ms controller latency before the panel even twitches. That's very good—in fact—many times better than 99% of PC LCDs (which is terribly shameful in and of itself, but anyway…), but it's still infinitely slower than the nanoseconds it takes for a GPU's RAMDAC to command a CRT. Why? The same is true of panel controllers for plasma, OLED, DLP, LCoS, and all the rest, yet the task they're performing is simpler in principle than a RAMDAC. Somebody will have to unfuck panel controllers, one way or another.

You realize scaling in software, or even on your GPU, will look FAR better than the worthless scaler in your LCD, right?


!?!?!?!? I've never heard of SED/FED leaving the handmade prototype stage. Anyway, my ideal tech is synthetic LED, since it's far more mature than anything else, I've never understood why it's treated as so difficult (by, for instance, Sony's Crystal LED project, or mLED projects) to manufacture even though they're made on wafers just like LCDs.

IPS LCDs do not have noticeable color shift from a normal view. To say this means you lack basic technical knowledge.

Again, not a surprise to me because CRT proponents have always been liars. They love SED, spread fud about OLED, and will never acknowledge that an IPS display can flawlessly render images so long as it is not at low picture brightness.


Funny because til a few years ago, GPUs were adding up to a frame of lag on their outputs on non-quadro/firepro cards.

Let me put it this way, I've never seen an LCD that could wholly withstand even changes in posture at normal viewing angles. I can't find graphs of performance at different viewing angles, but given the "standard" for viewing angles (drop to 10:1 contrast) it's not surprising:
prad.de/en/monitore/review/2013/review-dell-u2713h-part8.html#Viewing

I've never seen CRT shills badmouthing OLED. Quite the opposite, everywhere I've seen, we're defending it against FUD from LCD proponents outside and inside the industry, pointing out harmful effects the LCD monopoly has on the development and propagation of every alternative technology (aside from OLED, plasma & SED/FED, another clear example in "progress" is the killing of eInk), and virtually the only people supporting it in the marketplace. Sure, we criticize flaws (many of them due to inheriting subcomponents from LCDs) in current OLEDs, but only because we're aware of how far out display technology limits really are instead of being (literally) blinded by ignorance of currently popular technologies' shortcomings.

Are you talking about genlock and gSync/FreeSync? Those only effect judder, not transmission latency, which is an entirely unrelated effect:
anandtech.com/show/2803/5

It's just a side-effect of sample & hold displays requiring a complete image. The way it works with a CRT, the upper left corner is displayed almost instantly, with the rest displayed as the GPU's framebuffer (and CRT's electron beam) scans out over the entire frame duration. The only workaround I can imagine is if frames were sent to the monitor at a rate higher than display FPS, which would be capped by the limited throughput of DP/HDMI/DVI/etc so harshly, it would be better spent just increasing FPS (especially on non-LCD technologies where refresh isn't a problem).

For instance, [email protected]/* */ requires 0.445GBps, while the DP v1.4 max throughput of 3.24GBps is only about 7.2 times higher, and 4k at reasonable FPS leaves little or no headroom at all. Even the fastest current external bus, InfiniBand, tops out at 37.5 GBps.
en.wikipedia.org/wiki/DisplayPort#Specifications

In the same page you linked:

"The AH-IPS panel provides the DELL U2713H with excellent viewing angle stability."

Everyone here with an IPS panel might report on how this picture looks despite the IPS glow problem.

My view? The EA275UHD has no technical limitation when displaying this image. And this 4K panel is higher in colour shift compared to older IPS models, presumably because of limitations in etching multiple pixel domains within the fine aperture.

IPS glow is light leakage. It does not affect gamma value as you imputed. The emission is broadband wrt the backlight and its main significance is contrast reduction.

IPS has saturation and gamut problems with dark tones. We know this. What you won't acknowledge is that despite the light leakage, the technology has no problem rendering the majority of the flickr trending images.

On pre-eyefinity cards on input lag:

prad.de/en/monitore/specials/inputlag/inputlag-part18.html

you're a fag has anybody told you that faggot?

I wasn't referring to other problems that manifest with one eye dead center at a long viewing distance, but to shifts that manifest entirely due to (large or, to a lesser but noticeable extent, small) changes in viewing angle. This is a problem common to all LCDs, though (in exchange for worse motion performance) not as severely in IPS as TN.

Depending in intensity on the angle and direction; brightness, saturation, and hue are all affected:
tftcentral.co.uk/reviews/content/dell_u2717d.htm#viewing

I'm sure everybody who cares about input lag has read that article. Are you sure you're interpreting it correctly? It looks to me like they're saying synchronization (due, as I thought, to the lack of genlock or similar GPU/display communication like gSync/FreeSync offer) between multiple outputs from the same GPU was poor, not that the whole thing is laggy.

CRTs do have a fixed native resolution, the dot pitch of the mask used. It's also physically impossible to attain that kind of precision with an electron gun so instead they just pissbeam all over the place.

That's only relevant above the stripe pitch's maximum resolution, though some additional detail is still produced if the tube is fast enough to scan that high, even if additional pixels won't be fully visible through resultant moiré artifacts.

Before the stripe pitch of a CRT matters at high resolutions, there are practically no artifacts at lower resolutions (which also allows faster refresh). Arbitrary pixel and image aspect ratios work the same way.

This is completely different from pixel-based displays like LCDs, where anything other than native resolution or aspect ratio can only be displayed by scaling or cropping it into a native resolution image prior to display.

Eew, aperture grille 4 lyfe!

Changes which are not present when seated in front of an IPS panel.

Again, no performance limitation to IPS when displaying most picture content.

>necdisplay.com/p/desktop-monitors/ea275uhd-bk
>cdw.com/shop/products/NEC-MultiSync-EA275UHD-BK-LED-monitor-27in/3762183.aspx

2botnet4me

I'd honestly go for a barebones Korean eBay monitor, they're the same panels for cheaper, just without any extraneous features.

I have it and I hate the glow and high black value (and it's not a shit monitor). Except that, pretty nice.

I always follow these simple rules:

Take note of if it actually becomes the new standard or if it simply becomes a meme (3D gaming, etc).

Wait for later revision of hardware or software. Early adopters are always the ones who get burned.

You are talking about 50c ir sensors and network software to admin all office monitors at once.

What makes it stand out to Korean ebays is primarily the spectraview software. There is a keygen for it.

I'm currently building my new gaming rig (X99, 6850K, SLI 1080s) and I'm waiting for the first 4K/144Hz/IPS/27" Monitor (Asus demonstrated one at Computex). I'm confident that my rig can push around 60-144fps @4K, depending on the game. Until the Monitor releases (likely Q4 2016 / Q1 2017) I'll continue to use a 27" 144Hz QHD and 32" 60Hz WQHD Monitor.

For your situation I would recommend building a new rig with a 980Ti (Price dropped significantly, ~420USD) and a 144Hz QHD (2560x1440) 27" Monitor, there is no problem with scaling with this configuration (expect if you really want to scale 1080p to 1440p, which would be very questionable).

A 980Ti build would likely cost you between 1200-1400USD and a 144Hz QHD Monitor around 800USD, but be aware that the 980Ti cannot push 144fps in all games, you might need to optimize your settings or lower the refresh rate in graphically demanding games. This would be a good option if you want to go the middle way (alternatively you can use a 1070).

If you really want a 4K 144Hz Monitor you need to be aware that It will cost a major premium on release (> 1500USD) and even a 5000USD PC with SLI 1080s won't be able to put out over 100fps in graphically demanding games, I would even be surprised if SLI 1080Tis will be able to get 144fps @ 4K. I highly doubt that the cost for a PC that is able to run 4K @144fps will drop below 4000USD before 2018.


PS: Not to sound rude, but try to write well informed, to the point and give precise information about your question so others can try to answer it as precisely as possible. Also drop the "meme-speak" (mustard, gaymen, ..) it makes you sound like an edgy, autistic retard (good will, no offense).

$359.99
ebay.com/itm/Pixio-PX277-27-inch-2560x1440-144Hz-AMD-FreeSync-WQHD-Gaming-PC-Monitor-/262508966762?hash=item3d1ec0f36a:g:Q4QAAOSwuzRXesKp

$360
ebay.com/itm/EVGA-NVIDIA-GeForce-GTX-980-Ti-06G-P4-4996-KR-6GB-6GB-max-GDDR5-SDRAM-/112056636807?hash=item1a17189587:g:hNcAAOSwP0RXh-mp

Don't buy new parts, unlike a car or whatever, there is zero wear-and-tear downside to buying used (except, perhaps, for SSDs, but you should RAID all storage anyway). And keep in mind the above are Buy It Now prices, auction bidding or scouring classifieds like Craigslist shaves 25%-50% more off the price of most parts.

Also, remember to sell any old parts you don't need after upgrading. Circle of life and all that.

The ~ symbol means similar to. Which in this context means a price around 420USD. Also I referenced to a new Gigabyte 980 Ti with Manufacturer Warranty, if you want or need that is up to oneself.

I should have clarified that, I meant a 27" 144Hz QHD IPS Monitor with G-Sync which mostly still cost around 800USD altough you can get them as low as ~650USD now. Sure you can drop G-Sync and go with FreeSync, get an LCD instead of a IPS Panel or even go as far as buying a monitor from a questionable company as you suggested and save a few bucks, but in the context of the OP who wanted advice on a 4K system or a reasonable current alternative I expect performance to stand on the first place and money on the second at best. If I misinterpreted that and money is really a bigger issue then a monitor like the one you suggested and an rig built around an Radeon RX 480 for example would also be a decent option.


I always either keep my old rigs as backup machines for when I have a LAN or friends over or for testing purposes and if I don't need it I give or sell it to some friends. This way the hardware will be used quiet a few years and in the end it will get recycled.

I want Holla Forums's homosexual propaganda front to leave

True enough, if you want one. Though you can still get it with refurbished parts, and a large number of used ones can be warrantied through 3rd parties like Square Trade.

Like I said, going from Buy It Now to auctions on eBay shaves the price quite a bit, sold listings show QHD IPS 144Hz G-Sync LCDs commonly going for ~$400, and 980 Ti GPUs sometimes getting snagged for under $360.

While that's true for certain very new or specific parts that haven't developed a good used market yet (the latest Intel CPUs, for instance, typically have very rigid prices), you'll usually get identical or competitive parts for substantially less money if you shop around.

In that respect, price and performance are most often interchangeable. If a few hours shopping and a little flexibility on exact parts saves you hundreds or thousands on a build, those cheaper prices can be used to for an even more ambitious build within the same budget, instead of plain savings.