Nvidia kills 3+ card SLI

Nvidia kills 3+ card SLI

archive.is/Ws9s2

"We suck at doing our gooddamn jobs, so we give up"

We already knew this

Jesus christ, what is the power consumption of that?

it requires its own thorium reactor

but will they bundle fire extinguishers?

i don't know why everyone is surprised.
this is quite a problem with parallel computing, if you account for the overhead you end up with devices working at half of their maximum performance or even less

So they fucking sucked, how about made that as open libre hardware instead so that other competent people could learn on how that piece of shit was made?

Anything over 2 cards for SLI has sucked ass anyway. The cuda architecture was never designed for more than 2way SLI, everything else was just marketing bullshit for people with money to burn on 4 cards.

Wow gee, those 4 people who use this are sure going to be disappointed.

...

...

ITT: nVidiots in full damage control claiming no one needs 3+ card SLI anyway

But can it play ASS-ASS-IN's Creed?

...

They're killing SLI in general as Vulkan and DX12 is going to be able to address multiple cards natively.

I can't stand people that think that graphics programming is moving towards Vulkan and DX12, other than in major studios with tons of money. These APIs are ungodly complicated, and no one unless you need maximum performance would use them.

Implying indi devs wright there own engine from scratch

So 4-way crossfire from now on?

I wonder if Godot will add Vulkan support.


LOL

This was already an issue for 5+ card rigs typical of HPC apps, which were practically impossible to rasterize a unified framebuffer from. D3D12/Vulkan will solve that issue, though this will probably mean no more mobos or SLI bridges with explicit 3/4-way SLI capability, so more PCIe bus contention.

Hey goys. Now that the future is not certain, you can't save money by buying two or more cheaper Nvidia cards and going SLI and beating Titans. Make sure you buy a GTX 1080 for now, and then 8 months later buy the Titan!

well it's already written in opengl right? the whole point of vulkan is that it's a cleaner codebase than opengl. if the godot guys were doing alright in opengl, then they should be fine with vulkan.

...

Rename yourself to /g/ until some of you comes up with a newsworthy subject, like a smart way to render any 3D on multiple accelerators with independent memory, or a smart way to write software so that even renderers of linearly dependent frames can benefit from them.

Excellent idea! We'll need a logo to start.

Isn't this what D3D12/Vulkan already do?

The issue here is that there has been a regression in technology, where something has been killed for no good reason.