I've always wanted to build a desktop/workstation with dual CPUs to satisfy my autism. I have various uses for them but would also like something to play games on. I also need GPU passthrough because I want to run some version of Windows in a VM. Some user told me to wait for AMDs new server class CPUs back when the shilling for Ryzen was in overdrive, another mentioned Xenon not begin the best for applications that only use a single core. I don't really care for either AMD or Intel I just need 2 CPUs that are good at both single core and multiple core applications. I want this to be my main work computer for some time to come. My goal is to encode video fast on one CPU while running the OS and maybe even a game on another.
Dual CPU configurations
use dual gpus instead
The CPU is my bottleneck. The GPU is limited in what in can do, the tools for it aren't as mature, and built-in GPU encoders produce shit results at the same bitrate. I manually offload stuff to the GPU in my scripts and it isn't much.
What's the point of multiple CPUs vs multiple CPU cores?
Moar cores
More botnet.
Just get yourself a 16+ core single socket and avoid the NUMA bullshit.
Aren't there encoders for xeon phi? If they're not complete shit it should beat any cpu you can afford. Plus you can ssh to it.
There's no x86_64 cpu with 32 or more cores.
OP will use each cpu for a different task. He won't have NUMA problems.
Server boards and cpus can actually be better than desktop ones. For instance some xeons are just i7s without an integrated gpu that cost less and the motherboards don't have reddit-tier faggot shit like meme LED connectors that are completely useless and only drive up cost.
The downside is the lack of overclocking options.
only retarded winfag gamers do this
It's a cheap way to get more performance out of your hardware. And it's currently the only way to get more single thread performance on amd64/i386 than intel sells out off the box.
...
I wrote "cheap", not "free".
...
cheap until your cpu shits the bed because of it.
No, the potentially decreased lifetime is part of the price you pay for overclocking.
I don't even OC my hardware anymore but this is mostly bullshit. I have plenty of old AMDs on stock cooling that were over clocked over a decade ago that will chug a long fine. If you're an idiot and burn one up trying to max out an OC on stock cooling sure but if you keep the temps low it's fine. I've yet to see one of my own CPUs die because of a mild/responsible for the cooling overclock. Power consumption is not an issue in most applications, if you think it saves on the power bill enjoy the entire penny or two a month. Overclocks are stable when dialed in too. You're just baiting or bought into the meme of it begin dangerous.
It would cause redundancy is NSA's database and make it less efficient. It's a great plan.
Intel's encoders are as shit as Nvidia's and the Phi's don't have vt-d yet in addition to having low clock speed.
I dont need a cpu configuration Ive got a super computer. Like research level shit. Quit being poor you stupid faggots.
I bought a h8dgi dual socket mobo for 200USD
I planned buy 2 opteron 6276 2.3GHz/3.2GHz for
80$.
Then 2 AMD furry Nitro 300USD each.
Autism satisfied.
If you really want to be autistic about dual CPUs then go look for an older server on ebay or something. Maybe you'll find a dual Xeon or Opteron setup for a reasonable price. I think you're being pretty stupid though if you plan to use each CPU for a separate purpose. Just build two PCs if you're going to do that, it will probably be cheaper and faster.
also the new AMD chips are going to have NUMA within the chip (2x "core complexes" per die, multiple dies per package in the Epyx chips), so you can kinda get the same experience there if you want
Does the Xeon Phi run a seperate linux OS?
What's the knot rate on those?
Yes, it's independent of the host OS.
Why not drop 1500 on an intel processor+liquid cooling and solve that problem, goyim