What do when it comes the time to upgrade my CPU to a botnetted one?

So, with all the talk about Intel's CPUs already having for a while the Intel ME which is fucking botnet incarnate, AMD's upcoming Zen line being just the same, and everybody in Holla Forums urging each other to not get any CPU launched after 2013, there's one big-ass white elephant on the room that I need to address:

What are we going to do when it comes the time to upgrade our CPUs?

Software gets increasingly bloated every day. We can even feel it right now: unless you use Noscript, loading a website implies loading billions and billions of Javascripts that literally won't even load quickly on a Core i7 6700. Everything is designed to pretty much only run smoothly on the latest Mac. And no matter how hard we cling to our old not-so-botnet CPUs, there will come a time where we will be forced to upgrade. If that didn't happen, we would be still fine with a Pentium 4. What are we going to do when that time comes?

I used it for a while. After a couple months I gave up. It's a fucking hassle when you open a website and you have to hunt down the Javascript that loads the website's content, which is hidden behind a Javascript that was outsourced to a third-party content delivery company that outsources its Javascript delivery to Amazon Cloudfront and that's one of the 15 Javascripts loaded by one of the two main Javascripts.

Other urls found in this thread:

hardenedlinux.org/firmware/2016/11/17/neutralize_ME_firmware_on_sandybridge_and_ivybridge.html
hackaday.com/2016/11/28/neutralizing-intel…
hackaday.com/2016/11/28/neutralizing-intels-management-engine/
youtube.com/watch?v=oL895peZpqY
a.pomf.cat/hxqqyl.html
developer.amd.com/tools-and-sdks/cpu-development/tools-for-dmtf-dash
techreport.com/review/27062/the-ssd-endurance-experiment-only-two-remain-after-1-5pb
tomsha
amd.com/en-us/press-releases/Pages/amd-strengthens-security-2012jun13.aspx
en.wikipedia.org/wiki/List_of_AMD_FX_microprocessors
backblaze.com/blog/hard-drive-failure-rates-q3-2016/
lib
libr
newegg.com/Product/Product.aspx?Item=N82E16813157699
blogs.intel.com/evangelists/2015/02/20/tricky-world-securing-firmware/
pcworld.com/article/3163500/components/amd-confirms-its-ryzen-cpu-will-launch-in-early-march-followed-by-the-vega-gpu.html
web.archive.org/web/20130504231619/http://www.asrock.com/mb/index.asp
web.archive.org/web/20130524031913/http://www.asrock.com/mb/AMD/985GM-GS3 FX/?cat=Specifications
web.archive.org/web/20130526060610/http://www.asrock.com/mb/AMD/880GM-LE FX/?cat=Specifications
web.archive.org/web/20130526122206/http://www.asrock.com/mb/AMD/970 Pro2/?cat=Specifications
web.archive.org/web/20130504135838/http://www.asrock.com/mb/AMD/960GMU3S3 FX/?cat=Specifications
web.archive.org/web/20130504125741/http://www.asrock.com/mb/AMD/960GM-VGS3 FX/?cat=Specifications
web.archive.org/web/20130504150132/http://www.asrock.com/mb/AMD/960GM-GS3 FX/?cat=Specifications
shodan.me/books/Cryptography/
twitter.com/SFWRedditImages

i'm not upgrading shit.

getting some "modern" hardware for certain tasks? maybe. but i wouldn't use it for mission critical shit. it would just be an internet shopping box, or whatever.

i'm doing fine with a laptop that was just average when it was made 10 years ago. i think people who complain about "slow" computers just don't know how to properly use one. but i'm running OpenBSD with many years of experience. if you're still a Windows fagot at this point then you're pretty fucked.

i can't imagine using firefox without noscript. the web is such a disgusting thing nowadays. my cache, history, cookies, etc are all cleared when i close firefox. i will disable noscript for sites that need that, but afterwards i make sure i close that shit and clean up the mess. textmode browsers make more sense for most of what i do on the web.

fuck the web. it accounts for a small fraction of what i do with my computer.

a lot of people are getting real sick of this backdoor cpu bullshit. i fully expect open and trustworthy hardware at affordable prices within a decade. until then we just have to deal with it.

There are a couple of answers to this. The first, and most practical one, is that Sandy/Ivy Bridge Intel CPUs are very close to being de-botnetted.
hardenedlinux.org/firmware/2016/11/17/neutralize_ME_firmware_on_sandybridge_and_ivybridge.html

Nah, that's too much trouble. I use browsers like Lynx and Links, where you don't have to fiddle with scripts.
If a website doesn't work, then I just don't bother going there.
Only exception is critical sites like my bank, so I have a big bloated browser installed just for that (but only for that).

Good news. You can now disable the intel ME on new processors:

hackaday.com/2016/11/28/neutralizing-intel…

Link fixed

hackaday.com/2016/11/28/neutralizing-intels-management-engine/

Not everyone. Like an user mentioned in the other thread, ME's remote control features can be pretty easily bypassed by either running your traffic on a non-default VLAN if using the onboard NIC or just adding a third-party one. I'd prefer a jumper or BIOS setting to disable it entirely, though so if the debotting procedure becomes bulletproof I'll run it on my own systems. I'm not going to stress beyond that about how (((they))) might still have some secret backdoor. If (((they))) want to burn that capability by using it on me, come at me bro.

That's like worrying about reinforcing your windows to deter burglars when you don't lock your front door. I will say that the menu system of uMatrix is significantly more convenient than noscript.

As for what CPU to use. Get an FX-8350 now as it's currently the most powerful x86 CPU on the market that doesn't force anything like Intel's ME or AMD's PSP onto you and doesn't have a full retard TDP (or possibly wait to see how the recent solution to neutralizing Intel's ME on some machines turns out). Then get gud with optimizing your shit (which includes blocking JS online) so you can stretch out its usefulness as long as possible (I still use a netbook with an Intel N270 Atom and 1 GB of RAM and it runs well for most of my online use). In the event that you need more processing power, optimizing is no longer an option, and better non shit processors have not been made. Then build a separate computer just for that task and others that can't be accomplished on your older secure machine, so long as those tasks won't compromise what you're doing on your older secure machine.


You do know that the ME can also be exploited locally. I don't know how powerful the processor used for Intel's ME is, but AMD's PSP uses an ARM cortex A5 which would give an attacker quite a bit of room to work with. Also remember that Intel's ME and AMD's PSP are being used to force DRM onto you as well.

Like, for example, a separate computer just for CPU-intensive compression, data crunching or for compiling my Gentoo packages with distcc?

Also, I already have an FX 8350. I've already overclocked it to 4.4 GHz and plan on getting better cooling on my computer to crank that shit up to 5 GHz.

They'd have to string a chain of 0 days to get there. I'm not saying leave yourself wide open to every driveby, but my systems are locked down and logged well enough to the point I'll know if something funny is going on. That's why I'm not stressing they'd go to all that trouble and expense and risk burning their exploits for a no value target- I'm not alQuaeda.
Agreed on DRM, I'd just as soon purge that. I think Intel's next version of ME is actually going to be a full-blown 386 which is... They'd better come up with a good way to secure that shit.

The first 2 criteria for when to build a second less secure but more powerful computer mean when such a task can no longer be reasonably accomplished with your more secure computer. Remember, if you constantly make excuses for using a less secure machine just because it's convenient you will end up screwing yourself with your complacency.

An ARM Cortex A5 is significantly more powerful than a 386. Older ARMv5 processors like Intel's XScale line (used in higher end PDAs and early smart phones from the early-mid 2000s) clocked at 300-400 MHz were powerful enough to run games made for a 386 emulated in DOSbox if I remember correctly (some people messed around with compiling DOSbox for the Zipit Z2 in the late 2000s).

x86 has a better security model than x86-64 :^)

Yeah, it's called "don't have a backdoor built into the CPU" :^)

I don't know OP, I'm a bit spooked. I guess I'll get AMD when I have to but I have few illusions about it being much safer, although at least it's less brazen than intel. What is the most powerful CPU today that doesn't have the current-gen botnet?

Seems like software and games have largely stagnated, the bloat you speak of exists but the FOSS community still respects not hogging resources even if hardware is cheap, and these days FOSS is quickly becoming better than proprietary even in just features. If we're lucky, soon the proprietary ecosystem and the normalfags feeding it will completely separate from Holla Forumsie net that we can safely ignore, just like the Apple matrix.

NoScript is obsolete, use uMatrix.


Oh god please let this happen, I'd pay 5x the money of a botnet cpu with the same benchmark score. Hell I'd pay 10x if I had a higher paying job.


Cool link, but it's still very uncomfortable to be in an arms race with your CPU vendor. The CPU vendor has far greater resources to do sneaky shit, and you can only disable the shit everyone knows about. These things are stopgaps, what we need is a transparent and honest vendor.

We have that. It's called the OpenPOWER Foundation. What's missing is affordable prices for desktop usage. The TDP is also pretty shit.

Why does this keep happening?

Why does software evolve and better extensions get developed over time?

If you're talking about 80386 or earlier chips, then yeah they're better. There's not almost 40 years of legacy cruft with lots of wiggle room for exploits. And chip is entirely your own (no ME botnet).
I would love a nice 386DX33 to run DOS again, on the real hardware. Either that on an Amiga or Atari ST.

You can go up to a 486DX2 before you start running into potential issues like System Management Mode being included.

There really is no reason for the average consumer to upgrade within the near future.
Performance gains are so miniscule on newer CPU/GPUs that it's pointless. We've reached a plateau, and it's gonna be a long time until the plateau is overcome. Developers are going to have to start buckling down and optimizing their shit for current tech, because they can't keep passing off the cost of their bloat to the consumer, demanding that their hardware is obsolete.

so what was the last CPU launched that wasn't part of a botnet? Since they're all before 2013 I assume they'll be cheap by now...looking to build a half-decent budget PC in the spring.

How about you lurk moar faggot. Seriously, there are only 21 posts before yours in this thread and multiple of them in the first 10 talk about exactly that.

im using an amd fx8120, is that botnet?

picture unrelated

Who are you, the identity killer?

youtube.com/watch?v=oL895peZpqY

retard detected. umatrix allows some parts of JavaScript through, unlike noscript, which blocks all javascript. why the fuck do you think that the tor project is still using it if its "obsolete"?

...

It's fine, it's Zen and some of the newer CPUs that are a problem.


AMD FX chips are all fine. I'd get them now before they're discontinued.

Nice smug lain.

It actually does. Remember the Firefox CSV + JS 0day that was found a week ago? That requires JavaScript to function, correct? Try your browser with this test (made by a fellow Holla Forumsnichian): a.pomf.cat/hxqqyl.html
Check the source of you'd like, there's not malicous payload. If it executes correctly, it will crash Firefox, if not, it won't. With noscript, it doesn't work, but with uMatrix it does. Therefore, uMatrix allows some JavaScript to be run.

Again, retard detected, thanks for the lain.

noice.

...

Same here with IceCat, don't know what he's on but must be good stuff.

They did more than that, I still have a couple Compaq iPaqs which I use as touchscreen remote controls for my TV, surround sound, etc. I can play Quake 1 on it with its meger 400mhz xScale. Although as you'd imagine having only a D pad and a stylus makes it rather had to play.

No one ever was.


Why would you want to run non-libre JS?

The only reason to buy newer hardware would be for gaymen or media, and even a 5 years old CPU can handle 1080p 30fps vp8 encoded.

I have a pentium D that runs linux really well. Barely has an issue even with web browsing and youtube. My main computer is still a gen 1 i7 from 2010. The way modern computers are used is just really damn wasteful. We've kind of hit a soft ceiling in terms of more CPU power being truly useful. The only computer I really want now is an alienware 17 with a geforce 1080. Maybe that is silly. I just like the way it looks and stuff, it looks like something from the early 2000s. People will probably start trying to sell theirs in 2 years and then I will probably use it for 10 (or more if it still works) and that will just be my weirdo person-who-was-alive-in-the-20th-century computer to traumatise people with. Basically just represents the epitome of what computers were at the era we now live in where you still had a box with shit in it that sits on your desk, before the time where google (a subsidiary of microsoft) beams "apps" into your retina through your phone. It conveniently has two m2 slots so I can have one for windows 7 and one for linux.

Well the D doesn't have the same heat problems

Can anyone actually prove that intels ME and whatever amd has or plans to have allows for remote access actively ????
There would have to be some kind of network protocol that can talk to these devices directly, there is NO WAY these are actively sending out data.

do xenons and other server grade cpus have this shit?

I don't remember anything like this from the mentioned era.

The server chips are the only ones where it actually makes sense for the customer to want this "feature". And yeah, they have it (but so does your laptop, heh heh heh!)
I doubt you'll see any random traffic from ME, but allegedly (I just never observed it personally) it just hijacks your own ethernet and IP addresses, so only way to notice it is by running packet sniffer on upstream router and seeing traffic from your computer even after you shutdown the OS.
I just don't know how to activate the ME and connect to it. Apparently there is remote access software you can buy, for the server chips at least.

I know that, I just thought being powerful enough to emulate the processor he was talking about was a great way to make my point about how much more powerful a modern ARM processor is.

its a thick silver laptop with a huge bezel and keys that don't have gaps in between

Intel, AMD & Microsoft openly support remote management, including LOM and have been doing so for several years. It is not a bug, it is a feature and they have invested a lot of money into sub-OS remote control technology for their business clients.


I don't know about their secret botnet shit, but many of their RAT tools are out in the open.
developer.amd.com/tools-and-sdks/cpu-development/tools-for-dmtf-dash

That's different though in that it doesn't (at least in the case of a GPU, a PCIe cellular modem could but why would you want one) run on an separate system that you can't access/control/examine the code running on it like Intel's ME/AMD's PSP, nor does a GPU (to my knowledge) have separate RAM that only a ring with higher execution privilege than the kernel can access like SMM.

Is there anything shipping with relatively current hardware and Coreboot/tianocore? Librem seems like the only laptop out there but I can't find anything for desktop. I know corebooting current intel procs still needs blobs.

pls email [email protected]/* */ if you're a cat named sakamoto and want a cute furret to lick your paws
they drew first blood

I NEED this video. Seems amazing.

Any race capable of actually getting to this planet, could easily wipe us out. Either by a relativistic weapon, or some other advanced technologies.

JACK BTFO

I'm breeding all mine. I don't trust Gamefreak to keep this feature in future games.

I'm not begging anyone, if you like my work you follow me. I've just started! :)

In vishera fx-X300 you have to load
microfirmware in order to get
low level cpu operations

If internet browsing is seriously the only fucking thing that's shitting up your CPU, then I suggest not buying a computer to exclusively use the internet on. What kind of fucking idiot makes the internet their main purpose for buying a computer?

archive.is/gb6K1

so is AMD psp and Intel ME same thing with same risks or are they any differences between them?

Because if amd goes that way why bother not going with intel?

That was in the past. It doesn't happen any more.

So is this FX-8350 confirmed for last good CPU?

we are so fucked up

Block the botnet ports on the firewall level?

What kind of board supports multiple FX-8350 CPUs and how many of them?

uMatrix allows 1st party scripts by default. I don't agree with that default setting either but it's easy to change. In every other way uMatrix is superior to noscript.

you want to burn your house down? if so go with 9590

Can we not just break the arm chip on amd cpus?

You can't just take any CPU and use it in a dual CPU configuration, you need a CPU that supports being used in the configuration in the first place. Look at AMD Opterons.

rely on operational security.

ever heard the saying "an ounce of prevention is worth a pound of cure"? good operational security is at least as effective as good hardware and software security.

if you get stuck with a botnet computer when you upgrade, keep your normie shit on your botnet computer, pick up a decent old model thinkpad or a sbc like beaglebone black for when you need to inspect your rare pepes you don't want stolen.

you could even get a raspberry pi zero and run that as your rare-pepe machine, $5 a pop you can effectively use it as a burner computer and switch to a new one every week.

keep the two hardgapped, never have wireless enabled (including bluetooth) while in range of the same network, don't share peripherals between the two, get a usb-wife like alfa-awus036h (so you can physically unplug it, and can be run in monitor mode), only use wifi on your pepe-machine at public hotspots (or cracked wifi), always use vpn + gpg / pgp, get good with encryption, store your rare-pepes on micro-sd cards so they can be easily hidden and / or destroyed, never access the same accounts between devices, Vary you fuckin type style an sheeit mane can't be havin no muhfuggin cia nigga be [email protected]/* */ dem ai bitchz ta be profilin you an sheeit. fuggin whities be racist cuz dey mad we big dick niggas an sheeit muh WOP DO DIDDA BIX NOOD Muhfugga

Not using uMatrix+uBlock and Self-Destructing Cookies.

i would buy new for:

just get fx6350 with wraith. for normie stuff its enough and not like 8350 where you need waterooling or 80 bucks high end air cooler

Yes I looked up the specs and I was surprised about the 4GHz clock. Some reviewer also said the fan was too loud so I guess it must be heat related like the previous posters suggested.

So basically to unfuck my computer I need to get an AMD cpu from the FX line and a new motherboard with AM3 socket? Then I can just swap my motherboard and CPU inside my tower pc and I have a non-pozzed machine, right?

What motherboards would be good? I read ASUS are solid but come with UEFI faggotry. What alternatives would there be without UEFI and similar sturdyness?

you forgot the DDR3 ram
dont think there are any non-UEFI Motherboards
its just standar nowadays, unfortunately

Wraith cooler has been keeping my FX 8370 pretty cool. I hope they keep it around for Ryzen.

That doesn't look very different than the cooler that came with my old socket 939 Opteron. It was always able to keep up with everything, even massive overclocking.

It's quite a bit bigger. But that aside, it's the best fucking stock cooler ever.

Proof/citation?

vp8 is shit, however

the W10 thing? or what else


from tests it seems to be on level of hyper evo212

Can anyone of you tell me more about the FX-4300? Any personal experiences with this one? Will it overheat? Is the fan loud?
So far I read that it is shit for playing new games but for the low price you actually get decent power. Of course it also comes without intel ME / botnet AFAIK.

I want to build a dedicated linux machine only for browsing the web. System will be on an encrypted HDD and I have no intention of playing videogames or installing any Michaelsoft software on it as I already have a gayman pc with Intel CPU 3770K I bought in 2012. Graphics card would be a cheap one with passive heat sink just for getting a video output.

Most important question would be: Will the machine run as smoothly for everyday browsing like my intel machine? I would absolutely hate it if it was slow and constantly lagging whenever I launch the webbrowser.

Also can anyone tell what the right time would be to buy one? I read AMD will release new CPUs this year and this would likely lead to prices of existing cpus getting lowered for competition. Should I wait until this effect sets in?

me on the left

That should be absolutely fine as a facebook machine. I doubt the fan would be loud or that you would experience overheating if you don't overclock (much).

Get an SSD as well

I don't know if prices on AMD shit will get much lower than they currently are.

Thanks for the reply, I do not plan on overclocking.

but this will not go well with LUKS, right? If I understand this correctly, encrypted partitions will be written more often to disk and this will be bad for SSDs because there are limits on how many times the Flash can be overwritten with new data. Also once I read something about it is not good privacy-wise because flash cells can be restored in forensics much easier or something (cannot remember anymore).

FX-4300 does not have the AMD botnet features (PSP or how they call it), right?

SSD + LUKS is fine. Encrypt the entire root partition including swap before you do anything else and forensics isn't an issue. For performance reasons you'll probably want to enable trim. Some consider this a security concern because you'd be able to tell from the trimmed blocks what type of OS is in use but a /boot partition and a LUKS header are going to give it away anyways.

Don't buy a consumer SSD. Get a proper 1 DWPD drive off ebay. You used to be able to get new HK3R and a Samsung model for $130/480 GB.


Doesn't matter. ME is a red herring. If you don't trust ME/PSP, you oughtn't trust the silicon at all. What most of the worry is for these engines is their oob network access. This can be disabled by using a PCI-E lan card.

this was new to me. Thanks

I went deeper into the subject and researched about motherboards. The 2 most prominent AM3 boards I found were the ASUS sabertooth and the M5A99X. The sabertooth is marketed as a motherboard with super sturdy components (military specifications) and ASUS will give 5 years of warranty for it. It costs about 50$ more.

Does anyone have experience with those boards? Would the sabertooth be worth the extra bucks?
Online reviews write that the sound on both cards is broken and that the sabertooth is not as good as ASUS tells it is. Some people write they had 2 of 3 cards that failed. So far I have been using the Maximus ROG series and it has been working nicely for years. No driver issues, no crashes nothing. I think some of the bad reviews might also be retards buying computer parts that don't fit each other.

Like I said, I do not plan to overclock or mod anything, just a stock "facebook machine" for webbrowsing on linux. If this Sabertooth is good and if it gives me some more years to use the machine it would be worth the 50 bucks. I never overclock nor do I play around with it, I just want it to survive as much years as possible.

If a website requires Javascript, don't use that website.

If a website requires JavaScript and you need to read that website, use archive.is.

If you need to interact with a website which uses JavaScript, fuck with NoScript.

Problem solved.

These high end am3+ motherboards have overclocking as their main selling point. Eight core fx cpu's can draw more than 200 watt, and need proper power delivery. If you intend to use the 4300, and no overclocking, its best to look at the feautures(IO and such) of the boards, because these reviews are all skewed by people that overclock.

whats wrong with ssds?

They could and I think they will because as AMD released their revision of FX series with suffix -E they dropped prices of older cpus in some cases even to 28%

Had one. They're fucking fantastic. They also allow you to automatically overclock the CPU. That is the mobo will detect your CPUs optimal frequency. Mine went up a good 500mhz using standard cooling.

To add to this, the only reason I don't have one now is I lost it during a move. I'd buy another but I'm waiting for Ryzen.

wow thats a lot

also read that sabertooth boards have an above average RMA rate and cost far more than they're worth.

Thx for all the replies


you seem to be the happy one. How long did it last? Did you have any problems with it? Apparently the board has been on the market for some years already and there should be people around who bought it back in the day.


It is 170$ for AM3 cpus when I last checked.
Well if the components are MIL standard and if the board will be good for at least 5 years it would be worth it but There are so many reviewers who write the board was either dead on arrival or after a few weeks. But then again, there are so many retards and you can never know if the boards was dead or if they just fried it with their first power on.


So what would be a mobo like that? I do not care for much connectivity or features. As long as it has a port for a graphics card that will give me video output and connectors for HDD and DVD drive. Of course the idea is not to buy some cheap crap that will break down after 1 year. I just don't know any other manufacturers than ASUS personally. They have been good with my previous board so I want to give them a chance again. Gigabyte is also good apparently but their boards are priced similarily.

I have a three/four year old system with a Gigabyte 990FXA-UD3 v4.0, 8320 at 4.5/4.6 with several case moves over it`s life time. System is pretty stable except for several unexplanied overheating related poweroff`s recently (which is what I can remember right now) during fullscreen at least in the current case, most likely the air from the 200/230mm intake is clashing with the CPU fan`s. Annoyingly USB 3 requires the kernel option immou=soft which appears to not work with the option needed for PCIE-passthrough. At least the CPU voltage setting does not allow for any fine grain control at all, only appearing to move up in bigish looking jumps. A earlier USB driver in kernel had caused a reboot when I connected as specifc device however that has ben fixed for a while now, overall I would recommend it.

1 DWPD = The SSD is warrantied to have it's volume size worth of data written to it once per day for 5 years. So a 480 GB 1 DWPD SSD has a lifetime endurance of 876 terabytes written.

Consumer drives are 0.3 DWPD or less, and for most of SSD history they have been plagued with failures.

thanks for your reply.

I also found out that the mainboards I looked into have no integrated graphics. So I will have to use a dedicated graphics card. In the low power segment with passive heat sink for minimum noise I found those products at roughly the same price:
Sapphire HD6450
ASUS R5 230 SL-1GD3
ASUS R5 230 SL-2GD3

Will I have audio on those cards as well? I just want the card to have a 1080p+audio output for my monitor from the HDMI port. I hope they would be sufficient if I do not plan on playing video games on this rig. Do any of you recommend one model or are they the same? Reviewers write that for office those are sufficient. Some even use it for old videogames on low settings. I just want it to give me a HDMI signal with no lag and no corrupt audio.


Then the other issue is CPU underclocking/undervolting. I believe 3.8 GHz is more than enough so I thought about underclocking and undervolting my CPU. The idea behind it was to reduce the fan noise and thermal stress (so it can live longer). Would this be a good idea or is it only a meme with minimal effect? 3GHz should be more than enough for my needs. The ASUS boards I looked into seem to support changing the frequency and voltage of the CPU.

thanks, dont they sell those in shops anymore that you need to look for used? because used here is non existent and I dont want to import from germanistan or britanistan

for average user its enough and will often last a lot more
techreport.com/review/27062/the-ssd-endurance-experiment-only-two-remain-after-1-5pb

Probably a stupid question, but is it possible for some BIOS spyware shit to be able to evade a packet sniffer?

I mean packet sniffer running on the same machine.

How difficult is it to build your own computer parts? Sure, it's take a lot more volume and time than those built with robo arms. But would it be feasible with an user's free time? Computers of old were big as rooms, but people then didn't know what we know now.

If you are an electric engineer and if you have experience with designing computer parts then you might get some basic computer developed after a few years. Then you only need a factory that will build the boards you layouted. Also engineers usually don't develop every single part. They put existing parts into their database and then use the parts for their schematics and board layouts. Then they call a factory to build their prototype.

...

someone else but I dont think I will wait until zen for FX based PC
I already see local vendors not stocking DDR3 rams and such

If you're fine with a peasantly quad-core and don't plan upgrading, I would suggest going the APU route. The Richland APUs are the last ones without the PSP and they have up to four of essentially the same cores as the FX-4300. They also have an integrated GPU that would let you do without a graphics card. The main advantage is much more modern chipsets and mobos that go with them - the FX platform is ancient in comparison and is limited to PCIe 2.0 and USB 2 (though the latter can be fixed by a separate controller chip on the mobo). The disadvantage is that four cores are the max.

I would suggest you buy yourself an A8-6600K and a nice, high-endish mITX FM2+ mobo with all the bells and whistles so that the mobo has everything integrated and you don't need any expansion cads. You'll then be able to squeeze everything into a tiny, console-sized case like the Chieftec FI-02BC-U3 (I have one of those, it's nice-looking, solid and really cheap).

The stock one bundled with the processor is shit and can be quite loud under load. I suggest buying a 3rd party cooler if that's an issue.

I have a much lowlier A4-5300 in my HTPC and it's pretty snappy (unless I open lots of browser tabs, but that's only because I skimped on RAM). Soft-decoding a 1080p H.265 video takes about 50% of a single core, so no problem there either.

ASAP. All the processors we were discussing are old as fuck and utterly obsolete by now. Don't expect price drops, but do expect them disappearing from the market.


Or a high-end mobo with two ethernet ports. The chipset provides only one, so the other has to be implemented through a separate controller chip. Find out which is which and don't ever use the chipset-provided one.

Also, if you go this route (whether a discrete card or second mobo port), make sure that the controller chip running that extra network interface is not made by Intel. All contemporary Intel's NICs support the "bridged second ghost NIC" mode used by the ME, and the ME surely has drivers for them. This includes WiFi NICs.

I wouldn't be surprised if AMD's PSP also had NSA-supplied drivers for Intel's NICs, but that's pretty speculative. Better safe than sorry though.

YES.
A botnet-compatible network interface operates as two bridged virtual NICs sharing the same carrier. One is controlled by your OS, the other one by the Management Engine. When the ME sends packets on its virtual NIC, your OS will be none the wiser.
You would need a sniffer on the machine on the other side of the cable to detect those packets.

So, what about uefi? Is it an ime/psp level disaster or is it more of just a fucky mess that we can live with?

There's nothing inherently wrong with the UEFI spec itself. It is a welcome and long-overdue overhaul of the boot firmware interface on the PC. There's nothing wrong with the Secure Boot mechanism (which is what the whole debacle is centered around) either - it's a valuable tool in protecting your PC from boot malware. What can be wrong is specific implementations intentionally locking out the machine owner from changing the keys and installing his own bootloader and OS.

The key thing to take away is that this is an issue with specific malicious implementations, not the UEFI spec itself. The majority of UEFI firmwares do not fuck over the owner.

Thank you for your reply, I will have a look at this

Will a Hyper Evo cool down a FX8350 with a bit overclock (I want to hit something like 4,5) or do I need some $100 bucks beast for it?

tomsha rdware.co.uk/answers/id-2073182/cooler-master-hyper-212-evo-amd-8350.html


I took a look at it and your suggestion was pretty good but I don't want to go the console way because there is only 1 motherboard on the market atm that would be compatible with this case. It is a Gigabyte modell and I read somewhere they have bad linux support on this card.

I built an alternative system based on the A6 chip instead the FX one. From what I understood so far, the benefits of the A6 would be lower TDP, lower price and integrated graphics with HDMI output. By changing to A6 I could also buy a less expensive motherboard. The disadcantage would be that the A6 only has 2 cores and not as much power as the 4300.

I am basing this on the A6 because the A8 seems to be out of stock at my pc part shop. If I would take the A8 instead, I could have 4 cores and more performance at the expense of 30 Watts more TDP. For my build I would go with the same case, the same PSU, same RAM, DVDR and HDD. The graphics card can be dropped. From pcpartpicker I would say there are no incompatibilities and the only question would be if this build would run linux and give me audio+video on the integrated HDMI output.

Any thoughts on this?

oh and would it be possible for you to give me a list with all botnet-free AMD A-series APUs? I am not sure if the 7xxx versions of the A8 would have it as well. So far I did not find out how to recognise the botnet ones.

Do you have a recommendation for boards where you can remove all keys and insert your own?

Is there a list of malicious implementations?

You dont want APU
amd.com/en-us/press-releases/Pages/amd-strengthens-security-2012jun13.aspx

what about intel xmp on ram?

help

Guess that would make the 9590 afterburner as the best out there...

if you have spare liquid nitrogen

what a shame about AMD even with budget cpus intel is dominating them

thank you very much. I did not see this until now. I wish there was a list with all non-compromised cpus.
So the FX series are confirmed for last CPUs without botnet inside? 4300, 6300, 8300 and the 9xxx series all good to go?

I only read that anything past 2013 is not good (on libreboot page) and -E line falls into that (8320E,8370E)

also en.wikipedia.org/wiki/List_of_AMD_FX_microprocessors lists EVP (Enhanced Virus Protection) on all models but I have no idea what that is

Lolwut? While the choice of ITX mobos was always sparser than (m)ATX ones, every major manufacturer cranked out at least one model (if not more). You should have quite a few options to choose from.

My HTPC runs on an Asrock FM2A88X-ITX, and I'm quite happy to recommend that board if you can find it. While it was quite pricey, it has all the shit I'll ever need integrated onboard. There's also the very similar newer model, A88M-ITX that should be fine too. One major advantage of both is that they have the CPU socket near the center, leaving more place around it to install a larger cooler in a cramped case like that Chieftec. And the vent in the side panel is then right over the CPU, letting the cooler suck fresh, cool air from the outside.

There's nothing to support here. Wireless and sound are provided by standard, ubiquitous chinkese chips (Realtek or Broadcom? I don't remember..) and all the other interfaces are straight from the chipset. Everything is perfectly supported by mainline Linux kernel. Firmware updates can be done straight from the UEFI, you don't need any Windows-only flasher programs.
This is not an expensive 1337 [email protected]/* */ mobo with obscure RAID controllers or "Killer" network cards that have drivers only for Windows. Everything should just werk.

Bear in mind that the TDP is calculated for the worst-case scenario with all cores (including the GPU!) running at full throttle at the same time. If you load only two cores of the quad-core chip, the power draw will be no worse than on the dual-core part. Basically, you only pay the extra watts when you actually use the extra processing power.
And even the shitty stock cooler can keep the top-of-the-line A10's temps in check. At worst, it will ramp up the RPM at full load and sound like a fucking hairdryer, but it will hold out. If you don't like the decibels, you can either get a better cooler, or configure the power governor in the kernel to limit max clocks under all-core loads. Or just don't run anything heavy when silence matters.

Not by much though. All of those chips are dirt cheap today anyway. I don't think it's worth sacrificing this much performance for a few measly bucks, especially if you plan this machine to hold up for many years. Get a quad-core if you can.

The APU contains a GPU, yes, however the video outputs available are determined by the mobo.

Then check out other shops. And fleabay. The Internet is full of offers, and processors are tiny and cheap to ship.

Both are total overkill. If you don't need any addon PCIe cards, it would be a waste of space not to go with a more compact ITX or at least mATX case. Also look for slim cases with half-height expansion slots. Almost all PCIe cards these days (except graphics cards) are half-height anyway, so you won't realistically lose any expandability.
Your build will pull below 100W in total. You don't need a 450W PSU.

Get faster RAM, preferably DDR3-2133. It's only a few bucks more, and the iGPU will massively benefit from extra bandwidth. And get two sticks for dual-channel operation.
I strongly recommend an SSD as a system drive. You can get a 240GB OCZ Trion 150 for barely $10 more than that HDD you specced. 240GB ought to be enough for an office/web machine.
And, for the love of Kek, if you do need an HDD for bulk storage, do NOT buy a WD! These drives fail left and right at ridiculous rates. The only HDD manufacturers with a decent reliability record these days are HGST and Toshiba, and HGST drives are ridiculously overpriced - so Toshiba P300 it is.

Sure, no problem. With some more conservative distros you might have to fiddle a bit with kernel config to make HDMI audio work, everything else should work out of the box.

The first APU generation to have the PSP is the 7xxx. All the 6xxx and older are botnet-free.

thank you for your reply.

Yes I found this board but not in the ITX format. It was the A88X-Plus in micro ATX format.
The only ITX one I found was the GA-F2A88XN.

How much would it boost my performance? I think the motherboards has a limit on RAM frequency as well but I will check this later. Would it be good for the FX build as well? I just went for the same RAM I used until now.

yes I just thought about this today. If this machine is supposed to last long, it would be stupid not to pack some extra processing power into it, just in case websites get even more javascripted than today and inoperable for anyone without a 5GHz cpu. This is really a retarded design concept. Build retarded webpages that are slow as fuck and solve the problem by throwing shitloads of hardware power at the problem. So I thought about putting in a FX-6300 or FX-6350. The 6300 is priced similarly like the 6600K and they have similar performance. The 6350 might generate significantly more heat as it packs 30 Watts more.

unlikely because I run a full bitcoin client. The blockchain takes 115GB and my lurk forlder is 25GB.
I would rather not have space problems in 2 years.

Thank you for pointing this out, I just thought to take the most popular drive.

I did have western digital external 3.5" disks for my warez a few years ago and they did not fail. I had 5 or 6 of them. Then I switched to Lacie and those were nice as well. Now I have Seagate and those work as well. I usually do not plug in the drives when I do not write stuff into them so maybe they just ran well because I never stressed them too much.

For my current PC I have a WD green and a blue one. The green is utter shit for anything other than storage. You cannot play a game or watch a movie from this drive without lags. The blue one I have is one from 2012 and it still runs fine. Never crashed and never had a problem with it. Those new drivers might have been introduced a little later.

I will look at some reviews about them. THose Hitachi ones surely look great but they nearly charge double the price.
Never used a toshiba one but I will check them out.

thank you. Is there any list what FX series cpus are affected? Someone mentionned the FX-xxxxE series as potentially compromised.
Maybe they released updated versions of the FX line that included those "features"

hold it you dummies. get seagate 8-10tb drives tbh.
backblaze.com/blog/hard-drive-failure-rates-q3-2016/

they upgraded from hgst to the seagate 8tb drives because they're at a sweet spot for reliability, cost, and storage density. if you're building an array, for large arrays (say 30 tb+ if you're building your own seedbox or server) they have lower array failure probability than hgst drives for the same array size. and by nature of being a higher capacity drive, you get more storage and consume less power. and on your battlestation, you have a limited number of drive slots - using the seagate drives a redundant configuration (raid-5 is nice) you get higher capacity compared to hgst while you can still lose a disk and be okay, so reliability is less critical.

I checked on the HDD story and basically Hitachi had the best drives but they were bought by WD. Then Toshiba won a lawsuit against WD and WD had to sell some of the former Hitachi production plants to them. Apparently Toshiba is producing the 1TB 3.5" drives that Hitachi produced before. However, I have no idea if Toshiba is building Hitachi Quality drives there or if they just wanted to benefit from their name. From reviews I read, the P300 drives are not significantly better than WD drives and suffer similar fail rates.

WD drives are especially shitty because they implemented some "energy saving feature" in some of their drives. The post 2012 drives have higher fail rates because higher mechanical stress the HDD is moved back to idle position every few seconds. They say this is to "save energy". It usually ends with HDDs reaching their maximum LCC count much faster than without this "feature". The green drives were affected by this most and there was a scandal around it a few years ago that lead to WD rebranding their shitty and slow green drives into medium performance WD blue. They make it extra hard for you to distinguish the shitty and good ones. However, the good thing is that you can disable it completely or increase the timer.

Seagate are the ones with the largest fail rates but to be fair you also have to say that they have the largest market share and sell the most drives.

WD is shit, i would never ever buy them. If I was only going to have 1 hdd I would buy HGST for the reliability. If I'll have more than 3 drives (in my next build I figure on having 6+ drivebays) then I'd go with the 10 tb seagate drives out right now. Check the failure rate on the 8tb seagate drives. 1.46% It's close to the worst HGST, 1.2% The 10tb drives are supposed to be similar to the 8tb in performance. That's why I recommend the large seagate drives. With 6 10 tb drives in raid 5, that's 50tb and I can lose 1 without data loss. 50tb storage is pretty tempting. I want to use it for some large datasets.

So when you all say Botnetted, what do you actually mean?
A CPU can't keylog, screenshot and send my data to anyone.

Best they can get is turn on/off times, and I suppose location of internet access point.

Right?
Right?

I dont know about data but it seems they can lock you out of your own PC, if it gets ((stolen))

Not really. However, I've never heard of nor came across a Secure Boot enabled retail mobo that wouldn't give the user control over the keys. The only SB lockout horror stories you hear are about laptops and (rarely) big brand OEM prebuilts.


What about it? It's just a table of precise timing specs embedded in the RAM stick readable by the mobo that let the mobo automatically choose the best timings it and the memory can handle.


Wut? Right now AMD has the best bang for buck in the low end (as it always had). Their newest APU gen ain't half bad ignoring the botnet factor - they may not be the fastest, but they're cheap and AMD really managed to squeeze great power efficiency out of them (considering the old, inferior manufacturing node they're made on). I don't really get what you're complaining about.

All desktop FX CPUs are botnet-free.

Enhanced Virus Protection is just AMD's fancy marketing name for the No-eXecute page table bit. Every x86 processor made in the last decade supports that.

I meant the kabylake pentium, that thing seems to pack good performance just for 75 bucks

I have bad news for you:
lib reboot.org/faq/#intel
basically it can be controlled remotely, it can phone home, it can see everything in your RAM (this is especially problematic for encrypted operating systems), probaly even your keystrokes and you can never disable it or get rid of it.

libr eboot.org/faq/#amd
basically the same. Access to ram, keystrokes and can phone home. Notice the emphasis on the "DRM" part.


They want the same like the Trusted Platform Module that microshit is shilling so hard. They want to control what software goyim can run on their machine. If it is not approved by them, the computer will not execute it. So if you pirate a game, it has not been signed with microsofts private key, you can not even execute it on your pc. They also want this for DRM and there must be some way how they can stop your pc from playing pirated movies or music.
Luckily TPM is still not used while it is available on many mainboards already.

I forgot to mention that once my PC was starting itself. I was sitting on my bed and it just started itself. It went to the part where you have to press the password to unlock the encrypted LUKS system. I watched if there will be attempts to enter a password but nothing happened. It just stopped at the password screen and it did not seem like there were people trying out passwords. Screen was showing no change so after a few minutes I turned it off.

It might have been a malfunction but this incident reminded me that intel systems cannot be trusted. Ever since I am always cutting the power supply after turning off my pc.

Denuvo fucked up piracy real good even before this will come

Then you didn't search hard, did you? I just did a quick search and that Asrock board is widely available both in American and European online shops. Right now it's on sale inb4 it's just $5 on Jewegg: newegg.com/Product/Product.aspx?Item=N82E16813157699
There are also some models by MSI and Biostar floating around.

Going dual-channel will boost your CPU performance by ~5-10% and nearly double the performance of the integrated GPU. Switching to 2133MHz sticks won't measurably improve CPU performance, but will speed up the GPU by further ~20-30%. There is no point in going over DDR3-1600 for pure CPUs like the FX series. Those integrated GPUs however are massively bottlenecked by memory bandwidth, especially the higher models. Every bit of extra bandwidth helps them.

Not really. It only contains short traces connecting pins on the CPU socket to those on the RAM socket. The actual DDR controller is on the CPU, and that's what is limiting achievable transfer rates (provided the RAM itself is up to the task). Mobo makers will list "supported" RAM speeds on the packaging, but that is only to ease less informed customers' compatibility concerns.

Streetshitter-made webbloat is only going to get worse, but sure as fuck it won't become multithreaded. Throwing more cores at it won't help you, and individual cores in the FX-63xx ain't any faster than in the quad-core models.

kek
Storage will be much cheaper in 2 years. You can buy a second drive then. Or buy both an SSD for system and apps and an HDD for bulk storage now. Or splurge $130 for a 480GB model and be done with it for a few extra years.
Seriously, the system as a whole feels much snappier with an SSD. In typical office/web usage it will affect perceived speed much more than your CPU choice.

There were no "updated versions". The development of this line was abandoned right as it hit the market, as it was clear that it couldn't be in any way competitive with Intel's offerings in this market segment. All the refreshes since that time are just new binning tiers taking advantage of the maturing manufacturing process. They are binned from the same production line as the older models.

I checked the specs and both CPUs in question (FX-6300 and A8-6600K) have 1866 in their data sheet. Will they go well with higher RAM frequency as well? AMD probably rated it lower to be on the safe side.

Then I read G.Skill has the best memory (high freq, low cl) but there are reports of mainboards not beeing able to run them at the max speed. Do you have experience with those? Ripjaw and Sniper are the ones I founds.

I understand but having a little extra power might come handy because operating systems and programs might be more power hungry as well in the future. I also think about stocking some replacement parts.

So far I have 2 builds on paper and I will compare them when I make the final decission. The FX version has a little more power and room for additional cards or a new gpu while the A8 version will be a little cheaper, more compact and the parts will be tuned perfectly to each other. The price difference is only about 130$


thank you very much for your help.

buy both so you can have a backup

Why is AMD better than Intel?

I'll wait..

Because they're not as massive faggots and they suck much less cock.

that pic being posted in 3....2....1

I won't post unrelated pics in this thread. Get banned for it yourself if you want :).

AMD is a slightly less shitty company than intel aviv or invidia. The reason therefore is that AMD has very good linux open source drivers for Radeon graphics cards and they help the developers with developing them. Nvidia cards have little to no linux support and until a good driver becomes available, the cards has already aged a lot.

AMD was not always bad. Back in 2011 they cooperated with coreboot but in 2014 they stopped it and claimed it was wasted time in their eyes. Today they are pushing the Windows 10 meme by claiming future cpus will only be compatible with W10 for full power.
I wonder how they went 180 in only 3 years.
The other good thing is that AMD usually sells their products with less greed. They are known for good bang for buck ratio and hopefully they will still be when Ryzen comes out.
Even if you don't but their products you profit because competition keeps the markets from going full kike.

My greatest hope is that some day we will have open source BIOS motherboards and cpus completely without botnet.

And forgot to mention. We kiss AMD ass so much because their FX processors are some of the last cpus that come without those typical intel botnet "features". As they went full NSA as well now, our hopes are in PowerPC, ARM and RISC although it does not look good there either.

Im sure Ryzen wont be so forgiving as is Kaby on

...officially.
All the FX chips and the APUs ending with "K" after the number are unlocked. This not only means the main CPU clock can be freely overclocked, but also the memory controller. In practice both typically handle up to 2400MHz no problem. At worst you might need to bump the memory voltage a bit to keep it stable. Unlike core overclocking, bumping memory clocks won't significantly increase the power draw.
Pretty much. In fact, the highest APU model in the series (A10-6800K) is officially rated for DDR3-2133. This is the same chip from the same production line. It's just binned to tighter standards.

Get whatever brand you can get the cheapest. There's really not much difference between different manufacturers, except bling maybe (if that's your thing). Very high freq RAM is in general very finicky, and there's a risk sticks from manufacturer A won't work at rated speeds, but from manufacturer B will. Then with a different CPU the sticks from A work, and the ones from B don't. Up to 2133MHz that shouldn't be a problem, however.
Also, don't get overboard with memory speed. As I said before, there's no point going over 1600MHz for CPU-only chips. Higher data rates significantly benefit only the integrated GPU, and even then over 2133MHz you'll hit diminishing returns (and skyrocketing prices). CL is pretty insignificant in real-life workloads. Your CPU may bench a fraction of a percent faster with lower latency memory, maybe a few percent faster in chosen benchmarks. There is no point in wasting money on memory that is faster than necessary. Obviously though, if you've got a great deal on faster memory for barely a buck or two more, go for it.

If you decide to go with the A8, do get that Chieftec ITX case I shilled earlier. It's cool as fuck to squeeze so much power into a teeny case like that and your build just screams for it :).

are those APU models good for someone who browses a lot + multi task (vids), Im just looking for something cheap for my uncle

Definitely. In fact, they have the best bang for buck in their price range. You might opt for a newer generation model if your uncle doesn't care about botnets though. Which is exactly offtopic for this thread. SAGE

he is extremely paranoid about that though

Ah OK then. Have an apology bump.

Do you have the screencap proving this? I believe you I just need it for archiving.

see


basically AMD put out the FX series that was primitive technology with poor power efficiency and less operations per clock less performance than intel CPUs. They soon realized that they cannot improve those cpus and abandonned all development to focus on Zen. They are selling the FX line as the option for budget gaming and good bang for buck builds.

If you need sources then check out the libreboot page. There is a section about AMD and they claim any CPU released after 2013 is affected. FX models were released in Fall 2012. Botnet comes with new Zen cpus and newer A-series chip with the 7xxx numerology.

Eh, I wouldn't go that far describing it. It was an ambitious speed-demon type design that simply failed to reach its design goals.

The basic idea of what CPU designers call a speed-demon type core is to shorten each pipeline stage as much as possible to drastically raise achievable clock rates at the expense of IPC. The usual failure mode of a speed-demon design is to succeed in fucking up IPC while failing to raise clocks high enough, and this is exactly what happened here (and in the earlier Intel's stab at this kind of design, the Netburst core).

In fact, the Bulldozer is only a half-failure. By the second iteration it managed to reach IPC parity with previous AMD's K10 cores while significantly lifting up clocks - though not nearly as much as AMD expected to be able to. If it had reached 6+ GHz at launch as it was planned to, it would have competed on par with Intel in single thread performance and beat the shit out of them in multithreaded.

They only abandoned the high-end >quad-core platform because they knew they weren't going to become competitive in this market segment in the next few years. However they continued to develop the Bulldozer cores for use in their APU line, where they were good enough. The following iterations significantly improved power efficiency, to the point that AMD is now able to offer decently clocked quad-core APUs in a 15W thermal envelope. That's fucking something!

Not sure. What's wrong with the FX-8370, the 8380E and the 9000 series (with high-end cooling)? They're part of the Family 15h systems, so there should be no PSP, right?

you need liquid nitrogen for that shit
it has 200 TPW stock.

Nothing at all. They're all just different bins from the same production line.

Quit dramatizing. A good air cooler can deal with up to 300W easily.

Pretty much.

Bulldozer was a massive fuckup, though the latest iteration, excavator, shows some significant progress and demonstrates where AMD was trying to go with it.

AMD saw their hex-core Phenom 2 CPUs being able to handle 3.4Ghz+ on 45nm with no issues. They weren't overheating or consuming unreasonable amounts of power. They figured that lengthening the pipeline a little combined with a process switch to 32nm would result in 4.5Ghz+ stock clocked octo-cores. The reality is that a new architecture combined with a process switch is always very risky. They gambled and they lost. This is why Intel always introduces a new architecture on a mature process, less things that can go wrong.

AMD wasn't being unreasonable when Bulldozer was on the drawing board. The "speed demon" approach combined with the "dual-core module" design was pretty much the only way to put a cost effective octo-core on the market. It's just that they put way too much faith on Global Foundaries' 32nm process. This was the first time they didn't have in-house fabs that they could fine tune for their processors. They also vastly overestimated how well "moar cores" were needed.

its standard for EE undergrads to develop computers. A skilled EE can have a custom comp made in a few days. The cost of having it made will range from reasonably cheap if they are wiring up various ICs/pieces to retardedly expensive if they want custom silicone (since the expensive part of that is the master which is used to make 1 or a thousand, doesnt matter)

I'm gonna switch to a motherboard that simply does not support the botnet processor. Easy as that.

Btw == everyone should read this totally unbiased blog post: blogs.intel.com/evangelists/2015/02/20/tricky-world-securing-firmware/ ==

...

not the main guy here but I bought FX 6350 I just dont know what gpu to pair it with
I saw some vids where they said rx 480 has severe bottleneck on that cpu but from all the reviews 480 seems best for my needs since its only about 20 pricer than 470
I dont think its good idea to wait for VEGA if its supposed to be 1070/1080 tier right?

Pretty much. Get that 480, get a solid cooler for your CPU and overclock the shit out of it. It should get you by when at ~5GHz.

No such things on newer platforms. Every Intel chipset now in production contains the botnet. AMD is quickly closing on that too.

Also, that blog post gave me cancer.

You'd better hurry up and get an AM3+ board and AMD FX CPU before they go out of production.

Aren't they already out of production?

You cannot develop everything by yourself in short time. This is why companies employ hundreds of developers and each one of them does his part.


I even want to stock spare parts. Statistically motherboards tend to die faster. Should I get 3 replacement boards and 2 replacement cpus?
Also, is there any proof that FX series are botnet free? I read some claims that not all of the FX series were clean.

but you will test them all right?

probably

good

AMD predicted that revenue would drop by 11 percent (plus or minus 3 percent) in the current quarter, however, indicating that AMD won’t have as much to offer before Ryzen’s launch. That’s because AMD wanted to clear out its channel inventory, Su explained, before Ryzen officially ships.

anyone who wants a FX you have month left

do you have proof for your theory?
Thing is that your claim contradicts what is noted here:

it was interview with CEO Su.
pcworld.com/article/3163500/components/amd-confirms-its-ryzen-cpu-will-launch-in-early-march-followed-by-the-vega-gpu.html

Does my i7 920 c0 have the Intel Management Engine?

ntroduced in June 2006 in Intel's 965 Express Chipset Family of (Graphics and) Memory Controller Hubs, or (G)MCHs, and the ICH8 I/O Controller Family, the Intel Management Engine (ME) is a separate computing environment physically located in the (G)MCH chip. In Q3 2009, the first generation of Intel Core i3/i5/i7 (Nehalem) CPUs and the 5 Series Chipset family of Platform Controller Hubs, or PCHs, brought a more tightly integrated ME (now at version 6.0) inside the PCH chip, which itself replaced the ICH. Thus, the ME is present on all Intel desktop, mobile (laptop), and server systems since mid 2006.

We will find out if she was only bullshitting. I really hope they will continue the FX series for a few years. Maybe it is time to buy replacement cpus and AM3+ motherboards for the time when you have to replace broken parts

I have ASRock mobo that:
-have AM3+ socket
-DDR3
-SATA2
-IDE

but it DOESN'T have UEFI, but have BIOS


what's the point of putting system to SSD? waste of money

can you give model name? not like it matters for me since all the crates for pc are already at home and payed

Why not string a bunch of pre-2013 boards into a nigger-rigged beowulf cluster?

...

I had to use web archive since current asrock website is unusable

web.archive.org/web/20130504231619/http://www.asrock.com/mb/index.asp

web.archive.org/web/20130524031913/http://www.asrock.com/mb/AMD/985GM-GS3 FX/?cat=Specifications
web.archive.org/web/20130526060610/http://www.asrock.com/mb/AMD/880GM-LE FX/?cat=Specifications
web.archive.org/web/20130526122206/http://www.asrock.com/mb/AMD/970 Pro2/?cat=Specifications
web.archive.org/web/20130504135838/http://www.asrock.com/mb/AMD/960GMU3S3 FX/?cat=Specifications
web.archive.org/web/20130504125741/http://www.asrock.com/mb/AMD/960GM-VGS3 FX/?cat=Specifications
web.archive.org/web/20130504150132/http://www.asrock.com/mb/AMD/960GM-GS3 FX/?cat=Specifications

As you see, you can even have SATA3 mobo without UEFI.

I got an answer. It seems like there was some misunderstanding or the reporter was bullshitting this happened with my sister once when a newspaper interviewed her; they printed something but my sister said she did not say it like this in the interview or probably this ceo has no idea what is going on in the company happens very often that high cadre have no idea about how things are done in the blue collar field.

However, I believe the shortage concerns are not justified and I expect FX series to be available for at least another year and a half. Maybe the price will go down a little after ZEN and VEGA hit the markets.

why not crysis your mom you nigger

well I already bough all the parts but when the prices will drop I will pick up non uefi board and maybe fx 4200 for reserve

yes I will buy my parts soon as well. Then I will test everything for half a year and see if prices go down. Goal would be to have replacement parts for AM3+ mainboard and cpu.

seems so
btw just noticed local vendor added 8370 with wraith cooler into its listing

Newegg has a sale for the a4-6300 for $29. (ending in 4 days)

It was released in July 2013 according to wikipedia. I'm guessing it's botnet free but from my short research it's literally core2duo (think e8600) tier. But dam $29 man.

there are some rumors going around that AMD is working for Intel on some Kaby Lake cpu,
like wtf!

yeah try your motherboards good if you buy more than one. Im returning mine since it has faulty socket and cpu wont sit properly in it...

maybe im just idiot who cant build pc for damn but this doesnt look right.

What if I told you that almost all Intel 64 bit CPU are based on AMD tech for 17 years!

This

Intel was literally unable to make 32-bit compatible 64-bit CPUs so they had to buy all the technology and know-how from AMD who did it first

The consumers wouldn't appreciate sudden change to 64-only environment since it would invalidate 10 years worth of Windows software that would be needed to be replaced with brand new software

Seems like most of them have USB 2.0 headers. Are there UEFI-free boards with USB3 headers?

pretty much all the recent boards come with UEFI. What would be the benefit of BIOS? I doubt you can just flash coreboot onto normal BIOS motherboards.

Just got it
Is there anything I should do in bios or in settings?

MUD BLOOD FOR ZIKA CHAN


No ugly 1337 g4m3r graphical cancer like second pic, for starters. I don't think it's inherently botnet, just shit.


I can't find any either, I guess I'll just have to grin and bear it. How's the Asus M5A97? "Remote GO!" makes me nervous, but they claim it only works on Windows.

yeah uefi has some security issues not botnet ones from what I read

How did you do this? I installed a Cryorig H7 on my FX 6300 and I get 38 degr. centigrade on idle when browsing the bios. It is not bad but I hoped it would go below 30 degr.

Should I tighten the screws or apply more / less thermal paste?

must be bad reading for me that time, it usually hovers around 32
I have wraith cooler that I bit botched during with stock paste on silent profile in BIOS installation (my first build ever)
Dont get your temps though since Cryorig is muuuch better and also the paste cryorig included is top tier
Maybe its the case? I have Fractal R5

It is known that (at least the motherboard`s) censors are unreliable. AMD`s GPU software suppsoly is able to more accurately track it.

How was the installation of that thing? I read they have some new patent pending for it

I have the bequiet Pure Base 500 and no matter what fan speed setting I choose (fastest or most quiet) it always hovers around 37 degr.
However I must also say that the H7 is so much more quiet compared to my old intel stock cooler on my other machine. I also read on a review that the largest differences between after market and stock coolers will be in the high power range where temperatures will be below 60 degree compared to nearly 80 degree stock.


Pretty easy but it felt very fragile to me and I was worried breaking it apart (it is sturdy though). First you put 4 screws through the backplate and put the screws through your mainboard. Then you put some spacers through the screws on the other side. Then you apply thermal paste on your cpu. Then you position the X on the screws and start screwing until it does not move around anymore. I probably applied too much thermal paste because in the edges I can see a little of it.

honestly I have no idea

but I might have other question I will be swapping out stock paste for noctua one is it okay to do it with mobo in case? I really dont want to redo cables and stuff

It would work with the H7 if your case has enough free space on the back side for the motherboard (no screw holes covered with metal of the case). You would just have to keep your computer on a desk when you want to screw the cooler because you need to use a screwdriver from below to screw it tight.

I meant paste but I will replace cooler later too, im bit dissapointed in the wraith.

I built my brother a botnetted i7 machine two years ago. He ended up upgrading to another machine because he's a consumer whore and sent it back my way. I hadn't had a powerful computer in many years so I was using it for editing video and vidya. I got lazy and looked at some rare pepes on it but nothing that really fully compromised by OPsec.

Anyway, the same thing happened to me one night. It has been turned off for a day or so, I was watching TV when I heard it booting up. I turned on the main monitor and it got all the way to the prompt to unencrypt the the main partition on the SDD. Sat there for 15 minutes waiting to see if something more would happen then turned it off. I leave it plugged in still and I think it did it one other time but I'm not sure if it booted itself or an ex-girlfriend tried to access it. I figured if she did she would have turned it back off. I unplugged all microphones from it and never ran a camera so I figure I'm good. I know it leaks data and I really should get it out of the house but I love begin able to encode video on that monster so it's staying until I replace it with something.

guys help how do I encrypt my shit like you do

I'm assuming you're serious. Check the stickies in this board. Here is a good link to get you started

shodan.me/books/Cryptography/

Also look around for guides on OPsec. It is discussed here every day.

Performance. The computer will feel much snappier.

Would VIA CPUs be a option?

I know nothing about any of this that's why I'm asking.

isnt that thing like ancient?

For those, the CIA just has to ask.

In addition to being ancient it probably doesn't have a x64 version. Also power efficiency is likely to be way less than even AMD.

Actually, they had x86-64 pretty early, and the Isaiah architecture was pretty spiffy back in it's day. It probably would still be decent today if they shrunk it to 14nm. They were very power-frugal too.

But yes, that shit is pretty ancient today. Shame VIA didn't continue to develop that CPU line. I heard the reason was Intel jewing them out of western markets with an army of patent lawyer kikes. We need to gas "intellectual property" laws already, ffs.

user, can you post your final build, please? I am in the market for something similar, thanks.

When you install a linux distro, like trisquel, you literally just tick the checkbox that says "encrypt my hard drive"

what's with the retards breaking links?

Its done that way intentionally. I'm not going to tell you why because even a little retard should be able to think for himself and not be a mouth-breathing contribution to the heat death of the universe

what seems to be problem for you in building pc?

Just wanted to use user's build as a starting point

found this in consumer general
also as mentioned before he added cryorig h7 to it for cpu cooling

oops

Is javascript really
that bad
?

First Javascript was just a way to make the web move, create animations, popups etc. Then It became the replacement for F5. Then infinite scroll happened. Then some more interactivity happened. And then... css2 and css3 animations came, browser cache evolved, html5 came with video player and websocket.

Today there is 0 reason to use javascript on any website which does not need mouse and keyboard tracking.

I see 2 proper use of it:
- browser based games
- keylogger/mouserecorder (see a demo on hotjar.com)

this x200 is the last pc i will own, in most likely case

None of you faggots have the slightest idea of how technology works and have completely destroyed the mere idea of a technology board.

wew lads, my G5 is still chuffing like butter

So javascript is used for keylogging, thank you.

I had a similar experience.
I was lying on my bed and reading a book. My pc was plugged in and powered but turned off. Then it suddenly turned itself on. I was curious and turned on the monitor to see what will happen. It only went to the LUKS screen where you have to input the password to unencrypt the luks partition. No entries were made and I waited 10 minutes to see if there would be any.

Really got me thinking...

The more I learn about software the more hyper paranoid I get about it. I have a windows partition I occasionally use for Normie tier shit and every time I run a proprietary program I just think "This thing could be a RAT and I don't have the traffic analysis knowledge to find out."

I mean windows itself is a botnet that uses other machines to assist with distributed updates. They can also make controlled changes with or without your go ahead. Then you look at what the nsa can do and how all these companies are buddies with them... How do I know the proprietary firmware in my hdd or cpu isn't doing something sketchy?

MUD BLOOD FOR ZIKA CHAN!

Not even true dude, Javascript was originally created to make it easier for web developers to code applets into the browser. Before then it was pretty much all Java, and we all know how shit java is.

Now the point of Javascript is to make it possible and easier for a web developer to make the website dynamic, and while not always required today because of what you can do with CSS, it is still a lot easier and faster for most web developers to implement. Doesn't always make it 'correct', but it is what it is.

Try running most SaaS without javascript, and they will break horribly. Unless it's goygle.