Something better

The issue: Modern PC's have become overly complex and undependable. Botnet hardware prohibits users from understanding or controling their hardware. Microcode, buggy BIOS, firmware. Backdoors on a hardware level. Layered on top is the most complex set of software, whether you are running Windows, Mac, GNU/Linux, or a BSD. Security holes, endless bugs, and unexpected behavior. A never-ending onslaought of updates, fixes, and "improvements." Proprietary OS's cannot be trusted. The various open source options are an unorganized mess. A large, uncoordinated community of so-called developers relentlessly hacking together haphazard code that is always breaking. The casual, community-driven developement model was novel at the time, but now it's unweildy and dangerous.

Every year software becomes larger and requires more resources. Manufacturers are always coming out with newwer hardware, and the average consumer is elated to scrap their thousand dollar device for a more shoddily manufactured and expensive one. None of these machines are built to last.

Think of all the money and time that is spent developing and manufacturing modern computers. Think of the cost that consumers pay to aquire and maintain these devices. And now consider that these items are treated as disposable. It might be forgivable if modern computers fulfilled their purpose, but instead they are riddled with bugs, defects and poor craftsmanship.

Booting takes an unpredictable amount of time. There are always software updates delaying you. Unwanted reboots that take forever. Overly complex user interfaces with countless dials and toggles for the most useless settings. Software updates routinely introduce regressions and break user-expected behavior. A million things to get in the way of actually using the damn thing.

A solution I would like to see: A simple, bare-bones, future-proof, portable computer. Designed to mainly run in text-mode, but graphics would be possible. Physically robust construction with components designed to last. Extremely low-price.

Low-power, simple CPU. Probably an 8-bit Z80, but maybe something 16-bit. I know very little about CPUs, but the idea would be to have a CPU that is affordable, ubiquitous, and not backdoored or otherwise voulnerable to attack. (Think about microcode infection, firmware exploits, and the recent Ubuntu BIOS corruption debacle.)

Bare-bones BIOS/OS in ROM. Would be able to access and execute code stored on battery-backed RAM, internal flash storage chip, or stored on USB flash. Would include a simple text editor, BASIC, assembler, and small and simple C compiler. Some small utilities for file management, etc. Not likely to have any multitasking.

Standard keyboard. No chiclet shit. Highest quality. Keyboards are how we interact with a computer, and anything subpar is unnacceptable.

Powered by standard AA batteries with optional AC adaptor. Proprietary batteries = planned obsolesence. The simple CPU and components would allow the machine to be powered by 4 AA batteries for many hours of continuous use. Rechargable AA batteries would work.

Non-backlit reflective LCD screen. Backlights are unnecessary. They waste battery and are just another component that will fail.

Instant boot and shutdown. A dependable machine that is simple and easy to use and maintain.

Perfect for writing code. All needed tools built right into the machine. No nonsense to distract you. Store countless text documents in RAM or internal flash. Perfect for archiving and acessing your data.

Would work very well as a word processor. Power on and just start typing.

Does anyone else feel this way? Does anyone else want such a device? Does it exist? Will it ever?

Other urls found in this thread:

yeah but how would i use facebook and watch youtube? sounds like a worthless computer and noone will buy it.

There's your problem, that doesn't exist. X86 is not low power and is a botnet. ARM is a lower power botnet. POWER is not affordable. MIPS is not simple and might be backdoored. RISCV fits the requirements but is somewhat-expensive and is vulnerable to hardware attack as all the others are.
If someone wanted to build a device like this they couldn't use the standard GNU LIBC implementation, they would have to port their own light-weight version of the LIBC library that is designed without the complexities of modern single core/non-parrellel oriented code.
Yes. Phones of today get closer to a mobile terminal but are botnets.
No. Closest to it you could get is a samsung galaxy 2 with replicant but it is complex to work with because android and a modem. Or if you used very old hardware, but that isn't low powered.
Not if (((they))) have a say in such. Why bother making this topic? Are you actually going to try something like this and not expect to be (((suicided))) by (((them)))? I hope you succeed.

Sounds a lot like a Model T (M102)

It sounds like you want something like this:
Probably need to avoid USB, it's complicated and potentially dangerous. Maybe CF or SD card would work, I don't know what kind of firmware is needed on those.

Btw the Z80 is pretty flexible. Amstrad released some computers that such chip clocked at 16 MHz in 1995, but of course it didn't do well. Even Amiga was already dead by then. But the Z80 is allegedly still in production for embedded use. Otherwise you have the AVR stuff as an option.

Harvard architecture.

You forgot 'leaving the computer unattended every night when you go to sleep'.

That's preety close to OP's request but it ain't giving you a desktop enviroment like OP wants.
It's also very under-powered for the energy usage. You aren't going to be compiling C code in that thing quickly for large projects.

I wouldn't go back to using a 16-bit CPU let alone 8-bit. These are nostalgia or hipster feelings you are having. This really is the best timeline because things must change instead of doing the same thing decade after decade. In 20 years the tech will be radically different compared to now and 1997. Fucking count on it.

Good thread, archive'd because impossible dreams must be remembered. Raspberry Pi not good enough? Not judging.

OP I have other specs in my mind for this impossible computer that nobody will build:

- toughbook laptop frame, can be dropped on hard floor and not break
- dual-core 64-bit CPU (low clock speed but large registers for math ops)
- barely any cooling required due to low clock speeds
- improved LCD screen from the future! cheap and last forever
- improved solid state drive from the future!
- powered by cable but also standard batteries (9-volt, C, AA? I don't know, I'm no physicist) but certainly not proprietary stuff that is irreplaceable when it burns out
- requirement: real-time operating system that is read-only, almost instant boot (user installed progs and games are not considered part of the OS)

This wouldn't be dirt cheap, but it would be strong enough to be worth buying.

Ya there are modern versions of the z80 and z180.

The advantage with going with the z80 line is you instantly have 2 decades worth of software,

This is true, especially with Ryzen and ARM/smartphones. I think we're going to see some awesome stuff.

Intel had quad core chips as their consumer flagship. I give it a few years and we'll see eight core chips as their flagship.

AMD is bring some interesting changing to the table with Ryzen and Nvidia's technology (Volta) is going to change things too. I'm not sure what AMD is going to do with their GPUs, but it will be interesting to see.

Intel might be making discrete GPUs, too. I sure hope so on that one.

I'll make the logo.

Okay. Then they just do it.

Somewhat. Very poor screen on those. I imagine a full-size screen in a standard
laptop-style case. Or at least something with much more vertical space.

Good thinking.

Btw the Z80 is pretty flexible
Yes, I think so. Besides early computers, it was used in TI-83 and other calcula
tors and in Gameboy systems.

Let's not get ridiculous. There's plenty of ways to easily avoid that scenario.

I want no such thing. What I meant was that graphics would be possible. Perhaps
certain software would render graphics, like games. There's no point in spening
time and resources on a windowing system. Time would be better spent implementin
g ncurses for such a machine.
We should avoid large projects anyways.

The Pi is, perhaps, a step in the right direction. But it is probably botnet. I
think it requires closed source firmware to boot and/or run. It also has no CMOS
or clock, which would be nice to have. Where is the standard laptop case with a
mechanical keyboard and amber monochrome display that will accept any and all S
BC's as a motherboard?
Glad that you feel similarly to me. I like your specs.

h of software
Nice. Thanks for those pics and info.


Sorry for the fucked paste job there.

Well, in less lay terms, we have AI, quantum computer, graphene, new storage mediums, CRISPER, etc.

What happens in 4, 5 years when we're at the silicon limit and we can't shrink transistors any further?

Don't you want technology that you can actually control? Wouldn't you rather outright OWN your computer instead of, essentially, leasing it from the corporations that design and sell it? Your continued use of your computer is contigent on their approval. iPhones are throttled when they "get old" now. And there's nothing stopping Intel from slowing your CPU via ME when they decide it's time for you to upgrade. Linux won't save you where we are going. But you seem content with the current state of affairs. Enjoy your botnet.

There might actually be an end to the current bullshit after all. But see above. The hardware will be throttled remotely and stupid people will replace the "aging" hardware. There are people who actually believe that computers slow down with age because the electronic components degrade. Any actual degredation of components is so miniscule that it makes no difference in speed.

The more I think of this, the more it hits me: there is no solution. We have been sucked into a matrix of pure shit. The whole enviromnent is nothing but shit. Because ultimately, your hardware has a backdoor for every device plus the one on your processor.
What will be your development environment? Even your precious """Free""" FOSS software is not to be trusted, rms achieved nothing.

What can we possibly do? Everything is compromised, I don't know of a single thing that isn't, and everybody keeps insisting on sticking to horrible shit like the Web, GNOME, GNU.........
Technology is fucked, and while there's a lot to be done about it, there will always be one little thing to compromise it all. From hardware manufacturers, to the fact that writing an OS is no easy task....
We live in dark times, and will only get worse.

Motherfucker have even seen half the shit Boston Dynamics has come up with? Machine Learning is fucking huge and it's only going to get bigger. IBM and Intel, among others, are experimenting with neuromorphic cpus. Robotic waifus will become a reality in your lifetime.

You're right, actually, I'm just a brainlet, just like the OP.

This guy is either a cuck or (((they))) are shilling this thread.

I sure want a computer like your description, OP. It kind reminded me of Canon Cat, it's Forth based, TUI, simple and powerful.
I grew with the late 90's computers, I don't know how it's like to type BASIC code from a magazine and be able to understand every aspect of the computer, but I love the premise.
My only doubt is why BASIC? I won't shill for any other language, I can easily learn any other syntax, even Scheme is familiar to me now, but can you can explain this choice? Is it just nostalgia?

Philosophically you are right, practically you are wrong, and I'll explain why philosophically (heh): modern electronics are magic, we do not know how they work because they are too complicated. Probably even for the engineers who (partly) designed them. This process of magic-ification cannot be undone without losing technological progress. And if you want to get even more scared: food is magic too unless you build your own farm and make your own food! Otherwise you don't know what colorings, flavorings and preservatives they put in it.

But we still have lawyers who can help us, to make sure that we consumers aren't fucked by the corporations.

Agreed, look at coreboot/libreboot. It won't be supporting new CPUs soon cause botnet, to that I'd say thank you very much for embracing defeat! But again we can go back to lawyers. Maybe they will be our last defense.

Dark times, yes. Only get worse, maybe yes maybe no. Forces lurk in the shadows. ReactOS is just waiting for Windows to go to shit. Haiku is just waiting for Linux to go to shit. There is competition that you don't see because you don't really care about it, despite all your pessimism.


I only proposed it because first, it’s the traditional choice, and second, why not. I still proposed asm and c in addition.

Do you know what obsolescence is?

The BASIC on the 80s 8bits didn't limit you do just basic. With PEEK and POKE you could manipulate any memory address you wanted. Interact directly with the hardware.

Z80 would definitely be the way to go if your going to build something from scratch. Just search 'z80 homebrew'

Also check out this graphical OS

There's only no solution if you insist on being part of the botnet and using its platforms beyond the bare minimum. A lot of people are addicted to modern games, facebook, and so on. So they want the big, nasty web browsers and the nasty Intel computers and iphones that can run those.
Otherwise you have a lot more choice of hardware and OS/software. You can even run Linux on the Z80 if you want to:
I wouldn't (simpler hardware is a nice opportunity to write my own OS), but it shows that the Z80 really is flexible. And there's also m68k:

Forth is another good option for small computers. You could even have it boot into a Forth REPL, like the Jupiter Ace.

If it has a screen, a input method, the speed of a pentium 4, a disk to read everything from, and 256MB of fast ram with swap space for everything else then you could run a DE and modern web browser on it like palemoon/icecat. That's if you ported a libc library to the proccessor and used all FOSS components. Although 256MB's is a large amount of ram for a small computer like what OP wants. Along with the eZ80 having no where near enough adressable memory for such software. And images would load very slowly without a dedicated instruction set to be like a GPU. So probably not that.

The eZ80 sounds perfect for a text word proccessor at medium power consumption on the go. Near useless for compiling or testing software, modern or otherwise. I remember hearing of a proccessor that could get the proccessing power of a pentium 4 at extremely low energy usage. I forgot what its name was though.

Yes RPi requires a binary blob, specifically the code that the GPU runs. The GPU is in some sense the "master" on the RPi SoC so there is no way to disable it or otherwise avoid using this binary blob. lists other boards (e.g. BeagleBoard/BeagleBone) which can be run sans-GPU with FOSS software. For the purpose of this thread, these boards should be enough.

I've played around with bare metal on the RPi and BBB. I would recommend playing around with bare metal programming on these boards (especially BBB which doesn't need a binary blob) for anyone interested in stripping away the complexity of modern computing. That said, bare metal programming on these kinds of SoC is still fairly complex.


AFAIK OP isn't concerned with the internal complexity of the CPU for its own sake, only because of how it influenced software (e.g. complexity of the BIOS).

For the programmer, yes the timing of the execution of the code will be slightly unpredictable but given OPs goals I don't see that as a major problem since OP isn't looking for maximum performance.

The bigger issue, at least for me, was the interaction of the CPU cores with other parts of the system. E.g. RPi (and I think other systems) are really slow (like 10x slower) until you setup the MMU which handles the memory caching. It's only a few instructions iirc.

Then on RPi you have communication with peripherals which is really complicated. Not sure what it's like on the BBB.

The idea is just to get away from this modern bloated software, so that a Z80 is fast enough, and 64K or so of memory is sufficient. I don't want desktop environment, libc, and the other junk. I mean, I hate using that stuff even on a modern laptop. There I spend most of my time in small WM with an xterm or even just at the text console. But even a 4 MHz Z80 like in 80's computers can manage graphics and sound well enough to make games (at least the kind of games I like), provided the board has simple graphics/sound chips as were common on 80's computers like Amstrad CPC and MSX. Don't need any libc or unix-related stuff at all, I'm fine with just BASIC, asm, and Forth. Quite frankly having Intel and Unix everywhere is getting really old and boring.

But yea, if you just want a text editor with no networking, compatibility with modern shit software, 3d graphics, and or DE eye candy then the eZ80 is perfect for you.
did terry really fit his whole OS in less then 2MB? Ugh what a mess modern software is that I need 3-7GB of disk space and 256MB of ram minimum for a working desktop with eye candy and multi-tasking. Where did everything go so wrong?

Fast, good, cheap: pick two. Terry isn't cheap, and companies go for the worst devs they can hire that still manage to compile something.

Linux isn't going to shit for a long time. Look at Unix in general, it's a 1969
operating system still active. It carries concepts no longer used like TTY and
wasn't designed with network and graphics displays in mind because they didn't
exist back then. There are better alternatives that carries what Unix did right
and incorporate new technologies and that's Plan 9 and Inferno, check Inferno's
docs it's an OS that runs on top of a good virtual machine (unlike Java), don't
need memory protection, is distributed, has garbage collection and can run on
small devices (there's even a port running on Nintendo DS).
Why don't we use these awesome OSs today? There's a lot of reasons, but I can
point out two, licenses and costs of changing. Today both Plan 9 and Inferno
are GPLv2, but they aren't compatible with POSIX software without hackery, so
many don't adopt them because you'd have to pay to either develop a
compatibility layer making your software run worse or develop your software

Well the #1 thing that would fix security is to get rid of C code. Entirely. It is by far the single greatest cause on Earth of security flaws in technology and computing. And get rid of compatibility with it in C++ also.

< 1 down, 100 to go OP.

I can port c libraries, in fact I'm dev'ing my own 32bit OS, bootloaders, and kernel right now. Shits not difficult, just requires a lot of time and effort.

I'm happy to throw in with whoever on a project like this.

Modern Operating Systems are trying to be everything to everyone. That is why they're so large, because they need to support every conceivable software someone wants to run.

Computers like the vax with 8-16megs up ram supported large numbers of users running the same basic text applications we run today.

Complete 4.3BSD distribution was a 25 meg tarball.
Today FreeBSD 11 is a 650meg iso with out X11.

Unix sucking has nothing to do with its age. Unix was bad for a 1969 operating system too.

Haiku doesn't have LXQt, Xfce, KDE, Fluxbox, Awesome or any other useful DE/WM.

Haiku also doesn't have 90% of the programs I need, like Firefox/Chromium, Libreoffice, GIMP, Audacious, Audacity, LMMS, etc.

Explain what would be superior to Unix? What are its flaws? Don't "NT Kernel" me.

Haiku has the only sane desktop environment a desktop OS can have. No tabs/windows/taskbars jewry, everything is streamlined in one obvious magnetic UX.

I don't know if you know this, but Haiku does in fact use windows. There are even buttons to close or minimize the windows.

Haiku is fully POSIX compatible and had an X server since the BeOS days. All your shit would run on it. But why waste your time with X when you have the BeOS UI?


That's a really small part of it. Microsoft doesn't hire the worst developers they can to write Windows, and neither are the Linux core devs the worst. I have a friend whose company does hire from the "bottom 40%" as he puts it, but they produce some random corporate software not operating systems. Software bloat mostly happens because of backwards compatibility, the need to support a lot of random use cases, and the need for extra layers to catch errors.

Another reason clean abstractions fail is that however inherently logical your abstractions may be, users may still find it hard to grok them. Git is a great example of this. A lot of programmers I've talked to say they have a very fuzzy mental model of git even after using it for many years. I think the hardest thing to accept as a developer is that your way of thinking may be confusing to the user, and you might have to compromise on the purity of your API to account for this, or even change your own mental model and abstractions.

A related problem is that two developers rarely share exactly the same mental model, and so they often build incompatible abstractions. Each will tell you privately that they know they are doing this but the other developer is wrong. It's really hard to set aside your own ego and go with the mental model of the other person or team.

yea, and for double the price you get much faster 32 bit MCUs with bigger flash
shut up you hipster nostalgia fag

kys Z80 nostalgia fag
the rest of the world moved on, I wonder will your kind

Here you go Z80 fags:

compare the two and then shut up

This is really the best option for what OP is talking about.

Of course, you can't get very far in a thread like this without people (see above) saying, "No, I need a 32-bit CPU!" or "No, 64-bit" or "You know, we just need to port a libc and a 'simple' C compiler..." or "We can totally run Palememe guize." Logo Guy™ hasn't even come up with a logo for Idea Guy™'s idea, and user already wants to bloat the project up to a monstrosity that will enable him to watch Youtube videos in HD.

You can shove that chip up your butt if you like it so much you pompous faggot, btw nice strawman

Yeah, this guy gets it.
If you ever designed a system to a company, whatever the language, you may have had a situation like this:

And stuff like that happens all the time. You can create a system that's simple and beautiful, but if you keep submitting to people's requests for features, you'll end up with a giant, unmaintainable turd.
A related read:

Your asshurt is palpable even over TCP/IP. Your whining is a classic example of "stop liking what I don't like." This thread is about extremely simple computers, purposely limited in their abilities (including their ability to accumulate bloat). If that triggers you, go find a safe space somewhere else.


yea yea, you nostalgia fags are annoying but I am not butthurt that you limit yourselves with outdated overpriced gear

8 bit nigger faggots are more like retarded little cousins that like to play with trains than anything sincerely angering

The first half of this post is basically my entire view on computers. I couldn't agree with it more.

I disagree with your proposed solution however. What you're proposing is essentially just a very old computer, and it won't be able to do much of anything we expect modern PCs to do.

Why not just get an older CPU and put Debian testing on it or something with just the packages you need? I don't see why we'd need anything else.

RC2014 is an open source modular computer you can buy in kit form if you don't want to source everything yourself and build PCB's.
You could run CP/M, FUZIX Linux, or what ever on it.

This guy has lots of expansion projects built around this computer.

Given that we know OSes are essentially userland code, and that Ring -3 exists; yes, we should be doing something about the absolute neck-deep river of shit and vomit we are wading through.

My money is on Power 9. Along with fully open and documented hardware. The graphics system will be a problem.

Beyond that we should reconsider what we use computers for. The first order of business is to kill javascript. Javascript was a mistake.

Totally agree. But 99% of internet discussions that are about wishing some aspect of the world is a better place are really, "I wish the world was a better place while I change absolutely nothing about what I do".
I'd like to see a return to text based systems; the vast majority of images and videos on the net don't really provide anything of value besides another avenue of distraction.

I thought that name sounded familiar. I played the fuck out of Land of Devastation as a little kid.

Computing isn't only about net.
I use Darktable (which is free software btw) to process photos, for example, and this is quite important.

That's... beautiful.

what a silly thing.
even the cheapest fucking smartphone has more memory and CPU performance, while being smaller and eating less energy.
who in their right mind will buy it?

Not using MC68008

don't post images like this.

m68k is a fine choice, but it's more complicated than Z80, so not as good for a beginner who wants to get his hands dirty. I was writing Z80 asm on my computer at 13 years old as second language after BASIC, that's how easy it is. Actually scratch that, I was writing Z80 machine code, by typing the equivalent hex codes (not asm mnemonics), because I didn't have an assembler. So you could do the same exact thing to bootstrap a new computer that has no software whatsoever by toggling in program from the hardware level CPU monitor (assuming your design has this, since we're talking about non-shitty, non-modern computers).
Anyway they're both good choices, and both were used extensively in 80's computers all the way from ZX-81 to Sun and NeXT workstations. It just comes down to how much power do you need, and how much complications are you willing to deal with. For me, I want the simplest possible computer that still does basic graphics and sound, kind of like MSX (video related). The nice thing about modular system like this is you can try different approaches, whereas with standard old computer like MSX or Amstrad you're mostly stuck with whatever was designed in it.
Anyway I would say do both projects, starting with the simpler Z80.

Well said. Something robust and with a small footprint like Rust is a must IMHO.

ITT all talk about wanting a text processor on the go and whatnot, focusing on the tools instead of the things you do with those tools. As useless as reinstalling Arch or Gentoo for the millionth time.
Focus instead on writing that book or that software and don't worry so much about sharpening your pencils.

not even a good troll


is the entire image rendered or is it just the face

Dont tell me what to do nigger



Stopped reading here. Your thinking as exemplified above is what made GNOME shitty.

Nice try. Gnome has tons of dials and toggles. Just because they are not visible to the user doesn't mean they aren't there.


@szeloof on Twitter.
He's your guy.

The TRS-80 model 100 runs for 20 hours on some AA batteries.

What made GNOME shitty is the choice of Python and Javascript as primary langauges, not your autistic aversion to being productive.

yea, but compared to 18650 tho

But can you get that at any gas station at 3am in BFE?

I am ashamed fam


Use unprotected 18650 cells and use software controlled ADC to do battery management.

Feel free to disassemble them to reassure yourself that (((they))) haven't snuck in any secret ICs.

Make sure that the lithium cells are near any storage so that you can use them for secure destruction of data

I'm developing a kernel myself. Do you have a repo?


No shit, my old Amiga could process images. I remember being able to substitute on backgrounds to photos back with that level of technology.

I'm not suggesting that we eradicate image capabilities completely, but that I don't want an entire operating system that does everything (browser) on top of my operating system. The web should be able to fall back to text.

"no longer used"

Okay, I'll bite.

4k isn't bad. It just requires fast hardware to deal with it. You cannot do 4k on 16 bit machines. I appreciate today's tech's achievements.

I don't want to throw them into trash, but to change them to become less botnet, to have more freedom and to make them less complex. New, faster hardware doesn't keep suckless guys from providing us with good software ( though surf needs something like umatrix ).

We need to change the attitude of developers, not go back to thd paleolithic age. This won't come easy, big corporations love nondisclosure agREEEments, hiding stuff and developing tech that is good for them, not the users.

Your unwillingness to give up on some incremental improvements gives them the power to take away your freedoms for their profits. For example, there exists no wlan ac chipset that works without proprietary firmware. So most, like you, choose to get the immediate benefit of higher speeds and congratulate themselves about how "pragmatic" they are.

The problem is not compromising or bad tech, it's the corporations. Addressing the symptoms won't help in any way.

You need to recruit normies, too, and normies like the speed, etc. You can make non-botnet hardware which is just as good as the current ones, with less complexity. Talos II isn't exactly for the average consumer, but it's much more promising than going back to the old ages. For example: we need to make ( and produce, too ) "open source" 2D acceleration GPUs which can be used to watch videos and browse the internet. They are good for most use cases. We need to build alternatives, not some shit that works only for autists who don't create art content.

Tech always goes forward: you cannot turn back the wheel of time, unless you destroy knowledge with wars. Isolating yourself by refusing all modern tech on the basis of "being botnet" won't help the cause. Remember, every single movie today include CG for a reason ( and it's not just the profit motive ). Be more practical.

Thankfully I never used any wlan other than g. I prefer cable for high speed a wlan for basic internet access.

Your assumption that "we" want the same things is flawed. I don't care at all about 4K video or similar. Quite frankly, that kind of stuff is what has brought normies (and the media, politicians, etc.) into my nice hobby, where I used to be able to relax in peace. Believe me, if I could just wave my hands to cast a spell and erase the last 25 years of "progress" in the computer field, I totally would.
But I'm no magician, unfortunately, so the next best thing for *me* is to get the hell away from all this modern disaster and back into a more sane environment where the computer is simple enough that the user can understand every part of it, program it himself from the ground up, build his own hardware expansions, and write the drivers for them.
Alas that leaves one big problem. Today you basically need a Firefox/Chrome machine to be able to do basic tasks, because banks, insurance companies, and even the government insist on using Web 2.0 type disaster designs that require nasty modern hardware. If it weren't for that, I'd gladly just throw away this laptop into the trash where it belongs. But that's really the only part that I'm interested in rectifying. The other stuff like fancy graphics, sound, and so forth are things I absolutely do not want, because they complicate the kind of basic, simple, cozy computer designs I actually enjoy using.

what are those black boxes and how are they not backdoored?

This tbh. Only nostalgiafags and hipsterfags would want to turn back time (to the good old daays...).

Economies of scale are ridiculous in the tech world, so at least you will probably not get the same perf/price. Having POWER9 to build on is a stroke of luck. Otherwise I actually fully agree with you, though I did get myself some atheros chips that also support n iirc. The EOMA68 project is pretty cool and close to what you are describing, check it out if you haven't already.

Early integrated circuits, probably similar to video. And why would they be backdoored? First of all there wasn't much room to squeeze in lots of unecessary logic gates that would allow for a backdoor that's most likely never going to be used. Waste of money that could instead be used to design a more competitive chip. The story is different today, as a backdoor would take up an insignificant space and cost.
Secondly a backdoor in old, simple chip would be much easier to find, and all the reverse-engineering of past 45 years would have found it.
Thirdly, most computers weren't even networked those days, so what would be the point? And even if the user had a modem, he likely only occasionally connected to a dialup BBS rather than staying permanently attached to the Internet (which itself wasn't monitored and logged as it is today).

Do you suppose old 68030, 68040 chips are back-doored? I'd imagine they would be pretty capable 32 bit CPUs with virtual memory support, and easy enough to write assembly for. I definitely think, with the right software, they could make fine daily drivers for shitposting, IRC and recreational programming, just not video or other heavy web 2.0 stuff.

I've seen the project and I think it's pretty close to ideal.

Back in 2010, I had a pentium 4 machine which had no integrated gpu nor dedicated. It ran only on cpu, and it wasn't a good experience. Even moving the cursor produced significant stress and it was extremely slow when sliding windows, etc. That's when I became quite sure on that gpus exist for a good reason. Later, I could "upgrade" the machine with an nvidia 7300GT ( 256 MB ) , with which I could somehow seamlessly use a fullhd monitor and a 1024x768 CRT (dual monitors). Even the slowest gpus are more than enough for 2D as it seems, and cpus aren't powerful enough, even for a small resolution.

I'd worry more about not having a floating point unit than back doors.

Those are more suspicious, since they were used in many Unix workstations, so it would make more sense to backdoor them. But I don't think they have anything equivalent to Intel's negative CPU rings, so Intel ME style craziness is right out. A backdoor would take on a different form entirely, and would probably be hard to implement across many OS, and every Unix vendor had their own OS those days (in addition to other competely different stuff like MacOS, AmigaOS, and Atari TOS).
Anyway I think your biggest problem would be SSL, since lots of sites enforce it now. I don't remember how my 486 behaved with https websites, and frankly such sites were pretty rare in mid 90's (as a frame of reference, rsh and plain telnet were still widely in use...) I guess you could just have an ARM SBC running a proxy that handles the heavy lifting, and just connect your board to it via serial port (and run SLIP for the routing).

In that case, get the non-cucked 68040, its FPU is even faster than an equivalent 486 (at same clockspeed).

You can always have a discrete FPU. The 68040 has one on-die.

The original 68040 is the only non pozed version. The EC and LC had no FPU or no MMC. There was never an option for an FPU on die on the 68030. I'm not sure if it had an on die memory management controller either.

How hard is it to build accelerator boards out of similar era chips for things like crypto?

People used DSPs to do 3d a long time ago. I'll bet you could use one to speed up cryptography.

I don't know anything about the crypto maths, but I've seen dedicated FFT and DSP chips on equipment from ~ 1991, so it might be possible to do something but I wouldn't know where to start. Worst case I guess you just end up with another 68040 coprocessor, or two or three (if you can manage to parallelize the task).

Yes. The 68030 could use an external FPU. You often saw the socket in systems like the 68K mac's. The chips where the 68881 and 68882.

That back door would be very hard to implement because Motorola would have no idea where those chips would end up. They where used a lot in industrial and embedded applications that they had no part in designing. The general purpose computers like Apple/CBM/Atari/NEC/etc etc all shared very little. They all ran completely different OS's. Different network implementations if they even had one at all.

Thinking about Z80's and other 8/16bit CPU's isn't hipster in all cases. Its realistically the limit of what an amateur computer hobbyist could put together with his own hands and mind. He could write his own ROM, bootstrap or even his own OS on his own.
Very few people could pick up a PGA or bigger chip and build a working system based around a it from scratch.

What's Haiku? How is it better then Linux? I plan on setting up a Friend server cloud based desktop service that can run Linux, Mac and Windows apps simultaneously What do anons think of Friend?

Haiku's an OS
It's based on old BeOS
Still in alpha though

After my first z80 build i realised how hard it would be for a single guy to build a sem modern computer.
on hardware level, the bus width would be a big problem.
unless you can print your own circuits.

i was thinking about the Intel 8088, a 16bit x86 cpu with an 8 bit bus.
its basically the first modern PC cpu, it runs dos, windows and most likely linux too.

i may try to do it some day when i have free time

A lot of work and purified autism, yes, "omg2hard", not really.

Minix-1&2 is an option for a 'modern OS'.
Even NetBSD had a port for a homebrew computer, the PC532, and I've had a go at porting NBSD to a new board: (system initialization, devices, interrupts--and then the system is just about bootable).

Santa gave me a pair of 8284A Clock Generator ICs, and I have a plan for a homebrew board with 8088/8086 and >= 512kB of RAM, and then try to get Minix or ELKS going on it.


if you don't want backdoors i honestly think Pentium 1 cpus are safe, they're the last cpus i know that were made for modular platforms.

Everything you have said has been said before many years ago. In those days, the problem was assembly, and the solution was to design hardware for high-level languages like Fortran, Cobol, Algol, Basic, and Lisp.

The problem today, the real reason why software is not better, is the C programming language by Bell Labs. High-level languages (Fortran, etc.) were carefully designed to be able to run on a variety of hardware, but C is based on the PDP-11 model. The bad design of C makes it impossible to have an elegant, efficient OS and hardware design.

Are you frustrated?

Get an AthlonXP or a Pentium III if you want a non-pozed daily driver that still supports all modern file formats, protocols and hardware, and is easily available at a low price for everybody everywhere. You can easily build an inexpensive system from that generation and run it on 100% libre software (drivers and firmware included). It'll run a modern browser, drive a 4k screen, offer 2d and 3d acceleration with support for Vulcan and OpenGL, edit high resolution images, edit hd video (with some patience), and support modern SATA drives, 10 GigE, USB3 and thunderbolt via PCI cards. Yet it is still simple enough that you can make full use of its capabilities with nothing more than assembler and a machine code monitor.

I've had an idea of picking tons of unused Core 2 Duo era embedded low-power processors from China like the ones in X200 laptops, and building liberty-respecting NASes or routers with them.
Am I completely bonkers larping daydreamer late for 9 years or is it a viable concept? I'm an Ideas Guy btw™

Build just one and you are no longer an ideas guy, but have an actual product to sell. From there on it's basic business. If you can offer a good enough product at a good enough price for the mass market, and have some value added stuff for a niche market, then you are golden. If you have a niche product for a reasonable price, then it's up to how much you are willing to lose if things don't work out.

How about you name one fucking normie task that can’t be done on a late 68k chip that’s not due to bloated web interfaces for everything from email to online banking.

It can even do video, albeit at potato resolution, unless you have some exotic accelerator hardware like what Avid used to make.


Just rig up a fucking ARM SOC with a self made case, screen and make it run on regular batteries. It's not hard. Your retarded dream computer won't ever come true because it's electronic trash and has no use aside from the novelty of being old.

I was thinking more of tasks people actually need to do. I’ll grant you that things like YouTube, Netflix, FaceTime, etc. on an Amiga, Atari TT030 or an early Mac is a bit much to ask, but I still maintain that saying they, given the right software, are still capable office and email machines does not make one a “computer illiterate 15-year-old. In fact, how many 15-year-olds have actually used such a machine, never mind had one as their daily driver for years?

That doesn’t solve the botnet problem unless you’re using one of the early 90s ARM chips, in which case you might as well use something more widely available from the same era.

Especially the former. Basic word processing is an obvious example. Even spreadsheets: VisiCalc came out in 1979, and was originally written for an 8-bit processor.

Modern email, well, that's tougher. If you demand all of the necessary functionality in the microcomputer and its OS themselves, you're talking about an Ethernet driver, TCP/IP stack, IMAP/POP client, SMTP software, a TLS implementation (unless you're okay with your email credentials being sent in plaintext, if your email provider even allows that), and a mail client. That's a tall order.

Therefore you won't actually learn anything about computer design, and you won't have any fun getting your hands dirty. The newest I would go for is AVR microcontroller, since you still have to build the circuit and program it yourself. Very good battery life too.

I don’t see a problem with the email protocol stack, but TLS is probably a lot harder. Something that calls for hardware acceleration. Unfortunately I don’t have either the mathematical nor the electronics skills to even conceptualise such a device, but I’m sure it’s possible. Hobbyist electronics have come a long way.

There's a recent version of OpenSSL for 68k Amiga (68020+), so maybe the whole performance problem talk is bogus. Someone should try it out and see. It's going to be slower than on amd64, but so long as it's fast enough to be usable to download POP/IMAP mail without the session timing out, that's what matters.

That's a bold statement. TCP/IP (which isn't part of an "email protocol stack" per se, but which you failed to mention), would be challenging to implement on its own. IMAP/POP/SMTP are not trivial, and would probably have to be implemented in assembler, Forth, or C, based on the kind of hardware we're talking about.

TLS would be difficult, but not impossible if you stuck to a subset of the permitted ciphers and key exchange protocols and only implemented those.

Frankly, I don't see the point in dumping all of that effort into email stuff. Email is shit, and normies will never use an 8-bit or 16-bit computer, anyway. Fuck 'em.

Haiku is a single user OS and it has builtin WM so it will never replace linux. Linux development is pushed by the server market, not the desktop and there multiuser is a requirement. On the desktop, choice of DE/WM is crucial on linux systems. I like its design though, it has window tabbing like pekwm or fluxbox. It's very useful on not tiling WMs.

Seconding this. Dual athlon xp or tualating p3 servers were very powerful. Those boards don't have pcie though and I don't know which gpu was the last one with AGP. The problem is that those machines can hardly be stronger than an athlon 64 x2, which is insufficient for browsing the web. Now, because of heavier JS and memory-eating browsers, even that is getting worse. I's suggest building the last non-botnet servers you can build ( k10 opteron, doesn't matter which, or 6-core socket 604 xeons, maybe debotneted 1st gen i7 xeons ) and get to know their motherboards to get hacking on them. Supermicro mobos usually come with an integrated old matrox card which runs on free linux drivers. I'm really thinking of copying/getting old matrox cards to get 2D accel.

Cryptography, decoding modern compression formats. I'm sure that if you get a rar or tar.xz, it's gonna be shit. I'm almost sure listening to flac will severely strain the computer. Also afaik initially linux was written for 32bit. Oh, and you know, in 2037, 32bit integer timestamps will overflow. You'd need to build lots of hardware acceleration cards. Better use FPGAs, but unfortunately even those require new hardware and most of them use nonfree toolchains...

Oh, I heard AMD is going to make PSP switchable with their new AGESA. Nonbotnet ryzens when?

A 68030 or 68040 is going to breeze through all these things. They used to power Unix workstations, the first web server only had a 68030 CPU. It's not going to be a problem. I think you underestimate the power of these chips.

Not a problem. That's a software issue, not a hardware one. 32 bit integer timestamps on 32 bit systems is an optimisation choice, not a hard limit.

For TCP/IP there is already uIP, there is also a version with IPv6 Support, IMAP/POP should be no Problem. TLS seems to be the main Problem, but seems to be possible considering there is wolfssl which is quite compact already, with tight asm code and limited cipher support it should even fit on an 8-Bit micro.

I just wanted to show how much stuff you will need to deal with. I know you can emulate 64bit integers on 8bit systems, too. It's just that counting time is gonna be slooow as fugg. Or you could count time from now, that's also a good option.

Well, the Motorola 68k is a 32-bit processor, that helps a lot.

Radeon HD 4670.

whats with this obsession with using Motorola?
you know the late 68k came in the same period as the Pentium series and the later were much more powerful and are much easier to deal with today, being x86

So they never released anything newer. I checked agp in 2010 but it was frigging long ago. 4670 was a good one, it could run crysis on a dual p3. And it has pretty good drivers on linux now afaik. Maybe the agp one doesn't, idk for sure.

There are plenty of ARM SoCs to choose from if you look in the industrial/automotive market. Many older chips even lack TrustZone or the back doors are at least available for the device manufacturer (i.e. you) to tinker with.

I'm still running my i5-750. I might replace it two years from now.

Not newer, but more powerful. IIRC there was a Radeon 48XX for AGP. But good luck finding one of those.

Old x86 was shit. Thats why SPARC, MIPS, ALPHA, and 68K existed back then.

complex operating systems to be replaced with a simple DOS-like loader.

This. The reason why x86 won was compatibility, prices and modularity. You could easily extend the functionality of your pc by attaching new cards. Not performace or others.

Aren't the controller chips on modern SATA drived pozzed though?

How the hell does being x86 make it easier to deal with? 68k assembly is much nicer to work with, and there is no shortage of compilers targeting the architecture either.

x86 is a piece of shit mired in baggage due to having to remain backwards compatible with a lot of poor decisions. It’s not “something better”, it’s exactly the kind of mediocre garbage we need to move away from with a project like this.

Any arch can have an expansion bus. There's nothing inherently cheap, compatible or modular about x86 either. Its success is enirely down to Microsoft's incompetence in porting their OS and toolchain to other architectures. And the low cost is entirely down to economies of scale. And compatibility with legacy x86 software wasn't an argument either, an emulator could have been used as a seamless stopgap measure. Which is what x86 ended up being anyway, a proprietary RISC processor running an x86 emulator.

TL;DR: x86 won because of PC and its "perks".

I didn't say x86 was the reason for modularity. It isn't the arch that defines the modularity, the PCs were modular, and PC was associated with the arch called x86. Heh, you just made me remember when I learnt how to attach DRAM to the 8085s.
Yes. But also consider that you could upgrade/replace parts easily. There must have been others who did this, too, and they lost because of network effect and this.
Yeah it could have been done with emulation, but it could have lead to potential errors and reduced performance. Back then these were more serious concerns, I think, now they aren't. And just look at people, most of the never buy something again if it dissatisfied them in only one thing. Stupid consumers + microshit and others did want to keep profiting.
Couldn't say it better.

The 68040 did pretty well against similarly clocked 486. Motorola had a hard time matching DX2/4 486s. I think Sun had gone to SPARC by then and Apple had already planned to use PPC before the 68060 came out. I think the 68060 got a lot of use in embedded devices back then, but you are correct. It was no match for the original Pentium.

Would it be possible to build a relatively simple homebrew computer (no graphics, UART for serial interface) with a ColdFire v2 microprocessor?

you can't possibly solder that by hand, however there are already available boards



Ah, fuck.

There is nothing I want more than an OSI 300 trainer, so fucking cute. Originals are basically non-existent, schematics are online and I have ordered all the parts, but they're just sitting in a box as I don't trust my soldering skills with all the timing and shit. made a reproduction, and sold a few kits at cons, but never posted the board files online.
He also hasn't been seen online in years.
I'd do fucking anything just for those board files.


Those d-subs aren't actually for monitors, are they?

The 65C816 is still being made by WDC. That might be an option for a portable, it's reasonably low power, and is static.

6502 based computers are kind of a meme, now, because of the whole "muh commodore 64 nostalgia", but it's worth considering.

the first motherboard has a DVI output, the second has a simple db9

It seems to be a (9-pin) serial port

Nope. DB9 connector-- very unlikely to be CGA/TTL. Here's a board with 2 UARTS and a CAN.

For some fucking reason I can't get code to execute on this board. Uploading: fine; Monitor inspection: fine; Execution: crash. :/

1. Get an FPGA.
2. Make your own instruction set.
3. ???
4. Profit.

Look at this big boy.
Pure x86, none of that (((AMD_64))) bullshit, integrated FPU, 4GiB RAM through PAE, GPU with 1080p anime codec.

OP wanted low energy usage proccessors. X86 is not that. Granted it's great you found a chink x86 proccessor that may or may not be a fucking botnet. Alternatives to kiked (((AMD PSP))) and (((jewtel ME))) are always nice.

pic related


Alright you fucks, lets start the project.

the CPU:Zilog Z180

100% compatible with Z80, available in 10, 20 and 33MHZ

clock generator, 16-bit counters/timers, interrupt controller, wait-state generators, serial ports and a DMA controller.

GPU: Yamaha V9958
for its simplicity and compatibility reasons with the OS.
if you can find somthing better, let me know.

the OS: SymbOS

pretty impressive what that guy managed to do on an 8bit cpu with limited ram.
there is even a software (IDE) to make programs for it !.
that being said it doesn't seem to work properly and isn't mantained

Install Fuzix, kid.
Last commit on github was earlier this month and it's *nix.

Just practice on random junk board and components until you feel skilled enough to do that project. Soldering through-hole isn't hard, just takes some practice. Also make sure to always wear eye protection (you need goggles, plain glasses aren't enough).
Anyway cool blog. I really like his idea to turn credit card reader into a CP/M computer. I'll try the same if I can find an old cash register or whatever.

I bet you could do even more impressive stuff with a custom-built Forth OS.

i like to have a DE tbh.

Why not a more powerful eZ80?

Is it still manufactured and in wide use? No point in developing something that's going to fall to bit-rot in five years and will be hard to get until then. Also, why use an analog video processor in 2017? It should not be too hard to get a microcontroller to generate a DVI or DisplayPort signal.

Ya Fuzix is Alan Cox's project it is legit.

couldn't find it stand alone.
like i said its more for compatibility reasons with the OS, it was designed for that chip.
i like having something that look like a modern OS instead of a black screen and command line.

I would use a microcontroller as a GPU instead of the analog video chip. Just have it offer a frame buffer to the rest of the system and maybe run some graphics code. This way you could use whatever CPU you like without having to worry about the requirements of modern display interfaces.

How free are qualcomm snapdragons? Particularly 410e, but I'm asking because the graphics can be used with an open source driver and I don't know about other components ( wlan, etc. )

i think the hole point of this is to use as basic components as possible in order to maximize safety.

The boot process is cyptographically locked down to require blobs, forget it.

Good topic OP, I also have thought about this. I like the idea of something like the Canon Cat, I would use Forth to do the whole thing. You can have a nice, secure computer with a small set of drivers for its limited set of peripherals, which people can audit themselves, running on a simple CPU.

Look up PropForth, I think basically we're already halfway there. There are other options too, as there are still many embedded CPUs and SOCs which don't appear to be pozzed. I think the secret laws which mandate that our computers come with backdoors only apply to chips intended for computers like laptops, tablets, phones, and desktops, and not super small low power solutions meant for embedded applications. These small computers would be perfect though, and some of them are simply micronized versions of CPUs which were used in high-end Unix multi-user systems as recently as the 1990s. For a single user computer they would be perfect.

Also it's amazing to see all the sliding in this thread, this topic obviously bothers the fuck out of the CIA niggers.

I've built a few 8 bit single board computers. Mostly Z80, but I did build one 6800 (not 68k) computer. I have a few 8085 chips and a CMOS 6502 in my collection of socket pulls. If I were to prototype up a Holla Forums SBC, what kind of shit would we want on it?

I can source just about any peripheral, but 6502 peripherals are easier to find on account of WDC still making them. Xilinx XC9500 series CPLDs are also cheap as shit, so if there's something obscure we need, we can custom program it. I have a USB Blaster, but you can program an Arduino to replay XSVF files.

Alternatively, if you wanted something more powerful and capable of running Linux, we could use a Microblaze or Nios II IP core in an FPGA.

Serial is a must, ideally a couple channels supporting higher speeds. I would like to see some kind of wireless as well.

Maybe Inferno would be something to consider, or Plan9 / 9front? They will run on very weak hardware and are ready to go. Avoiding countless hours working on the software stack, networking, etc. would allow people to concentrate on more important things like making the user facing software comfy to work with.

Android/iBad cancer:


They're even more scared of FPGAs.

I've been looking at designs for homebrew 8-bit computers, and I've noticed two problems that you could potentially rectify:
1. They're way too fucking complicated. Backplanes, tons of accessory cards, graphics, etc. It's impressive, no doubt, and after Holla Forumsnicians have built a few simpler computers, they might be ready to tackle something like that, but for now it's just too complicated and expensive. I'll be the first to admit that, while I've done some simple soldering before, I wouldn't be confident about tackling one of those designs, especially after popping $100+ on all the custom PCBs and shit that they basically require. That could be an expensive fuckup.
2. They all seem to require at least one component that is no longer readily available new from electronics suppliers, necessitating scrounging on eBay for used components. For example, I finally found a design for a 6502 (65c02, really) board where I thought I could source everything from Jameco, etc. Nice! But, nope. It's an old design, and it requires 1MHz components, and the new ones are 14MHz. I guess you can underclock the 65c02, but can you underclock the 65c22 VIA (which is also 14MHz)? Do you even need to? How would it be done? Fuck if I know.

So my suggestion would be to start with a simple design that noobs like me could actually build. If a later iteration were to add the more complicated stuff that people want, then great, but I think the first one needs to be nearly as basic as possible. To that end:

-Z80 or 65C02 processor
-ROM for which EEPROM programmers are inexpensive and readily available

Something simple, interfaced with over a USB-to-serial cable from a modern computer, that could run a BASIC or a Forth or even simple assembly programs loaded into ROM.

And if all of the parts could be sourced new from a big supplier or two, that would be ideal.

Plan 9 and Inferno are shit. It would take less time to invent your own OS from scratch and write it in assembly (see MenuetOS and KolibriOS) than to polish those turds into something halfway decent.

Agreed. We should emphasize the "single" in "single board computer". It should be composed of as few boards as possible, preferably one. Since WDC still makes the VIA, PIA, and ACIA chips, we can use those if we want. Or we could use CPLDs to implement faster versions of old chips, or we could even write the functionality into a microcontroller. I have plenty of AVR experience. Although something about using a chip to interface with a CPU several times less powerful than itself seems wrong to me. For graphics, we can stick to a graphical LCD, so we don't have to worry about VGA or NTSC video (the latter being a lot harder for color, but easier for monochrome). STN LCDs are piss easy to talk to.

Fundamentally, you don't need much for a barebones 8-bit SBC. Strictly speaking, you don't even need RAM, but it really cripples your system to not have it. 8k EEPROM, 8k RAM, CPU, Xtal, and some kind of interface to the outside world (even just a simple parallel bus) is really all you need.

Most, if not all, parts we would need could be sourced from Digikey, Mouser, or Jameco.

check this

Basic idea for a Z80 SBC. I haven't built it up physically to test it, but this is the basic idea. It has 8 LEDs that you can set by outputting a byte, and 8 DIP switches that you can read by inputting a byte. When you call an IN instruction, it sets a latch that will cause the Z80 to wait until you press SW1. That way you can input data.

I just did this up from memory of how the Z80 functions, some signals may need to be inverted, I'll have to build it up to play with it.

brainlet confirmed

The ENTIRE REASON that DOS was replaced on PCs before the advent of 64-bit/SMP machines was programmer convenience. Instead of every single application having to be manually aware of screen resolutions, detect and use different types of hardware (especially graphics and sound!), contain a full TCP/IP stack, and so on, you get a set of APIs and libraries to write against, and a kernel to manage time sharing so you can multitask. Windows was by no means the best way of doing this, but it was dirt cheap compared to OS/2 and was backwards compatible with DOS stuff. This in turn drove x86 sales and economies of scale through the roof, and so by the time Linus Torvalds and the BSD guys were looking for a sub-$5000 home micro to target their Unix clones to, PCs were the obvious choice. So you're not just wrong, you're SO wrong that the entire OS and application landscape on x86 is centered around how wrong you are.

t. tried doing modern things on FreeDOS for a week and almost shot myself

No they're not. FPGA toolchains are so proprietary they make Visual Studio look like GCC. If that ever changes, THEN we'll have a fighting chance.

Looks like the 9958 can still be bulk ordered. Maybe it's still in production?

The trick with DOS (and other simple systems) development is to not do modern things, and to build up a solid library of... libraries for all the tedious stuff. Forget about preemptive multitasking, generic GUIs, always-on network connections and all that other modern crap. Instead, write simple applications that fit your work and your workflow.

Honestly you could ship the GUI modules, hardware abstraction layers (DirectX/OpenGL, etc.), drivers, and TCP/IP stack as open source libraries. The reason Microsoft didn't is that they were trying to make money off the platform. It raises the complexity a bit, but you could probably implement dlopen() for a sequential loader like DOS, at which point something like DJGPP for porting POSIX software could let you do some very complex things from a simple base.

I'm not sure what he's complaining about re: FreeDOS, unless it's that he couldn't waste time on Faceberg and Youtube. I tried out FreeDOS a while back, got a packet driver set up, and was able to browse webpages, download files, zip/unzip, listen to mp3s, program in asm and C, etc.

He seems to be missing the point of OP's thread, however, which is to keep things simple instead of rebuilding the current tower of garbage from scratch.

Tower of garbage. Completely misses the point of the OP. You want GUI, HAL, OpenGL, complex POSIX shit? You have it. It's called Linux.

Wrong thread, m8.

I think the RC2014 includes all necessary components, if you order the full kit. Granted, you have to solder everything, but it's not a very hard job. The components are all old school and large, with lots of space between the pins. Also the chips all have sockets, so it's not like you're soldering the chips directly to the board. You solder the socket and then plug in the chip.
And the base kit is simple, pretty much the barebones to get a BASIC interpreter running. I doubt you can get much simpler than this without using pre-assembled system. But if it's all on one board, how do you expand it later when you want to branch out and experiment? I mean short of designing your own thing entirely on breadboard and wire-wrapping everything...
Maybe their price is a bit high? I don't know how much it would cost to build all that from scratch (I mean design your own PCB, gather components, and so forth). But it doesn't seem too hard to actually put together, even for a relative n00b. This kit along with some useful Z80 hardware specs and general documentation like pic-related (PDF is easy to find online btw) would make a pretty good starting point.
But hey if you can find a simpler and/or cheaper kit, post it. The more alternatives, the better.

Well the nice thing with CP/M is you don't actually have to know anything about the hardware. Only the "BIOS" part of CP/M is platform specific. Your program interacts with the CP/M's BIOS not the hardware directly.
This allows a single program to run on vastly different platforms.

The RC2014 has not just some, but *all* of the problems I identified.

-The bare minimum RC2014 consists of 5 PCBs.
-It uses a UART that does not seem to be manufactured anymore and has to be scrounged.
-The "bare bones" kit is expensive, running around $100 with shipping.
-The ROM that it uses does not seem to have an inexpensive EEPROM programmer readily available. The cheapest one I found that might work was over $350.

What does everyone think of this thing? Looks really nice, physically. But I don’t know enough to size up the internals.

I’m not crazy about that book. Can I still purchase the components I need? The whole thing needs to be plugged into a terminal. I don’t want to buy an ancient bulky dying terminal for an exorbitant price off of shitbay. And I’m not going to use the computer via serial port on another computer. Might as well emulate in that case.


There's tons of stuff out there to choose from, you could probably use a TDD.

Its 5 bux

It uses 27 series eproms. You could program a 27xXX with literally any burner made in the last 40 years.

I stand corrected about the UART and programming the 27C. The RC2014 design is still too complicated, though. Reconfigured into a single board design, it might be a good option.

i have that book in pdf if anyone is interested.
sadly since its an old book from the 70s pretty much all the components cannot be found anymore

So just ditch the bus and card idea and put everything on a giant perfboard. Wire wrap the whole thing.

Try and source all those parts on your own. You end up spending 1/2 that price easily. Then think of all the time its going to take to reconfigure the layout on to a single board. Then wire wrap or solder everything point to point. And what if you want to ad something in later or replace something?

Is RC2014 the best possible design? Who knows. But it is readily available and made up easily sourced parts if you had to replace something. All legwork has been done and it has a community that is still actively developing for it.
CP/M has already been ported to it so in a weekend you could build the thing. Boot an OS and get to work on your code. All on a system you know 100% from bare metal on up because you built it. No place for botnet to hide.

And RC2014 also comes in SBC form if you really want everything in one board.


Here is yet another Z80 design

CPU: Z80 running at 7.3728 MHz (* see below)Interface: Two high-speed serial ports at 115200 Baud. One with a fully compatible RS232 driver.Fully buffered serial input on both serial ports. Supported by interrupt-driven drivers and circular buffers. Buffer size is 60 bytes. Full signal indicated when 20 chars received to accommodate run-on from the sender. Empty signal sent when 5 chars remain in the buffer.Disk: 64MB or 128MB Compact Flash card support, containing 8 or 16 logical drives, respectively.RAM: 64K ByteROM: 16K Byte, switched off when CP/M active. ROM contains the bootloader and memory load utilities. Also contains Microsoft BASIC, as used on the NASCOM computer, modified to remove code that is not relevant to a serial-interfaced board (eg. screen handling and keyboard matrix scanning).Resets: Both cold (full reset) and warm reset (used to return to CP/M prompt) circuitryPower consumption: Less than 250mAChip count: 9 for a fully functional CP/M implementation with an RS232 compliant serial portCP/M support: 2.2 with included software.

There are a few solutions for ethernet

This one is an open design. The source code is published in ANSI C. Lots of documentation in the source zip

There is also all the Arduino stuff.

If you want to handle the TCP/IP stack yourself there are strait media converters available cheap

don't let them slide this, we can make it.
SymbOS (Z80) has a networking stack.

We could just all use Contiki, it's available for a shitload of platforms and has a TCP stack.


Oh look, a Maker™. Too bad you took the Buying Like A Witless Consumer Tool route, instead of learning everything yourself.

But it's precisely not a consumer product, since you have to assemble/solder it yourself and you can expand it to do whatever, including adding prototype boards. It's closer to old computer systems you built yourself like in the late 70's (pic related), except more expandable by design.
The cost is biggest issue IMO, especially after adding VAT and shipping. But it's dumb to say it's worthless for learning.

You either trace your own circuit board or stop larping too.

For a one-off project? You just outed yourself as a teenager.
PCBs are a mass-production method. They take effort to design, tens of hours usually. They're 'popular' on MakeABuckToday sites, because someone wants to sell thousands of them, and 40-year old born-again techies /think/ that's what they should be doing too because muh Cargo-cult. Go have some Smashed Avocado on toast w/ bottle water, pretender. to solder, not how to design a computer.

For complex projects you are often better off designing and etching a circuit board, even if you just want to build one. It's far less of a headache than dealing with a billion wires.

You only design your own PCB after you've gotten your prototype working. They're great if you need a lot of the same board or you're using BGA packages or something equally difficult to work with.

Check this out
32Mhz 60030
256Mb of ram !
Runs linux

Gentlemen, get your hackerspaces ready! There are ravenous nixie tubes on the prowl.

the 68030 can address up to 4GB of ram.
what do you think ?

The 68030 and 68040 are great processors in their own right, but they are expensive to source. Especially the non-EC versions (EC versions of the chips have no MMU and no FPU, and Linux needs an MMU). It may be more practical to build an 8 bit computer than it would be build a fully Linux compatible machine, especially for us Holla Forumsfags. I have some experience building single board computers, but only 8 bit systems.

Out of curiosity, are the AMD Geode chips pozzed?

I know a lot about the 6502; the microcomputer era was a golden age for efficiency.
The amount of complexity of the system could be managed by a small team of engineers.
What made the 6502 "better" than the 68000 was having programmers involved who understood how critical interrupts are and the memory bottleneck.
While I find minicomputers more elusive due to high level programming languages, the microcomputer offers real world results.
There is no market to build a new generation.
Even if you've mastered VHDL and semiconductor production, you still need to learn chinese and have a million dollar investment with no gaurantee of yield.

I do believe that something new can come from old off the shelf components, but you require a software stack built mostly ground up.
TLDR; were is the market, and were are the engineers?

Maybe companies, government, and military could make good use of sane and secure computers?

Everything is depending on totally pozzed machines, and it will only take the slightest push to send everything into total chaos.

There is no market here. If you wanted to turn this idea into a commercial product, you would have to go the route of ✓ and find a product with a viable market that happens to use the kind of hardware we are talking about, and could be converted into a home computer by enthusiasts.

True, but a 32bit cpu with up yo 4 gig of ram would basically mean an almost complete freedom.
we'd be able to run most tasks.

if we're going 8bit, i think a Z80 + SymbOS would be the closest thing to a modern computer we can get.

could also go for an 8088/86 but there is very few build/literature about them.
you can get Dos and windows 3.0

anything else is black screen CP/M or Basic, you can't do much with that, its the dark ages. may as well just use a pencil and a calculator

I think the software educator would be interested in a platform that lets students do low level and high level tasks in their first class -- without requiring system administration skills.
A system with a default language(s), module/library systems, graphics system, sound system, documentation, an editor, a build system, SCM system, all baked in.
Something with an authoritarian design like an Apple computer, but less homosexual.

In the 80's drawing to the screen was done on day one of programming 101.
Now you have to teach the student how to install/compile packages and libraries, teach OOP, teach a rendering api, a gui framework, how to link and compile, and most importantly -- how to copy others when your shit doesn't work.
This wouldn't be an issue if students were learning with something sensible like Racket (or even BASIC lol), but due to pressure from industry, people prefer to teach shitty languages like C# / Java as they do address the previous issues.

When you only teach the easy high level languages the student begins to "believe in magic" (Stroustrup), but when you only teach low level they don't learn proper abstraction and safety measures (autistic homos writing their own memory manager and containers, dangling pointers, shit written in ASM, C with classes, never writing GUIS, thinking ncurses/TTY is cool, etc.).

This video is a good summary (sorry for youtube):
The amount of "suckless" hobbyist programmers out there is not enough to make a dent even if we were somehow significantly more skilled.
For every one user that reads SICP, 1000 hipsters read "Eloquent Javascript".
The education system is pozzed, consumer software market is pozzed, crap-wear is incentivized, and indians/chinese will probably financially dominate you in your own country forcing you to join the trades or do labour.

And that's why I don't see a market there. I am thinking more of small electronics which are usually built with micro-controllers, where using dedicated components instead, and having some extra terminals on the PCB, would not increase the price too much. Stuff like clocks, home weather stations, dictaphones, programmable keyboards or game controllers, etc..

AMD Realizes it can't compete with Nvdia with GPUs so they're pivoting to making APUs with intel. Decent GPU and CPU in one chip; it's perfect for phones and low powered laptop. Really is a fantastic move, it should make cheap phones even better.

Why the fuck would you use AA batteries? That's retarded. Use 18650s. They last a long time, they're lipo so they're not shit rechargable, they have WAAAAAAY more energy density than AA, and they're ubiquitous. Most bigger laptops just use 16 of them in series, something like this could run off of 4 and last a decent amount of time. They also have good, cheap, long lasting external chargers. A cord would be optional.

Well, this thread just got a whole lot more relevant.

Maybe our path forward is simple, secure terminals just powerful enough to run a basic OS, video games, etc. and a Drawterm instance to the 9front /g/+Holla Forums grid.


The URL is a typo. It's actually 750,000 workers that could get deported.

Let us suppose we decide on a 32-bit CPU. I think we should make USB, ISA,PC,/PCI-E , or something available to whatever platform we create. For example: with PCI, it would be a very easy to get sound cards or USB, even 2D acceleration with old S3 graphics or whatever. Or if we go for USB directly, tons of devices would be available pretty soon.

Also: I've been a SCSI fanboy since long ago for example. IDE/PATA might be easier to implement, but SCSI scaled very well compared to its age. More devices/controller, more types of devices available (card readers, scanners, tape drives) and sometimes having several data storage units in RAID is a must, and older computers cannot be expected to deal with software raid implementations. Yes, SCSI voodoo might be pain in the ass to deal with, but with getting more and more of it open hardware, it will be easier to debug than the black boxes we had/have.

You're missing the point of the thread. Think of an MSX or a Tandy CoCo or an Apple //c, not a PC or an Amiga or something. If there's ethernet you can do network backups to a real OS with real RAID and a real filesystem.

Then you basically want thin clients, right?

No, he wants a simple home computer that's not 30 years old.

You guys gotta fuck off. If you want that stuff then go literally buy it at Walmart. Stop trying to change the idea behind this thread. Only bare bones, “ancient” shit in here.


hell I've even seen some old chink laptop batteries use these on their insides

I have a couple of Newton MessagePads. One of them is a MP110 with a 20 MHz ARM610 CPU, and the other two are MP2000 and a MP2100 units with 162 MHz StrongARM 110 CPUs. All of these machines will run for about a month on a set of four AA batteries. The SA-110 draws about 230mW max at 166 MHz. I'm not sure about the ARM610, but it should be comparable. These Newtons also came with rechargeable battery packs, which were internally NiCd AA batteries. On the early models, these battery packs were literally two pieces of metal glued to the sides of four NiCd AA batteries.
According to the Z180 datasheets, a 20 MHz chip draws max 50mA and typ 30mA at VDD = 5V. From what I can tell, that equates to 150mW typ and 250mW max at that voltage. A single rechargeable AA might have a capacity of 1.2V * 2800mA = 3.36Wh. I don't know how much the other chips on the board would draw, but that single AA battery should power a Z180 CPU for more than 13 hours on a charge at maximum power consumption. 9Wh 18650s are massively overkill for something like this.
The 68030, according to the 1991 datasheet, draws a max of 1.7W at 33 MHz. Although this scenario would make 18650s more justifiable, they still wouldn't be my preference because 18650s are expensive, hard to find, and make things more complicated than they need to be.
You should read the datasheets yourself instead of trusting my numbers. I'm not 100% sure I interpreted everything correctly.

Compare all of this to a newer laptop with a 45W quad core Intel CPU and a 90Wh lithium battery made out of a fuckton of 18650s. Under full load, that laptop would theoretically last about two hours on a charge.

So a thin client

No, more like a GameBoy or GBC. Or the credit card reader I posted earlier.
Something simple that's got an 8-bit CPU with around 64 KB of RAM and a plain serial interface, and optionally low-res graphics and a simple sound chip. A system that boots into a ROM BASIC, or simple OS like CP/M or SymbOS, or even a Forth REPL like the Jupiter Ace.
Anything more that tries to run modern OS or software doesn't fit. Those are too complicated and detrimental.

Says someone who hasn't had to deal with a billion circuit traces...

Whats all that stuff stacked up on top of the monitor in picture two? Looks like it has some sort of cupboard specifically designed for all of that hardware, and the room was designed to fit that cupboard in the wall so it doesn't take up space in the room.
Is that some sort of Japanese micro house?

I know what you meant, but it's still kinda funny.

You underestimate the amount of work required to build a modern computer motherboard. I've routed PCIe for work and, let me tell you, even PCIe x1 is a bitch to do correctly. Modern CPUs will also require at least a four layer board, if not six, just due simply to the number of pins that need to be connected. And this isn't even taking into consideration the absolute insanity that clock and data lines will be at even mid-megahertz frequencies.

I'll look into using uCLinux on a 68EC030 or 68EC040 (no MMU), but I think that's the upper limit of the CPU power we can hope to work with and not quintuple our board prices and prototyping time.

Maybe if we find a module with processor and memory, designing a baseboard is much less of a hassle, fuck routing even semi modern ram on anything not 6 layers.

What reasonably power, non-pozzed module is there? I work with the Nvidia TX2s at work (and previously the TK1s), but they are way too expensive and are probably botnetted.

Perhaps some MIPS module? What about Gumstix?

I was thinking of systems at ~pentium/486 level, PCIE was just something I mentioned because it might be a possible upgrade in 10 years. PCI was alive and kicking on 486s and ISA

The vortex86 and eoma68 mentioned before looked fine in some ways, and yes, price will constrain the attempts.

ISA seems like it woud be a step back. How easy is it for hobbyists to implement PCI devices?

I'm a 68k NuBus fanboy, but those tranceiver chips have long since been discontinued and would require botnet FPGAs or a ton of glue logic to implement. But that might be true of parallel PCI too.

not easy.

I think you're on to something here. The ideal educational computing system will have excellent development tools and support high-level programming for ease of entry and learning theoretical concepts, yet have a bedrock abstraction preferably at the microcode level (but sadly going below the machine code level isn't really feasible without very exotic hardware)

Early Macs do get pretty close to the mark with things like HyperCard and a simple OS with no pre-emptive multitasking or memory protection (which is fine for an educational system, crashing once in a while and figuring out how to recover is a good learning experience), but it didn't really come with any good low level tools, with no user groups or friends with Macs when I grew up I never got hold or even heard of things like Macsbug and programmer switches and proper documentation.

does anyone know where i can find the circuit diagram of an MSX2+ or an MSX turboR ?

No idea, maybe someone on an MSX forum will know. Only thing I happened to notice was there exists a DIY kit for MSX2 with lots of docs, including schematics:
It's about $500 (370 euros) though.
But they also have some other docs here:

i know, thats why i'm looking for the plans, that bastard is charging 500$ for something that cost no more then 50$ !.

ok, you wored on it you want to make some buck, make it 100$ or 150$ not fucking 500 !.

The sad part is that the price is not entirely unreasonable for what he is selling. Which is why having an open alternative would be so nice.

Like hell it is. A 100% compatible C64 clone sold for half the price, and a clone of an Aussie computer called the Microbee-- which came with a complete casing and keyboard was 2/3 that. It's a N8VEM-level product, and they sell for 1/4 of what that guy's autism is asking for.

NON-FREE, BOTNET wives, you should add

Is the hifive1/FE310 any good for avoiding botnets and hardware bugs? It's an entirely open-source board/cpu that uses RISC-V


Universal Shitty Botnet

With is new intel CPU security flaw building our own computer has become a serious matter.

Shut up faggot.

A CPU in 10 or 12

There is a guy in UK selling a full Z80 computer kit with almost the same componants for 188$
Sadly its not MSX compatible but still, that Russian faggot is way off in price.

In theory, RC2014 could be expanded to be MSX compatible. Or a ZX Spectrum, Amstrad CPC/PCW, TRS-80...

in theory, i guess, but i don't know the address map.
you can't just plug the components and expect them to just werks, we need to put them at the right addresses

There's a map here, at least for the first model:
Looks like it came from a 1985 book (and magazines). Maybe the rest is in other print resource. The only trouble is some of those are in non-english language.


it has a microusb

That chip is probably at the upper limit of what you can hand solder. The soldering is actually the easy part - flood one side of the chip with solder, then wick it off. Solders all (or most) of the pins on one side of the chip in one operation. The hardest part is lining all the pins up. I usually put a tiny ball of sticky tack on the bottom of the chip to hold it in place.

That package is easy compared to things like QFN (which I have also hand soldered).

And the price of that would be 1/10th if it used regular stripboard, instead of this attempt to be a wannabe Sir Clive.

Pic related. Stripboard system I'm building. Spent $15 on parts at this point.

The traces are already there, minimizing time.

I would love such a computer.

What has time got to do with it? If anything, a hobbyist -enjoys- the extra time spent on the activity.

wires are unreliable, they fall off with time, some joints will be cut. and you'll have a hard time fixing it.
its also a pain in the ass for decoupling, your circuit will become unstable as you go higher in Mhz. it will add noise.
printed board are way better


Host Interface (microUSB): Program, Debug, and Serial Communication

It is the most libre "full stack" computer.
The FTDI chip is a black mark but otherwise it is the least botnet, modern embedded CPU that exists.

I thought that was just the power port

See this m8y? Runs at 9Mhz-- when a PCB design would hit the wall at 5. The 74LS parts are good up to 25.

This is like the 8th time in this thread you've proven you're a 15yo talking out of his ass.

gas yourself you insecure little cunt.

i didn't say it was impossible i said it was unreliable.

Why haven't we banned tripfags yet?

Thanks that was an interesting read, particularly the rom part.
i hope there is an msx2 version of this.

bump to save from sliding

Is anyone aware of a 65C02-based design that's similar to the RC2014?

hey retard, do you even know what forum sliding actually is or are you just parroting words other people said to try and fit in?

What's wrong with ARM?

This thread isn't about

It's about extremely simple 8-bit computing. (Re?)read the OP.


it would be extremely painful

The thread is not about any particular bit width.

If you absolutely must tie it to a particular metric it's better to tie it to manufacturing process size. Chips fabricated with a smaller process is harder to reverse-engineer and easier to hide hardware-level botnets in.


Such a convincing argument.

I'd rather focus on end-to-end metrics. What's the target functionality, and work backwards from there. Making a computer that runs at 8MHz is all well and good, and maybe you can program it to blink some LEDs or something, but it's not useful.

What's the minimum requirement for shitposting on imageboards? That's what I would like to see the focus on. That means it needs to be able to do internet, so enough processing power to run ethernet or wifi controllers, it needs to be able to cache and render thumbnails and webpages, so a decent chunk of RAM too, and it needs to be able to run some facsimile of a browser. That last bit is where you determine what sort of software you're going to be running, which will determine what sort of language you will be using, which will determine what sort of instruction set you will be using. Modern browsers are right out, throw them in the trash, you're better off using the web api's directly and making your own single purpose viewing app like overchan. The question is how minimal can such an app be made that it works well at sub-1GHz clock speeds? Sub 500MHz clock speeds even.

The main problem with viewing imageboards on old hardware isn't really the networking. Usenet was around even in the early eighties. The relatively high resolution images and videos on the other hand...
You're overestimating the needed clockspeed though. You can browse imageboards on an Amiga 1200, and that's far off from 500MHz.

What about GPUs? If it's so strenuous on the CPU, hand off the work to a separate dedicated processor for rendering images and videos.

8-bit ≠ 8MHz

That's nice. Maybe you should make your own thread if that's what you want to focus on. This one isn't about building minimum-viable imageboard shitposting machines.

interesting, i wander how it works exactly, is it a multi-core computer or a single core computer that can on occasion access another cpu

if i had to guess, i'd say a 68060 (maybe two) and +128MB of ram, that would really be the bare minimum (the 68060 can take up to 4gig in theory)

the 68060 @75Mhz has 110Mips which is equivalent to a Pentium 100Mhz.

as for GPU a Yamaha V9990 seem to be the right choice.
could probably go for an ISA vga too, maybe ?
Sound chip is up to you, there are thousands out there. you're not limited.

OS is obviously gonna be Linux, there is a port available for it and you can build on it.
obviously, you're not gonna be using Firefox or shit like that.

is it doable ?

yeah, but there is only two difficulties.very hard and night mare mode, depending on if you want to go for CPLD or old school 7400 TTL.

From the links it doesn't look like they are running the Z80 and 6502 in the same system. They are just replacing the Z80 CPU board with the 6502 board. The modular design of the RC2014 makes that simple.

With the Apple ][''s Z80 "softcard" they used the DMA line on the bus to "sleep" the main (6502) CPU. The 2nd CPU(Z80) then takes over and talks on the bus.

Yeah OP nobody cares about your autistic wall of text about modern CPUs and shit

So anyways I have an old Commodore 64C laying around with a datasette drive. I have no floppy drive though. I can write any .tap to a cassette tape easily with pretty much anything with an audio jack though. Is there anyway I can throw together some DUI solution for plugging this into my router and logging onto some BBS FOR SCIENCE?

building on the same architecture with a better instruction set sounds great.
it also doesn't sound anywhere near viable.


I could help you but I won't now because fuck you too.

Why obviously? Why not a lighter weight OS?

Do you really need preemptive multitasking and protected memory just to shitpost? And I think it's missing the point of the thread, to get away from all that complexity.

The thread says otherwise.

Create your own thread instead of trying to derail this one.

I came.

Shut up faggot.

Anyway won't Linux have effectively more overhead when doing context switches between kernel and userland? When you have old CPU, better not waste any cycles (AmigaOS was really efficient in part due to not having memory protection). Plus, being stuck in userland all the time is a drag, and you don't need the additional security on a personal Z80 or m68k that's not going to run network server and big fat javascript browser.

It's also probably not the best idea to try to shove an entire general purpose OS into the mix anyways. Even Temple OS is more than you'd need. A better path would be to try to imitate video game console OSs (old ones, not modern "consoles" that are really just shitty PCs in disguise), where the entire environment is purpose made to do one thing and do it well.

why start from scratch? especially considering how hard it is to make an os and a webbrowser.

user it's easy as fuck to make an OS and a webbrowser. Coding your own webbrowser is like year 2 CS student homework. Now coding your own javascript interpreter is a different matter, but I think we can all agree that there's no reason to support that cancer anyways and the sooner we get off of it the better. The same goes for OSs. A fucking while (true) loop is an OS. There's no reason we need the extreme amount of cross compatibility, backwards compatibility, general purpose compatibility, reads-every-pdf-version and executes-legacy-code and runs systemd and x11 behemoth just to run a purpose built communication app that doesn't do much more than send HTTP requests.

I'm not a programmer, that really isn't my thing tbh.

c'mon step it up


Ya that's why CP/M is popular on Z80 machines. It lets you do simple "DOS" stuff and when you run your program it gets the fuck out of the way. Its even acceptable to clobber the OS that's in RAM if you need more memory. You just call a warm-boot when your program exits and the whole OS gets read back in to RAM from disk. Takes only a second or 2 even on the lowest end machine.
That is how you get 63K of usable ram on a system that only has 64K.


Someone made a wireless modem that plugs into a serial port and connects to Wifi. I don't know if it works with the C64, but it works on IBM clones and other 16 bit machines I believe.

It works with the C64, Amigas, Ataris, pretty much anything from that vintage.

Seems reasonably priced. I was unsure about the C64 because I forgot if its serial port was standard or not. The 16-bit/80s workstation stuff is definitely got more of the standard connections instead of proprietary stuff.

Is that better than RPi?

It's not a C64 support thread. And if you don't care about an "autistic wall of text about modern CPUs" you can fuck off to your own thread, don't speak for the rest of us.


The Model T is still looking pretty good:

I also like the HP LX series MS-DOS handhelds.

A new machine should aspire to their level of usability at least, there's no reason why a desktop, laptop or handheld form factor can't be running on the same hardware internally.

holy fuck that is some cancerous shit right there.
pahahaha pathetic faggots

Is CHIP actually backdoored? I know it has blobs though.

I might get a Model T. Being limited to BASIC is not great, but the other
features are nice. There's a similar machine: The Cambridge Z88. And those HP
LX's and other similar machines are really cool.

My main concern with buying these old machines is that you don't know what you
are getting into until it's too late. How do you know that what you are buying
is not messed up or altered in a gay way by a previous owner? I'd also be
concerned that one of these old machines will give out at any moment with no

Do you think it is safe to assume otherwise? I don't know about you, but I don't
trust those queers.

With the limited resources of these machines BASIC's a pretty good fit. Any code I write on my Model T is small, math functions and what not for now.

Most of these older machines are so cheap that you could buy a few and have a spares pile. On the other hand they were built very robustly.

Yeah it's a problem, it's being worked on though AKAIK.

Your motherboard that using to post runs 100x times that. No one cares about your shitty address peeker.

You literally do.
Let's not forget that even Microsoft didn't try to drop a web browser into anything less than a 32 bit, preemptive multitasking OS.
Without this, enjoy JavaScript overwriting your kernel while it runs.

You don't if you unload your network stack unto separate hardware. Just let another machine (like a micro controller) handle the network and present the data you need to the main system, in the form of serial streams or memory pages, on demand.

There are parts for this now actually, the package handles all the networking and tcp/ip and appears as a simple serial connection to the host cpu. But how to audit such a thing?

In the context of this thread: The same way you handle the rest of the system, you make it yourself. The idea is that two very simple and specialized systems are easier to understand and manage than one complex generalized system.

Heh. I wouldn't ever want javascript on Z80 computer. It already slows down my amd64 enough as it is. Also don't forget: 64 KB of addressable memory.
As for multitasking, you can do that entirely within your program. Effectively it is the OS.

The idea is to get away from that user.

You're right, but unfortunately, a vocal minority in this thread doesn't seem to understand that, and wants to recreate the entire shitshow we have now, only from PCBs in their garage.

You'll never get it to perform well, since it's already slow on modern systems. Besides, it's also another means to pwn your computer.

There's also a part that doesn't understand that "just shitposting" or "just watching anime" requires a good 80% of the cruft we have already.

So what? They're not in charge. Ignore them like the rest of us. We can't be bothered with people who fail to grasp the simple premise of this thread.

I think OP should just go make his own thread if the things people are discussing upset him so much.

Arachnid exists, you know.

Arachne, sorry.

Lynx was also ported to DOS. And someone even started a fork called BOBCAT that ran on 8088.


no Ethernet

ARM as in the spying program from the nsa?

Acorn computers ARM chips might be interesting, if you can find some. Goes back some 30 years, about same timeframe as Amiga, but not as widely produced. 68000's are easy to find since there were used in tons of different computers. Also looks like ARMs were surface mounted, unlike 68000. Would require advanced soldering skills.

How about this bad boy right here?

Zilog Z80, can be programmed in TI-BASIC or Z80 asm. A very nice option, except for the fact that there's no way to work with it directly with a serial terminal and keyboard (AFAIK).

It's crap compared to a 50g from a user / developer standpoint though.

50g is really the thinking person's calculator, and now it's discontinued and shot up to almost 10x what they were before.

Keyboard is included. You type BASIC programs directly on it. Maybe it's not the greatest keyboard, but it's still functional and not much worse than ZX-81, Spectrum 48k, and such. Anyway, you won't type big programs, because memory is limited. I don't know about asm, because at the time my TI-82 didn't have an asm shell like TI-85 did (this was 1994). TI link cable was used to transfer programs.
Much software here:

This gets demerit for having USB. That's the kind of modern thing we strive to avoid, as it greatly increases the complexity of design. Simple serial connection is all that's needed.

You're going to have to explain why USB is bad besides just saying it's bad because it's "modern". Where is the botnet in USB?

I didn't say it's botnet, although on x86 is certainly it (see: BadUSB), but that it's more complicated than the goals of this project. The USB standard is convoluted and needs lots of code to support it, as compared with simple serial interface. Our aim here is to make things as simple as possible, not replicate the insanity of modern computers, where nothing works right and bugs are ever present even in the hardware itself.

What is the deal with this? I understand that you can write programs in asm on a computer and transfer them over. But can you write asm programs directly on the 83 without an external computer?

I've often wondered if aggressive parallelization can help. If you've got a massive array of 8086s, maybe you could get some decent websurfing done.

From memory, I believe you can, but you're writing machine code directly. It isn't pretty.

That's what the Amiga does. The CPU is less than 100MHz but there are a bunch of custom chips to handle audio, video, etc.

The Amiga also has a bunch of weird upgrades available so you can basically put the poor thing on life support. The problem is you'll have to pay out the ass for them, as well as shell out the inflated price for a base Amiga system.

It's hard to imagine that Microsoft used to make decent products. The most popular CP/M machine ever made was actually the A2+Softcard.
There is a dude in S. Korea who makes clones and is also working on a MSX co-pro board.
It is under "My Z80 card" on here

You can use the USB power when changing batteries so you don't lose your data, otherwise IIRC it just behaves like a serial over USB connection. For a new device, sure, avoid USB, but the 50g even with it beats the shit out of any TI calculator.


Thoughts about what sort of peripherals and buses will be needed? This will dictate the selection of CPUs.
- The good old parallel address/data bus with chip selects. Needed for SRAM
- I²C
- U(S)ARTs are a must

Many old bus standards were based on address/data buses (eg: ISA). Would be great to be able to use old ISA cards in a Z80 system.

How about a remake of the pozzed education system?
Information in one's mind can never be deemed illegal but the process in acquiring it may or can be. Some books we can freely obtain legally today might be illegal in the face of an authorian state.

I know piracy is morally wrong and violates the IP law but it's good for humanity as a whole. Copyrights, patents, and laws protect the content creator and its royalties but in the long run it only awaits an abusive authoritarian to monopolize or make use of all these to their own interest(s).

Every piece of human work has its incentives but it would most likely just vanish or be censored/monopolized in the passage of time or authority.
People should make more non-copyrighted content or atleast make another internet sub-iteration or at most a better network where everything is accessible freely maybe illegally too and it can't be shutdown. Something like Project Xanadu but more archival and checksumed which includes different untouched versions of information because the current internet is a clusterfuck of identity politics/narcissism with disinfo and shutdowns. The internet is weak and it's dying in a degenerative stupidity.

I think information should be freed just as humanity should always be. Things only get worse from here if you can feel it.
Let's get out while we can.
If you think about it only the people in authority have the authority to fabricate CPUs although it was completely the hardwork of humanity that made it to fruition but it doesn't belong to us at all. Here we are discussing feasibly usable old-ass CPUs for a nice project or waiting for RISC-V but in the end we're just going to be peasants against magicians.

Don't wait until it's too late. Build the weapon.

Z80 stuff was mostly S100. "ISA" like your photo was really only a PC clone thing.

I like what RC2014 uses because it makes building your own cards easy.Cheap and simple. You could even just use perfboard.

Getting the PCBs made for edgecard buses like ISA and S100 is kinda a pain today.

How do you intend to do file transfers if not with USB? It seems to me like this would be an ideal distractions-free machine for writing creative projects or programming projects; the whole thing seems sort of pointless if there's no way to move your content to a computer you can distribute it from. We already know you don't want a networking stack, so how do you intend to handle file transfers? With a floppy drive?

Serial, parallel, ethernet, or even copying files over floppies/swapping HDDs would work. USB isn't the only way to transfer data.

Not according to Urbit friend :^)

Kermit works over anything. Telnet session, RS232, ssh session, dialup, radio links, IR,..etc
If you can move ascii text in and out then you can use Kermit.

There are clients ported to all the 8-bit CPUs. There are clients for Mac,Windows, Linux, BSD. Clients licensed under GNU and BSD. The boostrap is so small you could even type it in by hand in hex if you had to.

The *nix clients can even run as a daemon so you could connect with your own machine to a linux box running the daemon and put/fetch files interactively like ftpd/ftp.

So even if your homebuilt computer is so limited it only has 1 RS232 link for console running 9600baud you could use kermit to send files in and out of it over the dumb console port.

Doesn't actually tell you a whole lot. What serial and parallel protocols specifically are available?
Requires networking.
This is practical as long as you can actually get your hands on a floppy disk and a normie computer with a floppy drive. For some reason I was envisioning a device that looks like one of those old ARM palmtops from the 90s, but re-reading the OP it looks more like he wants something shaped like a c64, so size probably isn't an issue. Definitely the best non-USB option here.
That's probably too time-consuming, though it should technically work as long as you give this thing a linux-compatible filesystem.

So you're all right with giving it a networking stack? Personally, I'd consider networking to be more something to avoid than USB.

Swapping hard drives isn't too time consuming assuming you put them somewhere easy to access such as a drive caddy.

Floppies are pretty easy to obtain along with the necessary drives. The only problem is that production is pretty minimal if at all, so you're stuck with old disks unless you have a way of making more.

As for serial and parallel, the smaller RS-232 ports and your standard printer interface seem like the way to go.

Swapping hard drives isn't too time consuming assuming you put them somewhere easy to access such as a drive caddy.

Floppies are pretty easy to obtain along with the necessary drives. The only problem is that production is pretty minimal if at all, so you're stuck with old disks unless you have a way of making more.

As for serial and parallel, the smaller RS-232 ports and your standard printer interface seem like the way to go.

Swapping hard drives isn't too time consuming assuming you put them somewhere easy to access such as a drive caddy.

Floppies are pretty easy to obtain along with the necessary drives. The only problem is that production is pretty minimal if at all, so you're stuck with old disks unless you have a way of making more.

As for serial and parallel, the smaller RS-232 ports and your standard printer interface seem like the way to go.

Goddamn latency.
I only meant to make one post.

What network stack? I am guessing you didn't read the protocol spec. Kermit doesn't need IP at all. Sure it could work over IP but it could also work fine over strait RS232.
If you can send ascii text over what ever the link is then you can use Kermit.

Me personally I don't see TCP/IP as a problem. Everything is open and you can implement it over something as simple as UART's using SLIP or PPP. All open standards. On smaller machines it's better to have the individual programs handle the IP rather then have a big stack running in the background at all times. That is how it was done on CP/M and DOS.
USB on the other hand requires botnet chips, and then layers of software to actually do anything. Just seems like adding unnecessary complexity for very little gain. If you only need USB for moving files around then RS232/kermit would be a better choice imo.

The problem with using serial is that files are no longer measured in the kilobyte size range, they're measured in the gigabyte size range. You can't make a baud rate fast enough to push enough data through, it'd take years to move files around.

Satan pls.


You're not using your vision to see, user. People who want something better and simpler won't be downloading their 3gb doujins to their simple devices. Images and video could be done away with entirely on such a platform if you like.

There's no reason for most multimedia so why not do away with it? If you like you can have a simple and safe vector / raster format with strict file size limits, say 4k per image?

You do realize USB is serial, right? So it would potentially be feasible to get a non-botnet serial connection operating at similar speeds.

Terry's Super Simple Serial's looking better and better, low speed devices on one bus for stuff like keyboards / mice, other peripherals on another for high speed connections. Every device listens on one channel or more on each bus, stuff like keyboards and mice can be made to share a channel even.

There's no reason that devices have to have a manufacturer ID and such shit, or that you have to pay a pretty hefty sum to join the industry group that hands these things out and without which you can't put a device into a machine.

Any real effort will look something like S100 or the Apple II, where the spec is fully published and available and anybody who wants to can make an expansion card.

What about AVR? Arduino compatible boards certainly can rival early 8-bit microcomputers in terms of power if you don't mind SoC. It would work about the same with a BASIC interpreter and some simple i/o.

If you want an AVR you might as well get the SiFive E310, more power, more open.

Fair. With the higher clock speed it would be possible to generate a VGA signal on the main CPU. However, it would be more impressive (and cheaper and more easily repeatable) to use arduino compatibles instead. The "open" part of RISC-V has more to do with licensing the ISA to manufacturers than how libre the design and hardware is. The arduino board (though not all chips) and software is all freely licensed. The SiFive is only marginally "more open" while being more expensive and less common.

The E310 is open hardware as well.
Grab the RTL used to make the chip

Yes there will be closed hardware RISC-V chips but SiFive are making two open source CPUs as well.

Even if there are closed source RISC-V CPUs you still have to pass their verification standards to carry the name "RISC-V" on your product.

RS485 can have more then one device on the bus. And its a simple and open standard like RS232.

"Open Hardware" is a meme. Even Stallman knows that it's pointless because you can't verify what the fab does. Once it's on silicon, even if it is well documented, whether or not it's "open hardware" stops mattering; it's a black box. The best you can hope for is no proprietary firmware and undocumented instructions. The RTL being available doesn't really change much because there is only a single FPGA (ice40 from lattice semiconductor) with a free software toolchain and even then it's not very powerful, possibly not enough for the design. The toolchain is the result of reverse engineering not the documentation of the chipset.

We don't have 100% verification in the stack yet that is true but it is far from a meme.
The more documentation you have the better you can verify the hardware you have does what you tell it to do.
Stallman still wants open hardware.

Eventually we might get portable fab tech.
For something as simple as the E310 you could buy 5, image 3 and use two if you were really paranoid.
E310 is 180nm you could do it a la

The ability to fabricate microchips in the home can't come soon enough.

I see people here talking about proprietary hardware, with some arguing in favor of it on the same principles of free software, so allow me to put in my two small copper discs that are each valued at 1/100th of a United States dollar.

Free and open source hardware designs aren't really important when you look at the big picture. Even machines shipping with proprietary and/or closed source software out of the box is totally fine. The problem is when the end user is not allowed to install their own software, and are not given adequate tools and documentation to make their own free and open source software substitute. Even something like the Intel Management Engine would be a non-issue if Intel gave me the ability to permanently disable it. Hell, I might even enjoy some of the cool remote management features it provides if I could run it with my own free firmware. Manufacturers could even charge like $99 for a developers account with them to get the documentation on their hardware. That's totally fine, since it still gives you a choice in the matter, and they still make a profit. At the end of the day, it's the software that controls the hardware, and if you don't control your own software, you don't control the hardware either. I think my biggest problem is with non-free BIOS software. I know that there's Coreboot, but this just a bandaid on the underlying problems we see in the tech industry today. A bigger push for free software is desperately needed. I'm saddened and discouraged that the only systems I can get that run 100% free software are 10 year old ThinkPads. We need hardware manufacturers to simply give us the ability to choose what software we run. I don't care if it's hard to replace the non-free firmware, I just want to be able to do it without obscene amounts of reverse engineering of embedded devices. That's ridiculous and isn't sustainable.

I was wrong. According to wikipedia one of the RISC-V SOCs has been demonstrated running on an ICE40 FPGA. It is a shame SiFive's development boards don't use it. It's most likely because of the low clock rate.

How can you be sure it's really disabled if the design is not at the very least open?

Besides, half the point of the free software movement is to allow people to have the knowledge to modify or create their own software. The reason RMS didn't originally believe non-free hardware to be an issue was because it was relatively straightforward to reproduce with existing knowledge, but the expense of say, fabricating one's own silicon made the task impossible for an individual. His opinion has changed now that firmware is designed to be routinely updated, and because malicious features can be implemented in hardware, such as DRM and backdoors.

How does the Libreboot tranny verify that the management engines in those machines are disabled? Because of various debugging tools that are available for these platforms, that can be used to interact with it. As hard as Intel tries to make the thing a black box from which no secrets can escape, it's still an embedded processor with input and output mechanisms that you can hack into to study it and get a better understanding of how it works.

Exactly, and manufacturers should be made to release instructions on how you can interface with their hardware to create your own software for it. You don't need CAD files for ever little micro controller and every CPU core to do this. You just don't.

Do you mean in silicon? Or in firmware? Because I'm a bit out of the loop on DRM bullshit, though last I heard, DRM was implemented in software in the OS and in firmware with drivers and EFIs and stuff, but not in silicon. Implementing DRM in silicon would be ridiculous because then someone can come along and crack it and all past devices are cracked for life. So if they're doing the DRM in software/firmware, then free software would solve this problem. If they're doing it in silicon, then the reverse engineers have time and permanence on their side.

I don't really see what the issue is.

Modern CPUs have billions of transistors, there could be a second secret management engine in there for all we know. The point of this thread was to explore looking at devices that predate the NSA's meddling or are at least simple enough to reverse engineer when decapped and examined under a microscope. 7400 series ICs and 6502 are about the latest thing you can verify.

You can do this without giving away the working of the device. Think of a microcontroller like the BASIC stamp. If the entire system was implemented on a single IC, it would be a black box, but it could still be interfaced with per the documentation.

It can be either, but you don't need hardware based DRM to be implemented in silicon

The software could still claim to be open source, but without the ability to read the ROM, it can't be trusted. A similar but easier to bypass method was used in arcade systems in the 80s/90s.

It only needs to remain uncracked for the lifecycle of the device. It took a pretty long time for suicide batteries, bluray, and the PS3 to be cracked. All of these have longer lifecycles than a modern smartphone.

I'd like to point out that the tranny doesn't develop anything at all. All he does is rip out binary blobs and re-release coreboot; he doesn't do any development, doesn't have the knowledge base or wherewithall to do it, and doesn't verify any claims made by coreboot. He just strips out blobs, repackages, and re-releases.

Closed source hardware is not the same as DRM, nor is it a prerequisite for DRM. It's an entirely different issue, and you can have open source hardware and open source DRM, which would be even worse, because a lack of obscurity means that there would be tougher security systems with more thought put into them. What's important is being able to control all of the software that runs on your system, either by buying a system already running FOSS or by writing the alternative software yourself. I also don't give a damn if there's hidden hardware and software as long as it doesn't screw with the rest of my system. There could be a tiny MIPS CPU on my motherboard playing PS2 games in the background I wouldn't care beyond the power that it's drawing for such frivolous shit. And I'll know if it's accessing the BIOS, kernel/OS, drivers, or any other devices, just as we know about the AMD PSP and Intel ME. These things wouldn't stay hidden for very long, at least not among Libreboot numales and chan nerds like us. I mean SSDs and HDDs have non-free controller boards and firmware in them, but there's nothing that can really be done about it. It could be better, but there are far more pressing issues within the computer itself, like the boot firmware.

That scenario with coated hardware is pretty out there, and I doubt OEMs would go to those lengths.

By life cycle, I think you're trying to say as long as it's sold by the manufacturer or as long as they give it firmware updates. One of the greatest advantages of free software is that it allows you to keep your software up to date long after that cutoff. The thing about freetards is that we don't mind using old hardware. Hell, the newest device I own is a 2013 Nexus 5. My laptop is a 2012 ThinkPad X230. I have little use for the latest and greatest stuff when I can get these for cheap and install FOSS operating systems on them, and other embedded firmware in them is being reverse engineered as well. I just keep using my tech until it breaks. For me, that's the life cycle of my devices, and it's often 10+ years because I treat my personal belongings well.

That being said, I fully support open hardware and I'll be buying RISC-V stuff as soon as someone packages it up into a nice little pre-built desktop or laptop. Nothing will ever be perfect, but we can push for something better. I just think that this push should start with software, and that focusing entirely on hardware will just take longer.

As someone who is in contact with some of the Coreboot maintainers and gives them anonymous bug fixes for some of my boards, I can tell you that isn't entirely true.The tranny does substitute in some firmware of "her" own, and does some pretty extensive testing with the help of others, and maintains it all in source and binary form for anyone that wants to use it. I'll shit on the tranny for many things, including being and insufferable faggot, but not for Libreboot. It's a good project.


I forgot how comfy cpm/mpm was. The distribution disk comes with ed, an assembler, debugger, load/dump. What more do you need?

What am i looking at?

Couldn't you just get an Pi 3 and do this?

bombing the thread