10 years ago 512Mb of RAM was perfectly adequate for everything

How do we fix this? Or is resistance completely futile?

Other urls found in this thread:

openhub.net/p/firefox/analyses/latest/languages_summary
officialsystemrequirements.com/half-life-2-system-requirements/
github.com/gorhill/uMatrix/tree/master/src/js
idlewords.com/talks/website_obesity.htm
en.wikipedia.org/wiki/Inner-platform_effect
en.wikipedia.org/wiki/Scope_creep
en.wikipedia.org/wiki/Reinventing_the_square_wheel
en.wikipedia.org/wiki/Hazcam
youtube.com/watch?v=kr58r0b5LKM
pcc.qub.ac.uk/tec/courses/f90/stu-notes/F90_notesMIF_5.html
transparencyreport.google.com/safe-browsing/search?url=http://kolibrios.org/en/
en.wikipedia.org/wiki/Macintosh_Common_Lisp
github.com/servo/servo/blob/master/docs/ORGANIZATION.md
youtube.com/watch?v=3o1A16NyrEI
thefreelibrary.com/MACINTOSH COMMON LISP 2.0 SHIPS-a012458182
scribd.com/document/45488252/Macintosh-Common-Lisp-Reference-for-Version-2-0
hooktube.com/watch?v=f-hiEF0HjTo
twitter.com/SFWRedditImages

1. Remove botnet
2. Remove pajeet
3. Convince Terry Davis to write a new computer ecosystem to supercede all other operating systems

wget??????????????

Not just that, but modern graphical systems (like Windows, KDE, GNOME, etc) are ridiculously bloated and slow.

The reason is that tech companies work with software corporations to make software more and more complex. If software was as lightweight as it used to be, hardware companies would make little profit, because computers would be cheap.

It has always been like this; the original UNIX was made and popularized because all the other software of the time was just too damn slow. But it will be noted that UNIX wasn't developed to be sold. The same thing happens now. Well-designed software is still developed by some open-source communities, but there is no market for it. People are happy enough to pay hundreds of dollars for new PCs every few years.

We've had 10 years of programmers getting shittier. Look at Holla Forums cry about C. They need GCed languages because they can't do anything without all the hand-holding but GC requires a ton of ram. So we get 512MiB Hello Worlds in C#, Java, Javascript, etc..

Even on a 1GB RAM machine that works perfectly with everything else the browser will bring it to its knees

Garbage collection is not the problem dumbfuck. Lisp Machines were a thing in 80's. They had 2 megs RAM, garbage collection and OS you could edit realtime when it was running.

I actually need 16GB RAM because I work with image files as big as 65358 x 27375 100mb raw, brute processing.
In the past I opened one in Paint by mistake in a 4GB RAM machine, i7 (not top) and it took 30 minutes to load with the computer stuck.

kys

The situation is so insane nowadays even the most basic of programs have this kind of issue.
When I did work on an embedded machine I decided that I would leave bash installed so the user can have a more friendly shell.
And then I ran bash and it used all of the RAM and linux crashed.
And then I realized why every single router ever runs ash/dash

i686 user detected

10 years ago stateful fixed function opengl was considered perfectly adequate for everything, but now every modern graphics API wants to directly expose the hardware.
10 years ago javascript as-is was considered perfectly adequate, but now it has JIT compiler implementations, asm.js and WASM.
10 years ago python's performance, memory use and GIL was considered normal, but now there's more whining about it being a "slow" language than there ever was.

FUCK!

should of installed fish tbh

Very simple.
Browser needs to come back to what the were.
AKA browsers and not operating systems.

Give up on modern browsers.

Should have learned English properly

Life started off as single celled organism, now we're multi cellar life with functioning eco systems with adaptable flaws and benefits
/g/enerally software molds itself around hardware, linux adopting arm instruction sets, because the evolution of things, modern browser integreting spell check/native html video/smp.
Keep in mind switching cores and cache does inherently require memory to cache for when certain cores are being used depending on which architecture you're using.
It's not futile, but yes, you can run 1GB ram on a modern system with chromium and ubuntu LTS just fine. I use one on my homelab.
As others have stated, other things generally also use more ram too, the windows xp isn't viable on the internet, windows 7 is bloated. Linux has added a little memory, but the real things are things around linux such as Xorg, desktop environments, growing desktop screen resolutions. Back in the 512mb era, a computer could run like shit if it had 20+ autostarted services, but could run fine now. Also integrated graphics card started to use system ram as ram increased.
Generally linux shines alot on older hardware and using minimal browsers. I don't think its laziness of these programmers.
Unused ram is wasted ram on modern systems, just because your system might be 99% used memory, doesn't mean you can't run a program that will use 40% of it. Sometimes the Program/OS CACHES to save it from reading from the disk. This is a Godsend, because the memory can easily remap(malloc) to use that 40% memory that could be cache but used.
Also making a OS won't help, because as we all know and OS just run programs, so unless you want to build and entire ecosystem focused on minimality, eehh.
I understand OP's sentiments, but I just embrace change, because, well that is life. We get older, its part of life. Ehhh

Should have learned how to use punctuation as well.

Look at this loser and laugh

Terry?

I practically used it as a web site viewer with javashit and everything disabled why the hell does it need to much RAM I'll never understand.

A lightweight browser with customizable keybinds, modes, hinting would be a godsend but sadly everything these days is based on some webkit or on qt5-webengine which both suck.

I see you do not into cryptocurrencies.
Or servers.
Or big data.
Or number crunching.
Etc.

Hi CS grad.

Are you retarded?

This.
XP has hopeless with less than 1Gb. Fedora Linux was even worse.

www.microchip.com/wwwproducts/en/ATMEGA328

I would like to interject for a moment.
When you're referring to CS grad, you actually don't want to imply that Computer Science is bad globally, as it sucks only in North America.
In Europe, CS is actually an extremely difficult and serious course.
Therefore, please, use the appropriate terminology, such as US/CS or as I've recently started to call it, computer bullshit for unilliterate amerifggts on amphetamines and fast food diet.

512MB is a lot of RAM for most applications. For example: people went to moon and back on 4KB of RAM, rover on mars has only 256MB RAM and can do a lot of things, including image processing.
Now, of course there are some applications that need tons of RAM to work (video editing, 3D, image editing, science) but using 2GB of RAM to display some text on screen is unacceptable. If games can render whole virtual worlds in 2GB of RAM (which are more complex than rendering text/images) and browsers can't do the same then there are some problems with the architecture of the application. I sincerely hope that hardware stops improving so that people will fix bloated software, but this won't happen and we will just have to live with it. I wonder what it will be like in 10 years time? From 2005 to now Firefox memory usage increased from 70MB to 1GB. That's 1463% increase (12 years). If we assume that rate of memory usage stays constant we are looking at ~15GB of RAM used by Firefox in 2029. But the thing is, this increase is not linear. LOC of Firefox is going up exponentially: openhub.net/p/firefox/analyses/latest/languages_summary - more code more RAM needed.

Text Only Internet.

XP ran perfectly fine with 512MB. Even right out of the box.

is this supposed to be an insult?

Only use elinks.

Yeah CS is a complete fucking joke that nobody takes seriously. You should have gone CE or EE.

Not if you then tried to run any applications. I did tech support for XP.

Your flaming anus is beautiful user

No
My server never uses more then 400 MB
No
No

Even OS X was good and with all its applications. I did a lot of video editing, audio format conversion and image editing on OS X 10.4, iBook G4 with 512Mb RAM. No problems.

When Half-Life 2 was launched it could run on 256Mb RAM and recommended was 512Mb.
officialsystemrequirements.com/half-life-2-system-requirements/

Take a look how much this page + browser + gpu-process uses on Chrome/Chromium task manager. Go on. Add to that µBlock, HTTPS Everywhere and other essentials.

Your computer maybe able to run HL2 but it can't run this site. Let that sink in for fucks sake.

The only people who think GC isn't a gigantic clusterfuck of awful are CS students who have never experienced software development in the real world.

Daily reminder that computer science is not a degree in programming. A computer scientist's programming skill with or without garbage collection is irrelevant to the subject of computer science.

Windows has had consistent system requirements since Vista, mainly because later iterations are just tweaked Vista

A computer scientist has no skill in programming at all -- only theory, hence the reason they both think GS is a good idea (in THEORY) and the reason they're a big joke outside of academia.

Fug I meant GC.

Your SAT scores weren't high enough so you are self-educated superhacker, we get that. There there. Now shut the fuck up.

I love garbage collection! I'm all for letting my computer do the work that it's perfectly able to do! With your attitude, you might as well get off your computer and return back to the abacus to do all the manual calculations that you love to do so much.

And I used XP on a 512MB machine, for a couple of years, doing media creation with Adobe's CS. More RAM would have been nice, but it was never a problem.

Try Dillo

OP is obviously talking about PCs

With your attitudes we'll be needing 32GB of RAM just to open up a file browser.

It's the whole point. Programming anyone can do, leave that to the engineers. Designing new improved algorithms and ways to crunch data fast or at least approximate it fast is for the CS guys. If not, why look at P and NP or big O? or whether solving some problem is actually possible? The idea is processing shit fast, if it can be done, and if not, apply some heuristic and solve it approximately.
If your CS grad is worth a shit you will inevitably end up seeing optimization. The point of the grad is doing fast algorithms, so sometimes you'll end up having to implement them fast unless it's not worth the hassle. Also, at some time you'll end up touching on low-level stuff (like kernel design), unless you go to a meme university like in the USA.
Reminds me of a thread we had here, where at some point a brazilian CSfag said something about doing babby's first kernel and some american said something about it being "very advanced shit". Everywhere else in the world it comes with the CS experience. But in the US CS is just teaching java and here's your keyboard, go "code" something.
Adding to that, in my experience, I had to do manual memory management by the second semester of the career (C++) (first one was by the math department, Algebra I and Calculus I, and the algebra classes also had lessons where they taught some Haskell).
The USA is pajeet-tier in what regards CS. A true CS guy will know that GC and shit impacts performance, but when you're dealing with high-level abstraction it's not worth the hassle. Identifying where the line is makes for a good CSfag.

You can always just try a lightweight web browser. I really like pic related.

But it's shit and abandonware. Netsurf with JS disable is better.

Kill JavaScript

Just read that as:
dildo
the fast and light browser

I'm onto you, gaucho.

...

How many CS careers in the world you think have fucking haskell on the algebra course + calculus on the first semester. It was a given.

43 Languages.
What in the blue hell are they even thinking.
C++ 31.4% OK
JavaScript 24.1% OK
HTML 12.2% OK
C 13.7% OK
Rust 4.7% Starts being a bit sketchy
XML 3.4% OK
Python 3.5% Why another scripting language ?
Java 2.1% Here it gets strange. Why put Java here ? for Java in the browser ?
CSS 1.2% OK
Assembly 1.1% OK
Autoconf 0.5% OK
shell script 0.5% ????
OpenGL Shading 0.6% OK, but is that really the job for a browser ?
Objective-C 0.3% WTF ? C/C++ are already here
Make 0.3% OK
Perl 0.1% more scripting language ?
NSIS 0.1% Windows install , OK
TeX/LaTeX 0.0% WTF ? Why have a markup language ?
CMake 0.0% OK
Automake 0.0% OK
DOS batch script 0.0% WTF ?
DCL 0.0% why the hell should a browser have a DB with access rights ...
Ada Alternatvies are present
XSL Transformation OK?
Pascal Alternatvies already present
Matlab WTF ? why put calculations in the source of software ?
C# WTF ? for install / gui under windows
Jam Alternatives are present and it seems a dead project
Ruby ...
PHP Why.
Go More scripting
Emacs Lisp more langugage
Scala With the Java that does not belong here ...
AWK Text processing when there is alredy a pile of other soft that could do that
HaXe OK ?
Visual Basic see C#
CoffeeScript More scriping
MetaFont OK?
AMPL More maths
IDL/PV-WAVE/GDL OK?
R Data analysis in my browser source code ?
D one more ...
Vim Script ... wtf.

I think Firefox is a piece of modern art, protected from ridicule by it's looks of complexity.

Anything below 1% could be just code examples, the horrific part is this:
This is NOT ok, basically your browser not only implements the DOM and JS, it RUNS on it. A fucking 1/4 of it is JS, basically the entire add-on interface is JS.
I understand why they did it, but it means any component you add it's going to be yet another JS cancerous tumor running constantly for each page you load.
Case and point:
github.com/gorhill/uMatrix/tree/master/src/js

1% of 18M is 180K (dillo browser has 88K lines). I doubt that are just examples. Even at 0.1% we are talking about 18K lines, that is a lot of complexity. But this is what you get if you accept pulls from everyone because "we encourage everyone to contribute". No one wants to clean up because that is boring, instead we get ~200 new commits per day. If you look at any other project LOC goes down sometimes (people remove old shit), but not firefox it just keeps growing.

With each day I loose a bit of joy in using computers. Everything has become so complex that is difficult to take apart and study. I'll probably start doing analog electronics or something. There are enough people contributing to software already, I don't get it why there is "massive shortage of CS graduates". Apparently software is not complex enough, so we will add more people to the field to write more code, that will surely help. Software is fun for end user but hell for developers.

I don't have a degree and I have a job as a c++ coder in nokia working on 5G. I think I'm the only one there who doesnthave a degree or isn't in the middle of getting it though. I don't know why.

There isn't, they're just required by law to say that because that's the single legal condition by which they're allowed to hire so many Pajeets.

No you don't. Nokia sold its wireless tech division to Renesas and it was all Erlang.

Finally someone who isn't posting inanities
This
And even better, there's a need to produce free/libre hardware (thinking about POWER9).
So that errata and other madness can be minimized.


WTF ?

Bring back Basic.

nuke SF Bay Area
install Links

nuke SF Bay Area
install Links

You can't even use this garbage site with Links, can you?

This is how I know we are fucked. Fucking JS developers.

Source 2004 had a lot of features missing though that are in the current source version,Source 2013. Like HDR lighting

Yes, servers that have few visitors often get by with little RAM. I'm talking about real servers, not just your toy VPS that happens to be running a few daemons.


That's not obvious at all from OP's retarded post.

Also, don't be stupid. Cryptocurrency stuff and number crunching are run on desktops and laptops all the time.

Keep resisting

Try running HL2 on 256 or even 512Mb. Tell me how the experience was.

I don't now about the current version, but the original release ran very well with 512MB of RAM.

Here's your why


I thought summer was already ended, but role playing is still strong.

Half-Life ran on 32MB

Nobody here makes any content of any kind.

stop using ""64"" bit anything
seriously, it's not even a full 64 bit bus for crying out loud!

...

What is JanusVR.

t. x86 infant

Nowadays you'd eat up that much just to get a terminal running.

That looks better than what I remembered. I might have to reinstall my original CD version to just compare.

Pretty sure it's the Steam version. They updated it to have the option for some higher quality models which may have actually originated from the unreleased Dreamcast port IIRC.

Well sheeeit, that's a surprise to me.

Just 5 years ago 1GB was enough to do stuff with. Sites have become bloated cows of unnecessary bullshit because the people designing them don't think. If it runs on their Mac OSX with 16GB ram it must be fine!

For the record, I have a netbook with 1.5 GB ram and I can still do everything I could do 10 years ago. Some sites don't work, not a loss, I just ditched them because fuck them.

Windows XP worked fine on it, but you have to use software before 2012 otherwise it consumes all your ram. Every Linux Distro I tried on it worked without any issues. The only things that don't work were Gnome3 and KDE.

See picture.

...

That first one is what I remembered it looking like on my very first workstation running XP with 512meg RAM, and a firepro card.

Striving for efficiency, not stagnation. What I don't want to see is the increase in hardware being used as a crutch or excuse for software to get bloated and lazy.

Fucking this. I bought more Ram and a beefier CPU to run the shit I already had, and more of it without it slowing down.

Sorry autismo. It should have been obvious because of the fact that not many people have servers.

you're're'rour gay.

idlewords.com/talks/website_obesity.htm

There is no such thing as "progress" or "technological advancement." Its a meme. In some cases its a useful one for describing when we learn how to do new things, but not really in the case of software.
Software isn't 'discovered', or learned. Humans design it. The vast, vast majority of difference between old and new software is pure style and norms, not "advancement" of any sort. And most of the new styles and norms are shit compared to old ones, even though the old ones were shit in various ways too.
Browsers taking gigabytes of RAM is not "progress", its shit. Trash. Its not more advanced, superior technology because its more complicated. Its a different stylistic norm, and its a bad one.

GC's problem persists no matter how much memory you have, because your programs will scale to match the hardware's capablities. Lisp machine's consumed an artrocious amount of memory for the time, and you had to pay out the ass for them too; it was $545 for a C64, $1298 for an Apple II, and $1565 for an IBM PC. Better models of these could go for about $3000. And by better, I mean they had more RAM. Think about this: The kind of RAM home computers had was somewhere between 4k and 64k in around the mid-eighties. Now, lets compare it to a GC'd Lisp machine: These things could be $50,000 - $70,000. And the amount of memory that they needed compared to computers are the time was disgusting. Lisp machines were by far the biggest waste of computer resources: 70k of 1985 dollars just to run your GC'd language, because it would never run on any normal computer.

WebGL, WebUSB, Web Storage, and the rest of that shit are "inner-platform" reinventions of features provided by the OS.
en.wikipedia.org/wiki/Inner-platform_effect

JavaScript and HTML are not meant to do all of these things. That's scope creep. C, C++, and Java are also scope creep.
en.wikipedia.org/wiki/Scope_creep

JavaScript, Electron, WebAssembly, and all of these other "web technologies" are all "square wheels", inferior replacements for existing technology that was already better. C is also a square wheel which suffers from a huge number of problems that were already solved in the 1960s.
en.wikipedia.org/wiki/Reinventing_the_square_wheel

Programmers today are inept and care more about putting stupid shit and copying others (durr Firefox) than actually making good software.

KolibriOS is a fully GUI-enabled OS that can run with 8 megs of RAM. It's a product of careful design, not the 'shoot and pray' way devs use now.

I'm pretty sure image processing isn't handled on Curiosity itself.

Some people get it, unfortunately, JS shills and bloated frame works are everywhere.


We've allowed people to enable "developers" to get away with this kind of waste. When we trashed them, they went to their enablers who write popular blogs who tell them "Shhh, Shhh, it's okay, just called them Autistic or Luddites, they just afraid of change and progress!", then these enablers shilled their shit because they got money if they did.

Look at anywhere in life, that's the message that's being told. Each section of society has these enablers.


Curious, why is C scope creep? Or do you mean at some point C became scope creep?

C++, especially more so with C++11 and beyond are 100% scope creep. In any active C++ projects I've banned anything beyond C++03, even 03 is terrible.


Blame Stack Overflow. Can't get something to compile? Stack Overflow will fix it for you! Can't align Divs in an HTML page? Stack overflow will tell you how to use JQuery!

Prior to this you could go to forums which were run by people who knew the languages they wrote in, and if you weren't willing to learn you were told to go elsewhere. Assembly seems to be immune to this, most of those are still forum based communities.

That's pretty neat. This looks like they took Windows 95 (which only needs 4mb of RAM) and enhanced the fuck out of it. Nathan Lineback approved?

en.wikipedia.org/wiki/Hazcam
It takes 1024x1024 grayscale pictures in stereo to build 3D map of surroundings for autonomous navigation.

youtube.com/watch?v=kr58r0b5LKM

It also compresses color images to transmit them back to earth (1600x1200). You can really do a lot of things in what it seems like a constrained system by today's standards.

just install Chrome, it is fast

Apparently KolibriOS is "Unwanted Software". Was website compromised, is Firefox being scared of its lack of bloat or are they being genuinely evil?

Got the same error on Chromium. Either it's indeed (((them))), it's a false positive or the site got compromised recently.

Checked source, there is no strange things going on. Only JS on page is responsible for UI. Some idiot probably installed OS on bare metal and deleted all his files so he decided to report page to firefox as malware site.

Lisp Machines were the 80s equivalent of supercomputers though. Home computers in the 80s were basically novelty toys, but Lisp Machines were the de facto standard in AI research.

The female's head is somehow bizarrely hidden by the angle, or she hasn't got it?
Getting really triggered trying to figure it out.

C is scope creep because it's meant as a small systems language and it's getting bigger and adding features that are irrelevant for systems programming. The C11 standard is 700 pages and it didn't solve the basic problems of C.

C programmers tell people that inferior solutions in C are just as good as using another language that does them properly, but that's the "reinventing the square wheel" problem.

They think C is as good as Fortran for array processing because it has restrict.
pcc.qub.ac.uk/tec/courses/f90/stu-notes/F90_notesMIF_5.html

They think C is a safe language because it has Annex K.

...

> transparencyreport.google.com/safe-browsing/search?url=http://kolibrios.org/en/
Normally I'd dismiss unironic use of (((-)))s as moronic but this time I think you're justified

Hidden or just chopped off?

...

...

this comment was posted with Links over tor

yeah fuck off faggot. i write real software and it all works well under 512MB

4GB*

NOT OK

You can browse 8ch with Links, elinks, w3m and probably lynx.
I've tried at least the first 3, posting works too but captcha is a pain in the ass if you're using a text only browser. I had to write a script to open the captcha image and submit, then it required reopening whichever thread I was in to be able to post. w3m simplifies this with inline image loading.

Fucking hell, what is wrong with you? C++ is a disaster without std::move. You were forcing a team to churn out garbage.

I'm in my 40s and worked on supercomputers at SDSC. lisp machines were a meme and I never ran into anyone that used one. They also took wall clock minutes to run a single GC pass.

Nigga that's nothing. My boss banned me from using the not operator over "x == false" because it was not readable enough.

My understanding is that lisp machine users would work all day, then before they left they would start garbage collection in hopes that it would be finished by the time they got back.

Webpack will save you. They'll just take AAAAALLL of those pesky little scripts and package them up into a single .js bundle with the rest of the main site code.

I can use palememe with 2.5GB RAM.

I use Links but I just bypass the captcha.

C++ is a disaster period.

I fucking hate all browsers. I have multiple computers and it's my browsers that eat all the ram and constantly have to be restarted every day due to memory leaks. Why the fuck they have to use more memory than muh games I don't understand.

...

Wrong. They had a lot of memory because they were high-end specialist systems.

By your logic C++ requires 64Gb minimum of RAM to work because Windows is mostly written in C++ and there are workstations with 64Gb RAM.

Show me common lisp or whatever on a C64 or apple 2. There's a reason that its only a curiosity, and no actual programs have been written for those systems in Lisp (aside from "Hey, we did it!" programs).
In the 1980's Lisp actually did require enterprise hardware to be useful.

en.wikipedia.org/wiki/Macintosh_Common_Lisp

Alright, you win

S-someone yielded in internet fight? You should be recorded in annals of gentlemanry historics.

Lazy developers who copy and paste from Stack Overflow and are promoted by cancerous faggots like Jeff "The Cuck" Atwood, Sam "boipussy" Saffron, and Joel "Shekelstein" Spolsky.

My Unused ram is there so that when I'm ready, I may launch an application to use it, not so a shitty web browser can consume it all. Fuck that.


Well the other option was to allow even shittier C++11 code to be written.


After several years of using, it sure fucking is.

The only way to continue about lisp is if I moved the goalposts, so I can't say anything bad about it anymore. Other than, I don't know about any notable programs for those computers written in lisp.

I think that because Lisp can do all the things that it can do, the GC is not a problem. And lisp is never used for things where it isn't appropriate, so nobody complains about how its GC'd.

But, because of web 2.0, everyone has to run JavaScript, a GC'd language, and it wastes such an incredibly amount of memory. On an AI simulation or whatever it would be excusable, or on emacs, where its just a text editor, so as long as it's responsive the GC is OK, but;

Now, everyone has to deal with GC hangs when running JavaScript webpages, GC hangs when playing minecraft or whatever, GC hangs on all the games that try and use GC in real-time intensive programs... I think it would be much better if the people who wrote those systems were forced to use C or any non-managed language.

There are instances of the kind of GC, or VM things that we see in managed languages working... like in quake 3, with the QVM, or how Crash Bandicoot uses lisp and assembly apparently... and because they work, nobody is concerned with the technology they used. But most of the time, retards use them to make their programming time shorter, and everyone has to suffer because of it.

HDR lighting was added, I think, in a 2005 branch with Half-Life 2: Lost Coast to better test it publicly before the release of Episode 1. The 2007 branch with the Orange Box added things like dynamic lighting (flash light could cast shadows with level geometry), cinematic physics (pre-calculated destruction; think white forest cabins being blown up by striders), and I forget what else. I can't remember when they added colour correction though, might have been with DoD:S, so around the same time as Lost Coast and Ep. 1.

C++11 is a massive pile of long overdue fixes to C++ that make it viable in a lot of domains. I don't see what your problem is with it.

...

Actually, we may as well have a new banner, eh?

Shit.

I used to alt-tab between an opera browser with 8 tabs open, winamp and 50+ fps WoW back in 2005 or whatever on a machine with probably 2gb or less of ram.
Some time between 2008 and now, everything got fucked.

If you think C is another square wheel, you are mistaken. C was a refinement of B. B was a refinement of BCPL. Before BCPL came the CPL concept, which was derived from ALGOL. ALGOL was the first language alongside COBOL to seek standardization.

Systems verses applications programming is not source creep for C. C was concieved as a high level language that could access low level functionality, rather then compared to BCPL which was purely a systems programming language.

You can't have it both ways. Either say humanity is finding better methods, or say everthing is just a reinvention that happens to get progressively worse. C is being obsoleted, but nothing has yet took its crown because UNIX forged it.

By Rust™.

Standardize architectures and have everything written in ASM.

Terry implemented his own C known as holy C you dumb fuck.

Anyone else remember heavy.com?

More Importantly at the moment, what Browsers have good adblock?

Firefox is fast approaching unusable and I dont want to go to chrome, what alternatives have good ad-block software?

heavy-r?

Now there's some rose-tinted glasses.

used to be you could run like a dozen flash movies on firefox at once

You have a lot to learn, son. Start with dropping your ignorance, maybe.
Sage for off topic.

no problems with pale moon

Fuck no, Rust is shit.

Jokes on you, my browser uses 350 mebibytes of RAM, but the real reason my RAM is bottlenecking is due to bloatware and proprietary laptop fan control programs.
The biggest problem from my perspective is lazy use of the power of 64 bit computing. 32 bit browsers are worth it, go use one.

it's not though. stop larping

It is though, stop being faggot.

no it's not. kys larper

Everything is shit until proven otherwise. Do you have any proof of Rust being non-shit?

it is memory safe. that means it is less shit than c and c++. that is good enough for me

This, it used to be that you could run a trillion shitty fash animations on a 2ghz pentium.

Would have said Palemoon since Firefox extensions still worked, but now the developer is acting like a Cuck like all Swedes.


Take your commiefornia language and go back to whatever Gloryholes the three of you faggots left.

Just disable Javascript. Boom, no ads.

Many languages that aren't shit are memory safe. Rust on the other hand doesn't scale. Even relatively small projects for a systems language like Servo have unbearably long recompile times.
I'm really not sure why Rust exists as thing it was supposed to improve on was the tooling and process of Ada development, yet it went backwards in every way.

You can ask that about most programming languages. Every post-70's language is just some DSL meme faggotry.

He's not a cuck, he got bribed by the GOOGLE

most languages that are memory safe are so because of garbage collection.
lol

Hi CS Grad. The length of the build-run-test cycle basically defines programmer productivity. If I have to wait an hour for a recompile my workflow is going to look more like debugging a Mars probe than just adding 1 and running make check.

Back in the days of shitty mainframes, people waited a whole day to compile their programs, and they had no syntax checkers no linters. You won't die for a few minutes. If you want a write-compile-debug loop so hard, you can structure your project in crates.

Rust is unmaintainable, but not because of compile times. That's a shitty excuse considering they are working on it, unlike the retarded syntax or the batshit insane typing verbosity.

lol. keep on larping

Install w3m

If you need to check that your code still works after "adding 1" somewhere, you are the problem.

adblock is a meme
also see

that's not a thing in the discourse of programming languages, unless we're talking about Prolog

this

C compiles slow
Java takes a long time to start up
what does that leave us with that has non obfuscated binaries? Python?

I am pretty sure you haven't used Rust outside of toy programs, which would make your post very ironic because unlike you, I have used Rust.

Build tools.
Build tools.
Bindings for the OSX GUI.
Windows build tools.

No shit, and progress was glacial.
The authors themselves don't even have that working adequately for Servo so it's a meme solution in a meme language.

C's fine. The time it takes to recompile is proportional to the size of the change made. That's what allows us to easily scale development as projects grow. An inability to scale development is the defining trait of a toy language.

what kind of autism is this

ram is the cheapest component. Its so cheap, so software can expand into it, cuz every1 can afford a lot of it
futile

The autism that allows one to work on a kernel module without having to spend all day planning the compile like it was going to be done via punch cards?

you're answers amount to "change the internet," not "this is the best browsing solution"

lol. i have definitely written more and bigger things in rust than you


wrong: github.com/servo/servo/blob/master/docs/ORGANIZATION.md
also pic related
it is the same for rust

The pose of her fingers on the ground is very deliberate. You need a head for that.

Valve actually got it to run on Xbox, so that would be 64 or even just 32 MB, I don't remember off the top of my head.

the source engine is tight as fuck
youtube.com/watch?v=3o1A16NyrEI

Lots of that is likely in the build scripts and source tools.

Maybe she positioned them before they chopped the head off, and that nigger is humping her for days now so she entered rigot mortis. Maybe he continued until she was nothing but dust.

Anime is not Holla Forums.

That's standard dynamic typing.
x = {a:10, b:'abc'}; y = x + 1
Now y is the string "[object Object]1".

You are still the problem if you write garbage code like that in the first place. Really even worse than one worrying about possible overflow or logic error somewhere like I initially assumed was the problem.

...

Good excuse for filling the caches with useless shit and then having to wait >100 ns for a single memory access, repeatedly.

Yesss, finally Holla Forums learned the basics.

$ qlop -g rust gcc | ack $CURRENT_YEAR': \d{4} sec'rust: Fri Jun 16 20:14:55 2017: 6125 secondsrust: Wed Aug 9 17:54:45 2017: 6753 secondsrust: Sat Aug 26 19:17:36 2017: 6039 secondsgcc: Thu Feb 16 20:05:49 2017: 3362 secondsgcc: Sun May 7 19:45:44 2017: 3345 secondsgcc: Sun May 7 23:13:00 2017: 3834 secondsgcc: Tue May 16 19:23:55 2017: 3943 secondsgcc: Fri May 19 16:16:24 2017: 1035 secondsgcc: Tue Aug 8 23:38:35 2017: 4707 secondsgcc: Wed Aug 9 05:29:29 2017: 1122 seconds
yeah sure, suck a dick

I totally agree with this. I believe that technology is moving so fast for us to have a chance to optimize styles currently in use. By the time optimization really comes to mind- we are already trying to put together something new.

It kind of sounds more like a reluctance of companies/programmers.. Thinking "oh well they are just going to buy another computer so I can leave it like this". Back in the day when things were really limited everyone had to play with this mindset "we only have so much".

Now however maybe it is because memory is cheap as fuck we can be more neglectful of the sloppy crap that is served to le people. Quite a cycle..

Maybe AIs will fix this problem for everyone in the next 6 years to search for better optimization missed by humans with 'neglectful tendencies'. We are going to be fucking replaced.

you're onto something

The other day I was using my moms computer which only has 4 gigs of ram and firefox suddenly started lagging to all hell, I checked it's usage and it was using 2.5 gigs. I was only browsing here and listening to a youtube video. I'm starting to unironically lean towards text based browsers

Have you tried it, nigger? It still takes forever.

That's not what being 64 bits means. I could give you a 64 bits processor with a bus of 1 bit wide and it'd still be 64 bits, because the reason why it's called like that is that it can address memory addresses 64 bits wide (which means having registers that big).
Next time don't make an ass of yourself before posting.


Try this:
%define offset_char 0%define offset_pointer_next 1 mov rax, [rdi + offset_pointer_next]

So was the 386 a 64 bit processor because of the FPU registers?

>>>/g/

...

...

Did it use the FPU register for addressing memory, dumbass?

No, he's saying it's simultaneously 32-bit and 48-bit, and he's also saying that registerless processor architectures have no n-bit categorization at all.

Mac Lisp required 4MB each of RAM & HDD, which cost thousands of dollars at the time:
thefreelibrary.com/MACINTOSH COMMON LISP 2.0 SHIPS-a012458182
scribd.com/document/45488252/Macintosh-Common-Lisp-Reference-for-Version-2-0
Whereas, contemporary C environments on the Mac like Symantec Think ran on under 1MB, there were other C environments for 8-bit platforms, and of course compiled C object code requires no runtime.

GC is bloat, period.


Fuck you. If your software doesn't compile clear past cheats like interpreters, bytecode, and garbage collection, with as many optimizations as possible, you're wasting the time and money of your users including you if your personal hacks are something you use frequently rather than just tinkering with.

If you want to do rapid prototyping, use a higher-level language, and identify performance-intensive parts to rewrite in lower-level languages after your overall design is stabilized.


64 MB, shared with the GPU as VRAM yet. There were also traced pads for a 128MB mod, supported by some homebrew.


This is absolutely true, except for three things:
1. The level of RAM waste is reaching absolutely insane levels. Eating up hundreds of megs EACH just to render a text file with some simple scripts, when older systems used kilobytes for the exact same results, is wastefulness on a scale where working with multiple simultaneous documents AT ALL could soon become impossible even with terabytes of SSD virtual memory. Developer entitlement and user submissiveness is reaching potentially catastrophic levels, this can't continue.
2. The parallelization problem isn't as new as many now claim, even by the late '90s the writing was on the wall and OS/IDE/CPU vendors were trying to force multithreading down the throats of app coders. By the mid-2000s, even Intel had dropped all pretenses about clocks and IPC, admitting that multithreading was the only realistic path forward.
3. Parallelization isn't the only problem of its kind, adoption of vector instructions, GPGPU, and FPGA resources by app coders is laughably behind schedule compared to (for instance) the rapid adoption of FPU & MMU instructions when each developer's platform gained access to them in the 1980s.

2000 era flash has less overhead than the javascript abortions of 2017.

1GB Ram Vs Modern Computing
hooktube.com/watch?v=f-hiEF0HjTo

You realize Core2 was 10 years ago, right? 1GB wasn't even standard, unless you're talking about Apple.

A 68000 is and will always be a 16-bit processor even if it can address 24-bits of memory and is internally 32-bits. The data bus is the determining factor.

Current Intel processors have 48-bit memory addressing as well. You could try to be more wrong, but I don't think it's possible.

I honestly couldn't be bothered researching or checking the horseshit I wrote down because it doesn't matter, I'm anonymous.

yeah but now tonight when you go to sleep you're going to have to think about how you're a retarded cringey faggot

Posting from abaco. There are still low memory browsers.