ITT: We correct W3C

Since Tim Berners Lee basically opened his ass cheeks for big corporations and the web was already pretty screwed anyway, is it time for us to replace W3C?

Other urls found in this thread:

whatwg.org/faq
tools.ietf.org/html/rfc2119
whatwg.org/code-of-conduct
citizencodeofconduct.org/
docs.racket-lang.org/html-template/index.html?q=shtml
ronja.twibright.com/
web-artanis.com/scheme.html
web.archive.org/web/20170525152209/https://forum.nim-lang.org/t/2332
github.com/domgetter/NCoC
kernel.org/doc/html/latest/process/code-of-conflict.html
contributor-covenant.org/version/1/4/code-of-conduct/CODE_OF_CONDUCT.txt
blog.plan99.net/its-time-to-kill-the-web-974a9fe80c89
w3.org/TR/SRI/
wiki.mozilla.org/Electrolysis
media.urbit.org/whitepaper.pdf
youtube.com/watch?v=QH6xgVRdfX0
youtube.com/watch?v=g1qroWiZF90
youtube.com/watch?v=eM4J7ljCExM
stackoverflow.com/a/34984276
twitter.com/NSFWRedditVideo

The time is to replace the web, not the consortium that standardizes it.

If you can get normies to use your implementation of whatever autistic standard this board is capable of coming up with, sure.

these guys:

whatwg.org/faq

If that "gif files shouldn't be allowed" discussion is anything to go by, you can bet you life on that never happening.

Why would you want to have normies in the first place? It's not like they produce any useful content anyways. You'll also attract corporations to push ads to "new web", you'll basically have same thing we have now. People will get bribed and sell out if enough money is involved (sir/cuck Tim Berners Lee), nobody can be trusted 100%.


Why don't we just build standalone applications? Web technology is bloated (support for lots of things only few sites use but browser has to include them anyways) and most of the time you have to fight it to achieve anything. With standalone applications you can do whatever you want.

What we should do is remove javascript and expand HTML. HTML should have features that allow for instance loading more content and replacing old content when a button is pressed or you hover over something, and changing the css classes upon button press or based on a timer so you could make interactive elements without scripts.

To be honest I'd rather just start over and rebuild the entire web standard from scratch. HTML is bloated shit and the syntax is gay, CSS is broken and makes no sense at times, and javascript should be replaced by userscripts (see: if you want the website to work like an app, you must install a script for that site).

YES
Key words for use in RFCs to Indicate Requirement Levels
tools.ietf.org/html/rfc2119

Standard 1: The web browser shall render text.
Standard 2: The web browser shall render non animated images.
Standard 3: The web browser shall not under any circumstance execute code.

THOU SHALT NOT EXECUTE CODE IN THE CLIENT


It's dead
whatwg.org/code-of-conduct

>citizencodeofconduct.org/

Keep your greasy hands off my internet.

Web
What ?

Separate your fucking concerns. Right now, HTML acts as model and view, which must be overriden with a thousand hacks with CSS, and sometimes even controller. The equivalent of HTML should come in two parts: a data source, and a layout. Content, and where it should be acquired from (datasource document may contemplate downloading data from API) should have nothing to do with how it is presented, and at best the layout file should reference where to acquire the data from, like in a virtual filesystem with paths and all that jack. I would also argue theming should have nothing to do with layout, so I would separate the equivalent of CSS in colors and other non-vital visual aspects from UI layout.

Also, be fucking consistent. HTML is all over the place, and I can't fucking believe most input elements aren't all under the same input tag considering the importance of CSS selectors today.

Lispweb when?

Hmmm. So you're saying we could put all the website content into some kind of data object like YAML or TOML, then use a CSS type language to place all the elements from the data object onto the page.

This is basically how a lot of blog engines work. I use hugo for my website. I write my blogpost on markdown with a YAML 'frontmatter' and then hugo spits out the entire website. I use a custom template that is 100% static and free of client side scripting.

Very important: code must not be tolerant. If something does not conform to the standard the entire page must not be displayed, instead an error message is given which states what the error is and where it lies.

As it stands currently, in order to be compatible with all the sub-pajeet tier web pages browser engines have become such a clusterfuck of hacks that it has become impossible to write a new engine unless you have massive financial backing.

Something similar to that, but not exactly. Most modern websites don't inline their "dynamic" (as in, not hardcoded) content because otherwise it would be nightmare, but it's the server who hacks this together using scripts. I am talking about client side templating, where the relevant content is in one file, the presentation is in another, and the latter just references the first one. This would make localization extremely easy for one, but it would provide several more advantages, like easy crawling, pretty much "incidental" API generation, easy as fuck restyling, sane preparation of contents serverside (basically, making life easier for both backend and frontend developers), etc.

Say, HTML currently describes a paragraph of text as a tag, with the contents of the tag inside. If you want to change the cintents of said tag at any time, you must be able to inject the contents by placing "yourself" (probably handled by your framework or templating engine, but this is irrelevant to the generated file your clients will read) in the middle of those tags. If we separated the concerns, said tag may just have a pointer to a virtual filesystem generated by the datasource file, ie. (please ignore the fact the example was XML-like). The guys at the backend generate that document describing the contents of the web, and the guys at the frontend just link to them.

In fact, if you want to go into more powerful behaviours, you may indeed define a layout mode "list" where you could just reference , and it would read the contents of the 795509 "folder" in the datasource file and display them one after another with the layout "post" defined in the components subsection of this layout document, which may or may not be in the same file (you could just reference it in this file as an external file as an "include" to allow for better cacheing, then the virtual filesystem wpuld aggregate all routes to compute the actual layouts). The post layout would then reference its contents with relative routes, so its would read like .

There, the web but slightly less insane.

I wasn't trying to imply that's exactly what a blog generator does, just that it does something similar. Unless I'm totally missing something, we are on the same page.

Now all we need to do is make it a lisp.

A man of culture, I see. S-expressions are probably the best tool for the job.

I worked on a s-expression based languahe that transpiled to HTML for a while. Didn't manage to get too far because Rust is verbose as fuck for stuff like this, but maybe I could post some design documents for the "enriched" s-expression syntax that allowed atoms to be tagged with special meanings. Language sas named HOTSEX, for HTML Over Translated/Transpiled S-EXpressions, so I imagine that if we call this language something stupid like Hypertext Over Tight/Tiny S-EXpressions, we should avoid going too mainstream/enterprise, which was exactly HTML's problem.

...

The web shouldn't be allowed too run scripts behind your back and do arbitrary shit, but it should be possible to enhance a particular website manually if you visit it regularly, and it should be possible to use a language that isn't javascript. Disabling JS isn't the answer because it'll immediately break 90% of the internet, even many tech and hobbyist sites break if you don't allow javascript.

The main reason people add scripts to begin with is because there's no other way to get some basic features like toggling things on the page, but it quickly escalates into a massive pile of cancer because there's nothing stopping it.

I'm also not going to install a special snowflake app for browsing Holla Forums and deal with the retarded shit that it's developer did, when I could just add a simple script to my browser to get the feature(s) I want.

This. Please don't make things unnecessarily complex. All we need is a way to create simple text/image documents that can be retrieved remotely. Why not use something simple like markdown? CSS or other layout language will only attract design hipsters (muh coffee table with macbook air jumbotron in sepia colors with blur scrolling bulshit) , focus should be on content, you can always just include image if you need more advanced design.

If you need more control of how content is displayed/retrieved create standalone application and define as much complexity as your heart desires (but make it GPL so I can check that I'm not running botnet). For instance: create imageboard using separate client/server applications, define data/layout models in a way that creates most sense (you're not restricted by any DOM or other framework that tries to solve everything). If you make your server free software people can just host new server and add addresses to the client to get new boards. KISS please, I know some of you are really smart but sometimes stupid simple solutions are best.

at this point, we'd better make our own global community mesh internet with foss hardware and free access for every node owner, but governments and corp soup prolly won't let us

Why not start with creating a new browser with very limited capabilities.. Doesn't it all boil down to what the browser handles/understands?

How about a new web where any site that is publicly listed (for this browser) can be accessible to replace the need for search engines in some manner? With text based pages overhead will be small and no ads might be neeeded(unless it's to promote other friends' websites).

Imagine how we have online games with their server lists.. Similar concept but for the web on publicly visible sites and maybe descriptions..

Sites can be connected to this new lens that can replace the need for search engines in some manner... Still need to implement a way to search.

Should we start a github project for this venture in general? Should /prog/ know?

If we want to build something we have all agree on what the standard should be for the most part. I say Text-only is a good start.. With simple images.

Pretty much the web before it became bloated- but also more modernized to our liking. Should we look to the past for some physical examples visually?

lel. this is high quality LARPing
keep it up

It sounds a lot like we are trying to go backwards in time.. What can we do right now to organize this effort to actually make it become a reality?

Every web browser must be a similar to a remote desktop client. All fancy graphics is rendered server-side and then transmitted as lossless video to clients.

Yo dude just make vector hypertext fidonet a reality already.

Why on earth would you use Rust for that? Pick a Lisp which already has an HTML templating library.
docs.racket-lang.org/html-template/index.html?q=shtml

I'm sure there are other ones as well, writing an S-expression -> [HT/X]ML converter is baby's first DSL.

This ronja.twibright.com/ fused with gnunet.

Doesn't artanis already do that ?
web-artanis.com/scheme.html

How is pic related? I think browser like this would have everything we could need. Links either open pages or default applications. Pages would be simple markdown with images. Styles would be defined by client only no CSS.

Fuck off with your bloated botnet.

Image rendering isn't bad.
Video or animations on the other hand are problematic.

The web browser should be an 8-bit virtual machine, and web programs/pages should be assembly coded for this 8-bit virtual machine. C should be the language of choice for this new internet.

s/non//
Otherwise get out.

Okay, but what's rationale for having animated images? Supporting animated gifs because muh kek lol lol videos is not good reason.

Why the fuck is this even allowed? My ophthalmologist says fuck you.

Some of us are not living in basements and have plenty of ambient light. This is only idea draft. If somehow Holla Forums starts to build something we would probably have vote on default color scheme, or just read .Xresources.

I vote #F0F background with #00F text

How do you plan on handling forms?

Eh that's kind of what I have in mind. But I have to agree on not having the same shitty standards. White is just an awful color for a background. It's too bright.. A color closer to black would be more preferred as a default imo.... Or maybe as specified in the settings? Like first-time users can specify how they want it to look?

Voting can work but we need to be serious about it.. People are always throwing stuff off.

I think we should take some of the things we like from HTML but ultimately to make our own syntax for this shit would be nice.
I don't know how everyone else feels about the way html pages are written and designed but we should consider tossing it altogether and making something we all really like. It is for us after all..

Oh and maybe we need to solve the problem on how to better organize and arrange data on the pages from the start without having to rely on any bloated code. Or should orientation of how things appear not be important? Picture bootstrap when I am describing this...
We need to determine to what extent the new standard should work out-of-the-box for ideal non-bloatiness.

Just kill yourself already, you fucking degenerate.

What are you doing here, on an imageboard??

>implying CoCks aren't for political leverage.
We need projects with a 'CoC' that says "Every contributor is assumed to be an unambiguous heterosexual male of European descent, and all ambiguous use of pronouns shall refer to male sex only."

b...but the w-whole point is... to k-keep them out

JavaScript is (((Lisp))) with different syntax.

Why not also replace the "tag" syntax with s-expressions? That should make things slightly more light-weight.

You just went full suckless. Never go full suckless.


I agree Lisp would have been better. Fuck, JavaScript would have been better if only to avoid emscripten fuckery. I just wanted to try out Rust and build a Lisplike abomination.

At this point you might as well build the software of the dynabook if you want to start over.

You're just like those fucktards who want to replace everything with C syntax. OCaml has to become "Reason" so they can understand it. Fuck you.

Why would you replace everything? The thing that could have fixed the web when we had 4.01 was better iframes, so that we wouldn't have used XMLHTTPRequests for all kinds of shit.

The rest of the modern shit-web should have been plugins.

I think a good way of "encouraging" the web in the right direction is a websearch with a benchmarking engine, so that smaller and faster websites (not download speed) would be rated higher. That way developers would look to hire better developers instead of SEO people.

I say remove features from the standard so we force web devs to not bloat websites. So we basically have text + image + webm only.

No, he is right. S-expressions make sense to represent tree-like data, probably more so than XML considering arrays/lists in XML are a hack and making a distinction between text leaf nodes and non leaf nodes requires you to do some data structure gymnastics (generating virtual nodes, which isn't really hard but it's retarded) even though this should be trivial. Also, XML is unconfortable to write by hand, and so is using most XML generation libraries if you still have to define the tree by hand. S-expressions would reduce the amount of keystrokes required BY LARGE.

I was thinking that there would be no forms. For basic task such as search engines data could be passed in the URL (searchengine.tech/query). If enough people would like to have forms we would implement them. Key thing here is to think about what components are really necessary to avoid bloat. I would much rather see dynamic content sites be standalone applications, but if people feel like they should be integrated they will be. My standpoint is to keep "new web" for documents/information sharing only (wikis, software pages, documentation, guides, static files, anything not generated by users).


Keep ideas coming. Don't worry about looks, we could easily add client customization.

XML based markup definitely needs to go, it's too painful to write by hand. It should be replaced by simple markup language similar to markdown with additional tags to mark external content which would call applications set by user (or defaults or specified on first run or something).

What we all need to do before writing any code is to examine all possible options on how to approach this problem and choose best solution, so by all means keep providing ideas. As I stated many times, I would like to see web be as simple as possible (text and images) and create standalone applications for sites like imageboards. If some of you want to create complex tree based document structures I'll help however I can, but it seems like it could end up like DOM with different syntax. Are multiple levels of depth for layout really necessary? Can't content just be linear, one thing after another?

I don't see why not. Do we have any clear understanding of what we want to see and have from such a venture?

Just implement Xanadu and be done with it.

For me it's text with basic styling, images and ability to launch predefined external applications by clicking links. Gopher almost feels like what I want but it's exactly there yet, maybe extending it could be a good idea? I need more of your input though. Tell me what would you like to see in "new web".

content should be simple to the point it can be assumed the user is the one chosing the style.

I was all for freedom of expression and shit. But I think "web devs" and designers have already shown their commitment to expression.

User chooses the stylesheets, websites don't. Linear content. Content, not "fixes" and "hacks". Everything simple.

This is exceedingly difficult unless you castrate javascript and json. If you can't just load a third party script in the head by manipulating the dom or using ajax, they'll just append secret sauce to the local script.

If this is the direction you want to go, ditch javascript. Ditch browsers. Start replacing web-apps with stand alone software.

HTML is not XML and it's not a hack in HTML.
abc
abc

Gimme a few hours. I got a headache. I'm taking a nap. You guys keep discussing for now ~_~
How would protocols look? You can discuss that maybe. idunno

you not closing thos li tags is hurting my autism

Not at all. Right now things seem pretty bad with the DRM thing, but don't let that fool you: now it's DRM, ten years ago it was Flash, Java, Silverlight, Microsoft's proprietary extensions, etc.
Ads still were everywhere, and adblocking was in it's infancy, and IE-only sites were still a thing.
Corporations will always try to fuck up the Internet, but we'll still have good forums, wikis, IRC, torrents, etc. To fall back to.

Then don't use it. Nobody is going to miss you or your retard friends.

remove everything in HTML except p,b,i,table,a,img,video (and video no longer has options to let author control how video is played, and has controls=1 by default, etc), and a few others, remove head and shit. remove all this metashit, including title, which doesn't belong in HTML
remove pretty much everything in HTTP
remove DNS/X.509, use actual secure (defined by zooko's triangle) links
remove js, remove css
remove about 9 million other things, cant be bothered to list

(cont.)
remove HTML itself as well and replace it with something that can't be parsed if it's not valid syntax, such as binary, prefix coding, etc

Poettering spotted

I think there should be at least some basic style options like the background and foreground color, the font, and link colors. I just can't stand black text on a white background.

We should have forms stored in separate files much like how gopher handled search forms. Only you can add more fields.

plain text is the most retarded shit ever invented. the UNIX way is shit (and so is systemd; lack of UNIX way compliance doesn't make something good)

so literally change just one line of code/config in your shit

Your post is the most retarded shit ever invented. What now?

Holy cow, this. It would keep the normies away for sure.

And everyone else too.

And you're the one saying that people should kys.
Who's the more civilized now ?
Listen m8 we all know what CoCs can't do and that's forcing people to behave a certain way to change people.
That's what CoC pushers believes, that it changes people via forcing them but it doesn't.
Add a CoC and tell me what magic will stop someone from misbehaving ?
Nothing either you have a CoC or not people can still misbehave.
The nim community understood that a long time ago
web.archive.org/web/20170525152209/https://forum.nim-lang.org/t/2332

The only thing that we can do to help people (and to to tell people what to do) is to make a guide to express themselves on X chat system to make better constructive work and of course correctly manage/close shit that goes off-topic.
It's the only way possible without being a power tripping faggot that forces political views on others.
We can inspire ourselves from multiple sources
The no-CoC
github.com/domgetter/NCoC
Linux kernel:
kernel.org/doc/html/latest/process/code-of-conflict.html

>github.com/domgetter/NCoC
This is already agenda pushing, it's basically saying "look at how tolerant we are towards diverse people!"

The only thing your CoC needs to say, if anything at all, is "Behave yourself" or "Don't be a dick" or something along those lines.

...

ACTUALLY, it's part of a HTML document, it contains hyperlinks and styled green text thus it is not plain text.

No you're just retarded.
No that just means that it's something that isn't worthy to take into account.

If you want a real example of what you've just suggested look at the contributor convenant:
contributor-covenant.org/version/1/4/code-of-conduct/CODE_OF_CONDUCT.txt
contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation.

I agree with this guy. 99% of HTML is not parsed by humans but by browsers and spiders, and on the rare occasion that it is viewed by humans, the chances are it's obfuscated machine generated gibberish, since that is the case 99% of the time these days (even more true for JS).
As long as the binary format is just for creating structured documents and is not some executable code format (like webassembly), then it's not an issue.

His comment is fine because it is meant to be read by humans. Such raw data in plain text would still come along with the binary structural information, although I assume both would be compressed using lz4 in our standard.
Configuration files and log files are meant to be written and read by humans, and machines don't constantly pass them to each other (or at least they shouldn't) so they don't count either.
Essentially, using text for inter-process communication where the chance of a human wanting to read the message is < 0.000001% is inefficient and inelegant. The web falls under this category.

crawlers*
I just woke up 5 minutes ago

If I understand this correctly the binary would only be to structure the content of the website and the content would be in cleartext.
The problem with binary is that it's like having obfuscated content.
Well that's my perspective of binaries.

Obfuscation is useless anyway, it's not encryption. At most it's a little tiresome to translate, but doesn't have any security.

so exactly what we have now but even slower

I think we should avoid any dialog whether what we currently have with the internet is fine as it is. Looks like we are leaning more towards an alternative to current practices of accessing information..

Some of you express opinions that how the internet is becoming more influenced by corporate entities is ruining the experience in some manner or fashion. I think so too.

We should try to lean towards a direction that isn't influenced by money and has some guaranteed protections of WYSIWYG. I guess some concerns are of privacy.

What features in general/specific are you wanting to see and can you say if that still goes along with the notion/appeal of what we want to accomplish/use? Just trying to help guide the discussion for now.

are you actually going with that "argument", or are you just memeing?

We need a big push to remove arbitrary code execution from web viewing clients. It's the worst idea ever, it has always been a bad idea, and it has never improved. ActiveX, Flash, Javashit. All cancer.

Move that shit to the server. Oh what? You really don't need to execute all that bullshit code to render a fucking web page? NO SHIT. When it's the server's resources at stake all of sudden non-retarded decisions about code execution get made...

For pages that actually need dynamic functions and code, suddenly there is a demand for efficient, multi-threaded skilled programmers. Javashit monkeys need not apply.

The only way to make this happen is with fire. Massive, painful compromises of javashit engines and web browsers, ad servers, etc. It has to be painful and it has to be wide spread. It has to be prolonged. Make the dream reality Holla Forums.

I just want this to stop.

Makes perfect sense but how does that look in terms of resources being used on the backend/server? Can we handle it?
I understand efficiency will make it more possible but can we look at any real examples if there are any? How does this look when it is handling thousands of users and at what complexity? Is it more susceptible to abuse?

This was web development pre-JS non sense aka about 10 years ago.

this nigger gets it

no.
long story: most shit just generates text and spends 11523518751346345781251578223508175 of your CPU cycles to make it "look good". there are cases where it actually makes sense to do computation on the client, but that's largely outside the realm of serving 3 paragraph text files

If that's the case then we are also focusing more on standards and practices. Like I said we need to specify and agree to what extent is acceptable/allowed. I'm considering maybe we can have it set up in such a way that if it isn't efficient enough that it would reject the site/page- but would people find a way to bypass that? Hence why it needs to be considered what is and isn't to be allowed period.

We need to put together "a list of demands" that can express clearly what is needed. Once we can understand and sort that out maybe some work can go under way.
Since we are trying to redefine something more than merely the browser, this is going to take a while and we need more seasoned programmers here to discuss this properly while providing feedback. Security, privacy, functionality, efficiency.. all of this is going to be important in the matter. Can any of us start pulling in more people with the right talent who might be interested to discuss this?

All the "kill JS" and "no client-side scripting" sentiments are agreeable to a certain extent: JS sucks, and it's being abused horrifically even in terms of what it was originally designed for. On top of that, the sentiments about turning most modern "web apps" into proper standards like eMail/USENET/IRC/FTP/Gopher/etc are also obvious (and sort of already happening in mobile platforms' walled gardens).

But a thought occurs to me: PostScript is both a document format and a Turing-complete language, why isn't it so cancerous? Could something similar to it be used to make usable rich documents?


Isn't W³C dead in favor of the cancer, right there? The idea of maintaining any kind of standard for browsers to target is long, long gone. Now, browsers are officially SUPPOSED to make up standards, and WHATWG's actual job is just to scribble them down. Worse, with everyone except Microsoft and (as they attempt their hardest to commit suicide) Mozilla all using the same Konqueror-derived engines, there have been serious proposals to ditch standards entirely in favor of "the standard IS Blink's source tree".


That's how HTML was originally intended to function, completely separate content and presentation, in lieu of something like CSS for presentation. Of course, since browsers didn't originally have CSS, HTML was hideously abused using tables and spacer GIFs in the HTML 4 days. Once W³C compliance became a big deal for browser devs and web writers, the W³C sought to expunge these sins with XHTML.

HTML CSS & XML are, of course, both descendants (and primarily subsets) of the older SGML. So tying them together into the same schema is a homecoming of sorts, quite logical.

QUOTED FOR FUCKING TRUTH!
Rich text shouldn't be scrawled by hand in markup, it shouldn't be puked across the internet and into storage as slow and bloated text, its format shouldn't be spec'd out without a functional rendering and authoring implementation.

Little-known bit of internet history: In the WWW as originally conceived by Berners-Lee, there was no such thing as a "browser", the early client implementations all had full GUIs for both browsing and editing, which continued into the W³C's testbed clients Arena and Amaya.

If I were to point to how web browsers SHOULD work, it would be like Framemaker, where both the format and the software are designed to lend maximum prominence to separation of content and presentation by simple boolean logic, all in a powerful elegant GUI that supports viewing and modifying all features.

This entire thread is a LARP anyway, nobody will ever use this retarded web 0.1 sub-netscape garbage you're shilling for.

Isn't it ironic then that you're replying to me when I'm making one of the least radical suggestions ITT.

You can't live action role play on the internet. If anything it's RPing.

Yeah, no. Are you out of your fucking mind? Do you want all document edition to happen on WYSIWYG editors or via libraries? Fucking Microsoft Office used to do this and they had to drop it for XML because it made no fucking sense.

Arguing about client side scripting, html, and css is pointless; the real issues are security, privacy, and anonymity. We need to use a different standard below these three technologies that eliminates the issues they create and attempt to address.

Instead of the web being cowboy coding and embrace+extend pseudocompliance writ large? Yeah, that is exactly what I want.


Even if the internet was turned into the distributed, anonymized, redundant, self-repairing system it should be, it would still just be a network. And if that network was still being used primarily to access a mutant document format pozzed by pajeets into a JS-powered pseudo-OS in every tab filled with layer after layer of cancer, we would still have to fix HTML.

Fixing the internet and fixing the web are two separate problems.

this fbh fam.
how can we create from the ground up a new internet using existing infrastructure? has it been done? is it TOR?

tonight at 11

W3C are idiots
think about what they've done with the tag

no wait

no no it's only
it's self closed this way, trust us, because and S E M A N T I C values and bla bla

This is already happening. Normalfags hardy actually use a browser, they just use apps on their phone.

True, but the problem is all of them are proprietary and closed, much like the walled gardens such apps are distributed through. This contrasts poorly with older special-purpose apps like USENET, IRC, FTP, Gopher, etc.

A perfect example is Skype: While never as open as an independent standard like H.323, originated as a P2P architecture for actual conversations, one with an official API for 3rd-party clients, and which maintained compatibility from 1.x clear to 3.x of its 1st-party client. But then they switched to server architecture for everything, killed the API, and have been herding everyone into the latest version, orphaning older platforms and dropping features in older clients in the process.

You could have something that compiles your hand written xhtml into a binary format if writing hideous markup is so important to you.

Here's a good ripping apart of webapps that was posted on HN.
He pretty much ignores how HTML is still broken even for its original structured documents, saying it's good enough.
blog.plan99.net/its-time-to-kill-the-web-974a9fe80c89

They are not separate problems in the sense that in order to fix the web, one must first understand what the new web should be about. Fixing the internet might mean using something like GnuNet, assuming something like that is even theoretically scalable in practice. However, server-side scripting would not work in such a scenario, since most nodes in a P2P network would be just dumb nodes that transmit information. How would one, for example, build a perfect web store? The P2P model probably wouldn't work very well, since server-side processing would probably be necessary. On the other hand, web sites that only serve static data could easily be handled in a P2P fashion. Should we have separate protocols/standards for web services and web data, where services would be client-server based, and data would be P2P? Would these two standards compete with each other? If so, the P2P standard would probably die, because it would, in a way, just be a subset of the client-server model. As a result, our beautiful anonymous, private, secure, resilient P2P network would die as well, since the main standard supporting it would no longer be there, and normies don't give a fuck about privacy anyway.

FUCK YOU. NEW WEB: XHTML5 ONLY

This.
JS was made by a jew to destroy the internets!
CSS and HTML expansion for clicks on elements or content loading of nearly infinte content (like comments or search results) or updating the chat without reloading an entire embedded website are necessary.

No embedding either. That can be done server side. And no cookies because there's no JS which could make them in the fist place.

NO SCRIPTS. JS only got extended to geolocation, canvas, webgl and other security flaws.

We're talking about something a CIA-nigger like you can't understand. The possebility that someone could fry your GPU through a driver bug is simply retarded in a DOCUMENT.
JS has shit performance and all games that run on it suck too. Stupid browser games weren't worth flash(it was video streaming).

I correct myself there would be no harm in embedding since there are no scripts and html and css both can only work within there boundaries.

Maybe we shouldn't even have a "web" at all but rather separate networks for specific purposes with clients to retrieve and send data. Kinda like IRC clients or torrent clients, there are dozens out there but they essentially all do the same thing; connect to servers with standardized protocols. So you could have an imageboard protocol or something that narrowly defines what one is and sets up a standard template that can be customized to a limited extent, and clients can be developed for it by anyone to prevent monopoly.

Would you install a client for a specific type of imageboard?
But you're still making a good point. Lot's of things should move back to external applications. Like chatsystems.
Inaccessablity is one of the main issues on the internet. So I propose adding the necessary stuff to html and css and slowly dropping js support like flash.

...

out

security is out of scope for the web. the web is only useful for serving documents. the only security one could talk about is authentication of documents

postscript is fucking garbage

there are proper ways to do binary. i dont know about this journald, but it's probably shit like everything else. HTTP2 can't possibly be good because HTTP isn't good in the first place. plaintext of HTTP provides zero benefit even today. most browsers give you some sort of idea of what headers they sent / received, but you really have to read the source to know what's going on, or use wireshark, which will support binary just as well anyway. the ability to _construct_ HTTP using plaintext is also largely useless, since there's no way to know how to do it aside from being an expert webshotter and knowing how each server will treat your request. but i digress, HTTP is a shit protocol regardless of encoding

and all these "apps" are just a container holding a web document. that aside, phone software is shit. performance/aesthetics aside, they're full of ACL fail

this. even something as simple as an imageboard will never be a good user experience when implemented through the web. ignoring the lag caused by the web browser being a bloated piece of shit:

web:
if you just want to view a thread's images one at a time, pressing pagedown causes the viewport to appear in the middle of an image, then you have to press some other button to scroll up a small amount. you could go to the media window in Firecox, but that's still not ideal
normal software:
press "next image" button/hotkey to set the viewport such that the next image in the thread is at the top of the screen
hipster faggot's suggested fix: hurr durr it can just be a plugin or supported through JS
reality: web browser rebinding hotkeys is annoying as fuck aside from having security and privacy implications. web plugins are shit and buggy as fuck, and often freeze the entire browser because they literally do cooperative multitasking. said plugin will compromise every other site you browse, even when the browser tries to support an option to limit it to one domain, so you need a separate browser for each website

web: you want to authenticate the identity of a tripfag, but imageboard provides some non-cryptographic method of doing this
normal software: tripfag post has PGP signature attached to it, software automatically verifies the post during the 1 millisecond it takes to load the thread
hipster faggot solution: web plugin, same issues as above, crypto written in JS, which even when done correctly is retarded as fuck; and some other bullshit compromise solution
neckbeard solution: copy and paste signature into PGP. web page hijacked copy/paste (or simply had invisible metacharacters and effects even people with JS off) and now you pasted `muhsignature174812591257198175^C\nrm --no-preserve-root -rf */` into your terminal

imageboards could also use a DAG view, but i can guarantee you, if that's implemented in the web it's going to be shit.

forums like PHPBB, invbb, reddit, quora, etc should also not be web. they should just use crypto for authentication.
your viewer guarantees that any poster you see is authenticated
quotes can simply be some metadata specifying ranges of other posts. now people can't forge quotes either and you can follow the chain to know exactly what post is being quoted
you could literally implement this over a weekend if you just want to use a centralized server

HTTP is not a shit protocol, or at least it is not on the same grade of shittiness as the rest of the web is.

Taking this seriously

Here's a strategic plan.

1. Everything starts and ends with the browser so take an open-source browser (say, Mozilla).
2. Replicate the build and test chain.
3. Document how it all works so newbies can hop in to a specific point without being scared off by a gigantic pile of code.
4. Build a system to keep importing updates from the corporations that are still spending millions on development.
5. Decide what we want to keep out and/or put in. Why are we forking?
6. Offer our platform to every other Mozilla fork to reduce the duplication of effort. We pull in from Mozilla core, they can pull in from core or from us.
7. Launch a subsidiary effort or two to modernize the code and leapfrog everyone else in technology. Doesn't Mozilla still use the same process for all tabs so they're all fucked if one hangs? Chrome fixed that years ago.
8. One of these modernization efforts will be to make it easier to develop alternative languages to HTML by having the browser incorporate something like antlr. If you don't like where HTML is going, you can make your own personal markup language, have a data file that describes it, and the browser will interpret it without needing to install a plugin. JS and CSS could be fed through the same sort of thing.

Web should be for serving documents, nothing else. Want an interactive app? Use telnet or something.

W3C, When Power Goes To Your Head

There is a bunch of problems in this thread, but I think some of them can be treated separately.

1. Personally I'm in no rush to replace http. Sure you can replace it with something slightly different but what would the be the benefit other than the small performance improvement? Anyway provided you can convert back and forth between the two, its ok.

2. Does anyone know of a tool to simplify HTML trees? As in remove useless intermediate nodes (such as aesthetic divs, etc). Or remove specific tags by request (script comes to mind).

3. As far as I am concerned layout, colors etc are a presentation problem. The browser should be able to ignore all server side presentation bits. In firefox you can disable CSS or fiddle with browser.display.document_color_use, browser.display.use_document_fonts, permissions.default.image. But none of this is immediately usable.

4. There is a significant pressure to make websites conform to certain design expectations. ajax popups, endless scroll, etc. If you want to move this forward you need back pressure - go and and register javascriptkillsmymobile.io and start a campaign to list sites you hate. BTW sites you hate are also the ones that work really bad for people with accessibility problems.

5. The closest thing we have to an easy to hack browser is webkit. Chromium is a pain to get and build from source, as is ff. Servo is not usable. We need a toolkit to build browsers that allows us to externalise features (video, audio, webrtc, WTF-web-3.0). webextensions are not enough.

6. For security w3.org/TR/SRI/ fell short of what we really needed i.e. the ability to add integrity tags to arbitrary external resources AND to have multiple URLs as references to the same resource .


In firefox the setting to disable autoplay is media.autoplay.enabled but for some reason it does not work for gifs. And it gets messed up because you dont see the controls.

This one is partially fixed, or being worked on until ff 57 wiki.mozilla.org/Electrolysis I think ff 5 already uses per tab content processes.


telnet 2.0


If you get a way to convert back and forth between html to a cannonical s-expression representation, I'll buy you beer.

Shut the fuck up, retard.

yes it is. HTTP doesn't even have a purpose. it's just a bunch of "le verbs" which you're supposed to shittily fit into your application in question, and a bunch of key value pairs ("headers"). then a bunch of bullshit about caching etc which nobody gives a fuck about except professional webshotters, which don't even make any sense because it's not even clear what the protocol is for. they literally just tweak these higher plane concepts to match their use case of the day on their shitty stacks of the day

there's nothing of value in firecox. i use a fork of firecox in the rare event when i want to access some shit website, and in the process i almost throw my computer into the window

no. People that talk about race nowadays, imply that you can't be racist against white people, and actually want to be hostile to them and find any possible fault on their behaviours. They make CoC a weapon rather than a shield. They just can say wathever you do is harassment, and they all will agree, while they will say anything they do to you isn't. And they all will agree.

CoC's not only should be rejected. People supporting them should be mocked, spit on and destroyed. If not, they will slowly creep up and wait for a moment of weakness and amass power until they get what they want. They will lie, manipulate and extort to get what they want.

In general, "emotional intelligence" stunts (manipulation and shit without any kind of value) should be punished and drawn attention and be a reason of humilliation. If the major thing you bring to a project is drama and your ability to make some people like you then you should be ejected and treated like the parasite you're.

This thread has inspired me to write a browser that ignores all the js, css, and shitty parts of html5, and has built in adblocking, but can still render things nicely.
I'm thinking of using electron since I know js and html.

Urbit

media.urbit.org/whitepaper.pdf

youtube.com/watch?v=QH6xgVRdfX0
youtube.com/watch?v=g1qroWiZF90

No thanks!

The real problem with CoCs and HR isn't any of the things they putatively restrict or guarantee, it's the fact that they are so blatantly redundant for any legitimate purpose. Every one of the things CoCs bitch about is:
A) Already covered by the law. You know, the REAL law enforced by real cops and real judges, with real checks and balances, decided by real democracy.
B) Something legal and, aside from possible emotional trauma anyone more mature than the average 6-year-old can ignore, completely harmless.
C) Sufficiently fuzzy that it is practically impossible to consistently define, let alone enforce policies against it, in an honest and fair way.

So, if CoCs don't actually do anything useful that isn't already done by existing mechanisms, what useless thing is their true purpose? The HR mechanism itself, and the completely arbitrary power those among its ranks hold over their fearful subjects.

Could we all take a step back here?
Why don't people start working on their own systems and concepts and see what we can come up with? It doesn't have to be taken serious and more as a little side hobby/project? I feel like talking alone isn't going get the job done. Talking isn't going in a good direction for achieving results. The only people whom are going to make sense are the ones showcasing real projects/examples.

Hello luminescent black men working on 1984 again I see.
The problem of spam is the retards who believes in them.
If people were to be taught how to filter their dam shit (aka only trust emails that only comes from trusted sources) before even having a computer spam would be irrelevant.
Urbit is the worst possible fucking idea ever it just brings more problem.
What are the problems of the internet.
-Centralization
You have to pay/pass in a central authorities to get certificates X.509.
Same thing with DNS which requires enormous centralize infrastructures.
-Anonymity
There's no anonymity with the actual architecture of the internet.
You need serer and client anonymity so that freedom can be insured.
Gnunet achieves that:
youtube.com/watch?v=eM4J7ljCExM

Hello luminescent black men working on 1984 again I see.
The problem of spam is the retards who believes in them.
If people were to be taught how to filter their dam shit (aka only trust emails that only comes from trusted sources) before even having a computer spam would be irrelevant.
Urbit is the worst possible fucking idea ever it just brings more problem.
What are the problems of the internet.
-Centralization
You have to pay/pass in a central authorities to get certificates X.509.
Same thing with DNS which requires enormous centralize infrastructures.
-Anonymity
There's no anonymity with the actual architecture of the internet.
You need serer and client anonymity so that freedom can be insured.
Gnunet achieves that:
youtube.com/watch?v=eM4J7ljCExM

Hello luminescent black men working on 1984 again I see.
The problem of spam is the retards who believes in them.
If people were to be taught how to filter their dam shit (aka only trust emails that only comes from trusted sources) before even having a computer spam would be irrelevant.
Urbit is the worst possible fucking idea ever it just brings more problem.
What are the problems of the internet.
-Centralization
You have to pay/pass in a central authorities to get certificates X.509.
Same thing with DNS which requires enormous centralize infrastructures.
-Anonymity
There's no anonymity with the actual architecture of the internet.
You need serer and client anonymity so that freedom can be insured.
Gnunet achieves that:
youtube.com/watch?v=eM4J7ljCExM

>>>Holla Forums10664174
>>>Holla Forums10664174
>>>Holla Forums10664174

-__-

*Yawn*

Maybe Holla Forums is a satire.

Go back to Gopher for static documents, NNTP for discussion, and IRC for general shitposting. Execute every web developer and mainstream browser programmer. It's the only way to fix the internet.

I know this is out of the context, but what is the girl of this image?

a whore

Be more specific pls.

...

any sauce?
who is she?

...

...

Oh look, another dipshit that doesn't know the difference between the web and the internet.

why don't we just build a program/browser that renders hyperlinked pdf-equivalent-documents hosted on ipfs?

Is it a feminine penis?

into the trash it goes

Only rule that I agree with (and would break anyways if it meant doxxing antifa to get revenge on them doxxing me).

I'll make the logo.

WHATWG is less shitty than W3C due to stackoverflow.com/a/34984276 but hardly ideal due to CoC concerns expressed in and