./netrunner

factually incorrect

Really? Then why don't you post the benchmark code.

it's not intended to be publicly available as is and I don't think the reward of extracting it is worth it.

Compressing already compressed data usually doesn't offer much of an improvement.

A bitmap is not compressed. Binary serialization is also not a compression algorithm.


Well I linked you to some benchmarks demonstrating that there is no in practice difference between them.

are you retarded?

I literally linked to benchmarks showing exactly this

Deserialization is almost never the bottleneck. Network or disk speed are. Even then the benchmarks I linked prove that deserialization is almost as fast for JSON except for a couple types like floating point doubles.

>auth0.com/blog/beating-json-performance-with-protobuf/
You BTFO yourself you retard. This blog post shows that Protobuf is way faster and smaller than JSON. Only when you pipe it through a shitty compression algorithm can JSON catch up.

There exist better compression algorithms than those webshitters are currently using. Stop being a retard.

Well lets see that was linked to in what context

Wrong. If you are not using compression you are wasting a shit ton of space binary or not.

Huh so it was in the context of "These things are all the same size when compressed". Wonder what "Only when you pipe it through a shitty compression algorithm can JSON catch up." has to do with it.

Now on to your next comment:
Other compression algos exist but they bottleneck before the network does so they are useless and decrease the performance of the entire system.

They either come pre-compressed (FLIF, etc) which is better than any blind general purpose compression, or they don't need to be compressed to be transfered locally.
Are you unironically saying that the interprocess communication inside one machine needs compression? This is not even funny anymore.