Google noto

archive.is/RWaHK
hothardware. com/news/noto-google-monotype-unveil-universal-font-project-all-languages

google noto

oy gevalt talk about bloat. why must these projects always pander to every possible group of people from every bumfuck corner of the planet that will never be important to the internet?

unicode obviously was insufficient, what will be the death of google noto? will you use it? thoughts and comments?

Other urls found in this thread:

en.wikipedia.org/wiki/Newspeak
en.wikipedia.org/wiki/Law_of_triviality
users.teilar.gr/~g1951d/
twitter.com/SFWRedditImages

Try making another thread, this time without sounding like a retard.

Thanks for correcting the record!

This is what business looks like. There are people who need this font. They exist, whether you consider them "important to the internet" or not. Google wants them to be able to use their operating system, so they make sure they can.

This implements unicode. Do you have any idea what you're talking about?

This thread isn't even Hillary related

Oy vey!
It's like another shoah!

This is the ultimate fallback font, you retard. Before that it was Droid Sans

disagree, it's bloat. tbh everything beyond ascii is bloat.

I actually like that it when a font covers all languages, it means I can use it for any document and have it work. I often need Latin, Cyrillic, Greek and occasional being able to type and display an (((Aleph))) in math is useful as well.

So what if there is no typographical history, if you found yourself writing a paper and needed to print some runes or cuneiform you would very much appreciate if the glyphs didn't stand out like a sore thumb.


And any language besides English is bloat as well, right?

i'm making an analogy you idiot. unicode 9.0 is bloated as fuck because people gotta add stupid fucking things like emojis. with noto which tries to have a font for everything inevitably there will be some stupid fucking obscure language or even a fake language or they'll want dozens or hundreds of font faces for every language, or something else that we can't even imagine now - similar to emojis with unicode. and eventually it'll have the same problems. what a waste of fucking resources for people that don't and will never matter.

no. they included dead languages and languages that haven't had fonts ever, even in print before. i'm sure the cherokee font will be such an improvement to the internet, such an enabling technology, we just couldn't exist without it, those cherokees sure will be able to run their casinos at double the efficiency with this font. lel. english is lingua franca of the world. and smaller regions have their own regional lingua francas. these people were already served.

i couldn't have said it better myself!

already available and standard

literally you could have used any dingbats type font for a dead culture. it just needs to look the part, nobody will know the difference. trying to be exact for every possible script including dead cultures is pure autism and creates bloat.

Oh no, my precious megabytes.

I'll have to clear out some of my webms or anime titty pictures if I'm going to contain all that "bloat".
No wait, I just realized I'm not a fag and actually have a hard drive made in the past decade.

Nigger you're functionally retarded.

Unicode didn't have enough diversity.

I wonder if it supports Klingon alphabet because I want to use Klingon as my language on Linux to keep normies away from my PC.

People still want to exchange text in dead languages and languages that didn't originally have a font. It's useful to them and has essentially no cost for you.

I understand that it's not useful to you, but why does that make it bad?

...

This many Google fanboys, god I fucking hate /g/.

good thread

You sad, sad attention whoring SJW.

The purpose of including dead languages is the ability to catalog and exchange works in those languages. You do know that people still study dead languages, right? And those people do use computers to do their work, right?

I'd understand it if it was something like Klingon script, which has no actual use outside of a hobbyist realm (and even then, a Klingon community could use a private block to communicate in Klingon), but even dead languages are "used", as they are studied. A language being dead doesn't mean that it's never touched, it just means it has no more native speakers.

Either way, languages that are dead are still useful to catalog and store efficiently, as we usually have written works that linguists refer to, and they need to be searchable and useable in textual format, which raw images aren't.

I used to use GNU unifont for languages I didn't give a shit about but still wanted basic support for.

...

Goddamn, chinese is such a pile of bullshit.
How the fuck do you make a language that impractical?

I just tried Noto out and I'm quite disappointed that it's actually a bunch of different fonts that only cover a part of Unicode each, and not every font supports every variation (e.g. no bold variant of the monospace font). They aren't finished with the project though (phase 3 has just recently started), so maybe it will be fixed in the future. I'm uninstalling it for now.


Dingbats is the most retarded shit ever, followed closely by Emoji. Try searching a database of cuneiform writings: you would first have to translate the cuneiform to whatever your Dingbats really are, resulting in something that looks like Qbert swearing, then enter that mess, get some other mess out of it and then convert it back to Dingbats. And that's if you can see, because if you have to rely on a screen reader you are fucked. Unicode is about providing proper semantics, just looking right is not enough. I have wanted to use some old-style Cyrillic writings on a website but every font I was able to find was doing the same retarded thing of making Latin characters look like Cyrillic, making it useless for anything outside of print.


Fucking this!


Wow, that's an ugly font.

When the third Reich starts up we should be prepared to replace world languages with something better designed to make machine use and interpretation simple and accurate. A better language could trivialize speech recognition, natural language parsing, and determining meaning.

I actually wondered that myself. Turns out it made sense back in the day because had one central government that had to control a bunch of different regions with different languages. By using ideographic characters it was possible for someone from region A to read a message from region B even though they did not understand each other's language.

unifont is meant to be very small and bare bones, not pretty. It's a bitmap font and can be easily used as a texture atlas in vidya. It's expected that it only gets used when a symbol isn't provided by another font.

I have noto sans on KDE5 and it's great. I'm glad Jewgle has made these open source fonts, because the old ones that were available were cancer.

I hear Roboto is also good.

Why? What reason is there for this to exist? Who would need a font with these features?

...

Noto is a font. A font is not the same thing as an encoding. Encoding is the representation of the text as bytes. A font translates the parsed characters to a representation that can be displayed. If the encoding supports more characters than the font you might get those ugly rectangles, and if the font supports more characters than the encoding some of the font will never be used. But switching to a different font won't increase document size.

UTF-8, UTF-16 and UTF-32 are three popular unicode encodings.

UTF-8 can encode all unicode characters and is backward compatible with ASCII, so ASCII characters never take more than one byte in it, even when mixed with other characters. It's the standard for everything that's not old and not Windows.

Windows uses UTF-16. It seems like you think it encodes all characters in two bytes (16 bits), but that's not the case. It encodes everything in at least two bytes, but four bytes if needed. Not all characters have the same width in UTF-16. UTF-16 is completely fucked up but a new font won't make it any worse.

UTF-32 is a unicode encoding that does make all characters the same width (four bytes). It's useful for really fast indexing but not much else.

gotcha fam
en.wikipedia.org/wiki/Newspeak

So they just won't use the whole thing most of the time then?

They will, but it won't take up any extra space because not all characters have the same width and it's completely unrelated to which font you use because it only depends on the encoding.

Why did Microsoft and IBM standardize on UTF16? It's the most retarded encoding.

All of unicode could fit in 16 bits back then.

That's not true. UCS1 could but UTF16 has always been multibyte.

The font has nothing to do with your document. The file is stored as a sequence of numbers and the meaning of those numbers is determined by the encoding. The font is just for visual representation and it's use by a program in order to display the file on screen. You could even "display" it without any font at all if you were to use a screen reader to read it out loud instead of showing on a screen. You could take your file (a series of numbers) to a different computer that doesn't have the noto font, it would still have the same size.

However, if you were to store the font inside the file as well then yes, it would get larger. I don't know what file formats do that (probably PDF?), but I assume they would be clever enough to not include all the glyphs not used in the document. So in that case your file still wouldn't get larger than using any other font.

Most of CJK is actually just a bunch of ligatures. Zoom that image out and you can even see a visual gradient where it starts with one base glyph and progressively adds more shit to it.

If it was done today, it'd just be one or two planes of combining characters, like the useless emoji ZWJ sequences added in Unicode 8/9.

Holla Forumstards really do get triggered over anything and everything

Oh look you just heard about it on your 'tech news' site. How cute. Noto has been around for a long time and this latest news is just a report on a google blog. There hasn't been any major work done on the fonts in months.

Google didn't make the standard, the Unicode Consortium did. It existed before they did. If you understood the situation you would know that people have been asking for reference fonts since day one or at minimum a fall back set.


Emojis make up a few hundred code points at most yet get nearly all the news coverage. That is because normies can't understand Unicode, linguistics, or writing systems. Emojis, well everyone can comment about that. This is the classic bike shed problem.
en.wikipedia.org/wiki/Law_of_triviality


Klingon is not in the Unicode standard. It is in the Conscript Unicode Private Registry.

Does it have hieroglyphics? If not, it's shit.

Which kind?

users.teilar.gr/~g1951d/

This is awkward, Hitler took that name about 80 years ago and WW2 happened just to destroy it. You should call your one second third Reich to avoid confusion with Hitler's third Reich.

We're not surrendering, there's just been a delay.

yet they fuck up everything. literally death by bloat.


tibet isn't a country or a people or a culture or even a font fuck off.


i advocate for not bloating tech up. i seem to have struck a nerve in this thread, lots of butthurt fags worried about their fonts. making fonts for literally dead languages is absolutely retarded as nobody can tell the difference. using dingbats in place of a dead language is efficient use of resources. and unicode already covers basic cuneiform so it's literally a moot point.

Do you know what a dead language is? Do you think it's a language that nobody understands any more?

It's a language nobody in their right minds cares about any more.