Google's new operating system, Fuchsia

github.com/fuchsia-mirror

Thoughts, opinions, and more. Go!

Other urls found in this thread:

github.com/fuchsia-mirror/magenta/blob/master/docs/getting_started.md
doc.cat-v.org/plan_9/4th_edition/papers/9
twitter.com/NSFWRedditVideo

The name is extremely problematic for men, who can only see 8 distinct colors.

Well at least it's not written in Java this time.

we can see more colours than that, we just aren't as better at interpreting shades as women, nor do we need to.
If we went to work in a place that forced us to differentiate between shades more, we'd be pretty good at it(though still at a certain disadvantage compared to women).

C fanboys BTFO

Where does the "men are bad with colours" meme come from?
All great painters of history were men.

k

People keep using Google, i don't know why. They "trust it". Dumb fucks.

I heard some time ago that the gene for interpreting color is on the X chromosome, and a gene that improves contrast is found on the Y chromosome. It sounds like bullshit, but I'm too tired to check it right now.

Men are more likely to be color blind because of something like that. If all your X chromosomes have a certain gene you're color blind.

Men see same colours that women see, but average woman can name more colours than average man.
My theory is that women learn those colour names from makeup things that they use all the time. And in the same time most men do not have any need to learn anything but basic division of colours.

So it's just a social construct

Is there any place with instructions on how to run that thing in virtual machine?

This sounds more like it.

Shut up, we are talking about why men are biologically and genetically superior to women.

What kind of idiot would use an OS written by Google?

Don't give me that "open sewers" excuse, you know exactly how that turns out in practice: The codebase is usually incomprehensible to anyone but the original authors and if your interests ever conflict with Google's, guess who wins.

I see all the colors, I just don't have time for bullshit. The background on this page is light blue, the posts are less light blue, the links are dark blue. Done.

This "colors are special" BS can be dealt with with RGB values and a color picker.

yes they provide detailed instructions on running it in qemu
>github.com/fuchsia-mirror/magenta/blob/master/docs/getting_started.md

Redox > fuchsia
/thread

enjoy your botnet goy

Are you referring to google the company or google the racial epithet for black people?

This is pretty amazing. Dart is a fantastic language.

How it is any different from Rust or Go or any other "modern attempt at re-writing C with a fancy logo attached"?

It's neither. It's actually JavaScript made to work a little more like Java.

...

ew

nope. it's a thing of the past, when men used to hunt and women used to collect berries.

some women due to the genetics can even exhibit tetrachromatics

If someone uses that solid pink square as a Google+ profile picture will they get autobanned for porn?

so has anybody tried it yet?

that's true story

and you know what?

I think we need to make a series of R?GB monitors, because usual RGB monitors are only 3 channels and they cannot display all the colors they can see.
In other words, traditional computer displays are oppressive and there should be alternative, true color displays.

Rust is basically javascript: java edition. Rust is basically C++ + ML: actually good edition.
Go is basically C: C is too safe for us edition, but also we want a GC because fuck 99% of software kinds.

Yes, it's fucking nothing. It just has a few test programs, and that's it. Even redox is further along.

What can't you write in a garbage collected language and why? I see a lot of people mention GC negatively but never understood what is wrong with it. Especially in the case of Go where the gc's seem to take single digit milliseconds.

Try writing an audio(or video) player in GCed language.

You meant to start with Dart right?

I wrote something that handles audio in Go and had no issues with it, what's supposed to happen?

Displays of color, you biggot!

Yes, pls no bulli


System-level stuff, embedded programs, and anything that needs to run in realtime because of the lack of determinism in runtime and the resource usage. It's also unsuitable (but usable) for complex desktop programs (you'll end up with performance issues and, more importantly as it happens more often with greater consequences, memory consumption issues).

In relation to the desktop performance, is this design and implementation specific? As in design and implementation of the GC itself not the client program. Would it be suitable with a good GC or are all GC methods inherently not suitable in these situations?

I haven't looked into it heavily but people seem to say Go's GC isn't typical, I'm curious if it actually handles some issues that other languages have for their GC.

Partly, and partly not. You could hand-design a GC for your particular application where these issues are minimized, but you can't make a generic GC that does not show these issues. It's a consequence of the difficulty of the problem that GCs try to solve.
So-called realtime GCs (which bound the garbage collection time per episode) would certainly work better than more traditional approaches for high-performance or complex programs, but in exchange, it may take a lot longer to collect memory, resulting in leak-like behavior. On the other hand, with more traditional GCs, you end up having significant wait times on GC passes when large amounts of memory can be freed.

As for the suitability of any given GC, it will obviously always depend on the client program (memory access pattern, memory consumption, CPU utilization, etc.)

Go's GC is not atypical, and only recently has it become a non-joke, too. (used to be orders of magnitude slower than java's, which isn't even the best by any means, now it's somewhat better than it).

Thanks for the information.

Do you mean that it decodes an audio file into PCM?

If you don't want to install in /usr/local (the default), which will require you to be root, add --prefix=/path/to/install (perhaps $HOME/qemu) and then you'll need to add /path/to/install/bin to your PATH.

I'm confused. How do I do this?

Do I append "--prefix=/path/to/install" like this:
./configure --target-list=arm-softmmu,aarch64-softmmu,x86_64-softmmu --prefix=/path/to/install

And do this:
export PATH=$PATH:/path/to/install/bin

i i distrust Google more than Microsoft. and thats saying something.
they're not even hiding the fact that they steal your data.

It decodes Vorbis into PCM and gets played back by the OS, the audio is decoded and handed off in 8192 byte long segments, before they're handed off they have to be prepared with headers and a finished chunk has to be unprepared before the buffer can be refilled.

I've never done anything like that before so I may have done it poorly, also I'm pretty sure the interface I'm using is deprecated but I couldn't find what deprecated it and thus no documentation for the new proper system, I'm using the Windows waveOut interfaces to get a device handle, prepare and unprepare the chunks, and then play them back.

They don't steal it, you give it to them willingly.

You could just, you know, stop being retarded and not do that... but that wouldn't fit your narrative.

This.

It's a social construct.
Read this somewhere reputable.

Weird, the background of the page is dark grey, the posts are lighter grey, and the links are light blue over here.

:^)
Ohoho

Wouldn't that be better for everyone, feminist memes aside?

I can't wait for this to replace Android/ChromeOS. We'll get to see the shitty meme kernel Linux market share plummet to the bottom. The butthurt from Freetard GNU/Fanboys or GNU+Fanboys as I've been calling it will be glorious

It would be better for everyone but it would be even better for women therefore we can't do it.

Google harvests data whether or not you use their products.

How do you expect that to happen? They don't seem to be preparing it as a drop-in replacement for Linux within Android, or they wouldn't be doing the UI stuff the way they're doing it now, and simply dropping Android would be dumb.

No, it's color blindness. Color blindness is more common in men because the mutation that causes it is on the X chromosome. Women have 2, so they have a non-mutated back up gene that serves the function normally. Men only have 1 X, so whatever version of the gene is all we have. Even so, the majority of men have normal color vision. This combined with the fact that women are just more into fashion and colors so they learn more distinction names for shades drives the "men are bad at colors" meme. If you want to see areas were men are just as anal at color look at the guitar world, where some guys can tell Candy Apple Red vs Fiesta Red vs Dakota Red from 20 feet away.

its a hybrid kernel based design, swapping it with Linux would be piss easy and it wouldn't have to sacrifice compatibility because all that would need to be ported over would be Android Runtime and the Bionic C library, neither of which are tied to Linux by any means

Why would they be mad when a company they hate stops associating with Linux? Especially given that Android almost entirely runs on locked down devices that are exactly what those people would be against.

It does seem like fuchsia is planning support for natively running android and linux programs, though.

You're thinking of Andromeda, which they're running on the new "Pixel" devices.

No.

Why?

Didn't you play pokemon?

Even if it's google, the license is permissive, so there will be
something to be reused it'll be interesting to see them trying a
microkernel approach, the only other "popular" OS using this is Minix.
In reality, microkernel or monolithic doesn't matter much, for
instance, in Plan 9 there's no message passing, since the devices are
directly addressable through the file system by the process and this
kind of thing blurs the difference between these kernel
styles.

Plan 9 isn't a microkernel at all though, is it? Why is it relevant that it doesn't have message passing?

Wikipedia says it's a hybrid kernel, but there's no such thing, so
it's monolithic.
Message passing is often used in micro kernel as RPC, with the kernel
involved in managing them, this results in overhead because context
switch. In Plan 9, devices are files (actual files, not sockets or
similar, like Linux) and process simply use file operations like read
and write to do RPC [1], the file server decode the message received
and do the appropriated operation, this is much simpler than the
micro kernel approach.

[1]doc.cat-v.org/plan_9/4th_edition/papers/9

I want to learn Plan9 just to know myself if it's actually worth using in some way, but I'm afraid the literature won't be there. What is the best way to learn Plan9 and should I commit myself to learning it?

Plan 9 is just a toy OS. It's full of good ideas and innovative stuff, but because weird license at first (today is open source) and
its difference from Unix (but it's actually simpler), it wasn't adopted.
Just read the manuals and try 9front if you want, but
don't expect stuff like web browser with
javascript.
As a curiosity, look how the command
"dircp" works.

It's beautiful, but it won't get you a job. Please do study it in your own time, and try to faithfully extend it.

I'll look into it, thanks.


If it's legitimately better in some way I can be content with just knowing and using it on my own. I don't think I have the skill to improve or extend an OS with a history like this one but who knows. I'll have to learn it first regardless.

GC which stops the process at unfortunate time will cause audio to drop-out/glitch.

With shitty TN panels, it's hard to tell.

But women are more important…
They live about 10 years longer on average because they aren't dumb, and they are not as lazy as most men.

LOL

i like this bait

...

2008:

2016:

really makes you think

really made me think

2024:

Nobody gave much of a shit about the botnet meme until snowden docs.

Google already released the security hell (android) Isn't that enough?