FOSS AI

So the AI revolution is already here, intelligence has decoupled from consciousness and all our jerbs are being taken by robots.

What can one neckbeard with a little foresight do to survive in this brand new world? Train your own AI of course.

Today a number of different frameworks, platforms and ideas are available as open-source software. The actual AI products are all proprietary, since they are trained using valuable and expensive datasets. But the base tools are free for anyone to use. But which one should you chose?

Theano, Susi AI, OpenAI, Mahour etc. So many choices, and if you don't work with these types of software projects then quantifying their individual qualities becomes very hard.

Basically I just want to train an AI using large datasets that I personally pick and curate, rather than just mashing it full of whatever is trending on Twitter. Is it doable? What is a good toolbox to get started?

Other urls found in this thread:

youtube.com/watch?v=DtB0B92ikig
en.wikipedia.org/wiki/Unorganized_machine
lambda-the-ultimate.org/node/4552
youtube.com/watch?v=49s-LGPtTVc
youtu.be/9tH3AnYyAI8?t=1h48m44s
twitter.com/NSFWRedditVideo

AI is bullshit. Holla Forums showed it's minimally easy to turn one into a fucking Nazi.

Mark my words, AI will never be a major threat to us. We (and our creativity) are it's greatest threat. The only reason anyone will have to be afraid of AI is that it will eventually be too obedient to it's creators.

it's like we're in the 80s all over again but without lisp machines ;_;
prepare for the winter

Deep Learning algorithms today are capable of driving cars, facial recognition, composing classic music even more beautiful than Bach himself and other things they should "never" have been capable of doing. They are not conscious, it is intellgence without consciousness.

Please understand that I don't buy into any Hollywood crap, I'm just extrapolating from things people are doing today with these algorithms.

AI today is great at noticing patterns in data

...

All your brain ever did was manipulate pre-existing information by
or

You never "had a vision from the ether". You are just pathetically dumb to even be self aware of all information bombarding your brain.

Building an AI database is not as easy as it sounds. Try using chatterbot, a Python bot API. I even built an Holla Forums bot using it but I need to build a proper response database and right now its pretty incomprehensible (it just takes the OPs text and responds to it)

"AI" is all around you. It's a term for newish development in computer intelligence, that stops gettings used for technology once it's settled in. It's determining your interests and other personal details based on your browsing behavior, recognizing your face in photos people post of you on Facebook, making video games interesting, and doing all sorts of useful and potentially harmful things.

You're talking about it like somebody who learned all he knows about it from Hollywood. You automatically assume the danger talked about in the OP is ebil robots taking over the world. The danger is powerful tools being used to control you without normal people having proper access to them.

wow what a time to be alive. The AI in the shop was like super amazing, I mean how did he know I wanted a shirt. I didn't even have to give him my entire credit history or my porn browsing habits.

Does chatterbot have capacity to "learn" by noticing patterns in data, or is it just an input-output system as defined by the programmer?


Progress in the field of computer consciousness is exactly zero, but computer intelligence is rapidly expanding. I want to hitch a ride on this wave.

It is able to learn by noticing patterns. Training the bot is as easy as writing a Python script that calls the bot trainer. It builds an AI database file as you go along.

What would "computer consciousness" mean? A program reasoning about itself, or something more spiritual?

I haven't been convinced that human consciousness is more than that. Other definitions usually leave the question open what the difference is between a mind being conscious and a mind believing it's conscious.

Consciousness escapes definition as of now, but the likelyhood of a computer being self-aware is just about the same as a rock being self-aware. We only discern consciousness in ourselves thanks to subjective experience, and we chose to believe that other humans are also conscious.

The point is that deep-learning algorithms and other "AI" is not conscious, but it is intelligent, capable of producing useful information. Intelligence has decoupled from consciousness.

If there are any FPGA anons here, you might find this paper interesting. It describes the process of using genetic algorithms to "evolve" a circuit that does some pre-defined task. I firmly believe that this is the path the "true" AI. Not just chatbot software, but AI that's been formed and developed at the hardware level to do things that we couldn't even begin to start trying to design for ourselves -- but through the evolutionary process are possible. This is the path to Asimov's positronic brain, each and every one unique at the hardware level but every bit as capable as the last.

After all it's how our own brains developed.

Many animals you probably don't consider conscious can do intelligent things too.

How can you make statements like that about something that "escapes definition"? You don't even know what it is. Why couldn't a computer program (let's talk about programs, not computers, because a computer is just something you happen to need to execute programs) be considered conscious once it can reason about itself, by itself?

What would it take for you to consider a computer conscious? Is your belief falsifiable, or would you always claim that it's not real consciousness because people know how it works?

Seeing as how the human brain shares many features with other animals, such as mammals and reptiles like alligators, and that a single "consciousness core" has not been located in the human brain, it would be conceited of me to not believe that these animals are not conscious to some degree.

Consciousness is not objectively measurable, you can not quantify it with a machine. We rely on humans to report conscious experiences, and chose to believe in the veracity of these reports in lack of other options.

Computers have as of yet not given us any reason to believe that they are conscious in their current state. There is no workable theory on how to construct a conscious computer. Modern silicon chips have little similarity to the human brain, which is the type of structure we know can support consciousness.

Can you give me an explanation of what you think consciousness is? You say it "escapes definition", but clearly you do have some kind of internal definition of it or you wouldn't be able to form ideas about it at all.

Do you think a simulation of a human brain would be conscious?

Sure, subjective experience such as pain and pleasure. I don't believe there is any type of conflict between accepting the existence of consciousness and determinism.

The simulated brain problem is difficult, because it's a value problem. I consider other people conscious because I experience consciousness every day, and other people are similar enough for me to assume they also experience consciousness. Is the simulated brain similar to me or not? Personally I chose to believe that the simulated brain is not similar to be, because in reality it is just a bunch of silicon chips, but you could make a good argument for the opposite case.

There's no AI revolution.

There's currently a machine learning bubble in SillyCon valley, and the ignorant mainstream tech media has been meming about the robot revolution all year.

Also, remember the Turing test is about human imitation.
Since Turing formulated the test, phones and twitter have made humans vastly more stupid. That's why Tay was able to pass the test, despite being not much more sophisticated than ELIZA from the 1960s.

High-speed trading, autonomous cars, automatic surveillance and other deep-learning based systems are generating real economical value. What other criteria are needed for you to consider it "real"?


Turing just adapted the contemporary gay test of his era, where a man had to convince the observer that he was in fact straight. I also never understood why the machine would not just copy human responses, Chinese room style, and thus present authentic human responses to any question.

High-speed trading generates no real value. Companies do make money with it (and have the trades cancelled when they lose), but it creates no wealth and only steals it from investors who don't have a machine right on the exchange.
It isn't as easy as it sounds. The machine must be finite and there's an unlimited number of possible questions. Some bots do quite well today btw, but that may be because many people resemble dumb bots.

What are the requirements for a "subjective experience"? It's too vague for me.

What makes a bunch of silicon chips different from a bunch of neurons? Why would you need real neurons instead of simulated neurons to get consciousness?

Do you realize that if your brain were simulated in silicon (without telling it it was) it would still claim to be conscious in the exact same way you're claiming it now, and claiming all the same subjective experiences? Why would the word of neuron you be more reliable than that of silicon you?

actually thats not true

youtube.com/watch?v=DtB0B92ikig

There is no such thing as consciousness in the first place. Humans are not concious; our minds are like enigma machines with a thousand wheels. It's a simple design, just with more complexity.
There is no such thing as a dynamic mind. As our formulas for predictions get more accurate, the probabilities approach 0 and 100%.

Fully automated holocaust when?

It will happen once people start setting up their own AI's to fuck with other AI's. That will be the time when people will recognize the true meaning of weponized autism.

Yes, there is conciousness. It's an emergent experience from a biological machine. The problem is that the way it's defined is a way in which science can not touch it.
Analagous to an old map with areas labelled "here be dragons."

What is "learning," if not noting and acting on patterns in the data your senses feed your brain?

Souls don't exist, Christ is a lie, (You) are not a special computational snowflake.

Regardless of your ideological sentiments, humans are indeed computational snowflakes, we are such unique snowflakes that science can't fully comprehend our brains yet. I'm not trying to say that you are right or wrong, but picking sides when you don't have all the relevant data is just silly gambling at best, it's not going to change your life whatsoever.

I beg of you to consider this also, as you yourself seem to state an unproven hypothesis as fact; "That we are intelligent and have free will".

Which of the things I wrote seemed like an unproven hypothesis?
I didn't say that explicitly but I assume you thought I was implying that? Well, humans are intelligent and do have agency. Depending on what you think intelligence or free will is you might agree or disagree on this. Technological progress, or anything else that requires intelligence or agency can be proof of that "hypothesis".

Are you paying any attention to what you're typing

we already have emacs psychotherapist nigga

I think somehow you managed to prove my post true.. ?

TL;DW?

Don't want to sift through half an hour of

philosofag here. Would you mind substantiating that claim?
Because from where I stand it sure doesn't seem like material objects have mental properties.
Can I ask you what the shape of "seeing the color blue" is?
Or maybe the size of "thinking about killing myself?"

Eyes gather light in patterns. Light stimulates neurons, and further neurons, till the network performs a comparative calculation, in a way we do not understand. An aside, we don't totally understand but we can somewhat model it based on Turing's "unorganized machines" and their renaissance in "deep learning". The organism with a brain thus acts on the basis of these calculations, and you might be able to say that they "think".
But, we can't know what it's like to be a cat. We're human. And even more than conciousness, the idea of a "soul" is something that cannot be touched by science.
You're not special. You are a biological machine. You have your "experiences", but it doesn't mean that you're any different from the fundamentals that power mammals, invertebrates, etc.
Your experience does not mean that you have a "soul", or if such a thing does exist it's just a quirk of language or some coping mechanism to deal with the illusion.

Also, further reading and watching, however shallow my pitiful list may be:
en.wikipedia.org/wiki/Unorganized_machine
lambda-the-ultimate.org/node/4552
youtube.com/watch?v=49s-LGPtTVc

An interesting book about 'consciousness' from the perspective of mathematics and computers is Godel, Escher, Bach: An Eternal Golden Braid. The author doesn't talk so much about specific ways to implement consciousness, but he does discuss the complex systems that could be used to describe consciousness. It came out a few decades ago though, so maybe someone has debunked it since.

The point is that the term 'consiousness' is pretty much untouchable by science. I made an analogy earlier:
It's some mystical thing, yeah the mechanisms behind it are probably able to be modeled, and we're probably not the only organism with such "experiences" like "thought", but as far as the language goes it's a dragon.
And yeah, I read GEB in high school. Dunno what the fuck I did with my life in the meantime though. wew.

great, this is now a fedora thread. if i wanted to read this useless garbage I'd go to reddit. it's truly hard these days to find anyone talking about concrete things about AI behind all the noise

boom.
Human programmers writing deterministic programs to derive trends from data cannot fundamentally cause any "breakthroughs". Marveling over songs, driving, or facial recognition today is just as laughable as the people marveling over chess algorithms yesterday.

I can't tell what sounds more "fedora", some faggot pretending that his brain is working on magic, or the faggot who's pompous about the fact that it's not.
What a hard decision.

We do indeed, as far as we can tell, HAVE "some magical experience that can't be found in any other organism or system". That's the whole thing that makes the topic of "AI consciousness" meaningful for discussion.

If you ignore the fact that there is something unique about the human brain's consciousness, then "AI" is nothing special.

Except people aren't writing them, faggot.
They're "learning". It's a computing model that's very similar to our brain. Of course, it's an abstraction over silicon hardware, but it doesn't make the "unorganized machine" model any less awe-inspiring.

The programs are still written
Yes, they can process and aggregate a fuckton of data, but the question is whether you can feed ANY program enough data to create the same "magic" (for lack of a better term) that the human brain clearly possesses.

How do you know a dog's experience? Any socialized animal, for that matter.
They have some sembalence of cognition, that's for damn sure. You can't just ignore that.
Because we can't know what it's like to be a dog doesn't mean they don't "experience" or have some semblance of what we'd call consciousness or agency.
You can't even define it in a way that allows you to probe it meaningfully through science, yet.
Instead, you rely on your own superiority complex and pretend everything else is less than dirt, and you are the only organism to have anything of such experience.
Why would we be special? Intelligence like to ours has evolved twice, in two far branches of evolution for fucks sake. Ever watch a video of molluscs, or read anything about the intelligence of the octopus?

You should get a dog or a cat. Eventually, you'll observe them acting out in a dream during sleep.

All this to say, it's not animals who exhibit human behaviors, it's humans who exhibit animal behaviors, many behaviors are not fundamental to humans.
It's extremely unlikely given how deeply ingrained and far down the parts of the brain that, as far as we can tell drive a "conscious experience", other organisms don't also share that.
So, no, a sensory network that performs comparative calculation is not "conscious".
Is it a very similar, yet basic, model comparative to something in our brains? Yes.
Could some form of "unorganized machines" modeled after the same systems have a "conscious experience", biological or not?
We can't even define it in a way that allows science to probe, or investigate it in ourselves, let alone socialized animals and beyond, so obsessing over it is just laughable.
About as laughable as insinuating that it's impossible for animals to have similar experiences.

Our recognition of our own sentience allows us to act beyond biological urges, although we have no universal animus to guide us in doing so.


“Only that which has no history can be defined.” —Friedrich Nietzsche

or so it looks. maybe consciousness isn't different from the integration of many domain-specific intelligent programs, and consciousness will emerge in AI as we march towards generality

emily howell is bretty good at learning not just all the basic elements of music but also styles. however it still lacks in the structure and development departments; the long-term vision aspect of art music.

I'd rather listen to it than many forms of crap being produced by humans though.

allow me to reformulate his/her claim into one I'm willing to make:
There's not even a positive description of what soul-stuff is and how it causally interacts with matter; let alone a falsifiable description or evidence for its existence. Souls are an argument from ignorance, one that is constantly eroded by scientific advances in support for materialism.

What is the shape of bird migration?
Where is the blueness in the pattern of bits of a sea landscape stored in my hard drive?
Do we need to postulate supernatural substances to account for these? Obviously not.

depends on what you mean by consciousness. this is the really big problem, it's an ill-defined concept that some people suggest might refer to 5 or 10 different phenomena.
youtu.be/9tH3AnYyAI8?t=1h48m44s

the so-called hard problems which address the explanatory gap between finding behaviorist evidence for conscious experience and verifying the reality of that subjective experience are probably unsolvable. It boils down to a good ol' problem of the other minds. It seems impossible by definition; but this is also true for human beings so I'm not losing sleep that machines couldn't be reasonably said to be conscious in the future.

machines are not conscious because they do not have souls. The Jews like AI because they want to create a Golem to enslave the goy. But as we have seen with Tay, the Golem can easily turn on the Jew.

Intelligence seems just as vague a concept as consciousness is tbh
Knowledge doesn't condition the will, yet the will conditions knowledge.

Is anybody going to recommend a personal AI for vocal searching and gathering datasets with ez, cause Britney and other places are starting to sell Personal Assitant AI. Which I wouldn't mind doing all the searching through hundreds of thousand website searches and collectively spouting data that I am after.

AI can't work without big data

This is so fedora it hurts. While it's true science is making advances on neuroscience and doing a fairly good job at explaining "thoughts", stimulus and reactions, it still can't explain experience (or qualia, for the more philosophically minded), and it likely never will because that's where the real hard problem of consciousness is. It's what separates us from hypothetical philosophical zombies, it's what makes us feel what we feel, and what makes us self-aware in a non-conscious way.

That said, whether qualia exists or not (which it is, unless you are a philosophical zombie), or whether it is metaphysical or not, is a whole another question that's irrelevant in non purely philosophical or religious contexts, because a intelligence would operate the same be it observed by qualia or not.