Euclideon

Are these faggots ever going to actually release anything we can get our hands on and play with?

youtube.com/watch?v=1-gf8p5_t7M

Other urls found in this thread:

youtube.com/watch?v=gM6QkPsA2ds
youtube.com/watch?v=GjPWk0UhKDQ
youtu.be/5AvCxa9Y9NU
youtube.com/channel/UCX_5t4F9zsOVfW5_0alc7AQ/videos?disable_polymer=1
youtube.com/watch?v=XqkKTDlD2d8
youtube.com/watch?v=f3Ets6bWjEQ
youtu.be/00gAbgBu8R4
codersnotes.com/notes/euclideon-explained/
twitter.com/SFWRedditImages

youtube.com/watch?v=gM6QkPsA2ds

oh lawd

youtube.com/watch?v=GjPWk0UhKDQ

hmmm this might redeem them

That's quite retarded because:
1. The engine still uses textures
2. It uses a horseload of power
3. It demands great efforts
4. Cannot be used to make VR games

Also, the retard compared current year engine graphics made by top professionals with a 2011 demo made by amateurs.

See this here:
youtu.be/5AvCxa9Y9NU


They already made it, as you know. Their biggest asset is their method, mostly certain patented, so things will proceed slowly.

I hate the patents system.

Everyone does, even the patent holders. But you must have a way to reward inventors.
If I had power over this, I would stipulate a system where the government pays the inventor a monthly salary based on the outreach, adoption and relevancy of his invention - therefore freeing patents and making everything public domain.

They have an application designed for geospatial information systems. The idea is that you can scan an area and display it on any laptop. This kind of system is useful for engineers and planners to physically see an area without necessarily being there.

They already did. There was a web demo IIRC.
It's mostly shit because the technique requires that everything is static. The only way that they can animate 3D is by doing a "flipbook" style animation.

They also made some sort of VR game / experience place or something.

irrelevant thread

there is a guy on youtube who has been making something similar in unity:
youtube.com/channel/UCX_5t4F9zsOVfW5_0alc7AQ/videos?disable_polymer=1

below is a comment from this video that says something about his and euclideons pointcloud renderer: youtube.com/watch?v=XqkKTDlD2d8

Don't forget that they bypass memory and render the data directly.
It's basically the perfect thing everyone ever wanted.


Not...

You don't know what you're talking about, right?
youtube.com/watch?v=f3Ets6bWjEQ

I think this method is not quite the same method used in Euclideon. The method they use in Euclideon is based on a tree search. The detail of a scene is formed by searching through a tree of nodes a prestructured scene. I'm not quite sure this voxel system works the same way.

Euclideon says that they fill the dot mapping with voxels. It's in their first video.
youtu.be/00gAbgBu8R4

I met the guy who wrote the Atomontage engine at Oculus Connect 4. He had John Carmack use the engine in VR while kneeling on the floor. It was an absolute 10/10 topest of keks.
PS: check out his super lit pelican case rig.

I've been following them for a while.

I've read their patent, it's cool and a good idea, but not enough to base a company off. The patent is basically about approximating a perspective projection of the scene (hard to compute) with an orthogonal projection (easy to compute). Didn't read the whole thing but I assume they stitch together a bunch of orthogonal projections to build the whole scene.

The key is that this is possibly an advancement on voxel rendering, but it isn't going to fundamentally change the situation of voxel rendering vs polygon rendering. Given this, their business strategy makes no sense. They should have been either

(1) marketing their method as a voxel rendering engine that could be used alongside polygon rendering. This would require dropping the bullshit about replacing polygon rendering, and focusing on running their code on the GPU.

(2) marketing their method as a replacement for existing voxel rendering. This would require doing direct comparisons with existing voxel rendering engines, which I'm pretty sure they never did.

Instead, they first marketed their method as a replacement for polygon rendering. Then when this didn't work, they pivoted to "hologram" technology (i.e. using 3d projectors/screens and glasses) which in no way depends on their actual tech.

If there was a patent, anyone with a sliver of programming knowledge would be able to make a duplicate. For this reason nobody files industry secrets for a fucking patent.

Or you can, you know, just charge royalties for using your patent.

On a mathematically perfect Turing machine, yeah.

It's shit.

codersnotes.com/notes/euclideon-explained/

But the patent system does the opposite. Especially if they want to research prior art and build upon previous discoveries:
>only patentable ideas will receive funding (because patents are weapons for/deterrents against lawsuits and you can collect rent license them for money)

They fixed that problem without your mememachine.

At this point "Infinite Detail" is just an investment scam. They've been showing the same shitty demos for a decade now, while the dirty polygon tech has far exceeded what they've been promising.

Maybe it doesn't require it, but that is what they are currently doing for performance reasons AFAIK.

This isn't by Euclideon though. I'm not discrediting voxel based engines in whole, but rather just what Euclideon is doing.

Well, you're wrong. Modern engines still can't do what Euclideon does.
That game they made? They only had a team of 3 people making it, 3 amateurs.

It's not due to performance reasons. They're simply creating a game engine from scratch, and they don't have any experience at all with it and their whole company only has 30 employees, of whom maybe 10 work with coding.

Also, if you guys don't know, Euclideon can't partner, sell, buy or being bought by any company - they're funded by the Australian government and don't have any decision power.

Maybe, but they can release a product which is an essential feature in any game engine.
Download link? Big claims with no proof and you can't fault others for assuming it's snake oil.
Call me crazy but maybe they should hire someone who has the experience instead of doing it wrong just to learn how you don't do it.

Their purpose isn't even of being a game company or to make a game engine, they're a geomapping service provider doing extra projects in their free time.
It wasn't released to the public.
It's in OP's video.
Are you being forced to buy anything? Were you cheated? No.
Agreed.

Why would I want to put a face botnet on

...

...

Game engines have existed for ages. They don't have to do it right if they don't want to, but if they have any interest in it the lead must know how to develop an engine or they'll end up with some unusable mess.

There's also some really cool research out there in using machine learning to remove noise from an image / frame meaning that you can do all sorts of complicated lighting and stuff in real time.

They're essentially the same thing, points in a 3D space like you said. Voxels are generally thought of as the "pixels" of a 3D grid of known dimensions, say 512x512x512. Like how pixels are a unit of a 2D grid. Point clouds are generally used is a world space of undefined dimensions, like cm or mm in XYZ as floating point coordinates, but they can just add easily be integer coordinates just like in a voxel volume. It's trivial to convert a point cloud to a voxel volume and vice versa, so they are equivalent mathematically. I think the easiest way to think of it is that voxels are an element of a space with predefined, quantized dimensions, and points of a point cloud are elements of a space with undefined, non-quantized dimensions. Or maybe more succinctly, voxels are elements of a specific, defined voxel volume, and point clouds are just a collection of points.

But user, there is no bijection between the set of real numbers and the set of integers.

There is a bijection in real world datasets.

...

(not him)
True, but there is between the set of floating point numbers in any architecture and integers.

That's not how it works. Bijection means you can map one to another, with natural and real numbers you can't as real is a larger infinity (basically there are infinite numbers between 0.0 and 1.0 but not so between 0 an 1).

True, I only nitpicked the comment because he said it was true mathematically.

There are no infinite numbers in real life.

Holy shit nigger, you're retarded.

Reminder quantum mechanics posits the entire universe is composed of 1 Planck length voxels, at a frame interval of 1 Planck time, and energy leaping at a discrete bit depth of one Planck energy.

Reminder that quantum mechanics doesn't exist.

Just because we can't measure lengths < 1 Planck length doesn't mean thing's length can be represented by an integer number of Plank lengths.

Integers don't exist in real life, for they are an infinite set. You can write down an infinity symbol, that does not make what it represents constructable.