C++ or Java

Which is better java or c ++, to make mobile applications and video games.

also

c++ = better, java = easier, op = gay.

java because of ability to be used on many platforms with ease.
C++ because performance of handling memory manegement and not having a huge overhead of garbage collection. Also you can inline assembly to speed up certain parts.

But both of these are fucking botnets.
Use rust for video games, you avoid the garbage collection of java and yet retain the memory speed of c++. You also don't have to worry about learning how to manage memory for your targeted platforms because rust won't cause errors based off of memory. It would be unwise to use rust for mobile apps because you would have to compile for a great many platforms and their gui systems which would be a pain in the ass even if you used something like QT.

Research just a little bit before being an idiot

HolyC you heathen fuckin clown nigger

C++ botnet wth

I'd rather kill myself. Good thing all games need is being performant and not having potential RCE vulnerabilities if it's online, and the first one is more about designing a good system rather than muh C++. I mean, Minecraft gets slowdowns not because it was coded in Java, but because it was coded by Notch. There are ways and ways to optimize Java and any GC language, and one of them is pooling via factories and never calling new, which I am sure Minecraft doesn't do. Is Java gonna be enough for a triple A game? Probably not, but it's not like you will ever write a AAA all alone by yourself.

What I am trying to say is, just use whatever for 2D, and anything not interpreted for 3D. I'm looking into Nim for gamedevving right now, and it seems like a sane option for everything but the most batshit engines.

...

You fucking kike. Nobody gives a shit about the security of games. Memory safety has everything to do with performance you dipshit. If I load *insert video buffer here* and transfer it to an opengl context via an llvm intermediatary languaged inlined I don't want any fucking memory leaks being compensated for by the gpu in performance loss.
Plus most making games nowadays are indie devs and pajeet triple AAA's. So it's better they have a language hold their hand instead of fucking up all over the place because C++ let's them.

I guess Holla Forums is right when they say these people can't into memes.

So, this is what rust fairyland looks like. For games. Even indie tier shit needs you to come up with the most performant implementation. Anything that comes in the middle is an obstacle.
Ill not say that memory bugs arent one obstacle. But rust restrictive ideology would be a much bigger one.

Yes. And then theres no point.

For problem domains that need performance, if given enought resources and time you would even prefer to go down to the assembly levels in some parts. Sadly, most of the time we deal with whats reasonable. However rust is much more of a compromise only for some vague concept if security that while nice, has nothing to do with the domain of games.
Of course, some indie games might not require performance... But then theres no particular need to go for a performant language like rust either.

Do a game in rust if you like. Theres nothing wrong with that. But don't try to claim rust is the best for all problem domains.

Or actually, continue doing it. I want the rust dream to fail at this point. You people are extremely obnoxious and the best way for you to lose is to continue doing what youre doing.

Until you fuck up and those pesky Brazilian hackers figure out how to inject a crypthueware into your clients' computers because your sloppy C code expected very specific package lengths. Something similar had already happened with GMod and it was a shitstorm.

Memory leaks are fixable and cause no damage to clients other than minor annoyances. The damage caused by non-DoS zero days may be unfixable.

Rust has a problem, and it is that you will never finish a game engine with it if you give a fuck about performance. You have to be very sure where to implement clone, where to put your heap-allocating object in that kilometric type signature, wonder if there is a better way to do something (it's Rust; there WILL be a better way), design your data structures around this, realize all those RC and Boxes will onevitably slow down your application because lol cache misses, and refactor your code 1001 times when you inevitably realize you fucked up your type signatures. Rust is not really the most execution efficient language for these kinds of things (although it comes close), but it may be one of the slowest to develop. Seriously, efficient systems make more of a difference than GC or no GC; just give a fuck about data structures and algorithm complexity and you will be fine.

Also
Lol fag.

Big teams are just a one man developer and a bunch of orbiters each doing a tiny amount of busywork.

I don't know about mobile applications, but I'd guess java. For games C++ only. You need the performance. Even when making pixel shit it should be fast. Lazy garbage that looks like NED and yet requires 2ghz cpu and A gpu makes my autism go wild. I'm coding a pixel shit game and it runs on my 166 mhz Pentium 1 computer (and that's quite fast pc for this kind of game, but I'm not optimisation God yet)

Most pixel shit is now C# (Java) in Unity. People seem to have accepted the stuttering.

This is my shit. This video is very old. I have very little time to work on this, unfortunately. 640x360, 256 colors, software rendering all with my own blit routines. I use SDL 1 & 2 for creating the window, input and sound, but I use my own drawing routines. Not because SDL is too slow, far from that, their routines are very optimised, but they have case for everything (is this a 8/16/24 bit surface ? does it have transparency ? does it clip or not ? etc.) so making my own for specific case work little bit faster (and besides, that's just what I like to do). Also, SDL doesn't have any blit routines for rotating or scaling (in software). This is recorded from my normal PC running in SDL2, same code in SDL2 gives me about 400 frames. SDL2 is a bit slower, because it doesn't have the old styled interface of software graphics. It can only do the hardware accelerated textures (which is good if you're making a normal modern game). So what I have to do is to make a streamable texture that acts as my framebuffer and at the same time I have my own software, 8-bit framebuffer in ram. When the whole frame is drawn I convert it to the texture format and display it as such. That's why it's at ~150 FPS. That's just how fast my GPU (integrated gpu in a 2009ish laptop) can render such textures.

The 166 Mhz Pentium 1 laptop can run this at about 25 frames, but I'm sure I can make it run at 60 FPS, because the bottleneck is Windows 98... At least I think so. When I removed the final function that puts the ready framebuffer at the screen (so the code was still doing the game logic and the pixel manipulation in memory) suddenly it ran at about 75 fps. I want to try coding my own tiny VESA driver and see if that makes it faster. You can run real-mode code in Win9X, so why not take the advantage and run it in direct graphics mode bypassing the windows screen driver. I think the best thing to do for such a game would be to just make it runnable from DOS. It's possible to enter protected mode from dos and run your 32 bit code, but I don't think I have enough talent autism to do that. Besides, this is barely a game yet, just a tile and sprite rendering thing.

This is nothing to brag about... but I wanted to explain thing a little bit.

The graphics are taken from amiga game called Turrican II and are used as placeholders only. They were upscaled by factor of 2, so the actual resolution is better than it seems from this video.

I meant that SDL1 gives me about 400 frames

when will you ever learn?
Holla Forums is always right

Do you need to program on the GPU? C++.
Otherwise, Java.

Minecraft is about the level of gfx quality you can put on kava optimally.

tfw Mobil fag.

For physics simulations and shit running on custom computers with large amounts of graphics cards (16+) Java gets dropped on the designated street in favor of CUDA C++ or OpenCL.

shitty music. switch to aesthetic 80's fash wave tbh.

Vulkan now but yes.

Sure.
Garbage collection => no go for games with action, unless you don't care about making a quality product


Your time is clearly not worth much.
Just don't be a fucking retard; use destructors to free whatever needs to be freed or pre-allocate and reuse resources.

This is not in-game music, although I will use mod tracks. The vaporwave meme is shit.

how about a meme engine that does all the work for me and all i have to learn is a python-like langauge

vaporwave isnt a meme and isnt shit
the LE 80'S XDD aesthetic is beyond garbage though.

Protip: it usually runs on allocations. Which means you can trick the GC into not running by using all those news in loading screens or pooling objects. You know, what you should do in C++ if you are not retarded anyway. Minecraft slowdowns are due to a stressed GC full of far away chunks to free.

Also, muh grafix performance is usually mostly GPU-bound. That means OpenGL and GLSL will do the work for you and your CPU, and you could technically get muh grafix in any language. Fuck, even WebGL runs okay on fucking JavaScript.

There are things the GPU still can't do. There are things that require you to laid down the memory in certain ways before feeding it to the gpu (interleaved vs non-interleaved data, for example).

But sure, you can make js draw amazing stuff thanks to webgl (which runs on a c-like language that compiles like a non-interpreted language), that's beside the point.

even with webgl2 if you use regular js you will be extremely limited in what you can do. Javascript is excellent to show the capabilities of webgl. but when you want to render thousands of particles, do a real time simulation with 100 ragdolls, you and I both know that one will lag, and the other will not.

Can't wait wasm becomes widespread so js loses it's very own single advantage and people start using other languages.

You might as well use C++ and not have a GC at all. You'll also avoid the overhead of a virtual machine while you're at it.

5RuHL5LQnT9G AJexiCu7OS6c

/thread


it's like you shills are not even trying anymore. holy shit.