Game Creation/Design Catch All

Um...
Impressive, wow! Does this mean no UV mapping either?
Unreal Engine 5 tech demo (skip a few minutes in to get to the game footage):
https://www.unrealengine.com/en-US/b...

From what it sounds like yea no UV mapping. When you have triangles as small as pixels you can probably flat shade them to get really good effects.

So I wonder if it is some sort of voxel meets streaming technology. Because that is essentially the principle behind voxels.
Or perhaps someone has unlocked Zbrush's pixol technology. Lord knows you are going to need a content creation tool using the technology to have any chance at creating art for the engine in a normal time frame. I guess the 3D scanning industry will explode and artists can just kit bash geographically scanned data together.

It will cause major headaches tuning these levels: fixing clipping, stuck or falling out of the world issues.

Come to think of it, you are going to have to have art tools based on this technology just for the streaming alone. Zbrush can handle a gazillion polygons but without streaming you will have to make/fix/detail them in chunks where load times will kill productivity and impair testing and finding issues.

Rykin wrote:

From what it sounds like yea no UV mapping. When you have triangles as small as pixels you can probably flat shade them to get really good effects.

The size of a triangle compared to a pixel is a function of how far away the viewpoint is. Forgetting that, your smallest color element has to live in the GPU somewhere. Whether that be pixels in texture or color values in an array - that's pretty much the same thing.

But that's just for scanned stuff. I bet Loro Craft was UV mapped by a human.

fangblackbone wrote:

So I wonder if it is some sort of voxel meets streaming technology. Because that is essentially the principle behind voxels.
Or perhaps someone has unlocked Zbrush's pixol technology. Lord knows you are going to need a content creation tool using the technology to have any chance at creating art for the engine in a normal time frame. I guess the 3D scanning industry will explode and artists can just kit bash geographically scanned data together.

It will cause major headaches tuning these levels: fixing clipping, stuck or falling out of the world issues.

Come to think of it, you are going to have to have art tools based on this technology just for the streaming alone. Zbrush can handle a gazillion polygons but without streaming you will have to make/fix/detail them in chunks where load times will kill productivity and impair testing and finding issues.

At the about the 6 minute mark they talk about how they imported the statue directly from Zbrush which was made from 33 million triangles with no baked lighting and no LOD optimization and they were able to put 500 of them in a room and that the engine figures out the culling and LOD dynamically. If it really is running on a PS5 it is pretty crazy and probably only doable thanks to the fast SSD. Unreal engine has support asset streaming for a long time now it has just kind of sucked largely due to the speed of the storage mediums that have been used.

Digital Foundry did an overview of what is going on. They say that the Lumen system doesn't require ray tracing hardware, but it can be combined with ray tracing effects on hardware that supports it.

AMD made a GPU with an SSD built into it a few years back. It was mainly targeted at video editing/rendering market, but that might be something to look at building into next gen gaming cards. I am not sure I buy the numbers Sony is claiming for their PS5 SSD, but with a couple of NVMe SSDs in RAID 0 or perhaps just a single high end NVMe controller with a ton of flash memory modules on a PCIe 4.0 bus it seems feasible to get similar drive performance on PC, but it will probably bottleneck somewhere else in real world applications. If a graphics card had direct access to a fast SSD without having to go through the rest of the system bus to get to it that might be a game changer.

After the demo they have an interview with people from Epic in this one:

fangblackbone wrote:

Um...
Impressive, wow! Does this mean no UV mapping either?
Unreal Engine 5 tech demo (skip a few minutes in to get to the game footage):
https://www.unrealengine.com/en-US/b...

I believe you still have to do UV mapping, but programs like Substance Painter do a really good job of creating them for you at this point.

Humble Bundle is having a game development learning bundle. I got the last one and never used it but the collection seems pretty extensive.

fangblackbone wrote:

Humble Bundle is having a game development learning bundle. I got the last one and never used it but the collection seems pretty extensive.

Thanks. Zenva courses are pretty short and focused. I did a couple from the last bundle and they were okay. For $25 that's a pretty large collection. I'm interested in a couple of the topics, which I haven't seen covered in other courses (Cinemachine/camera movement, for one). Here's a link in case it helps.

The RTS one seems interesting too.
But i am going to hold off. As I said, I haven't touched the other ones I got years ago.

Yeah, I hear you and can totally understand. I've been making consistent progress this year so I feel a bit more confident I'll dig in to them. Still a long way to go to be good at Unity, but it's feeling like it's starting to come together.

Very neat on the demo and the DF reaction. Cool stuff.

The demo had such impact that it effected my sleep!
But then of course it led to more questions:
They showed physics and collisions which are very taxing and typically the reason why cinematics can look so much better than actual gameplay (because they are turned off).
However, they only showed one character and while you may not need many lights, obviously at some point adding 10, 100, 1000 lights will break the system or bog it down. Likewise, how many enemies will it be able to handle with the same physics and collision and pathfinding and logic as the character.

Also, I know they have been working on compression for a while but it had better be really good now. Games have 100's if not 1000's of assets. The size and details of Unreal 5 worlds will require that much more. Files sizes of objects just a 5 million polys can reach 1 GB. Current gen games are already at 60+ GB so you could see an explosion with sizes of 250 GB or more. I mean, an you imagine a half TB game? It looks like it is coming and in less than 2 years.

I usually look to hacker news to evaluate stuff like this, and a lot of the HN comments revolved around past Unreal demos that oversold features, or hyped tech that ultimately never panned out. So, wait and see I guess?

For me, while the demo was obviously amazing it seemed like there was some weird aliasing whenever the camera gradually approached a complex surface (e.g. where she walks towards a big rocky wall, early on). It looked like the engine was sort of continuously varying the LoD-style optimization, such that the pixel-level details sort of morphed around. But the artifacts could have been in the video.

Random thought: the kinds of games that demo foretells - mega-resolution, multi-TB, cinematic, and requiring hardware not yet common in gaming PCs - would be hella good news for Stadia et al. A blockbuster game on this tech could be a real beachhead for streaming services (if they stick around long enough).

It also makes me wonder whether Nvidia got the memo on including an SSD on next gen graphics cards?
Does their Quadro line include one?

More details:

fangblackbone wrote:

Also, I know they have been working on compression for a while but it had better be really good now. Games have 100's if not 1000's of assets. The size and details of Unreal 5 worlds will require that much more. Files sizes of objects just a 5 million polys can reach 1 GB. Current gen games are already at 60+ GB so you could see an explosion with sizes of 250 GB or more. I mean, an you imagine a half TB game? It looks like it is coming and in less than 2 years.

This could actually cut down on the number of assets though. Instead of having 5 or 6 versions of a single in game object each with its own textures and UV maps or whatever is being used for different levels of detail there can just be a single high quality object that has it level of detail dynamically figured out by the engine.

That video was incredibly helpful. Much more helpful than the interview one or the other DF reaction video.

Such a fascinating breakdown. I think this is the most exciting video game tech thing I've seen since the jump from 2D to 3D. Every advance in 3D games has been so incremental like tessellation. Then, you get something like ray traced lighting which is a pretty darn big jump but really, really hampered by performance cost, so its still kinda pie in the sky to a degree for the mainstream. The combined features of lumen and nanite are crazy. I'm so excited to see where this all goes. I want more details on PS5, and I want to know if PC graphics cards are going to be changing their designs. I'm guessing that there will be a pretty decent solution for UE5 on Series X and PCs with NVMe drives (I hope since I just put in an NVMe drive. )

I wish I had a bit of artistic sense so I could play around making 3D models and goofing around with them in UE5 when that becomes available to the general public. It sure seems like this will make for a lot less hassle.

Rykin wrote:
fangblackbone wrote:

Also, I know they have been working on compression for a while but it had better be really good now. Games have 100's if not 1000's of assets. The size and details of Unreal 5 worlds will require that much more. Files sizes of objects just a 5 million polys can reach 1 GB. Current gen games are already at 60+ GB so you could see an explosion with sizes of 250 GB or more. I mean, an you imagine a half TB game? It looks like it is coming and in less than 2 years.

This could actually cut down on the number of assets though. Instead of having 5 or 6 versions of a single in game object each with its own textures and UV maps or whatever is being used for different levels of detail there can just be a single high quality object that has it level of detail dynamically figured out by the engine.

I hope that is the case! The only way I can see the consoles dealing with larger than 100 GB games is to have an archiving system where you plug in a 6 TB HDD and it automatically and manually archives stuff similar to how the Switch does things. You'd really need to have a local drive in this case because you just can't keep redownloading 100 GB games. If you have a slow backup drive, it should be able to do at least 100 MB/s reading, so you could read a 100 GB game to your fast SSD in a little over 15 minutes.

tuffalobuffalo wrote:
Rykin wrote:
fangblackbone wrote:

Also, I know they have been working on compression for a while but it had better be really good now. Games have 100's if not 1000's of assets. The size and details of Unreal 5 worlds will require that much more. Files sizes of objects just a 5 million polys can reach 1 GB. Current gen games are already at 60+ GB so you could see an explosion with sizes of 250 GB or more. I mean, an you imagine a half TB game? It looks like it is coming and in less than 2 years.

This could actually cut down on the number of assets though. Instead of having 5 or 6 versions of a single in game object each with its own textures and UV maps or whatever is being used for different levels of detail there can just be a single high quality object that has it level of detail dynamically figured out by the engine.

I hope that is the case! The only way I can see the consoles dealing with larger than 100 GB games is to have an archiving system where you plug in a 6 TB HDD and it automatically and manually archives stuff similar to how the Switch does things. You'd really need to have a local drive in this case because you just can't keep redownloading 100 GB games. If you have a slow backup drive, it should be able to do at least 100 MB/s reading, so you could read a 100 GB game to your fast SSD in a little over 15 minutes.

Yea I will probably move my Xbox One X's external drive to the Series X once I get one and manually move stuff around as needed. I have come pretty close in the past to hitting my data cap on my broadband service downloading games so it is nice to be able to keep a local cache of them around (be nice if I could offload to my file server over my gigabit LAN). I do wish Xbox Series X had gone with some sort of cover you could pop off to install a new M.2 SSD instead of the proprietary thing they are doing (which is probably just a funky form factor for a NVMe based solution). I am sure they are probably trying to make it easier for less tech savvy people while also being able to charge a premium for being the only way to get it. Some sort of intelligent background file manager that moves stuff around automatically based on what you are playing would be a nice feature.

This could actually cut down on the number of assets though. Instead of having 5 or 6 versions

I don't see it.
You are talking 5 to 6 versions of the same object each taking ~ 1/2 the cost of the prior in succession. Contrast that with 1 version at film fidelity. Those 5-6 versions were derived from the film fidelity version but never used in game. So you are talking a 20 million polygons item used to make 5-6 assets of 100,000 polygons and lower. And with nanite, you just take the 20 million polygon item into the engine/editor. I know there are loads of other resources like all the specular, reflection, normal maps etc. But the difference is 20 million polys vs 250,000 with maps.

My math shows:
100K object = 3.5KB; plus 4K texture x60 (10 maps for 6 objects) = 48 MB total (uncompressed)

20 million Zbrush object = 1 GB?

I am sure this conversation means something and I'm glad everyone is excited about it.

Also, it was reported that PC's with SSD's will be able to handle nanite at high performance levels but that there was some special sauce to the PS5 and its SSD.

I will say that $500 for a PS5 is compelling when compared to spending an extra $450-1000 for a GPU+SSD for your desktop. Do we know if the PS5 comes with a 500GB or 1TB SSD?

Anyone have any recommendations for a good course, training, tutorial, etc. for younger kids to learn about making games? With (home)school ending soon, I'm looking ahead to some weeks with my 8 year old at home with summer camps canceled and thought game design would be a fun father-daughter project.

Mark Rosewater, the lead designer on Magic: the Gathering, wrote an article entitled "Ten Things Every Game Needs" that he based on a talk he gave to his daughter's 5th grade class. It's not something I would hand to an 8 year old to read for themselves, but it should have some good stuff for you to pass along.

Oh, and it's focused on board and card games, not video games, though that largely manifests itself with an assumption of multiple players. Also this thread title says "Catch All" with no mention at all of anything digital, so I stand by my submission.

Definitely take a look at hour of code. https://hourofcode.com/us/learn
It teaches coding through little gamelets. There is a lot to choose from so if something gets boring then you can move on to something else that they are interested in.

I LOVE Lightbot. It is a light puzzle game that teaches logic and functions.

This looks amazing!
Why hasn't anyone told me about this? Why do we not have something like this on PC/desktop?
Do any of you mess around with Dreams PS4?
https://www.linkedin.com/posts/marti...

Hello!

Amateur gamedev and 3D artist here. I've been looking for small/student projects to help on to get more experience working in a team.

Amanda actually hosted me on the GWJ twitch for a bit while I was working on a portfolio piece.

If anyone needs some artwork for their project, please let me know.
https://www.artstation.com/aqpham

Hello Felsparrow and welcome! If no one responds here, have you tried posting on the itch.io Help Wanted or Offered forum? I've had very good luck finding people and connecting through it.