Hacker News new | comments | show | ask | jobs | submit login
Unreal Engine 4.15 Released (unrealengine.com)
116 points by richardboegli 38 days ago | hide | past | web | 48 comments | favorite



Still no diff patching support? I'm really sick of having to re-download 40 gigs every other day when an early-access game based on Unreal decides it needs to update.

Seriously, diff patching has been a thing for decades and we're in the age of artificially-restricted bandwidth allocation. If Unreal can't get with the times, then Unreal needs to go away and make room for engines that bother to use modern practices.


There are no other better engines and what's more, there only two serious competitors out there. CryEngine and Unity. CryEngine has nowhere near the amount of documentation UE has (both official and unofficial) and investing time into learning to work with CryEngine is questionable since the company behind it may go bakrupt at any time and the skills you'll learn are not really usable outside of your project(s) since pretty much no one in the industry (except CIG and Warhorse) uses it. Unity is great for what it is, but it's not exactly the best choice as far as ambitious projects go.


CIG switched to Amazon's Lumberyard fork of CryEngine, which I suspect will become the canonical upstream, provided Amazon keeps up it's support and doesn't alienate people with crazy Amazon service tie-ins. My (not very in-depth or researched) impression is that the original team at CryTek suffered a brain drain long before this latest round of financial difficulties (e.g. lead engineer Tiago Sousa moving to id Software to succeed John Carmack as lead renderer guy), so I suspect they don't have a huge the "real change is happening in this branch" advantage over Lumberyard anymore, and Amazon is surely more stable than CryTek is.

Note I'm not working inside this industry, this is just pieced together from news observation.


For public engines yes, for private engine it's another story!


Well, we're talking about public engines here, so you point is kind of irrelevant.


Steam does delta compression automatically and for most changes shouldn't be an issue.

UE4 has an append mode to add extra content in separate .pak, but if you are using Steam to distribute it doesn't save you anything. I believe it understands pak files enough to decompress, delta encode, then recompress.


That's strange, you could almost just use XDelta...

Maybe they've never heard of it?


I expect they view this problem as something third party developers are capable of handling on their own.


Yeah, given how steam tends to fragment game files on disk, I figure Steam probably figures this stuff out for your game(unless we're talking about one of those games you can play before it's all there). If this is the case, I can see why they wouldn't bother adding it to the engine.


Unreal has had the ability for games to create patches (new exe and changed/additional content) for some time.

Distribution and application is generally the harder part of diff patching though and I suspect most early-access games prefer not to worry about it (though 40GB seems a bit further along than early access!)

https://docs.unrealengine.com/latest/INT/Engine/Deployment/P...


Not across all platforms, it does not exist. So my PlayStation Steam stuff takes a serious chunk of bandwidth.


What engines provide this diff patching?


None. They're no silly command line tools two people wrote in their spare time.


* Monoscopic far field rendering for mobile VR.

Not sure how long this has been brewing, but I imagine it provides a significant performance boost.

Anyone got any experience using it from the previews?

Great work Epic!


Right now it's extremely scene dependent. If your scene works well with it ( lots of far objects, not too many objects in the clip plane area ) it can get you 20+% perf wins. If you have a scene that doesn't work with it ( small rooms, lots of objects in the clip plane ), you're gonna lose performance, as there is a non trivial cost of running that third monoscopic view and doing the mono compositing into the stereo buffers.

The feature can be dynamically turned on or off though, so devs can work with that :)

We still have issues with transparency around the clip plane area.


What is it?


I believe it's when an object is so far in the distance that it doesn't require stereo rendering, just render it once.


What about the parallax effect though? Edit. I guess it would only apply in cases where there is nothing in the foreground?


The idea is that the parallax of two objects that are say 200 meters and 250 meters in the distance is negligible and can be ignored as long as you get the parallax correct for nearby objects.

You might have to set your nearfield scale based on the contents of the scene. It wouldn't work if there were say, some buildings at 200 meters and then mountains a kilometer away. But if you're in a cathedral with some chandeliers high up on the ceiling, cheating to remove the parallax of the chandeliers will not detract from the VR effect.


Former Psych major with a focus on perception here.

Your stereoscopic vision is good/useful only out to about 30 meters. Beyond that it's really diminishing returns.


They translate the renderbuffer for the other eye. It's far enough that the objects in the far pass don't have any significant parallax.


This is a huge drop, great work Epic. I was excited enough to see the reduced mem consumption, blend space and animation features, etc, and then I got to the Linux patch notes and my jaw dropped, maybe I can finally get some work done on my nix boxes and not have to keep using windows as much. (Mostly crashes on certain screens like inability to bone edit, lack of launcher and marketplace on Linux version, etc)

Compiling now


Maybe this is a good place to ask:

I have an Actor that rotates a child Static Mesh through a series of meshes (via "Set Static Mesh") to display a number 0-30. When I preview or launch the game, the mesh ends up rendering with some crummy-looking pixelation:

https://www.spinda.net/files/pixelation2.PNG

https://www.spinda.net/files/pixelation1.PNG

So far I've tried: ensuring Settings->Resolution Scale->Engine Scalability is set to 100%; switching between Temporal AA, FXAA, and MSAA; disabling motion blur; restarting the editor multiple times; create a whole new project from the basic/minimal template and drop my object into an otherwise-empty scene. Nothing's gotten rid of this jaggedness.


Sounds like texture streaming. There is a way to disable it on a per-asset basis.


Thanks for the suggestion. I just tried disabling texture streaming globally but unfortunately it didn't help. It's not a texture that's ending up pixelated but rather the edges of the 3D model, as if it's being rendered at some lower resolution and upscaled or anti-aliasing isn't being applied properly.

Another example: https://i.imgur.com/tuvcAA8.png - FXAA is turned on yet the edges are all jagged and pixelated.


Looks like it has AA to me. You would be better with temporal AA, FXAA reduces jaggies but doesn't know enough of the underlying shape to really do anything like super sampling along subtle curves and stuff. For lines it can sometimes use enough surrounding context on screen to do a passable job.

If you use the new forward renderer you can turn on MSAA and get super sampling along the edges too but it is still going to be there to some degree. 2xMSAA is only going to get you the equivalent of two bits of translucency information for each edge pixel.


Oh make sure your mesh has smoothing groups too, or you will be getting aliasing on the specular on the internal polygons. Some of your edge artifacts look like they may be that.


I have smoothing groups enabled in Blender's FBX export settings, so they should be there. I can't find a way to tell if they've been imported correctly from within UE4. The model is here, if you're willing to take a look at it: https://www.spinda.net/files/number6.fbx

I switched from Temporal AA to FXAA because it interacted oddly with the changing Static Mesh. When it switched from one number mesh to the other, the new one would jitter and sort of dissolve in until it settled out. I have another object with a texture that rapidly changes, cycling through a set, and Temporal AA causes ugly artifacting there which settles out once the texture stops changing. (This happens both with texture streaming on and off.)

edit: video of what I'm talking about - https://www.spinda.net/files/temporal-aa.mp4


Temporal AA can have issues with scrolling textures and stuff. I think there is a setting on the material to disable temporal AA around that area, but it may only be for translucency.

For a sort of flipbook mesh like that I could see it happening too. Not sure what your best option is. You can change around some variables on how big the temporal AA history filter is, but if you lower it enough to get rid of all that ghosting you may end up as bad off as the 2xMSAA in terms of how much AA you actually get out of it.

Your best bet may be trying a higher level of MSAA, r.MSAACount will control the level of it. With really dense meshes it can begin having a high cost.


You were right about turning down Temporal AA ending up with worse results than 2xMSAA. MSAA with r.MSAACount = 4 is a huge improvement, though: https://www.spinda.net/files/4x-msaa.mp4

Thanks for the help! I'll go forward with this for now and see if I run into performance issues. Around how dense is "really dense"?


If you look at the quad complexity view and that will give you a good idea. That will start getting red when your triangle edges are dense within a 4 screen pixel area and I think it will match up pretty close with the MSAA costs.


Kind of seems like you're on a hidpi display and the game isn't aware of it. (Or maybe the buffer that mesh is rendering to, but it seems like you've covered that and I don't know Unreal well enough to suggest otherwise.)


I am in fact using a HiDPI display. I tried setting Windows 10's scale factor to 100%, disabling my HiDPI (laptop) monitor, and moving UE4 to a 1x (1080p) display. Unfortunately the issue is still present: https://i.imgur.com/tuvcAA8.png I don't know whether this means HiDPI scaling isn't the issue or I failed to completely disable whatever scaling is being applied.

I also checked Application Scale in Developer Tools->Widget Reflector, but it was already at 1.0.


That image does not seem to have the issue at all. Which makes me think that if it doesn't happen for everything in the scene, the engine is for some reason rendering that object to a lower-resolution buffer and upscaling it with point sampling.


I am amazed, at the pace of Unreal Engine, the HUGE changelog and improvement with every release and constantly shipping it in a very timely manner.

How is this even possible? Just trying to simply scroll through that page! Are there any other Software that gets the huge list of improvement every few months?


It has support for Nintendo Switch


I'd hold off on excitement until 3rd party dev details are released.


According to a recent video [1] some features from the new audio engine should have made it into 4.15, but I can't see them mentioned in the release notes. Didn't they make it in?

[1] https://www.youtube.com/watch?v=h8o2xQcrb_E


Found the answer to my own question: They made it into 4.15 but since they are experimental the features are not mentioned in the release notes. [1]

[1] https://forums.unrealengine.com/showthread.php?136947-4-15-R...


Good stuff, I always enjoy the way they present their patch notes; even as someone with just the basics of unreal, it's always a good read.


Any VR performance updates? (I made it halfway through )



Thanks, we changed the URL to that from https://www.unrealengine.com/blog/unreal-engine-4-15-release... because it seems to give more information. Can change it back if people object.


Please change it back.

Ars article is just a high level summary and some of their older articles, where as the original article are the actual patch notes.

Thanks.


Ok, we did.


What I really like about Unreal Engine is its Unreal Assets, ie all those modifications that can be applied according to the field used. Good example: UnityCar, the packet that was used to build "My Summer Car".


Are you confusing the Unity and Unreal engines perhaps?


I am! Apologies.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: