Does Unity need a new Rendering pipeline for 4k+ devices?

My understanding is that resolution is outrunning GPU power or at least GPU power using current rendering technology.

So does Unity need a new rendering pipeline technology for a future of greater than 4k resolution displays and if it does what might that be?

I know this is more of a Graphics topic but I think it should be general as it is fundamental to the future of games and game engines as device resolutions increase up to and beyond 4k.

How would that help?

3 Likes

This is an Arowx thread. I’m 100% willing to bet that they saw a video or an article on 4k rendering tech and desperately wants unity to adopt it without any real thought on the matter

2 Likes

I don’t know about any of you.
But I am perfectly content with 1920 x 1080.

And if you want 4K so bad, just upscale it like they do on Star Wars Battlefront with a slider.
I’m sure Unity can do that. Granted not true 4K, but it does give better pixel resolution.

2 Likes

Higher resolutions are great for some things, and nearly a requirement for others. VR, for example, really needs a higher resolution if you’re going for a wide FOV. Then there’s the cases where people want to use screens more like paper - added resolution is great there, too.

For standard gaming on a monitor, though, I agree that 1080p is pretty reasonable.

5 Likes

You just don’t want to have to find a way to fit a 40-inch monitor onto your desktop to be able to see the cursor. :stuck_out_tongue:

3 Likes

but once you do, it’s really really nice.

Just get a bigger GPU. I’m failing to see the problem here.

Of course for most applications we are already past the point where the human eye can actually separate the pixels. So going to higher resolution really doesn’t help much.

3 Likes

Just because it is relevant to this thread: http://forum.unity3d.com/threads/scheduled-gdc-talk-4k-rendering-breakthrough-the-filtered-and-culled-visibility-buffer.424820/

Seems like some people are already trying to find this “holy grail of new rendering technology” which would make 4k less of a GPU crusher.

On the allure of 4K for games: Thanks to upgrading to a new OCed GTX 1070, and the 4k 40" screen I bought some time ago at a reduced price, I can finally start testing out 4k for myself.
Some observations:

  • I can get 4k at 60Hz with a single 500$ GPU with max setting in 2016! … well, for some games at least. Obviously older games not using the latest and greatest technology. But the fact this not-so-fast-as-hoped GPU still seems to have enough power to at least get up to 60Hz in slightly older titles maxed out is cool as hell.
  • 4k IS looking good… just not as good as many have hoped. Me included. Even games that HAVE been developed to also be playable at 4k just sometimes do not look as good as you would hope. the additional pixel count easely lets you see textures not so well done, a lot of things that look fine viewed in Full HD looks kinda meh once you have the additional pixels.
    My guess is that many devs just added the options to the menu, and then MAYBE (get to that next) made sure the UI scales, and called it a day without testing it out much.
  • UI scaling. Really, its the same mess as with Windows all over again. How hard can it be? … then again, dabbling in game development myself I know it is kinda hard, and also way more work than anyone would think.
    Point is, if the UI does not scale, games are hardly usable on a 40" 4k screen. Thank god I am not trying to play those games on a 24" screen, or even worse, a tiny 17" laptop screen!
  • AA… yes, I also was sure that 4k would make me finally abandon the worries about crappy ingame AA, or trying to fry my GPU with Downsampling.
    But to be honest, AA is still needed. Even on a 24" screen, specular aliasing with small tris would most probably make AA essential. On my 40" screen, I am currently trying to decide if I should scale down to 30Hz for proper AA, or keep just the ingame FXAA (which I could probably switch off just as well) and play at 60 Hz.

I would say 4k is the future. But its still not there yet, and the biggest problem is not the GPU power which is not increasing fast enough, or renderer technology not utilizing that power efficiently enough.
Its plain simple the game developers developing for Full HD first and foremost, leaving 4k as an afterthought and often not even investing the time to look into proper UI scaling.

And really, I personally am just a little bit dissapointed because games in 4k just don’t look as spectacularly better as I might have hoped they would. Which might have been me expecting too much.
Betting on 8k now… when will the first 8k screens finally come out? Where are the 8k capable single GPUs? :wink:

This is exactly what happened when HDTVs first became available at cheap prices but most broadcasts (and even stuff like DVD content) was still SD. People were simply shocked at how bad SD really was, but they needed a dramatically better display before it was obvious.

You want 8K? Got $133K? :slight_smile:

http://www.theverge.com/2016/1/5/10713490/lg-98-inch-8k-oled-tv-uh9800-ces-2016

1 Like

Well I noticed that the Ogre 3D Gfx engine developers were talking about bandwidth limits to deferred rendering at >= 4k. They speculated that Forward+ rendering would be needed to provide performant higher resolution rendering on current generation hardware.

But I also noticed a technical blog post about different approaches to this issue when I posted.

But the GDC 2016 lecture notes on using a V buffer look fascinating > http://www.conffx.com/Visibility_Buffer_GDCE.pdf

Summary:
Forward rendering does too much overdraw.
Deferred uses up too much bandwidth at higher resolutions.
Visibility buffering reduces rendering with lower bandwidth overhead than Deferred rendering.
Vulkan and DirectX 12 allow Visibility buffering.

It sounds like the Forward+ (Forward rendering with Light Culling) rendering combined with Visibility buffering could allow more performance in 4k+ resolution games and VR (with DX12/Vulkan level GPU’s).

1 Like

Its just a fillrate issue. 4k is 4x the pixels of 1080p. If the graphics card is not 4x faster at filling pixels then the framerate will drop. Same thing happened on iPad when they went from 1024x768 up to 2048x1536 but only bumped the processing performance by 2x in that generation. … instant slower performance. It wasn’t until the next version (ipad4?) that the speed bump was 2x again - enough to make up for the fillrate increase.

There is not really anything Unity can do with regards to making things faster when it’s a fill-rate issue. Maybe a little optimizing here and there but nothing in the order of a 400% boost. It’s purely down to the graphics card horsepower. Although maybe more modern API’s would help like the replacement for OpenGL for example.

4 Likes

If you look at the presentation slides you can see that the Visibility buffer does not boost performance by 400% but it can boost performance enough to keep under a FPS threshold (note further optimisations are being worked on) and this approach uses less memory.

That is situation dependent. It depends on the resolution, size of the screen, your distance from it and your eyesight.

@Arowx - SD, HD, UHD, 4K, the story is the same. It is simply a new transition. 4K just raises the bar yet again and we have to squeeze more performance out of software and hardware… The same problem the industry have been working for since infancy. This can be achieved through more powerful GPU:s, more optimization, new API:s and new tricks… usually a mix of them all. There is already 8K :wink:

Will rendering change again in the future? Certainly but how and what will it be? Who knows… There is a cost to changing stuff and Unity will do it when they feel that the results are worth it :slight_smile:

Oh, remember that the majority is still stuck at 1080p… don’t worry be happy :slight_smile:

1 Like

I agree that Arowx’s threads can be quite frustrating. The issue of different rendering approaches/pipelines being more or less fit for higher resolutions however has merit. To my knowledge, Apple was able to go for “Retina” resolutions on their mobile devices earlier than the competition thanks to their use of PowerVR hardware, which thanks to their tile-based rendering approach can handle the resolution better. Then again, this is a hardware discussion, less a software one.

PowerVR delivering iPhone GPUs is an interesting turn of events since PowerVR has been around for a while. Due to their exotic approach, they struggled to gain ground in the gaming market during the 90ies. Yet 20 years later they were back all of a sudden, supplying a company that until then was struggling with gaming. Oh the irony.

Apple’s move to 4K and 5K resolutions on their desktop iMacs baffled me, however. The Intel iGPUs or 2xx/3xx ATI cards they are or were using were visibly struggling under the load, even with just desktop effects.

I’d like to see Unity implement forward+ at some point (not for 4K rendering, just in general).

I don’t really care about 4K at this point to be honest. I’d rather see consistent, properly antialiased 1080p @ 60fps before we make the jump to 4K.

2 Likes

What about Unity adopting Clustered Forward Rendering as used by Doom and explained here → DOOM (2016) - Graphics Study - Adrian Courrèges (really good article on Doom Rendering Pipeline)

It’s an alternative rendering pipeline. That’s it.

Even Doom’s Clustered rendering struggles to run on 4k @ 60hz on high end hardware → http://www.techspot.com/review/1173-doom-benchmarks/page4.html

Naturally hardware will struggle with massive increases in resolution when the hardware is still largely made for much lower resolutions (usually 1080p/1440p). Hardware needs to push four times as many pixels as before.

By the way your link is very dated. The GTX 1080 can handle 4K at playable speeds.

http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1080-review,16.html

Yes, the GTX 1080 is a bit pricey but a 4K display isn’t cheap either.