EXE slows down when turning - keeping assets in memory?

My problem is that I work with a huge amount of polygons in a hudge scene and anytime I turn with the FPS camera it starts to slow down for a while (until the assets load to the memory). I uses a VGA with 2 gigs of RAM, so I have no problem to store everything in memory permanently. How can I do that? This slowing down kills the user experience…thanks

Define huge.

Verts: 38M, Tris: 6,7M VRAM usage: 22-330 MB

But I have the same issue with far smaller scenes…I turn to lef for example and it slows down…after loading assets into memory it runs perfectly…with the huge one my problem is that it slows down permanently for a while…

Um…is that what is currently being rendered? :shock:

ep… I think it will be the biggest project made in Unity so far :slight_smile:

Well no wonder your having problems with the game! There isn’t a game alive that has 6.7 million triangles rendered per frame!

Hell, Crysis (the hardest game to render) only renders 2 million tris at the highest settings at best!!! How on earth are you even getting a framerate? Did I miss something? :shock:

Can we see a screenshot? :smile:

Wait…is it 6.7 million per frame? Or is that the total amount of geometry in your level? :?

6.7 million per frame is perfectly doable on a beefy graphics card, I think (with simple shaders etc.). Crysis used two million, but then it had quite complex shaders and lots of other stuff going on.

Anyway - I think the slowdown is the graphics driver actually uploading the vertex buffers, textures and whatnot on the first use.

What some games do is: at the start of the level, render “stuff that will be needed” into some invisible place. E.g. create a temporary render texture, and render 10 frames with various camera rotations into that texture. Then destroy the texture and proceed normally.

Actually using some resources (vertex buffers, textures, shaders, …) is the only way to make the graphics driver actually create them, because most of drivers defer lots of stuff until it is actually used.

Yes, I know it is huge:) The framerate is about 30-40 fps with an Ati 4870 X2, so it is OK. Unfortunately I cannot attach a screenshot, it is business stuff; if it will be ready and presented I will make video and pics just to show what is Unity capable…

Thanks Aras, I think it could be a solution, we will give it a try. I don’t know yet if it will decrease performance or not.

Are you on a PC? How did you make a Unity game if your on a PC? That card isn’t supported by Mac. :?

Um, Unity makes PC builds (if you have Pro).

–Eric

Yeah I know that. I was just curious about him running a 6.7 M tri scene in the editor on his Mac (I guess the Mac can somehow handle it?).

Sorry about me confusing me. :smile:

Anyway, I can run Crysis at 2 M tris and his card is probably quadruple the power of mine (including the duo core graphics system that his card has). I’d love one of those cards. I wish Apple would put a gun to AMD’s and Nvidia’s heads and tell them to do their jobs and make some drivers for the Mac (and have Crossfire and SLI too!) instead of Apple doing all the hard work. :lol:

A 8800 GT can handle it perfectly for editing…Unity is nut really supported by crossfire, we did a test yesterday.

Oh? You own the same card as me for the Mac? :smile:

Your right about Crossfire and SLI. Then again, not many games support them in the first place. Maybe when they get more popular they’ll be supported. But right now, all you could ever need is that 4870 X2. :stuck_out_tongue:

8800GT is a quite good card. We tested 4870 X2 and GTX285 together, the ATI was faster, And don’t forget under Win these cards are faster…even the 8800.

Hello Aras!

"create a temporary render texture, and render 10 frames with various camera rotations into that texture. Then destroy the texture and proceed normally. "

We made an additional camera and framerate decreased. How did you mean to destroy the texture? THX

I mean to create this camera and render texture only for the “warmup” period. Like this:

  1. create camera, create render texture that camera renders into
  2. render that camera for N frames, using different orientiations, to make sure almost everything around will be rendered.
  3. destroy the render texture and the camera.
  4. proceed as normally.

The above will still have delays and whatnot to load all the data in (obviously - a lot of data needs to be loaded), but the rendering will be invisible. In the main camera you can display “loading” or whatever during that time.

Thank you. Most of the steps already done. The “destroy” is my problem; is it enough to hide the object with the rendertexture or there is a separate “destroy” command for rendertextures?

The key is to stop rendering into the render texture. Just like about everything else, using Destroy(something) will destroy the render texture.

Thanks, it is clear.