Approach for HUD Map

I am doing a little map so the player can see where they are as the move around an irrecular path (i.e. not square, circular, oval, etc.). And was wondering what others thought might be a good approach. Some possible ones I have looked at least a little at:

  1. Use GUI texures - Have a texture for the map then a little texture to represent the players position. This would require the map texture to have the same proportions as the “real” thing and a reference point from which the script could calculate an offset in order to transform the players position to the correct position on the map. The complication here seems to be that the map graphic must be precise and that the offset must be precisely placed in order to effectively calculate and display the player blip on the map.

  2. Use a separate map camera - For this I would place the path on a different layer that only the map camera renders. To show the map I could use the camera viewport or render to a texture that is placed on a GUITexture. If the camera is required to be far away in order to show the path then another object closer to the map camera will probably be needed to show the position. A script would be used to keep this “blip object” in the correct relative position.

From others experience, is one of the above generally better? Or, are there some better approaches?

I recall that one of the space-flight game designers in here considered both options and eventually decided to use the second one, using layers to create symbols for the mini-map.

In Intifada we had an RTS-like map camera. When the player rotates the camera, the map would rotate as well.

What I did was to have a separate map camera, that could only see the “map” layer. This map layer contained a textured quad (with the map graphics) on, and a particle system for blibs. I used Particles.Emit in order to make the blibs, and rotated moved the map camera in order to position it so it would match the ingame camera. I just had a scale-offset factor to go from world to map coordinates. It was a bit annoying to get right, but it worked out in the end. and worked really nice

I think your reffering to me :smile: and I ended up with some thing like this keep in mind this is a super old build! many things are wrong/funky. How ever, im using a second camera. It was a pretty easy to set up, each ship has a child “icon” thats attached and only rendered on a certain layer. Very slick, very easy, just my expirience, its a good way to go.

Edit: I forgot to say that you cant move in that demo just look (mouse) and roll (Q and E)

Bill

Bill, that’s a -very- nice compass design!

Thank you everyone.

Nicholas, you have given some additional things to think about. I like the idea of a particle system as the “icon”. That seems like it adds some additional options to the “blip”.

Bill, I remember when you were working on that. Are you still working on it or have you moved on to other things? In my current situation the child icon approach used for the radar has the complication (at least that is how it seems to me) that the reference point is not the player. I could probably get the path model, scale it down by a factor of maybe 500 then use child icons. Then all I have to do is the transform between the locations with the scaling factor, like in the radar :slight_smile:

By the way, I still really like that radar :smile:

I am still working on that project, its a big one, but I am having imense fun in this creative outlet. If you ask nicely maybe I can show you a few more things :wink: So what exactly is your current situation? Screens! :smile:

Bill

I don’t have any screens, I am only a helper. However, I hope in a month or so to at least have a screen of my next project:)

So I have a scaled down version of my path which I render to a texture then add the texture that is the texture on a GUITexture. When I scale the GUITexture down to a smaller size the path becomes pixelated, even though the x and y are scaled by the same factor. How can I prevent the Pixelation? I changed the width and height of the GUITexture from 128 to 512 but that did not seem to matter. The texture looks fine when I view it in the inspector.

Maybe the texture you’re rendering into has a small size?

If I look in Inspector the width and height of the render texture are both 512. Can I only set those values when first created or at anytime? Also, is the only way to conttrol the size of a GUITexture is through the Scale?

It just occurred to me that my using a split screen could be contributing to the problem (doh!). At run time the scaling changes from (.15x, .15y) to (.05x, .15y). However, changing the x value manually while running does not change the pixilation. I have attached a picture if that helps.[/img]

20074--650--$picture_1_954.png

Nope, you can also do that by changing the pixels offsets. (Xmin, Ymin, Xmax, YMax).

–Eric

I played with the Pixel Offsets but I still get the pixilation. For reference, here is what the render texture looks like. I figure I am doing something wrong with the scaling.

20078--651--$picture_2_579.png

Render textures don’t have mipmaps, so scaling it down by 2x or more will look bad (bilinear filtering won’t be able to filter them nicely). I suspect that’s what you’re seeing. What you could do:

  1. use a 2x smaller render texture or
  2. make the minimap larger on the screen or
  3. render only into a part of the render texture, approximately to match the minimap’s size on the screen.

And of course, clearing it to transparent black (or white… whichever is better) instead of transparent blue will get rid of the blue-ish fringes (caused again by bilinear filtering).

This definitely looks like what is happening.

These sound like some excellent ideas :slight_smile: Just a question on #1 and #3:

  1. Do I make a smaller render texture by reducing the Width and Height parms for the Render Texture?
  2. I assume I use the viewport to only render to part of the texture. Is this correct?

I will also do this.

After all this is done, I can put the scripts and an example on the Wiki if others are interested. My current scripts position the map camera, calculate the scaling factor, and of course, place the blips at the appropriate location.

Are these 2 correct, and if not, is there a way I can someone point me in the right direction? Using the viewport to render into part of a texture does render into that part, but the remaining part of the texture is sometimes garbled. There seems to be a subtlety I am missing.

  1. Make the render texture smaller, by changing the width and height of it.
  2. Leave the viewport size to be full window

Thank you Joachim, glad to know I wasn’t completely out in left field. The smaller render texture hasn’t seemed to help, but I will play with it some more this evening when I get home. I have at least one thing I want to try while doing this in order to rule out another cause. I do know that 128x128 is too small since the render texture itself becomes overly pixilated.

You want to make sure the pixel size matches. (In the gui texture use scale 0 and pixel inset of the size you want to use)

From code you can create render textures which are not power of two. You can use non-power of two render textures from a gui texture.

Thank you Joachim. I will give it a whirl tonight :slight_smile: