Cranke
1
The Problem:
I have a VERY simple project with an image and a PointerEnter event on that image. I seem to get some resolution-dependent results that I can’t understand. I’d appreciate some help about why my PointerEnter event doesn’t trigger on 1920x1080 resolution.
The Configuration:
- Two monitors in Windows 10, setup in extended desktop mode.
- The left monitor is 1600x1200.
- The right monitor is 1920x1080.
- If the program is run on the right monitor with a resolution greater than 1600x1200, the right-aligned image (or even a regular button) will not trigger events. No clicks. No PointerEnter/Exit. No button animations. Nothing.
The Results:
- If I run this project in the Unity Editor and put my mouse over the image, I see in the log: “Pointer entered successfully!”. All good!
- If I build and run this project as a standalone .exe (Windows) on the right monitor at 640x480 resolution, I see in output_log.txt the same “Pointer entered successfully!” log entry.
- Now, if I build and run this project as a standalone .exe on the right monitor at 1920x1080 resolution, I don’t get the pointer entered log. It seems as if the pointer entered event just never fires at all.
The Project:
-
Create a new 2D project (Unity 5.3.1f1 Personal).
-
Create a canvas.
-
Create a panel on the canvas.
-
Set panel anchors to stretch/top, panel Height=50, and panel Pos y = -25.
-
Add horizontal layout group component to panel, uncheck child force expand width/height, and set child alignment to middle right.
-
Add image to panel.
-
Add layout element component to image and set preferred width and height to 50 each.
-
Add box collider 2D component to image with offset 0,0 and size 50,50.
-
Add event trigger to image that will call a simple script when PointerEnter happens.
-
In simple script, write Debug.Log to show that the PointerEnter event triggered successfully.
This is what it looks like:
Cranke
2
Looks like the problem is related to the targetDisplay set for the overall canvas (in screen overlay mode). The targetDisplay is always set to Display 1 (instance 0). When this is the case, the problems I talk about above happen. If I set the canvas to be Display 2 (instance 1) (which, btw, is the display the program is actually running on), things go back to working again like magic.
If anyone knows why I pick Display 2 from the Display Resolution Dialog but canvas.targetDisplay ends up getting set to Display1, I’d be interested.
I’ve posted a follow-up question to figure out why the discrepancy and what I can manually do about it (link text).