View Single Post
Old 04-11-10, 09:49 AM   #1
A Deviate Faerie Dragon
Join Date: Apr 2010
Posts: 11
Lightbulb Pixel Perfection for Specific Resolution

This post has been updated per Haleth's information below. Thanks Haleth!

I came across an issue I didn't like while designing my UI: in-game pixels were not "real" pixels. They were close, but not exactly. So sometimes textures/borders/etc would get fudged when WoW scales up from its in-game resolution to your monitor's resolution.

The solution is to manually set the uiscale CVar (which just stores the value from this slider in the Video Options).

Here's the quick'n'dirty macro line:

/run SetCVar("uiScale", 768/string.match(({GetScreenResolutions()})[GetCurrentResolution()], "%d+x(%d+)"))
Here's the explanation and longer more hands-on method:

To calculate the value, you need your resolution height (mine was 1050 for a 1680x1050 resolution) and need to know the game's height is always (?) 768. You can verify that your in-game height is 768 by running the following lines:

/console useuiscale 1
/console uiscale 1
/run print(GetScreenHeight())
Your UI scale is the ratio between your actual resolution and the in-game resolution. For me, it's:

768 / 1050 = 0.731428571
/console uiscale 0.731428571
NOTE: The minimum value is 0.64. Below 0.64 will not work!

You can see what your resulting size is by:

/run print(GetScreenWidth(), "x", GetScreenHeight())
And there you have it, the exact resolution of your monitor (within floating point accuracy) to work with when designing your UI. Since we were modifying a CVar, this will save across sessions.

Disclaimer: This is possibly/probably a bad idea if you plan to package your UI. I use it only for my own UI.

Last edited by TravisWatson : 04-11-10 at 11:11 AM.
TravisWatson is offline   Reply With Quote