Pixel Perfection for Specific Resolution
This post has been updated per Haleth's information below. Thanks Haleth!
I came across an issue I didn't like while designing my UI: in-game pixels were not "real" pixels. They were close, but not exactly. So sometimes textures/borders/etc would get fudged when WoW scales up from its in-game resolution to your monitor's resolution. The solution is to manually set the uiscale CVar (which just stores the value from this slider in the Video Options). Here's the quick'n'dirty macro line: Code:
/run SetCVar("uiScale", 768/string.match(({GetScreenResolutions()})[GetCurrentResolution()], "%d+x(%d+)")) To calculate the value, you need your resolution height (mine was 1050 for a 1680x1050 resolution) and need to know the game's height is always (?) 768. You can verify that your in-game height is 768 by running the following lines: Code:
/console useuiscale 1 Code:
768 / 1050 = 0.731428571 You can see what your resulting size is by: Code:
/run print(GetScreenWidth(), "x", GetScreenHeight()) Disclaimer: This is possibly/probably a bad idea if you plan to package your UI. I use it only for my own UI. |
I too have noticed "fudging" or "blending" with my UI but I always simply thought of it as a limitation of the UI overlay, thanks so much for this! Now my OCD is satiated.
|
@Gsusnme, Glad I could help! Indeed it was also my OCD that led me down this path! ;)
|
It's much easier if you just do:
/script SetCVar("uiScale", 768/string.match(({GetScreenResolutions()})[GetCurrentResolution()], "%d+x(%d+)")) |
Quote:
|
Quote:
|
UI Scale is one way, but when I'm making an addon, I can't set the users UI scale to my value. So I have to make a workaround. If I scale any value with the following function, there comes out the value needed to represent the amount of "virtual pixels" needed by the frame to get the amount of "real pixels".
Code:
local mult = SetCVar("uiScale", 768/string.match(({GetScreenResolutions()})[GetCurrentResolution()], "%d+x(%d+)"))/GetCVar("uiScale") |
If you don't want to override the user's uiScale, you can use the following code to position a frame on real pixel boundaries: http://pastey.net/134131
It includes some test code that renders pixel-wide nested borders. The top one in this screenshot is adjusted, the bottom one isn't. |
Quote:
One problem though, neither GetCVar("uiScale") or UIParent:GetScale() return values instantly, you'll have to hook an event like PLAYER_ENTERING_WORLD(I don't know the first event that they are available, I do know that they are available at PLAYER_ENTERING_WORLD). |
Quote:
|
Quote:
|
Do any of these methods take into consideration using windowed mode (sans Maximized option)? I ask because, say your Windows desktop resolution is 1024x768. If you run WoW at 1024x768 in a non-maximized windowed mode, the actual resolution is a bit lower due to the WoW window's borders, title bar, and even the task bar.
|
Quote:
Quote:
|
Quote:
|
Quote:
In fact, if I understand what you guys are doing here, I highly disapprove. It looks like you guys are making your addons ignore the UI scale... and in my opinion, this is a terrible idea! This is not what I had in mind AT ALL. Your addons should never ignore a user's UI scale. If I found one that did, I would probably promptly trash it. Please correct me if I misunderstood what this code is doing. |
It doesn't ignore the scale. It just (optionally, per element) multiplies the value of a real pixel so that it fits the size of a full ui pixel and won't be anti-aliased.
Both solutions are acceptable, but the fact is not everybody is going to have their UI set to this scale. Also, just as importantly, changing the UI scale modifies your entire UI. When switching to a normalized UI scale I had to drastically modify every individual element of my UI to be ~20% smaller and in a position to match the old. If I'm releasing an addon that requires pixel-perfect borders, should I tell all of my users to change every UI element's scale and position, or should I emulate the value of a pixel in my own UI and leave everything else alone? I'd always choose the latter. |
Quote:
What am I missing here? This is exact code from Saiket. From the looks, it blatantly ignores scale. (It should scale with the map, but it doesn't) Quote:
If an addon developer runs at a UI Scale of 2 and optimizes font sizes, etc for that, it's going to look like hell for someone with a default of 0.9 (no ui scale). And by using the UIScale method when setting up my custom UI or pack, I'm making every addon pixel perfect, not just some random one that I'm creating. This goal isn't to create one pixel perfect addon, but an entire pixel perfect UI regardless of the addons I use. I really believe we're talking about two completely different problems. Hopefully you'll see my intents now. Quote:
|
Quote:
I'm still not entirely sure what you want, or what you don't want. If you want everything pixel perfect, you must either use the uiScale method or hook every SetPoint operation in the API. If you want only your mod to be pixel perfect, position your frames using the formula in my test code. |
Quote:
I also wanted to suggest a method for addon authors to design a more consistent user interface experience. All too often I see an author that designed his options GUI when his scale was set crazy, and it ends up not blending at all with the rest of the UI components. I'm suggesting that by designing at your native monitor resolution, you can deliver a more quality, consistent experience. Quote:
|
Quote:
I'll probably lack knowledge, but all scale(1) does is multiply by 1. So scale(23.124578) is still 23.124578 with this function of yours |
Quote:
http://wowprogramming.com/docs/api/GetCVar I also don't agree with nightcracker's original code either (and this may be simply a misunderstanding on my part as well). It seems like the multiplier calculation should only call GetCVar: Code:
local mult = (768/string.match(({GetScreenResolutions()})[GetCurrentResolution()], "%d+x(%d+)"))/GetCVar("uiScale") Anyways, the premise to this method is finding the reverse multiplier that WoW is using, so it is scaling you one way, and you're scaling it back before you draw, thereby negating effects of both the UI scaling and the "in-game" resolution based off a 768 pixel height screen. I'm still going to reiterate this warning: using this method will cause the code ignore UI scaling. Use only where appropriate. For instance, a 1 pixel border ignoring UI scale is fine, as it will stretch to encompass the entire container and still serve its purpose while retaining is pixel perfection. Using it to scale an entire frame will result in the UI enlarged or contracted in comparison to your addon because your addon isn't scaling with everything else. |
Ok, then I can't use the scale function. I'm still not sure which code to actually use then, which would be useful. Probably the stuff by Saiket. But that code is kind of 'littered' :o
|
Quote:
|
Quote:
|
Quote:
You can try with my modification as well: Code:
local mult = (768/string.match(({GetScreenResolutions()})[GetCurrentResolution()], "%d+x(%d+)"))/GetCVar("uiScale") I agree that Saiket's code is geared more towards a specific purpose, so it's going to be a little obfuscated. |
Heh, thanks. Have a look at this screenshot. On the topright, you'll see the addon I'm working on.
http://img651.imageshack.us/img651/2...210011038h.jpg For now, it's always pixelperfect with the manual /run thingy... |
Can anyone give some enlightment on this:
Is there ANY way to get the UI Scale of the user BEFORE any loading progress event(PLAYER_ENTERING_WORLD, VARIABLES_LOADED, etc)? |
I don't think there is. The default UI waits until it's available (VARIABLES_LOADED) before setting the scale on UIParent but as this happens during the loading screen or loading progress, no one notices the scaling happening.
|
I still did not find a suitable solution.
I've looked into the scripts here, the problem is that if you use such a script, the addon does not scale anymore with the UIscale but uses a fixed scale at all times. It also gives a weird effect if your addon IsSizable() and the user tries to resize it.... I think the easiest way is just to tell people to use the manual /run SetCVar() if they want to use an UIscale and not have a problem with pixels. |
Can anyone who's used this method before successfully share a demo? I too, tried this out and was unable to get it to perform correctly.
What I want to see is a scalable and movable frame whose borders and size always break on "real" pixels rather than in-game pixels regardless of scale or position. |
Quote:
Also, Dominos is gay for getting pixel-perfect borders, especially with PixelSkin for ButtonFacade. I might try out a different AB mod just because of this. :/ |
I wasn't aware Dominos had a sexual preference. :rolleyes:
|
Quote:
If you want your whole UI to be perfect, check out the first post in this thread (that includes Dominos if you leave it on 100% scale). |
Quote:
Wait, does this make me GAY?!? Oh, man! How'm I gonna break this to my girlfriend? |
Quote:
|
Quote:
|
Quote:
Quote:
|
Actually, I'll illustrate:
Quote:
Quote:
|
In other words, whenever someone says "gay" because it's shorter than "this is not working correctly, and here's what it's doing wrong...", I say "I can't help you." because it's shorter than explaining that it's working correctly but the user just doesn't know what they're doing. :D
|
Pedantry is not funny.
|
Quote:
Have I mentioned lately how good it is to have you back here on the forums? /nods |
Quote:
Oh, and I edited the post you lifted for your new sig... There was a critical "NOT" missing in the middle there, so it makes more sense now. :D:mad: |
Quote:
Quote:
|
Quote:
|
Quote:
|
You only have to hit this macro once if you are not switching between window mode and fullscreen mode frequently.
Basically you have to hit it once everytime you change something in your graphic settings (Multisampling for example) and hit apply. |
I have a problem... I read about this "pixel perfection" method a while ago on these forums, but it does not work exactly for all resolutions.
My resolution is 1440x900. When I put in 768/900, the result is 0.85333333333333333 to infinity. I edited my config file so that the scale is uiScale "0.85333333333333". But since it's not an exact number, the resultant UI scale is therefore not accurate, but is only the closest available. If I use /run print(GetScreenWidth(), "x", GetScreenHeight()) to see the resolution my number gets me, the result is 1439.9999662615 x 899.9999969073. I set all addons to use 100% scale. The result of all of this is that some of my addons have 1 px borders: my minimap, grid, tooltips, and those look fine. I also have graphic replacements with 1 px borders that sometimes appear smudged. Any help/adivce? |
I also have a resolution of 1440x900 and I used Google to work out 768/900 for me. The result was 0.85 follow by seven 3s:
0.853333333 Try this in your command prompt and everything should be fine. |
It's still 0.853333.. repeating to infinity. Google just doesn't display it properly.
|
I know what it is, but I used what Google displayed and got a perfect result. The maths here aren't important, he was just asking for a way to solve the problem.
PS: The correct term is 'recurring'. :P |
Let's hope those seven 3's are magical enough then. :)
Actually, it's either recurring or repeating - http://en.wikipedia.org/wiki/Repeating_decimal |
is it possible to ""lock"" the UIScale setting ?? I setup my Interface Pixel Perfect, and any time i change something in the Video settings like view distance etc ... it changes the UIscale after a reload. And no the Config.wtf isn`t locked ;)
|
OK, so, I haven't really been following this thread, mostly because what is being discussed I simply can't follow. You've probably heard the expression "It's all Greek to me". Well, I'm no good at foreign languages. That being said:
So, I ran this: Code:
/run print(GetScreenWidth(), "x", GetScreenHeight()) Code:
1280.0000179939 x 800.00002924007 Since my monitors resolution is 1280 x 800, I'm guessing this equates to being able to do "pixel perfect" ...... stuff. Oh, I have WoW set to play in windowed mode and maximized. Don't know if that makes a difference. |
Did Blizzard change something with how the UI's resolution is computed in the beta? I'm getting blurry pixel lines using the macro from the first post in the beta. It works perfectly on the live client.
|
Quote:
|
Code:
/console useuiscale 1 767.99998249287 Beta: 767.99998249287 It doesn't look like they've changed the way UI resolution is computed, so it must be something else. |
Having a little play around myself also with live to beta changes with the scaling, I can get some things pixel perfect with 8x multisampling on beta, whereas on live they smudge.
Also the 'no-UI scale' scale seems to have changed. Now I'm not sure if I'm looking at this correctly, but without the 'Use UI Scale' box checked on live and beta the numbers I get from UIParent:GetScale() are as follows: Live : 0.89999997615814 Beta: 0.73142856359482 And also /run print(GetScreenWidth(), "x", GetScreenHeight()) returns different values on live and beta. I'm running the game windowed, maximized, at 1680x1050. Live returns : 1365.333381317 x 853.33341730482 Beta returns: 1680.0000326139 x 1050.0000923593 I don't honestly know tho', this scaling stuff is rather confusing to me and does seem to be a bit of voodoo magic. And also Quote:
Live: 768.00005398167 Beta: 768.00005398167 |
Got exactly the same problem here - there's a tiny tiny difference in UI scale between live and beta for me. Almost everything looks fine, but some 1-pixel gaps become 2-pixel, and font shadows look messed up.
|
Quote:
|
I use player_entering_world, works fine as long as you unregister the event afterwards for obvious reasons.
|
All times are GMT -6. The time now is 06:24 PM. |
vBulletin © 2024, Jelsoft Enterprises Ltd
© 2004 - 2022 MMOUI