[][src]Module winit::dpi

DPI is important, so read the docs for this module if you don't want to be confused.

Originally, winit dealt entirely in physical pixels (excluding unintentional inconsistencies), but now all window-related functions both produce and consume logical pixels. Monitor-related functions still use physical pixels, as do any context-related functions in glutin.

If you've never heard of these terms before, then you're not alone, and this documentation will explain the concepts.

Modern screens have a defined physical resolution, most commonly 1920x1080. Indepedent of that is the amount of space the screen occupies, which is to say, the height and width in millimeters. The relationship between these two measurements is the pixel density. Mobile screens require a high pixel density, as they're held close to the eyes. Larger displays also require a higher pixel density, hence the growing presence of 1440p and 4K displays.

So, this presents a problem. Let's say we want to render a square 100px button. It will occupy 100x100 of the screen's pixels, which in many cases, seems perfectly fine. However, because this size doesn't account for the screen's dimensions or pixel density, the button's size can vary quite a bit. On a 4K display, it would be unusably small.

That's a description of what happens when the button is 100x100 physical pixels. Instead, let's try using 100x100 logical pixels. To map logical pixels to physical pixels, we simply multiply by the DPI (dots per inch) factor. On a "typical" desktop display, the DPI factor will be 1.0, so 100x100 logical pixels equates to 100x100 physical pixels. However, a 1440p display may have a DPI factor of 1.25, so the button is rendered as 125x125 physical pixels. Ideally, the button now has approximately the same perceived size across varying displays.

Failure to account for the DPI factor can create a badly degraded user experience. Most notably, it can make users feel like they have bad eyesight, which will potentially cause them to think about growing elderly, resulting in them entering an existential panic. Once users enter that state, they will no longer be focused on your application.

There are two ways to get the DPI factor:

Depending on the platform, the window's actual DPI factor may only be known after the event loop has started and your window has been drawn once. To properly handle these cases, the most robust way is to monitor the HiDpiFactorChanged event and dynamically adapt your drawing logic to follow the DPI factor.

Here's an overview of what sort of DPI factors you can expect, and where they come from:

The window's logical size is conserved across DPI changes, resulting in the physical size changing instead. This may be surprising on X11, but is quite standard elsewhere. Physical size changes always produce a Resized event, even on platforms where no resize actually occurs, such as macOS and Wayland. As a result, it's not necessary to separately handle HiDpiFactorChanged if you're only listening for size.

Your GPU has no awareness of the concept of logical pixels, and unless you like wasting pixel density, your framebuffer's size should be in physical pixels.

winit will send Resized events whenever a window's logical size changes, and HiDpiFactorChanged events whenever the DPI factor changes. Receiving either of these events means that the physical size of your window has changed, and you should recompute it using the latest values you received for each. If the logical size and the DPI factor change simultaneously, winit will send both events together; thus, it's recommended to buffer these events and process them at the end of the queue.

If you never received any HiDpiFactorChanged events, then your window's DPI factor is 1.



A position represented in logical pixels.


A size represented in logical pixels.


A position represented in physical pixels.


A size represented in physical pixels.



Checks that the DPI factor is a normal positive f64.