Send it to Tim Cook email. It worked for me fixing DisplayPort DSC bug. After Catalina, later MacOSes lost ability to drive monitors at higher than 60Hz refresh.
Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)
Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.
If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).
I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.
Ahh, true, I now have 120Hz top, but it's fine, why I said fixed :) I now recall in Catalina I had full 144Hz and VRR options! Monitor is Dell G3223Q via Caldigit TS4 DP.
The ideal work/coding resolutions and sizes for macOS that I would suggest if you are going down this rabbit hole.
24 inch 1080p
24 inch 4k (2x scaling)
27 inch 1440p
27 inch 5k (2x scaling)
32 inch 6k (2x scaling)
Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.
Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.
I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.
For me it would be 16-27" 4k is fine, but and as you go up to 32" I'd be wanting 5 or 6k ideally as it's quite noticable for text (even when high DPI scaling is working and across operating systems).
32" 4k display at fraction scaling of 1.5 (150%) is fine for my day-to-day work (Excel, VS Code, Word, Web browsing, Teams etc.). It delivers sharp enough text at an effective resolution of 2560x1440 px. There are many 32" 4k displays that are affordable and good enough for office workers. I work in a brightly lit room, so I find that monitor brightness (over 350 nits) is the most important monitor feature for me, over text sharpness, color accuracy, or refresh rate.
I use a 4K 32'' Asus ProArt monitor and didn't notice any difference between my M2 Pro and my M4 Pro. I will admit my eyesight is not the best anymore but I think I would notice given I'm a bit allergic to blurry monitors.
Anyway I will run the diagnostic commands and see what I get.
I thought I was going crazy when my new m4 seemed "fuzzier" on my external 4ks. I tried replicating settings from my old MacBook to no avail.
I wonder if Apple is doing this on purpose except for their own displays.
It's a bit nit-picky on my part, but this bizarre world of MacOS resolution/scaling handling vs. other operating systems (including Windows 11 for crying out loud) is one of my biggest gripes with using Apple hardware.
I remember having to work hard to make my non-Apple display look 'right' years ago on an Intel-based mac due to weirdness with scaling and resolutions that a Windows laptop didn't even flinch handling. It was a mix of hardware limitations and the lack of options I had to address the resolution and refresh rates I had available over a Thunderbolt doc that I shouldn't have to think about.
I honestly hope they finally fix this. I would love it if they allowed sub-pixel text rendering options again too.
Props to the author for putting in what looks like ton of work trying to navigate this issue, shame they have to go to these lengths to even have their case considered.
Thanks, it was a good portion of my weekend bashing my head against the keyboard trying to figure out what was going on and if there was a workaround I could use (there isn't that I've found).
I went to hell and back trying to get PIP/PBP monitors on my 57" g9 ultrawide to work with my M2 pro. ended up having to use a powered hdmi dongle, displaylink cable, and displayport, with 3 virtual monitors via betterdisplay. Allowing resolutions outside of macs limitations setting in BD is what did the trick. I don't envy OP. Having 5120x1440 @ lodpi was the worst, just ever so slightly too fuzzy but perfect UI size but eventually got a steady 10240x2880 @ 120hz with HDR. I literally laughed out loud when I read the title of the thread. Poor guy.
Sadly I have the issue on a new m5 air. I have a 60hz 4k work monitor and two high refresh 4k gaming displays. The 60hz pairs fine with either gaming monitor, but the two gaming ones together and one just doesn't get recognized. Spent way too long trying new cables before realizing it's a bandwidth limitation.
This is not a normal retina configuration. This is a highly unusual configuration where the framebuffer is much larger than the screen resolution and gets scaled down. Obviously it sucks if it used to work and now it doesn't but almost no one wants this which probably explains why Apple doesn't care.
In my case it's a standard LG UltraFine 4K monitor plugged into a standard 16" M5 MacBook Pro via standard Thunderbolt (via USB-C) - not sure what's not normal about this? I've confirmed it with other monitors and M5 Macbook Pros as well.
To be frank, it's kind of embarrassing if an entry-level Windows laptop with a decent integrated GPU handles this without much effort.
Apple is free to make its own choices on priority, but I'm disappointed when something that's considered the pinnacle of creative platforms sporting one of the most advanced consumer processors available can't handle a slightly different resolution.
I don’t know why this was downvoted, I agree that this is a highly unusual configuration. Why render to a frame buffer with 2x the pixels in each direction va the actual display, only to then just scale the whole thing down by 2x in each direction?
TFA doesn't say -- does anyone know if this applies to 5k and 6k monitors? On my 5k display on a M4 Max, I see the default resolution in system settings is 2560x1440.
If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.
I'm sure you've already given this a crack via some other technique (I just Cmd-F for it and didn't find) but I have had monitors with confusing EDIDs before that MacOS didn't handle well and the "screenresolution" CLI app https://github.com/jhford/screenresolution always let me set an arbitrary one. It was the only way to get some monitors to display at 100 Hz for me and worked very well for that since the resolution is mostly sticky.
Yeah. I don't get it. If you've got a 3840x2160 display, intended use on macOS as a 1920x1080@2x display, what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?
(I use my M4 Mac with 4K displays, and 5120x2880 (2560x1440@2x) buffers. That sort of thing does work, though if you sit closer than I do then you can see the non-integer scaling. Last time I tried a 3840x2160 buffer (1920x1080@2x), that worked. I am still on macOS Sequoia though.)
> what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?
Text rendering looks noticeably better rendered at 2x and scaled down. Apple's 1x font antialiasing is not ideal.
Especially in Catalyst/SwiftUI apps that often don't bother to align drawing to round points, Apple's HiDPI downscaling has some magic in it that their regular text rendering doesn't.
Yes but Apple got to drop subpixel anti-aliasing support because this workaround is "good enough" for all of their built-in displays and overpriced mediocre external ones, so we all get to suffer having to render 4x the pixels than we need.
Yes, I would actually be surprised to learn that mode is available on any system. I’ve never seen that anywhere, though I only have a M1 Pro and an M4 Pro (and various Intel Macs).
You’re rendering to a framebuffer exactly 2x the size of your display and then scaling it down by exactly half to the physical display? Why not just use a 1x mode then!? The 1.75x limit of framebuffer to physical screen size makes perfect sense. Any more than that and you should just use the 1x mode, it will look better and perform way better!
Yeah I'm not sure what the point of this article is really or am I probably misunderstanding something? There's no such thing as 4K HiDPI on a 4K monitor. That would be 2160p @ 2x on an 8K monitor. 4K at 100% scaling looks terrible in general across every OS.
Well, it sounds like a real issue, but the diagnosis is AI slop. You can see, for example, how it takes the paragraph quoted from waydabber (attributing the issue to dynamic resource allocation) and expands it into a whole section without really understanding it. The section is in fact self-contradictory: it first claims that the DCP firmware implements framebuffer allocation, then almost immediately goes on to say it's actually the GPU driver and "the DCP itself is not the bottleneck". Similar confusion throughout the rest of the post.
And prior to Apple’s re-entry into the display market, everybody internally was likely on 2x HiDPI LG UltraFine displays or integrated displays on iMacs and MacBooks.
Fractional scaling (and lately, even 1x scaling “normal”) displays really are not much of a consideration for them, even if they’re popular. 2x+ integer scaling HiDPI is the main target.
- 24" you need 4k
- 27" you need 5K.
- 32" you need 6k.
Windows subpixel aliasing (Clear Type) manages a lot better with lower pixel density. Since Windows still has a commanding market share in enterprise, you might be right about the industry standard for HiDPI but for Apple-specific usage, not really.
This still baffles me. Never mind Windows; I can get sub-pixel font rendering with the ability to fine-tune it on virtually any major Linux distro since around 2010.
Meanwhile, Apple had this but dropped it in 2018, allegedly under the assumption of "hiDPI everywhere" Retina or Retina-like displays. Which would be great...except "everywhere" turned out to be "very specific monitors support specific resolutions".
Tbh I'm not even sure what the issue is here. I have a personal M1 macbook and a work M4 and a 4k display. I don't see any issues or differences between them on my display. The M4 seems to be outputting a 4k image just fine.
The article could just be AI slop since it just contains hyper in depth debugging without articulating what the problem is.
Send it to Tim Cook email. It worked for me fixing DisplayPort DSC bug. After Catalina, later MacOSes lost ability to drive monitors at higher than 60Hz refresh.
Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)
https://egpu.io/forums/mac-setup/4k144hz-no-longer-available...
Thank you
No, it didn’t get fully fixed.
Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.
If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).
I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.
Ahh, true, I now have 120Hz top, but it's fine, why I said fixed :) I now recall in Catalina I had full 144Hz and VRR options! Monitor is Dell G3223Q via Caldigit TS4 DP.
The ideal work/coding resolutions and sizes for macOS that I would suggest if you are going down this rabbit hole.
24 inch 1080p 24 inch 4k (2x scaling) 27 inch 1440p 27 inch 5k (2x scaling) 32 inch 6k (2x scaling)
Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.
Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.
I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.
For me it would be 16-27" 4k is fine, but and as you go up to 32" I'd be wanting 5 or 6k ideally as it's quite noticable for text (even when high DPI scaling is working and across operating systems).
32" 4k display at fraction scaling of 1.5 (150%) is fine for my day-to-day work (Excel, VS Code, Word, Web browsing, Teams etc.). It delivers sharp enough text at an effective resolution of 2560x1440 px. There are many 32" 4k displays that are affordable and good enough for office workers. I work in a brightly lit room, so I find that monitor brightness (over 350 nits) is the most important monitor feature for me, over text sharpness, color accuracy, or refresh rate.
So MacOS supports only a handful of low dpi resolutions and high dpi must be an integer multiple of one of those?
It doesn't have to be but it's really designed to run at exactly 2x scale.
What makes you say that? Unless I am mistaken, it’s only the Pro models who run at 2x by default.
I use a 4K 32'' Asus ProArt monitor and didn't notice any difference between my M2 Pro and my M4 Pro. I will admit my eyesight is not the best anymore but I think I would notice given I'm a bit allergic to blurry monitors.
Anyway I will run the diagnostic commands and see what I get.
I thought I was going crazy when my new m4 seemed "fuzzier" on my external 4ks. I tried replicating settings from my old MacBook to no avail. I wonder if Apple is doing this on purpose except for their own displays.
It's a bit nit-picky on my part, but this bizarre world of MacOS resolution/scaling handling vs. other operating systems (including Windows 11 for crying out loud) is one of my biggest gripes with using Apple hardware.
I remember having to work hard to make my non-Apple display look 'right' years ago on an Intel-based mac due to weirdness with scaling and resolutions that a Windows laptop didn't even flinch handling. It was a mix of hardware limitations and the lack of options I had to address the resolution and refresh rates I had available over a Thunderbolt doc that I shouldn't have to think about.
I honestly hope they finally fix this. I would love it if they allowed sub-pixel text rendering options again too.
Props to the author for putting in what looks like ton of work trying to navigate this issue, shame they have to go to these lengths to even have their case considered.
Thanks, it was a good portion of my weekend bashing my head against the keyboard trying to figure out what was going on and if there was a workaround I could use (there isn't that I've found).
I went to hell and back trying to get PIP/PBP monitors on my 57" g9 ultrawide to work with my M2 pro. ended up having to use a powered hdmi dongle, displaylink cable, and displayport, with 3 virtual monitors via betterdisplay. Allowing resolutions outside of macs limitations setting in BD is what did the trick. I don't envy OP. Having 5120x1440 @ lodpi was the worst, just ever so slightly too fuzzy but perfect UI size but eventually got a steady 10240x2880 @ 120hz with HDR. I literally laughed out loud when I read the title of the thread. Poor guy.
Sadly I have the issue on a new m5 air. I have a 60hz 4k work monitor and two high refresh 4k gaming displays. The 60hz pairs fine with either gaming monitor, but the two gaming ones together and one just doesn't get recognized. Spent way too long trying new cables before realizing it's a bandwidth limitation.
This is not a normal retina configuration. This is a highly unusual configuration where the framebuffer is much larger than the screen resolution and gets scaled down. Obviously it sucks if it used to work and now it doesn't but almost no one wants this which probably explains why Apple doesn't care.
In my case it's a standard LG UltraFine 4K monitor plugged into a standard 16" M5 MacBook Pro via standard Thunderbolt (via USB-C) - not sure what's not normal about this? I've confirmed it with other monitors and M5 Macbook Pros as well.
Isn't that just 2x supersampling? If you want "perfect" antialiasing that's the minimum you need, no?
Yes, it is supersampling but historically almost no one runs that way.
To be frank, it's kind of embarrassing if an entry-level Windows laptop with a decent integrated GPU handles this without much effort.
Apple is free to make its own choices on priority, but I'm disappointed when something that's considered the pinnacle of creative platforms sporting one of the most advanced consumer processors available can't handle a slightly different resolution.
I don’t know why this was downvoted, I agree that this is a highly unusual configuration. Why render to a frame buffer with 2x the pixels in each direction va the actual display, only to then just scale the whole thing down by 2x in each direction?
TFA doesn't say -- does anyone know if this applies to 5k and 6k monitors? On my 5k display on a M4 Max, I see the default resolution in system settings is 2560x1440.
If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.
This would be even more compelling if you included screenshots with magnified detail insets showing the text blur.
I'm sure you've already given this a crack via some other technique (I just Cmd-F for it and didn't find) but I have had monitors with confusing EDIDs before that MacOS didn't handle well and the "screenresolution" CLI app https://github.com/jhford/screenresolution always let me set an arbitrary one. It was the only way to get some monitors to display at 100 Hz for me and worked very well for that since the resolution is mostly sticky.
Wouldn't HiDPI be 1080p@2x? Is that still available?
Yeah. I don't get it. If you've got a 3840x2160 display, intended use on macOS as a 1920x1080@2x display, what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?
(I use my M4 Mac with 4K displays, and 5120x2880 (2560x1440@2x) buffers. That sort of thing does work, though if you sit closer than I do then you can see the non-integer scaling. Last time I tried a 3840x2160 buffer (1920x1080@2x), that worked. I am still on macOS Sequoia though.)
> what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?
Text rendering looks noticeably better rendered at 2x and scaled down. Apple's 1x font antialiasing is not ideal.
Especially in Catalyst/SwiftUI apps that often don't bother to align drawing to round points, Apple's HiDPI downscaling has some magic in it that their regular text rendering doesn't.
Feels like a huge power loss just to get slightly better text. You slow rendering down 4x for this
Yes but Apple got to drop subpixel anti-aliasing support because this workaround is "good enough" for all of their built-in displays and overpriced mediocre external ones, so we all get to suffer having to render 4x the pixels than we need.
Yes 1920x1080@2x absolutely works on M4. I use this mode all day every day.
Yeah if I understand it correctly, this is more like 2160p@2x which is... unusual?
Yes, I would actually be surprised to learn that mode is available on any system. I’ve never seen that anywhere, though I only have a M1 Pro and an M4 Pro (and various Intel Macs).
You’re rendering to a framebuffer exactly 2x the size of your display and then scaling it down by exactly half to the physical display? Why not just use a 1x mode then!? The 1.75x limit of framebuffer to physical screen size makes perfect sense. Any more than that and you should just use the 1x mode, it will look better and perform way better!
Because 1x mode has no subpixel antialiasing and thus looks absolutely terrible.
I have a 32:9 Ultrawide I would love to use on macOS but the text looks awful on it.
Yeah I'm not sure what the point of this article is really or am I probably misunderstanding something? There's no such thing as 4K HiDPI on a 4K monitor. That would be 2160p @ 2x on an 8K monitor. 4K at 100% scaling looks terrible in general across every OS.
This is the sort of Apple gotchas that really upset me.
They've got a good thing going, but they keep finding ways to alienate people.
Their suggestion : get an Apple monitor that we just launched.
Well, it sounds like a real issue, but the diagnosis is AI slop. You can see, for example, how it takes the paragraph quoted from waydabber (attributing the issue to dynamic resource allocation) and expands it into a whole section without really understanding it. The section is in fact self-contradictory: it first claims that the DCP firmware implements framebuffer allocation, then almost immediately goes on to say it's actually the GPU driver and "the DCP itself is not the bottleneck". Similar confusion throughout the rest of the post.
I think you are probably right--it's a real problem.
As an article, it is not 100% coherent, but there is a valid data and a real problem that is clear.
Is this for specific verisons of macOS?
The article doesn't mention it.
How did none of the Apple devs notice this? 4k 32" inch is the industry standard for HiDPI monitors.
Apple doesn’t make an 4k external monitor.
They’re likely all on Studio Displays.
And prior to Apple’s re-entry into the display market, everybody internally was likely on 2x HiDPI LG UltraFine displays or integrated displays on iMacs and MacBooks.
Fractional scaling (and lately, even 1x scaling “normal”) displays really are not much of a consideration for them, even if they’re popular. 2x+ integer scaling HiDPI is the main target.
Not in the Apple world, and this article is centered on Apple.
https://bjango.com/articles/macexternaldisplays/
Windows subpixel aliasing (Clear Type) manages a lot better with lower pixel density. Since Windows still has a commanding market share in enterprise, you might be right about the industry standard for HiDPI but for Apple-specific usage, not really.This still baffles me. Never mind Windows; I can get sub-pixel font rendering with the ability to fine-tune it on virtually any major Linux distro since around 2010.
Meanwhile, Apple had this but dropped it in 2018, allegedly under the assumption of "hiDPI everywhere" Retina or Retina-like displays. Which would be great...except "everywhere" turned out to be "very specific monitors support specific resolutions".
Tbh I'm not even sure what the issue is here. I have a personal M1 macbook and a work M4 and a 4k display. I don't see any issues or differences between them on my display. The M4 seems to be outputting a 4k image just fine.
The article could just be AI slop since it just contains hyper in depth debugging without articulating what the problem is.
In layman terms, for some UI scaling options, text is rendered blurry by M4/M5 Macs.
Don’t think I’d call 4K at 32” high dpi.
Now I know I was not crazy and the "cheap" 4K screen I bought a couple months ago doesn't actually suck.
Tim Apple's Apple has been fu#$%& me again..
Apple software is written by codeslaves under constant fear of deportation. They’re cheap and they can’t do software worth a damn.