Hacker News new | past | comments | ask | show | jobs | submit login
Dummy display for Apple Silicon Macs to achieve custom resolutions (github.com/waydabber)
333 points by PikachuEXE 30 days ago | hide | past | favorite | 198 comments



You maybe wouldn’t expect it, but in this regard Windows is far superior to macOS. It will output whatever resolution your monitor has, and then you can just set a “scaling factor” which will make the interface exactly the size you like while still being pixel perfect.

Given how simple most UI is, especially these days (circles filled with gradients, roundrects) and given how many different screen sizes and resolutions are used even within Apple’s first party displays, it’s almost insane macOS isn’t resolution independent.


Android is like that too. Some devices have non-integer pixel densities (multipliers or device pixel ratios or scaling factors or whatever term you prefer), especially 1.5x aka "hdpi" that was popular around 2011. You can provide separate resources for each pixel density if you still use bitmap graphics for some reason. Oh and there's also "ldpi", which is something like 0.75x, though there were very few devices with it.

With how advanced macOS graphics stack is, I don't understand why does Apple not do this, instead insisting on using integer multipliers. Some iPhone models, too, render the UI at 3x and then downscale it to fit the screen. Even more curious is the fact that they have returned it as a float for as long as retina displays were a thing: https://developer.apple.com/documentation/appkit/nsscreen/13...

edit: one important difference I forgot to mention. On Android, all draw calls on the Canvas, and all view dimensions and other things like text sizes, always take physical pixels. It's your job to multiply and round everything correctly. On macOS and iOS, all graphics APIs take "points", which are those logical pixels you get after the scaling is applied, "density-independent pixels" as Android calls them. I guess Apple thought it would make it easier to adapt existing apps to retina displays. Android has always supported UI scaling, so there was no problem designing the API the way it is.


I worked on the Cocoa frameworks back when the hope was arbitrary scaling. The short answer to your question is "rounding." At 1.5x, a box of length 3 occupies 4.5 physical pixels. Either one side of the box will be blurry, or the box must get resized to align with physical pixels, which can cause mis-centering of text or icons, or unexpected clipping or truncation.

There was also a lot of custom layout code in existing apps, both first and third party, which just didn't work at arbitrary scale factors because it was never tested.

Yet another complication is that apps are expected to handle scale changes, e.g. if the user drags a window from one display to another. Android doesn't have to deal with this.

After years of trying to make it work, Apple basically gave up and did the scaling in WindowServer.


> I worked on the Cocoa frameworks

The thing I love about HackerNews is how you can casually run into people who worked on things you've been using every day for the last 10 years.

> The short answer to your question is "rounding."

Yeah, on Android, you're supposed to round yourself. View dimensions are integers, so you have to, but graphics APIs take floats, you don't strictly have to, but you want to anyway. Even though there's no such thing as pixel-perfect on AMOLED displays thanks to their subpixel layout. When you use XML and specify `25dp`, the system does the rounding.

> Android doesn't have to deal with this.

Oh it does! Though it deals with this in a lame way: your activity with all its views gets destroyed and recreated for a "configuration change" unless you've explicitly specified that you can handle this specific kind of change. Density doesn't change often, but the possibility is certainly there.

I remember how around 2013 there was this bizarre device. It was a phone, but it came with a tablet you could dock it into. The tablet contained only a screen and a battery. The screens on the phone and on the tablet had different densities. There were bug reports about my app not handling these transitions correctly, presumably because I was dumb back then. I can imagine something similar happening with the new hotness, foldable devices, where the external and internal screens could have different densities.

And while I have you, one question. Why does macOS have the coordinate system origin in the bottom left? I've always found that bizarre, especially when iOS does use the usual top left.


Ah yes, the Hi-DPI days. I left Apple before all of that was sorted out. I assume all of the code dealing with scaling was just removed from the frameworks?


I agree, resolution independence should be a higher priority for macos. They've been able to dance around the subject by just doubling everything and calling it retina, but true accessibility would allow everyone to set the scale that works best for them.


Resolution control on my mac mini + 3rd party 4k monitor works just great, you can choose to have native resolution, 'retina' (i.e. 2x scaling), or a few steps in-between the two, there's plenty of flexibility.

Plus, the way it works means that (as I understand it), programs don't really need to do anything to support all these variations. Behind the scenes, the OS renders everything at a very high resolution, then uses the graphics hardware to smoothly scale everything down to your chosen size.

My eyes aren't good enough to use the native resolution, and using the 'retina' setting wastes too much screen real estate (I mean, that's why you buy a big monitor for, right?), so having somewhere between the two is essential. I was concerned that choosing a non-simple scaling (neither doubled or native) would make things blurry or slow, but it just works, and works well. Even dragging windows between a high-DPI and normal DPI display is seamless.

As an aside, after years of using a low-res monitor, with tiny fonts to squeeze as many terminals and emacs windows onto my screen, high-DPI is a godsend. For a long time after getting the 4k monitor, I'd just play around with the mac's screen zoom accessibility function (uses the mouse wheel to smoothly zoom into an area of the screen) - it was astounding to me just how much you can zoom before individual characters become blocky. They used to be drawn as 8x8 pixels and now, to my aging eyes, each letter looks as smooth and detailed as printed text!


It's not as simple as fixing the display setting code. For example WebKit is not happy with non-integer DPRs. It will leave gaps here and there. When those gaps happen is probably rare enough that it's not a show stopper but I suspect lots of other MacOS software has similar issues.

Another tangentially related issue, how do you display a pixel perfect image on a non-integer DPR. For example lets say you want to show 8bit mario in HTML at exactly 16 or some multiple of 16 pixels so that there's not odd integer scaling. it's quite a pain, requires JS, and requires flexible page layout since you won't be able to use "CSS pixels" to decide the size to display it. Possibly that's an argument for never having non-integer DPRs.


You're right it's not simple, it would require a refresh of uikit and a transition period for developers to learn how to opt in or opt out. There are existing ui frameworks like what android provides that show how to define ui elements, strokes, padding, margins, and text to use either scaled pixels or fixed pixels.

Leave everything as fixed pixels for backwards compatibility and add scaled pixels in for new development. I can guarantee that if Apple ported the main apps like Finder, Mail, Safari, Calendar, Music to support resolution independence that other developers would jump on board quickly to keep up to date.


Hold alt while clicking "scaled" in the display settings. You'll get the full list of resolutions.


Interestingly, looks like that's presented as a checkbox in Monterey, rather than an option-click (never Apple's most discoverable UX).


I don't use an external monitor, so I'm not familiar with the issue here, but are you saying this person created a solution for a problem that doesn't exist? Is this one of those RTFM moments?


I used dual-monitor set up for my MBA-M1. My external monitor is 1080p native and connected through Anker USB-C dock. When I connected the monitor to it, macOS on my external monitor looks blurry as hell. It is like it is imposed a strange resolution on my 1080p monitor that ended up looks worse. I tried to change the resolution through the Display setting and it didn't generally help. So the "secret menu" in Display didn't help at all, macOS is doing something to my external monitor that it was crisp and clear in Windows and Linux whereas macOS are not.


I would not ever say that, I do not have an M1 (which is specifically mentioned in the readme). I am responding to the parent because indeed there is a way to set a resolution as desired rather than just the "pick from these 4 scaled images".


Then half your apps will look blurry because they don’t support scaling, this includes quite a few built in utilities.


When was the last time you used windows?

Windows 10 has solved all of these, you need to find and use some old 3rd party apps to have this issue


I've been running Win 10 with 150% scaling for a year or so, with fairly light use because I spend easily >10x more time in OSX (where scaling works correctly nigh-universally).

Win10 scaling is..... very hit or miss. Various programs are of course all over the place, but some system utilities don't even render correctly. Window size and control placement is mostly correct everywhere, but text? Images? Tooltips? Ooooh boy are those a mess. Maybe like 20% of apps I've ever touched don't render something correctly. That's A LOT of problems.

And that's before I start talking about running two screens at different densities. Then it's borderline hilarious how broken things are.


The worst thing is that scaling isn't applied correctly at all. You can have a taskbar icon with correct scaling that has a tooltip that isn't scaled at all and a context-menu that is scaled correctly but the scaling was also applied to the position so now it floats 50 pixels in the wrong direction.

It's an inconsistent mess. Great for people who think inconsistent messes are beneficial for them, not great for other people.


Taskbar stuff! I totally forgot about that. Yeah, those are very frequently wildly screwed up. I've had their icons squashed weirdly several times, context menus in particular have no rhyme or reason I can discern about how they break (they just do everything imaginable), and a good number just never show their popup content at all. Maybe it's appearing off-screen? It's impossible to tell!

Not that stuff in the taskbar is at all stable normally, of course. About half of the time I'm trying some new application out, it'll disappear from the taskbar as I mouse over it, apparently because that portion crashed or something and for some reason windows doesn't detect that until it's moused over. Or sometimes they'll double or triple up for no apparent reason, and mousing over will clear out the duplicates. I can't imagine what leads dozens of applications to have those same kinds of problems, so I'm forced to assume it's at least partly because of Windows.


I'm on Windows 10 right now, running a 1440p display, and a 4k display at a logical resolution of 1440p. Lots of stuff misbehaves on the scaled 4k display.

For a couple of examples: The nVidia Control Panel ignores the scaling on the 4k display and renders tiny text as if the display was set to 4k. GPU-Z looks like a blurry mess.


The Netflix Windows app and my 4K monitor at 125% do not work together. The UI elements are tiny and the padding is wrong.


Tell that to my Windows 10 install, because it's displaying a hodge-dodge of non-uniformly scaled bullshit.

Solved would mean I'm not hacking a scaling factor into the perforce config file to get its window to match the rest of the UI. Solved would mean that Windows handled it for everything drawing windows.


The last time I used Windows was last night. Unlike Mac where I hold back on updates until they force my hand, I update my Win 10 as soon as it hits the public channel. I can assure you, the problem is far from solved. Whereas on MacOS, not a single app misbehaves.

Most apps are 3rd party apps!


I've been using Windows and hidpi for the past few years & routinely encounter scaling bullshit.


Just on the weekend I ran into Keyshot pretty much ignoring the scaling. Then Rainway installer rendered in some weird scale factor. Then Modern CSV got issues all over the UI. It is far from being solved…


Pretty much any app using bog standard winforms seems to suffer.


There's still some issues if you have multiple monitors with different display scalings, but they have just about solved that too.


You mean old 3rd party apps such as Device Manager?


I’m on Windows 11 and still regularly see this.


It really has not


This was true in 2016 but not anymore today. I cannot find a single blurry app on my laptop now.


I always hate this response. "I don't have this problem, so no one else does either." This all depends on what tools you use on a daily basis. I have some data tools that have scaling problems and they're unique depending on whether you use Windows Server or Windows 10/11. It's one of the cons of Windows having such great backwards compatibility. Some tools just aren't updated in basic ways despite receiving plenty of updates.


Sure, I'm not saying the problem is completely solved. But for most people using the nth percentile most used tools, the problem is solved. There will always be holdouts or legacy programs that are not HiDPI aware. But if you ask 100 people to list their 10 most used tools, I'd wager a bet that 95% of the unique entries on that list are not blurry when used scaled.


Man, what're you talking about? If you hook a Windows 10 machine up to a pair of monitors with different scaling percentages, then try to drag a Google Chrome window across them, you get sent straight to Resolution Hell. Even the first party apps that "work" go crazy for a while right after they're dragged across

I ended up having to buy a new monitor to work around the issue, since I couldn't handle having apps explode on me any more


"[M]ost used tools" are not the things that really annoy people. It's the "least used critical tools" -- the things that you absolutely need to do, but only once every few months. They often involve loading three layers of progressively older control panels.

This is a failure brought by the overreliance on data and telemetry to show what is "most used," without regard for what is most important.


yeah, Microsoft loves to give an inferior experience to 1% of its users in favor of polishing the experience for the 99%

The problem is that everyone is a member of a different 1%. "Oh, only one percent of users have that monitor setup" "Oh, only one percent of users run that app" "Oh, only one percent of users change that setting in any given month" "Oh, that app only crashes once per hundred uses"

If you do 100 things with your computer every month, and every one of them has a 1/100 chance of crapping out, you're looking at a 64% chance of something crapping out on you over the course of the month


If you do 100 things with your computer every month, and every one of them has a 1/100 chance of crapping out, you're looking at a 64% chance of something crapping out on you over the course of the month

Good point (1 - 0.99 ^ 100 ~= 0.63397), and this also relates to the idea that the higher the number of dimensions (in this case features used by a given user), the more volume there is away from the middle. Our spatial intuitions for distributions in two or three dimensions do not prepare us to handle distributions in 100 dimensions.


And that's what I'm saying is not the case. We have at least 10 Windows machines in my office and everyone has different scaling issues with different apps, even things as ubiquitous as Google Chrome. It's mostly an issue when there are multiple monitors of different resolutions. I have 2 of the exact same monitor so my issues only pop up with legacy software where the UI is still bitmap scaled but every person here has some kind of situation where Windows scaling just craps out.

The only apps that don't really have issues are Microsoft's apps. VNC and RDP mostly function correctly so that's the direct response to this main post but to say that it's "mostly solved" is not accurate, in my opinion. Searching Google for Windows HiDPI issues and filtering to the last year still yields support docs from Dell and other monitor manufacturers (from this year) that describe these issues.


Depends which 100 people, which was the point of the comment you replied to.


Qt apps still have scaling issues. They can either be blurry, or they can have mismatched text and button sizes. VLC is terrible at 125% scaling.


Is that worse than simply not being able to scale the other half of the apps that do work without kludges like this?


If you just use chrome and visual studio, you won’t run into half of those apps. But ya, any old app that renders directly to a bitmap would have problems if you sneed to use them.

I used to write a lot of WPF apps and the resolution independence was a real thing. I guess I just got lucky in that car never used any legacy apps.


That's how it works for me. I have a 4K monitor, my MacBook Pro connects to it at native resolution, and I select the scaling factor I want.

This is a good thing, since as much as I like 4K resolution, my middle aged eyes prefer to have things scaled up a bit.


That's indeed nice...but it only works for 4K or higher resolution screens (on lower res, scaling isn't offered).

For example with the Windows solution, I have a laptop with a 1920x1080 screen which I set to 125% and then a 3440x1440 monitor which I set to 150%.

When I connect my MacBook to that monitor, I have to resort to hacks like the one linked here, or live with tiny text and heaps of wasted space.


The issue is with lossy scaling. I also have a 4k monitor but you can clearly see a difference in clarity if you select anything other than 3840x2160 (1x, everything is tiny) or 1920x1080 (2x, everything is huge).


How does Windows solve that? Anything other than 2x multiples is lossy I thought.


Windows UIs can draw natively at non-integer scales. This put them a good decade behind macOS for software support because 3rd party products all needed tons of work to support it, and would all get drawn at 1x with pixelated upscaling in the meantime.

But now that it's more widely supported, it's really the more efficient and more precise solution.

I had a Surface Pro 3 with a default 150% scale and the experience in 2014 was great as long as you lived in OneNote.


Some* Windows UIs can draw natively at non-integer scales.

Many first party programs even still look _awful_ at best, and are non-functional at worst as a result.


What I mean is Windows UIs can draw natively at non-integer scales, not that they all actually do it. But it's certainly more than it used to be, especially consumer-focused stuff.

In contrast to Mac where your UI draws at 2x and then gets downscaled to fit on screen, unless the user has picked the one resolution option where it's actually native 2x mode, which I never see anyone do because it's not enough space.

I imagine any niche business targeted software on Windows still doesn't do scaling right, but that's the nature of the Windows ecosystem. You can run your software that hasn't had meaningful updates since 1995, and someone will happily keep selling you their software that hasn't had meaningful updates since 1995.


Windows has had support for variable DPI since win95. They've had to hide or rework it over the years because it hasn't always held up to problematic dev practices like fixed pixel dimensioning but the fundamental graphics subsystem has supported it.


Now if Windows could just make my laptop not use half the battery overnight with the lid closed.

(This is frustrating me to the point that I am on the verge of getting a macbook for the first time now even though my laptop is otherwise great and only a year old, so it's good (for my pocketbook anyway) to know the other side has its own issues).


> Now if Windows could just make my laptop not use half the battery overnight with the lid closed.

Eh I have a Macbook and various apps will cause the laptop to not go to sleep. Web browsers are the largest offenders in my experience. I've often enough came into my office to a laptop that was running at 100% CPU all night.


I don't get how idle power use is still so bad on competing products. I understand Android, at least, has improved somewhat, but it spent years being laughably bad compared to iOS, even on tablet hardware with no cell radio to worry about. Dead after three days in a drawer, while the iPad that'd been in there for three weeks still had plenty of charge left. WTF.


Check your manufacturer's website for updates and you rpower scheme settings. I have a early-2021 Dell XPS 13 (the 9310) and I can close it, forget it in my bag for two weeks, open it and it has 94% battery left because it auto-hibernated.


It's been a while but I think I had that working too, except wifi would be totally dorked up after waking. I'd have to manually disconnect and reconnect each time (xps 9700). That was more annoying than being dead 1/3 of the time, so I left it like this. Are you experiencing similar?

That said, yeah I'll check if there are updates I'm missing.


FWIW I disabled "Allow network connectivity during connected-standby" in group policy and it seems to be better (fingers-crossed).

Apparently I'd set that to "enabled" last year when trying to deal with the wifi flakiness after wakeup. However setting it to disabled now, I am not getting the flakiness so far, so maybe a software update fixed that problem. (Again, fingers crossed).


Two days and so far it's worked. Drops 1% power overnight (compared to 8% per hour pre-change), and connects straight back to wifi upon wakeup.

Note hibernation is still disabled. This is just sleep. And I didn't hack anything AFAIK to disable the connected-standby that Dell/Windows forces. So, pretty happy! (Of course, was looking forward to trying out a Mac, but oh well).

    The following sleep states are available on this system:
      Standby (S0 Low Power Idle) Network Disconnected

    The following sleep states are not available on this system:
      Standby (S1)
        The system firmware does not support this standby state.
        This standby state is disabled when S0 low power idle is supported.

      Standby (S2)
        The system firmware does not support this standby state.
        This standby state is disabled when S0 low power idle is supported.

      Standby (S3)
        This standby state is disabled when S0 low power idle is supported.

      Hibernate
        Hibernation has not been enabled.

      Hybrid Sleep
        Standby (S3) is not available.
        Hibernation is not available.
        The hypervisor does not support this standby state.

      Fast Startup
        Hibernation is not available.

      Standby (S0 Low Power Idle) Network Connected
        Connectivity in standby is disabled by policy.


Try looking into Power Profile and Power Management setting, likely you are on the high performance profile or the profile was set incorrectly. Also look into the device manager and change the power setting for each devices, some device can turn on the computer and keep it awake through out the night. Even apps (I managed to find out why my computer keep awaking up in the middle of the night because of the particular app that have no reason to use the waketimer function).


As someone who has to do UI/UX work on Windows, I prefer the way macOS handles it, the issue is, if I use a non-HiDPI aware app on Windows, I have no idea what my UI elements will look like - on MacOS, they may show up blury, but I know exactly what they'll look like - another issue is, Windows does UI scaling on non-HiDPI screens by default.


I think it embodies the philosophical difference between Microsoft and Apple. And I’m not making a judgement here.

Windows: give users choice, including the choice to end up with blurry apps with ill aligned UI.

Mac: we cannot let you stray outside these boundaries, otherwise your UI experience will deviate too much from the experience we determine to be the best.

For me, scaling on non-HiDPI screens is a feature. I like big text and big buttons and Windows gives me that. I’ll gladly except perfect big UI 99% of the time, if the cost is ugly UI 1% of the time.


Windows still requires a dummy dongle for a virtual (software) video output "monitor", I think. (Something like this: https://de.aliexpress.com/item/1005001604432934.html)


>in this regard Windows is far superior to macOS

I agree, and wish to add that the new graphics stack on Linux is also far superior to MacOS in the same way. By "new graphics stack" I mean Wayland without XWayland.

In fact, of the 3 desktop operating systems, I distinctly prefer the details of how my screen looks on Linux: on Windows, small (plus or minus 25%) changes in scaling factor tend to cause large changes in the details of the rendering of text (e.g., the average width of lines and curves can seem to halve or double or the text can seem to change font) which I find a little distracting.

MacOS is by far the worst of the three. Not only does it look horribly blurry to me (on a normal old monitor -- I do not own any HiDPI monitors) when a non-integer scaling factor is applied (via the Displays pane of System Preferences), but even at the native resolution of the display, it is blurrier than I like because of a decision by Apple long ago to optimize for how closely text on the screen matches how the same text looks like when printed out. (At least one of the terms hinting and anti-aliasing are relevant here, but I don't know the details.)

Windows is further along than Linux in the rollout of a truly resolution-independent graphics stack: on Linux, I have to pass certain flags to Chrome to get it to circumvent XWayland and not be blurry -- and then there are a few bugs -- but bugs I can definitely live with. Also, on Linux, I have to use a special branch of Emacs (named feature/pgtk) to get Emacs to be non-blurry when a non-integer scaling factor is set in Gnome Settings, feature/pgtk has a bad bug (freezing at random times) which I learned to work around (by starting a second "sacrificial" Emacs instance, which luckily would always be the first to freeze, and once frozen would somehow prevent the first instance from freezing).

I was a MacOS user for 10 years, and would still be a MacOS user today if I hadn't spent time on Windows 10 and had my eyes opened to the painfulness of the two aforementioned sources of blurriness (namely, Apple's decision to optimize for fidelity to hardcopy and MacOS's use of a scaling algorithm when the scaling factor is not an exact integer). I always knew during those 10 years that MacOS was too blurry for me at non-integer scaling factors, but I thought it was OK because the 2 individual apps I spent the most time in (Emacs and my browser) have app-specific scaling factors that don't introduce blurriness, and it wasn't until I spent time on a different OS that I understood how sub-optimal MacOS was for my pattern of use and my particular visual cortex.

If you are on a Mac, a good way to experience what I am talking about is to install Google Chrome, then operate the "Zoom" control in the menu of the 3 vertical dots. Note how every element in the viewport instantly changes size. PNG and JPG images (e.g., the white "Y" in the left top corner of this page, but not the white rectangle surrounding it) might become blurry (or stop being blurry) because unlike essentially everything else on a modern web page, PNGs and JPGs are not stored as resolution-independent mathematical descriptions of curves. Well, Windows has a control in its Settings app that has the same effect on every visual element on the OS (including the mouse cursor). And on my Linux box, Gnome Settings has a control that does the same thing. (Safari and Firefox might have the same "zoomability" as Chrome does; I haven't used Firefox on a Mac in years; back when I did, its zoom control zoomed only the text, but not the images; I no longer have access to a Mac, so cannot experiment with Safari.)


> I do not own any HiDPI monitors

This is a major factor. macOS has aggressively optimized for these in recent years, often at the expense of classic 1x display experience. I use 5K 27" and 4K 24" monitors at 2x scaling (this is the default), and the result is excellent.


Maybe so, but the blurriness caused by decisions around antialiasing and hinting was present already in Snow Leopard, which predates the first retina Mac.


That was my first thought. All this crap Mac users have to go through if they want to go off the "happy path" even slightly with an aftermarket monitor.

I worked for Apple more than two decades ago, but now I'm a happy Windows 11 user.


I have an aftermarket monitor, runs at 4K, has a scaling factor applied so that I can make everything bigger. I don't feel like I had to go through any crap, either, it's the simplest of UIs.


I agree that it is simple (i.e., the Displays pane of System Preferences) but the result IMHO is unpleasantly blurry.

These days, fonts, SVG images, CSS, etc, are stored as mathematical descriptions of curves that can be rendered at any scaling factor. Any use of a scaling algorithm, like MacOS does when the scaling factor is not an integer multiple of the display's true resolution, is an unnecessary source of blurriness. A HiDPI display make the blurriness less noticeable, but do not solve the basic problem, which is that the scaling algorithm reduces the "effective resolution" of the display.

(I have never used a HiDPI display with a desktop OS, but another participant here on HN has and reported that he notices MacOS's blurriness relative to Windows even on a HiDPI display.)


Cloud providers that offer Mac mini servers use a HDMI dummy plug to emulate the presence of a display, but macOS will usually not treat it as a HiDPI display. This project is great for creating 4k screenshots on such servers.


EDID dongles have made my remote life work (with HP RGS of all things...) far more nice than any alternative. Goes up to 4K (but low Hz... which is OK for the kind of dev work I do).


What app do you used for remote work? I've tried teamviewer / anydesk, but the text always blurry.

(I have 4k screen and need to scale up the text size to 2560*1440. Then I found these app only take the 2560 screen to remote)


I settle on Parsec, if you can't tweak the quality to your liking in the gui you can pass custom encoder settings as well.


I find ConnectwiseScreenconnect works well, but there is a pretty noticeable delay.

The best remote experience I have had with macOS is using the built in VNC, but not with a standard client. Using the Remotix client provides a great experience. It's $50.

I have also heard that NoMachine for macOS works really well, but I have not tried it.


I used NoMachine in the past. It works really well, it feel like Windows RDP brother without the MS name. It handles better (less slideshowy and did good with video playing in the remote devices) than VNCs I used in the past.


Just realize how simple is it with the built in VNC. But it just show a DOT for mouse cursor :s And I tried "chrome remote desktop", also good quailty text. But both causing cpu hogging.


HP RGS is quite ok I'm surprised to say... Since we're a HP shop it's all free.


For those asking why this is useful:

Mac mini's without a monitor hooked up can only display at a certain aspect ratio while remote screen sharing. The typical solution is to use a dummy hdmi plug (~$10) plus an app like SwitchResX (~$20) to support custom screen resolutions.

If OP's app works as described, then this is a free software solution to a $30 problem.


Your presentation has presenter notes but your MacBook has only one display?

Create a dummy display for the full-screen presentation, share that into the remote meeting/projector, and preserve the primary display for presenter notes, logtails, chat, etc.


Sounds like it'd be easier to just run your slideshow in a window and share that window. PowerPoint and Keynote can both do that.

Bonus to sharing only a window - you never accidentally show something you didn't want to, like that email or messages notification...


For those who don't know, in PowerPoint go to Setup Slide Show on the Slide Show tab and select "Browsed by an individual (window)" and then share that window. Great for Zoom et al.


I use OBS to handle the screen sharing because it is easily to misclick the wrong window that are not for public view. In OBS, it is binded to specific window titles and it is easily to edit out the black bar on the top of the PowerPoint that appeared in Zoom directly (without OBS). I have a few scenes set up this away and quicker to switch than doing it through Zoom itself.


This doesn't sound like the core use case though based on the README - it sounds like it's used for enabling HiDPI (aka not-blurry) scaling on less-than-4K monitors. So, using native resolution but not necessarily 1:1 DPI.


It is not just a monetary problem though. Sometimes you don’t have the required physical access or permissions to install additional hardware.


Hi, I tested the Screen Sharing scenario with BetterDummy running on an Intel headless Mac Mini 2018 running Big Sur. Works splendidly, all resolutions are available and resolution change works on-the fly through Screen Sharing!


Was looking for this exact thing a month ago [1]. Now, this.

I've given up the idea of connecting to mini over screensharing and connected a couple of displays to it, happily.

[1] - https://apple.stackexchange.com/q/428243/51800


Here is another reason this is useful:

If you connect a Mac (mini, MacBook pro, air) to an external monitor, AND this monitor is not recognized as "retina" (eg. Apple $4599 XDR) AND you are not running at the monitor's highest resolution, the screen will be blurry.

There are a lot of "ands" in the above sentence, but it's actually quite common situation. Retina monitors are expensive. For non-retina monitors, native highest resolution still renders objects too small for many people.

On MacBook Pro/Air the solution is to mirror the screens. The MacOS thinks that it renders to the retina screen and sends appropriate re-scaling to the external monitor. But mirroring has its problems. For example, expect ratio will be the same as MacBook's internal screen. This may result in black bars on the monitor, depending on its aspect ratio.

Also, in M1 MacBook Pro 2021, the aspect ratio is variable, due to the 72 pixels menu bar up top, going in and out. I am not sure what will happen when you mirror the screens there.

Hopefully the above software solves this problem.


Interesting, for those wondering how this could be useful apart from what's already mentioned in the Readme: Projects like my Weylus [1] or the similar Deskreen [2] would greatly benefit from this as both can mirror your screen to a tablet and like that one could use a dummy display and said tablet to create an additional screen.

On a related note, does anyone know of similar software to create dummy displays on Linux? All I could find so far is some trickery which only works with Intel hardware on X11 [3].

[1]: https://github.com/H-M-H/Weylus

[2]: https://github.com/pavlobu/deskreen

[3]: https://github.com/H-M-H/Weylus#intel-gpu-on-xorg-with-intel...


Yes thought about this when reading the README. Sidecar has been better than 3rd party apps in this regard (maybe the same technique) but having no DPI control sucks. Thanks for the links, will try to see how it works.


Hmm. Didn't think of that use, but tried it and works nicely. However BetterDummy does not have 4:3 (iPad) aspect ratio support as of 1.0.7, I'll add so SideCar would fill the iPad screen properly.


I stand corrected, 4:3 (16:12) is already available in the app just wasn't thinking right lol. So Sidecar scaling is supported as of now.


Thanks, trying it out now! One more thing; can is it possible to have this on Homebrew? Just to make sure it becomes a default app for me.

EDIT: works well with Sidecar -> Mirroring the Dummy display! Lag/latency wise, feels at least the same with native Sidecar. And thanks for the humongous amount of res options, even though I don't need it right now.

I have another usecase: I used to have an off-brand external display that doesn't register HiDPI res to Macs. Bought DisplayResX for that, but maybe this can help. Once again thank you!


>does anyone know of similar software to create dummy displays on Linux?

Xvfb (https://linux.die.net/man/1/xvfb)

wayvnc: https://github.com/any1/wayvnc

Weston has a headless backend also.


I have absolutely no use for this. But I love that people make stuff like this. There's so much creativity and skill involved in this type of thing, that often seems to get completely overlooked.

So, to people who build stuff like this and release it... Thank you for your service!


What does it actually do? What is the point of a fake display that doesn't exist?


It's useful on a headless Mac used a server.


Is this a joke or a serious answer? Why would a headless mac need one more display that it doesn’t use?


When you remote into the Mac with VNC (which is the protocol Apples native Screen Sharing solution uses) you are limited to the display resolutions that the hardware thinks is available. This tricks the hardware into allowing additional resolutions that you otherwise wouldn’t be able to get over a VNC session.


This is just a completely foreign concept for Windows users. My main work PC is a laptop with a 4k built-in display, and 2x 1920x1080 external monitors. I can RDP in from my home machine, which has a single 5120x1440 monitor, with zero issues.


Only if you use RDP which is basically a remote graphic card where as VNC type systems capture the frame buffer of actual graphics card on the remote. Steamlink, Team Viewer, etc all have similar issues on Windows as well.

Tried doing Steam link to a Windows computer without monitor attached and would only work in like 1024x768, needed a monitor attached to get 1920x1080 almost bought a dummy for the Windows machine similar to this but had and extra monitor so I didn't bother.

Also VNC like system will turn on a monitor and you can see what people are doing remotely, RDP doesn't do that. RDP is a really nice system that gets really good performance over low bandwidth links due to remoting higher level graphics API, at least it used to, wish Apple had a built in equivalent that worked similarly.


Oh yeah, I know that RDP is superior. But Apple's own first-party remote desktop solution is basically VNC, and suffers from similar limitations. They could be more like Microsoft, but they seemingly don't care.


NoMachine is a great alternative for RDP. My remote desktop only had Win Home which have the RDP features disabled, so NoMachine is an answer to that. There is a command line to enable the RDP in WinHome but it require several step and external library to make it functioning. Now I have a new Intel NUC with WinPro, so I ditched NoMachine for MS RDP. NoMachine works flawlessly on macOS and Windows (and it have Linux support). I would say NoMachine is trailing behind MS RDP while Apple VNC is closer to the totem pole.


MacOS still uses the garbage VNC protocol instead of what Windows did, which was treat remote display as a first-class product; indeed RDP is one of Windows greatest features that is unheralded/underappreciated.


Haha this is even crazier than the original fake-mirroring usage!


I'm guessing to reap benefits of https://en.m.wikipedia.org/wiki/Supersampling


On Windows, I use AMD's VSR (Virtual Super Resolution) to set 5k screen resolution on a physical 4k display, letting me set the UI scale factor to 200% and have reasonable font sizes with clean icons (125% and 150% looks fuzzy). Under Linux, it is an xrandr spell that achieves the same thing.


How does this work? Don't you have to scale the virtual 5k resolution down to 4k? Wouldn't that also generate fuzziness?

If a stroke is 1px wide at 100%, 2px at 200% / 5k, how does that stay sharp when you scale it down to 80%?

Or is the idea that whatever algorithm AMD uses is better than the one Windows uses?


Say you have a 28" 4K display but you want to run 2560x1440 resolution to get you desired UI size. You can either run 2560x1440 scaled up to 4K, or 5120x2880 scaled down to 4K. The latter will look better.


Or you run at 4K and let applications scale – which sadly not all support properly.


This is what I usually do. I run 4k at "100%", but then I increase font size as needed. I usually don't care about the icons, widgets, etc since I rarely interact with them, so to me, it's actually a good thing they're small. This way everything is plenty sharp, and I also get more screen real estate.

The reason I use a 4k screen is to have sharp text, so I try to avoid scaling.


That’s why – despite being a free software advocate – I currently run Windows for all critical software, as on Windows I can scale content without re-scaling windows. (Qt and Android support it as well, but Gnome and macOS don’t).


I'm not using Gnome, but scaling the text works fine for GTK apps on I3.


Yes, but Elementary, Android, Windows and Qt can scale icons and all elements together with the text to any arbitrary scale, pixel-perfect, without any post-processing.

GTK can only scale text, or scale the whole window to 2x, 3x, etc and then scale it back down in post-processing.


Downscaling doesn't bother me as much for some reason. Icons scaled to 125% or 150% from their 1x size look more blurry than the 2x version of the same icon scaled down. It also looks better to me if the system uses integer scaling to blow up the 1x icon to 200% size first, then scale down from there.


The github page describes the exact scenario that this is supposed to solve. I don't think I can explain it any better than that page already did.


Is this the "explanation"?

> To fix this problem, BetterDummy creates a virtual dummy display which you can then utilize as a mirror main.

What is a "mirror main"?


By using display mirroring (i.e. having all your displays show the same output) you can effectively use scaling settings determined applicable for the primary display that the OS wouldn't normally allow you to apply to the secondary displays. This is a workaround solution for that OS limitation in which a fake primary display is created to give you more settings for the real display (by making the real display the secondary).


Thanks, that's actually an explanation. Seems crazy that there's no simpler solution!


I find it unfortunate that m1 macs have this problem. This person's creativity could have been put to use on something other than fixing a flaw in a computer.


Yeah, I made this app as I switched a family member's mac from an Intel mini to M1 and was mystified by the fact that no matter what tricks I was trying, it was not working with the Lenovo 24" QHD display as before (getting a 4K display was not an option due to cost and size constraints - no good 24" 4K display is available for a reasonable price and there ppl who don't want huge displays on their desks). I purchased a HDMI dummy but had constant issues with it, so I had to explore other avenues.

My creativity is normally put to use on something other, like MonitorControl, but working on BetterDummy is fun as well. :)

https://github.com/MonitorControl/MonitorControl


Probably doesn’t qualify as reasonably priced, but this 24” monitor has suited me well. I shrink everything by one notch and everything is very crisp and “retina”.

LG UltraFine 4K Display

https://store.apple.com/xc/product/HMUA2VC/A


Now that is software I have a use case for! I used some abandoned menu bar app for this. Nothing worked, except the manual slider, which was enough for me.

I guess I'll have to try this with my display!


Can anyone link to or explain the problem? I'm in the market for a new display and didn't quite get the README explanation.


I think it's explained quite well in the introduction text:

> M1 macs tend to have issues with custom resolutions. Notoriously they don't allow sub-4K resolution displays to have HiDPI ("Retina") resolutions even though (for example) a 24" QHD 1440p display would greatly benefit from having an 1920x1080 HiDPI "Retina" mode.

So, if you connect a non-4k display, the Mac will render in a way that won't look as sharp as the display could.

If you are in the marked for a new display, just but a 4K model (unless you have good reasons not to) and you won't need this hack.


So, if you connect a non-4k display, the Mac will render in a way that won't look as sharp as the display could.

That's not true. A display will happily utilise it's full hardware resolution. The problem, as I understand it, is you can't change the DPI of the monitor. You can configure a 2560x1440 display to use a lower resolution (e.g. 1920x1080 or 1600x900), but then the monitor itself will upscale this into a blurry full screen.

Tricking MacOS to change the DPI would let you configure the display to use all 2560x1440 pixels, but drawing everything bigger, at full resolution.


You are correct, but I was trying to explain it a bit in layman's terms, since the OP did not appear to understand the intro text in the README.md.


But this is a fake display, it doesn't actually show anything. I think the explanation only makes sense for people that already know what the issue is.


Ok silly question because I've never ran into this use case but it got me worried about using real monitors.

I'm using 1920x1200 monitors. If I upgrade to a M1 Pro/Max Mac Mini (if those will ever be available) will I have any trouble using them at their native resolution?

From what I understand, this trick is helpful for headless machines.


According to my tests the app seems to provide headless macs with a HiDPI virtual display with customizable resolution (tested via Screen Sharing, resolution switch on the fly worked, all resolutions are available without any real display connected - should work with VNC as well, but I did not test that).

I tested this as part of testing the app on Intel and Big Sur (BetterDummy worked fine on this Intel config with an integrated Intel UHD630 so headless Mac Mini 2018 users should be fine).


No, you won't have trouble using your Mac at native resolutions with an 1920x1200 display, you don't need BetterDummy for that! :)


The display output of the M1, according to the very clever devs working on Asahi Linux, is a bit funky under the hood.

I've found display compatibility has not been straightforward, having owned an M1 mini for the last year...which is unfortunate for a computer without a built-in screen.

Don't count on it supporting multiple external monitors. Expect compatibility issues, etc.

Last week it kind of burned a flickering rectangle onto my Dell ultrasharp 4k, which I thought was toast, but it gradually cleared up.

Like I said - weird.


> Don't count on it supporting multiple external monitors. Expect compatibility issues, etc.

Well, no way I'm buying a M1 due to the lack of RAM. Hopefully Apple will make a decent desktop for me next year and fix some silicon bugs while they're at it.

Incidentally, my current x86 mac mini is wonky with the hdmi output as well. The monitor randomly goes black for a few seconds once in a while.


ram is on the chip with the cpu now. so i don't see that changing in the near future with the laptop/mac mini versions of the M1 at least


The M1 Super Duper Max Pro supports 64 G ram if you haven't noticed. All they have to do is put it in a Mac Mini... I don't need a new laptop right now.


The black screen problem is usually a bad HDMI cable.


All my HDMI cables are bad? :)

The only kind I'm missing is the gold plated kind.


Apparently so, as Mac hardware never fails! ;)


Dummy plugs are an ingenious solution, but the fact that we still depend on them is baffling. Does Apple seriously not recognize the need for these uses cases?


You can simply create two dummies and mirror each one to separate displays. Or you can use the 4K display as is (since it will natively have 1920x1080 HiDPI on M1) and create a dummy to use with the QHD display to get 1920x1080 HiDPI.


I guess, but my point is: There is no reason macOS shouldn't allow doing all of this on the software level.

Especially with the wholly Apple-made M series of chips, they control the entire video stack down to the transistor level.

Dummy plugs should be syscalls, not physical things.


Yep, sorry, I guess I was clicking the wrong link while trying to respond to something else.

I agree with you, it is funny that we need dummies and workarounds (being hardware or software).


This is my main problem with MacOs - text on non 4k displays is so blurry, Windows and Linux are doing much better job with rendering fonts. And Apple doesn't seem to care about it at all - they even removed subpixel aliasing in the recent system update (i guess it was High Sierra). So weird, given most of the high refresh rate monitors on the market is 1440p, so if you have a gaming pc you need to buy separate 4k monitor for your macbook, because no one seems to care about fixing this, WTF?


Thanks for doing and sharing this!

It has been really helpful to finally have virtual displays that respond to CGDisplay methods when developing Lunar (https://lunar.fyi)

It also helped in unearthing some bugs apparently. [0]

[0] https://github.com/MonitorControl/MonitorControl/issues/752#...


Yeah thanks @alin23 for creating extra work for me lol :), I'll be working on fixing that in MonitorControl (https://monitorcontrol.app).


I wonder if any macOS app able to do scaling and adding black border around the screen like what "nvidia scale resolution" does in Windows.

My monitor have light leakages issues. In Windows, I scale it down and use black tap covered all 4 edges. It looks like a normal screen. :)

When I use it as macOS external, I have to guess the top menu position :p


Related to this, looking for a way to have the HDMI output of a 2015 MBP be horizontally flipped (for teleprompting). SwitchResX is supposed to be able to do this but it hangs when I try to flipped resolutions. Might be an interesting feature for this software… Anyone have an idea how to achieve this on an Intel Mac?


A number of ways:

https://telepromptermirror.com/mirror-flip-screen/

Personally I gave up on the window flippers and use LunaDisplay w/ iPad.

Why? The window flippers flip the window — but not the mouse or button click targets! So you need the whole screen to flip, not just the window.


Tried out the WindowMirror utility and the click targets are indeed pretty confusing… but it might still come in handy when presenting. With the looks of it, I could use it to flip a windowed OBS projector view, and drag that to the external monitor that is my teleprompter. Wouldn’t need to click much in that scenario. Thanks!


The 11.6 (I believe) release of MacOS X finally allowed native support for 5120 X 1440 (my display).

The thing that annoys me, is that I have a second display that I use, from time to time (when giving video classes, for instance), but it remains assigned to a different device, most times.

However, I can't leave it plugged in. If I do, the OS insists that I have a second screen, even though it is not actually connected to the Mac. This means that I can lose my cursor and the graphics processors are rendering undisplayed pixels.

I can use an app like SwitchResX[0], but I feel that I should not need it. The OS should detect more than just if an HDMI cable is plugged in. It needs to know whether or not the screen is actually "live," and rendering graphics.

[0] https://www.madrau.com


Maybe you want a physical KVM in between the two sources and the monitor? I tried to get a similar setup working locally (2 monitors, a macOS source and a Debian source, supporting all four input/output combinations) and couldn't find a satisfying software solution. There were a few people using ddcutil/ddccontrol/ddctool on Linux hosts to script changes on source changes, but (a) it was difficult/impossible to get DDC to work consistently with DP and HDMI outputs with first party Nvidia drivers (b) I couldn't find anything comparable on the macOS side.

Hardware KVMs weren't much better, but if you're willing to spend the money, and have enough DP sources on both sides, they aren't too bad.

Funny how hard/expensive it is to replicate the convenience of a nice analogue KVM with a bunch of VGA/DVI/PS2 I/O, a knob, an not much else...


What about IP-KVM something like PiKVM? I imagine it would be far more capable than those software solutions.


Actually, belay that. I just tested with Monterey, and a new MBP, and it works the way that I expect. I need to use the built-in HDMI, but it works.


Does anybody know of software-defined ways to switch this behavior? I have the opposite problem sometimes -- I have an external display shared between two devices (of varying OSes), and sometimes I want it to remain a logical display on each device no matter which input is selected on the display. But other times I want the display to be removed when the device is not the active input.


I'll bet that SwitchResX will do it (for the Mac). It's a pretty heavy-duty utility (look at the link in the root).


I remember when Windows had to be subjected to lots of different downloaded freeware applications to "unlock" full customization of the OS.

macOS is in these days at the moment. What does that say about its product arc?


Does this allow to create virtual resolutions wider than 6016 px? I have a Samsung 32:9 monitor and this seems to be a hard limit on a Mac so far. Would be great if I could go higher with this.


Great app. I can confirm it works well with the new M1 Max and Monterey. Currently using with dual Pro Display XDRs in clamshell mode (14" MacBook Pro).

It allowed using HiDPI at a higher resolution (4k or 3840x2160) than the M1 chipset natively allows (3k or 3008x1692 HiDPI on the 6k XDRs).


Can anyone confirm if the new M1 Max (or Monterey) can output HiDPI at higher resolutions than the 1st gen M1 13" MacBook pro? Using a 6k Pro Display XDR on the M1 with 3k scaling is brutal. The interface is way too big.

An iMac pro with an eGPU was able to accomplish this perfectly.

--edit

The Display Dummy app resolves this problem perfectly. In the Display Dummy menu, select 16:9 ratio, then go to displays in system preferences and choose 3840x2160. Curious if it will be able to work with dual Pro Display XDRs on M1.


I'm away from my 13" M1 MacBook right now but there was a 3rd party app I used that lets me choose any resolution and scaling on the built in or external displays. I like having a lot of workspace and thankfully have good eyes so my usual setup is 2560x1600 no scaling for the built in and 3840x2160 no scaling for the external portable USB C monitor I carry around with it.

There may well be a way to do it from the CLI as well but the app just sits as a tray icon in the menu bar which is great from a "I don't know what I just plugged in but I want to to just be <x>" perspective.


Ideally 3840x2160 in HiDPI on the Pro Display XDR. I get this with an eGPU puck on an iMac pro with SwitchResX.

Unfortunately the original M1 limited SwitchResX from using HiDPI at higher resolutions. It is limited to 3008x1692 at HiDPI (half of 6k).

The higher resolutions are available but they aren't HiDPI and text looks noticeably worse.

It's a known issue with several display tools like: https://github.com/xzhih/one-key-hidpi/issues/164


Can't you just go to System Preferences and click Alt + Scaled? That usually gives you all the resolutions that your monitor supports without having to use the "Larger/Smaller" visual options.


the issue was the m1 limited the choices to maximum of 3008x1692.

I just tested the Dummy Display app and it resolved this perfectly. The app definitely works, even better than the m1 limitation of SwitchResX.


It gives more, but not all, resolution and scaling options.


--update

Confirmed working with dual Pro Display XDRs, M1 Max, Monterey (14" MBP in clamshell mode).


--update 2

Bad news, you lose HDR color support (ouch).

Current workarounds involve toggling BetterDummy as needed, or higher resolution on one monitor and native HDR on the other.


i’ve used SwitchResX for years to create/use custom resolutions. though i don’t have an M1 mac (on order now). maybe that’s another way of achieving this?


SwitchResX I think is different. It is superior in a way that it creates additional display modes which macOS then can use as a native resolutions on Intel. On M1 the problem is that it can only create scaled resolutions and even at that cannot create HiDPI resolutions if the display reports itself as sub-4K. This is due to an inherent limitation in the M1 driver implemented by Apple, not the fault of the SwitchResX developer. I was a SwitchResX user for many years on Intel macs and was dumbstruck when after replacing my Intel macs with M1 I found out that I cannot use my displays as I used to.

BetterDummy simply navigates around the whole problem by creating Virtual Displays that are mirrored to the main display. This is in one regard inferior, since this is inherently a workaround but on the other hand can work on-the-fly and all possible resolutions are instantly available and accessible via System Preferences/Displays.


thx!


This is how I got hiDPI to work with a 1440P monitor https://github.com/bbhardin/A-Guide-to-MacOS-Scaled-Resoluti...


"Does not utilize graphics hardware in vain so it is somewhat faster."

Can anyone shed a light on this feature? Doesn't HiDPI require graphics hardware ?


There is some difference, but it is marginal indeed at least in terms of speed. Via the traitional route with a real dummy, the display hardware needs to produce two sets of displayport output streams (one of which is converted to HDMI via a DisplayPort-HDMI controller chip MCDP2900 to drive the Dummy) + sync up the two displays in terms of vertical sync (this does not always work well, this is why real HDMI dummy users experience mouse jittery sometimes) and also has to scale the full-res framebuffer to two independent display (but scaling is done super efficiently on M1). With BetterDummy all this is not needed obviously.

For ppl who until now had to resort to mirroring the internal displays (as M1 MacBooks - before the new MBPs - support only the internal display + a single external display) the benefit is more obvious, they now can use clamshell mode and don't have to drive the full MacBook display hardware (with brightness turned down to zero) all the time.


Thanks for the detailed answer. Btw any chance this can work on virtualized environment without real GPU. Last time I gave this a shot (virtual screen), couldn’t manage to get it work with hidpi.


This is an interesting question, honestly I have no idea. It might be that these APIs are relying on the presence of a GPU for acceleration (as the private framework APIs used are made by apple for Sidecar and AirPlay primarily). If you test this, please let me know about the results at the project GitHub page (https://github.com/waydabber/BetterDummy). Thank you!


Does anyone know if it's possible to run a Mac app in a "greater" resolution than what the display supports? We have quite a few 2017 macbook airs running at 1440x900 and it doesn't have enough width to display an enterprise app we need to use.

Basically would need some sort of VM or sandbox that could run the apps in 1920 width and scale down to fit 1440px.


This can do exactly that.

You would create a dummy display at 1920x1080 (or higher), then have your real display become a mirror of the dummy display. macOS will scale down to display on 1440x900.


does this work on intel mac? i want to set my MBP screen to a resolution that fills my PC screen for VNC


Works for me on an intel mac mini, needs MacOS 11 (Big Sur) or higher.


I can confirm this, I had now the chance to test it as well. My findings:

- Works just fine on an Intel Mac (tested one with Intel UHD 630). - Works well with Big Sur (tested on an Intel Mac). - Does seem to provide headless macs with a HiDPI virtual display with customizable resolution (tested via Screen Sharing, resolution switch on the fly worked, all resolutions are available without any real display connected - should work with VNC as well, but I did not test that).


Amazing, thanks for the report! Did not have a chance to test it on Intel but I did compile the app to be Intel and Big Sur+ compatible and the APIs used by the app should be available on these platforms as well.


Thank you! I now have it working just fine to access my two headless minis. Small vote of thanks via open collective sent too.


Wow, thanks! :)


I love this, it's so crazy that the resolutions I can display on my screen can include/exclude resolutions that it will display only when an external display is connected.

SwitchResX is cool but also a bit complex. 'Display Menu' is simpler but also somewhat limited.


This is exactly what I’ve been looking for. Wanted to use my 2012 imac as a second monitor for my MacBook and in the absence of target display mode decided to use NoMachine but couldn’t get resolution + separate display without a physical dummy hdmi plug. Thank you


Is there a way to keep using internal screen? When I turn mirroring on, internal display starts mirroring virtual screen too. Readme says "Your internal screen will be available as an extended space on a MacBook" but I do not know how to do it.


Of course, you can configure everything as usual under System Preferences/Displays.

If you have a 2+ display setup for example, you can create a Dummy, mirror it to one display but leave the other display unaffected. Or you can create 2 dummies, mirror each one to a separate display.

Every display can serve as a Main or Extended display, as well as a Mirror for some other display. Therefore a mirrored set can be Main or Extended as well. A typical use case is to use your MacBook display as an Extended display alongside your external display which is a Mirror of a Dummy that is set as Main display (providing all the fine grained HiDPI resolution options).

Hope this helps!


Is this a macOS 12 feature? I am stuck on 11 for now and I do not see any Main/Extended setting.


The app is compiled for Big Sur but I did not test it and received conflicting reports whether it works or not. Please try it and let me know! Thanks!


I tried again got it working this time. I turns out that in versions below 12 the way to set up mirroring of only 2 out of 3 displays is: "Press and hold the Option key and drag one display icon onto another display icon to mirror those two displays.". 10 years of using Mac every day and never knew it was possible.

It is pretty buggy though, "Arrangement" tabs stops working and I cannot change relative screen positions anymore, so to get it kinda working I had to set my real and dummy displays to the same lower resolution before dragging one onto the other and then I could set my desired resolution on dummy screen and it got mirrored onto physical screen.


Nice to see this wrapped up as an app. If all you need is a cli, then https://github.com/jakehilborn/displayplacer works great too.


I don't think displayplacer is intended to create a virtual display, it seems to be a CLI to manipulate display modes and mirroring for existing display so these seem to be two different things entirely. I might be wrong of course.


I agree. But the problem the OP had was forcing a video mode. The virtual display was a means to that end. I think displayplacer solves that problem more directly.


This kind of dongle is also useful for certain workstations that refuse to boot without a screen if you want to turn them into servers.


BetterDummy will not help with that, the app fully runs in user space as a normal app and can start after login.


Wonder if this can be used to bypass ImmersedVR's display limitations


That is the reason I was reading through the comments. Guess it's time to just go try it out.


Immersed apparently uses the same technique as BetterDummy to create virtual displays. I don't know if without a subscription BetterDummy created additional virtual displays show up in Immersed (as I have Elite), but for sure, BetterDummy created displays show up in Immersed.


Thats awesome. Thanks for trying it out.


Hey, I have a Quest 2, I'll try this one out myself as well. :) Thanks for the idea!


A hardware fix to a software annoyance. Nice.


It is actually a software alternative for a hardware fix to a software annoyance lol. :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: