As a general rule, 4K monitors do tend to slow down LR in the Development module. How much slower depends on many things. If the LR GPU accelleration works well with your video card and drivers, things can be quite nice for most edits. If they don't play well together, things can be hit-or-miss.
As a Windows guy using a standard monitor, I cannot get into much details regarding the Mac-Lightroom integration.
What ManiacJoe posted is generally true on Macs too. As I understand it, the introduction of 4K+ and Retina/HiDPI displays on Windows and Mac is what caused Adobe to improve GPU acceleration in Lightroom Classic, because CPU-only rendering was not keeping up with higher resolution displays. A Mac mini is going to use its integrated graphics to update the display, and the integrated graphics on a 2018 Mac mini should perform much better than the integrated graphics in a 2012 Mac mini. That might work out well enough, but I don't actually have a 2018 Mac mini.
Another factor is the pixel dimensions of the images you edit. The more megapixels per image, the more you might wait for screen updates. The worst case scenario is supposed to be 4K (or higher) display combined with very large raw files such as over 30 megapixels each; that’s a lot of pixels to update. If that’s what you work with, and Lightroom Classic is still laggy on the 2018 Mac mini, one somewhat costly option is to plug in an eGPU (external GPU, a full size graphics card in an enclosure) to a Thunderbolt 3 port.