6 votes

Programming/gaming monitor recommendations

I'm a work from home software engineer, I spend most of the day at my desk staring at my dual monitor setup. There are some specs I'd like to upgrade based off what I've read online, but I want to hear if any tildes users have strong opinions about the hardware.

My primary display is a Dell S2719DGF (1440 144hz), and my secondary display is a Dell P2719H (1080 60hz). My primary display is for programming & gaming, and I want to upgrade to 4k 144hz. My secondary display is my primary display for work, only used for programming. I want to upgrade it to at least 1440, maybe 4k if it's as good as they say. I also need 100x100mm vesa mount support, though I think most monitors have that these days.

A few points that I'm not certain about:

  • I've read that 4k is better for reading and writing code because the higher pixel density makes text sharper. I definitely prefer 1440 over 1080, but is the jump from 1440 to 4k as noticeable? I've never used a 4k monitor.

  • My current primary display has a low response time. I don't play fast-paced PvP games anymore, is this something I can give up and not notice?

  • I think IPS panels are the move for both displays, for better contrast and to avoid burn-in, but I'm no longer well educated on the current landscape of panels. MiniLED? QLED? QD-LED? What'sNextLED??

  • Both of my displays are 27 inches. I'm hesitant to upgrade to something larger like 32 inches and lose pixel density. Is the difference between 4k@27in and 4k@32in negligible?

  • I just moved my office into a loft with poor lighting. I read that dark rooms require better contrast but I'm not sure what good or poor contrast looks like.

  • Is my fps going to take a hit from increasing the resolution of my secondary display? I don't know if there's a lot of extra overhead from the increased resolution. AMD GPU/Gnome/Wayland btw.

If you're a programmer/gamer with a hill to die on regarding monitors please share it with me!

8 comments

  1. [2]
    Greg
    Link
    A pixel density calculator that might be helpful: https://www.sven.de/dpi/ - since you’ve got two different densities already in front of you to compare, it’s hopefully not too hard to visualise...

    A pixel density calculator that might be helpful: https://www.sven.de/dpi/ - since you’ve got two different densities already in front of you to compare, it’s hopefully not too hard to visualise roughly how the other options will measure up based on that number.

    One really key thing to remember is that for anything rasterised, sharper means smaller - toolbars, UI elements, etc. often have a default pixel size that maps to a physical size sweet spot at around 110dpi, so 4k 27” (163dpi) can leave you in a bit of an awkward middle ground. You’re either living with tiny UI widgets or scaling 1.5x and losing a lot of your dpi gains to dithering.

    It’s not an issue for text, you can obviously zoom that however much or little you like, and broader UI toolkits are thankfully starting to take more of a web-inspired approach to vector rendering and scalability too, but there are still enough pixel-based edge cases in a lot of software that it’s worth bearing in mind unless you live in a terminal.

    Beyond that, I’ll second the love for QD-OLED, I got a new TV last year with a Samsung panel and I’ve been very impressed. My current monitors are still IPS because I’ve prioritised excessively high dpi, but as soon as an equivalent OLED comes out I’m switching!

    5 votes
    1. ogre
      Link Parent
      I hadn’t considered the impact of DPI on raster UI elements, thank you for bringing it to my attention! What is excessively high to you?? Are they Mac Retina displays?

      I hadn’t considered the impact of DPI on raster UI elements, thank you for bringing it to my attention!

      My current monitors are still IPS because I’ve prioritised excessively high dpi

      What is excessively high to you?? Are they Mac Retina displays?

      2 votes
  2. [3]
    chromakode
    (edited )
    Link
    I upgraded this year to an LG 32GS95UE-B, a 4K 32" WOLED monitor. Previously, I used a 27" 1440p IPS monitor. I spend the majority of my time programming, with some gaming. I enjoy fast paced...

    I upgraded this year to an LG 32GS95UE-B, a 4K 32" WOLED monitor. Previously, I used a 27" 1440p IPS monitor. I spend the majority of my time programming, with some gaming. I enjoy fast paced games, but I also notice and appreciate the smoothness of regular desktop motion at 144Hz and above.

    If budget allows, these days I'd prioritize an OLED or QD-OLED monitor for coding, thanks to the deeper blacks and better color gamut. I was very interested in QD-OLED displays, but this latest (3rd) generation has a semi gloss coating and issues in bright rooms (look up "raised blacks"). I work in a bright room, so I went with the LG, and I don't regret it. Text is less sharp than a Mac retina display, but it's no longer distractingly pixelated like my 27" 1440p. I have good eyesight and I can faintly see the pixels if I get closer to my monitor, but from 2ft away it's fine. I appreciate having the additional physical screen deal estate.

    Display technology is improving rapidly right now, and I anticipate future QD-OLEDs tipping the balance with better coatings and color vibrancy. Both the LG and Samsung displays in this class are a bit immature, as the tech is coming downstream from TVs which are more mass market than computer displays. I wouldn't necessarily wait though -- the ergonomic improvements were very worthwhile IME -- but don't be surprised if a screen you get becomes obsolete next year.

    3 votes
    1. [2]
      ogre
      Link Parent
      You don’t run into any burn-in issues from your editor on OLED? I worry that static elements in a desktop OS would easily burn. My TN panel has temporary burn issues (image persistence) and it...

      You don’t run into any burn-in issues from your editor on OLED? I worry that static elements in a desktop OS would easily burn. My TN panel has temporary burn issues (image persistence) and it doesn’t take long at all to set in.

      1 vote
      1. chromakode
        Link Parent
        OLED monitors have a large bag of tricks to reduce burn-in effects, but the jury is still out on how effective they'll be for desktop use, since the desktop is a less established market and harder...

        OLED monitors have a large bag of tricks to reduce burn-in effects, but the jury is still out on how effective they'll be for desktop use, since the desktop is a less established market and harder use case than televisions. The monitor runs an "image cleaning routine" during long periods while the display is idle which resolves short term persistence effects. There's also imperceptible pixel shifting over time to prevent static items from burning specific pixels.

        In practice some of the tricks (e.g. dimming large white areas on the screen) are more noticable to me than occasional image persistence (which does happen), but it hasn't been particularly annoying. 70% of the time I'm looking at white on black IDE text, and for that use case (QD-)OLED is unbeaten and quite efficient.

        I accept that this monitor is a consumable which may burn in over time. My biggest concern with that is on ecological grounds. As a business expense it's well worth it for the QOL; I stare at this thing for 10+ hours a day including personal use.

        3 votes
  3. [2]
    floweringmind
    Link
    make sure you pick a monitor that goes in 60hz increments. 60, 120, 240, 480. Otherwise you will have issues with video.

    make sure you pick a monitor that goes in 60hz increments. 60, 120, 240, 480. Otherwise you will have issues with video.

    1 vote
    1. ogre
      Link Parent
      Could you elaborate on that? My primary display is 144hz and I haven’t noticed any issues. Maybe if you point them out I’ll notice and then I won’t be able to unsee them haha

      Could you elaborate on that? My primary display is 144hz and I haven’t noticed any issues. Maybe if you point them out I’ll notice and then I won’t be able to unsee them haha

      1 vote
  4. Pistos
    Link
    I have a 4k main display, and would only begrudgingly go back to 1080p for a main display. I have a BenQ PD3200U, and it's easily the best hardware purchase I've made in my entire life. 4k is...

    I have a 4k main display, and would only begrudgingly go back to 1080p for a main display. I have a BenQ PD3200U, and it's easily the best hardware purchase I've made in my entire life. 4k is significantly better in almost all ways. The main difference is:

    Is my fps going to take a hit from increasing the resolution

    Yes, because the graphics hardware has to do more work to achieve the same results (per unit time) with more pixels.

    I have a 3-monitor setup, so I see my previous monitor (1080p) and new 4k monitor side by side all the time. When I drag something from one to other, the difference is obvious.

    That all said, I'm not that intense a gamer, and am content to play games at 60fps or less. In fact, I prefer better looks at lower framerates than higher framerates with worse graphics. In your case, if you're really wanting to play above 100 Hz, that'll probably increase your cost by quite a bit, both in the monitor and the graphics card.

    You'll need to do some display scaling (mainly at the OS level), so things are not too small (relative to 1080 at the same distance from your eyes). I set my default Firefox font scaling to 150%. Also, the occasional game will obviously be designed for 1080, and UI elements will be ridiculously tiny at 4k, or the screen real estate will be laid out with a 1080 bias (bad UX at 4k). Most games are fine in 4k, though, in this respect.

    1 vote