9 votes

4K screen on 15" laptop - worth it?

Pricing up my next Thinkpad (I'm a lifer for Thinkpads I think now) and I keep hovering over the 4K screen option. I'm looking at a 15.6" screen. The FHD 14" screen I currently have is lovely and sharp with a decent colour gamut, and I don't think I can see pixels, even now when the machine is literally on my lap. I'd guess the screen is maybe 35cm from my eyes at the moment.

I don't really game, I do edit photos, video (HD, not 4K) and do a little 3D work with Blender/FreeCAD/etc. I usually run Debian/Gnome, occasionally dropping into Windows because my 3D printer's preferred slicing software is Windows only (grrrr).

The other bonus to 4K is HDR400 and twice as many nits of brightness but again, I'm not sure that's worth an extra £250. I'd probably turn the brightness down anyway. The HDR is potentially interesting but as I don't watch TV/movies on this machine and my camera doesn't output HDR, that's likely not very useful despite sounding good. I could buy quite a lot more compute power and ram with that money instead..

I would go and look at one in person but I have no idea where the nearest 4K Thinkpad is, in person, and even if I did, I don't really want to go into shops right now.

Any thoughts, experiences, advice, etc would be much appreciated.

17 comments

  1. [3]
    asoftbird
    Link
    I'm going to link this chart and would say it's pretty accurate. At that size, I wouldn't say it's worth it unless your head is 1ft away from the screen at all times. If you want something a...

    I'm going to link this chart and would say it's pretty accurate.

    At that size, I wouldn't say it's worth it unless your head is 1ft away from the screen at all times.

    If you want something a little sharper than 1080p, check out 1440p or similar, but 4k is just not worth it at that point, imo.

    10 votes
    1. [2]
      mat
      Link Parent
      Working that line back to 15" looks to me like I'd be in the blue at <0.6m, which is where I mostly sit. But, that said, I can only barely see the difference between 720 and 1080 on my 46" TV that...

      Working that line back to 15" looks to me like I'd be in the blue at <0.6m, which is where I mostly sit.

      But, that said, I can only barely see the difference between 720 and 1080 on my 46" TV that I sit 2 metres from. So I suspect my aging eyes (or at the very least, eyes in need of some new glasses) are coming into this a bit.

      If Lenovo offered a 1440p I think that would be perfect.

      5 votes
      1. Gaywallet
        Link Parent
        Given that laptops are often quite literally in our laps, I would argue it's absolutely in the distance range at which you will be able to tell a visual fidelity difference between 1080 and 4k....

        Given that laptops are often quite literally in our laps, I would argue it's absolutely in the distance range at which you will be able to tell a visual fidelity difference between 1080 and 4k. However, a higher quality display will often offset this. Almost everything has an IPS panel nowadays, but something which good color reproduction (such as HDR if available) would be more important than overall resolution here.

        One consideration with a 4k screen is that it takes up more energy to power it. Realistically a 1080 or a 1440 is going to sit at a much lower TDP and give you a longer battery life. I personally don't think 4k is worth it for most laptop users currently, but it really depends on what you're doing with the display. If you're an artist or the extra pixels are important, than go for the 4k, otherwise I'd highly suggest a lower resolution.

        5 votes
  2. [4]
    Akir
    Link
    IMHO, the most important thing in a modern display is not the pixel count, but the color reproduction. If you're working on visual arts, you'll want to have a display capable of displaying as wide...

    IMHO, the most important thing in a modern display is not the pixel count, but the color reproduction. If you're working on visual arts, you'll want to have a display capable of displaying as wide (and accurate) a color gamut as possible so you can make sure your end result looks as good as possible no matter what it's being shown on. So getting the screen upgrade might be worth it for that reason alone.

    By the way, you don't need a special camera to do HDR. You can just stack multiple exposures. If you're shooting Cannon, you can use Magic Lantern to help you get both of them quickly.

    8 votes
    1. [3]
      mat
      Link Parent
      That's a good point. I think realistically the colour handling of an Dolby Vision certified display is probably more tempting than the raw pixel count, but the standard Lenovo screens are already...

      That's a good point. I think realistically the colour handling of an Dolby Vision certified display is probably more tempting than the raw pixel count, but the standard Lenovo screens are already fairly good at colour. I can't recall the exact spec of my current screen but I seem to remember it's around 75% sRGB, which isn't bad.

      What I meant with the camera was that it's usual behaviour is to produce standard 8-bit jpegs that any display can display most of (colour gamut notwithstanding). It can make 14-bit raws but frankly life is too short for shooting raw, 99% of the time. It will shoot expanded dynamic range at the sensor with some nifty automatic per-pixel ISO adjustments - up to four EV - which is fun.

      2 votes
      1. [2]
        Akir
        Link Parent
        The thing about 75% sRGB is that it's a smaller section of the color gamut than you might expect. Take a look at this graphic to see what I mean. That being said, I'm not really 100% convinced...

        The thing about 75% sRGB is that it's a smaller section of the color gamut than you might expect. Take a look at this graphic to see what I mean.

        That being said, I'm not really 100% convinced that having a wider color gamut to work on actually means that you'll produce better results. If you show someone an image on a screen with an accurate 8-bit colorspace and another with a 14-bit colorspace, they wouldn't likely be able to tell the difference. Vision is spongy and subjective; there is literally no such thing as perfect vision. And HDR really is much better for movies, IMHO.

        I'm kind of shocked you just shoot JPEG. I'm assuming you're probably either a sports photographer or you're just really confident that you can shoot exactly what you've got in your mind. Personally speaking, I shoot RAW just so I can change my mind later. :P

        4 votes
        1. mat
          Link Parent
          That's kind of what I mean about 75% sRGB being OK. It's not huge in the grand scheme of things, but it's good enough and to be honest almost all of my photos (100% of the work images) are being...

          That's kind of what I mean about 75% sRGB being OK. It's not huge in the grand scheme of things, but it's good enough and to be honest almost all of my photos (100% of the work images) are being seen on someone else's screen so optimising for that is almost easier on a less-than-perfect screen. I used to work with a video guy who had a second desk covered in cheap TVs and monitors so he could master for the screens people would actually see his work on, rather than on his eye-wateringly expensive insanely-high gamut professional display. It was sort of useless in that sense, although holy crap it was beautiful. I get there are advantages to working at the highest res/gamut possible then mastering down at the end, but I'm not sure that's me.

          A little off topic but fwiw I'm mostly an amateur photographer. I do shoot photos for work stuff but they're done in a lightbox which is so reliably repeatable that I have a jpeg preset in-camera just for that use. My post-processing workflow at this point is "place SD card in laptop, choose image, crop, upload", which over the years has probably saved me weeks of tinkering around converting raws in Darktable.

          A number of Fuji-using pros I know shoot jpeg because the in-camera conversion is so, so good. I generally like being restricted somewhat while I'm taking photos for fun, I have a handful of quick-select I use - B&W, Chrome emulation, Velvia emulation, Pro Neg Hi and couple of variations within those for dynamic range, noise reduction and so on. I'll regularly change settings during a day out. I rarely miss what I'm going for and if I'm really not sure I turn on jpg+RAW mode and even than 99% of the time I use the jpeg. Post-processing is a time and energy sink I'm increasingly not interested in, although I appreciate how some people enjoy it. It's similar to how one of my favourite lenses is an 8mm fisheye. You have to get everything perfectly set up before you hit the shutter because you can't unbend the image in post! But with care while shooting you can get amazing results and I really like putting the focus (as it were) back to me holding a camera rather than me pushing a slider on a screen.

          2 votes
  3. dblohm7
    Link
    I would say that, for text, there hasn't been much benefit to 4K. I had delusions that as a developer I would benefit from the extra pixels, but my eyesight isn't what it used to be. Due to the...

    I would say that, for text, there hasn't been much benefit to 4K. I had delusions that as a developer I would benefit from the extra pixels, but my eyesight isn't what it used to be. Due to the pixel density, I end up needing to make my fonts so large that I lose any benefits.

    Working with graphics is obviously a different kettle of fish, but I definitely agree with other replies here that colour accuracy should take precedence over pixel density.

    6 votes
  4. [2]
    skybrian
    Link
    It seems like it's less whether it's noticeable and more whether you care. Why pick up more-expensive tastes when you don't need to? If you don't really see the difference, maybe it's better not...

    It seems like it's less whether it's noticeable and more whether you care. Why pick up more-expensive tastes when you don't need to? If you don't really see the difference, maybe it's better not to be in a hurry to find out?

    5 votes
    1. mat
      Link Parent
      That is also an excellent point. My current screen is fine. Maybe it would be a few almost-impossible-to-quantify percent "better" if it were 4K. But then I could instead spend that money on more...

      That is also an excellent point. My current screen is fine. Maybe it would be a few almost-impossible-to-quantify percent "better" if it were 4K. But then I could instead spend that money on more CPU/GPU/RAM, which I know would be useful, and I'd never know what, if anything, I was missing with the screen.

      3 votes
  5. [2]
    pew
    Link
    Since I bought my first retina MacBook 13" there's no way for me to go back to Full HD. I switched jobs earlier this year and got a notebook with a FHD display and it was really bad. Eventually...

    Since I bought my first retina MacBook 13" there's no way for me to go back to Full HD. I switched jobs earlier this year and got a notebook with a FHD display and it was really bad. Eventually they provided me a MacBook with a retina display again and everything was good again.

    I know, #firstworldproblem, but still. I feel like if you had 4k/retina once you can't go back, even with a 13" machine.

    On the other hand, if you're using Linux I'm not sure about 4k, my experience was really bad, even in 2020 everything's kinda messed up.

    4 votes
    1. snazz
      Link Parent
      Yeah, I think the software compatibitility aspect is the most significant consideration. macOS and nearly all third-party Mac apps support Retina very nicely. Windows supports HiDPI monitors to an...

      Yeah, I think the software compatibitility aspect is the most significant consideration. macOS and nearly all third-party Mac apps support Retina very nicely. Windows supports HiDPI monitors to an extent (many included programs like Computer Management are fuzzy) and third-party software is less guaranteed to support it. Linux might be challenging to get working with HiDPI displays outside of programs like the default GNOME applications which are meant to work at those resolutions.

      5 votes
  6. [2]
    phormix
    Link
    4K is great but also make sure it's a bright screen with good color contrast. I don't know that 4k automatically gives you good brightness as I've seen some screens that just look kinda.... dull....

    4K is great but also make sure it's a bright screen with good color contrast. I don't know that 4k automatically gives you good brightness as I've seen some screens that just look kinda.... dull. I've got a 4K laptop and my SO has a surface book. Frankly, the surface looks much better for a lot of stuff.

    That said, I really do find that just having the extra desktop real-estate is very useful for productivity on a high-res screen, so long as you've got good vision. Since you're a Linux user you should be able to find plenty of utilities to divvy up your screen, but even in windows the WIN+arrow combination works pretty good to tile things.

    2 votes
    1. mat
      Link Parent
      I have a cheap nasty Chuwi tablet which uses the same screen as the previous-gen of Surface devices and it's chuffin' lovely. Slow as hell but it looks gorgeous. The Lenovo 4k screens are 500nits...

      I have a cheap nasty Chuwi tablet which uses the same screen as the previous-gen of Surface devices and it's chuffin' lovely. Slow as hell but it looks gorgeous. The Lenovo 4k screens are 500nits iirc, with a very decent colour gamut.

      There are two things pulling me towards 4K, one is the HDR/colour rendition for photo editing (which is for fun) and the other is being able to fit more video editing stuff into my desktop (which is for work-ish). Trying to cram a full 1080p NLVE environment into my current FHD/14" screen is fiddly as hell. Hence a slightly larger screen with some more pixels is quite tempting.

      I do need some new glasses - lockdown got in the way of eye test and stuff - but generally I'm pretty good at seeing stuff.

      Gnome's desktop switch shortcut ctrl-alt-up/down is so ingrained in me I do it in windows all the time, where it, for some reason, turns the screen upside down. I'm not sure that's such an important operation it needs a shortcut but apparently it does.

      2 votes
  7. [3]
    Comment deleted by author
    Link
    1. [2]
      mat
      Link Parent
      I run Linux 99.9% of the time, I don't care about Lenovo's crapware. Their build quality has always been very good in my experience and for me it's important that my computers are easy to take...

      I run Linux 99.9% of the time, I don't care about Lenovo's crapware. Their build quality has always been very good in my experience and for me it's important that my computers are easy to take apart, repair and upgrade which Thinkpads are - but the reason I'll always buy Thinkpads is the keyboards. I've never used a laptop which comes close, even most desktop keyboards aren't as good. I have physical access restrictions which make me very sensitive to keyboard quality so it's a huge deal for me. I lknow I can safely buy a Thinkpad and type on it with minimal pain. I can't say that for any other manufacturer, even if their keyboards might be OK. It takes a good couple of weeks of use to determine if a keyboard is long-term 'safe' for me, and returning a system after that long can be awkward. A Thinkpad is a safe buy.

      Slightly smaller numbers for me, but the other way around. When I worked at $LargeMultinational we had Thinkpads and Dell and HP laptops and the Thinkpads were the most reliable, the Dells were OK and the HPs were complete trash. I knew if I sent someone out with a Thinkpad they likely wouldn't be coming back to me with problems.

      My whole team had Dell systems at my last programming job and it was such a pain. Every one of them had some weird issue that Dell weren't interested in fixing. My manager's machine rebooted every day at 15:21 despite multiple reinstalls and even a change of building (we didn't move offices just to try to fix his computer, but doing so also didn't fix his computer). Super weird. Put me off Dells.

      Related to all this, I read something ages ago about how everyone in IT has a preference for disk manufacturers - personally I'm a Seagate guy - and tended to think the one they liked was more reliable than the others, and the one they didn't like (man, fuck those clowns at Western Digital) was worse, but on the numbers there was basically zero difference in reliability between all the main brands. I suspect most hardware is the same and as such I tend to view comments like yours with that in mind, especially when my experience runs the other way. It's not that I disbelieve you, nor am I unappreciative of you sharing your knowledge and experience. More data is always good. But that said, I'm not about to go and price up a Dell machine.. ;)

      2 votes
      1. Akir
        Link Parent
        The people who work on software and firmware for Dell must have a serious case of the "good enoughs". Laptops have firmware "good enough" to run windows, but so broken that it won't run anything...

        The people who work on software and firmware for Dell must have a serious case of the "good enoughs".

        Laptops have firmware "good enough" to run windows, but so broken that it won't run anything else. I couldn't initially run Linux on my Dell laptop because it didn't have the proper configuration for the integrated peripherals. I could only get it to run properly by manually passing the configuration to the kernel at boot. Dell has openly said they do not plan to fix this firmware.

  8. mat
    Link
    I'm sure you'll all be fascinated to hear that I just ordered a Thinkpad T15P with a 4K screen. I wasn't sure about the screen right up to the point that I (a) made a few sales and had a little...

    I'm sure you'll all be fascinated to hear that I just ordered a Thinkpad T15P with a 4K screen.

    I wasn't sure about the screen right up to the point that I (a) made a few sales and had a little more budget to play with and (b) managed to snag a 10% discount code for Lenovo's store. Top tip, hit the shop in Chrome - I'd been on and off the UK store all week in Firefox, then for unrelated reasons I happened to load it in Chrome and immediately a "sign up to our spam and get a 10% voucher" box popped up. In went my spam email address and seconds later my 4K system was back in budget range. I could have got a little extra cpu instead but it was only a few % more power compared to over twice as many extra pixels.

    In the end I decided that 16GB of RAM would do for now because I can buy third party upgrades later for half the price Lenovo wanted. Did get me some sweet i7-10750H hexa-core cpu goodness though, and a passable (for a non-gamer) GTX1050 GPU - so maybe I'll be a little less of a non-gamer in future.. The current gen Lenovo 4k panel is 100% AdobeRGB, which is pretty impressive.

    Thanks for the feedback and advice everyone, much appreciated.