• SocialMediaRefugee@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    3 hours ago

    At a certain point yours eyes can’t tell much difference. It is like music, people would obsess over tweaking their stereo systems to the point where I doubt you could physically tell the difference, it was mostly imagined.

    Huge tvs also require big rooms to make the viewing angle work. Not everyone has a room they work in. Apartments are especially too small for huge tvs.

  • lemmydividebyzero@reddthat.com
    link
    fedilink
    English
    arrow-up
    58
    ·
    10 hours ago

    4k is enough, 60fps is enough, no smart or AI stuff is perfectly fine…

    What about reducing the energy consumption? That’s an innovation I want.

    • HereIAm@lemmy.world
      link
      fedilink
      English
      arrow-up
      42
      ·
      10 hours ago

      I hope you mean 60hz is enough for TVs. Because I certainly don’t want that regression on my monitor 😄

    • 1984@lemmy.today
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      14
      ·
      edit-2
      9 hours ago

      Are you aware that reducing the energy consumption on monitors is competely irrelevant compared to the giant data centers coming up now, taking as much as power as a full city?

      One typical AI data center ≈ 1 TWh/year ≈ the electricity used by 100 000 average homes annually.

      A very large AI data center under construction ≈ 20 TWh/year ≈ electricity for ~2 million homes annually.

      Fun times isnt it.

        • HugeNerd@lemmy.ca
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          6
          ·
          7 hours ago

          Yes, sort of like if your kitchen is on fire but you also need to vacuum the living room. You should definitely focus on finishing the vacuuming before addressing the fire, because, you know, you can care about more than one thing.

          • badgermurphy@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            7 hours ago

            Fires are more urgent than messes, but we have firemen and custodians and need them both. The poster can vote with his wallet to get the best energy efficiency possible in his home electronics and possibly vote at the ballot box to help regulate corporate energy waste.

            It doesn’t seem sane that he would have to forego every other endeavor in his life until the most urgent issue in it is resolved, even if there is no direct action he can take about that one at this time.

            I need a new car and also to do the dishes. The new car is much more important, but I have no means to work on the car issue today and am already standing in the kitchen. Is it more productive if I pace around wringing my hands in concern about the car problem, or maybe wash some dishes now and get a car in the morning when the dealership opens?

          • hume_lemmy@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 hours ago

            Try to keep up with me here: what if you could put out the fire and vacuum at the same time?

        • 1984@lemmy.today
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          5 hours ago

          Of course, buy monitors that use a bit less energy. That will give you probably 1 dollar per year in savings, and you can spend that on something nicer. A cup of coffee maybe.

      • lemmydividebyzero@reddthat.com
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 hours ago

        I personally do reject that kind of thinking.

        “But others are worse” reminds me of the kindergarten…

        EU has introduced energy efficiency levels for monitors based on the potential and we are far from the goal.

        I can’t tear down AI data centers, but I can choose to buy a monitor that does not heat up my living room and leads to a nicer electricity bill for me.

        • Doomsider@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 hours ago

          “I can’t tear down AI data”

          I mean, if the wealthy won’t listen and they are trying to steal all our resources this is exactly what must happen.

      • HugeNerd@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        I’m reminded of this type of absurdity every time my Creative T40 speakers auto-shut off after a few minutes of inactivity, and take 4 seconds to wake up again. Yes, that entire millijoule of (entirely renewable) electrical energy is making a huge difference.

  • aceshigh@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    6
    ·
    7 hours ago

    Getting rid of my tv was the best thing I did for myself. That’s the future. Removing and reducing all screen time.

    • ipkpjersi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      4 hours ago

      For me, the opposite was true. Ever since I injured my knees last year, putting a 75 inch TV in my bedroom has improved my quality of life.

      I know people will probably say “oh just fix your knees” and think that sentiment is helping, but I tend to not take my medical advice from technology communities and instead listen to doctors. It makes me sound rude, but it’s true that medical advice should be given by medical professionals for the best outcome possible.

      I really do love having a nice big TV in my bedroom.

    • bridgeenjoyer@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      6 hours ago

      We didnt have one for a long time.

      Then we got one for free. So we use it for movies and some gaming together, but we dont have any streaming subscriptions.

      I agree it’d be better to not have. But im into old console gaming so I like to have a couple tvs.

      Its good to balance it with reading and exercising of course! But movie nights are fun.

  • Pyr@lemmy.ca
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    1
    ·
    14 hours ago

    Fuck.

    Now instead of each new generation of TVs being slightly higher in resolution some god damn business tech executive is going to focus on some god damn bullshit to try and change or add which is going be absolutely fucking ridiculous or pointless and annoying like AI television shit or TV gaming or fuck my life. Smart TVs are bad enough but they can’t help themselves they need to change or add shit all the fucking time to “INNOVATE!!!” and show stockholder value.

    Resolution was something easy and time consuming but we can’t rely on that keeping them from fucking TV up any more.

  • ComradePenguin@lemmy.ml
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    11 hours ago

    The only real innovation after 1080p for TV was HDR, sound stuff, 60-120hz, and upscaling to 4k.

      • jestho@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 hours ago

        Only time I used 3D on my TV was playing Black Ops split-screen, where both players got the full screen, which was pretty neat.

  • AA5B@lemmy.world
    link
    fedilink
    English
    arrow-up
    81
    ·
    20 hours ago

    I want a dumb tv with the price and and specs of a smart tv. Less is more

    • keyez@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 hours ago

      My LG C3 not connected to internet and using an HTPC and Nvidia shield is working great so far.

    • thatKamGuy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      17 hours ago

      What you’re looking for are commercial screens. They’re a bit more expensive for a comparable panel, as they are intended for 24/7 use- but are about as dumb as they get nowadays.

      • NocturnalMorning@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        10 hours ago

        A bit more expensive? I was about to get a smart tv for like 800 bucks. The same equivalent dumb tv wpuld have been a few thousand dollars and Best Buy said they would only sell it to business accounts which was infuriating to read.

        • Doomsider@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          6 hours ago

          This isn’t a peasant TV. And it doesn’t even have any tracking, I am not sure a pleb can even legally own these. Sorry, but you have to be a wealthy person who watches CP to have it in your home.

    • avg@lemmy.zip
      link
      fedilink
      English
      arrow-up
      18
      ·
      18 hours ago

      You would likely have to pay more since they aren’t getting to sell your information.

      • Alcoholicorn@mander.xyz
        link
        fedilink
        English
        arrow-up
        11
        ·
        17 hours ago

        *you would have to pay more because major companies know they can charge more. There isn’t a limited amount of profit a company wants to mae, and then they pick a price from that, they price it as high as the market will bear.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 hours ago

        Realistically, not only do I not want an 8k tv, but I might not get a tv at all, if I had to do it today. We rarely watch tv anymore. The brainrot is the same but we’re much more likely to use individual screens

  • Omega_Jimes@lemmy.ca
    link
    fedilink
    English
    arrow-up
    35
    ·
    22 hours ago

    Theres a ton of other things I want my TV to do before more pixels.

    Actual functional software would be nice, better tracking on high speed shots (in particular sweeping landscapes or reticles in video games) higher frame rates and variable frame rate content, make the actual using of the tv, things like changing inputs or channels faster, oh man so much more.

    Anything but more pixels.

    • yeehaw@lemmy.ca
      link
      fedilink
      English
      arrow-up
      18
      ·
      20 hours ago

      I still probably watch 90% 1080p and 720p stuff lol. As long as the bitrate is good it still looks really good.

    • Skullgrid@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      22 hours ago

      Actual functional software would be nice

      you do not want software on your TV.

      • uniquethrowagay@feddit.org
        link
        fedilink
        English
        arrow-up
        11
        ·
        15 hours ago

        Yes I do. I want an actual smart TV with a practical, open source, TV-optimitzed Linux OS. It’s not that software on a TV is a bad idea in itself. It’s how it’s ruined by for-profit companies.

        • balsoft@lemmy.ml
          link
          fedilink
          English
          arrow-up
          14
          ·
          12 hours ago

          Nah, honestly, I think stuffing an entire computer inside a monitor and relying on it to generate/show content is a bad idea no matter what software it runs. A dumb TV + a small computing dongle requires only a tiny fraction more labor to produce than a smart TV, but it’s so much easier to upgrade in the future if you decide you need faster boot times or wanna game on the TV, etc. And if the TV breaks before the dongle does, you can also buy a new TV and keep all your settings/media without transferring anything.

          • Aceticon@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            11 hours ago

            Also to add to this, the life-cycle of a TV display is mismatched from the live-cycle of media playing hardware or just hardware for general computing: one needs to update the latter more often in order to keep up with things like new video codecs (as for performance those things are actually implemented in hardware) as well as more in general to be capable of running newer software with decent performance.

            I’ve actually had a separate media box for my TV for over a decade and in my experience you go through 3 or 4 media boxes for every time you change TVs, partly because of new video codes coming out and partly because the computing hardware for those things is usually on the low-end so newer software won’t run as well. In fact I eventually settled down on having a generic Mini-PC with Linux and Kodi as my media box (which is pretty much the same to use in your living room as a dedicated media box since you can get a wireless remote for it, so no need for a keyboard or mouse to use it as media player) and it doubles down as a server on the background (remotely managed via ssh), something which wouldn’t at all be possible with computing hardware integrated in the TV.

            In summary, having the computing stuff separate from the TV is cheaper and less frustrating (you don’t need to endure slow software after a few years because the hardware is part of an expensive TV that you don’t want to throw out), as well as giving you far more options to do whatever you want (lets just say that if your network connected media box is enshittified, it’s pretty cheap to replace it or even go the way I went and replace it with a system you fully control)

      • Omega_Jimes@lemmy.ca
        link
        fedilink
        English
        arrow-up
        9
        ·
        20 hours ago

        I mean, yes and no. I like e-arc, and I like being able to adjust settings other than v-hold. But I don’t want this slow crud fest that keeps telling me when my neighbour turns on Bluetooth on their iphone.

          • Omega_Jimes@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 hours ago

            Yeah, all my inputs go to the tv, then i run a wire to the receiver. This makes it so my ps5 and PC are plugged directly to the tv so i can get different resolutions are variable refresh rate and the tv can control the receiver. So when I turn something on, the tv/receiver turn on and set themselves to matching settings, Dolby, stereo, whatever. Its not huge but its a nice convinienice over the older optical connection.

      • Zozano@aussie.zone
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        21 hours ago

        I want software on my TV.

        Steam Link specifically. I like streaming to my TV via Ethernet.

    • Lenggo@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      21 hours ago

      I have a Samsung frame because I wanted a TV that didn’t look so much like I had one, but the software is so goddam bad. The only way to switch sources quickly is to set them as a favorite which isn’t always that straight forward if you didn’t do it right away. Regardless you have to let the the home page fully render before you can even worry about that. Even the Samsung TV app which you would think would be perfectly optimized for the hardware since the same compare makes the software is barely functional and loads like a web page on AOL in 1998

      • lobut@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        20 hours ago

        I like my frame because it faces a set of windows and with all my other tvs … I would have to close the blinds to see the tv in the day time.

        However, the software is straight garbage. I didn’t even know about the favourite thing … every time I change source it would spend like a minute or two trying to figure out if it could connect to it for no reason.

  • FireWire400@lemmy.world
    link
    fedilink
    English
    arrow-up
    204
    ·
    edit-2
    1 day ago

    It’s about time the electronics industry as a whole realises that innovation for the sake of innovation is rarely a good thing

    • Scratch@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      142
      ·
      1 day ago

      Look, we can’t have TVs that last 15 years anymore!

      We need to keep people buying every year or two. Otherwise line not go up! Don’t you understand that this is about protecting The Economy?!

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        58
        ·
        1 day ago

        Boomers economic policy is like if Issac Newton saw an apple falling from a tree, and came to the conclusion it would always accelerate at the same speed no matter what, even though the ground with the entire ass planet behind it is right fucking there.

        Numbers can not constantly go up, it’s just that’s what was happening their whole lives and they can’t accept that their childhoods was a blip and not how things always were and always will be.

        They just can’t wrap their heads around it. They have such shit tier empathy they can’t comprehend that they’re an exception.

        • cmbabul@lemmy.world
          link
          fedilink
          English
          arrow-up
          27
          ·
          1 day ago

          A large number of the problems we currently face and will in the future come down to boomers being worse than their predecessors at grasping, understanding, and accepting their own impermanence and unimportance on the grand stage of reality.

          Most of them need to have a series of existential crises or maybe read some fucking Satre so they can stop with the Me generation bullshit. It’s wild that the first generation to do LSD in mass is somehow the one that needs to experience ego death the most

          • givesomefucks@lemmy.world
            link
            fedilink
            English
            arrow-up
            26
            ·
            edit-2
            1 day ago

            It’s wild that the first generation to do LSD in mass

            I want to say hippies were less than 1% of that generation, but for some reason I think it was recorded as 2-3% which would be a gross over-estimate.

            But for every hippie you think of sticking daisies in rifles, there was 100 spitting on Black kids for going to the school they were legally required to go to.

            It would be like if in 2080 they think we’re all catboys with blue hair and 37 facial piercings.

            Sure, those people exist as a fringe demographic, but they’re not the norm.

            Bmost hippies had more issues with peers their own age than their parents age, that part of the folk tale gets left out tho, because the people who want us to think they were hippies and “grew out of it” were the ones beating hippies for being different.

            All they were ever trying to do was lie to younger generations in the hopes they’d confirm to decades old social norms. Like, it’s weird how many people still don’t understand the boomers just lie about shit instinctively. They grew up in a world filled with lead and are literally incapable of caring about logical inconsistencies. They want younger generations to think they were cool, so they just fucking lied about what they were like as a generation.

            If you ever run into a real deal old hippie some day, ask them what the majority of people their age was like back then.

        • explodicle@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          18 hours ago

          To be fair Boomers didn’t create this economic policy. Their parents elected Nixon, who broke the Bretton Woods agreement “temporarily”, and then we adopted Keynesian macroeconomic policy afterwards to justify it.

          Inb4 someone regurgitates a defense of this “boomer” policy and proves that it’s not just them and never was. It’s always been the rich and their loyal servants.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      76
      ·
      edit-2
      1 day ago

      It’s not even innovation, per say. It’s just Big Number Go Up.

      Nobody seems to want to make a TV that makes watching TV more pleasant. They just want to turn these things into giant bespoke advertising billboards in your living room.

      Show me the TV manufacturer who includes an onboard ad blocker. That’s some fucking innovation.

      • redditmademedoit@piefed.zip
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 day ago

        The galaxy brain move is buying an old dumb tv for a pittance and use it for watching Jellyfin/Plex/stream from a browser with uBlock Origin/DNS filtering – all running on some relative’s “obsolete” smart toaster from last year that they happily gift you because “the new version’s bagel mode IS LIT – pun intended – but it needs the 128 gb DDR7 ram of the new model, can barely toast on the old one any more”.

      • Evilschnuff@feddit.org
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 day ago

        I think this just comes down to human nature. Give people (engineers, execs) a metric that looks like a good proxy for performance and they will overcommit on that metric as it is a safer bet than thinking outside the box. I think the incremental improvements in deep learning with all those benchmarks are a similar situation.

      • chocrates@piefed.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        You can’t really find a dumb TV anymore. I might see how big of a monkey I can find when I’m ready to upgrade, but I doubt I’ll find one big enough and cheap enough.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          23 hours ago

          I hooked my computer up to the HDMI and have used that as my primary interface.

          It’s not perfect, but it screens out 95% of bullshit

          • tyler@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            21 hours ago

            That doesn’t, unless you’ve blocked your TV from network access, because they use ACR - Automated Content Recognition - that literally scans what is being displayed over your hdmi port and then sells it off to advertisers.

          • dual_sport_dork 🐧🗡️@lemmy.world
            link
            fedilink
            English
            arrow-up
            15
            ·
            23 hours ago

            That won’t save you anymore. My boss bought a smallish smart TV in contravention of my explicit instructions for use as a CCTV monitor because it was “cheap.” It nags you on power up with a popup whining about not being able to access the internet, and if you don’t feed it your Wifi password it will subsequently display that same popup every 30 minutes or so requiring you to dismiss it again. And again. And again. Apparently the play is to just annoy you into caving and letting it access your network.

            Instead I packed it up and returned it. Fuck that.

            • tyler@programming.dev
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              21 hours ago

              If you are at a business you should have an access point or router that is capable of blocking specific devices from WAN access. But I would create a new segmented network, block that network from WAN access entirely, put it on its own VLAN, and then connect the TV to that network.

              • explodicle@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                ·
                18 hours ago

                I’d assume it nags whenever it can’t connect to the home server, and just says “network”.

                So when they go out of business any remaining units will nag forever.

                • tyler@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  15 hours ago

                  You can use your router or access point tools to check what address it’s trying to resolve and then set up a redirect to a device that can respond with a fake response.

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      22 hours ago

      We traded 3D tv’s, which are amazing if you watch the right stuff, for 8k…

      8k is great, but we need media in 8k to go with it.

  • lechekaflan@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    19 hours ago

    Most developing countries have cheap 1080p TVs right now, but others are still using CRTs, and still others are watching on their phones (like some of my poorer relatives who do get their entertainment fix through their phones while the big TVs in their living rooms rarely gets turned on).

    • BanMe@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      18 hours ago

      I think my TV is like 32" and 720p from 2012 or so. It’s fine.

      Before that I had a projector which was fun sometimes but mostly too big for the room. Cool to take psychedelics and run a visualizer while it was pointed along the floor at you. You could feel the fractals on your skin. I don’t do that anymore, so a 32" TV is fine.

      • hereiamagain@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        My TV is a 50" I think? 1080p from 2013. It was cheap then, and it’s worth nothing now, on the market. But it works fine.

        The backlight is evenly distributed, which is good, that’s a pet peeve of mine. But otherwise it’s unremarkable.

        Honestly, I’d really like to try those new HDR TVs, a mini LED or OLED or something. But I just can’t justify it. Why? Because the TV I have, works fine 🤷‍♂️

        If it magically died tomorrow, I’d upgrade. But I definitely don’t need 8k. Heck I don’t need 4k. I barely watch any content at 1080, it’s mostly 720 🤷‍♂️

  • ohulancutash@feddit.uk
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    23 hours ago

    They showed Skyfall on 70ft IMAX screens, and that film was shot 2880 x 1200. Its not all about the pixel count.

      • ohulancutash@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        14 hours ago

        DP Roger Deakins has a forum on his site where he answers these sorts of questions. The Arri Alexa Studio was essentially developed for him, as he prefers an optical VF. It has an academy gate, and Deakins favours spherical lenses rather than anamorphic so we simply take the width of the sensor and crop down to 2.40:1 aspect ratio to get the quoted dimensions. Your link quotes the capture resolution, which will have some excess compared to the finished material.

        This was put through the DI process and at a later stage IMAX DNR’d a blow-up to 15/70mm.

  • MonkderVierte@lemmy.zip
    link
    fedilink
    English
    arrow-up
    99
    arrow-down
    1
    ·
    1 day ago

    Gaming was supposed to be one of the best drivers for 8K adoption.

    Whu? Where 4k still struggles with GPU power? And for next to no benefit?

      • Kyden Fumofly@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        Everything above 1440p isn’t offering any display space. I have a 4K monitor but 150% DPI is needed to make things big enough to work with.

        • AnUnusualRelic@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          I’m using a 5k by 1200 monitor and it’s awesome with Kde. I suppose it would be equally great with other interfaces.

    • LiveLM@lemmy.zip
      link
      fedilink
      English
      arrow-up
      20
      ·
      edit-2
      22 hours ago

      Introducing the new DLSS 9, were we upscale 720p to 8k. Looks better than native, pinky swear.

    • olympicyes@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      1
      ·
      1 day ago

      What’s dumb is that 3D failed because of lack of resolution and brightness, and now we have more pixels than we can handle and screens so bright they can hurt to look at. PS3 had a couple games that showed different screens to two players wearing 3D glasses. I’d love to see full screen couch coop games with modern tech. 8K isn’t solving any problems.

      • Rooster326@programming.dev
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        1
        ·
        edit-2
        20 hours ago

        3D failed for the exact same reason VR is failing now. Nobody wants to wear headsets at home.

        • No1@aussie.zone
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          14 hours ago

          Nobody wants to wear headsets at home.

          Or anywhere, really. AR/XR glasses are a big improvement, but can still be clunky.

          Or bring on the neural shunt. Not so fast, Elon. It’s not for your purposes. It’s for mine. And I can’t see who I could trust with direct connections to my brain… But once I plugged in, I love Amazon, and you should subscribe!

      • raldone01@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        24 hours ago

        Screen dimming is technically possible over HDMI/Displayport no idea why its not properly supported and integrated into monitors, graphics drivers, windows and Linux. KDE shows dimming for monitors sometimes? Don’t know if that is software or real hardware dimming though.

    • addie@feddit.uk
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 day ago

      I dunno. Oxygen Not Included looks crisp on a 4K monitor. And it makes my job easier, being able to have an absolute tonne of code on-screen and readable. I reckon I could probably use an 8K monitor for those things.

      Yeah, I generally have FSR running on any 3D game made in about the last decade - even if I can run it at 4K at a reasonable framerate, my computer fans start to sound like a hoover and the whole room starts warming up. But upscaling seems a better solution than having separate monitors for work and play.

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 day ago

      Pretty sure my GPU could run 4k Rimworld, just play good games instead of AAA games.

      • MonkderVierte@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        11 hours ago

        But we are talking TV. RPG with a controller is more suitable here. And there are some great RPG games.

    • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      3
      ·
      edit-2
      1 day ago

      FR. What distance and size do I need to be able to actually see the difference between 1080p and 4K? Cuz my current setup does not allow me to notice anything but the massive reduction in performance, unless it’s a 2D game and then everything becomes too tiny to effectively play the game.

      • calcopiritus@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        1 day ago

        4k is noticeable in a standard pc.

        I recently bought a 1440p screen (for productivity, not gaming) and I can fit so much more UI with the same visual fidelity compared to 1080p. Of course, the screen needs to be physically bigger in order for the text to be the same size.

        So if 1080p->1440p is noticeable, 1080p->4k must be too.

        • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          1 day ago

          Like I said, 2D things it is noticable only because it makes everything smaller (there is more space because the elements inside that space are smaller). However movies and 3D games? No difference.

          Even going from 640x480 to 1024x768 makes a noticeable size difference with the 2D elements of a UI.

      • toofpic@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        I’m using a 60 inch tv as a monitor to my desktop - I sit in front of it at a distance of about 2m. It feels really nice to have stuff in 4k, so it’s always 4k except the games that are too tough for my 2060 super to give me 60p.

      • Kogasa@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 day ago

        It’s very noticeable at the DPI of a 27" screen from arms’ length. Or maybe not if you can’t see very well. But on a TV from 10 feet away, I dunno if I could differentiate 1440p from 4K personally.

          • Kogasa@programming.dev
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 day ago

            27" 2K is a good DPI. I personally only went up to 4K 27" because I also wanted OLED, and the 2K OLED panel I was using had some noticeable text fringing because of the subpixel layout. At 4K it’s not noticeable anymore.

      • kameecoding@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        1 day ago

        For PC you can actually see the benefit between 4K and lower resolutions.

        It’s for TV where it’s likely useless to have 8K

  • panda_abyss@lemmy.ca
    link
    fedilink
    English
    arrow-up
    152
    arrow-down
    1
    ·
    1 day ago

    There’s no 8k content, and only recently do standard connectors support 8k at high refresh rates.

    There’s barely any actual 4K content you can consume.

    • Kyden Fumofly@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 hours ago

      There is a lot 4K to consume now. That was the reality 5 years ago (even 4K exists more than 10). I would say 4K is becoming slowly the new FHD, but very very slowly.

      The problem is that there is a lot low quality 4K, because of bandwidth, size etc.

      • panda_abyss@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 hours ago

        I feel like most streaming platforms plan lock it and still compress it to crap though.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      53
      ·
      1 day ago

      There’s barely any actual 4K content you can consume.

      Honestly a little surprised the IMAX guys didn’t start churning out 4k+ content given that they’ve been in the business forever.

      But I guess “IMAX in your living room” isn’t as sexy when the screen is 60" rather than 60’

      • jqubed@lemmy.world
        link
        fedilink
        English
        arrow-up
        39
        ·
        1 day ago

        You don’t even need IMAX for 4K; ordinary 35mm film can normal scan to a nice 4K video. Films shot on the 65mm IMAX cameras would probably make good 8K content, but most of that was educational films, not what most people apparently want to watch all the time.

        The digital IMAX projections were actually a step backwards in resolution.

        • hcbxzz@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 hours ago

          IMAX is a mess. They can’t even figure out a consistent aspect ratio, so most of the content shot on IMAX is cropped after delivery.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          18
          ·
          1 day ago

          Films shot on the 65mm IMAX cameras would probably make good 8K content, but most of that was educational films, not what most people apparently want to watch all the time.

          Sure. But the cameras exist. You can use them for other stuff.

          Hateful Eight was filmed in 70mm, and while it wasn’t Tarantino’s best work it certainly looked nice.

        • Anakin-Marc Zaeger@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 day ago

          Films shot on the 65mm IMAX cameras would probably make good 8K content

          So there’s still hope that they might release The Last Buffalo in 8k 3D sometime in the future? Got it. :)

      • queermunist she/her@lemmy.ml
        link
        fedilink
        English
        arrow-up
        19
        ·
        1 day ago

        They don’t want IMAX in your living room, they want IMAX in the IMAX theater, where you pay a premium for their service.

      • worhui@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        20 hours ago

        IMAX is 4K or less content. Its edge is special projection that can look good and brighter on huge screens.

        Only imax film prints are significantly better than anything else

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      2
      ·
      1 day ago

      People really need to understand a lot of what “smart” TVs do is upscale the “4k” signal to something actually resembling real 4k.

      Like how some 4k torrents are 3GB, and then a 1080p of the same movie is 20gb.

      It’s “worse” resolution, but it looks miles better because it’s upscaling real 1080 to 4k instead of taking existing shitty 4k and trying to make it look better without just juicing the resolution.

      So we don’t need 8k.content for 8k.tvs to be an incentive. We need real 4k media, then 8ks TV would show a real improvement.

      • Chronographs@lemmy.zip
        link
        fedilink
        English
        arrow-up
        29
        ·
        1 day ago

        Yeah, you’re talking about bitrate. A lot of the 4k content is encoded using more efficient codecs, but if it’s sourced from the streaming services the bitrate is so abysmal it’s usually a tossup between the 1080p or 4k stream. At least the 4k usually has hdr these days which is appreciable.

      • bdonvr@thelemmy.club
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        1 day ago

        Yeah. A 1080p Bluray clocks in around 20GB. A 4K bluray is 60-80GB.

        If you’re downloading something smaller it’s probably lower quality

    • bdonvr@thelemmy.club
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      1 day ago

      There’s barely any actual 4K content you can consume

      I feel like that’s not true. But you’ve gotta try. If you’re streaming it, chances are it’s not really any better. 4K Bluray (or rips of them…) though? Yeah it’s good. And since film actually has 8K+ resolution old movies can be rescanned into high resolution if the original film exists.

      Supposedly Sony Pictures Core is one streaming service that can push nearly 4K Bluray bitrates… but you’ve gotta have really good internet. Like pulling 50-80GB in the span of a movie runtime.

      • Kogasa@programming.dev
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        1
        ·
        edit-2
        1 day ago

        You’re probably aware of this since you mentioned bitrate, but a lot of 4K streaming services use bitrates that are too low to capture much more detail at 4K compared to a lower resolution. A lot of games will recommend/effectively require upscaling (DLSS/FSR/XeSS) to achieve good performance at 4K. All of this is still maybe better than 1440p, but it shows 4K is still kind of hard to make full use of.

        • applebusch@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          14 hours ago

          I wish they didn’t feel the need to fake it with upscaling. In my experience upscaling looks like shit every time, whether it’s a video or game. Most of the time a good 1080p video with good bit rate will look way better than a 4k upscale.

          • Kogasa@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            13 hours ago

            For video, bitrate is definitely king. 4K high bitrate just gets insanely large. I opt for 1080p bluray quality when available over 4K usually. I looked into AI upscaling for video recently and it can be pretty good, but it’s a technology that changes fast so I’d rather store the original resolution and upscale in real time later (if at all).

            For games, I find even FSR2 upscaling from 1440p to 2160p is excellent as long as it’s implemented properly (i.e. scaling the 3D world and not the UI), and FSR3/4 even better.

      • CmdrShepard49@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        24 hours ago

        Not true on paper but true in practice. Most people don’t buy/use Blurays (or any other physical media) anymore to the point that retailers aren’t even bothering to stock them on the shelves these days. Their days are certainly numbered and then all we’ll be left with is low quality 4k streaming.

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      21 hours ago

      There’s barely any actual 4K content you can consume.

      Ironically there actually is if you bother pirating content because that’s the only crowd that will share full 4k Dolby Vision + Dolby Atmos/DTS-X BluRay rips.

      Aside from that though, even 4k gaming is a struggle because GPU vendors went into the deep end of frame generation, which also coincidentally is the same mistake lots of TV OEMs already made.

    • Asmodeus_Krang@infosec.pub
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 day ago

      I’ve got a nice 4k mini LED tv with a 4k Blu-ray player and there’s plenty of excellent 4k content but it’s a niche market because most people aren’t using physical media for movies. 4k streaming is garbage compared to UHD Blu-ray.