• flemtone@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    3 days ago

    And how much are they charging to support the new standard ? DisplayPort is at least free and open.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    4 days ago

    Not only is DisplayPort better, they have eDP which has been used as a beefed up MIPI for laptops and tablets for years.

    Plus it supports HDMI and DVI for backward compatibility, so really it’s just the last standing corporate media standard that hasn’t fallen to its superior open counterpart.

  • Altima NEO@lemmy.zip
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    3 days ago

    Now I wonder how strict they’ll actually be with the standard? 2.1 is a mess because you can call just about anything 2.1 now.

    And then it takes forever for companies to adopt it.

  • Lost_My_Mind@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    4
    ·
    3 days ago

    No thank you. I’d like to pass on this.

    We can already do 8K resolution. We still haven’t gotten to a point where the average broadcast is anything more than 720p or 1080p.

    It’s the reason I never bought a 4K or 8K tv. Sure, I have a new tv, but the only thing thats 8k is the forced ads into the TVs OS.

    And that’s why I don’t see benefit to a new HDMI. It’s just going to support more protocols, and make tv’s do more things that we don’t want. It’s going to make DRM in the cable. It’s going to make unskippable ads, it’s going to make all this shit that nobody wants or needs, but ooooOOOOooooo!!! Look! It’s new tech, so everybody gotta have it!!!

    But how’s it going to improve your visual experience if the content isn’t any better resolution to begin with?

    • patatahooligan@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      On HDMI 2.1, you can do 8K30fps before you have to compress the stream with DSC, which is “visually lossless”, so actually lossy. We don’t even have 5K120fps or 4K240fps without compression. These are common refresh rates for gaming. So you could say that the highest resolution that supports all use cases without compromises is 1440p. That’s definitely not enough even by today’s standards. I think you’re underestimating the time it takes for standards to reach widespread adoption. The average viewer is not going to have access to it until the technology is cheap and enough time has passed for at least a few hundred million units to have reached the market. If you think you’re going to be using it in 2030 or later for your “average broadcast”, then it needs to be designed today.

      Of course HDMI is shit for reasons you mention. DisplayPort is better, but it’s not an open standard either and it supports DRM as well. But designing for higher bandwidth is necessary and it has nothing to do with HDMI’s problems.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      3 days ago

      Yes. It’s nice to have crisp fonts, but how realistic do I need pictures on my screen really?

      Life is finite. I have my dog, family members, some need for rest and food and cleaning and music, chores forgotten, and a girl I’d like to talk to (likely either it’s my imagination and she doesn’t like me that way or the ship has sailed, but still to hope is to live).

      Pictures on a usual 1920x1080 display are already as good as what people of 1950s could dream to see on paper. I’m not judging those who want more, but it seems like pressure for the sake of it. More GHz, more pixels, bigger something.

      I hope that pressure for “more” will change to pressure for “better” some day.

    • dual_sport_dork 🐧🗡️@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      4 days ago

      I wish manufacturers would bother to mark the capabilities of their otherwise identical looking ports and cables, so we could figure out what the hell we were looking at when holding a Device That Does Not Work in one hand and a cable in the other.

      • Ilovethebomb@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 days ago

        I think this is the reason a number of standards are going to active cables, so the device will know the cable isn’t up to standard.

        Or in the case of USB-C, so it doesn’t catch fire after having five amps cranked through it.

        • dual_sport_dork 🐧🗡️@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          4 days ago

          It’s bully for the device if it knows, but that doesn’t help the user who has just pulled one identical looking cable out of many from the drawer and will have no idea until they plug it in whether or not they will get a picture, nothing, near-undiagnosable partial functionality, or smoke.

          • Ilovethebomb@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 days ago

            I’m thinking that the user will get a notification that the cable they’re using isn’t the correct one.

              • Ilovethebomb@lemm.ee
                link
                fedilink
                English
                arrow-up
                3
                ·
                3 days ago

                HDMI is reverse compatible a long way back, almost every device will fall back to a standard that doesn’t require such an expensive cable.

                Come on man, this is simple stuff.

    • Pasta Dental@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 days ago

      I can definitely see the difference between a 1440p 27 inch display vs a 5k 27 inch display, add in high refresh rate and HDR and you already are close to exceeding the DP 2.0 maximum bandwidth (without display stream compression). I wish we could finally get decent high DPI monitors on desktops that aren’t made by or for apple Macs

      • LaggyKar@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 days ago

        Though that’s not where you would use HDMI. I would argue for TV:s, 4k is generally enough, and HDMI 2.1 already has enough bandwidth for 4k 120 Hz 12 bit-per-color uncompressed.

        But DisplayPort, yeah, that could use a bit more.

        • Pasta Dental@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 days ago

          The point is definition doesn’t matter, it’s the viewing distance + pixel density that matters. This is what apple calls retina: when we stop seeing the individual pixels (jagged edges) at a normal viewing distance. This means that a phone will need a much higher pixel density than a desktop or tv. But the low-dpi displays we still have are unacceptable in 2024 the icons and text look so ugly…

          • Cocodapuf@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            3 days ago

            The thing is, I prefer actually owning my media, I don’t use steaming services for the most part. But even with my 40 TB of media storage, I just don’t have the space for 5k content. If it’s worthwhile, it gets 1080, if it matters less (kid shows or anything that came from a dvd), it gets 720 at best.