Hey Community,

Since I just read a post about the X11 vs. Wayland situation I’m questioning if I should stay on X11, or switch to Wayland. Regarding this decision, I’m asking you for your opinions plus please answer me a few questions. I will put further information about my systems at the bottom.

  • What are the advantages of Wayland? What are the disadvantages?
  • I do mostly music production, programming, browsing, etc, but occasionally I’m back into gaming (on the desktop). How’s performance there? Anything that might break?
  • what would be the best way to migrate?
  • why have/haven’t you made the switch?

Desktop: Ryzen 3100, 16 Gig Ram, Rx 570 Arch Linux with KDE 144 hz Freesync Monitor and 60hz shitty monitor

laptop: Thinkpad L540 (iirc), i3 4100, 8 GB Ram intel uhd630 gfx (iirc) Arch Linux with heavily customized i3-gaps

  • @michaelrose@lemmy.ml
    link
    fedilink
    01 year ago

    double performance in wayland

    Ya that is complete nonsense.

    Firefox accelerated decoding works in X11 as does mixed DPI

    • Coelacanthus
      link
      1
      edit-2
      1 year ago

      There is a benchmark use https://webglsamples.org/aquarium/aquarium.html

      https://blog.lilydjwg.me/2021/11/12/display-tearing.215968.html

      • X11 + Intel card, 1080p 60fps, GPU fully utilized, one third of frames dropped! 4k 60fps is about the same. It turns out that the focus is not on the resolution (the GPU isn’t used to its full capacity anyway), it’s on the frame rate of the video.
      • Wayland + Intel cards, 4k 60fps, not even dropping frames, let alone anything else, and the GPU is used for about half of the graphics calculations.

      For hardware decode, when I switched to wayland, it was only implemented in Wayland. After they implememted EGL on X11, they implemented hardware decode on X11 as well.

      For mixed DPI, applications can implement it use screen information, but not all applictions will do this. But wayland ask them to implement this feature.