• Narrrz@kbin.social
    link
    fedilink
    arrow-up
    49
    ·
    1 year ago

    the irony of the first panel being a guy at his home computer, the second, at a work computer

  • TrustingZebra
    link
    fedilink
    arrow-up
    44
    ·
    1 year ago

    Working from home has killed PC gaming for me. I have no desire to sit at my desk after a full work day.

    • Waker@lemmy.ml
      link
      fedilink
      arrow-up
      18
      ·
      edit-2
      1 year ago

      Same. I can’t even be in the same room anymore. Somehow made gaming there feel like a chore as well…

      So bored of games, that I’ll go work on some other stuff (home servers) and just taking care of day to day life stuff. Maybe I’m just growing up, but I never thought I’d be bored of games… Ever…

      I still prefer working remotely though!

      • Obi@sopuli.xyz
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        I solve this by putting an Xbox downstairs and only playing games where short sessions are possible (rocket league, racing, indie games). I’d love for them to port WoW so I could grind/level up on the Xbox and go to the PC for raids though realistically even like that I probably still don’t have enough time anymore.

        • Waker@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Yeah, I’ve noticed I do feel more at ease on my living room now. But the only console I own is a Nintendo switch and I games that aren’t multiplayer bore me faster, and switch’s multiplayer games don’t really speak to me too much…

          Well you could in theory, use XPadder or something like that to play with a remote controller on PC. It’s not ideal by any means though…

          With that + steam streaming to the TV App (just add the game as non steam) you could play WoW. Not anything competitive though lol. Just questing simple stuff or easier dungeon content.

      • TrustingZebra
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Home servers are a hobby for me as well. It’s a better hobby than gaming, is what got me into my current career.

        But yeah kid me would be somewhat disappointed in adult me losing all interest in gaming…

    • Fushuan [he/him]@lemm.ee
      link
      fedilink
      arrow-up
      12
      ·
      1 year ago

      Working from home has empowered PC gaming for me. I just play a little when I’m stuck, and closing the work laptop screen is enough to disassociate from the working environment. I even have a kvm and reuse the same screens, same keyboard, same mouse.

      • NιƙƙιDιɱҽʂ@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        I work on my own PC, but keep two separate desktops going. One for personal, one for work. Ctrl + alt + arrow key to switch between. We don’t have any dystopian monitoring software to install, so it works great for my use case.

        • Fushuan [he/him]@lemm.ee
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          I’d still prefer being provided with a workstation. This way it’s easier for the company it team to install whatever they need, I don’t need to care about privacy tools and stuff with the work pc and I can just focus on working. Also, no way in hell I’m paying storage space for work files.

      • TrustingZebra
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I won’t do that during the work day, but having a KVM switch could be helpful. Everytime I want to use my personal machine I need to unplug and replug the USB hub and monitor. It’s only a couple of cables, but it’s annoying enough to the point I usually don’t bother.

        Actually, I just saw this post, I really should get a KVM switch…

        • Fushuan [he/him]@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          They are really great, I have two screens, a mechanical keyboard and a good mouse. When I press they key with the laptop up, I suddenly have 3 screens to work with. If I run any long running process or zi want to listen to music instead of running anything on the laptop I just press the switch and the top screens connect to the desktop while I can keep the screens showing me the progress of those processes below (the screen of the laptop itself).

          Peiole talk about working with 2 screens, but I quite enjoy using 3 haha.

            • Fushuan [he/him]@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              nono, the KVM supports 2 hdmi monitors (and I had to use a myriad of adapters to make them fit to the displayport and dvi ports of my monitors and the graphics card) and then the laptop itself works as a third monitor in the middle of the other two, where I have windows I don’t pay much attention to, but are still useful to have like teams, etc.

    • kameecoding@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      yep if I game it’s in the Living room on ps5.

      but mostly I rather just relax or exercise, but nowadays gaming feels much like an escapism that I don’t even enjoy just do it to procrastinate

  • steve228uk@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 year ago

    I’m literally at the same computer at I’m still the second picture when I’m working on my own stuff after hours 😂

  • Dylan@lemdro.id
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    Can’t wait to leave after 10 hours of bad screen to go home to 10 hours of good screen.

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      Honestly, it’s the only real down side of WFH. Your computer happy place becomes a den of working and pain for the hours between 9 and 5.

      Everything else about working from home is amazing though.

  • Mossy Feathers (She/They)@pawb.social
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    The irony is that nowadays the monitors would be swapped. The “good PC” would have a CRT (because most CRTs nowadays are probably in enthusiast rigs), while the “bad PC” would have the common 1080p Dell IPS display.

    On an semi-related note, why are Dell’s IPS panel monitors so ridiculously common? VA and TN panels are a lot cheaper, so I’d think companies wanting to get the most bang for their buck would use those instead. Is it the fact that IPS panels have a decent horizontal viewing angle, so Mr. Micromanager can look over your shoulder and see what you’re doing more easily?

    • w2tpmf@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Dell produced monitors is much larger numbers that most distributors like CDW etc will have in every warehouse in the country. This makes it much easier to standardize equipment across a large organization when you can always order the exact same SKU for several years in a row.

    • funkajunk@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 year ago

      Where are you getting this information about CRTs from? I know they get used for old school emulation, but pretty sure for modern systems a high refresh rate and freesync/gsync is where it’s at.

      • Mossy Feathers (She/They)@pawb.social
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        1 year ago

        People who are into older games tend to have a CRT + retro rig or digital to analog converter. A lot of older PC games legitimately look nicer on CRTs. Additionally, CRTs can have ludicrously high refresh rates and resolutions, don’t let the 4:3 aspect ratio fool you. High-end CRTs (specifically computer monitors, not TVs) tended to max out at 1600x1200 (vs 1920x1080), giving them a slightly larger vertical resolution at the cost of a lower horizontal resolution, with some going as high as 2048x1536 (comparable to 1440p (yes, 1440p, CRT computer monitors were mostly progressive scan, not interlaced like TVs)). Additionally, the refresh rates on later CRTs tended to start at 75hz (vs 60hz on LCDs), and could max out at 200hz on high-end monitors. You’d sacrifice resolution to do so, though I think you could mitigate some of that by using a BNC cable if your monitor supported it (though I doubt most rigs could run anything even close to 200fps without decreasing resolution). Finally, CRTs tend to have extremely low response times, very good color depth, and true blacks.

        That said, CRTs are heavy, fragile, and nowadays, expensive (before the pandemic you could get a high-end Sony Trinitron 20" PVM (professional video monitor) for like, $300-$400; shipping was more expensive than the monitor, nowadays you’re easily talking $1000 or more). Most LCD panels can beat CRTs in resolution and refresh rate nowadays (though even high-end LCD panels tend to struggle at beating CRT response time), and OLEDs outclass CRTs in almost every way.

        Edit: oh, another weakness of CRTs is that they can burn-in. That’s where the term originated. If you left an image on the screen too long, it’d burn into the display, causing it to persist even after the monitor was turned off and unplugged. Since no one’s making CRTs anymore, that means there’s a smaller and smaller pool of CRTs in good condition, which means they’ll get more expensive until someone decides it’s worth the money to start making the tubes again.

        Edit 2: that’s also why screensavers were a thing! Screensavers were there to stop you from accidently burning in your monitor. I wonder why they haven’t made a comeback with OLEDs.

    • moosetwin@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I bet it’s that intel already has a business relationship with a ton of companies and the inertia is keeping them common