• huginn@feddit.it
    link
    fedilink
    arrow-up
    5
    ·
    5 months ago

    Resources are just way cheaper than developers.

    It’s a lot cheaper to have double the ram than it is to pay for someone to optimize your code.

    And if you’re working with code that requires that serious of resource optimization you’ll invariably end up with low level code libraries that are hard to maintain.

    … But fuck the Always on internet connection and DRM for sure.

    • rbn@sopuli.xyz
      link
      fedilink
      arrow-up
      5
      ·
      5 months ago

      If you consider only the RAM on the developers’ PCs maybe. If you count in thousands of customer PCs then optimizing the code outperforms hardware upgrades pretty fast. If because of a new Windows feature millions have to buy new hardware that’s pretty desastrous from a sustainability point of view.

      • huginn@feddit.it
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        Last time I checked - your personal computer wasn’t a company cost.

        Until it is nothing changes - and to be totally frank the last thing I want is to be on a corporate machine at home.

        • CosmicTurtle0@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          When I was last looking for a fully remote job, a lot of companies gave you a “technology allowance” every few years where they give you money to buy a computer/laptop. You could buy whatever you wanted but you had that fixed allowance. The computer belonged to you and you connected to their virtual desktops for work.

          Honestly, I see more companies going in this direction. My work laptop has an i7 and 16GB of RAM. All I do is use Chrome.

          • huginn@feddit.it
            link
            fedilink
            arrow-up
            1
            ·
            5 months ago

            It’d be nice to have that - yeah. My company issued me a laptop that only had 16gb of RAM to try and build Android projects.

            Idk if you know Gradle builds but a multi module project regularly consumes 20+GB of ram during a build. Despite the cost difference being paid for in productivity gains within a month it took 6 months and a lot of fighting to get a 32gb laptop.

            My builds immediately went from 8-15 minutes down to 1-4.

            • CosmicTurtle0@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              I always felt that this is where cloud computing should be. If you’re not building all the time, then 32GB is overkill.

              I know most editing and rendering of TV shows happen on someone’s computer and not in the cloud but wouldn’t it be more efficient to push the work to the cloud where you can create instances with a ton of RAM?

              I have to believe this is a thing. If it isn’t, someone should take my idea and then give me a slice.

              • huginn@feddit.it
                link
                fedilink
                arrow-up
                1
                ·
                5 months ago

                It’s how big orgs like Google do it, sure. Working there I had 192gb of ram on my cloudtop.

                That’s not exactly reducing the total spend on dev ram though - quite the opposite. It’s getting more ram than you can fit in a device available to the devs.

                But you can’t have it both ways: you can’t bitch and moan about “always on internet connections” and simultaneously push for an always on internet connected IDE to do your builds.

                I want to be able to work offline whenever I need to. That’s not possible if my resource starved terminal requires an Internet connection to run.

                Ram is dirt cheap and only getting cheaper.

          • Ardyssian@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            5 months ago

            Alternatively they could just use Windows VDI and give you a card + card reader that allows Remote Desktop Connection to avoid this hardware cost, like what my company is doing. Sigh

    • 2xsaiko@discuss.tchncs.de
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      You can also build a chair out of shitty plywood that falls apart when someone who weighs a bit more sits on it, instead of quality cut wood. I mean, fine if you want to make a bad product but then you’re making a bad product.

      • huginn@feddit.it
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        5 months ago

        Resource optimization has nothing to do with product quality. Really good experiences can be done with shitty resource consumption. Really bad experiences can be blisteringly fast in optimization.

        The reason programmers work in increasingly abstract languages is to do more with less effort at the cost of less efficient resource utilization.

        Rollercoaster Tycoon was ASM. Slay the Spire was Java. They’re both excellent games.

        • 2xsaiko@discuss.tchncs.de
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          Yeah, I don’t really have a problem with games except for the stuff added on purpose just to make the user experience worse like DRM. I was more thinking about trends like using Electron for desktop development.