As always, please ensure you stop your Jellyfin server and take a full backup before upgrading!

  • Einar@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    6 months ago

    As always, please ensure you stop your Jellyfin server and take a full backup before upgrading!

    Now, if only there was a simple, built-in way to backup/export and restore/import all settings and other data, so that all platforms could do this easily, without having to search the internet for which folders to back up…

    FYI, this is the best we have atm (which is pretty terrible). Please correct me if there is a better way:

    How to backup a JF instance?

    Jellyfin Docs: Migrating

    • kakes@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 months ago

      I run JF in a docker container, and although I don’t have backups of my config files yet (because I don’t really care about setting up from scratch if need be), it would be trivial to simply backup the mounted config volumes. Makes upgrading safe and easy, too.

      That’s probably how I would recommend going about this, personally.

      • SuitedUpDev@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        6 months ago

        datab

        Theoretically, support for that could be coming… Emby (where Jellyfin is based on) always used their own layer for interacting with a SQLite database. All that custom made logic is currently being swapped out for EF Core. EF Core is a DotNet library for interacting with databases and EFCore that also supports MySQL, PostgreSQL, SQL Server besides SQLite.

        So my guess is that, once all that work is completed, support of other database can be added.

        For a little bit of context. I am currently running Jellyfin on Btrfs and there is quite a performance impact due to CoW. If 2 clients decide to browse the libraries, both clients grind to a near standstill with regards to being able to see things. So I am following this work with quite some interest.

        • Laser@feddit.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          I am currently running Jellyfin on Btrfs and there is quite a performance impact due to CoW. If 2 clients decide to browse the libraries, both clients grind to a near standstill with regards to being able to see things.

          CoW is not recommended for databases, all DB servers advise for turning it off for the actual database. You’ll run into the same issue with a dedicated database if you leave CoW on I guess. You could also disable CoW for jellyfin’s database right now and performance should increase.

          I also follow the progress of a dedicated DB, but on the other hand I don’t know how much sense it makes architecturally. The likeliness that you have multiple jellyfin server instances access the same database is low - after all, there is info very specific to the server in there like the file path. Just migration is already not easy, how likely is sharing the database live? And if each database is specific to an instance - why not use SQLite (like it’s done right now) and allow for more specific parameter tuning, like used memory and the like?

      • stevestevesteve@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        That’s my absolute #1 wish for jf. I’m sure it’s hard work and people are on it, it excites me to think about

    • deweydecibel@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      I believe they’re suggesting just doing a full backup up of your system/Docker container. Which isn’t ideal, but I think they’re trusting people who can run a Jellyfin server to be able to use the scripts.

      • Einar@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        6 months ago

        Sure. But what if Docker is not available on a machine? What if the import should happen on a Linux machine coming from Windows? What if I want to sync two installations on different OSs?

        I know it’s all doable, but not easy, let alone foolproof. It’s so easy to install, but genuinely not easy to keep safe without tech knowledge.

        • exu@feditown.com
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          6 months ago

          Syncing two instances sounds like a fun challenge. I think there’s some project to replicate an sqlite db over the network. Similarly, you could use ceph or other distributed storage for the media.

          I built something like this for Nextcloud a few years back, fun times.