Well update on that, after like 3 days of struggling with nextcloud I’ve finally got that set up with working collabora-code
That let’s me have basically my own self-hosted “Google drive” or “Office 365” experience, so you can upload, sync, download, edit files online in my own self-hosted cloud.
That looks effectively like this, you have a main dashboard with apps along the top left (looks similar to like, all the office 365, Desire2Learn school or work hosted solutions doesn’t it?):
2023-08-23_11-32.png
And then you can open up docx or other micro$oft files and edit them in browser, supposed can be done live with others as well (untested still, and I guess I must have issues with spellchecker right now…):
2023-08-23_11-33.png
So next I’m going to be figuring out how to make sure my pivpn is set up so that these can be access remotely, I have them set up to be just HTTP instead of HTTPS since it wouldn’t work the other way.
This is my docker-compose file for it (saving here since I need to back it up now that its working, and it took 3 days to figure out):
NOTE: don’t use this on anything publicly facing, I have HTTPS turned off, its “fine” here because I’m planning to only access it from via a VPN remotely, and not through the public internet.
Now that that’s all set up I need to make sure my wireguard can let remote machines use the pihole as the DNS server so I can get a locally defined domain to connect to the nextcloud.
Then its on to setting up a second proxmox node with SSD, migrating this to that, then reimaging the laptop with an SSD in place of the HDD. Got it a 512GB SSD for now, maybe up that at some point but I’m hoping to get a rack server as well at some point. Right now its at 600 GB HDD, so it’ll be a bit of a downgrade but I think SSD will be worth it.
Right now honestly the browsing of nextcloud isn’t too bad off a HDD.
In other news of my self-hosting adventure, I also set up gitea.
That was way easier and faster than Nextcloud and almost too easy.
That one I haven’t considered ready for production just yet, I’m waiting until I get the second machine up and then I’ll probably look into having gitea pull over all my private repos from gitlab. It looks like that is a thing at least.
I may have my gitlab stick around and maybe see if I can’t get gitea to mirror changes up to gitlab as well, so I can have a redundant backup of any code projects I want to have.
The one thing I want to make sure I can do is back up my crap out of these containers, since that right now seems non-trivial to figure out how I get just the files out of the volumes.
For now I am taking snapshots as necessary to basically save state on the containers, but of course having a way to get full backups out of just the files/data would be good, I don’t want to necessarily have a full image/backup of the machine that is running these things, I just want to have a way to get the file copies out of there and back it up somewhere, don’t really need to have the whole container image with it too.