• 4 Posts
  • 117 Comments
Joined 2 years ago
cake
Cake day: December 12th, 2023

help-circle
  • I had a website that was set up for only my personal use. According to the logs the only activity I ever saw was my own. However, it involves a compromise. Obscurity at the cost of accessibility and convenience.

    First, when I set up my SSL cert, I chose to get a wildcard subdomain cert. That way I could use a random subdomain name and it wouldn’t show up on https://crt.sh/

    Second, I use an uncommon port. My needs are very low so I don’t need to access my site all the time. The site is just a fun little hobby for myself. That means I’m not worried about accessing my site through places/businesses that block uncommon ports.

    Accessing my site through a browser looks like: https//randomsubdomain.domainname.com:4444/

    I’m going on the assumption that scrapers and crawlers are going to be searching common ports to maximize the number of sites they can access over wasting their time on searching uncommon ports.

    If you are hosting on common ports (80, 443) then this isn’t going to be helpful at all and would likely require some sort of third party to manage scrapers and crawlers. For me, I get to enjoy my tiny corner of the internet with minimal effort and worry. Except my hard drive died recently so I’ll pick up again in January when I am not focused on other projects.

    I’m sure given time, something will find my site. The game I’m playing is seeing how long it would take to find me.



  • There’s a few things I backup from my phone.

    • Music downloaded from Seeker
    • Youtube audio downloaded from YTDLnis
    • Backups of Termux
    • Notes in plain text
    • Backups from certain apps that make their own backup data
    • Pictures that I have sorted and want to saved

    I have an Android phone so I use Termux as a terminal emulator. I use ssh and passwordless keys to make transfers simpler and quicker.

    Although this is closer to a backup process and not like SyncThing where it’s syncing a folder between two devices. I don’t believe rsync is capable of acting like SyncThing but I’m tempted to dig into rsync more and see if I can put something basic together one day.




  • She called someone who she trusts in a time when she needed support. I made sure she was safe, calm and around people she felt safe being around for the rest of the night.

    Before that night, we became our own little mental health support group so of course I’m going to make sure she feels safe after dealing with some unhinged shit like how that guy was acting.

    Also she lives in a different country, her life is her own as much as my life is my own in my own country. It’s possible for two people to be friends, care for each other and not expect to be in an intimate or romantic relationship with each other.




  • I’ve been writing POSIX scripts as a sort of hobby and don’t really have any Bash experience but I think I can still give some insight. Hopefully what I say is accurate but this is what I’ve learned so far.

    POSIX is a standard, to say it as simple as possible, it sets the minimum requirements for environment, programs, commands and options for those commands with the purpose of having those commands be as portable as possible. That way a POSIX script will work on any POSIX compliant system. For example a POSIX script could work on Arch, Debian, on a Raspberry Pi or even Mac products. In theory if could work on windows too. If an Operating System ships with a POSIX compliant shell, you are very likely able to run a POSIX script.

    Bash is a shell but it has a bunch of features that expand beyond the basic features set by the POSIX standard. Bash also has more features and flexibility for scripting which is why it’s so common to see Bash scripts. Those scripting features are usually referred to as “bashisms.” Since it expands on POSIX scripting, it can look similar to a POSIX script but would not work as intended if you ran a Bash script outside of a Bash shell.

    With a lot of modern OS’s, they would likely have Bash installed and you most likely don’t need to worry about anything. However, Bash is not a standard and not required to be installed on every system.

    If you care about your script working on as many systems as possible without the worry about what shell is being used, you will probably prefer writing a sh shell, POSIX compliant script.

    Since POSIX shells and scripts work on a much more basic level, it can lack some depth and finding work arounds for issues can start to look unreadable/insane. A good example is how arrays are handled. POSIX is limited to one array where Bash has much better support for arrays.

    There are advantages to using either but with the popularity of Bash, it’s not really that big of a deal in the end.


  • My guild had so many tanks and off-tanks that I was always last pick as a tank but I still attended most raids. I made myself useful by getting every alchemy, cooking and first aid recipes along with damage and tanking for “oh shit” moments.

    As a warrior, I had access to every gear so I used my first points on onyxia bags for all my bag slots. I carried random gear like the underwater breathing staff, a huge amount of potions (especially running potions), a wedding dress and a flame enchanted broom to beat people with while wearing a wedding dress. I had so many gimmick items to amuse people during any down time.

    I had so many points that I suddenly went from a mix of random gear to a mix of really good random gear. It was fun to be a menace in PvP before PvP gear became the norm. Healers loved me because they liked playing with my life and I always quick to protect them. Enemies hated me because I’d get all the heals or I’d be the most annoying mosquito if they attacked my healers.

    Lots of good memories from that time but MMO’s never hit the same after that game. The people I met during that time were what made that it all special.


  • I remember starting a Hunter because they could have pets but got real bored real quick. It felt too easy. After a bit of research, I changed to a warrior. At launch the warrior was the most under powered class.

    Solo levelled my way to 60. Took me twice the time to get to level 60 because I kept going on adventures. Made it to Gadgetzan at some ridiculously low level (after many deaths). I also found a bunch of easter eggs before hitting 60 too.

    I was allowed to be a DPS warrior in raids and at one point was matching or outdoing Rogues for damage. Used to speed run Stratholme and Scholomance as a fury Warrior because my healers loved the chaotic challenge of keeping me alive.

    I had so much fun playing my own way and that probably contributed to why I had such good friends in the guild during that time. I had to quit because the expansions kept adding too much grind and it sucked having all that hard earned gear become pointless every new expansion :(


  • I have two systems that sort of work together.

    The first system involves a bunch of text files for each task. OS installation, basic post OS installation tasks and a file for each program I add (like UFW, apparmor, ddclient, docker and so on). They basically look like scripts with comments. If I want to I can just copy/paste everything into a terminal and reach a a specific state that I want to be at.

    The second system is a sort of “skeleton” file tree that only contains all the files that I have added or modified.

    Here's an example of what my server skeleton file tree looks like
    .
    ├── etc
    │   ├── crontabs
    │   │   └── root
    │   ├── ddclient
    │   │   └── ddclient.conf
    │   ├── doas.d
    │   │   └── doas.conf
    │   ├── fail2ban
    │   │   ├── filter.d
    │   │   │   └── alpine-sshd-key.conf
    │   │   └── jail.d
    │   │       └── alpine-ssh.conf
    │   ├── modprobe.d
    │   │   ├── backlist-extra.conf
    │   │   └── disable-filesystems.conf
    │   ├── network
    │   │   └── interfaces
    │   ├── periodic
    │   │   └── 1min
    │   │       └── dynamic-motd
    │   ├── profile.d
    │   │   └── profile.sh
    │   ├── ssh
    │   │   └── sshd_config
    │   ├── wpa_supplicant
    │   │   └── wpa_supplicant.conf
    │   ├── fstab
    │   ├── nanorc
    │   ├── profile
    │   └── sysctl.conf
    ├── home
    │   └── pi-user
    │       ├── .config
    │       │   └── ash
    │       │       ├── ashrc
    │       │       └── profile
    │       ├── .ssh
    │       │   └── authorized_keys
    │       ├── .sync
    │       │   ├── file-system-backup
    │       │   │   ├── .sync-server-fs_01_root
    │       │   │   └── .sync-server-fs_02_boot
    │       │   └── .sync-caddy_certs_backup
    │       ├── .nanorc
    │       └── .tmux.conf
    ├── root
    │   ├── .config
    │   │   └── mc
    │   │       └── ini
    │   ├── .local
    │   │   └── share
    │   │       └── mc
    │   │           └── history -> /dev/null
    │   ├── .ssh
    │   │   └── authorized_keys
    │   ├── scripts
    │   │   ├── automated-backup
    │   │   └── maintenance
    │   ├── .ash_history -> /dev/null
    │   └── .nanorc
    ├── srv
    │   ├── caddy
    │   │   ├── Caddyfile
    │   │   ├── Dockerfile
    │   │   └── docker-compose.yml
    │   └── kiwix
    │       └── docker-compose.yml
    └── usr
        └── sbin
            ├── containers-down
            ├── containers-up
            ├── emountman
            ├── fs-backup-quick
            └── rtransfer
    

    This is useful to me because I can keep track of every change I make. I even have it set up so I can use rsync to quickly chuck all the files into place after a fresh install or after adding/modifying files.

    I also created and maintain a “quick install” guide so I can install a fresh OS, rsync all the modified files from my skeleton file tree into place, then run through all the commands in my quick install guide to get myself back to the same state in a minimal amount of time.


  • I do want to write up a guide about how to setup Caddy + DeSec.io but I don’t have the time at the moment. If you have any questions, feel free to ask. I can try to help where I can.

    I’ll leave you this previous post I made, you might find some additional information in there if you get stuck. https://lemmy.dbzer0.com/post/51117983

    Also, someone suggested using a wildcard cert for the use of any sub-domain names. I chose to learn and use that because it helps obscure my services. If you have any interest in security, it might interest you. It terms of security, it’s not the absolute way to protect yourself, but I think it helps when combined with other security measures. If you read the comments in the post, you should get some more insight about it.


  • True.

    My self-hosting strategy is wildly alternative and not one I speak much about publicly. I’m the only person connecting to my own domain so as long as I continue to practice shutting the fuck up, I can get away with using multiple layers of obscurity rather than fiddling with third party solutions.

    I check my logs daily and the only activity I ever see is my own. Since I am not hosting anything critical or sensitive, I have the opportunity to experiment this way without much risk to myself.

    The way I’m set up, I am not concerned with DDOS attacks because it would fail to get past the Dynamic DNS. If I were hosting a social media platform or something more public, then I would need to take stronger measures to protect myself and that data.



  • Even though I don’t host anything important, I’m still glad I found alternative ways to hosting my own stuff without the use of any of Cloudflare services.

    I’ve noticed over time that the self-hosted communities have been suggesting Cloudflare Tunnels less and less since Trump and his gang took over America. Maybe this latest outage will push more people to not recommend Cloudflare again in the future.

    I still remember when I first got into self-hosting and being mocked pretty hard for questioning the use of such a large centralized service like Cloudflare. I’m glad I persisted and kept learning in my own direction but that still was very demotivating at the time.


  • I actually started with RPi’s. The first one, a used Pi 4b, is dedicated only to HomeAssistant. I don’t tinker with it anymore because it does what I want and I don’t want unexpected downtime when I have to use the bathroom or use the lights in my room.

    I bought a used Pi5 with the intention of upgrading later. In life I am quite minimal and find a joy in using what little tools and material I have to create something new. That seems to hold true to technology and scripting too. The RPi5 with an old USB3 HDD is actually way more power than I can currently use and can imagine using for a long time. The extra room to work is convenient though.

    I’ll have a look into some of the places you suggested, those seem like the places to draw good inspiration from, thank you.


  • I don’t have much experience with curl. From what I understand, it’s an old but constantly maintained command line tool. If you type curl https://www.google.com/ in your terminal of choice, you should get a text display in return of google’s search page. That’s if the curl command is installed on your system, which it most likely would be.

    You won’t be able to interact with it since it’s in text but you can see how the page has been written in the HTML language before it gets rendered into the website you would normally see in a web browser.

    When it comes to terminal commands, I find it helpful to do web searches using linux <command name>. For example linux curl and that will lead me to many sites that help explain the command and give multiple examples of how to use the command.

    Once you get more experienced with using a terminal, using the command options --help or -h will give you information that could help you use the command. For example curl --help

    There’s also manual pages, or man pages that give a more technical look at commands within your terminal of choice. You can access them with man <command name>. Example: man curl.

    In the case of federation, every platform that is using federation is using a communication protocol called ActivityPub. Simplified, it functions like email but instead of private emails, it’s transferring public social media content. Microbloggers and threaded conversations can communicate with each other using ActivityPub but the information exchanged between the two platforms is slightly different. That’s how we get quirks like this when two different ActivityPub platforms communicate with each other.



  • I started out rewriting my network backup scripts only to realize I was adding functionality to a previous script I wrote to automatically mount and dismount luks encrypted volumes. I still want to type in my luks passphrase because I don’t want everything automated and prefer to include inconvenience as an additonal security measure in securing some of my data.

    I also came to the realization recently that the reason I don’t relate strongly to other self hosters is because I’ve unknowingly been trying to create a minimal self hosted system that is more beneficial to small, low powered devices.

    I’ve been using Alpine Linux, I install only the bare, older but well established tools and have been creating scripts soley based off those tools instead of seeking out bigger, more complicated modern tools. For example creating workflows by only using rsync or using https://github.com/RayCC51/BashWrite to create a blog that only uses bash and GNU sed to create a static blog site.

    At least now that I’m aware of this, I can keep an eye out for such projects or communities and would hopefully be able to contribute something in that direction.