Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

Dust0741

@Dust0741@lemmy.world

This profile is from a federated server and may be incomplete. Browse more on the original instance.

Dust0741 OP ,

Hmm interesting. Maybe I'll have to do more testing.

Dust0741 OP ,

Alright, is there any way to have it only select new photos? I.e. I turn on syncing, and every photo after this point in time is uploaded. Or do I have to upload them all.
FYI there is ~200GB of photos/videos on the phone

Dust0741 ,

I've got a $50 USD 6500T with 25+ docker containers including jellyfin and it is amazing. It isn't the drive space you want but pure cli Linux is very lightweight.

Dust0741 OP ,

No, every version of Minecraft except PlayStation allows connecting to external servers.

Dust0741 OP ,

It is bridged as you described by default. The issue is the switch cannot install a VPN client.

Dust0741 OP ,

Huh very cool. I'm going to have to try this

Dust0741 ,

Grapheneos can be used almost identically to stock android. You can install google apps and use them or not. The biggest piece of it is the options.

There is no account associated to GOS. You can login to an existing google account etc, just like any android.

GOS has messages for SMS only. It had a Gallery app for photos and a files app for system files.
There aren't many apps it comes with, so getting alternative apps is easy. Mostly via Fdroid (or droidify for a more modern looking app).
For a better photos app, I recommend "Aves"
For a drive app, a private option would be proton drive.
Notes app can be anything you want, but GOS doesn't come with one. If you want to use google notes you can. I wouldn't recommend it, but you can. There are lots on Fdroid to choose from.

As for cloud sync, GOS doesn't do this, but again, you can use any other service you'd normally use to sync.
I use Syncthing to sync a folder on my phone to a folder on my PC. That way I can have things like my photos easily on desktop and have backups.

As for app stores, GOS doesn't recommend Aurora because they don't sign the apps the provide, but I use it anyways, as it is the best way to get apps without a google account.

You definitely don't need your own Nextcloud or Homelab. I prefer paying for hardware I own instead of cloud things, but both have good positives.

Also, your questions aren't stupid their great! You're just learning about this stuff that that's amazing. Keep learning.

Dust0741 ,

No adblocker built in, but Rethink DNS is a great app that will set up a local VPN and do firewall filtering and DNS filtering.
There are other apps too and they should all work on any OS.

Personally I self host a VPN and pihole and stay connected to that

Dust0741 OP ,

As I mentioned I have a server, and I use a VPN to connect always to it. This makes using a paid VPN a bit harder. The dedicated VPN IP should fix this issue but I haven't looked into how difficult that'd be.

Dust0741 OP ,

Yup. Tailscale+Mullvad isn't a bad option, but I'd rather not depend on tailscale and a true local connection will always be better.
But then you have to pay through tailscale and then more identifiable.

Dust0741 OP ,

Very similar yes. Trackmenot but for any site not just search engines. Although it may be a good option too

Dust0741 OP , (edited )

I like this. Is there some sort of list of safe sites that exists that I could use in a script?

Edit: something like this

Dust0741 OP ,

Yea. My issue now is finding a list of these sites

Dust0741 OP ,

That's a good idea.

Probably just a shell script. Someone mentioned using curl so that'd be pretty easy

Dust0741 OP , (edited )

Little curl shell script that works:

#!/bin/bash

# Random_Curl_Request.sh

# CSV file containing websites
CSV_FILE="/home/user/Documents/randomSiteVisitor/websites.csv"

# Get a random line from the CSV file
RANDOM_LINE=$(shuf -n 1 "$CSV_FILE")

# Extract the website URL from the random line
WEBSITE=$(echo $RANDOM_LINE | cut -d ',' -f 1)

# Make a curl request to the random website every minute
while true; do
    curl $WEBSITE
    sleep 60

    # Get a new random line from the CSV file
    RANDOM_LINE=$(shuf -n 1 "$CSV_FILE")

    # Extract the website URL from the new random line
    NEW_WEBSITE=$(echo $RANDOM_LINE | cut -d ',' -f 1)

    # Update the website URL for the next iteration
    WEBSITE=$NEW_WEBSITE
done

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • incremental_games
  • meta
  • All magazines