Discord… Still isn’t public?
They’re certainly talking about it but they haven’t announced a date yet.
Apologies, I striked the lines out of my previous comment. It simply was an example of how you still can be captured.
Running Linux would block this feature too.
Keep In mind that you can still be captured by this feature indirectly, Discord for example certainly doesn’t intend to do anything to hide your messages, they recently went public so in their eyes more tracking the better.
And the backups.
This is the Firefox extension I use, I would check the headers your browser passes with WhoAmI to verify your user-agent, alternatively you can use invidious to get around YouTube’s bullshit.
I host a public Invidious instance for folks with a Canadian IP - https://inv.halstead.host/
I have to use Chrome to access a couple of sites that don’t play nice with Firefox.
I bet those sites will play nice if you switch your user-agent to display as chrome.
How is the art a positive?
While true, they still collect data on the results hosting your own instance can prevent you from hitting rate-limits as often.
- SearxNG (Google Privacy frontend)
SearXNG is more than just a front end for google search, it’s an aggregator, if configured properly can collect results from Bing, Startpage, Wikipedia, DuckDuckGo, Brave.
https://freetubeapp.io/ comes packed with DeArrow, Sponsorblock and natively uses Invidious or Piped API’s for playback.
Yes, back up your stuff regularly, don’t be like me and break your partition table with a 4 month gap between backups. Accomplishing 4 months of work in 5 hours is not fun.
So why would you not write out the full path?
The other day my raspberry pi decided it didn’t want to boot up, I guess it didn’t like being hosted on an SD card anymore, so I backed up my compose
folder and reinstalled Rasp Pi OS under a different username than my last install.
If I specified the full path on every container it would be annoying to have to redo them if I decided I want to move to another directory/drive or change my username.
As other stated it’s not a bad way of managing volumes. In my scenario I store all volumes in a /config
folder.
For example on my SearXNG instance I have a volume like such:
services:
searxng:
…
volumes:
- ./config/searx:/etc/searxng:rw
This makes the files for SearXNG two folders away. I also store these in the /home/YourUser
directory so docker avoids using sudoers access whenever possible.
Would this be the Gif killer? If PNG can contain a relatively similar frame count & time limit but with marginally better image quality it just may.
Grandma probably doesn’t do the actually torrenting herself, chances are OP has a overseerr or jellyseerr type of setup, grandma makes the request and things just flow.
“Technically” my jellyfin is exposed to the internet however, I have Fail2Ban setup blocking every public IP and only whitelisting IP’s that I’ve verified.
I use GeoBlock for the services I want exposed to the internet however, I should also setup Authelia or something along those lines for further verification.
Reverse proxy is Traefik.
If you aren’t already familiarized with the Docker Engine - you can use Play With Docker to fiddle around, spin up a container or two using the docker run
command, once you get comfortable with the command structure you can move into Docker Compose which makes handling multiple containers easy using .yml
files.
Once you’re comfortable with compose I suggest working into Reverse Proxying with something like SWAG or Traefik which let you put an domain behind the IP, ssl certificates and offer plugins that give you more control on how requests are handled.
There really is no “guide for dummies” here, you’ve got to rely on the documentation provided by these services.
Me: Siri turn lights on
Siri: Now playing Bon Jovi’s “Wanted dead or alive”
Me: Siri shut the fuck up, you have one job, do that job
Interesting that you chose Reddit as an example.
I was in a rush! Honestly it was the quickest thing I could come up with on the spot, plus database tools are something I lack a lot of knowledge about so I really couldn’t go in depth even if I wanted to.
Appreciate the history behind Reddit’s database!
Postgres, SQLite, etc are tools for database management, things like user data, application data and so on are collected here.
Edit: the DB_PASS=“postgres”
is the default password when setting up a Postgres database.
Take Reddit’s Karma system or Upvotes/Downvotes for example, they’re stored in a database and however Reddit wants to utilize that data Postgres makes it easy to call upon it.
I’m sure others can give more detailed responses, I’m typing this out in a rush.
Not a single soul asked for it, waste of company resources.