Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)C
Posts
0
Comments
62
Joined
6 mo. ago

  • Yes, it's allowed and encouraged between RED<->OPS. There are a few tools on the RED and OPS forums to automate most of the process (e.g. Transplant, REDCurry, Takeout, Orpheus-Populator, etc.). Cross-posting torrents on many sites is allowed and fine, you just have to be aware of the rules of the source site, e.g. some places don't want their internals to be shared, or some have a literal timer countdown before cross-posting is allowed. On the other hand, most sites are not going to enforce other sites' exclusivity demands (PTP explicitly has a note about this). If an exclusive file is cross-posted onto PTP, PTP isn't going to take it down on anyone's behalf.

    I'll note that private tracker culture has warmed up quite a bit in the past decade and a half that I've been on them. Trackers (and their users) don't usually see other trackers as rivals/competitors anymore, release groups are respectful of each other, there are a ton of tutorials and help forums around to help low-skill members learn how to do the advanced stuff, and so on. There are recognizable usernames everywhere, and the general vibe is to cross-upload as much as possible and help build everyone's trackers together. Cross-seed (the program) has helped a lot with this, and seedbases have become very strong even on smaller trackers as a result.

  • Mainly, HDDs are bigger and FLAC is future-proof for future audio formats, as well I think the death of What.CD has really impressed upon the next generation that preservation is of utmost importance. A lot of albums were fully lost during the transition to RED/OPS, and a good chunk of albums that used to have a lossless copy now only have lossy versions from those who kept MP3 libraries. IMO, piracy is ownership, and owning the master lossless copy so you can generate any other formats is that concept taken to its logical conclusion.

  • Seconding the notion to get into OPS somehow if at all possible. RED's economy is one of the few economies that is actually non-trivial, whereas OPS's economy is totally trivial. A large amount of RED stuff is automatically mirrored to OPS, so you can just grab it at OPS and cross-seed back to RED (there are a few tools to do this automatically, e.g. nemorosa). RED is still definitely the more active and qualitative place to be, but cross-seeding shenanigans with OPS will keep RED's economy in-check.

  • A lot of people just rip Qobuz, Deezer, and Tidal FLAC for free using shared keys that you can find on the megathread ("Knowledge & Tokens"). Autosnatchers will give you at least one snatch per upload. No one is actually buying most of that WEB FLAC. There also might be a big batch of freeleech tokens during December for kickstarting a library. Also, I'd recommend just going full FLAC from the start; MP3 is easier/smaller to snatch, but it's 2025 and no one wants MP3, so long-term you'll get the best results by perma-seeding a large FLAC library.

  • I don't think it will be a big deal to transcode MP3 to Opus as long as you're okay with for-sure having theoretically-scuffed-up audio files. Every time an encoder has a go at the files (especially different encoders) they'll leave little artifacting marks all over the waveforms, typically seen as little "blocks". Are they audible? Doubtful. If you want to keep a neat and high-quality library I'd recommend collecting FLAC next time around.

    Also, this won't work on Win11, and I don't think you can make it transcode MP3, but if anyone happens to have slightly different requirements I'll plug https://gitlab.com/beep_street/mkopuslibrary, which I use to keep my FLAC library in sync with a parallel Opus library for mobile use.

  • Since you've clearly not read or comprehended any of the subpoenas that I linked, nor the encryption analysis, nor read any of Signal's blogposts, I see no point with responding any further. You are spreading FUD, and I question your motives.

  • No, and in fact they have fought to unseal and publish the articles they have. The point is that if you read the subpoenas, they request a lot of data from Signal and Signal can only ever return the phone number, account creation date, and last connected timestamp. So either Signal is consistently lying to various governments or they actually don't have any of that data. Signal's client is also open-source and has been audited, and they have published many blogposts about how the technology works.

    I'd strongly recommend digging deeper into this and trusting the auditors and experts instead of dismissing it based on lazy and cynical guesses. If you don't trust anyone you're welcome to read the source code of the client yourself. Soatok recently posted an 8-part series going through Signal's encryption that you can read as a primer: https://soatok.blog/2025/02/18/reviewing-the-cryptography-used-by-signal/.

  • Doing your own encodes is also really cool. I'm not too sure what the AV1 compatibility of your friends' players would be, but yes AV1 encodes are a very efficient way to microsize. If you happen to be on PTP, there's a giant AV1 research thread with people testing stuff out. It looks like they prefer SVT-AV1-PSYEX as of the latest posts, though I don't know enough to understand which encoding settings are the most impactful.

  • There's nothing wrong with Signal's centralization model in a worrying sense. It acts only as a clueless message relay, and it has near-zero information on any of its users, even as it delivers messages from person to person. The only information Signal knows is if a phone number is registered and the last time it connected to the server. There is great care taken to make sure everything else is completely end-to-end encrypted and unknowable, even by subpoena.

    The only real issue with Signal's centralization is that if Signal the company goes down, then all clients can no longer work until someone stands up a new server to act as a relay again. Signal isn't the endgame of privacy, but it's the best we have right now for a lot of usecases, and it's the only one I've had any luck converting normies to as it's very polished and has a lot of features. IMO, by the time the central Signal server turns into an actual problem we'll hopefully have excellent options available to migrate to.

    Also TMK, the only reason you still need a phone number for Signal is to combat spam. You can disable your phone number being shown to anyone else in the app and only use temporary invite codes to connect with people, so I don't count the phone number as a huge problem, though the requirement does still annoy me as it makes having multiple accounts more difficult and asserts a certain level of privilege.

  • If you're only at 10mbps upload you'll have to be very careful about selecting microsized 1080p (4-9mbps) or quality 720p (6-9mbps) encodes, and even then I really wouldn't bother. If you're not able to get any more upload speed from your plan then you'll either have to cancel the idea or host everything from a VPS.

    You can go with a VPS and maybe make people chip in for the storage space, but in that case I'd still lean towards either microsized 1080p encodes or 1080p WEB-DL (which are inherently efficient for the size) if you want to have a big content base without breaking the bank. E.g, these prices look pretty doable if you've got people that can chip in: https://hostingby.design/app-hosting/. I'm not very familiar with what VPS options are available or reputable so you'll have to shop around. Anything with a big harddrive should pretty much work, though I'd probably recommend at least a few gigs of RAM just for Jellyfin (my long-running local instance is taking 1.3GB at the moment; no idea what the usual range might be). Also, you likely won't be able to transcode video, so you'll have to be a little careful about what everyone's playback devices support.

    Edit: Also, if you're not familiar with microsized encodes, look for groups like BHDStudio, NAN0, hallowed, TAoE, QxR, HONE, PxHD, and such. I know at least BHDStudio, NAN0, and hallowed are well-regarded, but intentionally microsizing for streaming is a relatively new concept, and it's hard to sleuth out who's doing a good job and who's just crushing the hell out of the source and making a mess - especially because a lot of these groups don't even post source<->encode comparisons (I can guess why). You can find a lot of them on TL, ATH, and HUNO, if those acronyms mean anything to you. Otherwise, a lot of these groups post completely publicly as well, since most private trackers do not allow microsizing.

  • SuccessfulCrab only does WEB-DLs so "subjective quality" isn't as much of an issue as it would be with the encoding groups, but yeah I agree that scene is usually best avoided if you have access to reliable P2P sources. Quality > speed for me any day.

  • SuccessfulCrab is a legitimate scene group and ELiTE appears to be some sort of P2P x265-1080p transcode bot/group (their releases on IPT/TL look fine and go back quite a ways). I'd stop using whatever you're indexing from that's either serving you malware or failing to regulate the malware in its users' uploads. The real problem is that someone is mimicking these groups and putting out fake releases, so playing whackamole with the fake tags that that person is using is only treating the symptoms, and they can easily change the tag again.

  • Yeah h264 is the base codec (also known as AVC), x264 is the dominant encoder that encodes in that codec. So the base BDs are just plain h264, and remuxes will take that h264 and put it into an mkv container. Colloquially, people tag WEB-DL and BDs/remuxes as "h264" as they're raw/untampered-with, and anything that's been encoded by a person as "x264". Same thing for h265/HEVC and x265, and same for h266/VVC and x266.

  • Yeah I'm reading a little bit on it, and it seems like apt-get can't install new packages during an upgrade. On initial reading I was thinking there were specific packages it couldn't download or something, but this makes sense too. Regardless, this is news to me; I always assumed that apt and apt-get were the same process, just with apt-get having stable text output for awk'ing and apt being human-readable. I've been using nala for a long time anyway, but this is very useful knowledge.

  • Whoa, do you have something to read up on that? I'd be extremely surprised, since apt-get is supposed to be the script-safe variant, i.e. I'd imagine it's the more stable of the two.

  • As an idea, I use an SSD as a "Default Download Directory" within qBittorrent itself, and then qB automatically moves it to a HDD when the download is fully finished. I do this because I want the write to be sequential going into my ZFS pool, since ZFS has no defragmentation capabilities.

    Hardlinks are only important if you want to continue seeding the media in its original form and also have a cleaned-up/renamed copy in your Jellyfin library. If you're going to continue to seed from the HDD, it doesn't matter that the initial download is done on the SSD. The *arr stack will make the hardlink only after the download is finished.

  • Yep, fully agree. At least BluRays still exist for now. Building a beefy NAS and collecting full BluRay disks allows us to brute force the picture quality through sheer bitrate at least. There are a number of other problems to think about as well before we even get to the encoder stage, such as many (most?) 4k movies/TV shows being mastered in 2k (aka 1080p) and then upscaled to 4k. Not to mention a lot of 2k BluRays are upscaled from 720p! It just goes on and on. As a whole, we're barely using the capabilities of true 4k in our current day. Most of this UHD/4k "quality" craze is being driven by HDR, which also has its own share of design/cultural problems. The more you dig into all this stuff the worse it gets. 4k is billed as "the last resolution we'll ever need", which IMO is probably true, but they don't tell you that the 4k discs they're selling you aren't really 4k.

  • The nice thing is that Linux is always improving and Windows is always in retrograde. The more users Linux has, the faster it will improve. If the current state of Linux is acceptable enough for you as a user, then it should be possible to get your foot in the door and ride the wave upwards. If not, wait for the wave to reach your comfort level. People always say

    <CURRENT_YEAR>

    is the year of the Linux desktop but IMO the real year of the Linux desktop was like 4 or 5 years ago now, and hopefully that captured momentum will keep going until critical mass is achieved (optimistically, I think we're basically already there).