I would say if it is a general question, like you asked, feel free to ask here. Other people might have the same question and it could help them. If it is something where you might need to send me links to content you are having problem downloading and you don't want everyone on the site knowing what you are jerking off to, them feel free to send me a PM. I have no problem helping out anybody if they have questions. I am not a professional programmer so all I ask is that you be a bit patient with me if I am helping trouble shoot as it's my first time as well.
The idea behind the .rat file is that it keeps track of all content that passes through the folder it is in. I personally use it as 1 .rat file per folder per girl/model/content creator/etc. Technically you could use it to keep track of anything, but the idea for it is individual models. For example if you went to the exploitedcollegegirls thread and downloaded all the bunkr links using ripandtear into an 'exploitedcollegegirls' folder that had a .rat file, the .rat file would keep track of all the links downloaded from bunkr. That way if you attempted to download the same link again, it would just skip it saving you time and bandwidth
Yep. That is 100% how it works. If you wanted to add bunkr links, coomer post links or any link with downloadable content you would just add it under the 'urls_to_download' section with the
-u
flag. Any link that has content you want to download should be added under there. gofile, bunkr, *Blacklisted site*, cyberdrop.me, jpg.fish, etc. If you run
ripandtear -su
that tells ripandtear to go through all the urls_to_download within the .rat file that you want to be downloaded and it attempts to download all the links that it finds an extractor for. If an extractor doesn't exist, it tells you and moves on to the next link. I might not have an extractor for the site now, but I might have one in the future so feel free to add it (that's what I do so I don't have to come back and add urls later). When I make an extractor for the link, the next time you run
-su
it will be downloaded.
I have data caps on my internet so there are times where I want to download a bunch of stuff, but I don't want to download it now and go over my data cap and pay extra money. I store the links I want to download with
-u
and then when a new month hits and I can afford to download the content I go back to the folder with the .rat and run
ripandtear -su
(sync urls_to_download) so ripandtear will go and download everything I have saved.
The coomer and simpcity under the links section is intended for sites that will always have new posts/content added to them. This works for coomer now, and I am planning on doing it to simpcity in the future. Example: you add belledelphine's coomer profile to the .rat with
ripandtear -c 'https://coomer.party/onlyfans/user/belledelphine'
. Then you run
ripandtear -sc
to sync the coomer profile(s). Ripandtear looks at all of the links to profiles that are stored, it then finds every post on every page under the profile(s). It then finds the links to all content under every post. It then downloads all of the links it finds and saves the links in the .rat file for record keeping. It's a month later and you run
ripandtear -sc
again. This time while looking at posts it only adds posts that it hasn't recorded yet. If it doesn't have a record for the post/content it downloads it. If it does have a record that it downloaded it already, it skips the post. That way you just download what is new, not a bunch of duplicates
The command line flags to add information just appends information. It also checks for duplicates so you don't have to worry about adding identical information. If in doubt just add (if you mess up by adding a wrong name you will have to manually remove the entry from the .rat). Here is an example of a folder and .rat I have for belledelphine. Lets say I find out she also has a reddit account. All I need to do is go where here .rat is stored and run
ripandtear -r <reddit_name>
and it will add the reddit name to the reddit section under names. I could then download all unrecorded reddit content with
ripandtear -sr
(sync reddit). Currently ripandtear only adds information so you don't have to worry about accidentally deleting anything