johnny.barracuda

Bathwater Drinker
Sep 26, 2022
221
3,039
1,252
0fya082315al84db03fa9bf467e3.png
Made a semi breaking change to ripandtear. It only applies from 0.9.6 and onwards. Previously I used a comma ( , ) as the separator between urls to download and names. That created a conflict with redgifs so I have changed it to a pipe ( | ).

Here is how it looks now:

Previously:

ripandtear -d 'https://www.redgifs.com/gifs/lexi-luna,cumshot?order=new,https://www.redgifs.com/watch/masculineupbeatlangur#rel=user%3Aoutrageousone;order=new'

Currently:
ripandtear -d 'https://www.redgifs.com/gifs/lexi-luna,cumshot?order=new|https://www.redgifs.com/watch/masculineupbeatlangur#rel=user%3Aoutrageousone;order=new'

Adding names to a .rat:

Previously:

ripandtear -r 'test,name'

Currently:

ripandtear -r 'test|name'
 
  • Like
Reactions: Mingor

glandalf

Superfan
Mar 14, 2022
34
680
832
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes

Thank you for this info, its been really helpful. I've used ZSH in the past, but hadn't set it up on the VM I use to download stuff, so I looked into FISH and thought it was cool, so I've gone ahead and set it up. Have now got a for loop like yours running to de-dupe and sort, and its working like a dream. Definitely something for me to learn and build upon.

Can I ask if your -H flag only looks in the working directory, or do it also look in the sub directories that have previously been -S sorted when comparing new downloads?
 
  • Like
Reactions: johnny.barracuda

glandalf

Superfan
Mar 14, 2022
34
680
832
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes

Not need to prioritize anything on my behalf mate, you're already doing a lot by taking time to explain and teach me new stuff, so I'm happy to wait for whenever you get around to implementing stuff!

Thanks for this further info, it's a great help. I understand the premise of what you're suggesting I do, its just the applying to to an actual python script that I need to get my head around. But you've given me a great deal to research and start trying to apply, so again thank you. I will get tinkering with it and see what I can come up with.
 
  • Like
Reactions: johnny.barracuda

glandalf

Superfan
Mar 14, 2022
34
680
832
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
Many thanks for this!

I'm getting closer to finishing the function you suggested. At least I think I am lol.

One thing I wanted to call out for consideration, is that would you perhaps think of using the name method instead of stem in the rat_info.py util, as stem interprets a period as the beginning of a suffix/extension and cuts it off.

[old imgur media embed was here once, but it's now gone]
 
  • Like
Reactions: johnny.barracuda

glandalf

Superfan
Mar 14, 2022
34
680
832
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
S'all good mate, no need for contributor, just happy that I could of been some help. A friend of mine in fact showed me name when I asked them a question earlier, so I thought I'd relay the info as it helped fix an issue I was having.


Just updated the tool, tried creating a rat file in the directory I was having issue with, and BOOM, works like a dream. Thank you for patching that so quickly :)
 
  • Like
Reactions: johnny.barracuda

glandalf

Superfan
Mar 14, 2022
34
680
832
0fya082315al84db03fa9bf467e3.png
So I got the script written and executed! Thank you again for your help.

I split the usernames from the tags, then passed the usernames individually as generics to RAT (I will clean up later and assign them correctly to right sites in the rat file) and the tags individually to the tags. I also added a step to rename the folder to strip the tags from it before passing the usernames and tags into RAT so that the rat file and the directories were more clean.
I did notice an oversite with my script though, as there were a couple folders which didn't have the same naming convention/format at the rest of them, and actually had a '(' before the tags. But I took a backup of the directory names before running, so I don't mind a little manual clean up.

One thing I've noticed after running this is that RAT doesn't seem to like UTF-8 characters. So models with non-english characters in the usersnames (e.g. ø, ö, 呆, etc, etc). It replaces the actual character with the unicode number, for example 'ø' becomes '\u00f8' inside the rat file.

So it took me all day to write 70 lines of code, most of which are comments or empty lol. Though I'm not to disheartened by that, as this is literally my first python script I've written outside of basic beginner follow along learning videos.

If you interested, here is what I came up with:
Python:
Please, Log in or Register to view codes content!

Then ran it via:
Bash:
Please, Log in or Register to view codes content!
 
Last edited:
  • peepoClap
Reactions: johnny.barracuda

glandalf

Superfan
Mar 14, 2022
34
680
832
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes

Yeah I thats why I added so many comments, so that when I come back to it, or just refer to it, I can quickly see what each part is doing. Also helped me whilst I was scripting it to compartmentalise each section. A lot of the day was spent reading documentation and watching some videos on specific functions and methods.

Please, Log in or Register to view quotes

I tried to avoid the edge cases by building a bunch of test scenarios of dummy directories I set up. Tried coming up with as many different combinations of characters/number of tags and usernames/etc. Some having ( a bit earlier in the directory than the tags didn't cross my mind until it was too late though. But you live and learn!
I can imagine scripting around bunkr can be tricky, what with them constantly changing the configuration and limits, than and their servers going in and out of maintenance a lot as of late.

Please, Log in or Register to view quotes

Ah ok cool, thats fine then. This came from a case of me again not using the rat file as intended and just reading it in a text editor rather than calling the print function.

Please, Log in or Register to view quotes

I come from the world of databases thus I know SQL quite well, so the programmatic thinking is/was there for me already. Although some similarities, programming is a whole different world though, so was still quite the jump and a lot to get my head around. It helped having something I wanted specifically to do/achieve, that I could apply myself to, rather than "lets build a calculator", or see how many different ways you can get terminal to spit out "hello world". Don't get my wrong, that stuff is very useful and i'm glad i've done it (several times lol).

Please, Log in or Register to view quotes

Well thank you again mate, its much appreciated!


On a different note, can I ask how about your workflow for how you typically go about setting up a new model and rat file? Do you just type of the commands and arguments each time manually or do you have other ways of doing it? I was thinking about setting up a template of all the different arguments in a file, then I can just copy paste the various usernames/tags/urls/etc, and then when its populated with everything, execute it. The only issue with that if an argument is blank because the model doesn't have an Instagram or something, you can't execute the command.
 
  • Like
Reactions: johnny.barracuda

glandalf

Superfan
Mar 14, 2022
34
680
832
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
To be honest I agree with you on the archivist mentality and keeping the first file, but I guess this works too. I've gone through my entire library now updating uppercase file extensions to lower, so I shouldn't have the issue I was having with -H removing files that were hashed already but not sorted due to them having uppercase extensions. Have also now finished re-sorting and hashing every file in my library. It took a few days to iterate over everything, but its done now!

Please, Log in or Register to view quotes

Nice I will test installing it again, because when I installed it on my windows system I had to install a different package than the one I install in Linux. Yeah I get the "it works on my machine" mentality, though this tool is great, so I can see it gaining traction :)

Please, Log in or Register to view quotes
Awesome thank you! I've dabbled a little with git for my job, but nothing collaborative, so my usage is basically just pushing and pull scripts from a directory only I work in, so pull requests is not something I've dealt with. I will do some research on it.

Please, Log in or Register to view quotes
Thank you again. That makes sense, I will store the full URL then.
 
  • Like
Reactions: johnny.barracuda

johnny.barracuda

Bathwater Drinker
Sep 26, 2022
221
3,039
1,252
0fya082315al84db03fa9bf467e3.png
Bunkr changed their domain name to bunkrr. Updated RAT so it keeps downloading.

Added a -eu flag to erase urls to download that are saved in a .rat file. It deletes every url that is currently saved
 
  • Like
Reactions: Fordaboyss

johnny.barracuda

Bathwater Drinker
Sep 26, 2022
221
3,039
1,252
0fya082315al84db03fa9bf467e3.png
Just updated Ripandtear to download, add and sync *Blacklisted site* users/videos. If you have already downloaded feel free to upgrade

(Linux/Mac)
pip install --upgrade ripandtear

(Windows)
python -m pip install --upgrade ripandtear
 

johnny.barracuda

Bathwater Drinker
Sep 26, 2022
221
3,039
1,252
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
Could you send me the link? You could also attempt the download with ripandtear -d <link> -l 2 to have it print information about what it is doing to the screen. If you want super detailed information you can do -l 1. It might give you some hints about what is going on. If I had to guess the url of the content is a certain pattern that I haven't accounted for yet.

Bunkr is the one site that has a bunch of edge cases that prevent ripandtear from working. I already have 4 different links opened in my browser that I need to go through and account for to make sure they are downloaded.
 

johnny.barracuda

Bathwater Drinker
Sep 26, 2022
221
3,039
1,252
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
I would say if it is a general question, like you asked, feel free to ask here. Other people might have the same question and it could help them. If it is something where you might need to send me links to content you are having problem downloading and you don't want everyone on the site knowing what you are jerking off to, them feel free to send me a PM. I have no problem helping out anybody if they have questions. I am not a professional programmer so all I ask is that you be a bit patient with me if I am helping trouble shoot as it's my first time as well.

The idea behind the .rat file is that it keeps track of all content that passes through the folder it is in. I personally use it as 1 .rat file per folder per girl/model/content creator/etc. Technically you could use it to keep track of anything, but the idea for it is individual models. For example if you went to the exploitedcollegegirls thread and downloaded all the bunkr links using ripandtear into an 'exploitedcollegegirls' folder that had a .rat file, the .rat file would keep track of all the links downloaded from bunkr. That way if you attempted to download the same link again, it would just skip it saving you time and bandwidth

Please, Log in or Register to view quotes

Yep. That is 100% how it works. If you wanted to add bunkr links, coomer post links or any link with downloadable content you would just add it under the 'urls_to_download' section with the -u flag. Any link that has content you want to download should be added under there. gofile, bunkr, *Blacklisted site*, cyberdrop.me, jpg.fish, etc. If you run ripandtear -su that tells ripandtear to go through all the urls_to_download within the .rat file that you want to be downloaded and it attempts to download all the links that it finds an extractor for. If an extractor doesn't exist, it tells you and moves on to the next link. I might not have an extractor for the site now, but I might have one in the future so feel free to add it (that's what I do so I don't have to come back and add urls later). When I make an extractor for the link, the next time you run -su it will be downloaded.

I have data caps on my internet so there are times where I want to download a bunch of stuff, but I don't want to download it now and go over my data cap and pay extra money. I store the links I want to download with -u and then when a new month hits and I can afford to download the content I go back to the folder with the .rat and run ripandtear -su (sync urls_to_download) so ripandtear will go and download everything I have saved.

The coomer and simpcity under the links section is intended for sites that will always have new posts/content added to them. This works for coomer now, and I am planning on doing it to simpcity in the future. Example: you add belledelphine's coomer profile to the .rat with ripandtear -c 'https://coomer.party/onlyfans/user/belledelphine'. Then you run ripandtear -sc to sync the coomer profile(s). Ripandtear looks at all of the links to profiles that are stored, it then finds every post on every page under the profile(s). It then finds the links to all content under every post. It then downloads all of the links it finds and saves the links in the .rat file for record keeping. It's a month later and you run ripandtear -sc again. This time while looking at posts it only adds posts that it hasn't recorded yet. If it doesn't have a record for the post/content it downloads it. If it does have a record that it downloaded it already, it skips the post. That way you just download what is new, not a bunch of duplicates

The command line flags to add information just appends information. It also checks for duplicates so you don't have to worry about adding identical information. If in doubt just add (if you mess up by adding a wrong name you will have to manually remove the entry from the .rat). Here is an example of a folder and .rat I have for belledelphine. Lets say I find out she also has a reddit account. All I need to do is go where here .rat is stored and run ripandtear -r <reddit_name> and it will add the reddit name to the reddit section under names. I could then download all unrecorded reddit content with ripandtear -sr (sync reddit). Currently ripandtear only adds information so you don't have to worry about accidentally deleting anything

Code:
Please, Log in or Register to view codes content!

JSON:
Please, Log in or Register to view codes content!
 

johnny.barracuda

Bathwater Drinker
Sep 26, 2022
221
3,039
1,252
0fya082315al84db03fa9bf467e3.png
froppalo Here is an example of how I use ripandtear

-mk - make a directory with '<name>' and then go into it (and run the following flags)

-r - adds reddit username

-o - adds onlyfans username

-s - stores simpcity link so I can download it later when I create an extractor

-i - stores instagram name

-u - save the following urls so I can download them later by running ripandtear -su

-sr - sync reddit content (running it this first time downloads all reddit content)

-H - hash files and remove duplicates

-S - sort files into the correct directory.

This is the result:

nude-eva/
├── nude-eva.rat
├── pics/ (197 pics)
└── vids/ (73 vids)



Please, Log in or Register to see links and images
 

johnny.barracuda

Bathwater Drinker
Sep 26, 2022
221
3,039
1,252
0fya082315al84db03fa9bf467e3.png
froppalo I forgot to add her redgifs profile. All I do is run this command

cd nude-eva

ripandtear -R 'evaporshe' -sR -H -S

-R - add her redgifs username to the .rat (create a .rat if one doesn't exist)

-sR - sync Redgifs

-H - Hash and remove duplicates

-S - sort files
 

johnny.barracuda

Bathwater Drinker
Sep 26, 2022
221
3,039
1,252
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
No, but looking at the tags on redgifs it looks like it might be easy to implement. Could you either post a link to what you are talking about here or PM the link to me so I know we are talking about the same thing? I might be able to add that functionality today if you do.
 

glandalf

Superfan
Mar 14, 2022
34
680
832
0fya082315al84db03fa9bf467e3.png
Really like your tool johnny.barracuda, I'm kinda annoyed I've only just found it to be honest lol.

Got a couple of suggestions of things you might consider implementing.

1) The ability to pass a specific path into the tool, instead of having have the directory you want to work in as your current working directory. E.g. ripandtear --directory="/path/to/folder -H -S"
2) The file sorter util seems to be case sensitive and is only looking at extensions that are lowercase. E.g. it will sort video.mov into vids folder, but not video.MOV