EXCLUSIVE OFFER - WATCH CAMSODA GIRLS LIVE FOR FREE!!! - CLICK HERE
  • We have blocked access to our website from TOR nodes and a datacenter in France (some VPN connections to the country might be faulty) due to abusive behaviour. If things quiet down - they'll be enabled again in a few days/weeks.
  • Bunkr are having some temporary issues with some videos showing cloudflare tos violations and are restricted.
    Please be patient while the issue is resolved and DONT ask for re-ups.

Guide Onlyfans Downloading - A complete guide for PC and Mobile

The DRM video I'm trying to get is also above 500 MB and the download is getting stuck around 430 MB.
 
sim0n00ps I saw that your downloader uses aria2c, is it built into the downloader or do you have to download it and set the path? It wasn't mentioned in the readme. :LUL
 
sim0n00ps I saw that your downloader uses aria2c, is it built into the downloader or do you have to download it and set the path? It wasn't mentioned in the readme. :LUL
I have just removed the use of aria2c within the yt-dlp command and it seems to work fine so I will create a new release in a moment and see if that has any difference for people.
 
I have just removed the use of aria2c within the yt-dlp command and it seems to work fine so I will create a new release in a moment and see if that has any difference for people.
Another thing I noticed, dotnet --list-runtimes uses the 64bit dotnet.exe as default and doesn't list Microsoft.NETCore.App 7.0.5 if it's the x86/32bit version. Probably uses the 32bit dotnet.exe and finds the correct version if no 64bit versions exist. Was a bit confusing when I installed the downloader. :peepoShrug:

64908b71a9b43482c0.md.png 32dc946d7aba63b0f3.md.png
 
Awesome work sim0n00ps. (y)(y)(y)🧎‍♂️
The scraper works a treat and is much handier than the python script option.

I'm kicking myself a bit, that I took a punt on TubeDigger but I guess I can still use it for other purposes.

A little minor request if I may, with regard to the functionality of the scraper.
Would it be possible to add an option to go back to the main menu after a successful scrape, rather than having to exit the program and then launch it again.
I know you can select multiple OF accounts to scrape in one go, but on the off chance that I want to do individual scrapes, said option would be good.
 
Awesome work sim0n00ps. (y)(y)(y)🧎‍♂️
The scraper works a treat and is much handier than the python script option.

I'm kicking myself a bit, that I took a punt on TubeDigger but I guess I can still use it for other purposes.

A little minor request if I may, with regard to the functionality of the scraper.
Would it be possible to add an option to go back to the main menu after a successful scrape, rather than having to exit the program and then launch it again.
I know you can select multiple OF accounts to scrape in one go, but on the off chance that I want to do individual scrapes, said option would be good.
Glad it works for you. I can have a look into that.
 
sim0n00ps An option or hardcoded change to start scraping at the latest wall post? :peepoBless: Right now I would have to download 70GB again for some models to get the newest post. :hahaa:
 
sim0n00ps An option or hardcoded change to start scraping at the latest wall post? :peepoBless: Right now I would have to download 70GB again for some models to get the newest post. :hahaa:
lol, I was coming here to request this. We need an option to download by date range. Like if I just re-subbed to someone, I would only want to download the posts from like the last 3 weeks or so.
 
sim0n00ps An option or hardcoded change to start scraping at the latest wall post? :peepoBless: Right now I would have to download 70GB again for some models to get the newest post. :hahaa:
lol, I was coming here to request this. We need an option to download by date range. Like if I just re-subbed to someone, I would only want to download the posts from like the last 3 weeks or so.
I'll make it so it goes newest first, give me a few mins
 
I too would like it to download by latest post first.

We need an option to download by date range.
One thing I will say though is try and avoid too many user configurable settings. I think this is part of why DigitalCriminals script has so many issues, people can configure it in so many different ways and its impossible for him to test every configuration. Obviously some might be necessary and shouldnt cause issues like adding dates to the start of files
 
sim0n00ps An option or hardcoded change to start scraping at the latest wall post? :peepoBless: Right now I would have to download 70GB again for some models to get the newest post. :hahaa:
lol, I was coming here to request this. We need an option to download by date range. Like if I just re-subbed to someone, I would only want to download the posts from like the last 3 weeks or so.
I too would like it to download by latest post first.


One thing I will say though is try and avoid too many user configurable settings. I think this is part of why DigitalCriminals script has so many issues, people can configure it in so many different ways and its impossible for him to test every configuration. Obviously some might be necessary and shouldnt cause issues like adding dates to the start of files
Newest release should now download newest media first
 
Glad it works for you. I can have a look into that.
Many thanks mate.

sim0n00ps An option or hardcoded change to start scraping at the latest wall post? :peepoBless: Right now I would have to download 70GB again for some models to get the newest post. :hahaa:
I may be mistaken but I thought I read somewhere that there is a setting in the scraper that automatically skips existing/already downloaded content?
 
sim0n00ps Would it be possible to customize the downloaded folder structures? It's not a huge priority, but it would be nice for me to be able to have it structured the same as my settings on DC's old scraper. Thanks.
 
Many thanks mate.


I may be mistaken but I thought I read somewhere that there is a setting in the scraper that automatically skips existing/already downloaded content?
It does currently skip dupe content, however b0b is suggesting that he doesn't want to download 70GB of stuff he doesn't already have downloaded just to get the latest posts from a user hence why I've made it so newest posts are downloaded first.
sim0n00ps Would it be possible to customize the downloaded folder structures? It's not a huge priority, but it would be nice for me to be able to have it structured the same as my settings on DC's old scraper. Thanks.
What is the structure you have if you don't mind me asking, as long as it follows the structure like Posts/Free/Videos etc somewhere in your structure then it might be a case of letting you change what the root folder is from __user_data__ to whatever you want. However if its completely different that's where it might be a little difficult as I don't want to overload the auth.json file with loads of different values people can change because that might lead to problems as mentioned above.
 
What is the structure you have if you don't mind me asking, as long as it follows the structure like Posts/Free/Videos etc somewhere in your structure then it might be a case of letting you change what the root folder is from __user_data__ to whatever you want. However if its completely different that's where it might be a little difficult as I don't want to overload the auth.json file with loads of different values people can change because that might lead to problems as mentioned above.
It just removes the Free and Paid folder and just combines them. So instead of like Messages/Free/Videos it's just Messages/Videos and everything free or paid goes in the same folder.
 
It just removes the Free and Paid folder and just combines them. So instead of like Messages/Free/Videos it's just Messages/Videos and everything free or paid goes in the same folder.
I'll consider it but probably not one of the things that will make it on the priority list right now
 
Back
Top Bottom