Tools Cyberdrop-DL: Mass Downloader for most forum sites and SimpCity threads

YeOlCurry

Diamond Tier
Oct 29, 2023
11
973
737
0fya082315al84db03fa9bf467e3.png
Fandanzo its in your user folder as shown (usually) by the command prompt your using. I use the terminal winddows app with powershell, but any command prompt you use defaults to C:/users/<your username here>/<downloaded scripts, files, packages, etc. go here>
 

ukdoll

Lurker
Dec 4, 2023
4
0
9
0fya082315al84db03fa9bf467e3.png
Any buddy please solve my problem, I will try to download bulk video from bunker-albums.io by cyberdropDL but it shows a 403 error and stops working. I also tried jdownloader2 but it also shows the error of DDos protected.
 

Get_RekT

Bathwater Drinker
Mar 13, 2022
35
4,655
1,242
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
I have made a script to auto download the files from bunkrr by opening a chrome window in debug mode one by one . Run the script using python >= 3.10. Here are the steps:
  • Install python3 if you dont have already. type python --version or python3 --version in cmd or powershell.
  • unzip the bunker-dl.zip. move to that directory and open terminal/cmd there.
  • Put your download album LINK in LINKs.txt file (Only works for 1 LINK at this time).
  • Run the script: python3 main.py
  • This will open a chrome window and a 30 sec delay will be given for you to pass the DDoS guard. You have to manually do the DDoS check in the now opened window.
  • After passing the check DO NOT CLICK ANYWHERE IN THE SITE in the opened window. After a few moments it will install ublock origin in the current debug window (this is to block pop ups or else the script wont work) from the .crx file in the folder.
  • Then the script will proceed to download the medias one by one. I suggest you to put the window in a seperate monitor or minimize it if you dont want to look or manually resize the window to your preference.
  • You can view the download status in the terminal window.
  • If there is a DDoS check in the middle of downloading a album try to pass it if you are paying attention. Else just rerun the script it should resume from the last downloaded media.
This script is not polished and is slow but the DDoS errors are annoying lol.
Here's the zip file
Please, Log in or Register to see links and images
 

WrangledHog

Diamond Tier
Mar 11, 2022
68
779
929
0fya082315al84db03fa9bf467e3.png
Thank you Jules--Winnfield ! There wasn't a CDL folder in C:\Users\XXXXX\AppData\Local\ but mentioning the config.yaml was the hint I needed. I did a search and there was a config.yaml and URLs.txt file at the very top of my user folder. No idea why they were there but that doesn't matter. Deleting these files did the trick. Thank you.
 

Get_RekT

Bathwater Drinker
Mar 13, 2022
35
4,655
1,242
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
bunkrr will give error 429 (too many request) if more than 2 files is downloading at the same time. Is the delay between each download too long? Is each individual file gets downloaded and the next one takes a while to open? The delay between downloads was set to my internet speed so that it wont request more than 2 files at the same time. Whats your internet speed? I will try to add a option to set download speed according to ones net speed. Meanwhile you can try changing the nos.
Python:
Please, Log in or Register to view codes content!

If you get "Service not available in your region" no luck but to use VPN to connect to a different region and try again.
 
Last edited:
  • Like
Reactions: pauolo225

WrangledHog

Diamond Tier
Mar 11, 2022
68
779
929
0fya082315al84db03fa9bf467e3.png
Is there a way to disable logging? When I'm trying to run two CDL instances at the same time the second instance fails because the first one is already using the log.
 

WrangledHog

Diamond Tier
Mar 11, 2022
68
779
929
0fya082315al84db03fa9bf467e3.png
I mean, it worked just fine with v4. I have it set to ignore history anyways so I don't think I need a DB at all.
 

AMoussaGuido

Bathwater Drinker
Oct 18, 2022
33
1,585
1,242
0fya082315al84db03fa9bf467e3.png
I thought I read in one of the earlier posts that if I get a 403 scrape failure, that it will just try scraping those again at the end of the queue?

If that's true, then just re-running the same URL list will ultimately scrape all of them, right?