Tools Cyberdrop-DL: Mass Downloader for most forum sites and SimpCity threads

mariowanders

Bathwater Drinker
Mar 11, 2022
80
3,585
1,242
0fya082315al84db03fa9bf467e3.png
I have a problem when downloading files from *Blacklisted site*, when I get to the download part nothing else happens, as if the download speed is very low, I tried with other albums and the same thing happened, but if I download anything other host in the program, the speed is normal.
 

Somerandostuff

Casual
Oct 4, 2022
5
29
225
0fya082315al84db03fa9bf467e3.png
Well crap. My url list has gotten to long I guess for bunkrr and I just got my IP blocked when kicking off another download. I really wish URL's would get removed from the URL list after a successful download. I have a few links here and there that work sometimes but not all the time so I've left them in the list to be downloaded. They sporadically complete. Well, it's more than a few I guess which is why I haven't cleaned up the list.

Anybody know how long bunkrs IP block is for?
 

Jules--Winnfield

Cyberdrop-DL Creator
Mar 11, 2022
2,144
5,052
1,127
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
The URLs.txt in a lot of ways is designed to be continuous. Like with simpcity threads for example. They are always updating, you can continue to download them.

The only time I've known bunkr to ip block is when you are creating obtuse amounts of traffic, and essentially bringing down the site. I very much doubt you are creating that much traffic with CDL, but I've been proven wrong before.
 

Jules--Winnfield

Cyberdrop-DL Creator
Mar 11, 2022
2,144
5,052
1,127
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
As a heads up, by default certain sites have an additional rate limiter (whichever is lower is the one that takes precedent). Bunkrs default secondary rate limiter is 20. So before you changed it from whatever you had it at (assuming it was greater than 20), it was already limited to 20.
 

Jules--Winnfield

Cyberdrop-DL Creator
Mar 11, 2022
2,144
5,052
1,127
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
Yea, so what I'm saying is:

For bunkr as an example here, there are two ratelimiters, one adheres to the config, one is hard coded.

The config in your instance was 50 requests / second, the hard coded being 20 requests per second. As the hard coded is smaller, it's what the program is limited by. It was running at 20 requests per second, even with 50 in the config.
 

Somerandostuff

Casual
Oct 4, 2022
5
29
225
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
Just some clarification...

I understand that you are saying that there is a hard limit which I cannot go above for bunkr no matter what the settings were. But when I set it to 20 it seemed like it was considerably slower than before.

I wasn't using the config when I got blocked. I had issues with the YAML file a while back and stopped using it. I've also reinstalled windows and reinstalled the downloader and only setup the batch file with an output directory. Everything else was left untouched. So whatever it was running at wasn't being pulled from the YAML. The default of 50 that I was speaking of is what the wiki says the default is and it's what was set in the non-applied/unused YAML. I changed it to 20 along with modifying the "UI" and changed the attempt limit for some pesky downloads that keep failing. That made it seem to me like it was taking quite a bit longer to parse URLs than before.


I got blocked while it was still parsing the URLs.(all the URLs started turning read and stayed that way in the CLI) I opened my browser and found that I could not access the site. I connected a browser only VPN client and my browser could once again access the site but the downloader on my PC couldn't. I connected a proper VPN client on my machine and the downloader started working again. Dropped the VPN and it dropped traffic to the site again.

My normal URL folder is about as big as it's ever been at about 220 URL's to parse when starting it up. But I had been stockpiling a bunch of bunkr URL's for when that finished. (kept having failures on files which is why I never cleaned it up and just kept piling on more "important URLs") My To-Do bunker URL list is about 925 URLs. I dropped that list at the top of my normal URL.txt and started the batch file. It had only made it a fraction of the way through that list. I'm pretty sure it made it further than the 220 or so in my other list but I don't think it made it half way.
 

noxod

Bathwater Drinker
Mar 21, 2022
209
6,949
1,309
0fya082315al84db03fa9bf467e3.png
anyway to limit number of downloads happening at the same time? and/or limit download speeds?
 

Jules--Winnfield

Cyberdrop-DL Creator
Mar 11, 2022
2,144
5,052
1,127
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
Can't limit the download speed using CDL. You might be able to using a third party utility and specifically limiting command prompt and python.

You can however change the simultaneous downloads.

Please, Log in or Register to see links and images

This isn't an overarching limit, but per domain. If you set it to 1 and try to download from both gofile and bunkr for example, each site with download one at a time, but there will be two downloads at the same time.
 
  • Like
Reactions: noxod

propjoe

Lurker
Oct 27, 2022
6
3
58
0fya082315al84db03fa9bf467e3.png
As always, thanks for the continued support of cbd-dl :pepoLove: the new console interface looks awesome

I'm personally having trouble with downloads stalling with cyberdrop (the site). The download stops, but it doesn't error out nor time out, it just sits there idle forever. It doesn't skip those files and try to download the ones that it can, because all the slots are taken up by cyberdrop. What can I do to deal with this?

edit: here is the log
2023-06-12 15:02:48,452:DEBUG:downloader_utils:downloader_utils.py:81:ClientPayloadError('Response payload is not completed') 2023-06-12 15:02:48,452:DEBUG:downloader_utils:downloader_utils.py:82:Retrying (0) https://cyberdrop.me/vLq2j5Pbvjk73TFqLMBh_25_cba38f5875976371023153d9107d10ba_video-8lp3zEGA.mp4... 2023-06-12 15:03:55,494:DEBUG:downloader_utils:downloader_utils.py:81:ClientPayloadError('Response payload is not completed') 2023-06-12 15:03:55,494:DEBUG:downloader_utils:downloader_utils.py:82:Retrying (0) https://cyberdrop.me/video_2021-06-27_02-52-12-6DJyX3Uo.mp4... 2023-06-12 15:04:24,096:DEBUG:downloader_utils:downloader_utils.py:81:ClientPayloadError('Response payload is not completed') 2023-06-12 15:04:24,096:DEBUG:downloader_utils:downloader_utils.py:82:Retrying (0) https://cyberdrop.me/VID_20201206_222832_021-RicWsPbq.mp4... 2023-06-12 15:06:21,168:DEBUG:downloader_utils:downloader_utils.py:81:ClientPayloadError('Response payload is not completed') 2023-06-12 15:06:21,168:DEBUG:downloader_utils:downloader_utils.py:82:Retrying (1) https://cyberdrop.me/video_2021-06-27_02-52-12-6DJyX3Uo.mp4...
 

hkei

Less is More
Mar 12, 2022
285
10,152
1,308
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
If you haven't set up your config.yaml, I would suggest so.

Then, I would do a "2 pass" for new threads you're ripping that have cyberdropme links (if cyberdrop ever comes back/has some traffic left).
You'll want to have:
skip_hosts: [cyberdrop]
under "Ignore" category in your config; then you can always remove it or move it over to "only_hosts" if cyberdrop ever comes back, so it will only then process that site.

This is what I did a few months ago when cyberdrop was still able to be downloaded from.
 
  • Like
Reactions: propjoe