Tools Cyberdrop-DL: Mass Downloader for most forum sites and SimpCity threads

Dec 23, 2023
2
0
1
0fya082315al84db03fa9bf467e3.png
how do you force the script to re-download files you already downloaded and not skip them? Start from scratch so to speak

The script appears to be skipping files that I don't have on my drive. I just want to start from the beginning.

affected URLs
Please, Log in or Register to see links and images
Please, Log in or Register to see links and images
 

BuyUsYuckFou

Casual
Mar 12, 2022
8
47
225
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
In Terminal the keyboard shortcut Control + W would help you delete lines of URLs quickly (provided there are no spaces within the URLs). So if you wish to delete all the URLs within the text file from within the UI, you could simply hold down that shortcut for a bit until everything is deleted.
 
  • Like
Reactions: syzygy_8

Beestly

Bathwater Drinker
Apr 30, 2022
26
1,319
1,234
0fya082315al84db03fa9bf467e3.png
I guess I'll ask a more vague question,

Does a forced captcha on login on ANOTHER SITE break CDL entirely, or is there a way I can get around it? It's one of the sites that has the login information in the config, which has worked a long time. I'm assuming the newish captcha is what's breaking the functionality.. Thanks

(thanks for the explanation, as well)
 
Last edited:

EightyThree

Enthusiasm Enthusiast
Apr 3, 2022
473
38,085
1,733
0fya082315al84db03fa9bf467e3.png
Trying to download just the Saint links from a Simp City thread. I set only_hosts: ['saint', 'simpcity'], but nothing gets scraped. The log says the link was skipped due to the config: DEBUG : 2023-12-25 16:58:47,883 : utilities.py:98 : Skipping URL by Config Selections: https://simpcity.su/threads/isaramirezoficial.8281

Full log:
Code:
Please, Log in or Register to view codes content!
 

DarthPasma

Tier 3 Sub
Mar 12, 2022
13
429
607
0fya082315al84db03fa9bf467e3.png
I really don't know if this been already answered, but is there any way to workaround the 429 Too Many Requests thing on bunkrr? I cannot even download a single file manually
 
Last edited:
  • Like
Reactions: Bobbymcburger

holyfukkk

Lurker
Dec 22, 2023
4
3
58
0fya082315al84db03fa9bf467e3.png
Just installed cyberdrop and tried to download bunkrr
Please, Log in or Register to see links and images
(306 files, 141.1 MB). Downloads get completed without failures but all i get is files with different names but same size and image saying "404 - File gone Homie. Move on".

Upd: I let it finish and end up with 20 good files actually downloaded, but 173 files are as described above.
 
Last edited:

mussie

Bathwater Drinker
Mar 11, 2022
109
3,129
1,249
0fya082315al84db03fa9bf467e3.png
First, thank you for this awesome tool that we all get to use for free.

With all of the hassle of getting files from bunkr and the 429 errors, is it possible to have the program let us know when one of our urls has been 'finished'? As in, everything has been downloaded or is a 404, or along those lines. Or some kind of status per url (% complete, % skipped, % 404, % 429, etc)

I'm asking because I usually have several urls in the file and I never really know the status of any of them without having to manually check the thread or album they point to.
 

errananna

Bathwater Drinker
Jun 24, 2023
127
1,838
1,249
0fya082315al84db03fa9bf467e3.png
Is there anything I can add to the command line to default it to start downloading all configs?

--download-all-configs --retry-failed

Please, Log in or Register to view quotes

Because of the way mega encrypts files, you either need to have the mega app download and handle the decryption, or download it in your browser and let the encryption happen in the browser. Exit the mega app the next time you download something, and download it without the mega app. You'll see progress in the browser, but you won't actually download it (in your downloads) until it's done in the browser, in which case your browser thinks the huge download you did happened immediately.
 
Last edited by a moderator: