Tools Cyberdrop-DL: Mass Downloader for most forum sites and SimpCity threads

torrentsite69420

Tier 3 Sub
May 30, 2022
23
337
534
0fya082315al84db03fa9bf467e3.png
Is there a way to tell exactly which videos failed to download when downloading a bunkr album? I went to down "downloader" txt log and there are plenty of files that say something like

2023-06-01 21:06:55,085:DEBUG:downloader_utils:downloader_utils.py:79:ClientPayloadError('Response payload is not completed')

but there's lots of retries and there are way more of these than the failed downloads. My album had 270 files, 12 failed.
 

torrentsite69420

Tier 3 Sub
May 30, 2022
23
337
534
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
Opened it in notepad and I don't see an "output_failed" line

Apply_Config: false Configuration: Authentication: gofile_api_key: '' gofile_website_token: '' *Blacklisted site*_password: '' *Blacklisted site*_username: '' pixeldrain_api_key: '' simpcity_password: '' simpcity_username: '' *Blacklisted site*_password: '' *Blacklisted site*_username: '' xbunker_password: '' xbunker_username: '' Files: db_file: download_history.sqlite errored_download_urls_file: Errored_Download_URLs.csv errored_scrape_urls_file: Errored_Scrape_URLs.csv input_file: URLs.txt log_file: downloader.log output_folder: Downloads output_last_forum_post_file: URLs_last_post.txt unsupported_urls_file: Unsupported_URLs.csv Forum_Options: output_last_forum_post: false scrape_single_post: false separate_posts: false Ignore: exclude_audio: false exclude_images: false exclude_other: false exclude_videos: false ignore_cache: false ignore_history: false only_hosts: [] skip_hosts: [] JDownloader: apply_jdownloader: false jdownloader_device: '' jdownloader_password: '' jdownloader_username: '' Progress_Options: hide_album_progress: false hide_domain_progress: false hide_file_progress: false hide_forum_progress: false hide_new_progress: false hide_overall_progress: false hide_thread_progress: false refresh_rate: 10 visible_rows_albums: 2 visible_rows_domains: 2 visible_rows_files: 10 visible_rows_threads: 2 Ratelimiting: connection_timeout: 15 ratelimit: 50 throttle: 0.5 Runtime: allow_insecure_connections: false attempts: 10 block_sub_folders: false disable_attempt_limit: false include_id: false max_concurrent_albums: 0 max_concurrent_domains: 0 max_concurrent_downloads_per_domain: 4 max_concurrent_threads: 0 output_errored_urls: false output_unsupported_urls: false proxy: '' remove_bunkr_identifier: false required_free_space: 5 skip_download_mark_completed: false user_agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:109.0) Gecko/20100101 Firefox/112.0 Sorting: sort_directory: Sorted Downloads sort_downloads: false sorted_audio: '{sort_dir}/{base_dir}/Audio' sorted_images: '{sort_dir}/{base_dir}/Images' sorted_others: '{sort_dir}/{base_dir}/Other' sorted_videos: '{sort_dir}/{base_dir}/Videos'
 
Aug 11, 2022
2
0
49
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
I had used the program to download a few threads. i assumed that any new posts to the threads would automatically download if i were to rerun the program. What should i do specifically to make sure it does? You say to give it the links again but the links are already in the urls file. Should i rerun with it cleared and then again with the links i want downloaded?
 

Big donut hair

Diamond Tier
Mar 15, 2022
30
784
929
0fya082315al84db03fa9bf467e3.png
Is there a way to download all the pictures from a user on jpg.pet.com, specifically if they're not on an album? I've tried downloading this album but only 42/690 images download. Is there something I'm doing wrong?

Please, Log in or Register to see links and images
 

Tubbz

Bathwater Drinker
Apr 6, 2022
102
3,033
1,252
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes

I've been wanting to give this a try and I had the same question! :monkaHmm:

Anyways, It was really easy to get this working from an image. Here's how you do it:

1. Create a Dockerfile like this

Code:
Please, Log in or Register to view codes content!

2. Build the image:

Code:
Please, Log in or Register to view codes content!

3. Create a new directory and add a URLs.txt file

4. Run a container from the directory containing the URLs.txt file

Bash:
Please, Log in or Register to view codes content!

This spits out some files the first time you run it (cache, config, downloads, etc.) so you should run it from the same directory every time, or replace $PWD with a specific path of your choice. You should still be able to pass any cyberdrop-dl arguments to it.

Also, you might wanna create an alias for that last command.
 

J4k_The_Reaper

Tier 2 Sub
Mar 12, 2022
21
213
337
0fya082315al84db03fa9bf467e3.png
hey jules....firstly, just wanted to say that you honestly are the GOAT.....this script is very handy-(no pun intended, seriously)-and you are such a patient human from having so many issues from us all being posted. second, i seem to have this traceback issue lately. i dunno if its normal due to a file not being able to download or what. here is the downloader.log via G-Drive. one is visable and theres a direct link to it if you need to download it....{the error is close to bottom of log}


Please, Log in or Register to see links and images