Tools Cyberdrop-DL: Mass Downloader for most forum sites and SimpCity threads

clocksucker

Lurker
Mar 14, 2022
5
2
58
  • Like
Reactions: Jules--Winnfield

axej

(◕‿◕✿) STYLE (◠‿◠)✌ START (◠‿◠✿)PORNHOARD
Mar 11, 2022
276
3,695
1,299
0fya082315al84db03fa9bf467e3.png
did you ever do pixeldrain support? or was it always just pixl?
 

rdnk

Ctrl + "WTF" for ℱ𝓪𝓷𝓬𝔂 𝓦𝓣ℱ
Mar 10, 2022
222
10,785
1,402
0fya082315al84db03fa9bf467e3.png
Are you opposed to (edit: me) restructuring the code to make adding new file hosts easier/more maintainable?

I'd like to know before I spend my time optimizing the gofile scraper.

If you wanna chat about the specifics, let me know.
 
Last edited:
  • Like
Reactions: neolith

tyrz

Lurker
Mar 11, 2022
4
0
46
0fya082315al84db03fa9bf467e3.png
Is there a way to comment on lines in the URLs file?
to avoid having to check every link one by one to find out what it is, at least to add the url title after the url itself
 

Jules--Winnfield

Cyberdrop-DL Creator
Mar 11, 2022
2,178
5,124
1,127
0fya082315al84db03fa9bf467e3.png
2.6.0 is out.

Big thanks to rdnk with his contribution to the new update. Thanks to him GoFile downloading should be quite a bit better, and no longer requires chrome to be installed.

This update also has a fair bit of code cleanup, swapping to on disk downloading from in memory downloading (should make things run more smoothly).
 

humanwan

Casual
Mar 11, 2022
5
26
225
0fya082315al84db03fa9bf467e3.png
Strange, for the URL
Please, Log in or Register to see links and images
, I get a no links found error. I tried another bunkr album and it worked fine. Fresh install on OSX
 

neolith

egirl shill
Mar 11, 2022
597
37,756
1,793
0fya082315al84db03fa9bf467e3.png
any way to implement a "done.txt" that logs links you've successfully scraped so that we know which ones we can safely remove from the URLs.txt for huge scrapes?

this is my current solution, but im sure it has some false positives.

Python:
Please, Log in or Register to view codes content!
 

Jules--Winnfield

Cyberdrop-DL Creator
Mar 11, 2022
2,178
5,124
1,127
0fya082315al84db03fa9bf467e3.png
Please, Log in or Register to view quotes
With how the program is currently setup, no. That is if you are looking for the original links that are in the URLs.txt. If you are fine with the end links you can look at the logs.log file. I'd have to pass through the original link to the very end of the program, and then check if to make sure all downloads completed fully before adding to the done.txt. Right now it'd be easier to just see what folders have ".part" files, the links that correspond to those need to run again.

edit: also just running again will check to make sure all files are complete and is a really quick process after the link scraping completes.
 
  • Like
Reactions: neolith