greytmeam | 12 points
This post will likely be pretty stupid, but I plan to ask either way:
I am planning to setup automatic downloading of /r/opendirectories through wget. Is there any way to automatically get content from /r/megalinks as soon as it's released similar to the aforementioned? I presume not (otherwise the bots of companies who's content it is would report it fast), but wanted to ask either way. Also, for backups of data on encrypted containers, does anyone have a suggestion other than MultCloud for transferring backups between online providers (I have a decent amount more than 2TB on Google Drive, about 17TB on MEGA, and might also start adding to Amazon)?
Thanks for being such an awesome community guys and appreciate any advice!
[-] starbuck93 | 2 points
[-] ruralcricket | 3 points | Apr 11 2017 23:30:45
Probably not since posters are using various methods to prevent automation from working. Did you see the posting that used stenography to hide the links?
I'd also advise against auto downloading all of the content of open directories. Beside the need to have petabytes like /u/Dougygob mentioned, you also risking downloading much duplicate content, poor content (480p movies), content you really don't want in your possession, personal financial documents (e.g. tax returns). Finally, you get to weed through the crap to find the stuff you want to keep. Oh, and do you have usage limits on your ISP?
Example - I like e-books. Found a list of Calibre libraries. WGET'd just
permalink
[-] greytmeam | 2 points | Apr 12 2017 00:05:57
You bring up fair points. I have unlimited storage, so file amount shouldn't be an issue and sorting can be done essentially through filebot and similar scripts, but it would still require some sorting and removing of duplicated (and low quality) content. Ultimately, I might go back to using Sonarr, Headphones, and CouchPotato to auto-torrent, but I was hoping to avoid it. If you care to share the directories you used for books (and whatever else you might want to offer) though, I'd love to visit them and download (PM or comment). Thanks a ton for the input!
permalink
[-] Dougygob | 5 points | Apr 12 2017 00:20:28
I'd be happy to help with the sorting! (I have no life, I play video games all day and check /r/megalinks three times a hour.)
permalink
[-] PainDoflamiongo | 3 points | Apr 12 2017 14:42:06
bruh i need an income source like yours.
permalink
[-] Dougygob | 2 points | Apr 12 2017 17:33:45
I have none, I really live with my parents and I'm homeschooled, Moved too. No friends, Just video games. Pretty neato life tho, Excluding the loneliness and other shit.
permalink
[-] Cartmanishere | 2 points | Apr 12 2017 18:15:16
Can relate.
permalink
[-] greytmeam | 2 points | Apr 12 2017 18:35:57
Haha, that's kinda awesome. I'm pretty much in a similar situation -- high school is getting pretty boring, so I'm just dicking around with this type of stuff. Pro tip: try to get a girlfriend and/or find hobbies. For me: getting a girlfriend, rock climbing, sailing, etc. were the solution.
permalink
[-] greytmeam | 1 points | Apr 12 2017 18:34:08
Thanks a ton! For now, I'm probably gonna do most of it on my own, but I can send you links to it when it's in a somewhat stable state (and has a decent amount of content). Always appreciate help!
permalink
[-] ruralcricket | 3 points | Apr 12 2017 02:57:20
https://www.reddit.com/r/opendirectories/comments/5dw9gr/ebook_database_of_over_200000_ebooks/
permalink
[-] greytmeam | 1 points | Apr 12 2017 18:33:07
Awesome, thanks!
permalink
[-] ruralcricket | 1 points | Apr 12 2017 03:04:57
Also I have this list laying around of Calibre sites (probably outdated). You need to visit the site and then modify the num=xxx to be a value larger than the actual # of books.
then use wget
the -i sitelist.txt assumes the list is in a file, you could of course just do one at a time.
the contents of sitelist.txt https://pastebin.com/1kZx9BYX
permalink
[-] greytmeam | 1 points | Apr 12 2017 18:33:01
Awesome, thanks a bunch! I'll be sure to do that this weekend when I have a bit more time on my hands.
permalink
[-] Cartmanishere | 1 points | Apr 12 2017 18:18:08
This isn't the one you asked for. But this site has quite a lot of mobi ebooks. Probably in hundreds of thousands.
permalink
[-] greytmeam | 1 points | Apr 12 2017 18:32:40
Awesome, thanks! I'll check it out this weekend. I typically use ebook.bike, but I'm not sure how to bulk download from there with scripts (tbh, I'm pretty bad at scripting).
permalink
[-] Cartmanishere | 1 points | Apr 12 2017 18:38:36
You can use wget to download this. Roughly 8.2GB.
Here's the relevant thread on /r/opendirectories.
permalink
[-] sneakpeekbot | 2 points | Apr 12 2017 18:38:41
#1: Every Season of Game of Thrones in 1080p - Updated Weekly | 22 comments
#2: Princeton.edu's FTP server, it contains thesis's submitted by every student from every year since the mid 1980's, among various other files hosted by the university. | 27 comments
#3: 2.7TB EN MSDN Dump [all microsoft products from its developer network] | 75 comments
^^I'm ^^a ^^bot, ^^beep ^^boop ^^| ^^Downvote ^^to ^^remove ^^| ^^Contact ^^me ^^| ^^Info ^^| ^^Opt-out
permalink
[-] greytmeam | 1 points | Apr 12 2017 20:55:11
Awesome, thanks! Do you by chance know if I can do something similar for sites like ebook.bike? I recall them having some token based download thing, but I'm not sure.
permalink