Moderator: CCGHQ Admins
by MidnightLightning » 07 Nov 2016, 18:24
I think IPFS might be a good option for this community as it offers the same sort of "browse for individual files" flexibility, but doesn't have "single point of failure" problem (if Mega were ever forced to shut down, those files are gone). IPFS is more like a torrenting setup, where community members can locally seed the files, but allows for picking and choosing individual files from the collection.
For example, I seeded the 4th Edition XLHQ folder, and it now can be accessed from:
https://gateway.ipfs.io/ipfs/QmRPFDMW86 ... q6WcCVFZTp
Anyone who wants to run a local IPFS node can issue the command:
- Code: Select all
ipfs pin QmRPFDMW86yhqwVK7DUPYaa43eN59XrraCXdq6WcCVFZTp
The downside is adding files to a folder changes the hash/identifier of the folder. So, while right now the Mega link of https://mega.nz/#F!p8RBBT6Y!ksgSGJbMsKU0HX_ho-QS5g will always go to the most up-to-date folder with all the newest sub-folders added. Directing users to an IPFS folder would mean updating the link's hash when new folders were added to the root level. Or, IPFS has a naming system (which seems to not quite be finalized yet), where CCGHQ could keep a node up-to-date with the most recent hash's name, though that part of the IPFS protocol seems to not be finalized yet.
Anyone have thoughts on this or tried working with this in the past?
by JohnnyTH » 10 Nov 2016, 17:17
Plus for this kind of repository we need something more similar to Git, because people tend to sync to this archive so if random files are patched or updated we also need a mean to view the changes history and fetch only the updates from a certain point onward.
by MidnightLightning » 16 Nov 2016, 06:51
by JohnnyTH » 19 Dec 2016, 12:44
Do you have any other solution in mind? I started a git repository locally checking in all the sets that i'm interested about, but rebuilding the whole tree in git could take a toll, I'm not sure how git behaves with so much data.
Git also has the advantage of being a distributed versioning system, so each one of us could keep his own repo and we just need to be able to sync regularly somehow
by skibulk » 19 Dec 2016, 14:09
1) It would take a ton of work to setup and I just don't have the time to do it. It might be manageable with a tried and true database like Magic Album's, but GH has requested not to use his data.
2) I fear that somebody would complain about copyright infringement and our git would just be taken down. I posted a question to Stack Exchange Law, without any input on this topic.
by charlequin » 19 Dec 2016, 16:52
I haven't compared closely but at least on my spot checks in the past their data has been pretty spot on.
Who is online
Users browsing this forum: Bing [Bot] and 1 guest