Edit: Nevermind, didn't see the `ipfs add ...` call... I guess this just goes to show how simple it is to use IPFS. ;)
Edit2: I can't seem to resolve any of my files on `ipfs.io` -- probably some problem with my network config that's preventing peers from mirroring my file?
Sharing screenshots quickly sounds like a useful app. I'm interested immediately.
I've never used IPFS so when I see the installation steps list things like "install IPFS" and "start IPFS daemon," I'm scared to actually take those actions. I just wonder what this is going to do to my system. Is it going to be running a web-accessible service?
vague idea of what IPFS is leads to more questions
Can I remove content after publishing?
Do I have to have the daemon running forever?
Is it like pastebin for images? Am I anonymous or not?
etc.
Because of those types of questions, I haven't actually tried it out yet, but this seems like a good opportunity to package up a 'program' that people can just 'run' on their computers. If you're relying on an existing network that any party can enter, the program just works and you don't have any infrastructure. Am I on the right track here?
I'm not really looking for answers to any of those questions, just pointing out what thoughts I have after reviewing your project. If I wanted to know the answers, I would go spend a while learning about IPFS.
If the author is still here, I am wondering what benefits/drawbacks using IPFS has compared to, say, a shell script that uploads images to your own server and places the link in your clipboard.
The IPFS daemon installation will result in an internet facing service being installed. It's not "web accessible" but accessible using the protocols that IPFS uses. Other nodes will contact your instance and communicate with it.
Your node will share its external IP address, and all internal IP addresses configured on the machine it is running with all other IPFS nodes.
You can remove the content after publishing but then other nodes won't be able to retrieve it. If you know the other party has received the content you can use IPFS commands to remove it and eventually, as nodes garbage collect their content, it will go away. This assumes no other node decides to permanently pin your content before your remove it.
You have to have the daemon running for as long as required for other parties to retrieve the content. If you can get another node to 'pin' the content you can shut your node down when that node has retrieved the data.
You are not anonymous. If you share the hash of the content then any IPFS user can go from that hash, to your node hash and from there to all IP addresses configured on your machine with a couple of IPFS commands for as long as you have the content share by your IPFS daemon.
No. Once a file has been hashed, that hash always equals that file. No. Matter. What.
Now, if no one else in the world has that file that equals that hash, then the content is dead.
But you can never remove things. It wouldn't make sense to do so.
>Do I have to have the daemon running forever?
You can run it as long as you want. But if you're the only hoster of certain images, then you quitting will mean those files can't resolve. But if your stuff is popular, other IPFS daemons will also provide bandwidth and capacity.
> Is it like pastebin for images? Am I anonymous or not?
It's like Pastebin, kind of. Your node is NOT anonymous, but the content itself can be. The content=hash , no matter who shared it.
In the truest sense of the word, if data for a given hash is on any computer, that computer can provide the data behind the hash.
If that data was found 1000 years from now, and entered into the system, the data would generate the same hash, and then be able to fulfill any hash requests.
Also, IPFS does data deduplication across blocks (chunks of 512KB - I believe), files, and directories. So if I share a source repo that uses someone else's repo, and they share it too, then my copy helps speed his files up as well.
There's an ongoing project to get OpenStreet Maps in IPFS, so that local areas have their own hash, all the way up to state, country, continent, and world. It's so the whol geographic repo could be shared amongst all, without the burden of storage on a single person or group.
I don't really grasp the point. Isn't ipfs.io still pretty much centralized service? So you have the image stored in ipfs, but what is the added value instead of using something like imgur?
ipfs.io is simply a public gateway to the ipfs network. If you have the ipfs daemon running, you can fetch the image through the ipfs network like this:
$ ipfs get QmTf3EquRYzxa4njTRpimp6YEcJJPSUf5pQE9TE54qCa3M
hi guys. just noticed it i got on frontpage! I wanted to demo how easy it was to use IPFS and was a bit suprised no one made a cloudapp clone with this tech. As you can see in the code, took like 30 min to put it together. I'm not that familiar with IPFS either so thanks for answering all the questions guys.
FYI: thanks for the guys that put in a pull-request, i've already merged them in.
Creator of CloudApp here (no longer involved). This is really neat, thanks for sharing! You could potentially monitor ~/Desktop on OS X and identify screenshots with:
mdls 'Screen Shot 2016-10-24 at 4.14.54 PM.png' | grep IsScreenCapture
That would match the original behavior even closer.
Edit2: I can't seem to resolve any of my files on `ipfs.io` -- probably some problem with my network config that's preventing peers from mirroring my file?