Skip to content

A short trip to Jupyter via the Inter-planetary File System

Getting started with IPFS, Python, and Jupyter Notebooks

The Inter-planetary File System (IPFS)

IPFS is the Distributed Web. It is an idea and a protocol designed to support a distributed file system that seeks to connect all computing devices with the same system of files. There has been plenty of hype around IPFS, but much of it is well-deserved. There’s a growing list of awesome apps that are building on IPFS technologies, and the open source community around it is exploding. Why should you care? Because IPFS offers exciting opportunities for data sharing and access, code versioning, and a whole lot more to come. And the best part, it’s actually quite easy to get started on the distributed web!

While the reference implementations for IPFS are written in Go and Javascript, Pythonistas can still take part in the decentralized web using their favorite language. In this short post, we’ll go through the setup required to get you started, and point out some resources to satisfy your (hopefully building) curiosity.

Install IPFS

You can start with the official install page here. Grab the IPFS binary for your platform, and install it like you would any piece of software(you can also read the README for help). The trick here is to make sure it is installed somewhere in your $PATH. Try running ipfs --version to make sure it’s working. IPFS uses a global local object repository, added to ~/.ipfs, which you can initialize with ipfs init. Feel free to follow the suggested commands there, and get a feel for the IPFS command-line tools.

Once you’re ready to take things to the next level, run the daemon in another terminal: ipfs daemon. Watch for(at least) three lines to appear, and make note of the tcp ports you get. Now, if you’re connected to the network, you should be able to see the ipfs addresses of your peers: ipfs swarm peers (thereshould be at least 4 to start with).

Client Library

Now it’s time to install the Python client library to interact with your running daemon’s local IPFS API. Your usual pip install ipfsapi will get you what you need. This is probably a good point to recommend you install the latest version of IPFS, and make sure you are running Python 3.4 or greater (not required), it will make you life much easier all around.

Jupyter Notebook

Let’s fire up a Jupyter Notebook running IPython, and start playing around with IPFS jupyter notebook then New > Python. I’ve also made one available over IPFS to get you started.

We’ll start with the basics:

import ipfsapiapi = ipfsapi.Client("localhost", 5001)

and reproduce the little example that gets printed when you initialized your peer:


You can even do administrative functions, such as and list locally pinned files api.pin_ls(type=’all’). To make things a bit more interesting, we can take advantage of plotting functionlity in Notebooks by directly accessing images (in this case, an XKCD comic) and displaying them inline:

from IPython.display import Imagepath = "QmSeYATNaa2fSR3eMqRD8uXwujVLT2JU9wQvSjCd1Rf8pZ/1553 - Public Key/1553 - Public Key.png"Image( guess I should be signing stuff, but I’ve never been sure what to sign. Maybe if I post my private key, I can crowdsource my decisions about what to sign.

And because we’re working in Python, we can take advantage of some of the nice features these bindings afford us, such as helpers functions for adding strings and dicts directly to IPFS:

metadata = {"data": "about data"}
cid = api.add_json(metadata)
# You can also check the public gateway:

and then check that our data is actually available via our local peer:

import json

And that’s it for now! At Textile, we use a whole range of IPFS tools to interact with our own data, as well as the growing infrastructure we’re developing to enable secure photo backup and sharing on the distributed web. Come check us out, and jump on the Textile Photos waitlist to request early access to a whole new way to control your photos.


A short trip to Jupyter via the Inter-planetary File System was originally published in Hacker Noon on Medium, where people are continuing the conversation by highlighting and responding to this story.