OCDS Kingfisher Collect

Build Status Coverage Status

Kingfisher Collect is a tool for downloading OCDS data and storing it on disk and/or sending it to an instance of Kingfisher Process for processing.


Use the OCP Data Registry to download OCDS data, worldwide. (The registry uses Kingfisher Collect to prepare datasets on a regular schedule.)

(If you are viewing this on GitHub, open the full documentation for additional details.)

You can:

Instead of installing Kingfisher Collect to your computer, you can follow this interactive step-by-step guide, to use Kingfisher Collect in Google Colaboratory.

You can also try using Kingfisher Collect with Scrapy Cloud.

How it works

Kingfisher Collect is built on the Scrapy framework. Using this framework, we have authored “spiders” that you can run in order to “crawl” data sources and extract OCDS data.

When collecting data from a data source, each of its OCDS files will be written to a separate file on your computer. Kingfisher Collect also ensures that the files are always either a record package or a release package, depending on the source.

By default, these files are written to a data directory (you can change this) within your kingfisher-collect directory (which you will create during installation). Each spider creates its own directory within the data directory, and each crawl of a given spider creates its own directory within its spider’s directory. For example, if you run the zambia spider (learn how), then the directory hierarchy will look like:

└── data
    └── zambia
        └── 20200102_030405
            ├── C8E
            │   ├── <...>.json
            │   └── <...>
            └── D1D
                ├── <...>.json
                └── <...>

As you can see, the data directory contains a zambia spider directory (matching the spider’s name), which in turn contains a 20200102_030405 crawl directory (matching the time at which you started the crawl – in this case, 2020-01-02 03:04:05). Within the crawl directory, .json files – the OCDS data – are split among subdirectories with opaque names, to not exceed filesystem limits.