1. Setup environment

Author: Luis Rosenstrauch


docker-2.3.0 docker-compose-1.13.0

Note: make sure to run this on your host. This is needed for elasticsearch to work.

sudo sysctl -w vm.max_map_count=262144

Step 1: Get the code and Fetch the images

start by getting your copy of the spiderbreeder

$ git clone [email protected]:hoaxly/hoaxly-scraping-container.git

$ cd hoaxly-scraping-container

Login to our registry (using your gitlab credentials) to get at the images that you need in order to locally build and run spiders.

docker login registry.acolono.net:444 docker pull registry.acolono.net:444/hoaxly/hoaxly-storage-container

docker pull registry.acolono.net:444/hoaxly/hoaxly-scrapydaemon-container docker pull registry.acolono.net:444/hoaxly/hoaxly-scraping-container

Step 2: Spin up the local instances and initialize them.

In your project's rootfolder, run:

fin init

Open your preferred browser and go to: http://portia.hoaxly.docksal