After my yearly lease came up for my VPS service, I decided to move providers and ended up going with AWS. I also did not want to use Wordpress anymore and liked the idea of static site generators and with some thought, I chose Jekyll. Since I was not using GitHub pages, I wanted an alternative way to have my site re-generated every time I have a successful git push. This leaded me to my next few choices: 1. Run my blog on Docker ------ To respawn a site after every change, made docker seem like a perfect choice and I had some experience with it already. After I had created my site in Jekyll, I gathered all the dependencies and needs to run it in docker.... 2. Use nginx for control ------ Using just Jekyll, leaves you with a working static www root but no control for rewrites/redirects or custom magic. And no offense to Jekyll, it does a great job with static site generation but you should serve it a real web server. Jekyll does support dumping the site content into a folder, which makes it very compatible with nginx or apache the configuration ------ I worked with Jekyll locally and created a template I admired, which started from [martin308]'s [left-stripe] theme and made a handful or modificiations to make it work for me. These modifications included category support, social share buttons, a complete color theme change, and a few others. After my testing, I had come up with a solid config, my _Dockerfile_ below: The above configuration does a few things: 1. Sets the base image to Ubuntu 15.10 2. Updates the image and installs my dependencies 3. Copies my complete repository to the container 4. Copies my nginx configs to the container 5. Creates the www root and runs Jekyll using my copied repository files as _source_ and www root as _destination_ 6. Expose ports 80 and 443 7. Start my services script, which basically just start nginx So this gets my blog up and running but I would need to manually build and run my docker images. I want to push them to my private BitBucket repo and automatically restart my container with the new content. To do this, I need to have BitBucket communicate to the DockerHub and DockerHub relay a successful build notification to my server. This is the fun part :) the automagical devopsy stuff ------ After having my docker/nginx/jekyll work completed, I needed to create a workflow for comitting changes and triggering automatic builds. Having all my BitBucket repo pushes trigger a DockerHub build was pretty simple to set up. Now I needed an endpoint for DockerHub to send towards, after a successful build. After some searching, I found [captainhook], a nice project by [bketelsen] which is a web listener that can run scripts based on the URL called, aka "webhook." However, captainhook does not run over SSL, so I recommend having nginx running to forward requests to captainhook, which is how I set mine up (SSL reverse proxy). Also, run captainhook in cron so that it never dies or alternatively, use supervisord. This ensures that if captainhook has a hiccuup and dies, it will start again under a different PID. My configuration for restarting my docker containers looks like this: _update-docker.json_ _update-docker.sh_ My ultimate workflow ------ 1. Test my content changes locally with Jekyll using _jekyll --serve_ 2. If satisfied, I delete the temp Jekyll \_site destination directory and _git commit_ & _git push_ That sums it up! Let me know what you think! : https://github.com/martin308 "martin308" : https://github.com/martin308/left-stripe "left-stripe" : https://github.com/bketelsen/captainhook "captainhook" : https://github.com/bketelsen "bketelsen"