100% Free way to host your website, email and custom domain with Google, Cloudflare, etc… on Ubuntu 22.04 using docker.

You have a wordpress website with a custom domain name, but don’t want to pay for DNS,Email,SSL,hosting, etc… It’s only $8/year to register a custom domain name with cloudlfare. Everything else is free! As long as you don’t exceed more than 1GB of egress a month. You can use both wordpress jetpack, and cloudflare to minimize this.

Some advantages of hosting your own custom WordPress website with Docker include: – You have full control over your website and can customize it however you want. – Docker makes it easy to set up and maintain your website. – You can run your website on any platform that supports Docker. – Your website will be more scalable and reliable than if it were hosted on a traditional web server.

Set up cloudflare

Create account, and add domains.

Set up tier instance:

From: cloud.google.com/free/do…

Set up billing notifications if it goes over $0.03

Compute Engine  tier 
1 non-preemptible e2-micro VM instance per month in one of the following US regions:
Oregon: us-west1
Iowa: us-central1
South Carolina: us-east1

30 GB-months standard persistent disk

5 GB-month snapshot storage in the following regions:
Oregon: us-west1
Iowa: us-central1
South Carolina: us-east1
Taiwan: asia-east1
Belgium: europe-west1

1 GB network egress from North America to all region destinations (excluding China and Australia) per month
Your Free Tier e2-micro instance limit is by time, not by instance. Each month, eligible use of all of your e2-micro instance is free until you have  a number of hours equal to the total hours in the current month. Usage calculations are combined across the supported regions.

Compute Engine  tier does not charge for an external IP address.

GPUs and TPUs are not included in the Free Tier offer. You are always charged for GPUs and TPUs that you add to VM instances.

Learn more

Setup Dynamic DNS with Cloudflare

From: How to use Cloudflare for Dynamic DNS on Ubuntu 22.04

Install docker, and dependencies

From: docs.docker.com/engine/i…

##Set up the repository
#Update the apt package index and install packages to allow apt to use a repository over HTTPS:

sudo apt-get update

sudo apt-get install \
    ca-certificates \
    curl \
    gnupg \

##Add Docker’s official GPG key:

sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
#Use the following command to set up the repository:

echo \
  "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu \
  $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null

#Install Docker Engine
#Update the apt package index, and install the latest version of Docker Engine, containerd, and Docker Compose, or go to the next step to install a specific version:

sudo apt-get update
sudo apt-get install docker-ce docker-ce-cli containerd.io docker-compose-plugin docker-compose

Install configurations

From: carlwillimott.medium.com…

git clone https://github.com/carlwillimott/docker-compose-wordpress-ssl.git
cd docker-compose-wordpress-ssl

nano -w docker-compose.yml

version: '3.3'


    image: linuxserver/swag
    container_name: swag
    restart: always
      - wordpress
      - ./config:/config
      - ./default:/config/nginx/site-confs/default
      - [email protected]
      - URL=blog.jphein.com
      - VALIDATION=http
      - TZ=America/Los_Angeles
      - PUID=1001
      - PGID=1002
      - "443:443"
      - "80:80"

    image: wordpress:latest
    container_name: wordpress
    hostname: wordpress
      - db
    restart: always
      - "8080:80"
      - ./www/:/var/www/html/
      - ./custom.ini:/usr/local/etc/php/conf.d/custom.ini
      WORDPRESS_DB_USER: wp_db_user
      WORDPRESS_DB_PASSWORD: wp_db_pass
      WORDPRESS_DB_NAME: wp_db
        define( 'WP_MEMORY_LIMIT', '96M' );

    image: mysql:5.7
    container_name: db
      - /var/lib/mysql:/var/lib/mysql
    restart: always
      - "3306:3306"
      MYSQL_ROOT_PASSWORD: mysql_root_pass
      MYSQL_DATABASE: wp_db
      MYSQL_USER: wp_db_user
      MYSQL_PASSWORD: wp_db_pass

Bring up the containers

docker-compose up -d

Make the containers come up on boot

From: bootvar.com/systemd-serv…

# Write systemd service in 
nano -w /etc/systemd/system/mysite.service
Description=Service for mysite

ExecStart=/usr/bin/env /usr/bin/docker-compose -f /home/jp/jphein.com/docker-compose.yml up -d
ExecStop=/usr/bin/env /usr/bin/docker-compose -f /home/jp/jphein.com/docker-compose.yml stop


Now you have created the systemd service, enable it and start the service. Enabling service will make sure to start this service at server startup.

# will enable service to start at server bootup
systemctl enable mysite

# start service ( Executes ExecStart command )
systemctl start mysite

Enable PageSpeed in WordPress docker

To be continued…

Configure WordPress

Go to yourdomain.com

Install WordPress Plugins


Test SSL: www.ssllabs.com/ssltest/…

Test Pagespeed module: ismodpagespeedworking.co…

Test Varnish: isvarnishworking.uk/

Test your server response time: www.bytecheck.com
Should be 500ms or under. Ideally under 200ms.


Test email: www.mail-tester.com/

Forward your custom domain email to your gmail account

Cloudflare offers this service for

Automated backups to Cloud storage bucket

Free tier

Cloud Storage5 GB-months of regional storage (US regions only)5,000 Class A Operations per month50,000 Class B Operations per month1 GB network egress from North America to all region destinations (excluding China and Australia) per month
Free Tier is only available in us-east1us-west1, and us-central1 regions. Usage calculations are combined across those regions.
You get the first 5GB of cloud storage for free

Create a bucket in the cloud console or terminal.

Install gcsfuse

Download .deb package frome: github.com/GoogleCloudPl…

wget https://github.com/GoogleCloudPlatform/gcsfuse/releases/download/v0.41.6/gcsfuse_0.41.6_amd64.deb
sudo dpkg -i gcsfuse_0.41.6_amd64.deb 


You may need to run “gcloud init” if your permissions aren’t set up correctly. You may also need to edit your bucket permissions as well.

Mount bucket
From: github.com/GoogleCloudPl…

gcsfuse bucket_name mount_point
fusermount -u /path/to/mount/point

Add to fstab to mount on boot

sudo echo "jphfree /home/jp/jphfree gcsfuse rw,_netdev,allow_other,uid=1001,gid=1002" >> /etc/fstab

#Backup script

NOTE: If you want to use duplicity gs: urls then you should follow this guide: jphein.com/how-to-use-du…

#Install mysqldump
sudo apt install mysql-client-core-8.0
#Install duplicity
sudo apt install duplicity
nano -w backup
#--A simple backup script
#----Change the variables below to reflect your information
echo "------------------------------------------------------------------"
echo "          Website Backups"
echo "------------------------------------------------------------------"
echo "Creating variables..."
printf "$0 <-- Name and path of this script.\n"
#Working directory
printf "$WD <-- Working Directory on local server.\n"
echo "------------------------------------------------------------------"
# Cloud storage bucket"
printf "A Google Cloud storage bucket is  for longterm storage of incremental backups.\n"
printf "$BUCKET <-- Google Cloud storage bucket name\n"
echo "------------------------------------------------------------------"
#MySQL database
printf "$MYSQL_DB <-- MySQL Database Name\n"
printf "$MYSQL_USER <-- MySQL Database Username\n"
echo "------------------------------------------------------------------"

#Google Cloud Storage Mount the storage bucket
printf "Mounting Google Cloud storage bucket...\n"
echo "------------------------------------------------------------------"

#MySQL dump
printf "Dumping MySQL database...\n"
mysqldump --opt -h localhost --protocol=tcp --user=$MYSQL_USER --password=$MYSQL_PASS $MYSQL_DB > $WD/mysqldump/database.sql
echo "------------------------------------------------------------------"

#-------Duplicity Backup everything to the bucket
	echo "------------------------------------------------------------------"
	echo "Backing up data to a Google Gloud storage bucket using duplicity.."
	echo "------------------------------------------------------------------"
	echo "An incremental backup every day. A full backup every month."
	echo "We keep 12 full backups and their corresponding incrementals."
	echo "One year of daily back up data!"
	echo "------------------------------------------------------------------"

	#Clean up in case of aborted backups
	duplicity cleanup --no-encryption file://$WD/$BUCKET/backups

	#Backup data to bucket
	duplicity --progress --no-encryption --exclude $WD/$BUCKET --full-if-older-than 1M $WD file://$WD/$BUCKET/backups

	#Delete old backups
	#duplicity remove-older-than 1Y --force $SYNC_DIR/$i gs://$BUCKET/$i
	duplicity remove-all-but-n-full 12 file://$WD/$BUCKET/backups
chmod 700 backup

#Add to crontab

crontab -e
# Edit this file to introduce tasks to be run by cron.
# Each task to run has to be defined through a single line
# indicating with different fields when the task will be run
# and what command to run for the task
# To define the time you can provide concrete values for
# minute (m), hour (h), day of month (dom), month (mon),
# and day of week (dow) or use '*' in these fields (for 'any').
# Notice that tasks will be started based on the cron's system
# daemon's notion of time and timezones.
# Output of the crontab jobs (including errors) is sent through
# email to the user the crontab file belongs to (unless redirected).
# For example, you can run a backup of all your user accounts
# at 5 a.m every week with:
# 0 5 * * 1 tar -zcf /var/backups/home.tgz /home/
# For more information see the manual pages of crontab(5) and cron(8)
# m h  dom mon dow   command
  0 0  *   *   *     /home/jp/backup > /home/jp/backup.log 2>&1

Restore from Backups

duplicity file:///home/jp/jphfree/backups /home/jp-restored

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.