100% Free way to host your website, email and custom domain with Google, Cloudflare, etc… on Ubuntu 22.04 using docker.

You have a wordpress website with a custom domain name, but don’t want to pay for DNS,Email,SSL,hosting, etc… It’s only $8/year to register a custom domain name with cloudlfare. Everything else is free! As long as you don’t exceed more than 1GB of egress a month. You can use both wordpress jetpack, and cloudflare to minimize this.

Some advantages of hosting your own custom WordPress website with Docker include: – You have full control over your website and can customize it however you want. – Docker makes it easy to set up and maintain your website. – You can run your website on any platform that supports Docker. – Your website will be more scalable and reliable than if it were hosted on a traditional web server.

Set up cloudflare

Create account, and add domains.

Set up google free tier instance:

From: cloud.google.com/free/do…

Set up billing notifications if it goes over $0.03

Compute Engine free tier 
1 non-preemptible e2-micro VM instance per month in one of the following US regions:
Oregon: us-west1
Iowa: us-central1
South Carolina: us-east1

30 GB-months standard persistent disk

5 GB-month snapshot storage in the following regions:
Oregon: us-west1
Iowa: us-central1
South Carolina: us-east1
Taiwan: asia-east1
Belgium: europe-west1

1 GB network egress from North America to all region destinations (excluding China and Australia) per month
Your Free Tier e2-micro instance limit is by time, not by instance. Each month, eligible use of all of your e2-micro instance is free until you have used a number of hours equal to the total hours in the current month. Usage calculations are combined across the supported regions.

Compute Engine free tier does not charge for an external IP address.

GPUs and TPUs are not included in the Free Tier offer. You are always charged for GPUs and TPUs that you add to VM instances.

Learn more

Setup Dynamic DNS with Cloudflare

From: How to use Cloudflare for Dynamic DNS on Ubuntu 22.04

Install docker, and dependencies

From: docs.docker.com/engine/i…

##Set up the repository
#Update the apt package index and install packages to allow apt to use a repository over HTTPS:

sudo apt-get update

sudo apt-get install \
    ca-certificates \
    curl \
    gnupg \
    lsb-release

##Add Docker’s official GPG key:

sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
#Use the following command to set up the repository:

echo \
  "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu \
  $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null

#Install Docker Engine
#Update the apt package index, and install the latest version of Docker Engine, containerd, and Docker Compose, or go to the next step to install a specific version:

sudo apt-get update
sudo apt-get install docker-ce docker-ce-cli containerd.io docker-compose-plugin docker-compose

Install configurations

From: carlwillimott.medium.com…

git clone https://github.com/carlwillimott/docker-compose-wordpress-ssl.git
cd docker-compose-wordpress-ssl

nano -w docker-compose.yml

version: '3.3'

services:

  swag:
    image: linuxserver/swag
    container_name: swag
    restart: always
    depends_on:
      - wordpress
    volumes:
      - ./config:/config
      - ./default:/config/nginx/site-confs/default
    environment:
      - [email protected]
      - URL=blog.jphein.com
      - VALIDATION=http
      - TZ=America/Los_Angeles
      - PUID=1001
      - PGID=1002
    ports:
      - "443:443"
      - "80:80"

  wordpress:
    image: wordpress:latest
    container_name: wordpress
    hostname: wordpress
    depends_on:
      - db
    restart: always
    ports:
      - "8080:80"
    volumes:
      - ./www/:/var/www/html/
      - ./custom.ini:/usr/local/etc/php/conf.d/custom.ini
    environment:
      WORDPRESS_DB_HOST: db
      WORDPRESS_DB_USER: wp_db_user
      WORDPRESS_DB_PASSWORD: wp_db_pass
      WORDPRESS_DB_NAME: wp_db
      WORDPRESS_CONFIG_EXTRA: |
        define( 'WP_MEMORY_LIMIT', '96M' );

  db:
    image: mysql:5.7
    container_name: db
    volumes:
      - /var/lib/mysql:/var/lib/mysql
    restart: always
    ports:
      - "3306:3306"
    environment:
      MYSQL_ROOT_PASSWORD: mysql_root_pass
      MYSQL_DATABASE: wp_db
      MYSQL_USER: wp_db_user
      MYSQL_PASSWORD: wp_db_pass

Bring up the containers

docker-compose up -d

Make the containers come up on boot

From: bootvar.com/systemd-serv…

# Write systemd service in 
nano -w /etc/systemd/system/mysite.service
[Unit]
Description=Service for mysite
Requires=docker.service
After=docker.service

[Service]
Type=oneshot
WorkingDirectory=/home/jp/jphein.com
Environment=COMPOSE_HTTP_TIMEOUT=600
ExecStart=/usr/bin/env /usr/bin/docker-compose -f /home/jp/jphein.com/docker-compose.yml up -d
ExecStop=/usr/bin/env /usr/bin/docker-compose -f /home/jp/jphein.com/docker-compose.yml stop
StandardOutput=syslog
RemainAfterExit=yes

[Install]
WantedBy=multi-user.target

Now you have created the systemd service, enable it and start the service. Enabling service will make sure to start this service at server startup.

# will enable service to start at server bootup
systemctl enable mysite

# start service ( Executes ExecStart command )
systemctl start mysite

Enable PageSpeed in WordPress docker

To be continued…

Configure WordPress

Go to yourdomain.com

Install WordPress Plugins

Test

Test SSL: www.ssllabs.com/ssltest/…

Test Pagespeed module: ismodpagespeedworking.co…

Test Varnish: isvarnishworking.uk/

Test your server response time: www.bytecheck.com
Should be 500ms or under. Ideally under 200ms.

pagespeed.web.dev/

Test email: www.mail-tester.com/

Forward your custom domain email to your gmail account

Cloudflare offers this service for free

Automated backups to Cloud storage bucket

Free tier

Cloud Storage5 GB-months of regional storage (US regions only)5,000 Class A Operations per month50,000 Class B Operations per month1 GB network egress from North America to all region destinations (excluding China and Australia) per month
Free Tier is only available in us-east1us-west1, and us-central1 regions. Usage calculations are combined across those regions.
You get the first 5GB of cloud storage for free

Create a bucket in the google cloud console or terminal.

Install gcsfuse

Download .deb package frome: github.com/GoogleCloudPl…

wget https://github.com/GoogleCloudPlatform/gcsfuse/releases/download/v0.41.6/gcsfuse_0.41.6_amd64.deb
sudo dpkg -i gcsfuse_0.41.6_amd64.deb 

Permissions

You may need to run “gcloud init” if your permissions aren’t set up correctly. You may also need to edit your bucket permissions as well.

Mount bucket
From: github.com/GoogleCloudPl…

gcsfuse bucket_name mount_point
#unmount
fusermount -u /path/to/mount/point

Add to fstab to mount on boot

sudo echo "jphfree /home/jp/jphfree gcsfuse rw,_netdev,allow_other,uid=1001,gid=1002" >> /etc/fstab

#Backup script

NOTE: If you want to use duplicity gs: urls then you should follow this guide: jphein.com/how-to-use-du…

#Install mysqldump
sudo apt install mysql-client-core-8.0
#Install duplicity
sudo apt install duplicity
nano -w backup
#!/bin/bash
#--A simple backup script
#----Change the variables below to reflect your information
echo "------------------------------------------------------------------"
echo "          Website Backups"
echo "------------------------------------------------------------------"
echo "Creating variables..."
printf "$0 <-- Name and path of this script.\n"
#Working directory
WD="/home/jp"
printf "$WD <-- Working Directory on local server.\n"
echo "------------------------------------------------------------------"
#Google Cloud storage bucket"
printf "A Google Cloud storage bucket is used for longterm storage of incremental backups.\n"
BUCKET="jphfree"
printf "$BUCKET <-- Google Cloud storage bucket name\n"
echo "------------------------------------------------------------------"
#MySQL database
MYSQL_DB="wp_db"
MYSQL_USER="root"
MYSQL_PASS="db_root_pass"
printf "$MYSQL_DB <-- MySQL Database Name\n"
printf "$MYSQL_USER <-- MySQL Database Username\n"
echo "------------------------------------------------------------------"

#Google Cloud Storage Mount the storage bucket
printf "Mounting Google Cloud storage bucket...\n"
gcsfuse $BUCKET $WD/$BUCKET
echo "------------------------------------------------------------------"

#MySQL dump
printf "Dumping MySQL database...\n"
mysqldump --opt -h localhost --protocol=tcp --user=$MYSQL_USER --password=$MYSQL_PASS $MYSQL_DB > $WD/mysqldump/database.sql
echo "------------------------------------------------------------------"


#-------Duplicity Backup everything to the bucket
	echo "------------------------------------------------------------------"
	echo "Backing up data to a Google Gloud storage bucket using duplicity.."
	echo "------------------------------------------------------------------"
	echo "An incremental backup every day. A full backup every month."
	echo "We keep 12 full backups and their corresponding incrementals."
	echo "One year of daily back up data!"
	echo "------------------------------------------------------------------"

	#Clean up in case of aborted backups
	duplicity cleanup --no-encryption file://$WD/$BUCKET/backups

	#Backup data to bucket
	duplicity --progress --no-encryption --exclude $WD/$BUCKET --full-if-older-than 1M $WD file://$WD/$BUCKET/backups

	#Delete old backups
	#duplicity remove-older-than 1Y --force $SYNC_DIR/$i gs://$BUCKET/$i
	duplicity remove-all-but-n-full 12 file://$WD/$BUCKET/backups
#------------------------------------------------------------------------------
chmod 700 backup

#Add to crontab

crontab -e
# Edit this file to introduce tasks to be run by cron.
# 
# Each task to run has to be defined through a single line
# indicating with different fields when the task will be run
# and what command to run for the task
# 
# To define the time you can provide concrete values for
# minute (m), hour (h), day of month (dom), month (mon),
# and day of week (dow) or use '*' in these fields (for 'any').
# 
# Notice that tasks will be started based on the cron's system
# daemon's notion of time and timezones.
# 
# Output of the crontab jobs (including errors) is sent through
# email to the user the crontab file belongs to (unless redirected).
# 
# For example, you can run a backup of all your user accounts
# at 5 a.m every week with:
# 0 5 * * 1 tar -zcf /var/backups/home.tgz /home/
# 
# For more information see the manual pages of crontab(5) and cron(8)
# 
# m h  dom mon dow   command
  0 0  *   *   *     /home/jp/backup > /home/jp/backup.log 2>&1

Restore from Backups

duplicity restore file:///home/jp/jph-backups/backups /home/jp/restored/ --no-encryption

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.