Home on-demand DBMS
Post
Cancel

on-demand DBMS

instead of running multiple server (mySql, SQLServer, Postgres, . . .) and have to either stop/start their service, you can just set them up using docker and use them when u want, or have multiple instances of the same provider running on different ports easily.

or when setting up a new windows machine, instead of installing all one by one, you have them on demand, for linux there’s ansible. . .

in some cases tho you’d need a certain dll for a library to work, like in rust’s diesel case. which u can download and point diesel to the folder u’ve downloaded it in.

Setup using docker compose

  • i’ve been using this setup for a while now, here’s my repo which u can use as a refernce or a starting point. th readme needs cleaning up tho

i like keeping each in their own folder so it’s clean and tidy when it comes to their folder data.

basically what you do is have a docker-compose.yml for each instance and just start and stop with docker commands.

in docker you can set environment variable either in a text file or .env file, i never used the text file option so here how to do it in .env:

  1. have you .env file :
1
2
DATABASE_USER=example
DATABASE_PASSWORD=example1234
  1. in docker-compose you can load them in with ${} :
1
DATABASE_USER=${DATABASE_USER}

MySql example

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
version: "3.8"

services: 
  mySQL:
    image: mysql  <-- i recommend using a set version so ur data don't get messed with on a new version update
    environment:
      MYSQL_ROOT_PASSWORD: root   <-- you can use ${.env} 
    restart: always               <-- so it starts as soon as docker desktop starts 
    container_name: mySqlDockerMain <-- give it a name
    ports:
      - "3306:3306"  <-- what ports
    volumes: 
      - "./mySql-db-data:/var/lib/mysql"
        ________________ the data folder which will get created locally
                           |
volumes:                   |
  db-data:           <------

“without the annotations”

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
version: "3.8"

services: 
  mySQL:
    image: mysql
    environment:
      MYSQL_ROOT_PASSWORD: root
    restart: always
    container_name: mySqlDockerMain
    ports:
      - "3306:3306"
    volumes: 
      - "./mySql-db-data:/var/lib/mysql"

volumes: 
  db-data:

More “involved” examples

Kafka cluster and zookeeper

you can run the entire thing using docker compose up cool!

  • the docker raw file
    • obviously you can play around with it more, but i’m still learning it (●’◡’●)
  • docker compose up (with or without -d) pullit
  • test it thru offset explorer
    • in the advanced tab, in Bootstrap servers: broker:29090, localhost:9092
    • setup-1
    • setup-2
    • setup-3

Aerospike

in this case you’ve only got to give it the aero_config folder as like this

Airflow

same thing as AeroSpike but the folder is named dags and it would house ur handmade .py scripts

after it is done initializing, you can navigate tp localhost:8080 user and password : airflow

you should see a lot of dummy tags which you can play with, but remember : the tag names are not the names of the python scripts but the name you gave the dag inside of it:

dag

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
def crap():
    print("I am POWER!")

default_args = {
    'owner': 'Test',
    'depends_on_past': False
}
dag = DAG(
    'crap', <=- this is the name of the dag!
    default_args=default_args,
    start_date=days_ago(0)
)
task1 = PythonOperator(
    task_id="CRAP",
    python_callable=crap,
    dag=dag
)

closing

normally you’ll always find documentation for a provider u wanna use in docker, just either search dockerhub or uncle google, if you couldn’t find anything tho, you can always try to install and run it normally and see what environment vars or other things it may need.

or if you found a docker container all the better, you can spin it up and ssh inside it, or use docker desktop to do so!

This post is licensed under CC BY 4.0 by the author.

Home lab identity server

Windows scheduled tasks