Allows Extron processors to connect with external databases
Not affiliated with the Extron corporation
https://github.com/mefranklin6/ExtronDatabaseConnector
-
Vendor Agnostic. The base services can handle anything that can work with the REST API or connect through TCP sockets. This should cover most control processors. I plan on having instructions that are control system specific on how to interact with the server.
-
Server side code re-written as Go (golang) microservices. You can find the development REST server in a dev branch now. This will increase the capacity of existing servers, increase speed, and add horiziontal scalibility.
- Extron control processors send REST-like API calls to a proxy web server
- Proxy server converts GET or POST commands from the processors into SQL SELECT/INSERT commands
- Proxy server communicates with the database server, and sends data back to the control processors, in JSON or HTTP.
- Programmers write JSON strings and send them out from GCP prcessors using a generic TCP driver
- Proxy server receives these strings via TCP socket, then interprets and formats these strings
- Proxy server sends data to the SQL server
- Control_Processor_Files : Add these files to your Extron processors running Extron Control Script (ECS)
- ECS_Server: Server files for a x86_64 machine for bi-directional database communications to processors running ECS
- GCP_Server: OPTIONAL: unidirectional GCP Processor to external database server files.
- The server apps were built to be mostly asynchronous for best performance. Sometimes commands will be processed out of order.
- The REST_Connector code on the control processors uses the Wait decorator as a multi-threaded hack to prevent blocking the main thread when waiting for the external servers.
- SQL Queries are never formatted on the control processors, we do not trust what the control processors say. We handle SQL injection prevention and command formatting in the mysql_tools module.
- The below contains example instructions of how to setup a usage tracking metrics system, but the code can be modified to to any read/write to the database from Extron control processors.
The instructions and requirements below are only for ECS processors, but you can perform the same steps to run GCP_Socket_Server
for GCP processors if needed. These two apps/services can co-exist as long as they use different ports.
Requirements:
- An x86 server running MySQL database (could be ported to other DB's too). Instructions are for hosting on a remote Ubuntu Server + bash.
- An x86 server running modern Python 3 (built on 3.10.12). For this repo, this is the same server as the one running the database, but it can be two different servers.
- Reserved or Static IP's on servers
- Processors must be running Extron Control Script (GCP devices use GCP_Socket_Server)
- TCP 22 SSH : From your management PC (if running headless/remote servers)
- TCP 8080 HTTP : To/From FastAPI server (if using two servers) and To/From Extron processors
- TCP 3306 MySQL : To/From Database Server and FastAPI server (if running on different machines)
- Optional: TCP 3306 MySQL : To/From your management PC for easier access to the database, MySQL workbench, etc.
sudo apt install mysql-server
If using two servers: Bind the current IP of the server to the database. You can leave it as 127.0.0.1 if running FastAPI and MySQL on the same server.
sudo nano /etc/mysql/mysql.conf.d/mysqld.cnf
- Find 'bind-address' and set that to the IP of your MySQL server
- Write-out and exit nano
Enter MySQL as root
sudo mysql -u root
Create a database: example, a database called 'devdb'
CREATE DATABASE devdb;
Enter the database:
use devdb;
Create a table: example, table named 'testextron'
CREATE TABLE testextron (
room varchar(255),
time varchar(255),
metric varchar(255),
action varchar(255)
);
Example: user named 'fast_api' with the password 'mypw', which will have INSERT (new write) and SELECT (read) privileges on the table 'testextron' in the database 'devdb'. Only grant the minimum amount of privileges needed.
Also, please use a better password or consider a better system of authentication. The server IP should be the same as the bind address in mysqld.cnf
that you set previously (it will be 127.0.0.1 if you didn't change it)
CREATE USER 'fast_api'@'<your server IP>' IDENTIFIED BY 'mypw';
GRANT INSERT, SELECT ON devdb.testextron TO 'fast_api'@'<your server IP>';
FLUSH PRIVILEGES;
Copy this repo in to /opt/yourproject and cd into it
cd /opt/yourproject
If needed, edit the FastAPI_Server.py
with the proper credentials for your MySQL account you setup above.
-
- If ever permissions errors in /opt, always:
-
-
sudo chgrp -R users /opt/yourproject
-
-
-
sudo chmod -R g+rwx /opt/yourproject
-
Install venv if you don't have it
sudo apt install python3.10-venv
Create a virtual python enviroment called 'my-virt'
sudo python3 -m venv my-virt
Start and enter the virtual enviroment
source /opt/yourproject/my-virt/bin/activate
Install packages
pip install -r requirements.txt
- If permissions error:
-
sudo chown -R $USER:$USER /opt/yourproject/my-virt/
Exit the virtual enviroment:
deactivate
Make a shell script that runs at boot which enters the virtual enviroment and starts the FastAPI app.
sudo nano /opt/yourproject/fastapi_startup.sh
#!/bin/bash
cd /opt/yourproject
source /opt/yourproject/my-virt/bin/activate
python3 FastAPI_Server.py my-virt
Make the shell script and FastAPI app executable
sudo chmod +x fastapi_startup.sh
sudo chmod +x FastAPI_Server.py
Create a new service tied to the shell script to start the app on boot
sudo nano /etc/systemd/system/fastapi.service
[Unit]
Description=fastapi
After=mysql.service
[Service]
ExecStart=/opt/yourproject/fastapi_startup.sh
TimeoutSec=60
Restart=on-failure
RestartSec=60
[Install]
WantedBy=multi-user.target
Eenable the service, reload the daemons, start the service
sudo systemctl enable fastapi
sudo systemctl daemon-reload
sudo systemctl start fastapi
Check if the service is running now.
sudo systemctl status fastapi
If you need to stop the App to make changes:
sudo systemctl stop fastapi
-
Copy
REST_Connector.py
to your control processor repository. I copied it to/src/modules/project
-
Instantiate the connector in your control script code
from modules.project.REST_Connector import REST_Connector
API = REST_Connector(
<str: this processor name>,
<str: IP address of your server>
)
This will check the health of the app and database before sending additional commands.
Recommended this is checked at every system startup.
In Extron Control Script:
def Startup(): # your startup function
@Wait(0.1) # multi-thread hack to prevent blocking
def CheckMetricsEnabled():
API.EnableAPI_Metrics = API.get_global_api_metrics_enable()
print("API Metrics is {}".format(API.EnableAPI_Metrics))
Usage: call API.start_metric or API.stop_metric and pass the name of the metric as the only parameter
- At some system-on function, add:
API.start_metric("System On")
- This will end up in your database as:
-
|<processor name>|<current time in ISO format>|System On|Started
The paramater passed to API.start_metric can be any string under 255 characters. This string is sanitized before being sent to the database to prevent SQL Injection.
In FastAPI_Server.py, define an action and a page that acts as a trigger
In this example, we will read everything from a table called 'testtable' and send that to the processor when "Btn_GetTable" is pressed
@app.get("/get_table")
async def get_table():
data = await db_connect.db_read('SELECT * FROM testtable')
return {"message": data}
In Extron Control Script:
@event(Btn_GetTable, "Pressed")
def GetTable(btn, state):
@Wait(0.1) # multi-thread hack to prevent blocking
def GetTable_Inner():
return API.rest_read(
url="<your_server_ip>:8080/get_table",
timeout=5 # remember this is non-blocking
)
In Extron Control Script:
@event(Btn_PC_Input, "Pressed"):
def SetInput_PC(btn, state):
<switcher commands, GUI commands, etc>
API.start_metric("PC", group="Inputs")
The current input is stored in the REST_Connector class. When group is passed as a paramater and that group is "Inputs", it will unload and send a metrics stopped message of the old input and send a metrics started message for the new input.
Currently only "Inputs" group is supported for mutual exclusivity, but you can add more by adding a class attibute to store the data, adding that attribute to the clear_group
method, and adding your logic to the _handle_group
method.