Both Wind and sun are provided from AIT infrared. This is a glue-code repository to communicate with their endpoint. You need a CityPyO user at DCS to use this repo.
To run a sun simulaiton post a request to /trigger_calculation_sun payload as a json should contain only your CityPyO user id. { city_pyo_user "YOUR_ID" } (It is used to get the building geometries from CityPyO)
Requests to calculate wind-comfort for a CityPyO user. Inputs are:
- Wind speed
- Wind Direction
- CityPyO user id (used to get the building geometries from CityPyO)
Provides results as geojson or png. Results are calculated by the Infrared API of AIT.
The "wind comfort" service predicts a plane of Lawson Criteria categories, given an input wind direction and speed. The returned normalised values represent categories as seen in the following table:
value | lawson criteria category |
---|---|
0.0 |
"Sitting Long" |
0.2 |
"Sitting Short" |
0.4 |
"Walking Slow" |
0.6 |
"Walking Fast" |
0.8 |
"Uncomfortable" |
1.0 |
"Dangerous" |
value | sunlight hours average |
---|---|
0.0 |
"< 1.2 h/day" |
0.1 |
"2.4 h/day" |
0.2 |
"3.6 h/day" |
0.3 |
"4.8 h/day" |
0.4 |
"6 h/day" |
.... | |
1.0 |
"12 h/day " |
Results are obtained through a 3 step process:
-
Trigger a calculation wind: POST Request to /trigger_calculation
- Params:
- "wind_speed": INT ; [km/h] ; - "wind_direction": INT [0-360°] (0 being north, 90 east); - "city_pyo_user": YOUR_CITYPYO_USER_ID
- Returns the task id of the celery task:
json { "taskId": __TASK_ID__ }
- Params:
-
Trigger a calculation sun: POST Request to /trigger_calculation_sun
- Params:
- "city_pyo_user": YOUR_CITYPYO_USER_ID
- Returns the task id of the celery task:
json { "taskId": __TASK_ID__ }
- Params:
-
Get result of the celery task: GET Request to /check_on_singletask/TASK_ID
- Returns a group task id:
json {"result": __GROUP_TASK_ID__ }
- Returns a group task id:
-
Get result of the group task: GET Request to /collect_results/GROUP_TASK_ID
-
Param:
"result_format": "geojson" || "png"
-
Returns the actual result, accompanied by some meta information on group task calculation progress.
"results": { __RESULT_OBJECT__ }, "grouptaskProcessed": boolean, "tasksCompleted": 1, "tasksTotal": 7 }
RESULT_OBJECT for result_type "geojson":
{ "results": {"type": "FeatureCollection", "features": [...] }}
RESULT_OBJECT for result_type "png":
"bbox_coordinates": [ [ LAT, LONG ], ..., ], "bbox_sw_corner": [ [ LAT, LONG ] ], "image_base64_string": "PNG_STRING", "img_height": PIXELS_Y, "img_width": PIXELS_X
}
-
-
clone repo
-
create venv, install requirements
-
Export ENV variables (see below)
-
RUN docker-compose up redis to start only the redis docker
-
Activate venv
-
RUN celery -A tasks worker --loglevel=info --concurrency=8 -n worker4@%h
-
RUN entrypoint.py
-
RUN the mock-api by docker-compose up ait-mock-api
-
Test by running executing sample_request.sh
- REDIS_HOST=redis
- REDIS_PORT=6379
- REDIS_PASS=YOUR_PASS
- INFRARED_URL=http://ait-mock-api:5555/
- INFRARED_USERNAME=test
- INFRARED_PASSWORD=test
- CITY_PYO=https://YOUR_CITYPYO_URL
In general this software is a wrapper around the Infrared AIT api. The software takes care of
- subdividing the area of interest into 300m x 300m bboxes (projects) for calculation
- creation/updating of "projects" at Infrared. Each project contains a set of buildings and calculation of wind-comfort is run per project.
- translating geospatial data into the local coordinates and format of AIT projects
- automated merging of results at bbox/project intersections
- converting of a project's result to geojson
- provision of result geojsons as png if requested
- keeping your projects at AIT api alive (by regular requests to them)
The AIT api is a GraphQL api which allows creation and updating of projects. Calculation of results per project. The mock api mocks this behaviour and will always return the same mock result.
This sample project shows how to use Celery to process task batches asynchronously. For simplicity, the sum of two integers are computed here. In order to simulate the complexity, a random duration (3-10 seconds are put on the processing). Using Celery, this tech stack offers high scalability. For ease of installation, Redis is used here. Through Redis the tasks are distributed to the workers and also the results are stored on Redis.
Wrapped with an API (Flask), the stack provides an interface for other services. The whole thing is then deployed with Docker.
After a task has been successfully processed, the result is cached on Redis along with the input parameters wind_speed, wind_direction and buildings.geojson. The result is then returned when a (different) task has the same input parameters and is requested.
- Python
- Celery
- Redis
- Flask
- Docker
docker-compose build
docker-compose up -d
celery -A tasks worker --loglevel=info
List Tasks:
redis-cli -h HOST -p PORT -n DATABASE_NUMBER llen QUEUE_NAME
List Queues:
redis-cli -h HOST -p PORT -n DATABASE_NUMBER keys \*