Skip to content

Docker Setup and Test Redis and Celery Locally

tmpayton edited this page Jan 14, 2025 · 7 revisions

To enhance performance, we use redis, celery beat and worker to process scheduled tasks that are queued such as nightly MV refresh, legal documents reload, elasticsearch backup etc.

The download tasks are queued in redis. The celery worker will find the job in the queue, process the request, extract and execute the query in database and upload the output from the query to s3 bucket.

App will send message to Slack at end of task.To enhance performance, we use redis, celery beat and celery worker to process scheduled tasks that are queued such as nightly MV refresh, legal documents reload, elasticsearch backup etc.

The download tasks are queued in redis. Then celery worker will find the job in the queue, process the request, extract and execute the query in database and upload the output from the query to s3 bucket.

To test redis, celery beat, and celery worker locally:

Modify the task code:

Modify the webservices/tasks/init.py file temporarily.

  1. Disable SSL on local. Change line from
"url": redis_url() + "?ssl=true",

to

`"url": redis_url(),`
  1. Comment out the broker_use_ssl and redis_backend_use_ssl variables.
    # broker_use_ssl={
    #     'ssl_cert_reqs': ssl.CERT_NONE,
    # },
    # redis_backend_use_ssl={
    #     'ssl_cert_reqs': ssl.CERT_NONE,
    # },

Retrieve AWS S3 Bucket Credentials

  1. Login Cloud.gov foundry and get AWS s3 bucket credentials and user-provided credentials.
cf login --sso
cf target -s <space>
cf env api
  1. Look for the s3 service named 'fec-s3-api' and export the environment variables:
export AWS_ACCESS_KEY_ID="xxxxx"
export AWS_SECRET_ACCESS_KEY="xxxxx"
export AWS_PUBLIC_BUCKET="xxxxx"
export AWS_DEFAULT_REGION="xxxxx"
  • Test download (don't need FEC_DOWNLOAD_API_KEY if test on dev space):
export FEC_DOWNLOAD_API_KEY="xxxxx" # don't need this if test on 'dev' space
  • Test Slack message:
export SLACK_HOOK="xxxxx"

Start CMS

  1. Open a separate terminal, go to the fec-cms repo, and activate Python virtual env
  2. Export the necessary env variables and point to local api:
export DATABASE_URL=postgresql://:@/cfdm_cms_test
export FEC_API_URL=http://localhost:5000
export FEC_WEB_API_KEY_PRIVATE=xxxxx # don't need this if point to local or 'dev' api
export FEC_WEB_API_KEY_PUBLIC=xxxxx  # don't need this if point to local or 'dev' api
export FEC_CMS_ENVIRONMENT=LOCAL
  1. Run CMS: ./manage.py runserver

Build the API

  1. Open the rancher desktop application
  2. In the same terminal as step two, build the image if it hasn't been built before: docker-compose build

Test Cron Jobs:

Note: This will test using local database

  1. Modify webservices/task/init.py to a different schedule
  2. Start All Services: docker-compose up -d
  3. Check the logs in docker desktop to ensure there are no errors

Test Downloads:

  1. Add SQLA_CONN as an environment variable for celery worker to test against dev database: - SQLA_CONN=${SQLA_CONN} . If not added it will test against your local database.
  2. Stop services docker-compose down
  3. Start Necessary Services: docker-compose up -d openfec db celery-worker redis
  4. Go to http://127.0.0.1:8000 and find data to export
  5. Check logs in docker desktop to ensure there are no errors
  6. Make sure to remove SQLA_CONN environment variable from celery worker when done.