pyinfra/README.md
2022-03-01 13:21:54 +01:00

73 lines
5.3 KiB
Markdown
Executable File

# Infrastructure to deploy Research Projects
The Infrastructure expects to be deployed in the same Pod / local environment as the analysis container and handles all outbound communication.
## Configuration
A configuration is located in `/config.yaml`. All relevant variables can be configured via exporting environment variables.
| Environment Variable | Default | Description |
|-------------------------------|--------------------------------|--------------------------------------------------------------------------------|
| _service_ | | |
| LOGGING_LEVEL_ROOT | DEBUG | Logging level for service logger |
| RESPONSE_FILE_EXTENSION | ".NER_ENTITIES.json.gz" | Extension to the file that stores the analyized response on storage |
| _probing_webserver_ | | |
| PROBING_WEBSERVER_HOST | "0.0.0.0" | Probe webserver address |
| PROBING_WEBSERVER_PORT | 8080 | Probe webserver port |
| PROBING_WEBSERVER_MODE | production | Webserver mode: {development, production} |
| _rabbitmq_ | | |
| RABBITMQ_HOST | localhost | RabbitMQ host address |
| RABBITMQ_PORT | 5672 | RabbitMQ host port |
| RABBITMQ_USERNAME | user | RabbitMQ username |
| RABBITMQ_PASSWORD | bitnami | RabbitMQ password |
| RABBITMQ_HEARTBEAT | 7200 | Controls AMQP heartbeat timeout in seconds |
| _queues_ | | |
| REQUEST_QUEUE | request_queue | Requests to service |
| RESPONSE_QUEUE | response_queue | Responses by service |
| DEAD_LETTER_QUEUE | dead_letter_queue | Messages that failed to process |
| _callback_ | | |
| RETRY | False | Toggles retry behaviour |
| MAX_ATTEMPTS | 3 | Number of times a message may fail before being published to dead letter queue |
| ANALYSIS_ENDPOINT | "http://127.0.0.1:5000" | |
| _storage_ | | |
| STORAGE_BACKEND | s3 | The type of storage to use {s3, azure} |
| STORAGE_BUCKET | "pyinfra-test-bucket" | The bucket / container to pull files specified in queue requests from |
| TARGET_FILE_EXTENSION | ".TEXT.json.gz" | Defines type of file to pull from storage: .TEXT.json.gz or .ORIGIN.pdf.gz |
| STORAGE_ENDPOINT | "http://127.0.0.1:9000" | |
| STORAGE_KEY | root | |
| STORAGE_SECRET | password | |
| STORAGE_AZURECONNECTIONSTRING | "DefaultEndpointsProtocol=..." | |
## Development
### Local Setup
You can run the infrastructure either as module via. `src/serve.py` or as Dockercontainer simulating the kubernetes environment
1. Install module / build docker image
```bash
pip install -e .
pip install -r requirements.txt
```
```bash
docker build -f Dockerfile -t pyinfra .
```
2. Run rabbitmq & minio
```bash
docker-compose up
```
3. Run module / docker container
```bash
python src/serve.py
```
```bash
docker run --net=host pyinfra
```