bedrock.workers
Bedrock Workers
Bedrock supports workers. Workers are process that run outside the main application and may be triggered by other workers or endpoints.
This means they will be deployed in their own lambdas.
How to create a worker
You can use the bedrock helper command to create a new worker:
bedrock new worker <name>.<py|js>
Bedrock will automatically detect your worker and deploy it to AWS Lambda (and also build a layer with all dependencies if needed).
Behind the scenes, all it does is create a new folder in your workers directory with the name of your worker.
For example, my_app/workers/system_fetcher/.
Then creates a file called main.py (for python workers) or main.js (for javascript workers) in that folder.
This file will contain a class with the camelCase name of the folder, for example SystemFetcher.
The folder will also contain a requirements.txt file (for python workers) or a package.json file (for javascript workers).
Additionally, javascript workers will include a worker.js file that contains a base class for javascript workers.
Python example
Run the following command to create a new worker:
bedrock new worker system_fetcher.py
This will create a new folder called system_fetcher in your workers directory with the following structure:
my_app/
├── app/
│ ├── config/
│ ├── endpoints/
│ ⋮
│ ├── workers/
│ │ ├── system_fetcher/
│ │ │ ├── __init__.py
│ │ │ ├── main.py
│ │ │ └── requirements.txt
⋮ ⋮ ⋮
You can then modify the main.py file to add your logic:
# my_app/workers/system_fetcher/main.py
import requests
from bedrock.workers.worker import Worker
class SystemFetcher(Worker):
def execute(self, event, context):
response = requests.get("https://badass-space-info-website.com/api/systems")
if response.status_code != 200:
return False
return response.json() # Or False, or True. It just needs to be loadable by json.loads()
You can also add dependencies to your worker by adding them to the requirements.txt file:
# my_app/workers/system_fetcher/requirements.txt
requests==2.28.2
lambda-bedrock==0.5.0
Javascript example
Run the following command to create a new worker:
bedrock new worker system_fetcher.js
This will create a new folder called system_fetcher in your workers directory with the following structure:
my_app/
├── app/
│ ├── config/
│ ├── endpoints/
│ ⋮
│ ├── workers/
│ │ ├── system_fetcher/
│ │ │ ├── main.js
│ │ │ ├── package.json
│ │ │ └── worker.js
⋮ ⋮ ⋮
You can then modify the main.js file to add your logic:
// my_app/workers/system_fetcher/main.js
const axios = require('axios');
// This is the base class for javascript workers. It gets automatically generated by bedrock when you create a new worker.
const { Worker } = require('./worker');
class SystemFetcher extends Worker {
async execute(event, context) {
const response = await axios.get("https://badass-space-info-website.com/api/systems");
if (response.status !== 200) {
return false;
}
return response.data; // Or false, or true. It just needs to be loadable by python's json.loads()
}
}
You can also add dependencies to your worker by adding them to the package.json file:
# my_app/workers/system_fetcher/package.json
{
"name": "system_fetcher",
"version": "1.0.0",
"description": "",
"main": "main.js",
"dependencies": {
"axios": "^1.5.1"
}
}
How to invoke a worker
Bedrock provides a helper function to invoke a Worker (or any other lambda) via an invoke function (see
bedrock.workers.invoker.invoke).
This function takes the arn of the worker and a payload as parameters. Optionally it can also take a require_response
parameter (default: False).
It also works locally by figuring out the worker's class name from the ARN (as bedrock derives the lambda function's
name from the folder name) as well as its extension (based on available files) and then calling the handle method of
the worker.
Example
# my_app/app/endpoints/systems.py
from bedrock.workers.invoker import invoke
class SystemsEndpoint(Endpoint):
# ...
def get_global(self, event, context):
response = invoke("system_fetcher", {}, True)
if response:
return 200, response
return 500, {"error": "Failed to invoke worker"}
The Terraform provided by Bedrock will automatically create an environment variable called
[YOUR_APP]_WORKER_ARN_[YOUR_WORKER_NAME].
That gets added to config["workers"][your_worker_name].:
Using workers in applications built pre-0.5.0
Update your config
You should update your your_app/app/config/config.py file with the following changes:
- Rename
"lambdas"inparametersto"workers"dictionary. - Add
{**{k.replace(f"{prefix}_WORKER_ARN_", "").lower(): v for k, v in get_all_by_prefix(f"{prefix}_WORKER_ARN_").items()}}as a value to the"workers"dictionary (example below).
# your_app/app/config/config.py
#...
parameters = {
# ...
"workers": {
**{k.replace(f"{prefix}_WORKER_ARN_", "").lower(): v for k, v in get_all_by_prefix(f"{prefix}_WORKER_ARN_").items()}
}
}
#...
Update your terraform
It's best to compare your terraform with bedrock's template terraform.
The important changes are in:
- terraform/lambda/
- terraform/lambda-layer/
- terraform/main.tf
Update your deployment workflows
You should update your deployment workflows to include building any workers and any worker layers.
Example:
bedrock build --worker=system_fetcher
bedrock build --worker-layer=system_fetcher
How does it work behind the scenes?
I want to make the distinction between what I'll call "Endpoint Lambda" and "Worker Lambda" here:
- Endpoint Lambda is the lambda that has been packaged based on what the endpoint needs (i.e. the code you write in
app/except forapp/workers). - Worker Lambda is the lambda that was packaged based on the folder in
app/workers(i.e. the code you write inapp/workers/your_worker).
Very simply put, this is what happens when the application is deployed:
flowchart LR el[Endpoint Lambda] wl[Worker Lambda] el <-- boto3 api call --> wl
In more detail:
sequenceDiagram Endpoint->>Invoker: Invoke WorkerX Invoker->>Config: Get WorkerX ARN Config-->>Invoker: WorkerX ARN Invoker->>WorkerX: call (via boto3) WorkerX->>WorkerX: whatever it needs to do WorkerX-->>Invoker: reply* Invoker-->>Endpoint: reply*
*if replies are "fire-and-forget", then they just release back control immediately
Local behaviour
Bedrock tries to simulate the behaviour locally too.
Python
Python workers run in separate threads
flowchart TB subgraph Python App el[Endpoint Lambda] wl[Worker Lambda] end el <-- asyncio --> wl
Javascript
Javascript workers run in entirely separate processes
flowchart LR subgraph Python App el[Endpoint Lambda] end subgraph Node Process wl[Worker Lambda] n[node] end el <-- system call --> n n <--> wl
1""" 2# Bedrock Workers 3Bedrock supports workers. 4Workers are process that run outside the main application and may be triggered by other workers or endpoints. 5 6This means they will be deployed in their own lambdas. 7 8# How to create a worker 9You can use the bedrock helper command to create a new worker: 10 11```shell 12bedrock new worker <name>.<py|js> 13``` 14 15Bedrock will automatically detect your worker and deploy it to AWS Lambda (and also build a layer with all dependencies if needed). 16 17Behind the scenes, all it does is create a new folder in your `workers` directory with the name of your worker. 18For example, `my_app/workers/system_fetcher/`. 19 20Then creates a file called `main.py` (for python workers) or `main.js` (for javascript workers) in that folder. 21This file will contain a class with the camelCase name of the folder, for example `SystemFetcher`. 22 23The folder will also contain a `requirements.txt` file (for python workers) or a `package.json` file (for javascript workers). 24 25Additionally, javascript workers will include a `worker.js` file that contains a base class for javascript workers. 26 27## Python example 28Run the following command to create a new worker: 29```shell 30bedrock new worker system_fetcher.py 31``` 32 33This will create a new folder called `system_fetcher` in your `workers` directory with the following structure: 34``` 35my_app/ 36├── app/ 37│ ├── config/ 38│ ├── endpoints/ 39│ ⋮ 40│ ├── workers/ 41│ │ ├── system_fetcher/ 42│ │ │ ├── __init__.py 43│ │ │ ├── main.py 44│ │ │ └── requirements.txt 45⋮ ⋮ ⋮ 46``` 47 48You can then modify the `main.py` file to add your logic: 49```python 50# my_app/workers/system_fetcher/main.py 51import requests 52from bedrock.workers.worker import Worker 53 54 55class SystemFetcher(Worker): 56 def execute(self, event, context): 57 response = requests.get("https://badass-space-info-website.com/api/systems") 58 if response.status_code != 200: 59 return False 60 return response.json() # Or False, or True. It just needs to be loadable by json.loads() 61``` 62 63You can also add dependencies to your worker by adding them to the `requirements.txt` file: 64``` 65# my_app/workers/system_fetcher/requirements.txt 66requests==2.28.2 67lambda-bedrock==0.5.0 68``` 69 70## Javascript example 71Run the following command to create a new worker: 72```shell 73bedrock new worker system_fetcher.js 74``` 75 76This will create a new folder called `system_fetcher` in your `workers` directory with the following structure: 77``` 78my_app/ 79├── app/ 80│ ├── config/ 81│ ├── endpoints/ 82│ ⋮ 83│ ├── workers/ 84│ │ ├── system_fetcher/ 85│ │ │ ├── main.js 86│ │ │ ├── package.json 87│ │ │ └── worker.js 88⋮ ⋮ ⋮ 89``` 90 91You can then modify the `main.js` file to add your logic: 92 93```javascript 94// my_app/workers/system_fetcher/main.js 95const axios = require('axios'); 96// This is the base class for javascript workers. It gets automatically generated by bedrock when you create a new worker. 97const { Worker } = require('./worker'); 98 99class SystemFetcher extends Worker { 100 async execute(event, context) { 101 const response = await axios.get("https://badass-space-info-website.com/api/systems"); 102 if (response.status !== 200) { 103 return false; 104 } 105 return response.data; // Or false, or true. It just needs to be loadable by python's json.loads() 106 } 107} 108``` 109 110You can also add dependencies to your worker by adding them to the `package.json` file: 111``` 112# my_app/workers/system_fetcher/package.json 113{ 114 "name": "system_fetcher", 115 "version": "1.0.0", 116 "description": "", 117 "main": "main.js", 118 "dependencies": { 119 "axios": "^1.5.1" 120 } 121} 122``` 123 124# How to invoke a worker 125 126Bedrock provides a helper function to invoke a `Worker` (or any other lambda) via an `invoke` function (see 127`bedrock.workers.invoker.invoke`). 128 129This function takes the arn of the worker and a payload as parameters. Optionally it can also take a `require_response` 130parameter (default: `False`). 131 132It also works locally by figuring out the worker's class name from the ARN (as bedrock derives the lambda function's 133name from the folder name) as well as its extension (based on available files) and then calling the `handle` method of 134the worker. 135 136## Example 137```python 138# my_app/app/endpoints/systems.py 139from bedrock.workers.invoker import invoke 140 141class SystemsEndpoint(Endpoint): 142 # ... 143 144 def get_global(self, event, context): 145 response = invoke("system_fetcher", {}, True) 146 if response: 147 return 200, response 148 return 500, {"error": "Failed to invoke worker"} 149``` 150 151The Terraform provided by Bedrock will automatically create an environment variable called 152`[YOUR_APP]_WORKER_ARN_[YOUR_WORKER_NAME]`. 153That gets added to `config["workers"][your_worker_name]`.: 154 155# Using workers in applications built pre-0.5.0 156## Update your config 157You should update your `your_app/app/config/config.py` file with the following changes: 158 1591. Rename `"lambdas"` in `parameters` to `"workers"` dictionary. 1602. Add `{**{k.replace(f"{prefix}_WORKER_ARN_", "").lower(): v for k, v in get_all_by_prefix(f"{prefix}_WORKER_ARN_").items()}}` as a value to the `"workers"` dictionary (example below). 161 162```python 163# your_app/app/config/config.py 164#... 165parameters = { 166 # ... 167 "workers": { 168 **{k.replace(f"{prefix}_WORKER_ARN_", "").lower(): v for k, v in get_all_by_prefix(f"{prefix}_WORKER_ARN_").items()} 169 } 170} 171#... 172``` 173 174## Update your terraform 175It's best to compare your terraform with [bedrock's template terraform](https://github.com/TheKeyholdingCompany/bedrock/tree/main/bedrock/generators/templates/terraform). 176 177The important changes are in: 178* terraform/lambda/ 179* terraform/lambda-layer/ 180* terraform/main.tf 181 182## Update your deployment workflows 183You should update your deployment workflows to include building any workers and any worker layers. 184 185### Example: 186```shell 187bedrock build --worker=system_fetcher 188bedrock build --worker-layer=system_fetcher 189``` 190 191# How does it work behind the scenes? 192 193I want to make the distinction between what I'll call "Endpoint Lambda" and "Worker Lambda" here: 194* Endpoint Lambda is the lambda that has been packaged based on what the endpoint needs (i.e. the code you write in `app/` except for `app/workers`). 195* Worker Lambda is the lambda that was packaged based on the folder in `app/workers` (i.e. the code you write in `app/workers/your_worker`). 196 197Very simply put, this is what happens when the application is deployed: 198```mermaid 199flowchart LR 200 el[Endpoint Lambda] 201 wl[Worker Lambda] 202 el <-- boto3 api call --> wl 203``` 204 205In more detail: 206 207```mermaid 208sequenceDiagram 209 Endpoint->>Invoker: Invoke WorkerX 210 Invoker->>Config: Get WorkerX ARN 211 Config-->>Invoker: WorkerX ARN 212 Invoker->>WorkerX: call (via boto3) 213 WorkerX->>WorkerX: whatever it needs to do 214 WorkerX-->>Invoker: reply* 215 Invoker-->>Endpoint: reply* 216``` 217*if replies are "fire-and-forget", then they just release back control immediately 218 219## Local behaviour 220 221Bedrock tries to simulate the behaviour locally too. 222 223### Python 224 225Python workers run in separate threads 226 227```mermaid 228flowchart TB 229 subgraph Python App 230 el[Endpoint Lambda] 231 wl[Worker Lambda] 232 end 233 el <-- asyncio --> wl 234``` 235 236### Javascript 237 238Javascript workers run in entirely separate processes 239 240```mermaid 241flowchart LR 242 subgraph Python App 243 el[Endpoint Lambda] 244 end 245 subgraph Node Process 246 wl[Worker Lambda] 247 n[node] 248 end 249 el <-- system call --> n 250 n <--> wl 251``` 252"""