bedrock
Bedrock v6.1.20
Getting started
Installation
You'll need to set up our pypi repository (see our docs).
Let's create a project called "geo-area"
mkdir geo-area
cd geo-area
python3 -m venv venv # optional, but recommended, especially if you get a managed environment error when trying to install bedrock globally
source ./venv/bin/activate
pip install lambda-bedrock
deactivate
source ./venv/bin/activate # This is to refresh the virtual environment paths after bedrock is installed
Dependency note
Bedrock uses boto3 but doesn't declare it as part of its dependencies.
This is because AWS Lambda's Python runtime already includes it.
Locally, your requirements.txt file should include boto3.
Set up
Step 1: Initialise your project
Inside your geo-area directory (that you created in the Installation step), run:
bedrock init --name=geo-area
This should have created a few files in your project directory. You should be able to see the following structure now:
geo-area/
├── .github/ <---------------------- GitHub actions for CI/CD
│ ├── common/
│ │ └── deploy-to-aws/
│ │ └── action.yml
│ └── workflows/
│ ├── api-tests.yml
│ ├── build-artifacts.yml
│ ├── deployment.yml
│ ├── integration-tests.yml
│ └── unit-tests.yml
├── .gitignore
├── .idea/
│ ├── .gitignore
│ └── workspace.xml
├── app/ <-------------------------- Your microservice application code
│ ├── .coveragerc
│ ├── .project
│ ├── Dockerfile
│ ├── config/
│ │ ├── __init__.py
│ │ └── config.py
│ ├── endpoints/ <---------------- This is where REST endpoints and Kafka listeners go
│ │ ├── __init__.py
│ │ └── status.py
│ ├── handler.template.py <------- Used to generate the lambda handler during the build process
│ ├── local.handler.py <---------- Used to run the application locally
│ ├── model/ <-------------------- Your database models go here
│ │ └── __init__.py
│ ├── requirements-dev.txt
│ ├── requirements.txt
│ ├── tests/
│ │ ├── integration/
│ │ └── unit/
│ └── workers/ <------------------ If your application chains off background workers, they go here
│ └── __init__.py
├── database-seed/ <---------------- Generated database schema and seed files
├── docker-compose.yml
├── docs/ <------------------------- Persistent documentation files end up here (and your icon.svg too)
│ └── adr/ <---------------------- Architectural Decision Records
├── migrations/ <------------------- Sqlow database migration files
├── openapi/ <---------------------- OpenAPI spec files (these files are auto-generated)
├── readme.md
└── terraform/
├── main.tf
├── outputs.tf
├── terraform.tf
└── vars.tf
Step 2: Run it!
In a terminal, run:
pip install -r app/requirements.txt -r app/requirements-dev.txt
bedrock run
And in your browser, go to http://localhost:5050/status
You should see something like this:
{
"version": "0.0.0",
"db": {
"canConnect": false,
"hasRequiredTableAccess": false
}
}
You probably want to configure your application (see next step) so that your app has the right DB access.
Step 3: Add an endpoint
An endpoint for /status has been generated for you, you can use that as an example!
You may also use bedrock's code generators to create a new endpoint with an associated model.
Step 3.1 - Create endpoints for planets and countries
Let's create 2 endpoints:
/planetsendpoint with:- a
Planetmodel - auto-sync based on
planet-changedkafka topic - authorisation by planets
- a
/planets/{planetUuid}/countriesendpoint with:- a
Countrymodel - authorisation by planets
- a
In your geo-area directory, run:
# Do Planets
bedrock new endpoint '/planets' --kafka-topics="planet-changed" --protection="planets" --model-table='global_planets' --model-attributes='name:String(255),nullable=False;code:String(255),nullable=False;is_habitable:Boolean,nullable=False,default=False' --model-path-hint="self.code"
# Do Countries
bedrock new endpoint '/planets/{planetUuid}/countries' --protection="planets" --model-attributes='name:String(255),nullable=False;population:Numeric,nullable=False,default=False' --model-path-hint="self->Planet.code"
These commands will create a couple of files each, so your directory structure will now look like this:
geo-area/
⋮
├── app/
⋮ ⋮
│ ├── endpoints/
│ │ ├── __init__.py
│ │ ├── countries.py
│ │ ├── planets.py
│ │ └── status.py
⋮ ⋮
│ ├── model/
│ │ ├── __init__.py
│ │ ├── country.py
│ │ └── planet.py
⋮ ⋮
endpoints is what will be used to handle the requests to your API, and model is what will be used to interact with the database.
You can customise these further by looking at the documentation for endpoints and models.
Step 3.2 - Generate the database seed
Now that we've got some model objects, we can generate the database seed.
bedrock generate schema
This will create two files in ./database-seed/:
./database-seed/drop.sql./database-seed/seed.sql
Note: When adding, removing or changing model definition, you'll have to re-generate the database schema and reset your database (see below)
Step 3.3 - Run the database
Now you can just create the database using the provided docker-compose.yml file:
docker compose up -d database
The compose file configures a postgres database.
It will create a volume for the database data in ./.db-data.
If you'd like to reset the database, you can run:
docker compose down
rm -rf ./.db-data
bedrock generate schema
docker compose up -d database
Step 3.4 - Try it!
Start the server again:
docker compose up -d database
And try to access one fo the endpoints, e.g. /planets:
curl -X GET http://localhost:5050/planets
Step 4: Create a cache server
A local cache server is required to use our websocket functionality locally. To start the cache server you can run the command:
docker compose up -d cache
This will create a volume for the cache data in ./.cache-data.
If you'd like to wipe the data stored in the local cache server, you can run:
docker compose down
rm -rf ./.cache-data
docker compose up -d cache
Upgrade notes
See upgrade-notes.
Documentation
Full documentation is hosted in https://bedrock-docs.keyholding.com/
Or to create local docs:
- Create a python virtual env
- Install
./requirements.txt - Run
./scripts/docs.sh - Drag & drop the generated
./.docs/index.htmlfile into a browser
Deployment
bedrock init should have created a terraform folder for you, with some base code that you can edit.
Look inside terraform/vars.tf for any variables you need to set a default value for. A good list to start with is:
app_nametop_domainvpc_idaccount_idsubnet_idscert_arnhosted_zone_idneeds_own_database
Also check if there are variables that you don't need.
Prepare deployment by building necessary elements
These should be similar commands already set up inside .github/workflows/build-artifacts.yml.
# Generate OpenAPI spec that is used for documentation & API Gateway configuration
bedrock generate openapi-spec [options]
# Generate supporting files that get used by Terraform
bedrock generate tf-support-files
# Generate the database seed
bedrock generate schema
# Build the zips for each endpoint and the lambda
bedrock build --endoint=endpoint_name1 # generates the lambda zip
bedrock build --endoint=endpoint_name2 # generates the lambda zip
bedrock build --layer # generates the zip for the layer
bedrock build --worker=worker_name1 # generates the lambda zip for the worker
bedrock build --worker=worker_name2 # generates the lambda zip for the worker
bedrock build --worker-layer=worker_name1 # generates the zip for the worker layer
bedrock build --worker-layer=worker_name2 # generates the zip for the worker layer
Note on deploying to AWS
The build process should be done on a linux machine (which can be done through GitHub actions).
On a Mac, the application can be run locally via bedrock run, but the lambdas cannot be built and zipped locally for upload to AWS because the Python libraries are produced in a different way that causes errors.
So the easiest thing is to build and deploy via GitHub actions.
Bedrock CLI
Usage:
bedrock --help
bedrock --version
bedrock init --name=<project-name>
bedrock <run|check|build> [options]
bedrock new endpoint <path> [options]
bedrock new worker <name>.<js|py>
bedrock build-all [options]
bedrock generate schema
bedrock generate openapi-spec [options]
bedrock generate tf-support-files
bedrock test [--unit|--integration] [options]
Init Options:
--name=<project-name>
Your project name. This will be used by terraform, openapi spec and other generators to label/prefix resources and names.
Build Options:
--resource=<resource>
DEPRECATED. Build the designated resource.
--endpoint=<endpoint>
Build the designated endpoint.
--layer
Build the layer for the application.
--worker=<worker>
Build the designated worker.
--worker-layer=<worker>
Build the layer for the designated worker.
--pre-package-layer-command=<command>
Run a command just before zipping the dependencies.
New Endpoint Options:
<path>
The full path for this endpoint. E.g. /countries/countryUuid/regions/regionUuid/cities
--kafka-topics=<topics>
Optional comma-separated list of topic names.
--protection=<entity>
Optional entity for @protected('entity'). Defaults to 'accounts'
--model=<model>
Optional model name to use as the related model for this endpoint.
--model-attributes=<attributes>
Required semi-colon separated list of colon-separated column name/type pairs.
E.g.: 'name:String(255),nullable=False;description:String(1024),nullable=True;details:JSONB'
--model-path-hint=<hint>
Optional model path hint.
--model-table=<table_name>
Optional model table name.
Build-all Options:
--url=<url>
URL for the server
--env=<env>
Environment to use (Default: testing)
--aws-region=<region>
AWS Region for API Gateway (Default: eu-west-1)
Generate API Spec Options:
--env=<env>
Environment|Host pair to use (Default: 'testing|-api.testing.keyholding.com')
--aws-region=<region>
AWS Region for API Gateway (Default: eu-west-1)
--output-filename=<filename>
File name for the generated api spec (Default: openapi.spec.yml)
--api-version=<version>
API version (typically, the version of the application)
--documentation-only
Don't generate OPTIONS verbs or x-amazon-apigateway-integration fields for endpoints.
Unit Test Options:
--minimum-coverage=<value>
Minimum expected coverage (fails if not met)
Integration Test Options:
--minimum-coverage=<value>
Minimum expected coverage (fails if not met)
--database-host=<host>
Sets the database host to use for integration tests
--database-username=<username>
Sets the database username to use for integration tests
--database-password=<password>
Sets the database password to use for integration tests
--database-port=<port>
Sets the database port to use for integration tests
--database-schema=<schema>
Sets the database schema to use for integration tests
1""" 2<div align="center"><img width=150 src="./icon.svg" /></div> 3<h1 align="center">Bedrock v6.1.20</h1> 4 5<!-- TOC --> 6* [Getting started](#getting-started) 7 * [Installation](#installation) 8 * [Set up](#set-up) 9* [Upgrade notes](./upgrade-notes.md) 10* [Documentation](#documentation) 11* [Deployment](#deployment) 12* [Bedrock CLI](#bedrock-cli) 13<!-- TOC --> 14 15# Getting started 16 17## Installation 18 19You'll need to set up our pypi repository 20([see our docs](https://docs.keyholding.com/doc/sonatype-nexus-KyEZsQGa0M#h-pypi)). 21 22Let's create a project called "geo-area" 23```shell 24mkdir geo-area 25cd geo-area 26python3 -m venv venv # optional, but recommended, especially if you get a managed environment error when trying to install bedrock globally 27source ./venv/bin/activate 28pip install lambda-bedrock 29deactivate 30source ./venv/bin/activate # This is to refresh the virtual environment paths after bedrock is installed 31``` 32 33### Dependency note 34Bedrock uses `boto3` but doesn't declare it as part of its dependencies. 35 36This is because AWS Lambda's Python runtime already includes it. 37 38Locally, your `requirements.txt` file should include `boto3`. 39 40## Set up 41 42### Step 1: Initialise your project 43 44Inside your `geo-area` directory (that you created in the Installation step), run: 45 46```shell 47bedrock init --name=geo-area 48``` 49 50This should have created a few files in your project directory. 51You should be able to see the following structure now: 52 53``` 54geo-area/ 55├── .github/ <---------------------- GitHub actions for CI/CD 56│ ├── common/ 57│ │ └── deploy-to-aws/ 58│ │ └── action.yml 59│ └── workflows/ 60│ ├── api-tests.yml 61│ ├── build-artifacts.yml 62│ ├── deployment.yml 63│ ├── integration-tests.yml 64│ └── unit-tests.yml 65├── .gitignore 66├── .idea/ 67│ ├── .gitignore 68│ └── workspace.xml 69├── app/ <-------------------------- Your microservice application code 70│ ├── .coveragerc 71│ ├── .project 72│ ├── Dockerfile 73│ ├── config/ 74│ │ ├── __init__.py 75│ │ └── config.py 76│ ├── endpoints/ <---------------- This is where REST endpoints and Kafka listeners go 77│ │ ├── __init__.py 78│ │ └── status.py 79│ ├── handler.template.py <------- Used to generate the lambda handler during the build process 80│ ├── local.handler.py <---------- Used to run the application locally 81│ ├── model/ <-------------------- Your database models go here 82│ │ └── __init__.py 83│ ├── requirements-dev.txt 84│ ├── requirements.txt 85│ ├── tests/ 86│ │ ├── integration/ 87│ │ └── unit/ 88│ └── workers/ <------------------ If your application chains off background workers, they go here 89│ └── __init__.py 90├── database-seed/ <---------------- Generated database schema and seed files 91├── docker-compose.yml 92├── docs/ <------------------------- Persistent documentation files end up here (and your icon.svg too) 93│ └── adr/ <---------------------- Architectural Decision Records 94├── migrations/ <------------------- Sqlow database migration files 95├── openapi/ <---------------------- OpenAPI spec files (these files are auto-generated) 96├── readme.md 97└── terraform/ 98 ├── main.tf 99 ├── outputs.tf 100 ├── terraform.tf 101 └── vars.tf 102``` 103 104### Step 2: Run it! 105In a terminal, run: 106```bash 107pip install -r app/requirements.txt -r app/requirements-dev.txt 108bedrock run 109``` 110 111And in your browser, go to http://localhost:5050/status 112 113You should see something like this: 114```json 115{ 116 "version": "0.0.0", 117 "db": { 118 "canConnect": false, 119 "hasRequiredTableAccess": false 120 } 121} 122``` 123 124You probably want to configure your application (see next step) so that your app has the right DB access. 125 126### Step 3: Add an endpoint 127 128An endpoint for `/status` has been generated for you, you can use that as an example! 129 130You may also use bedrock's code generators to create a new endpoint with an associated model. 131 132#### Step 3.1 - Create endpoints for planets and countries 133 134Let's create 2 endpoints: 135* `/planets` endpoint with: 136 * a `Planet` model 137 * auto-sync based on `planet-changed` kafka topic 138 * authorisation by planets 139* `/planets/{planetUuid}/countries` endpoint with: 140 * a `Country` model 141 * authorisation by planets 142 143In your `geo-area` directory, run: 144```shell 145# Do Planets 146bedrock new endpoint '/planets' \ 147 --kafka-topics="planet-changed" \ 148 --protection="planets" \ 149 --model-table='global_planets' \ 150 --model-attributes='name:String(255),nullable=False;code:String(255),nullable=False;is_habitable:Boolean,nullable=False,default=False' \ 151 --model-path-hint="self.code" 152# Do Countries 153bedrock new endpoint '/planets/{planetUuid}/countries' \ 154 --protection="planets" \ 155 --model-attributes='name:String(255),nullable=False;population:Numeric,nullable=False,default=False' \ 156 --model-path-hint="self->Planet.code" 157```` 158 159These commands will create a couple of files each, so your directory structure will now look like this: 160``` 161geo-area/ 162⋮ 163├── app/ 164⋮ ⋮ 165│ ├── endpoints/ 166│ │ ├── __init__.py 167│ │ ├── countries.py 168│ │ ├── planets.py 169│ │ └── status.py 170⋮ ⋮ 171│ ├── model/ 172│ │ ├── __init__.py 173│ │ ├── country.py 174│ │ └── planet.py 175⋮ ⋮ 176``` 177 178`endpoints` is what will be used to handle the requests to your API, and `model` is what will be used to interact with the database. 179 180You can customise these further by looking at the documentation for [endpoints](https://bedrock-docs.keyholding.com/bedrock/endpoints.html) and [models](https://bedrock-docs.keyholding.com/bedrock/model.html). 181 182#### Step 3.2 - Generate the database seed 183Now that we've got some model objects, we can generate the database seed. 184```bash 185bedrock generate schema 186``` 187This will create two files in `./database-seed/`: 188* `./database-seed/drop.sql` 189* `./database-seed/seed.sql` 190 191*Note: When adding, removing or changing model definition, you'll have to re-generate the database schema and reset your database (see below)* 192 193#### Step 3.3 - Run the database 194Now you can just create the database using the provided `docker-compose.yml` file: 195```bash 196docker compose up -d database 197``` 198 199The compose file configures a postgres database. 200It will create a volume for the database data in `./.db-data`. 201 202If you'd like to reset the database, you can run: 203```bash 204docker compose down 205rm -rf ./.db-data 206bedrock generate schema 207docker compose up -d database 208``` 209 210#### Step 3.4 - Try it! 211Start the server again: 212```bash 213docker compose up -d database 214``` 215 216 217And try to access one fo the endpoints, e.g. `/planets`: 218```bash 219curl -X GET http://localhost:5050/planets 220``` 221 222### Step 4: Create a cache server 223A local cache server is required to use our websocket functionality locally. To start the cache server you can run the command: 224```bash 225docker compose up -d cache 226``` 227This will create a volume for the cache data in `./.cache-data`. 228 229If you'd like to wipe the data stored in the local cache server, you can run: 230```bash 231docker compose down 232rm -rf ./.cache-data 233docker compose up -d cache 234``` 235 236# Upgrade notes 237See [upgrade-notes](./upgrade-notes.md). 238 239# Documentation 240 241Full documentation is hosted in https://bedrock-docs.keyholding.com/ 242 243Or to create local docs: 244* Create a python [virtual env](#Installation) 245* Install `./requirements.txt` 246* Run `./scripts/docs.sh` 247* Drag & drop the generated `./.docs/index.html` file into a browser 248 249# Deployment 250 251`bedrock init` should have created a terraform folder for you, with some base code that you can edit. 252 253Look inside `terraform/vars.tf` for any variables you need to set a default value for. A good list to start with is: 254* `app_name` 255* `top_domain` 256* `vpc_id` 257* `account_id` 258* `subnet_ids` 259* `cert_arn` 260* `hosted_zone_id` 261* `needs_own_database` 262 263Also check if there are variables that you don't need. 264 265## Prepare deployment by building necessary elements 266These should be similar commands already set up inside [`.github/workflows/build-artifacts.yml`](.github/workflows/build-artifacts.yml). 267 268```shell 269# Generate OpenAPI spec that is used for documentation & API Gateway configuration 270bedrock generate openapi-spec [options] 271 272# Generate supporting files that get used by Terraform 273bedrock generate tf-support-files 274 275# Generate the database seed 276bedrock generate schema 277 278# Build the zips for each endpoint and the lambda 279bedrock build --endoint=endpoint_name1 # generates the lambda zip 280bedrock build --endoint=endpoint_name2 # generates the lambda zip 281bedrock build --layer # generates the zip for the layer 282bedrock build --worker=worker_name1 # generates the lambda zip for the worker 283bedrock build --worker=worker_name2 # generates the lambda zip for the worker 284bedrock build --worker-layer=worker_name1 # generates the zip for the worker layer 285bedrock build --worker-layer=worker_name2 # generates the zip for the worker layer 286``` 287 288## Note on deploying to AWS 289 290The build process should be done on a linux machine (which can be done through GitHub actions). 291 292On a Mac, the application can be *run* locally via `bedrock run`, but the lambdas **cannot be built and zipped locally** for upload to AWS because the Python libraries are produced in a different way that causes errors. 293 294So the easiest thing is to build and deploy via GitHub actions. 295 296# Bedrock CLI 297``` 298Usage: 299 bedrock --help 300 bedrock --version 301 bedrock init --name=<project-name> 302 bedrock <run|check|build> [options] 303 bedrock new endpoint <path> [options] 304 bedrock new worker <name>.<js|py> 305 bedrock build-all [options] 306 bedrock generate schema 307 bedrock generate openapi-spec [options] 308 bedrock generate tf-support-files 309 bedrock test [--unit|--integration] [options] 310Init Options: 311 --name=<project-name> 312 Your project name. This will be used by terraform, openapi spec and other generators to label/prefix resources and names. 313Build Options: 314 --resource=<resource> 315 DEPRECATED. Build the designated resource. 316 --endpoint=<endpoint> 317 Build the designated endpoint. 318 --layer 319 Build the layer for the application. 320 --worker=<worker> 321 Build the designated worker. 322 --worker-layer=<worker> 323 Build the layer for the designated worker. 324 --pre-package-layer-command=<command> 325 Run a command just before zipping the dependencies. 326New Endpoint Options: 327 <path> 328 The full path for this endpoint. E.g. /countries/countryUuid/regions/regionUuid/cities 329 --kafka-topics=<topics> 330 Optional comma-separated list of topic names. 331 --protection=<entity> 332 Optional entity for @protected('entity'). Defaults to 'accounts' 333 --model=<model> 334 Optional model name to use as the related model for this endpoint. 335 --model-attributes=<attributes> 336 Required semi-colon separated list of colon-separated column name/type pairs. 337 E.g.: 'name:String(255),nullable=False;description:String(1024),nullable=True;details:JSONB' 338 --model-path-hint=<hint> 339 Optional model path hint. 340 --model-table=<table_name> 341 Optional model table name. 342Build-all Options: 343 --url=<url> 344 URL for the server 345 --env=<env> 346 Environment to use (Default: testing) 347 --aws-region=<region> 348 AWS Region for API Gateway (Default: eu-west-1) 349Generate API Spec Options: 350 --env=<env> 351 Environment|Host pair to use (Default: 'testing|-api.testing.keyholding.com') 352 --aws-region=<region> 353 AWS Region for API Gateway (Default: eu-west-1) 354 --output-filename=<filename> 355 File name for the generated api spec (Default: openapi.spec.yml) 356 --api-version=<version> 357 API version (typically, the version of the application) 358 --documentation-only 359 Don't generate OPTIONS verbs or x-amazon-apigateway-integration fields for endpoints. 360Unit Test Options: 361 --minimum-coverage=<value> 362 Minimum expected coverage (fails if not met) 363Integration Test Options: 364 --minimum-coverage=<value> 365 Minimum expected coverage (fails if not met) 366 --database-host=<host> 367 Sets the database host to use for integration tests 368 --database-username=<username> 369 Sets the database username to use for integration tests 370 --database-password=<password> 371 Sets the database password to use for integration tests 372 --database-port=<port> 373 Sets the database port to use for integration tests 374 --database-schema=<schema> 375 Sets the database schema to use for integration tests 376``` 377"""