Skip to content

Commit

Permalink
Fix typos and grammar in md files (#223)
Browse files Browse the repository at this point in the history
## Changes

Propagating fixes for typos and grammar in documentation `md` files from
a bid back to the template.
  • Loading branch information
yoomlam authored Jul 16, 2024
1 parent dded29c commit ecc0dd0
Show file tree
Hide file tree
Showing 12 changed files with 63 additions and 60 deletions.
6 changes: 3 additions & 3 deletions docs/app/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ You can switch which way many of these components are run by setting the `PY_RUN
* `export PY_RUN_APPROACH=local` will run these components natively
* `export PY_RUN_APPROACH=docker` will run these within Docker

Note that even with the native mode, many components like the DB and API will only ever run in Docker, and you should always make sure that any implementations work within docker.
Note that even with the native mode, many components like the DB and API will only ever run in Docker, and you should always make sure that any implementations work within Docker.

Running in the native/local approach may require additional packages to be installed on your machine to get working.

Expand All @@ -69,8 +69,8 @@ Running in the native/local approach may require additional packages to be insta
* Run `poetry install --all-extras --with dev` to keep your Poetry packages up to date
* Load environment variables from the local.env file, see below for one option.

One option for loading all of your local.env variables is to install direnv: https://direnv.net/
You can configure direnv to then load the local.env file by creating an `.envrc` file in the /app directory that looks like:
One option for loading all of your local.env variables is to install `direnv`: https://direnv.net/
You can configure `direnv` to then load the local.env file by creating an `.envrc` file in the /app directory that looks like:

```sh
#!/bin/bash
Expand Down
6 changes: 3 additions & 3 deletions docs/app/database/database-management.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,8 @@ To clean the database, use the following command:
make db-recreate
```

This will remove _all_ docker project volumes, rebuild the database volume, and
run all pending migrations. Once completed, only the database container will be
This will remove _all_ docker project volumes, rebuild the database volume, and
run all pending migrations. Once completed, only the database container will be
running. Simply run `make start` to bring up all other project containers.

## Running migrations
Expand Down Expand Up @@ -100,7 +100,7 @@ make db-migrate-history

When multiple migrations are created that point to the same `down_revision` a
branch is created, with the tip of each branch being a "head". The above history
command will show this, but a list of just the heads can been retrieved with:
command will show this, but a list of just the heads can be retrieved with:

```sh
make db-migrate-heads
Expand Down
2 changes: 1 addition & 1 deletion docs/app/database/database-testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ This document describes how the database is managed in the test suite.

## Test Schema

The test suite creates a new PostgreSQL database schema separate from the `public` schema that is used by the application outside of testing. This schema persists throughout the testing session is dropped at the end of the test run. The schema is created by the `db` fixture in [conftest.py](../../../app/tests/conftest.py). The fixture also creates and returns an initialized instance of the [db.DBClient](../../../app/src/db/__init__.py) that can be used to connect to the created schema.
The test suite creates a new PostgreSQL database schema separate from the `public` schema that is used by the application outside of testing. This schema persists throughout the testing session and is dropped at the end of the test run. The schema is created by the `db` fixture in [conftest.py](../../../app/tests/conftest.py). The fixture also creates and returns an initialized instance of the [db.DBClient](../../../app/src/db/__init__.py) that can be used to connect to the created schema.

Note that [PostgreSQL schemas](https://www.postgresql.org/docs/current/ddl-schemas.html) are entirely different concepts from [Schema objects in OpenAPI specification](https://swagger.io/docs/specification/data-models/).

Expand Down
2 changes: 1 addition & 1 deletion docs/app/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ A very simple [docker-compose.yml](/app/docker-compose.yml) has been included to
curl -sSL https://install.python-poetry.org | python3 -
```

3. If you are using an M1 mac, you will need to install postgres as well: `brew install postgresql` (The psycopg2-binary is built from source on M1 macs which requires the postgres executable to be present)
3. If you are using an M1 Mac, you will need to install Postgres as well: `brew install postgresql` (The psycopg2-binary is built from source on M1 Macs which requires the Postgres executable to be present)

4. You'll also need [Docker Desktop](https://www.docker.com/products/docker-desktop/) installed and running.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ This document describes how logging is configured in the application. The loggin

We have two separate ways of formatting the logs which are controlled by the `LOG_FORMAT` environment variable.

`json` (default) -> Produces JSON formatted logs which are machine-readable.
`json` (default) -> Produces JSON formatted logs, which are machine-readable.

```json
{
Expand All @@ -27,7 +27,7 @@ We have two separate ways of formatting the logs which are controlled by the `LO
}
```

`human-readable` (set by default in `local.env`) -> Produces color coded logs for local development or for troubleshooting.
`human-readable` (set by default in `local.env`) -> Produces color-coded logs for local development or troubleshooting.

![Human readable logs](human-readable-logs.png)

Expand All @@ -37,11 +37,11 @@ The [src.logging.flask_logger](../../../app/src/logging/flask_logger.py) module

## PII Masking

The [src.logging.pii](../../../app/src/logging/pii.py) module defines a filter that applies to all logs that automatically masks data fields that look like social security numbers.
The [src.logging.pii](../../../app/src/logging/pii.py) module defines a filter that applies to all logs and automatically masks data fields that look like social security numbers.

## Audit Logging

* The [src.logging.audit](../../../app/src/logging/audit.py) module defines a low level audit hook that logs events that may be of interest from a security point of view, such as dynamic code execution and network requests.
* The [src.logging.audit](../../../app/src/logging/audit.py) module defines a low-level audit hook that logs events that may be of interest from a security point of view, such as dynamic code execution and network requests.

## Additional Reading

Expand Down
12 changes: 6 additions & 6 deletions docs/app/monitoring-and-observability/logging-conventions.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Logging is a valuable tool for engineering teams to support products in producti

### Make code observability a primary tool for debugging and reasoning about production code

When a user runs into an issue in production, logs offer one of the primary ways of understanding what happened. This is especially important for situations where we can’t or don’t know how to reproduce the issue. In general it is not feasible to attach a debugger to production systems, or to set breakpoints and inspect the state of the application in production, so logs offer a way to debug through “print statements”.
When a user runs into an issue in production, logs offer one of the primary ways of understanding what happened. This is especially important for situations where we can’t or don’t know how to reproduce the issue. In general, it is not feasible to attach a debugger to production systems, or to set breakpoints and inspect the state of the application in production, so logs offer a way to debug through “print statements”.

### Make it easy for on-call engineers to search for logs in the codebase

Expand All @@ -30,21 +30,21 @@ Log querying systems are often limited in their querying abilities. Most log dat

### Log event type

- **INFO** – Use INFO events to log something informational. This can be information that's useful for investigations, debugging, or tracking metrics. Note that events such as a user or client error (such as validation errors or 4XX bad request errors) should use INFO, since those are expected to occur as part of normal operation and do not necessarily indicate anything wrong with the system. Do not use ERROR or WARNING for user or client errors to avoid cluttering error logs.
- **ERROR** – Use ERROR events if the the system is failed to complete some business operation. This can happen if there is an unexpected exception or failed assertion. Error logs can be used to trigger an alert to on-call engineers to look into a potential issue.
- **WARNING** – Use WARNING to indicate that there *may* be something wrong with the system but that we have not yet detected any immediate impact on the system's ability to successfully complete the business operation. For example, you can warn on failed soft assumptions and soft constraints. Warning logs can be used to trigger notifications that engineers need to look into during business hours.
- **INFO** – Use `INFO` events to log something informational. This can be information that's useful for investigations, debugging, or tracking metrics. Note that events such as a user or client error (such as validation errors or 4XX bad request errors) should use `INFO`, since those are expected to occur as part of normal operation and do not necessarily indicate anything wrong with the system. Do not use `ERROR` or `WARNING` for user or client errors to avoid cluttering error logs.
- **ERROR** – Use `ERROR` events if the system fails to complete some business operation. This can happen if there is an unexpected exception or failed assertion. Error logs can be used to trigger an alert to on-call engineers to look into a potential issue.
- **WARNING** – Use `WARNING` to indicate that there *may* be something wrong with the system but that we have not yet detected any immediate impact on the system's ability to successfully complete the business operation. For example, you can warn on failed soft assumptions and soft constraints. Warning logs can be used to trigger notifications that engineers need to look into during business hours.

### Log messages

- **Standardized log messages** – Consistently formatted and worded log messages easier to read when viewing many logs at a time, which reduces the chance for human error when interpreting logs. It also makes it easier to write queries by enabling engineers to guess queries and allow New Relic autocomplete to show available log message options to filter by.
- **Standardized log messages** – Consistently formatted and worded log messages are easier to read when viewing many logs at a time, which reduces the chance of human error when interpreting logs. It also makes it easier to write queries by enabling engineers to guess queries and allowing New Relic autocomplete to show available log message options to filter by.
- **Statically defined log messages** – Avoid putting dynamic data in log messages. Static messages are easier to search for in the codebase. Static messages are also easier to query for those specific log events without needing to resort to RLIKE queries with regular expressions or LIKE queries.

### Attributes

- **Log primitives not objects** – Explicitly list which attributes you are logging to avoid unintentionally logging PII. This also makes it easier for engineers to know what attributes are available for querying, or for engineers to search for parts of the codebase that logs these attributes.
- **Structured metadata in custom attributes** – Put metadata in custom attributes (not in the log message) so that it can be used in queries more easily. This is especially helpful when the attributes are used in "group by" clauses to avoid needing to use more complicated queries.
- **system identifiers** – Log all relevant system identifiers (uuids, foreign keys)
- **correlation ids** – Log ids that can be shared between frontend events, backend logs, and ideally even sent to external services
- **correlation ids** – Log ids that can be shared between front-end events, backend logs, and ideally even sent to external services
- **discrete or discretized attributes** – Log all useful non-PII discrete attributes (enums, flags) and discretized versions of continuous attributes (e.g. comment → has_comment, household → is_married, has_dependents)
- **Denormalized data** – Include relevant metadata from related entities. Including denormalized (i.e. redundant) data makes queries easier and faster, and removes the need to join or self-join between datasets, which is not always feasible.
- **Fully-qualified globally consistent attribute names** – Using consistent attribute names everywhere. Use fully qualified attribute names (e.g. application.application_id instead of application_id) to avoid naming conflicts.
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Which format and structure should these records follow?
Chosen option: "MADR 2.1.2", because

* Implicit assumptions should be made explicit.
Design documentation is important to enable people understanding the decisions later on.
Design documentation is important to enable people to understand the decisions later on.
See also [A rational design process: How and why to fake it](https://doi.org/10.1109/TSE.1986.6312940).
* The MADR format is lean and fits our development style.
* The MADR structure is comprehensible and facilitates usage & maintenance.
Expand Down
Loading

0 comments on commit ecc0dd0

Please sign in to comment.