Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[#13227] Improve testing docs #13242

Open
wants to merge 6 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/_markbind/layouts/default.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,10 +29,10 @@
* [Emails]({{ baseUrl }}/emails.html)
* [Unit Testing]({{ baseUrl }}/unit-testing.html)
* [End-to-End Testing]({{ baseUrl }}/e2e-testing.html)
* [Snapshot Testing]({{ baseUrl }}/snapshot-testing.html)
* [Performance Testing]({{ baseUrl }}/performance-testing.html)
* [Accessibility Testing]({{ baseUrl }}/axe-testing.html)
* [Search]({{ baseUrl }}/search.html)
* [Snapshot Testing]({{ baseUrl }}/snapshot-testing.html)
* [Static Analysis]({{ baseUrl }}/static-analysis.html)
* [Troubleshooting Guide]({{ baseUrl }}/troubleshooting-guide.html)
* [Glossary]({{ baseUrl }}/glossary.html)
Expand Down
53 changes: 5 additions & 48 deletions docs/development.md
Original file line number Diff line number Diff line change
Expand Up @@ -288,56 +288,13 @@ If you are using the Cloud SDK method, you can use `Ctrl + C` in the console to

There are two big categories of testing in TEAMMATES:

- **Component tests**: white-box unit and integration tests, i.e. they test the application components with full knowledge of the components' internal workings. This is configured in `src/test/resources/testng-component.xml` (back-end) and `src/web/jest.config.js` (front-end).
- **E2E (end-to-end) tests**: black-box tests, i.e. they test the application as a whole without knowing any internal working. This is configured in `src/e2e/resources/testng-e2e.xml`. To learn more about E2E tests, refer to this [document](e2e-testing.md).
- **Component tests**: White-box unit and integration tests, i.e. they test the application components with full knowledge of the components' internal workings. To learn more about running and writing component tests, refer to [this guide](unit-testing.md).
- **<tooltip content="End-to-end">E2E</tooltip> tests**: Black-box tests, i.e. they test the application as a whole without knowing any internal working. To learn more about running and writing E2E tests, refer to [this guide](e2e-testing.md).

<div id="running-tests">
Other tests in TEAMMATES include:

#### Running tests

##### Frontend tests

To run all front-end component tests in watch mode (i.e. any change to source code will automatically reload the tests), run the following command:

```sh
npm run test
```

To update snapshots, run the following command:
```sh
npm run test
```

Followed by `a` to run all the test cases. Check through the snapshots to make sure that the changes are as expected, and press `u` to update them.

To run all front-end component tests once and generate coverage data afterwards, run the following command:

```sh
npm run coverage
```

To run an individual test in a test file, change `it` in the `*.spec.ts` file to `fit`.

To run all tests in a test file (or all test files matching a pattern), you can use Jest's watch mode and filter by filename pattern.

##### Backend tests

Back-end component tests follow this configuration:

Test suite | Command | Results can be viewed in
---|---|---
`Component tests` | `./gradlew componentTests --continue` | `{project folder}/build/reports/tests/componentTests/index.html`
Any individual component test | `./gradlew componentTests --tests TestClassName` | `{project folder}/build/reports/tests/componentTests/index.html`

You can generate the coverage data with `jacocoReport` task after running tests, e.g.:

```sh
./gradlew componentTests jacocoReport
```

The report can be found in the `build/reports/jacoco/jacocoReport/` directory.

</div>
- **Performance tests**: Evaluate the application's stability and responsiveness under expected, peak, and prolonged loads. See [Performance Testing](performance-testing.md).
- **Accessibility tests**: Ensure the application is accessible to users with disabilities. See [Accessibility Testing](axe-testing.md).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be good to also mention that docker has to be running to do testing?


## Deploying to a staging server

Expand Down
2 changes: 1 addition & 1 deletion docs/e2e-testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
- It aims to ensure all integrated components of the application work together as expected when it is being used by the end user.
- This is done by simulating user scenarios on the fully built product.

E2E tests in TEAMMATES can be found in the package `teammates.e2e`.
E2E tests in TEAMMATES are located in the package `teammates.e2e` and configured in `src/e2e/resources/testng-e2e.xml`.

## Running E2E tests

Expand Down
20 changes: 11 additions & 9 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,22 +25,24 @@ Here are some documents important for TEAMMATES developers.
### Supplementary documents

* **Coding standards** for:
[Java](https://oss-generic.github.io/process/codingStandards/CodingStandard-Java.html),
[CSS](https://oss-generic.github.io/process/codingStandards/CodingStandard-Css.html),
[HTML](https://oss-generic.github.io/process/codingStandards/CodingStandard-Html.html)

* [Java](https://oss-generic.github.io/process/codingStandards/CodingStandard-Java.html),
* [CSS](https://oss-generic.github.io/process/codingStandards/CodingStandard-Css.html),
* [HTML](https://oss-generic.github.io/process/codingStandards/CodingStandard-Html.html)
* **Best practices** for:
* [Coding](best-practices/coding.md)
* [Testing](best-practices/testing.md)
* [Data migration](best-practices/data-migration.md)
* [UI design](best-practices/ui-design.md)
* [Accessibility](best-practices/accessibility.md)
* [Mobile-friendliness](best-practices/mobile-friendliness.md)

* **How-to guides** for:
* [Static analysis](static-analysis.md): Performing code quality checks
* [Setting up third-party email providers](emails.md)
* [Setting up CAPTCHA](captcha.md)
* [Setting up developer documentation](documentation.md)
* [Setting up third-party email providers](emails.md)
* [Unit testing](unit-testing.md)
* [End-to-End testing](e2e-testing.md)
* [Snapshot testing](snapshot-testing.md)
* [E2E testing](e2e-testing.md)
* [Accessibility testing](axe-testing.md)
* [Performance testing](performance-testing.md)
* [Accessibility testing](axe-testing.md)
* [Setting up full-text search](search.md)
* [Static analysis](static-analysis.md): Performing code quality checks
20 changes: 10 additions & 10 deletions docs/performance-testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,16 +8,6 @@ TEAMMATES makes use of [JMeter](https://jmeter.apache.org/) for load and perform

The performance test cases are located in the [`teammates.lnp.cases`](https://github.com/TEAMMATES/teammates/tree/master/src/lnp/java/teammates/lnp/cases) package.

## Creating Performance Tests

Each new test case must inherit the `BaseLNPTestCase` class, and implement the methods required for generating the test data and the JMeter L&P test plan. The L&P test plans are created in Java using the JMeter API.

The inherited test cases can run JMeter test by calling `runJmeter` method. When passing the parameter `shouldCreateJmxFile` as `true`, an equivalent `.jmx` file can be generated from this test plan.

To help with debugging, you can open this `.jmx` file in the JMeter GUI and add Listeners.

To see a sample implementation of a test case, you can refer to `FeedbackSessionSubmitLNPTest`. It is a _simple_ test case which load tests a PUT endpoint (`/webapi/responses`).

## Running Performance Tests

If you want to use your own copy of [JMeter](https://jmeter.apache.org/download_jmeter.cgi), update the `test.jmeter.*` properties in `src/lnp/resources/test.properties` accordingly.
Expand Down Expand Up @@ -68,3 +58,13 @@ However, you should not use the GUI to run large scale tests as it is very resou

Remember to **disable or remove all `Listeners`** in the `.jmx` file, unless you are debugging. Having them enabled can have a negative impact on the test performance.
</box>

## Creating Performance Tests

Each new test case must inherit the `BaseLNPTestCase` class, and implement the methods required for generating the test data and the JMeter L&P test plan. The L&P test plans are created in Java using the JMeter API.

The inherited test cases can run JMeter test by calling `runJmeter` method. When passing the parameter `shouldCreateJmxFile` as `true`, an equivalent `.jmx` file can be generated from this test plan.

To help with debugging, you can open this `.jmx` file in the JMeter GUI and add Listeners.

To see a sample implementation of a test case, you can refer to `FeedbackSessionSubmitLNPTest`. It is a _simple_ test case which load tests a PUT endpoint (`/webapi/responses`).
49 changes: 20 additions & 29 deletions docs/snapshot-testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,15 +8,15 @@

Snapshot testing is an extension of usual expected-vs-actual assertion where the following conditions apply:

1. The expected object ("snapshot" afterwards) is too large to be contained in a few lines of code (e.g. a web page HTML content)
1. The snapshot should be tolerant to some degree of data variation (e.g. date and time of test execution)
1. (Optional) There is an automated way to update the snapshot based on the state of the actual object at any point of time
1. The expected object (or "snapshot") is too large to be contained in a few lines of code (e.g. a web page HTML content)
2. The snapshot should be tolerant to some degree of data variation (e.g. date and time of test execution)
3. (Optional) There is an automated way to update the snapshot based on the state of the actual object at any point of time

In the case of TEAMMATES, snapshot testing is useful for the following comparisons:

1. Expected web page content/DOM structure vs actual rendered web page
1. Expected email content vs actual email generated by the system
1. Expected CSV content vs actual CSV generated by the system
1. Expected web page content/DOM structure vs. Actual rendered web page
2. Expected email content vs. Actual email generated by the system
3. Expected CSV content vs. Actual CSV generated by the system

## How does Snapshot Testing work?

Expand All @@ -27,20 +27,20 @@ In verification mode, snapshot testing is essentially the same as a normal expec
Auto-update mode is the flagship feature of snapshot testing. Essentially, this _reverses_ the process of testing, i.e. using the _actual_ generated object to overwrite the snapshot provided in the test.
To remove redundancy, even if auto-update mode is enabled, this overwriting procedure only happens when a test fails during the test run.

For such testing method to be effective, before the changes are committed, a *manual* (by the developer) verification to ensure only the intended changes have occurred is mandatory.
For such testing method to be effective, before the changes are committed, a *manual* verification by the developer is mandatory to ensure only the intended changes have occurred.

## How do we use Snapshot Testing?

For the web page comparison and CSV content generation, the tests are done in the front-end using Jest. [Jest has native support for snapshot testing (in fact, that is where the name is obtained from!)](https://jestjs.io/docs/en/snapshot-testing). Auto-update mode is activated by pressing `u` when running Jest under watch mode.
For the web page comparison and CSV content generation, the tests are done in the frontend using Jest. Jest has [native support](https://jestjs.io/docs/en/snapshot-testing) for snapshot testing (in fact, that is where the name is obtained from!). Auto-update mode is activated by pressing `u` when running Jest under watch mode.

For email generation, the tests are done in the back-end. Auto-update mode is activated by setting the value of `test.snapshot.update` to `true` in `test.properties`.
For email generation, the tests are done in the backend. Auto-update mode is activated by setting the value of `test.snapshot.update` to `true` in `test.properties`.

## When do we use Snapshot Testing?

Snapshot testing is typically used in the following two situations:

1. To create a new source file for a (new) HTML comparison test, email content test, or CSV file content test.
1. To update existing source files to reflect intended changes to the UI of the web pages, the email content, or the CSV file content.
2. To update existing source files to reflect intended changes to the UI of the web pages, the email content, or the CSV file content.

The following example describes the behaviour of snapshot testing and how it can be used in practice. Let us consider the case where the following line of test code is executed:

Expand All @@ -52,9 +52,9 @@ Here are three possible situations and the corresponding behaviours of snapshot

1. If the snapshot exists and has the correct content, no updates to the source file will be observed.

1. If the snapshot exists but has the wrong content, it will be updated with the correct content. Subsequently, the test case will pass subsequent test runs with/without auto-update mode enabled.
2. If the snapshot exists but has the wrong content, it will be updated with the correct content. Subsequently, the test case will pass subsequent test runs with/without auto-update mode enabled.

1. If the snapshot does not exist, it will be created with the given name (in Jest it is auto-generated) AND with the correct content. Subsequently, the test case will pass subsequent test runs with/without auto-update mode enabled.
3. If the snapshot does not exist, it will be created with the given name (in Jest it is auto-generated) AND with the correct content. Subsequently, the test case will pass subsequent test runs with/without auto-update mode enabled.

The same idea applies to email content test:

Expand All @@ -65,32 +65,25 @@ EmailChecker.verifyEmailContent(email, recipient, subject, "/studentCourseJoinEm

## When do we NOT use Snapshot Testing?

Snapshot testing is useful for regression testing, e.g. if a change to front-end logic caused the rendered HTML to change, snapshot testing will reflect those changes and it can be checked whether those changes are intended or not.
Snapshot testing is useful for regression testing, e.g. if a change to frontend logic caused the rendered HTML to change, snapshot testing will reflect those changes and it can be checked whether those changes are intended or not.

However, snapshot testing should NOT be used to check how a rendered HTML looks like before/during/after interactions, e.g.:

1. Take one snapshot at the initial phase
1. Perform a 'click' on a selected component of the page
1. Take another snapshot afterwards
2. Perform a 'click' on a selected component of the page
3. Take another snapshot afterwards

Such (mis-)usage of snapshot testing will erode the value of test as the difference between the two snapshots cannot be traced.

In such a case, normal assertions should be used, e.g. by checking the component's internal data before and after the 'click'.

## Best Practices

1. Snapshot testing complements but does NOT replace unit tests.
1. Remember to disable the auto-update mode once the necessary changes have been made, before committing the changes.
1. Confirm that all the changes are EXPECTED.
1. After all the necessary changes have been made, run the test suites once without auto-update mode enabled to ensure that the tests pass.

## Final Notes

Do NOT create or modify the snapshot objects manually. Use auto-update mode even for seemingly trivial changes. The generated snapshot may not reflect the actual object identically. Some modifications have been made to achieve cross-compatibility (e.g. white space standardization).

Running any snapshot test with auto-update mode enabled can lead to false positive results since comparison failures, if any, are suppressed. This further underscores the need to run the test suite WITHOUT auto-update mode enabled to truly test the system.

In general, only the lines that are modified should be changed. However, since there are some forms of standardization such as white spacing, sometimes multiple (seemingly unrelated) lines may be affected due to changes in the indentation, and as such it is not a cause for concern. An example of this is shown below:
* Snapshot testing complements unit tests, but does not replace them.
* **Modify snapshot objects with auto-update mode**: Avoid creating or modifying snapshot objects manually. Use auto-update mode even for seemingly trivial changes. The generated snapshot may not reflect the actual object identically to achieve cross-compatibility (e.g. white space standardization).
* Once the necessary changes have been made, confirm that all the changes are EXPECTED and disable the auto-update mode, before committing the changes.
* **Run tests without auto-update mode**: Running any snapshot test with auto-update mode enabled can lead to false positive results since comparison failures, if any, are suppressed. After all the necessary changes have been made, run the test suite WITHOUT auto-update mode enabled to truly test the system.
* In general, only the lines that are modified should be changed. However, since there are some forms of standardization such as white spacing, sometimes multiple (seemingly unrelated) lines may be affected due to changes in the indentation, and as such it is not a cause for concern. An example of this is shown below:

```html
<div>
Expand All @@ -111,5 +104,3 @@ being changed to
</div>
</div>
```

Happy Testing!
56 changes: 50 additions & 6 deletions docs/unit-testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,56 @@ Unit testing is a testing methodology where the objective is to test components
- It aims to ensure all components of the application work as expected, assuming its dependencies are working.
- This is done in TEAMMATES by using mocks to simulate a component's dependencies.

Frontend Unit tests in TEAMMATES are located in `.spec.ts` files, while Backend Unit tests in TEAMMATES can be found in the package `teammates.test`.
Frontend unit tests in TEAMMATES are located in `*.spec.ts` files and configured in `src/web/jest.config.js`.

Backend unit tests in TEAMMATES are located in the package `teammates.test` and configured in `src/test/resources/testng-component.xml`.

## Writing Unit Tests
Comment on lines 11 to -17
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it looks fine.

Just a note—I typically run tests using IntelliJ’s built-in features rather than executing them manually using the terminal. In the test file, I just click the green test button and a task window pops up. In the task window, I will then select to run unitTests, since we’ve only been working with unit tests. The screenshot of the intellij window pop-up is attached below:
Screenshot 2025-02-27 at 2 06 27 PM

Would it be helpful to add comments for developers who primarily use IDE features? Maybe we could include guidance for popular IDEs like VS Code and IntelliJ.

## Running Unit Tests

### Frontend tests

To run all frontend component tests in watch mode (i.e. any change to source code will automatically reload the tests), run the following command:

```sh
npm run test
```

Most frontend component tests use [Snapshot Testing](snapshot-testing.md). To update snapshots, run the following command:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's useful to show that we need to run npm run test for Snapshot Testing. However, I think it would also be great if you could add a hyperlink in snapshot-testing.md in the "How do we use Snapshot Testing?" section to this part, or perhaps you can add instructions there on which commands we could use. It's a little confusing to newer developers because the given Jest docs link shows a different command (jest --updateSnapshot).

On a side note, it would help me greatly as well since I haven't found the right way to do so with my button PR 🤣


```sh
npm run test
```

Followed by `a` to run all the test cases. Check through the snapshots to make sure that the changes are as expected, and press `u` to update them.

To run all frontend component tests once and generate coverage data afterwards, run the following command:

```sh
npm run coverage
```

To run an individual test in a test file, change `it` in the `*.spec.ts` file to `fit`.

To run all tests in a test file (or all test files matching a pattern), you can use Jest's watch mode and filter by filename pattern.

### Backend tests

Backend component tests follow this configuration:

Test suite | Command | Results can be viewed in
---|---|---
`Component tests` | `./gradlew componentTests --continue` | `{project folder}/build/reports/tests/componentTests/index.html`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be useful to include a rough approximation of how long the tests will take to run (not just for this but for frontend as well)? I believe some of the tests takes ages to run and it might catch new developers off guard.

Any individual component test | `./gradlew componentTests --tests TestClassName` | `{project folder}/build/reports/tests/componentTests/index.html`

You can generate the coverage data with `jacocoReport` task after running tests, e.g.:

```sh
./gradlew componentTests jacocoReport
```

The report can be found in the `build/reports/jacoco/jacocoReport/` directory.

## Creating Unit Tests

### General guidelines

Expand Down Expand Up @@ -121,7 +167,7 @@ it('getStudentCourseJoinStatus: should return true if student has joined the cou

By injecting the values in the test right before they are used, developers are able to more easily trace the code and understand the test.

### Frontend
### Frontend tests

#### Naming
Unit tests for a function should follow the format:
Expand Down Expand Up @@ -179,7 +225,7 @@ it('triggerDeleteCommentEvent: should emit the correct index to deleteCommentEve
});
```

### Backend
### Backend tests

#### Naming
Unit test names should follow the format: `test<functionName>_<scenario>_<outcome>`
Expand All @@ -202,5 +248,3 @@ account.setEmail("[email protected]");
Student student = getTypicalStudent();
student.setName("New Student Name");
```
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it might be useful to add some helpful tips if the tests cannot run, for instance, there was a time where I couldn't run my test and I had to do ./gradlew clean build which then rectified the issue.


<include src="development.md#running-tests" />
Loading