Skip to content
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
112 changes: 102 additions & 10 deletions TESTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ runtime:
// src/drivers/my_driver/test/my_driver_integration_test.c
#include "my_driver_test.h"

void my_driver_test_int_main(void)
void main(void)
{
LOG_DEBUG("Starting my_driver integration tests\n");
my_driver_init();
Expand Down Expand Up @@ -215,15 +215,109 @@ hardware_integration_test_suite(

That's it! The next bringup or pico build will automatically include your test.

### Using `hardware_test_assert`
### Using `test_harness`

Integration tests run on hardware where a `fatal_error()` would halt the MCU
and prevent remaining tests from executing. The
`//src/tasks/hardware_test:hardware_test_assert` library provides a non-fatal
`ASSERT` override that logs failures via `LOG_ERROR` instead of aborting, so the
full test suite always runs to completion.
A test harness for running and reporting on groups of test cases has been
provided in `src/test_infrastructure/` and is automatically linked during
`samwise_test` and/or `samwise_integration_test`.

Note that `hardware_test_assert` is automatically added as a dependency by bazel, so no need to ask for it in deps.
#### 1. Write test functions that return `int`

Each test function must return `0` on success or non-zero on failure. Use the
`TEST_ASSERT` macro for assertions — it logs the failure message and returns
`-1` automatically:

```c
// src/drivers/my_driver/test/my_driver_test.c
#include "my_driver_test.h"

int test_my_driver_read(void)
{
uint8_t buf[16] = {0};
my_driver_read(buf, sizeof(buf));
TEST_ASSERT(buf[0] == 0xAB, "First byte should be 0xAB");
return 0;
}

int test_my_driver_write(void)
{
int rc = my_driver_write(data, sizeof(data));
TEST_ASSERT(rc == 0, "Write should succeed");
return 0;
}
```

#### 2. Define a test case table

Build an array of `test_harness_case_t` entries. Each entry has a unique
numeric ID, a function pointer, and a human-readable name:

```c
// src/drivers/my_driver/test/my_driver_test.c (continued)
const test_harness_case_t my_driver_tests[] = {
{0, test_my_driver_read, "Read"},
{1, test_my_driver_write, "Write"},
};
const size_t my_driver_tests_len =
sizeof(my_driver_tests) / sizeof(my_driver_tests[0]);
```

Export the table and its length in the shared header so both unit and
integration entry points can reference it:

```c
// src/drivers/my_driver/test/my_driver_test.h
#include "test_harness.h"

int test_my_driver_read(void);
int test_my_driver_write(void);

extern const test_harness_case_t my_driver_tests[];
extern const size_t my_driver_tests_len;
```

#### 3. Run the suite

Call `test_harness_run` from your entry point (`main()` for unit tests,
and for integration tests). It runs every case, logs
pass/fail results, and returns `0` if all tests passed or `-1` if any failed:

```c
// Unit test entry point
int main(void)
{
return test_harness_run("My Driver", my_driver_tests, my_driver_tests_len, my_init_func);
}
```

Note that `my_init_func` is run for each test case that sets `requires_custom_init=true`. Otherwise, the slate passed into the function is directly initialized with `clear_and_init_slate`. If `my_init_func` is `NULL`, `clear_and_init_slate` is run for every `requires_custom_init=true`, i.e. the parameter is completely useless (as both `false` by default and `true` by `NULL` will call `clear_and_init_slate`).

The function `my_init_func` MUST use `clear_and_init_slate` or similar functionality to initialize slate, as this is not handled when a custom init function is called.
Comment thread
megargayu marked this conversation as resolved.
Outdated

#### 4. Run a subset of tests (optional)

To run only specific tests by ID, use `test_harness_include_run`:

```c
uint16_t ids[] = {0, 2};
test_harness_include_run("My Driver", my_driver_tests, my_driver_tests_len, my_init_func,
ids, 2);
```

To run all tests _except_ certain IDs, use `test_harness_exclude_run`:

```c
uint16_t skip[] = {1};
test_harness_exclude_run("My Driver", my_driver_tests, my_driver_tests_len, my_init_func,
skip, 1);
```

#### 5. Integration tests

Note that because we export the tests array in the header file, we can easily use the same `test_harness_run` function in integration tests! In fact, you can copy paste the same code described in (3) or (4) into the integration test file as well.

See `src/filesys/test/` for a complete working example of the test harness
in use.

---

Expand Down Expand Up @@ -321,8 +415,6 @@ one-line edit in the `tests` dict.
5. **Initialize state** — create a fresh `slate_t` for each test, using `clear_and_init_slate()`
6. **Share test logic** — write core test functions in a shared `.c`/`.h` pair
and reuse them in both `samwise_test()` and `samwise_integration_test()`
7. **Add `hardware_test_assert`** to integration test deps so failures don't
halt the MCU

## Troubleshooting

Expand Down
7 changes: 4 additions & 3 deletions bzl/defs.bzl
Original file line number Diff line number Diff line change
Expand Up @@ -159,8 +159,8 @@ def samwise_test(name, srcs, deps = [], copts = [], defines = [], **kwargs):
test_deps = _remap_deps_to_mocks(deps)

# Add standard test infrastructure
if "//src/test_infrastructure" not in test_deps:
test_deps.append("//src/test_infrastructure")
if "//src/test_infrastructure:test_infrastructure" not in test_deps:
test_deps.append("//src/test_infrastructure:test_infrastructure")
Comment thread
megargayu marked this conversation as resolved.

# Combine defines with TEST flag
test_defines = list(defines)
Expand Down Expand Up @@ -240,7 +240,8 @@ def samwise_integration_test(name, int_src, srcs = [], deps = [], copts = [], de

# cc_test does not support `hdrs`; pop it from kwargs before forwarding.
hdrs = kwargs.pop("hdrs", [])


Comment thread
megargayu marked this conversation as resolved.
Outdated
deps = list(deps) # Make a mutable copy of deps
# Add standard test infrastructure
if "//src/test_infrastructure:hardware_test_infrastructure" not in deps:
deps.append("//src/test_infrastructure:hardware_test_infrastructure")
Expand Down
97 changes: 2 additions & 95 deletions src/filesys/test/filesys_integration_test.c
Original file line number Diff line number Diff line change
Expand Up @@ -2,99 +2,6 @@

void main(void)
{
LOG_DEBUG("========================================\n");
LOG_DEBUG("Starting Filesys Test Suite\n");
LOG_DEBUG("========================================\n");

int result;
int tests_passed = 0;
int tests_failed = 0;

// Define test functions
struct
{
int (*test_func)(void);
const char *name;
} tests[] = {
{filesys_test_write_readback_success, "Write and Readback"},
{filesys_test_initialize_reformat_success, "Initialize and Reformat"},
{filesys_test_start_file_write_already_writing_should_fail,
"Start File Write - Already Writing"},
{filesys_test_write_data_to_buffer_bounds_should_fail,
"Write Data to Buffer - Bounds Checking"},
{filesys_test_write_data_to_buffer_when_no_file_started_should_fail,
"Write Data to Buffer - No File"},
{filesys_test_write_buffer_to_mram_when_no_file_started_should_fail,
"Write Buffer to MRAM - No File"},
{filesys_test_write_buffer_to_mram_clean_buffer_success,
"Write Buffer to MRAM - Clean Buffer"},
{filesys_test_complete_file_write_dirty_buffer_success,
"Complete File Write - Dirty Buffer"},
{filesys_test_cancel_file_write_success, "Cancel File Write"},
{filesys_test_cancel_file_write_no_file_should_fail,
"Cancel File Write - No File"},
{filesys_test_clear_buffer_success, "Clear Buffer"},
{filesys_test_crc_correct_success, "CRC Verification - Correct"},
{filesys_test_crc_incorrect_should_fail,
"CRC Verification - Incorrect"},
{filesys_test_crc_no_file_should_fail, "CRC Check - No File"},
{filesys_test_multiple_files_success, "Multiple File Writes"},
{filesys_test_blocks_left_calculation_success,
"Blocks Left Calculation"},
{filesys_test_multi_chunk_write_success, "Multi-Chunk Write"},
{filesys_test_write_at_offset_success, "Write at Offset"},
{filesys_test_multiple_files_commit_success,
"Multiple File Writes Committed"},
{filesys_test_write_long_file_crc32_success,
"Write Really Long File with CRC32"},
{filesys_test_file_too_large_should_fail,
"File Too Large for Filesystem"},
{filesys_test_second_file_out_of_space_should_fail,
"Second File Runs Out of Space"},
{filesys_test_raw_lfs_write_large_file_success,
"Raw LFS Write Large File (510KB)"},
{filesys_test_list_files_empty_filesystem_success,
"List Files - Empty Filesystem"},
{filesys_test_list_files_multiple_files_success,
"List Files - Multiple Files"},
{filesys_test_list_files_max_files_limit_success,
"List Files - Max Files Limit"},
{filesys_test_list_files_after_cancel_success,
"List Files - After Cancel"},
{filesys_test_list_files_crc_mismatch_success,
"List Files - CRC Mismatch Detection"},
};

int num_tests = sizeof(tests) / sizeof(tests[0]);

for (int i = 0; i < num_tests; i++)
{
LOG_DEBUG("\n--- Running Test %d/%d: %s ---\n", i + 1, num_tests,
tests[i].name);
result = tests[i].test_func();
if (result == 0)
{
tests_passed++;
LOG_DEBUG("--- Test %d PASSED ---\n", i + 1);
}
else
{
tests_failed++;
LOG_ERROR("--- Test %d FAILED ---\n", i + 1);
}
}

LOG_DEBUG("\n========================================\n");
LOG_DEBUG("Test Suite Complete\n");
LOG_DEBUG("Passed: %d / %d\n", tests_passed, num_tests);
LOG_DEBUG("Failed: %d / %d\n", tests_failed, num_tests);
LOG_DEBUG("========================================\n");

if (tests_failed > 0)
{
LOG_ERROR("SOME TESTS FAILED!\n");
return;
}

LOG_DEBUG("All Filesys tests passed!\n");
test_harness_run("Filesys", filesys_tests, filesys_tests_len,
filesys_test_setup);
}
Loading
Loading