Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Review and add tips from notion doc #1053

Merged
merged 8 commits into from
Mar 4, 2025
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions content/guides/models/track/launch.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,6 +127,7 @@ The following are some suggested guidelines to consider when you create experime
2. **Project**: A project is a set of experiments you can compare together. Each project gets a dedicated dashboard page, and you can easily turn on and off different groups of runs to compare different model versions.
3. **Notes**: Set a quick commit message directly from your script. Edit and access notes in the Overview section of a run in the W&B App.
4. **Tags**: Identify baseline runs and favorite runs. You can filter runs using tags. You can edit tags at a later time on the Overview section of your project's dashboard on the W&B App.
5. **Create multiple run sets for easy comparison**: When comparing experiments, create multiple run sets to make metrics easy to compare. You can toggle run sets on or off on the same chart or group of charts.

The following code snippet demonstrates how to define a W&B Experiment using the best practices listed above:

Expand Down
22 changes: 22 additions & 0 deletions content/guides/models/track/runs/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -403,6 +403,28 @@ Delete one or more runs from a project with the W&B App.
For projects that contain a large number of runs, you can use either the search bar to filter runs you want to delete using Regex or the filter button to filter runs based on their status, tags, or other properties.
{{% /alert %}}

## Organize runs

This section provides instructions on how to organize runs using groups and job types. By assigning runs to groups (e.g., experiment names) and specifying job types (e.g., preprocessing, training, evaluation, debugging), you can streamline your workflow and improve model comparison.

### Assigning runs to groups and job types

Each run in W&B can be categorized by **group** and a **job type**:

- **Group**: Represents a broader experiment category, making it easier to organize and filter runs.
- **Job type**: Describes the function of the run, such as preprocessing, training, or evaluation.

In the following [example workspace](https://wandb.ai/stacey/model_iterz?workspace=user-stacey), a baseline model is trained using increasing amounts of data from the Fashion-MNIST dataset. The color coding in the workspace represents the amount of data used:

- **Yellow to dark green**: Increasing amounts of data for the baseline model.
- **Light blue to violet to magenta**: Increasing amounts of data for a more complex "double" model with additional parameters.

Using W&B's filtering options and search bar, you can easily compare runs based on specific conditions, such as:
- Training on the same dataset.
- Evaluating on the same test set.

Applying filters dynamically updates the **Table** view, allowing you to quickly identify performance differences between models. For example, you can determine which classes are significantly more challenging for one model compared to another.

<!-- ### Search runs

Search for a specific run by name in the sidebar. You can use regex to filter down your visible runs. The search box affects which runs are shown on the graph. Here's an example:
Expand Down