Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mhs/das 2216/quick fixes #19

Merged
merged 25 commits into from
Oct 30, 2024
Merged
Show file tree
Hide file tree
Changes from 23 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
bb7ebd8
DAS-1934: Most basic implementation to allow consistent grids
flamingbear Aug 28, 2024
d072bca
DAS-1934: Update comments and README for new grid validation.
flamingbear Aug 29, 2024
addf305
DAS-1934: Update changelog
flamingbear Aug 29, 2024
8150470
DAS-1934: Update changelog redux
flamingbear Aug 29, 2024
44589d8
DAS-1934: Update service version and address snyk vulnerabilities
flamingbear Aug 29, 2024
f9890f0
Update CHANGELOG.md
flamingbear Sep 1, 2024
ed0ff0e
DAS-1934: Fix release note extraction script.
flamingbear Sep 1, 2024
7622aa7
DAS-2216: Quick Fixes 1, 2 and 4
flamingbear Sep 9, 2024
b319a63
Merge remote-tracking branch 'origin/main' into mhs/DAS-2216/quick-fixes
flamingbear Sep 12, 2024
31c5bea
Merge branch 'main' into mhs/DAS-2216/quick-fixes
joeyschultz Sep 23, 2024
2a5a6ef
DAS-2216: Modify earthdata-varinfo config for quick fix 1
joeyschultz Oct 10, 2024
4477475
DAS-2216: Resolve unit tests that were failing due to quick fixes
joeyschultz Oct 10, 2024
404088c
DAS-2216: Update service version and CHANGELOG.
joeyschultz Oct 10, 2024
72223b9
Modify transpose_if_xdim_less_than_ydim to resolve mask array not bei…
joeyschultz Oct 16, 2024
656c858
Create notebook for PR testing and demo purposes
joeyschultz Oct 16, 2024
3601f92
Install and run pre-commit
joeyschultz Oct 16, 2024
735894d
Remove unused exception, fix CHANGELOG links, and other minor updates
joeyschultz Oct 21, 2024
8770da0
Reorganize some of the notebook functions
joeyschultz Oct 21, 2024
78dd9df
Modify assumption comments in get_variable_values for clarity
joeyschultz Oct 21, 2024
25cdc10
Add to varinfo config, re-enable MissingReprojectedDataError, modify …
joeyschultz Oct 23, 2024
b707bd2
Remove TEMPO_O3TOT_L2_example.ipynb and add it to JIRA ticket instead.
joeyschultz Oct 23, 2024
6678d5a
Simplify variable transposal, update effected typehints and unit tests
joeyschultz Oct 25, 2024
e5d4be1
Merge remote-tracking branch 'origin/main' into mhs/DAS-2216/quick-fixes
joeyschultz Oct 25, 2024
bde29b6
Add get_rows_per_scan utility and associated unit tests
joeyschultz Oct 28, 2024
cd54588
Apply coordinates MetadataOverride to geolocation group
joeyschultz Oct 28, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 17 additions & 7 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,14 @@
# Changelog

## [v1.2.0] - 2024-10-10

### Changed

- [[DAS-2216](https://bugs.earthdata.nasa.gov/browse/DAS-2216)]
The Swath Projector has been updated with quick fixes to add support for TEMPO level 2 data. These changes include optional transposing of arrays based on dimension sizes and updates to the configuration file for TEMPO_O3TOT_L2 to correctly locate coordinate variables and exclude science variables with dimensions that do no match those of the coordinate variables.

## [v1.1.1] - 2024-09-16

### Changed

- [[TRT-558](https://bugs.earthdata.nasa.gov/browse/TRT-558)]
Expand All @@ -12,6 +20,7 @@
also been renamed to `earthdata_varinfo_config.json`.

## [v1.1.0] - 2024-08-29

### Changed

- [[DAS-1934](https://bugs.earthdata.nasa.gov/browse/DAS-1934)]
Expand All @@ -37,14 +46,15 @@ include updated documentation and files outlined by the

Repository structure changes include:

* Migrating `pymods` directory to `swath_projector`.
* Migrating `swotrepr.py` to `swath_projector/adapter.py`.
* Addition of `swath_projector/main.py`.
- Migrating `pymods` directory to `swath_projector`.
- Migrating `swotrepr.py` to `swath_projector/adapter.py`.
- Addition of `swath_projector/main.py`.

For more information on internal releases prior to NASA open-source approval,
see legacy-CHANGELOG.md.

[v1.1.1]:(https://github.com/nasa/harmony-swath-projector/releases/tag/1.1.0)
[v1.1.0]:(https://github.com/nasa/harmony-swath-projector/releases/tag/1.0.1)
[v1.0.1]:(https://github.com/nasa/harmony-swath-projector/releases/tag/1.0.1)
[v1.0.0]:(https://github.com/nasa/harmony-swath-projector/releases/tag/1.0.0)
[v1.2.0]: (https://github.com/nasa/harmony-swath-projector/releases/tag/1.2.0)
[v1.1.1]: (https://github.com/nasa/harmony-swath-projector/releases/tag/1.1.1)
[v1.1.0]: (https://github.com/nasa/harmony-swath-projector/releases/tag/1.1.0)
[v1.0.1]: (https://github.com/nasa/harmony-swath-projector/releases/tag/1.0.1)
[v1.0.0]: (https://github.com/nasa/harmony-swath-projector/releases/tag/1.0.0)
6 changes: 3 additions & 3 deletions bin/project_local_granule.py
Original file line number Diff line number Diff line change
Expand Up @@ -125,8 +125,8 @@ def project_granule(
{
lyonthefrog marked this conversation as resolved.
Show resolved Hide resolved
'url': local_file_path,
'temporal': {
'start': '2021-01-03T23:45:00.000Z',
'end': '2020-01-04T00:00:00.000Z',
'start': '2020-01-03T23:45:00.000Z',
'end': '2025-01-04T00:00:00.000Z',
},
'bbox': [-180, -90, 180, 90],
}
Expand All @@ -141,5 +141,5 @@ def project_granule(

reprojector = SwathProjectorAdapter(message, config=config(False))

with patch('swotrepr.shutil.rmtree', side_effect=rmtree_side_effect):
with patch('swath_projector.adapter.shutil.rmtree', side_effect=rmtree_side_effect):
owenlittlejohns marked this conversation as resolved.
Show resolved Hide resolved
reprojector.invoke()
2 changes: 1 addition & 1 deletion docker/service_version.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
1.1.1
1.2.0
43 changes: 40 additions & 3 deletions swath_projector/earthdata_varinfo_config.json
Original file line number Diff line number Diff line change
@@ -1,12 +1,35 @@
{
"Identification": "Swath Projector VarInfo configuration",
"Version": 3,
"Version": 4,
"CollectionShortNamePath": [
"ShortName"
"ShortName",
"collection_shortname"
],
"Mission": {
"VNP10": "VIIRS"
"VNP10": "VIIRS",
"TEMPO_O3TOT_L2": "TEMPO"
},
"ExcludedScienceVariables": [
{
"Applicability": {
"Mission": "TEMPO",
"ShortNamePath": "TEMPO_O3TOT_L2"
},
"VariablePattern": [
"/support_data/a_priori_layer_o3",
"/support_data/cal_adjustment",
"/support_data/dNdR",
"/support_data/layer_efficiency",
"/support_data/lut_wavelength",
"/support_data/N_value",
"/support_data/N_value_residual",
"/support_data/ozone_sensitivity_ratio",
"/support_data/step_1_N_value_residual",
"/support_data/step_2_N_value_residual",
"/support_data/temp_sensitivity_ratio"
]
}
],
owenlittlejohns marked this conversation as resolved.
Show resolved Hide resolved
"MetadataOverrides": [
{
"Applicability": {
Expand All @@ -21,6 +44,20 @@
}
],
"_Description": "VNP10 SnowData variables have incorrect relative paths for coordinates."
},
{
"Applicability": {
"Mission": "TEMPO",
"ShortNamePath": "TEMPO_O3TOT_L2",
"VariablePattern": "^/product/.*|^/support_data/.*"
},
"Attributes": [
{
"Name": "coordinates",
"Value": "/geolocation/latitude, /geolocation/longitude"
lyonthefrog marked this conversation as resolved.
Show resolved Hide resolved
}
],
"_Description": "TEMPO_O3TOT_L2 variables only contain basenames for coordinates, which are found in sibling hierarchical groups. This rule fully qualifies the paths to these coordinates. Some variables in these groups are excluded via 'ExcludedScienceVariables'"
}
]
}
1 change: 1 addition & 0 deletions swath_projector/interpolation.py
Original file line number Diff line number Diff line change
Expand Up @@ -268,6 +268,7 @@ def get_ewa_results(
ewa_information['target_area'],
variable['values'],
maximum_weight_mode=maximum_weight_mode,
rows_per_scan=2, # Added in QuickFix DAS-2216 to be fixed in DAS-2220
)

if variable['fill_value'] is not None:
Expand Down
16 changes: 7 additions & 9 deletions swath_projector/swath_geometry.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@


def get_projected_resolution(
projection: Proj, longitudes: Variable, latitudes: Variable
projection: Proj, longitudes: np.ma.MaskedArray, latitudes: np.ma.MaskedArray
) -> Tuple[float]:
"""Find the resolution of the target grid in the projected coordinates, x
and y. First the perimeter points are found. These are then projected
Expand All @@ -40,7 +40,7 @@ def get_projected_resolution(


def get_extents_from_perimeter(
projection: Proj, longitudes: Variable, latitudes: Variable
projection: Proj, longitudes: np.ma.MaskedArray, latitudes: np.ma.MaskedArray
) -> Tuple[float]:
"""Find the swath extents in the target CRS. First the perimeter points of
unfilled valid pixels are found. These are then projected to the target
Expand All @@ -59,19 +59,17 @@ def get_extents_from_perimeter(
def get_projected_coordinates(
coordinates_mask: np.ma.core.MaskedArray,
projection: Proj,
longitudes: Variable,
latitudes: Variable,
longitudes: np.ma.MaskedArray,
latitudes: np.ma.MaskedArray,
) -> Tuple[np.ndarray]:
"""Get the required coordinate points projected in the target Coordinate
Reference System (CRS).

"""
if len(longitudes.shape) == 1:
coordinates = get_all_coordinates(longitudes[:], latitudes[:], coordinates_mask)
coordinates = get_all_coordinates(longitudes, latitudes, coordinates_mask)
else:
coordinates = get_perimeter_coordinates(
longitudes[:], latitudes[:], coordinates_mask
)
coordinates = get_perimeter_coordinates(longitudes, latitudes, coordinates_mask)

return reproject_coordinates(coordinates, projection)

Expand Down Expand Up @@ -135,7 +133,7 @@ def get_absolute_resolution(polygon_area: float, n_pixels: int) -> float:


def get_valid_coordinates_mask(
longitudes: Variable, latitudes: Variable
longitudes: np.ma.MaskedArray, latitudes: np.ma.MaskedArray
) -> np.ma.core.MaskedArray:
"""Get a `numpy` N-d array containing boolean values (0 or 1) indicating
whether the elements of both longitude and latitude are valid at that
Expand Down
48 changes: 39 additions & 9 deletions swath_projector/utilities.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,8 @@ def get_variable_values(
As the variable data are returned as a `numpy.ma.MaskedArray`, the will
return no data in the filled pixels. To ensure that the data are
correctly handled, the fill value is applied to masked pixels using the
`filled` method.
`filled` method. The variable values are transposed if the `along-track`
dimension size is less than the `across-track` dimension size.

"""
# TODO: Remove in favour of apply2D or process_subdimension.
Expand All @@ -42,27 +43,38 @@ def get_variable_values(
if len(variable[:].shape) == 1:
return make_array_two_dimensional(variable[:])
elif 'time' in input_file.variables and 'time' in variable.dimensions:
# Assumption: Array = (1, y, x)
return variable[0][:].filled(fill_value=fill_value)
# Assumption: Array = (time, along-track, across-track)
owenlittlejohns marked this conversation as resolved.
Show resolved Hide resolved
return transpose_if_xdim_less_than_ydim(variable[0][:]).filled(
fill_value=fill_value
)
else:
# Assumption: Array = (y, x)
return variable[:].filled(fill_value=fill_value)
# Assumption: Array = (along-track, across-track)
return transpose_if_xdim_less_than_ydim(variable[:]).filled(
fill_value=fill_value
)


def get_coordinate_variable(
dataset: Dataset, coordinates_tuple: Tuple[str], coordinate_substring
) -> Optional[Variable]:
) -> Optional[np.ma.MaskedArray]:
"""Search the coordinate dataset names for a match to the substring,
which will be either "lat" or "lon". Return the corresponding variable
from the dataset. Only the base variable name is used, as the group
path may contain either of the strings as part of other words.
data from the dataset. Only the base variable name is used, as the group
path may contain either of the strings as part of other words. The
coordinate variables are transposed if the `along-track`dimension size is
less than the `across-track` dimension size.

"""
for coordinate in coordinates_tuple:
if coordinate_substring in coordinate.split('/')[-1] and variable_in_dataset(
coordinate, dataset
):
return dataset[coordinate]
# QuickFix (DAS-2216) for short and wide swaths
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this comment in the right place still? They are in the code so that they can be removed when the quick fixes are actual fixes.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 70-73 are all part of the quickfix so I think the comment is where we want it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm actually on the fence about this as I suspect these are essentially default behaviors and not necessarily going away with the quick fix. That said, there is at least the case of setting rows-per-scan = 2 that will need follow-up modification. We can go with it for now, but do need to follow-up.

if dataset[coordinate].ndim == 1:
return dataset[coordinate][:]

return transpose_if_xdim_less_than_ydim(dataset[coordinate][:])

raise MissingCoordinatesError(coordinates_tuple)


Expand Down Expand Up @@ -216,3 +228,21 @@ def make_array_two_dimensional(one_dimensional_array: np.ndarray) -> np.ndarray:

"""
return np.expand_dims(one_dimensional_array, 1)


def transpose_if_xdim_less_than_ydim(
variable_values: np.ma.MaskedArray,
) -> np.ma.MaskedArray:
"""Return transposed variable when variable is wider than tall.

QuickFix (DAS-2216): We presume that a swath has more rows than columns and
if that's not the case we transpose it so that it does.
"""
if len(variable_values.shape) != 2:
raise ValueError(
f'Input variable must be 2 dimensional, but got {len(variable_values.shape)} dimensions.'
)
if variable_values.shape[0] < variable_values.shape[1]:
return np.ma.transpose(variable_values).copy()

return variable_values
5 changes: 5 additions & 0 deletions tests/unit/test_interpolation.py
Original file line number Diff line number Diff line change
Expand Up @@ -384,6 +384,7 @@ def test_resample_ewa(
self.mock_target_area,
mock_values,
maximum_weight_mode=False,
rows_per_scan=2, # Added in QuickFix DAS-2216 to be fixed in DAS-2220
)
mock_write_output.assert_called_once_with(
self.mock_target_area,
Expand Down Expand Up @@ -423,6 +424,7 @@ def test_resample_ewa(
self.mock_target_area,
mock_values,
maximum_weight_mode=False,
rows_per_scan=2, # Added in QuickFix DAS-2216 to be fixed in DAS-2220
)
mock_write_output.assert_called_once_with(
self.mock_target_area,
Expand Down Expand Up @@ -491,6 +493,7 @@ def test_resample_ewa_nn(
self.mock_target_area,
mock_values,
maximum_weight_mode=True,
rows_per_scan=2, # Added in QuickFix DAS-2216 to be fixed in DAS-2220
)
mock_write_output.assert_called_once_with(
self.mock_target_area,
Expand Down Expand Up @@ -530,6 +533,7 @@ def test_resample_ewa_nn(
self.mock_target_area,
mock_values,
maximum_weight_mode=True,
rows_per_scan=2, # Added in QuickFix DAS-2216 to be fixed in DAS-2220
)
mock_write_output.assert_called_once_with(
self.mock_target_area,
Expand Down Expand Up @@ -581,6 +585,7 @@ def test_resample_ewa_nn(
harmony_target_area,
mock_values,
maximum_weight_mode=True,
rows_per_scan=2, # Added in QuickFix DAS-2216 to be fixed in DAS-2220
)

# The Harmony target area should be given to the output function
Expand Down
12 changes: 5 additions & 7 deletions tests/unit/test_swath_geometry.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,8 @@ def setUpClass(cls):

def setUp(self):
self.test_dataset = Dataset(self.test_path)
self.longitudes = self.test_dataset['lon']
self.latitudes = self.test_dataset['lat']
self.longitudes = self.test_dataset['lon'][:]
self.latitudes = self.test_dataset['lat'][:]

def tearDown(self):
self.test_dataset.close()
Expand Down Expand Up @@ -101,8 +101,8 @@ def test_get_projected_resolution_1d(self):
"""Ensure the calculated one-dimensional resolution is correct."""
resolution = get_projected_resolution(
self.geographic_projection,
self.test_dataset['lon_1d'],
self.test_dataset['lat_1d'],
self.test_dataset['lon_1d'][:],
self.test_dataset['lat_1d'][:],
)

self.assertAlmostEqual(resolution, 5.0)
Expand Down Expand Up @@ -167,9 +167,7 @@ def test_get_perimeter_coordinates(self):
np.logical_not(valid_pixels), np.ones(self.longitudes.shape)
)

coordinates = get_perimeter_coordinates(
self.longitudes[:], self.latitudes[:], mask
)
coordinates = get_perimeter_coordinates(self.longitudes, self.latitudes, mask)

self.assertCountEqual(coordinates, expected_points)

Expand Down
Loading
Loading