Skip to content

Conversation

@ma595
Copy link
Collaborator

@ma595 ma595 commented Dec 5, 2025

This PR aims to address the issue observed by @TomMelt, where a vertical 'seam' or fold is observed at the meridian line (lon = 0). We hadn't noticed this before as we were plotting data (lons 0-360).

This is illustrated here in the raw flux outputs of the network. Tom has already verified that the inputs are smooth.
image

This issue appears because the Unet does not enforce longitudinal periodicity. We can fix this by using the CircularPad2d function where convolution is done, i.e. in classes Conv_block, Upsample and Attention_block.

I also recommend a replication pad at the poles using nn.ReplicationPad2d. This will eliminate potential steep gradients there.

self.pad_layer = nn.Sequential(
            nn.CircularPad2d((padding, padding, 0, 0)),
            nn.ReplicationPad2d((0, 0, padding, padding))
        )

TODO:

  • Zero pad or replication pad in latitude?
  • Light testcase to ensure training will work.
  • Add some docs to code for clarity

Closes #56

@ma595 ma595 self-assigned this Dec 5, 2025
ma595 added 3 commits December 5, 2025 16:39
padding=0 condition never met in Conv_block, Attention_block or
Upsample.
@ma595 ma595 marked this pull request as draft December 6, 2025 13:44
@TomMelt
Copy link
Collaborator

TomMelt commented Dec 8, 2025

Thanks for raising this @ma595 . Would be good to see if we can make the output continuous across the 0 meridian line.

@TomMelt TomMelt added the ICCS label Dec 8, 2025
@amangupta2
Copy link
Contributor

@ma595 Looks like a nice strainghtforward fix. Is it possible to add this during inference itself? Or do we need to retain the model with the ReplicationPad2d layer?

@ma595
Copy link
Collaborator Author

ma595 commented Dec 9, 2025

@ma595 Looks like a nice strainghtforward fix. Is it possible to add this during inference itself? Or do we need to retain the model with the ReplicationPad2d layer?

I think we'd need to retrain the model.

kernel_size=kernel_size,
stride=stride,
padding=padding,
padding=0,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just a reminder, can you remove these lines completely. The default for Conv2d is padding=0. https://docs.pytorch.org/docs/stable/generated/torch.nn.Conv2d.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants