Skip to content

JOSS review: questions regarding optimization #243

@fabian-sp

Description

@fabian-sp

This is related to the review of FunFact for JOSS (see openjournals/joss-reviews#4502)

I have a few questions regarding your optimization procedures.

  • If I understand correctly, i could use any torch.optim optimizer as opt argument, because they possess a step method, correct? As I am not so familiar with JAX, is this also true for JAX? Maybe you could provide a short example for this in the docs (or at least I couldn't find it)?
  • Regarding the point above, I was wondering why you reimplemented Adam and RMSProp in https://github.com/yhtang/FunFact/blob/4e5694f7c9881223fcb41fcb21e49007586aa779/funfact/optim.py. They are included in Pytorch, so reimplementing them seems a bit counterintuitive to me. Even though these algorithms are pretty simple, this is maybe an unnecessary source of errors. Is there any reason for using by default Adam from torch or some JAX optimizers depending on the active backend?

Thank you in advance for your help!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions