Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Review on the JCON Paper #27

Open
kellertuer opened this issue Jan 26, 2025 · 2 comments
Open

Review on the JCON Paper #27

kellertuer opened this issue Jan 26, 2025 · 2 comments

Comments

@kellertuer
Copy link

These are a few comments on the JCON paper review at JuliaCon/proceedings-review#179.

I marked a few things in the PDF, and wrote additional notes. They are numbered to discuss them.
Maybe the best way to address these is in a PR.

Review ExponentialFamilyManifolds JCON

  1. I would maybe postfix every package name by .jl for clarity? (I only marked the first 3 and a few throughout)
  2. conerning the advanced optimization techniques – directly or via Manopt.jl? I that fits, mentioning Manopt.jl in the abstract would be great (if that is used for them that is)
  3. Can you maybe provide a reference that the family is a popular choice?
  4. natural parameters sounds like that is a known fixed term in statistics? If so then it can stay as is, and/or otherwise maybe a short explanation would be nice
  5. can you maybe add that they are implemented using the interface for manifolds provided in ManifoldsBase.jl ? That fits a bit better than the mention of Manifolds.jl in the sentence afterwards
  6. Would NaturalParametersManifold <: AbstractManifold be correct here? Maybe even `ManifoldsBase.AbstractManifold (though that is a bit long-ish)
  7. Again to be a bit picky: ProductManifold is already available from ManifoldsBase.jl , Manifolds.jl (also a bit of a heavy dependency) would not be necessary for this
  8. HOw does get_natural_manifold work? What are its two arguments? Maybe a link to the docs would be nice here
  9. retract(M, p, X, m) starting from p in direction X is missing the retraction method m here, otherwise it is equivalent to just calling exp(M, p, X) unless the default_retraction_method(M) is overwritten accordingly, but then that method should be mentioned again
  10. Ah from herre you might indeed need Manifolds.jl since the positive numbers are from there. The footnote is a bit misleading, it might be read in a way, that PositiveNumbers is actually implemented in exponentialFamilyManifolds.jl as well
  11. Oh, it gets a bit technical in Code 1. ForwardDiff computes the Euclidean gradient, on Manifolds you need the Riemannian gradient, see the note at https://manoptjl.org/stable/tutorials/AutomaticDifferentiation/#EmbeddedGradient and change_representer . On positive numbers you might just be slightly wrong, but it would be better to actually change the representer here as well
    In principle you (implicitly) do the same in Code 4, just that there Fisher is the metric you aim for.
    We just do not have Fisher-Rao much in Manifolds.jl but that hcnage is the same as you would need in Code 1 as well, unless M is Euclidean.
  12. This explanation is too short for me, but I am not a statistics expert, maybe for them that is enough information
  13. What is [7] for a reference? Meant to be the Zenodo-DOI? Then please add this. The reference seems quite incomplete in the current form.

📋

10.21105.jcon.00172.pdf

@kellertuer
Copy link
Author

In a second part I will add a few comments on the documentation and the code in a few days.

@kellertuer
Copy link
Author

Here is part II on the documentation; I did not yet look deeply into the actual implementations, but the code snippets I saw look nice.

Concerning the code:

  1. I am missing a bit a “get started” or installation help:
  • The Readme says “See the docs”, which is fine (could be a link maybe)
  • The docs (stable) are one single page; that is ok, but could maybe be split – but all docs are merely the description of the functionality.
  • So: How do I get started (after the usual using Pkg; Pkg.add("ExponentialFamilyManifolds"))
  1. Nice that you run Aqua and great that have a very good code coverage!
  2. For the very last code example with optimization in the docs (https://reactivebayes.github.io/ExponentialFamilyManifolds.jl/dev/#Optimization-example)
  3. The gradient is created just doing forward differences so is that the Euclidean gradient? Or is that by chance due to the natural parameters? That could maybe be mentioned.
  4. As a future improvement it might be worth having a list of distributions you already cover / that fit your framework? currently get_natural_manifold_base (not sure why the name includes the base) has a long list of its defintions which might be considered a list, but is a bit clumsy to follow up on
  5. The review criteria mention that the package should clearly indicate how people can contribute or seek support. It would be great if you could write about that e.g. in the Readme.
  • Maybe link to discourse?
  • Maybe mention a Slack channel where you generally hang out, look regularly?
  • mention issue / discussions on the repo?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant