-
Notifications
You must be signed in to change notification settings - Fork 647
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[API] redesign of logging and monitoring for 2.0 #1700
Comments
Maybe a question for @jdb78. @dr-upsilon, is this happening in a way that is duplicative with |
Yes I think so. BaseModel inherits from HyperparametersMixin from lighting.pytorch.core.mixins.hparams_mixin. |
Looked through this multiple times and I cannot come up with a good reason (without hearing @jdb78 original rationale). I suppose it is something that changed between different versions of Would you like to contribute a PR? |
pytorch-forecasting
seemingly unnecessarily saves loss
and logging_metrics
multiple times
pytorch-forecasting
seemingly unnecessarily saves loss
and logging_metrics
multiple times
@dr-upsilon, monitoring and logging have emerged as a top priority item for the API redesign. I have hence converted this issue in a more general API redesign issue for monitoring and logging.
|
Discussion on API design of monitoring and logging layer for version 2.0.
Top level umbrella issue: #1736
Converted from original issue below, by @dr-upsilon.
pytorch-forecasting
seemingly unnecessarily savesloss
andlogging_metrics
multiple timesC:\...miniconda3\envs\envpt\Lib\site-packages\lightning\pytorch\utilities\parsing.py:208: Attribute 'logging_metrics' is an instance of
nn.Moduleand is already saved during checkpointing. It is recommended to ignore them using
self.save_hyperparameters(ignore=['logging_metrics']).
This is caused by
self.save_hyperparameters()
in init method of TemporalFusionTransformer, because save_hyperparameters() uses inspect and frame to identify all the hyperparameters,What's the reason to keep it or shall we add handling in init?
The text was updated successfully, but these errors were encountered: