Skip to content

Commit

Permalink
add docstring to dtype
Browse files Browse the repository at this point in the history
  • Loading branch information
killershrimp committed Dec 22, 2024
1 parent 6ef8d80 commit eccd5f0
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions src/adapters/configuration/adapter_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -483,6 +483,7 @@ class LoRAConfig(AdapterConfig):
Place a trainable gating module besides the added parameter module to control module activation. This is
e.g. used for UniPELT. Defaults to False. Note that modules with use_gating=True cannot be merged using
`merge_adapter()`.
dtype (str, optional): torch dtype for reparametrization tensors. Defaults to None.
"""

architecture: Optional[str] = "lora"
Expand Down Expand Up @@ -542,6 +543,7 @@ class ReftConfig(AdapterConfig):
subtract_projection (bool): If True, subtract the projection of the input.
dropout (float): The dropout rate used in the intervention layer.
non_linearity (str): The activation function used in the intervention layer.
dtype (str, optional): torch dtype for intervention tensors. Defaults to None.
"""

layers: Union[Literal["all"], List[int]]
Expand Down

0 comments on commit eccd5f0

Please sign in to comment.