Skip to content

Commit

Permalink
clean up fsdp lora logic
Browse files Browse the repository at this point in the history
  • Loading branch information
Jiayi-Pan committed Feb 5, 2025
1 parent e01412a commit a0be6d9
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion verl/workers/sharding_manager/fsdp_vllm.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ def __enter__(self):

if isinstance(self.module._fsdp_wrapped_module, PeftModel):
with FSDP.summon_full_params(self.module):
self.module._fsdp_wrapped_module.unmerge_adapter()
self.module.unmerge_adapter()
del params
torch.cuda.empty_cache()
log_gpu_memory_usage('After del state_dict and empty_cache in sharding manager', logger=logger)
Expand Down

0 comments on commit a0be6d9

Please sign in to comment.