Skip to content

Commit

Permalink
(remove some small docstring references to slurm)
Browse files Browse the repository at this point in the history
  • Loading branch information
pgunn committed Nov 21, 2023
1 parent 637b7ac commit 2202528
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 3 deletions.
2 changes: 1 addition & 1 deletion caiman/source_extraction/cnmf/map_reduce.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ def cnmf_patches(args_in):
number of global background components
backend: string
'ipyparallel' or 'single_thread' or SLURM
'ipyparallel' or 'single_thread'
n_processes: int
number of cores to be used (should be less than the number of cores started with ipyparallel)
Expand Down
2 changes: 0 additions & 2 deletions caiman/source_extraction/cnmf/temporal.py
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,6 @@ def update_temporal_components(Y, A, b, Cin, fin, bl=None, c1=None, g=None, sn=N
ipyparallel, parallelization using the ipyparallel cluster.
You should start the cluster (install ipyparallel and then type
ipcluster -n 6, where 6 is the number of processes).
SLURM: using SLURM scheduler
memory_efficient: Bool
whether or not to optimize for memory usage (longer running times). necessary with very large datasets
Expand Down Expand Up @@ -287,7 +286,6 @@ def update_iteration(parrllcomp, len_parrllcomp, nb, C, S, bl, nr,
ipyparallel, parallelization using the ipyparallel cluster.
You should start the cluster (install ipyparallel and then type
ipcluster -n 6, where 6 is the number of processes).
SLURM: using SLURM scheduler
memory_efficient: Bool
whether or not to optimize for memory usage (longer running times). necessary with very large datasets
Expand Down

0 comments on commit 2202528

Please sign in to comment.