Skip to content

Latest commit

 

History

History
30 lines (21 loc) · 2.23 KB

README.md

File metadata and controls

30 lines (21 loc) · 2.23 KB

Does engagement matter: Do mice see the world differently when they don't care?

The following repository contains code for the Project for Neuromatch Academy: Computational Neuroscience.

Traditionally V1/VISp is considered a simple feature detector. Stimulus representation in A1 is known to adapt to engagement in a go/no-go task. Does stimulus representation in V1 also differ depending on engagement (active) and disengagement (passive)? We investigate this question both cell-level and population-level.

Requirements

Required Python packages are listed in the requirements.txt file. The dataset can be accessed from the following allen-sdk api.

Results

  • On single cell level our methods yield no clear results.
  • Population level dissimilarity (RDM) hints at the difference between active and passive session, but with control (pre-stimuli) this difference appears independent of the stimuli.

Refer to the respective notebooks for single cell and population level analysis. For detailed results, refer to the presentation file.

References