Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

start article on variable precision #230

Draft
wants to merge 2 commits into
base: develop
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions _pkgdown.yml
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@ articles:
- articles/bmm_mixture_models
- articles/bmm_imm
- articles/bmm_sdm_simple
- articles/bmm_variable_precision

- title: Using the bmm package
desc: >
Expand Down
18 changes: 18 additions & 0 deletions vignettes/articles/REFERENCES.bib
Original file line number Diff line number Diff line change
Expand Up @@ -101,3 +101,21 @@ @article{oberauerSimpleMeasurementModels2019
urldate = {2022-07-26}
}

@article{bergVariabilityEncodingPrecision2012,
title = {Variability in Encoding Precision Accounts for Visual Short-Term Memory Limitations},
author = {van den Berg, Ronald and Shin, Hongsup and Chou, Wen-Chuang and George, Ryan and Ma, Wei Ji},
year = {2012},
month = may,
journal = {Proceedings of the National Academy of Sciences},
volume = {109},
number = {22},
pages = {8780--8785},
publisher = {National Academy of Sciences},
issn = {0027-8424, 1091-6490},
doi = {10.1073/pnas.1117465109},
url = {https://www.pnas.org/content/109/22/8780},
urldate = {2020-08-17},
chapter = {Biological Sciences},
langid = {english},
pmid = {22582168}
}
93 changes: 93 additions & 0 deletions vignettes/articles/bmm_variable_precision.Rmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
---
title: "Implementing variable precision in the mixture models for visual working memory"
output:
bookdown::html_document2:
number_sections: false
author:
- Ven Popov
bibliography: REFERENCES.bib
vignette: >
%\VignetteIndexEntryImplementing variable precision in the mixture models for visual working memory}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
pkgdown:
as_is: true
---

```{=html}
<style type="text/css">
div.main-container {
max-width: 850px !important;
}

p {
margin-top: 1.5em ;
margin-bottom: 1.5em ;
}
.author{
display: none;
}
</style>
```

```{r, include = FALSE}
options(crayon.enabled = TRUE)
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>",
dev = "jpeg",
dpi = 100,
fig.asp = 0.8,
fig.align = "center"
)
fansi::set_knit_hooks(knitr::knit_hooks, which = c("output","message","error"))
options(width = 300)

ggplot2::theme_set(tidybayes::theme_tidybayes() + cowplot::panel_border())
```

# What is variable precision?

In the context of [visual working memory (VWM)](bmm_vwm_crt.html) research, the precision of memory representations is often assumed to be constant across all items in a display. However, there is evidence that the precision of memory representations can vary across items and trials. This can be implemented in formal models of VWM, such as those implemented in the bmm package, as variable precision - a model parameter that allows the precision of memory representations to vary across items or trials.

Traditionally, variable precision models were considered to be a separate class of models (@bergVariabilityEncodingPrecision2012), and estimating them required the development of specialized mathematical tools, such as deriving the likelihood function that results from a mixture of variable precision distributions. Fortunately, this is no longer the case, as the bmm package allows for the estimation of variable precision models using the same tools that are used to estimate constant precision models. In this article, we will demonstrate how to implement variable precision for any of the existing models in the bmm package.

# Implementing variable precision in the bmm package

Depending on your goals, amount of data, and computational resources, you can implement variable precision in the bmm package at several levels of pooling.

## Trial-by-trial variability in precision

The simplest option is to specify that `kappa`, the precision parameter of the von Mises distribution, is allowed to vary as a random effect across trials. Let's implement this for the standard two-parameter mixture model:

```r
# load the package
library(bmm)

# no variable precision
# thetat and kappa vary across set-sizes and have a random effect for set-size
formula_fixed <- bmf(thetat ~ 0 + set_size + (0 + set_size | ID),
kappa ~ 0 + set_size + (0 + set_size | ID))

# kappa varies across trials (variable precision)
formula_vp <- bmf(thetat ~ 0 + set_size + (0 + set_size | ID),
kappa ~ 0 + set_size + (0 + set_size | ID) + (1|trial))
```

In this formula, `(1|trial)` specifies that the precision parameter `kappa` is allowed to vary across trials. The `0 + set_size` term specifies that the mean parameter `thetat` and the precision parameter `kappa` are allowed to vary across set-sizes. The `(0 + set_size | ID)` term specifies that the mean and precision parameters are allowed to vary across set-sizes and that there is a random effect for set-size.

We would then fit the model like so:

```r
mydata <- oberauer_lin_2017
mymodel <- mixture2p(resp_error = "dev_rad")
fit_fixed <- bmm(formula_fixed, data = mydata, model = mymodel, file = "local/vp_tutorial_fixed",
backend = "cmdstanr", cores = 4)
```

The model above makes a strong assumption - that the variability over trials is the same for all subjects. Even though we have set random effects on the overall precision (that is, participants can vary in how precise their memory is), we have assumed that all participants have the same variability over trials.




# References