binaryRL: Reinforcement Learning Tools for Two-Alternative Forced Choice Tasks

Tools for building Rescorla-Wagner Models for Two-Alternative Forced Choice tasks, commonly employed in psychological research. Most concepts and ideas within this R package are referenced from Sutton and Barto (2018) <ISBN:9780262039246>. The package allows for the intuitive definition of RL models using simple if-else statements and three basic models built into this R package are referenced from Niv et al. (2012)<doi:10.1523/JNEUROSCI.5498-10.2012>. Our approach to constructing and evaluating these computational models is informed by the guidelines proposed in Wilson & Collins (2019) <doi:10.7554/eLife.49547>. Example datasets included with the package are sourced from the work of Mason et al. (2024) <doi:10.3758/s13423-023-02415-x>.

Version: 0.9.7
Depends: R (≥ 4.0.0)
Imports: Rcpp, compiler, future, doFuture, foreach, doRNG, progressr
LinkingTo: Rcpp
Suggests: stats, GenSA, GA, DEoptim, pso, mlrMBO, mlr, ParamHelpers, smoof, lhs, DiceKriging, rgenoud, cmaes, nloptr
Published: 2025-08-19
DOI: 10.32614/CRAN.package.binaryRL
Author: YuKi ORCID iD [aut, cre]
Maintainer: YuKi <hmz1969a at gmail.com>
BugReports: https://github.com/yuki-961004/binaryRL/issues
License: GPL-3
URL: https://yuki-961004.github.io/binaryRL/
NeedsCompilation: yes
CRAN checks: binaryRL results

Documentation:

Reference manual: binaryRL.html , binaryRL.pdf

Downloads:

Package source: binaryRL_0.9.7.tar.gz
Windows binaries: r-devel: binaryRL_0.9.7.zip, r-release: binaryRL_0.9.7.zip, r-oldrel: binaryRL_0.9.7.zip
macOS binaries: r-release (arm64): binaryRL_0.9.7.tgz, r-oldrel (arm64): binaryRL_0.9.0.tgz, r-release (x86_64): binaryRL_0.9.7.tgz, r-oldrel (x86_64): binaryRL_0.9.7.tgz
Old sources: binaryRL archive

Linking:

Please use the canonical form https://CRAN.R-project.org/package=binaryRL to link to this page.