This is a list of “core” papers that everyone should be familiar with. Please feel free to recommend some!
Semantic Memory
- Explaining our theory:
- Explaining the opposing theory:
- Patterson, K., Nestor, P. J. & Rogers, T. T. Where do you know what you know? The representation of semantic knowledge in the human brain. Nat Rev Neurosci 8, 976–987 (2007). [PDF]
- Ralph, M. A. L., Jefferies, E., Patterson, K. & Rogers, T. T. The neural and computational bases of semantic cognition. Nat Rev Neurosci 18, 42–55 (2016). [PDF]
- The theoretical basis of our theory:
- Barsalou, L. W. Grounded Cognition. Annu Rev Psychol 59, 617–645 (2008). [PDF]
- Simmons, W. K. & Barsalou, L. W. The similarity-in-topography principle: reconciling theories of conceptual deficits. Cognitive Neuropsych 20, 451–486 (2010). [PDF]
- Meyer, K. & Damasio, A. Convergence and divergence in a neural architecture for recognition and memory. Trends Neurosci 32, 376–82 (2009). [PDF]
BCI
- One of the prominent groups that has decoded speech from the motor cortex:
- Makin, J. G., Moses, D. A. & Chang, E. F. Machine translation of cortical activity to text with an encoder–decoder framework. Nat Neurosci 23, 575–582 (2020). [PDF]
- Moses, D. A. et al. Neuroprosthesis for decoding speech in a paralyzed person with anarthria. New Engl J Med 385, 217–227 (2021). [PDF]
Machine Learning
- Convolutional Networks: Szegedy, C. et al. Going deeper with convolutions. Arxiv (2014). [PDF]
- Residual Networks: He, K., Zhang, X., Ren, S. & Sun, J. Deep Residual Learning for Image Recognition. Arxiv (2015). [PDF]
- The Transformer: Vaswani, A. et al. Attention Is All You Need. Arxiv (2017). [PDF]
- GPT: Radford, A., Narasimhan, K., Salimans, T. & Sutskever, I. Improving language understanding by generative pre-training. (2018). [PDF]