No CrossRef data available.
Published online by Cambridge University Press: 24 June 2022
Background: Brain-machine-interface research has utilized multichannel single neuron recordings to decode movement intention. However, the prefrontal cortex (PFC) contains mental representations of more abstract task and goal elements which may be utilized as important signals in a brain-machine-interface. We therefore utilized virtual reality to simulate a real-world task while recording from ensembles of primate PFC neurons. Methods: Two male rhesus macaques (macaca mulatta) were trained to navigate a virtual reality environment using a joystick and learn a context-object association rule. We implanted each monkey with two 96-channel Utah arrays (Blackrock Microsystems) in the lateral PFC (areas 9/46 and 8a) and simultaneously recorded from multiple single neurons. Results: A linear support-vector-machine decoded task elements (context, target location and chosen direction of movement) with significantly greater than chance accuracy. This information was decoded in a sequential manner as the primates made a rule-based decision, with context information appearing first, followed by target location, and chosen side. Conclusions: We found that different neuronal ensembles encode the elements needed for implementing the context rule, and that such ensembles are activated sequentially. Brain-machine-interface systems may benefit by integrating neural data from the PFC, providing salient goal-related information such as the content of the goal and its spatial location.