Previous words in the sentence can influence the processing of the current word in the timescale of hundreds of milliseconds. The current research provides a possible explanation of how certain aspects of this on-line language processing can occur, based on the dynamics of recurrent cortical networks. We simulate prefrontal area BA47 as a recurrent network that receives on-line input of “grammatical” words during sentence processing, with plastic connections between cortex and striatum (homology with Reservoir Computing). The system is trained on sentence-meaning pairs, where meaning is coded as activation in the striatum corresponding to the roles that different “semantic words" play in the sentences. The model learns an extended set of grammatical constructions, and demonstrates the ability to generalize to novel constructions. This demonstrates that a RNN can decode grammatical structure from sentences in an on-line manner in order to generate a predictive representation of the meaning of the sentences.