There are numerous behavioral and physiological studies that show how the brain compensates for uncertainties and unexpected changes in the sensory environment while still successfully perform motor tasks. To date, there have been a variety of phenomenological models proposed for explaining this sensory motor adaptation. But in order to relate the suggested control algorithms with their neural realizations, it is important to have biologically plausible mechanisms that capture both neural activities and the higher order behaviors that they give rise to. Here we extend a previous model, the Recurrent Error-driven Adaptive Control Hierarchy (REACH), that accounts for dynamic and kinematic adaptation, to also capture visuomotor adaptations. To demonstrate this behavior, we consider the conventional task of 'visuomotor rotation' using a two link arm in a planar reaching task. The model consists of anatomically organized structures including M1, pMd and cerebellum to incorporate different aspects of the behavior. The extended model has a multimodal Kalman filter to accomodate an internal model for dynamic prediction of limb states and sensory integration of vision and proprioception. A spike based algorithm is implemented to learn the visuomotor transformation. Replicating experiments in humans and non-human primates, the model is made to reach when given either abrupt or fast implicit rotations. We also use the model to explore multirate adaptations to further demonstrate classical characteristics such as savings and interference. While the proposed model is consistent with the experimental data of rapid adaptations, the model also exhibits spiking activity comparable to empirical data. A plausible and anatomically organized neuron model describing the adaptation to visuomotor rotations, and eventually to other visuomotor transformations, can provide insights and significantly improve our understanding of the motor system organization and function.