The hand is an exceptionally versatile effector that enables a wide range of behaviors, ranging from picking up a coffee cup to playing the piano. From a neural standpoint, the versatility of the hand depend on a sophisticated motor system that has evolved to give rise to dexterity (Bortoff and Strick 1993; Lemon 1999). Manual interactions also require sensory feedback, including vision, proprioception, and touch. Proprioceptive afferents provide weak sensitivity to mechanical fingertip contact (Macefield and Johansson 1996), and visual signals convey little information about contact events. Instead, most manual interactions with objects require the sense of touch, which provides precise information about objects – their local contours, their surface texture, their motion across the skin – and about our interactions with them – the location, magnitude, and directionality of contact forces (Johansson and Flanagan 2009). Ultimately, tactile signals are used to guide object interactions, as evidenced by the deficits that arise when these signals are abolished, through digital anesthesia (Augurelle et al. 2003), or disease (Jeannerod et al. 1984). The goal of my dissertation is to investigate different representations of hand-object interactions across the neuraxis, beginning with the coding of task relevant features in individual tactile fibers and culminating with representations of hand movements in populations of motor cortical neurons.