Biological motion observation has been recognized to produce dynamic change in sensorimotor activation according to the observed kinematics. Physical plausibility of the spatial-kinematic relationship of human movement may play a major role in the top-down processing of human motion recognition. Here, we investigated the time course of scalp activation during observation of human gait in order to extract and use it on future integrated brain-computer interface using virtual reality (VR). We analyzed event related potentials (ERP), the event related spectral perturbation (ERSP) and the inter-trial coherence (ITC) from high-density EEG recording during video display onset (-200-600 ms) and the steady state visual evoked potentials (SSVEP) inside the video of human walking 3D-animation in three conditions: Normal; Upside-down (inverted images); and Uncoordinated (pseudo-randomly mixed images). We found that early visual evoked response P120 was decreased in Upside-down condition. The N170 and P300b amplitudes were decreased in Uncoordinated condition. In Upside-down and Uncoordinated conditions, we found decreased alpha power and theta phase-locking. As regards gamma oscillation, power was increased during the Upside-down animation and decreased during the Uncoordinated animation. An SSVEP-like response oscillating at about 10 Hz was also described showing that the oscillating pattern is enhanced 300 ms after the heel strike event only in the Normal but not in the Upside-down condition. Our results are consistent with most of previous point-light display studies, further supporting possible use of virtual reality for neurofeedback applications.
Keywords: ERP; ERSP; ITC; SSVEP; observation; virtual reality; walking.