In this paper, we propose a novel unified framework for virtual guides. The human–robot interaction is based on a virtual robot, which is controlled by the admittance control. The unified framework combines virtual guides, control of the dynamic behavior, and path tracking. Different virtual guides and active constraints can be realized by using dead-zones in the position part of the admittance controller. The proposed algorithm can act in a changing task space and allows selection of the tasks-space and redundant degrees-of-freedom during the task execution. The admittance control algorithm can be implemented either on a velocity or on acceleration level. The proposed framework has been validated by an experiment on a KUKA LWR robot performing the Buzz-Wire task.