A mayor challenge in human-robot interaction and collaboration is the synthesis of non-verbal behaviour for the expression of social signals. Appropriate perception and expression of dominance (verticality) in non-verbal behaviour is essential for social interaction. In this paper, we present our work on algorithmic modulation of robot bodily movement to express varying degrees of dominance. We developed a parameter-based model for head tilt and body expansiveness. This model was applied to a variety of behaviours. These behaviours were evaluated by human observers in two different studies with respectively static pictures of key postures (N=772) and realtime gestures (N=31). Overall, specific behaviours proved to communicate different levels of dominance. Further, modulation of body expansiveness and head tilt robustly influenced perceived dominance independent of specific behaviours and observer viewing height and angle. The modulation did not influence perceived valence, but it did influence perceived arousal. Our study shows that dominance can be reliably expressed by both selection of specific behaviours and modulation of behaviours.