Drone simulation
The installation of V-REP is straightforward. From the v-rep official site, download the archive corresponding to your system. Extract the directory from it. So you have somewhere in your disk a somewhere/V-REP_PRO_EDU_..../ directory. Let us define an environement variable corresponding to the path of this directory.
Edit your ~/.bashrc, and add the following lines at the end (use the actual path of your v-rep directory instead of somewhere/V-REP_PRO_EDU_..../ of course).
# V-REP location
export VREP_ROOT=somewhere/V-REP_PRO_EDU_....
Then, close all terminals and re-open a new one, so that the $VREP_ROOT variable is actually defined. To launch v-ep, you just have to
Now, we are ready to start real job...
Our first drone scene

Let us build our drone simulator from an existing drone in v-rep, in order to discover some v-rep features. The drone should follow the target. Now, let us rename some parts. What happens ? Edit the drone lua script. Correct what has become wrong after the renaming.
Let us have a deeper look at the lua script. Try to understand what is done (except the calculation of the propeller velocities which will be changed next). Why is the sphere detached ? How is the propeller speed pass to the propeller controllers ?
Game-like tracking camera
Let us add a tracking camera. Why is the drone frozen in the floatting view ?
Our track cam is attached to the drone frame. Indeed, another frame would be better. Indeed, the "Track_base" frame moves with the drone, as "Quadricopter_base" frame do. The trick consists in removing pitch and roll rotations for the "Track_base" frame. In initialization part of the script, add the following:
We have unparented trackBase so that we can further set its position independently from the motion computation performed for Quadricopter_base. The object trackBase is thus now freed from any physics computation and can be moved in the world without being influenced by the simulation.
The script already has the following statement:
Now, in the sensing part of the script, let us get the position of the drone... :
So we only have to adjust (in the sensing part of the script) the absolute position of the "Track_base" frame from the absolute position of the "Quadricopter_base" frame:
ori = simGetObjectOrientation(d,-1)
The "Track_base" frame is nice for a game-like display, but it will also be useful for incoming control of the drone.
Enter V-REP ?
Understanding how V-REP works might help you to build up your own robots, interfacing them with ROS. It is really worth it. This is what the rest of this page adresses. Nevertheless, you can skip this if you want directly to play with a already-made drone. If you do not want to skip, go to next section. Otherwise, execute the following instructions and you are done.
Removing existing control code
The idea here is to build up our own controller. To do so, we need first to clear the existing target-based control.
Elevation control
Let us control the height of the drone. This consists of adjusting the overall thrust so that the height of the drone is at a desired position.
Yaw control
The yaw control can be done quite easily.
Pitch control
The pitch control is a bit more complex, since maintaining the pitch to some constant inclination value leads to a drone acceleration. This is what I propose.
Roll control
The roll control is identical to the pitch one.
Stability tests
Now, you can change the drone height and orientation during the simulation run. It should return to a still position.
ROS interface
In order to be able to interact with ROS, you have to install some ROS packages in your workspace and add libraries to V-REP to handle interaction with ros. Follow the instructions here.
We need to have roscore running before v-rep is started. So close your current v-rep, run roscore, and restart v-rep.
The twist topic
Now, let us setup the ros communication. Indeed, the twist topic /drone/cmd_vel is available. Let us use it to set the elevation of the drone at 1 meter high.
rostopic pub /drone/cmd_vel geometry_msgs/Twist "linear:
  x: 0.0
  y: 0.0
  z: 1.0
  x: 0.0
  y: 0.0
  z: 0.0"
Now, let us use the keyboard to control the drone navigation. I propose a teleop package for that purpose. Read the code in order to learn how to interact with the keyboard in ROS.
cd your_catkin_workspace/src
git clone https://github.com/HerveFrezza-Buet/demo-teleop.git
cd ..
catkin build
Now, run that teleop node and control the drone with the keyboard.
rosrun demo_teleop key_teleop.py cmd_vel:=/drone/cmd_vel
Do not forget to save your scene.
The image topic
Our drone really needs a camera (i.e a vision sensor, not a new openGL view of the scene). Let us add it, and then bind it to a ROS image topic.
In the frame of the drone (i.e "Quadricopter base"), add a perspective vision sensor named "FrontVision". Be sure to orient it carefully. Place it just in front of the central stick (i.e. the body) of the drone (set its position and orientation relativly to the parent frame : the blue axis points toward front, the green one point at the top and the red one point to the left). Set its dimensions to 640x480. Configure a far clipping plane (100m). Then, in the init part of the lua script, in the section for ros stuff, add:
frontImage = simGetObjectHandle('FrontVision')
image_publisher = simExtRosInterface_imageTransportAdvertise('/drone/front', 1)
Then in the cleanup section, add
Last, let us get image data and send it on the image topic. Add this in the sensing part of the script, at the end.
-- ROS stuff : publish the image
local data,w,h = simGetVisionSensorCharImage(frontImage)
simExtRosInterface_imageTransportPublish(image_publisher, data, w, h, "drone front")
Do not forget to save your scene.
Then, the ros image topic /drone/front is available.
rosrun image_view image_view image:=/drone/front
From the vrep interface, select your Drone, and save it as a model (.ttm) file. Put the model into the "models" subdirectory of your v-rep directory. It is now available as any other robot for the building up of an drone experimentation platform.