Tag Archives: quadcopter

myQrc – test completed

I just upload the last and final version of the myQ release candidate on github

DSC_6918

All the software has been debugged and  tested.

All the functionalities are now stable.

I tested in different options (debug mode, netscan activated, sensor log)  and the result is that I can run the main loop every 10 ms and  get sensor data every 6 ms.

It can happen to have a delay on the  sensor data loop  when a log is added ( 2/3 ms).

 

I removed the webserver from the list of test to do, so it is not supported in this version of the software. The main reason is an instability when runnin gon raspberry. I have to investigate a more robust way to manage the comunication via browser.

 

Next time I will write a post the drone will be just landed…(after its first flight!!!)

Advertisements

Alfa3.test. Here it comes the python code

So now ,after some additional tests, here I added the software code: alfa3

All the basic components necessary to build a quadcipter sw are now implemented.
This is list of the modules developed:

  • motor.py
  • sensor.py
  • MPU6050.py
  • pid.py
  • rc.py

Any of those modules includes a specific object.I’ll plan some time to review the python page with a detailed help for each module.

Here I want just to summarise the main aspects.

rc is the new entry in this alfa3test session.it is an object that works in parallel thread.It is just waiting for input from user. In this particular case the input is coming from the keyboard,but the same approach it is possible (I’ m working on it and a tutorial is close to be ready) to get input from a webserver running on rpi.

Motor is an object that manages the motor speed trou ESC. It uses the RPIO library to generate the PWM.

Sensor is an object that works in a parallel thread. Every around 6ms it can update the info from the gyroscope and the accelerometer ,quadcopter inclination and rotational speed.

MPU6050 is a pure interface between raspberry and sensor hardware. If you want to use a different sensor it is just needed to build this specific class.

Pid is the object that includes the calculatio for the proportional , integral and derivative control.

Alfa3test. PID introductionand more…

Starting with this series of tests Alfa3,I added some new features that move the project to the final setup.alfa3_pic1In particular I add the following modification that allow the system to work wireless:

  1. I added a second ESC and a second motor.
  2. I modified the connector board adding a new 3 pin connector (C in the photo) where there is the zero volt (black wire),  the signal (white wire) that is bridged with the 4th signal (green wire), and also added the 5 volt (red wire) .connector_board In this way I can choose if rpi is powered by  the ESC (C connection) or not (B connection).
  3. So the schema for this test consist of the connection of motor M[0] in A and motor M[1] in C.
  4. I connected also my wifi dongle in the rpi usb port. I discovered that neither the pc nor the wifi dongle can act as access point. You can verify this using the comand:

iwconfig ap

So I used my smartphone activating the router wifi function,building up a network including the PC and the rpi.

Below the references used in the test:

alfa3_convention

  • M[0] turns counterclockwise, the red wire is connected to the “T” wire of the esc (the one that is closer to the T of the Turnigy logo),and the yellow wire to the “Y” wire , the black wire with the center wire of the esc.
  • M[0] mounts a standard left prop.
  • M[1] turns clockwise, the yellow wire is connected to the “T” wire of the esc ,and the red wire to the “Y” wire ,the black wire with the center wire of the esc.
  • M[1] mounts a right prop. (marked  R).
  • According to the position of the IMU,I have a negative roll rotation if M[0] moves down.

I did already some tests with this configuration. In the next post the first results and the description of th enew sw module rc.py that is the remote control module.

Alfa1 test.Preparation

During the last wekend I have been busy,so it was not possible to continue the testing.

By the way now it comes a new post where I describe the code I created for what I called alfa tests, a session of experiments to prepare the final code.

Alfa1 test includes the following functionalities:

  1. manual motor jogging
  2. IMU reading
  3. data logging (motor w,and IMU angles)

The actions I want to take during this test are:

  1. verify the disturbance on  IMU data due to motor vibrations
  2. test different IMU filter values
  3. start the motor and  try to understand which is the hover w  (motor angular speed) according to the current configuration
  4. Start the motor and understand  how a deltaw can influence angle of the quadricopter
  5. mount different weights on the quadcopter arm and see the different  wh (hover w) in order to verify my math model.
  6. measure the actual motor angular speed with my smartphone (just curious abuot the possibility to measure the real w  with a sound spektrum analyzer).

Hope to start tomorrow the alfa1 test, than I will share the  code.

Class sensor.py ready!

After some complementary activities (build a new table for next experiments, for example) I finalized the test on the IMU and created this sensor object  that can run in a parallel thread , so I can get last updated sensor information anytime (every 5 -8 ms) while my main quadcopter loop is running.
You can find and download the sensor_test  example.When you run it you can see on the terminal the current roll,pitch and yaw of your IMU.

The available data are: angular position [deg],angular rate[deg/sec], linear acceleration[m/sec]

The main differences respect the IMU_test are:

  • I saw that the  mpu6050 initialization of same parameter can fail.The order on how you set the IMU parameter can affect the setting itself. So I added a function to verify the parameters are in fact what I decide.
  • The accelerometer vector has been normalized, so the lenght ,when not moved is equal to the gravity. this can help if I want to integrate it to extimate the linear speed and the linear movement
  • the class sensor.py  can run in a parallel thread, so anytime I can have the current sensor values.
  • minor adjustments done on the naming of the variables.

Next steps are:

  • Buy some connectors and do some soldering to have a final version of the wiring
  • mount the sensor on the frame and test it with running motors and see if i need more severe filtering.

Tutorial: How to read data from IMU

In the previous post I described how to setup raspberry pi for connection with the IMU.

Now it is time to see how to read some data from the sensor.

First yuo need sw that manage the i2c interface.there are many examples.You can find one called adafruit_i2c.py on github.

Then it is necessary to have the code specific for the sensor,in my case a MPU6050.

I tryied for some days to build my own code, but I encountered problems related to unconsistent results: even if I did not move the sensor, the returned results were always different. I suspect it was a problem on how I formatted the values.

Finally, Thanks to the great job done by Hove in  his blog, I used his code and I’m now able to collect correct data from the sensor.

I did some minor modification and prepare this IMU_test files.

So I started some preliminary tests to verify which is the sensor behaviour.

I fixed the sensor on a bar ,horizontally, than turned the bar by a known angle ( 13 degrees,measured with my smartphone level) then move back to horizontal.

IMU_test1

I recorded on a file the sensor data : acceleration along axis (from ACCelerometer) and rotational speed (from GYRO scope). On excel sheet  I calculated the angle around x  respect the ACC and respect the GYRO:

  • rx ACC=DEGREES(ATAN2(accZ+9.8;accY))
  • rx(i) GYRO=wx(i) *dt+wx(i-1)

Below you can see the graph.

IMU

I underline in the picture the 2 tipical problems on the IMU :

  • the Gyro drift (you can see an angle of 1  degree while it reality it was 0)
  • the Accelerometer sensibility to noise.

So next  development step is to filter/reduce this 2 problems by combining the 2 sensors results.