Proof of concept: Using Unity to control a servo

…With the purpose of assessing the suitability of Unity as a ‘world simulator’ for robotics applications, and as a test bed for evolving motor control networks!

Tagged , , ,

Telemus: A remotely operated vehicle based on Raspberry Pi and Arduino

NASA’s Curiosity rover is just awesome. I wanted one in my life, and so set about thinking and building… To what extent could I construct a similar robot using bits and pieces available to mortal men? I’m not interested in sampling rocks or seeking out evidence of extraterrestrial activity just yet, and so I boiled my desires down to simple remote control with a video feed – It’d be awesome fun to trundle around my house with a telepresence droid, from anywhere in the world, much like NASA are able to host a remote human presence on a planet 225 million kilometers away.

Planning

Acknowledging that such a construction was not exactly an insignificant undertaking, I spent some time in project manager mode considering my approach. I settled on assembling my droid as a series of independent engineering work ‘streams’, that when brought together, would result in the main goal: an internet controlled Remote Operated Vehicle (ROV).

Core: The project involves a computer that is connected to the internet, which can receive commands and tell motors to move on demand. The computer has a camera attached, and it is able to present the video stream via the internet.

I planned to use a Raspberry Pi as my onboard computer. This little hackable Linux machine certainly has low power requirements, and I’ve seen some impressive things done with it, including numerous projects involving cameras and motors. The Raspberry Pi is not such a great candidate for controlling any motors directly from its GPIO pins though (motors need to draw a lot of current – the tiny computer is built for very low current logic output).

In a biological system, it’s typical for a motor intention (e.g. ‘move my arm in this way’) to pass from a relatively complex ‘high level’, down through successively simpler layers of control, and finally into mechanical affectors: the brain controls muscles via the motor cortex, for instance. I thought that this methodology would be a great way to get my Raspberry Pi communicating with physical reality – by pushing commands to a ‘low level’ motor controller – for my purposes an Arduino microcontroller. But what’s the best way to get a message from the Pi into the Arduino? I established (‘forked’) a project stream: Bridge, which was intended to solve this.

Setting up a web camera on the Raspberry Pi was a critically important task, so I gave it a stream title: Eye. Similarly, internet connectivity is an essential work module (connectivity is not provided ‘out of the box’ with the RPi), and so another stream was forked: Uplink, which was to establish WiFi internet connectivity.

Vessel: I’d need to build a scaffold to hold everything together, with wheels to form a mobile platform. It’ll need a drive system with sufficient torque to deal with basic obstacles (carpet/hard floor transitions), and stay put when not driving.

Juice: There are energy requirements that must be met – a battery or batteries sufficient for both motor control and core computer power.

Soul: In order to animate the whole thing, I’d need to do some software development in a couple of sub work-streams. Firstly I’d have to present a control interface to the user (me at a remote machine), so that I could send commands to some process running on the robot ‘host’ computer. This should then push commands onto its motor control layer, to ultimately translate my control intentions into motor actions.

Production

It’s cool to think that if I were ‘in industry’, I could farm out each stream to a resource or team to develop – that’d allow the production of something really big and cool within an efficient time; just like at NASA! However I was tackling my project alone (kind of, as you’ll see), and so worked through each stream one after the other, switching to progress upon a different task whenever I got stuck or bored. What follows is a debrief of each work stream – hopefully some of the things I learned will assist others looking to achieve similar goals.

Uplink – Raspberry Pi WiFi Connectivity
After a bit of internet research targeting ‘cheap wifi dongle for Raspberry Pi’, I hoped to follow the success of others with the Edimax 7811un WiFi dongle. Unfortunately for me and my year out of date Raspbian Linux distro, this did not at all work with prebuilt driver modules helpfully posted by others on the web. Not fully confident with building from source for my distro (as a Linux noob), I tried to get the prebuilt drivers working with an updated Raspbian distro. Failed again – now the distro was too new for the drivers! I then tried an entirely different OS: Arch Linux, hoping the device would work immediately with included drivers, but was totally put off by the different command environment (I’m really am a Linux noob). I finally tried Occidentalis 0.2 (based on familiar Raspbian, and with a lot of hobbyist-centric modules preloaded); after some very simple configuration, the Edimax now worked ‘out of the box’!

This was a difficult project stream for me, since it involved a lot of learning about Linux hardware compatibility and setup – the entire Linux hardware configuration process seems so arbitrary; options are littered about various obscure places on the core file system, and little sensible help aimed at newcomers (who want to understand the process and not just copy some commands) can be found on the web. Occidentalis came ‘pre-baked’ and ready to go, a perfectly simple solution to wireless internetting on the RPi, with a dirt cheap WiFi dongle.

Eye – webcamera streaming with Raspberry Pi
IMG_0487Similar problems to those experienced with Uplink were encountered whilst setting up the webcam (the lesson: Linux hardware setup is a pain). The main issue was trying to get a Linux distro that supported both the Edimax dongle for WiFi and my webcamera (a Microsoft Lifecam 3000). Occidentalis 0.2 totally saved the day with support for my camera already built in, and my camera hardware was ready to use. I also tried a really cheap camera, the Microsoft XBox Live Vision (around a fiver including postage via Amazon), and this worked beautifully too, with no setup required! I found this hardware compatibility resource particularly useful whilst shopping for suitable hardware: http://elinux.org/RPi_VerifiedPeripherals

Actually streaming the video from my functional camera turned out to be a mammoth research and development endeavour – I tried various systems designed to stream a camera feed to the internet with no success; the popular programs ffserver, avserver and motion all failed at various points during investigation, each with obscure errors. I was running out of patience after days getting nowhere, until I happily stumbled upon mjpg-streamer (installed from the command line like other Linux programs using apt-get) – it worked immediately, and it worked so very well. Furthermore, the video stream format hosted by this program was ready to drop straight into a webpage, which is perfect for my requirements – rather than writing a ‘native’ desktop program to control the robot remotely, I’d simply be able to use a web page to build my control interface: much simpler.

Bridge – communication from Raspberry Pi to Arduino
There turn out to be many numerous methods of getting a computer to talk to another bit of hardware: Serial, I2C etc. I wanted a quick and easy solution, so opted for basic digital logic output directly through the Raspberry Pi’s general purpose input/output pins. I can define my motor control intentions in terms of binary states, and by setting the logic voltage of the pins high for 1 and low for 0, I can push the control intention to my Arduino:

Each motor has three states, which can be described by two bits; a ‘drive’ bit and a ‘reverse’ bit.

idle: 00 (both ‘drive’ and ‘reverse’ bits zero)
driving forwards: 01 (reverse bit 0, drive bit 1)
driving backwards: 11 (reverse bit 1, drive bit 1)

…which means that I need just four bits to control two motors according to the following:

idle – both motors stationary: 0000
driving forwards – both motors turning forwards: 0101
driving backwards – both motors turning backwards: 1111
turning left – left motor turning backwards, right motor forwards: 1101
turning right – right motor turning backwards, left motor forwards: 0111

Further consideration reveals that up to 7 digital states can actually be controlled by just three bits (freeing up one pin from the scheme above), but I eventually stuck with keeping things simple and obvious, using four bits according to the scheme above.

The voltage output from the Raspberry Pi is 3.3V, whilst the voltage expected by the Arduino inputs is 5V – whilst it’s possible to connect directly from Pi into Arduino (3.3V reads perfectly well as a logic 1 – the ‘high’ threshold on Arduino is closer to 2.8V by my experiences), I wanted to ‘do things properly’ with a logic buffer. I tried pushing my bits from the RPi to a 74HCT541 Octal buffer IC, with 5V on the Arduino side. Using the chip to ‘drive’ inputs to the Arduino worked okay until a fourth ‘high’ output was demanded from it – there seemed to be a voltage drop associated with each high output, and by setting all four bits to on (1111), the informational state was very unstable, with ‘high’ represented by slightly less than 2.8V on the Arduino side on each pin. A conversation with a seasoned industrial engineer (thanks dad!) introduced me to the difference between ‘source’ and ‘sink’ drive, as it pertains to IC circuits. I had assumed that ‘source’ mode was how every IC worked! A quick bit of prototyping using a ‘sink’ setup to drive four LEDs worked much better than a source drive arrangement; logic voltage could be made to remain consistent with each additional high output from the octal buffer, but I just couldn’t figure out how to drive my Arduino inputs in this way – ‘pulling’ current from the arduino to represent a logic 1 (I’m also an electronics newbie – though in retrospect I’m sure this could be done with a PNP transistor arrangement).

I ditched the IC and tried an entirely different approach using plain old transistors to translate my Raspberry Pi output voltage from 3.3 to 5v, and experienced a voltage drop across the transistor leading to my 5v dropping to below 2.8v (regardless of swapping out resistors, changing transistor types etc.) – again no good!

Given all the electronics grief thus far, I ended up just plugging RPi into Arduino (sometimes the simplest solutions are the best), acknowledging that this is perfectly safe as my Arduino on the connected pins is in input mode – it won’t be dropping 5v onto the RPi anytime soon. I also noted that I seriously need to spend more time working with electronics to understand what caused my numerous issues during this work stream.

IMG_0494Vessel – a simple wheeled robot body
Possibly the easiest task of all: have some fun with Meccano (a great modular construction system with enough robustness for light robot application). I looked into the possibility of adding tank tracks (a belt sander reel cut in half lengthways) but this didn’t work out how I intended – I preferred a smaller frame than dictated by the size of the belt, so wheels it was.

By modifying a couple of servo motors so that they rotated continuously in either direction (rather than the fixed angle default), I had some high torque drive units, with easy control using the pulse width modulation output from the Arduino.

Juice – Power supply for Pi and motors
Whilst browsing the discounted goods aisle in Maplin, I found a 9000MAh rechargeable power pack with USB output, intended as a supplementary charging source for mobile devices. It’s pretty heavy, but this battery had a massive capacity! A first trial with my WiFi connected Raspberry Pi powered by this little brick proved that it could idle the Pi happily for fourteen hours.

My drive servos require a good bit of power however (servos needing typically 6v, and a fair bit of current) – I didn’t want to put too great a demand on my new power source, and so though it’d do well as my CPU power source for both the Arduino and Raspberry Pi. I’d use a separate source of 4 AA batteries (~6v) to drive my motors.

I built a small ‘power station’ circuit to allow me to source power from each supply, with a common ground rail connected to the rest of the robot circuitry by a master on/off switch. I learned a core lesson in electronics at this point: different battery voltages can happily be combined so long as their grounds are connected – I once couldn’t comprehend how this was possible without one battery ‘leaking’ charge into the other, thus leaving the batteries ‘unbalanced’. Such a thing just doesn’t happen – for charge to enter a battery at one end, an equal but opposite charge must leave at the other end, and so connecting multiple batteries to a common ground simply completes a pathway for charge from each to complete its circuit – there’s no adverse effect with ‘excess’ charge crossing from one battery to the other; a simple fact, but fundamental to my growing practical understanding.

Soul – Programming a telepresence ‘droid
As a software developer by trade I’m on familiar ground here. A bit of research into how the Raspberry Pi’s GPIO pins could be controlled via the internet led me to discover a set of interconnected software elements called WebIOPi – an *amazing* bit of work by a single person; not only did this software fulfill my requirements almost immediately all the way from front end control interface all the way to GPIO action on the Pi, but the author (blog linked here) was in the process of writing an article for a hobbyist magazine about making a ‘cambot’ – refer to issues 9 and 10 of The Magpi; basically the same as my own project! Furthermore, he’d kindly committed his control interface source to the repository for use by anyone (with files supporting the written article also available here)… In this respect I cannot claim credit for this portion of the ‘Soul’ workstream – this turned out to have been fulfilled already by another skilled engineer – with a very few small tweaks (and donation where it’s due to the author of WebIOPi), my project requirements were met.

To summarise the architecture briefly: WebIOPi hosts a webserver on the Raspberry Pi, upon which HTML content can be accessed remotely using a web browser. On this webserver I can place a simple HTML document that uses WebIOPi’s JavaScript API to communicate with a Python script running on the Raspberry Pi; this sets the state of the GPIO pins based on commands originating from button presses on the web page. ‘Macro’ commands can be defined, and so it’s easy to define a ‘forward’ button as meaning ‘0101’ on any four GPIO pins (the intention to drive both my motors accordingly), for example.

The final portion of programming was a very simple bit of code on the Arduino. A loop monitors input on four pins connected to the GPIO from the Raspberry Pi, and sets the ‘angle’ desired by my modified servo motors accordingly, via PWM output. The angle will never be reached by my modified servos, so they’ll spin continually in one or other direction, with a speed proportional to how far the angle is defined from zero.

Final assembly
A bit of insulating tape to hold everything in place and it’s good to go! My robot’s name: Telemus – a cyclops from legend, and a nod towards the word ‘telemetry’! Project success; a lot of learning had during construction, and a lot of fun to be had trundling around my own local planet Mars. A video of my rover in action follows…

IMG_0527

Project complete

The future

I’d really like to add a set of high power LEDs to ‘light up the dark places’ – the on/off state of this will easily be controlled with one GPIO pin connected to a transistor switch.

I’d also really like my robot to be able to tackle stairs! This will require both a larger body scale, and some better locomotion system than wheels – tracks might do it, but I’m highly intrigued by Boston Dynamics’ RHex robot, with its leg/wheel crossover type locomotion system…

A larger body could support a beefier power system for drive. It might be interesting to add a manipulator or arm of some kind, for basic interaction with the environment – possibly even aid with raising the front of the (tracked) robot’s body for mounting stairs.

Tagged , ,

AppHack-e :: ‘Appcopter’ toy helicopter, hacked. Part II: Flying robot

In the previous post of this two-part series, I revealed the process by which the binary control signal for the ‘Appcopter’ toy could be deciphered. This was the first step towards the ultimate goal of commanding the helicopter to perform a sequence of predefined manoeuvres, like the well known educational toy robot ‘Turtle‘.

The remaining task is to write a program capable of generating an audio signal that looks like the image below. Chattering the signal out of a computer’s line out port and into the transmitter will send the toy off doing whatever I tell it to, so long as the parameters are within bounds – “fly, my pretty, fly!”.

Throttle at 10%

Generating the signal

I immediately set out to use Processing (a Java based programming tool) as my programming tool of choice; it’s a free, cross-platform tool that’s pretty easy to get into. Audio support ‘straight out of the box’ isn’t quite adequate for the purposes of easy audio manipulation, so I opted to plug in a third party library called Minim in order to manage my requirement for dynamic audio generation.

Brace yourselves for some code – there’s no way to make this section any easier than simply posting the full code with explanatory comments. Feel free to use this source as you wish (evil robot swarms excluded).


// IMPORTS

// requires Minim audio library: http://code.compartmental.net/tools/minim/
import ddf.minim.*;
import ddf.minim.signals.*;
import ddf.minim.effects.*;
// Java's AudioFormat is required by Minim's createSample() Function
import javax.sound.sampled.AudioFormat ;

// VARS
Minim minim ; // minim instance
int outputSampleRate = 44100 ; //audio sample rate (hz)

// arrays storing prefabricated signal elements - a full signal is built from
// assemblies of these elements in the correct order for a particular message
float[] sig_open ;
float[] sig_0 ;
float[] sig_1 ;

// affects how quickly one program value becomes another - rather than change e.g. throttle abruptly, smoothly ease it from one value to the next
float ease = 0.75 ;

// 'current' control values - these are the values that are actually sent to the transmitter. They are based on the desired value contained in the current instruction, modified by easing
int cThrottle ;
int cPitch ;
int cYaw ;
int cTrim ;
//

// signal is being generated and transmitted during main update
boolean signalOn = false ;

// time at which the program began (used to compare time to next instruction)
int programStartTime ;

// a modifible audio sample to host the signal data
AudioSample signal ;

// In the same way that a washing machine has a program - 'prewash, main wash, spin' - I want to be able to say 'take off, fly forwards a bit, turn left', etc. I use a list of Instruction objects, with a variable pointer to the current instruction. the suffix '_ch1' indicates that this could be made into an array of instruction arrays, thus supporting multiple airbourne agents simultaneously controlled by one computer
int programIndex_ch1 = 0 ; // current program index (1)
Instruction[] program_ch1 = new Instruction[0] ; // program instruction set (1)

// CONSTANTS
int MIN_THROTTLE = 0 ;
int MAX_THROTTLE = 127 ;
int DEFAULT_THROTTLE = 0 ;
int MAX_PITCH = 0 ;
int MIN_PITCH = 63 ;
int DEFAULT_PITCH = 32 ;
int MIN_YAW = 0 ;
int MAX_YAW = 31 ;
int DEFAULT_YAW = 16 ;
int MIN_TRIM = 0 ;
int MAX_TRIM = 63;
int DEFAULT_TRIM = 31 ;
int DEFAULT_CHANNEL = 1 ;
int DEFAULT_SPROG = 2 ; // in the last post I had no idea what this was - it apparently controls the on/off state of the the bright white nose light, but may also be something to do with resetting the helicopter control board between flights!
//
int throttleBits = 7 ;
int pitchBits = 6 ;
int yawBits = 5 ;
int trimBits = 6 ;
int channelBits = 2 ;
int sprogBits = 2 ;
//

// INITIALISATION
void setup()
{
 frameRate(6) ; // feels about right: repeats the full signal every 6th of a second
 size(256, 256); // any old window size - doesn't really matter since there's no UI

 minim = new Minim(this); // make a Minim audio instance

 int i = 0 ; // a counter

 // 0, 1 prefix: each signal value (0 or 1) is preceeded by a fixed length output close to zero. In this block of code I create this data 'building block', in such a way that it can be included in order with other building blocks, and a full signal thus made
 float[] sig_prefix ;
 float prefixSignalDuration_ms = 0.41 ; // the duration of the signal element
 int prefixSignalNumSamples = round( outputSampleRate * (prefixSignalDuration_ms/1000.0) ) ;
 sig_prefix = new float[prefixSignalNumSamples] ;
 for( i = 0 ; i < prefixSignalNumSamples ; i++) sig_prefix[i] = 0.01 ; // set a value very close to zero

 // open: construct the 'on' signal building block, which preceeds the data values in every full signal
 float openSignalDuration_ms = 1.6 ;
 int openSignalNumSamples = round( outputSampleRate * (openSignalDuration_ms/1000.0) ) ;
 sig_open = new float[openSignalNumSamples] ;
 for( i = 0 ; i < openSignalNumSamples ; i++) sig_open[i] = 1.0 ; // the 1 signal data (set to -1 for MacOS?)

 // the '0' value building block
 float zeroSignalDuration_ms = 0.4 ;
 int zeroSignalNumSamples = round( outputSampleRate * (zeroSignalDuration_ms/1000.0) ) ;
 float[] tSig_0 = new float[zeroSignalNumSamples] ;
 for( i = 0 ; i < zeroSignalNumSamples ; i++) tSig_0[i] = 1.0 ; // the 0 signal data (set to -1 for MacOS?)
 sig_0 = concat( sig_prefix, tSig_0 ) ; // prepend the 0, 1 prefix building block from above

 // the '1' value building block
 float oneSignalDuration_ms = 0.8 ;
 int oneSignalNumSamples = round( outputSampleRate * (oneSignalDuration_ms/1000.0) ) ;
 float[] tSig_1 = new float[oneSignalNumSamples] ;
 for( i = 0 ; i < oneSignalNumSamples ; i++) tSig_1[i] = 1.0 ; // the 1 signal data (set to -1 for MacOS?)
sig_1 = concat( sig_prefix, tSig_1 ) ; // prepend the 0, 1 prefix building block from above

// make a container for the full signal data - an assembly of the building blocks above in whatever order we choose. Length is some arbitrary value big enough to contain the entire signal
int nullSignalNumSamples = 2000 ;
float[] nullSig = new float[nullSignalNumSamples] ;
 for( i = 0 ; i < oneSignalNumSamples ; i++) nullSig[i] = 0 ; // set the full signal to zero

// build our audio sample, using our 'nullSignal' data source to drive it
 AudioFormat signalAudioFormat = new AudioFormat( outputSampleRate, 16, 1, true, false ) ;
 signal = minim.createSample( nullSig, signalAudioFormat ) ;

 // That's the building blocks ready: now use them to set up our custom helicopter control program!
 initProgram() ;
}

void initProgram()
{
 // as the battery drains, the appcopter loses power - it needs proportionally more throttle to climb to the same altitude, etc. tScale value allows uniform tweaking of the full program to keep the helicopter behaving consistently throughout multiple runs, across the entire range of the battery's charge
 float tScale = 0.5 ;

 float t = 0 ; // time offset

  int myTrim = DEFAULT_TRIM - int(11 * tScale) ;
// my heli needs this - one too many crashes I think!

  // spin up and stabilise - send '0' to the 'sprog' param briefly; seems to be the way that the proper Appcopter app does it (reset?)
  program_ch1 = appendInstruction( program_ch1, t, 0, DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, 0 ) ;
  program_ch1 = appendInstruction( program_ch1, t+=1, 0.3, DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  //

  // fly, my pretty, fly... Draw a square.

  // lift off and stabilise
  program_ch1 = appendInstruction( program_ch1, t+=1, 1 * tScale , DEFAULT_PITCH, DEFAULT_YAW, DEFAULT_TRIM, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  program_ch1 = appendInstruction( program_ch1, t+=0.6, 0.9 * tScale , DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  //

  // forwards - pitch forwards for a time
  program_ch1 = appendInstruction( program_ch1, t+=1, 1 * tScale , MAX_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  // strip momentum and stabilise - pitch briefly back to arrest forwards momentum and return to hover
  program_ch1 = appendInstruction( program_ch1, t+=1.5, 1 * tScale , MIN_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  program_ch1 = appendInstruction( program_ch1, t+=0.3, 0.9 * tScale , DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;

  // turn 90 - yaw full for a brief time
  program_ch1 = appendInstruction( program_ch1, t+=1, 0.9 * tScale , DEFAULT_PITCH, MIN_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  program_ch1 = appendInstruction( program_ch1, t+=0.5, 0.9 * tScale , DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;

  // forwards
  program_ch1 = appendInstruction( program_ch1, t+=1, 1 * tScale , MAX_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  // strip momentum and stabilise
  program_ch1 = appendInstruction( program_ch1, t+=1.5, 1 * tScale , MIN_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  program_ch1 = appendInstruction( program_ch1, t+=0.3, 0.9 * tScale , DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;

  // turn 90
  program_ch1 = appendInstruction( program_ch1, t+=1, 0.9 * tScale , DEFAULT_PITCH, MIN_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  program_ch1 = appendInstruction( program_ch1, t+=0.5, 0.9 * tScale , DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;

  // forwards
  program_ch1 = appendInstruction( program_ch1, t+=1, 1 * tScale , MAX_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  // strip momentum and stabilise
  program_ch1 = appendInstruction( program_ch1, t+=1.5, 1 * tScale , MIN_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  program_ch1 = appendInstruction( program_ch1, t+=0.3, 0.9 * tScale , DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;

  // turn 90
  program_ch1 = appendInstruction( program_ch1, t+=1, 0.9 * tScale , DEFAULT_PITCH, MIN_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  program_ch1 = appendInstruction( program_ch1, t+=0.5, 0.9 * tScale , DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;

  // forwards
  program_ch1 = appendInstruction( program_ch1, t+=1, 1 * tScale , MAX_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  // strip momentum and stabilise
  program_ch1 = appendInstruction( program_ch1, t+=1.5, 1 * tScale , MIN_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  program_ch1 = appendInstruction( program_ch1, t+=0.3, 0.9 * tScale , DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;

  // turn 90
  program_ch1 = appendInstruction( program_ch1, t+=1, 0.9 * tScale , DEFAULT_PITCH, MIN_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;
  program_ch1 = appendInstruction( program_ch1, t+=0.5, 0.9 * tScale , DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;

  // landing - throttle down to allow safe descent
  program_ch1 = appendInstruction( program_ch1, t+=1, 0.55 * tScale , DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, DEFAULT_SPROG ) ;

  // shutdown - set 'sprog' to 1, again seems to be the way appcopter app does it, but seems maybe unnecessary?
  program_ch1 = appendInstruction( program_ch1, t+=1.5, 0, DEFAULT_PITCH, DEFAULT_YAW, myTrim, DEFAULT_CHANNEL, 1 ) ;
}

// construct an instruction and put it into the control program array
Instruction[] appendInstruction( Instruction[] program, float time_s, float throttle, int pitch, int yaw, int trimOffset, int channel, int sprog )
{
  Instruction instruction = new Instruction( time_s, throttle * MAX_THROTTLE, pitch, yaw, trimOffset, channel, sprog ) ;
  program = (Instruction[]) append( program, instruction ) ;
  return program ;
}

// the main update loop
void draw()
{
 if( ! signalOn ) return ; // don't do anything if the signal is off

 // otherwise...
 readProgram() ; // fetch the latest instruction
 sendData() ; // transmit it as audio
}

void readProgram()
{
 float currentTime_s = float(millis() - programStartTime) / 1000.0 ;

 int nextInstructionIndex = programIndex_ch1 + 1 ;

 if( nextInstructionIndex < program_ch1.length )
 {
 Instruction nextInstruction = program_ch1[nextInstructionIndex] ;
 if( nextInstruction.time_s < currentTime_s )
 {
 programIndex_ch1 = nextInstructionIndex ; // next instruction is now due: step the program index along
 Instruction instr = program_ch1[ programIndex_ch1 ] ;
 println( "Set instruction: @time: " + instr.time_s + " throttle: " + instr.throttle + " pitch: " + instr.pitch + " yaw: " + instr.yaw + " trimOffset: " + instr.trimOffset + " sprog " + instr.sprog ) ;
 }
 }
}

void sendData()
{
 float[] buffer = signal.getChannel(1) ;

 int offset = 0 ;

 Instruction instr = program_ch1[ programIndex_ch1 ] ;

 // ease
 cThrottle += (instr.throttle - cThrottle) * ease ;
 cPitch += (instr.pitch - cPitch) * ease ;
 cYaw += (instr.yaw - cYaw) * ease ;
 cTrim += (instr.trimOffset - cTrim) * ease ;
 println( "Sent data: Throttle: " + cThrottle + " Pitch: " + cPitch + " Yaw: " + cYaw + " Trim: " + cTrim ) ;
 //

 // open
 offset = writeData( sig_open, buffer, offset ) ;

 // throttle
 offset = writeValue( throttleBits, cThrottle, buffer, offset ) ;

 // pitch
 offset = writeValue( pitchBits, cPitch, buffer, offset ) ;

 // yaw
 offset = writeValue( yawBits, cYaw, buffer, offset ) ;

 // trim
 offset = writeValue( trimBits, cTrim, buffer, offset ) ;

 // channel
 offset = writeValue( channelBits, instr.channel, buffer, offset ) ;

 // sprog
 offset = writeValue( sprogBits, instr.sprog, buffer, offset ) ;
 //offset = writeData( sig_1, buffer, offset ) ;
 //offset = writeData( sig_0, buffer, offset ) ;

 //
 signal.trigger() ;
}

void keyPressed()
{
 // any keypress: toggle signal on / off
 signalOn = ! signalOn ;

 if( signalOn )
 {
 // reset the values to default when the signal turns back on
 cThrottle = DEFAULT_THROTTLE ;
 cPitch = DEFAULT_PITCH ;
 cYaw = DEFAULT_YAW ;
 cTrim = DEFAULT_TRIM ;
 programStartTime = millis() ;
 programIndex_ch1 = 0 ;
 println( "program started" ) ;
 }
 else
 {
 println( "program stopped" ) ;
 }
}

int writeValue( int numBits, int value, float[] target, int offset )
{
 byte tByte = (byte)value ;

 // nudge extra bits off
 int mask = 0x80;
 int bitDiff = 8 - numBits ;
 for( int i = 0 ; i < bitDiff ; i++ ) mask >>= 1;

 while( mask > 0 )
 {
 if( (mask & tByte) != 0)
 {
 offset = writeData( sig_1, target, offset ) ;
 //println( "wrote 1" ) ;
 }
 else
 {
 offset = writeData( sig_0, target, offset ) ;
 //println( "wrote 0" ) ;
 }
 mask >>= 1;
 }
 //println( ".." ) ;
 return offset ;
}

int writeData( float[] src, float[] target, int offset )
{
 int i ;
 for( i = 0 ; i < src.length ; i++ ) target[offset + i] = src[i] ;
 return offset + i ;
}

void stop()
{
 minim.stop();
 signal.close() ;

 super.stop();
}

class Instruction
{
 float time_s ;
 float throttle ;
 int pitch ;
 int yaw ;
 int trimOffset ;
 int channel ;
 int sprog ;

 Instruction( float time_s, float throttle, int pitch, int yaw, int trimOffset, int channel, int sprog )
 {
 this.time_s = time_s ;
 this.throttle = throttle ;
 this.pitch = pitch ;
 this.yaw = yaw ;
 this.trimOffset = trimOffset ;
 this.channel = channel ;
 this.sprog = sprog ;
 }
}

Troubleshooting:

I’ve tried this code on both Mac and Windows machines – for some reason that I don’t understand, I needed to invert the sign of the signal data on MacOS (see comment in code, within setup function) for the thing to work.

I note that the Appcopter app sets the output volume to max when the transmitter is plugged into the iPhone/iPad. Similarly, you need to set an alarmingly high volume for the transmitter to push out any infrared! Needs an amplifier in there methinks.

Sprog:

In the last post I indicated that the last two bits were some unknown signal element. After further study, this appears to be a control for the ‘headlight’ LED on the nose of the Appcopter. I suspect there may be more to it however – perhaps a ‘reset’ signal? I base this on the discovery that if the signal doesn’t go from 0 to 2 to 1 during a flight, the helicopter can be landed in an inconsistent state within which it’ll not respond to the control signal at all.

Flying time:

Ultimately, my ambition of commanding the helicopter to fly a square kind of worked… Check out the really poor video. I need a proper video camera.

Immediately you’ll note: It’s not a very square square. Uhh no.

Conclusions:

I quickly realised the naivety of my ambitions after the first couple of automated flights. The control signal was adequately deciphered, and the program fully sufficient to tell the helicopter what to do in a robotic way. My ‘flying turtle’ kind of worked, being able to draw a (very rough) square! But I expected perhaps a little more… accuracy?

Unlike the Turtle robot I once enjoyed at school, no two of my helicopter’s flights were the same. ‘Tiny’ variables, like rotor positioning at the start of a run, became massive deviations in direction as the flight progressed – reality brings the full weight of the chaos effect to bear on my toy robot! In my control signal I’ve tried to avoid a predictable episode of turbulent disruption caused by ground effect (the helicopter buffeted by its own rotor down wash) by launching up to a respectable altitude and stabilising before setting off. My flying robot with it’s four degrees of freedom (up/down, forwards/backwards, yaw, pitch) is a lot more complicated than a terrestrial buggy already at peace with gravity (and a mere two degrees of freedom)!

So what next? A sensible answer would be to attach sensors, and allow the helicopter to take care of it’s own stability to some extent. But my little toy can’t possibly be expected to carry a load (gyroscopic sensors, distance sensors, altimeter, the list of possible enhancements goes on), and with each new system comes a greater requirement for power, which obliges a bigger battery, which requires more lift, so bigger rotors or more powerful motors… Basically, my explorations with the Appcopter have led me to consider the possibilities of constructing a multirotor helicopter in order to explore ‘real’ onboard flight AI. I’m currently working on a hexcopter built from three broken Appcopters, in order to get a feel for constructing my own flying machine:

Tagged , , , ,

Raspberry Pi: Unresponsive keyboard problem solved with a customised USB cable

After collecting the various things requisite to operate the RPi (HDMI to DVI cable, 8GB SD card with OS image, micro USB cable, low power USB keyboard and mouse), I eagerly plugged it all in and watched the boot screen flash by… Awesome.

Raspberry Pi Login:

Type pi

Nothing appears on screen. No characters.

Bash keyboard a bit…

Still nothing.

Conscious that the power supply (iPad charger, rated 5v, max 2.1amps) might be at fault (having read that the iPad charger wasn’t really up to the job of powering the Pi), I tested the board voltage with a multimeter across the labelled test points ‘TP1′ and ‘TP2′ on the board. It looked like the voltage was cycling between 3.5v and 4.5v – that’s not correct at all – and the power LED was also suspiciously flickering! I tried a similarly rated power supply (one gifted by my energy supplier with their ‘iPlan’ home energy monitor), which did make a tiny difference to the cycling voltage values, but otherwise the board behaved the same – receiving no keyboard input at all.

I then tried another USB cable (from a Samsung mobile phone), and this did make a difference – the voltage no longer cycled up and down, but was still apparently too low for the keyboard to operate.

So I’d tracked at least part of the problem down to a dodgy micro USB cable. Frustrated by the lack of Pi desktop action, and faced with the prospect of spending money on A) a powered USB hub or B) another cable and/or power supply that may or may not have even worked, I had nothing really to lose by taking a scalpel to the cheaper USB cable (which had only cost me just 68p including postage on Amazon!). I figured I would simply try and connect the terminals with some thicker gauge wire to reduce the cable’s resistance, and whilst I was in there, also remove the unnecessary data cables, which weren’t doing any good for power purposes. Here’s a montage:

1) micro end of USB cable, with internal wires laid bare

2) The micro USB end decapitated, with data cables removed

3) micro USB end, with thicker wires

4) USB end, ready to be modded

5) The finished, upgraded power cable!

I connected the modified cable to the iPad charger and Pi, and eagerly watched the boot screen flash by… Awesome.

Raspberry Pi Login:

Type pi

pi appears on screen

Celebrate.

I checked the voltage across the board test points, and found a stable voltage of 4.9v (with my alternative power supply, a stable 5v)! Job done.

Err… Now what?

AppHack-e :: ‘Appcopter’ toy helicopter, hacked. Part I: The control signal

Motivation

A few weeks ago, I went to a local hobby and model shop to look for inspiration (and maybe something to dismantle). Of particular interest was this:

Appcopter

A small infra-red controlled toy helicopter, with a transmitter that plugs into an iPhone – simply download the virtual control panel from the app store to fly it. I was immediately struck by the potential for tinkering; it’s rare to see anything ‘unofficial’ plugged into an iPhone because the platform is so closed (Apple don’t let just anyone mess with their hardware), but here was a cheap toy making innovative use of the audio output socket, to fly a vehicle no less – that’s pretty clever! As a programmer I could see that in order to make a custom controller for this toy, all that’d be needed would be some code to ‘make the right noise down the wire’ to the transmitter, so to speak, so with this simple view in mind I went ahead and bought it with some idea of what I wanted to achieve…

I remember once playing with a programmable robot at school called ‘Logo’ (nicknamed ‘Turtle’), it was a white plastic dome shaped vehicle with wheels, that you could command to perform a series of navigations in sequence, leading the robot to traverse a maze or draw a picture. Program instructions were really simple, you could draw a square by commanding something like the following:

// draw a square: go forwards X units and turn right 90 degrees. Repeat four times.
Forward 10 ;
Right 90 ;
Forward 10 ;
Right 90 ;
Forward 10 ;
Right 90 ;
Forward 10 ;
Right 90 ;

I thought it’d be an interesting challenge to hack this toy helicopter into behaving like a ‘flying turtle’, so I could send it a series of instructions and watch as it performed them in sequence. That sounds like a pretty simple goal, but to get this far I’d have to achieve a number of tasks, most definitely learning a lot in the process. This modest goal also represents a necessary step on the path to bigger things, such as building a fully autonomous flight platform with sensory feedback systems – now that’d be awesome…

Control mechanics

So after playing with the helicopter for a bit to get a feel for the flight characteristics (cats now run from the room when the toy starts up), I looked at more detail into the available control mechanics.

The virtual controller app in d-pad mode

The virtual control panel gives you a number of controls, the most obvious being throttle – how fast the rotors spin, which translates into how quickly the helicopter ascends or descends. Get things balanced just right, and it hovers nicely.

A significant feature of this helicopter is that it has two pairs of ‘coaxially mounted’ rotor blades – each pair spins in opposite directions, with the effect of mutually canceling any bodily rotation induced during spin. The control circuitry on-board the helicopter manipulates the Y-axis rotation (yaw) of the helicopter in response to your command by slowing down or speeding up just one or the other pair of blades, deliberately allowing a little inductive torque in either direction. There’s also ‘trim‘ – it’s possible that the unique physiology of the helicopter (factory differences in motor power, for instance) means that some rotational force needs constant compensation; trim allows you to apply some permanent offsetting yaw for this.

Next is pitch, managed by a tiny motor and upwards-facing propeller on the tail boom – spin the rotor one way to dip the tail and make the helicopter fly backwards, or spin it the other to raise the tail and fly the helicopter forwards. Leaving the control untouched allows the helicopter to straighten out and hover.

The app offers two control modes. First is a ‘d-pad’ (‘digital pad’) for simple ‘zero-or-maximum’ forwards, backwards, yaw left, yaw right and combinations of these. Secondly there’s a pretty cool mode that allows you to fly the helicopter with the accelerometer of your iPhone by tilting it left, right, forwards or backwards to affect yaw and pitch respectively.

Finally, hidden away in another screen of the app, is a channel selection option: A, B or C – this setting lets you and some friends fly up to three helicopters on different channels, in the same room, each with their own transmitters and controller apps.

Control signal

When you fire up the throttle on the app, and the helicopter’s motors whirr into action, you can hear the plug-in transmitter box coughing out a staccato series of sounds through a tiny built-in speaker (which on first appraisal could passably be described as ‘helicopter rotor noises’, made for effect). The same sound is made through the speakers on the iPhone when the transmitter is unplugged, so there must be more to these sounds than mere aural aesthetic; it must be the control signal! It’s a pretty straightforward exercise to plug the iPhone’s audio output into the audio line-in port on a computer, and record a bit of the sound. It’s even easier to then look at it visually portrayed in a sound editing program – here’s what the audio signal looks like at two levels of zoom:

1) A little over 3.5 seconds of audio signal from the app, throttle set at 10%

2) Zoomed in upon one of the many individual ‘blips’ from the signal in image (1). The portion containing dips and peaks is about 30 milliseconds long.

Zoomed right into one of the many pulses visible in image (1), you can see that the signal looks like a series of valleys carved out from a line centered around zero (image 2). Though the signal looks like it tends upwards a bit across its entire length, the ‘valley floors’ along it look to be of pretty consistent widths and depths, with some looking like exactly double the width of the shorter ones.

This signal above looks most definitely digital (an analogue signal would present a more chaotic and varied looking series of smooth valleys and peaks or different heights and depths, not the regular ‘rectangular wave’ that we see). A little more research reveals that your average TV remote sends a signal that doesn’t look too different to the one sent by the helicopter transmitter… The important point being that the typical TV remote controller signal has a definite structure: it first opens with a ‘low signal’ or valley of a fixed length (like shouting someone’s name to get their attention), the signal data then follows like a spoken sentence, flowing in a series of words: control values spelled out in binary (with the length of the low signal denoting either a ‘1’ or a ‘0’), before closing with another fixed pattern of lows. Just in case the TV failed to get the message the first time around, the signal is then repeated. It looks like my helicopter similarly has a consistent ‘hello’ valley at the start (longest trough, far left of image 2), some binary data then follows (with what could be a longer valley denoting a ‘1’, and a shorter valley denoting ‘0’) before coming to a definitive close. The entire signal is repeated again over and over at regular intervals so long as the throttle is greater than zero – setting the throttle to zero stops the signal audio.

A brief moment to explain binary…

When we count from 0 to 9, we visit a series of unique numerical symbols in one ‘column’, which we call the ‘units‘: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9. When we get to the number 10, we wrap over into two columns: the ‘tens‘ (of which we have one in this instance) in addition to the ‘units‘ (of which we have none). Count up to 100 and we have one ‘hundred‘, no ‘tens‘ and no ‘units‘. Following so far? We’ve just been counting in ‘base-10′ – our numerical system is in fact built around base-10. We don’t have to constrain ourselves to ‘units‘, ‘tens‘, ‘hundreds‘, etc. though – you can effectively work in a numerical system of any base – binary is simply counting in ‘base-2’! The table below shows exactly what that looks like – on the left is the familiar number in base-10, on the right is the same number, in binary. See that instead of ‘tens‘, ‘hundreds‘ and ‘thousands‘ (powers of ten), we instead have ‘twos‘, ‘fours‘, ‘eights‘, ‘sixteens‘, etc. (powers of two), with each column being shifted along to the left when the smallest gets filled.

1 (one ‘unit’) → 1 (one ‘one’)

2 (two ‘units’) → 10 (no ‘ones’ plus one ‘two’)

3 (three ‘units’) → 11 (one ‘one’ plus one ‘two’)

4 (four ‘units’) → 100 (one ‘four’ plus no ‘two’ and no ‘one’)

5 (five ‘units’) → 101 (one ‘four’ plus no ‘two’ and one ‘one’)

6 (six ‘units’) → 110 (one ‘four’ plus one ‘two’ and no ‘one’)

7 (seven ‘units’) → 111 (one ‘four’ plus one ‘two’ and one ‘one’)

8 (eight ‘units’) → 1000 (one ‘eight’ plus no ‘fours’ plus no ‘two’ and no ‘one’)

9 (nine ‘units’) → 1001 (one ‘eight’ plus no ‘fours’ plus no ‘two’ and one ‘one’)

10 (one ‘ten’, no ‘units’)→ 1010 (one ‘eight’ plus no ‘fours’ plus one ‘two’ and no ‘one’)

When using binary in computing, it’s typical to specify a number with a certain length of symbols – like putting in a placeholder ‘0’ for each column we might possibly use – this is the number of ‘bits’ (‘binary digits’) used to specify the number:

The number 1 in ‘four bit binary': 0001

The number 15 in ‘four bit binary': 1111

So what happens if we try to specify the number 16 in four bits? We just can’t do it! There’s a limit to the maximum number we can specify with a certain number of bits. To show the number 16 in binary we need at least five bits:

The number 16 in ‘five bit binary': 10000

Decoding the binary signal

So we’ve seen what controls the helicopter responds to: throttle, yaw, pitch, trim, and channel. The signal data must contain information about each of these – there’s no magic happening when you push up the throttle control! We’ve established that the signal is binary in format; it’s sending a series of numbers represented by zeroes and ones – these numbers must mean something about each control, but in what order do the values get sent, and by how many bits is each represented? With applied scientific method, we can easily figure out the signal encoding! The process is really simple:

- Choose just one variable (be it throttle, yaw, pitch, trim or channel) and modify it to a known value, for example, set throttle at 10%. Record the signal audio – this is our ‘control’ signal, against which all further modifications will be compared.

- Change the selected variable by a small amount, for example, set throttle to 20%. Record the signal audio.

- Open up the audio file for the control signal (in this case throttle at 10%), and place alongside it the audio for the latest recording (throttle at 20%) – see how the signal changes between the two wave forms (image 3) – there’s clearly a connection between a particular variable (throttle in this case) and the part of the second recording that changes in response.

- Repeat many times, for a range of control settings for a particular variable, before moving onto a different variable and starting over!

3) Signal comparison: Throttle at 10% (top) and 100% (bottom). There’s definitely something going on with those first few bits of signal (highlit, bottom)

I found it possible to get a pretty big hint about the location of a particular control value on the signal, with a little trick: the controls have maximum and minimum values (for example, throttle has both a 0% and 100% setting), so assuming these aren’t mapped to weird arbitrary values, they must be equivalent to the value on the signal – the signal should show the highest value for throttle, say, when the app has it set at 100%. So set the throttle to 100%, and the signal shows a series of long valleys (obviously then our ‘ones’) at a location that must contain the throttle value!

I initially started writing a program to automate the full testing procedure, exporting the audio files to a text file format (easier to crunch the numbers that way). I eventually ran out of patience with this though after encountering a number of issues, and resorted to pencil and paper – it turned out to be much quicker and easier – about an hour to figure out the encoding, and like any puzzle, really good fun.

Pencil and paper: Change one aspect of the control (e.g. set throttle @ 20%) and see how the signal changes relative to some base setting (I used throttle @ 10% as my reference signal). Repeat many times, identifying the characteristics of the control components of the signal

The signal exposed

So here’s a summary of my findings, and everything you need to start writing a program to take control of the helicopter:

- The audio signal is a rectangular waveform representing binary data, with a normalised peak amplitude ‘pulse’ of about 0.6 units, when represented in uncompressed .wav file format.

- The signal begins with an ‘on’ pulse of ~1600 microseconds

- A binary ‘0’ is indicated by a pulse of ~400 microseconds following a gap of ~400 microseconds

- A binary ‘1’ is indicated by a pulse of ~800 microseconds following a gap of ~400 microseconds

- The entire signal carries 28 bits

- Following the on pulse the controls are sent as follows:

Throttle: 7-bits, defaulting to 0, full at 127
Pitch: 6-bits, defaulting to 32, full forwards at 0, full reverse at 63
Yaw: 5-bits, defaulting to 16, full right at 0, full left at 31
Trim: 6-bits, defaulting to 31, full right at 0, full left at 63
Channel: 2-bits, A: 1, B: 2, C: 0
???: 2-bits, no idea – let’s find out in the next post!

That’s it for now – there’s a lot of content here, so I’ll deliver the story in two parts. In the next post I’ll reveal the program written in Processing to generate the audio signal for the transmitter, and talk about automating the helicopter’s flight!

Tagged , , , , , ,
Follow

Get every new post delivered to your Inbox.