ROS: robot operating system

I was following mac install instructions

, but the pip instruction failed. I turned off the anaconda distro I recently installed by commenting out the line in my ~/.profile file

assimp.hpp not found or some stuff

brew info assimp

brew install ros/deps/pyassimp

I give up on mac. It sucks. Using an ubuntu virtual box. Setup is super easy. Mostly just a apt-get

So what is ROS? Well, my impression is that it’s focus is kind of like unix piping. It’s a uniform interface you can use to ram little chunks of programs into one another.

But also the community just has a bunch of robotics programs





Check this out Depending on how well this works could be pretty cool.

sudo apt-get install ros-jade-visp

You need to be running roscore in another window for stuff to work

Install instructions here

had problems at catkin_make

pip install catkin_pkg


Okay the problem I had was that a weirdo python distro had been install doing something else (Goddamn you Moose. What gives you the right?), so I removed it and had to rerun all the workspace creation stuff. Now we’re good




Visual Odometry and Epipolar Stuff

I went on a huge projective geometry kick a few years back. I loved it when i found out about it.

Anyway, forgotten a great deal of it now. Looking into using cameras to determine displacement of a robot.

Click to access vis%20odom%20tutor%20part1%20.pdf

Useful stuff

So if we translate and rotate 3d vectors we have

x' = Rx+t

If we take the cross product we have the equation

t\times t=0

t \times x' = t \times R x

Since whatever comes out of a cross product is perpendicular to what went in

x' \cdot (t \times x') = 0 = x' \cdot t \times Rx

Now we can just interpret these 3d vectors as 2d projective plane vectors from two different camera viewpoints (homogenous coordinates).

This last equation gives the epipolar constraint. It describes what the ray that projects to the object from camera 1 looks like in camera 2. And that ray is going to be a line. So the essential matrix converts from points in 1 camera to lines in the other camera.


Then how do you get the R and t back from point correspondence? A cross product can be represented as a skew symmetric matrix (it’s a linear process that takes a vector to another vector).

t\times = [t]_x

The cross product kills 1 vector, so this vector has eigenvalue 0 for this matrix.

And it flips two other vectors with some minus signs. In an orthogonal basis with t as one element, it would look something like

\begin{matrix} 0 & -1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 0 \end{matrix}

So when we take the SVD of E, it will have 2 equal eigenvalues and 1 zero eigenvalue. E = USV^T

The matrix V^T transforms from the x frame to the t frame.

The matrix U transforms from the t frame to the x’ frame.

We can just unflip the two vectors switched up by the cross product in the t basis(The matrix that does this is called W) and compose U and V to reconstitute R.

Then we can find [t]_x=ER^{-1}

Note that the t you find is unknown up to scale. You can only find the direction of translation, not how far.

The problem comes from that you can’t tell the difference between images of dollhouses and real houses. There is an unavoidable scale ambiguity. BUT, one would have estimates from all sorts of other places. Context maybe, other sensors, current estimates of velocity, estimates of the actual 3d position of the points you’re using.

Makes sense, I think. Now I have to actually try it.


Getting goddamn wifi on the goddamn orange pi pc

I tried the Lubuntu and Raspbian distros, maybe another one would work out of the box. i’ve heard rumblings that an openelec distro might work with wifi? Also I’ve heard that 8188eu chipsets will work out of the box. Can’t confirm either.

used the legacy jessie server 3.4.112 build here.  unpacked it using keka.

installed on sd card using dd.

Made the wifi drivers as described here

make scripts failed

After making the scripts I mostly followed the github itself instructions.

I had to edit the kernel files to comment stuff out like discussed here

Seemed to work.

(Putting the wifi dongle in the lower usb port might be important? I’ve heard reports of that)

iwlist scan wlan0

finds the local router

My Edimax RTL8188CUS adapter does appear to be close to working now. I am connected to the router

made a /etc/wpa_supplicant/wpa_supplicant file



changed the /etc/network/interfaces file to have

auto wlan0
iface wlan0 inet dhcp
    wpa-conf /etc/wpa_supplicant/wpa_supplicant.conf

Can’t ping external ips. TBD if everything is actually fixed.


route add default gw wlan0

Gateway was misconfigured

Also added

allow-hotplug wlan0

To the /etc/network/interfaces. I think until I did this it didn’t find the wifi until you logged on with ethernet



A Stack Monad in Python

So I saw

where he uses a stack as an example of a state monad.

I get the Maybe monad I think and some others, but the State monad is one more level of abstraction that confuses me still. When I try to reason about it, I fail to see the point sometimes. Pure state to state functions are immediately composable. To also have a return value and return state usually doesn’t seem necessary.

The actual monad is the full state transforming and value returning function. This is very abstract.

Most functions you’ll be writing will return a function that takes state and outputs (val, state) pair. It’s weird.

Though I’d try doing it in python. It is sticky. Maybe I need to churn through it a bit more. The fact that I need to keep putting those toss away functions in there seems strange. Perhaps I need a bit more think. Although the do notation does expand out to that often I think.

# This is horrific

#currying would help correspondence to haskell. Then push(a,stack): would be the same as push(a): return lambda stack
def push(a):
	return lambda stack: (None, [a] + stack)

pop = lambda stack:  (stack[0], stack[1:])
def myreturn(x):
	return lambda stack: (x, stack)

def runStackFunc(statefulstackfunc):
	return statefulstackfunc([])

#state -> (val ,state)
def bind(statemanip, regtostatemanip):       #  return val  #state
	return lambda state: regtostatemanip(statemanip(state)[0])(statemanip(state)[1])

#Do notation is also pretty clutch. You need to toss the value a lot

print runStackFunc(bind(bind(bind(push(1), lambda _ : push(2)), lambda __ :pop), push))
# associativity?  The value tossing lambdas need to be switched outside of the binds. That seems weird
print runStackFunc(bind(push(1), lambda _ :bind( push(2), lambda __: bind(pop, push))))

print runStackFunc(bind(bind(push(1), lambda _: push(2)), lambda __: bind(pop, push)))

It is neato how the associativity kind of works.



I later realized I had forgotten >> aka then

def then(statemanip, dontneednothin):
	return lambda state: (dontneednothin(statemanip(state)[1]))
#More general 
def then(statemanip, dontneednothin):
	return bind(statemanip, lambda _: dontneednothin)

print runStackFunc(bind(then(then(push(1), push(2)), pop), push))


Pretty straightforward. Just added in the throwaway lambda into a function. Can do this manually or by referring back to bind (which is better since then basically just has the logic of bind in it).


Keras and Learning Sine

One perspective on machine learning is that it is interpolation, curve fitting.

So I decided to try to fit sin(x) using a neural network.

Linear layers with Relu units will make piecewise linear curves.

It works, just not well.

Deep networks tend to get kind of hung up.

from keras.models import Sequential

model = Sequential()
from keras.layers.core import Dense, Activation

model.add(Dense(output_dim=5, input_dim=1))
model.add(Dense(output_dim=5, input_dim=1))
model.add(Dense(output_dim=5, input_dim=1))
model.add(Dense(output_dim=5, input_dim=1))

#Depth doesn't seem to work so well
#Wide seems better for this task.
#Of course what we're doing is incredibly stupid

model.compile(loss='mean_squared_error', optimizer='sgd')

import numpy as np
import matplotlib.pyplot as plt

x = np.linspace(-np.pi, np.pi, 100)

data = (np.random.random((100000))-.5) * 2 * np.pi
vals = np.sin(data), vals, nb_epoch=5, batch_size=32)

y = model.predict(x, batch_size=32, verbose=0)



The final result is never all that great. Seems to have a lot of trouble at the minima and maxima of sine, probably because a relu is not a good model for what is happening at those points.  And it takes a lot of iterations for the rest to look pretty good. Perhaps the optimizer needs more fine tuning. I have not looked within the box much at all. Maybe a more aggressive optimizer would converge faster for this very simple task.

Here I plotted the result at intermediate steps of convergence. The accuracy tends to propagate out from 0 and sort of wrap around the curves. I wonder if this is due to where the thing is initialized



Still, I guess for being dead easy, not bad?