Cellular Automata in Haskell

Ben recently tried this and I wanted to see if I could do it my way

I’ve seen this done (Bartosz?) before but I tried to do it without looking anything up.

The comonad is an interesting pattern to use. It automates the translation invariant nature off the cellular automata. This would also be useful for translationally invariant PDEs like the simple wave equation or others.

I used the laziness of Haskell to start with an infinite plane of zeros. Of course if you ever want to look at it, you need to pick a finite slice at the end using dtake



Ordered pairs in Idris

I implemented an ordered pair type in Idris. Took me a while.


Already has an inequality type. That may have been better to use.

Exploration and guesswork was necessary. How the heck do I get a hotkey for holes in atom?

The first couple Opairs are ordinary pairs. Just exploring the syntax.

I switched out of InEq which explicitly stores the number it is talking about to better match the equality type.

The Inequality type acts something like a mix of Nat and Equality.

My head hurts and I’m dehydrated.



Position Tracking using Wifi


I have been aware of machine learning projects that use RSSI (signal strength) (for example, try typing iwconfig in a linux terminal. You can grep that for signal strength.)

CSI (channel state information) is more though. Uses alternative firmware. Gets you phase sensitivity.


mini PCIe is a pain. Many laptops support it and mini pcs. But still, that is a ~100$ investment or so.

plausible but unconfirmed alternative for atheros cards


Another paper describing localization



LimeSDR, gettin going


I had trouble installing the ppa the first time I tried about a month ago. It seemed to go through the second time around

Install LimeSuite using the ppa instructions for ubuntu. I have 16.04


lsusb shows OpenMoko as the device. Odd.



Ok. A capitalized command LimeSuiteGUI is insane. No commands are capitalized.

It brings up some spaceship of cryptic options.

Some of the stuff in the top menu bar seems good. Connecting, Programming, FFT, etc.

The commands

LimeUtil –info

LimeUtil –find

Lists a device. So that’s good. Also does updating?


Added a myriad RF ppa for gnuradio


sudo apt-get install gnuradio gnuradio-dev

Also installed gqrx. This is very useful for determining if the goddamn thing is working at all

Install Gqrx SDR on Ubuntu Linux



Ok. GQRX was crashing on boot.

So was gnuradio when I tried playing with the osmocom source

GNU C++ version 5.4.0 20160609; Boost_105800; UHD_003.010.001.001-release

ImportError: /usr/lib/python2.7/dist-packages/gnuradio/uhd/_uhd_swig.x86_64-linux-gnu.so: undefined symbol: _ZN3uhd4usrp10multi_usrp7ALL_LOSB5cxx11E

so did python

from gnuradio import uhd

Eventually I apt get installed libuhd-dev which makes gqrx run now with an rtl-sdr

gqrx with line


should work but didn’t


Hmm another twist

Limesdr is power hungry

I may not have a usb3 port on my 2012 desktop insane as that sounds

so i need external power? I wonder if that is what the blinking led signifies


Trying to install in windows now. Maybe that will work better

Install PothosSDR

After running zadig can recognize an rtl-sdr in gqrx


Install the drivers as specified for limesdr.

I accidentally installed the x86 instead of the x64 drivers and needed to uninstall with deletion and then reinstall properly. Got error code 48

It appears to be working.


GQRX opens.



It has been months since I tried in ubuntu 16.04. Maybe things have gotten better.

ok. At first gqrx wouldn’t load but after fiddling with the sampling and bandwidth it does. I don’t get any FM signals with nothing attached to the Lime. That is either good or bad. It does appear to be receiving data at least though.


So I have not gotten gqrx to actually receive signals with limesdr, but gnuradio appears to be working to some degree. This graph plays audio on 101.5 although very crappily. I hope it just needs filtering (I think I should be low passing those resamplers. Also I probably shouldn’t be receiving on the middle of the band where the DC spike is). I doubled the resampler because a single resampler through an error for too much decimation. Is this standard procedure?


Useful gnuradio tutorials


Possibly useful blog series



ok. Next step is simple transmit and receive




3d printing Linkages


I’m starting by trying to make a simple scissor linkage. The challenge is the rotating joint


I decrease the radius of the peg by 0.5mm compared to the hole. This gave me a nice fit. Slides smoothly, doesn’t pop out.

Hmm Only for the center hole though. The other’s are much too loose.

took about 1:10 to print 8

took about 12 mins to print 1

0.3 is definitely too big

0.4 is nice and tight. But it isn’t good enough. The thing still wants to get out of the hinge.

Back to the drawing board. A three piece design? Or just make the pegs and holes longer?


Update: I extended the holes so the diameter equals the length. It works GORGEOUS.



I printed a circular version. You just need to angle the bars of the scissor by 2 *pi/N where N is the number of sides of the polygon you want.



Notes on ROS and ORB SLAM 2


This seems like the most recent openly available slam package I could find. So let’s try it out

whenever using ros always



when using ros

rosnode info or rosnode list to inspect running nodes

rostopic gets the avaiable messages

Something like this. However, should edit the yaml config file

sudo apt-get install ros-kinetic-usb-cam

To inspect the image

rosrun usb_cam usb_cam_node __name:=camera

changes the topic name to camera


Dang. I’m pretty impressed. With a bad calibration file and no tuning, it works ok in short term. Blur clearly seems like an issue.

The contents of the example yaml config


Other things to consider LSD-SLAM



Actually, many people are recommending svo and dso saying orb_slam while the better behaving one, works poorly on the pi








Also a perhaps useful ROS short course

Old 6/16:

So I downloaded a cool looking package


roslaunch svo_ros live.launch

You need to have a camera running

sudo apt-get install ros-jade-usb-cam

run with

rosrun usb_cam usb_cam_node

You can see it using image_viewer

rosrun image_viewer image_viewer image:=/cam_usb/image_raw

Change the param file in svo_ros so that the camera has the same resolution and stuff

Using Z3 to solve a simple logic puzzle

I was checking out Smullyan’s Lady of the Tiger. It starts with some very simple logic puzzles. Let’s make Z3 do one for us!

There are 100 politicians

at least one is honest

for any two at least one is crooked

How many honest politicians are there?

And here is one way of encoding this into z3:

Z3 is kind of a pain in that it won’t immediately cast to bools when I want them. A mistake I made at first was missing that “not i is j” in the list comprehension. Also an unfortunate feature is that this does not make it clear that there is a unique answer, due to permutations of the variables. But if we just asked for the number of honest, it’s not clear to me how to encode the forall2 constraint. Very curiously, z3 picks ‘x18’ to be the one honest man. I wonder why?


When I first heard about Prolog, and that it’s “functions” are reversible (they are really more like relations) my first thought on how I would build such a thing was wrong. Let’s say you wanted plus(x,y,z). I thought maybe you’d write three functions, one for each of the possible outputs you’d like. so x+y -> z,  x-z -> y, and y-z->x. Then it’s not so hard to imagine something like wiring up the graph of connections between relations and figuring out an ordering of computation that gives you the query you want. You could make such a thing non-deterministic by returning a list of possible answers rather than a single answer if that is the case like square(x,y) (square root has plus and minus answer y-> [+sqrt(y),-sqrt(y)]).

Prolog actually uses a backtracking search mechanism though and doesn’t require you to write a function 3 times (although maybe under the hood primitive addition relations might have something like this for efficiency?) which makes the thing appear all the more magical at the programmer level.

But I think that first idea is somewhat related to something called a propagator.


Ed Kmett is a god. He is also horrendously opaque unless maybe you have some serious mathematical chops. The propagators are a unifying paradigm for some very disparate sounding things. The Introduction to Lattices book by Davey and Priestly he mentions is actually a fairly pleasant read. The reference about how orders are involved in programming languages that comes to mind is https://en.wikibooks.org/wiki/Haskell/Denotational_semantics. The laziness of Haskell makes a richer set of possibilities than just the result of a function is terminating or not. If you don’t look inside a data type, it may not matter that that piece involved a non-terminating computation. So it turns out you can define an ordering relation between data that have non-termination in different slots. the Haskell function fix, which is sort of a nasty (or useful) recursive higher order function is called so because it is related to taking the fixed point of a function from this order theory perspective.

It is very interesting that this analogy may allows you to apply the tricks of SAT solvers to speed up parallelism and stuff. As an example, I think the idea is that a function requires previous values that are it’s inputs to be computed before it can fire. So you have this mess of functions waiting to fire and every time a new answer gets completed, maybe you have to search through the mess for the next guy that is ready. But the two watched literal technique from SAT solvers is a nice way of doing this. It really reduces the number of functions from the blob you need to check by keeping tabs on only a much smaller set per intermediate result (you’d keep like a hash or list of all possible functions to check if ready to fire off after an intermediate result comes in). The two watched literal technique doesn’t keep ALL the possibly firing functions in the list. Per function, they only are put into two possibility lists. And when one of those intermediate results comes in, you pop the function out of that list and put it into one of the other lists of something else it needs that hasn’t come in yet. If you can’t find another list to put it in, it’s ready to fire off itself. Each function may end up getting inspected much less than the number of variables it has depending on the order intermediates come in rather than once per variable as you would in the naive approach.

http://composition.al/ This is the blog of Lindsey Kuper, who did that PhD work on LVars he mentions.


The Bartosz Milewski himself talking about MVars (Last 2 videos in playlist)

MVars block on write if full and on read if empty. So they are useful for synchronization.

IVars can only be written to once and block on read until full. They are promises

CRDTs – https://en.wikipedia.org/wiki/Conflict-free_replicated_data_type

Propagators are mentioned here





Caffe: Getting Started

To Use the Movidius neural network stick we have to use caffe.



Caffe is not super friendly or well documented in my opinion.

I’m having lots of installation problems on my mac.

Segmentation Fault 11


Trying the last thing here. Be sure to change the version numbers in the path

I eventually got it to import with

PYTHON_INCLUDE := /usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/include/python2.7 \

PYTHON_LIB := /usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib

in MakeFile.config

I run the command with python2 which supposedly is the homebrew python.

python2 -m pip install protobuf scikit-image


Ok. Now we can start.


It is configuration file based. You make this protobuf file with the network and then load it up.

In standard configuration, you pull the data off of a database.

Data layers have the training data

top parameter is output of a layer

bottom is input

include Train Test are ways to include different guys at different stages


Any loss layer contributes to the eventual loss. You can weight them with a weighting parameter.  The solver runs to optimize loss layers by default. There is no parameter to specify which.

Some beginner files for syntax and exploration.

Sets input blobs in python (probably not the preferred methodology but it is fast to get up and running. Should probably at least dump into an hdf5)

Performs InnerProduct which is a matrix product and computes a euclidean loss.

Check out the mnist folder in caffe/examples. It has the clearest stuff.






This is the location of caffe.io

It has some routines for conversion to and from arrays and preprocessing in Transformer.







View story at Medium.com

Deep Learning With Caffe In Python – Part I: Defining A Layer


You can set certain layers to not train by setting learnign rate to 0

What a great talk! NP solvers in Clojure for puzzles

I was listening to this guy on this podcast Programming Throwdown and really liked the cut of his jib. So I sought out the talks he mentioned

Links to stuff mentioned


Rolling Stones – bindings to Java Sata solver Sat4j


Tarantella – Dancing Links


Loco – Choco   – CSP solver in java




I guess the difference is the community as perspective. CSP tends to be from AI. SMT comes from formalizing and proving correctness of programs.