Metropolis sampling of quantum hall wavefunction

I thought I’d whip up a fast little script to show the radial density of the fluid from a laughlin style wavefunction.

Doing it algebraically and exactly I’ve run into some hiccups that may or may no iron out.

Using monte carlo sampling is easy enough.

Using the metropolis algorithm i suggest a new configuration of positions of the electrons, then accept or reject depending on whether a random acceptance ratio is greater that the ratio of the probabilities of the configurations (probabilities being given by the square of the wavefunction).

Then I make a histogram.

There are a ton of inefficiencies in this implementation, some that would be very very easy to fix. For example, I could put the binning in the loop. That’s save me that huge data structure and let the thing run for hours. But whatever. Works good enough for now.

import numpy as np
import matplotlib.pyplot as plt


#Now it is setup to plot filling factor style density

def Prob(x):
    prod = 1.
    for i in range(N):
        for j in range(i):
            prod = prod * ((x[i,0]-x[j,0])**2 + (x[i,1]-x[j,1])**2)**3
    return prod * np.exp(- 0.5 * np.sum(x * x))

def suggestMove(fromx):
    index = np.random.randint(N)
    newx = x
    newx = x[index] + random.randn()
    return np.random.randn(N,2) + x

def Step(x):
    xnew = suggestMove(x)
    xold = x
    acceptanceratio = Prob(xnew)/Prob(xold)
    if acceptanceratio > 1:
        xnew = xnew
        if np.random.rand() < acceptanceratio:
            xnew = xnew
            xnew = xold
    return xnew

x = np.random.randn(N,2)
steps = 100000
data = np.zeros((steps,N,2))
for step in range(steps):
    x = Step(x)
    data[step,:,:] = x

q = np.sqrt(data[:,:,0]**2 + data[:,:,1]**2)
#r = np.linspace( 0.01, 7., num = 128)
p,edges = np.histogram(q, bins=128,density=True)
edges = (edges[1:] + edges[:-1])/2

#Will get better sampling results if I move this radial factor into the probability. 
#The origin is more poorly sampled just due to geometric factors. And then dividing
# by a small number amplifies this.
p = N * p / edges


Not totally sure everything is right, but at least we are getting a filling of 1/3 for the laughlin state. So that’s something.

I believe that peak is an actual effect. I think I’ve seen it in plots before.

Machine Learning on AWS and not

So gonna give machine learning a try on AWS. Ultimately, I think this is not economically viable, but since I’m just screwing around, whatever.

I searched for a prebuilt AMI ami-180ad670 called gpu_theano or something. Picked the first one for no reason except it seemed reasonable.

I picked a spot instance with .30$ per hour as max rate. g2.2xlarge

ran some script inside called Theano appears to be working on the gpu. (theano )

Let’s see if we can get this puppy running

Error: Command ‘[‘/home/ubuntu/neural-doodle/pyvenv/bin/python3’, ‘-Im’, ‘ensurepip’, ‘–upgrade’, ‘–default-pip’]’ returned non-zero exit status 1

Okay. Forget doing the virtual env stuff. Not worth it on a burner computer.

sudo apt-get install python3-dev

sudo apt-get install python3-pip

sudo python3 -m pip install numpy

sudo python3 -m pip install scipy

Why aren’t these in requirements.txt

I wonder if I could use python2. scipy and numpy are probably already installed.

sudo python3 -m pip install –upgrade setuptools

sudo python3 -m pip install –upgrade cython

sudo apt-get update

sudo apt-get build-dep matplotlib

sudo apt-get install libfreetype6-dev

sudo apt-get build-dep pillow

sudo python3 -m pip install –upgrade scikit-image

sudo python3 -m pip install  theano

sudo python3 -m pip install lasagne

Bad move DOn’t install lasagne and theano on their own

python3 -m pip install --ignore-installed -r requirements.txt

This is running slow as hell. What is up. Github page

Using screen so ssh failing won’t quit job

use screen

run your job

detach with ctrl-a ctrl-d

then you can reattach with screen -r,-CUDA-7,-cuDNN)

Interesting Link. Should try this next time.

Okay. I got a free nvidia graphics card (GTX 560 ti) from a bro. I set up my router to forward port 22 to my desktop so I can ssh in from anywhere. Installed cuda and cudnn. Tensorflow by default does support this old of a graphics card. Saw some rumblings

FInally got scipy to install once I download the blas and lapack libaryr prerequisites using apt-get.

I was having a lot of trouble installing scikit-image with some kind of error about pgen. Eventually I renamed /usr/local/bin/pgen which is not the porgram it is expecting to /usr/local/bin/pgentmp  and then it seems txo get past it fine.

sudo apt-get install libatlas-base-dev

Needed to install some matplotlib dependancies

Needed to set a .theanorc file. Change cude5.5 to the verison you have

Finally running.

I’d say the speed is on par with the AWS. It will take about an hour or two to finish the job at 40 iterations.

Phase 3 is the beast.

Failed at phase 3. My card has only 1GB of ram. Not enough I guess.

I’ll post this for now, but clearly a work in progress.




So we made a drone

Got a kit off of ali express. Had most bits: a frame, cc3d flight computer, motors, ESC (electronic speed controllers), and power distirbution control. I got a turnigy battery off of amazon. 2200mAh, 20-30C which means a peak current of ~60A (multiply the two numbers). Each ESC says 12A, so we’re okay? Also a charging unit from hobby king and a turnigy 9x controller and receiver unit. We also got a bunch of crap for FPV( first person video) which hasn’t all come yet.

So we pieced together the frame from pics floating around.

FIguring out which prop is which is tough. We did it wrong. Got no lift. Then flipped em around.

Silver capped motors rotate ccw. Need to solder wires accordingly.

Downloaded the openpilot software to configure. Stuff. Went through wizards. Treated us right.


When we first launched the thing flipped immediately on takeoff. We put in the cc3d unit sideways and upside down so we used the attitude adjustment to switch around until the flightdata screen actually reflected what we were doing.

Then Success!!!

IMG_0210 IMG_0212 IMG_0211 IMG_0209 IMG_0208 IMG_0207 IMG_0206