## More with RTL-SDR

http://web.stanford.edu/class/ee179/labs/Lab.html

http://aaronscher.com/

Seems hella useful

puts some data from my boy and puts it in file ab120_10.dat

-g sets gain. -f sets central frequency. Defaults to 2.048 samps/s. The n is number of samples which is 10x that so that means 10s.

I basically reimplemented the example code from matlab into  numpy.

A surprising glitch is that using pcolormesh with the fft/spectrogram is that you need to use the fftshift function for it to work right (hypothetically, it feels like this shouldn’t be necessary).

I was confused for a long time by this. The example code for spectrogram doesn’t use a complex signal so I figured that was the problem, but I could see the spike if I just plt.plot() a slice of time. This is bad behavior. I do not see this documented anywhere obvious.

Radio reference. Interesting listing of local radios. I don’t know how to decode p25. Some kind of digital radio standard

https://github.com/szechyjs/dsd

Maybe this is the right thing

How did people do anything before the internet?

The Raw IQ data is unsigned byte alternating I then Q

The rtl_tcp server serves up this data stream.

It also accepts data commands, but I do not see where this is documented.

## Playin around with ESP8266

So I tried tying my esp8266 to the 2\$ nano i just received in the mail. Was getting nothing for a while. Needed an external power supply to work. Too bad.

http://rancidbacon.com/files/kiwicon8/ESP8266_WiFi_Module_Quick_Start_Guide_v_1.0.4.pdf

Set serial monitor to both NL & CR line returns and 9600 baud to get it to respond to AT with OK.

firmware version

AT+GMR
0018000902-AI03

AT+CWMODE=3

sets it into some mode? Necessary for the next step to work

CW stands for?

List Access Points

AT+CWLAP lists local access points

Join access point

I’ve noticed that the esp is available from my computer

AT+CIFSR tells me the IP addresses

I belive one is its ip as a client and one as an access point.

I do not have a great success rate with commands. The serial is getting garbled

https://github.com/tommie/esptool

Looks like there is a python script to load the firmware

https://github.com/nodemcu/nodemcu-firmware.

Or Maybe I’m done for today.

## Getting Going with RTL-SDR (A computer radio thingy)

Using a Mac kind of adds a little bit of shit to every encounter I have with open-sourcy projects. The really mainstream ones fully support macs, but the sort of off road ones need tweaking and digging, even though by and large OS X is compatible with linux as far as I can tell.

Install macports. I tried using homebrew and failed

It’ll install an explosion of stuff.

run gqrx

select the device.

192000 sample rate

leave the other stuff at 0. Do not yet know what that does

When I change the output device it crashes hard for some reason. Default works though

You can click and right click on numbers to change them

Around 100Mhz you’ll see some fm stations

Pick a demodulation. WFM mono or stereo.

http://gqrx.dk/doc/practical-tricks-and-tips

Here’s a bunch of garbage that didn’t work:

Followed these instructions:

https://github.com/robotastic/homebrew-hackrf

build failed on gr-baz. Hmm. Error

Tried doing some stuff. Apparently not updated to gnuradio 3.7? Optional So maybe ok.

Reported that it can see my stick (A NooElec R820T2 NESDR Mini 2)

Screw it. i don’t know how to get this working. Let’s try a different tact.

Fired up my ubuntu virtual machine. Go into settings and add a usb filter for the SDR dongle

Alright After some significant hiccups, changing tacts

https://github.com/roger-/pyrtlsdr

I’d bet that one of the previous stages was necessary to get this going.

Checkout demo_waterfall.py. Pretty cool

## ngspice, huh

Got tipped off to ngspice. Circuit simulation. Not bad. Can’t get it to graph properly on my computer though. I vastly prefer specifying the circuit in a file to specifying it in a schematic. That sucked.

http://www.ngspice.com

is a solid online doohickey

Hypothetically this is a copy of that code. Wonder how long that’ll last

http://www.ngspice.com/index.php?public_circuit=T6XqVE

Axebot Lives.

## Random Potentials and QFT I

There is a remarkable mathematical similarity between systems with random potentials and interacting quantum systems (QFT). I think that systems with random potentials are conceptually more straightforward.

The problem often is that you want to take the average value with respect to the distribution of possible potentials. The choice where you place this averaging (integration with respect to the probability distribution) and you can make smart approximations by placing this averaging where it shouldn’t go.

Let us suppose we have a system with $H=H_0+V$ where H is an ordinary hamiltonian for a single particle and V is a random non time varying (for simplicity) potential background.

Solving the system with a given instance of V is no big deal. Put it in a computer. It’s doable basically.

The trouble is averaging over all the V.

The mother of all questions in quantum mechanics (or mechanics in general) is how to time evolve a system.

The hiccup we run into is the simple fact that

$\overline{e^{-iHt}} \ne e^{-i\bar{H}t}$

Try expanding the exponentials to see.

$1-i \overline{H}t-\overline{HH}t^2/2! +..\ne 1-i\overline{H}t-\bar{H}\bar{H}t^2/2! + ..$

The problem starts with that second term. On the left side we have the correlations of V. On the right side we never have any correlations.

What are we to do? The right side is easily computed. The average value of V is the only thing that shows up (and it would probably be a reasonable convention to set that equal to 0 and put any bias it has into $H_0$). The left side is a hairy mess. Naively, we would have to solve the parametized by V eigenvalue problem in closed form, then average that answer with respect to to distribution of V, basically an impossible task.

So what are we to do?

Well, what we can do is start working on moments. Then we can do better and better.

Even if your initial $\psi$ is set, after any evolution $\psi$ becomes a random variable inheriting it’s randomness from the randomness of V. That means we can ask about its average values (a bad idea or at least you should know what you’re asking. The phase of wavefunctions is a fragile and sometimes irrelevant thing, when you average over it you may get 0. Again the classic misconception that the summed average object has to be the same  as characteristic example of the object. The sum of 1 and -1 is 0, and 0 is neither 1 nor is it -1. Pedantic and yet obtuse? You’re welcome.).

Sooooo. We can form $\psi \psi$ and evolve that.

$(e^{-iHt}\otimes e^{-iHt})( \psi\otimes\psi)$

I assume you can totally see where I’m going with this. Because I see it only hazily. But I know there is something here.

How does this connect to QFT?

## Some thoughts on many body quantum mechanics

There is a lot going on all at once when you open up a textbook on many body. Also, an insistence (which for practical calculational reasons makes a lot of sense, but leaves a conceptual hole) on using second quantized talk and notations, when before the thing you’re most familiar with and spend years grappling is schroedinger style quantum mechanics. It is not impossible to do most if not all things you do in second quantized form in first quantized form.

A good question to ask throughout is what the hell are we talking about? In the sense of what subspace are we talking about really.

Let us suppose that things really are made of “just” electrons. Then the total vector space is clear and the total Hamiltonian is clear. A reasonable wavefunction will be $\psi(x_1,x_2,x_3,...)$ which is properly antisymmetric in all the variables, one variable for each electron. In a discretized form the vector will look like $\psi_{i_1 i_2 i_3 ...}$ where it is properly antisymmetrized between indices, one index for each electron. If you have a lattice of N sites and m electrons, then the number of amplitudes you have to give to specify your vector is roughly $N^m/m!$ (Number for independent particles divided by permutations) not including some mild savings that you could achieve because of the antisymmetrization. This is quickly enormous. You might equivalently talk about solving PDE in 1,2,3 dimensions. 3d PDEs are way harder because for even modest lattice sizes the total size of vectors is huge.

So really, it is unacceptable to really discuss a vector in the context of the full configuration space of all the particles for most purposes.

Let us say that a power on high has given you the true ground state wavefunction of the system. How would you construct a reasonable space to work in? A simple answer is the single particle subspace, which can be constructed $\sum_P (-1)^P \psi(x)\psi_0 (x,x,x,x,x...)$, where we are summing over all permutations of the variables with that pesky -1 for odd permutations. Perhaps I should stick to distinguishable particles for the moment. It cleans things up a bit and we don’t need all these sums for symmetrization. $\psi(x_{new})\psi_0 (x,x,x,x,x...)$

For concreteness, we’ll worry about evolving a given wavefunction forward in time, sort of the canonical quantum mechanics problem. It tends to be that if you can solve that one, you can solve any other question you might ask. Evolving wavefunctions forward in time is related to all the high falutin Green’s function techniques that maybe I’ll get to in another post.

If the extra particle had no interaction with the other particles, let us say it is on venus or something, this would be a very good wavefunction indeed. The new wavefunction would evolve independently of the ground state wavefunction (which evolves trivially with a phase since it is a energy eigenstate) under its own hamiltonian.

$\psi(x,t)e^{-i\omega_0 t}\psi_0 (x,x,x,x,x...)$

However, if the particle really does interact, then the wavefunction does not maintain this nice factored form, and that is what we really should expect. The factored form implies that really the new particle is not correlated at all in what it is doing to all the other particles. If they interact then of course they get correlated. If we want to be perfectly honest we really need to play in the full unusably huge vector space.

Luckily physics is not honest so ALL HOPE IS NOT LOST.

Here’s the dumbest thing we can do (I don’t mean that disparagingly. The dumbest thing you can do is the first thing you should do. Don’t get fancier than you need to.). Let’s hope that by restricting ourselves to wavefunctions of the form $\psi(x)\psi_0 (x,x,x,x,x...)$ we can do a good enough job of describing the system. Maybe if they aren’t interacting that much those wavefunctions will still be pretty good since they were really good when they didn’t interact at all. Now all we need to specify is the new $\psi(x)$ that we put on there, which has only N entries. Much more swallowable.

Ok. So how can we go about evolving the system in time? Well, we have the Hamiltonian, which probably has messy coulomb interactions or something in it. What about if we just find the matrix elements of the true hamiltonian in the subspace (which form and $N\times N$ matrix) we’re working in, then solve the thing by exponentiating it? Sure. Doable. For reasonable lattice sizes, we can totally solve the eigenvalues of that matrix or whatever. When we find the effective interaction between the ground state particles and the new particle, it is essentially the potential that would   be between a continuum blob with a density that is equal to the probability density of the ground state wavefunction  and the new particle $V(x)=\int dx \frac{\rho(x_1}{|x-x_1|}$.

No.

What do we really want?

What we actually want is the matrix elements between the true correctly evolved wavefunction, whatever that happens to be, and our goofy subspace. Let’s call the projection to our subspace the operator P which is 1 if you’re in the subspace and 0 if you’re outside.

What we kind of would prefer is this

$<\psi |Pe^{-iHt}P|\psi >$

The evolution is for real and honest, but we only allow our restricted beginning and end states.

What we’ve done is this

$<\psi |e^{-iPHPt}|\psi >$

Which has dishonest evolution and really can’t be trusted.

Here is where things start to and will continue to be interesting. How do we achieve the former? We are now touching upon one of the many faces of renormalization! Isn’t that exciting!

Let us form a definition of the effective hamiltonian

$e^{-iP H_{eff} P t} = P e^{-iHt} P$

Ok. Definitions are nice, but to be certain, this does not immediately tell us how to calculate $H_{eff}$.

We’ll start there next time.

(PS wordpress latex freaks if you have an double whitespace. Are you kidding me?)

## Recording Audio in Python

I was thinking that maybe recording audio might be convenient way to get reasonable amounts of data into my computer. Probably better than using an arduino for a bunch of reasons. Plus I’ve already bought some external DACs for recording audio.

http://www.swharden.com/blog/2013-05-09-realtime-fft-audio-visualization-with-python/

I followed these instructions https://gist.github.com/jiaaro/9767512210a1d80a8a0d

to install

Here is some very slightly modified from the pyaudio example code. May want to eventually select which input device you want

Maybe I’ll write in a moving fft or some realtime watching would be nice.

Pretty Good!

## Curses and Ncurses or whatevs.

So I was wondering how to make slightly less wonky command line programs. Sometimes they aren’t. Like top or nano or vi and stuff.

I think the answer is curses,

https://docs.python.org/2/howto/curses.html

Here’s a short example program that I think sort of demonstrates enough to get you going.

Press q to quit

z, r, and p do stuff.

Very curious that you have to do this sequence of build up and tear down or else it freaks. Ctrl-c does exits nicely due to that bit of code about signals. Ctrl-c send SIGINT signal

## Python Wave File Fun and Audio on Arduino

I had never touched the struct library before. I believe it is the main way to use python to interact with bit formats.

https://docs.python.org/2/library/struct.html

Strings are byty things, so I guess struct uses them as a bridge between pythons weirdo types and the bare metal of c types. I think something similar can be done using numpy.

16 bit wave is signed integers, 8bit is unsigned. That goofed me off. I never got the resaved file to sound not alike a demon though, so there is something wrong in the code. The outputted c file worked like a charm, so I guess who cares. That’s all I wanted anyhow, I was just rewriting the wave for debugging.

Tried using an R-2R ladder to make a ADC for the arduino. It worked, we buffered the output with a simple emitter follower circuit.

Way overkill though.

I was thinking of using the analog in in some kind of feedback loop with a capacitor as an ADC when I found this.

http://highlowtech.org/?p=1963

This does the same thing with no parts (well, still need a buffer to get full volume). Clev. Majorly clev.

A speaker can’t respond at ~60kHz (nor would you hear it) so the fact that there is a pwm and not a true ADC going does not matter. Noice.

Incidentally, installed crayon for wordpress. Code looks way freaking better.

## Havin’ Some Problems with Beaglebone

When I turned on the beaglebone black and saw that it served its own IDE, I was immensely pleased. Also Javascript! Language of the future (Not necessarily due to inherent qualities, but due to it’s adoption as a langua fraca. There is no easier toolchain to get going)!

However things turned to shit and now I’m a little more hesitant to embrace dat bone.

The pin headers are too long and unmarked. It’s a pain to count from the side to find the pin you need. Especially since the pins have different functionality. Not all pins are PWM for example.

A problem that took forever was getting the xbox controller to work really properly. Apparently some kind of EMI was occurring. dmesg|grep usb had some business about babble. Found a forum post that said to use a hub or usb extender and then you’ll be find. Used an usb extender and it worked. Bizarre.

For some unknown reason I had difficulty getting more than 2 PWM to works. It spit mysterious messages about device trees (which apparently are important. A lot of things in this world are important.).

I went into uEnv.txt in the boot folder and started disabling stuff. My hypothesis was that the pins being configured for other purposes that was blocking their use.

First I uncommented the line to disable HDMI. Then, I accidentally bricked the beaglebone by uncommenting a line in uEnv.txt that enables eMMC which is where the beaglebone boots from. I am an idiot.

I unbricked it by loading the debian image onto a microSD. Not a big deal except for the scary factor. I do not like formatting things. Especially not with sudo. Also, I did not have a microSD card handy and had to wait for one.

I essentially followed the instructions here https://www.raspberrypi.org/documentation/installation/installing-images/mac.md

replacing the raspberry pi image with the beaglebone image. dd is an odd duck. I used /dev/disk3 rather than /dev/rdisk3 which might explain why it took so goddamn long. Apparently one is buffered and the other isn’t.

So yeah. I love having the beef of a full linux on board stuff. Really I think control from an xbox controller would have been unfeasible natively on an arduino (don’t quote me on that. Never actually looked. I’ve heard the Leonardo can do usb stuff maybe). But for simple purposes I think you’re better off with an arduino controlled over serial. Could just then make some python program to accept the xbox controller on a host computer. Check out this clever snippet for a simple Serial protocol. Then you can type in numbers followed by a letter for various commands. The – ‘0’ thing is because the numbers come in sequential order in ascii.

I could type into the serial console

123a

And then it would set servo a to 123 degrees.

Also,  update on the h-bridge:

It caught on fire.