Quantum Information and Computation Resources

A playlist I keep in no particular order. In particular I recommend Gottesman’s Quantum Information Course from Perimeter Institute which has tons of other interesting physics courses and colloquia as well.  Also check out the CSSQI lectures. Some of these videos are just links to a whole glut of connected videos from the same people, so look on the sidebar and at other videos from the same users.

 

Preskill’s classic notes. I don’t find them very approachable?

http://www.theory.caltech.edu/people/preskill/ph229/

Quantum Computing since Democritus. Unbelievable.

http://www.scottaaronson.com/democritus/

I liked these

https://people.eecs.berkeley.edu/~vazirani/

The Simons Institute has had some interesting workshops. I coincidentally have been interested in SAT problems recently coming from a logic side.

https://simons.berkeley.edu/workshops/qhc2014-boot-camp

 

Recommended Books:

Nielson and Chaung – The standard

Kitaev and other – Classical and Quantum Information

Mermin

Mark Wilde – Quantum Information. Much less quantum computation. Good different perspective.

 

Interesting Languages – Both have tutorial videos available

Quipper – Haskell based

Liquid – Microsofty F# based

 

IBM Quantum Experience – Let’s you run on real hardware!

https://www.research.ibm.com/ibm-q/

Rigetti has a similar thing going on

http://rigetti.com/

 

 

 

 

 

 

 

Drone Notes

April 2017

Racing firmware mostly

CleanFlight vs betaflight vs iNav. A family of related firmwares

iNav for autonomous, betaflight for latest

Betaflight might be taking over?

 

 

Ardupilot seems to be leader for autonomous drones

pixhawk is premier computer

http://ardupilot.org/dev/docs/raspberry-pi-via-mavlink.html

 

gstream for video streaming

Gstreamer basic real time streaming tutorial

https://gstreamer.freedesktop.org/documentation/tutorials/basic/gstreamer-tools.html

 

uv4l?

UV4L

even better for streaming pi?

The only thing we have working is the webrtc brwoser based camera.

You need to click call to make it start

 

 

https://blog.athelas.com/a-brief-history-of-cnns-in-image-segmentation-from-r-cnn-to-mask-r-cnn-34ea83205de4

 

 

get avr branch of ardupilot

go into examples folder

make apm2

make apm2 upload

I am not registering my apm2.6 as a serial device. Ok,  my usb cable was bad. What are the odds?

installing apmplanner from http://ardupilot.org/planner2/docs/installation-for-linux.html

command is missing an underscore

rtl is return to launch

 

 

SITL is the recommended simulator

Installed vagrant to use SITL on mac

http://ardupilot.org/dev/docs/setting-up-sitl-using-vagrant.html

http://sourabhbajaj.com/mac-setup/Vagrant/README.html

I had to make a Vagrantfile to get it to work. By default vagrant was trying to use some nonsense

Make Vagrantfile with

https://www.vagrantup.com/intro/getting-started/boxes.html

 

 

JMavSim for software in the loop on pixhawk 2

https://pixhawk.org/users/hil

https://pixhawk.org/dev/hil/jmavsim

 

 

What is the difference between apm planner and mission planner?

Setup pi as access point. Could use as radio then. Not very long range

https://learn.adafruit.com/setting-up-a-raspberry-pi-as-a-wifi-access-point/overview

 

supposedly the apm2.6 will connect through usb

Dronekit

http://python.dronekit.io/guide/quick_start.html

Mavlink and pymavlink. Evidently dronekit uses pymavlink

pymavlink is a low level python control of MAVlink messages.

mavproxy – is a command line ground station software. More feature packed than apm planner? Has ability to use multiple linked ground stations.

mavproxy can forward data to a given port. Useful, but I can’t find it documented in the mavproxy docs themselves

 

dronecode is a set of projects

Dronecode Platform

Really nice looking simulator

https://github.com/Microsoft/AirSim/blob/master/docs/linux_build.md

I had to sign up with epic games and link my gihub account to be able to clone the unreal engine

We’re using a Turnigy 9x. Got a ppm encoder to be able to attach to pixhawk

 

Setting up the pixhawk 2:

The motors need to be plugged in according to their number

http://ardupilot.org/copter/docs/connect-escs-and-motors.html

Download APM planner 2

Flashed the firmware

Ran through the initial calibration. Followed onscreen instructions.

Not immediately getting all the buttons working

http://ardupilot.org/copter/docs/common-rc-transmitter-flight-mode-configuration.htmlSw

Swapped channels 5 and 6 on controller to have flight mode siwtch

Flight modes

Stabilize – self level roll and pitch axes

FS_THR_Value error. Not sure why

Compass is not calibrating. Not sure why.

 

We had lots of problems until we uploaded the latest firmware. It loaded firmware at the beginning, but I guess it wasn’t the latest. We built APM Planner from source and perhaps that reupdating fixed the firmware to 3.5.1

Spinning up it flew but was spinning. We wired up the motors ccw and cw opposite to the wiring diagram but never changed it in the firmware.

 

Drone Code uses QGroundControl. This is sort of an APM Planner alternative.

v.channels gives a dict

channel 2 was right up down

channel 3 was left up down

 

Dronekit Cloud. Web apis for drone control? This kind of seems like for if you have a ton of drones. Forward looking

 

In the field we can connect to the drone using the phone as a hotspot.

 

It seems like only guided mode will accept mavlink commands

The controller modes override what the pi says.

Stabilize mode should ignore mavlink commands? In case they get wonky.

RTL.

So we set the controller to have flight mode settings. In those three modes, in case something goes wrong.

put this in a dronerun file

python “$@” &

So that you won’t have the program stop when ssh pipe dies.

Need to set RTL speed and altitude. Dafult may be alarming

WPNAV_SPEED

250 up default

150 down default

Crash on RTL mode. (Toilet bowl behavior? Seemed to be moving in a circle. ) I also felt like the loiter mode responded counter intuitively to my commands.

 

We’d like to use raspberry pi camera for visual odometry

Mavlink message is implemented in ardupilot

https://github.com/PX4/OpticalFlow

http://mavlink.org/messages/common#OPTICAL_FLOW_RAD

http://ardupilot.org/dev/docs/copter-commands-in-guided-mode.html

actual source

https://github.com/ArduPilot/ardupilot/blob/master/ArduCopter/GCS_Mavlink.cpp#L967

 

 

 

Movidius Neural Compute Stick

https://developer.movidius.com/getting-started

Installed VirtualBox and ubuntu 16.04 on my macbook (welcome to the dangerzone). Nice and fresh. sudo apt-get update and upgrade. I ran into problems eventually that are not cleared up on the forum. Switched to using a native 16.04 installation. The setup ran without a hitch. Beautiful.

Get the latest SDK

https://ncs-forum-uploads.s3.amazonaws.com/ncsdk/MvNC_SDK_01_07_07/MvNC_SDK_1.07.07.tgz

following these instructions

https://developer.movidius.com/getting-started/software-setup

I had to restart the terminal before running setup.sh for ncapi. It added something to my bashrc I think. Ok. Actually that is mentioned in the manual. Nice.

Now to test. In the bin folder

also example 01-03

They all seem to run. Excellent.

Looks like ~100ms for one inference for whatever that is worth

“example00 compiles lenet8 prototxt to a binary graph, example01 profiles GooLeNet, example03 validates lenet8 using a simple inbuilt image.”

https://developer.movidius.com/getting-started/run-inferencing-on-ncs-using-the-api

Go to ncapi/c_examples

make

 

options for ncs-fullcheck are inference count and loglevel

go to py_examples

stream_infer

It really likes oxygen mask.

But was successful on sunglasses and a coffee mug. Although it did oscillate a little.

The README is interesting in the stream_infer

Stat.txt holds the average rgb and std dev values.

I wonder if I could run two sticks?

A lot of the stuff is gstreamer related

The movidius beef seems to be

You just load the tensor and then get it back.

There is some recommended preprocessing of the image and grabbing the label files and stuff but that is all standard python. Change the mean and std dev to match the netowrk. Also convert to a float16 array. Resize to 227×227

I’ve never used gstreamer. I wonder if there is a problem using the standard opencv stuff. Doesn’t seem like there should be.

 

In the last couple days, they released instructions on how to run on a raspberry pi.

 

object localization would be very useful for us.

Get the script for the faster r-cnn

https://github.com/rbgirshick/py-faster-rcnn/blob/master/data/scripts/fetch_faster_rcnn_models.sh

copy contents

chmod +x that script and run it

 

To take a new network and make it run

you run the  mvNCCompile on the prototxt ( which describes the shape of the network and other things) and the caffemodel weightfile

for example

python3 ./mvNCCompile.pyc ./data/lenet8.prototxt -w ./data/lenet8.caffemodel -s 12 -o ./lenet8_graph

then you can profile and check it’s correctness. It is unclear at this point how easy it will be to take stock networks and get them to run.

https://huangying-zhan.github.io/2016/09/22/detection-faster-rcnn.html