Da Laser BACK END BABY

We need to build that back end baby. The butt so to speak. Where the action BE AT.

The idea is to have collaborative aiming of our laser across the internet.

Check it out https://github.com/ishmandoo/laser

So we’re gonna docker this. Because we’re off the goddamn chain.

Ben already has some code on his github

https://github.com/ishmandoo/multiplayer_test

That we’re going to modify to our needs. He essentially used

http://socket.io/get-started/chat/ as a base level + some basic docker

So this already does what we need it to do basically. Just change the name of the git repository and it’ll boot up the server.

check this out https://www.digitalocean.com/community/tutorials/docker-explained-using-dockerfiles-to-automate-building-of-images for more deets.

“I built it with  “docker build -t laser .”
and ran it with “docker run -d -p 80:3000 laser””- Ben

Okay, also technically we need a front end too, so

https://developer.mozilla.org/en-US/docs/Web/API/Touch_events#Create_a_canvas

duct taped together with the chat application and http://stackoverflow.com/questions/4037212/html-canvas-full-screen full screening.

We just need to tell the server when touch is down and where. We normalized x and y to be ratios of 0-1 fractions across the screen. It seems we can’t rely on not going a little over 1 maybe so be aware.

The Socket.io part is very straightforward.

The express stuffjust serves the ordinary webpages. (app.get)

The only thing kind of funky is storing data associated with each client. We just attach it to the socket object, which should be self cleaning to some degree. We also add a timeout just in case we somehow lose a touch end. Maybe unneccessary.

Polling the path /location will return json of the average position of everyone touching the screen.

All in all pretty straightforward. And it seems to work with our phones. Except not Beth.

Now onto the microcontroller code. Should be simple, but for some reason is not. We have to glue together a couple of things if we want to bolt the arduino to an esp8266. Alternatives are to go native on an esp8266 or use a Photon. It is bizarre to me that one of these is not a clear winner. Why is it not so incredibly easy to make http requests on the Photon?

Also, a comment: Being name the photon makes it impossible to search for you. It sounds cool, but it’s a bad name.

Aiming a Laser Project

So we hot glued two servos together and the to a mirror.

We’re aiming a laser at it. A blindingly powerful laser

Essentially Servo B controls the angle in the x-y plane of the mirror and servo A controls the z angle to that plane.

The mirror law is v_2=v_1 - 2(v_1 \cdot n)n

Where all are unit vectors. v_1 is the incoming, n is the mirror normal and v_2 is the outgoing unit vector.

We fixed the laser to the base so

v_1 = \hat{y}

The outgoing ray must hit the ceiling (at a height R) at position x,y.

v_2 = (x,y,R) \frac {1} {\sqrt{x^2+y^2+R^2}}

Here’s a nice observation:

v_2 - \hat{y} \propto n

So we have an algorithm for finding n right there. Find v2, subtract off 1 and then normalize the resulting vector to get $latex \hat{n}$.

Then finally we can write n in terms of the angles \alpha,\beta, which heavily depend on our conventions of where angle 0 is and whether the servo spin clockwise or counterclockwise.

I believe ours came out to be something like:

n = (\cos(\alpha)\sin(\beta), -\cos(\alpha)\cos(\beta) ,\sin(\alpha))

In all honesty, we coded her up, then fiddled with minus signs until it was working. Not necessarily a bad way of going about things. Find the things to think about and find the things to just try.

Then here is the code: