Contracts and Category Theory

Contracts are both less cryptic and less impressive than a type system.

A type system somehow proves that things will never happen at compile time. You’ll never get a string when you expected an int or even more powerful facts. Contracts just guarantee that at least if certain problems occur at least the program will freak out and tell you at runtime.

I had been vaguely aware of contracts which I considered to be kind of a band aid Racket thing that is their solution to type safety (besides typed racket), but I did not go into depth. And I kind of viewed the thing more as a methodology for showing loop invariants and algorithmic correctness rather than type correctness. I do not know if this is an accurate portrayal of what is in Racket and I’m pretty sure contracts do not actually originate there (Eiffel?).

Mike Stay (who you may know as the co-author of the Rosetta Stone paper https://arxiv.org/abs/0903.0340)made a bunch of videos which I don’t know how I didn’t come across before (they’re kind of old by skateboarding millennial mountain dew front end developer standards. Did Node even exist? Barely.). Javascript (well maybe after python) was my language of greatest comfort a couple years ago and I would have loved this. I absolutely loved Bartosz Milewski’s Category Theory for Programmer’s series. There is a crap-ton of line noise ┬áthat kind of muddies the waters though in javascript. I wonder if it makes sense to me because I mostly already understand what he’s going for from a Haskell context. He has some cool patterns here like using objects as typeclasses/interfaces.

https://jscategory.wordpress.com/

The really neat thing he discusses is higher order contracts which I’d bet is a central piece of contract based programming but I had never gotten that far.

I’m still skimming over the vids, but nice job.

 

Convolution and Tensor Product

I realized a relation the other day that made me feel like a doof.

The tensored H\otimes I +I\otimes H extension of a 1-particle operator H occurs often. This is also the form of a laplacian of many variables

One of the most fantastic properties of the exponential and tensor product is

e^{H\otimes I +I\otimes H}=e^{H}\otimes e^{H}

I guess that looks okay but it is worth remarking that kron multiplication meshes well with exponential.

While this is easy, the inverse is not. You can’t write the inverse simply in terms of the inverses of H, as you can write the exponential in terms of the krons of e^H.

At the same time, This operator is roughly the propagator e^{tH}, which often you use the Laplace transformed version of G(E)=\frac{1}{H-E}. But that right there gives you an interesting integral expression for the inverse of the tensor extension of H.

e^{t H\otimes I}e^{t I\otimes H} is a product in the time domain. Hence it becomes convolution in the Fourier domain.

G(E) =\frac{1}{H-E} (*) \frac{1}{H-E}=\int dE' \frac{1}{H-E'}\otimes\frac{1}{H-(E-E')}

Where that star is some kind of kroneckerized convolution. It makes sense that kronecker product in the time domain becomes a kronecker convolution in the Fourier domain. If you do the integral out in an eigenvalue expansion you can see that it does work.

This suggests a kind of interesting method for finding the tensorized sum (I’m having a hard time knowing what to call it, but sums of the form A\otimes I+I\otimes B ) in terms of their original using a numerical integration.