Problems involving outer products of vector variables can be relaxed into semidefinite programs. That’s a general trick. Then the low rank bit from SVD is an approixmate solution for the vector

https://en.wikipedia.org/wiki/Matrix_completion

convex relaxation for distributed optimal control

http://ieeexplore.ieee.org/document/7464306/

graph matching in relation to Image correspondence

Permutation matrices have sum of rows and columns must be 1 constraint, is one relaxation.

quickMatch. Actually, not convex programming but was the root of the chain of references I ‘m digging through

http://openaccess.thecvf.com/content_ICCV_2017/papers/Tron_Fast_Multi-Image_Matching_ICCV_2017_paper.pdf

https://dl.acm.org/citation.cfm?id=2600314

matchALS

www.cis.upenn.edu/~kostas/mypub.dir/xiaowei15iccv.pdf

https://arxiv.org/abs/1402.1473

https://vision.in.tum.de/research/convex_relaxation_methods

Finding MaxCut approximation of a graph is a classic one

Quantum Semidefinite programming course

Density matrices have a semidefinite constrina (non negative probabilities)

https://cs.uwaterloo.ca/~watrous/CS867.Winter2017/

Sum of Squares is a semidefinite program that can guarantee that lyapunov functions actually work