How do I prove that the Hadamard satisfies $H\equiv e^{i\pi H/2}$?

8

1

How can I demonstrate on the exponential part equality of the Hadamard matrix: $$H=\frac{X+Z}{\sqrt2}\equiv\exp\left(i\frac{\pi}{2}\frac{X+Z}{\sqrt2}\right).$$

In general, how can I demonstrate on: $\exp(i(\pi/2)M) \equiv M$ for any matrix whose eigenvalues are all $\pm 1$, and that such matrices uniquely satisfy $M^2=I$?

(Actually, I tried a guess: since the eigenvalues are all $\pm 1$ - are real - so the matrix is Hermitian, and since $M^2=I$ so the matrix is Unitary, then this relation $\exp(i(π/2)M) \equiv M$ is valid)

Source: https://community.qiskit.org/textbook/chEx/Ex2.html

walid

Posted 2019-10-22T21:42:34.990

Reputation: 305

Answers

9

First of all, note that the statement, as written, is wrong (or rather, it is correct only as long as the "$\equiv$" symbol is taken to mean "equal up to a phase"). An easy way to see it is by computing the determinant of $H=e^{i\pi H/2}$, which gives $-1=1$ (using $\det[\exp(A)]=\exp[\operatorname{Tr}(A)] $ for all $A$ and $\operatorname{Tr}(H)=0$).


Now, assume $A$ is any matrix satisfying $A^2=I$ (note that we don't even need to assume that $A$ has a specific dimension). Then, $\exp(i\alpha A)=\cos(\alpha)I + i\sin(\alpha)A$ (you can see this by expanding the exponential in power series as shown in another answer, or using directly the formula $\exp(A)=\cos(A)+i\sin(A)$, which also works for matrices). It follows that $$e^{i\pi A/2}=i A.$$ Your formula is a special case of this. You can check that, given any direction $\hat{\mathbf n}$ with $|\hat{\mathbf n}|=1$, denoting with $\boldsymbol\sigma_i$ the $i$-th Pauli matrix, you have $$(\hat{\mathbf n}\cdot\boldsymbol \sigma)^2=I,$$ thus the conclusion follows.

Finally, because you also referenced the eigenvalues, note that $A^2=I$ if and only if $A=PDP^{-1}$ with $P$ invertible and $D$ diagonal with $\pm1$ elements. See this answer on math.SE for a proof.

glS

Posted 2019-10-22T21:42:34.990

Reputation: 12 247

That make sense, thank you @gls. – walid – 2019-10-23T19:12:17.603

What do you mean by: " given any direction n with |n|=1, denoting with σi the i-th Pauli matrix, you have (n⋅σ)^2=(n⋅σ) " ? – walid – 2019-10-23T19:30:54.070

@glS is there any policy regarding exercise questions like this similar to the ones in say Physics SE? As I see it, you basically repeated both my and answer and the answer of ChainedSymmetry and I already felt that my hint should have been more than enough for anyone really working on this question to solve it. – Marsl – 2019-10-23T19:38:58.643

I think you were looking for $(\hat n \cdot \sigma)^2 = I$. As stated that's not accurate. It might also be worth noting that equation in the cited example is not wrong, but as stated in the question (without the definition of $\equiv$) it's ambiguous. – Jonathan Trousdale – 2019-10-23T19:44:51.780

@ChainedSymmetry you are right, of course! I'll add a remark along those lines – glS – 2019-10-23T19:56:37.893

1

@Marsl there isn't, but you might want to check out the recent discussion on meta about adding a vote-to-close reason for "not enough effort questions". Not that I think this question would fall into that category, mind you. My personal opinion is that a question should stand as long as it is a useful contribution to the site (i.e. it might help someone looking for the answer to a similar problem in the future)

– glS – 2019-10-23T20:01:13.167

6

For questions like this, the conventional physics notation is easier to work with than the QIT gate notation. Define $\vec \sigma = (\sigma_1,\sigma_2,\sigma_3)$ to represent the three Pauli matrices $$\sigma_1 = X = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}, \;\;\; \sigma_2 = Y = \begin{bmatrix} 0 & -i \\ i & 0 \end{bmatrix}, \;\;\; \sigma_3 = Z = \begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix}.$$ The Pauli matrices form a basis for the Lie algebra $\mathfrak{su}_2$, and the corresponding Lie group elements, $U \in SU(2)$, are given by the exponential map $$U = e^{i\, \vec \phi \, \cdot \vec \sigma}, \;\;\; \vec \phi \in \mathbb{R}^3.$$ Define the vector $\vec \phi$ by a magnitude $\alpha$ and unit vector $\hat \phi = (\phi_1,\phi_2,\phi_3)$ such that $\vec \phi = \alpha \hat \phi$. Simple multiplication shows that $(\vec \phi \cdot \vec \sigma)^2 = \alpha^2$. With this relationship, the Taylor expansion of $U$ works out very nicely. $$U = \sum \limits_{i=0}^\infty \frac{i^n}{n!} \, (\vec \phi \cdot \vec \sigma)^n = \sum \limits_{j=0}^\infty \frac{(-1)^j}{(2j)!} \alpha^{2j} + i \hat \phi \cdot \vec \sigma \sum \limits_{j=0}^\infty \frac{(-1)^j}{(2j + 1)!} \, \alpha^{2j+1}$$ $$=I \, \cos \alpha + i \hat \phi \cdot \vec \sigma \sin \alpha = \begin{bmatrix} \cos \alpha + i \phi_3 \sin \alpha && (\phi_2 + i \phi_1) \sin \alpha \\ (-\phi_2 + i \phi_1) \sin \alpha && \cos \alpha - i \phi_3 \sin \alpha \end{bmatrix}.$$

With this formula it's simple to find the group element corresponding to given Lie algebra parameters. In the case of your specific question $\alpha = \tfrac{\pi}{2}$ and $\hat \phi = (\tfrac{1}{\sqrt{2}}, 0, \tfrac{1}{\sqrt{2}})$. Plugging this in gives $$U_{x+z} = \begin{bmatrix} \frac{i}{\sqrt{2}} && \frac{i}{\sqrt{2}} \\ \frac{i}{\sqrt{2}} && -\frac{i}{\sqrt{2}} \end{bmatrix} = \frac{i(X + Z)}{\sqrt{2}}.$$ In the underlying question from qiskit, $\equiv$ is defined as equivalence modulo global phase, so, as desired, the result equals $\tfrac{X+Z}{\sqrt{2}}$ up to a global phase of $e^{i\frac{\pi}{2}}$.

The more general question of determining what other Lie algebra parameterizations share this property (apart from trivial solutions, which are given by multiples of the identity) reduces to finding solutions to the equation $$e^{i \vec \phi \, \cdot \, \vec \sigma} = e^{i \theta} \, \hat \phi \cdot \vec \sigma \; (\text{mod} \; \theta).$$ This requires $\cos \alpha = 0$, which means vectors solving this equation will have $\alpha = \pm \tfrac{\pi}{2}$. From there it's relatively straightforward to see that solutions take the form of vectors with $\alpha = \pm \frac{\pi}{2}$ and $e^{\pm i \frac{\pi}{2} \hat \phi \, \cdot \, \vec \sigma} = \pm i \, \hat \phi \cdot \vec \sigma$.

Jonathan Trousdale

Posted 2019-10-22T21:42:34.990

Reputation: 2 714

4

Hint: Consider the series expansion of the exponential in the case of matrices $M$ that satisfy $M^2=-1$ (in your case $M=i \frac{X+Z}{\sqrt(2)}$. You should find something akin to Euler's formula which renders the proof trivial.

Marsl

Posted 2019-10-22T21:42:34.990

Reputation: 729

3

You can take the definition of the function $f(x)$ acting on any normal matrix to be such that if $$ M=\sum_i\lambda_i|\lambda_i\rangle\langle\lambda_i|, $$ then $$ f(M):=\sum_if(\lambda_i)|\lambda_i\rangle\langle\lambda_i|. $$ So for any $M$ such that $M^2=I$, then $M=P_+-P_-$ is described using projectors onto its $\pm1$ eigenspaces, and $P_++P_-=I$. We can solve these simultaneously to get $P_+=(I+M)/2$ and $P_-=(I-M)/2$. Now, $$ f(M)=e^{it}P_++e^{-it}P_-=e^{it}(I+M)/2+e^{-it}(I-M)/2=\cos(t)I+i\sin(t)M. $$

DaftWullie

Posted 2019-10-22T21:42:34.990

Reputation: 35 722