Orientational embeddings are tensorial representations of orientations with the specific property that each class of symmetrically equivalent orientations has a unique tensor representation. The easiest tensorial representation of a rotation is its rotational matrix. However, in the presence of crystal symmetry multiple rotational matrices describe the same orientation. This can be avoided by restricting the space of admissible matrices to the so called fundamental region. However, this creates the problem that two similar orientations may be represented by very different matrices in the fundamental region. This usually happens if the orientations are close to the boundary of the fundamental region.
The central problem is that the geometry of the fundamental region is not the geometry of the orientation space. Lets demonstrate this by taking pairs \(\mathtt{ori_1}\), \(\mathtt{ori_2}\) of random orientations in the fundamental region
and compare their misorientation angle \(\omega(\mathtt{ori}_1,\mathtt{ori}_2)\) with the Euclidean distance \(\lVert \mathtt{tensor(ori_1)} - \mathtt{tensor(ori_2)} \rVert_2\) of the corresponding rotational matrices and the Euclidean distance \( \lVert \mathtt{R(ori_1)} - \mathtt{R(ori_2)} \rVert_2\) of the corresponding Rodrigues Frank vectors.
We observe that orientations with very small misorientation angle \(\omega(\mathtt{ori}_1,\mathtt{ori}_2)\) may be very far from each other in Rodrigues Frank space, i.e. \(\lVert\mathtt{R(ori_1)} - \mathtt{R(ori_2)}\rVert_2\) is large. As a consequence, we can not simply compute the average of two orientations by taking the mean of the corresponding Rodrigues vectors.
Lets have a look at the extreme case of finding the mean orientations of the orientations \((44^{\circ},0^{\circ},0^{\circ})\) and \((46^{\circ},0^{\circ},0^{\circ})\)
The mean orientation \((0^{\circ},0^{\circ},0^{\circ})\) computed from the average of the Rodrigues vectors is far away from the true mean.
This issue does not only apply to the mean but actually to all statistical methods that work well for vectorial data and that one would like to apply to orientation data.
Defining an Embedding
The crucial idea of an embedding is to replace the vectorial representation by a higher dimensional tensorial representation that preserves the geometry and the distances of the orientation space as good as possible. In MTEX such an embedding \(\mathcal E(\mathtt{ori})\) of an orientation ori
is defined by calling the function embedding
.
This creates variables e1
and e2
of type embedding
that behaves like lists of vectors, i.e., they can be summed, rotated, scaled and one can compute their inner product. Lets have a look at the Euclidean distances \(\lVert\mathcal E(\mathtt{ori_1}) - \mathcal E(\mathtt{ori_2}) \rVert_2\) between the embeddings e1
and e2
We observe that the distance in the embedding differs slightly from the misorientation angle. However, especially for small misorientation angles the approximation is very good.
Lets go back to our second example of averaging the orientations \((44^{\circ},0^{\circ},0^{\circ})\) and \((46^{\circ},0^{\circ},0^{\circ})\). If we compute the embedding of both orientations, average the resulting tensors and project the mean tensor back to an orientation we end up with the correct result \((45^{\circ},0^{\circ},0^{\circ})\).
Basic Properties
By construction the embeddings of all orientations have the same norm.
In other words the embeddings are located on the surface of a ball with a radius \(1\). When computing the mean from a list of embeddings the resulting tensor has in general a smaller norm, i.e., is inside this ball. Similarly as in spherical statistics the norm of the mean of the embeddings can be interpreted as a measure of the dispersion of the orientations. If the norm is close to 1 the orientations are tightly concentrated around a preferred orientation, whereas if the norm is close to zero some of the orientations are at maximum distance to each other.
Lets compare the norm
\[ n=\left\lVert\frac{1}{N} \sum_{i=1}^N \mathcal E(\mathtt{ori}_i) \right\rVert\]
of the mean embedding with the standard deviation
\[ \sigma = \left(\frac{1}{N} \sum_{i=1}^N \omega(\mathtt{ori}_i, \mathtt{mori})^2\right)^{1/2},\]
where \(\omega(\mathtt{ori}_i, \mathtt{mori})\) denotes the misorientation angle between the orientations \(\mathtt{ori}_i\) and the mean orientation \(\mathtt{mori}\).
It appears as if the norm of the mean embedding is a function of the standard deviation. However, the reason for this false relationship is that we have generated the orientations out of a single family of random variables - unimodal de la Vallee Poussin distributed density functions. A broader family of density function are the Bingham distributions. Lets repeat the experiment for this family.
We observe that there is no one-to-one relationship between the discrete standard deviation.
Operations
The following operations are supported for embeddings:
-
+
,-
, >|,|<embedding.times.html .,./
-
sum
,mean
-
norm
,normalize
-
dot
-
rotate
,rotate_outer
Low dimensional representation
Internally the tensorial representation of the is slightly larger than required. In many practical
Reference
The theory behind these embeddings is explained in the paper
- R. Arnold, P. E. Jupp, H. Schaeben, Statistics of ambiguous rotations, Journal of Multivariate Analysis (165), 2018
- R. Hielscher, L. Lippert, Isometric Embeddings of Quotients of the Rotation Group Modulo Finite Symmetries, arXiv:2007.09664, 2020.