In the section density estimation we have seen that the correct choice of the kernel halfwidth is essential for creating a good match between the true density function and the reconstructed density function. If the halfwidth is set too small the reconstructed density function is usually oscillating and the individual sampling points are visible as sharp peaks. If the halfwidth is too large the resulting density function is usually too smooth and does not reproduce the features of the original density function.
Finding an optimal kernel halfwidth is a hard problem as the optimal kernel halfwidth depends not only on the number of sampling points but also on the smoothness of the true but unknown density function. MTEX offers several options set by flags during the kernel calculation operation. A very conservative choice for the kernel halfwidth that takes into account only the number of sampling points is implemented in MTEX with the flag 'magicRule'. The flag 'RuleOfThumb' considers both the number of sampling points and the variance of the sampling points as an estimate of the smoothness of the true density function. The most advanced (and default) method for estimating the optimal kernel halfwidth is Kullback Leibler cross validation. This method tests different kernel half-widths on a subset of the random sample and selects the halfwidth which best reproduces the omitted points of the random sample.
In order to demonstrate this functionality let's start with the following orientation density function
and compute \(10000\) random orientations representing this density function using the command discreteSample
Next we estimate the optimal kernel function using the command calcKernel with the default settings.
This kernel can now be used to reconstruct the original ODF from the sampled points using the command density estimation
Exploration of the relationship between estimation error and number of single orientations
In this section we want to compare the different methods for estimating the optimal kernel halfwidth. To this end we simulate 10, 100, ..., 1000000 single orientations from the model ODF odf, compute optimal kernels according to the 'magicRule', the 'RuleOfThumb' and Kullback Leibler cross validation and then compute the fit between the reconstructed odf_rec and the original odf.
Plot the error to the number of single orientations sampled from the original ODF.