I am running Jupyter notebook with ipykernel (v5.1.0) in Mozilla Firefox (66.0.3) but I've seen this occurring in Chrome as well. I'm on Windows 10, 64bit, if that information is relevant.
When rendering mathjax on an existing notebook, it often appears like:
where the inline mathjax symbols seem to overlap with the main text. it seems that the renderer is treating the math content as having zero-width(?).
Just to be clear:
- the math symbols display fine while I'm typing them; they even appear fine if I execute the markdown cell.
- the above issue only happens when I save the notebook, quit the server, boot up a new jupyter server, and start the notebook up again.
- it only affects inline math, not displayed math, as you can see from the image.
- I can fix it by re-executing the markdown cell again every new session, but it's getting pretty annoying.
The underlying code looks like:
<span class="kw">Principal Component Analysis</span> is an unsupervised linear transformation technique.
It helps to identify patterns in data based on correlations between features.
PCA aims to find the directions of maximum variance in high-dimensional data and projects them onto a new subspace spanned by these vectors (with equal/fewer dimensions than the original one).
Mathematically-speaking, we construct a $d\times k$-dimensional transformation matrix $W$ that allows us to map a sample vector $x$ in $d$-dimensional feature space ($x\in \mathbb{R}^d$) onto a new $k$-dimensional subspace.
$$
x \longrightarrow z = xW
$$
**General PCA algorithm**
1. Center the $d$-dimensional dataset around the origin. Z-scoring (dividing by stdev may help, or not).
2. Construct covariance matrix.
3. Decompose the covariance matrix onto its eigenvectors and eigenvalues.
4. Sort the eigenvalues by decreasing order to rank the corresponding eigenvectors.
5. Select $k$ eigenvectors that correspond to the $k$ largest eigenvalues, where $k$ is the dimensionality of the new feature subspace ($k\leq d$)
6. Construct a projection matrix $W$ from the $k$ eigenvectors chosen in step 4.
7. Perform the projection by multiplying $X^\top W$ (a $n\times k$ matrix), to obtain a new $k$-dimensional feature subspace.
I don't imagine myself to be the only one experiencing this, but I haven't been able to find anyone discussing this issue online. How do I fix this?