There are only 2 possibilities:
Things like fancy features/enhanced functionality, mitigating security issues, dealing with hardware bugs/errata, performance, scalability and supporting a very wide range of different hardware all increase the complexity of the code; and if you look at a real commercial OS (e.g. Linux maybe) that has to care about all of these things it's hard to learn about one thing (e.g. memory management) without all the complexity getting in the way and making it significantly harder to learn.
If you have a simple OS that does none of those things (no fancy features, no mitigation of security issues, ...) then it's much much easier to learn basic principles from it; but it also becomes impossible to use it to learn about fancy features, mitigating security issues, dealing with hardware bugs/errata, ...
The solution is to start with a simple OS (e.g. XV6) to learn the basics, then switch to a real OS later to learn everything else.
However; most OS courses at Universities are not intended to teach you about writing an OS. Instead they're intended to give you basic information about operating systems so that you can use that knowledge when writing application programs for existing operating systems. For that reason (and because there's time constraints) they only do the first part (with a simple OS like XV6) and then the course finishes.