Currently learning about database internals, and it seems like there's a lot of interesting tradeoffs when you are designing a data management system. Some of these tradeoffs seem to be dependent on what kind of behavior you think will occur once user's begin to use your database.
For example, you may design your database to have super fast reads for certain types of data but maybe it makes writes slow. So you're assuming that your user is mostly only going to care about the speed of reads.
One of the takeaways I've gotten so far is that, assumptions like this are dangerous because requirements change, can't predict user behavior, etc. So it got me thinking, are there any databases out there that use the databases internal statistics, do some cost analysis, and change their internal file structure/data structures/algorithms to something that better suits the users?
English is not my first language so I apologize if there's any bad grammar, happy to elaborate in comments if it doesn't make sense :)