I don't know if this is a programming question but it is a issue that may be possible to solve through computer programming.
Based on my limited knowledge about how the display processing pipeline in computers works, I theorised that pixels on the monitor are allocated space in a memory buffer somewhere and this buffer size depends on the size of our screen. So, can we fake the computer into thinking that we have a bigger monitor than we actually have and take the advantage for instance screencasting at a larger resolution than we already have?