-3

I don't know if this is a programming question but it is a issue that may be possible to solve through computer programming.

Based on my limited knowledge about how the display processing pipeline in computers works, I theorised that pixels on the monitor are allocated space in a memory buffer somewhere and this buffer size depends on the size of our screen. So, can we fake the computer into thinking that we have a bigger monitor than we actually have and take the advantage for instance screencasting at a larger resolution than we already have?

Veger
  • 37,240
  • 11
  • 105
  • 116
juztcode
  • 1,196
  • 2
  • 21
  • 46

2 Answers2

1

Yes, there's a large chunk of memory (probably in your video card) that contains the actual displayed pixels, and there's a completely separate memory area maintained by the desktop software. It is possible (and in fact common) for the latter to maintain a "virtual" desktop that is larger than your monitor, extending the desktop into a second monitor, or perhaps scrolling or page flipping to access the extended areas.

All of this is very OS-specific.

Lee Daniel Crocker
  • 12,927
  • 3
  • 29
  • 55
1

To answer your question more information about your hardware and OS (and drivers) is required, as it is all very depending on this.

Nvidia for example has Dynamic Super Resolution, while AMD has Virtual Super Resolution.

Both will provide a bigger available resolution to the applications (games) than is actually available on your monitor. How to enable/configure this is depending on your hardware, so you should Google a bit for your specific setup.

Your OS then is (should be) able to scale it down to properly show on your monitor, so you can view the screen properly.

If your your screen capture software is able to directly capture your video memory (and not what is output to your monitor), it will capture the higher resolution. (I have no experience with screen capturing software, so I won't be much of help with this.)

Veger
  • 37,240
  • 11
  • 105
  • 116
  • 1
    While reading a bit more on the subject (as your question triggered my interest), I found this article how to set screen recording on a higher resolution for Nvidia: https://stormystudio.com/record-4k-hd-monitor/ – Veger Jan 30 '20 at 14:23
  • It seems nvidia provides some pretty neatly tied up things. I don't know if linux's open source drivers would provide easy to use sort of thing for AMD. Thanks for the answer though, will also try to look into it and post if figure anything. :) – juztcode Jan 30 '20 at 15:51