3

I know there are a plethora of frameworks for macOS to make working with graphics easy. In this case I want to make my life hard on purpose for the learning experience, and customizability.

I want to simply make a window of X by Y pixels, make an array of X by Y pixels, fill the array with color data, and have the window display those pixels.

Basically I want to toy around with making my own little engine to draw things so I can learn from the experience. And I don't want to use OpenGL, Metal, or any other framework that does the hard work for me. Simply give me a window and let me color the pixels one by one. Once I learn how to do what I want to do there I can move up to a higher level framework.

So what in macOS will let me do just that? I've looked at a couple of the frameworks but there are too many to make heads or tails of where to really start. Once I know where to start I can figure out the rest from there.

So far the best idea I have is to use Core Graphics, create a pixel buffer, set that to the whole window, and ignore all the other fancy stuff that Core Graphics does for me. I'd like to go a level lower than that if possible.

Peter O.
  • 32,158
  • 14
  • 82
  • 96
RTHarston
  • 429
  • 4
  • 13
  • In case people are wondering, I have nothing against using a library, but it seems odd to me that I'd need a library to do something that should be the lowest level of the stack: put pixels in a window. Are the system APIs really built in such a way that it is easier to draw geometry and text than a single pixel? I could believe that since most use cases want the geometry, but it still seems odd that pixels aren't an option. Unless I'm looking at this all wrong, and in reality the system APIs are higher level than the libraries people are suggesting. So what is the lowest level? OpenGL/Metal? – RTHarston May 19 '20 at 18:04
  • 1
    The reason that direct pixel access isn't really supported is because it prevents the sort of abstraction that enables modern display mechanisms. Windows are composited by the Window Server. The backing textures reside in VRAM and aren't in main memory. The backing texture pixel formats and layout may be optimized for hardware and thus not linear or have a predictable pixel format. Etc. – Ken Thomases May 19 '20 at 23:52
  • I understand some of what the window compositor is doing, and I have a basic grasp of how the CPU and GPU get along, but the more I look into it I think I may have some of the steps out of order. I don’t know. That’s what I’m trying to figure out. I thought I could find a way to make an array of pixels and put them in a window before the compositor takes over, besides just loading a bitmap image in a window. I’m wanted to do things on the CPU before it gets shipped off (for simplicity), but I think now that isn’t possible because how much modern macOS does on the GPU. – RTHarston May 20 '20 at 02:54
  • And now my comment asking why thanks got deleted also got deleted? Things must be worse than I thought, and it already seemed bleak... – RTHarston May 20 '20 at 02:55

2 Answers2

2

I'm not sure if it's possible. It's certainly not officially supported. There are some old, long-deprecated APIs for accessing the framebuffer for the display (not a window). I have no idea if they still work. You would use the Quartz Display Services API to first capture the display(s) and then obtain the framebuffer address. For example, CGDisplayAddressForPosition().

Does it really matter for your purposes if you're accessing the real framebuffer vs. accessing an off-screen raster buffer and blitting that to screen with a minimal high-level API call?

Something you might try is using an IOSurface as the contents of a CALayer. It's supported but not clearly documented. Obviously, there's a lot of high-level stuff going on to get the IOSurface contents to show in the layer, but you don't necessarily need to deal with it, as such.

For good measure, I'll mention the NSDrawBitmap() function in AppKit. It's perhaps the most low-level-style interface. That's not to say it's the lowest level of the stack. For example, it's very likely that internally it just constructs a CGImage from the bitmap data and then draws that.

Ken Thomases
  • 88,520
  • 7
  • 116
  • 154
  • "Simply give me a window and let me color the pixels one by one." I don't want access to the screen's framebuffer, just a single window's buffer. So "accessing an off-screen raster buffer and blitting that to screen with a minimal high-level API call" sounds more like what I want to do, but again, just within a single window. That said, I will look in to the CALayer idea. Thanks! – RTHarston May 19 '20 at 17:56
  • The "off-screen raster buffer" comment was basically referring to what I thought you already had: using Core Graphics with a bitmap context and then drawing the image of that to the window. – Ken Thomases May 19 '20 at 23:41
  • It’s the idea I’ve had, but I haven’t actually done it yet. I started thinking about this last night while I was supposed to be doing homework, so it may be a few days before I can try anything, but I’m making note of your ideas to try when I get the chance. – RTHarston May 20 '20 at 02:45
2
  • You could try with SFML on macOS if you wish - see here.

  • Or with CImg (which needs XQuartz) - see other answer to same question.

  • Or you can mess about with libsvga in a VirtualBox Linux on your Mac - see here.

  • Or you can write to the framebuffer on RaspberryPi if you have $30 to spare.

  • Or you can go even lower-level with the Raspberry Pi framebuffer - see here.

Mark Setchell
  • 191,897
  • 31
  • 273
  • 432
  • I have a RaspberryPi and one of the things I want to do is this, so thank you for sharing those last couple of bullet points. Those are helpful for the spirit of the question: toying with graphics on a rectangle of pixels. However, in this case I am looking for something more to the letter of the question. At least first anyway. I'd like to play around on my computer first where things are easier to iterate, and then maybe play with the Pi later once I have a better idea of what I'm doing. But maybe the other way around will be easier. ;) – RTHarston May 19 '20 at 17:59
  • That SFML answer looks compelling. I'll look in to that a bit more when I get some time. It does what I want, and it is great that it is multiplatform, but now I have to admit one of the reasons I wanted something macOS 'native' -- I want to use Swift. I like Swift and I want to use more of it. I already use C++ at work, so I want to use Swift at home where I can. Maybe I'll have to make my own binding in Swift, or I switch to Rust... I've been wanting to learn Rust, and it has an SFML binding. That is tempting... Thanks for giving me an alternative to consider. – RTHarston May 19 '20 at 18:10
  • I believe I installed it with `brew install sfml` by the way. I also used **homebrew** to install `pkgconfig` to work out all the compilation switches as at the top of the code. – Mark Setchell May 19 '20 at 18:44