2

I'm learning CoreImage and Vision libraries in iOS by implementing a coloring app. The app shows a sketch with white background and black edges to the user to color, such as this image: enter image description here

I can color a view using Canvas and specify the line width and color. However, the challenges are:

1- how can I prevent the user from coloring those black lines in the original image?

2-if the user colors a specific area with black color, how can I allow the user to modify that area (erase the black color, or over the color of that area with a different color) while preventing the user from coloring the black lines in the original image?

3- if the user can select a sketch from a set of sketches and each sketch has a unique width and height, what is the best approach I should take to achieve the point in question number 1?

I followed this tutorial to detect the black line using contour detection: contour detection tutorial

because my code is too long, I'm using pastern.com: source code

abs8090
  • 111
  • 2
  • 9

1 Answers1

1

Conceptually, you could do the following:

Separate the template (the tiger) and the canvas the user draws on into two different images, so that the template image is never modified directly.

Then blend the user's drawing over the template image using the multiply blend mode. This way, any area that is black in the template (the contour lines) will stay black in the result image (black = 0, so someColor * 0 = 0 (black)), and the white part will show the user's color (white = 1, someColor * 1 = someColor).

SwiftUI has a .blendMode(.multiply) modifier that might do the trick for your use case.

By separating the user's drawing from the template, you also ensure that when the user erases parts in their drawing, they don't erase the template image.

Frank Rupprecht
  • 9,191
  • 31
  • 56