So I have a PORO service where I get one image at time and store this original image, after that I scheduled a sidekiq job where I convert this image to webp
format with three different dimensions. But I noticed that sidekiq consumes ~200MB of memory at the beginning and when it starts processing 4MB image(.jpeg) it quickly grows to ~350MB. And if the user sends 8 consecutive requests with the total image size of ~18mb the job may take up to 800MB and this memory is not freed after completion. Therefore, any further requests only increase the memory of the job. I'm running docker on linux machine, also i'm using plain ActiveStorage
and image_processing
gem with libvips
image processor. Anyone having the same problem with this or know how to decrease the memory?
Here is the code of job:
class Api::V1::Ads::Images::ResizeAndUploadJob < Api::V1::ApplicationJob
sidekiq_options queue: 'high'
def perform(blob_id)
@blob = ActiveStorage::Blob.find_by(id: blob_id)
return if @blob.nil?
@blob.filename = "#{image_filename}_x1200.webp"
@blob.variant(format: :webp, resize_to_limit: [nil, 1200]).process
@blob.filename = "#{image_filename}_x560.webp"
@blob.variant(format: :webp, resize_to_limit: [nil, 560]).process
@blob.filename = "#{image_filename}_x130.webp"
@blob.variant(format: :webp, resize_to_limit: [nil, 130]).process
end
private
def image_filename
@image_filename ||= @blob.filename.base.split('_ORIGINAL').first
end
end