You can use sidekiq unique jobs gem - looks like it actually does what you need.
Added later:
Here is basic implementation of what you are asking for - it would not be fast, but should be OK for small queues. I've also met this problem when repacking JSON - in my environment it was necessary to re-encode json the same way.
#for proper json packing (I had an issue with it while testing)
require 'bigdecimal'
class BigDecimal
def as_json(options = nil) #:nodoc:
if finite?
self
else
NilClass::AS_JSON
end
end
end
Sidekiq.redis do |connection|
# getting items from redis
items_count = connection.llen('queue:background')
items = connection.lrange('queue:background', 0, 100)
# remove retrieved items
connection.lrem('queue:background', 0, 100)
# jobs are in json - decode them
items_decoded = items.map{|item| ActiveSupport::JSON.decode(item)}
# group them by class and arguments
grouped = items_decoded.group_by{|item| [item['class'], item['args']]}
# get second and so forth from each group
duplicated = grouped.values.delete_if{|mini_list| mini_list.length < 2}
for_deletion = duplicated.map{|a| a[0...-1]}.flatten
for_deletion_packed = for_deletion.map{|item| JSON.generate(item)}
# removing duplicates one by one
for_deletion_packed.each do |packed_item|
connection.lrem('queue:background', 0, packed_item)
end
end