使用 Ruby 和 imagemagick 获取或计算图像的熵

发布于 2024-10-16 10:24:17 字数 1894 浏览 9 评论 0原文

如何在 Ruby 中使用 imagemagick(最好是 mini_magic)找到“熵”?我需要将此作为一个更大项目的一部分,在图像中找到“有趣之处”以便对其进行裁剪

我发现了一个很好的 Python/Django 示例,它给出了以下伪-代码:

image = Image.open('example.png')
histogram = image.histogram() # Fetch a list of pixel counts, one for each pixel value in the source image

#Normalize, or average the result.
for each histogram as pixel
  histogram_recalc << pixel / histogram.size
endfor

#Place the pixels on a logarithmic scale, to enhance the result.
for each histogram_recalc as pixel
  if pixel != 0
    entropy_list << log2(pixel)
  endif
endfor

#Calculate the total of the enhanced pixel-values and invert(?) that.
entropy = entroy_list.sum * -1

这将转换为公式entropy = -sum(p.*log2(p))

我的问题:我正确解释了 Django/Python 代码吗?如果有的话,我怎样才能在 ruby​​ 的 mini_magick 中获取直方图?

最重要的问题:这个算法首先好吗?您是否会建议一种更好的方法来查找图像(部分)中的“熵”或“像素变化量”或“梯度深度”?

编辑:使用下面答案提供的资源,我想出了工作代码:

# Compute the entropy of an image slice.
def entropy_slice(image_data, x, y, width, height)
  slice = image_data.crop(x, y, width, height)
  entropy = entropy(slice)
end

# Compute the entropy of an image, defined as -sum(p.*log2(p)).
# Note: instead of log2, only available in ruby > 1.9, we use
# log(p)/log(2). which has the same effect.
def entropy(image_slice)
  hist = image_slice.color_histogram
  hist_size = hist.values.inject{|sum,x| sum ? sum + x : x }.to_f

  entropy = 0
  hist.values.each do |h|
    p = h.to_f / hist_size
    entropy += (p * (Math.log(p)/Math.log(2))) if p != 0
  end
  return entropy * -1
end

其中 image_data 是 RMagick::Image

这在 smartcropper gem 中使用,它允许使用回形针等对图像进行智能切片和裁剪。

How to find the "entropy" with imagemagick, preferably mini_magic, in Ruby? I need this as part of a larger project, finding "interestingness" in an image so to crop it.

I found a good example in Python/Django, which gives the following pseudo-code:

image = Image.open('example.png')
histogram = image.histogram() # Fetch a list of pixel counts, one for each pixel value in the source image

#Normalize, or average the result.
for each histogram as pixel
  histogram_recalc << pixel / histogram.size
endfor

#Place the pixels on a logarithmic scale, to enhance the result.
for each histogram_recalc as pixel
  if pixel != 0
    entropy_list << log2(pixel)
  endif
endfor

#Calculate the total of the enhanced pixel-values and invert(?) that.
entropy = entroy_list.sum * -1

This would translate to the formula entropy = -sum(p.*log2(p)).

My questions: Did I interprete the Django/Python code correct? How can I fetch a histogram in ruby's mini_magick if at all?

Most important question: is this algorithm any good in the first place? Would you suggest a better one to find the "entropy" or "amount of changing pixels" or "gradient depth" in (parts of) images?

Edit: Using a.o. the resources provided by the answer below, I came up with the working code:

# Compute the entropy of an image slice.
def entropy_slice(image_data, x, y, width, height)
  slice = image_data.crop(x, y, width, height)
  entropy = entropy(slice)
end

# Compute the entropy of an image, defined as -sum(p.*log2(p)).
# Note: instead of log2, only available in ruby > 1.9, we use
# log(p)/log(2). which has the same effect.
def entropy(image_slice)
  hist = image_slice.color_histogram
  hist_size = hist.values.inject{|sum,x| sum ? sum + x : x }.to_f

  entropy = 0
  hist.values.each do |h|
    p = h.to_f / hist_size
    entropy += (p * (Math.log(p)/Math.log(2))) if p != 0
  end
  return entropy * -1
end

Where image_data is an RMagick::Image.

This is used in the smartcropper gem, which allows smart slicing and cropping for images with e.g. paperclip.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

獨角戲 2024-10-23 10:24:18

使用 facets 的 Array#entropy

require 'facets'
puts File.read('path/to/image.png').chars.entropy

With facets's Array#entropy:

require 'facets'
puts File.read('path/to/image.png').chars.entropy
千里故人稀 2024-10-23 10:24:17

此处解释了熵(使用 MATLAB 源代码,但希望定性解释有所帮助):

熵简介(MATLAB 中的数据挖掘)

有关更正式的解释,请参阅:

《信息论要素》(第 2 章),作者:Cover 和 Thomas

Entropy is explained here (with MATLAB source, but hopefully the qualitative explanation helps):

Introduction to Entropy (Data Mining in MATLAB)

For a more formal explanation, see:

"Elements of Information Theory" (Chapter 2), by Cover and Thomas

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文