Matplotlib内存不足

伊恩·法科(Ian Fako)

我遇到两个问题,都与内存问题有关。第一种情况发生在resolvers使用了大约5个解析器时(请参见下面的代码和说明),第二种情况发生在使用了大约15个解析器时。对于第一个问题,Stackoverflow也有类似的问题解决此问题的方法是在每次循环后清除内存,但我想在单个图形中创建多行数据线,因此这对我不起作用。

这是所有发生的代码段:

fig = plt.figure()
ax = fig.add_subplot(111)

def add_plot(resolver_name, results):
    sum_results = sum(results)
    norm = [float(i)/sum_results for i in results]
    cy = np.cumsum(norm)
    ax.plot(results, cy, label=resolver_name, linewidth=0.8)


for resolver in resolvers:
    results = db.get_rt(resolver["ipv4"], tls)
    add_plot(resolver["name"], results)        

# Positioning of legend
box = ax.get_position()
ax.set_position([box.x0, box.y0, box.width * 0.8, box.height])
ax.legend(loc='center left', bbox_to_anchor=(1, 0.5))
fig.set_size_inches(10,5)

ax.set_xscale('log')
plt.title('CDF response time for '+('DNS-over-TLS measurements' if tls else 'DNS measurements'))
plt.xlabel("Response time (ms)")
plt.ylabel("CDF")
plt.grid(True)

png_name = V.base_directory+"/plots/rt_cdf.png"
if (tls):
    png_name = V.base_directory+"/plots/rt_cdf_tls.png"
log.info("Plotting graph to "+png_name)
plt.savefig(png_name)

该变量resolvers包含一些有关几个公共DNS解析器的信息。该变量results是浮点值的列表。所有其他不清楚的变量都不应与此问题相关。但请随时询问您是否需要进一步的解释。

问题1

如前所述,大约使用5时会发生这种情况resolversresults条目的大小在〜100万和〜600万之间变化。AMemoryError出现在最后一行:

Traceback (most recent call last):
File "plot_building/rt_cdf.py", line 63, in <module>
    plt.savefig(png_name)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/pyplot.py", line 695, in savefig
    res = fig.savefig(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/figure.py", line 2062, in savefig
    self.canvas.print_figure(fname, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/backend_bases.py", line 2263, in print_figure
    **kwargs)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/backends/backend_agg.py", line 517, in print_png
    FigureCanvasAgg.draw(self)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/backends/backend_agg.py", line 437, in draw
    self.figure.draw(self.renderer)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/artist.py", line 55, in draw_wrapper
    return draw(artist, renderer, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/figure.py", line 1493, in draw
    renderer, self, artists, self.suppressComposite)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/image.py", line 141, in _draw_list_compositing_images
    a.draw(renderer)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/artist.py", line 55, in draw_wrapper
    return draw(artist, renderer, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/axes/_base.py", line 2635, in draw
    mimage._draw_list_compositing_images(renderer, self, artists)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/image.py", line 141, in _draw_list_compositing_images
    a.draw(renderer)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/artist.py", line 55, in draw_wrapper
    return draw(artist, renderer, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/lines.py", line 756, in draw
    tpath, affine = (self._get_transformed_path()
File "/usr/local/lib/python2.7/dist-packages/matplotlib/transforms.py", line 2848, in get_transformed_path_and_affine
    self._revalidate()
File "/usr/local/lib/python2.7/dist-packages/matplotlib/transforms.py", line 2822, in _revalidate
    self._transform.transform_path_non_affine(self._path)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/transforms.py", line 2492, in transform_path_non_affine
    return self._a.transform_path_non_affine(path)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/transforms.py", line 1564, in transform_path_non_affine
    x = self.transform_non_affine(path.vertices)
File "/usr/local/lib/python2.7/dist-packages/matplotlib/transforms.py", line 2271, in transform_non_affine
    return np.concatenate((x_points, y_points), 1)
MemoryError

问题2

这一点很难弄清楚。在运行时的某个时候,该过程刚刚停止。经过一番搜索,我发现以下内容var/log/syslog

[27578124.494907] Out of memory: Kill process 376 (python) score 897 or sacrifice child
[27578124.495020] Killed process 376 (python) total-vm:2081432kB, anon-rss:1833416kB, file-rss:1464kB

我认为日志文件中的其他几行也可能属于此问题,但是我发现该问题是由于内存不足而引起的。


该脚本在具有2GB RAM的Ubuntu VM上运行。

有什么想法可以解决这些问题吗?

jt2.4.6

您在运行此程序时是否正在监视系统监视器?您的RAM用完了吗?

600万点似乎是巨大的,您不仅可以减少采样吗?

本文收集自互联网,转载请注明来源。

如有侵权,请联系 [email protected] 删除。

编辑于
0

我来说两句

0 条评论
登录 后参与评论

相关文章