为什么R报告使用Windows的内存比自己多得多?
> sessionInfo()
R version 3.4.0 (2017-04-21)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows >= 8 x64 (build 9200)
在一个32G系统中,我创建距离矩阵,当得到这个错误:为什么R报告使用Windows的内存比自己多得多?
df <- remove_duplicates_quanteda(dfm, df)
Error: cannot allocate vector of size 1.3 Gb
寻找我的环境里面,有一点值得关注的理由:
print(object.size(x = lapply(ls(), get)), units = "Mb")
96.5 Mb
哈德利把它很简单,在先进的R:
This number won’t agree with the amount of memory reported by your operating system for a number of reasons:
It only includes objects created by R, not the R interpreter itself.
Both R and the operating system are lazy: they won’t reclaim memory until it’s actually needed. R might be holding on to memory because the OS hasn’t yet asked for it back.
R counts the memory occupied by objects but there may be gaps due to deleted objects. This problem is known as memory fragmentation.
欲了解更多信息,请参阅节约Memory
感谢您的参考。但这不仅仅是报告和回收分配的内存,因为它实际上具有不能分配1.3G矢量的结果。出于各种原因(包括虚荣心),我无意关闭Visual Studio/reboot。 – user1603472
难道是要分配矩阵每列/行(矢量)约1.3GB?你计算距离矩阵的数量是多少? – AaronP
使用'pryr :: object_size()'报告的类似练习是什么? – hrbrmstr
'错误:无法分配大小为1.3 Gb的向量'意味着R一直在愉快地分配内存,直到没有更多可用内存为止。它报告下一次分配需要多少内存(在你调用的函数内)。即使你有更多的内存1.3 GB,它可能不足以进行下一次分配。距离矩阵可能很大。根据您的全球环境,您无法判断是否有理由担心。你必须考虑你想要做的操作的内存需求。 – Roland