Matthias Lee
2012-10-18 01:28:46 UTC
Hello there,
Ive been using pymongo for a while and have read a few smaller bson files,
but today I was trying to convert a large bson file to json. (contains no
binary data)
Every way I tried reading and decoding resulted in me maxing out my RAM at
32GB.
If there a more efficient way of reading/decoding bson that this:
import bson
f = open("bigBson,bson", 'rb')
result = bson.decode_all(f.read())
perhaps it can be decoded incrementally?
In comparison, using mongorestore to load the same file barely increased my
memory usage.
Thanks,
Matthias
Ive been using pymongo for a while and have read a few smaller bson files,
but today I was trying to convert a large bson file to json. (contains no
binary data)
Every way I tried reading and decoding resulted in me maxing out my RAM at
32GB.
If there a more efficient way of reading/decoding bson that this:
import bson
f = open("bigBson,bson", 'rb')
result = bson.decode_all(f.read())
perhaps it can be decoded incrementally?
In comparison, using mongorestore to load the same file barely increased my
memory usage.
Thanks,
Matthias
--
You received this message because you are subscribed to the Google
Groups "mongodb-user" group.
To post to this group, send email to mongodb-user-/***@public.gmane.org
To unsubscribe from this group, send email to
mongodb-user+unsubscribe-/***@public.gmane.org
See also the IRC channel -- freenode.net#mongodb
You received this message because you are subscribed to the Google
Groups "mongodb-user" group.
To post to this group, send email to mongodb-user-/***@public.gmane.org
To unsubscribe from this group, send email to
mongodb-user+unsubscribe-/***@public.gmane.org
See also the IRC channel -- freenode.net#mongodb