2018-12-10 07:56:50 UTC
I've been trying the MongoSpark connector in Scala regarding writing data
into Mongo from Spark. So far so good, the data is written flawlessly with
just a minor detail, and it's the one that is confusing me.
I create Date objects in my code which I get their hours, minutes, seconds
and ms reset to 0, and the rest of parameters are set by me. But somehow
the 'date' that gets written into MongoDB is 2h less than the one I
created. There isn't any issue in the DataFrame with the Spark TimeZone and
such as I've already checked that out.
Any clue or config params that I may be missing out?
You received this message because you are subscribed to the Google Groups "mongodb-user"
For other MongoDB technical support options, see: https://docs.mongodb.com/manual/support/
You received this message because you are subscribed to the Google Groups "mongodb-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to firstname.lastname@example.org.
To post to this group, send email to email@example.com.
Visit this group at https://groups.google.com/group/mongodb-user.
To view this discussion on the web visit https://groups.google.com/d/msgid/mongodb-user/32733676-0639-493d-817f-be25162740e6%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.