OOM problems caused by a MongoDB paging query

OOM Description: 2018-09-18 14:46:54.338 [http-nio-8099-exec-8] ERROR o.a.c.c.C.[.[.[.[dispatcherServlet] - Servlet.service() for servlet [dispatcherS...

OOM Description:

2018-09-18 14:46:54.338 [http-nio-8099-exec-8] ERROR o.a.c.c.C.[.[.[.[dispatcherServlet] - Servlet.service() for servlet [dispatcherServlet] in context with path [/party-data-center] threw exception [Handler dispatch failed; nested exception is java.lang.OutOfMemoryError: GC overhead limit exceeded] with root cause java.lang.OutOfMemoryError: GC overhead limit exceeded at org.bson.io.ByteBufferBsonInput.readString(ByteBufferBsonInput.java:154) at org.bson.io.ByteBufferBsonInput.readString(ByteBufferBsonInput.java:126) at org.bson.BsonBinaryReader.doReadString(BsonBinaryReader.java:245) at org.bson.AbstractBsonReader.readString(AbstractBsonReader.java:461) at org.bson.codecs.BsonStringCodec.decode(BsonStringCodec.java:31) at org.bson.codecs.BsonStringCodec.decode(BsonStringCodec.java:28) at org.bson.codecs.BsonArrayCodec.readValue(BsonArrayCodec.java:102) at org.bson.codecs.BsonArrayCodec.decode(BsonArrayCodec.java:67) at org.bson.codecs.BsonArrayCodec.decode(BsonArrayCodec.java:37) at org.bson.codecs.BsonDocumentCodec.readValue(BsonDocumentCodec.java:101) at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:84) at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:41) at org.bson.codecs.configuration.LazyCodec.decode(LazyCodec.java:47) at org.bson.codecs.BsonDocumentCodec.readValue(BsonDocumentCodec.java:101) at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:84) at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:41) at org.bson.codecs.configuration.LazyCodec.decode(LazyCodec.java:47) at org.bson.codecs.BsonDocumentCodec.readValue(BsonDocumentCodec.java:101) at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:84) at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:41) at org.bson.codecs.configuration.LazyCodec.decode(LazyCodec.java:47) at org.bson.codecs.BsonArrayCodec.readValue(BsonArrayCodec.java:102) at org.bson.codecs.BsonArrayCodec.decode(BsonArrayCodec.java:67) at org.bson.codecs.BsonArrayCodec.decode(BsonArrayCodec.java:37) at org.bson.codecs.BsonDocumentCodec.readValue(BsonDocumentCodec.java:101) at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:84) at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:41) at org.bson.codecs.BsonDocumentCodec.readValue(BsonDocumentCodec.java:101) at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:84) at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:41) at com.mongodb.connection.ReplyMessage.<init>(ReplyMessage.java:51) at com.mongodb.connection.InternalStreamConnection.receiveCommandMessageResponse(InternalStreamConnection.java:301)

According to the above information, it seems that MongoDB uses too much memory when querying data, resulting in OOM

Export the dump file and analyze it After opening the file with MAT, there is a Problem Suspect 1 (the most likely hint of memory overflow)

The thread org.apache.tomcat.util.threads.TaskThread @ 0xf9b19fa0 http-nio-8099-exec-8 keeps local variables with total size 58,255,056 (60.49%) bytes. The memory is accumulated in one instance of "java.lang.Object[]" loaded by "<system class loader>". The stacktrace of this Thread is available. See stacktrace. Keywords java.lang.Object[] Details »

Click See stacktrace

The amount of information is still very large. Analyze it slowly. find

at com.mongodb.DB.command(Lcom/mongodb/DBObject;Lcom/mongodb/ReadPreference;Lcom/mongodb/DBEncoder;)Lcom/mongodb/CommandResult; (DB.java:496) at com.mongodb.DB.command(Lcom/mongodb/DBObject;Lcom/mongodb/ReadPreference;)Lcom/mongodb/CommandResult; (DB.java:512) at com.mongodb.DB.command(Lcom/mongodb/DBObject;)Lcom/mongodb/CommandResult; (DB.java:467)

We can find that it is the error of executing the Mongo command, MongoResult,,. Isn't this the returned Mongo query result set?? Is the returned result set too large?? stand a good chance!!! Keep looking down...

at com.fosung.data.party.dao.DetailDao.detailQuery(Lcom/fosung/data/party/dto/PartyItemDto;)Lcom/fosung/data/party/vo/OutDetailCountVo; (DetailDao.java:314) at com.fosung.data.party.dao.DetailDao$$FastClassBySpringCGLIB$$caf49f16.invoke(ILjava/lang/Object;[Ljava/lang/Object;)Ljava/lang/Object; (Unknown Source)

Here we see our business code method, which is likely to be the OOM caused by this method, and further analyze our business method: After careful analysis, we finally found out the cause of the problem: The reason for the above problem is that when obtaining the total number of records, no paging conditions (skip and limit) are added, which results in querying all the records that meet the conditions (more than 6w of the records that meet the conditions) and loading them into memory, thus causing the OOM problem.

Solve: MongoDB obtains the total number of qualified pieces after using the pipeline query

db.getCollection('user_order').aggregate([ { "$match" : { "code" : "100002255842358"}} , { "$project" : { "code" : 1 , "yearInfo" : 1 , "personInfo" : 1}} , { "$unwind" : "$yearInfo.counts"} , { "$unwind" : "$yearInfo.counts.code"} , { "$match" : { "yearInfo.counts.code" : { "$in" : [ "1"]}}} , { "$sort" : { "code" : 1 , "yearInfo.counts.sort" : 1}} , { "$lookup" : { "from" : "user_info" , "localField" : "yearInfo.counts.detail" , "foreignField" : "_id" , "as" : "personInfo"}} , { "$unwind" : "$personInfo"} , {"$group":{"_id":null,"totalCount":{"$sum":1}}}, {"$project":{"totalCount":"$totalCount","_id":0}} ])

It is not necessary to get all records at a time, and then get the number of records.

After the modification, the test passed perfectly...

30 December 2020, 10:10 | Views: 3092

Add new comment

For adding a comment, please log in
or create account

0 comments