1. To establish a collection, perform again, still be the same,
2. Increased - batchSize effect after 1, import for the first time after the progress bar is no longer change
Mongoimport -h 127.0.0.1 - port XXXXX collection database - c - d - a file called logfile. The log
The T10:2017-03-07 40:51. 514 + 0800 connected to: 127.0.0.1: XXXX
The T10:2017-03-07 40:54. 507 + 0800 [...] The database. The collection 4.7 MB/319.1 MB (1.5%)
The T10:2017-03-07 40:57. 507 + 0800 [...] The database. The collection 4.7 MB/319.1 MB (1.5%)
The T10:2017-03-07 41:00. 507 + 0800 [...] The database. The collection 4.7 MB/319.1 MB (1.5%)
The T10:2017-03-07 41:03. 507 + 0800 [...] The database. The collection 4.7 MB/319.1 MB (1.5%)
The T10:2017-03-07 41:06. 507 + 0800 [...] The database. The collection 4.7 MB/319.1 MB (1.5%)
Everybody encountered this problem?
Thank you very much,
CodePudding user response:
File is a json format? Can see if there's an error in the log fileCodePudding user response:
I went to validate the contents of this log, the result is right the json formatCodePudding user response:
Is the shard? No piece of keys?Take out a few try the insert manually
CodePudding user response:
Through the front-end tool import is no problem, very fast, the original use mongoimport is no problem, suddenly one day not line, during didn't do any changesCodePudding user response:
Can try to add batchsize options?I saw on stackoverflow, do not know way,
http://stackoverflow.com/questions/33315243/mongoimport-stuck-at-same-point-while-importing-a-json-file
CodePudding user response:
Add batchsize, such as set to 10, only import 10, don't moveCodePudding user response:
Could you tell me how to solve the problem finally is I also encountered this problem at present, the import after a few have been stuckCodePudding user response:
The building Lord how to solve?