There are 1,8 million records in the database table that I want to index through the fos:elastica:populate command. As soon as ~500.000 documents are already in the elasticsearch the indexing stops abruptly.
In the dmesg on backend server I saw this:
[19314221.670723] Out of memory: Killed process 2640772 (php) total-vm:21738676kB, anon-rss:19847596kB, file-rss:4kB, shmem-rss:0kB, UID:0 pgtables:42352kB oom_score_adj:0
[19314224.812907] oom_reaper: reaped process 2640772 (php), now anon-rss:0kB, file-rss:0kB, shmem-rss:0kB
Elasticsearch, MySQL and PHP Symfony backend (where populate runs) are on different servers.
How can I fix that and fill elastica with documents?
FOSElasticaBundle is v5.1.0
CodePudding user response:
Okay. The only solution I found (excluding the possibility of renting a powerful server) is to run the populate command with a bash script in parts.
Here is a sample script to index the remaining data.
#!/bin/bash
for i in {6000..18000..4000}
do
bin/console fos:elastica:populate --no-reset --first-page="$i" --last-page="$((i 4000))"
done
bin/console fos:elastica:populate --no-reset --first-page=18000