Home > database > Postgresql support data from 1 billion level??
Postgresql support data from 1 billion level??
Time:10-09
Postgresql support data from 1 billion level??
CodePudding user response:
The single table feeling can support, if little storage (such as field, less field short) Can try the partition table, but is not good enough
CodePudding user response:
I have to consider change postgresql to clickhouse, but clickhouse never used, do not know how it will pit,
CodePudding user response:
Can use partition can, support, mainly control the size of the index, If you compare lazy, can be single table, with some indexes to split, according to the range index, for example, Partition with reference to: https://blog.csdn.net/goldenhawking/article/category/1222775
CodePudding user response:
Partition in this blog, although the test in ten million, but should and 1 billion are similar, because insert 1 billion article will take a long time, so there is no put up, and I in a production environment to store about 600 million records, more than 400 GB, do not have too big problem,
Partitions, non-relational field reference, this article partitions, to write the CHECK condition good, can directly according to avoid unnecessary searching child table,
CodePudding user response:
Postgresql is close to the performance of oracel, basically no problem