Home > Back-end >  Database entry and deal with the problem
Database entry and deal with the problem

Time:11-09

People write a database software, database on ali cloud server, a few people use together, with adoconncetion directly connected,
Based data is excel (may be 2 rounds to perfect a record, the format is not fixed fucking each round), and then use software to ali cloud, lie between period of time to record time, thousands of every time,
1, purchase specification: code, name, name of traders, allotment object names, the shareholder's code is valid, intends to purchase price (RMB), plans to purchase price (RMB), assets, offering price (RMB), placement of shares (shares), codes, amount, a success rate, the type
2, the placement of objects tables: name of traders, placement of the object name, shareholders code note: this data is used to complement the announcement data, data sorting is likely to be useful,
3, TAB: name, assets, maximum effective quotation number, quotation number, accuracy

Recognize and see me this idea right, is there a more optimal solution

1, keep each record to the SQL statement, the full 200 article send data to the server, until after,
2, sometimes you may not have shareholders or other code, to improve the data to the server in the temporary table, with a process to deal with query back to save,
3, because every time after input the new data statistics will change, I whether to build a TAB instead of a view? After each entry to regenerate the TAB? Feeling distinctly entity tables and views query speed about the same, I go to?
4, a large quantity of data query slower, when this is no other way?

CodePudding user response:

200 records inserted into a transaction, will be faster, the inside of the new version of the Delphi database access framework firedac also supports batch operation, quicker; View generally do not store the physical data, in fact, or use of the entity behind the table, isn't much of a difference between natural speed; Slow query can generally look at the next generation of database query plan, and see whether there's record is more table scan read, have a lot of double counting repetitions, table scan can cause slow through the establishment of appropriate index could look at ease, part of the repeated computation can be individually in a cte, if the data quantity is too large, also can consider to save the middle results of data calculation, when such queries directly using the intermediate results, the speed will be a lot faster,

A little suggestion, hoping to help,
  • Related