Home > database >  When two same data submitted 101 milliseconds. Were data in inventory.
When two same data submitted 101 milliseconds. Were data in inventory.

Time:09-16

Use of third party form application, two identical data submitted differ 101 milliseconds, are data inventory, rechecking rules have no effect, third-party companies vary time interval is too small, system to identify, I don't know each other right, please help in details, thank you.

CodePudding user response:

Database is not possible to identify, let alone 101 ms and 0.1 ms can out to you, unless, triggered the BUG greatly greatly,
Seems to be the only application has realized the data, rather than the primary key or unique index in the database? This program problem seems not small, third-party companies dare not back pan ~

CodePudding user response:

rechecking rules didn't take effect,

Rules of the query, apparently do check in the back-end services,

In general, the cause of the situation of the process about the following:

Form with a submit button, point a, no response (or web page could jump), point again, this will cause submitted two times, also is called the two background services

Rules for rechecking the definition of service in the background of
1, is submitted for the first time, first check rules, found no duplicate data,
2, the second submit came up again, also to execute the check rules, also found no duplicate data,
3, the first in the written data
4, write the second data,

But in fact your expectations are, 1324, in that order,
However, before the second rechecking, data is not put in storage for the first time, now, the problem will be basically in place,

Swap (the third and fourth order, will be the same)

So, stand in the Angle of the database, it is recommended that a uniqueness constraints on the database (index) to guarantee the uniqueness of the data,

CodePudding user response:

refer to the second floor selling fruit net reply:
rechecking rules didn't take effect,

Rules of the query, apparently do check in the back-end services,

In general, the cause of the situation of the process about the following:

Form with a submit button, point a, no response (or web page could jump), point again, this will cause submitted two times, also is called the two background services

Rules for rechecking the definition of service in the background of
1, is submitted for the first time, first check rules, found no duplicate data,
2, the second submit came up again, also to execute the check rules, also found no duplicate data,
3, the first in the written data
4, write the second data,

But in fact your expectations are, 1324, in that order,
However, before the second rechecking, data is not put in storage for the first time, now, the problem will be basically in place,

Swap (the third and fourth order, will be the same)

So, stand in the Angle of the database, it is recommended that a uniqueness constraints on the database (index) to ensure the uniqueness of the data.


But these two data submitted are different people, to submit with my cell phone, because the set of id check weight, but no effect, the same id card information is entered to, the third party, the problem of 101 milliseconds apart,

CodePudding user response:

reference 1st floor minsic78 response:
database is not possible to identify, let alone 101 ms and 0.1 ms can give you out, unless - triggered a BUG greatly greatly,
Seems to be the only application has realized the data, rather than the primary key or unique index in the database? This program problem seems not small, third-party companies dare not back pan ~


Thank you, I explain the application has realized the data only, and the primary key or unique index database is what meaning,

CodePudding user response:

Godoor
reference 4 floor response:
Quote: refer to 1st floor minsic78 response:

Database is not possible to identify, let alone 101 ms and 0.1 ms can out to you, unless, triggered the BUG greatly greatly,
Seems to be the only application has realized the data, rather than the primary key or unique index in the database? This program problem seems not small, third-party companies dare not back pan ~


Thank you, I explain the application has realized the data and database only what meaning is the primary key or unique index,


Means: if it is a database with a primary key or unique index to guarantee the data only, so it will be impossible to data duplication, now the emergence of data duplication is a question of procedure, has nothing to do with the database, nothing of millisecond lag and so on,

CodePudding user response:

reference godoor reply: 3/f
Quote: refer to the second floor selling fruit net reply:

rechecking rules didn't take effect,

Rules of the query, apparently do check in the back-end services,

In general, the cause of the situation of the process about the following:

Form with a submit button, point a, no response (or web page could jump), point again, this will cause submitted two times, also is called the two background services

Rules for rechecking the definition of service in the background of
1, is submitted for the first time, first check rules, found no duplicate data,
2, the second submit came up again, also to execute the check rules, also found no duplicate data,
3, the first in the written data
4, write the second data,

But in fact your expectations are, 1324, in that order,
However, before the second rechecking, data is not put in storage for the first time, now, the problem will be basically in place,

Swap (the third and fourth order, will be the same)

So, stand in the Angle of the database, it is recommended that a uniqueness constraints on the database (index) to ensure the uniqueness of the data.


But these two data submitted are different people, to submit with my cell phone, because the set of id check weight, but no effect, the same id card information is entered to, the third party, the problem of 101 milliseconds apart, the


Different submitted, more suitable for, I said that this order 1234,

CodePudding user response:

Simply put your nonsense, a third party company appeared significant bugs,

CodePudding user response:

Database table has a primary key, I wouldn't have this kind of problem, the design has a problem,

CodePudding user response:

You can set the primary key what! Delay the problem, it is can be designed to avoid!

CodePudding user response:

reference godoor reply: 3/f
Quote: refer to the second floor selling fruit net reply:

rechecking rules didn't take effect,

Rules of the query, apparently do check in the back-end services,

In general, the cause of the situation of the process about the following:

Form with a submit button, point a, no response (or web page could jump), point again, this will cause submitted two times, also is called the two background services

Rules for rechecking the definition of service in the background of
1, is submitted for the first time, first check rules, found no duplicate data,
2, the second submit came up again, also to execute the check rules, also found no duplicate data,
3, the first in the written data
4, write the second data,

But in fact your expectations are, 1324, in that order,
However, before the second rechecking, data is not put in storage for the first time, now, the problem will be basically in place,

Swap (the third and fourth order, will be the same)

So, stand in the Angle of the database, it is recommended that a uniqueness constraints on the database (index) to ensure the uniqueness of the data.


But these two data submitted are different people, to submit with my cell phone, because the set of id check weight, but no effect, the same id card information is entered to, the third party, the problem of 101 milliseconds apart, the


A database table to do the only constraint, it can be left to the time difference

CodePudding user response:

Why not do idempotent check?
  • Related