Home > Enterprise >  Online without cleft grafting perception
Online without cleft grafting perception

Time:04-21

For now between group, the big account, the family pay, historical data, and so on all business scenarios such as business migration through one-time data migration way to complete the entire network data migration,
By one-time migration data are large, the proposal for data migration in the evening, at the same time to avoid during the released version database scripts, etc.,
Moving the current for the one-time CB1.0 physical library and library data memory according to the cities and the product batch moved to CB2.0 database, mainly divides into the business in current CB1.0 library, zhang tube base and the memory of billing data, corresponding to customer center in CB2.0, user center account center, financial center, billing center, credit control center, etc., physical library from oracle to DRDS, memory libraries changed from single to distributed,
Reference province score according to taking migration experience, in the form of large quantities of data operation, the shortest period of time once the batch data from cb1.0 moved to cb2.0 system,
With reference to the current online process, combining the province grade according to taking process, roughly combing offline migration into the following process: data extraction, data cleaning (many times), data validation, lock (marking), relocation, audit, warehousing, routing updates, unlock (activated), and mass processing of each process were analyzed development validation,
Data extraction:
? Requirements: can't split (group and GeZhang) associated with the user, as a unit, the overall relocation,
? According to provinces move target delineated the whole business (group, big GeZhang, affection to pay, etc.) user scope, extract the migration user data (partial migration), subsequent migration has been the basis of target users,
? Data validation, the calibration result table and customer detail records,
Data cleaning:
Data cleaning are mainly includes basic data cleaning and data migration data transformation of new and old model, this part according to the business center provides the conversion table (field) rules to convert the data level of the work, after finishing according to the data dimension, the business dimension for secondary cleaning, in order to ensure the transformed data table can be directly to the ground to the new architecture,
Data cleaning internal into pre-cleaning, finally two parts cleaning, segmentation points for using account lock, pre-cleaning prior to lock, lock for the final cleaning, after the pre-cleaning is mainly exposed problems, eventually washing for the final data,
Cleaning solution:
Precleaning: considering the load effects in the production environment, based on image libraries can be divided into library data cleaning, data cleaning and production mirror
Library cleaning is mainly found in mass data class and business class limitation, etc., to minimize the production environment of data cleaning,
Considering competitive factors, such as the production of the precleaning may consider incremental way,
Final cleaning:
The new and old model transformation:
1, basic table rearrangement of
In view of the new and old architecture table model almost the same table, rearrangement of you just need to field docking,
Rearrangement of 2, transverse and longitudinal table
Cross table into the vertical table, transverse longitudinal table into a table, this table transformation, to do special processing,
Synthesis of 3, table rearrangement of
Combined table is a table, conversion efficiency of give attention to two or morethings
4, special table rearrangement of
Data validation:
? Data validation data mainly for routine check (primary key, unique index, etc.), check the business dimension, business dimensions refer to the online migration now offers some business class limit check, such as on the repair order, cross-regional restrictions, such as
? Data validation results, generate calibration result table and customer detail records, a user for all validation rules to run again, try to avoid dealing with a problem when running and quote us other problems; And store the results of the related information, such as user_id, serial_number, if the order is in transit and key information, such as to have the order number,
? See the check script
Calibration range:
1. Business class: travelling alone;
Class: 2. The data and the primary key of the conflict and the only new architecture production indexes;
Lock:
To ensure the migration process static data, need to users, lock handle account state, the user center account center, billing center to provide the corresponding rules of lock, the migration tool according to this method completes the locking movement,
The middle table rearrangement of & amp; Data relocation:
After data cleaning steps after the formation of the rearrangement of table agree with cBSS2.0 data model, so a one-time migration through a dedicated data rearrangement of tool one-time large-scale imported into the new architecture, considering that the new architecture capacity was complicated, so the new architecture form validation, rules are moved forward to the data cleaning steps to complete, the actual table data is for migration of real data, follow-up can be directly according to the table to move,
? ORACLE on table to DRDS, move through datax tools implementation, through to datax encapsulation, further use has the function of dynamically generated configuration examples are as follows:
Mydatax. Sh tablename oracle_tns drds_tns
? DRDS in writing the script writing production data sheet: data in the temporary table by moving to production, through the data broker layer directly inserted into the production in the table, the data again before birth production instance the audit data is already existed in the production,
? This module has the following script output:
Guide in the script
DRDS production audit script
Datax start-stop scripts



  • Related