I have a production server and a training server.On a monthly basis the training server needs to have data refreshed from the production server.This consists of approximately 1000 tables, with a few hundered thousand rows from many of the tables.
I was wondering what would be the best way to accomplish this in integration services.Obviously i do not want to create a task for each of the 1000 tables, and the data needs to be copied as fast as possible (it is a substatial amount of data).
I was thinking of loading a table with all the table names and using a loop to dynamically go throught the tables, but then i dont want to use a sql task to do a "select * from linked server into" as this would be very inefficient.
There are no real transformations needed, just copying many objects from one server to another.
There are other ways but i cant help thinking it is not the best way.
Any ideas on the best practice for this?
ThanxWhat about the transfer objects task? It does exactly what it says on the tin and also exactly what you described above.
-Jamie|||Hi Jamie
Thanx for the assiatance.
Yes i did investigate the copy objects tasks. Sorry i should have mentioned that my concerns are that i do not want to copy all rows from these tables, is there a dynamic sql command one can use for object data extraction in this task ?
Also this list of over 100 tables may change, which would require me to change the package every time, is there a way to dynamically drive this task from perhaps a table holding the objects that need to be copied? Perhaps this is possible in the copy objects task ? But i doubt it.
Thanx again for your help.|||Hi Jamie
I have realized dynamically driving thorugh the table list will not work with a for each loop container using a data flow task , as the source can use dynamic sql but the ole db destination cannot be dynamic therefore i could change the destination with the source dynamically.
I tlooks like the only way is to loop through using a sql task with a linked server.
Any ideas ?
Thanx|||
The other thing that woul worry me with only a partial data transfer would be integrity. If you transfer only 100 rows per table for example, what ensures that any foreign key relationships will not be violated, when you do not get matching data?
I don't think a totally generic and automated solution is feasible, the type of data, static vs transactional will influence how they are treated. If the tables themselves are dynamic, with tables being added and removed, you will need to know what each table is, and therefore how to treat them.
|||Hi Darren Thanx for the assistance.There are no foreign keys, so that should not be an issue.
I agree i do not think it is feasible for SSIS in this situation, i have opted to go for Snapshot replication, easier to add and articles etc, just thought there may be a cool way to do it it in SSIS.
But thanx for the feedback!
Cheers|||Did you try using the SMO enumerator with the transfer objects task? This seems ideally suited, though I admit I don't fully understand what you're trying to do with your data.sql
No comments:
Post a Comment