Home > other >  Problems with inital refresh of Power BI Data Flow - Too demanding on data source
Problems with inital refresh of Power BI Data Flow - Too demanding on data source

Time:01-20

I am faced with a challenge I would love to get some pointers for.

I have a couple of very large tables housed in an SQL Server instance that is also the primary production table for the application it supports.

I want to load 2 years worth of historical data, after which I will be implementing incremental refresh. The problem is that when I tried to do the initial load, the end users of the application experienced time outs and all the other stuff we want to see in production.

I am looking for a way to feed the data flow either in small steps of a month or smaller, or provide the historical data separately via a .csv or in some other way.

Can anybody share some insight on how to go about this? I've tried researching this issue, but I've not found a way so far.

Thank you in advance!

CodePudding user response:

If you dont have the option to mirror your data i would suggest a dirty read.

SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED

BEGIN TRANSACTION

-- your sql script

COMMIT TRANSACTION
  •  Tags:  
  • Related