It's a simple job that copies data older than 15 days from Database A on premises to Database B in Azure. It then deletes the old data from Database A. So a simple archive job. There's not much there to fail - and it runs every day - just not today.
Well, let's have a look at the error messages in the Progress tab:
[ADO NET Destination ] Error: An exception has occurred during data insertion, the message returned from the provider is: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "ADO NET Destination" (16) failed with error code 0xC020844B while processing input "ADO NET Destination Input" (19). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
[OLE DB Source ] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
There now, isn't that informative?
Well, now that I have worked out what went wrong, it is - here are the important words in all that guff:
An exception has occurred during data insertion
It is trying to add data to the destination and fails, reporting a timeout.
The hexadecimal stuff doesn't really help me terribly much. It looks really specific - but when you google it, there are all sorts of suggestions for error code 0xC02020C4 and 0xC020844B - a network glitch (that old standby), memory settings, all sorts of things
What I THINK happened was a problem delivering data to Database B in Azure. And the problem in my case is that the Azure database has been set to 100GB, and I'm trying to put more data than that into it!
It's a standard Azure DB - so the max size is 250GB. Of course you can get much bigger than that, but it would involve paying Microsoft big bucks. Let's stick to the 250.
-- *** Note this needs to be run on the MASTER database
ALTER DATABASE [Archive_2019Q34] MODIFY (MAXSIZE = 250GB);
I suppose that if Azure databases grew of their own accord, I would be bitching about how they grew so big it costs a fortune. It would be nice if SSIS gave us something more meaningful though
Post a Comment