Wednesday, March 7, 2012

Error1Memory error: The operation cannot be completed because the memory quota estimate exce

I get the following error when trying to process a cube on the server.

Error 1 Memory error: The operation cannot be completed because the memory quota estimate exceeds the available system memory. 0 0

It runs on a Microsoft Windows NT 5.2 (3790) with the JUNE SQL 2005 Developer release. I has 4 gigs of RAM

This error comes when trying to process all sizes of cubes.

Hi,

I am receiving the same error when I try to process a specific measure group. This measure group has 13 partitions, each partition containing approximately 26 million rows of data.

I have read this microsoft article http://support.microsoft.com/kb/914595/ which details the same error message for processing large dimensions and the solution is to apply service pack 1. I've talked to our dba and he confirmed that we have sp1 applied. My problem is slightly different since I am processing measure groups.

The exact error message that I am receiving is [Analysis Services Execute DDL Task] Error: Memory error: The operation cannot be completed because the memory quota estimate (17010MB) exceeds the available system memory (13308MB).

Any advice would be much appreciated.

Thanks.

Dev88

|||

Try to search this forum for previous posts with similar errors reported. For instance:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=334677&SiteID=1

The problem is: your machine doesnt have enoght memory for Analysis Server to complete processing operation. Before it starts processing Analysis Server tries to estimate how much memory it needs to complete the operation. The error is given if there is shortage of memory.

Things to try:

1. Process objects one by one not in parallel in a single transaction.
2. Try and see how much memory you have on your machine and if you can free some of it for AS to complete the processing.

Edward Melomed.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Hi Edward,

Thank you for your reply. I had already previously looked at that other post and tried to process the objects one by one with no success. The error I received still stated that the memory quota estimate of 17 Gigs exceeded the memory available of 13 gigs. Ie, regardless of whether or not I processed objects 1 at time or in parallel Analysis Services still had the same memory quota estimate of 17 gigs.

In any case we have a 64 bit machine with 20 gigs of memory. In order to process these partitions I had to change some of the analysis services properties. I changed the OLAP \ Process \ BufferMemoryLimit, OLAP \ Process Plan \ MemoryLimit, and Memory \TotalMemoryLimit all to 90 percent. All of these properties have to do with the amount of memory that Analysis Services is allowed to use when porcessing dimensions and partitions, specifically MemoryLimit and BufferMemoryLimit. Only then was I able to process these partitions.

In this solution here from microsoft http://support.microsoft.com/kb/914595/ it deals with this same error message for processing dimensions and states how Analysis Services incorrectly estimates the amount of memory needed to process the dimension. Is this the same for Processing measure groups? I have processed cubes with 4 partitions each containing more than 40 million rows of data. The most recent cube that I processed contained 13 partitions with approximately 26 million rows per partition and only then did I received the error. Like I said earlier I increased the amount of memory available for use by AS and only then was I able to process these partitions. My concern is that when I begin to add more data to my cube, I will again receive this error.

So here are my questions:

1.Is this a problem with Analysis Services incorrectly estimating memory quota?

2.Is there anyway to change how AS estimates its memory usage?

3.Is the only solution to purchase more memory?

Any advice would be much appreciated.

Thanks.

|||

Some of the properties you suggest changing would have a little effect at the processing. For some reason I cannot get to the KB article you refer to.
The one property that probaly had the most effect in your situation was BufferMemoryLimit. You can read about it and about other properties in http://download.microsoft.com/download/4/f/8/4f8f2dc9-a9a7-4b68-98cb-163482c95e0b/SSASProperties.doc. I am not sure I would suggest using this option in your case.

As for the options you have:

Let me try to rephase #1 in my list above: Process smallest possible objects one by one ...

In your case processing of entire measure group internally getting resolved into processsing of all it's partition in parallel. Of cource these partition processing jobs are coming at the same time and competeing for memory.
You should try and create command processing partitions explicitly and make sure you control how many partitions are processed in parallel.

Analysis server is not perfect in estimating amount of memory needed for processing, but it is pretty close in it's etimate. And etimation logic there for a reason; It prevents from you from processing for hours before running into shortage of memory and aborting processing.

In your case, I dont think you need more memory.

Edward Melomed.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

The document that you suggested is the document that I read prior to modifying my AS properties. The property that actually made it possible to process the partition was an undocumented property OLAP \ Process Plan \ MemoryLimit. I agree that increasing the amount of memory available was probably not the best solution but it seem to be the only one that worked. I'm not sure what you mean by 'command processing partitions explicitly to control the number of partitions being process in parallel' but the steps that I am following to control how many partitions are processed in parallel is manual.

In the MS management studio, I browse the down the the partitions directory of my cube. I then right click the partition which I would like to process and then click process. By processing this single partition, does AS perform its estimate as if all of the partitions would be processed in parallel? Or is there another method which I can use to control how many partitions are being processed?

Once again I appreciate all your help and I look forward to hearing your response.

Thanks,

Dev88

|||

Try following:

Navigate in the SQL Management studio to the Partitions node in your measure group. The "Summary" page on the right will list all of your partitions.

Multi-select partitions you would like to process and launch the Processing dialog using Right-Click menu on the summary page. At this moment you will see multiple partitions in the processing dialog window.

Click on the "Change Settings" button and you will see the the dialog window where you can choose the Processing order. You can select how many partitions in parallel you would like to process.

Closing "Change Settings" dialog gets you back to the Processing dialog, here you click on the Script button above and script processing command into SSMS window.

Take a look at the script generated. It has all of your partitions listed. The Parallel MaxParallel= element specifies how many partitions would be processed in parallel.

Here is whitepaper talking about processing http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnsql90/html/sql2k5_asprocarch.asp

Edward Melomed.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Hello, I tried what you suggested. I also reviewed that white paper again since I had previously read it a couple of months ago.

I only selected one partition. On the page after I click "Change Settings" I selected that I one partition to process in parallel. After closing that dialog to get back to the processing dialog I click on the Script button to open the xmla script into ssms. Here is the script:

<Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">

<ErrorConfiguration xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

<KeyErrorLimit>-1</KeyErrorLimit>

<KeyNotFound>IgnoreError</KeyNotFound>

<NullKeyNotAllowed>IgnoreError</NullKeyNotAllowed>

</ErrorConfiguration>

<Parallel MaxParallel="1">

<Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

<Object>

<DatabaseID>Cube009</DatabaseID>

<CubeID>B Data</CubeID>

<MeasureGroupID>P Warehouse</MeasureGroupID>

<PartitionID>P Warehouse</PartitionID>

</Object>

<Type>ProcessFull</Type>

<WriteBackTableCreation>UseExisting</WriteBackTableCreation>

</Process>

</Parallel>

</Batch>

As you can see the Parallel MaxParallel=1. I am still receiving the error.

<return xmlns="urn:schemas-microsoft-com:xml-analysis">

<results xmlns="http://schemas.microsoft.com/analysisservices/2003/xmla-multipleresults">

<root xmlns="urn:schemas-microsoft-com:xml-analysis:empty">

<Exception xmlns="urn:schemas-microsoft-com:xml-analysis:exception" />

<Messages xmlns="urn:schemas-microsoft-com:xml-analysis:exception">

<Error ErrorCode="3238199299" Description="Memory error: The operation cannot be completed because the memory quota estimate (17017MB) exceeds the available system memory (13308MB). " Source="Microsoft SQL Server 2005 Analysis Services" HelpFile="" />

<Error ErrorCode="3240034325" Description="Errors in the OLAP storage engine: An error occurred while processing the indexes for the P Warehouse 1 partition of the P Warehouse measure group of the B Data cube from the Cube009 database." Source="Microsoft SQL Server 2005 Analysis Services" HelpFile="" />

</Messages>

</root>

</results>

</return>

Is there anything else that I can try?

Thanks.

|||

At this moment I am thinking ether your patition is too big. You have 20Gb memory avaliable, I can hardly imagine the partition size that cannot process with that much memory. Try partition your data futher.

It is also possible, you've changed some server properites and now server estimation logic is skewed. Get back to the server defaults.

And lastly, it is possible there is a bug in Analysis Server, so you please and try contacting customer service.

Edward Melomed.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

No comments:

Post a Comment