Mittwoch, 18. November 2009 20:36
Just started using November CTP today. First thing, point to our Oracle data warehouse. Two errors:
Data overflow converting to the data type for table on what appears to be an Oracle date data type
Memory error: Allocation failure : Not enough storage is available to process this command. .
The operation has been cancelled.
when attempting to download a relatively large table
The client is a 1 cpu box on VM with 2 gigs of memory
Mittwoch, 18. November 2009 21:05Re: issue #1, depending on the version of Oracle you are connecting to, you may want to use a different provider.
PowerPivot defaults to using the MS OLEDB Provider for Oracle; this setting can be overriden in Advanced properties of the connection page to use an Oracle provider instead (note that you will need to install that provider onto the client machine first).
Re: issue #2, this would typically occur when the client is under memory pressure as a result of processing a large dataset; you can try specifying a filter (Preview & Filter option in the Table Import Wizard), or allocating more RAM to your image.
Mittwoch, 18. November 2009 22:22Regarding issue 1,
I have isolated the issue to returning the legitimate Oracle date value 1/1/0001 - which is basically date Oracle date 0 back into powerpivot.
Using the Microsoft OleDB driver the error is seen as originally posted. Using the Oracle Oledb driver, a different Oledb error is returned.
If I filter out Oracle date value 0 by issuing SQL and not a table, both drivers return data to powerpivot.
Interestingly, if I attempt to choose the table and preview, I can see the date value in question. The error occurs only during the import to Powerpivot.
Mittwoch, 18. November 2009 22:51As to issue #2, I have upped the memory on the VM to 4 gig. same error after loading about 2.5 million rows of data from one table.
Donnerstag, 19. November 2009 01:10Out of curiosity, how large is this table in its raw data format? While PpowerPivot has some great compression capabilites, its still more than possible that you just don't have enough memory. For example, I had imported one table with >44 million rows in it but because it was mostly ints, the final Excel workbook size was only 2MB. But for another table where I had imported in <12 million rows, because there were alot of text columns, the overall file was 5GB. For the latter, I ultimately had to make my HV have 8GB of memory.
Donnerstag, 19. November 2009 01:43
The tables I'm working with come from our data warehouse and most of the columns are numeric. I have been watching the performance monitor for memory usage and don't see used memory approaching the current 4 gig limit. Some of these tables are wide. I will try a smaller set to see how big the resultant file is.
I am also having what I consider to be poor performance - though I have nothing to base this on, almost 5 minutes to download 1 million rows of data from the Oracle server. I've tried the Oracle Oledb, Microsoft Oracle Oledb, and Oracle native client driver. about the same duration for all. That's too slow :(
Donnerstag, 19. November 2009 09:10
For the performance issue:
I can empathize with this. Please note that PowerPivot itself does not have with it pre-installed any special drivers but simply utilizes the existing OLEDB drivers. The problem with the Oracle to SQL is persistent throughout as you can tell from what you have tried.
In general, I would suggest Microsoft Connectors for Oracle and Teradata by Attunity. The performance to get data from Oracle to SQL is extremely fast - more information about this can also be read at: Moving Large Amounts of Data Between Oracle and SQL Server: Findings and Observations. While this is great for the data warehouse scenario, the Microsoft connectors will not be available for PowerPivot to utilize at this time.
What I may be able to suggest, though not very straightforward, is to transfer your data from Oracle to a SQL staging database via the Microsoft Connectors and then you can connect to SQL directly to download the data if performance is strictly your concern.
Another possibility is to download to PowerPivot for Excel only a small portion of the data that you actually need (e.g. create a view against your table that does select top 1000) and create your model. Then you can upload this to SharePoint and you can use the Data Refresh option to load the entire dataset once you alter the view to grab all of the data. Admittedly, it just shifts the burden to downloading to Excel to downloading in SharePoint but if it's in a data center the SharePoint system will potentially have faster network connectivity to your Oracle system.
Back to the file size...
Yes, if you can get back about how big the resultant file size is that would be good so we can better understand if the issues here.
This posting is provided "AS IS" with no warranties, and confers no rights
Donnerstag, 19. November 2009 16:25
I think the problem here is we detect the type as date and set the type as date however the value returned is 0 which couldn't be converted.
We will investigate this problem on how to handle this case. For now, you can filter out the value and try to proceed with your solution.
If you are using x32 bit excel and x32 powerpivot, you might hit memory pressure as soon as you reach ~1 GB mark.
Please give it a shot on x64 bit which doesn't have the memory limitation.
Donnerstag, 19. November 2009 16:51
The actual date value Oracle returns is 1/1/0001. That is their equivalent value for date 0. We use the Windows stack for reporting and querying all the time with the standard set of connectivity drivers and have had no issues. In fact, the preview and filter panel shows the date just fine.
It looks to me that the issue is somewhere within the compression algorithms.
Donnerstag, 19. November 2009 17:01Thanks kyle.
Are you using x32 bit or x64 bit if you use x64 bit this could resolve the memory pressure issue.
Regarding the date, the issue is happening while the datetime conversion during importing.
Is there a reason the date value is set as 1/1/0001?
Donnerstag, 19. November 2009 17:42Hi Deva,
I am using the x32 bits as that is the build that any business user would need. The date in question comes from a date dimension in the data warehouse. Date key value -1 - no date, has date value of 1/1/1. As the dates in question comprise part of a unique constraint, a null value is not allowed. As the date has to be set to something, the smalles date .NET allows seemed the best candidate.
Donnerstag, 19. November 2009 23:30Sorry I incorrectly clicked on the proposed answer.
1) For x32 bit there is a limitation on the memory address space. As far as I know, the max it could go on is 1 to 1.1 GB. If it goes beyond I would recommend x64 bit machine with x64 bits.
2) For date time, I think there is a limitation on how far back it can go. I don't think it supports Jan 1, 0001.
I can find out the how far it goes.
Freitag, 20. November 2009 07:55Hey Kyle,
For the x86 address space, of course even if your machine has 4GB available, only 2GB is theoretically available to a 32bit Excel. In addition, due to various overheads of Excel and the PowerPivot addin, about 800MB of virtual space is used up by the time you launch the PowerPivot window, and with a practical limit of 1.8GB virtual memory available there's not much space available. As you suggested, removing columns (especially those that have large strings) may be useful, as would of course limiting the rows coming in. During the data import process, PowerPivot hangs on to a buffer of 1-2M rows of uncompressed data so while the eventual space taken will be less, there's an additional memory requirement during the import process.
Re: the datetime issue, we're investigating this and can reproduce the issue with SQL Server's datetime2 as well. So whatever Oracle can do, we can do as well. <g>
Program Manager, Analysis Services