# XLSQL for using Excel (XLS) as datasource in VC

Hi Benjamin,
Hope you are having a good day. I know its been a while since you worked on this piece ([url] DATE issues when using BI JDBC Excel datasource in VC;url] ) but we are trying to do the same thing you had done. Use xlSQL to connect to an excel file as a datasource for VC modeling.
We want to use the BI Java connector. I have loaded the JAR files and when we get to the Manage Connections, not sure what we enter for the Driver, URL, Schema, username and Password.
What I now see if that the Connection Test (for the BI JDBC system) works fine with the following:
Driver: com.nilostep.xlsql.jdbc.xlDriver
Connection URL (Unix box): jdbc:nilostep:excel:/reports/spec (this is the folder where I have the xls file)
FixedSchema: (should my excel file name go in here?)
FixedCatalog: (should my excel sheet name go in here?)
When I do all of this and search for TABLES in VC on that system, it says None found. Any light shed on this would be very helpful.
Thanks,
BR

Hi,
How Tou2026Configure a BI JDBC System for Visual Composer Version 1.03 u2013 March 2006
Follow all the steps... Some remarks about the steps in this document:
Step 7: add all additional .jar files that are in the /lib folder of the xlsql driver package
Step 11 - 14:  ... skip this, after step 15 restart the whole j2ee cluster, this is really needed to load the .jar files
Step 15: although optional and not the configuration you need to get it working for VC it is handy to configure this as it allows you to use the JDBC test page for debuging etc.
The only 2 parameters you need to configure with this step is:
Driver: com.nilostep.xlsql.jdbc.xlDriver
Connection URL: jdbc:nilostep:excel:/reports/spec/
...Now restart the j2ee cluster...
After the restart you can use the following url to see if the driver works: http://<hostname>:<port>/TestJDBC_Web/TestJDBCPage.jsp
Continue the configuration to allow VC to access the driver you need to configuer a System object in the portal...
Step 26: Again only fill in the following:
Driver Class name: com.nilostep.xlsql.jdbc.xlDriver
Connection URL: jdbc:nilostep:excel:/reports/spec/
Step 29,30 and 34,35: Usermaping Type can be left untouched
Additional step (do this after step 33): To make the system visible in VC you must set the portal permission for the user that use VC to Read + Enduser. Also when used in runtime later the enduser should have Read+ Enduser permission to use the system object. We used the built-in group Authenticated User for this and set the permissions on this group to Read+Enduser.
This should do the trick...
FAQ 1 - after uploading a new (or overwriting an existent) XLS file to the folder the changes are not visible, how to solve this?
ANSWER 1 -  restart the JDBC connector via /nwa > manage > applications > (search for jdbc) > Stop / Start
FAQ 2 - after deleting a XLS file the data is still querieable, even after the restart of the jdbc driver?
ANSWER 2 - this is some kind of caching bug, deleted files are only deleted when you restart the J2EE engine.
FAQ 3 - after a restart of the jdbc driver, the first query firred in VC returns an error.
ANSWER 3 - this is because the first time the XLS files are read and writen into a HSQLDB in memory database, this takes more time. after this the calls are made on this in-memory-copy of the XLS data.
Hope this helps you...
Cheers,
Benjamin Houttuin
Edited by: Benjamin L.F. Houttuin on May 26, 2011 8:04 PM

### Similar Messages

• Error using Excel as a DataSource for Report Builder 3.0 - ODBC connection

Hi,
I'm getting this error message below while trying to use excel as a datasource within Report Builder 3.0.  I can see the columns and rows but unable to display/run the report.  Using Excel 32 bit and have the driver and user dsn created under c:/windows/syswow64/odbcad32.
ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified

Hi Cherise,
I have commercial experience in migrating Crystal Reports to SSRS. When performing any type of migration it is always wise for the business requirements to be revisited. It is likely that the business requirements may have changed since the Crystal reports
were initially developed, etc.
I've done some research for you and the following link looks quiet impressive and offers a cheap trial, in terms of a direct migration. Please tell me how you get on; -
http://www.sqlcircuit.com/2013/08/ssrs-how-to-create-report-using-excel.html
I emphasis again I have commercial experience of lots of migration projects and it would be very unwise to not revisit the business requirements as part of the migration process.
Kind Regards,
Kieran.

• Using Excel as a Datasource

I created a report that will use Excel as a datasource. The report runs fine from my desktop. I would like to place this report on the Business Objects Server. Can anyone tell me what I would need to do in regards to the excel file.
In addition to this issue, when I moved the excel file to a shared folder and reconfigured the system dsn, the report would not run. I received an error message: could not decrypt data. Any help would be appreicated. Thanks.

Hi
Put the excel file on a network drive which is accessible by the server.
Create a Crystal Report using the Access/Excel DAO connection based on the above excel file.
Save it to the enterprise.
Hope this helps!!
Regards
Sourashree

• Any workaround for using Excel  2000 in 2004s ???

Greetings.
1. We have Excel 2000 and Windows 2000 standardized in the company and not upgrading to the Windows XP/Vista  in the immediate future.
2. Since the new 2004s BI tools do not support downloading to Excel 2000, we are looking for other options such as continue using  3.5 query designer.  But from the 3.5 Query designer "Display Query on the web", Publish to Portal or Bex Broadcaster links not working or erroring out.
Greatly appreciate your tips and suggestions for this scenario.
Thanks.
PK

Hello PK,
Do not try to find a workaround  to use excel 2000 with NW 2004s. This may not be useful as Excel 2000 is not supported by SAP for NW 2004s.
But you should be able continue using BW 3.5 tools. Please post the errors that you are getting so that members can help you.
Regards,
Sheik Bilal

• Using excel file as datasource in crystal report asking database logon

hai all,
we uisng  plain excel file in crystal report to develop some reports it is working fine. Issue is when we are using  parameters and  using the crystal report in live office it  is asking for database logon details. we are not using any database as it plain excel file even without any odbc driver . please suugest what credential we have to give.
Thanks and regards.
suresh.p

Looks, your Live Office settings are set to "OnDemand".
Give BO logon credentials(Infoview) & try it out.
I'm Back

• Best practises for using Excel functions on Power Pivot Data

Hi
How do you suggest to perform calculations on cells in a power pivot table? Obviously the ideal approach is to use a DAX measure. But given that DAX doesn't have every function, is there a recommended way of eg adding an "extra"  ( ie just
adjacent)  column to a pivot table. ( in particular I want to use beta.inv )
I could imagine one option of adding some VBA that would update the extra column as the pivot table itself refreshed ( and added more/less rows)
thanks
sean

Hi Sean,
I don't know what's your expected requirement regarding this issue, maybe you can share some sample data or scenario to us for further investigation. As we know, if we need to add extra column to PowerPivot data model, we can directly
create a calcuated column:
calculated columns:
There are some different between Excel and DAX functions, here are the list for your reference:
Excel functions (by category):
http://office.microsoft.com/en-us/excel-help/excel-functions-by-category-HA102752955.aspx
DAX Function Reference:
http://msdn.microsoft.com/en-us/library/ee634396.aspx
Hope this helps.
Regards,
Elvis Long
TechNet Community Support

• The connection string for coded UI Data driven test using excel data source is not working

Hello,
I am using the visual studio 2012 coded UI test, i added the following connection strings to connect to an excel data source:
[DataSource("System.Data.Odbc", "Dsn=Excel Files;Driver={Microsoft Excel Driver (*.xls)};dbq=C:\\Users\\shaza.said.ITWORX\\Desktop\\Automation\\On-track Automation Framework\\On-track_Automation\\Automation data file.xls;defaultdir=.;driverid=790;maxbuffersize=2048;pagetimeout=5;readonly=true",
"Sheet1$", DataAccessMethod.Sequential), TestMethod] [DataSource("System.Data.Odbc", "Dsn=Excel Files;dbq=|DataDirectory|\\Automation data file.xls;defaultdir=C:\\Users\\shaza.said.ITWORX\\Desktop\\Automation\\On-track Automation Framework\\On-track_Automation\\Automation data file.xls;driverid=1046;maxbuffersize=2048;pagetimeout=5", "Sheet1$", DataAccessMethod.Sequential), TestMethod]
But i get the following error:
Error details: ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified"
Thanks,
Shaza

Hi shaza,
From the error message, I suggest you can refer the Adrian's suggestion to check the date source connection string correctly.
In addition, you can refer the following about how to Create a Data-Driven Coded UI Test to check your issue:
http://msdn.microsoft.com/en-us/library/ee624082.aspx
Or you can also try to ﻿use a Configuration File to Define a Data Source for coded UI test.
For example:
<?xml
version="1.0"
encoding="utf-8"
?>
<configuration>
<configSections>
<section
name="microsoft.visualstudio.testtools"
type="Microsoft.VisualStudio.TestTools.UnitTesting.TestConfigurationSection,
Microsoft.VisualStudio.QualityTools.UnitTestFramework, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/>
</configSections>
<connectionStrings>
name="ExcelConn"
driverid=790; maxbuffersize=2048; pagetimeout=60;"
providerName="System.Data.Odbc"/>
name="ExcelConn1"
driverid=790;maxbuffersize=2048;pagetimeout=60"
providerName="System.Data.Odbc"/>
</connectionStrings>
<microsoft.visualstudio.testtools>
<dataSources>
connectionString="ExcelConn"
dataTableName="Addition$" dataAccessMethod="Sequential"/> <add name="ExcelDS_Multiply" connectionString="ExcelConn1" dataTableName="Multiply$"
dataAccessMethod="Sequential"/>
</dataSources>
</microsoft.visualstudio.testtools>
</configuration>
Best Regards,
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.

• Need flexibility on using Excel as datasource

I found this tutorial: http://blogs.oracle.com/dataintegration/2010/03/connecting_to_microsoft_excel.html
Is it just me or there simply no flexibility if you need to predefine data area (named range) before hand? Most use cases you'd want to automate ETL, and thus data source must support varying number of entries (rows) at the very least.
Is there any work around for this?
TIA

For Excel this are the steps.
1. Create a DSN in ODBC
2. Create a connection in Data server (topology) using sun odbc-jdbc driver
3. Reverse the Excel using the selective reverse. While handling multiple sheets ODI defines them like a table (datastore) and names are populated based on sheet name say if sheet1 then Sheet1$,Sheet2$ and so on.
Now you can use them as Source Datasource and use it in the interface as we do with other RDBMS datastore.

• Using Excel as  Datasource in Crystal Reports 8.5

I have never used Excel as a source for Crystal data and I am having a bit of an issue I believe. I have an Excel file on my desktop that I am using as the datasource. Eventually it will be out on a network drive. I set up all my reports as blank reports so I do that. When it brings up the Data Explorer pop up I choose ODBC ---> then Excel Files and choose the file. I then add the sheet to the report, close the data explorer window and start designing my report. I drag the fields I want on there, but when I go and preview I don't see any data at all. I have also tried it with the Report Wizard, no go there either.  Is there something special I need to get the data to show up on there? Does it need to be formatted a certain way? Is it just easier to convert it over to an Access database? Any help anyone could suggest would be wonderful!
Thanks
Jami Benson

I just checked something out and I figured out what I was doing wrong myself. It would seem that I was choosing the wrong sheet. One of the sheets, sheet 1, appears twice in the list. Once with the name of the whole spreadsheet, and once as just sheet 1. I was using the one with the name. It has the headings just no data. The other has the data and headings, just no name. Once I choose it, all seems to be well. '
Jami

• Facing problem with the code for sending an .xls attachment via email, a field value contains leading zeros but excel automatically removes these from display i.e. (00444 with be displayed as 444).kindly guide .

Facing problem with the code for sending an .xls attachment via email, a field value contains leading zeros but excel automatically removes these from display i.e. (00444 with be displayed as 444).kindly guide .

Hi Chhayank,
the problem is not the exported xls. If you have a look inside with Notepad or something like that, you will see that your leading zeros are exported correct.Excel-settings occurs this problem, it is all about how to open the document. If you use the import-assistant you will have no problems because there are options available how to handle the different columns.
Another solution might be to get familiar with ABAP2XLS-Project. I got in my mind, that there is a method implemented, that will help you solving this problem. But that is not a five minute job
~Florian

• Excel xls spreadsheets no longer open from links in Firefox 3.6.8 for Mac OS 10.6.4

My website contains links to MS Excel xls documents. In the past, I have always been able to open those Excel docs from within Firefox. But after I got a new Mac Mini with Mac OS 10.6.4 and MS Office 2008, my links no longer worked on the Mac OS platform.

You need to use the command line switch -safe-mode in the Terminal to start Firefox 3.6 versions in Safe Mode.
*https://support.mozilla.org/kb/Safe+Mode
*http://kb.mozillazine.org/Safe_Mode
The last Firefox version that runs on Mac OS X 10.4 is Firefox 3.6.28.
For an unofficial Firefox 17.0.5 ESR compatible version (TenFourFox) that runs on PowerPC Macs with OS X 10.4.11 or OS X 10.5.8 you can look at:
*http://www.floodgap.com/software/tenfourfox/
*http://www.macupdate.com/app/mac/37761/tenfourfox

• Excel *.xls or *.xlsx for dimension sheet format in Admin Console?

Dear all:
We are in the process of upgrading BPC 5.1 to 7.0, and I was prompted a message about the Excel format (2007 or 2003) when I first opened up a dimension in Admin Console, after BPC 7.0 was installed.
My questions are:
1. What are the pros and cons?
2. Excel 2003 has the limitation of 65000 rows, so does that mean once I choose 2003 format, my dimension members can only have 65000 limitation? If not, what is the workaround?
3. If I select Excel 2007 as my default format, and want to switch to Excel 2003, is there a way to do so?
Thank you very much!
Sincerely,
Brian

Hi Brian Hsu,
Please refer to the SAP Note below:
SAP Note 1265872 - Excel 2007 format template supported in Admin console.
Summary
Symptom
Admin console does not support Excel 2007 format in previous versions.
Other terms
BPC 7.0 SP2 Microsoft, BPC 7M SP02, Excel 2007 format(xlsx), Admin console, dimension member sheet,
Reason and Prerequisites
The previous version of Admin console supported Excel 2003 file format only. Users could not use Excel 2007 file format ('xlsx').
Solution
Implemented in 7.0M SP2.
BPC Administration console will support both Excel 2003 and Excel 2007
file format for dimension member sheet. And the BPC Administration
console uses the value of 'MEMBERSHEET_VERSION' in tblDefaults table
internally to judge which file format is used for dimension member
sheet. The value can be "2003" or "2007" and be set by selecting the
value from popup window as below. Once the value is set via the popup
window, the value cannot be changed, so we recommend reading the below
explanation for each case first and then setting the value.
KeyID and Value in tblDefaults;
- [KeyID]: MEMBERSHEET_VERSION
- [VALUE]: "2003" OR "2007"
Below is the behavior of the Excel version when users try to save a
dimension member sheet to server or process dimension with member sheet;
1. No record having MEMBERSHEET_VERSION in tblDefaults after installing
BPC 7M SP2 or later
a)Excel 2003 users
b)Excel 2007 users
- A popup window for selecting 2003 or 2007 will come up and if
user selects 2007, a warning message that Excel 2003 users
cannot usedimension member sheet file anymore will come up once
again and then the file format will be changed to 'xlsx'
- Create, Modify, Copy dimension: The file format of the member
sheet will be 'xls'.
2. MEMBERSHEET_VERSION=2003
a)Excel 2003 users
b)Excel 2007 users
- Dimension member sheet file will be saved as 2003 file format by
'SaveAs'
3. MEMBERSHEET_VERSION=2007
a)Excel 2003 users
dimension member sheet any more.
b)Excel 2007 users
- Download 'xlsx' file from server and if there is only 'xls', the
file will be downloaded from server and then it will be changed         to 'xlsx' when save to server.
- Create dimension: The file format will be 'xlsx'
- Modify, Copy dimension: The file format will be followed by the
[Limitation]
If the value of MEMBERSHEET_VERSION is set to 2007, users who use Office 2003 cannot access and modify dimension member sheet file any more. Only users who use Office 2007 can access and modify dimension member sheet.

• How-to use Excel for the XML file input?

Hello all,
Following our discussion with Gerhard Steinhuber on the very nice tutorial from Horst Schaude , "How to upload mass data via XML File Input" , I am starting this new discussion.
In the comments section of this previous cited tutorial, Rufat Gadirov explains how to use a generated XML from Eclipse instead of your XSD file as your source in Excel.
However, in spite of all the instructions, I am still facing the same issue in Excel when I try to save my file as XML : "The XML maps in this workbook are not exportable".
What I try to do is to create one or more Sales Orders with multiple Items in it from a XML File Input, using excel to enter data.
The part with the File input is working (if I directly upload my file to the webDAV, it creates a sales order instance with multiple items).
The only missing part is the Excel data input that I cannot make work. Any help on this matter would be greatly appreciated.
Here is my XML file that I try to use as a source in Excel before inputing data from Excel:
<?xml version="1.0" encoding="UTF-8"?>
<CreationDateTime>2015-03-02T12:00:00.000Z</CreationDateTime>
<List actionCode="01" listCompleteTransmissionIndicator="true" reconciliationPeriodCounterValue="0">
<MyDateTime>2015-03-02T12:00:00.000Z</MyDateTime>
<MyName languageCode="EN">MyName</MyName>
<MyBillToParty schemeAgencyID="token" schemeAgencySchemeAgencyID="1" schemeAgencySchemeID="token" schemeID="token">token</MyBillToParty>
<MyDateToBeDelivered>2001-01-01</MyDateToBeDelivered>
<MyEmployeeResponsible schemeAgencyID="token" schemeAgencySchemeAgencyID="1" schemeAgencySchemeID="token" schemeID="token">token</MyEmployeeResponsible>
<MySalesUnit schemeAgencyID="token" schemeAgencySchemeAgencyID="1" schemeAgencySchemeID="token" schemeID="token">token</MySalesUnit>
<MyItem>
<MyItemID>token</MyItemID>
<MyItemProductID schemeAgencyID="token" schemeID="token">token</MyItemProductID>
<MyItemDescription languageCode="EN">MyItemDescription</MyItemDescription>
<MyProductTypeCode>token</MyProductTypeCode>
<MyRequestedQuantity unitCode="token">0.0</MyRequestedQuantity>
<MyConfirmedQuantity unitCode="token">0.0</MyConfirmedQuantity>
<MyNetAmount currencyCode="token">0.0</MyNetAmount>
</MyItem>
<MyDateTime>2015-03-02T12:00:00.000Z</MyDateTime>
<MyName languageCode="EN">MyName</MyName>
<MyBillToParty schemeAgencyID="token" schemeAgencySchemeAgencyID="1" schemeAgencySchemeID="token" schemeID="token">token</MyBillToParty>
<MyDateToBeDelivered>2001-01-01</MyDateToBeDelivered>
<MyEmployeeResponsible schemeAgencyID="token" schemeAgencySchemeAgencyID="1" schemeAgencySchemeID="token" schemeID="token">token</MyEmployeeResponsible>
<MySalesUnit schemeAgencyID="token" schemeAgencySchemeAgencyID="1" schemeAgencySchemeID="token" schemeID="token">token</MySalesUnit>
<MyItem>
<MyItemID>token</MyItemID>
<MyItemProductID schemeAgencyID="token" schemeID="token">token</MyItemProductID>
<MyItemDescription languageCode="EN">MyItemDescription</MyItemDescription>
<MyProductTypeCode>token</MyProductTypeCode>
<MyRequestedQuantity unitCode="token">0.0</MyRequestedQuantity>
<MyConfirmedQuantity unitCode="token">0.0</MyConfirmedQuantity>
<MyNetAmount currencyCode="token">0.0</MyNetAmount>
</MyItem>
</List>
Thank you all for your attention.
Best regards.
Jacques-Antoine Ollier

Hello Jacques-Antoine,
I suppose that as you have tried to construct a map from the schema, you have taken the elements from the List level down. In this case I also can't export the map.
But if you take the elements from the level MySalesOrderUploaded down, you'll get the exportable map (screenshots)
Best regards,
Leonid Granatstein

• Cache and performance issue in browsing SSAS cube using Excel for first time

Hello Group Members,
I am facing a cache and performance issue for the first time, when I try to open a SSAS cube connection using Excel (using Data tab  -> From Other Sources --> From Analysis Services) after daily cube refresh. In end users
system (8 GB RAM), for the first time, it takes 10 minutes to open the cube. From next run onwards, its open up quickly within 10 secs.
We have daily ETL process running in high end servers. The configuration of dedicated SSAS cube server is 8 core, 64GB RAM. In total we have 4 cubes - out of which for 3 is full cube refresh and 1 is incremental refresh. We have seen after
daily cube refresh, it takes 10 odd minutes to open the cube in end users system. From next time onwards, it opens up really fast with 10 secs. After cube refresh, in server systems (16 GB RAM), it takes 2 odd minutes to open the cube.
Is there, any way we could reduce the time taken for first attempt ?
Best Regards, Arka Mitra.

Thanks Richard and Charlie,
We have implemented the solution/suggestions in our DEV environment and we have seen a definite improvement. We are waiting this to be deployed in UAT environment to note down the actual performance and time improvement while browsing the cube for the
first time after daily cube refresh.
Guys,
This is what we have done:
We have 4 cube databases and each cube db has 1-8 cubes.
1. We are doing daily cube refresh using SQL jobs as follows:
<Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Parallel>
<Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2" xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100" xmlns:ddl200="http://schemas.microsoft.com/analysisservices/2010/engine/200" xmlns:ddl200_200="http://schemas.microsoft.com/analysisservices/2010/engine/200/200">
<Object>
<DatabaseID>FINANCE CUBES</DatabaseID>
</Object>
<Type>ProcessFull</Type>
<WriteBackTableCreation>UseExisting</WriteBackTableCreation>
</Process>
</Parallel>
</Batch>
2. Next we are creating a separate SQL job (Cache Warming - Profitability Analysis) for cube cache warming for each single cube in each cube db like:
CREATE CACHE FOR [Profit Analysis] AS
{[Measures].members}
*[TIME].[FINANCIAL QUARTER].[FINANCIAL QUARTER]
3. Finally after each cube refresh step, we are creating a new step of type T-SQL where we are calling these individual steps:
EXEC dbo.sp_start_job N'Cache Warming - Profit Analysis';
GO
I will update the post after I receive the actual im[provement from UAT/ Production environment.
Best Regards, Arka Mitra.

• Performance issue in browsing SSAS cube using Excel for first time after cube refresh

Hello Group Members,
This is a continuation of my earlier blog question -
https://social.msdn.microsoft.com/Forums/en-US/a1e424a2-f102-4165-a597-f464cf03ebb5/cache-and-performance-issue-in-browsing-ssas-cube-using-excel-for-first-time?forum=sqlanalysisservices
As that thread is marked as answer, but my issue is not resolved, I am creating a new thread.
I am facing a cache and performance issue for the first time when I try to open a SSAS cube connection using Excel (using Data tab  -> From Other Sources --> From Analysis Services) after daily cube refresh. In end users system (8 GB RAM but around
4GB available RAM), for the first time, it takes 10 minutes to open the cube. From next run onwards, its open up quickly within 10 secs.
We have daily ETL process running in high end servers. The configuration of dedicated SSAS cube server is 8 core, 64GB RAM. In total we have 4 cube DB - out of which for 3 is full cube refresh and 1 is incremental refresh. We have seen after daily cube
refresh, it takes 10 odd minutes to open the cube in end users system. From next time onwards, it opens up really fast with 10 secs. After cube refresh, in server systems (32 GB RAM, around 4GB available RAM), it takes 2 odd minutes to open the cube.
Is there, any way we could reduce the time taken for first attempt ?
As mentioned in my previous thread, we have already implemented a cube wraming cache. But, there is no improvement.
Currently, the cumulative size of the all 4 cube DB are more than 9 GB in Production and each cube DB having 4 individual cubes in average with highest cube DB size is 3.5 GB. Now, the question is how excel works with SSAS cube after
daily cube refresh?
Is it Excel creates a cache of the schema and data after each time cube is refreshed and in doing so it need to download the cube schema in Excel's memory? Now to download the the schema and data of each cube database from server to client, it will take
a significant time based on the bandwidth of the network and connection.
Is it anyway dependent to client system RAM ? Today the bigest cube DB size is 3.5 GB, tomorrow it will be 5-6 GB. Now, though client system RAM is 8 GB, the available or free RAM would be around 4 GB. So, what will happen then ?
Best Regards, Arka Mitra.

Could you run the following two DMV queries filling in the name of the cube you're connecting to. Then please post back the row count returned from each of them (by copying them into Excel and counting the rows).
I want to see if this is an issue I've run across before with thousands of dimension attributes and MDSCHEMA_CUBES performance.
select [HIERARCHY_UNIQUE_NAME]
from $system.mdschema_hierarchies where CUBE_NAME = 'YourCubeName' select [LEVEL_UNIQUE_NAME] from$system.mdschema_levels
where CUBE_NAME = 'YourCubeName'
Also, what version of Analysis Services is it? If you connect Object Explorer in Management Studio to SSAS, what's the exact version number it says on the top server node?
http://artisconsulting.com/Blogs/GregGalloway