Issue with Custom data parameter in Excel Data Connection

Hello,
We are querying a OLAP (SSAS 2008) cube using Excel 2007 by creating a data connection to SSAS server with an extended parameter “Custom Data”. We are using custom data parameter to apply our application’s user security to filter dimension
members in Excel’s pivot table report. We have created a pivot table in Excel and the excel is including this custom data parameter along with each request that excel is submitting to SSAS.
However, when the user selects “Show Properties in Report” option for a dimension which is selected in Row-axis of pivot table, Excel is not including custom data parameter . Why this “Custom Data” parameter is not being included in
the request only for this specific case?
1. The Request which sent from Excel to SSAS when user clicks on “Show Properties in Report” option
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"><soap:Header><Session xmlns="urn:schemas-microsoft-com:xml-analysis" SessionId="AE6B294B-4983-4010-BC5F-E0896A49ECD9"/></soap:Header><soap:Body><Discover
xmlns="urn:schemas-microsoft-com:xml-analysis"><RequestType>DISCOVER_LITERALS</RequestType><Restrictions/><Properties><PropertyList><Content>SchemaData</Content><Format>Tabular</Format></PropertyList></Properties></Discover></soap:Body></soap:Envelope>
2. The sample Request which sent from Excel to SSAS for all other requests
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"><soap:Header><Session xmlns="urn:schemas-microsoft-com:xml-analysis" SessionId="74165AD6-C240-4634-92A1-61A400A3FC97"/></soap:Header><soap:Body><Discover
xmlns="urn:schemas-microsoft-com:xml-analysis"><RequestType>MDSCHEMA_CUBES</RequestType><Restrictions><RestrictionList><CUBE_NAME>RepCube</CUBE_NAME></RestrictionList></Restrictions><Properties><PropertyList><Catalog>QAV10_12thMay2010</Catalog><Timeout>5000</Timeout><SafetyOptions>2</SafetyOptions><MdxMissingMemberMode>Error</MdxMissingMemberMode><LocaleIdentifier>1033</LocaleIdentifier><DbpropMsmdMDXCompatibility>1</DbpropMsmdMDXCompatibility><CustomData>xxxxxx</CustomData></PropertyList></Properties></Discover></soap:Body></soap:Envelope>
As the Custom Data parameter is missing in 1st request our application logic is throwing a custom exception to Excel and Excel is not able to understand the exception message. Excel is sending the same request repeatedly (going into infinite loop) and the excel
window is freezed. Not able to work with excel until restarting the Excel.
Can anyone help me to resolve the above? quick help is much appriciated!
Thanks,
Venkat

Venkat,
We have a very similar issue, but there is almost no material to help us. If this gets to you, please let me know if you progressed it or whether I need to go to Microsoft for a solution
THanks
David

Similar Messages

  • Issues with Custom Chart

    Hi,
    I am facing some issues with custom charts.
    1. X Axis value is getting cut off. Given date as x axis parameter and last 2 digits of date is getting cut off. (format like 19-Apr-2011T00:00:00). And this value cut of happening only for custom chart.
    2. On right click of the custom chart, when i am selecting Preview, it opens a new pop up window with and error as Error: "Application error occurred during the request processing.". No preview is being generated.
    3. In Server Scaling, i checked Use global auto scaling, and many times it is showing improper y axis or mutiple y axis values with the same value or it is displaying improper global range
    kindly help
    Regards
    Muzammil

    Muzammil,
    Without seeing your data and your chart configuration, it is difficult to understand exactly the issues you are encountering.  I have the same JRE and the same version and build of MII as you.  I have no difficulty with the scaling or with Global Range, but problems displaying the date in my tests.
    I would suggest the you enter a ticket into the SAP Support System, and enclose a  copy of your data (run the query, use Browser with Content Type = text/xml) and export a copy of you display template.
    Lastly, what type of query are you using - sql, tag?
    Kind Regards,
    Diana Hoppe

  • HR ABAP: Issue with using 'nocommit' parameter on FM HR_INFOTYPE_OPERATION

    Issue with using nocommit parameter on FM HR_INFOTYPE_OPERATION:
    My client has a requirement to create the following 4 infotypes in sequence in a LUW, i.e either all are created or none is created.
    9045   (custom infotype)
    0045
    0078
    0015
    I tried to use the nocommit parameter on FM HR_INFOTYPE_OPERATION to insert the 4 infotypes
    in a nocoomit mode and then at the end I have issued
    'Commit Work', but to my surprise only I/T 0015 is created in the database and the first three (9045, 0045 and 0078) did not make it to database.
    I searched many threads on SDN but could not find a solution.
    Please let me know if there could be any solution to implement the LUW.
    YOur inputs will be appreciated.

    Hi ,
    i think u can also try with this FM HR_MAINTAIN_MASTERDATA , see its documentations.
    no commit works like a simulation mode , what u can do is  ,
    call FM for all Infotypes and collect all error msgs if any , then finally call FM for all infotypes again without passing nocommit work ( i.e space).
    regards
    prabhu

  • Issues with Custom Settings

    Is anyone else having issues with custom settings on 10.8.2?
    I am trying to configure basic office 2011 settings based on the keys located at http://afp548.com/mediawiki/index.php/Office_2011_Settings using a device group.
    However, when I log into an authenticated user's machine and load pref setter I can see that the customizations are not found on the newly managed machine.  I know the push updates are working, but for custom settings I seem to be getting nothing.
    My questions:
    - Am I just flat our doing it wrong? (meaing, should the plist files be copied over)
    - Are there common issues to look out for?
    - Can anyone share basic custom settings they have that work?
    Thanks!

    Thats a bad idea. Office dos not react well to having it's prefences copied from one machine.
    Check out: http://www.officeformachelp.com/office/administration/mcx/
    It's a good guide for MS Office prefs.

  • Issue with camera it says "there is no connected camera" in may macbook pro

    issue with camera it says "there is no connected camera" in may macbook pro

    I got that error from PhotoBooth, no connected camera.
    After messing around with all the usual suggestions (reset PRAM, repair permissions, etc.), I tried plugging in an external USB camera (happened to be a ProScope) and - for no apparent reason - the internal iSight camera sprang to life and has been working fine ever since.
    Worth a try - any USB camera should do it.

  • I am having issue with iphone 5 bluetooth , not able to connect honda CRV

    I am having issue with iphone 5 bluetooth , not able to connect to my honda CRV

    Don't know if you could consider this an Apple issue. I know you are probably frustrated, and I am not defending Apple here, just saying that the iPhone does not update to support individual automobiles, the automobile manufactures update to support the Bluetooth profiles.The iPhone has the latest Bluetooth stack and supports a number of things. Problems generally revolve around the manufacturer not supporting the lastest Bluetooth version and have to update their systems. Check with Honda, they are probably aware of this, and if you search the forum, you will find a number of posts regarding the CR-V.

  • Export of Customer Receivables Ageing to Excel -dates can't sort

    Hi all ,
    When I export the Customer Receivables Ageing Report to Excel by clicking on the excel icon it successfully exports the data, but the dates in the report are treated as text and not as a date field by excel.
    Is there any way to export the data wihout having to change the format for each date individually to date format.
    Options used are
    1. Save as type: Microsoft 97-2003 Workbook (*.xls)
    2. Export Currency Symbols: Yes, to the same column
              However I have tried all 3 responses with no change to the date format
    3. Enable the macros in AutoOpen.xls
    4. Posting date and due date in the excel spreadsheet are left justified text fields.
    Any ideas anyone?

    Thanks Gordon.  what do you mean match the data format to your machine
    the dates show in B1 as 18.12.2011 which is not a recognised date format in Excel.
    By changing the "." to "/" when in excel so the date show as 18/12/2011 they are recognised as dates. 
    We can do this change but I would rather that the CEO did not have to convert the data in Excel before he can use it for his analysis.
    Are there settings in B1 where we can specify the formats for dates etc?  If so where are they?  We us Australian/British standards rather than American standards on our PC's and the server that B1 in on  has the operating system set to Australian English and Australian time.
    regards
    Chris

  • Issue with Info Object Transport after changing Data Type

    Hi Experts,
    We have a DSO which is running past 5 years. And recently(3 days back) we added new fields into that DSO. Delta loaded for last three days and new info object (Say XXX) data populated into DSO.
    Now the problem is, We identified that, info object data type is not correct. We used data type NUMC  instead of CHAR.So Character information is missing for that field.
    Example:
    Data from ECC: ABC123
    Data Loaded to DSO: 123 (Missed character ABC)
    So we deleted data from DSO and changed the info object data type in Development system.
    And also I have deleted only last three days delta records from DSO and transported my info object to Production. But its giving error as: Info object contains data in DSO".
    But that info object field is empty in DSO. I have already deleted last three days delta.
    Do I need to Delete all 5 years data from DSO to change the data type of recently added new info object ?
    Please give me your solutions and ideas to solve this issue.
    Thank you,
    Best Regards,
    Santhosh

    Hi Raman,
    Thank you for your answer.
    When I changed the data type of info object in Dev, I deleted the content of DSO. And same thing transported to QA as well. Before importing changed info object, I just deleted the content of DSO from QA. So transport done successfully QA.
    But in Production we have a history of past 5 years. So I cant delete all contents of DSO.
    So just deleted only the delta request's which contains data for that info object(3 days back, we moved that info object(NUMC data type) to production,So last 3 days delta only loaded for that info object). And tried to transport it. But it was failed. I am sure we need to drop all data from DSO, if I am interested to go with same info object.
    I have some comments on your first approach,
    1. If we delete info objects from Dev DSO and transport to Production will give transport failure, Why because, my previous transports errors clearly saying that info object contains data in DSO . So it wont allow to remove that info object from DSO.
    2.We are using same info object technical name in BO Data federator also. So if we change add new info objects again we need to make changes with BO as well. I am thinking this point as my last option if i cant find any other solution.
    Thank you.
    Best Regards.

  • Getting Error with date parameter field -"Invalid DATE"

    Hi
    I created a report with 9.2 version. And  i created 2 date parameter fields in it. And within crystal reports it works fine.
    But If, I call from the VS2005 crystalreport.net(report document) SDK , i am getting error as Invalid Date" enter a Date Value". Actually it is a dtp control. how can i modify it.

    <p>There&#39;s not enough details to get a good understanding of your scenario, but it sounds like you&#39;re using some kind of date control to retrieve the date value and pass it to the report&#39;s parameter field at runtime. </p><p>If this is correct, it sounds like the format of the date value from the control may not be what the report is expecting. To debug, I would first use hardcoded values when you set the report parameter fields to see what works. Then compare that to what the return value is from the control to see if they match. I suspect the formats differ. </p><p>I did a quick search on the BOBJ kbase and found a couple of articles that may help you. I recommend searching as there are lots of information on how to set parameter fields at runtime. </p><p><span style="margin-left: 5px"><a href="http://technicalsupport.businessobjects.com/KanisaSupportSite/search.do?cmd=displayKC&docType=kc&externalId=c2010247&sliceId=&dialogID=8582434&stateId=1 0 8584068" target="_blank">c2010247</a></span><span style="margin-left: 5px"><span style="margin-left: 5px"><a href="http://technicalsupport.businessobjects.com/KanisaSupportSite/search.do?cmd=displayKC&docType=kc&externalId=c2010251&sliceId=&dialogID=8582434&stateId=1 0 8584068" target="_blank">c2010251</a></span></span></p><p>-MJ</p>

  • Issues with CLOB dataype in 10g. (Data truncation)

    I have a product which is running perfectly fine on Oralce 9i. We are trying to upgrade the product to 10g. Once the database is upgraded and after the data is imported, we are facing few issues with CLOB data type values. We are using SQLGetData function to fetch data from these CLOB datatypes. The data is getting truncated when retriving. Oracle says, its an issue with ODBC driver which was released with 10g Relase 1. I tried to upgrade the ODBC driver. But even after upgrading, I am still facing the same issue. I tried to upgrade it Oracle 10g Release 2. Still I am facing the same issue. The data stored in CLOB datatype is xml data. So after retriving the truncated data, XML parser is not able to parse the data.
    When I trace the error using ODBC Error Tracing, I am getting this error :
    [01004] [Oracle][ODBC]String data, right truncated. (0) .
    Has anyone faced anything like this. your inputs will be greatly appreciated.
    Thanks in advance.

    Hi,
    Thanks for the reply. I couldn't quite got what you meant by referential variable. Could you please elaborate on that.
    Thanks in advance.

  • Issue with Advanced Replication - ORA-01403: no data found

    Hi,
    got into this weird issue with my replicated environment (9.2.0.5.0 on solaris). We have a bidirectionally replicated table which has been working fine for years but is now showing issues on certain records.
    We had to manually update locally some fields but that's something we did in the past, ensuring both sides were aligned, but this time it does not work.
    The two records are identical but when we try to update on one side the far end fires:
    ERROR at line 1:
    ORA-01403: no data found
    ORA-06512: at "SYS.DBMS_DEFER_SYS_PART1", line 442
    ORA-06512: at "SYS.DBMS_DEFER_SYS", line 1854
    ORA-06512: at "SYS.DBMS_DEFER_SYS", line 1900
    ORA-06512: at line 1
    Thoutght there could be some field out of sync, but the update we execute is quite easy:
    Records on current DBs:
    9495411494 EA10CARD 10 169 0 used 1.0000E+17 31-DEC-99 E-VOUCHER_10_CARD 28-MAR-13 12 10 949541
    9495411494 EA10CARD 10 169 0 used 1.0000E+17 31-DEC-99 E-VOUCHER_10_CARD 28-MAR-13 12 10 949541
    Trying to:
    update ucms_batches set batch_status='new' where serial_no=9495411494;
    commit;
    I see the change locally applied but I see an error reported on the far end. Could not decode the content of the user_data completely, the only thing I could get was:
    3.45.53207 0 0 ?
    OPS$SCNCRAFT? UCMS_BATCHES$RP?
    REP_UPDATE? ?A
    !? used? new?EA10CARD??Y
    7
    Y?'
    !?AE-VOUCHER_10_CARD??AF!????_`*!??!?A
    !????_`*_!?E
    !??? NSMS_MASTERDEF.RMS? N
    Is there anyway I could verify the SQL attempted on the far end with the data used to match the record?
    Thanks
    Mike

    Found a solution using dump function to get the data decoded and compare them across nodes.
    It came out a date field was not matching the hh:mm:ss values.
    Thanks
    Mike

  • Need help with Importing These Kinds of Excel Data Sheets

    hy there, Can someone Assist me with this.
    I got a package in SSIS. where I need to import Excel data sheets.
    Those sheets are coming from Cube.
    And So their format is bit different
    Kindly help me how to import those With DATA FLOW TASK.
    And ,
    As you can see Excel sheets got Filter on their Columns.
    So I am Getting much often Errors in importing those sheets Via DATA FLOW TASK.
    Thanks In Advance.
    # Note.
    I can not Edit each Excel File Before running packages.
    Packages are automated.
    And also the sheets are automated, they get store in the FTP on a cyclic job.
    So Have to Do it on PAckage level

    You cannot hit a moving target, your data quality is at risk, too. Even if a package succeeded there is still a degree of suspect whether the data ended in the right place.
    You can create a helper program that checks for data integrity, e.g. a series of SQL queries can be executed against the source to pinpoint what metadata shape the Excel arrived in. Then branch the package flow based on one or the other variant (there will
    be only a handful). The branching can be done by using Precedence Constraints. Hey, but see what I say about above.
    Arthur My Blog
    Okay than I must do it differently. OR get the source in a better and secured manner.

  • Issue with custom receive Pipeline component

    I have been facing issue with creating a custom receive pipeline component. The Pipeline is to receive large file, if the file size is large it has to read the incoming stream to a folder and pass only some meta data through the MessageBox. The Execute method
    I am using is,
    #region IComponent Members
    public IBaseMessage Execute(IPipelineContext pContext, IBaseMessage pInMsg)
    if (_largeFileLocation == null || _largeFileLocation.Length == 0)
    _largeFileLocation = Path.GetTempPath();
    if (_thresholdSize == null || _thresholdSize == 0)
    _thresholdSize = 4096;
    if (pInMsg.BodyPart.GetOriginalDataStream().Length > _thresholdSize)
    Stream originalStream = pInMsg.BodyPart.GetOriginalDataStream();
    string largeFilePath = _largeFileLocation + "\\" + pInMsg.MessageID.ToString() + ".zip";
    FileStream fs = new FileStream(largeFilePath, FileMode.Create);
    // Write message to disk
    byte[] buffer = new byte[1];
    int bytesRead = originalStream.Read(buffer, 0, buffer.Length);
    while (bytesRead != 0)
    fs.Flush();
    fs.Write(buffer, 0, buffer.Length);
    bytesRead = originalStream.Read(buffer, 0, buffer.Length);
    fs.Flush();
    fs.Close();
    // Create a small xml file
    string xmlInfo = "<ns0:MsgInfo xmlns:ns0='http://SampleTestPL.SchemaLocation'><LargeFilePath>" + largeFilePath + "</LargeFilePath></ns0:MsgInfo>";
    byte[] byteArray = System.Text.Encoding.UTF8.GetBytes(xmlInfo);
    MemoryStream ms = new MemoryStream(byteArray);
    pInMsg.BodyPart.Data = ms;
    return pInMsg;
    #endregion
    Here I want the xml to be dropped in to the File share Eg: E:\Dropbox\PL\send and and the entire message to be dropped in the folder Eg: E:\Dropbox\sendLarge. so in the ReceivePipeline properties i set like
    And in the send port the destination i give is E:\Dropbox\PL\send.
    The issue is both the xml and the message are getting dropped in to the same folder E:\Dropbox\PL\send and the message is not getting dropped in E:\Dropbox\SendLarge. Any help is greatly appreciated.

    using System;
    using System.Collections.Generic;
    using System.Text;
    using Microsoft.BizTalk.Message.Interop;
    using Microsoft.BizTalk.Component.Interop;
    using System.IO;
    namespace Sample.ReceivePipelineLargeFile
    [ComponentCategory(CategoryTypes.CATID_PipelineComponent)]
    [ComponentCategory(CategoryTypes.CATID_Decoder)]
    [System.Runtime.InteropServices.Guid("53fd04d5-8337-42c2-99eb-32ac96d1105a")]
    public class ReceivePipelineLargeFile : IBaseComponent,
    IComponentUI,
    IComponent,
    IPersistPropertyBag
    #region IBaseComponent Members
    public string Description
    get
    return "Pipeline component used to receive large file and save it ina disk";
    public string Name
    get
    return "ReceivePipelineLargeFile";
    public string Version
    get
    { return "1.0.0.0";
    #endregion
    #region IComponentUI Members
    public IntPtr Icon
    get
    return new System.IntPtr();
    public System.Collections.IEnumerator Validate(object projectSystem)
    return null;
    #endregion
    #region IPersistPropertyBag Members
    private string _largeFileLocation;
    private int _thresholdSize;
    public string LargeFileLocation
    get { return _largeFileLocation; }
    set { _largeFileLocation = value; }
    public int ThresholdSize
    get { return _thresholdSize; }
    set { _thresholdSize = value; }
    public void GetClassID(out Guid classID)
    classID = new Guid("B261C9C2-4143-42A7-95E2-0B5C0D1F9228");
    public void InitNew()
    public void Load(IPropertyBag propertyBag, int errorLog)
    object val1 = null;
    object val2 = null;
    try
    propertyBag.Read("LargeFileLocation", out val1, 0);
    propertyBag.Read("ThresholdSize", out val2, 0);
    catch (ArgumentException)
    catch (Exception ex)
    throw new ApplicationException("Error reading PropertyBag: " + ex.Message);
    if (val1 != null)
    _largeFileLocation = (string)val1;
    if (val2 != null)
    _thresholdSize = (int)val2;
    public void Save(IPropertyBag propertyBag, bool clearDirty, bool saveAllProperties)
    object val1 = (object)_largeFileLocation;
    propertyBag.Write("LargeFileLocation", ref val1);
    object val2 = (object)_thresholdSize;
    propertyBag.Write("ThresholdSize", ref val2);
    #endregion
    #region IComponent Members
    public IBaseMessage Execute(IPipelineContext pContext, IBaseMessage pInMsg)
    if (_largeFileLocation == null || _largeFileLocation.Length == 0)
    _largeFileLocation = Path.GetTempPath();
    if (_thresholdSize == null || _thresholdSize == 0)
    _thresholdSize = 4096;
    if (pInMsg.BodyPart.GetOriginalDataStream().Length > _thresholdSize)
    Stream originalStream = pInMsg.BodyPart.GetOriginalDataStream();
    string largeFilePath = _largeFileLocation + "\\" + pInMsg.MessageID.ToString() + ".zip";
    FileStream fs = new FileStream(largeFilePath, FileMode.Create);
    // Write message to disk
    byte[] buffer = new byte[1];
    int bytesRead = originalStream.Read(buffer, 0, buffer.Length);
    while (bytesRead != 0)
    fs.Flush();
    fs.Write(buffer, 0, buffer.Length);
    bytesRead = originalStream.Read(buffer, 0, buffer.Length);
    fs.Flush();
    fs.Close();
    // Create a small xml file
    string xmlInfo = "<ns0:MsgInfo xmlns:ns0='http://SampleTestPL.SchemaLocation'><LargeFilePath>" + largeFilePath + "</LargeFilePath></ns0:MsgInfo>";
    byte[] byteArray = System.Text.Encoding.UTF8.GetBytes(xmlInfo);
    MemoryStream ms = new MemoryStream(byteArray);
    pInMsg.BodyPart.Data = ms;
    return pInMsg;
    #endregion
    Thanks Osman Hawari, for trying to help me out.

  • Issue with customer Incoming Payment

    Hi Gurus,
    I need your help related to an issue with Incoming customer payment. This is the first time I am doing this Incoming payment set-up. I have odne the payment program set-up for outgoing payments. I need to configure the automatic payment program for incoming payments. After the payment run is successfully executed, we need to create and send Idoc's to the bank. This process of payment run and Idoc creation is being used successfully for outgoing payments.
    From configuration side, I have created a new payment method and have set it up for incoming payments. I have selected ACH as format and PPD as format supplement. I did the bank set-up and have assigned a partner profile to it. I have assigned a reference key 17 in the SPRO-> Print->Payment media.
    The issue is that when I run the payment run for customer open item, the payment run is successful and the item gets cleared but no Idoc's are being generated. I have entered my House bank and account number in the variant as well. I am getting the following error:
    Step 001 started (program SAPFPAYM_SCHEDULE, variant &0000000000869
    No variants found for format ACH
    No Payment Media Created in Format 100
    I would really be thankful if you can help me in resolving this issue. I am not sure which step I am missing.
    Thanks,
    Best Regards,
    Shalu

    Hi Lakshmipathi,
    Yes the Document split has been activated. This issue is the specific one. We have different C&F location. Sales happens accross India. Hence we have created the Business area concept for each location to get the location wise balance sheet. I have also activated the zero balancing for Profit center and also Business area so that i can derive the Location wise Balance sheet.
    Now the sales invoice has been processed from C&F Location. Customer deposit the Cheque directly to the Head office Bank and not at the C&F Location bank. I don't want to clear the existing open item because 50% of the cheque gets bounced. So i am booking the incoming payment without clearing the open item.
    While processing the incoming payment user is entering the C&F location business area in the customer line item (Credit line item) and Head office business area for the Bank line item as the bank is at the Head office so that we can arrive the Balance sheet.
    Now system is not deriving the profit center for the customer line item even though the business area wise zero balance has been activated. Recon account field status group has market as optional for profit center but in F-28 i am not able to get the profit center field (Because it is balance sheet item).
    The Document type already been configure for Document split and the Recon account also categorised as Customer inthe doc splitting. I don't have any problem with document split other than this issue.
    For this issue anything can be done in the document split??
    Regards,
    Devendran

  • Issue with custom fields mapping from CRM to ECC.

    Hi all,
    I have issues with replicating custom fields in a Service order. I have created new fields with EEWB and EEWC. Now the structure has been changed in ECC and CRM. But i have to map this fields to correct structures. Also i need some logic to apply.
    But i have addressed in couple of threads about the custom function module as in user exit for FI generation.
    But i didnt understand the concept of this custom function module. why is this been used?
    Could anyone let me know with an good example on how this has to be acheived.
    Points are highly awarded.
    Kindly reply me.
    Thanks,
    Kiran...

    Hi,
    Here you go,
    After you create custom characteristics.
    1. Do Assign category.
    2. Map contents.
    3. Double check your master catalog has the mapped content.
    4. Publish master catalog.
    As per your message, you are not having any error message during import, which tells me that you are one step to close.
    Cheers, Renga

Maybe you are looking for