Poblem while loading FS Items through Flex Upload

Hi,
While trying to append some new FS items to an existing hierarchy I'm facing issues.
As suggested by Eugene (in his blog), I am mapping the 'Node' column in the load file to 'Item' in the method.
However in this case the nodes are text nodes (e.g. 'INTERNALACCRUALS') and system seems to be taking this as an Item and hence truncates it to the length of Item and then isseus error message that 'the value of Item is invalid.
Please note that the nodes as well as the hierarchy already exist and we are just trying to append new FS items to the hierarhcy.
We are on version 4.0.
I'm sure I'm doing something wrong but not able to figure out what. Please could you help me with this?
Best Regards,
SSC

I've never used text nodes in a BCS item hierarchy, not sure if that is possible (haven't read Eugene's blog).  I've always created them as FS Items with a posting block, and kept them in the same numbering sequence as the accounts below them.
For example
FS_ITEMS         FS Item Hierarchy
- 1000000000  Assets
- 1100000000  Cash & Equivalent
  - 1110000000  Cash
   - 1111000000  Bank Accounts
    - 1111100001  Bank Account 1
    - 1111100002  Bank Account 2
- 20000000000  Liabilities
- xxx                 etc
Hope that helps,
Chris

Similar Messages

  • Issue while loading Master Data through Process Chain in Production

    Hi All,
    We are getting an error in Process chain while loading Master Data
    Non-updated Idocs found in Source System
    Diagnosis
    IDocs were found in the ALE inbox for Source System that are not updated.
    Processing is overdue.
    Error correction:
    Attempt to process the IDocs manually. You can process the IDocs manually using the Wizard or by selecting the IDocs with incorrect status and processing them manually.
    I had checked the PSA also but I could not find any record and the strange thing is, Job itself is not getting scheduled. Can any one help me out in order to resolve this issue.
    Regards
    Bhanumathi

    Hi
    This problem is not related to Process chain..
    u can try this..
    In RSMO, select the particular load you want to monitor.
    In the menu bar, Environment >>> Transact. RFC >>> Select whichever is required, BW or Source System.
    In the next screen you can select the Execute button and the IDOCS will be displayed.
    Check Note 561880 - Requests hang because IDocs are not processed.
    OR
    Transact RFC - status running Yellow for long time (Transact RFC will be enabled in Status tab in RSMO).
    Step 1: Goto Details, Status get the IDoc number,and go to BD87 in R/3,place the cursor in the RED IDoc entroes in tRFC
    queue thats under outbound processing and click on display the IDOC which is on the menu bar.
    Step 2: In the next screen click on Display tRFC calls (will take you to SM58 particular TRFC call)
    place the cursor on the particular Transaction ID and go to EDIT in the menu bar --> press 'Execute LUW'
    (Display tRFC calls (will take you to SM58 particular TRFC call) ---> select the TrasnID ---> EDIT ---> Execute LUW)
    Rather than going to SM58 and executing LUW directly it is safer to go through BD87 giving the IDOC name as it will take you
    to the particular TRFC request for that Idoc.
    OR
    Go into the JOB Overview of the Load there you should be able to find the Data Package ID.
    (For this in RSMO Screen> Environment> there is a option for Job overview.)
    This Data Package TID is Transaction ID in SM58.
    OR
    SM58 > Give * / user name or background (Aleremote) user name and execute.It will show you all the pending TRFC with
    Transaction ID.
    In the Status Text column you can see two status
    Transation Recorded and Transaction Executing
    Don't disturb, if the status is second one Transaction Executing. If the status is first one (Transation Recorded) manually
    execute the "Execute LUWs"
    OR
    Directly go to SM58 > Give * / user name or background (Aleremote) user name and execute. It will show TRFCs to be executed
    for that user. Find the particular TRFC (SM37 > Req name > TID from data packet with sysfail).select the TrasnID (SM58) --->
    EDIT ---> Execute LUW
    (from JituK)
    Hope it helps
    Darshan

  • Error while loading the dimension through ODI

    Hi ALl,
    Here iam facing the different error,actually , i migrated all the cubes from dev server to integration server,and again i loaded and saved my rulefiles wen i loaded dimension its loading perfetly , so wen we arre loading the dimension through ODI package , but its all are fine for dev server we are getting the problem::::
    "Cannot build dimension. Analytic Server Error(1007083): Dimension build failed. There are many possible causes (for example, problem allocating memory). Check the server log file and the dimension build error file to locate the error that caused the failure..""
    then i tried in essbase , now its giving error cannot open object in the arbor path .otn file
    its not opening the .otn file, can any one help , it would be appriciated..
    Thanks

    The .otn file is created when essbase is restructuring and once completed renamed to otl, if it does not complete it can leave the .otn file.
    You should be able to delete the otn (maybe stop the application first) and open the outline.
    I would also check the logs generated from ODI and also the essbase app logs to troubleshoot the issue.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Error while Loading Entity Dimension through ODI

    Hi,
    When I tried to load the outline for Entity dimension onto the Planning (11.1.1.0) through ODI, I'm getting the following error:
    7000 : null : java.sql.SQLException: Invalid COL ALIAS "STORAGE C2_DATA_STORAGE" for column "DATA"
    java.sql.SQLException: Invalid COL ALIAS "STORAGE C2_DATA_STORAGE" for column "DATA"
         at com.sunopsis.jdbc.driver.file.bb.b(bb.java)
         at com.sunopsis.jdbc.driver.file.bb.a(bb.java)
         at com.sunopsis.jdbc.driver.file.w.e(w.java)
         at com.sunopsis.jdbc.driver.file.w.executeQuery(w.java)
         at com.sunopsis.sql.SnpsQuery.executeQuery(SnpsQuery.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.h.y(h.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Kindly help.
    -Jitendra

    Hi,
    Send over the models/interface and I will have a look.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Problem while loading kernel image through Jtag (on ZEDBOARD)

    Hi all,
    I am trying to  boot kernel image on the Zedboar via Jtag.
    But while running the "run netboot" i am getting the following error
    "wrong image format for bootm command"
    ERROR: can't get kernel image
    I just enclosed the console Screenshort.
    I am requesting to suggest some solotion for the above problem.

    
    Try re-building the image and see if that helps.
    Can you try booting the image using bootm commands and see if you are able to boot?
    Example:
    uboot> tftpboot 0x1000000 /tftpboot/image.ub
    uboot> tftpboot 0x2000000 /tftpboot/urootfs.cpio.gz
    uboot> tftpboot 0x3000000 /tftpboot/system.dtb
    uboot> bootm 0x1000000 0x2000000 0x3000000
    For details steps refer to details in the link
    Regards,
    Achutha

  • Errors while loading a hierarchy through flat file

    Hi All,
    I have prepared a flat file to load the data for a characteristic (location) hierarchy and trying to load it.
    its throwing an error again and again though it loaded fine when with less no. of records (25 earlier and 75 now)
    error coming up is
    boldData records for package 1 selected in PSA - 1 error(s)bold
    could any one please suggest the possible errors and troubleshooting points please?
    Thank you
    Tanu

    Hi
    Just check for all lower case character in you file, all characters should ne in upper case.
    You may also choose to edit this single records in PSA and process further.
    Regards
    Sudeep

  • Error while loading and adding IC Transaction

    I received the following error while loading IC Transaction through Load Tasks utility and adding IC Transaction through Process IC Transaction option
    Transactions Load - log:
    *'Cell does not support inter-company transaction details*
    Rajhi Holding; Falcon; 4110100;[NONE];[NONE];[NONE];[NONE];CF1234; 1; 01/01/2006; SAR; 2000; 2000; 1; FC1234; Grant;OWE
    Kindly tell me what it is and how I can resolve this issue.

    Hello
    The probleme is that you have to include in your rule the transactions rule, to allow the acount where you whant to load icp transaction.
    If you are working with YTD that you should use a diferent acount to load the icp transaction and then make the ajustment.
    Becuse the program would not allow to load data and if you have already data it will give you an error, if you load your rule with the integrity box check
    Sub Transactions()
         Hs.SupportsTran "A#Modulo_ICP_Ventas"
         Hs.SupportsTran "A#Modulo_ICP_Costos"
         Hs.SupportsTran "A#Modulo_ICP_Deudores"
         Hs.SupportsTran "A#Modulo_ICP_Acreedores"
    End Sub
    hope it helps

  • Excel, PowerView error in SharePoint 2013: "An error occurred while loading the model for the item or data source 'EntityDataSource'. Verify that the connection information is correct and that you have permissions to access the data source."

    I've installed SQL Server 2012 SP1 + SP server 2012 + SSRS and PowerPivot add-in.
    I also configured excel services correctly. Everything works fine but the powerview doesn't work!
    While I open an excel workbook consist of a PowerView report an error occurs: "An error occurred while loading the model for the item or data source 'EntityDataSource'. Verify that the connection information is correct and that you have permissions
    to access the data source."
    error detail: 
    <detail><ErrorCode xmlns="http://www.microsoft.com/sql/reportingservices">rsCannotRetrieveModel</ErrorCode><HttpStatus xmlns="http://www.microsoft.com/sql/reportingservices">400</HttpStatus><Message xmlns="http://www.microsoft.com/sql/reportingservices">An
    error occurred while loading the model for the item or data source 'EntityDataSource'. Verify that the connection information is correct and that you have permissions to access the data source.</Message><HelpLink xmlns="http://www.microsoft.com/sql/reportingservices">http://go.microsoft.com/fwlink/?LinkId=20476&amp;EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&amp;EvtID=rsCannotRetrieveModel&amp;ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&amp;ProdVer=11.0.3128.0</HelpLink><ProductName
    xmlns="http://www.microsoft.com/sql/reportingservices">Microsoft SQL Server Reporting Services</ProductName><ProductVersion xmlns="http://www.microsoft.com/sql/reportingservices">11.0.3128.0</ProductVersion><ProductLocaleId
    xmlns="http://www.microsoft.com/sql/reportingservices">127</ProductLocaleId><OperatingSystem xmlns="http://www.microsoft.com/sql/reportingservices">OsIndependent</OperatingSystem><CountryLocaleId xmlns="http://www.microsoft.com/sql/reportingservices">1033</CountryLocaleId><MoreInformation
    xmlns="http://www.microsoft.com/sql/reportingservices"><Source>ReportingServicesLibrary</Source><Message msrs:ErrorCode="rsCannotRetrieveModel" msrs:HelpLink="http://go.microsoft.com/fwlink/?LinkId=20476&amp;EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&amp;EvtID=rsCannotRetrieveModel&amp;ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&amp;ProdVer=11.0.3128.0"
    xmlns:msrs="http://www.microsoft.com/sql/reportingservices">An error occurred while loading the model for the item or data source 'EntityDataSource'. Verify that the connection information is correct and that you have permissions to access the
    data source.</Message><MoreInformation><Source>Microsoft.ReportingServices.ProcessingCore</Source><Message msrs:ErrorCode="rsErrorOpeningConnection" msrs:HelpLink="http://go.microsoft.com/fwlink/?LinkId=20476&amp;EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&amp;EvtID=rsErrorOpeningConnection&amp;ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&amp;ProdVer=11.0.3128.0"
    xmlns:msrs="http://www.microsoft.com/sql/reportingservices">Cannot create a connection to data source 'EntityDataSource'.</Message><MoreInformation><Source></Source><Message>For more information about this error navigate
    to the report server on the local server machine, or enable remote errors</Message></MoreInformation></MoreInformation></MoreInformation><Warnings xmlns="http://www.microsoft.com/sql/reportingservices" /></detail>
    Please help me to solve this issue. I don't know if uploading the excel workbook is enough or maybe It needed to connect to another data source.
    I Appreciate in advance.

    Hi Ali.y,
    Based on the current error message, the error can be related to the
    Claims to Windows Token Service (C2WTS) and is an expected error under certain conditions. To verify the issue, please check the aspects below:
         1. The C2WTS Windows service and C2WTS SharePoint service are both running.
         2. Check the SQL Server Browser service is running on the machine that has the PowerPivot instance of SSAS.
         3. Check the domain. You're signing into SharePoint with a user account in some domain (call it Domain A).  When Domain A is equal to Domain B which SharePoint server itself is located (they're the same domain), or Domain
    A trusts Domain B.
    In addition, the error may be caused by Kerberos authentication issue due to missing SPN. In order to make the Kerberos authentication work, you need to configure the Analysis Services to run under a domain account, and register the SPNs for the Analysis
    Services server.
    To create the SPN for the Analysis Services server that is running under a domain account, run the following commands at a command prompt:
    • Setspn.exe -S MSOLAPSvc.3/Fully_Qualified_domainName OLAP_Service_Startup_Account
    Note: Fully_Qualified_domainName is a placeholder for the FQDN.
    • Setspn.exe -S MSOLAPSvc.3/serverHostName OLAP_Service_Startup_Account
    For more information, please see:
    How to configure SQL Reporting Services 2012 in SharePoint Server 2010 / 2013 for Kerberos authentication
    Regards,
    Heidi Duan
    Heidi Duan
    TechNet Community Support

  • Error while Loading data through .csv file

    Hi,
    I am getting below date error when loading data through into Olap tables through .csv file.
    Data stored in .csv is 20071113121100.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
    ... nl:ERROR(u:'transformation error')).
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
    Any help is greatly appreciated.
    Thanks,
    Poojak

    1) Wrong format, you wont get much support loading OLAP cubes in here I think.
    2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
    *** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
    Expression in Informatica is setup as below.
    IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
    Thanks,
    Poojak

  • Error in loading data into essbase while using Rule file through ODI

    Hi Experts,
    Refering my previous post Error while using Rule file in loading data into Essbase through ODI
    I am facing problem while loading data into Essbase. I am able to load data into Essbase successfully. But when i used Rule file to add values to existing values I am getting error.
    test is my Rule file.
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Cannot put olap file object. Essbase Error(1053025): Object [test] already exists and is not locked by user [admin@Native Directory]
    at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
    at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
    at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
    at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
    at java.lang.Thread.run(Thread.java:662)
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C1_HSP_RATES "HSP_Rates",C2_ACCOUNT "Account",C3_PERIOD "Period",C4_YEAR "Year",C5_SCENARIO "Scenario",C6_VERSION "Version",C7_CURRENCY "Currency",C8_ENTITY "Entity",C9_VERTICAL "Vertical",C10_HORIZONTAL "Horizontal",C11_SALES_HIERARICHY "Sales Hierarchy",C12_DATA "Data" from PLANAPP."C$_0HexaApp_PLData" where (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    #stmt.setFetchSize(srcFetchSize)
    stmt.setFetchSize(1)
    print "executing query"
    rs = stmt.executeQuery(sql)
    print "done executing query"
    #load the data
    print "loading data"
    stats = pWriter.loadData(rs)
    print "done loading data"
    #close the database result set, connection
    rs.close()
    stmt.close()
    Please help me on this...
    Thanks & Regards,
    Chinnu

    Hi Priya,
    Thanks for giving reply. I already checked that no lock are available for rule file. I don't know what's the problem. It is working fine without the Rule file, but throwing error only when using rule file.
    Please help on this.
    Thanks,
    Chinnu

  • An error occurred while loading the model for the item or data source 'EntityDataSource'.

    Hi Team,
    We are trying to configure the Power View in SharePoint 2013. We did the entire configuration by following below steps. When we open the excel workbook (2013) containing
    Power View sheet (using excel web services) deployed in SharePoint Document library, we get the below error.
    “An error occurred while loading the model for the item or data source
    'EntityDataSource'. Verify that the connection information is correct and that you have permissions to access the data source.”
    Steps followed for configuring Power View in SharePoint Server 2013 (Single Machine setup):
    Installed SharePoint Server 2013 Enterprise edition
    Did not configure it at this time
    Installed SQL Server 2012 with default instance installation (selected every features)
    Later installed SP1 for SQL Server 2012
    After that installed Analysis Server in SharePoint mode (SQL Server PowerPivot for SharePoint)
    Again installed another Analysis Server and selected the tabular mode of data model
    Then ran the configuration wizard of Share point 2013 for farm configuration (only for central admin creation)
    Later we ran the PowerPivot for SharePoint 2013 (spPowerpivot.msi)
    Then using the Power Pivot Configuration tool for 2013 and completed all the steps and configured the excel service and registered the above two analysis service instances. 
    Questions/Doubts are:
    Is our steps followed as mentioned above is correct for configuring Power View in SharePoint Server 2013 Enterprise edition?
    Is the SQL Server 2012 Enterprise edition with SP1 installed is sufficient to view Power View sheets in SharePoint 2013 or we need to use the SQL Server 2012 SP1
    CTP3 software for Power View feature as mentioned in the below link.
    http://technet.microsoft.com/en-us/library/jj219634.aspx
    but we don’t have an option to download CTP3 software now from Microsoft site
    Or do we need to install
    Microsoft SQL Server 2012 With Power View For Multidimensional Models
    CTP
    http://www.microsoft.com/en-sg/download/details.aspx?id=35822
    Few blogs says that SQL Server 2012 instance should have been installed with SP1 at one go (not seperately).
    http://www.microsoft.com/en-in/download/details.aspx?id=35575
    Error detail :
    <detail><ErrorCode xmlns="http://www.microsoft.com/sql/reportingservices">rsCannotRetrieveModel</ErrorCode><HttpStatus xmlns="http://www.microsoft.com/sql/reportingservices">400</HttpStatus><Message
    xmlns="http://www.microsoft.com/sql/reportingservices">An error occurred while loading the model for the item or data source 'EntityDataSource'. Verify that the connection information is correct and that you have permissions to access
    the data source.</Message><HelpLink xmlns="http://www.microsoft.com/sql/reportingservices">http://go.microsoft.com/fwlink/?LinkId=20476&amp;EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&amp;EvtID=rsCannotRetrieveModel&amp;ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&amp;ProdVer=11.0.3000.0</HelpLink><ProductName
    xmlns="http://www.microsoft.com/sql/reportingservices">Microsoft SQL Server Reporting Services</ProductName><ProductVersion xmlns="http://www.microsoft.com/sql/reportingservices">11.0.3000.0</ProductVersion><ProductLocaleId
    xmlns="http://www.microsoft.com/sql/reportingservices">127</ProductLocaleId><OperatingSystem xmlns="http://www.microsoft.com/sql/reportingservices">OsIndependent</OperatingSystem><CountryLocaleId xmlns="http://www.microsoft.com/sql/reportingservices">1033</CountryLocaleId><MoreInformation
    xmlns="http://www.microsoft.com/sql/reportingservices"><Source>ReportingServicesLibrary</Source><Message msrs:ErrorCode="rsCannotRetrieveModel" msrs:HelpLink="http://go.microsoft.com/fwlink/?LinkId=20476&amp;EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&amp;EvtID=rsCannotRetrieveModel&amp;ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&amp;ProdVer=11.0.3000.0"
    xmlns:msrs="http://www.microsoft.com/sql/reportingservices">An error occurred while loading the model for the item or data source 'EntityDataSource'. Verify that the connection information is correct and that you have permissions to access
    the data source.</Message><MoreInformation><Source>Microsoft.ReportingServices.ProcessingCore</Source><Message msrs:ErrorCode="rsErrorOpeningConnection" msrs:HelpLink="http://go.microsoft.com/fwlink/?LinkId=20476&amp;EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&amp;EvtID=rsErrorOpeningConnection&amp;ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&amp;ProdVer=11.0.3000.0"
    xmlns:msrs="http://www.microsoft.com/sql/reportingservices">Cannot create a connection to data source 'EntityDataSource'.</Message><MoreInformation><Source>Microsoft.AnalysisServices.SPClient</Source><Message>We
    cannot locate a server to load the workbook Data Model.</Message><MoreInformation><Source></Source><Message>We cannot locate a server to load the workbook Data Model.</Message><MoreInformation><Source>Microsoft.Office.Excel.Server.WebServices</Source><Message>We
    cannot locate a server to load the workbook Data Model.</Message></MoreInformation></MoreInformation></MoreInformation></MoreInformation></MoreInformation><Warnings xmlns="http://www.microsoft.com/sql/reportingservices"
    /></detail>
    Please help me in  configuring Power View for SharePoint 2013.
    Thanks in advance.
    Pavan Kumar

    I'll bet that I know the answer to this one - and you probably won't like it ;)
    Here's the telling portion of that error:  "We cannot locate a server to load the workbook Data Model."
    This error is thrown in two scenarios; the first, when you haven't registered an analysis server.  The second appears to be the issue you're having:
    PowerPivot requires SQL Server 2012 SP1.  Microsoft issued a release of SQL Server 2012 SP1 that really wasn't SP1...
    Check the version of SQL that you are running.  It should be version 11.0.3000 or greater.  If it's not, you have the wrong "SP1" installed.  The correct one is
    here.
    Here's the bad news- you can't simply redeploy the correct one... because the installer already thinks that SP1 is installed.  You'll have to backup (or detach and copy) your databases, then re-install SQL with the correct SP1, then restore or reattach
    the databases.  Once you've done that, PowerPivot and PowerView will work properly.
    I sincerely hope this resolves your issues - we wasted TWO WEEKS of our time with this problem!

  • Error while loading Data in HFM application through ODI

    Hello All,
    I am performing a integration from source(Flat file) to HFM application I have created a Integration to it and while loading the data i get the following error:-
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
      File "<string>", line 3, in <module>
        at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: Properties [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] are mandatory.
        at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
        at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
        at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
        at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
        at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
        at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
        at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
        at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
        at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
        at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:577)
        at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:468)
        at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2128)
        at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
        at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
        at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
        at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
        at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
        at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
        at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
        at java.lang.Thread.run(Thread.java:662)
    Caused by: Traceback (most recent call last):
      File "<string>", line 3, in <module>
        at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
    com.hyperion.odi.common.ODIHAppException: com.hyperion.odi.common.ODIHAppException: Properties [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] are mandatory.
        at org.python.core.PyException.fillInStackTrace(PyException.java:70)
        at java.lang.Throwable.<init>(Throwable.java:181)
        at java.lang.Exception.<init>(Exception.java:29)
        at java.lang.RuntimeException.<init>(RuntimeException.java:32)
        at org.python.core.PyException.<init>(PyException.java:46)
        at org.python.core.PyException.<init>(PyException.java:43)
        at org.python.core.Py.JavaError(Py.java:455)
        at org.python.core.Py.JavaError(Py.java:448)
        at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)
        at org.python.core.PyObject.__call__(PyObject.java:355)
        at org.python.core.PyMethod.__call__(PyMethod.java:215)
        at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
        at org.python.core.PyMethod.__call__(PyMethod.java:206)
        at org.python.core.PyObject.__call__(PyObject.java:397)
        at org.python.core.PyObject.__call__(PyObject.java:401)
        at org.python.pycode._pyx15.f$0(<string>:6)
        at org.python.pycode._pyx15.call_function(<string>)
        at org.python.core.PyTableCode.call(PyTableCode.java:165)
        at org.python.core.PyCode.call(PyCode.java:18)
        at org.python.core.Py.runCode(Py.java:1204)
        at org.python.core.Py.exec(Py.java:1248)
        at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
        at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
        ... 19 more
    Caused by: com.hyperion.odi.common.ODIHAppException: Properties [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] are mandatory.
        at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:240)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
        ... 33 more
    Caused by: com.hyperion.odi.common.ODIHAppException: Properties [CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE] are mandatory.
        at com.hyperion.odi.hfm.ODIHFMUtil.validateRequiredProperties(ODIHFMUtil.java:87)
        at com.hyperion.odi.hfm.ODIHFMMetadataLoader$OptionsMetadataLoad.validate(ODIHFMMetadataLoader.java:66)
        at com.hyperion.odi.hfm.ODIHFMMetadataLoader.validateOptions(ODIHFMMetadataLoader.java:188)
        at com.hyperion.odi.hfm.ODIHFMAppStatement.validateLoadOptions(ODIHFMAppStatement.java:168)
        at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:195)
        ... 38 more
    Any help appreciated
    Thanks,
    Pratik

    LKM looks fine. Sorry it was my bad the "data" at the end was not visible and you are right its IKM for integration
    Log file that you have provided talks about these parameters but I doubt these are related to data
    I was going through the link and it takes about the parameters CLEAR_ALL_METADATA_BEFORE_LOAD, REPLACE_MODE but they are related to metadata
    Oracle Hyperion Financial Management
    I have no idea. Will keep posted if I get to know anything
    HTH
    Amarnath
    ORACLE | Essbase

  • Error while loading data from PSA to Infoobject through DTP(URGENT)

    Hi-
    I am running into an issue while loading the data from PSA to Infoobject.
    It says
    Data Package 1 : error during processing.
    Updating Attributes for Infoobject
    Error in substep
    Process Terminated
    Please let me know ur solution as this is very urgent....

    Data Package 1 : error during processing.
    This is the error when your flat file is opened at the time of scheduling. Pls close your flat file, will upload correctly.
    Let me know if you have any further query.

  • While loading through External Tables, Japanese characters wrong load

    Hi all,
    I am loading a text file through External Tables. While loading, japanese characters are loading as junk characters. In text file, the characters are showing correctly.
    My spool file
    SET ECHO OFF
    SET VERIFY OFF
    SET Heading OFF
    SET LINESIZE 600
    SET NEWPAGE NONE
    SET PAGESIZE 100
    SET feed off
    set trimspool on
    spool c:\SYS_LOC_LOGIC.txt
    select CAR_MODEL_CD||',' || MAKER_CODE||',' || CAR_MODEL_NAME_CD||',' || TYPE_SPECIFY_NO||',' ||
         CATEGORY_CLASS_NO||',' || SPECIFICATION||',' || DOOR_NUMBER||',' || RECOGNITION_TYPE||',' ||
         TO_CHAR(SALES_START,'YYYY-MM-DD') ||',' || TO_CHAR(SALES_END,'YYYY-MM-DD') ||',' || LOGIC||',' || LOGIC_DESCRIPTION
    from Table where rownum < 100;
    spool off
    My External table load script
    CREATE TABLE SYS_LOC_LOGIC
         CAR_MODEL_CD                         NUMBER               ,
         MAKER_CODE                              NUMBER,
         CAR_MODEL_NAME_CD                    NUMBER,
         TYPE_SPECIFY_NO                         NUMBER               ,
         CATEGORY_CLASS_NO                    NUMBER               ,
         SPECIFICATION                         VARCHAR2(300),
         DOOR_NUMBER                              NUMBER,
         RECOGNITION_TYPE                    VARCHAR2(30),
         SALES_START                          DATE ,
         SALES_END                               DATE ,
         LOGIC                                   NUMBER,
         LOGIC_DESCRIPTION                    VARCHAR2(100)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY XMLTEST1
    ACCESS PARAMETERS
    RECORDS DELIMITED BY NEWLINE
    FIELDS TERMINATED BY ','
    MISSING FIELD VALUES ARE NULL
                        CAR_MODEL_CD,MAKER_CODE,CAR_MODEL_NAME_CD,TYPE_SPECIFY_NO,
                        CATEGORY_CLASS_NO,SPECIFICATION,DOOR_NUMBER,RECOGNITION_TYPE,
                        SALES_START date 'yyyy-mm-dd', SALES_END      date 'yyyy-mm-dd',
                        LOGIC, LOGIC_DESCRIPTION     
    LOCATION ('SYS_LOC_LOGIC.txt')
    --location ('products.csv')
    REJECT LIMIT UNLIMITED;
    How to solve this.
    Thanks in advance,
    Pal

    Just so I'm clear, user1 connects to the database server and runs the spool to generate a flat file from the database. User2 then uses that flat file to load that data back in to the same database? If the data isn't going anywhere, I assume there is a good reason to jump through all these unload and reload hoops rather than just moving the data from one table to another...
    What is the NLS_LANG set in the client's environment when the spool is generated? Note that the NLS_CHARACTERSET is a database setting, not a client setting.
    What character set is the text file? Are you certain that the text file is UTF-8 encoded? And not encoded using the operating system's local code page (assuming the operating system is capable of displaying Japanese text)
    There is a CHARACTERSET parameter for the external table definition, but that should default to the character set of the database.
    Justin

  • Error while loading data on 0LIV_DS01 ODS through DTP

    Hi Expert,
                  I am trying to load data on 0LIV_DS01 ODS through 2LIS_06_INV datasource data is comming correctly in PSA
    while i'm trying to load data on ODS through DTP is runing properly till 9th package but at 10th package it showing me error
    like "Exception condition "CTT_NOT_FOUND" raised"
    Error Analysis
    "A RAISE statement in the program "SAPLRSW0" raised the exception
    condition "CTT_NOT_FOUND".
    Since the exception was not intercepted by a superior
    program, processing was terminated."
    please tell me how i rectify the error....
    Thanks and Regards
    Lalit Kumar

    Hi,
    This error has to do something with Currency Conversion.
    While loading from the DSO to the cube, do you have any currency conversions?
    Please look at the below threads:
    Loading error in 0LIV_DS01
    Raise Exception error: CTT_NOT_FOUND
    Hope this helps.
    Regards
    Vishal

Maybe you are looking for