Member load error from EIS

I'm loading data into essbase from eis. After member load following error is given
EssbaseAPI: You have been logged out due to inactivity or explicitly by the administrator.
When I try the same thing using user 'admin' it works fine. but the error is observed for other users.
Is there any setting for the user not getting logged out.

Could you explain how to solve this issue?
thanks in advance

Similar Messages

  • Hierachy Load error from R/3

    Hi All,
    I got an error while loading hierarchy from R/3 to BW. error is
    <i><b>Diagnosis</i>
        The hierarchy level of the parent node must be one larger than the
        hierarchy level of the actual node. This is not the case for the node
        with ID 00092826 and its parent node with ID 00092827 .
    <i>System response</i>
    <i>Procedure</i>
        Correct the hierarchy level of the node or of the parent node in your
        source system.
    <i><b>Procedure for System Administration</b></i>
    Can anybody help how to solve this error and where to correct this hierarchy in R/3.
    This is an urgent issue.all my dependant loads are stopped due to this error.
    Thanks in advance
    Narendra.

    Hi,
    Refer Note 339453
    Is this is the first time ?
    Thanks
    Assign points if useful

  • Loading error from BI to Oracle

    we are trying to connect to ABAP connector through OPEN HUB to Oracle.
    a.       a function module is called when we run the extractor to load data from BI to Oracle.
    b.       When the extractor is started it calls a function module.
    c.       When the function module is executed it opens a dialogue box asking if the data is deleted from the table.
    d.       This dialogue box is stopping us from doing a load.
    I need the solution to suppress the dialogue box.
    Thanks & Regards
    Ramakanth.

    Hi
    Thanks for your reply. Following are my settings in my interface.
    Source : MS SQL Server 2K. It contains a table called sampleTable with two fields empId,empName .empId is the primary key. Under the Flow tab, I have selected LKM SQL to SQL
    Target: Oracle 10g . It contains a table OracleTable with two fields empId,empName. empId is the primary key. In the Flow tab, I have selected IKM Oracle Incremental Update.
    When I execute the interface it said Session started. When I open Operator, I found the following error under Sessions folder.
    933 : 42000 : java.sql.SQLException: ORA-00933: SQL command not properly ended
    java.sql.SQLException: ORA-00933: SQL command not properly ended
    at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:282)
    at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:639)
    Where Can I see the code generated from the interface. Is it a PLSQL code?
    Pls help me where I am doing wrong.

  • Loading Error from CRM datasource  - Info Idoc recieved with status 8

    Hello All,
    NEED YOUR HELP
    We have a CRM datasource 0CRM_SALES_ACT_1 which works well from source in RSA3. The dataload is a full and fetches 19,000 records from CRM
    However when we load it from an infopackage from BW it gives us the following  "No data in source sysetm" Info Idoc with status 8. We have tried the following to eleminate some doubts
    1) Checked the connection in SM59 using aleremote .Connection and authorization successful. Also loaded data in the BW system  for other datasources from same CRM system it works well, so no authorization/ connections issues.
    2) In the source system CRM  tried transaction RSA3--> extract data get 19000 records
    3) In the source system CRM Triied transaction RSA3--> Import Request from BW --> successful.
    4) Checked the TRFC queue in SM58 the TRFC queue is empty
    5) When we check the logs in CRM for ALEREMOTE we get the following message
    synchronous transmission of info IDoc 2 in task 0001 (0 parallel tas
    DATASOURCE = 0CRM_SALES_ACT_1
             Current Values for Selected Profile Parameters
    abap/heap_area_nondia......... 0
    abap/heap_area_total.......... 9437184000
    abap/heaplimit................ 40000000
    zcsa/installed_languages...... ED
    zcsa/system_language.......... E
    ztta/max_memreq_MB............ 256
    ztta/roll_area................ 3000000
    ztta/roll_extension........... 2000000000
    Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 0 records
    Result of customer enhancement: 0 records
    IDOC: Info IDoc 2, IDoc No. 266228, Duration 00:00:00
    IDoc: Start = 2008.10.22 14:18:38, End = 2008.10.22 14:18:38
    Synchronized transmission of info IDoc 3 (0 parallel tasks)
    IDOC: Info IDoc 3, IDoc No. 266229, Duration 00:00:00
    IDoc: Start = 2008.10.22 14:18:39, End = 2008.10.22 14:18:39
    Job finished
    *What is the issue ? Data is present and confirmed available in source system , yet we get 0 records in BW system ? WE NEED YOUR HELP *

    Hi
    May be you need to provide full authorizations to the user 'ALEREMOTE'.Ask your basis team to check the authorizations.
    Regards,
    Chandu.

  • Date load error From ODS to Cube?

    Guru's,
    Here is a status of data load and suggest me.
    Actually we are loading data from ODS to Cube which is full load.This loading is done thorugh process chain and it is loaded for every half an hour.Every time the earlier load will be deleted and next load will be taken to cube.Two days ago the load had gone wrong.When I tried to see cause for the failure in Monitor details I could see Update missing and processing missing.When I see in process chains log view I got info stating some invalid characterstic had appeared.But when I search in PSA the whole of the data packets are green.If there is any invalid characterstics PSA should show a red status.But it is not showing.Please guide me how could I solve this?
    point will be definitely awarded for the proper answers.
    Thanks in advance.
    vasu.

    Hi Vasuvasu
    Once the edit is done and reloade the sata it wont show u the records in Red..
    Hope itz Helps..!
    Regards
    KISHORE M REDDY
    **Winners Don't Do Different things,They Do things Differently...!**
    > I dont whink this is with Disk space Y because after
    > every full load we are deleting data.

  • Invalid column name while loading member error in EIS

    we have changed the sort_order_01 column as LEVEL01_SORT_ORDER in SQL server view, but it is not reflecting in EIS during member load, however we added manually LEVEL01_SORT_ORDER in table properties of dimension member, but it is not reflecting in EIS, it is poping the error as:Invalid column name "sort_order_01".
    hi techies kindly throw some light on this

    we have done the same, even then it is not reflecting in OLAP model and Meta Outline.Problem is whether in the sql server or in the EIS ..
    Please suggest & help me out.
    sunil

  • IS Error Member load terminated with error in Essbase Integration Services

    Hi all,
    Please help!
    When i am trying to build the outline of a cube using Essbase Integration Services 9 it gives me the following error:
    IS Error Member load terminated with error
    Any replies - appreciated.
    Thanks in advance.
    Avishek Sarkar.

    Hi Avishek,
    Can you please post the EIS log contenets when error is coming? Also what are the options are you selecting when loading members?
    Atul K,

  • EIS : parent child - member load gives duplicate across dimensions message

    <p>I am trying to do member load from EIS (version7.1) desktop using an ASO metaoutline. The outline does notget built properly for one of the dimensions.</p><p> </p><p>Message log shows one of the values is duplicate but there areno duplicate values.</p><p> </p><p>message log shows the sql similar to : ( backend database isOracle 9.2.0.4 connected with odbc/OCI)</p><p> </p><p>SELECT  /*+ */ PARENT_ID, CHILD_ID, ALIAS_NAME,SEQUENCE_NUMBER from DIM1 order by 4 ASC</p><p> </p><p>Example data is:</p><p>CHILD_ID PAREN_ID SEQUENCE_NUMBER</p><p>ROOT                1</p><p>L1_1    ROOT    2</p><p>L1_2    ROOT    3</p><p>L1_3    ROOT    4</p><p>L2_1    L1_3    5</p><p>L2_2    L1_3    6</p><p>L1_4    ROOT    7</p><p>L1_5    ROOT    8</p><p> </p><p>Message log shows the following line twice:</p><p>Member "L1_3" is duplicate across all dimensions.</p><p>Member "L1_3" is duplicate across all dimensions.</p><p> </p><p>I checked all the dimensions and member value is unique acrossall dimensions.</p><p> </p><p>Outline gets created as follows: ( L1_1, L1_2 and L1_3 show upas siblings to DIM1 instead of children).</p><p> </p><p>Outline</p><p>    DIM1</p><p>            L1_2</p><p>            L1_3</p><p>            L1_4</p><p>            L1_5</p><p>    L1_1</p><p>    L1_2</p><p>    L1_3</p><p>    </p><p>Dimension has recursion defined in the model with PARENT_ID asparent column and CHILD_ID as Child column.</p><p> </p><p> </p><p> </p>

    Thanks John - still not quite working
    I should give another detail / extra complexity
    The "Geographic Location" dimensions looks like this:
    "Geographic Location" (Gen 1)
    "United States" (Gen 2)
    "SC" (Gen 3)
    "Unique Facility 1", "Unique Facility 2" etc (Gen 4)
    The data I have would be for Facility 1, Facility 2 which are unique member names which just happen to roll up under SC in the outline. I would have thought it would just load directly to those members without needing to specify their parent values.
    Do I need to specify [Geographic Location].[United States].[SC].[Facility 1] ? This seems like it's something that needs to be done in the source file rather than using the prefix/suffix features of the Rules File.

  • EIS Member Build Error

    We have an ASO cube in 11.1.2.0 which is refreshed nightly with outline and data for most recent month. Occasionally, we get the error message during member only load process. It appears after the outline written and verification stage. At the very last step, it fails saying "Applying Compression to Dimension 'Metrics'. IS Error Member load terminated with error."
    EIS outline itself is correct because most of the time it works fine. Does anyone have experienced the similar issue and know the root cause and fix?
    Thanks a lot

    thanks all! the error has been resolved.
    Jus had to create a directory in the Integration services folder: $ISHOME/loadinfo
    the loadinfo folder was missing.
    Prathap,
    Is that view available at that time? --the query is generated automatically.
    Which Data Source and which version of the Hyperion - datasource is Oracle10g and 9.3 is Hyp version.

  • ERROR :  "Exceptions in Substep: Rules" while loading data from DSO to CUBE

    Hi All,
    I have a scenario like this.
    I was loading data from ODS to cube daily. everything was working fine till 2  days before. i added some characteristic fields in my cube while i didn't do any change in my key figures. now i started reloading of data from ODS to cubes. i got the following error  "Exceptions in Substep: Rules" after some package of data loaded successfully.
    I deleted the added fields from the infocube and then i tried toreload data again but i am facing the same error again  "Exceptions in Substep: Rules".
    Can any one tell me how to come out from this situation ?
    Regards,
    Komik Shah

    Dear,
    Check some related SDN posts:
    zDSo Loading Problem
    Re: Regarding Data load in cube
    Re: Regarding Data load in cube
    Re: Error : Exceptions in Substep: Rules
    Regards,
    Syed Hussain.

  • Error when loading data from DSO to Cube

    Hi,
    After upgrading to 2004s we get the following error message when trying to load data from a DSO to a Cube:
    Unexpected error: RSDRC_CONVERT_RANGES_TO_SELDR
    Has anyone experienced a similar problem or have some guidelines on how to resolve it?
    Kind regards,
    Fredrik

    Hej Martin,
    Tack!
    Problem solved
    //Fredrik

  • Load error code 11 when attempting to open .vi from library

    Hello,
    Recently a colleague sent me a .llb containing a VI and sub-VIs of a program needed for my research.  The VI opens correctly but none of the sub-VIs can be found by it and when I attempt to open the sub-VIs I get load error code 11: VI version (6.0) cannot be converted to the current Labview verstion (8.5.1) because is has no block diagram. 
    I am running Vista with 8.5.1 and attempting to open a library of version 6.0.  I do not know what OS the library was compiled on.  (Linux is a possibility)  The colleague who sent me the program does not know either, as he recieved it from another colleague whom he is no longer in contact with.
    I am fairly certain the library was saved with block diagrams, so I do not think that is the source of the error.
    Does anyone know what might be a way to get these sub-VIs working?

    Okay,
    File posted.  Someone please tell me if this has a block diagram on it.
    Attachments:
    Find First Error.vi ‏24 KB

  • Bought apple iphone 4s 16gb in error from family member, but wanted 32gb.  Is there anyway to increase space, as planning to use a lot for music and photos

    Bought iphone 4s 16gb from family member in error.  Wanted 32gb as present for daughter
    Is there anyway to increase space as need lots for photos and music, other than deleting stuff!!

    harryham12 wrote:
    Bought iphone 4s 16gb from family member in error.  Wanted 32gb as present for daughter
    Is there anyway to increase space as need lots for photos and music, other than deleting stuff!!
    No.

  • Error while loading table from flat file (.csv)

    I have a flat file which i am loading into a Target Table in Oracle Warehouse Builder. It uses SQL Loader Internally to load the data from flat file, I am facing an issue. Please find the following error ( This is an extract from the error log generated)
    SQL*Loader-500: Unable to open file (D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv)
    SQL*Loader-552: insufficient privilege to open file
    SQL*Loader-509: System error: The data is invalid.
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    I believe that this is related to SQL * Loader error.
    ACtually the flat file resides in my system ( D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv). I am connecting to a oracle server.
    Please suggest
    Is it required that i need to place the flat file in Oracle Server System ??
    Regards,
    Ashoka BL

    Hi
    I am getting an error as well which is similar to that described above except that I get
    SQL*Loader-500: Unable to open file (/u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv)
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: The system cannot find the file specified.
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    The difference is that Ashoka was getting
    SQL*Loader-552: insufficient privilege to open file
    and I get
    SQL*Loader-553: file not found
    The initial thought is that the file does not exist in the directory specified or I have spelt the filename incorrectly it but this has been checked and double checked. The unix directory also has permission to read and write.
    Also in the error message is
    Control File: C:\u21\oracle\owb_staging\WHITEST\source_depot\INV_LOAD_LABEL_INVENTORY.ctl
    Character Set WE8MSWIN1252 specified for all input.
    Data File: /u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv
    Bad File: C:\u21\oracle\owb_staging\WHITEST\source_depot\Durham_Inventory_Labels.bad
    As can be seen from the above it seems to be trying to create the ctl and bad file on my c drive instead of on the server in the same directory as the .csv file. The location is registered to the server directory /u21/oracle/owb_staging/WHITEST/source_depot
    I am at a lost as this works fine in development and I have just promoted all the development work to a systest environment using OMBPlus.
    The directory structure in development is the same as systest except that the data file is /u21/oracle/owb_staging/WHITED/source_depot/Durham_Inventory_Labels.csv and everything works fine - .ctl and .bad created in the same directory and the data sucessfully loads into a oracle table.
    Have I missed a setting in OWB during the promotion to systest or is there something wrong in the way the repository in the systest database is setup?
    The systest and development databases are on the same box.
    Any help would be much appreciated
    Thanks
    Edwin

  • Error while loading data from a text file to a datastore

    Hi all,
    i am a starter in ODI. I am trying to load data from a text file into a datastore. I have put the file on a unix server whose location is defined in the topology manager. when i am trying to "view data" it is giving me error file does not exist at <location where i have put the file>. Please help me

    Hi,
    1. If your files are on a remote File System, you will need to copy one of your files to the machine ODI Designer is running on to allow ODI to retrieve the metadata information of the file.
    2. In Topology create a Physical Schema, the directory you enter for Data and Work Schema should point at this local file.
    3. Then define the File Datastore in ODI Designer. Enter a name, browse and select the file and fill in each filed of the Files tab.
    * If its a Fixed file, click on the grid icon on the Columns tab to enter the columns and have Automatic Adjustment checked.
    * If its a Delimited file, use the Reverse button on the Columns tab to reverse the columns.
    * Right click on the File Datastore select View Data, if you can view data, that means the File Datastore has been defined correctly.
    * If not, check each tab of the File Datastore to make sure everything is defined correctly and retry.
    4. Once View Data is successful, you now change the directories (Data and Work Schema in Topology) to point at the remote File System. These directories must be accessible to the ODI Agent that will be used to run the transformations. The directory can be an absolute path (m:/public/data/files) or relative to the ODI Agent startup directory (../demo/files). It is strongly advised to use a UNC (independent from the execution location) for the path. When running the transformations with "no agent", the directory is relative to the directory where Oracle Data Integrator has been installed.
    You need to have a agent process running at the system where your source file resides .
    Then while running the odi interface choose that agent .
    Thanks,
    Sutirtha

Maybe you are looking for