Why stale sessions are creating in Data Load Execution?

Hi Experts,
Can anyone tell,Why stale session are creating every time, when i do the incremental load in operator tab?.Thanks In advance.
Regards,
raj

Hi Experts,
I got the some below information from blogs..is it correct?
When the network connection has a problem or an agent gets disconnected from master and work repository momentarily, we end up with stale sessions.and master and work repositories Crash.
Thanks

Similar Messages

  • What are the frequent data load errors in production support?

    what are the frequent data load errors in production support?

    It is a long list. Here is some of them.
    1. Idoc not arriving from source system.
    2. previous processes failed.
    3. change run unsucessful or did not run properly and data in master data is messed up.
    4. invalid characteristics in the data.
    5. Duplicate records found.
    and on  and on.
    Ravi Thotahdri

  • How many sessions are created in Server ?

    Hi All developers,
    SCJP back again.........
    Anybody please tell me......
    Is there any way to find out how many sessions are created in server ?
    Thanks in advance ..............

    Only by keeping track of them yourself.
    The various session listeners - SessionListener, SessionBindingListener are usefule here.

  • How to create a data load file from Excel !!!

    Hi All,
    I'm new to HFM and would like to load data into an HFM application. As I have an Excel file with all the data. When I'm directly loading the data it throws an error saying "No section has been specified to determine if this is data, description or line item detail". How can I convert this excel file into proper format (.dat) file understandable by HFM ?

    There are several ways to get this data into HFM.
    1) FDM - best option if you have it
    2) Webforms/Data Grids
    3) HsSetValue formulas in Excel
    4) DAT file loads
    5) JVs, etc
    If you wish to use DAT files created via Excel, you will likely want to use Excel VBA macros to create your DAT file to load. We do this on occasion for special projects and it works quite well. What you can do is set up an Excel file with your data inputs to look however you want, then link your POV members and amounts to another tab (we commonly call this the Export tab and it is set up in an HFM-friendly format).
    Create a macro to write a DAT file to a specified location using data from the Export tab. The DAT file will need to be formatted as below. For a specific sample, you can extract data from your HFM app and see the format.
    !Data
    Scenario;Year;Period;View;Entity;Value;Account;ICP;Custom1;Custom2;Custom3;Custom4;Amount
    Scenario;Year;Period;View;Entity;Value;Account;ICP;Custom1;Custom2;Custom3;Custom4;Amount
    Scenario;Year;Period;View;Entity;Value;Account;ICP;Custom1;Custom2;Custom3;Custom4;Amount
    Scenario;Year;Period;View;Entity;Value;Account;ICP;Custom1;Custom2;Custom3;Custom4;Amount
    Scenario;Year;Period;View;Entity;Value;Account;ICP;Custom1;Custom2;Custom3;Custom4;Amount
    Brush up on Replace, Merge, or Accumulate load options in the HFM Admin and User Guides, then upload your new DAT file.

  • DMRS_tablename --why these tables are creating

    Hi ,
    i created a schema and designed the datamodel through the SQL DATA MODELER..after extrating and running the ddl the tables are created perfectly.but many tables in the form of dmrs_tablename are created reduntanly. i dont know why this created.
    but im sure these are created after connecting the schema to data modeler tool only.kindly let me know why this are creating, wat are those tables and importance of those tables.
    thanks in advance,
    Jeevanand.K

    Hi Jeevanand,
    DMRS_xxx tables are created in "export to reporting schema" functionality. If you don't need reporting repository then go again to "export to reporting schema", select the connection for your schema and you can drop tables using "Drop repository" in maintenance tab.
    Philip

  • Why user sessions are frequently blocked?

    from the time that we did a database migration from oracle 8i to oracle 9i, and the application to developer 6i we have faced a new problem, ' the user sessions are frequently blocked'
    anybody have any idea? it is a database parameter that must be changed?...
    thnx,

    Hi,
    Since it was a migration from oracle 8i to oracle 9i, your performance is suppose to increase rather than decrease given condition that hardware and oracle configuration does not change.
    How did you figure out that user sessions are frequently blocked?
    You can query v$session_wait and check why it is blocked/waiting?
    You can query v$locked_object to check if it is blocked for lock on some other objects?
    Regards

  • Very slow performance in every area after massive data load

    Hi,
    I'm new to Siebel. I had a call from customer saying that virtually every aspect of the application (login, etc) is slow after they did a massive data loading ~ around 15GB of data.
    Could you please help to point out what would be the best practice for this massive data loading exercise? All the table statistics are up to date.
    Anyone encountered this kind of problem before?

    Hello,
    Siebel CRM is a highly customizable customer relationship management solution. There are number of customizations (scripting, workflow, web services,...) and integrations (custom c++, java, ERP system,...) that can cause Siebel performance issues.
    Germain Monitoring v1.8.5 can help you -clean-up- all your siebel performance issues (5 min after installation, which can take between 4hours and 10 days whether it is to be used against your siebel dev/qa or prod environment) and then monitor your siebel production system at every layer of your infrastructure, at the siebel user click & back-end transaction levels and either solve or identify the root-cause of siebel performance issues, 24x7.
    Germain Monitoring Software (currently version 1.8.5) helps siebel customers 1)faster solve siebel performance issues introduced by customizations and 2)effectively solve siebel performance issues before business is impacted once siebel is on production.
    Customers like NetApp, J.M Smucker, Alltel/Verizon,...have saved hundred of thousands of dollars using Germain Monitoring software.
    Let us know whether you would like to discuss this further...good luck w/ these issues,
    Regards,
    Yannick Germain
    GERMAIN SOFTWARE LLC
    Siebel Performance Software
    21 Columbus Avenue, Suite 221
    San Francisco, CA 94111, USA
    Cell: +1-415-606-3420
    Fax: +1-415-651-9683
    [email protected]
    http://www.germainsoftware.com

  • Choosing Dimension Members / Creating Data Load Rule Files

    Hi
    Let me explain this problem using an example. Let's say I have a dimension table called "Product" and its has ProductId, ProductCode & ProductName fields, and a Fact table with ProductId and Value columns. As you can see ProductId has beed used to link to Fact table. I want to display ProductCode as dimension members (ProductName as Alias). I have no problem creating this scenario using EIS. But when I am not sure whether I could do this with EAS. I tried but I cannot get it working. Does anybody know to do this in EAS please? Also, how to create the data load rule for this problem using ProductId?
    Regards
    Chandra

    You can do this using a SQL load rule.
    One rule to build your dimension, simply select statement on the one Dimension table.
    Second rule to load data is going to require a Select statement with a join to pull the product name from the dimension table and the data from the fact table linked on the Product id.

  • Data load in Essbase ASO cube

    Hi,
    I have not been using ASO cube before and had worked only on BSO cubes. Now I have a requirement to create a rule file to load data in to an ASO Essbase cube. I have created a data load rule file as I was creating for a BSO cube which is correctly validating. However when I am doing the data load I am getting following warning:
    "Aggregate storage applications ignore update to derived cells. [480] cells skipped"
    I have investigated further and found that ASO cube does not allow data loading at upper levels & on members calculated through formulas. After this I have ensured that I am loading the data in to zero level members and members which are not calculated through formula. But still I am not able to do the data load & getting the same warning.
    Could you please help me and let me know if there is anything else which I am missing here?
    Thanks in advance...
    AKW

    Hi AKW,
    "Aggregate storage applications ignore update to derived cells. [480] cells skipped"This is only a warning message that means only those many cells were skipped might be for some reasons like any member pointing to those cells will be missing.
    If you want to copy the Data of your BSO cube to an ASO Application why dont you use an PARTIONING it will copy your whole data from BSO to ASO (If Outline is common in both then copy any member of Sparse dimension like "Scenario 1" from Source i.e. BSO, to same member like "Scenario 1" in Target i.e ASO ),
    This is only an alternate wayThanks
    Avneet Singh Bhatia

  • Data Load Rules file - Conditional load

    Hi,
    I need to create a data load rules file to load sales data. The data comes at product, entity, month, year, version, scenario, account level. I need to load data only towards two of the entities, NCC and NCCCBU, i.e if the product is coming under a particular hierarchy called UK then it needs to get loaded against NCC entity member and if the product is under the CBU hierarchy it needs to be loaded against the NCCCBU entity.
    All of the data comes in the same file so I need to load it in one go without splitting the file into two.
    Thanks
    Vikash

    You can achieve this.
    In the data load rule when you are mapping the entity column.
    set the propperties like this:
    Field properties -> Global Prpoperties
    Replace with
    UK NCC
    CBU NCCCBU
    Field properties -> Data Load Prpoperties
    Map the column with entity dimension.

  • Is the Data load option in 4.1 really helpful?

    Hi,
    So while trying to create a data load page in my application, I was facing issues with no data found and was able to resolve the issue at that time but that raised another question which I thought of posting as a separate question rather than in an existing thread.
    I was trying to load data into a DB table from a .csv file. and at the datavalidation page/step, it would throw a "no data found" error. I then noticed that . I then noticed, that in the second data/table mapping page where it shows all the rows. on top there is a "Column Name" list box which has values defaulted to "Do not load" and in that there are all the column names.
    So just to confirm, do we need to to everytime select all the columns ? In the very first step, the column name is specified in the first row of the csv file, which I have selected .We have quite a few columns and this would be tedious for business to manually do? Shouldn't the application just use the csv first row headers as the column name?
    How do you guys workaround this?
    Thanks,
    Sun

    Hi VC,
    It does take the excel column values on the first row. The problem was that the column value had a space at the end. hence the issue. Was able to fix that.
    Thanks,
    Sun

  • Announcing 3 new Data Loader resources

    There are three new Data Loader resources available to customers and partners.
    •     Command Line Basics for Oracle Data Loader On Demand (for Windows) - This two-page guide (PDF) shows command line functions specifc to Data Loader.
    •     Writing a Properties File to Import Accounts - This 6-minute Webinar shows you how to write a properties file to import accounts using the Data Loader client. You'll also learn how to use the properties file to store parameters, and to use the command line to reference the properties file, thereby creating a reusable library of files to import or overwrite numerous record types.
    •     Writing a Batch File to Schedule a Contact Import - This 7-minute Webinar shows you how to write a batch file to schedule a contact import using the Data Loader client. You'll also learn how to reference the properties file.
    You can find these on the Data Import Resources page, on the Training and Support Center.
    •     Click the Learn More tab> Popular Resources> What's New> Data Import Resources
    or
    •     Simply search for "data import resources".
    You can also find the Data Import Resources page on My Oracle Support (ID 1085694.1).

    Unfortunately, I don't believe that approach will work.
    We use a similar mechanism for some loads (using the bulk loader instead of web services) for the objects that have a large qty of daily records).
    There is a technique (though messy) that works fine. Since Oracle does not allow the "queueing up" of objects of the same type (you have to wait for "account" to finish before you load the next "account" file), you can monitor the .LOG file to get the SBL 0363 error (which means you can submit another file yet (typically meaning one already exists).
    By monitoring for this error code in the log, you can sleep your process, then try again in a preset amount of time.
    We use this allow for an UPDATE, followed by an INSERT on the account... and then a similar technique so "dependent" objects have to wait for the prime object to finish processing.
    PS... Normal windows .BAT scripts aren't sophisticated enough to handle this. I would recommend either Windows POWERSHELL or C/Korn/Borne shell scripts in Unix.
    I hope that helps some.

  • Relationship between ERPi metadata rules and Data load rules

    I am using version 11.1.1.3 to load EBS data to ERPi and then to FDM. I have created a Metadata rule where I have assigned a ledger from the Source Accounting Entities. I have also created a Data Load rule that references the same ledger as the metadata rule. I have about 50 ledgers that I have to integrate so I have added the source adapters that reference the Data Load rule.
    My question is... What is the relation between the Meatdata rule and the Data load rule with the ledger? If you can specify only one Ledger per metadata rule, then how does FDM know to use another metadata rule with another ledger attached to it? Thanks!!!

    Aksik
    1 How freequently this activation problem occurs. If it is one time replicate the datasource and activate thetransfer structure( But in general as you know activation of transfer structure should be done automatically after transport of the object)
    2 One thing for difference of time is environmental as you know in production system so many jobs will run at the same time so obiously system performance will be slow compare to Dev System. In your case both the systems are performing equally. You said in dev system for 50000 records half an hour and in production 200000 records 2hrs so records are more in Production system and it took longer time. If it is really causing problem then you have to do some performance activities.
    Hope this helps
    Thnaks
    Sat

  • Schema name is not showing in the Data load

    Hi All,
    I am trying to load a CSV file using oracle apex data load option. The options are using a new table and file upload(.csv). In the load data page, the schema name is not listing my current schema because of which i could not to upload the CSV file.
    Can anyone please help on this ?
    I am using oracle apex 4.1.1
    Regards
    Rajendrakumar.P

    Raj,
    Did you export this application from another workspace perhaps? I have seen in the past that if you create a data load page based on schema A and then import that application and set it to parse as schema B, it will not work.
    The solution - although unsupported - is to simply alter the parse-as schema reference in the APEX export file. A more supported version would be to re-create the data load pages in the target application, so that it picks up on the proper parse-as schema.
    Thanks,
    - Scott -
    http://spendolini.blogspot.com
    http://www.enkitec.com

  • Unable to Create New Data Model

    Hello all,
    I have been using BI publisher 11g for sometime but at best can be treated as a newbie. I was able to create New data models, reports, layouts etc till sometime ago.
    Now when I try to create a New Data Model I get an error message saying
    Unable to load ~weblogic/usr/temp/xxxxxxxx.xdm
    I click on Ok and create my data set using the SQL query.
    Now when I try to generate a sample xml I get the following error message:
    org.xml.sax.SAXParseException: : XML-20108: (Fatal Error) Start of root element expected.
    I understand something has changed in the server but I cannot pin point what and where.
    Any help will be highly appreciated.....
    Thanks,
    Geetesh

    The JDK/Java that ODI using should be 64bit if ODI 11.1.1.6 is 64 bit.   (I think JDK 1.6 is recommended for this ODI version)
    Also, for which technology you are creating a data server ? and what error message you are getting while creating it ?
    Thanks,
    Santy.

Maybe you are looking for

  • Imessage using my gmail instead of my actual number

    So, I go to school in the states but im from canada. I recently bought a phone in canada, got it unlocked and now use it in the states. I just came back to canada for break. My phone was working fine for 2 days, my imessages were still going through

  • Character set problem with db link

    Hello, I'm accessing a DB2 database through a database link (ODBC) and I'm getting this error: ORA-28500: connection from ORACLE to a non-Oracle system returned this message: [Generic Connectivity Using ODBC]DRV_GetRows: DB_ODBC_ENGINE(1125): ; [IBM]

  • My mac is weird and slow

    It's okay to visit many websites but not Facebook. Many applications run fast but not mail and microsoft word. Skype doesn't work. The login is slow as well. What happen and what should I do?

  • Change Tracker implementation in portal

    Hi, while implementing change tracker in Portal , one important point to be notted is to assign the below mention UME action to the roles created. Otherwise you will get MDM authorization error. UME Action are  as follows: tcmdmchangetrackersecurity-

  • CS6 Design Download

    We recently purchased the DVD version of CS6 Design Standard, but our iMac doesn't have a DVD drive. Is it possible to download this product from the Adobe site? Thanks, Brad