While executing mapping facing Warnnings

Hi All,
We are executing mappings facing the following Warnnings Weekly once or twice its happening remaing days its executing sucessfully. Warnnings as follows:
Warning CursorFetchMapTerminationRTV20007
Warning ORA-01002: fetch out of sequence
ORA-02063: preceding line from FRPDTA
Warning ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
[IBM][iSeries Access ODBC Driver][DB2 UDB]SQL0913 - Row or object F4801LA in FRPDTA type *FILE in                 use.
ORA-02063: preceding 2 lines from FRPDTA
01. We are using 11.1.0.7.0 Version OWB.
02. Dblink is working fine FRPDTA other mappings are running successfully but one/two mappings facing same problem.
03. We checked either locks are there in Source / Target Objects but we found there is no locks.
Thanks in Advance
Ava

Hi Ava,
in my opinion problem is on the source DB2 database (error "IBMiSeries Access ODBC DriverDB2 UDBSQL0913 - Row or object F4801LA in FRPDTA type *FILE in use." is returned by DB2 ODBC driver)
Regards,
Oleg

Similar Messages

  • Error while executing mapping with 500 K records

    Hi,
    I have a mapping which has couple of transformation operators (Procedures) and some joiners and a splitter. The operation mode is set based fail over to row based. The target is set to update/insert. The target has about 20 fields.
    I am loading about 500k records from source to target. I get the following error"
    unable to extend segment by 4 in undo tablespace". My DBA has set 14 gigs space for the tablespace.
    Can anybody provide pointers/ideas why this is happening and how can I overcome this.
    My target and source are on Oracle 10g R2 and OWB is also 10g R2.
    Thanks a lot in advance!
    Maruthi

    Hi, Maruthi!
    Running OWB 10.1.0.4 on an Oracle 10gR1 database we had similar problems with UNDO tablespaces.
    One reason could be that the UNDO tablespace is too small, but 500K rows with 20 columns do not seem very large to me (of course it depends on the column types, but I suppose they are mainly some NUMBERs, VARCHARs or DATEs).
    You could use TOAD to have a quick look at the tablespaces or use the following statement:
    SELECT *
    FROM dba_data_files
    WHERE tablespace_name = 'SYS_UNDO';
    Another reason could be improper UNDO Management settings.
    You should contact the DBA to check them to fit the application's requirements.
    Regards,
    Günther

  • While Executing Facing Warnnings

    Hi All,
    While executing mapping through control centre facing the following warnnings. How can we avoid these warnings.
    Name      Status      Log
    MAP_FRPDTA_F4801LA     Warning     CursorFetchMapTerminationRTV20007
    MAP_FRPDTA_F4801LA     Warning     ORA-01002: fetch out of sequence
    ORA-02063: preceding line from FRPDTA
    MAP_FRPDTA_F4801LA     Warning     ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
    [IBM][iSeries Access ODBC Driver][DB2 UDB]SQL0913 - Row or object F4801LA in FRPDTA type *FILE in use.
    ORA-02063: preceding 2 lines from FRPDTA
    Rgds,

    Hi,
    Your mapping access a flat file right?
    Usually, flat files is host into the same database server machine, and the parameter UTL_FILE is set to '*'.
    Please confirm those questions.
    Regards,
    Gustavo.

  • Mapping - Warning while executing

    Hi All,
    While executing mapping the following waring is getting:
         ORA-04091: table %s.%s is mutating, trigger/function may not see it
    ORA-02063: preceding line from DBLINK.
    The mapping contains one filter & expression:
    We had created function to get max_date from target table A
    We are using the above function to filter the records by using this formula UPDATE>(MAX_DATE-30) to filter the records for last 30 days and load into the Target Table A
    In exprression level we are converting Char datatype to Varchar2 datatype.
    Regards,
    Edited by: [email protected] on Jul 16, 2009 11:40 PM

    If you have the get_max_date function being used in an expression within the mapping, then this date can change during the course of the load which is problematic from a logial perspective.
    What you need to do is get the function out of the main generated DML statement.
    To do that, create a constant operator and set the value to your get_max_date function, then use the constant in the logic of the ETL. In generated code the constant will be created as a package constant which is initialized once and only once when the mapping first gets called and this value is then locked for the duration of the mapping.
    Oh! And check to ensure that this IS what is causing the problem. Is there a trigger on your target table which could be raising the error?
    Mike

  • Error while executing  operation mapping

    Hi All,
    I am getting  the below the error , while executing the Operation mapping, whereas the same payload
    working fine with message mapping. Please suggest.

    Hi Chandra,
    Please refer to mark's comment provided in below thread:
    Unable to display tree view
    Regards,
    Akshay

  • While executing the query in the web template I am  facing below issue

    Hello SAP geniuses,
    Please help me on my issue  ,
    while executing the query in the web template i am  facing below issue.
    The variable for characteristic (region) is appearing but this characteristic (region) is not appearing in the free characteristic zone  but when we are executing the query with out webtemplate it is showing both variable and free characteristic
    can anybody help us to identify what is the issue with the web template.
    Thanks
    Alok

    Hi,
    Plz check ur report and execute at the designer and take its technical name and go to RSRT. log out and log in and try..
    if not check ur authorisation.
    Regards....KP

  • We are facing error while executing Bex report.

    Hi Gurus:
    We are facing error while executing Bex report.
    Messge :                                                                          
    u201CQuery 0: Runtime error program state check for conversation 05284411 / CP  with parallel processing via RFC
    Message no. DBMAN428u201D
    The query is based upon Multiprovider (based upon 2 basic cubes).
    Both cubes are partitioned by   info object 0CALMONTH, (valid range - 01.2005 - 012.2010).
    Those cubes are not containing any non-cumulative key figures and Service package level for  SAP NetWeaver BI 7.0 is  u2018SAPKW70012u2019.
    I have checked those cubes in RSRV and carry out following steps:
    "All Elementary Tests -> Transaction Data" and execute the repair "Consistency of the Time Dimension for an Info Cube"
    I have found system inconsistency like u201CRecord with the DIMID 2 contains non-fitting time characteristics u201C
    Detail ERROR:
    Record with the DIMID 2 contains non-fitting time characteristics
    Message no. RSCV053Diagnosis
    The data record of the time dimension table with the described DIMID contains values for time characteristics that do not fit together.
    I have found the u2018 Note 1130450 - Query displays error DBMAN 284; RSRV: Time dimension checku2019  for this but still not sure this will work or not .
    I tested this in dev server , didn't feel useful.
    Could you please let us know what to correct this error.
    Anwser will be rewared.
    Regards,

    >
    Surendra Pande wrote:
    > Anwser will be rewared.
    > Regards,
    Go read the RULES of these forums.

  • Ora -06550 error while executing the mapping

    Hi,
    I am using owb client 11.2.0.3 . I have mapping map_emlap_src in this mapping i have import tables from different scehema and i have done one to one mapping on both the table .
    While executing the mapping i got the following error .
    ORA-06550: line 1, column 1082:
    PLS-00302: component 'MAP_EMLAP_SRC' must be declared
    ORA-06550: line 1, column 1062:
    PL/SQL: Statement ignored

    grant execute on owner_name.MAP_EMLAP_SRC to user_name;

  • Error encountered while executing the transform project name.map name. Error:Unable to create the transform. Microsoft.XLANGs.Engine

    Hi All,
    I developed a Biztalk Application aims to upload a flat file data to our system MS SQL Server database. It works fine until one day, an exception is generated as below.
    "Error encountered while executing the transform project name.map name. Error:Unable to create the transform. Microsoft.XLANGs.Engine"
    It continues to generate an exception until I restart the Biztalk instance. And it happens occasionally, no pattern, cannot be traced and debugged.
    Anybody has an idea about this error? Please help, thank you very much.
    The Biztalk Server is running on Hyper-V Virtual machine, Biztalk Server 2009 on Windows Server 2008.
    The Biztalk Application consists of Mapping Functoid, Orchestration, Flatfile Schema, Flatfile Disassembler Pipeline, SQL Send Port, File Receive Port and Send Port
    Eric

    HI Eric,
    I think it might happen because of large message (Out of Memory exception). As you said you have flat file as inbound, you are doing mapping and then sending it to MSSQL
    SO you can check below mentioned things
    1) Any custom pipeline component is used in receive pipeline. (if so optimize the code)
    2) This issue haapen only with large message (BizTalk convert flat file to XMl to if you have 1MB file corresponding XML will be of higher size)
    again all thing mentioned above is assumption.
    YOu can use debugdiag tool to see if memory leak is happening
    http://www.microsoft.com/en-in/download/details.aspx?id=26798

  • Facing a small problem while executing WDA application from portal

    Hi Experts,
    I developed web dynpro for ABAP application and placed table UI control, which contains 270 records. Executing from SE80, it is working fine and showing clearly all records.
    This application integrated in portal, while executing this applicaiton from portal, it is working fine and showing all records.
    Here I facing a small problem i.e Pager ( footer area ) of table.
    When the table is displayed there are the small box in the below of the table showing ROWS 244 of 270.
    244 will be available in a small box. this 244 is not showing clearly, while executing from portal ( last digit 4 displying half part only). but 244 showing clearly while executing from SE80.
    Kindly suggest how to resolve this problem.
    Thanks & Regards
    Sridhar

    Hi Gopi Krishna,
    Thanks for your time. From SE80, it is working fine and showing clearly row number. There is no property for increase width of small box, which contains current row number).
    Facing problem while executing from portal.
    Thanks & Regards
    Sridhar

  • Facing Problem while executing a command through WLST

    Hi,
    Iam using Weblogic 11g(10.3.2).Whenever I execute the below command in WLST on windows it is working fine.
    reassociateSecurityStore(domain="base_domain",admin="cn=orcladmin",password="welcome1",ldapurl="ldap://<hostname>:389",servertype="OID",jpsroot="cn=jpsroot_idm_idmhost1")
    But whenever i execute the same in Linux,it is throwing the error as below:
    wls:/base_domain/serverConfig> reassociateSecurityStore(domain="base_domain",admin="cn=orcladmin",password="welcome1",ldapurl="ldap://<hoistname>:389",servertype="OID",jpsroot="cn=jpsroot_idm_idm1")
    Traceback (innermost last):
    File "<console>", line 1, in ?
    NameError: reassociateSecurityStore
    Please suggest....
    Regards
    Pavan

    Facing Problem while executing a command through WLST

  • Full Load: Error while executing :TRUNCATE TABLE: S_ETL_PARAM

    Hi All,
    We are using Bi Apps 7.9.6.1. Full Load was running fine. But Now we are facing a problem with truncating a table "S_ETL_PARAM".
    I have restart informatica Server And also DAC Srever. But still I am getting the same in the DAC Log as, *"NOMALY INFO::: Error while executing : TRUNCATE TABLE:S_ETL_PARAM*
    *MESSAGE:::com.siebel.etl.database.IllegalSQLQueryException: DBConnection_OLTP:SIEBTRUN ('siebel.S_ETL_PARAM')*
    *Values :*
    *Null Map*
    *EXCEPTION CLASS::: java.lang.Exception"*
    Any Suggestion.....
    Thanks in Advance,
    Deepak

    are you trying to run incremental load when you get this truncate error? can you re-run full load and see that still runs ok? pls also check your DW side database logs like alert lor any DB level issue. such errors do not throw friendly messages in DAC/Informatica side.

  • Error while executing the WorkBook

    Hi,
    While executing WorkBook, I am facing an error "A critical program error has occured. The program will now terminate.Please refer to the trace for further information".
    When I check in the trace, the error description is "System.NullReferenceException: Object reference not set to an instance of an object."
    Can someone plese guide me on how to solve this issue !!
    Thanks & Regards
    Seshendra Reddy

    Hi,
    Make sure you have .NET framework 2.0 with latest Service Packs installed on your system. Also try and see if you will be able to run the report by having more selections for restricting the data to a minimum. This is because, BI 7, uses more memory than what BW3.5 used to use. This inturn results in some reports not to run for same selection for which we could get back result in BW3.5
    Thanks,
    KR
    Edited by: KR on Jan 20, 2009 3:59 PM

  • Runtime error while executing rule

    Hello All,
      While executing the DTP for a cube, im facing the error as Runtime error while executing rule -> see long text .
      For this source is another Cube, where im loading the data from Cube to Cube.
    Error Description are as follows:-
    Error Location: Object Type    TRFN
    Error Location: Object Name    0T9SCR6Q4VWS1SOPNRRBOY1YD51XJ7PX
    Error Location: Operation Type ROUTINE
    Error Location: Operation Name
    Error Location: Operation ID   00034 0000
    Error Severity                 100
    Original Record: Segment       0001
    Original Record: Number        557
    and Also descripton is :-
    Diagnosis
        An error occurred while executing a transformation rule:
        The exact error message is:
        Division by zero
        The error was triggered at the following point in the program
        GP4H0CW3MLPOTR3E8Y93QZT2YHA 4476
    System Response
        Processing the data record has been terminated.
    Procedure
    The following additional information is included in the higher-level
    node of the monitor:
       Transformation ID
       Data record number of the source record
       Number and name of the rule which produced the error
    Let me know ur valuable suggestions on these error...
    thanks.

    Hello,
    I have checked all the transformation and End Routines.All are working fine.Yesterday i have loaded some data into it, but today its gettting errored out.
    Checked completely in the forum for threads related to this, but couldnt find proper thread which had solutions....
    thanks,
    srinivas.

  • Error while executing DTP

    Hi gurus,
    Iam trying to extract the data from r/3 using generic extractor the data is loaded into PSA sucessfully, but iam getting following errors while executing DTP.
    1.An error occurred while executing a transformation rule:
    The exact error message is:
    The argument 'EA' cannot be interpreted as a number
    The error was triggered at the following point in the program
    GP4D35STLXQI3SHIVNQC2FSJ7MB 791
    2.The data record was filtered out because data records with the same key
    have already been filtered out in the current step for other reasons and
    the current update is non-commutative (for example, MOVE). This means
    that data records cannot be exchanged on the basis of the semantic key.
    Please guide me accordingly.
    Regards
    Amar.

    Hi
    While mapping the Qty Fields it is must to add UOM to the Qty Fields and map it with relevant Info Objects.
    The Semantic Keys defined at DTP are also has some issues, try to give a dummy key figure if you are using DSO in the data flow as the DSO has the Overwrite mode.
    (Choose  Semantic Groups to specify how you want to build the data packages that are read from the source (DataSource or InfoProvider). To do this, define key fields. Data records that have the same key are combined in a single data package. This setting is only relevant for DataStore objects with data fields that are overwritten. This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.)
    Hope it helps and clear

Maybe you are looking for