Single record in Export Error log - Essbase Adapter

We are using FDM to load data to Essbase cubes. Currently in the export.err log file, we are getting the kickout details for a single record at a time. Is there any setting that would give us the entire list of kickouts in a sinlge log?
Thanks.

Use a load rule to load a data and you will get a error log listing all errors up to the max errors specified in the Essbase configuration. To use this method from FDM instead of the standard Adapter load check the Enable String Load option in the Adapter configuration settings and specify the name of the Load Rule you wish to use. Personally I would always go for the Load Rule option when loading Essabs from FDM

Similar Messages

  • Create Error log table

    I am supposed to insert the records that are not uploaded to the main table into a error log table and email the users about the error records that was not inserted into the table. How am is supposed to do it ?
    I have few more questions.
    What is the best way to upload the data from a file .
    1, I got to either do the batch processing or
    2, I got to browse and uplaod the file thru the APEX application and
    and insert the records.
    I want to know about which tutorial could be the best to read to do the
    about 2 methods and how do i create and insert records into the error log
    table and send the user with the CSv or txt file that contains the error records in both the methods ?
    Will following the below method be the right way for 2nd method ?
    http://oraexplorer.blogspot.com/2007/11/apex-to-upload-text-file-and-write-into.html

    Ok,
    I am trying to insert the records to an existing table from CSV file.
    I am using the below post to do so..
    Re: File Browse, File Upload
    I get some errors executing the htmldb tools package.
    Error at line 27: PLS-00103: Encountered the symbol "/"
    create or replace PACKAGE htmldb_tools
    AS
    -- Utility functions --{{{
    PROCEDURE parse_textarea ( --{{{
    -- Parse a HTML textarea element into the specified HTML DB collection
    -- The c001 element from the collection is used
    -- The parser splits the text into tokens delimited by newlines, spaces
    -- and commas
    p_textarea IN VARCHAR2,
    p_collection_name IN VARCHAR2
    PROCEDURE parse_file( --{{{
    -- Generic procedure to parse an uploaded CSV file into the
    -- specified collection. The first line in the file is expected
    -- to contain the column headings, these are set in session state
    -- for the specified headings item.
    p_file_name IN VARCHAR2,
    p_collection_name IN VARCHAR2,
    p_headings_item IN VARCHAR2,
    p_columns_item IN VARCHAR2,
    p_ddl_item IN VARCHAR2,
    p_table_name IN VARCHAR2 DEFAULT NULL
    END htmldb_tools;
    create or replace PACKAGE BODY htmldb_tools
    AS
    TYPE varchar2_t IS TABLE OF VARCHAR2(32767) INDEX BY binary_integer;
    -- Private functions --{{{
    PROCEDURE delete_collection ( --{{{
    -- Delete the collection if it exists
    p_collection_name IN VARCHAR2
    IS
    BEGIN
    IF (htmldb_collection.collection_exists(p_collection_name))
    THEN
    htmldb_collection.delete_collection(p_collection_name);
    END IF;
    END delete_collection; --}}}
    PROCEDURE csv_to_array ( --{{{
    -- Utility to take a CSV string, parse it into a PL/SQL table
    -- Note that it takes care of some elements optionally enclosed
    -- by double-quotes.
    p_csv_string IN VARCHAR2,
    p_array OUT wwv_flow_global.vc_arr2,
    p_separator IN VARCHAR2 := ','
    IS
    l_start_separator PLS_INTEGER := 0;
    l_stop_separator PLS_INTEGER := 0;
    l_length PLS_INTEGER := 0;
    l_idx BINARY_INTEGER := 0;
    l_quote_enclosed BOOLEAN := FALSE;
    l_offset PLS_INTEGER := 1;
    BEGIN
    l_length := NVL(LENGTH(p_csv_string),0);
    IF (l_length <= 0)
    THEN
    RETURN;
    END IF;
    LOOP
    l_idx := l_idx + 1;
    l_quote_enclosed := FALSE;
    IF SUBSTR(p_csv_string, l_start_separator + 1, 1) = '"'
    THEN
    l_quote_enclosed := TRUE;
    l_offset := 2;
    l_stop_separator := INSTR(p_csv_string, '"', l_start_separator + l_offset, 1);
    ELSE
    l_offset := 1;
    l_stop_separator := INSTR(p_csv_string, p_separator, l_start_separator + l_offset, 1);
    END IF;
    IF l_stop_separator = 0
    THEN
    l_stop_separator := l_length + 1;
    END IF;
    p_array(l_idx) := (SUBSTR(p_csv_string, l_start_separator + l_offset,(l_stop_separator - l_start_separator - l_offset)));
    EXIT WHEN l_stop_separator >= l_length;
    IF l_quote_enclosed
    THEN
    l_stop_separator := l_stop_separator + 1;
    END IF;
    l_start_separator := l_stop_separator;
    END LOOP;
    END csv_to_array; --}}}
    PROCEDURE get_records(p_blob IN blob,p_records OUT varchar2_t) --{{{
    IS
    l_record_separator VARCHAR2(2) := chr(13)||chr(10);
    l_last INTEGER;
    l_current INTEGER;
    BEGIN
    -- Sigh, stupid DOS/Unix newline stuff. If HTMLDB has generated the file,
    -- it will be a Unix text file. If user has manually created the file, it
    -- will have DOS newlines.
    -- If the file has a DOS newline (cr+lf), use that
    -- If the file does not have a DOS newline, use a Unix newline (lf)
    IF (NVL(dbms_lob.instr(p_blob,utl_raw.cast_to_raw(l_record_separator),1,1),0)=0)
    THEN
    l_record_separator := chr(10);
    END IF;
    l_last := 1;
    LOOP
    l_current := dbms_lob.instr( p_blob, utl_raw.cast_to_raw(l_record_separator), l_last, 1 );
    EXIT WHEN (nvl(l_current,0) = 0);
    p_records(p_records.count+1) := utl_raw.cast_to_varchar2(dbms_lob.substr(p_blob,l_current-l_last,l_last));
    l_last := l_current+length(l_record_separator);
    END LOOP;
    END get_records; --}}}
    -- Utility functions --{{{
    PROCEDURE parse_textarea ( --{{{
    p_textarea IN VARCHAR2,
    p_collection_name IN VARCHAR2
    IS
    l_index INTEGER;
    l_string VARCHAR2(32767) := TRANSLATE(p_textarea,chr(10)||chr(13)||' ,','@@@@');
    l_element VARCHAR2(100);
    BEGIN
    l_string := l_string||'@';
    htmldb_collection.create_or_truncate_collection(p_collection_name);
    LOOP
    l_index := instr(l_string,'@');
    EXIT WHEN NVL(l_index,0)=0;
    l_element := substr(l_string,1,l_index-1);
    IF (trim(l_element) IS NOT NULL)
    THEN
    htmldb_collection.add_member(p_collection_name,l_element);
    END IF;
    l_string := substr(l_string,l_index+1);
    END LOOP;
    END parse_textarea; --}}}
    PROCEDURE parse_file( --{{{
    p_file_name IN VARCHAR2,
    p_collection_name IN VARCHAR2,
    p_headings_item IN VARCHAR2,
    p_columns_item IN VARCHAR2,
    p_ddl_item IN VARCHAR2,
    p_table_name IN VARCHAR2 DEFAULT NULL
    IS
    l_blob blob;
    l_records varchar2_t;
    l_record wwv_flow_global.vc_arr2;
    l_datatypes wwv_flow_global.vc_arr2;
    l_headings VARCHAR2(4000);
    l_columns VARCHAR2(4000);
    l_seq_id NUMBER;
    l_num_columns INTEGER;
    l_ddl VARCHAR2(4000);
    BEGIN
    IF (p_table_name is not null)
    THEN
    BEGIN
    execute immediate 'drop table '||p_table_name;
    EXCEPTION
    WHEN OTHERS THEN NULL;
    END;
    l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
    htmldb_util.set_session_state('P149_DEBUG',l_ddl);
    execute immediate l_ddl;
    l_ddl := 'insert into '||p_table_name||' '||
    'select '||v(p_columns_item)||' '||
    'from htmldb_collections '||
    'where seq_id > 1 and collection_name='''||p_collection_name||'''';
    htmldb_util.set_session_state('P149_DEBUG',v('P149_DEBUG')||'/'||l_ddl);
    execute immediate l_ddl;
    RETURN;
    END IF;
    BEGIN
    select blob_content into l_blob from wwv_flow_files
    where name=p_file_name;
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    raise_application_error(-20000,'File not found, id='||p_file_name);
    END;
    get_records(l_blob,l_records);
    IF (l_records.count < 3)
    THEN
    raise_application_error(-20000,'File must have at least 3 ROWS, id='||p_file_name);
    END IF;
    -- Initialize collection
    htmldb_collection.create_or_truncate_collection(p_collection_name);
    -- Get column headings and datatypes
    csv_to_array(l_records(1),l_record);
    csv_to_array(l_records(2),l_datatypes);
    l_num_columns := l_record.count;
    if (l_num_columns > 50) then
    raise_application_error(-20000,'Max. of 50 columns allowed, id='||p_file_name);
    end if;
    -- Get column headings and names
    FOR i IN 1..l_record.count
    LOOP
    l_headings := l_headings||':'||l_record(i);
    l_columns := l_columns||',c'||lpad(i,3,'0');
    END LOOP;
    l_headings := ltrim(l_headings,':');
    l_columns := ltrim(l_columns,',');
    htmldb_util.set_session_state(p_headings_item,l_headings);
    htmldb_util.set_session_state(p_columns_item,l_columns);
    -- Get datatypes
    FOR i IN 1..l_record.count
    LOOP
    l_ddl := l_ddl||','||l_record(i)||' '||l_datatypes(i);
    END LOOP;
    l_ddl := '('||ltrim(l_ddl,',')||')';
    htmldb_util.set_session_state(p_ddl_item,l_ddl);
    -- Save data into specified collection
    FOR i IN 2..l_records.count
    LOOP
    csv_to_array(l_records(i),l_record);
    l_seq_id := htmldb_collection.add_member(p_collection_name,'dummy');
    FOR i IN 1..l_record.count
    LOOP
    htmldb_collection.update_member_attribute(
    p_collection_name=> p_collection_name,
    p_seq => l_seq_id,
    p_attr_number => i,
    p_attr_value => l_record(i)
    END LOOP;
    END LOOP;
    DELETE FROM wwv_flow_files WHERE name=p_file_name;
    END;
    BEGIN
    NULL;
    END;
    /

  • FDM Essbase Adapter - Duplicate members from same dimension on data records

    I am getting this error message when trying to export data to Essbase using FDM.
    Error - Duplicate members from same dimension on data records.
    Though this error looks quite straight forward, I could not find any thing in the data file to indicate the same. I even tried loading a single row data, but still got the same message.
    What other things I should be looking at to fix this issue?
    Thanks.

    There are one or two previous threads on this, but this might give you some help to start with as it links to an esssbase document which gives suggestions for the error message:
    Error in dataload- Duplicate Members From Same Dimension On Data Record
    another thread indicated that the first record had a blank in it and by re-sorting to have the first records with all fields populated resolved the issue (not sure on this one).
    from the FDM side:
    1. Are the dimensions correctly mapped in the Essbase adaptor?
    2. Are the dimensions in the correct sequence for loading to Essbase.
    3. Does the same value appear In more than one field (Can't remember but i think essbase didn't use to allow the same member name across dimensions).

  • While Installation of 11g database creation time error ORA-28056: Writing audit records to Windows Event Log failed Error

    Hi Friends,
    OS = Windows XP 3
    Database = Oracle 11g R2 32 bit
    Processor= intel p4 2.86 Ghz
    Ram = 2 gb
    Virtual memory = 4gb
    I was able to install the oracle 11g successfully, but during installation at the time of database creation I got the following error many times and I ignored it many times... but at 55% finally My installation was hanged nothing was happening after it..... 
    ORA-28056: Writing audit records to Windows Event Log failed Error  and at 55% my Installation got hung,,,, I end the installation and tried to create the database afterward by DBCA but same thing happened....
    Please some one help me out, as i need to install on the same machine .....
    Thanks and Regards

    AAP wrote:
    Thanks Now I am able to Create a database , but with one error,
    When I created a database using DBCA, at the last stage I got this error,
    Database Configuration Assistant : Warning
    Enterprise Manager Configuration Failed due to the Following error Listener is not up or database service is not registered with it.  Start the listener & Registered database service & run EM Configuration Assistant again....
    But when I checked the listener was up.....
    Now what was the problem,  I am able to connect and work through sqlplus,
    But  I didnt got the link of EM and when try to create a new connection in sql developer it is giving error ( Status : failure - Test Failed the Network Adapter could not establish the connection )
    Thanks & Regards
    Creation of the dbcontrol requires a connection via the listener.  When configuring the dbcontrol as part of database creation, it appears that the dbcontrol creation step runs before the dynamic registration of the databsase with the listener is complete.  Now that the database itself is completed and enough time (really, just a minute or two) has passed to allow the instance to register, use dbca or emca to create the dbcontrol.
    Are you able to get a sqlplus connection via the listener (sqlplus scott/tiger@orcl)?  That needs to be the first order of business.

  • DML ERROR LOGGING - how to log 1 constraint violation on record

    Hi there
    We are using DML error logging to log records which violate constraints into an error table.
    The problem is when a record violates > 1 constraint it logs the record but details only 1 constraint violation - is there a way to get it to record all constraint violations on an individual record.
    Many Thanks

    In the Netherlands several years ago a framework called CDM RuleFrame was introduced that did just this. Their main thought was that it is desirable to collect all error messages from one transaction and display them all at the end of the transaction.
    [url http://www.dulcian.com/papers/ODTUG/2001/BR%20Symposium%202001/Boyd_BR.htm]Here is an article that explains the concept.
    In short: it involves coding every single business rule as a database trigger using transaction management of CDM RuleFrame.
    I would not recommend it however, because I think [url http://rwijk.blogspot.com/2007/09/database-triggers-are-evil.html]database triggers are evil. However, it may appeal to first time users of an application.
    Hope this helps.
    Regards,
    Rob.
    Message was edited by:
    Rob van Wijk
    But if cannot be "turned on" by some switch: you have to design your system this way. So the short answer to your question is: no, it is not possible.

  • Essbase Error log issue for Shared Members

    I am loading Shared members for a custom dimension in Essbase.
    Error log is getting created as below.
    \\Record #1 - Warning: Account Only attributes cannot be set in non-Accounts dimensions
    PR0011,APP_PR,Shared Member,
    But that specific member is getting loaded.
    I am loading only shared member in this load and all records are shown as getting rejected even though most of them are getting loaded.This is causing issues in finding out real rejects.
    -app

    Hi, check this
    http://oraclebizint.wordpress.com/2008/12/14/hyperion-essbase-931-obe-series-designing-essbase-databases-bso-outlines-data-load-global-schema-part-2-building-dimensions/
    in the dimension build settings don't set allow moves property and use parent/child references
    I have tested it & it works
    Regards

  • Error at work flow Export task in Essbase data load

    Hi All,
    I am trying to load the data to Hyperion Essbase. I have two small question first is :
    1) can any buddy tell me what are the basic steps for specifying dimension for target essbase application. i have did the data load to HFM application previously but here with essbase it bit complex.
    2) I have done with the creating import format and assign it to Location after specifying dimensions. At the work flow export task i am getting following error:
    [Can not calculate Dimension member [Apr] with restricted member [Apr]
    Thanks in Advance

    The FDM Configuration Guide outlines the process to map the dimensions in the Essbase Adapter to the proper dimensions in the Essbase Outline. You will want to take a look at this on Oracle Technology Network under Documention.
    You will want to right-click on the adapter in the workbench and choose "Configure"
    You will then create your machine profile with the source machine value of the FDM app server name and the target machine value as the Essbase Server Name.
    On the second tab you will specifiy the Essbase Applicaiton and Datbaase Name and logon method to use for integrating with Essbase.
    Finally click on the Dimensions tab and click on each dimension in the left-pane and in the right-pane if the integration with Essbase is setup correctl you will be able to select the corresponding Essbase dimension from the Dimension drop down list and select it and it will populate the foreign name as Tony Mentioned.
    You can then apply the changes when complete.
    Hope this is helpful.

  • Getting error " Incomplete update due to error in single records"

    Dear All,
    We are loading data from DSO to Cube using full load and the load is failing giving an error message Collection in the source system ended and when checked the error message button it is showing as  " Incomplete update due to error in single records" . and also one more message "Messages (type E) for data records with record number 0
    Message no. RSM2714". Can any one tell us the reason for the failure of this load ? and how to resolve it ?.
    When we click on the help button of the error message it is displaying the message as below
    Incomplete update due to errors in single records --> Long text
    Message no. RSM2712
    Diagnosis
    In the update rules, one InfoSource record was used to create several records in the data target. These records must be handled in the same way to enable tracking into the PSA and the treatment of errors in individual records.
    In the previous case, one record was updated in this kind of group generated by update rules, whereas other records in the same group were rejected. If you updated the PSA data record again, the records that were already updated would be updated again. Duplicate records would appear in the data target and the data target would thus be inconsistent.
    System Response
    The data record with errors was highlighted in the PSA. However, no error request was generated.
    Procedure
    Delete the request in the data target and, after removing the error, update all records for the request to the data target again.
    Regards,
    JayaKrishna

    hi,
    Can you please check out the PSA error record and check this out in Source DSO as this load is for DSO -> Cube....
    Please correct it in PSA if this is not correct as per DSO ....prior correcting data in PSA pls delete the request from Cube.....then it will allow to correct or delete record in PSA and then further push data from PSA to Target Cube...
    If its correct as per DSO then run load in DSO sometime few records wrongly updated by  end user and they correct it by evening for Submission....So u will get the corrected record and then run manually further load to Cube...
    Hope this will help....
    Regards,
    Mahesh

  • Hi, I cant login to the facebook app on my iphone 5 ios 6.0.2.  I keep getting an error message saying 'There was an error logging in using single sign on' when im asked to log in again i get a 'session expired' message.  This only started happening yeste

    Hi, I cant login to the facebook app on my iphone 5 ios 6.0.2.  I keep getting an error message saying 'There was an error logging in using single sign on' when im asked to log in again i get a 'session expired' message.  This only started happening yesterday. Anyone else having this problem? Thanks.

    I am having the same problem and took the following steps to mitigate it to no avail.
    1. I deleted the Facebook app on the phone and turned off Facebook in the iPhone's system-wide settings.
    2. I re-enabled Facebook in the iPhone's system-wide settings and reinstalled the Facebook app and logged in again. It worked. For about an hour.
    3. I completely restored the phone to a previous backup (before the problems started) and reenabled Facebook .... reinstalled the app.... and now it works intermittenly. But it hasn't worked in about 12 hours now (just tried a few minutes ago).
    Please advise.

  • Sender File Adapter cannot send single record per message ?

    Hi,
    I have scenario flat file to jdbc. but then why the sender file adapter didn't split the record to become single record per message eventhough i have set it in "Recordsets Per Message" = 1. ?
    Document Name               MT_APINVOICE
    Document Namespace          urn:file:jdbc:apivinvoice
    Document Offset               8
    Recordset Name               INPUT
    Recordset Namespace          
    Recordset Structure                          RECORD,*
    Recordset Sequence                          Ascending
    Recordset per Message          1
    Key Field Name               
    Key field Type               String
    RECORD.fieldFixedLengths     10,5,10,10
    RECORD.endSeparator     'nl'
    RECORD.fieldNames          F1,F2,F3,F4
    Please advise
    Thank You and Best Regards
    Fernand

    >>but then how to make for example more then 1 records per message.
    like 10 records per message. should i set RECORD,10 ?
    That is right. Just try it out yourself.
    @Shesagiri,
    Number of record is decided by the parameter Recordset structure and number of recordset with in a message is decided by Recordset per Message.
    Regards
    Jaishankar

  • SNMP trap on OutOfMemory Error Log record

    I would like to implement SNMP trap on OutOfMemory Error Log record.
    In theory SNMP LogFilter with Severity Level "Error" and Message Substring "OutOfMemory" should do the trick.
    In reality it does not work (doh)(see explanations below), I wonder if someone managed to make it work.
    Log entry has following format:
    ----------- entry begin ----------
    ####<Nov 12, 2003 3:09:23 PM EST> <Error> <HTTP> <ustrwd2021> <local> <ExecuteThread: '14' for queue: 'default'> <> <> <101020> <[WebAppServletContext(747136,logs2,/logs2)] Servlet failed with Exception>
    java.lang.OutOfMemoryError
         <<no stack trace available>>
    ------------ entry end ------------
    Notice that java.lang.... is NOT part of the log record, yep it seems that exception stack trace is not part of log record! Thus filter could be applied only to "<[WebAppServletContext(747136,logs2,/logs2)] Servlet failed with Exception>" string, which is really useless.
    Here is fragment of trap data (i had to remove Message Substring in order to get Error trap to work)
    1.3.6.1.4.1.140.625.100.50: trapLogMessage: [WebAppServletContext(747136,logs2,/logs2)] Servlet failed with Exception

    Andriy,
    I dont think you could do much here, since Outofmemory is not part of
    log record SNMP agent cannot filter on this. I would be curious to hear
    if anyone got it to work using SNMP.
    sorry,
    -satya
    Andriy Potapov wrote:
    I would like to implement SNMP trap on OutOfMemory Error Log record.
    In theory SNMP LogFilter with Severity Level "Error" and Message Substring "OutOfMemory" should do the trick.
    In reality it does not work (doh)(see explanations below), I wonder if someone managed to make it work.
    Log entry has following format:
    ----------- entry begin ----------
    ####<Nov 12, 2003 3:09:23 PM EST> <Error> <HTTP> <ustrwd2021> <local> <ExecuteThread: '14' for queue: 'default'> <> <> <101020> <[WebAppServletContext(747136,logs2,/logs2)] Servlet failed with Exception>
    java.lang.OutOfMemoryError
         <<no stack trace available>>
    ------------ entry end ------------
    Notice that java.lang.... is NOT part of the log record, yep it seems that exception stack trace is not part of log record! Thus filter could be applied only to "<[WebAppServletContext(747136,logs2,/logs2)] Servlet failed with Exception>" string, which is really useless.
    Here is fragment of trap data (i had to remove Message Substring in order to get Error trap to work)
    1.3.6.1.4.1.140.625.100.50: trapLogMessage: [WebAppServletContext(747136,logs2,/logs2)] Servlet failed with Exception

  • When does a error log is creaded during BDC recording

    hi,
    i am working on Reports, i need to know how the BDC recording works, and when the error log log be created

    Hi anitha,
    You have got a message generated from message class "RQ" with message number "XXX".
    Are you getting this error during recording. Which transaction you are using for recording.
    The error messages will be generated when you documented is not proceesed completly via BDC. You can collect the error messages in to an internal table and print at the end of the program as a report.
    If you are using Call Traansaction
    *------ Declaring Internal Table to Store BDC Errors *----
    DATA: BEGIN OF T_MESSTAB OCCURS 0.
            INCLUDE STRUCTURE BDCMSGCOLL.
    DATA: END OF T_MESSTAB.
    CALL TRANSACTION 'TCODE' USING BDC_TAB MODE 'A'
                           MESSAGES INTO T_MESSTAB.
    LOOP AT T_MESSTAB.
            WRITE / TEXT-001.
            WRITE:/10 T_MESSTAB-DYNAME, 30 T_MESSTAB-DYNUMB.
            WRITE:/10 T_MESSTAB-MSGV1.
    ENDLOOP.
    Lanka

  • Hyperion hal essbase adapter error

    hi all,
    when using HAL, palnning componnent is working fine.
    When tring to use essbase adapter, I recieve an error: exception #1 error code 1000 could not create session
    i'm so Desperate.
    please Help!
    Thanks,
    Wendy :)
    Edited by: 956623 on 07:25 03/09/2012

    Hi,
    This is usually path/classpath problem.
    Ensure the following environment variable references exist on each machine running HAL and the Essbase adapter.
    To view/set the environment variables, navigate to My Computer -> Properties -> Advanced -> Environment Variables.
    User CLASSPATH:
    <HAL_Install_Path>\vbis\Hyperion\EssbaseAdapter
    System or User PATH:
    <HAL_Install_Path>\vbis\Hyperion\EssbaseAdapter\adm
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Essbase error logs

    ODI 10.1.3.6
    IKM SQL to Hyperion Essbase (DATA)
    I usuallly just hard code a path in the error log filename.
    I've never understood what the default string is all about in the error logs..
    <Default>:<?=java.lang.System.getProperty("java.io.tmpdir")?>/<%=snpRef.getTargetTable("RES_NAME")%>.err
    Can someone please shed some light?
    Cheers

    The default error path is just using java system parameters and odi substitution methods to dynamically build up the error log path. This will work for everyone regardless of their specific setup and thats why its the default. There is nothing wrong with hard coding the error log path but it can affect portabillity when moving objects between environments. I've always found that setting error log paths dynamically in an ODI variable and then referencing the variable for the error log filename is a good approach.

  • Error records written to application log

    Hi,
      Issue when uploading data from source system as data not even arriving into PSA and failing with below message like
    Error records written to application log
    Error in an arithmetic operation in record                                               793
    Record 793 :Contents 0 0 from field ANZZL cannot be converted in type INT1 ->longtext
    Record 793:Contents 0        0.00 from field PREIS cannot be converted in type CUR
    analysis:
      1).There were no issues in source system with filed data types.
      2). Re generated the data source and replicated into BW and reloaded, still same issue as mentioned above.
      3). In PSA data packets unable to find records which are issue. as data not arrived in PSA.
    Urgently needed to get the data into system.
    Thanks in advance.
    Mahesh

    Hi.
    May I recommend you to post such question in [BI General forum |Business Intelligence Old Forum (Read Only Archive) ;?
    There you can get an answer from our colleagues.
    Regards.

Maybe you are looking for

  • Hyper V in Windows 8.1 Enterprise does not work.

    Hi All, I am using Windows 8.1 Enterprise edition (Product is activated). I installed the Hyper V feature and when i open the Hyper V manager i don't see my system there. When i click on connect to server and click Local computer i get a cross mark m

  • Connecting 2560x1440 screen to a U410 over HDMI

    Hi, Im a new owner of U410 IdeaPad. I bought this machine with the intention to hook it up to a WQHD  (2560x1440) screen. The screen I intend to get has no limitation on being fed over HDMI with this resolution (supports 1.4 HDMI). What I have found

  • Track changes in Materialized view.

    Hi all, i would like to know if it is possibile to track changes in a Materialized view. 1) i have a materialized view tha can be fast refreshed, built on remote tables 2) base tables have, on remote DB, Materialized view log with rowid and primary k

  • Is it possible to make OINV.CardName field on Sales Invoice Editable?

    Hi Forum, Sales Invoice has cardcode and cardname. Is it possible to make the cardname editable  on the form to be able to change the name on invoice. Thank you.

  • Printing differences for certain BP's

    Hello Experts, I have some checks that are printing incorrectly for some BP's I am using Pre-numbered check stock and for most of the outgoing checks it prints correctly however, for a few of the BP's the margins don't seem to be taken in to account