Records are different in source & target ODS

hi,
is it possible that there can be more records in a target ODS than source ODS??? I am facing this problem.The mapping is one to one.what can be d reason .I am new to BI.Plz tell me there can be small reason for it.

company_code costcntr branch amount
100          2300     A     20
100          2301     A     20
100          2302     A     20
100          2303     A     20
In the above case if company_code is the only key field than ODS will have only the last record.
As recs with same value get overwritten.
If you now inlcude costcntr also in the key field than ODS will have all the 4 recs as key field value is unique for each records

Similar Messages

  • Number of records are different in database table and select statement

    Hi All,
    i need to fetch data from table BSID for the customer 0010000145
    if i am writing the code like -
        SELECT bukrs kunnr umskz shkzg dmbtr zfbdt zbd1t kkber
               FROM bsid
               INTO TABLE it_bsid
               FOR ALL ENTRIES IN it_kna1
               WHERE kunnr = it_kna1-kunnr
               and bukrs = pa_bukrs.
    no. of records are 130 in the internal table it_bsid
    and actual records are 200
    but when i am hardcoding the customer no.
    i am getting the exact records
       SELECT bukrs kunnr umskz shkzg dmbtr zfbdt zbd1t kkber
              FROM bsid
              INTO TABLE it_bsid
              FOR ALL ENTRIES IN it_kna1
              WHERE bukrs = pa_bukrs
                AND kunnr = '0010000145'.
    records in internal table = records in the database = 200.
    how it is possible.
    why the first code is not giving the correct no. of records.
    please reply asap.
    thanks in advance,
    madhu

    Madhu,
    You need to use the conversion routine...before u pass KUNNUR.
    CONVERSION_EXIT_ALPHA_RANGE_I
    CONVERSION_EXIT_ALPHA_RANGE_O
    Hope this helps..
    Chandra.

  • Data mart cube to cube copy records are not matching in target cube

    hI EXPERTS,
    Need help on the below questions for DATA Mart-Cube to cube copy(8M*)
    Its BW 3.5 system
    We have two financial cube.
    Cube A1 - Sourced through R/3 system (delta update) and Cube B1- Sourced through A1 cube.(Full update). These two cubes are connected through update rules with one to one mapping without any routines.Basis did a copy of back-end R/3 system from Production to Quality server.This happened approximately 2 months back.
    The Cube A1 which extracts delta load from R/3 is loading fine. but for the second cube, (extraction from previous cube A1) i am not getting full volume of data instead i m getting meagre value but the loading shows successful status in the monitor.
    We  tried through giving conditions in my infopackage (as it was given in previous year's loading) but then also its fetching the same meagre volume of data.
    To ensure that is it happening for the particular cube, we tried out in other cube which are sourced thro myself system and that is also getting meagre data rather than full data..
    For Example: For an employee if the data available is 1000, the system is extracting randomly some 200 records.
    Any quick reply will be more helpful. Thanks

    Hi Venkat,
                  Did you do any selective delitions in CUBEA1.
    first reconcile data cube1 & cube2 .
    match totals of cube1 with cube2.
    Thanks,
    Vijay.

  • Staging area different from target

    I am using oracle 10g as target database. I want to upload a flat data file into oracle table.
    I uploaded sucessfully using LKM Oracle (SQL Loader) in Interface. In this case i am using oracle database as a staging area. So all load transferred to my database at the timing of processing.
    I want to create staging area at my file server. So please help me out to resolve this problem.

    A staging area must accept SQL syntax.
    A File server won't accept SQL syntax.
    So if you speak about Physical Server where you got your file and if there is a RDBMS installed in maybe but else I don't really know how you can...
    You can specify another data server for your staging area or use the SUNOPSIS_MEMORY_ENGINE which use the cache I think but it's not optimized I think...
    For this in the Definition Tab of your Interface choose Staging Area different from the Target and then select in the list box SUNOPSIS_MEMORY_ENGINE.

  • In Journalization updates are not reflecting in target

    Hi,
    I am using Simple Journalization for SQL Server tables. Only the newly added rows are reflecting in the target but not the updated ones. I have inserted two rows and updated one row in source. Only the inserted records are reflecting in target and updated record is not reflecting in target.
    When I check the intermediate(I$_CDC_TARGET) and JKM tables(J$CDC_SOURCE) both inserted and updated rows are there. Why this updated records are not reflecting in target?
    Both Source and Target tables have Primary keys.
    Thanks,
    Naveen Suram

    Hi,
    I am using IKM SQL Incremental Update (row by row). Updated are coming to I$ table and I am able to see these records in Journal data also.
    One doubt here: In my source no extra columns are coming...I refered JohnGoodwin blog...according to the blog...there should be some extra columns related to journal right? Why they are not coming?
    First I have added the JKM to my model..then added the data source to CDC...then created a sunscriber...then started a Journal....
    Any thing I am missing?
    Thanks,
    Naveen Suram

  • File SIZES are Different ?

    Hi Experts,
    I have implemented ESR By pass Scenario.
    The problem is we are getting the file sizes are differently in Source folder and Target folder
    How can we resolve this issue
    Please Guide Me
    Thanks and Regards,
    Ravi Teja.

    Refer to question no. 2 under the below wiki and see if that configuration helps.
    Sender File Adapter Frequently Asked Questions - Process Integration - SCN Wiki

  • How to add a new field in target ODS that is not available in source ODS ?

    Hello every one , this is the first time i am placing a question
    - Currently we have an existing ODS based on 2LIS_02_SCL data source. This ODS is already in production.
    We have a requirement to use this existing ODS as our source to load required data in to two different data targets which also are ODS.Their update rules are different.
    Problem: There is  now a requirement to include a new field (PO need date) in the two target ODS. As this field in not available in our source ODS . How do we go about adding this.
    concern: As the source ODS is already in Production, How do we go about loading data if this new field is added.
    Anticipating your co operation.

    The new field is 0SCL_DELDATE  (planned delivery date of document schedule line)
    It is already there in the comm structure of our source ODS. But as it was not needed earlier in the source ODS, it was not used/added.(i.e not there is the source ODS update rules)
    But now as we are basing our targets on this source ODS.
    How do i go about doing this.
    The main idea of doing it this way is because we do not want to reload data from R/3.

  • Routine for Updating a field in the source ODS to target ODS

    Dear BW Users,
    I have a source and a target ODSu2019s, letu2019s say ODS1 and ODS2. ODS1 updates Field1 to Field2 in ODS2. Those fields have the same information but Field1, InfoObject1, is defined as CHAR whereas Field2, InfoObject2, is defined as NUMC. Since the data types are different I cannot map directly. I need an update routine to populate the InfoObject2/Field2 whose data type is NUMC. Thanks in advance.
    Best Regards,
    John Bartheimer

    In the Update routine of that particular Infoobject (Field2) add the following code:
    result = comm_structure-/bic/"Field1".
    Hope it helps...
    thanks
    Kumar

  • Error Occures while loading data from Source system to Target ODS

    Hi..
    I started loading Records From source system to target ODS.while i running the job i got the following errors.
    Record 18211 :ERROR IN HOLIDAY_GET 20011114 00000000
    Record 18212 :ERROR IN HOLIDAY_GET 20011114 00000000
    sp Please help me in these following Errors..
    Thanks in advance,

    Hello
    How r u ?
    I think this problem is at the ODS level, ZCAM_O04 is ur ODS Name.
    Could u check the ODS Settings, and the Unique Data Records is Checked or Not ?
    Best Regards....
    Sankar Kumar
    +91 98403 47141

  • What are differences between the target tablespace and the source tablespac

    The IMPDP command create so manay errors. But the EXAMPLE tablespace is transported to the target database successfully. It seems that the transported tablespace is no difference with the source tablespace.
    Why create so many errors?
    How to avoid these errors?
    What are differences between the target tablespace and the source tablespace?
    Is this datapump action really successfull?
    Thw following is the log output:
    [oracle@hostp ~]$ impdp system/oracle dumpfile=user_dir:demo02.dmp tablespaces=example remap_tablespace=example:example
    Import: Release 10.2.0.1.0 - Production on Sunday, 28 September, 2008 18:08:31
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SYSTEM"."SYS_IMPORT_TABLESPACE_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_TABLESPACE_01": system/******** dumpfile=user_dir:demo02.dmp tablespaces=example remap_tablespace=example:example
    Processing object type TABLE_EXPORT/TABLE/TABLE
    ORA-39117: Type needed to create table is not included in this operation. Failing sql is:
    CREATE TABLE "OE"."CUSTOMERS" ("CUSTOMER_ID" NUMBER(6,0), "CUST_FIRST_NAME" VARCHAR2(20) CONSTRAINT "CUST_FNAME_NN" NOT NULL ENABLE, "CUST_LAST_NAME" VARCHAR2(20) CONSTRAINT "CUST_LNAME_NN" NOT NULL ENABLE, "CUST_ADDRESS" "OE"."CUST_ADDRESS_TYP" , "PHONE_NUMBERS" "OE"."PHONE_LIST_TYP" , "NLS_LANGUAGE" VARCHAR2(3), "NLS_TERRITORY" VARCHAR2(30), "CREDIT_LIMIT" NUMBER(9,2), "CUST_EMAIL" VARCHAR2(30), "ACCOUNT_MGR_ID" NU
    ORA-39117: Type needed to create table is not included in this operation. Failing sql is:
    ORA-39117: Type needed to create table is not included in this operation. Failing sql is:
    CREATE TABLE "IX"."ORDERS_QUEUETABLE" ("Q_NAME" VARCHAR2(30), "MSGID" RAW(16), "CORRID" VARCHAR2(128), "PRIORITY" NUMBER, "STATE" NUMBER, "DELAY" TIMESTAMP (6), "EXPIRATION" NUMBER, "TIME_MANAGER_INFO" TIMESTAMP (6), "LOCAL_ORDER_NO" NUMBER, "CHAIN_NO" NUMBER, "CSCN" NUMBER, "DSCN" NUMBER, "ENQ_TIME" TIMESTAMP (6), "ENQ_UID" VARCHAR2(30), "ENQ_TID" VARCHAR2(30), "DEQ_TIME" TIMESTAMP (6), "DEQ_UID" VARCHAR2(30), "DEQ_
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "SH"."CUSTOMERS" 9.850 MB 55500 rows
    . . imported "SH"."SUPPLEMENTARY_DEMOGRAPHICS" 695.9 KB 4500 rows
    . . imported "OE"."PRODUCT_DESCRIPTIONS" 2.379 MB 8640 rows
    . . imported "SH"."SALES":"SALES_Q4_2001" 2.257 MB 69749 rows
    . . imported "SH"."SALES":"SALES_Q1_1999" 2.070 MB 64186 rows
    . . imported "SH"."SALES":"SALES_Q3_2001" 2.129 MB 65769 rows
    . . imported "SH"."SALES":"SALES_Q1_2000" 2.011 MB 62197 rows
    . . imported "SH"."SALES":"SALES_Q1_2001" 1.964 MB 60608 rows
    . . imported "SH"."SALES":"SALES_Q2_2001" 2.050 MB 63292 rows
    . . imported "SH"."SALES":"SALES_Q3_1999" 2.166 MB 67138 rows
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."REGIONS" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."REGIONS" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."COUNTRIES" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."COUNTRIES" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."LOCATIONS" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."LOCATIONS" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."DEPARTMENTS" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."DEPARTMENTS" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."JOBS" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."JOBS" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."EMPLOYEES" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."EMPLOYEES" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."JOB_HISTORY" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."JOB_HISTORY" TO "EXAM_03"
    ORA-39112: Dependent object type OBJECT_GRANT:"OE" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"OE" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: Dependent object type INDEX:"OE"."CUSTOMERS_PK" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type INDEX:"OE"."CUST_ACCOUNT_MANAGER_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type INDEX:"OE"."CUST_LNAME_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type INDEX:"OE"."CUST_EMAIL_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type INDEX:"PM"."PRINTMEDIA_PK" skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    ORA-39112: Dependent object type CONSTRAINT:"OE"."CUSTOMER_CREDIT_LIMIT_MAX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type CONSTRAINT:"OE"."CUSTOMER_ID_MIN" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type CONSTRAINT:"OE"."CUSTOMERS_PK" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type CONSTRAINT:"PM"."PRINTMEDIA__PK" skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
    ORA-39112: Dependent object type CONSTRAINT:"IX"."SYS_C005192" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUSTOMERS_PK" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_ACCOUNT_MANAGER_IX" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_LNAME_IX" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_EMAIL_IX" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"PM"."PRINTMEDIA_PK" creation failed
    Processing object type TABLE_EXPORT/TABLE/COMMENT
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    ORA-39112: Dependent object type REF_CONSTRAINT:"OE"."CUSTOMERS_ACCOUNT_MANAGER_FK" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39083: Object type REF_CONSTRAINT failed to create with error:
    ORA-00942: table or view does not exist
    Failing sql is:
    ALTER TABLE "OE"."ORDERS" ADD CONSTRAINT "ORDERS_CUSTOMER_ID_FK" FOREIGN KEY ("CUSTOMER_ID") REFERENCES "OE"."CUSTOMERS" ("CUSTOMER_ID") ON DELETE SET NULL ENABLE
    ORA-39112: Dependent object type REF_CONSTRAINT:"PM"."PRINTMEDIA_FK" skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
    Processing object type TABLE_EXPORT/TABLE/TRIGGER
    ORA-39082: Object type TRIGGER:"HR"."SECURE_EMPLOYEES" created with compilation warnings
    ORA-39082: Object type TRIGGER:"HR"."SECURE_EMPLOYEES" created with compilation warnings
    ORA-39082: Object type TRIGGER:"HR"."UPDATE_JOB_HISTORY" created with compilation warnings
    ORA-39082: Object type TRIGGER:"HR"."UPDATE_JOB_HISTORY" created with compilation warnings
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    ORA-39112: Dependent object type INDEX:"OE"."CUST_UPPER_NAME_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_UPPER_NAME_IX" creation failed
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/POST_INSTANCE/PROCACT_INSTANCE
    ORA-39112: Dependent object type PROCACT_INSTANCE skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    ORA-39083: Object type PROCACT_INSTANCE failed to create with error:
    ORA-01403: no data found
    ORA-01403: no data found
    Failing sql is:
    BEGIN
    SYS.DBMS_AQ_IMP_INTERNAL.IMPORT_SIGNATURE_TABLE('AQ$_ORDERS_QUEUETABLE_G');COMMIT; END;
    Processing object type TABLE_EXPORT/TABLE/POST_INSTANCE/PROCDEPOBJ
    ORA-39112: Dependent object type PROCDEPOBJ:"IX"."AQ$_ORDERS_QUEUETABLE_V" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    ORA-39112: Dependent object type PROCDEPOBJ:"IX"."ORDERS_QUEUE_N" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    ORA-39112: Dependent object type PROCDEPOBJ:"IX"."ORDERS_QUEUE_R" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    ORA-39112: Dependent object type PROCDEPOBJ:"IX"."AQ$_ORDERS_QUEUETABLE_E" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    ORA-39112: Dependent object type PROCDEPOBJ:"IX"."ORDERS_QUEUE" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    Job "SYSTEM"."SYS_IMPORT_TABLESPACE_01" completed with 63 error(s) at 18:09:14

    Short of trying to then reverse-engineer the objects that are in the dump file (I believe Data Pump export files contain some XML representations of DDL in addition to various binary bits, making it potentially possible to try to scan the dump file for the object definitions), I would tend to assume that the export didn't include those type definitions.
    Since it looks like you're trying to set up the sample schemas, is there a reason that you wouldn't just run the sample schema setup scripts on the destination database? Why are you using Data Pump in the first place?
    Justin

  • How to send data from 4 different data sources to one ODS

    Hello Gurus,
    There is a transaction called KSB1 in R/3.
    It has data related to cost center , cost element, G/L .
    In BI, i need to transfer the data from these Data sources to one ODS.
    Can any body give me some idea.
    Points will be awarded for any kind of response .
    Thanks .
    Anu

    Hi Rupa..
    In your requirement some data sources (Cost centre, Cost element) are master data sources.
    They are available as info Objects in BI (Content).
    So there is no need to create ODS on these.
    But the general scenario for creating ODS from multiple data sources in case of Transaction documents
    Like :
    PO header, PO line items...
    For this the Pre-requisite is to have common fields between these data sources (eg PONo).
    Hope this gives idea for u....
    Cheers...
    Varma

  • Records are missing in the File which XI has placed in Target FTP server

    Hi All,
    I have a scenario where in XI is transfering the files from ECC to Target System . No transformation required here . I am using AAE to run this scenario.
    Issue i am facing here is that i can see few records missing in the File  which XI had placed in the Target system FTP server . Same file if i check it in archive folder of ECC (which XI has archived after picking the file), complete set of records are present for the same file.
    Need your inputs please....
    Note : XI is using AAE to transfer the files and no mapping . Also i tried to check out the audit logs in the channel monitoring . Unforutanely i was not able to see the logs to check the bytes that was transferred while readng and writing the file . Sometimes i have faced audit logs issue in PI 7.1
    Regards
    Vinay P.

    Please use temporary name scheme
    http://help.sap.com/saphelp_nwpi711/helpdata/en/44/6830e67f2a6d12e10000000a1553f6/content.htm
    hope this helps
    regards
    Ninad

  • Can i use same customer master record in different sales area

    dear sir,
    i have created a record of customer master - RAM in sold to party account group using sales area - (I).
    (i.e,) XYZ ( sales org ) - A (dis channel ) - XX ( division)
    now i have created another record of customer master - DAVID in sold to party acc group using sales area - (II).
    (i.e) XYZ (sales org) - A (dis channel ) - YY ( division ).
    when i am trying to change SH party function of "RAM" with "DAVID" its throwing an error message that DAVID is not created in Sales Area - (I).
    please let me know how to resolve this
    regards
    vivek

    Hi,
    There are two solution to control this.
    1) Extend customers in respective sales area. i.e.. DAVID in XYZ-A-XX or RAM in XYZ-A-YY.
    2) Define reference sales organisation, channel and division and assign to respective.
    EX.      
    Define sales org. "0000" and assign to all sales orgs.
    Define channel "00" and assign to all channels.
    Define division "00" and assign to all divisions.
    Now create customer masters only in sales area 0000-00-00 and you can use this customer anywhere for any sales area (at transactional level). This is one time activity and will save so much time in creating(or extending) master records in different sales area.
    Thanks,
    Tarpan

  • Sp L records are coming to the PSA and not updating in the data target

    Special Ledger records are coming to the PSA and not updating in the data target in the month end and the info package is failing - because of that we are doing manual updating from PSA to the data target -
    Can so one can tell the reasons why this is happening? And give the solutions to it?

    Hi Sankar,
    If your Infopackage uses Only PSA and Update subsequently in Data Target on the Processing tab, then you will need to add a process for Read PSA and Update Data Target in your process chain, after the Infopackage load process. Then it will take data from the PSA and load to the data target.
    Hope this helps...

  • Staging area different from target - which KM???

    Hi,
    I need to transfer data from CSV to DB (only new inserts operation)
    I am working the following KM's
    LKM File to SQL
    IKM SQL Control Append
    It's working fine. However now i need to keep the staging area (where C$, I$ etc table are created) in a separate schema.
    I have created a diff schema for staging area and selected in the interface "Staging area different from target"
    However not sure of which KM's to use.
    Please let me know how to achieve this.
    Thanks,
    Rosh

    Hi
    1st thing you have to give the workschema for temp tables when creating physical schema in dataserver for target.
    Then you have to select the workschema in overview of interface as "staging different from target".
    After doing this when you will use predefined KM for the interface it will create temp tables in workschema.
    Now suppose you are not giving the workschma at the time of creating physical schema and you have selected "staging different than target" in interface.Here your C$ table will be created in your workschema but I$ table that is used by IKM is going to be created on target schema.So for this again you have change the IKM KM i.e. where to create I$ table (wokrschema) by selecting the corresponding logical schema.
    So its better you give the workschema at the time of creating physical schema.
    Here is the query to give privilege by the sys_dba
    Grant create any table to ODI_TEMP.
    Hope you got it
    (Please mark the answer as correct or helpful and close the thread)
    Thanks

Maybe you are looking for