Data load issue with export data source - BW 3.5

Hi,
We are facing issues in loading data with the help of export data source.
We have created export data source of 0PCA_C01 cube. With the help of this export datasource,  we are loading data to other custom cube. Scenario is working fine in development server.
But when we transported objects to quality server data is not getting loaded to custom target cube.
It is extracting zero records.  All transports are ok and we have generated export datasource in quality before transports .Also regenerated export datasource after transport and activated infosource, update rule via RS* programs.  Every object is active but data is not getting extracted.
RSA3 for 80PCA_C01 datasource isn't extracting any record in Quality. Records getting extracted in development.   We are in BW 3.5 with patch level 19.
Please guide us to resolve the issue.
Thanks,
Aditya

Hi
Make sure that you have relevant Role & Authorization at Quality/PRS.
You have to Transport the Source Cube first and then Create a Generate Export Data Source in QAS. Then, replicate data sources for BW QAS Soruce System. Make sure this replicated Data Source in QAS. Only then can transport new update rules for second cube.
Hope it helps and clear

Similar Messages

  • Master data load issue with flexible data source

    Hi All,
    1CL_OLIS001have this data source for loading configuration data. the changed records are not updating into this DS regularly.
    I have delta load on this every day. It loads correct data for one month and suddently it will start loading 0 records every day with out failing. If I reset the delta, it will work fine for month or two and then will corrupt again.
    Is any one have this kind of issue before Please suggest to resolve this issue permanently?
    Thanks and Regards,
    Pooja

    Hi Pooja,
    1) It might be Database table space issue in the backround, Please checkit out with BASIS
    Regards,
    Marasa.

  • Data Pump issue with nls date format

    Hi Friends,
    I have a database with nls date format 'DD/MM/YYYY HH24:MI:SS' from where I wish to take export from. I have a target database with nls date format 'YYYY/MM/DD HH24:MI:SS' . I have a few tables whose create statements have some date fields with DEFAULT '01-Jan-1950' and these CREATE TABLE statements when processed by Data pump is getting failed in my target database. These tables are not getting created due to this error
    Failing sql is:
    CREATE TABLE "MCS_OWNER"."SECTOR_BLOCK_PEAK" ("AIRPORT_DEPART" VARCHAR2(4) NOT NULL ENABLE, "AIRPORT_ARRIVE" VARCHAR2(4) NOT NULL ENABLE, "CARRIER_CODE" VARCHAR2(3) NOT NULL ENABLE, "AC_TYPE_IATA" VARCHAR2(3) NOT NULL ENABLE, "PEAK_START" VARCHAR2(25) NOT NULL ENABLE, "PEAK_END" VARCHAR2(25), "BLOCK_TIME" VARCHAR2(25), "FLIGHT_TIME" VARCHAR2(25), "SEASON" VARC
    ORA-39083: Object type TABLE failed to create with error:
    ORA-01858: a non-numeric character was found where a numeric was expected
    The table create sql which adds a column as ' VALID_FROM DATE DEFAULT '01-jan-1970' ' which I think is the issue. Appreciate if someone can suggest a way to get around with this. I have tried altering the nls of source db to be same as target database. Still the impdp fails.
    Database is 10.2.0.1.0 on Linux X86 64 bit.
    Thanks,
    SSN
    Edited by: SSNair on Oct 27, 2010 8:25 AM

    Appreciate if someone can suggest a way to get around with this.change the DDL that CREATE TABLE to include TO_DATE() function.
    With Oracle characters between single quote marks are STRINGS!
    'This is a string, 2009-12-31, not a date'
    When a DATE datatype is desired, then use TO_DATE() function.

  • Sql loader - Data loading issue with no fixed record length

    Hi All,
    I am trying to load the following data through sql loader. However the records # 1, 3 & 4 are only loading succesfully into the table and rest of the records showing as BAD. What is missing in my syntax?
    .ctl file:
    LOAD DATA
    INFILE 'C:\data.txt'
    BADFILE 'c:\data.BAD'
    DISCARDFILE 'c:\data.DSC' DISCARDMAX 50000
    INTO TABLE icap_gcims
    TRAILING NULLCOLS
         CUST_NBR_MAIN          POSITION(1:9) CHAR NULLIF (CUST_NBR_MAIN=BLANKS),
         CONTACT_TYPE          POSITION(10:11) CHAR NULLIF (CONTACT_TYPE=BLANKS),
         INQUIRY_TYPE          POSITION(12:13) CHAR NULLIF (INQUIRY_TYPE=BLANKS),
         INQUIRY_MODEL          POSITION(14:20) CHAR NULLIF (INQUIRY_MODEL=BLANKS),
         INQUIRY_COMMENTS     POSITION(21:60) CHAR NULLIF (INQUIRY_COMMENTS=BLANKS),
         OTHER_COLOUR POSITION(61:75) CHAR NULLIF (OTHER_COLOUR=BLANKS),
         OTHER_MAKE          POSITION(76:89) CHAR NULLIF (OTHER_MAKE=BLANKS),
         OTHER_MODEL_DESCRIPTION POSITION(90:109) CHAR NULLIF (OTHER_MODEL_DESCRIPTION=BLANKS),
         OTHER_MODEL_YEAR POSITION(110:111) CHAR NULLIF (OTHER_MODEL_YEAR=BLANKS)
    data.txt file:
    000000831KHAN
    000000900UHFA WANTS NEW WARRANTY ID 000001017OHAL
    000001110KHAP
    000001812NHDE231291COST OF SERVICE INSPECTIONS TOO HIGH MAXIMA 92 MK
    000002015TPFA910115CUST UPSET WITH AIRPORT DLR. $200 FOR PLUGS,OIL,FILTER CHANGE. FW
    Thanks,

    Hi,
    Better if you have given the table structure, I check your script it was fine
    11:39:01 pavan_Real>create table test1(
    11:39:02   2  CUST_NBR_MAIN  varchar2(50),
    11:39:02   3  CONTACT_TYPE varchar2(50),
    11:39:02   4  INQUIRY_TYPE varchar2(50),
    11:39:02   5  INQUIRY_MODEL varchar2(50),
    11:39:02   6  INQUIRY_COMMENTS varchar2(50),
    11:39:02   7  OTHER_COLOUR varchar2(50),
    11:39:02   8  OTHER_MAKE varchar2(50),
    11:39:02   9  OTHER_MODEL_DESCRIPTION varchar2(50),
    11:39:02  10  OTHER_MODEL_YEAR varchar2(50)
    11:39:02  11  );
    Table created.
    11:39:13 pavan_Real>select  * from test1;
    no rows selected
    C:\Documents and Settings\ivy3905>sqlldr ara/ara@pavan_real
    control = C:\control.ctl
    SQL*Loader: Release 9.2.0.1.0 - Production on Sat Sep 12 11:41:27 2009
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Commit point reached - logical record count 5
    11:42:20 pavan_Real>select count(*) from test1;
      COUNT(*)                                                                     
             5    control.ctl
    LOAD DATA
    INFILE 'C:\data.txt'
    BADFILE 'c:\data.BAD'
    DISCARDFILE 'c:\data.DSC' DISCARDMAX 50000
    INTO TABLE test1
    TRAILING NULLCOLS
    CUST_NBR_MAIN POSITION(1:9) CHAR NULLIF (CUST_NBR_MAIN=BLANKS),
    CONTACT_TYPE POSITION(10:11) CHAR NULLIF (CONTACT_TYPE=BLANKS),
    INQUIRY_TYPE POSITION(12:13) CHAR NULLIF (INQUIRY_TYPE=BLANKS),
    INQUIRY_MODEL POSITION(14:20) CHAR NULLIF (INQUIRY_MODEL=BLANKS),
    INQUIRY_COMMENTS POSITION(21:60) CHAR NULLIF (INQUIRY_COMMENTS=BLANKS),
    OTHER_COLOUR POSITION(61:75) CHAR NULLIF (OTHER_COLOUR=BLANKS),
    OTHER_MAKE POSITION(76:89) CHAR NULLIF (OTHER_MAKE=BLANKS),
    OTHER_MODEL_DESCRIPTION POSITION(90:109) CHAR NULLIF (OTHER_MODEL_DESCRIPTION=BLANKS),
    OTHER_MODEL_YEAR POSITION(110:111) CHAR NULLIF (OTHER_MODEL_YEAR=BLANKS)
    data.txt
    000000831KHAN
    000000900UHFA WANTS NEW WARRANTY ID 000001017OHAL
    000001110KHAP
    000001812NHDE231291COST OF SERVICE INSPECTIONS TOO HIGH MAXIMA 92 MK
    000002015TPFA910115CUST UPSET WITH AIRPORT DLR. $200 FOR PLUGS,OIL,FILTER CHANGE. FW
    CUST_NBR_MAIN     CONTACT_TYPE     INQUIRY_TYPE     INQUIRY_MODEL     INQUIRY_COMMENTS     OTHER_COLOUR     OTHER_MAKE     OTHER_MODEL_DESCRIPTION     OTHER_MODEL_YEAR
    000000831     KH     AN     NULL     NULL     NULL     NULL     NULL     NULL
    000000900     UH     FA      WANTS     NEW WARRANTY ID 000001017OHAL     NULL     NULL     NULL     NULL
    000001110     KH     AP     NULL     NULL     NULL     NULL     NULL     NULL
    000001812     NH     DE     231291C     OST OF SERVICE INSPECTIONS TOO HIGH MAXI     MA 92 MK     NULL     NULL     NULL
    000002015     TP     FA     910115C     UST UPSET WITH AIRPORT DLR. $200 FOR PLU     GS,OIL,FILTER C     HANGE. FW     NULL     NULL- Pavan Kumar N
    Edited by: Pavan Kumar on Sep 12, 2009 11:46 AM

  • Data load issue with Master infoobject

    Hi Sdns,
    I have master infoobject and it has a list of time independent attributes. I have successfully loaded the dato to this master infoobject.This master infoobject is being used as a attribute  in some infoobjects  and as a infoobject in infocubes and DSO's.
    Now the requirement is changed and few  attributes of the master infoobject become  time dependent.when i have tried to delete the data in the master info object, some amount data i have deleted and am unable to delete remaining data.
    is it necessary to delete the data from all the infoproviders before deleting the data from master infoobject when the change is happened in its attributes.
    Advance thanks
    karunakar

    I guess no need to delete the master data for this change! ... if you want to uncheck the time dependent we must delete data
    But you can tick time dependent for the MD obj. and activate it... it will tell you for the deletion of data is necessary or not
    After this action you will see Date to and Date from in the transformations.. you can map them and do the load. .. it will overwrite the master data object

  • Date comparision issue with different date formats

    Hi Friends
    I am trying to find difference in hours between two dates but not able to do that as the formats of both dates that i have are different.
    But i have both the dates in the String format. belwo is my code.
    First Date is in the format - yyyy-MM-dd-HH.mm.ss.SSS. String sDate = "2011-05-29-13.50.44.050761";
    Secong date is in the format - yyyy-MM-dd HH.mm.ss String uiDate = "2011-05-29 13.50.44";
    Now i need to parse both the Strings into a Date object and format that so that i can subtract both dates using the
    long difference = uiDate .getTime() - sDate.getTime();
    Can you please let me know how i can do that as the format of both the dates are different.
    Is there a way like a common format to which these both can be converted and calculation done.
    any suggestions and solutions would be of great help.

    Thanks All for your suggestions
    well i figured out the solution with some goggling and r&d.
    For the first case I was getting the Date and Time fields from the UI as two different fields. I then combined these into a Date object and set the format for the first date as yyyy-MM-dd-HH.mm.ss.SSS.
    The second date was already in the above format. Then the i just did the Date comparison and got the difference in hours which was what i wanted.
    Vikeng

  • Has anyone had issues with Administration\Data Import/Export\Data Import???

    Has anyone had issues with Administration\Data Import/Export\Data Import???
    I have a client who has recently upgraded from V2007 to V8.81. They were succesfuly  using this standard function to import supplier prices to their master price list, but now it has failed?
    I have looked at the file they are importing and it appears to be fine.
    On closer inspection, it did contain approx 46,000 entries, so I took the first 1,000 and created a test file, which imported fine.
    The only issue I found was Speed, with the test file of 1,000 records taking about 30 Mins to import. This appeared to get slower and slower the further through the file it got!
    Based on this, I have estimated that the whole file would takle about 13 hours to import. The client say that when they used to run it on version 2007 it was far quicker?
    In practice, it does appear to run, but the speed is the issue. Having said this, I set the whole file to run last night (over night)and this morning it had appeared to hang after about 2,307 rows, with nothing else being updated.
    Has anyone any ideas or is aware of performance issues like this?
    Thanks,
    Ian

    Always an option, but would you give your clients access to this tool?
    Not sure really.
    I have uploaded a copy of their database onto my test system and run the same routine. Its equally as SLOW
    I can't gauage if its an issue with 8.81 that 2007 didn't have, as I only have the client's word on it, however I have no reason to disbelieve them.
    Kind regards,
    Ian

  • Issue with 2LIS_02_SCL Data source

    Hi Everybody,
    I am facing the below 2 issues with 2lis_02_scl data source,
    1) This is fetching only the records  ETENR (Delivery Schedule Line Counter) value with ' 1 ', It is ignoring others ex:2,3 and 4. Hence Data is not reconciling with ECC system.
    2) The standard field GLMNG is not getting any data, Data was existed in table(EKET) level. So i have written the code and data is coming now. But the problem is, This is not considering the ROCANCEL indicator it seems. All the other key figures values are coming in with Negative sign When ROCANCEL Value is ' X ' or ' R ', But this field is getting all the positive values irrespective of ROCANCEL indicator. Hence showing the incorrect values compared to the ECC.
    Can anybody help me on this,
    Regards,
    Gopinath

    Hi Gopinath:
       Have you already applied any SAP Note to solve this problem?
    Please check if the SAP Note below is applicable to your system.
    668177 - "LIS BW: wrong quantity for documents with invoice plan"
    Regards,
    Francisco Milán.

  • Inventory data load issue

    HI all,
             We have 2 source system SAP4.7 and ECC6.0. I am using 3 data source BX, BF and UM.
    one year data we required from SAP4.7 and all data from ECC6.0.
    In SAP4.7 totally 7 years data available but we required only last one year. my doubt is if i extract last one year data weather  in my report OPENING STOCK and CLOSING STOCK show correctly?
    - SAP4.7 closing stock will be opening stock in ECC6.0 like the manner they will upload in source system.
    Getting data from two source system, what is the extraction steps i have to follow.
    Kindly give me yr suggestion.
    Thanks
    sara

    Hi,
    First you need to make sure the closing stock of R/3 4.7 is matching with the ECC 6.0 opening stock. It should be same, usually when the data cut over happens this will be addressed. you can cross check the same using T code - MB5B / MB52 etc.
    When you load the data using BX source, it is a full load one, it will be pulling the data for as on date. So it will be brining the data only from the ECC 6.0 system being connected and there wont be any issue with the data.
    BF - will bring the material movement, which is needed if you need to see the historic data. So do the loading in the normal manner. Split the load depending up on the data volume. While doing the set up table activity in 4.7 select only the needed period. do the same for the UM data source also.
    While setting up the delta, do it only for 6.0.
    Regards

  • Data Load Issue "Request is in obsolete version of DataSource"

    Hello,
    I am getting a very strange data load issue in production, I am able to load the data upto PSA, but when I am running the DTP to load the data into 0EMPLOYEE ( Master data Object) getting bellow msg
    Request REQU_1IGEUD6M8EZH8V65JTENZGQHD not extracted; request is in obsolete version of DataSource
    The request REQU_1IGEUD6M8EZH8V65JTENZGQHD was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
    I have taken the follwoing action
    1. Replicated the data source
    2. Deleted all request from PSA
    2. Activated the data source using (RSDS_DATASOURCE_ACTIVATE_ALL)
    3. Re transported the datasource , transformation, DTP
    Still getting the same issue
    If you have any idea please reply asap.
    Samit

    Hi
    Generate your datasource in R/3 then replicate and activate the transfer rules.
    Regards,
    Chandu.

  • TileList data load issue

    I am having an issue where the data that drives a tilelist
    works correctly when the tile list is not loaded on the first page
    of the application. When it is put on a second page in a viewstack
    then the tilelist displays correctly when you navigate to it. When
    the tilelist is placed in the first page of the application I get
    the correct number of items to display in the tilelist but the
    information the item renderer is supposed to display, ie a picture,
    caption and title, does not. The strange thing is that a Tree
    populates correctly given the same situation. Here is the sequence
    of events:
    // get tree is that data for the tree and get groups is the
    data for the tilelist
    creationComplete="get_tree.send();get_groups.send();"
    <mx:HTTPService showBusyCursor="true" id="get_groups"
    url="[some xml doc]" resultFormat="e4x"/>
    <mx:XMLListCollection id="myXMlist"
    source="{get_groups.lastResult.groups}"/>
    <mx:HTTPService showBusyCursor="true" id="get_tree"
    url="[some xml doc]" resultFormat="e4x" />
    <mx:XMLListCollection id="myTreeXMlist"
    source="{get_tree.lastResult.groups}"/>
    And then the data provider of the tilelist and tree are set
    accordingly. I tried putting moving the data calls from the
    creation complete to the initialize event thinking that it would
    hit earlier in the process and be done by the time the final
    completion came about but that didn't help either. I guess I'm just
    at a loss as to why the tree works fine no matter where I put it
    but the TileList does not. It's almost like the tree and the
    tilelist will sit and wait for the data but the item renderer in
    the tilelist will not wait. Which would explain why clicking on the
    tile list still produces the correct sequence of events but the
    visual component of the tilelist is just not working right. Anyone
    have any ideas?

    Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.

  • Master Data Load Issue

    Hi Experts,
    This is regarding an issue with Master Data "DELTA" load.
    WBSe master data has below fields
    WBSe
    Approval Year
    Program definition
    Program position
    Funding Type
    ABCD
    2014
    ZZPA
    Z_ABCD
    CB
    Program Position master data has below fields
    Approval Year
    Program definition
    Program position
    Funding Type
    2014
    ZZPA
    Z_ABCD
    DC
    WBSe master data table has "look up" for program position to populate "funding type" based on Approval Year,Program Definition and Program Position.
    The issue is, daily delta load is not getting "recent/new funding type values".
    But, If load the same record with program position selection in FULL Load, the new funding type is getting updated.
    Please tell me why "funding Type" is not getting updated with new value in DELTA load.
    Thanks
    Asha

    Hi Asha,
    In delta mode only new or changed records are loaded to WBSe MD,
    and look up will only pick up data if data is loaded from Data Source to WBSe MD.
    There are 2 solutions to this :-
    1) change WBSe MD to full load, it will take care, as MD is always over written.
    2) Enhance this field in Source system, i.e. Data Source enhacement, if you know the Table for look up in source system.
    Thanks and Regards,
    Amit.

  • C#/SharePoint -View State related issue while exporting data to excel

    We have web application based on SharePoint. We have a list view to display data based on Search Criteria. We have total of around 16000 data. When we try to export all data to excel it gives an error. The cause of the issue is:
    View State is used to save the state of page i.e. the previous state. When the export to excel link is clicked, the data present in the grid is actually saved in view state so that the overhead of again generating
    the data(as per the selection criteria) can be avoided. Now, this View state has a limitation of storage of these records. When we put in some selection criteria, the no. of records is less hence no problem in actually storing the view state. But when there
    is no selection criteria, 16306 records are too much to be stored in view state. Hence the error occurs.
    We tried to solve this issue by creating a compressor class and overriding two methods:
     LoadPageStateFromPersistenceMedium
    SavePageStateToPersistenceMedium
    This has resolved the issue with Export to Excel. However, there is now some issue in the paging of the list view. The list view has a column with hyperlink which opens related data in a form. While we click on the link in 2<sup>nd</sup>
    page of the list view, it displays data for the link in the 1<sup>st</sup> page at same row.
    Please suggest how to resolve this issue or any workaround for this

    We have web application based on SharePoint. We have a list view to display data based on Search Criteria. We have total of around 16000 data. When we try to export all data to excel it gives an error. The cause of the issue is:
    View State is used to save the state of page i.e. the previous state. When the export to excel link is clicked, the data present in the grid is actually saved in view state so that the overhead of again generating
    the data(as per the selection criteria) can be avoided. Now, this View state has a limitation of storage of these records. When we put in some selection criteria, the no. of records is less hence no problem in actually storing the view state. But when there
    is no selection criteria, 16306 records are too much to be stored in view state. Hence the error occurs.
    We tried to solve this issue by creating a compressor class and overriding two methods:
     LoadPageStateFromPersistenceMedium
    SavePageStateToPersistenceMedium
    This has resolved the issue with Export to Excel. However, there is now some issue in the paging of the list view. The list view has a column with hyperlink which opens related data in a form. While we click on the link in 2<sup>nd</sup>
    page of the list view, it displays data for the link in the 1<sup>st</sup> page at same row.
    Please suggest how to resolve this issue or any workaround for this

  • Demantra Data Load Issue

    I am new to Demantra. Have installed a stand alone Demantra system in our server. In order to load data, I created a new model, defined item and location levels, then clicked on 'Build Model'. The data is loaded into 3 custom tables created by me. After creating the model, I cannot login to 'Collaborator Workbench', it gives message 'There are system errors. Please contact your System Administrator'. Can anyone please tell me what I am doing wrong and how to resolve the issue.
    Thanks

    Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.

  • DSO data Load issue

    Hi all,
    i have some issue with DSO data load , while loading the data data comming to psa perfectly 238 ware comes, but when i trigger that DTP, i am getting only 6 records.
    Please any one suggest me.
    Thanks,
    Gayatri.

    Hi Gayatri,
    if you have already loaded some data to DSO & now if you are trying to do Delta, it is possible that it is taking only Delta Data??
    (or)
    If you have any Start/End routines/Rule Routines written for deleting any records based on some conditions.
    (or)
    Also It depends on the keyfield you have selected in the DSO. If the keyfield which you have selected is having some repeated values, then while loading into DSO, it will be aggregated i.e if you have 10 rows for keyfield with value say 101, then in the DSO it will be loaded with only one row with value 101 (10 rows becoming 1 row) & with the keyfigure either Summation/Overwritten depending on what you have selected in the rule details for the keyfigure(you can check it by right click on Keyfigure mapping> Rule Details> here you can see whether it is Overwrite/Summation).
    Also as mentioned in above posts you can check the DSO --> manage --> Check the number of rows transferred & number of rows added.
    Hope it is clear & helpful!
    Regards,
    Pavan

Maybe you are looking for

  • Digitizer/Dead spots

    i have an iphone it works great in some places but i have a couple of dead spots. the phone,email,safari, and ipod icons dont work at all. since i have these dead spots on the bottom row of the phone and also in the middle of the phone. Would buying

  • BOM with mile stone billing (order related)

    Hi Gurus, Can any body tell me soln for the following scenario. Order type                OR Material                    BOM Material(header) Item cat                    Taq &TAE Billing type                  01(mile stone) Requirenent               

  • Backup/restore a system that has 2 OS installed

    Hello Guys I have 2 OS –both are windows XP- installed on a machine on 2 different partitions Using Rescue and Recovery, I created a backup inside OS number 1, I tried to restore the backup, but unfortunately, Rescue and Recovery restored only one OS

  • DSL user name & password

    When I connected my replacement DSL modem/router from Verizon, it connected automatically.  I did not have to enter the Verizon DSL user name and password that were used for my old modem and router.  When I checked the new modem settings, the user na

  • Win 7 (64-bit) DJ 812C only prints test page

    Windows 7 (64-bit) driver for HP 812C installed; Word and e-mail print attempts result in printing only the test page.  Holding Power-On switch and pressing formfeed four times resulted in a good print-out that, among other things, mentioned an error