Bw first Data Load

Hi people, i'm new user of this very exciting forum. I have some problem. I have two connected system (r3 and BW), i have activated the business content but i don't know how to load the data from r3 to BW. Can anyone help me? The content activated ar FI-CO SD.
Thank you for ALL
Massimiliano

Hi,
Anticipating that you have activated business content in  BW and also the data sources in R/3, let me take you thru the process.
You have to send those business content datasources to BW by selecting transfer datasources in RSA6 come to BW right click on the source system and select replicate datasources.
After that activate transfer rules and update rules and create an info package on the infosource and load the data.
Hope this helps
Assign points if useful
Regards,
venkat

Similar Messages

  • No data found in report csv or pdf output on first page load

    No data found in report csv output on first page load. Report shows up on page, but no data found in csv or pdf output. I always need to submit the page and then it shows data in downloads. The page contains a form (read-only) with a report that references 4 page items by using something like the following:
    and b.employee_no = v('P5_EMPLOYEE_NO')
    and c.fisc_year_pk = v('P5_FISC_YEAR_PK')
    and b.job_class = v('P5_JOB_CLASS')
    and b.organization = v('P5_ORGANIZATION')
    Again, report in the page shows up except there is no data in downloads. Can anyone help me solve this problem? I'm running Oracle APEX 4.2.1 in Oracle database 11gR2.
    Thanks.

    Sounds like a classic session state issue.
    You need to populate you page items using page rendering computations to ensure they're set to session state.
    Or if the download is triggered from a button press, you need to ensure those page items are submitted to session state, perhaps via a PL/SQL action.
    Scott

  • Data Loader - Only imports first record; remaining records fail

    I'm trying to use Data Loader to import a group of opportunities. Everytime I run the Data Loader it only imports the first record. All the other records fail with the message "An unexpected error occurred during the import of the following row: 'External Unique Id: xxxxxxx'". After running the Data Loader, I can modify the Data file and remove the first record that was imported. By running the Data Loader again, the first row (previously the second row) will import successfully.
    Any idea what could be causing this behavior?

    W need a LOT more information, starting with the OS, and the version of ID, including any applied patches.
    Next we need to know if you are doing a single record per page or multiple records, whether the placeholders are on the master page, how many pages are in the document and if they all have fields on them (some screen captures might be useful -- embed them using the camera icon on the editing toolbar on the webpage rather than attaching, if it works [there seem to be some issues at the moment, though only for some people]).
    What else is on the page? Are you really telling it to merge all the records, or just one?
    You get the idea... Full description of what youhave, what you are doing, and what you get instead of what you expect.

  • Why we load Master data first before loading Transaction data

    Hi Experts,
    why we load Master data first before loading Transaction data, specify any reasons for that ? Is it mandatory to load MD first ?
    I will allocate points to those who help me in detail. My advance thanks who respond to my query.
    Edited by: Nagireddy Pothireddy on Mar 10, 2008 8:17 AM

    Hi Nagireddy,
    I hope this helps....
    The bottom line for building cubes it to view facts against dimensions. When i say facts these are the key-figures i.e sales volume, Sales vat etc against some characteristics like sales Area,  Cost center , plant.
    Basically charateristics are those against which key-figures are measures like Costcenter, plant, material etc.
         Dimensions are a grouping of related characteristic. So basically a cube has a central fact table with dimesions associated to it in a relational schema. Imagine now you want to view a key figure Sales Volume against a dimension plant. when you consider plant , it has a distribution channel, purchasing organisation , company code, sales area, region etc associated with it. So which form the attributes of plant and also have some or the other description (texts) and aslo hierarchy. first we load the master data and then the transaction data follows.

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • Master Data loading got failed: error "Update mode R is not supported by th

    Hello Experts,
    I use to load master data for 0Customer_Attr though daily process chain, it was running successfully.
    For last 2 days master data loading for 0Customer_Attr got failed and it gives following error message:
    "Update mode R is not supported by the extraction API"
    Can anyone tell me what is that error for? how to resolve this issue?
    Regards,
    Nirav

    Hi
    Update mode R error will come in the below case
    You are running a delta (for master data) which afils due to some error. to resolve that error, you make the load red and try to repeat the load.
    This time the load will fail with update mode R.
    As repeat delta is not supported.
    So, now, the only thing you can do is to reinit the delta(as told in above posts) and then you can proceed. The earlier problem has nothing to do with update mode R.
    example your fiorst delta failed with replication issue.
    only replicating and repeaing will not solve the update mode R.
    you will have to do both replication of the data source and re-int for the update mode R.
    One more thing I would like to add is.
    If the the delat which failed with error the first time(not update mode R), then
    you have to do init with data transfer
    if it failed without picking any records,
    then do init without data transfer.
    Hope this helps
    Regards
    Shilpa
    Edited by: Shilpa Vinayak on Oct 14, 2008 12:48 PM

  • DTP does not fetch all records from Source, fetches only records in First Data Package.

    Fellas,
    I have a scenario in my BW system, where I pull data from a source using a Direct Access DTP. (Does not extract from PSA, extracts from Source)
    The Source is a table from the Oracle DB and using a datasource and a Direct Access DTP, I pull data from this table into my BW Infocube.
    The DTP's package size has been set to 100,000 and whenever this load is triggered, a lot of data records from the source table are fetched in various Data packages. This has been working fine and works fine now as well.
    But, very rarely, the DTP fetches 100,000 records in the first data package and fails to pull the remaining data records from source.
    It ends, with this message "No more data records found" even though we have records waiting to be pulled. This DTP in the process chain does not even fail and continues to the next step with a "Green" Status.
    Have you faced a similar situation in any of your systems?  What is the cause?  How can this be fixed?
    Thanks in advance for your help.
    Cheers
    Shiva

    Hello Raman & KV,
    Thanks for your Suggestions.
    Unfortunately, I would not be able to implement any of your suggestions because, I m not allowed to change the DTP Settings.
    So, I m working on finding the root cause of this issue and came across a SAP Note - 1506944 - Only one package is always extracted during direct access , which says this is a Program Error.
    Hence, i m checking more with SAP on this and will share their insights once i hear back from them.
    Cheers
    Shiva

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

  • Data load problems

    Hello friends
    I am facing a problem in the data load. I modified a cube by adding few Characteristics. The characteristics were first added in the communication structure and in the transfer rules. Then I reactivated the update routine. Finally, I deleted any previous data load request for the cube and did a full load. However i wasn't able to find any data for the newly added fields in the Cube.
    Did I miss something. Any help will be appreciated in this regard.
    Thanks
    Rishi

    how come ODS came in to picture,this was not mentioned in ur prev post,are u loading it from ODS to CUbe and having problems??
    looks u are not using DTP ,in that case check the change log for the newly added fields and then have data flow as ODS>PSA>Cube and check the PSA if those fields are present ..if yes check for update rules in debugging mode of those gets deleted
    Hope it Helps
    Chetan
    @CP..

  • Data loading after field enhancement.

    Dear all,
    We are using BI7.00 and in one of our data source, a new field has to be enabled. Our people are under the impression that without downtime, the previous data which is available in the Target and the PSA can have values for the new field also.
    I could not perceive the possibility. Experts suggestion required in this regard. Can you kindly provide answers for the following questions.
    1) Can enhancement be done to the data source without deletion of setup table?
    2) Can the delta queue be as it is without stopping the delta pull process i.e., the process chain and the background jobs.
    3) If the field is enhanced, can the value of the field be loaded to all the data which is previously loaded to the PSA and the Target.
    Request Experts to provide apt solution so that field enhancement can take place without disturbing any of the data loads.
    I went through the forum posts and was able to find something about export data source and Loop back principles - these suggests that my requirement is possible.
    I do not know the process. Can experts provide step by step suggestion to my query.
    Regards,
    M.M

    Hello Magesh,
    1)Enhancement cannot be done if there are records in the set up tables.
    2)When an enhancement is done...delta queue also needs to be empty...so you will have to stop the collective running jobs...lock the system and empty the delta queue by scheduling the delta package twice....then only the transports to production will go succesful.
    3)Until you fill the set up tables again and do a historial loads...the old values for the new added field will not appear..
    If you just do an init without data transfer and schedule new delta loads...then the new added fields will contain values from that day and changes to them...previously loaded values to BW will remain as it is...to have the values for newly added fields you need to load the history through full repair loads by filling the set up tables first.
    Follow the following steps to load only the new values for the added fields
    1)Lock the system
    2)schedule the collective update job through job control so that all the records are in the delta queue and no records or LUW are left in LBWQ for that data source.
    3)Schedule the delta infopackage twice so that even the queue for repeat delta is also empty.
    4) do the transports and then delete the old init and do a new init without data transfer.
    5)schedule the normal delta.
    To have history for the added fields
    1)Lock the system and
    2)Delete the old init and clear the LBWQ from LUW's
    3)Do the transports
    3)Fill the set up tables and do init without data transfer for the data source.
    4)Unlock the system
    5)Do the full repair loads to the BW data targets
    6)Schedule the delta loads.
    Thanks
    Ajeet

  • Data load for target after target enhancement

    Dear all,
    We are using BI7.00 and in one of our design we have the data first loaded to the ODS and then to the Cube. Now we wanted to get the value of one more field. This field is already avaialable in the data source and data is flowing upto PSA. I have added that field in the ODS. My problem starts here. I want the data to flow for all the previous requests i.e., earlier requests in the ODS and the Cube (Prior to enhancement) without disturbing the data load.
    Kindly provide the step by step instructions. I do not want the earlier data to be deleted. The current field value to be updated to the previous loads to the ODS and the Cube.
    Regards,
    M.M

    Hi,
    Overwrite optrion is available in the update mode of key figures.
    Its key figure which determines whether the DSO is in overwrite or addition mode.
    Go to the mappings of each infoobject in the tranformation from the data source to DSO and check the mapping type of each key figure and then see if its overwrite or not??
    To avoide reloading to the cube.....One of the option is to add the fields in both cubes and DSO and then activate the transformation and mappings,DTP ect.
    After that load the data to the DSO and then schedule the delta from the DSO to the cube.Thi will bring all the chnages to the DSO in the cube.
    But this delta may be a huge one and may fail depending upon the data in the DSO.
    But you can try for this and if it is not working then you may have to delete the cube and reload it.
    Thanks
    Ajeet

  • Data Load

    We are trying to load the data of 2lis_03_bf from sap R/3 into SAP BW.
    The following steps were followed in the process.
    1.Delete data from Inventory Queue LBWQ MCEX03 Entries
    2.Delete setup tables LBWG
    3.Check data in Extractor RSA3 0 records should be there
    4. Filling setup tables for 2LIS_03_BX MCNB Termination date = next
    day, Transfer structure = 2LIS_03_BX,Only Valuated stock( with posting
    block ) on 14th august
    5.Filling setup tables for 2LIS_03_BF OLI1BW The data restriction
    given by posting date 01.01.1999-14.08.2007
    6.Generate Initial status 2LIS_03_BX. RSA1 ( BW) Done with in the
    posting block
    7.Collapse data with marker update
    8.Start/Schedule Control Job in R/3 side for BF to run every 2hrs LBWE,
    as suggested by External Consultant.
    9.Initialize delta process for 2LIS_03_BF RSA1 Started on 15th of
    august but failed due termination in R/3.
    10.So started full update in two parallel data load into BW, 3months at
    a time. Each load took 2 days to bring 2 million records.
    11.This load of data till 14th of August 2007 finished on 4th Sep 2007.
    12.INITIAL LOAD WITHOUT DATA TRANSFER IS DONE Successfully ( to
    activate delta for BW)
    13.Delta to BW was scheduled and it transferred 0 from 0 records.
    14. Check for data is R/3 delta Q : RSA7 Data records are shown from
    01.09.2007 – 04.09.2007. Unable to find data from 15.08.2007 to
    31.08.2007.
    15.Performed a full data load from 15.08.2007 till date ( in order to
    get the data for the missing days) RSA1 0 from 0 records are
    transferred.
    We are looking for any advice in getting this data records from 15th of
    August till today.
    This is a very critical issue, because we are unable to provide our
    business with any production reports and also stock reports.
    Please some one help us to resolve the issue.
    Please help as early as possible

    Hi,
    I have a suggestion, you can try..
    as you said that the, your delta init has failed from 15th aug..but later your init was
    successful ended at today's date..right..
    so, if your init and delta activation is successful, then system would have started
    captureing the data through one of its update mode set by you..
    therefore, first goto RSA7 and check whether you have any delta records there..?
    if you found NO..
    then...go to T-code - LBWQ - and check the entries against 'MCEX03' you should be able to see the no. of records..
    step2 : go there double click on that record and check the value in the status field
    and if it is other than 'Ready', then change the status to 'Ready'...
    and revert back to me...for further steps..

  • To get first date and end date after entering any month and year

    Hi,
    I need to to get first date and end date of a month and year in yyyyMMdd format. I am reading month and year from a properties file. But I don't know how to get the first date and End date in given format. The properties file gives me just text. But I don't know how to get the date format using this. I need this urgently. Can anyone help me to get code for this?
    I am reading the fields as,
    Properties props = new Properties();
    props.load(new FileInputStream("AnyMonthVolume.properties"));
    String date_month = props.getProperty("date_month");
    String date_year = props.getProperty("date_year");
    Thanks.

    I know this has been posted a while ago but incase someone looking for it, here is the code to get the end of current month date.
    Calendar cal = Calendar.getInstance();
         cal.setTime(new java.util.Date());
         cal.set(Calendar.DATE, 1); //set the date to start of month
         cal.add(Calendar.MONTH,1);
         cal.add(Calendar.DATE,-1);
    System.out.println(cal.getTime());

  • Data load failed while loading data from one DSO to another DSO..

    Hi,
    On SID generation data load failed while loading data  from Source DSO to Target DSO.
    Following are the error which is occuuring--
    Value "External Ref # 2421-0625511EXP  " (HEX 450078007400650072006E0061006C0020005200650066
    Error when assigning SID: Action VAL_SID_CONVERT, InfoObject 0BBP
    So, i'm  not getting  WHY in one DSO i.e Source  it got successful but in another DSO i.e. Target its got failed??
    While analyzing all i check that SIDs Generation upon Activation is ckecked in source DSO but not in Target DSO..so it is reason its got failed??
    Please explain..
    Thanks,
    Sneha

    Hi,
    I hope your data flow has been designed in such a way where the 1st DSO as a staging Device and all transformation rules and routine are maintained in between 1st to 2nd dso and sid generation upon activation maintained in 2nd DSO.  By doing so you will be getting your data 1st DSO same as your source system data since you are not doing any transformation rules and routine etc.. which helps to avoid data load failure.  
    Please analyze the following
    Have you loaded masterdata before transaction data ... if no please do it first
    go to the property of first dso and check whether there maintained sid generation up on activation (it may not be maintained I guess)
    Goto the property of 2nd Dso and check whether there maintained sid generation up on activation (It may be maintained I hope)
    this may be the reason.
    Also check whether there is any special char involvement in your transaction data (even lower case letter)
    Regards
    BVR

  • Error is data loading from 3rd party source system with DBCONNECT

    Hi,
    We have just finished an upgrade of SAP BW 3.10 to SAP NW 7.0 EHP1.
    After the upgrade, we are facing a problem with data loads from a third party Oracle source system using DBConnect.
    The connection is working OK and we can see the tables in the source system. But we cannot load the data.
    The error in the monitor is as follows:
    'Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.'
    But, unfortunately, the error message has no further information.
    If we look at the job log in sm37, the job finished with the following log -                                                                               
    27.10.2009 12:14:19 Job started                                                                                00           516          S 
    27.10.2009 12:14:19 Step 001 started (program RSBATCH1, variant &0000000000119, user ID RXSAHA)                    00           550          S 
    27.10.2009 12:14:23 Start InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG                                              RSM1          797          S 
    27.10.2009 12:14:24 Element NOAUTHORITYCHECK is not available in the container                                     OL           356          S 
    27.10.2009 12:14:24 InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG created request REQU_4FMXSQ6TLSK5CYLXPBOGKF31G     RSM1          796          S 
    27.10.2009 12:14:24 Job finished                                                                                00           517          S 
    In a BW 3.10 system, there is no  message related to element NOAUTHORITYCHECK. So, I am wondering if this is something new in NW 7.0.
    Thanks in advance,
    Rajib

    There will be three things to get the errors like this
    1.RFC CONNECTION FAILED
    2.CHECK THE SOURCE SYSTEM
    3.CHECK IT OUT WITH Oracle Consultants WEATHER THEY ARE FILLING UP THE LOADS.TELL THEM TO STOP
    4.CHECK I DOC PROCESSING
    5.FINALLY MEMORY ISSUES.
    6.CATCH THE DATA SOURCE FIRST CHANGE IT AND THEN ACTIVATE AND RUN THE LOAD
    7.LAST IS MEMORY ISSUE.
    and also Check the RFC connection in SM59 If  it is ok then
    check the SAP note : 692195 for authorization
    Santosh

Maybe you are looking for

  • IDOC not received in XI; error in SM58 of sending system

    Hello everybody, we got a SAP CRM sending a "generated" IDOC. (IDOC-Type: CRMXIF_PARTNER_SAVE01) The IDOC seemes to be rejected by XI because SM58 shows: - No Service for System Binding_Error Client ... - function: IDOC_INBOUND_ASYNCHRONOUS All setti

  • IWeb 08 Blog links not working in Safari

    I just published several blogs on my site and i cannot see the posts using Safari. Works fin with FireFox and IE. Any suggestions?

  • Widget "TimesSquare" (in Webcams category) issue

    That widget has been not working anymore since a couple of month. I've tried to trash the two appledashboard preferences, repaired permissions, but no changes occured. Is this a matter of Leopard Update not compatible with that very widget. I've been

  • Source of Time-off Request Project

    Hi, I am new to CAF guided procedures, I am trying to implement the Callableobject in webdynpro(java) same as time-off request. in gallery>examples>Time-offrequset, will clear my doubts. so how can i get the DCs              com.sap.caf.eu.gp.example

  • Adjust the color calibration for your camera

    This question was posted in response to the following article: http://help.adobe.com/en_US/lightroom/using/WS939594D8-4279-41b4-B8E9-B06BC919EC7C.html