Help needed in loading data from BW(200) to APO(100).

Hi everybody,
I am trying to load a master data from BW(200) to AP1(100) , this is what i did.
1) created a InfoObject, with some fields (general-ebeln, Attributes- bukrs, werks,matnr).
2) created and activated communication structure.
3) Then went to BW(200) created data source(sbiw) and extracted datas from a particular info object, which had same fields and saved it, then went to RSA3 checked for data availability in data source , and it was available there too.
4) Came back to AP1(100), in the source system tab, opened BW(200) and replicated the datas. I was able to see the Data source name which is created in BW(200).
5) Create and activated the Transfer struct.
6) created a info package, and loaded the data, but the monitor says " NOT YET COMPLEATED" , "CURRENTLY IN PROCESS". and it also shows "0 of 0 RECORDS".
I want to know,
1) Is there any mistake in what i have done above ?
2) how long will it take to complete the process (i.e. the loading) ?.
Please help me through this problem.
Thanks,
Ranjani.

Hi,
I am surprised with your steps. In APO, you want to load data from a particular infoobject from BW. Why did you create a specific extractor in SBIW???
You have just reinvented the wheel... It reminds me some people in the world ...
Here is what you should do:
- in BW, at the infosource level, you create a direct infosource based on the infoobject that you want to extract the data to APO (let's say 0MATERIAL)
- in BW, at the infosource level, you right click on the infosource itself and you choose 'GENERATE EXPORT DATASOURCE. That will create the datasources for you (attributes, texts, hierarchies) depending on the infoobject settings. The names of these datasources will begin with a 8 for the datamart
- in APO, you replicate the BW system. Now you find the datasources 80MATERIAL something
- in APO, you create the transfer rules to your infosource and you can load
Just give it a try
Regards

Similar Messages

  • Help needed with loading data from ODS to cube

    Hi
    I am loading data from an ODS to a cube in QA. The update rules are active and definition seems right. I am getting the following error, in the update rules.
    "Error in the unit conversion. Error 110 occurred"
    Help.
    Thanks.

    Hi Chintai,
    You can see the record number where the error occured in the monitor (RSMO) for this data load > goto the details tab and open up the Processing area (+ sign). Try it out...
    Also about ignoring the error record and uploading the rest, this is done if you have set Error Handling in your InfoPackage (Update tab), but this would have to be done before the load starts, not after the error happens.
    Hope this helps...
    And since you thanked me twice, also please see here:-) https://www.sdn.sap.com/irj/sdn?rid=/webcontent/uuid/7201c12f-0701-0010-f2a6-de5f8ea81a9e [original link is broken]

  • Help needed to load data using sql loader.

    Hi,
    I trying to load data from xls to oracle table(solaris OS) and its failing to load data.
    Control file:
    LOAD DATA
    CHARACTERSET UTF16
    BYTEORDER BIG ENDIAN
    INFILE cost.csv
    BADFILE consolidate.bad
    DISCARDFILE Sybase_inventory.dis
    INSERT
    INTO TABLE FIT_UNIX_NT_SERVER_COSTS
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
    HOST_NM,
    SERVICE_9071_DOLLAR DOUBLE,
    SERVICE_9310_DOLLAR DOUBLE,
    SERVICE_9700_DOLLAR DOUBLE,
    SERVICE_9701_DOLLAR DOUBLE,
    SERVICE_9710_DOLLAR DOUBLE,
    SERVICE_9711_DOLLAR DOUBLE,
    SERVICE_9712_DOLLAR DOUBLE,
    SERVICE_9713_DOLLAR DOUBLE,
    SERVICE_9720_DOLLAR DOUBLE,
    SERVICE_9721_DOLLAR DOUBLE,
    SERVICE_9730_DOLLAR DOUBLE,
    SERVICE_9731_DOLLAR DOUBLE,
    SERVICE_9750_DOLLAR DOUBLE,
    SERVICE_9751_DOLLAR DOUBLE,
    GRAND_TOTAL DOUBLE
    Log file:
    Table FIT_UNIX_NT_SERVER_COSTS, loaded from every logical record.
    Insert option in effect for this table: INSERT
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    HOST_NM FIRST * , CHARACTER
    SERVICE_9071_DOLLAR NEXT 8 DOUBLE
    SERVICE_9310_DOLLAR NEXT 8 DOUBLE
    SERVICE_9700_DOLLAR NEXT 8 DOUBLE
    SERVICE_9701_DOLLAR NEXT 8 DOUBLE
    SERVICE_9710_DOLLAR NEXT 8 DOUBLE
    SERVICE_9711_DOLLAR NEXT 8 DOUBLE
    SERVICE_9712_DOLLAR NEXT 8 DOUBLE
    SERVICE_9713_DOLLAR NEXT 8 DOUBLE
    SERVICE_9720_DOLLAR NEXT 8 DOUBLE
    SERVICE_9721_DOLLAR NEXT 8 DOUBLE
    SERVICE_9730_DOLLAR NEXT 8 DOUBLE
    SERVICE_9731_DOLLAR NEXT 8 DOUBLE
    SERVICE_9750_DOLLAR NEXT 8 DOUBLE
    SERVICE_9751_DOLLAR NEXT 8 DOUBLE
    GRAND_TOTAL NEXT 8 DOUBLE
    Record 1: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column HOST_NM.
    Field in data file exceeds maximum length
    Table FIT_UNIX_NT_SERVER_COSTS:
    0 Rows successfully loaded.
    1 Row not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Please help me ASAP.
    Awaiting u r reply.

    Hi,
    I verified and everything looks fine according to me.
    Table structure:
    OST_NM VARCHAR2(30)
    SERVICE_9071_DOLLAR NUMBER(8,2)
    SERVICE_9310_DOLLAR NUMBER(8,2)
    SERVICE_9700_DOLLAR NUMBER(8,2)
    SERVICE_9701_DOLLAR NUMBER(8,2)
    SERVICE_9710_DOLLAR NUMBER(8,2)
    SERVICE_9711_DOLLAR NUMBER(8,2)
    SERVICE_9712_DOLLAR NUMBER(8,2)
    SERVICE_9713_DOLLAR NUMBER(8,2)
    SERVICE_9720_DOLLAR NUMBER(8,2)
    SERVICE_9721_DOLLAR NUMBER(8,2)
    SERVICE_9730_DOLLAR NUMBER(8,2)
    SERVICE_9731_DOLLAR NUMBER(8,2)
    SERVICE_9750_DOLLAR NUMBER(8,2)
    SERVICE_9751_DOLLAR NUMBER(8,2)
    GRAND_TOTAL NUMBER(8,2)
    Control file:
    LOAD DATA
    BYTEORDER BIG ENDIAN
    INFILE cost.csv
    BADFILE consolidate.bad
    DISCARDFILE Sybase_inventory.dis
    INSERT
    INTO TABLE FIT_UNIX_NT_SERVER_COSTS
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
    HOST_NM ,
    SERVICE_9071_DOLLAR NUMBER(8,2),
    SERVICE_9310_DOLLAR NUMBER(8,2),
    SERVICE_9700_DOLLAR NUMBER(8,2),
    SERVICE_9701_DOLLAR NUMBER(8,2),
    SERVICE_9710_DOLLAR NUMBER(8,2),
    SERVICE_9711_DOLLAR NUMBER(8,2),
    SERVICE_9712_DOLLAR NUMBER(8,2),
    SERVICE_9713_DOLLAR NUMBER(8,2),
    SERVICE_9720_DOLLAR NUMBER(8,2),
    SERVICE_9721_DOLLAR NUMBER(8,2),
    SERVICE_9730_DOLLAR NUMBER(8,2),
    SERVICE_9731_DOLLAR NUMBER(8,2),
    SERVICE_9750_DOLLAR NUMBER(8,2),
    SERVICE_9751_DOLLAR NUMBER(8,2),
    GRAND_TOTAL NUMBER(8,2)
    Sample date file:
    ABOS12,122.46,,1315.00,,1400.00,,,,,,,,1855.62,,4693.07
    ABOS39,6391.16,,1315.00,,1400.00,,,,,,,,,4081.88,13188.04

  • Help needed to insert data from different database

    Hi ,
    I have a requirement where i need to fetch data from different database through database link .Depending on user request , the dblink needs to change and data from respective table from mentioned datbase has to be fetched and populated .Could i use execute immediate for this, would dblink work within execute immediate .If not , could pls let me know any other approach .

    What does "the dblink needs to change" mean?
    Are you trying to dynamically create database links at run-time? Or to point a query at one of a set of pre-established database links at run-time?
    Are you sure that you really need to get the data from the remote database in real time? Could you use materialized views/ Streams/ etc to move the data from the remote databases to the local database? That tends to be far more robust.
    Justin

  • Help needed in  extracting data from PCD tables

    Hi Friends
    I Have a requiremnt for creating custom portal activity report ,even though
    we have  standard report, the extraced data will be used to create bw reports later.
    my part is to find a way to extract the data from PCD tables for creating
    custom portal activity reports
    i have selected the following  tables for the data extraction
    WCR_USERSTAT,WCR_WEBCONTENTSTAT,WCR_USERFIRSTLOGON,
    WCR_USERPAGEUSAGE.
    My questions are
    1.Did i select the Exact PCD tables?
    2.Can i use UME api  for  accessing the data from those tables?
    3.can i use  the data extracted  from PCD tables in JSPdynpage  or
    webdynpro apps?
    4.can i Querry  the  PCD tables from  JSPDynpage or Webdynpro
    Please help me in finding a solution for this
    Thanks
    Ashok Battula

    Hi daniel
    Can u tell  me weather i can develop the following  custom reports from those WCR tables
         Report Type
    1     Logins
          - Unique Count
          - Total Count
          - Most Active Users (by Partner Name)
          - Most Active Users (by Contact Name)
          - Entry Point (by page name)
          - Session Time
          - Hourly Traffic Analysis
    2     Login Failures
          - Total Count
          - Count by error message
          - Credentials Entered (by user name and password)
    3     Content Views (by File Name)
          - Unique Count
          - Total Count
          - Most requested Files
          - Most requested Pages
          - File Not Found
    4     Downloads (by File Name)
          - Unique Count
          - Total Count
          - Most requested Files
          - File Not Found
    5     Portal Administration
          - Site Content (by file name)
          - Site Content (by page name)
          - Latest Content (by file name)
          - Expired Content (by file name)
          - Subscriptions Count (by file name)
    6     Login History (by Partner, Contact Name)
          - No Login
          - First Login
          - Duration between registration and first login
          - Most Recent Login
          - Average Number of Logins
    plz  help me in find ing a way
    thanks
    ashok

  • Need to load data from source .CSV files to oracle target database.

    Hi,
    This is the my scenario
    I have .CSV files in ftp folder and need to load the data into target tables.
    For that i need to create package and load the data into daily basis.
    But some time .csv file name will vary daily basis.
    can you any one suggest me???
    Thanks in Advacne.
    Zakeer

    Dear Roy,
    Thanks for your response
    Now I am able to extract the .zip file OdiUnZip (file). and loading data into target this is chapping in static way
    and my scenario is that some time i will get .zip files with different names with different .csv files
    i need to dynamically find the new .zip file and extract it and load the data into target.
    Please advice me..
    Thanks in advance
    Zakeer

  • Urgent help needed with reading data from Cube

    Hi Gurus,
    Can anyone tell me how to read a value from a record in the CUBE with a key (combination of fields).
    Please provide if you have any custome Function Module or piece of code.
    Any ideas are highly appreciated.
    Thanks,
    Regards,
    aarthi

    Due to the nature of the cube design, that would be difficult, that's why there are API's, FMs, and MDX capabilities - to eliminate the need to navigate the physical structure.  Otherwise you would have to concern yourself with:
    - that there are two fact tables E and F that would need to be read.  The Factview could take of this one.
    - you would want to be able to read aggregates if they were available.
    - the fact table only as DIM IDs or SIDs (for line item dims) to identify the data, not characteristic values.  A Dim ID represents a specific combination of characteristic values.

  • Help needed in reading data from a Crystal Report

    I am trying to read data values from a saved crystal report. (groovy code snippet below)
    I open a new ReportClientDocument and set the RAS using the inprocConnectionString property.
    Then I get the rowsetController and set defaultNumOfBrowsingRecords, rowsetBatchSize and maxNumOfRecords all to 1000000
    Using the browseFieldValues method after that returns only 100 records. I want to get all.
               ReportClientDocument clientDocSaved;
           def pathout=rc.pathToSavedReports+rr.path;
         //****** BEGIN OPEN SAVED SNAPSHOT ************
         clientDocSaved = new ReportClientDocument();
         clientDocSaved.setReportAppServer(ReportClientDocument.inprocConnectionString);       
            // Open report
         println("Reading file in.")
         clientDocSaved.open(pathout, OpenReportOptions._openAsReadOnly);
           def rowsetController = clientDocSaved.getRowsetController()
         rowsetController.setDefaultNumOfBrowsingRecords(1000000)
         rowsetController.setRowsetBatchSize(1000000)
         rowsetController.setMaxNumOfRecords(1000000)
         //setup metadata
         IRowsetMetaData rowsetMetaData = new RowsetMetaData()
         Fields fields =  clientDocSaved.getDataDefController().getDataDefinition().getResultFields()
         rowsetMetaData.setDataFields(fields)
         def values = rowsetController.browseFieldValues(fields.get(0), 1000000)
         values.each{value->
            println(value.displayText(new Locale("ENGLISH")))
    Before using the browseFieldValues method I was trying to use a rowsetCursor
                //get the rowset cursor to navigate through the results
                RowsetCursor rowsetCursor = clientDoc.getRowsetController().createCursor(null, rowsetMetaData)
                //navigate through the rowset and log the name and value
                rowsetCursor.moveTo(0)
                while(!rowsetCursor.isEOF()){
                     Record currentRecord = rowsetCursor.getCurrentRecord()
                     //println("current record number: " + rowsetCursor.getCurrentRecordNumber())
                     for(int i=0;i<fields.size();i++){
                          //println("Column - "+fields.get(i).getName())
                          if (currentRecord.size()>0){
                              println(currentRecord.getValue(i))
                     rowsetCursor.moveNext()
    the currentRecord was always an empty list and I did not get any data values at all.
    Am I missing something in using these approaches?
    I'm fairly new to using the Java SDK for CR. Any help will be greatly appreciated.
    Thanks

    hi, I am trying this second method to read the values from report. but all the records comming as null. Kindly help me to resolve this issue.

  • Help need to Recover data from 2009 iMac failing drive won't boot?

    How can I boot my 2009 iMac which came with Snow, from my macbook pro yosemite to recover my data on iMacs failed drive?

    Yes in target mode can I do wirelessly? or does it need wired firewire cable between the computers?I want to try to repair from disc utility and recover my data using my data rescue app first.Will there be any problems using
    macbook pro which is using
    yosemite?

  • Advanced Help Needed - Restoring iPhone data from Backup

    Upon updating my iPhone 3G[S] from 3.1 to 3.1..2 I did a full restore, I made sure to backup the device before this. Post-update/restore I, accidentally, hit "Set up as new phone" therefore deleting my earlier backup, a backup I have been building on for over 2 years now. When realizing that iTunes had automatically deleted my backup, I quickly opened my file recovery program and got the whole backup folder restored to a separate drive. I moved the whole freshly-recovered back up folder to the correctly directory within iTunes appdata, right next to where my new/empty/useless backups where. Right clicking my device within iTunes and selecting "Restore from backup" my backup from earlier in the day was not listed. After further inspection of the backup folder that I had recovered I noticed that there was no Info.plist file. I went into the other/useless backup that iTunes had just made and brought the info.plist into my older backup folder, now in iTunes my older backup was listed, I selected it and attempted to restore to only receive an error half way through restore "(-32)" to be exact. I checked again and again with my recovery tools to see if the original info.plist file was deleted, it was not, I have no idea why it would be missing, but worse is that I have no idea how I can get iTunes to restore the backup without it, considering it doesn't even recognize it the backup as existing without it, is there way to create a custom info.plist so I could restore or force a restore outside of iTunes? Please any advice would be greatly appreciated !

    Backup c:\users\[username]\AppData\Roaming\Apple Computer\MobileSync\Backup
    If you copy these folders into a new computer in the same location (or the contents of a current long name folder into one of the long name folders in backup) you should be able to restore your old data.
    Outside of that maybe track down the location of your outlook PST or address book database. support.microsoft.com probably has some tips.

  • Loading data from SAP Business One to BW

    We need to load data from several SAP Business One databases to BW. I find some special objects for SAP Business One in the Business Content
    And a short description in Help:
    The source for the data upload is a file system. The relevant files can be generated using the views provided within the SAP Business One database.
    May be someone has an experience in this area or could advise documentation about this
        May be better to use DB-connect?
    Thanks for your help

    Hello Elena,
    I am just curious if you've found answer to your question how to extract data fron BusinessOne into SAP/BI? What can we use? Only flat files or there are some standard extractors exists? Or have you used DB Connect for that?
    Thank you,
    -Vitaliy

  • How to regularly load data from .csv file to database (using apex)

    Hi,
    i am using apex3 , I need to load data from a csv file to apex . I need to perform this automatically through code at regular time interval of 5-10 seconds.
    Is it possible .If yes how ?. Please reply as early as possible. This will decide whether to use apex or not for this application.
    this is question for Application Express. Dont know why in forum for BPEL
    Edited by: TEJU on Oct 24, 2008 2:57 PM

    Hello,
    You really need to load the data every 5-10 seconds? Presumably it's read only?
    I would look at using an Oracle external table instead, that way you just drop your CSV in a location the DB can read it and then you build a table that essentially references the underlying CSV (that way when you query the table you view the data in the CSV file).
    Take a look at this link for a quick example on usage -
    http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
    Hope this helps,
    John.
    Blog: http://jes.blogs.shellprompt.net
    Work: http://www.apex-evangelists.com
    Author of Pro Application Express: http://tinyurl.com/3gu7cd
    REWARDS: Please remember to mark helpful or correct posts on the forum, not just for my answers but for everyone!

  • Load Data from a table on one server's database, to the same table structure in multiple server databases

    Hi,
    I have a situation where i have to load data from one server/database table to multiple servers/databases.
    Example:
    I need to load data from dbo.TABLE_A  (on Server: Server_A & Database: Database_A)  to the same table on the list of server databases like
    Server: Server_B , Database: Database_B
    Server: Server_C , Database: Database_C
    Server: Server_D , Database: Database_D
    Server: Server_E , Database: Database_E
    Server: Server_F , Database: Database_F
    Server: Server_G , Database: Database_G
    Server: Server_H , Database: Database_H
    so on and so forth on 250 such server database combinations.
    The table structure is the same on all the servers.
    If i make the source or destination dynamic, it throws an error while mapping ?
    I cannot get Linked server permissions and SQL Server Config thing doesn't work as well.
    Please suggest on how to load data from one source to multiple server/databases.
    Thank you.

    I just need to transfer one table's data. its like i have to use a query to pick data for
    the most recent data. So i use something like, select A, B, C, D from dbo.table where ETL_TIMESTAMP > (the max(etltimestamp) in the destination on different server). There are no foreign key relationships and the data should not be truncated. it just had
    to append the new records.

  • Loading data from APO-BW cube into DP planning area

    Hello,
    Please explain to me how to load data from an infocube within APO into the DP planning area.  So far, I generated the data mart, created the update rules for the DP planning area (cube) and replicated data sources.  The problem that I am having is that when I create the infopackage, there is no data targets on the data target tab.  How can I get the data targets to display?  All the other actions appear to be set up correctly (confirmed by BW experts).  Can data be loaded from an APO-BW cube directly to the DP planning area cube?
    Thanks

    Hi James,
    Loading of data from an InfoCube to a Planning Area is done using transaction /SAPAPO/TSCUBE (program /SAPAPO/RTSINPUT_CUBE). You just specify the InfoCube, target Planning Area, Selection condition, Infocube Key Figure to PA KF mapping, etc).
    Have you tried this transaction already?
    Hope this helps.

  • Load data from BW 7.0 DSO into a ECC 6.0 SD standard/Custom pricing table

    Gurus,
    We have a scenario, where we need to load data from a BW DSO to SD standard/Custom pricing table in ECC 6.0. That data will be few thousand records.
    Per my knowledge, in BPS, retractors are available to update data from BW to ECC and OpenHub also can be used to handle similar scenarios.
    Any one of you came across similar scenario?
    If you have any third option (not BPS rectors or OpeHub) as a solution to handle this kind scenario, and share the knowledge, it will be greatly appreciated.
    Thanks in advance,
    Vittal

    Hi Yogesh,
    Thanks for your reply.
    We have large data volumes for Billing datasource and hence moving flat files using PI is not an option and also as i mentioned a part of requirement is monitoring of the whole process as well. What i mean by this is if a 1000 billing document items were passed on to CRM7 to create a member activities and because of some reason a member activity was not created for one of the billing doc items, we should know the problematic record and reason why member activity was not created in CRM7 (reason code). And then be able to fix it.  All this requires an end to end monitoring capability and also guaranteed delivery of data to CRM7.
    Hence i was trying to explore the enterprise web service option.
    What i am not sure of is how to expose BW DSO delta request to CRM7 using a web service ? or any other method that gives end to end monitoring capability and also guaranteed delivery of data to CRM7.
    Any other suggestions ?
    Thanks
    CK

Maybe you are looking for