Moving data out of an ODS

I'm playing with ODS for the first time.  I've created the ODS and the update rules.  Now I have data loaded and activated in the ODS.  I want to move that data to the next ODS level.  What steps do I take to move this data from one ODS to another.  (I've got valid activated update rules defined betweent the two ODS objects.)
thanks,
Chris

Chris, for an ODS, you don't have to do Generate Export Data Souce. Its automatically done by BW. Just look for the infosource starting with 8(ODS NAME) in infosources, data marts.
If you don't see them, go to Main menu and somehwere there, you wil find a radio button for Display Generated ODS Objects (look around in the main menu).
All you need to do to create an infopackage and start loading (becasue you said the update rules between the two ODSs are alredy done).
Uday-Ram Chamarthy

Similar Messages

  • Error when loading the data from PSA to ODS......

    Hi BW guru's,
    i am facing one problem while loading the data from PSA to ODS.so please help me in this regard.
    Please give a step by step guidelines for me...
    the error while loading the data from PSA to ODS is "There are no PSA tables for these selection criteria","An error occurred when reading PSA data".
    thanks in advance,
    ashok.

    hi ashok,
    u can push data from  psa to ods, for this goto the psa in rsa1>psa>goto that request>rightclick>select " schedule update Immediately ", then data will moved from psa to ods.
                                                 or
    In ods > delete the failed request>goto the processing tab-->select 3rd option   " psa and then subsequentially to data targets ", --> schedule the infopackage.
    bye
    sunil

  • I need to get data out of Oracle 8i??!!

    I am working for a company that is moving from one old application to a new application.
    The Old application is using Oracle 8i for a backend. I know SQL Server very well. However, I don't know Oracle.
    1) One guy on the forum helped me to find the SIDs and see the databases being run.
    2) I have also changed the password file and can now get in as internal.
    Great, but I need data.
    1) How can I get the data out of Oracle 8i?
    2) Is there a way to export the data to CSV or spreadsheet?
    3) There are 1400 tables with cryptic table names. How can I see the tables and their column names?
    (this is easy to do with SQL Server, but again, I am just starting to use Oracle).
    Thanks in advance for any and all help!!! This is crazy!

    Hello,
    No one likes a thread that goes on and on. I really appreciate those that try to help.
    I now this database is being used. However, we are trying to read the database and create reports before switching over to a new application.
    The database is in use.
    I've tried
    select table_name, column_name from user_tab_columns;
    I get over 15,000 rows of this
    TABLE_NAME COLUMN_NAME
    EXU81REFC ROWNER
    EXU81REFC RTNAME
    EXU81REFC CNAME
    EXU81REFC CNO
    EXU81REFC RCNO
    EXU81REFC ACTION
    EXU81REFC ENABLED
    EXU81REFC DEFER
    EXU81REFC PROPERTY
    EXU81REFC ROBJID
    EXU81REFC ROWNERID
    EXU81REFC REFTYPE
    EXU81REFI OBJID
    EXU81REFI OWNER
    EXU81REFI OWNERID
    EXU81REFI TNAME
    EXU81REFI ROWNER
    EXU81REFI RTNAME
    EXU81REFI CNAME
    EXU81REFI CNO
    EXU81REFI RCNO
    (none of which looks like user tables, but rather system tables)
    when I tried
    SELECT TABLE_NAME, NUM_ROWS FROM ALL_TABLES;
    I get over 1400 rows of this:
    TABLE_NAME NUM_ROWS
    CNBIL1
    CNBIL2
    CNBJR
    CNBJT
    CNBLC
    CNBLDE
    CNBLES
    CNBLK1
    CNBLK4
    CNBLLOG
    CNBLNET
    CNBLQ
    CNBLR
    CNBLR1
    CNBLRD
    CNBLRR
    CNBLRT
    CNBLRW
    CNBLTF5
    CNBLX
    CNBPC
    I've tried to do a select on a few table_names and I get "no rows selected".
    Isn't there a way in Oracle to see what tables have a rowcount > 0 and then get the table_names and column_names from those tables?
    I found the free download of TOAD, but not sure how much more that would help me than using SQL Plus at this stage of the game when I don't even see data. Maybe TOAD would help getting the data once I know where to look...I guess.
    Thanks for helping to straighten out with the confusion.

  • What are required Oracle products for moving data from IBM IMS/DB(mainframe) to Oracle environment?

    I am z/OS system programmer, our company is using IMS as its main OLTP database. We are investigating moving data off the mainframe for data warehousing and online fraud detection. One option is using IBM InfoSphere CDC and DB2, another option is using IMS connect and writing our own program, I am wondering what is the oracle solution for this kind of issue?
    I am not oracle technician but I googled and find out Oracle has some product like Oracle Legacy Adapters, OracleAS CDC Adapter and Oracle Connect on z/OS, however I didn't find them in Oracle site(https://edelivery.oracle.com/), I don't know whether these products are deprecated or not?!
    I would very much appreciate any help or guidance you are able to give me

    Thank you for responding.
    I've considered dumping the data into a flat file and using SQL*Loader to import as you suggest but this would require some scripting on a per-table basis. Again: all I want to do is to copy the contents of a table from one database to another. I do not think I should have to resort to creating my own dump and load scripts in order to do that. However I agree with you that this type of solution may be my final solution.
    I've tried the db link solution. It was just a slow as the 'imp' solution for some reason. Don't know why. The tables are rather large (3 tables of a few Gb each) and therefore require intermediate commits when loaded. Otherwise the rollback segment will run out of space. So the 'db link solution' is really a PL/SQL script with a commit for each x records.
    I think Oracle is making it a bit difficult for me to copy the contents of a table from one database to another and to do it efficiently. Perhaps I'm missing something here?

  • Extracting Data Out of BW

    Hi,
    I'm researching what other companies are doing with regards to extracting data out of your BW data warehouse.
    Do you allow data to be extracted out of BW?
    If so, do you push data out to them?  Or do the external systems pull the data out?
    There's no right answer here.  I'm just trying to get a feel for best practices as to whether external clients should pull data out or have it pushed to them.
    By PUSH, I mean that BW initates the data extraction.  And by PULL, I mean that external systems initiate the data extraction.
    THANKS!

    Hi,
    Does anybody know that is it possible created an infospoke on a multicube/multiprovider?
    When creating a infospoke you can only choose between
    Datasource types: Basicube, Infobject (x2) & ODS, if you choose Basicube option and then do the f4 dropdown  the drop down does not contain a list  of Multiprovisers.
    However if you type the name of the multiprovider straight into the field the system will accept it, and the spoke will activate.
    I have tried to execute the spoke and it does generate a few data packets before  erroring. I am unclear if the error is because the spoke is based on a Multiprovider or a different issue. Error message provided below:
    System error: RSDRC / FORM AUTHORITY_CHECK RSDRC / FORM AUTHORITY_CHECK R
    System error: RSDRC / FUNC RSDRC_BASIC_CUBE_DATA_GET RSDRC / FUNC RSDRC_B
    System error: RSDRC / FORM DATA_GET RSDRC / FORM DATA_GET RSDRC / FORM DA
    Thanks
    Tony

  • Moving Calendar to a New Computer & Moving Data: Downloads and Add-ons

    Moving Calendar to a New Computer & Moving Data: Downloads and Add-ons
    The handle says is all. Totally lost here & in a virtual house of cards. Please first see screen-shot.
    I have the Mozilla calendar (Lightening?) program on my computer. Is this screen-shot Thunderbird or is it an add-on called Lightening that was added on to Thunderbird? I don't even recall how I acquired in a few years ago. Whatever it is I find in invaluable. Indispensable. Brilliantly conceived. A superb creation.
    Notice that I have several calendars in different colors for different purposes and a long task agenda on the right. Wonderful program!! I absolutely love it! I can not afford to loose these calenders and this data. However, big caveat, entrusting my life (our lives) to it has become extremely dangerous for it's a virtual house of cards upon the purchase of a new computer. The move-over is more like a career most of us already have one. Need anyone really wonder why PC sales are down? Res ipsa loquitur.
    FIRST ISSUE:
    I must move this/these Mozilla program/s and calendars over to the new computer but I don't have a clue where to begin. What must I download? Thunderbird? What must I add-on? Lightening? Is this a suite of programs? All I care about are my calendars and task agendas that you see in the screen-shot -- not any email programs etc. Just this personal management system that you see.
    SECOND ISSUE:
    After I downloading, installing, and adding-on what I need, how do I then move (migrate) my profile (data) over to new computer?
    Please know that I do know where that 'profile' and its .ics calendar profiles are stored. I am also well backed up. I just need to know how get (downloaded/install) what I need and move this over to the new computer. (See screen-shot)
    First, what must I download and install and add on. Second how do I move this data/profile (a virtual Life) over to the new computer without it all coming down like a house of cards?
    Please, unless it's all been documented in one location, a link to this and a link to that probably won't be very helpful. Anyone who can lay-out a simple, easy to follow plan (1, 2, 3, 4, done) will be a hero and make a significant contribution to people who dread buying a new computer for precisely these reasons. It will certainly encourage more people to use this absolutely wonderful personal management system. It's the best I've ever seen and I must keep it. But it's soooo risky.

    Obviously the screen-shot is posted so show what I see now and want to see when I am done. Was the ''really ''not clear? Nothing could be more germane ("Germaine" is a person's name) than a visual that shows the multiple calendar dynamic and what I actually want to see nor better communicants the fact that all I want is that and not other programs in a suite – a question btw asked but not answered.
    "I have no idea what the threshold issue is. Perhaps Zeno can assist you."
    I now see Zeno's reply but I never got an email re: my questions and just noticed it. I will respond to him separately. But ''obviously ''the ''''threshold'''' issue is ISSUE ONE (1) exactly what, and only what, must I download and install.
    Was that really not clear? That was not answered in your first reply. And since you here, later, state that Lightening does not need to be downloaded and installed how on earth could you presume the first threshold issue (question 1) was a joke?
    Once again I wrote and you even quoted me “Please, unless it's ALL been documented in ONE LOCATION, a link to this and a link to that probably won't be very helpful. Anyone who can lay-out a simple, easy to follow plan (1, 2, 3, 4, done) ...”
    As this exchange demonstrates, that statement has certainly proven to be correct.
    You then wrote, “Read the question... the question is. How do I move my profile to a new computer? You are the one making much more of it than that. ... All covered in the link.”
    What “the” (singular) of several links (plural) you provided are you talking abut? But no matter. No it was not "all" covered in the multiple links you sent which do not address the more important threshold question (1). The only issue you addressed was (2) data transfer which is secondary to the threshold issue which is ISSUE question one (1). Threshold issue/question presented: before we can go to question (2) we must address question (1).
    As I stated “Please know that I do know where that 'profile' and its .ics calendar profiles are stored.”
    You replied “No you do not. Local calendars are not stored in ICS files.”
    First please notice you make no attempt to clarify where they are; but, once again, please read carefully. I did not say that the calendars are “stored in the ICS files” but that the .ics files ARE the calendars and I do know there they are stored.
    Please see the screen-shot you claim is not germane? Each of those different colored calendars are (“not stored” in but are) .ics files that are stored in the Profile directory. In the directory c:\Users\myName\AppData\Roaming\Thunderbird is a directory or folder called \Profiles and in that directory is a file named 9w2ydrc4.default. That file IS the profile. In that profile is a folder called “calendar-data” and in that folder are the calender .ics files that are the different colored calenders you see in the very germane screen-shot. So I do know where the calendars are. So I gather that all I copy into the Profiles directory is the file (profile) called 9w2ydrc4.default.
    Again, as I wrote before, I do not want a full suite with email. So what do I downloaded and install to get only what you see in the screen-shot. I don't think we ever installed a full suite before. I never saw and email program. So, again,'' asked but not answered. ''
    Then you stated, I do not need to download and install Lightening (please notice, is a partial answer to my first, threshold, question you presumed was a joke) but only Thunderbird since “Lightning is already in your profile so will move with it When you actually move the profile.”
    Well, since you like to use links, exactly what is your documented Mozilla authority for that hearsay? Since I posted the question I have spent much time at Mozilla. Literally everything I have seen indicates I must first download and Thunderbird. But again, I do not want to and never did, install a full suite of programs and, as far as I know, there is no Mozilla mail or other suite programs on this (my current old computer) and there never were. So, once again'', asked and not answered).''
    Everything at Mozilla states I must first download & install Thunderbird'' and then ''add-on Lightening. Nothing I have seen at Mozilla suggests otherwise or states that when we want to get Mozilla calendars over to a new computer we must first and *only* download and install Thunderbird and then just copy the profile (9w2ydrc4.default) into the 'Profiles' directory. Nothing I have seen at Mozilla suggests I do not have to download and install Lightening. So please show me where Mozilla documents that. ''' Please notice that even Zeno states, “You download AND INSTALL LIGHTNING as an add-on.'''”
    Since this contradicts you do tell me: what is your Mozilla authority for claim that I do not need to download and install Lightening because it's already in my profile? I see nothing in my profile that suggests Lightening is there.
    Finally, I wrote to you “ ...it is axiomatic that Windows transfer wizard does NOT transfer programs but only data. You did not know this?” I only wrote that because it appeared your reference to Transfer Wizard was a response to the threshold question of (1) how to I get the program onto the new machine. But your response is''' very '''strange for you wrote “You are the first person I have encountered that thought it did more than transfer data.” Show me where I said that. I said the opposite. How on earth did you come to that interpretation as a response to my simple and very clear statement that " ...it is axiomatic that Windows transfer wizard does''' NOT transfer programs but only data.'''.”

  • Database Copy Doesn't find tables for moving data

    I'm running SQL Dev 3.1.7 on a Win7-64 PC against Oracle 11g. Using the Database Copy function to move and update tables from one instance to another. The application seems to work fine, creating all of the tables, but in the script when it tries to move the data, for some tables it fails as follows:
    Moving Data for object MY_TABLE_NAME
    Unable to perform batch insert.
    MY_TABLE_NAME ORA-00942: table or view does not exist
    Earlier in the script it dropped and then recreated the table in the target instance. The table does exist in the source instance. This occurred for 24 tables out of 105.
    Any ideas what is going on?
    Edited by: user12200489 on May 25, 2012 9:47 AM

    according to the log, it says the table got created. Here's the log entries as they pertain to our APPOINTMENT table ...
    DROP TABLE "APPOINTMENT" cascade constraints;
    table "APPOINTMENT" dropped.
    DROP SEQUENCE "APPOINTMENT_SEQ";
    sequence "APPOINTMENT_SEQ" dropped.
    -- DDL for Sequence APPOINTMENT_SEQ
    CREATE SEQUENCE "APPOINTMENT_SEQ" MINVALUE 1 MAXVALUE 9999999999999999999999999999 INCREMENT BY 1 START WITH 17721 CACHE 20 NOORDER NOCYCLE ;
    sequence "APPOINTMENT_SEQ" created.
    -- DDL for Table APPOINTMENT
    CREATE TABLE "APPOINTMENT" ("APPOINTMENT_ID" NUMBER(19,0), "PRACTICE_PATIENT_ID" NUMBER(19,0), "PRACTICE_PET_OWNER_ID" NUMBER(19,0), "STAFF_MEMBER_ID" NUMBER(19,0), "APPT_STATUS_ENUM_NAME" VARCHAR2(32 BYTE), "EXT_PRACTICE_APPOINTMENT_ID" VARCHAR2(255 BYTE), "CONFIRMATION_KEY" VARCHAR2(255 BYTE), "REASON" VARCHAR2(255 BYTE), "EXT_REASON" VARCHAR2(255 BYTE), "EXT_SECONDARY_REASON" VARCHAR2(255 BYTE), "NOTE" BLOB, "APPOINTMENT_DATE" TIMESTAMP (6), "CREATED_DATE" TIMESTAMP (6), "CREATED_BY" VARCHAR2(255 BYTE), "LAST_MODIFIED_DATE" TIMESTAMP (6), "LAST_MODIFIED_BY" VARCHAR2(255 BYTE), "CONFIRMATION_TIME" TIMESTAMP (6)) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "VET2PET_T" LOB ("NOTE") STORE AS BASICFILE ( TABLESPACE "VET2PET_T" ENABLE STORAGE IN ROW CHUNK 8192 RETENTION NOCACHE LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)) ;
    table "APPOINTMENT" created.
    It even truncates the table before attempting to load the data ...
    TRUNCATE TABLE "APPOINTMENT";
    table "APPOINTMENT" truncated.
    But when it goes to move the data ...
    --- START --------------------------------------------------------------------
    Moving Data for object APPOINTMENT
    Unable to perform batch insert.
    APPOINTMENT ORA-00942: table or view does not exist
    --- END --------------------------------------------------------------------
    There is a lookup table involved but at the time of moving the data none of the foreign keys have been enabled.
    All the indices for the table get created with no issue and here is the log entry for when the constraints are created and enabled ...
    -- Constraints for Table APPOINTMENT
    ALTER TABLE "APPOINTMENT" ADD CONSTRAINT "PK_APPOINTMENT" PRIMARY KEY ("APPOINTMENT_ID") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "VET2PET_I" ENABLE;
    ALTER TABLE "APPOINTMENT" MODIFY ("CREATED_BY" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("CREATED_DATE" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("CONFIRMATION_KEY" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("EXT_PRACTICE_APPOINTMENT_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("APPT_STATUS_ENUM_NAME" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("PRACTICE_PET_OWNER_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("PRACTICE_PATIENT_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("APPOINTMENT_ID" NOT NULL ENABLE);
    ALTER TABLE "APPOINTMENT" MODIFY ("APPOINTMENT_DATE" NOT NULL ENABLE);
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    table "APPOINTMENT" altered.
    Any ideas?
    thanks.

  • Fastest way to extract data out of xml with following constraints.

    10.2 on linux
    xml files are being dropped off into a queue. in the queue the documents must be stored as clobs so that control can be given back to the client as soon as possible.
    once in the queue we would like to extract all the data from the xml and place it in relational staging tables. the data is then moved from these tables into production.
    the only thing that can change is what happens between the queue and the staging tables. currently i am just using extract statements to pull the data out of the clob.
    the files are around 20mb and currently take over 20 minutes to process which is way too long.
    i looked at DBMS_XMLSTORE but we cannot alter the xml format.
    i looked at Oracle text but if i understand it correctly, we would have to rebuild the entire index after every new queue item.
    i have very little experience with xml so i want to make sure i know all my options.
    from what i can tell my only option is to take the clob and let xml db parse it into o-r tables. ...but even that seems like a horrible waste.
    is there anything else i can do? any pointers?
    thanks for any help!
    by the way...this forum has been of great help. my only problem is that i don't seem to ask the right questions at the right time.

    Chris
    Most people seem to find that allow XML DB to persist the XML using the object based storage and nested tables and then using insert as select operations is the most effective way to do what you want. There are a number of threads on how to best do this..
    The question to ask is do you really need the relational staging tables. If you read through the forum you'll see that once the XML has been persisted as objects, and the XML objects have been stored using a nested table storage models you can easily create relational views to represent the staging tables.
    This process will work very well if there are no updates to the staging tables. Effectively you will process the XML once, when you insert into the Schema based tables, and then use the relational views as the source for the migration from staging to production.
    If you haven't already done so, reading the following posts will help you with this
    XMLType column based on XML Schema: several questions
    http://forums.oracle.com/forums/thread.jspa?threadID=347820&tstart=0
    problem with sql/xml
    XML Query Performance on Nested Tables
    Basically you'll need an XML Schema that describes your XML and you'll need to set up nested table storage for each of the collections in your XML Schema in order to the required performance when using the views.
    The easiest way will be to use the default table that is creted when registering the XML Schema and the annotation xdb:storeVarrayAsTable="true" and then ensure that you sequence each collection correctly.

  • Error while updating data from PSA to ODS

    Hi Sap Gurus,
    I am facing the error while updating data from PSA to ODS in BI 7.0
    The exact error message is:
    The argument 'TBD' cannot be interpreted as a number
    The error was triggered at the following point in the program:
    GP44QSI5RV9ZA5X0NX0YMTP1FRJ 5212
    Please suggest how to proceed on this issue.
    Points will be awarded.

    Hi ,
    Try to simulate the process.That can give you exact error location.
    It seems like while updating few records may be no in the format of the field in which it is updated.
    Regards
    Rahul Bindroo

  • How do I get my data out of Keeper HD after updating iOS 7.1

    Recently I updated my iPad to iOS 7.1 and now my free Keeper HD won't open so I can get my data out.Keeper support say they can't assist

    Try this  - Reset the iPad by holding down on the Sleep and Home buttons at the same time for about 10-15 seconds until the Apple Logo appears - ignore the red slider - let go of the buttons. (This is equivalent to rebooting your computer.) No data/files will be erased. http://support.apple.com/kb/ht1430
    Troubleshooting apps purchased from the App Store
    http://support.apple.com/kb/TS1702
    Delete the app and redownload.
    Downloading Past Purchases from the iTunes Store, App Store and iBooks Store
    http://support.apple.com/kb/ht2519
     Cheers, Tom 

  • How to send data from 4 different data sources to one ODS

    Hello Gurus,
    There is a transaction called KSB1 in R/3.
    It has data related to cost center , cost element, G/L .
    In BI, i need to transfer the data from these Data sources to one ODS.
    Can any body give me some idea.
    Points will be awarded for any kind of response .
    Thanks .
    Anu

    Hi Rupa..
    In your requirement some data sources (Cost centre, Cost element) are master data sources.
    They are available as info Objects in BI (Content).
    So there is no need to create ODS on these.
    But the general scenario for creating ODS from multiple data sources in case of Transaction documents
    Like :
    PO header, PO line items...
    For this the Pre-requisite is to have common fields between these data sources (eg PONo).
    Hope this gives idea for u....
    Cheers...
    Varma

  • Loading Data from PSA to ODS

    Hi,
    right now there are nearly 50 requests in my PSA. I want to load entire data from PSA to ODS at a time. Ours is 3.x flow. I m able to load one request at a time. Can anyone help me to load all requests at atime?
    Please give me step by step.

    Hi Narendra,
    First of all all the records are unable to pick up at a time i single attempt due to. follow the reason
    It is not possible to load all the PSA requests' to target at a time.Because records picks up from PSA to Data target one by one due  dependencies of routine run on the data transfer up to the extent. anyway if you can do this thing the process of loading the requests' one by one, offcourse it is totally depends upon manual .
    Hope you are understood the scenario.
    Regards,
    Vamsi Krishna Chandolu

  • Error in Loading FIGL data into a standard ODS

    Hi,
    I have encountered an error when i attempted to load FIGL data into a standard ODS. Below is the error i received:
    Error message during processing in BI
    Diagnosis
    An error occurred in BI while processing the data. The error is documented in an error message.
    System Response
    A caller 01, 02 or equal to or greater than 20 contains an error meesage.
    Further analysis:
    The error message(s) was (were) sent by:
    Second step in the update
    Second step in the update
    Second step in the update
    Second step in the update
    Procedure
    Check the error message (pushbutton below the text).
    Select the message in the message dialog box, and look at the long text for further information.
    Follow the instructions in the message.
    Initially i discovered a work around wherein i loaded the data into a write optimize ODS. It was successful. However, when i attempted to data mart the data and put it into a standard ODS, the similar error occurred.
    Please help me with this problem.
    Ramon

    Hi,
    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
    Once you identify you can rectify the error.
    If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the new data table.
    Thanks,
    JituK

  • Error when i am loading data from PSA to ODS

    hai Gurus
    Very good morning
    Actually i am loading data from PSA to ODS.
    But when i am loading data i am getting the error InfoObject/BIC/0REFER_DOC does not contain ALPHA -CONFORMING VALUE 0.
    Can anybody help me to solve the problem and reload the data
    Thanks & Regards
    Jayavardhan

    Hi,
    Lacs of postings are there in this forum on this topic. Searching teh forum is the best option  for u now.
    With rgds,
    Anil Kumar Sharma .P

  • Error when sending data from Cube to ODS

    HI,
    I am extracting data from 2lis_02_itm & I stored the data in 0pur_o01(ODS) tha data is fine in ODS but when I am trying to send data from ODS to Cube 0pur_c07 it is giving the above error
    InfoSource 80PUR_O01 is not defined in the source system
    Errors in source system     
    InfoSource 80PUR_O01 is not defined in the source system.
    Message no. R3005
    Diagnosis
    The InfoSource 80PUR_O01 specified in the data request, is not defined in the source system.
    System response
    The data transfer is terminated.
    Procedure
    In the Administrator Workbench of the Business Information Warehouse, update the metadata for this source system, and delete the InfoPackages belonging to InfoSources that no longer existing
    Thanks
    Priya

    Hi A Priya
    First check if infosource 80PUR_O01 is present in datasource list of your BW system. Check this in RSA1-> Infosources-> here do Settings -> display generated objects and then do search on 80PUR_O01.
    if it not present there go to ODS 0PUR_O01-> right click
    -> generate export datasource.
    If it is present there then replicate the datasource by right click on transfer rules and then go to SE38-> Enter program name as RS_TRANSTRU_ACTIVATE_ALL -> Enter Infosource name as 80PUR_O01 and BW System as source system-> Execute. This will activate the transfer rules.
    Then try reloading from ODS to cube
    Regards
    Pradip

Maybe you are looking for

  • Changing xmlschema while parsing with sax

    Hello world, I`m using the sax-parser (xerces) and i want to combine different schema-files to parse an xml-String; <root>      <intervall>1,5</intervall>      <nix>jetzt echt nix!</nix>      <testtag>           <text>huhu joe</text>      </testtag>

  • Tables & field names used in PS?

    Hi ALL I need all the Tables and fields names that are used in PS.If anyone provide it then it will be of great help. Thanks in advance

  • Is iMovie 6 compatible with 10.4.11?

    I have imovie 08 and want to downgrade to imovie 6 in order to use the plug ins and other features that 8 does not have for some reason. is it (imovie 6) compatible with 10.4.11? The lowest i can downgrade my Tiger to is 10.4.10, that is the version

  • Archive Function Modules

    Hello Experts,     We are archiving the PDF function modules from the Application Server (unix) into Archive LInk. For this we are using these three function modules in the same order. 1. ARCHIV_METAINFO_GET 2. ARCHIVOBJECT_CREATE_FILE 3. ARCHIV_CONN

  • Oracle Product Management Sessions this month

    Forms Product Management will be delivering two Forms sessions in the coming weeks: The [Scottish Oracle User |http://scotland.ukoug.org/default.asp?p=4869] and the [Swedish Oracle User Group|http://www.orcan.se]