Extracting Data Out of BW

Hi,
I'm researching what other companies are doing with regards to extracting data out of your BW data warehouse.
Do you allow data to be extracted out of BW?
If so, do you push data out to them?  Or do the external systems pull the data out?
There's no right answer here.  I'm just trying to get a feel for best practices as to whether external clients should pull data out or have it pushed to them.
By PUSH, I mean that BW initates the data extraction.  And by PULL, I mean that external systems initiate the data extraction.
THANKS!

Hi,
Does anybody know that is it possible created an infospoke on a multicube/multiprovider?
When creating a infospoke you can only choose between
Datasource types: Basicube, Infobject (x2) & ODS, if you choose Basicube option and then do the f4 dropdown  the drop down does not contain a list  of Multiprovisers.
However if you type the name of the multiprovider straight into the field the system will accept it, and the spoke will activate.
I have tried to execute the spoke and it does generate a few data packets before  erroring. I am unclear if the error is because the spoke is based on a Multiprovider or a different issue. Error message provided below:
System error: RSDRC / FORM AUTHORITY_CHECK RSDRC / FORM AUTHORITY_CHECK R
System error: RSDRC / FUNC RSDRC_BASIC_CUBE_DATA_GET RSDRC / FUNC RSDRC_B
System error: RSDRC / FORM DATA_GET RSDRC / FORM DATA_GET RSDRC / FORM DA
Thanks
Tony

Similar Messages

  • Fastest way to extract data out of xml with following constraints.

    10.2 on linux
    xml files are being dropped off into a queue. in the queue the documents must be stored as clobs so that control can be given back to the client as soon as possible.
    once in the queue we would like to extract all the data from the xml and place it in relational staging tables. the data is then moved from these tables into production.
    the only thing that can change is what happens between the queue and the staging tables. currently i am just using extract statements to pull the data out of the clob.
    the files are around 20mb and currently take over 20 minutes to process which is way too long.
    i looked at DBMS_XMLSTORE but we cannot alter the xml format.
    i looked at Oracle text but if i understand it correctly, we would have to rebuild the entire index after every new queue item.
    i have very little experience with xml so i want to make sure i know all my options.
    from what i can tell my only option is to take the clob and let xml db parse it into o-r tables. ...but even that seems like a horrible waste.
    is there anything else i can do? any pointers?
    thanks for any help!
    by the way...this forum has been of great help. my only problem is that i don't seem to ask the right questions at the right time.

    Chris
    Most people seem to find that allow XML DB to persist the XML using the object based storage and nested tables and then using insert as select operations is the most effective way to do what you want. There are a number of threads on how to best do this..
    The question to ask is do you really need the relational staging tables. If you read through the forum you'll see that once the XML has been persisted as objects, and the XML objects have been stored using a nested table storage models you can easily create relational views to represent the staging tables.
    This process will work very well if there are no updates to the staging tables. Effectively you will process the XML once, when you insert into the Schema based tables, and then use the relational views as the source for the migration from staging to production.
    If you haven't already done so, reading the following posts will help you with this
    XMLType column based on XML Schema: several questions
    http://forums.oracle.com/forums/thread.jspa?threadID=347820&tstart=0
    problem with sql/xml
    XML Query Performance on Nested Tables
    Basically you'll need an XML Schema that describes your XML and you'll need to set up nested table storage for each of the collections in your XML Schema in order to the required performance when using the views.
    The easiest way will be to use the default table that is creted when registering the XML Schema and the annotation xdb:storeVarrayAsTable="true" and then ensure that you sequence each collection correctly.

  • SAP ECC 6.0 - Using BAPIs for C# to extract data out

    I have had a client recommend a strategy of extracting data usin C# (Visual Studio 2005) to extract data from SAP ECC using BAPIs.
    We do have an existing methodology in place using flat files to extract data to non SAP systems
    We have purchase XI which we intend to implement next year.
    I basically wanted to keep things tidy and continue with the flat file interface strategy just to keep things simple going forward to XI.
    THen there is also the security piece, the developer has had to slowly build a profile / roles that alows him to come in thru visual studio to access these BAPIs
    I am looking for anyone who has had experience with this and what your experiences are.  As well, what are the SAP best practicies concerning this strategy
    Thank YOU ALL who will reply

    I think that SAP's strategy toward exposing BAPI is leaning toward leveraging their AS-JAVA stack.  With the AS-JAVA Enterprise Services and standard delivered web content, SAP is already exposing many existing BAPI/Business Functionality outside of the ABAP world.   However, that's not saying that traditional flat files have no place in the future.  Matter of fact, I don't see flat/delimited files going away any time soon.
    It all really comes down to what you are doing, you may even end up with a landscape that's a mix of files/RFCs.  Here inhouse, we leverage file based transfers for large data sets such as SAP BI OHS extracts, and daily R/3 FI extracts.  Why file?  because it's simple, easy to control, and you can see the data being transferred.  A file can be "hold in my hands" if you will, at best there's just a few characters in-flight.
    As for RFCs, any type of RFCs in fact (Anything from sap .net connector to AS-JAVA ESOA Based ES), are basically designed for transactional data.  Here, you are looking at things that are high in transactional count but small in individual size.  Your client's call using C#, it all really depends on how you are doing it.  MOST IMPORTANTLY:  Make sure you are using a proven, standarized, and SAP supported way, PERIOD.  NEVER EVER go into production with a "hack".
    For the high-transactional-count-small-individual transactions, we leverage BizTalk Server (BTS) here.  I'll be frank, we are not using BTS because we wrote it.  We are using BTS because it just works better for us, in our environment.  BTS now support SAP 2.0 connector (registered program ID) and SAP 3.0 (WCF based, direct SAP RFC call, supported by Microsoft AND SAP).  We are leveraging these adapters and BTS (as distribution and transformation) in our environment more and more.
    So, long story short:  Determine the transaction type, multiple solutions is ok, make sure the solution is supported!

  • What's the easiest way to extract data out of a multi page document in adobe acrobat 8 pro.?

    We have landers that are now sending us lists with multiple social security numbers in a single pdf with multiple pages, before they used to break it up into which departments they go to, so I installed acrobat 8 pro on the users machines, now we need to get certain lines of data within that document and make a single document for different departments out of that 1 document, i tried the crop tool and made a document from cropped image but it pastes it in the corner extremely small for some reason, and I dont see how you can extract that data and bring it into either word or make it a seperate pdf document, because it dosent keep any of the formatting????
    Please help

    You might just try to do a save as DOC. The formatting may be lost, depending on how the PDF was made.

  • Extract data from ECC to Oracle using Data Services 4.0

    How to extract data from ecc6.0 Business content extractors  to oracle using sap bo data services 4.0

    Are you trying to use the SAP BW Business Content to extract data out of ECC and load into Oracle tables with Data Services? If that's the case, then you cannot do that. The SAP BW Business Content was developed to only be used in conjunction with SAP BW. When using Data Services to access the extractors in ECC, it has to have an SAP BW InfoPackage associated with it to execute. In this architecture, Data Services is only a pass through from ECC to BW and allows the ability to do some transformations of data prior to loading into the EDW layer (staging tables basically) on SAP BW.
    To connect ECC to Oracle, you're going to have to have all of the SAP BusinessObjects supplied Function Modules loaded onto ECC, along with a non-dialog logon account that has the ability to pass dynamic ABAP programs, generate the programs and schedule them. Depending on how you want to process the output, you may also have to have the ability to write to files on the ECC application servers and have an FTP account created on the application servers that can GET flat files and potentially DELETE them (you're going to need to delete periodically, otherwise your jobs will crash when the file space allocation has been consumed).

  • Extract data from CNiReal64Vector

    I'm using MFC Visual C++ to read 3 thermo couple channels.  My partial code to read the data look like the following
    CNiReal64Vector Data;
    CNI_MFC_DAQmx_App_UserCode userCode;
    userCode.GetData(Data);
    How do I extract data out from Data variable to store into a file?  How does the data store in Data variable for all three channels?
    Thanks,
    Tuan Bui

      /* Create a file to save your datalog*/
    try { m_fileWrite.exceptions(ifstream::failbit | ifstream::badbit);
          m_fileWrite.open("YourDatalog.txt");}catch (ofstream::failure e) {   AfxMessageBox("Error opening text file.");if (m_fileWrite.is_open()) {  m_fileWrite.close();                                                 m_fileWrite.clear();                                      return;} } /*your CNiReal64Matrix data is processed into CNiReal64Vector, that is if you used these data types.  The matrix type caters to multiple signals, thermo1, thermo2 & thermo3, in your case.  The output is tabulated in column form.  Open it in excel for readability*/
    int channelCount = m_data.GetRows();  int dataCount = m_data.GetCols(); // Write header
    for (unsigned int j = 0; j < m_task->AIChannels.Count; j++) { m_fileWrite << m_task->AIChannels[j].PhysicalName << "\t";} m_fileWrite << endl;m_outStream.str("");CNiReal64Vector v;// Write data
    for (int i = 0; i < dataCount; i++) {  m_data.CopyColumn(i, v);for (int j = 0; j < 1; j++) {  m_outStream << dec << v[j] << "\t";  }m_outStream << endl;  } 
    m_lTotalSamples += dataCount;if (m_fileWrite.is_open()) {  m_fileWrite << m_outStream.str();m_fileWrite.close();  } 
    This is a snippet from an analog voltage program with file I/O with few modifications. 

  • Time Out Dump while extracting data from table CKIS

    Dear Friends,
    I am getting TIme Out dump for the below code, while extracting data from table CKIS.
    Table CKIS doesn't have any Indexes. Please guide me to resolve this.
    Regards,
    Viji.
    form get_keko_ckis.
      SELECT kalnr kalka kadky tvers bwvar matnr werks kokrs
             FROM keko
             INTO TABLE i_keko1
             FOR ALL ENTRIES IN i_final_modify
                 WHERE matnr = i_final_modify-main_f
                   AND werks = p_werks
                   AND kokrs = p_kokrs
                   AND kadat = p_kadat
                   AND bidat = p_bidat
                   AND bwdat = p_bwdat.
      IF sy-subrc = 0.
        SORT i_keko1 BY kalnr kalka kadky tvers bwvar.
        SELECT kalnr kalka kadky tvers bwvar posnr typps kstar
               matnr menge gpreis
               FROM ckis
               INTO TABLE i_ckis_temp
               FOR ALL ENTRIES IN i_keko1
               WHERE kalnr = i_keko1-kalnr
                 AND kalka = i_keko1-kalka
                 AND kadky = i_keko1-kadky
                 AND tvers = i_keko1-tvers
                 AND bwvar = i_keko1-bwvar.
            IF sy-subrc = 0.
              SORT i_ckis_temp BY kalnr kalka kadky tvers bwvar.
              LOOP AT i_ckis_temp INTO wa_ckis_temp.
                wa_ckis-kalnr  = wa_ckis_temp-kalnr.
                wa_ckis-kadky  = wa_ckis_temp-kadky.
                wa_ckis-posnr  = wa_ckis_temp-posnr.
                wa_ckis-typps  = wa_ckis_temp-typps.
                wa_ckis-kstar  = wa_ckis_temp-kstar.
                wa_ckis-matnr1 = wa_ckis_temp-matnr1.
                wa_ckis-menge  = wa_ckis_temp-menge.
                wa_ckis-gpreis = wa_ckis_temp-gpreis.
              CLEAR wa_keko1.
              READ TABLE i_keko1 INTO wa_keko1
                                 WITH KEY kalnr = wa_ckis_temp-kalnr
                                          kalka = wa_ckis_temp-kalka
                                          kadky = wa_ckis_temp-kadky
                                          tvers = wa_ckis_temp-tvers
                                          bwvar = wa_ckis_temp-bwvar
                                          BINARY SEARCH.
                 IF sy-subrc = 0.
                    wa_ckis-matnr = wa_keko1-matnr.
                    wa_ckis-werks = wa_keko1-werks.
                 ENDIF.
                 APPEND wa_ckis TO i_ckis.
                 CLEAR: wa_ckis_temp, wa_ckis.
              ENDLOOP.
            ENDIF.
        REFRESH: i_keko1, i_ckis_temp.
      ENDIF.
    endform.                    " get_keko_ckis

    Hi Try minimising the conditions in where clause
         SELECT fields..... FROM CKIS
         WHERE KALNR = KEKO-KALNR AND
                      KADKY = KEKO-KADKY AND
                      TVERS = KEKO-TVERS AND
                      TYPPS = 'M'.
        after this, deleting unwanted records from internal table as per pending conditions...
    Regds,
    Anil

  • Unable to extract data from an AS/400 system.

    Hello experts.
    We are trying to extract data from an AS/400 system but not having any success until now.
    I´ll write down you the stepts that we have followed until now:
    1.- Create a DB Connect between both systems
    2.- Create a Source System from AS400 in the workbench under DB Connect Directory
    3.- Generate datasources from tebles specified in the schema of the connection
    break point -
    At this point, we had a problem with some tables with at least one fieldname containing character "Ñ".
    After asking some possible solutions to SAP, the told us this is not supported, as the system can´t have any object with character "Ñ", so the transfer structure was unable to activate with this fields in the datasource.
    --- end of brek point --
    4.- After those issues, we´ve decided to implement, in another schema, views from those tables which had the fieldnames with that character "Ñ", changing them to an "N".
    5.- We´ve created another source system with that schema, and user than can see that schema.
    6.- To be able to see those views, in transaction RSDBC, we had to deactivate the two checkboxes in the first window ( Choose tables and Choose views) .
    7.- Right afeter, we could generate correctly the datasources from this logical tables.
    8.- We have designed  the hole dataflow for this datasources and everithing went rigth.
    9.- But wen we tried to execute the infopackage to extract data from those logic tables, we cannot get any registers. Acctualy the charge remains yellow after the job have finished.
    Please, I would appreciate any help you could give us on this problem.
    Thank you very much
    Regards
    Joaquin

    I´d like ti add something to this thread, and maybe clarify a littel bit the question.
    The only way that the BW system recognizes those logical tables, through transaction RSDBC is checking out the two boxes on this transaction, "Select Tables" and "Select Views".
    I don´t know haw these logical tables have been created, bus does this mean that the are not neither tables or views as BW understand them.
    Please, if someone knows anythin about this, answer to this thread.
    Thank you very much.
    Joaquin Sobrido

  • Extracting Data from APO PP/DS to BW

    Hi Gurus,
    I'm trying to extract data from APO PP/DS (SCM 5.1) to BW (BI 7.0). I'm new to SCM and am not sure how the extraction from SCM to BI happens other than that we need to read data from LiveCache.
    The extractor we are interested are
    0APO_PPDS_RESCAPREQ_01
    0APO_PPDS_PROD_CUST_01
    0APO_PPDS_OPERATION_01
    0APO_PPDS_ORDER_01
    Pls kindly help me out with the procedure.
    Thanks in advance.

    Hi
    Have a look at the below urls..
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5f229690-0201-0010-84ba-9ee5a8958a05
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/4fe5d590-0201-0010-6c8d-ada86492cf11
    Re: APO to BW Design Question
    Re: APO BW Integration
    Hope it helps
    Thanks
    Teja

  • When I extracting data from DSO to Cube by using DTP.

    When i am extracting data from DSO to Cube by using DTP.
    I am getting following erros.
    Data package processing terminated (Message no. RSBK229).
    Error in BW: error getting datapakid cob_pro (Message no. RS_EXCEPTION105).
    Error while extracting from source 0FC_DS08 (type DataStore) - (Message no. RSBK242).
    Data package processing terminated. (Message no. RSBK229).
    Data package 1 / 10/04/2011 15:49:56 / Status 'Processed with Errors'. (Message no. RSBK257).
    This is the brand new BI 7.3 system. Implementing the PSCD and TRM.
    I have used the standard business content objects in FI-CA (dunning history header, item,activities) and standard Datasource (0FC_DUN_HEADER ,0FC_DUN_ITEMS, 0FC_DUN_ACTIVITIES). I have extracted data till the DSO level . when I try to pull the data to info provider level(cube) using DTP . I am getting fallowing error.
    my observation: when ever I use the DSO as source to any target like another DSO or cube. its throwing same kind of error for any flow including simple Flat file .
    please suggest any one whether do I need to maintain basic settings since its a brand new BI 7.3.
    please help me out on this issue . I am not able to move forward .its very urgent

    hello
    Have you solved the problem ?  
    I have the same error...
    as you solve this error,
    can you help me please I have the same error
    yimi castro garcia
    [email protected]

  • Extracting data from Excel To Illustrator javascript or vbscript

    Hi all-
    I was wondering if there was a way to extract data from Excel to be used in Illustrator. I know there is an option of variables and xml, and I don't want that. I've seen and tried out how to read illustrator and write to excel, and I get that.  What I would like to do is pretty much the opposite:
    1.Pre-fill in an Excel file(.xls,.csv, doesn't matter) with data such as a filename in column 1 and (Replacement Text) in column 2 and close manually.
    2. Run script(VBSCRIPT,Javascript, doesn't matter)
    3.For each column in Excel file where cell in first column is not empty, open Illustrator Template with placeholder of "DWG" textframe and replace the frame titled "DWG" with Replacement text from Excel in Column2.
    4, Save each to a PDF file and name file with text from Excel Column1(Filename)
    In a nutshell, there will be a single illustrator template with a premade textFrame with a name of "DWG". Excel will contain two columns, one for the filename to be named and one for the relative text to replace with the placeholder in AI. I hoped I explained this well enough without causing too much confusion. Thanks in advance.
    Filename
    Replacement Text
    test1.pdf
    DWG01
    test2.pdf
    DWG02
    test3.pdf
    DWG03
    test4.pdf
    DWG04

    As text… \n is new line character and \r is return character. I can't remember which excel uses but they both equate to a line/paragraph… I very quickly threw together an example for you…
    #target Illustrator
    textToPDF();
    function textToPDF() {
              if ( app.documents.length == 0 ) { return; }
              var doc, csvFile, i, fileArray, opts;
              csvFile = File( '~/Desktop/ScriptTest/Test.csv' );
              if ( !csvFile.exists ) { return; }
              fileArray = readInCSV( csvFile );
              doc = app.activeDocument;
              opts = new PDFSaveOptions();
              opts.pDFPreset = '[Press Quality]';
              // Here we loop the main array
              for ( i = 0; i < fileArray.length; i++ ) {
                        // Here we get the second item of sub array i
                        doc.textFrames.getByName( 'DWG' ).contents = fileArray[i].[1];
                        // Here we get the first item of sub array i
                        doc.saveAs( File( fileArray[i].[0] ), opts );
    function readInCSV( fileObj ) {
              var fileArray, thisLine, csvArray;
              fileArray =[];
              fileObj.open( 'r' );
              while( !fileObj.eof ) {
                        thisLine = fileObj.readln();
                        csvArray = thisLine.split( ',' );
                        fileArray.push( csvArray );
              fileObj.close();
              return fileArray;
    I haven't tested it but it should be close…?

  • How to extract data from ORCL GL in "full" and not "delta " with FDM ERPI?

    Hi
    I am using FDM ERPI 11.1.2.1 to extrct data from Oracle eBS 11.
    I can see that FDM ERPI extracts data in "delta mode" using the GL_TRACK_BALANCES_DELTA (FDM adds a row in this table after the 1st load of the period").
    Is it possible to force the "full mode" in order to reload all the data from Oracle GL for each FDM data load?
    Because I don't know why, but I have each month rows which are not imported into FDM.
    thanks in advance for your help
    Fanny

    Hi
    I had see that I have to install the FDM 11.1.2.1.502 patch in order to see this option in my ERPI adapter options.
    I have passed this patch.
    So I have the ERPI-FIN-C1 adaptor.
    I can see a new option (Time Out Value) but no "Load Data" or "Execution mode" option :-(
    Any idea of where the problem come from?
    Fanny

  • How to extract data from a remote system

    Hi,
    I want to extract data from another system and map it with a target table in my local machine. what is the procedure to do that?
    I tried to create another module specifying location of that remote system I could extract the table but when I am mapping with the target table I am getting this error
    ORA-04052: error occurred when looking up remote object [email protected]@ORACLE_LOCATION_UBN_15
    ORA-00604: error occurred at recursive SQL level 1
    ORA-01017: invalid username/password; logon denied
    ORA-02063: preceding line from UBNDW@ORACLE_LOCATION_UBN_15
    here ORACLE_LOCATION_UBN_15 is the location and UBNDW is the SID for that system
    SCOTT is the username I have provided while creating the module
    Please help me out with this..
    Thanks

    Hi Roberto,
    thank you for the fast answer. I have read this before but i don't understand the needed steps on the Oracle-System.
    Hope anyone can help me...
    Thanks
    Muammer
    Message was edited by:
            Muammer Kizilaslan

  • How to extract data from BCS consolidation cube to Group BW extraction cube

    Gurus,
    I have to figure out a way to extract data from a local BCS consolidation totals cube to a group BW extraction cube via a virtual cube in between the too...How can i do the extraction and what are the different ways in which it can be done..
    Detailed steps would be appreciated....also how shud i proceed towards this...
    Thanks
    Cheers:
    Sam

    Hi Sam,
    Instead of extracting you can also consider reporting on the data with a multiprovider.
    There is a doc here that might be useful
    http://www.affine.co.uk/files/How%20to%20Create%20a%20MultiProvider%20over%20BW%20and%20BCS.pdf
    Kevin

  • How to extract data from planning book

    nu t apo. need help regarding how to extract data from planning book givn the planning book name , view name & some key figures.
    Total Demand (Key Figure DMDTO):
    o     Forecast
    o     Sales Order
    o     Distribution Demand (Planned)
    o     Distribution Demand (Confirmed)
    o     Distribution Demand (TLB-Confirmed)
    o     Dependent Demand
    Total Receipts (Key Figure RECTO):
    o     Distribution Receipt (Planned)
    o     Distribution Receipt (Confirmed)
    o     Distribution Receipt (TLB-Confirmed)
    o     In-Transit
    o     Production (Planned)
    o     Production (Confirmed)
    o     Manufacture of Co-Products
    Stock on Hand (Key Figure STOCK):
    o     Stock on Hand (Excluding Blocked stock)
    o     Stock on Hand (Including Blocked stock)
    In-Transit Inventory (Cross-company stock transfers)
    Work-in-progress (Planned orders / Process orders with start date in one period and finish date in another period)
    Production (Planned), Production (Confirmed) and Distribution Receipt elements need to be converted based on Goods Receipt date for projected inventory calculation.

    Hello Debadrita,
    Function Module BAPI_PBSRVAPS_GETDETAIL2 or BAPI_PBSRVAPS_GETDETAIL can help you.
    For BAPI BAPI_PBSRVAPS_GETDETAIL, the parameters are:
    1) PLANNINGBOOK - the name of your planning book
    2) DATA_VIEW - name of your data view
    3) KEY_FIGURE_SELECTION - list of key figures you want to read
    4)  SELECTION - selection parameters which describe the attributes of the data you want to read (e.g. the category or brand). This is basically a list of characteristics and characteristic values.
    BAPI_PBSRVAPS_GETDETAIL2 is very similar to BAPI_PBSRVAPS_GETDETAI but is only available from SCM 4.1 onwards.
    For the complete list of parameters, you can go to transaction SE37, enter the function module names above and check out the documentation.
    Please post again if you have questions.
    Hope this helps.

Maybe you are looking for