SPM  Extractor kit 2.1

Hi ,
I would like to check with you guys  in this below post if anyone has this similar experience.
Re: Help in Spend Analytics 2.1 Data model
Thanks,
Vikas

Hi Vikas,
In terms of what fields are in the data model vs. what fields are provided by the extractors, they certainly do not match.  Below are few reason:
1. Some fields are calculated once data is being loaded into the SPM data model.
  Example: Transaction currency measures to Global currency measures.  Similarly for quantity measures.
2. Some fields may not be available in the source and may come from a different source.  But if your source system maintains them you can specify the source and populate them by enhancing the extractor.
  Example: Cleansed Category (or just Category).  Typically the source system category is different from cleansed category, similarly Source System Supplier and Supplier and few other master data objects.
3.  Few fields are determined when data loading data into SPM on the BI side.
Example: Upload ID - which is a unique ID assigned for every data load.  Source system (not to be confused with the physical source of your data) - is a fields that logically identifies / separates source system dependent master / transaction data.
4.  Few other fields are concatenation of multiple fields, which is also generated when loading data into the SPM data model.
Example: Document number + Item number + Source System to create a unique ID.
I think there might be a few more that I forgot, but you get the general idea.  Hopefully this answers your questions.
Regards,
Rohit

Similar Messages

  • SPM Extractors Starter Kit

    Hi,
    I am working with SPM extracting data from one of our R/3 systems (R/3 4.7). I installed the extractors in our system, copy the existing project delivered by SAP and then generated the extractors for those objects that we need. My question is regarding deltacapability. I read delta is available for some objects, based on their creation, posting or last change date. Nevertheless I cannot make any of them work in delta mode (even for those where there is a date available for delta extraction, for example Cost Centers, Invoices, POs.). I go to transaction Z_SA_EXTR, select project and object, and then click on Start. I immediately get the message 'Object file created successfully' (what is true, the file is there). But I am never asked whether I want to execute it in full, init or delta mode.
    The SPM application will be installed in a standalone server (not in any of our existing BW systems), so I will be generating flat files to be FTP to the final destination. Is it that if I do not have a datasource for BW the delta mechanism is not available?.
    Please clarify me on how this works. Many thanks!
    Claudia.

    Hi Claudia,
    SPM extractors do run in delta mode.  SPM applications maintains the previous successful load data & time. When extracted again the previous recorded data & time and current date & time range is used to bring in the delta records.  This date range is automatically passed to the info-package during extraction date filter to bring in delta extract.  This is also the reason only one info-package has to be created against the SPM extractors in BW.
    When executing these extractors in source system using RSA3 (extractor test) you can pass the date range to retrieve delta records for testing.
    The very first time you run the extractors it will run in INIT mode, meaning bringing all the records from beginning of time till today and the next load will be from previous successful load till today in DELTA mode.
    If you want to load multiple INIT loads, meaning break the INIT load into multiple pieces use the following steps:
    - Specify a date range in the info-package on BW side to bring in one year worth of data (01/01/2009 to 12/31/2009)
    - Make sure you delete the specific previous successful load date time stamp from the table OPM_SOURCE_ST which maintains the extractors status.  This is the table that is used to figure out which was the previous successful load.
    - Repeat the above 2 steps depending on the number of INIT loads you want to perform
    - For the final load only specify the from date and leave the to date blank, this will bring in data up to the current date.  After successful load the system will record the current data in the table and will be used for the next extract as delta.
    Keep in mind that a factor of safety of 5 days is used during extraction to make sure all the records are extracted.
    Hope this helps.
    In regards to your previous question on package size, please read the article posted on this topic - /people/divyesh.jain/blog/2010/07/20/package-size-in-spend-performance-management-extraction
    Regards,
    Rohit

  • SPM Extractors from SAP R/3

    Hi, I am quite new working with SPM and currently investigating options for data extraction activity from R/3 to SPM.
    I could see there is an extraction wiki, were I found very useful information, and could see the main options are:
    1. Use SAP provided ABAP extractors (as a starting kit probably)
    2. Write our own ABAP programs
    3. Using other ETL tool, like for example BOBJ IM (Business Objects Integration Manager)
    My main concern is regarding performance, as I expect to have to deal with high volume of data. I'd like to hear some past experience on how SAP provided extractors work. Have any of you implemented these extractors or had the chance of seeing them running?. The delta capability works fine? Is it a delta just based on a document creation/change data? or other changes (in the master data for example, where we do not always have a change date) are also captured?. How much customizing should I considerate will be requested (of course I understand this depends on how much customizing do we have in our R/3).
    Is it worth exploring these standard extractors?. Or should I better start considering other option like BOBJ-IM? I think I will have to use BOBJ-IM anyway for some non-SAP sources of data which I also have to include.
    Any experience with these extractors is appretiated!.
    Many thanks.
    Regards,
    Claudia.

    Hi Claudia,
    Below are some of my comments in regards to SPM extractors.
    1.  Using SPM extractors as a starter kit is a great idea.  This will bring in all the necessary data from standard R/3 tables with the necessary logic to populate SPM.
    2.  Taking the route of writing you own ABAP program / extractors may be necessary only if you have quite a lot of customization in R/3 as your source for data.
    3.  Using BOBJ IM is necessary for extracting data from non-SAP data source.
    As always loading initial could be a lengthy process.  To speed up this process it is recommended to break it down to smaller manageable loads.
    Delta is primarily based on create / change date for both master data and transaction data.
    In terms of customizations, you are right depends on what you are looking for, you may end up adding new fields or writing some exit / code to add business logic to existing fields.  Have seen customers do a combination of both.
    Have seen quite a lot of customers take the route of delivered extractor which gives them a great start.  Secondly looking at the extractors you can tell exactly what tables and fields are used to extract data, that way even if you are planning to build you own extractors you know exactly what fields are needed.
    Regards,
    Rohit

  • Transporting SPM-Extractors

    Hi,
    i just experienced some Problems with transporting the SPM Extractors (i know it's not the recommanded way).
    The Extract-Structures don't have an ObjectCatalogEntry. By reactivating them they get one, but the generated data types in them still don't have an Entry. Is there a more comfortable way then reactivating every single Data-Element?
    Thanks and best regards
    Pascal

    Hi Pascal,
    As you mentioned this is not the recommended way to transport. What we suggest is transporting metadata and generating in each client separately. If you are following the approach of generated objects currently there is no other way but to generate the object directory entry for each of them individually.
    Thanks and Regards,
    Divyesh

  • Can SPM realize AP requirement?

    Hi,
    Our company wanna to use SPM to realize AP reporting requirement? Is it possbile?
    The AP reporting requirement contains these fields:
    company code--from invoice data
    cost center---from invoice data
    GL account----from invoice data
    PO number----from invoice data
    vendor---from invoice data
    Invoice number-----from invoice data
    invoice amount-----from invoice data
    invoice paid amount----from AP data
    payment term -
    from AP data
    invoice status----from GL data
    invoice date------from invoice data
    invoice paid date-----from AP data
    So basiclly, the datasource cover AP and invoice. I wanna to know can SPM realize this requirement? There are some questions related:
    Will SPM data management tool just use some specific standard datasources? For example, 2LIS_06 for invoices, 2LIS_02 for purchase ordersu2026Per SPM documents I found on web, no AP SPM BI content mentioned. Can 0FI_AP_4 datasource be used by SPM?
    Shall we use customized DSO in SPM? Not standard ones---0ASA_DSXX
    Can datasource be enhanced in SPM?
    Thanks,
    Jack

    Hi Jack,
    Looking at your requirement we cover the Invoice portion of your requirement and lot more.  SPM has its own set of extractors that we used to pull data from ERP, we do not use any standard BW datasource.  But if you are already pulling all the requirement information [Invoice, AP] using the standard BW datasource you can load that data into the SPM data model [it will be your responsibility to map the necessary fields and load]. 
    As part of the standard SPM extractors and data model we do not have AP data, so you can either build a new set of objects [DSO and Cube] or reuse 0FI* objects to hold this data and add the necessary fields to the final SPM multiprovider in the reporting area.
    SPM application has functionality where you can introduce new data model [like mentioned above] and expose additional content in the SPM UI based on your needs.  You can also modify the SPM extractors to bring in additional information such as AP or any other master data that is necessary.
    Hope this answers your question.
    Regards,
    Rohit

  • SPM data extraction question: invoice data

    The documentation on data extraction (Master Data for Spend Performance Management) specifies that Invoice transactions are extracted from the table BSEG (General Ledger) . On the project I'm currently working the SAP ERP team is quite worried to run queries on BSEG as it is massive.
    However extract files are called BSIK and BSAK of; which seems to suggest that the invoices are in reality extracted from those accounts payable tables.
    Can someone clarify the tables really used, and if it's the BSIK/BSAK tables what fields are mapped?

    Hi Jan,
    Few additional mitigation thoughts which may help on the way as same concerns came up during our project .
    1) Sandbox Stress testing
    If available u2013 take advantage of an ECC Sandbox environment for extractor prototyping and performance impact analysis. BSEG can be huge (contains all financial movements), so e.g. BI folks typically do not fancy a re-init load for reasons outlined above. Ask basis to copy a full year of all relevant transactional data (normally FI & PO data) onto the sandbox and then run the SPM extractors for a full year extraction to get an idea about extraction system impact.
    Even though system sizing and parameters may differ compared to your P-box you still should get a reasonable idea and direction about system impact.
    2) In a second step you may then consider to break down the data extraction (Init/Full mode for your project) into 12 monthly extracts for the full year (this gives you 12 files from which you Init your SPM system with) with significant less system impact  and more control (e.g. can be scheduled over night).
    3) Business Scenario
    You may consider to use the Vendor related movements in BSAK/BSIK instead the massive BSEG cluster table as starting tables (and fetch/lookup BSEG details only on a need base) for the extraction process (Index advantages were outlined above already).
    Considering this we managed to extract Invoice data with reasonable source system impact.
    Rgrds,
    Markus

  • Filter Credit Memo's from  Invoice Reports

    Hi All,
    The SPM extractor for Invoices includes a filter based on Posting Key to only include types of 21, 22, 31, 32 which are Credit Memo, Reverse Invoice, Invoice, and Reverse Credit Memo. We have a requirement on our Invoice based reports to exclude Credit Memos, Reverse Invoices, and Reverse Credit Memo's. As the Posting Key is not brought into the data model in SPM, is any other way to create a filter to cater for this requirement?
    Regards,
    Gary Elliott

    Thanks Divyesh,
    We had thought this would be the case. We will look at extending the extractor to include the posting key. This will give the users the option to filter on the type of document.
    Many thanks,
    Gary

  • Issues while generating Schema DAT files

    We are facing two type of issues when generating Schema ".dat" files from Informix Database on Solaris OS using the
    "IDS9_DSML_SCRIPT.sh " file.
    We are executing the command on SOLARIS pormpt as follows..
    "IDS9_DSML_SCRIPT.sh <DBName> <DB Server Name> ".
    The first issue is ,after the command is excuted ,while generating the ".dat" files the following error is occuring .This error is occuring for many tables
    19834: Error in unload due to invalid data : row number 1.
    Error in line 1
    Near character position 54
    Database closed.
    This happens randomly for some schemas .So we again shift the script to a different folder in Unix and execute it.
    Can we get the solution for avoiding this error.
    2. The second issue is as follows..
    When the ".dat" files are generated without any errors using the script ,these .dat files are provided to the OMWB tool to load the Source Model.
    The issue here is sometimes OMWB is not able to complete the process of creating the Source Model from the .dat files and gets stuck.
    Sometimes the tables are loaded ,but with wrong names.
    For example the Dat files is having the table name as s/ysmenus for the sysmenus table name.
    and when loaded to oracle the table is created with the name s_ysmenus.
    Based on the analysis and understanding this error is occuring due to the "Delimiter".
    For example this is the snippet from a .dat file generated from the IDS9_DSML_SCRIPT.sh script.The table name sysprocauthy is generated as s\ysprocauthy.
    In Oracle this table is created with the name s_ysprocauthy.
    s\ysprocauthy║yinformixy║y4194387y║y19y║y69y║y4y║y2y║y0y║y2005-03-31y║y65537y║yT
    y║yRy║yy║y16y║y16y║y0y║yy║yy║y╤y
    Thanks & Regards
    Ramanathan KrishnaMurthy

    Hello Rajesh,
    Thanks for your prompt reply. Please find my findings below:
    *) Have there been any changes in the extractor logic causing it to fail before the write out to file, since the last time you executed it successfully? - I am executing only the standard extractors out of the extractor kit so assumbly this shouldnt be a issue.
    *) Can this be an issue with changed authorizations? - I will check this today, bt again this does not seem to be possible as the same object for a different test project i created executed fine and a file was created.
    *) Has the export folder been locked or write protected at the OS level? Have the network settings (if its a virtual directory) changed? - Does not seem so because of the above reason.
    I will do some analysis today and revert back for your help.
    Regards
    Gundeep

  • Generic delta creation with Extractor Starter Kit - BW Datasource option

    Hi Rajesh, Rohit,
    As per the wiki, http://wiki.sdn.sap.com/wiki/display/CPM/FAQ-DataExtractionforSpendPerformance+Management, the question:-
    Do these starter kits have any support for delta (periodic data updates) after the initial load is completed?
    A. Yes. Deltas are available (for all source objects that support delta mechanism) in both of the extraction options. You don't need to reload the full set of data, when you perform the periodic data updates.
    I understand you can use the date range but this in my view is still a full load.  We understand that you can create your own generic datasource based on the generated function module generated by generating the object, for e.g PO.
    We have tried to do this for the object PO, thus the following:-
    rso2 > create a new transaction datasource > base on generated datasource (Z_SADSERPPO) and use it's fm and extract strucutre :- Z_SAERPPO_DS fm , ZEXTR_ERPPO extr structure
    When we goto to create a generic delta based on DATE_RANGE we get the error message ::-
    Z_PDPOTEST: TABLES-paramter E_T_DATA for extractor Z_SAERPPO_DS is missing
    Can you create a blog around this as we understand other customers have managed to create these deltas?
    Best regards,
    Pom

    Hi Claudia,
    That's fine, please continue to use the Z_SA_DEPD to generate the datasource.  If you then do a rsa2 on this datasource, you can see the underlying function module and the extract structure for this which is also fine.
    The bizarre thing is that through rso2 for this generated extractor, you cannot see the fm associated for this, hence why the generic delta is not working.
    What we thought of doing and in line with the sap recommendation was to create a new transaction datasource from tcode rso2 and base it on the above function module and extract structure, but when we goto choose the date_range as the generic delta, it doesn't work and gives the error.
    Am still waiting an answer.
    Will keep you posted.
    Thanks,
    Pom

  • SAP eSourcing 5.1 - Java/SQL based data extractor starter kit for BW

    Hi,
    We're implementing the eSourcing 5.1 which is fully Java based. I'm not able to find the Java/SQL based extractor for it. Does anyone have any idea where we could get the Java/SQL based extractor ? If anyone has any prior  experience with extracting data from eSourcing5.1 to BW, would appreciate any advise on the best way to extract data from it, whether using standard Java/SQL based extractor or via flat files or some other methods.
    Thanks in advance for your help.
    Regards,
    Femmie.

    HI Sivaraju,
    Thanks so much for the info.
    For DB connect, do you know if there's any standard extractors for it ?
    Rgds,
    Femmie.

  • Document Date - Invoice Extractor Issue

    Hi Folks,
    I am facing an issue with Invoice extractor document date format while extracting data from source system.
    The record fails at the SPM validation step.
    Date in ECC and BKPF table 0949.08.10 (format: yyyy.mm.dd)
    This data come to PSA but fails at the SPM validation step.  Ideally if you look at the date it is in the valid format and should not fail.
    The error message I get in SPM is: Field XSATRANDAT is of type DATS and contains unallowed values
    Let me know how can we resolve this issue. We have 3 source system and to aggregate from all the systems we have approximately 5 such records creating problems.
    Regards,
    Sampat Desai

    Hi Sampat,
    Can you check to make sure under global settings the data format is setup correctly.  Or you can also choose date format under the properties of each load.
    Regards,
    Rohit

  • Extractor for FICO (Internal Order)

    Hi Gurus,
    I was wondering if you happen to know any extractor that can capture transaction that uses Internal Order? in FICO..I can't seem to find one. Thanks in advance.
    - Kit

    Hi Kit,
    Internal Order in FI will be collected in BSEG table( AUFNR is the field).So based on this table i hope there is no standard DS.
    For your Info :
    The following link will provide you the source tables for SD,MM&FI Extractors ;
    http://wiki.sdn.sap.com/wiki/display/BI/BWSDMMFIDATASOURCES
    Regards
    Ram.

  • SAP Standard Extractors with Data Integrator

    Does the SAP ERP or APO Standard Extractors can be used and managed by Data Integrator? e.g. 2LIS_12_VCITM, etc.

    Hi,
    you might want to put the question into the EIM area. This forum is for the Integration Kit for SAP product
    Ingo

  • How do i add type kit fonts to muse web site

    how do I +add type kit fonts to muse website

    Hi.
    Check this video, might be helpful
    Let me know if you have any questions

  • Extractor 0FC_BP_ITEMS - Business Partner Items Enhancement

    I'm looking at using 0FC_BP_ITEMS extractor to extract Open and cleared items. Has anyone worked on the enhancement of this extractor. If so what are the steps involved in enhancing this extractor?  Is it similar to the Full load extractor 0FC_CI_01 and 0FC_OP_01. Any help will be great.

    Hi,
    Please have alook in help.sap.com- Netweaver- BI Content-FICA.
    This is not same as those two datasources which you mentioned. This datasource will give you delta records.
    Regards,
    Asish

Maybe you are looking for

  • Mac Pro (2008) with Nvidia 8800GT - Adding a second (ATI 5870) possible?

    Hello there, I have a 3,1 mac Pro with the bto Nvidia 8800 GT in it, but buying the Apple/ATI 5770/5870 as 'upgrade' is somewhat expensive (where do these prices come from??), I was wondering whether I could 'just' buy a normal 5870 (for less than th

  • Wrt54g as a wireless access point

    alright i recently changed around my configuration with my wrt54g wireless router. I have made it into an access point which i was unaware is better. Im kinda new to the networking deal though it is what i want to do as a career and access points are

  • Upgrading to LC Designer ES (8.2)

    Currently we're in the process of upgrading all our forms to the new version of LC Designer ES (8.2).   We have designer 5.0, 7.1 and the newest version of designer loaded on a couple of our PC's.   When we launch the new version of designer and brin

  • How to transfer itunes library to new comp?

    I just got a new laptop. I want to transfer my itunes library from my old computer onto my new one without having to re-enter all of my cds (that would take hours). Any ideas?? IBM Thinkpad    

  • JStartupICheckFrameworkPackage: can't find framework package jvmx.jar

    Hi,  I do not manage to open J2EE Engine Visual Administrator I have this message: Unable to lookup connection default http://SOLMAN:8100/msgserver/text/logon returned empty list of connection parameters I have create a new connection with port 50104