Issue with PU12 extraction

Hi,
I want to extract Master datau2019s from various infotypes using PU12.
1.     Is it possible to extract only master data from various info types using PU12?
2.     If so how can I extract.
3.     Is it possible to add our field in the final layout which is not in info types? (because I want to add some constant fields and the  constant values needs to change based on the values I fetched.)
It will more helpful if i get early responses.
Regards,
Satheesh.
Edited by: satheesh kumar on Apr 14, 2010 8:47 AM

Hi simon !
Stuck means is failing from last october .
Yes .. I am running deltas .
Yes.last time stamp is set to X (that is for 09.05.2008)
In BWOM_SETTINGS table most recent has no value .
We have a ODS for to resolve for duplicate records.
Please also tell me the details of these .

Similar Messages

  • Issue with generic extraction (Generic Delta)

    I have one more issue with Generic Delta Extraction. Here I selected the field related to my requirement but when I am trying to save I am getting error Still OLTP have errors.
    I need some help when we select generic delta wht are the settings. Like now I am working on Inventory Management.
    Please Some one help me as soon as possible.
    Thanks In advance.....
    Regds
    SDR.

    Hi ,
    I think there are some of the Keyfigure fields which needs 0Unit or 0Currency field to be present in the Extract Structure but they are not present . Check All unit of measure and Currency unit fields are not hidden .
    Regards,
    Vijay.

  • Issue with Data Extraction from OAGIS 9 User Area Tags ...

    We are working on displaying the data coming from OAGIS9 into Excel sheet. We created a custom Schema for the Excel sheet and are able to display all the data in OAGIS9 except for the ones in UserArea. We are facing issue in extracting data from tags present in UserArea of OAGIS 9 schema and transforming it into our Custom built schema.
    Can you please suggest us some way in which we extract data from userArea?
    Thanks in advance ~

    Here is the example with instructions, we have just done it and working
    1.You need to have the custom schema that is coming with in the <UserArea>/<any>
    For example this custom schema is like this one shown below and you are using this CutomerPartyMaster of OAGIS
    <?xml version="1.0" encoding="utf-8"?>
    <xs:schema xmlns="http://www.userarea.com/mdm/oagisextensions" elementFormDefault="qualified" targetNamespace="http://www.userarea.com/mdm/oagisextensions" xmlns:xs="http://www.w3.org/2001/XMLSchema">
    <xs:element name="Customer" type="CustomerType" />
    <xs:complexType name="CustomerType">
    <xs:sequence>
    <xs:element minOccurs="0" name="Gender" type="xs:string" />
    <xs:element minOccurs="0" name="DOB" type="xs:date" />
    <xs:element minOccurs="0" name="SSN" type="xs:string" />
    <xs:element minOccurs="0" name="PAN" type="xs:string" />
    </xs:sequence>
    </xs:complexType>
    </xs:schema>
    and the Input data snippet will be like this inside CustomerPartyMaster
    <tnsa:UserArea>
         <tnso:Customer>
              <tnso:Gender>string</tnso:Gender>
              <tnso:DOB>1980-10-13</tnso:DOB>
              <tnso:SSN>string</tnso:SSN>
              <tnso:PAN>string</tnso:PAN>
         </tnso:Customer>
    </tnsa:UserArea>
    where xmlns:tnsa="http://www.openapplications.org/oagis/9" and xmlns:tnso="http://www.userarea.com/mdm/oagisextensions"
    Lets assume your Target Schema is like this where only DOB is coming from userarea
    <CreateRequest>
    <record>
    <firstname></firstname>
    <lastname></lastname>
    <dob></dob>
    </record>
    </CreateRequest>
    2.declare the name space of incoming userarea custom schema in your xsl file like this
    xmlns:nsext="http://www.userarea.com/mdm/oagisextensions"
    3.here is xsl code that copies element from CustomerPartyMaster/UserArea
    <xsl:template match="/">
    <imp1:CreateRequest>
    <xsl:for-each select="/inp1:ESBProcessCustomerRequest/body/ns4:ProcessCustomerPartyMaster/ns4:DataArea/ns4:CustomerPartyMaster">
         <imp1:record>
    <xsl:for-each select="ns4:UserArea/nsext:Customer/nsext:DOB">
              <imp1:dob>
              <xsl:value-of select="."/>
              </imp1:dob>
         <imp1:record>
    </xsl:for-each>     
    where imp1 is target schema namespace
         ns4 is OAGIS standard namespace.
    *****************

  • Issue with generic extraction using FM

    Hi All,
    I am trying to load from one DSO to other DSO, where i am trying to use generic extraction using FM, where i created strucutre of second DSO .
    created Generic datasource with Function module.
    when i am loading data using infopackage, its loading perfectly into PSA(with 100 records) and when i am loading thru DTP to Data target( second DSO), it is looping many records in each datapackage(like 8 million in each package) and updating into datatarget finally 100 records. but it is taking long time to finish the process.( i copied RSAX_GET_SIMPLE_DATA) and added transformation logic to it .
    so i would like to know, if there is anything i have to change in the Function module code, to fix this.
    i would also like to know,  request is already in PSA, will it again extract from FM, when updating thru DTP.
    Thanks
    R,man
    Points will be awarded.

    Hi
    I also had the same process. In the code of the function module. just check the Raise Exception code, probably ABAP guy can help. Also try to debug the code in RSA3 in debug mode. Actually it is going to a infinite loop, I guess.
    Regards
    Souresh

  • Data Validation issue with Standard Extraction

    Hi Experts,
    I have a requirement to publish a report in BI from ECC, all the details are coming from this DataSource : 2LIS_02_ITM.
    This particular DataSource have the many fields which are from the following tables
    EKKO
    EKPO
    MARA
    EKBE
    I tried replicating the DataSource in BI and created transformation extracted full data.The number of record the DSO (0PUR_O01) has is 27910.
    I checked with the ECC tables. And the number of records they have are
    EKKO -   79129
    EKPO -  250502
    MARA -   40950
    EKBE - 1447470 
    I just don't know how do I check whether BI data is correct with ECC.
    Can anyone please tell me how do I reconcile the data.
    Thanks

    Hi,
    you have to check the number of records for the datasource in RSA3 transaction.
    Also, check the start routine at transformation level, as well as the selections if you have loaded.
    Hope this helps,
    Regards,
    Rama Murthy.

  • Issue with data extracted to "loader" format

    I only just noticed something weird (yet probably not directly related to sqldeveloper)
    Using 3EA4, I see the following line when unloading a query result to "loader" format:
    CONTINUEIF NEXT(1:1) = '#'
    I assumed this is harmless , yet today I've noticed it cause sql*loader to systematically
    ignore the first character of every line in the data section of the .ldr file!
    I am using this format often as it conveniently bundles spec and data (ldr=ctl+dat),
    but have never paid attention to this glitch before. Could this perhaps affect this
    particular version of sql*loader only? Right now I only have access to the 10.2.0.1.0
    win32 version that comes with XE10g. Can someone confirm?
    If such a widespread (free, from Oracle, I assume attracted a lot of users) version
    is affected, wouldn't it make sense to not generate the offending line from sqldeveloper?

    Wow - on closer inspection I see sqldeveloper actually systematically includes
    a leading space, whicn can then be eaten by the CONTINUEIF while enabling
    the support for multi-line records.
    My problem only occurred because I had stripped that leading space.
    Every little thing counts, it seems :-(

  • Issue with Data Extraction for Master data

    Hi All,
    I have requriment to enhance standard Datasource(0VENDOR_ATTR) to populate certain fields from table LFBK.
    I have enhanced the datasource and the i am able to pull the data in ECC via RSA3.
    After that i have replicated the Datasource in BI and have activated the Datasource and the enhanced fields are available in BI datasources and have created the transformation between Datasource and 0VENDOR.
    when i run the infopackage, the new enhanced fields are not getting populated. I really wonder how the data is not coming to BI, when its is getting populated in RSA3 in ECC.
    Request to give your solution to solve this issue.
    Regards,
    Gowrisankar

    Hello Gowrisankar N K ,
    After making changes to your 0VENDOR_ATTR DS, make sure that you had replicate DS at BI side.
    if not then please first Replicate the DS & check the newly enhanced field is available in PSA.
    Regards,
    Divyesh Khambhati

  • Issue with Hrs extraction.

    Hi all;
    we are using delta for data source '0CA_TS_IS_1' and its been doing extraction fine for approved CATs hrs, but recently we have added another company code and costcenters for this company code, the new employees' hrs for that cost centers are not coming while doing extraction. did testing on RSA3; not a good luck..Is there a settings or something we are missing?
    Krishma.

    Hi simon !
    Stuck means is failing from last october .
    Yes .. I am running deltas .
    Yes.last time stamp is set to X (that is for 09.05.2008)
    In BWOM_SETTINGS table most recent has no value .
    We have a ODS for to resolve for duplicate records.
    Please also tell me the details of these .

  • Issue with controlling extraction

    Hi
    Due to an sap r3 error my data load from co_om_cca_9 to a data target is stuck from october . But now its Ok in sap r3 . How can i load the data from october till date to my data target ?
    Please suggest .

    Hi simon !
    Stuck means is failing from last october .
    Yes .. I am running deltas .
    Yes.last time stamp is set to X (that is for 09.05.2008)
    In BWOM_SETTINGS table most recent has no value .
    We have a ODS for to resolve for duplicate records.
    Please also tell me the details of these .

  • R&R issue with extracts

    Hi All,
    We have an issue with the R&R demon when creating a new extract. The queue AC_Extract holds with a DDIC_WRITE error. After deleting it and releasing the queue, the extract will eventually end succesfully. We have the same problem with all MSA sites. What does the error mean and what can we do to prevent it?
    Thanks!

    Hi,
    isn't it SPE_DDIC_WRITE which causes the problem?!
    Then in case you are running 5.0 please check note 765953.
    Regards,
    Wolfhard

  • Multiple issues with Creative Cloud File Syncing

    I'm have issues with syncing files on my mac to Adobe Creative Cloud. I don't seem to have issues syncing via drag and drop to my web browser. I keep getting this error while syncing via finder:
    I also have a fairly complex PSD with folders that i'm toggling on and off using Layer Comps. I don't see any difference when switching Layer Comps within Extract in the browser. I do see the changes when turning on and off the folders though.. What's the deal with that?
    Why does it take so long to render the PSD after I turn on a folder or switch layer comps? It takes about 30 seconds to re-render, which is crippling for our workflow.
    This would be an amazing tool for our developers if these 3 issues were resolved.

    Thanks for the information.
    Could you now send me some log files please..
    The log files can be found here:
    Mac: /Users/<yourusername>/Library/Application Support/Adobe/CoreSync/
    Windows: C:\Users\<yourusername>\AppData\Roaming\Adobe\CoreSync\
    The logs have the date in the filename, like "CoreSync-2014-03-25.log". Please compress (zip) all the CoreSync-2014-MM-DD.log files and email them to me directly at [email protected]
    Thanks
    Warner

  • Issues with using the output redirection character with newer NXOS versions?

    Has anyone seen any issues with using the output redirection character with newer NXOS versions?
    Am receiving "Error 0x40870004 while copying."
    Simply copying a file from bootflash to tftp is ok.
    This occurs for both 3CDaemon and Tftpd32 softwares.
    Have tried it on multiple switches - same issue.
    Any known bugs?
    thanks!
    The following is an example of bad (NXOS4.1.1b) and good (SANOS3.2.1a)
    MDS2# sho ver | inc system
      system:    version 4.1(1b)
      system image file is:    bootflash:///m9200-s2ek9-mz.4.1.1b.bin
      system compile time:     10/7/2008 13:00:00 [10/11/2008 09:52:55]
    MDS2# sh int br > tftp://10.73.54.194
    Trying to connect to tftp server......
    Connection to server Established. Copying Started.....
    TFTP put operation failed:Access violation
    Error 0x40870004 while copying tftp://10.73.54.194/
    MDS2# copy bootflash:cpu_logfile tftp://10.73.54.194
    Trying to connect to tftp server......
    Connection to server Established. Copying Started.....
    |
    TFTP put operation was successful
    MDS2#
    ck-ci9216-001# sho ver | inc system
      system:    version 3.2(1a)
      system image file is:    bootflash:/m9200-ek9-mz.3.2.1a.bin
      system compile time:     9/25/2007 18:00:00 [10/06/2007 06:46:51]
    ck-ci9216-001# sh int br > tftp://10.73.54.194
    Trying to connect to tftp server......
    |
    TFTP put operation was successful

    Please check with new version of TFTPD 32 server. The error may be due to older version of TFPT server, the new version available solved this error. Files are getting uploaded with no issues.
    1. Download tftpd32b.zip from:
    http://tftpd32.jounin.net/tftpd32_download.html
    2. Copy the tftpd32b.zip file into an empty directory and extract it.
    3. Copy the file you want to transver into the directory containing tftpd32.exe.
    4. Run tftpd32.exe from that directory. The "Base Directory" field should show the path to the directory containing the file you want to transfer.
    At this point, the tftpserver is ready to begin serving files. As devices request files, the main tftpd32 window will log the requests.
    Best Regards...

  • Performance Issues with large XML (1-1.5MB) files

    Hi,
    I'm using an XML Schema based Object relational storage for my XML documents which are typically 1-1.5 MB in size and having serious performance issues with XPath Query.
    When I do XPath query against an element of SQLType varchar2, I get a good performance. But when I do a similar XPath query against an element of SQLType Collection (Varray of varchar2), I get a very ordinary performance.
    I have also created indexes on extract() and analyzed my XMLType table and indexes, but I have no performance gain. Also, I have tried all sorts of storage options available for Collections ie. Varray's, Nested Tables, IOT's, LOB's, Inline, etc... and all these gave me same bad performance.
    I even tried creating XMLType views based on XPath queries but the performance didn't improve much.
    I guess I'm running out of options and patience as well.;)
    I would appreciate any ideas/suggestions, please help.....
    Thanks;
    Ramakrishna Chinta

    Are you having similar symptoms as I am? http://discussions.apple.com/thread.jspa?threadID=2234792&tstart=0

  • Performance issues with 0CO_OM_WBS_1

    We use BW3.5 & R/3 4.7 and encounter huge performance issues with 0CO_OM_WBS_1? Always having to do a full load involving approx 15M records even though there are on the average 100k new records since previous load. This takes a longtime.
    Is there a way to delta-enable this datasource?

    Hi,
    This DS is not delta enabled and you can only do a full load.  For a delta enabled one, you need to use 0CO_OM_WBS_6.  This works as other Financials extractors, as it has a safety delta (configurable, default 2 hours, in table BWOM_SETTINGS).
    What you should do is maybe, use the WBS_6 as a delta and only extract full loads for WBS_1 for shorter durations.
    As you must have an ODS for WBS_1 at the first stage, I would suggest do a full load only for posting periods that are open.  This will reduce the data load.
    You may also look at creating your own generic data source with delta; if you are clear on the tables and logic used.
    cheers...

  • Issue with 0hrposition master data

    We are extracting data from SAP using 0HRPOSITION_ATTR datasource. I noticed that data is not maintained correctly in master data tables in BW and giving us incorrect results in reporting. Consider below mentioned scenario:
    Position A created as vacant on 04/01/2006 with start date (BEGDA/ Valid from) as 04/01/2006 and End date (ENDDA/ Valid to) as 12/31/9999 and is vacant. Below mentioned entry is shown under maintain master data for 0HRPOSITION in BW:
    Position Valid To Valid From Position Vacant
    A 03/31/2006 01/01/1000
    A 12/31/9999 04/01/2006 X
    Position A is now delimited on 09/15/2006 as it’s no more required. In SAP, position has record only from 04/01/2006 till 09/15/2006 as vacant. When record is extracted in BW, it creates below mentioned entry in master data table.
    Position Valid To Valid From Position Vacant
    A 03/31/2006 01/01/1000
    A 09/15/2006 04/01/2006 X
    <b>A 12/31/9999 09/16/2006 X</b>
    Entry 09/16- 12/31 is incorrect as position doesn’t exist for this duration. If we report on 0HRPOSTION with key date as 09/30/2006, it shows position A as vacant though position no longer exists.
    Has anyone come across this situation. any help is greatly appreciated.
    Kamal
    P.S: Milind Rane...I was searching through the forums and came across your post.I would appreciate if you could let me know how you solved this issue...
    Message was edited by:
            Kamal K

    HI KK
    I have a similar issue. Can you please let me know how this Issue with 0HRPOSITION_ATTR extractor was resolved. In my case i have incorrect data in BW when reporting is done. THrough the extractor untill the PSA i have the correct records coming  but in the master data 0HRPOSITION i have incorrect records.
    Please help.
    Thanks
    Hari
    Message was edited by:
            SAPCOOL

Maybe you are looking for