0CO_PC_PCP_03 Extraction

Hi Experets,
I have a problem in loading data, Please provide your thoughts.
I am working on 0CO_PC_PCP_03 and _04 to load the data to BW.
I want to load the data for 12 Periods using Infopackage's ("to" and "from"). whereas i can't load more than 1 period at a time and it's mandatory to provide input for this field(PERIODE). 
I have checked the extractor in RSA3 and i could descover from there that there is function module (KKBW_PCP_ITEMS_GET_ITEMIZATION) of this Standard SAP extractor which is controlling the input. SO when i run the extractor using one period it's get successful. If range it get's fail. if blank it gets fail too.
Please throw your thoughts.
Thanks.
Message was edited by: Santosh Varma
Message was edited by: Santosh Varma

Hi,
You can load for more than 1 month also..
Add another one column for period and then try to load it for periods...if it works then add total 12 columns and load it....
try it out and let me know if any problem/
regs,
mahantesh

Similar Messages

  • Errors occured during extraction at RSA3

    hi
    i am giving these datasource and get data at rsa3, its  giving error like error occured during extractions?
    can u please let me know what could be the reason?
    0CO_PC_PCP_03
    0CO_PC_PCP_04

    Hi,
    Check the Log in RSA3, youca n find log and see it and correct it.
    Steps:
    1. Inastall DS in RSA5
    2. Check in RSA6.
    3. Extract in RSA3.
    Thanks
    Reddy

  • 0CO_PC_PCP_03 Cost Estimate Itemizations at the top level

    We are wanting to use extractor 0CO_PC_PCP_03 to extract the itemisation behind a cost estimate (The data in table CKIS). The issue with the extractor is that it breaks a multilevel BOM  down to the basic levels (ie, itemisation explosion). However we want to have the data at the  top level as it is displayed in a cost estimate itemisation. Is it possible to get this extractor to only extract the top level of the itemisation without further explosion of detail.
    Thanks
    Geoff ARmstrong

    I have the same problem now.
    I plan to create a generic datasource and extractor is a function module which is originated from KKBW_PCP_ITEMS_GET_ITEMIZATION.
    Do you have got a better way now?

  • Extractor 0CO_PC_PCP_03: diferent costing runs for the same material

    Hi, experts.
    We are extracting data with 0CO_PC_PCP_03 extractor, but some materials do not return any record.
    If we use CO standard transactions, we can get the values for all materials, but with the extractor just few materials are returned.
    We are filtering the extractor with the costing run name and costing run date, but some materials are not extracted.
    This is occurring for a specific month only. We have extracted data for other months / costing runs and the same materials appears in CO transaction and in extractor result. The different thing with this problematic month is that there are other costing runs of the same type (costing variant), for a higher date.
    It seems like the extractor just can extract the last costing run for a material / month. Any idea?
    Thanks,
    Henrique Teodoro

    Hello,
    I have an update in my findings. Few days ago, R/3 user ran a new costing run for a set of materials, using the same costing variant / costing version used one year ago. And now, when I extract the costing run created one year ago, materials present in recently run set don´t appear anymore. I ran this same costing run weeks ago and the material are ok. After the new costing run with the same costing variant / costing version, they are gone from extraction.
    Seems like this extraction just can return the last costing variant / costing version for each material. Or maybe the table key that stores this data are costing variant / costing version, instead of costing run / costing run date.
    Anybody have any knowledge or experience on this?
    Thanks,
    Henrique

  • Has anybody used 0CO_PC_PCP_03

    Hi all
    i was wondering if anybody used the extractor 0CO_PC_PCP_03 for cost estimate itemization. I know that it is basically used for SEM but came to know that it can be used for BW after applying few OSS notes.
    If at all anybody used this extractor for BW please let me know its validity.
    We have already applied the following OSS notes for this extractor :
    743357
    774376
    But we are still not sure if we are getting correct data.
    Any help would be appriciated.

    Hi Patrick.
    thanks for your reply.
    i have applied 3 OSS notes to 0CO_PC_PCP_03 in R/3 but we are still missing certain material cost estimates although these material entries are found in KEKO and CKIS tables and are with quantity structure. I know that this extractor only extracts material cost estimates with quantity structure.
    it would be great if you could answer few of my questions.
    1.Does this extractor takes into consideration any other factors relating to Material cost estimates.
    2.what we notice is it brings BOM details only for primary cost elements. is this how this extractor behaves??
    3.what kind of BOM useage it takes into consideration?
    Thanks for your time and help.

  • Cost Estimate - 0CO_PC_PCP_03 extractor

    Hello people,
    I am extracting data from 0CO_PC_PCP_03 which gives the breakdown of cost estimates.  However, ONE key figure for Item Component u2018Gu2019 and cost component u201915 (Loading)u2019 does not come through in the extractor; it is always zero for every material. 
    Going into R3 (t-code CK13N) I found that this is because this extractor gives the cost of goods manufactured, and not the cost of goods sold.  In fact, the only difference between the COGM and the COGS is this u2018Loadingu2019 cost.
    I also found that the function module CK_F_ITEMIZATION_CONVERT makes the key figures for Loading to zeros.  This is the route of the issue.  Does anyone have a solution or know how to get at the Cost of Goods Sold data?
    Many thanks

    Hi,
    I am facing the same issue and I do not know how to solve it... do you have any solution for this?
    Regards,
    Elena

  • Itemizations Calculation - CK13N with 0CO_PC_PCP_03 or 0CO_PC_PCP_30

    Hi all,
    I have a request to bring into BW the itemization calculation as it is in transaction CK13N.
    I  have activated the datasources 0CO_PC_PCP_03 and 0CO_PC_PCP_30 but it is not working as I expected.
    I would like to have the different costs per cost component view (Direct material, direct cost, TIC...) but with the 2 datasources I always have the total value only for the material and not for the rest of components...
    I am new at CO extraction and I don´t know if these are not the suitable datasources for the data I am looking for or is there anything I am doing wrong?
    Thanks in advance for any input.
    Regards,
    Elena

    Hi,
    We are in same request of loading data. Can you let us know how to bring all plant,material data to BI.
    What is the selection condition that best suit these infopackages to get the data to BI.
    Thanks,
    Arun

  • 0co_pc_pcp_03 not validating PERIODE

    Hi all
    We are using 0co_pc_pcp_03 for Product Cost itemization.when i generate the datasource 0co_pc_pcp_03 and do an extract checker in RSA3, it is bringing same data no matter what PERIODE I enter in the selection.
    PERIODE is required field for selection for this extractor, but the extractor program is ignoring the entry in this field.
    example: The extractor brings same data for
         1) PERIODE = 2005002 and
         2) PERIODE = 1234567
    did anyone face this kind of problem? Would appriciate your help in this regard.

    Hi Gattu,
    Could you please check the Function Module "KKBW_PCP_ITEMS_GET_ITEMIZATION" inside this function module there is one include program "LKKBW_PCP_ITEMSF03" which allows only one period to run in the infopackage. i am facing the proble that i can load only one period at a time whereas i need to load 12 periods.
    Look into this function module. hope it helps.
    worst thing is that this extractor doesn't support Delta.I think there is bug in this extractor. Do let me know if you come across any OSS Note or some workaround.experts please contribute your inputs.
    Thanks.
    San.

  • How to extract Cleared and Open Items from SAP R/3?

    Hi experts,
    I have a requirement from user to extract cleared and open invoice items from SAP R/3.
    Can someone tell me how to do that?
    Thanks!

    Hi,
    Use the Data source 0FI_AR_4 to Know the status (0FI_DOCSTAT) of payment to be done by customer.
    OR
    Enhance the 2LIS_13_VDITM with VBUP fields which will give the status of Billing.
    With rgds,
    Anil Kumar Sharma .P

  • Extraction and loading of Training and Event Management data (0HR_PE_1)

    hello,
    I've got the following doubt:
    before BI 7.0 release, extractor 0HR_PE_1 extracts event data (eventid, attendee id, calday,...) but if you load straight to cube 0PE_C01, as Calendar year/month needs a reference date (for example, event start date), you'll get total duration of event in hours or days refered to event star date.
    So in a query filtered by month, you get total duration of an event that starts in the filtered month but it couldn`t end until few months later o year later, so you don´t get appropiate information.
    Example:
    Event          calday        Hours
    10004377  20081120   500        but from event_attr event end date is 20090410.
    In a query filtered by 200811 you get total duration time (500 hours when in 200811 event hours have been 20) or if you filter by any month of 2009 you don´t get information of duration of that event.
    I had to create a copy of standar cube (without calday, only Calendar year/month, Calendar year in time dimension) and disaggrate data creating as many entries for an event as months lasts and adjust calculation of ratios duration of event (duration of event in each month/ total duration of event).
    The question is: Is there any improvement or change on business content in BI 7.0 to treat Training and Event Management data? Has anybody had to deal with that?
    Thank you very much.
    IRB

    Hi,
    TEM data is stored in HRP tables.
    You can load the catalog by creating LSMWs for objects Business event group (L), Business event types (D), Locations (F), Organizers (U) as per requirement.
    LSMW for tcode PP01 can be used to create these objects.
    To create Business Events (E) you can create LSMW for PV10/PV11.
    To book attendee create LSMW for tcode PV08 as here you can specify the actual business event ID which reduces ambiguity.
    tcode PV12 to firmly book events
    tcode PV15 to follow up
    Hope this helps.
    Regards,
    Shreyasi.

  • How to Extract Data for a Maintenance View, Structure and Cluster Table

    I want to develop  3 Reports
    1) in First Report
    it consists only two Fields.
    Table name : V_001_B
    Field Name1: BUKRS
    Table name : V_001_B     
    Field Name2: BUTXT
    V_001_B is a Maintenance View
    For this one I don't Find any Datasource
    For this Maintenance View, How to Extract the Data.
    2)
    For the 2nd Report also it consists Two Fields
    Table name : CSKSZ
    Field Name1: KOSTL (cost center)
    Table name : CSKSZ
    Field Name2: KLTXT (Description)
    CSKSZ is a Structure
    For this one I don't Find any Datasource
    For this Structure How to Extract the Data
    3)
    For the 3rd Report
    in this Report all Fields are belonging to a Table BSEG
    BSEG  is a Cluster Table
    For this one also I can't Find any Datasource,
    I find very Few Objects in the Datasource.
    For this One, How to Extract the Data.
    Please provide me step by step procedure.
    Thanks
    Priya

    Hi sachin,
    I don't get your point can you Explain me Briefly.
    I have two Fields for the 1st Report
    BUKRS
    BUTXT
    In the 2nd Report
    KOSTL
    KLTXT
    If I use  0COSTCENTER_TEXT   Data Source
    I will get KOSTL Field only
    what about KLTXT
    Thanks
    Priya

  • Basic extraction  doubts sap SAP R/3 to SAP BI

    IN SBIW which radio button i should select if i am extracting the Table in R/3  with 77 fields into sap BI ?
    1) Transaction  or
    2)Master data attributes
    3)texts data
    please tell me ...
    and after that If i want to  use the table for reporting in SAP BI, To which is the best data target to create in SAP BI?
    a)Infocube
    b)DSO
    and please tell me the steps to do in SAP BI system?
    I am asking all these because i am new to sap BI ?
    please do write me...thank you !!!!!!1

    Hi,
    i suppose you need transaction data. We could be sure if you post the table which you want to extract.
    The infoprovider you need is dependent from the data you need for reporting. Pls. search in SDN or in the help.sap.com for modelling help.
    Regards
    Edited by: Zoltan Both on Aug 20, 2008 1:14 PM
    Edited by: Zoltan Both on Aug 20, 2008 1:16 PM

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • Open Hub: How-to doc "How to Extract data with Open Hub to a Logical File"

    Hi all,
    We are using open hub to download transaction files from infocubes to application server, and would like to have filename which is dynamic based period and year, i.e. period and year of the transaction data to be downloaded. 
    I understand we could use logical file for this purpose.  However we are not sure how to have the period and year to be dynamically derived in filename.
    I have read in sdn a number of posted messages on a similar topic and many have suggested a 'How-to' paper titled "How to Extract data with Open Hub to a Logical Filename".  However i could not seem to be able to get document from the link given. 
    Just wonder if anyone has the correct or latest link to the document, or would appreciate if you could share the document with all in sdn if you have a copy.
    Many thanks and best regards,
    Victoria

    Hi,
    After creating open hub press F1 in Application server file name text box from the help window there u Click on Maintain 'Client independent file names and file paths'  then u will be taken to the Implementation guide screen > click on Cross client maintanance of file name > create a logical file path by clicking on new entiries > after creating logical file path now go to Logical file name definition there give your Logical file , name , physical file (ur file name followed by month or year what ever is applicable (press f1 for more info)) , data format (ASC) , application area (BW) and logical path (choose from F4 selection which u have created first), now goto Assignment of  physical path to logical path > give syntax group >physical path is the path u gave at logical file name definition.
    however we have created a logical path file name to identify the file by sys date but ur requirement seems to be of dynamic date of tranaction data...may u can achieve this by creating a variable. U can see the help from F1 that would be of much help to u. All the above steps i have explained will help u create a dynamic logical file.
    hope this helps u to some extent.
    Regards

  • F.01 - extract to consolidation on company code level

    Hi everyone
    We're trying to do an extract to consolidation on company code level (1). When popup dialog "Filename for extraction file" appears we can choose between presentation server or application server.
    When choosing presentation server and logical filename ECCS_FILE we get error GC027 (Error opening file c:\temp\EXTRACT_CS_CO.txt).
    Doing so choosing application server however works (whereas path isn't the same).
    We already checked and double-checked settings of logical file name and logical path in transaction FILE.
    Any suggestions what to do or isn't it supposed to work via presentation server?
    Thanks in advance and best regards to all
    Renaud

    I have not seen anything that you have mentioned to enter file path?
    Is this being customised for your Company?
    How did you download this before?

Maybe you are looking for