GLPCA data extraction...
I need to extract GLPCA table data into BW. Is there any Standard Business Content available for doing the same?
Hi Akhila,
Please check:
http://help.sap.com/saphelp_nw04/helpdata/en/42/e77a8e0e2c6346a0be901408be8781/content.htm
Hope this helps...
Similar Messages
-
GLPCA Data extraction in BW...
I need to bring in GLPCA data into BW.I am going to activate the datasource 0EC_PCA_3 and created an ODS based on it.I need to combine the GLPCA data with Billing data to meet my requirements.Will there be any performance issues in brining in GLPCA data into BW.Please do let me know.
Hi Akhil,
Not sure what is the data loads that you are looking at. If the data volume is too high you can think of initialising the Datasources in buckets based on periods.
I have used the same datasource 0EC_PCA_3 for initial extraction of around 17 million records faced no problems. I had initialised the datasource for one fiscal year at a time, i.e. 001.2004 to 012.2004, 001.2005 to 012.2005 and so on.
Also take a look at the help link below,
(an extract of the same below)
http://help.sap.com/saphelp_nw2004s/helpdata/en/5d/ac7d8082a27b438b026495593821b1/frameset.htm
To optimize the performance of data procurement when DataSource 0EC_PCA_3 accesses line item table GLPCA in the OLTP, you should set up an additional secondary index. This should contain the following fields:
· RCLNT
· CPUDT
· CPUTM
Hope it helps.
Cheers
Anurag
.....don't forget to assign points if it helps.... -
BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?
Hi.
Can anyone advise as to what is the difference in using the data extraction flow for extracting Data from SAP R/3 ?
1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
This ABAP flow generates a ABAP program and a dat file.
We can also upload this program and run jobs as execute preloaded option on datastore.
This works fine.
2) We also can pull the SAP R/3 table directly.
DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
THIS ALSO Works fine. And we are able to see the data directly into oracle.
Which can also be scheduled on a job.
BUT am unable to understand the purpose of using the different types of data extraction flows.
When to use which type of flow for data extraction.
Advantage / disadvantage - over the 2 data flows.
What we are not understanding is that :
if we can directly pull data from R/3 table directly thro a query transformation into the target table,
why use the Flow of creating a R/3 data flow,
and then do a query transformation again
and then populate the target database?
There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand. Can anyone advise please.
Many thanks
indu
Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PMHi Jeff.
Greetings. And many thanks for your response.
Generally we pull the entire SAP R/3 table thro query transformation into oracle.
For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
so as to be able to use the option of Execute preloaded - and run the jobs.
Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
we do not do anything at the SAP R/3 level
I was doing this trial and error testing on our Worflows for our new requirement
WF 1 : which has some 15 R/3 TABLES.
For each table we have created a separate Dataflow.
And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
by-passing the ABAP flow.
And still the entire work flow and data extraction happens ok.
In fact i tried creating a new sample data flow and tested.
Using direct download and - and also execute preloaded.
I did not see any major difference in time taken for data extraction;
Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors. Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow. Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP.
I dont know whether i could do this direct pulling of data - for the new job which i have created,
which has 2 workflows with 15 Dataflows in each worflow.
And and push this job into production.
And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement. And this technical understanding is not clear to us as regards the difference between the 2 flows. And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction.
As advised I shall check the schedules for a week, and then we shall move it probably into production.
Thanks again.
Kind Regards
Indu
Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM -
Open data extraction orders - Applying Support Packs
Dear All,
I have done the IDES 4.6C SR2 installation.
While updating the support packs, i get the message saying
CHECK_REQUIREMENTS phase.
Open data extraction orders
There are still open data extraction orders in the system
process these before the start of the object import because changes to the ABAP Dictionary structures could lead to data extraction orders not being able to be read after the import and their processing terminating
For more details about this problem, see Note 328181.
Go to the Customizing cockpit for data extraction and start the processing of all open extraction orders.
I have checked the Note.
But this is something m facing for the first time.
Any suggestion!!!
Rgds,
NKThe exact message is :
Phase CHECK_REQUIREMENTS: Explanation of the Errors
Open Data Extraction Requests
The system has found a number of open data extraction requests. These
should be processed before starting the object import process, as
changes to DDIC structures could prevent data extraction requests from
being read after the import, thus causing them to terminate. You can
find more information about this problem in SAP Note 328181.
Call the Customizing Cockpit data extraction transaction and process all
open extraction requests. -
allison.moore
Any plans for adding following objects under Bulk API V2.0 for data extraction from Eloqua. Extracting the data using the REST API for these objects makes it complicated.Thanks for quick response. Extracting these objects using REST API in depth=Complete poses lots of complication from the code perspective since these object(s) contains multiple nested or embedded objects within it. is there any guideline on how to extract these objects using REST so that we can get all the data which is required for analysis/reporting.
-
Data Extraction and ODS/Cube loading: New date key field added
Good morning.
Your expert advise is required with the following:
1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
3) Data must be appended in future.
4) the current InfoPackage in the ODS is a full upload.
5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
My questions are:
Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
Your assistance will be highly appreciated. Thanks
Cornelius FaurieHi Cornelius,
Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
-->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
--> Yes, It is correct. Otherwise you will loose historic data.
Hope it Helps
Srini -
HI All,
I have gone thorugh the sdn link for FI extarction and founf out to be very useful.
Still i have some doubts....
For line item data extraction...........Do we need to etract data from 0FI_GL_4, 0FI_AP_4, 0FI_AR_4, into ODS0FIGL_O02(General Ledger: Line Items) ? If so do we need to maintain transformation between ods and all three DS?
ALso Please educate me on 0FI_AP_3 and 0FI_AP_4 data sources.......>
Jacob Jansen wrote:
> Hi Raj.
>
> Yes, you should run GL_4 first. If not, AP_4 and AR_4 will be lagging behind. You can see in R/3 in table BWOM_TIMEST how the deltas are "behaving" with respect to the date selection.
>
> br
> jacob
Not necessarily for systems above plug in 2002.
As of Plug-In 2002.2, it is no longer necessary to have DataSources linked. This means that you can now load 0FI_GL_4, 0FI_AR_4, 0FI_AP_4, and 0FI_TX_4 in any order. You also have the option of using DataSources 0FI_AR_4, 0FI_AP_4 and 0FI_TX_4 separately without 0FI_GL_4. The DataSources can then be used independently of one another (see note 551044).
Source:http://help.sap.com/saphelp_nw04s/helpdata/EN/af/16533bbb15b762e10000000a114084/frameset.htm -
Generic Data Extraction From SAP R/3 to BI server using Function Module
Hi,
I want step by step procedure for Generic Extraction from SAP R/3 application to BI application
using Functional module.
If any body have any Document or any PPT then please reply and post it in forum, i will give point for them.
Thanks & Regards
Subhasis Panplease go thr this link
[SAP BI Generic Extraction Using a Function Module|https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/s-u/sap%20bi%20generic%20extraction%20using%20a%20function%20module.pdf]
[Generic Data Extraction Using Function Module |Re: Generic Data Extraction Using Function Module; -
Steps for Data extraction from SAP r/3
Dear all,
I am New to SAP Bw.
I have done data extraction from Excel into SAP BW system.
that is like
Create info objects > info area> Catalog
--> Character catalog
--> Key catalog
Create info source
Upload data.
create info cube
I need similar steps for data extraction for SAP R/3
1. when data is in Ztables ( using Views/Infosets/Function etc)
2. When data is with Standard SAP using Business Content.
Thanks and Regards,
Gaurav Soodhi,
chk the links
Generic Extraction
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
CO-PA
http://help.sap.com/saphelp_46c/helpdata/en/7a/4c37ef4a0111d1894c0000e829fbbd/content.htm
CO-PC
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6
iNVENTORY
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
Extractions in BI
https://www.sdn.sap.com/irj/sdn/wiki
LO Extraction:
/people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
/people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
/people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
/people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
/people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
Remya -
How to add new fields to a data extract
The following data extract program generates an output file which is displayed using a publisher template as a check format report.
Oracle Payments Funds Disbursement Payment Instruction Extract 1.0
How would I add new fields to the generated file? In other words what is the procedure to add new fields to the data extract?
ThanksDo anyone pls advise how to customize the payment extraction program? We also have the similar requirement to extract extra fields to format a payment file.
-
'Error 8 occurred when starting the data extraction program'
Hello Experts,
I am trying to pull master data (Full upload) for a attribute. I am getting an error on BW side i.e. 'The error occurred in Service API .'
So I checked in Source system and found that the an IDOC processing failure has occurred. The failure shows 'Error 8 occurred when starting the data extraction program'.
But when I check the extractor through RSA3, it looks fine.
Can someone inform what might be the reason of the failure for IDOC processing and how can this be avoided in future. Because the same problem kept occurring later as well.
Thanks
Regards,
KPHi,
Chk the idocs from SM58 of source system are processing fine into ur target system(BI system)...?
Chk thru Sm58 and give all * and target destination as ur BI system and execute and chk any entries pending there?
rgds,
Edited by: Krishna Rao on May 6, 2009 3:22 PM -
Vendor Master Data extraction???
Hi,
I need to extract Vendor Master Data from SAP into a flat file.
The format should be similar to file input required for the Vendor Master Upload program: RFBIDE00.
Is there any program which can be used to extract the data in the required format?
Any help is appreciated.
Thanks and Regards,
Varunhi varun,
check this link..
<a href="http://sap.ittoolbox.com/groups/technical-functional/sap-r3-dev/vendor-master-data-extract-from-r3-to-flat-file-944432#">VENDOR MASTER EXTRACTION</a>
<b>dont forget to mark helpful answers..</b>
Message was edited by: Ashok Kumar Prithiviraj -
Hi,
In my current project there is a requirement for Data migration from legacy, So can anyone please help me on providing the data extraction template for Vendor and customer open line items, G/L balances and for Bank directory,
Your help in this regard is highly appreciated
Thanks
Rajesh. RHi Rajesh,
When you extract the data from the legacy system the point that you should keep in mind are,
1. How the organisation structure in legacy system is mapped in SAP system. Becuase the data upload into sap should also happen in the way the reports are expected from SAP.
There is no standard layout which is used while extracting, you need to make sure that you extract all the necessary information from legacy system which is need to be uploaded in SAP.
I can give you an example of the field that you may include in the layout of extraction
Vendor account, Document data, Posting date, Document Type, Company code, Amount in Document currency, and local currency, etc
2. Please keep in mind that there are some GL account which will be maintained in SAP as open item basis. Therefore your extraction should also possibly happen each transaction wise.
3. There are certain GL which will be maintained in foreign currency, line bank GL which are in foreign currency. In such cases you need to extract the balance in foreign currency.
My suggestion to you will be thinking through the precess first and then go ahead with the extraction.
Hope this helps
Regards
Paul -
R/3 SP Import:Open data Extraction requests Error
We are getting the below error when Basis doing the upgrade support packeage in R/3 source system.
Open Data Extraction Requests
The system has found a number of open data extraction requests. These
should be processed before starting the object import process, as
changes to DDIC structures could prevent data extraction requests from
being read after the import, thus causing them to terminate. You can
find more information about this problem in SAP Note 328181.
Call the Customizing Cockpit data extraction transaction and process all
open extraction requests.
Intially we have cleared the entries in LBWQ,RSA7 and SM13.But after clearing these entries also we are getting the same above error.
For support package upgrade in R/3 do we need to delete the SETUP table also in production environment?.
Is there any other way around without deleting the setup table to upgrade support package in R/3 system?
Please help us with your inputs urgently..
Thanks
Message was edited by:
Ganesh SThanks Siggi for the suggestion.
We have already cleared the v3 updates by running RMBWV* job manually.After clearing all the entries in LBWQ,RSA7 and SM13 still we have got the same above error.
When we imported In R/3 development system after deleting the setup table we didn't get the above error.But we are concerned about deleting the setup table in production system
though already deltas are running fine.
Is there any other way around? or deleting the setup table is the only option?
Please help... -
Master data extraction from SAP ECC
Hi All,
I am a newbie here and teaching myself SAP BI. I have looked through the forums regarding master data extraction from SAP ECC in all forums but could not answers for my question. Please help me out.
I want to extract customer attributes from SAP ECC. i have identified the standard data source 0customer_attr and replicated in SAP BI. I have created infopackage for full update. I validated the extractor checker(RSA3) and found 2 records for 0customer_attr.
When I run the info package, the info package remains in yellow state until it times out. Please let me know in case i am missing anything, Please let me know if there is any other process for master data extraction similar to transaction data extraction.Hi All,
i did the below and afte clicking execute in the Simple job, it takes me back to the Monitor Infopackage screen.
From your info pack monitor --> menu environment-->job overview--> job in the source system--> prompts for source system(ECC) user id and password, enter them, you will get your job at SM37(ECC), select job and click on log details icon. there you can see details about your error. please share that error screen shot.
Please find the screenshots.
I did an Environment -->Check connection and found the below,
Maybe you are looking for
-
Sync is not working, even if I force it
I have several PCs set up with a Sync account, but it bothers me to no end that Sync is not working as I feel it should. Yeah, I know that's a bit of a cop out, but to be honest, Google's sync feature is far better than Firefox's but I can't stand ho
-
I have a HDTV with DVI input. I have Sawtooth 10.4 (upgraded to 1.8Ghz, 1G RAM) with the following video card. Do I just hook the DVI output to the HDTV or is there more? ATY,Rage128Pro: Chipset Model: ATY,Rage128Pro Type: Display Bus: AGP Slot: SLOT
-
I am connected to a strange host
For some reason, unknown to me, my computer has an FTP connection open to gd.tuwien.ac.at. I can see it if I type "netstat -a" into the terminal: tcp4 0 0 10.0.10.8.50971 gd.tuwien.ac.at.53010 ESTABLISHED tcp4 0 0 10.0.10.8.50970 gd.tuwien.ac.at.ftp
-
Hello How do I make code that allows user tp re enter the input sine the first input was invalid.... I don't want to terminate it(system.out(0))
-
Hello Everyone ... Level: Newbie OS: Win7 64bit PS: Cs6 I have made a Rectangle of which I have altered the shape Next and on the same layer I have se