FM for extracting data
Hi Friends,
Can any body tell me is there any FM avilable to extract the data from any datbase table? Or shall we have to create a new FM..?
Thanks in Advance.
call function 'RFC_READ_TABLE'
destination rfc_sys
exporting
query_table = 'DOKHL'
* DELIMITER = ' '
* NO_DATA = ' '
* ROWSKIPS = 0
* ROWCOUNT = 0
tables
options = options1
fields = nametab1
data = stmp_dokhl
exceptions
table_not_available = 1
table_without_data = 2
option_not_valid = 3
field_not_valid = 4
not_authorized = 5
data_buffer_exceeded = 6
others = 7 .
Check it...
Similar Messages
-
How to create a RFC destination for extracting data to HANA
Hello All,
Could someone help me in providing a document or note on how to create a RFC destination for extracting data from SAP data source to HANA using SAP LT replication server ?
I am able to create a data base connection while transforming data from non SAP data source,but wasnt able to transform data which is from SAP abap tables .Hi Venkatesh,
In SM59 t.code we create RFC destinations.
Go thru the video link for creating RFC destinations step by step
How to setup a trusted RFC connection between SAP systems: a step-by-step guide - YouTube -
BO DI for extracting data from legacy system to SAP BW 7.0
Hi,
Has anyone worked on scenario where BO Data Integrator is being used as ETL tool for extracting data from legacy systems to SAP BW 7.0 ?
Please share your experience on such a landscpae or related documents any of you had come across.
Regards,
PriteshThere are couple of them - Lawson ERP, Kalido, AS/400. ECC 5.0 might be there as well.
Regards,
Pritesh.
Edited by: pritesh prakash on Jun 14, 2010 2:00 PM -
BO DI for extracting data from legacy to SAP BW 7.0
Hi,
Has anyone worked on scenario where BO Data Integrator is being used as ETL tool for extracting data from legacy systems to SAP BW 7.0 ?
Please share your experience on such a landscpae or related documents any of you had come across.
Regards,
PriteshHi,
BO Integrator connects with other system (Database) and you will be creating your reports on the top of that DB or system. It never saw it being used to pull data from legacy to BW.
Regards,
Bashir Awan -
Best practice for extracting data to feed external DW
We are having a healthy debate with our EDW team about extracting data from SAP. They want to go directly against ECC tables using Informatica and my SAP team is saying this is not a best practice and could potentially be a performance drain. We are recommending going against BW at the ODS level. Does anyone have any recommendations or thoughts on this?
Hi,
As you asked for Best Practice, here it is in SAP landscape.
1. Full Load or Delta Load data from SAP ECC to SAP BI (BW): SAP BI understand the data element structure of SAP ECC, and delta mechanism is the continous process of data load from a SAP ECC (transaction system) to BI (Analytic System).
2. You can store transaction data in DSOs (granular level), and in InfoCubes (at a summrized level) within SAP BI. You can have master data from SAP ECC coming into SAP BI separately.
3. Within SAP BI, you SHOULD use OpenHub service to provide SAP BI data to other external system. You must not connect external extractor to fetch data from DSO and InfoCube to target system. OpenHub service is the tool that faciliate data feeding to external system. You can have Informatica to take data from OpenHubs of SAP BI.
Hope I explain to best of your satisfaction.
Thanks,
S -
FM code for extracting data from 10 tables
Hi All,
I have a requirement, where i need to extract the data from 10 tables using FM which is further used to create a data source in SAP R3.
The fields are almost similar in all the tables but yet i have to fetch the data from all the tables.
My approach:
1. I have created a structure with all the fields in it from all the tables that comes to 35 fields.
2. create a structure of each table and loop it into an internal table and work area.
3. writing a read statement based on a key value from all the work areas and populate the same into my final structure.
Note: there is a key field that need to be considered when updating the data into final stucture.
All help on the code / logic would be appreciated.
Thanks
Moderator message : Spec/requirements dumping not allowed, show the work you have already done. Thread locked.
Edited by: Vinod Kumar on Feb 28, 2012 8:35 PMHi All,
I have a requirement, where i need to extract the data from 10 tables using FM which is further used to create a data source in SAP R3.
The fields are almost similar in all the tables but yet i have to fetch the data from all the tables.
My approach:
1. I have created a structure with all the fields in it from all the tables that comes to 35 fields.
2. create a structure of each table and loop it into an internal table and work area.
3. writing a read statement based on a key value from all the work areas and populate the same into my final structure.
Note: there is a key field that need to be considered when updating the data into final stucture.
All help on the code / logic would be appreciated.
Thanks
Moderator message : Spec/requirements dumping not allowed, show the work you have already done. Thread locked.
Edited by: Vinod Kumar on Feb 28, 2012 8:35 PM -
Error trying to extract data via HFM objects
I've written a program to extract selected data from HFM (version 11.1.1.3.500) using the API objects. The program (shown at the bottom of this post) is failing on the 2nd of the following 2 lines:
oOption = oOptions.Item(HSV_DATAEXTRACT_OPT_SCENARIO_SUBSET)
oOption.CurrentValue = lBudgetScenario
where oOption is a data load/extract object previously initialized and lBudgetScenario is the long internal ID for our budget scenario.
The error is usually "COM Exception was unhandled" with a result code of "0x800456c7", but, mysteriously, even with no code changes, it sometimes throws the error "FileNotFoundException was not handled", where it says that it could not load "interop.HSXServerlib or one of its dependencies". The second error occurs even though HSXServer was previously initialized in the program and used in conjunction with the login.
I've carefully traced through the VB.NET 2010 code and find that all relevant objects are instantiated and variables correctly assigned. It also occurred to me that the data load DLLs might have been updated when the 11.1.1.3.50 and 500 patches were applied. For that reason, I removed the references to those DLLs, deleted the interop files in the debug and release folders and copied the server versions of those DLLs to my PC. I then restored the DLL references in Visual Studio which recreated the interops. However, the error still occurs.
The ID I'm using (changed to generic names in the code below) has appropriate security and, for example, can be used to manually extract data for the same POV via the HFM client.
I've removed irrelevant lines from the code and substituted a phony ID, password, server name and application name. The line with the error is preceded by the comment "THE LINE BELOW IS THE ONE THAT FAILS".
Imports HSVCDATALOADLib.HSV_DATAEXTRACT_OPTION
Module Module1
Public lActualScenario, lBudgetScenario As Long
Public oClient As HSXCLIENTLib.HsxClient
Public oDataLoad As HSVCDATALOADLib.HsvcDataLoad
Public oOptions As HSVCDATALOADLib.IHsvLoadExtractOptions
Public oOption As HSVCDATALOADLib.IHsvLoadExtractOption
Public oSession As HSVSESSIONLib.HsvSession
Public oServer As HSXSERVERLib.HsxServer
Sub Main()
'Create a client object instance, giving access to
'the methods to logon and create an HFM session
oClient = New HSXCLIENTLib.HsxClient
'Create a server object instance, giving access to
'all server-based methods and properties
oServer = oClient.GetServerOnCluster("SERVERNAME")
'Establish login credentials
oClient.SetLogonInfoSSO("", "MYID", "", "MYPASSWORD")
'Open the application, which will initialize the server
'and session instances as well.
oClient.OpenApplication("SERVERNAME", "Financial Management", "APPLICATION", oServer, oSession)
'Instantiate a data load object instance, which will be used to extract data from
'FRS.
oDataLoad = New HSVCDATALOADLib.HsvcDataLoad
oDataLoad.SetSession(oSession)
'Initialize the data load options interface.
oOptions = oDataLoad.ExtractOptions
'Find the internal ID numbers for various scenarios and years.
'These are required for HFM API function calls.
lActualScenario = GetMemberID(DIMENSIONSCENARIO, "Actual")
lBudgetScenario = GetMemberID(DIMENSIONSCENARIO, "Budget")
'Construct file names for open data.
strFileName = "c:\Temp\FEWND_BudgetData.dat"
strLogFileName = "c:\Temp\FEWND_BudgetData.log"
'Extract data for the current open cycle.
ExtractData("Budget", BudgetYear, "Dec", strFileName, strLogFileName)
End Sub
Sub ExtractData(ByVal strScenario As String, ByVal strYear As String, ByVal strPeriod As String, _
ByVal strFileName As String, ByVal strLogFileName As String)
'Populate the Scenario element.
oOption = oOptions.Item(HSV_DATAEXTRACT_OPT_SCENARIO_SUBSET)
If strScenario = "Actual" Then
oOption.CurrentValue = lActualScenario
Else
'THE LINE BELOW IS THE ONE THAT FAILS
oOption.CurrentValue = lBudgetScenario
End If
End Sub
Function GetMemberID(ByVal lDimID As Long, ByVal strMemLabel As String) As Long
Dim oMetaData As HSVMETADATALib.HsvMetadata
oMetaData = oSession.Metadata
oEntityTreeInfo = oMetaData.Dimension(lDimID)
GetMemberID = oEntityTreeInfo.GetItemID(strMemLabel)
End Function
End ModuleI stumbled upon the solution to my problem. The documentation for extracting data via objects defines member ID variables as Longs. In fact, I've always defined such variables as longs in previous object programs and had no problems. It appears that the datal load/extract "option" property of "Currentvalue" is defined as integer. When I changed all of my member ID items (such as the "lBudgetScenario" variable that was the right-side of the failing assignment statement) to be integers, the program worked.
-
Account based COPA datsource taking long time to extract data
Hi
We have created a Account based COPA datasource but it is not extracting data in RSA3 even though the underlying tables have data in it.
If the COPA datasource is created using fields only from CE4 (segment ) and not CE1 (line items ) table then it extracts data but tat too after very long time.
If the COPA datasource is created using fields from CE4 (segment ) and CE1 (line items ) table then it does not extarct any records and RSA3 gives a time out error..
Also job scheduled from BW side for extracting data goes on for days but does not fetch any data and neither gives any error.
The COPA tables have huge amount of data and so performance could be a issue. But we have also created the indexes on them. Still it is not helping.
Please suggest a solution to this...
Thanks
GauravHi Gaurav
Check this note 392635 ,,might be usefull
Regards
Jagadish
Symptom
The process of selecting the data source (line item, totals table or summarization level) by the extractor is unclear.
More Terms
Extraction, CO-PA, CE3XXXX, CE1XXXX, CE2XXXX, costing-based, account-based,profitability analysis, reporting, BW reporting, extractor, plug-in, COEP,performance, upload, delta method, full update, CO-PAextractor, read, datasource, summarization level, init, DeltaInit, Delta Init Cause and Prerequisites
At the time of the data request from BW, the extractor determines the data source that should be read. In this case, the data source to be used depends on the update mode (full initialization of the deltamethod or delta update), and on the definition of the DataSources (line item characteristics (except for REC_WAERS FIELD) or calculated key figures) and the existing summarization levels.
Solution
The extractor always tries to select the most favorable source, that is,the one with the lowest dataset. The following restrictions apply:
o Only the 'Full' update mode from summarization levels is
supported during extraction from the account-based profitability
analysis up to and including Release PI2001.1. Therefore, you can
only everload individual periods for a controlling area. You can
also use the delta method as of Release PI2001.2. However, the
delta process is only possible as of Release 4.0. The delta method
must still be initialized from a summarization level. The following
delta updates then read line items. In the InfoPackage, you must
continue to select the controlling area as a mandatory field. You
then no longer need to make a selection on individual periods.
However, the period remains a mandatory field for the selection. If
you do not want this, you can proceed as described in note 546238.
o To enable reading from a summarization level, all characteristics
that are to be extracted with the DataSource must also be contained
in this level (entry * in the KEDV maintenance transaction). In
addition, the summarization level must have status 'ACTIVE' (this
also applies to the search function in the maintenance transaction
for CO-PA data sources, KEB0).
o For DataSources of the costing-based profitability analysis,
30.03.2009 Page 2 of 3
SAP Note 392635 - Information: Sources with BW extraction from the CO-PA
data can only be read from a summarization level if no other
characteristics of the line item were selected (the exception here
is the 'record currency' (REC_WAERS) field, which is always
selected).
o An extraction from the object level, that is, from the combination
of tables CE3XXXX/CE4XXXX ('XXXX' is the name of the result area),
is only performed for full updates if (as with summarization
levels) no line item characteristics were selected. During the
initialization of the delta method this is very difficult to do
because of the requirements for a consistent dataset (see below).
o During initialization of the delta method and subsequent delta
update, the data needs to be read up to a defined time. There are
two possible sources for the initialization of the delta method:
- Summarization levels manage the time of the last update/data
reconstruction. If no line item characteristics were selected
and if a suitable, active summarization level (see above)
exists, the DataSource 'inherits' the time information of the
summarization level. However, time information can only be
'inherited' for the delta method of the old logic (time stamp
administration in the profitability analysis). As of PlugIn
Release PI2004.1 (Release 4.0 and higher), a new logic is
available for the delta process (generic delta). For
DataSources with the new logic (converted DataSources or
DataSources recreated as of Plug-In Release PI2004.1), the line
items that appear between the time stamp of the summarization
level and the current time minus the security delta (usually 30
minutes) are also read after the suitable summarization level
is read. The current time minus the security delta is set as
the time stamp.
- The system reads line items If it cannot read from a
summarization level. Since data can continue to be updated
during the extraction, the object level is not a suitable
source because other updates can be made on profitability
segments that were already updated. The system would have to
recalculate these values by reading of line items, which would
result in a considerable extension of the extraction time.
In the case of delta updates, the system always reads from line
items.
o During extraction from line items, the CE4XXXX object table is read
as an additional table for the initialization of the delta method
and full update so that possible realignments can be taken into
account. In principle, the CE4XXXX object table is not read for
delta updates. If a realignment is performed in the OLTP, no
further delta updates are possible as they would make the data
inconsistent between OLTP and BW. In this case, a new
initialization of the delta method is required.
o When the system reads data from the line items, make sure that the
30.03.2009 Page 3 of 3
SAP Note 392635 - Information: Sources with BW extraction from the CO-PA
indexes from note 210219 for both the CE1XXXX (actual data) and
CE2XXXX (planning data) line item tables have been created.
Otherwise, you may encounter long-running selections. For
archiving, appropriate indexes are delivered in the dictionary as
of Release 4.5. These indexes are delivered with the SAP standard
system but still have to be created on the database. -
Error while extracting data into 0GL_ACCOUNT
Hi all,
Initialize delta Process has been done successfully for extracting data into 0GL_ACCOUNT using DataSource 0GL_ACCOUNT_ATTR.
However when using Delta Update for extracting data, the following error is displayed: 'ALE change pointers are not set up correctly' & is asking to activate the change pointer in BD61(Source System).
I am not exactly sure of what i am supposed to do in BD61. Plz help.
Regards,
SrikarHi Sushant,
Thanks for the reply.
I have checked in RSA2 for DataSource 0GL_ACCOUNT_ATTR.
All the Entries are present; However the status of SAKAN is not successful(Criss-Cross Shape is displayed).
what am i supposed to do?
Regards,
Srikar -
Hello,
I am bulding a plugin for extracting data from a binary file.
My problem is that the scaling of the values is not defiened by a linear equation (y = a1*x + a0 where conversion is easy with Factor and Offset properties of Channel), but by a cubic equation (y = a3*x^3 + a2*x^2 + a1*x +a0) with 4 coefficients. For the moment, I performed the operation in a loop over all the values of the channel... this takes a lot of time to convert all the values!!! Is there a more efficient way for multiplying array in VBA or channels in DIADem (for Plugins!!!)...?Hi Ollac,
There is no way to efficiently apply polynomial scaling in a DataPlugin. You could create each of the polynomial component terms with an eMultiplyProcessor type ProcessedChannel in the DataPlugin, but we can't add them together with another ProcessedChannel because you can't add a ProcessedChannel to another ProcessedChannel. If the manual cell-by-cell scaling that you've already tried is too woefully slow, then I'd suggest exposing the raw data values to channels in the Data Portal and adding the "a0", "a1", "a2", "a3" polynomial coefficients as channel properties. Once the raw data is in DIAdem, you can use the ChnCalculate() or the older and faster FormulaCalc() command to apply the polynomial scaling in-place based on the coefficients in the channel properties. You might want the Data Plugin to add the "_Raw" suffix to the channel names and have the scaling VBScript remove the "_Raw" suffix or replace it with a "_Scaled" suffix so you don't accidentally apply the polynomial scaling twice.
Ask if you have questions about this,
Brad Turpin
DIAdem Product Support Engineer
National Instruments -
Im fairly new to XML & web services so bear with me please.
Im trying to extract data from XML documents where the structure/element names are unknown.
The condition for extracting data is if keyword matches some data content.
Im using Java and the DOM API with xcerces and I've managed to parse the XML file, what I need to do next is to traverse the data content checking against keywords and if a keyword matches then to strip the tags and save the data locally ?
Ive trawled through all the documentaion but as theres so many different methods Im somewhat unsuare which is the best way to proceed.
thankstough question, i would suggest looking at jdom as solution for you, because it is a relatively simple one to use for searching through a dom. aside from that, its probably up to you what sort of logic you want to use.
-
I have tired for this!
When I use SSIS for extract data from ssas, that means,I use mdx query.
then random error occured.
Hope some one can understand my poor English....
And the Error Info show below.
Code Snippet
Error: 0xC0202009 at Data Flow Task - For Individual User Tech Points, OLE DB Source 1 1 [31]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E05.
An OLE DB record is available. Source: "Microsoft OLE DB Provider for Analysis Services 2005" Hresult: 0x00000001 Description: "Error Code = 0x80040E05, External Code = 0x00000000:.".
Error: 0xC004701A at Data Flow Task - For Individual User Tech Points, DTS.Pipeline: component "OLE DB Source 1 1" (31) failed the pre-execute phase and returned error code 0xC0202009.I have had the same error on SQL2008 and now on SQL2012 SSIS, but have been able to eliminate / workaround.
We have a Loop Container in our Control flow that contains a data-flow task with an MDX source. The MDX query for the data-flow source is dynamically built (via an expression) on each iteration of the Loop container (however it always returns the "same shaped"
results - only the filters in the WHERE clause are different).
We've found the error to be somewhat intermittent - sometimes the package will complete successfully, other times it will fail with the 0x80040E05 error at varying iterations thru the container loop.
To alleviate the problem we setup the SQL Agent job-step for this package to re-try on failure for up to 5 retries - not an ideal workaround, but it helped to improve the success rate of the Job.
We have no idea why this error is occurring or what is causing it, however it appears to be timing-related in some way and I have only seen the issue when using a SSAS OLE-DB data source with a dynamically generated MDX query. I have managed to virtually
eliminate the error from occurring with a not ideal workaround in the SSIS package - no idea why this works/helps (hopefully Microsoft will be able to work it out and resolve the issue as it's been plaguing us since SQL2008 and is still here in SQL2012
SP1...
Workaround for MDX causing 0x80040E05 error:
Within our loop container we have added a Script task with OnSuccess precedent constraint to the data-flow task that contains the dynamically generated MDX source query. The script task simply introduces a WAIT in the processing immediately after the
data-flow task completes of about 5 seconds, before allowing SSIS to continue with the next iteration (e.g. System.Threading.Thread.Sleep(5000)).
With this delay in place we have had much more stable SSIS package executions - dont know why, but that's what we havce observed. Also note that when we migrated to SQL2012 SSIS packages the 0x80040E05 error returned, however we were able to eliminate it
once more by increasing the WAIT time to 10 seconds on this script task.
Now waiting for 10 seconds is not an ideal solution / workaround to this problem - particularly when it is contained within a Loop Container (in our case it has added nearly 30 minutes of "WAIT time" to the package execution duration), however this workaround
is better than having the package fail 80%+ of the time...
regards,
Piquet -
ODI : extract datas from HFM filtering on base members of certain accounts
Does anyone know how to filter only Basemembers for certain accounts for extracting data ?
I need to extract data for level0 members of certain accounts which is about 300. If I don't put the filter its more than 2000 records.Hi - unfortunately you cannot do that in the current KM. We faced the same issue and logged the issue with Oracle. Its being considered as an enhancement not as a bug. So we have to wait for future releases. As a solution for this we created TaskLists in HFM and extracted data.
-app -
Extract data from Essbase - MDX/Report Script
Hi,
I was trying to extract data from an ASO cube using both MDX and Report Script approaches.
I liked the MDX_ method since i could specify:
- level zero members with values for the Revenue measure.
SELECT
{[Revenue]} on columns,
NON EMPTY{[Accounts].Levels(0).Members} on AXIS(1),
NON EMPTY {[Accounts2].Levels(0).Members} on AXIS(2),
NON EMPTY {[Accounts3].Levels(0).Members} on AXIS(3),
NON EMPTY {[Accounts4].Levels(0).Members} on AXIS(4)
FROM CUBER.CUBER
Problem:_ For other measure, it shows many #Missing at Cost measure field. Not sure how it happened. I tried to use:
SELECT
{[Revenue]} on columns,
NON EMPTY{ FILTER ([Accounts].Levels(0).Members}, [Revenue] <> Missing) on AXIS(1),
NON EMPTY {[Accounts2].Levels(0).Members} on AXIS(2)
FROM CUBER.CUBER
But still does not solve the issue.
Then i tried to use Report Script_ approach. However, it is not simple to declare that i want all level zero members within dimension Account1 (which has 5 level hierarchy, with level zero members almost each level) and only entries that has values for my measure "Revenue" or "Cost"Hi Ye
For extracting data from Essbase you need to have either Calc script or Report script and you have to use LKM Essbase(Data) to SQL as your KM
For loading data into SQL server you can use IKM SQL Append or IKM SQL incremental update.
When you use Essbase as your source you have to use LKM ESSBASE (MetaData) to SQL as your KM.
So all sources would have LKMs (as KMs) and all targets have IKM as KMs.
-app -
How to extract data from table for huge volume
Hi,
I have around 200000 material doc number for which need to get material number from MSEG table but while using SE16 it gives dump , i have even tried breaking it into batches of 20000 records but still SAP gives dump on executing SE16 for MSEG. Please advise if there is any alternate way to get data from SE16 table for such a large volume.
Note: In our system SE16N does not work, only SE16 is there for our SAP version.
Thanks,
VihaanHi Jurgen,
Thanks for your reply.
I am getting Dump when i enter more than 5000 records as input parameter in MSEG, if I put more than that then it gives dump as "ABAP runtime errors SAPSQL_STMNT_TOO_LARGE ".
I understand that I can extract data restrciting 5000 every time but I have around 250000 material docs so that means if we consider batches of 5000 I need to run the step more 50 times--> 50 excel files. I wanted to avoid that as that is going to take lots of my time.
Any suggestion, please help.
Also wanted to highlight that apart from Material Doc number I am entering Plant (8 plants) and Mvt type (14 mvt type) also as input parameter.
Regards,
Vihaan
Edited by: Vihaan on Mar 25, 2010 12:30 AM
Maybe you are looking for
-
I am returning to Firefox after having used Safari on Mac for the past year. One thing I liked about Safari was that I could define what page gets loaded when I used the File Menu command to open a new Browser Window. I defined this as opening up wit
-
E-Recruitment and Hiring Action
Can someone tell me that once you have selected your applicant through e-Recruitment, how do you then hire the employee into ERP HR? What is the process? What data can be transfered? Thanks WB
-
I have a apple id but i made it ages ago on my iPod. now i have a iphone. I have forgotten my verification answers so it tells me to verify a email sent to me. but the email is the wrong one. What do i do?
-
Anyone know how to put your ripped music back into iTunes from your iPhone? I tried syncing but it gives a warning saying are you sure you want to erase your media or something to that effect.I'm afraid to do anything now and it happened right after
-
I cannot download photos from my apple iphone 4
I cannot download photos from my apple iphone 4. I rang apple who took me through some instructions and talked me through to what to do. We tried both a laptop pc (windows 7) and an apple macbook pro and tried two different cables. Cannot seem to dow