How to extract data for particular two members of same dimension.
As per the requirement i need to export data for certain members of a dimension. Lets say we need data for two account members A and B which is in in Account dimension only but is not a direct children. I need the data for all the available years too. Please suggest me how my DATAEXPORT command should look like.
When i am using an AND statement it is not working accordingly. Say i am fixing for years 2007 and 2009 but the output file is coming for 2009 and 2010.
Something other is happening when i am fixing OPEX_31 and OPEX_32. The values are coming not only for OPEX_31 and OPEX_32 but for many more accounts too.
Here is my dataexport statement for your reference
SET DATAEXPORTOPTIONS
DataExportLevel "ALL";
DataExportColFormat ON;
DataExportDimHeader ON;
DataExportOverwriteFile ON;
FIX("LC","Total_Year","ESB1","2009","SIERRA","COSTCENTER_NA","CELLULAR_NA","OPEX_31",
"January","February","March","April","May","June","July","August","September","October","November","December");
DATAEXPORT "File" " " "D:\exports\feb.txt";
ENDFIX;
I need data for OPEX_31 and OPEX_32 for all the available years starting from 2001 to 2025.
Please suggest what are the modification needed to get the desired result.
Thanks in advance
Hi,
There a few different options you can use for fixing on the months, years..
e.g. FIX(January:December)
or FIX(@CHILDREN(YearTotal)) < depends what the parent of the months is
sames goes for years
FIX(2009:2025)
or
FIX(@CHILDREN(Year)
If your period dimension is dense you can always use that as the column header e.g. DataExportColHeader "Period" and then fix on the accounts you require.
Cheers
John
http://john-goodwin.blogspot.com/
Similar Messages
-
How should restrict data for particular month?
Hi Experts,
How should retrieve data for particular month without where condition.Hi,
You must be having Month Object right in your Webi?
User will be given opportunity to Select his/her required [Month]
Create variable as =Userresponse("Enter value for Month")same text as you are giving in the Prompt text....Name it as [UMonth]
Go to analysis tab..Filter..Add filter...[Month]=[UMonth]
always this report will run for the month user require -
How to use two members from same dimension in Dimension Layout, for HFR
I am developing a report using Hyperion Financial Reporting. I have taken two dimensions in rows and remaining dimensions in column. After completing one section, I need to make new section of the report on the same grid and page with just adding one more member of the second row dimension. That is data should be the summation of two members of the dimension, say "ACCOUNT".
Kindly guide me the steps. Also i would like you to know that i am begineer with Hyperion Financial Reporting.
Thanks
RoshanCan you give some explanation of your gird layout and what you are trying to achieve?
Like Account and Scenario in row
Year, Market in columns
Product in Page
Regards
Celvin
http://www.orahyplabs.com
Please mark the responses as helpful/correct if applicable -
How to Extract Data for a Maintenance View, Structure and Cluster Table
I want to develop 3 Reports
1) in First Report
it consists only two Fields.
Table name : V_001_B
Field Name1: BUKRS
Table name : V_001_B
Field Name2: BUTXT
V_001_B is a Maintenance View
For this one I don't Find any Datasource
For this Maintenance View, How to Extract the Data.
2)
For the 2nd Report also it consists Two Fields
Table name : CSKSZ
Field Name1: KOSTL (cost center)
Table name : CSKSZ
Field Name2: KLTXT (Description)
CSKSZ is a Structure
For this one I don't Find any Datasource
For this Structure How to Extract the Data
3)
For the 3rd Report
in this Report all Fields are belonging to a Table BSEG
BSEG is a Cluster Table
For this one also I can't Find any Datasource,
I find very Few Objects in the Datasource.
For this One, How to Extract the Data.
Please provide me step by step procedure.
Thanks
PriyaHi sachin,
I don't get your point can you Explain me Briefly.
I have two Fields for the 1st Report
BUKRS
BUTXT
In the 2nd Report
KOSTL
KLTXT
If I use 0COSTCENTER_TEXT Data Source
I will get KOSTL Field only
what about KLTXT
Thanks
Priya -
How to see data for particular date from a alert log file
Hi Experts,
I would like to know how can i see data for a particular date from alert_db.log in unix environment. I'm suing 0racle 9i in unix
Right now i'm using tail -500 alert_db.log>alert.txt then view the whole thing. But is there any easier way to see for a partiicular date or time
Thanks
ShaanHi Jaffar,
Here i have to pass exactly date and time, is there any way to see records for let say Nov 23 2007. because when i used this
tail -500 alert_sid.log | grep " Nov 23 2007" > alert_date.txt
It's not working. Here is the sample log file
Mon Nov 26 21:42:43 2007
Thread 1 advanced to log sequence 138
Current log# 3 seq# 138 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo3.log
Mon Nov 26 21:42:43 2007
ARCH: Evaluating archive log 1 thread 1 sequence 137
Mon Nov 26 21:42:43 2007
ARC1: Evaluating archive log 1 thread 1 sequence 137
ARC1: Unable to archive log 1 thread 1 sequence 137
Log actively being archived by another process
Mon Nov 26 21:42:43 2007
ARCH: Beginning to archive log 1 thread 1 sequence 137
Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_137
.dbf'
ARCH: Completed archiving log 1 thread 1 sequence 137
Mon Nov 26 21:42:44 2007
Thread 1 advanced to log sequence 139
Current log# 2 seq# 139 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo2.log
Mon Nov 26 21:42:44 2007
ARC0: Evaluating archive log 3 thread 1 sequence 138
ARC0: Beginning to archive log 3 thread 1 sequence 138
Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_138
.dbf'
Mon Nov 26 21:42:44 2007
ARCH: Evaluating archive log 3 thread 1 sequence 138
ARCH: Unable to archive log 3 thread 1 sequence 138
Log actively being archived by another process
Mon Nov 26 21:42:45 2007
ARC0: Completed archiving log 3 thread 1 sequence 138
Mon Nov 26 21:45:12 2007
Starting control autobackup
Mon Nov 26 21:45:56 2007
Control autobackup written to SBT_TAPE device
comment 'API Version 2.0,MMS Version 5.0.0.0',
media 'WP0033'
handle 'c-2861328927-20071126-01'
Clearing standby activation ID 2873610446 (0xab47d0ce)
The primary database controlfile was created using the
'MAXLOGFILES 5' clause.
The resulting standby controlfile will not have enough
available logfile entries to support an adequate number
of standby redo logfiles. Consider re-creating the
primary controlfile using 'MAXLOGFILES 8' (or larger).
Use the following SQL commands on the standby database to create
standby redo logfiles that match the primary database:
ALTER DATABASE ADD STANDBY LOGFILE 'srl1.f' SIZE 10485760;
ALTER DATABASE ADD STANDBY LOGFILE 'srl2.f' SIZE 10485760;
ALTER DATABASE ADD STANDBY LOGFILE 'srl3.f' SIZE 10485760;
ALTER DATABASE ADD STANDBY LOGFILE 'srl4.f' SIZE 10485760;
Tue Nov 27 21:23:50 2007
Starting control autobackup
Tue Nov 27 21:30:49 2007
Control autobackup written to SBT_TAPE device
comment 'API Version 2.0,MMS Version 5.0.0.0',
media 'WP0280'
handle 'c-2861328927-20071127-00'
Tue Nov 27 21:30:57 2007
ARC1: Evaluating archive log 2 thread 1 sequence 139
ARC1: Beginning to archive log 2 thread 1 sequence 139
Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_139
.dbf'
Tue Nov 27 21:30:57 2007
Thread 1 advanced to log sequence 140
Current log# 1 seq# 140 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo1.log
Tue Nov 27 21:30:57 2007
ARCH: Evaluating archive log 2 thread 1 sequence 139
ARCH: Unable to archive log 2 thread 1 sequence 139
Log actively being archived by another process
Tue Nov 27 21:30:58 2007
ARC1: Completed archiving log 2 thread 1 sequence 139
Tue Nov 27 21:30:58 2007
Thread 1 advanced to log sequence 141
Current log# 3 seq# 141 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo3.log
Tue Nov 27 21:30:58 2007
ARCH: Evaluating archive log 1 thread 1 sequence 140
ARCH: Beginning to archive log 1 thread 1 sequence 140
Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_140
.dbf'
Tue Nov 27 21:30:58 2007
ARC1: Evaluating archive log 1 thread 1 sequence 140
ARC1: Unable to archive log 1 thread 1 sequence 140
Log actively being archived by another process
Tue Nov 27 21:30:58 2007
ARCH: Completed archiving log 1 thread 1 sequence 140
Tue Nov 27 21:33:16 2007
Starting control autobackup
Tue Nov 27 21:34:29 2007
Control autobackup written to SBT_TAPE device
comment 'API Version 2.0,MMS Version 5.0.0.0',
media 'WP0205'
handle 'c-2861328927-20071127-01'
Clearing standby activation ID 2873610446 (0xab47d0ce)
The primary database controlfile was created using the
'MAXLOGFILES 5' clause.
The resulting standby controlfile will not have enough
available logfile entries to support an adequate number
of standby redo logfiles. Consider re-creating the
primary controlfile using 'MAXLOGFILES 8' (or larger).
Use the following SQL commands on the standby database to create
standby redo logfiles that match the primary database:
ALTER DATABASE ADD STANDBY LOGFILE 'srl1.f' SIZE 10485760;
ALTER DATABASE ADD STANDBY LOGFILE 'srl2.f' SIZE 10485760;
ALTER DATABASE ADD STANDBY LOGFILE 'srl3.f' SIZE 10485760;
ALTER DATABASE ADD STANDBY LOGFILE 'srl4.f' SIZE 10485760;
Wed Nov 28 21:43:31 2007
Starting control autobackup
Wed Nov 28 21:43:59 2007
Control autobackup written to SBT_TAPE device
comment 'API Version 2.0,MMS Version 5.0.0.0',
media 'WP0202'
handle 'c-2861328927-20071128-00'
Wed Nov 28 21:44:08 2007
Thread 1 advanced to log sequence 142
Current log# 2 seq# 142 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo2.log
Wed Nov 28 21:44:08 2007
ARCH: Evaluating archive log 3 thread 1 sequence 141
ARCH: Beginning to archive log 3 thread 1 sequence 141
Wed Nov 28 21:44:08 2007
ARC1: Evaluating archive log 3 thread 1 sequence 141
ARC1: Unable to archive log 3 thread 1 sequence 141
Log actively being archived by another process
Wed Nov 28 21:44:08 2007
Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_141
.dbf'
Wed Nov 28 21:44:08 2007
ARC0: Evaluating archive log 3 thread 1 sequence 141
ARC0: Unable to archive log 3 thread 1 sequence 141
Log actively being archived by another process
Wed Nov 28 21:44:08 2007
ARCH: Completed archiving log 3 thread 1 sequence 141
Wed Nov 28 21:44:09 2007
Thread 1 advanced to log sequence 143
Current log# 1 seq# 143 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo1.log
Wed Nov 28 21:44:09 2007
ARCH: Evaluating archive log 2 thread 1 sequence 142
ARCH: Beginning to archive log 2 thread 1 sequence 142
Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_142
.dbf'
Wed Nov 28 21:44:09 2007
ARC0: Evaluating archive log 2 thread 1 sequence 142
ARC0: Unable to archive log 2 thread 1 sequence 142
Log actively being archived by another process
Wed Nov 28 21:44:09 2007
ARCH: Completed archiving log 2 thread 1 sequence 142
Wed Nov 28 21:44:36 2007
Starting control autobackup
Wed Nov 28 21:45:00 2007
Control autobackup written to SBT_TAPE device
comment 'API Version 2.0,MMS Version 5.0.0.0',
media 'WP0202'
handle 'c-2861328927-20071128-01'
Clearing standby activation ID 2873610446 (0xab47d0ce)
The primary database controlfile was created using the
'MAXLOGFILES 5' clause.
The resulting standby controlfile will not have enough
available logfile entries to support an adequate number
of standby redo logfiles. Consider re-creating the
primary controlfile using 'MAXLOGFILES 8' (or larger).
Use the following SQL commands on the standby database to create
standby redo logfiles that match the primary database:
ALTER DATABASE ADD STANDBY LOGFILE 'srl1.f' SIZE 10485760;
ALTER DATABASE ADD STANDBY LOGFILE 'srl2.f' SIZE 10485760;
ALTER DATABASE ADD STANDBY LOGFILE 'srl3.f' SIZE 10485760;
ALTER DATABASE ADD STANDBY LOGFILE 'srl4.f' SIZE 10485760;
Thu Nov 29 21:36:44 2007
Starting control autobackup
Thu Nov 29 21:42:53 2007
Control autobackup written to SBT_TAPE device
comment 'API Version 2.0,MMS Version 5.0.0.0',
media 'WP0206'
handle 'c-2861328927-20071129-00'
Thu Nov 29 21:43:01 2007
Thread 1 advanced to log sequence 144
Current log# 3 seq# 144 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo3.log
Thu Nov 29 21:43:01 2007
ARCH: Evaluating archive log 1 thread 1 sequence 143
ARCH: Beginning to archive log 1 thread 1 sequence 143
Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_143
.dbf'
Thu Nov 29 21:43:01 2007
ARC1: Evaluating archive log 1 thread 1 sequence 143
ARC1: Unable to archive log 1 thread 1 sequence 143
Log actively being archived by another process
Thu Nov 29 21:43:02 2007
ARCH: Completed archiving log 1 thread 1 sequence 143
Thu Nov 29 21:43:03 2007
Thread 1 advanced to log sequence 145
Current log# 2 seq# 145 mem# 0: /oracle/NEWDB/oradata/NEWDB/redo2.log
Thu Nov 29 21:43:03 2007
ARCH: Evaluating archive log 3 thread 1 sequence 144
ARCH: Beginning to archive log 3 thread 1 sequence 144
Creating archive destination LOG_ARCHIVE_DEST_1: '/oracle/NEWDB/admin/arch/1_144
.dbf'
Thu Nov 29 21:43:03 2007
ARC0: Evaluating archive log 3 thread 1 sequence 144
ARC0: Unable to archive log 3 thread 1 sequence 144
Log actively being archived by another process
Thu Nov 29 21:43:03 2007
ARCH: Completed archiving log 3 thread 1 sequence 144
Thu Nov 29 21:49:00 2007
Starting control autobackup
Thu Nov 29 21:50:14 2007
Control autobackup written to SBT_TAPE device
comment 'API Version 2.0,MMS Version 5.0.0.0',
media 'WP0280'
handle 'c-2861328927-20071129-01'
Thanks
Shaan -
How to extract data for 0FI_GL_50
hi guys,
we need to use datasource 0FI_GL_50, delta is not supported for this datasource, how can we extract data from ecc?Hi Ester,
If the datasource doesn't support delta load.....then you can't help this.
For ODS it will not be a problem if you keep the aggregation type as "Overwritten"....
But if you want Aggregation Type summation in the ODS level then you have to delete the content of the ODS before loading.....
For Infocube level duplicate records will be there......but you can avoid duplicate records in the following way...........
Use a Full DTP to load from ODS to cube and in the Process Chain you include the Process " Delete Overlapping Requests from infocube".....it will delete the previous request.......then there will be no duplication of records......
Actually what I feel is......there shoukd be three stage of loading.....
a) In the first level a ODS should be there with Aggregation Type Overwrite, so no need to delete the content..
b) 2nd level ODS with Aggregation type Summation, deleteion of the content required...
c) Then from ODS to cube.....include the step " Delete Overlapping Requests from infocube" at the end..
Regards,
Debjani..... -
How to extract data for condition records with different valid periods
Dear Gurus,
There was an error in the recent condition records price upload. Now, i want to extract the data into excel file depending on different valid dates. Please let me know how can i do this... in the extracted file i would like to have fields like, valid on and valid to dates, condition type, price list type, material, condition record no. , last changed date, last changed user id etc., please suggest...thanks in advance..
rgds
surihi suresh
create a quick query (SQVI) and extract to excel table KONP
only thing you need to take care is table join for fields datab ( valid from) and tatbi (valid to) you need to take from right table which are present A-table,
Anil -
How to extract data for this tcode
Hello BW Experts,
I have a transaction code in R/3. i observe that is a using a program P1 and structure S1 to generate the table display in the r/3 transaction.
i go to se38 and display the code, it seems to be a module pool.
Any suggestions for extraction of this data is highly appreciated.
Thanks,
BWerHi,
if you know ABAP, analyze the code, which tables are getting update with that. based on these tables either you can create generic extractor or you can search for business content.
you can use SQL Tracer also to find the tables.
go to ST05 --> SWITCH ON THE DEBUGGER(sql trace) > go to your custom transaction> create a entry> then come back to st05 deactivate the trace> then go with display trace.
there you will see what are the tables used for this.
Hope this helps.
All the best.
reagrds,
Nagesh Ganisetti. -
How to extract data for these fields from R/3 ?
Hi frnds,
My requirement is to extract the following fields which
are related to MM: Condition type, condition
value, Bill of lading, Qty in external delivery note,
Different invoicing party, Invoicing document from R/3 to
BI. But i found that they are not in any purchasing
datasources or inventory datasources ( Except different
invoicing party but the values are not picking up through
standard extractors ).
Anyone please tell me are there any standard datasources for extracting these fields ?
If not, if i want to enhance datasources for these fields
which datasource do i need to enhance i.e
2lis_02_itm,2lis_02_scl,2lis_02_s012 or 2lis_03_bf,
2lis_03_um ?
Full points will be assigned.
Regards,
Bhadri M.Hi.......
I think you have filled the set up table..........and you have replicate the datasource also.......and after replicating you have activated in the BI side........
Then it should extract the data............
Are you in BI 7.0...........
You try one thing...............
Replicate the datasource again............and then activate the transfer rules using program : RS_TRANSTRU_ACTIVATE_ALL...........
Also check the selection tab of your Infopackage........may be you are giving some selection.........also check if it is coming till PSA..........if it is coming till PSA.....then check your transformation and DTP.......
It may help you...........
Regards,
Debjani..........
Edited by: Debjani Mukherjee on Sep 21, 2008 11:18 AM -
Display two members of same dimension hierarchy using mdx on 2 diff columns
Hi,
Is it possible to print the parent of a member that belongs to a certain level in the hierarchy. For this example, the Dimension hierarchy is a Variable hierarchy based on a Row Source, - BOM - Bill of Materials.
Let say there is a BOM hierarchy with 4 levels.
Finished Good - Comp1 - Comp2 - Comp3.
Then we want to display in the IOP report as follows.
Finished Good Supply Chain--These are the column headers
Finished Good1 Finished Good1
Comp1
Comp2
Finished Good2 Finished Good2
Comp3
Comp4
Comp3
Edited by: user596497 on Mar 16, 2011 1:39 PMThe question is wrong ...
MDX helps you to construct cross-product of indices for the underlying cube and then it tells the system to find whether there is any cell with that index, and if so, returns a value for that combination.
If you put two dimensions next to each other, what does it mean? There is no such combination in the underlying cube as each dimension is uniquely represented. We will throw an error as follows:
You cannot mix-up display semantics with the syntax/semantics of MDX.
"Can only cross join sets with different dimensions Statement"
An internal error occurred during the query execution phase. Please contact your administrator.
Query Definition
WITH SET [AnotherGeography] AS '{[Geography].DefaultMember}' SELECT {Fiscal.DefaultMember} ON COLUMNS, { CrossJoin( {[Geography].DefaultMember}, {[Customers].DefaultMember}, {[AnotherGeography]}, {Hierarchize(Descendants([Product].DefaultMember, [Product].[Product Family], SELF_AND_BEFORE), POST) }, { [Measures].[Target Revenue], [Measures].[Demand Plan Revenue], [Measures].[Market Share], [Measures].[Adjusted Demand Plan Quantity] }) } ON ROWS FROM [Demand]
Can only cross join sets with different dimensions Statement: WITH SET [AnotherGeography] AS '{[Geography].DefaultMember}' SELECT {Fiscal.DefaultMember} ON COLUMNS, { CrossJoin( {[Geography].DefaultMember}, {[Customers].DefaultMember}, {[AnotherGeography]}, {Hierarchize(Descendants([Product].DefaultMember, [Product].[Product Family], SELF_AND_BEFORE), POST) }, { [Measures].[Target Revenue], [Measures].[Demand Plan Revenue], [Measures].[Market Share], [Measures].[Adjusted Demand Plan Quantity] }) } ON ROWS FROM [Demand] -
Open Hub: How-to doc "How to Extract data with Open Hub to a Logical File"
Hi all,
We are using open hub to download transaction files from infocubes to application server, and would like to have filename which is dynamic based period and year, i.e. period and year of the transaction data to be downloaded.
I understand we could use logical file for this purpose. However we are not sure how to have the period and year to be dynamically derived in filename.
I have read in sdn a number of posted messages on a similar topic and many have suggested a 'How-to' paper titled "How to Extract data with Open Hub to a Logical Filename". However i could not seem to be able to get document from the link given.
Just wonder if anyone has the correct or latest link to the document, or would appreciate if you could share the document with all in sdn if you have a copy.
Many thanks and best regards,
VictoriaHi,
After creating open hub press F1 in Application server file name text box from the help window there u Click on Maintain 'Client independent file names and file paths' then u will be taken to the Implementation guide screen > click on Cross client maintanance of file name > create a logical file path by clicking on new entiries > after creating logical file path now go to Logical file name definition there give your Logical file , name , physical file (ur file name followed by month or year what ever is applicable (press f1 for more info)) , data format (ASC) , application area (BW) and logical path (choose from F4 selection which u have created first), now goto Assignment of physical path to logical path > give syntax group >physical path is the path u gave at logical file name definition.
however we have created a logical path file name to identify the file by sys date but ur requirement seems to be of dynamic date of tranaction data...may u can achieve this by creating a variable. U can see the help from F1 that would be of much help to u. All the above steps i have explained will help u create a dynamic logical file.
hope this helps u to some extent.
Regards -
How to extract data from planning book
nu t apo. need help regarding how to extract data from planning book givn the planning book name , view name & some key figures.
Total Demand (Key Figure DMDTO):
o Forecast
o Sales Order
o Distribution Demand (Planned)
o Distribution Demand (Confirmed)
o Distribution Demand (TLB-Confirmed)
o Dependent Demand
Total Receipts (Key Figure RECTO):
o Distribution Receipt (Planned)
o Distribution Receipt (Confirmed)
o Distribution Receipt (TLB-Confirmed)
o In-Transit
o Production (Planned)
o Production (Confirmed)
o Manufacture of Co-Products
Stock on Hand (Key Figure STOCK):
o Stock on Hand (Excluding Blocked stock)
o Stock on Hand (Including Blocked stock)
In-Transit Inventory (Cross-company stock transfers)
Work-in-progress (Planned orders / Process orders with start date in one period and finish date in another period)
Production (Planned), Production (Confirmed) and Distribution Receipt elements need to be converted based on Goods Receipt date for projected inventory calculation.Hello Debadrita,
Function Module BAPI_PBSRVAPS_GETDETAIL2 or BAPI_PBSRVAPS_GETDETAIL can help you.
For BAPI BAPI_PBSRVAPS_GETDETAIL, the parameters are:
1) PLANNINGBOOK - the name of your planning book
2) DATA_VIEW - name of your data view
3) KEY_FIGURE_SELECTION - list of key figures you want to read
4) SELECTION - selection parameters which describe the attributes of the data you want to read (e.g. the category or brand). This is basically a list of characteristics and characteristic values.
BAPI_PBSRVAPS_GETDETAIL2 is very similar to BAPI_PBSRVAPS_GETDETAI but is only available from SCM 4.1 onwards.
For the complete list of parameters, you can go to transaction SE37, enter the function module names above and check out the documentation.
Please post again if you have questions.
Hope this helps. -
How to extract data from ODS to non-SAP system
Hi,
Can anybody tell me, step by step, how to extract data from ODS to a non-SAP system?
Is it possible to do it without programming effort? And is there volume limits for this kind of extraction?
The non-SAP system is an unix system.
Thanks in advance
EllaElla,
You can look at it from the concept of a BADI / Infospoke
Extract the data from the ODS to a Flat file / RDBMS using an infospoke. I am not sure as to how the infospoke loads data into the RDBMS ( did it very long ago ) but then you can push it into an RDBMS and I am sure it will be system neutral.
Hope this helps...
Arun
Assign points if it helps -
How to extract data from custom made Idoc that is not sent
Hi experts,
Could you please advise if there is a way how to extract data from custom made idoc (it collects a lot of data from different SAP tables)? Please note that this idoc is not sent as target system is not fully maintained.
As by now, we would like to verify - what data is extracted now.
Any help, would be appreciated!Hi,
The fields that are given for each segment have their length given in EDSAPPL table. How you have to map is explained in below example.
Suppose for segment1, EDSAPPL has 3 fields so below are entries
SEGMENT FIELDNAME LENGTH
SEGMENT1 FIELD1 4
SEGMENT1 FIELD2 2
SEGMENT1 FIELD3 2
Data in EDID4 would be as follows
IDOC SEGMENT APPLICATION DATA
12345 SEGMENT1 XYZ R Y
When you are extracting data from these tables into your internal table, mapping has to be as follows:
FIELD1 = APPLICATIONDATA+0(4) to read first 4 characters of this field, because the first 4 characters in this field would belong to FIELD1
Similarly,
FIELD2 = APPLICATIONDATA+4(2).
FIELD3 = APPLICATIONDATA+6(2).
FIELD1 would have XYZ, FIELD2 = R, FIELD3 = Y
This would remain true in all cases. So all you need to do is identify which fields you want to extract, and simply code as above to extract the data from this table.
Hope this was helpful in explaining how to derive the data. -
How to extract data from info cube into an internal table using ABAP code
HI
Can Anyone plz suggest me
How to extract data from info cube into an internal table using ABAP code like BAPI's or function modules.
Thankx in advance
regds
AJAYHI Dinesh,
Thankq for ur reply
but i ahve already tried to use the function module.
When I try to Use the function module RSDRI_INFOPOV_READ
I get an information message "ERROR GENERATION TEST FRAME".
can U plz tell me what could be the problem
Bye
AJAY
Maybe you are looking for
-
Embed original filename in EXIF
I'm looking for a way to embed the name of an image file in the EXIF data inside a jpeg. Ideally I would like to see that as an option in Aperture when importing images. Here is why I believe this is a very useful feature for photographers. I have te
-
Converting PDF to Word for Mac version 14
I subscribed to PDF Export and tested it by converting a Word for Mac doc to a PDF and then opening same in Word. My font was Palatino regular. It opened alright, but Calibri was substituted for Palatino. How can I change this.
-
Tables used for Advanced Product Catalog
Hi All, Can i get the tables used for Advanced Product Catalog. i.e. Catalog group and its attributes and peoples assigned and items assigned to that catalog. Its a bit urjent. Please help me. Thanks in advance. Regards, Pavitra.
-
Can i use oracle 8i for oracle 9ias application server ?
can i use oracle 8i for oracle 9ias application server for developing jsp applicationes? can i run the oracle 8i data application with 9 ias oracle server? can any one give suggetion? pullareddy
-
When I click to add an image for example to cragilist the window that pops up is huge why is this? How do I make it normal size?