Extract data from a cube into an excel
Hi All,
I am new to sap BI .
How i can extract content of a cube into an excel sheet with a request id?
Thanks in advance,
Aparna.
Goto RSA1 --> InfoProvider
Right Click on your InfoCube --> Manage
then Select the Content Tab and Click on InfoCube Content
Select the fields you want to get in your excel o/p file ( include the Request id field which u want)
change max no of hits from 200 to blank or some greater value
Execute it (press F8)
U ll get the o/p on your screen, next is to take it in excel file.
Press on local file icon ( or Press CTRL + SHIFT + F9 ) , then select radio button Spreed Sheet
Regards,
Vinod
Similar Messages
-
How to extract data from info cube into an internal table using ABAP code
HI
Can Anyone plz suggest me
How to extract data from info cube into an internal table using ABAP code like BAPI's or function modules.
Thankx in advance
regds
AJAYHI Dinesh,
Thankq for ur reply
but i ahve already tried to use the function module.
When I try to Use the function module RSDRI_INFOPOV_READ
I get an information message "ERROR GENERATION TEST FRAME".
can U plz tell me what could be the problem
Bye
AJAY -
Best way to extract data from archived cube
Hello Experts,
Can anyone tell me best way to extract data from archived cube.
Basically I am trying to pull all the data from archived cube and then load it into another brand new infoprovider which is in different box.
Also I need to extract all the master data for all infoobjects.
I have two options in my mind:
1) Use open hub destination
or
2) Infoprovider>display data>select the fields and download the data.
Is it really possible to extract data using option (2) if records are too high and then load it into another infoprovider in new system.
Please suggest me the pros and cons for the two options.
Thanks for your time in advance.Hello Reddy,
Thanks a lot for your quick reply.
Actually in my case I am trying to extract archived infocube data and then load it into new infoprovider which is in different system. If I have connectivity I can simply export data source from archived infocube and then reload into new infoprovider.
But there is no connectivity between those two systems (where archived cube is and new infoprovider) and so I am left with the two options I mentioned.
1) Use Open Hub
or
2) Extract data manually from infoprovider into excel.
Can anyone let me know which of the two options is the best and also I doubt on how to use excel in extracting data as excel have limit of no.of records 65536
Thanks
Edited by: saptrain on Mar 12, 2010 6:13 AM -
How to extract data from BW cube
Hello friends,
Kindly advice me how to extract data from BW cube.
My requirement is to write an ABAP program to retrive the data from the BW cube.
Kindly provide me any sample code for my requirement.
Thanks in advance.
S.Jaffer AliHi Jaffer,
If you want to load the data from the BW cube into any other Cube/ODS in the same of another BW system you can use the export datasource.
If you want to load the data from the BW cube into any other system then you can use any of these options:
1. Infospoke - Open hub service and is widely used.
2. APD
3. RFC enabled fn modules.
Bye
Dinesh -
ABAP Function Module Example to move data from one Cube into Another
Hi experts,
Can any please help out in this ..?
A Simple ABAP Function Module Example to move data from one Cube into Another Cube
(How do i send the data from one client to another client using Function moduel).
Thanks
-Upen.
Moderator message: too vague, help not possible, please describe problems in all technical detail when posting again, BI related? ("cube"), also search for information before asking.
Edited by: Thomas Zloch on Oct 29, 2010 1:19 PMThis is the start routine to duplicate records in two currencies.
DATA: datew TYPE /bi0/oidateto,
datew2 TYPE rsgeneral-chavl,
fweek TYPE rsgeneral-chavl,
prodhier TYPE /bi0/oiprod_hier,
market TYPE /bic/oima_seg,
segment TYPE /bic/oizsegment.
DATA: BEGIN OF S_DATA_PACK OCCURS 0.
INCLUDE STRUCTURE /BIC/CS8ZSDREV.
DATA: END OF S_DATA_PACK.
S_DATA_PACK[] = DATA_PACKAGE[].
REFRESH DATA_PACKAGE.
LOOP AT S_DATA_PACK.
move-corresponding s_data_pack to DATA_PACKAGE.
if DATA_PACKAGE-loc_currcy = 'EUR'.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalgrc.
DATA_PACKAGE-CURRENCY = 'USD'.
APPEND DATA_PACKAGE.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalloc.
DATA_PACKAGE-CURRENCY = 'EUR'.
APPEND DATA_PACKAGE.
else.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalgrc.
DATA_PACKAGE-CURRENCY = 'USD'.
APPEND DATA_PACKAGE.
endif.
ENDLOOP.
This is to load Quantity field
RESULT = COMM_STRUCTURE-BILL_QTY.
This is to load Value field
RESULT = COMM_STRUCTURE-NETVAL_INV.
UNIT = COMM_STRUCTURE-currency. -
Error extracting data from essbase cube using MDX method
Hi,
We have some problems extracting data from essbase cube using MDX method, we believe that the problem is the MDX query, this is the problem and query:
ERROR:
[DwgCmdExecutionThread]: Cannot perform cube view operation. Analytic Server Error(1260046): Unknown Member SELECTNON used in query
com.hyperion.odi.essbase.ODIEssbaseException: Cannot perform cube view operation. Analytic Server Error(1260046): Unknown Member SELECTNON used in query
at com.hyperion.odi.essbase.wrapper.EssbaseMdxDataIterator.init(Unknown Source)
MDX:
SELECT
NON EMPTY {[YearTotal].[Jan]} ON COLUMNS,
NON EMPTY {[Total Movimientos].[Presupuesto Base]} ON AXIS(1),
NON EMPTY {[Año].[FY11]} ON AXIS(2),
NON EMPTY {[Escenario].[Presupuesto_1]} ON AXIS(3),
NON EMPTY {[Version].[Trabajo]} ON AXIS(4),
NON EMPTY {[Moneda].[Moneda Input]} ON AXIS(5),
NON EMPTY {[Centros de Costo].[1101]} ON AXIS(6),
NON EMPTY {Descendants([Resultado Operacional],4)} ON AXIS(7)
FROM [DSR02].[ROP]
We try extract data using a sample cube and work fine, this is the mdx query:
SELECT
{[Actual],[Budget]} ON COLUMNS,
{[Sales]} ON ROWS,
NON EMPTY {[Product].levels(0).members} ON PAGES,
NON EMPTY {[East].levels(0).members} ON AXIS(3),
NON EMPTY {[Year].levels(0).members} ON AXIS(4)
FROM Sample.Basic
The model reversed ([DSR02].[ROP]) have the same structure than query need, the query and the model are fine, definitely we can´t see the problem, someone can help us?
RegardsYou will be able to test the MDX query in EAS, it is usually best to test the query first before trying to use it in ODI.
Is there any reason you are using MDX to extract the data, have you tried reportscript as I usually find it more efficient to extract the data.
Cheers
John
http://john-goodwin.blogspot.com/ -
How extract data from bw cube to MS SQL server.
Hi all,
This is Sateesh.In my requirement, i need to extract date from BW Cube and to be moved to MS SQL server.can any one give the slution .Pls give me the process how to move from bw to sql server.HI Kumar,
Try with Open Hub.........
Check these.....
http://help.sap.com/saphelp_nw04/helpdata/en/66/76473c3502e640e10000000a114084/frameset.htm
http://help.sap.com/saphelp_nw04/helpdata/en/c5/03853c01c89d7ce10000000a11405a/frameset.htm
http://help.sap.com/saphelp_nw04/helpdata/en/59/90070982d5524b931ae16d613ac04a/frameset.htm
http://help.sap.com/saphelp_nw2004s/helpdata/en/ce/c2463c6796e61ce10000000a114084/frameset.htm
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/43f92595-0501-0010-5eb5-bb772d41ffa4
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e830a690-0201-0010-ac86-9689620a8bc9
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5f12a03d-0401-0010-d9a7-a55552cbe9da
http://help.sap.com/saphelp_nw04/helpdata/en/66/76473c3502e640e10000000a114084/frameset.htm
/people/happy.tony/blog/2006/09/18/hyperion-essbase-data-extraction-transformation-and-loading
/people/marc.bernard/blog/2005/07/11/reorganization-of-sap-business-intelligence-forums
Using OpenHub Unix to export to SQL Server
Infospoke
HOW TO RETRACT DATA FROM BPS 7.1 TO SRM 5.0
Regards,
Vijay. -
Unable to extract data from R/3 into BW 3.5
I have just started learning BW. I am currently trying to upload data from R/3 into BW. I have tried uploading data into InfoCube , ODS, InfoObjects from R/3 and in all cases I am getting error messages. Please shed some light as the what am doing wrong......below are the steps that I have done
For loading transaction data from R/3 in ODS
1. Created view from VBAP and VBAK tables in R/3
2. Created Transaction type DataSource in R/3 using RSO2 transaction
3. Replicated above DataSource in BW under Source Systems. using 'Replicate DataSource'
4. Created ODS object along the same lines as the above created view in R/3
5.Created InfoSource and assigned the replicated DataSource
6. Activated the Transfer Rules, Transfer Structure, Communivcation Structure
7. Created Update Rules for ODS and activated them
8. Created InfoPackage and started the Data Request
The error message that I see in Monitor is as below
Requests (messages): Everything OK
Data request arranged
Confirmed with: OK
Extraction (messages): Missing messages
Missing message: Request received
Missing message: Number of sent records
Missing message: Selection completed
Transfer (IDocs and TRFC): Errors occurred
Request IDoc : IDoc with errors added
Processing (data packet): No data
I am getting the same error message when I try InfoCube instead of ODS.
Please let me know what is wrong
VidyaI checked the 'Details' and 'Status' tab page in Monitor
On Status Page it says
Request still running
Diagnosis
No errors could be found. The current process has probably not finished yet.
System response
The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
and/or
the maximum wait time for this request has not yet run out
and/or
the batch job in the source system has not yet ended.
Current status
No Idocs arrived from the source system.
On Details Tab page, it says
Requests (messages): Everything OK
Data request arranged
Confirmed with: OK
Extraction (messages): Missing messages
Missing message: Request received
Missing message: Number of sent records
Missing message: Selection completed
Transfer (IDocs and TRFC): Errors occurred
Request IDoc : IDoc with errors added
Processing (data packet): No data -
Extracting data from Essbase & loading into flat file through ODI
Hi,
I want to extract data from Essbase and load it into a flat file through ODI(for extraction from essbase I'm using a report script) and I’m using these KM’s:- LKM Hyperion Essbase data to SQL,IKM SQL to FILE Append & for reversing I’m using RKM Hyperion Essbase.All the mappings have been done and the interface has been made. But when I’m executing the interface it is throwing the error below:-
ODI-1217: Session ESS_FILEI (114001) fails with return code 7000.
ODI-1226: Step ESS_FILEI fails after 1 attempt(s).
ODI-1240: Flow ESS_FILEI fails while performing a Loading operation. This flow loads target table ESS_FILE.
ODI-1228: Task SrcSet0 (Loading) fails on the target FILE connection FILE_PS_ODI.
Caused By: java.sql.SQLException: ODI-40417: An IOException was caught while creating the file saying The system cannot find the path specified
at com.sunopsis.jdbc.driver.file.impl.commands.CommandCreateTable.execute(CommandCreateTable.java:62)
at com.sunopsis.jdbc.driver.file.CommandExecutor.executeCommand(CommandExecutor.java:33)
at com.sunopsis.jdbc.driver.file.FilePreparedStatement.execute(FilePreparedStatement.java:178)
at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)
at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:619)
Please let me know what I'm missing and how I can resolve this error.
ThanksIt seems that you are trying to use the file as your staging areas. Hyperion LKM extracts essbase data into a DB staging area which can then be used by your file IKM to load it into file.
You need to use a RDBMS for your staging area. -
Extract Data from a View into multiple columns
fairly new sql person here, I have a question on extracting data from a View that I have created and formatting it in a way that the user needs.
My View has the following columns:
Account
Department
Budget
Spent
Values would be such as:
Account Department Budget Spent
1 A 1.00 1.00
1 B 2.50 1.45
1 C 3.00 4.00
2 A 4.00 1.00
2 B 2.00 1.00
What I'm wanting to do is extract the data out in the following format:
<dept> <account> 1 <account> 2
<budget> <spent> <budget> <spent>
A 1.00 1.00 4.00 1.00
B 2.50 1.45 2.00 1.00
C 3.00 4.00
basically dept as a column and then separate columns for each accounts budget and spent values going across
thanks!Search for pivot queries in this forum.
However, you need to have finite number of accounts in Account column, otherwise output will be, i must say, non-readable and the query to be written would be unrealistic. -
Error when extracting data from R/3 into BW
Hi,
There is an infoobject Payscale Level(Datsource 0paysclaelv_attr) in BW module HR that is extracting data from R/3. While extracting data, the following error occurs "Assignment to feature PFREQ did not take place". I debugged the standard function module HR_BW_EXTRACT_IO_PAYSCALELV and I believe there is definitely an error in the data. But Im not able to find out that record. Has anyone come across this error before? Please let me know if theres any solution to this error. The help for this error provides the following details. But I havent understood it. Any help would be greatly appreciated.
You can assign a specific operation (assignment) to each decision option
within a feature. You can also define decision option ** whose
assignment is effected for all unlisted decision options (this is known
as a default value).
Example:
o D PERSK
o K1 &ABRKS=D1, <- D1 for employee subgroup K1
o K2 &ABRKS=D1, <- D1 for employee subgroup K2
o K3 &ABRKS=D2, <- D2 for employee subgroup K3
o ** &ABRKS=D0, <- D0 for all other employee subgroups
If you now create employee subgroup K4 and the default entry ** is
missing, the system is unable to find a decision option for payroll area
ABKRS.
Procedure
Define a decision option for K4 or for default value **.Hi ,
The error is due to the existing customizing of feature PFREQ for MOLGA = 05, there the value to be returned depends on the WERKS, but apparently it was forgotten to add a new line for the "Otherwise" case.
Like it was already done for the case of MOLGA = 03, for example. Please add then a line for MOLGA = 05 and "Otherwise" and the issue should be solved.
If this doesnot solves your issue , then kindly check these OSS notes :
1033423 -> 0PAYSCALELV_ATTR: Short dump feature_error
842212 -> 0PAYSCALELV_ATTR: DataSources deliver wrong results
92055 -> RP_FROM_PERIOD_TO_PERIOD: FEATURE_ERROR dump
Regards,
Lokesh -
Loading Data from one Cube into another Cube
Hi Guys,
I am trying to load data from one cube A to another cube B. Cube A has data around 200,000 records. I generate export datasource on Cube A. Replicated the datasource and created InfoSource and activated it.
I created update rules for Cube B selecting Source as Cube A. I do have a start routine to duplicate records in Cube A. Now when I schedule load,
It stops at Processing Datapacket and says no data. Is there something wrong with the update routine or is there any other way to load form cube to cube in a simpler way?
Thanks in advanceThis is the start routine to duplicate records in two currencies.
DATA: datew TYPE /bi0/oidateto,
datew2 TYPE rsgeneral-chavl,
fweek TYPE rsgeneral-chavl,
prodhier TYPE /bi0/oiprod_hier,
market TYPE /bic/oima_seg,
segment TYPE /bic/oizsegment.
DATA: BEGIN OF S_DATA_PACK OCCURS 0.
INCLUDE STRUCTURE /BIC/CS8ZSDREV.
DATA: END OF S_DATA_PACK.
S_DATA_PACK[] = DATA_PACKAGE[].
REFRESH DATA_PACKAGE.
LOOP AT S_DATA_PACK.
move-corresponding s_data_pack to DATA_PACKAGE.
if DATA_PACKAGE-loc_currcy = 'EUR'.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalgrc.
DATA_PACKAGE-CURRENCY = 'USD'.
APPEND DATA_PACKAGE.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalloc.
DATA_PACKAGE-CURRENCY = 'EUR'.
APPEND DATA_PACKAGE.
else.
DATA_PACKAGE-netval_inv = DATA_PACKAGE-/bic/zsdvalgrc.
DATA_PACKAGE-CURRENCY = 'USD'.
APPEND DATA_PACKAGE.
endif.
ENDLOOP.
This is to load Quantity field
RESULT = COMM_STRUCTURE-BILL_QTY.
This is to load Value field
RESULT = COMM_STRUCTURE-NETVAL_INV.
UNIT = COMM_STRUCTURE-currency. -
Use transaction FILE to store data from a cube into a file with Open Hub
Hi people:
I'm using BI 7.0 .Mi requirement is to make a flat file using the information of a virtual cube. The file name must have the number of the month and the year. I know that this is possible through FILE transaction.
Can anybody give me a clue how this transaction is used?Which are the steps in order to assemble the name of the file? Or is there any other option? I have defined the physical directory where the file must be leaved
Any help will be great. Thanks in advancedHi,
pick up the code which you need from below.
REPORT RSAN_WB_ROUTINE_TEMP_REPORT .
TYPES: BEGIN OF y_source_fields ,
/BIC/ZTO_ROUTE TYPE /BIC/OIZTO_ROUTE ,
ZINT_HU__Z_WM_HU TYPE /BIC/OIZ_WM_HU ,
CREATEDON TYPE /BI0/OICREATEDON ,
ROUTE TYPE /BI0/OIROUTE ,
PLANT TYPE /BI0/OIPLANT ,
PLANT__0STREET TYPE /BI0/OISTREET ,
PLANT__0CITY TYPE /BI0/OICITY ,
PLANT__0REGION TYPE /BI0/OIREGION ,
PLANT__0POSTAL_CD TYPE /BI0/OIPOSTAL_CD ,
/BIC/ZRECVPLNT TYPE /BIC/OIZRECVPLNT ,
ZRECVPLNT__0STREET TYPE /BI0/OISTREET ,
ZRECVPLNT__0CITY TYPE /BI0/OICITY ,
ZRECVPLNT__0REGION TYPE /BI0/OIREGION ,
ZRECVPLNT__0POSTAL_CD TYPE /BI0/OIPOSTAL_CD ,
KYF_0001 TYPE /BI0/OIDLV_QTY ,
ROUTE__Z_CR_DOCK TYPE /BIC/OIZ_CR_DOCK ,
REFER_DOC TYPE /BI0/OIREFER_DOC ,
END OF y_source_fields .
TYPES: yt_source_fields TYPE STANDARD TABLE OF y_source_fields .
TYPES: BEGIN OF y_target_fields ,
RECORDTYPE TYPE /BI0/OISTREET ,
CONTAINER TYPE /BI0/OICITY ,
/BIC/ZTO_ROUTE TYPE /BIC/OIZTO_ROUTE ,
TRACKINGNUMBER TYPE /BIC/OIZ_WM_HU ,
PO TYPE /BI0/OICITY ,
STAGEDDATE TYPE /BI0/OICITY ,
MOVEMENTTYPE TYPE /BI0/OICITY ,
ROUTE TYPE /BI0/OIROUTE ,
PLANT TYPE /BI0/OIPLANT ,
PLANT__0STREET TYPE /BI0/OISTREET ,
PLANT__0CITY TYPE /BI0/OICITY ,
PLANT__0REGION TYPE /BI0/OIREGION ,
PLANT__0POSTAL_CD TYPE /BI0/OIPOSTAL_CD ,
ORIGINCONTACTNAME TYPE /BI0/OISTREET ,
ORIGINCONTACTPHONE TYPE /BI0/OISTREET ,
/BIC/ZRECVPLNT TYPE /BIC/OIZRECVPLNT ,
ZRECVPLNT__0STREET TYPE /BI0/OISTREET ,
ZRECVPLNT__0CITY TYPE /BI0/OISTREET ,
ZRECVPLNT__0REGION TYPE /BI0/OISTREET ,
ZRECVPLNT__0POSTAL_CD TYPE /BI0/OISTREET ,
DESTINATIONCONTACTNAME TYPE /BI0/OISTREET ,
DESTINATIONCONTACTPHONE TYPE /BI0/OISTREET ,
RCCCODE TYPE /BI0/OISTREET ,
GLCORCLLICODE TYPE /BI0/OISTREET ,
JFCCODE TYPE /BI0/OISTREET ,
DESCRIPTIONOFWORK1 TYPE /BI0/OISTREET ,
DESCRIPTIONOFWORK2 TYPE /BI0/OISTREET ,
INSTRUCTIONS TYPE /BI0/OISTREET ,
REQUESTEDSHIPDATE TYPE /BI0/OICITY ,
ROUTE__Z_CR_DOCK TYPE /BIC/OIZ_CR_DOCK ,
REQUESTEDDELIVERYDATE TYPE /BI0/OICITY ,
ATTSEORDER TYPE /BI0/OICITY ,
CUBE TYPE /BI0/OISTREET ,
WEIGHT TYPE /BI0/OISTREET ,
PIECES TYPE /BI0/OIREFER_DOC ,
REEL TYPE /BI0/OISTREET ,
REELSIZE TYPE /BI0/OISTREET ,
VENDORSKU TYPE /BI0/OISTREET ,
ATTSESKU TYPE /BI0/OISTREET ,
COMPANYNAME TYPE /BI0/OISTREET ,
OEM TYPE /BI0/OISTREET ,
REFER_DOC TYPE /BI0/OIREFER_DOC ,
REFERENCENUMBER2 TYPE /BI0/OISTREET ,
REFERENCENUMBER3 TYPE /BI0/OISTREET ,
REFERENCENUMBER4 TYPE /BI0/OISTREET ,
END OF y_target_fields .
TYPES: yt_target_fields TYPE STANDARD TABLE OF y_target_fields .
Begin of type definitions -
*TYPES: ...
End of type definitions -
FORM compute_data_transformation
USING it_source TYPE yt_source_fields
ir_context TYPE REF TO if_rsan_rt_routine_context
EXPORTING et_target TYPE yt_target_fields .
Begin of transformation code -
DATA: ls_source TYPE y_source_fields,
ls_target TYPE y_target_fields,
var1(10),
var2(10),
year(4),
month(2),
day(2),
date(10),
it_workdays type table of /bic/pzworkdays,
wa_workdays type /bic/pzworkdays,
sto_date(10),
V_tabix TYPE sy-tabix,
Y_tabix TYPE sy-tabix,
sto_var1(10),
sto_year(4),
sto_month(2),
sto_day(2),
sto_final_date(10),
W_HEADER LIKE LS_TARGET-RECORDTYPE,
W_HEADER1(12) TYPE C VALUE 'HEDR00000000',
W_FOOTER LIKE W_HEADER VALUE 'TRLR0000',
CNT(5),
CMD(125) TYPE C.
**********CODE FOR GENRATING CSV FILE PATH*******************
data: OUTFILE_NAME(100) TYPE C,
OUTFILE_NAME1(10) TYPE C VALUE '/sapmnt/',
OUTFILE_NAME3(18) TYPE C VALUE '/qoutsap/ATTUVS',
DATE1 LIKE SY-DATUM,
DD(2) TYPE C,
MM(2) TYPE C,
YYYY(4) TYPE C.
MOVE SY-DATUM+6(2) TO DD.
MOVE SY-DATUM+4(2) TO MM.
MOVE SY-DATUM(4) TO YYYY.
CONCATENATE YYYY MM DD INTO DATE1.
CONCATENATE OUTFILE_NAME1 SY-SYSID OUTFILE_NAME3 '.CSV' INTO
OUTFILE_NAME.
**********END OF CODE FOR GENRATING CSV FILE PATH*************
OPEN DATASET OUTFILE_NAME FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
Code for generating Header.
CONCATENATE W_HEADER1 SY-DATUM SY-UZEIT INTO W_HEADER.
APPEND W_HEADER TO ET_TARGET.
TRANSFER W_HEADER TO OUTFILE_NAME.
CLEAR W_HEADER.
End of code for generating Header.
code for excluding the rows who's Quantity(PIECES) equal to zero.
LOOP AT it_source INTO ls_source where KYF_0001 NE '0'.
end of code for excluding the rows who's Quantity(PIECES) equal to
*zero
MOVE-CORRESPONDING ls_source TO ls_target.
ls_target-RECORDTYPE = 'PKUP'.
ls_target-CONTAINER = ''.
ls_target-TRACKINGNUMBER = ls_source-ZINT_HU__Z_WM_HU.
ls_target-PO = ''.
Date Conversion for Staged Date.
var1 = ls_source-CREATEDON.
year = var1+0(4).
month = var1+4(2).
day = var1+6(2).
CONCATENATE month '/' day '/' year INTO date.
End of Date Conversion for Staged Date.
ls_target-STAGEDDATE = date.
ls_target-MOVEMENTTYPE = 'P'.
ls_target-ORIGINCONTACTNAME = ''.
ls_target-ORIGINCONTACTPHONE = ''.
ls_target-DESTINATIONCONTACTNAME = ''.
ls_target-DESTINATIONCONTACTPHONE = ''.
ls_target-RCCCODE = ''.
ls_target-GLCORCLLICODE = ''.
ls_target-JFCCODE = ''.
ls_target-DESCRIPTIONOFWORK1 = ''.
ls_target-DESCRIPTIONOFWORK2 = ''.
ls_target-INSTRUCTIONS = ''.
ls_target-REQUESTEDSHIPDATE = date.
Calculating STO Creation Date + 3 working Days.
select /BIC/ZWORKDAYS from /bic/pzworkdays into table it_workdays.
loop at it_workdays into wa_workdays.
if wa_workdays-/bic/zworkdays = ls_source-CREATEDON.
V_tabix = sy-tabix.
Y_tabix = V_tabix + 3.
endif.
If sy-tabix = y_tabix.
sto_date = wa_workdays-/bic/zworkdays.
endif.
Endloop.
clear v_tabix.
clear Y_tabix.
sto_var1 = sto_date.
sto_year = sto_var1+0(4).
sto_month = sto_var1+4(2).
sto_day = sto_var1+6(2).
CONCATENATE sto_month '/' sto_day '/' sto_year INTO sto_final_date.
End Of Calculating STO Creation Date + 3 working Days.
ls_target-REQUESTEDDELIVERYDATE = sto_final_date.
ls_target-ATTSEORDER = ''.
ls_target-CUBE = ''.
ls_target-PIECES = ls_source-KYF_0001.
ls_target-REEL = ''.
ls_target-REELSIZE = ''.
ls_target-VENDORSKU = ''.
ls_target-ATTSESKU = ''.
ls_target-COMPANYNAME = 'AT&T'.
ls_target-OEM = ''.
ls_target-REFERENCENUMBER2 = '0'.
ls_target-REFERENCENUMBER3 = '0'.
ls_target-REFERENCENUMBER4 = '0'.
APPEND ls_target TO et_target.
TRANSFER ls_target TO OUTFILE_NAME.
CNT = CNT + 1.
ENDLOOP.
CNT = CNT + 2.
Code for generating Header -Footer.
SHIFT CNT LEFT DELETING LEADING SPACE.
CONCATENATE W_FOOTER CNT INTO W_HEADER.
APPEND W_HEADER TO ET_TARGET.
End of code for generating Header -Footer.
Code for file permissions
TRANSFER W_HEADER TO OUTFILE_NAME.
CLOSE DATASET OUTFILE_NAME.
CONCATENATE 'chmod 644' OUTFILE_NAME INTO CMD SEPARATED BY SPACE.
CALL 'SYSTEM' ID 'COMMAND' FIELD CMD.
End of code for file permissions
End of transformation code -
ENDFORM.
Hope it helps
bhaskar -
Discoverer Plus Automation- Extract data from Pac2K and fill in Excel 5 min
Hi,
I am new to this tool. I have zero knowledge on this tool.
I got a requirement to fetch the data from Pac2K tool for every 10 mins and export to Excel sheet. This should be done using Discoverer plus. I am using discoverer plus web application.
As it is very difficult to fetch the data manually for every 10 mins, I got a requirement to do the automation.
So, if you would be so kind, could you please let me know possiblity to do the automation for the above process using the Discoverer plus Web Application.
If possible please let me know in detail.Hi,
Using the Plus it is not possible to automate the export to excel.
This can be done using the desktop edition using a command line.
you can create a batch file and schedule it using any scheduler.
Tamir -
How to extract data from virtual cube..?
Gurus,
How can i read data from an virtual infocube thru an ABAP code..Is there any FM that i can use..
Kindly Help me in this...
Really URGENT..
Thanks
SamSubray,
Thanks for the reply..I have created a wrapper for this FM in the same way te DEMO program shows but still doesn't wrk.
No data is returned in the Table.
I also tried using the FM RSDRI_INFOPROV_READ_RFC and it does some processing but does nt return the results..
I am attaching my code...Can u please help me with that...
TYPES:
BEGIN OF gt_s_data,
cs_version(3) TYPE c,
cs_chart(2) TYPE c,
bcs_llob(4) TYPE c,
bcs_lcus(3) TYPE c,
bcs_ldch(2) TYPE c,
bcs_lprg(5) TYPE c,
curkey_lc TYPE /BI0/OICURKEY_LC,
curkey_tc TYPE /BI0/OICURKEY_TC,
bcs_lmay TYPE /BIC/OIBCS_LMAY,
figlxref3 TYPE /BIC/OIFIGLXREF3,
unit TYPE /BI0/OIUNIT,
CS_TRN_LC TYPE /BI0/OICURKEY_GC,
CS_TRN_TC TYPE /BI0/OICURKEY_GC,
CS_TRN_QTY TYPE /BI0/OIUNIT,
END OF gt_s_data.
DATA:
l_msg_text TYPE string,
g_s_sfc TYPE rsdri_s_sfc,
g_th_sfc TYPE rsdri_th_sfc,
g_s_sfk TYPE rsdri_s_sfk,
g_th_sfk TYPE rsdri_th_sfk,
g_s_range TYPE rsdri_s_range,
g_t_range TYPE rsdri_t_range.
DATA:
g_s_data TYPE gt_s_data,
g_t_data TYPE STANDARD TABLE OF gt_s_data,
g_t_rfcdata TYPE rsdri_t_rfcdata,
g_t_sfc TYPE rsdri_t_sfc,
g_t_sfk TYPE rsdri_t_sfk,
g_t_field TYPE rsdp0_t_field.
DATA:
g_first_call TYPE rs_bool.
g_first_call = rs_c_true.
CLEAR g_s_sfc.
g_s_sfc-chanm = '0CS_VERSION'.
g_s_sfc-chaalias = 'cs_version'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
CLEAR g_s_sfc.
g_s_sfc-chanm = '0CS_CHART'.
g_s_sfc-chaalias = 'CS_CHART'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
CLEAR g_s_sfc.
g_s_sfc-chanm = 'BCS_LLOB'.
g_s_sfc-chaalias = 'BCS_LLOB'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
CLEAR g_s_sfc.
g_s_sfc-chanm = 'BCS_LCUS'.
g_s_sfc-chaalias = 'BCS_LCUS'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
CLEAR g_s_sfc.
g_s_sfc-chanm = 'BCS_LDCH'.
g_s_sfc-chaalias = 'BCS_LDCH'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
CLEAR g_s_sfc.
g_s_sfc-chanm = 'BCS_LPRG'.
g_s_sfc-chaalias = 'BCS_LPRG'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
CLEAR g_s_sfc.
g_s_sfc-chanm = '0CURKEY_GC'.
g_s_sfc-chaalias = 'CURKEY_LC'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
CLEAR g_s_sfc.
g_s_sfc-chanm = '0CURKEY_GC'.
g_s_sfc-chaalias = 'CURKEY_TC'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
CLEAR g_s_sfc.
g_s_sfc-chanm = 'BCS_LMAY'.
g_s_sfc-chaalias = 'BCS_LMAY'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
CLEAR g_s_sfc.
g_s_sfc-chanm = 'FIGLXREF3'.
g_s_sfc-chaalias = 'FIGLXREF3'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
CLEAR g_s_sfc.
g_s_sfc-chanm = '0UNIT'.
g_s_sfc-chaalias = 'UNIT'.
g_s_sfc-orderby = 0.
INSERT g_s_sfc INTO TABLE g_th_sfc.
***Fill up the key figures data.
CLEAR g_s_sfk.
g_s_sfk-kyfnm = '0CS_TRN_GC'.
g_s_sfk-kyfalias = 'CS_TRN_LC'.
g_s_sfk-aggr = 'SUM'.
INSERT g_s_sfk INTO TABLE g_th_sfk.
CLEAR g_s_sfk.
g_s_sfk-kyfnm = '0CS_TRN_GC'.
g_s_sfk-kyfalias = 'CS_TRN_TC'.
g_s_sfk-aggr = 'SUM'.
INSERT g_s_sfk INTO TABLE g_th_sfk.
CLEAR g_s_sfk.
g_s_sfk-kyfnm = '0CS_TRN_QTY'.
g_s_sfk-kyfalias = 'CS_TRN_QTY'.
g_s_sfk-aggr = 'SUM'.
INSERT g_s_sfk INTO TABLE g_th_sfk.
Fill up selection criteria.
CLEAR g_s_range.
g_s_range-chanm = '0CS_VERSION'.
g_s_range-sign = rs_c_range_sign-including.
g_s_range-compop = rs_c_range_opt-equal.
g_s_range-low = '100'.
APPEND g_s_range TO g_t_range.
CLEAR g_s_range.
g_s_range-chanm = 'BCS_VERS'.
g_s_range-sign = rs_c_range_sign-including.
g_s_range-compop = rs_c_range_opt-equal.
g_s_range-low = 'ACT'.
APPEND g_s_range TO g_t_range.
CLEAR g_s_range.
g_s_range-chanm = '0CS_CHART'.
g_s_range-sign = rs_c_range_sign-including.
g_s_range-compop = rs_c_range_opt-equal.
g_s_range-low = 'ZG'.
APPEND g_s_range TO g_t_range.
CLEAR g_s_range.
g_s_range-chanm = '0CO_AREA'.
g_s_range-sign = rs_c_range_sign-including.
g_s_range-compop = rs_c_range_opt-equal.
g_s_range-low = 'AZ01'.
APPEND g_s_range TO g_t_range.
CLEAR g_s_range.
g_s_range-chanm = '0FISCVARNT'.
g_s_range-sign = rs_c_range_sign-including.
g_s_range-compop = rs_c_range_opt-equal.
g_s_range-low = 'K2'.
APPEND g_s_range TO g_t_range.
CLEAR g_s_range.
g_s_range-chanm = '0FISCYEAR'.
g_s_range-sign = rs_c_range_sign-including.
g_s_range-compop = rs_c_range_opt-equal.
g_s_range-low = '2007'.
APPEND g_s_range TO g_t_range.
CLEAR g_s_range.
g_s_range-chanm = '0FISCPER3'.
g_s_range-sign = rs_c_range_sign-including.
g_s_range-compop = rs_c_range_opt-equal.
g_s_range-low = '003'.
APPEND g_s_range TO g_t_range.
CLEAR g_s_range.
g_s_range-chanm = '0SEM_CGCOMP'.
g_s_range-sign = rs_c_range_sign-including.
g_s_range-compop = rs_c_range_opt-equal.
g_s_range-low = 'US0075'.
APPEND g_s_range TO g_t_range.
g_t_sfc = g_th_sfc.
g_t_sfk = g_th_sfk.
CALL FUNCTION 'RSDRI_INFOPROV_READ_RFC'
EXPORTING
I_INFOPROV = 'BCS_C1V11'
I_REFERENCE_DATE = SY-DATUM
I_SAVE_IN_TABLE = ' '
I_TABLENAME =
I_SAVE_IN_FILE = ' '
I_FILENAME =
I_AUTHORITY_CHECK = RSDRC_C_AUTHCHK-READ
I_CURRENCY_CONVERSION = 'X'
I_S_RFCMODE =
I_MAXROWS = 0
I_USE_DB_AGGREGATION = RS_C_TRUE
I_USE_AGGREGATES = RS_C_TRUE
I_ROLLUP_ONLY = RS_C_TRUE
I_READ_ODS_DELTA = RS_C_FALSE
I_RESULTTYPE = ' '
I_DEBUG = RS_C_FALSE
IMPORTING
E_END_OF_DATA =
E_AGGREGATE =
E_RFCDATA_UC =
E_SPLIT_OCCURRED =
TABLES
I_T_SFC = g_t_sfc
I_T_SFK = g_t_sfk
I_T_RANGE = g_t_range
I_T_TABLESEL =
I_T_RTIME =
I_T_REQUID =
E_T_RFCDATA = g_t_rfcdata
E_T_RFCDATAV =
E_T_FIELD = g_t_field
EXCEPTIONS
ILLEGAL_INPUT = 1
ILLEGAL_INPUT_SFC = 2
ILLEGAL_INPUT_SFK = 3
ILLEGAL_INPUT_RANGE = 4
ILLEGAL_INPUT_TABLESEL = 5
NO_AUTHORIZATION = 6
GENERATION_ERROR = 7
ILLEGAL_DOWNLOAD = 8
ILLEGAL_TABLENAME = 9
ILLEGAL_RESULTTYPE = 10
X_MESSAGE = 11
DATA_OVERFLOW = 12
OTHERS = 13
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
CALL FUNCTION 'RSDRI_DATA_UNWRAP'
EXPORTING
i_t_rfcdata = g_t_rfcdata
CHANGING
c_t_data = g_t_data.
Maybe you are looking for
-
Need to e-mail job log after job completes (SM37)
Greetings! I need to export the results of the Job Log (run SM37, double-click a finished job and then click "Job Log" button at top - this is the list I need). Obviously, the program has already terminated so it won't be able to initiate it. Is th
-
Help needed to create drop down list for a field.
Hi, I have to create a parameter/select-options on the selection screen having drop down values and also having multiple selections allowed. Please help me to attain this functionality.Its urgent. Thanks, Sandeep.
-
Hello everyone. I'd like some help creating rollovers out of a picture in iweb and how can I change the color of the hyperlinks and get rid of the underline in the word, when you mouse over a word. I know they are actually two questions, thank you fo
-
Multiple jdk versions on solaris--best practices and advice
I am a newcomer to solaris system administration (not by choice--I am normally just a Java programmer, but am now responsible for testing code on a new solaris box), so apologies for the newbie questions below. #1: is it typical for a brand new solar
-
Hello dear frinds. I have one large table: create table MAIN ID NUMBER not null, MID NUMBER not null, RDATE DATE not null, RTIME VARCHAR2(8), O0 VARCHAR2(30), O1 NUMBER not null, O2 NUMBER not null, O3 NUMBER not null, O4 NUMBER not null, O5 NUMBER n