Parameters in Transformations file "data export"
Hi,
i'm trying to export to a txt file data from SAP BPC to a txt file, using the standard package (Export Transactional Data).
I have several issues. Firstly, it seems like the dataexport generates a lot of lines containing all the "PARENT" calculations. So my file is perpaps only 30 lines on basemember level, but the final flatfile is perhaps 3.000 lines due to all the calculated lines (on parents level).
Is there any way to "disable" all these parent calculated lines in the export ?
Also i've tried finding some examples on transformation files for dataexport, but can not find any.
Does anyone know which parameters I can use in the *option section of the transformation file ?
Thank you,
Joergen Dalby
If you can access the other machine like a share folder then provide the same path in physical schema like \\my_other_pc_on_shared\new_folder
If you cannot access like that then create one agent on that machine which can access the path and execute your project using the agent.
Thanks
Bhabani
http://bhabaniranjan.com/
Similar Messages
-
Query in File to File Data Export
Hi,
I have done a project for file to file data export using this link below:
http://www.oracle.com/webfolder/technetwork/tutorials/obe/fmw/odi/odi_11g/odi_project_ff-to-ff/odi_project_flatfile-to-flatfile.htm
My project works fine.Here the source file is in local m/c(C:\Oracle\Middleware\Oracle_ODI1\oracledi\demo\file ).
1)What shall I do if the source file is in a remote m/c(says its' ip is 172.22.18.90)?
2)will it cause any problem if the remote m/c's os is unix
Thanks
Papai
NB:-My m/c's OS is windows 7.If you can access the other machine like a share folder then provide the same path in physical schema like \\my_other_pc_on_shared\new_folder
If you cannot access like that then create one agent on that machine which can access the path and execute your project using the agent.
Thanks
Bhabani
http://bhabaniranjan.com/ -
BPC 7.5 NW SP5 - Question regarding Transformation file for exporting data
Hi Experts,
I have a requirement to export data from a BPC application to App.server/UJFS. There are currently 13 dimensions in the application. The user requires the output file to contain only three dimensions followed by the amount(signdata), namely
| COSTCENTER | ACCOUNT | TIME | AMOUNT |.
How do i handle the same in transformation file?
Currently i am using the following transformation file but i am having issues validating it.
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
OUTPUTHEADER=*COSTCENTER,ACCOUNT,TIME, PERIODIC
DELIMITER = ,
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=NO
CREDITPOSITIVE=YES
MAXREJECTCOUNT=
ROUNDAMOUNT=
*MAPPING
COSTCENTER=*COL(2)
ACCOUNT=*COL(4)
TIME=*COL(13)
AMOUNT=*COL(14)
I tried to remove the OUTPUTHEADER from *options and tried but it still gives the same issue.
Please help me as to how i can handle this scenario.
Thanks,
VictorThe 'Export' data package which comes as part of a support pack enhancement does not support using Transformation/Conversion files to modify data prior to transport.
Please refer to section 3 (Limitations) in the below how to guide.
http://www.sdn.sap.com/irj/bpx/go/portal/prtroot/docs/library/uuid/b0427318-5ffb-2b10-9cac-96ec44c73c11?QuickLink=index&overridelayout=true
Hence your only option is to manually massage data in the spreadsheet after exporting. -
Essbase Data Export not Overwriting existing data file
We have an ODI interface in our environment which is used to export the data from Essbase apps to text files using Data export calc scripts and then we load those text files in a relational database. Laetely we are seeing some issue where the Data Export calc script is not overwriting the file and is just appending the new data to the existing file.
The OverWriteFile option is set to ON.
SET DATAEXPORTOPTIONS {
DataExportLevel "Level0";
DataExportOverWriteFile ON;
DataExportDimHeader ON;
DataExportColHeader "Period";
DataExportDynamicCalc ON;
The "Scenario" variable is a substitution variable which is set during the runtime. We are trying to extract "Budget" but the calc script is not clearing the "Actual" scenario from the text file which was the scenario that was extracted earlier. Its like after the execution of the calc script, the file contains both "Actual" and "Budget" data. We are not able to find the root cause as in why this might be happening and why OVERWRITEFILE command is not being taken into account by the data export calc script.
We have also deleted the text data file to make sure there are no temporary files on the server or anything. But when we ran the data export directly from Essbase again, then again the file contained both "Actual" as well as "Budget" data which really strange. We have never encountered an issue like this before.
Any suggestions regarding this issue?Did some more testing and pretty much zoomed on the issue. Our Scenario is actually something like this "Q1FCST-Budget", "Q2FCST-Budget" etc
This is the reason why we need to use a member function because Calc Script reads "&ODI_SCENARIO" (which is set to Q2FCST-Budget) as a number and gives an error. To convert this value to a string we are using @member function. And, this seems the root cause of the issue. The ODI_Scenario variable is set to "Q2FCST-Budget", but when we run the script with this calculation function @member("&ODI_SCENARIO"), the data file brings back the values for "Q1FCST-Budget" out of nowhere in addition to "Q2FCST-Budget" data which we are trying to extract.
Successful Test Case 1:
1) Put Scenario "Q2FCST-Budget" in hard coded letters in Script and ran the script
e.g "Q2FCST-Phased"
2) Ran the Script
3) Result Ok.Script overwrote the file with Q2FCST-Budget data
Successful Case 2:
1) Put scenario in @member function
e.g. @member("Q2FCST-Budget")
2) Results again ok
Failed Case:
1) Deleted the file
2) Put scenario in a substitution variable and used the member function "@member("&ODI_Scenario") and Ran the script . *ODI_SCENARIO is set to Q@FCST-Budget in Essbase variables.
e.g. @member("&ODI_SCENARIO")
3) Result : Script contained both "Q1FCST-Budget" as well as "Q2FCST-Budget" data values in the text file.
We are still not close to the root cause and why is this issue happening. Putting the sub var in the member function changes the complete picture and gives us inaccurate results.
Any clues anyone? -
Data convertion while exporting data into flat files using export wizard in ssis
Hi ,
while exporting data to flat file through export wizard the source table is having NVARCHAR types.
could you please help me on how to do the data convertion while using the export wizard?
Thanks.Hi Avs sai,
By default, the columns in the destination flat file will be non-Unicode columns, e.g. the data type of the columns will be DT_STR. If you want to keep the original DT_WSTR data type of the input column when outputting to the destination file, you can check
the Unicode option on the “Choose a Destination” page of the SQL Server Import and Export Wizard. Then, on the “Configure Flat File Destination” page, you can click the Edit Mappings…“ button to check the data types. Please see the screenshot:
Regards,
Mike Yin
TechNet Community Support -
Script Logic VS Data Transformation File
Hi all,
I'm new to SAP BPC. I have knowledge of SAP BW.
I can see conversion file, which we are referring in data transformation file. which we can use for mapping and conversion of external data into internal data.
How data transformation file different form script logic? Are we going to refer script logis in Data transformation file for each required dimension?
Can any of you give me clarity on how to place script logic and data transformation file in BPC data management flow.
I will really applicate all your help!!!
Thanks
Ben.Nilanjan,
I have a another quick question...
suppose my bpc application has 5 dimensions. Out of the 5 dimensions, 4 dimensions data i'm getting directly from SAP BW. assume 1 dim, i need to extract by doing look up at different table which also reside in BW.
how to populate data for DIM 5.
I got your point that data transformation file purely for field mapping. suppose DIM5 if i want to populate from script logic, wht do i need map in Transformation file. I hope you got my point.
My question if how to populate a DIM in BPC using lookup approach.
Thanks,
Ben. -
Hi,
Can any of you explain me the purpose of Data Transformation File ???
How to use Script logic based on BADI in SAP BPC? Do we need to call in Default.lgf ???
If i want to apply some custom logic while populating data using BADI, do i need to call that BADI in Data Tranformation file?
Thanks,
Ben.Hi Ben,
There is no possibility of calling userdefined scripts or BADI from transformation files.The main use of transformation file is to map data from source to destination.
If you require data load based on some filter criteria there are certain functionalities available for transformation file creation which u can find in the below link.
http://help.sap.com/saphelp_bpc70sp02/helpdata/en/66/ac5f7e0e174c848b0ecffe5a1d7730/frameset.htm
Hope this helps,
Regards,
G.Vijaya Kumar -
How create .ZIP, .RAR file on data export
Dear All(s)
On export in want to create .DMP and .RAR,ZIP files, How i can do this in Oracle 10.2.0.1
EXP full=Y file=d:\abc.dmp
Thanks in AdvanceFor Linux, you can create shell script to export database and after that create .tar file from exported dump
On Windows, you can create bat file to run export and then zip the dump file automatically
On my blog, you can see an example of this procedure
http://kamranagayev.wordpress.com/2009/02/23/using-oracle-utl_file-utl_smtp-packages-and-linux-shell-scripting-and-cron-utility-together-2/
Kamran Agayev A. (10g OCP)
http://kamranagayev.wordpress.com -
Issue with the /CPMB/EXPORT_TD_TO_FILE transformation file.
Good evening to all,
We have to export data from a cube to a text file. To do this we have create in the BPC (NW 7.5 SP 09) a package with the process chain /CPMB/EXPORT_TD_TO_FILE but when we execute the package, we get an error. We think that the error is generated by the transformation file. In the chainu2019s log we can see that the process didnu2019t finish successful, the Appl Source and the clear BPC tables variants are in red and we get this error u201CJob or process BPCAPPS waiting for eventu201D.
Can anybody help us?
our transformation file is like this:
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = ,
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT=
ROUNDAMOUNT=
*MAPPING
DIMENSION_1=/CPMB/NAMEDIMENSION_1
DIMENSION_2=/CPMB/NAMEDIMENSION_2
DIMENSION_3=/CPMB/NAMEDIMENSION_3
DIMENSION_4=/CPMB/NAMEDIMENSION_4
DIMENSION_5=/CPMB/NAMEDIMENSION_5
Amount=/CPMB/SDATA
*CONVERSION
Is this the proper use of the TF in this case?
The complete error log:
/CPMB/MODIFY completado en 0 segundos
/CPMB/APPL_TD_SOURCE completado en 2 segundos
/CPMB/CLEAR completado en 0 segundos
MEASURES=PERIODIC
TRANSFORMATION= DATAMANAGERTRANSFORMATIONFILESTF_EXTRACCION_PLANO.xls
FILE= DATAMANAGERDATAFILESprueba1.txt
ADDITIONINFO= Sí
(Selección de miembros)
CONCEPTO_FI:
FUENTE:
GRUPOCAME:
MONEDA:
SOCIEDAD:
TIEMPO: 2011.ENE
VERSION:
Nom.tarea TRANSACTION DATA SOURCE
Error de sentencia MDX: Valor CONTCORMOL /CPMB/T5DDIW
Aplicación: FINANZAS status de paquete: ERROR
Thanks in advance,
Best regardsHi. Thanks for the reply.
We still working on the issue. For a test we run the package without TF but we get the same error. We can find out the variant BPC Appl Source(APPL_TD_SOURCE) is the variant that generates the issue.
When we ran the package the package didn't finish and it get the status get the status active but when we ran the package in an other application the package ends with an error.
The variant log error was:
RSPROCESS 4ENFI7BF1OI7M7E5GKHXIT5IL is unknown
Here is the log in english.
/CPMB/MODIFY complete in 0 seconds
/CPMB/APPL_TD_SOURCE complete en 5 seconds
/CPMB/CLEAR complete en 0 seconds
[Selection]
MEASURES=PERIODIC
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\TF_EXTRACCION_PLANO2.xls
FILE= DATAMANAGER\DATAFILES\EXAMPLES\prueba_ETD.txt
ADDITIONINFO= Yes
(Members selection)
CONCEPTO_FI:
SOURCE:
GRUPOCAME:
CURRENCY:
SOCIETY:
TIME: 2011.JAN,2011.FEB,2011.MAR,2011.APR,2011.MAY,2011.JUN,2011.JUL,2011.AGO,2011.SEP,2011.OCT,2011.NOV,2011.DEC
VERSION:
[Messages]
Task name TRANSACTION DATA SOURCE
Sentence error MDX: Value CONTCORMOL /CPMB/T5DDIW
Application: FINANZAS Package status: ERROR
Thanks for your help, best regards. -
BPC10 - Data manager package for dimension data export and import
Dear BPC Expers,
Need your help.
I am trying to set up a data manager package for first time to export dimension - master data from one application and import in another application ( both have same properties) .
I created a test data manager package from Organize > add package > with process chain /CPMB/EXPORT_MD_TO_FILE and Add
In the advance tab of each task there are some script logic already populated. please find attached the details of the script logic written under each of the tasks like MD_Source, concvert and target .
I have not done any chnages in the script inside the task .
But when i run the package , I have selected a dimension 'Entity' but in second prompt ,it ask for a transformation file , and syatem autometically add the file ... \ROOT\WEBFOLDERS\COLPAL\FINANCE\DATAMANAGER\TRANSFORMATIONFILES\Import.xls
I have not changed anything there
in the next prmpt , it ask for a output file ..and it won't allow me enter the file name .....i
Not sure how to proceed further.
I shall be greatfull if someone guide me from your experiance how to set up a simple the data manager package for master data export from dimension . Should I update the transformation file in the script for import file and output file in the advance tab. how and what transformation file to be created and link to the data manager package for export / import .
What are the steps to be executed to run the package for exporting master data from dimension and import it another application .
Thanks in advance for your guidance.
Thanks and Regards,
Ramanuj
=====================================================================================================
Detals of the task
Task : APPL_MD-SOURCE
(DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
(OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
(RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
(%TEMPNO1%,%INCREASENO%)
(%TEMPNO2%,%INCREASENO%)
(/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
(/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
(/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
(/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
(/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
(/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
(/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
Task : EXPORT_MD_CONVERT
(DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
(OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
(RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
(%TEMPNO1%,%INCREASENO%)
(%TEMPNO2%,%INCREASENO%)
(/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
(/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
(/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
(/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
(/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
(/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
(/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
Task : FILE_TARGET
(DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
(OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
(RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
(%TEMPNO1%,%INCREASENO%)
(%TEMPNO2%,%INCREASENO%)
(/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
(/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
(/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
(/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
(/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
(/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
(/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
================================================================================1. Perhaps you want to consider a system copy to a "virtual system" for UAT?
2. Changes in QAS (as with PROD as well) will give you the delta. They should ideally be clean... You need to check the source system.
Another option is to generate the profiles in the target system. But for that your config has to be sqeaky clean and in sync, including very well maintained and sync'ed Su24 data.
Cheers,
Julius -
Data export from BPC 7.5 NW
I have created and validated my transformation file, which takes two dimensions and concatenates them. For example Company and ProfitCenter, in the mapping section:
U_Entity=Company+ProfitCenter.
Where my company = 5110 and my profit center = SVS8110
The records are validated and in my output file I can see 5110SVS8110 and the associated transaction data.
However, in the *Conversion section of the Transformation file I reference the conversion file
U_Entity=U_EntityConv.XLS
Within the Conversion file, in the internal column I have 5110SVS8110, and in the external column I have UKSERVCO.
Both files are validated, but when I run the export transaction data package, the output file doesn't include UKSERVCO
In BPC 420 training it says *Conversion can be used to convert external keys from the flat file to internal keys. But can it convert internal keys to external keys in the flat file? If so, what do I need to do here?
Thanks.Hi,
have you tried to invert the column values?
Kind regards
Roberto -
How to convert the flat file data into sap tables . ?
how to upload flat file data into sap table . before upload mapping is also there in some filds . any one can give me some steps how to upload and mapping . ?
Hi
See the sample code
REPORT zmmupload.
Internal Table for Upload Data
DATA: i_mara like MARA occurs 0 with header line
PARAMETERS: p_file LIKE ibipparms-path. " Filename
At selection-screen on Value Request for file Name
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
Get the F4 Values for the File
CALL FUNCTION 'F4_FILENAME'
EXPORTING
program_name = syst-cprog
dynpro_number = syst-dynnr
IMPORTING
file_name = p_file.
Upload the File into internal Table
CALL FUNCTION 'UPLOAD'
EXPORTING
filename = p_file
filetype = 'DAT'
TABLES
data_tab = i_mara
EXCEPTIONS
conversion_error = 1
invalid_table_width = 2
invalid_type = 3
no_batch = 4
unknown_error = 5
gui_refuse_filetransfer = 6
OTHERS = 7.
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
Upload the Data from Internal Table
MODIFY MARA from TABLE i_MARA.
Regards
Anji. -
Help needed on meta data export
I need a meta data export dump of a source database of two schemas and import to a fresh new database.
COuld you please help with the steps to be followed.Assuming you are using 10g and exp utility
You can say rows=N
exp help=yes
Export: Release 10.2.0.1.0 - Production on Thu Jul 23 13:36:59 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
You can let Export prompt you for parameters by entering the EXP
command followed by your username/password:
Example: EXP SCOTT/TIGER
Or, you can control how Export runs by entering the EXP command followed
by various arguments. To specify parameters, you use keywords:
Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
USERID must be the first parameter on the command line.
Keyword Description (Default) Keyword Description (Default)
USERID username/password FULL export entire file (N)
BUFFER size of data buffer OWNER list of owner usernames
FILE output files (EXPDAT.DMP) TABLES list of table names
COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
GRANTS export grants (Y) INCTYPE incremental export type
INDEXES export indexes (Y) RECORD track incr. export (Y)
DIRECT direct path (N) TRIGGERS export triggers (Y)
LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
{color:red}*ROWS export data rows (Y)* {color} PARFILE parameter filename
CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
OBJECT_CONSISTENT transaction set to read only during object export (N)
FEEDBACK display progress every x rows (0)
FILESIZE maximum size of each dump file
FLASHBACK_SCN SCN used to set session snapshot back to
FLASHBACK_TIME time used to get the SCN closest to the specified time
QUERY select clause used to export a subset of a table
RESUMABLE suspend when a space related error is encountered(N)
RESUMABLE_NAME text string used to identify resumable statement
RESUMABLE_TIMEOUT wait time for RESUMABLE
TTS_FULL_CHECK perform full or partial dependency check for TTS
TABLESPACES list of tablespaces to export
TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
TEMPLATE template name which invokes iAS mode export
Export terminated successfully without warnings. -
I want to store the flat file data for temporary purpose and as a backup also, but its size is so huge that processing them at one instance is a difficult task. Is there any 'Z' program that will split the records puts it in a new file. Anyone please give an idea and your answer ll b rewarded with maximum points if it helps me on this issue.
Hi Manjula,
Check out the below program this will help you solve your requirement, you split the records as per the limit specified and puts them in a new file. Think so it help you.
REPORT zc1_split_file MESSAGE-ID ztestmsg.
TABLES: mara.
DATA: BEGIN OF input,
mandt LIKE mara-mandt,
matnr LIKE mara-matnr,
ersda LIKE mara-ersda,
ernam LIKE mara-ernam,
matkl LIKE mara-matkl,
END OF input.
DATA: i_mara_tab LIKE TABLE OF input WITH HEADER LINE,
i_mara_temp LIKE TABLE OF input,
w_mara_tab LIKE LINE OF i_mara_tab,
v_newfile(120) TYPE c,
v_no_lines TYPE i,
v_split(4) TYPE n VALUE 1,
v_count(4) TYPE n VALUE 1.
SELECTION-SCREEN BEGIN OF BLOCK b1 WITH FRAME TITLE text-001.
PARAMETERS: p_file TYPE rlgrap-filename MEMORY ID file,
p_count(4) TYPE n.
SELECTION-SCREEN END OF BLOCK b1.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
Diplaying the dialog box for opening the file ************
CALL FUNCTION 'WS_FILENAME_GET'
EXPORTING
mask = '.,..'
IMPORTING
filename = p_file.
START-OF-SELECTION.
IF p_file IS INITIAL. " check to ensure file.
MESSAGE i001.
EXIT.
ENDIF.
IF p_count IS INITIAL. " check to ensure file.
MESSAGE i002.
EXIT.
ENDIF.
CALL FUNCTION 'WS_UPLOAD'
EXPORTING
filename = p_file
filetype = 'DAT'
TABLES
data_tab = i_mara_tab
EXCEPTIONS
conversion_error = 1
file_open_error = 2
file_read_error = 3
invalid_type = 4
no_batch = 5
unknown_error = 6
invalid_table_width = 7
gui_refuse_filetransfer = 8
customer_error = 9
OTHERS = 10.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
LOOP AT i_mara_tab INTO w_mara_tab.
IF v_count < p_count.
APPEND w_mara_tab TO i_mara_temp.
v_count = v_count + 1.
ELSE.
APPEND w_mara_tab TO i_mara_temp.
v_count = v_count + 1.
PERFORM split_records.
REFRESH i_mara_temp.
v_count = 1.
v_split = v_split + 1.
ENDIF.
ENDLOOP.
IF v_count NE 1.
PERFORM split_records.
ENDIF.
--> p1 text
<-- p2 text
FORM split_records.
CONCATENATE p_file v_split INTO v_newfile.
CALL FUNCTION 'WS_DOWNLOAD'
EXPORTING
filename = v_newfile
filetype = 'DAT'
mode = 'A'
TABLES
data_tab = i_mara_temp
EXCEPTIONS
file_open_error = 1
file_write_error = 2
invalid_filesize = 3
invalid_type = 4
no_batch = 5
unknown_error = 6
invalid_table_width = 7
gui_refuse_filetransfer = 8
customer_error = 9
no_authority = 10
OTHERS = 11.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
WRITE:/'Data has been written into:' , v_newfile.
ENDFORM.
**Reward points if found useful....
Regards,
Manikandan.A -
We recently performed an upgrade to 11.5.10 and are experiencing issues in the Purchasing module. The problem we're having occurs when trying to do a data export of our position hierarchy. I have no problems defining or setting up the hierarchy however, when I click Tools >> Export Data, I am recieving the error message below. Any help or suggestions would be greatly appreciated.
**************************** Error Message **************************************
APP-PER-289792: Before you can run a mail merge or data extraction from this window, your system administrator must define a Web ADI integrator and associate it with the window. There are currently no integrators associated with this window.
*************************** End Error Message *********************************hi for downloading the Data In addition to the web ADI function you need to associate that Position hierarchy Form to the Web ADI Integrator
Please find the sample setup steps which I did for other forms..you can try out the same for your requirmeent too
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
From your Web ADI responsibility , choose Create Document.
Select your chosen Viewer. i.e. Excel XP. DO NOT tick the Reporting checkbox because you will be uploading a new integrator definition to the database. Click on Next button.
Choose HR Integrator Setup integrator. Click on Next button.
Choose default Layout, Integrator Setup. Click on Next button.
Choose None for Content to open empty document. Click on Next button.
On Document Creation Review page, Click on Create Document button. Open file and Enable Macros if prompted. A Processing window will open and a Confirmation window will advise when document has been created.
In the blank spreadsheet enter a value for each of the columns as follows.
Upl - ignore
Metadata Type - select List of Values from Oracle menu
or right click on mouse.Choose UPDATE
Application Short Name- Choose your Application Short Name
(see Getting Started)
Integrator User Name - Enter a name for your integrator
(eg Update Asg Details)
View Name - Create your own view, but must include
object_version_number from
per_all_assignments_f. See Getting
Started.
Form Name - GENERAL
API Package Name - HR_ASSIGNMENT_API
API Procedure Name - UPDATE_EMP_ASG_CRITERIA
Interface User Name - Enter a unique name
Interface Param List - Enter a unique name
API Type - select List of Values from Oracle menu
or right click on mouse.Choose PROCEDURE
API Return Type - ignore
Upload by going to Oracle menu on spreadsheet toolbar, and select Upload. If during upload, you get the error PER_289866_ADI_CONT_COL_FAIL , it means the view you specified above does not exist in the APPS schema.
create a new form function, or use an existing one for this integrator, and add it to the menu where you will be running the integrator from.
Associate the function with your new integrator by following step 10 and 11 in previous example.
These are the three Input parameters from the API which don’t have defaults. In other words they are mandatory input parameters for the call to the API to be successful. The first two should be placed in the Header section, whilst the assignment_id can be a line item.
You can then choose the optional fields you wish to see in the spreadsheet.. Save Layout by pressing Apply button.
Run integrator from HRMS Create Document. In Select Content Parameters page, you are asked to enter Session Date and an optional Extra where clause to restrict the query. N.B. The view can also be set up to restrict the data downloaded. Continue and Create Document.
Enter Header fields, EFFECTIVE_DATE and DATETRACK_UPDATE_MODE. i.e. UPDATE or CORRECTION. Then make changes to individual lines which will flag each row for update.
Upload by going to Oracle menu on spreadsheet toolbar, and select Upload
..Now try to click the Export button in the position hierarchy form..this will help to resolve your issue
Regards
Ramesh Kumar S
Maybe you are looking for
-
Create new field for service tax no
hello, my client wants a new field for sevice tax for vendor in cin master maintanance. how can i create. thanks and regards bhakta
-
I'm trying to send a pages document as an email attachment to pc users. They can't access the document. When I try go to the "file" menu, there is no "save as" option.
-
Using G4 processor on B/W G3?
I recently upgraded my G4 400 and put an upgraded OWC 1.0 GHz processor which I just love. It's Great!, OK my question is can I use the G4 400 MHZ processor I took out and put it into my G3 350 MHz ,Will it work? Thanks fvs
-
Cisco Prime 1.2.1.12
Hi All, Running PI 1.2.1.12, need to recover the admin password for the command line interface any ideas. Cheers, Jay
-
Help with flash CS4 please!
Hello, I am in school and I have an assignment to complete on flash, but when I try to make an object, an annoying little box appears around my object instead of having nothing around it, I passed this off as nothing and tried to use the selection t