Delta upload for flat file extraction
Hello everyone,
Is it possible to initialize delta update for a flat file extraction, If yes please explain how to do that??
Hi Norton,
For a Flat file data source, the upload will be always FULL.
Please refer to following doc:
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a0b3f0e2-832d-2c10-1ca9-d909ca40b54e?QuickLink=index&overridelayout=true&43581033152710 if you need to to extract delta records from Flat file. We can write routine at infopackage level.
Regards,
Harish.
Similar Messages
-
Hello Everyone
i am srikanth. i would like to know wheather do we have a facility for Delta Upload for flat file. If yes can u please give me the steps.
thanks in advance
srikanthHi Sabrina...thank you for ur help...i did load the data from cube to ods
steps.
1. i generated export data source on the cube
2. i found the name of the cube with prefix 8<infocube name> in the infosource under DM application component.
3. there are already communication structure and transfer rules activated but when i am creating update rules for the ods..i am getting the message 0recordmode missing in the infosource.
4. so i went to infosource and added 0recordmode in communication structure and activated but the transfer rules in yellow colour..there was no object assigned to 0recordmode but still i activated.
5.again i went to ods and created update rules and activated (tis time i didnt get any message about 0recordmode).
6.i created infopackage and loaded.
a)Now my question is without green signal in the transfer rule how data populated into the ods and in your answer you mentioned to create communication structure and transfer rules where in i didnt do anything.
b) will i b facing any problem if i keep loading the data into the ods from cube (in yellow signal) ..is it a correct procedure..plz correct me..thanks in advance -
Omit the Open Hub control file 'S_*' for flat file extracts
Hi Folks,
a quick question is it somehow possible to omit the control file generation for flat file extracts.
We got some unix scripts running that get confused by the S_* files and we where wondering if we can switch the creation of those files off.
Thanks and best regards,
AxelHi,
However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
I would suggest you to control it again.
Derya -
Using Web Upload for Flat File in BW-BPS 3.5 Stack 13
Hi,
I am attempting to implement the "How To Guide Load a Flat File into BW-BPS Using a Web Browser" (the 7/2005 version). However, certain BSP files named in the How to Guide, such as "JS.htm" and "checkbrowser.htm" do not exist in my generated BSP application. We are on BW-BPS version 3.5 Stack 13. Is this guide compatible with this BW version?
Thanks,
Cynara
(Note: I have already attempted to correct the incompatibilities by changing the statement <%@include file="js.htm" %> to <%@include file="JS3.htm" %> and eliminating the code <%@include file="checkbrowser.htm" %>. However, I still get the JS3.htm error "field R_PAGE is unknown".)Hi Cynara,
I try to do the same.
I left out those both lines and also the line concerning the style.
<link type="text/css" href="<%= cstyle %>" rel="stylesheet">
The upload worked then so far except that no data was written into the cube. Even if I press save button nothing happens.
Any ideas?
Regards,
Juergen -
How to Delta Update in Flat Files
Hi all,
I would like to know is delta update is supported in flat files ? If yes how does this work ..... is there a SAP white paper or document on this
Thanks
NathanHi......
Yes Delta upload is supported by Flat files..........In this case........when the delta data records are loaded by flat file......... This delta type is only used is 'F'.......for DataSources for flat file source systems. Flat files are imported to the PSA when you execute the InfoPackage. The staging process then begins.........The delta queue is not used here.
As Siggi already told in the Following thread......post your flatfile data to an ods object with update mode overwrite. The ods object will do the delta for you then.........
Delta Upload for Flat File
Also Check this.......
SAP NetWeaver 7.0 BI: Data Transfer Process with Only get Delta Once
SAP NetWeaver 7.0 BI: Data Transfer Process (DTP) / The Normal DTP ...
Hope this helps........
Regards,
Debjani........
Edited by: Debjani Mukherjee on Nov 12, 2008 9:40 AM -
Process for Flat file delta load from Application server to transactional??
I want to do a flat file extraction in delta loads from application server and that should be loaded into a transactional cubes scheduling a process chain.
Can any one help with ABAP code and step by step process for this?
Thank you
DeviHi Devi,
As per your explaination, you want to load a list of files from application server using process chain.
You can do this with below steps.
1) Create a Event and call event to run Process chain multiple times.
2) Put all filenames in a file say, source-File
3) Write a rooting to read filename from Source-File and trigger InfoPackage
4) At the end of the process chain create new customized program which will delete first line from Source-File so that it will take next file in next run as well as delete currently loaded file. Once this is done trigger event again to start process chain again.
Regards,
Ganesh -
Oracle API for Extended Analytics Flat File extract issue
I have modified the Extended Analytics SDK application as such:
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
instead of
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
Where does the FLATFILE get saved/output when using the API?
Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
thanks for your help,
ChrisNever mind. I found the location on the server.
-
What are the settings for datasource and infopackage for flat file loading
hI
Im trying to load the data from flat file to DSO . can anyone tel me what are the settings for datasource and infopackage for flat file loading .
pls let me know
regards
kumarLoading of transaction data in BI 7.0:step by step guide on how to load data from a flatfile into the BI 7 system
Uploading of Transaction data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( Transaction data )
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to create ODS( Data store object ) or Cube.
Specify name fro the ODS or cube and click create
From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
Click Activate.
Right click on ODS or Cube and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
Loading of master data in BI 7.0:
For Uploading of master data in BI 7.0
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to select Insert Characteristics as info provider
Select required info object ( Ex : Employee ID)
Under that info object select attributes
Right click on attributes and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets. -
How to allow users to upload a flat file to BW
Hi All,
For a planning application I would like to permit our users to upload a flat file on their local desktop to the infopackage and execute the load.
We would like to empower the users to prepare and upload their flat files into BW from their desktop without asking for BW support.
Please let me know if any of you have followed this approach.
Thanks
KarenHi,
The possible steps..
1. Create a small program and then give
Selection Screen:
FIle name : -
Note: Ask users give always same file name i.e. xyz.csv
Once user will give file name and execute it then file will save in Application Server (You fix the path like /usr/sap/BI1/DVEBMGS00/work, you create seperate folder in Application server)
2. Create a small Program with is using Events..
REPORT ZTEST_EV.
DATA: EVENTID LIKE TBTCJOB-EVENTID.
DATA: EVENTPARM LIKE TBTCJOB-EVENTPARM.
EVENTID = 'ZEVENT1'.
EVENTPARM = 'ZEVENTPARAM'.
CALL FUNCTION 'RSSM_EVENT_RAISE'
EXPORTING
I_EVENTID = EVENTID
I_EVENTPARM = EVENTPARM
EXCEPTIONS
BAD_EVENTID = 1
EVENTID_DOES_NOT_EXIST = 2
EVENTID_MISSING = 3
RAISE_FAILED = 4
OTHERS = 5
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
3. Once user will upload the file in Step 1, then he need to run this Program,.
4. You craetea Process Chain using this Event, then once User will run this program then the Data Loads will happen through Process Chain.
Note: Eventhough this is lengthy process, it is protected 100%, because we are not giving any access to User, we just given reports/programs to execute.
Thanks
Reddy
Thanks
Reddy -
Error message while uploading the flat file
Hi Experts,
I am getting the error message while uploading the flat file.
Message class: MG
Number: 147
The message is: Several descriptions exist for the language JA.
Please guide me why this error is occuring.
Regards
Akshayhi,
how are you uploading the file and where ?
u can use open dataset , read dataset or gui_upload
check this link
http://help.sap.com/saphelp_nw04/helpdata/en/c8/e92637c2cbf357e10000009b38f936/frameset.htm -
Extended Analytics Flat File extract in ANSI
Hi,
Version: EPM 11.1.2.1
The Flat file extract using Extended Analytics from HFM is exported in UNICODE format. Is there any option to get the export in ANSI code format?
Thanks,
user8783298Hi
In the latest version, Currently the data from HFM can be exported only in UNICODE format. There is a defect filed to Oracle for the same.
At present the only workaround is to change the file encoding after it has been extracted, using third party software.
For example, the extracted file can be opened using Windows Notepad and then desired encoding format can be selected when the file is saved using option Save As.
Hope this helps.
thank you
Regards,
Mahe -
Extended Analytics Flat File extract question
I have modified the Extended Analytics SDK application as such:
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
instead of
HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
Where does the FLATFILE get saved/output when using the API?
Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
thanks for your help,
ChrisNever mind. I found the location on the server.
-
hi,
I have tried with GoldenGate for Oracle/ non-Oracle databases. Now, I am trying for flat file.
What i have done so far:
1. I have downloaded Oracle "GoldenGate Application Adapters 11.1.1.0.0 for JMS and Flat File Media Pack"
2. Kept it on the same machine where Database and GG manager process exists. Port for GG mgr process 7809, flat file 7816
3. Following doc GG flat file administrators guide Page 9 --> configuration
4. Extract process on GG manager process_
edit params FFE711*
extract ffe711
userid ggs@bidb, password ggs12345
discardfile E:\GoldenGate11gMediaPack\V26071-01\dirrpt\EXTFF.dsc, purge
rmthost 10.180.182.77, mgrport 7816
rmtfile E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, purge, megabytes 5
add extract FFE711, EXTTRAILSOURCE ./dirdat/oo*
add rmttrail ./dirdat/pp, extract FFE711, megabytes 20*
start extract FFE711*
view report ffe711*
Oracle GoldenGate Capture for Oracle
Version 11.1.1.1 OGGCORE_11.1.1_PLATFORMS_110421.2040
Windows (optimized), Oracle 11g on Apr 22 2011 03:28:23
Copyright (C) 1995, 2011, Oracle and/or its affiliates. All rights reserved.
Starting at 2011-11-07 18:24:19
Operating System Version:
Microsoft Windows XP Professional, on x86
Version 5.1 (Build 2600: Service Pack 2)
Process id: 4628
Description:
** Running with the following parameters **
extract ffe711
userid ggs@bidb, password ********
discardfile E:\GoldenGate11gMediaPack\V26071-01\dirrpt\EXTFF.dsc, purge
rmthost 10.180.182.77, mgrport 7816
rmtfile E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, purge, megabytes 5
CACHEMGR virtual memory values (may have been adjusted)
CACHEBUFFERSIZE: 64K
CACHESIZE: 1G
CACHEBUFFERSIZE (soft max): 4M
CACHEPAGEOUTSIZE (normal): 4M
PROCESS VM AVAIL FROM OS (min): 1.77G
CACHESIZEMAX (strict force to disk): 1.57G
Database Version:
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
PL/SQL Release 11.1.0.7.0 - Production
CORE 11.1.0.7.0 Production
TNS for 32-bit Windows: Version 11.1.0.7.0 - Production
NLSRTL Version 11.1.0.7.0 - Production
Database Language and Character Set:
NLS_LANG environment variable specified has invalid format, default value will b
e used.
NLS_LANG environment variable not set, using default value AMERICAN_AMERICA.US7A
SCII.
NLS_LANGUAGE = "AMERICAN"
NLS_TERRITORY = "AMERICA"
NLS_CHARACTERSET = "AL32UTF8"
Warning: your NLS_LANG setting does not match database server language setting.
Please refer to user manual for more information.
2011-11-07 18:24:25 INFO OGG-01226 Socket buffer size set to 27985 (flush s
ize 27985).
2011-11-07 18:24:25 INFO OGG-01052 No recovery is required for target file
E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, at RBA 0 (file not opened).
2011-11-07 18:24:25 INFO OGG-01478 Output file E:\GoldenGate11gMediaPack\V2
6071-01\dirdat\ffremote is using format RELEASE 10.4/11.1.
** Run Time Messages **
5. on Flat file GGSCI prompt-->_*
edit params FFR711*
extract ffr711
CUSEREXIT E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\flatfilewriter.dll CUSEREXIT passthru includeupdatebefores, params "E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\sample-dirprm\ffwriter.properties"
SOURCEDEFS E:\GoldenGate11gMediaPack\V26071-01\dirdef\vikstkFF.def
table ggs.vikstk;
add extract ffr711, exttrailsource ./dirdat/pp*
start extract ffr711*
view report ffr711*
Oracle GoldenGate Capture
Version 11.1.1.0.0 Build 078
Windows (optimized), Generic on Jul 28 2010 19:05:07
Copyright (C) 1995, 2010, Oracle and/or its affiliates. All rights reserved.
Starting at 2011-11-07 18:21:31
Operating System Version:
Microsoft Windows XP Professional, on x86
Version 5.1 (Build 2600: Service Pack 2)
Process id: 5008
Description:
** Running with the following parameters **
extract ffr711
CUSEREXIT E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\flatfilewriter.dll CUSE
REXIT passthru includeupdatebefores, params "E:\GoldenGate11gMediaPack\GGFlatFil
e\V22262-01\sample-dirprm\ffwriter.properties"
E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\ggs_Windows_x86_Generic_32bit_v11
_1_1_0_0_078\extract.exe running with user exit library E:\GoldenGate11gMediaPac
k\GGFlatFile\V22262-01\flatfilewriter.dll, compatiblity level (2) is current.
SOURCEDEFS E:\GoldenGate11gMediaPack\V26071-01\dirdef\vikstkFF.def
table ggs.vikstk;
CACHEMGR virtual memory values (may have been adjusted)
CACHEBUFFERSIZE: 64K
CACHESIZE: 1G
CACHEBUFFERSIZE (soft max): 4M
CACHEPAGEOUTSIZE (normal): 4M
PROCESS VM AVAIL FROM OS (min): 1.87G
CACHESIZEMAX (strict force to disk): 1.64G
Started Oracle GoldenGate for Flat File
Version 11.1.1.0.0
** Run Time Messages **
Problem I am facing_
I am not sure where to find the generated flat file,
even the reports are showing there is no data at manager process
I am expecting replicat instead of extract at Flatfile FFR711.prm
I have done this much what to do give me some pointers.....
Thanks,
VikasOk, I haven't run your example, but here are some suggestions.
Vikas Panwar wrote:
extract ffe711
userid ggs@bidb, password ggs12345
discardfile E:\GoldenGate11gMediaPack\V26071-01\dirrpt\EXTFF.dsc, purge
rmthost 10.180.182.77, mgrport 7816
rmtfile E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, purge, megabytes 5
ggsci> add extract FFE711, EXTTRAILSOURCE ./dirdat/oo
ggsci> add rmttrail ./dirdat/pp, extract FFE711, megabytes 20
ggsci> start extract FFE711
You of course need data captured from somewhere to test with. You could capture changes directly from a database and write those to a trail, and use that as a source for the flat-file writer; or, if you have existing trail data, you can just use that (I often test with old trails, with known data).
In your example, you are using a data pump that is doing nothing more than pumping trails to a remote host. That's fine, if that's what you want to do. (It's actually quite common in real implementations.) But if you want to actually capture changes from the database, then change "add extract ... extTrailSource" to be "add extract ... tranlog". I'll assume you want to use the simple data pump to send trail data to the remote host. And I will assume that some other database capture process is creating the trail dirdat/oo
Also... with your pump "FFE711", you can create either a local or remote trial, that's fine. But don't use a rmtfile (or extfile). You should create a trail, either a "rmttrail" or "exttrail". The flat-file adapter will read that (binary) trail, and generate text files. Trails automatically roll-over, the "extfile/rmtfile" do not (but they do have the same internal GG binary log format). (You can use a 'maxfiles' to force them to rollover, but that's beside the point.)
Also, <ul>
<li> don't forget your "table" statements... or else no data will be processed!! You can wildcard tables, but not schemata.
<li> there is no reason that anything would be discarded in a pump.
<li> although a matter of choice, I don't see why people use absolute paths for reports and discard files. Full paths to data and def files make sense if they are on the SAN/NAS, but then I'd use symlinks from dirdat to the storage directory (on Unix/Linux)
<li> both windows and unix can use forward "/" slashes. Makes examples platform-independent (another reason for relative paths)
<li> your trails really should be much larger than 5MB for better performance (e.g,. 100MB)
<li> you probably should use a source-defs file, intead of a dblogin for metadata. Trail data is by its very nature historical, and using "userid...password" in the prm file inherently gets metadata from "right now". The file-writer doesn't handle DDL changes automatically.
</ul>
So you should have something more like:
Vikas Panwar wrote:
extract ffe711
sourcedefs dirdef/vikstkFF.def
rmthost 10.180.182.77, mgrport 7816
rmttrail dirdat/ff, purge, megabytes 100
table myschema.*;
table myschema2.*;
table ggs.*;For the file-writer pump:
+5. on Flat file GGSCI prompt+
extract ffr711
CUSEREXIT flatfilewriter.dll CUSEREXIT passthru includeupdatebefores, params dirprm\ffwriter.properties
SOURCEDEFS dirdef/vikstkFF.def
table myschema.*;
table ggs.*;
ggsci> add extract ffr711, exttrailsource ./dirdat/pp
ggsci> start extract ffr711
Again, use relative paths when possible (the flatfilewriter.dll is expected to be found in the GG install directory). Put the ffwriter.properties file into dirprm, just as a best-practice. In this file, ffwriter.properties, is where you define your output directory and output files. Again, make sure you have a "table" statement in there for each schema in your trails.
Problem I am facing_
I am not sure where to find the generated flat file,
even the reports are showing there is no data at manager process
I am expecting replicat instead of extract at Flatfile FFR711.prm
I have done this much what to do give me some pointers.....The generated files are defined in the ffwriter.properties file. Search for "rootdir" property, e.g.,
goldengate.flatfilewriter.writers=csvwriter
csvwriter.files.formatstring=output_%d_%010n
csvwriter.files.data.rootdir=dirout
csvwriter.files.control.ext=_data.control
csvwriter.files.control.rootdir=dirout
...The main problem you have is: (1) use rmttrail, not rmtfile, and (2) don't forget the "table" statement, even in a pump.
Also, for the flat-file adapter, it does run in just a "extract" data pump; no "replicat" is ever used. The replicats inherently are tied to a target database; the file-writer doesn't have any database functionality.
Hope it helps,
-m -
Changing the file name n Flat file Extraction
Hi,
Currently i am using flat file extraction for my forecast data and i am sending the file through application server.
I have created directory successfully and now everyday morning i receive a file thru FTP server with name 20060903.csv and this name is based on one field in my flat file data. ex. /interface/asf/20060903.csv
During mid off month we have cut off date, and this cut off date varies for each month. During this time file name changes in the FTP and a file with different name i.e 20061002.csv will be existing in the application server.
Now in the infopackage i also need to set the deletion settings like if the file name is same delete the previous requests. I could achieve this if i could get the file name changed.
Lets say if i am not chnaging the file name how do i set deletion condition, like it should not delete if the field(scenario) changes. ie from 20061002 to 20061101. I should have only one file for 20061002 and one file for 20061101 etc... If the scenario is same it should delete.
Any one kindly advise. Very urgent and critical.
Tks & regards,
Bhuvana.Hi Bhunva,
Try the following abap code in routine under External data tab in infopackage.
data: begin of i_req occurs 0,
rnr like RSICCONT-rnr,
end of i_req.
select * from RSICCONT UP TO 1 ROWS
where ICUBE = <datatargetname>
order by TIMESTAMP descending.
i_req-rnr = rsiccont-rnr .
append i_req.
clear i_req.
endselect.
loop at i_req.
select single * from RSSELDONE where RNR eq i_req-rnr and
filename = p_filename.
if sy-subrc = 0.
CALL FUNCTION 'RSSM_DELETE_REQUEST'
EXPORTING
REQUEST = i_req-rnr
INFOCUBE = <datatargetname>
EXCEPTIONS
REQUEST_NOT_IN_CUBE = 1
INFOCUBE_NOT_FOUND = 2
REQUEST_ALREADY_AGGREGATED = 3
REQUEST_ALREADY_COMDENSED = 4
NO_ENQUEUE_POSSIBLE = 5
OTHERS = 6.
IF SY-SUBRC <> 0.
MESSAGE ID sy-MSGID TYPE 'I' NUMBER sy-MSGNO
WITH sy-MSGV1 sy-MSGV2 sy-MSGV3 sy-MSGV4.
else.
message i799(rsm1) with i_req-rnr 'deleted'.
ENDIF.
endif.
let me know if you get any problem in this logic.
regards,
Raju -
Encountering problem in Flat File Extraction
Flat file extraction: the settings which I maintain in the data source u201Cfield tabu201D are conversion routine "ALPHA", format "External". But when I load the data, the record which is maintained in lower case (in the Flat File) is not getting converted into upper case. But if I try with out ALPHA conversion, the lower case data is getting converted into the SAP format (Upper case). Could you please help me to fix the problem
When I use both ALPH & External together?Hai did you enable the lower case enable option check box in the infoobject level .For the objects which you want lower case values .Check the help.sap site as well.
Goodday
Maybe you are looking for
-
Hello all, I've put together a hybrid analysis cube and I'm experiencing some odd calc order behavior. It appears the years are calculating backwards. I load my 2007 end balances and calc forward three years. 2008 is correct, but 2009 and 2010 are in
-
Hi, Begin SELECT PR.VENDOR_CODE INTO PR_VENDORE_CODE_V, SUM(PRD.rejected_qty) INTO PR_REJECTED_QTY_V FROM PURCHASE_RETURN PR, PURCHASE_RETURN_DETAIL PRD WHERE PR.PR_NO = PRD.PR_NO AND PR.COMPANY_CODE = company_code_fk_V AND PR.PR_NO = purchae_return_
-
Having an issue with event handling - sql & java
HI all am trying to construct this hybrid of java and mysql. the data comes from a mysql database and I want it to display in the gui. this I have achieved thus far. However I have buttons that sort by surname, first name, ID tag etc....I need event
-
Saving metadata from Lightroom erases custom XMP data in DNG...
Hello, There seems to be an issue with the way Lightroom applies changes in from the LR catalog to the assets metadata. The setup is LR 2.5 and there is an orginal set of images that have custom XMP data applied to them via Bridge. There is one
-
Currencies with no decimals in SAP R/3
What are the list of countries in R/3 which do not have decimals in their currencies? I know SAP does not support for JPY.