Parsing field data into mailto:
Is there any way to pull field data, like with a variable %TextField2%, so it can be used somewhere else in the form? What I'm trying to setup is a form that will take the data from TextField2 so I can use it to prepopulate the subject of the mailto: submission. This is the line I'm using:
I have %TextField2% entered as an example. So basically I just need the info from TextField2 to be used in the subject= line. If anyone has any information on how to do this it would be greatly appreciated. Thanks!
Got it. Here is the script I used:
event.target.submitForm({cURL:"mailto:[email protected]?subject="+TextField1.rawValue+"",cSub mitAs:"PDF",cCharset:"utf-8"});
TextField1 is the field I referenced to. Hope this helps someone out.
Similar Messages
-
How to parse xml data into java component
hi
everybody.
i am new with XML, and i am trying to parse xml data into a java application.
can anybody guide me how to do it.
the following is my file.
//MyLogin.java
import javax.swing.*;
import java.awt.*;
import java.awt.event.*;
import java.io.*;
import java.util.*;
class MyLogin extends JFrame implements ActionListener
JFrame loginframe;
JLabel labelname;
JLabel labelpassword;
JTextField textname;
JPasswordField textpassword;
JButton okbutton;
String name = "";
FileOutputStream out;
PrintStream p;
Date date;
GregorianCalendar gcal;
GridBagLayout gl;
GridBagConstraints gbc;
public MyLogin()
loginframe = new JFrame("Login");
gl = new GridBagLayout();
gbc = new GridBagConstraints();
labelname = new JLabel("User");
labelpassword = new JLabel("Password");
textname = new JTextField("",9);
textpassword = new JPasswordField(5);
okbutton = new JButton("OK");
gbc.anchor = GridBagConstraints.NORTHWEST;
gbc.gridx = 1;
gbc.gridy = 5;
gl.setConstraints(labelname,gbc);
gbc.anchor = GridBagConstraints.NORTHWEST;
gbc.gridx = 2;
gbc.gridy = 5;
gl.setConstraints(textname,gbc);
gbc.anchor = GridBagConstraints.NORTHWEST;
gbc.gridx = 1;
gbc.gridy = 10;
gl.setConstraints(labelpassword,gbc);
gbc.anchor = GridBagConstraints.NORTHWEST;
gbc.gridx = 2;
gbc.gridy = 10;
gl.setConstraints(textpassword,gbc);
gbc.anchor = GridBagConstraints.NORTHWEST;
gbc.gridx = 1;
gbc.gridy = 15;
gl.setConstraints(okbutton,gbc);
Container contentpane = getContentPane();
loginframe.setContentPane(contentpane);
contentpane.setLayout(gl);
contentpane.add(labelname);
contentpane.add(labelpassword);
contentpane.add(textname);
contentpane.add(textpassword);
contentpane.add(okbutton);
okbutton.addActionListener(this);
loginframe.setSize(300,300);
loginframe.setVisible(true);
public static void main(String a[])
new MyLogin();
public void reset()
textname.setText("");
textpassword.setText("");
public void run()
try
String text = textname.getText();
String blank="";
if(text.equals(blank))
System.out.println("First Enter a UserName");
else
if(text != blank)
date = new Date();
gcal = new GregorianCalendar();
gcal.setTime(date);
out = new FileOutputStream("log.txt",true);
p = new PrintStream( out );
name = textname.getText();
String entry = "UserName:- " + name + " Logged in:- " + gcal.get(Calendar.HOUR) + ":" + gcal.get(Calendar.MINUTE) + " Date:- " + gcal.get(Calendar.DATE) + "/" + gcal.get(Calendar.MONTH) + "/" + gcal.get(Calendar.YEAR);
p.println(entry);
System.out.println("Record Saved");
reset();
p.close();
catch (IOException e)
System.err.println("Error writing to file");
public void actionPerformed(ActionEvent ae)
String str = ae.getActionCommand();
if(str.equals("OK"))
run();
//loginframe.setDefaultCloseOperation(DISPOSE_ON_CLOSE);
}hi, thanks for ur reply.
i visited that url, i was able to know much about xml.
so now my requirement is DOM.
but i dont know how to code in my existing file.
means i want to know what to link all my textfield to xml file.
can u please help me out. i am confused.
waiting for ur reply -
How to parse URL Data into an NSString Array in iphone application
Hi Every one
I am newbie to iphone programming. I am having problem with reading and displaying the data into the table view. My application has to be designed like this. There is a csv file in the server machine and I have to access that URL line by line. Each line consists of 8 comma separated values. Consider each line has first name, second name and so on. I have to parse the data with comma and a newline and store them in an array of first name, second name array an so on. The next thing is I have to set first name second name combined and must be displayed in the UITableView. Can anyone provide me with an example of how to do it? I know I am asking the solution but I encountered a problem in connection methods separately and parsing simultaneously. You help is much appreciated.
ThanksWhat does that have to do with a URL?
The only thing that doesn't sound good is "array of first name" and "second name array". For each row, extract all the field and store them in an NSDictionary. Add a derived field consisting of first name concatenated with last name. That will be easy to display in a table. -
Parsing Remote Data into Model Objects?
Hello,
I'm new to Flex and I'm trying to find out if my instincts
from other environments hold true. I have XML data from a remote
API that I'd like to use in my Flex application, specifically from
a 37Signals web application.
When I bring in the remote data into Flex, my instinct is to
parse it into data models, then use those models to bind to various
UI controls. I'm thinking that it will be easier to create
list/detail views if I have the data organized this way versus
having to do another remote XML lookup to populate the detail view.
I'm thinking it will also help me persist the data using the AIR
APIs.
Most of the information I'm finding wants me to parse the XML
into some sort of collection and bind it directly. Can someone tell
me if looking toward models is a good approach, or if it's a
paradigm/pattern that isn't appropriate?
If it is a decent approach, any information that shows
someone using it would be extremely helpful to me.
Thanks!
ScottI create one bindable class and in it put an ArrayCollection
of ArrayCollection objects which if you make this a singleton
instance class then each module within your flex application can
initialize and maintain the ArrayCollection element within this
class. This seems to work very well since we are pulling several
xml lists throughout a tabbed multi state view flex application.
I'm not sure if this is what you are looking for or if you need an
example. How are you pulling the data into your application through
HttpService calls or you are using bindable Remote objects? -
Parsing XML data into MySQL database
I need to parse data from an xml document rsiding on amother
server into a mysql database residing on my server, so that I can
pull data into a php site.
Any help? Mybe there's a commercial script or something?
I would also need to update the data daily
Thanks!!Check this out.
http://download-west.oracle.com/docs/cd/B10501_01/appdev.920/a96612/d_xmlsav.htm#1008593 -
Using sql load insert multiple fields data into a single column in database
Hi ,
I have my log file in sun OS box something like this
=======
(07/29/2009 00:02:24.467) 367518 (07/29/2009 00:02:26.214) 949384011
(07/29/2009 00:02:26.236) 3675 (07/29/2009 00:02:28.207) 949395117
(07/29/2009 00:02:28.240) 337710 (07/29/2009 00:02:30.621) 949400864
=============
I am trying to insert the data into oracle data base as follows.
=============================
column1 : (07/29/2009 00:02:24.467)
column2 : 367518
column3 : (07/29/2009 00:02:26.214)
column4 : 949384011
===========================
Can anyone help me with the control file format?
someone suggested me the code below.
==========
LOAD DATA
INFILE 'D:\work\load.txt'
INTO TABLE sample
(col1 POSITION(02:24) char,
col2 POSITION(27:32) INTEGER EXTERNAL,
col3 POSITION(35:57) CHAR,
col4 POSITION(60:68) INTEGER EXTERNAL
===========
but this works only for the fixed length data? Please helpuser11744904 wrote:
Hi ,
I have my log file in sun OS box something like this
=======
(07/29/2009 00:02:24.467) 367518 (07/29/2009 00:02:26.214) 949384011
(07/29/2009 00:02:26.236) 3675 (07/29/2009 00:02:28.207) 949395117
(07/29/2009 00:02:28.240) 337710 (07/29/2009 00:02:30.621) 949400864
=============
I am trying to insert the data into oracle data base as follows.
=============================
column1 : (07/29/2009 00:02:24.467)
column2 : 367518
column3 : (07/29/2009 00:02:26.214)
column4 : 949384011
===========================
Can anyone help me with the control file format?
someone suggested me the code below.
==========
LOAD DATA
INFILE 'D:\work\load.txt'
INTO TABLE sample
(col1 POSITION(02:24) char,
col2 POSITION(27:32) INTEGER EXTERNAL,
col3 POSITION(35:57) CHAR,
col4 POSITION(60:68) INTEGER EXTERNAL
===========
but this works only for the fixed length data? Please helpIs the requirement to load all data in a single column or multiple columns? The thread subject and body are conflicting. -
Parsing xml data into oracle database
Oracle ACEs,
We got a requirement that needs to load xml data into oracle db. Is it possible to load xml data into oracle db using built-in package. Is Oracle provides one ?
Our database version is Oracle 9i.
Your help in this regard is highly appreciated.
Many Thanks.Check this out.
http://download-west.oracle.com/docs/cd/B10501_01/appdev.920/a96612/d_xmlsav.htm#1008593 -
How to parse my datas into decimals
Hi all, i've been trying to convert my data from exponential form into decimal how should i go about doing it?
eventually, i will want to average out the data points(current) i got into a single value. Is there a better way to do so instead changing them into decimal form and try to average them again? Perhaps just do some formulas and get the final average value?
I'm using LV 7.1. attached is the program i've done.
thanks in advance
Rgds,
Linda
Attachments:
1.1.vi 104 KBHi Linda,
I attached an example of using 'Scan from String' to convert your string to an array of numbers...
To do the formatting of numbers you should use 'Format into string' or set the formatting properties of your numeric indicators accordingly!
(There's a huge page in the online-help on formatting codes :-)
Best regards,
GerdW
CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
Kudos are welcome
Attachments:
Convert_SFS_71.vi 22 KB -
Script for parsing xml data and inserting in DB
Thank you for reading.
I have the following example XML in an XML file. I need to write a script that can insert this data into an Oracle table. The table does not have primary keys. The data just needs to be inserted.
I do not have xsd file in this scenario. Please suggest how to modify Method 1 https://community.oracle.com/thread/1115266?tstart=0 mentioned so that I can call the XML mentioned below and insert into a table
Method 1
Create or replace procedure parse_xml is
l_bfile BFILE;
l_clob CLOB;
l_parser dbms_xmlparser.Parser;
l_doc dbms_xmldom.DOMDocument;
l_nl dbms_xmldom.DOMNodeList;
l_n dbms_xmldom.DOMNode;
l_file dbms_xmldom.DOMNodeList;
l_filen dbms_xmldom.DOMNode;
lv_value VARCHAR2(1000);
l_ch dbms_xmldom.DOMNode;
l_partname varchar2(100);
l_filename varchar2(1000);
l_temp VARCHAR2(1000);
TYPE tab_type IS TABLE OF tab_software_parts%ROWTYPE;
t_tab tab_type := tab_type();
BEGIN
l_bfile := BFileName('DIR1', 'SoftwareParts.xml');
dbms_lob.createtemporary(l_clob, cache=>FALSE);
dbms_lob.open(l_bfile, dbms_lob.lob_readonly);
dbms_lob.loadFromFile(dest_lob => l_clob, src_lob => l_bfile, amount => dbms_lob.getLength(l_bfile));
dbms_lob.close(l_bfile);
dbms_session.set_nls('NLS_DATE_FORMAT','''DD-MON-YYYY''');
l_parser := dbms_xmlparser.newParser;
dbms_xmlparser.parseClob(l_parser, l_clob);
l_doc := dbms_xmlparser.getDocument(l_parser);
dbms_lob.freetemporary(l_clob);
dbms_xmlparser.freeParser(l_parser);
l_nl := dbms_xslprocessor.selectNodes(dbms_xmldom.makeNode(l_doc),'/PartDetails/Part');
FOR cur_emp IN 0 .. dbms_xmldom.getLength(l_nl) - 1 LOOP
l_n := dbms_xmldom.item(l_nl, cur_emp);
t_tab.extend;
dbms_xslprocessor.valueOf(l_n,'Name/text()',l_partname);
t_tab(t_tab.last).partname := l_partname;
l_file := dbms_xslprocessor.selectNodes(l_n,'Files/FileName');
FOR cur_ch IN 0 .. dbms_xmldom.getLength(l_file) - 1 LOOP
l_ch := dbms_xmldom.item(l_file, cur_ch);
lv_value := dbms_xmldom.getnodevalue(dbms_xmldom.getfirstchild(l_ch));
if t_tab(t_tab.last).partname is null then t_tab(t_tab.last).partname := l_partname; end if;
t_tab(t_tab.last).filename := lv_value;
t_tab.extend;
END LOOP;
END LOOP;
t_tab.delete(t_tab.last);
FOR cur_emp IN t_tab.first .. t_tab.last LOOP
if t_tab(cur_emp).partname is not null and t_tab(cur_emp).filename is not null then
INSERT INTO tab_software_parts
VALUES
(t_tab(cur_emp).partname, t_tab(cur_emp).filename);
end if;
END LOOP;
COMMIT;
dbms_xmldom.freeDocument(l_doc);
EXCEPTION
WHEN OTHERS THEN
dbms_lob.freetemporary(l_clob);
dbms_xmlparser.freeParser(l_parser);
dbms_xmldom.freeDocument(l_doc);
END;
<TWObject className="TWObject">
<array size="240">
<item>
<variable type="QuestionDetail">
<questionId type="String"><![CDATA[30]]></questionId>
<questionType type="questionType"><![CDATA[COUNTRY]]></questionType>
<country type="String"><![CDATA[GB]]></country>
<questionText type="String"><![CDATA[Please indicate]]></questionText>
<optionType type="String"><![CDATA[RadioButton]]></optionType>
<answerOptions type="String[]">
<item><![CDATA[Yes]]></item>
<item><![CDATA[No]]></item>
</answerOptions>
<ruleId type="String"><![CDATA[CRP_GB001]]></ruleId>
<parentQuestionId type="String"></parentQuestionId>
<parentQuestionResp type="String"></parentQuestionResp>
</variable>
</item>
<item>
<variable type="QuestionDetail">
<questionId type="String"><![CDATA[40]]></questionId>
<questionType type="questionType"><![CDATA[COUNTRY]]></questionType>
<country type="String"><![CDATA[DE]]></country>
<questionText type="String"><![CDATA[Please indicate]]></questionText>
<optionType type="String"><![CDATA[RadioButton]]></optionType>
<answerOptions type="String[]">
<item><![CDATA[Yes]]></item>
<item><![CDATA[No]]></item>
</answerOptions>
<ruleId type="String"><![CDATA[CRP_Q0001]]></ruleId>
<parentQuestionId type="String"></parentQuestionId>
<parentQuestionResp type="String"></parentQuestionResp>
</variable>
</item>
</array>
</TWObject>Reposted as
Script to parse XML data into Oracle DB -
How to read a CSV file and Insert data into an Oracle Table
Hi All,
I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .
Please let me some suggestions on this.
Thanks,
Chandra Rjeneesh wrote:
And, please don't "hijack" 5 year old thread..Better start a new one..I've just split it off to a thread of it's own. ;)
@OP,
I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .You don't have a "clob file" as there's no such thing. CLOB is a datatype for storing large character based objects. A file is something on the operating system's filesystem.
So, why have you stored comma seperated data in a CLOB?
Where did this data come from? If it came from a file, why didn't you use SQL*Loader or, even better, External Tables to read and parse the data into structured format when populating the database with it?
If you really do have to parse a CLOB of data to pull out the comma seperated values, then you're going to have to write something yourself to do that, reading "lines" by looking for the newline character(s), and then breaking up the "lines" into the component data by looking for commas within it, using normal string functions such as INSTR and SUBSTR or, if necessary, REGEXP_INSTR and REGEXP_SUBSTR. If you have string data that contains commas but uses double quotes around the string, then you'll also have the added complexity of ignoring commas within such string data.
Like I say... it's much easier with SQL*Loader of External Tables as these are designed to parse such CSV type data. -
Parse/seperate data from string
I have a string that is returned from Google GeoCoding:
{ "name": "193 Farrow Hill Road,Davisville,WV", "Status": { "code": 200, "request": "geocode" }, "Placemark": [ { "id": "p1", "address": "Davisville, WV, USA", "AddressDetails": { "Accuracy" : 4, "Country" : { "AdministrativeArea" : { "AdministrativeAreaName" : "WV", "SubAdministrativeArea" : { "Locality" : { "LocalityName" : "Davisville" }, "SubAdministrativeAreaName" : "Wood" } }, "CountryName" : "USA", "CountryNameCode" : "US" } }, "ExtendedData": { "LatLonBox": { "north": 39.2357655, "south": 39.1665921, "east": -81.4344257, "west": -81.5624851 } }, "Point": { "coordinates": [ -81.4984554, 39.2011873, 0 ] } } ] }
I want to be able to parse this data into variables that I can use and update a table.
I currently use ListGetAt() function and try to find common delimiters to narrow down what I am trying to parse, but this processes does not always return the results I am needing.
Does anyone know of any faster more robust way of parsing this string down to variables? And, it could be that I am not using the ListGetAt() to its full potential as well....
What I need most out of this string is in bold below...you'll notice different string lengths depending on if it recognizes the street address or not.
This is the string returned if it DOES NOT recognize the street address:
{ "name": "193 Farrow Hill Road,Davisville,WV", "Status": { "code": 200, "request": "geocode" }, "Placemark": [ { "id": "p1", "address": "Davisville, WV, USA", "AddressDetails": { "Accuracy" : 4, "Country" : { "AdministrativeArea" : { "AdministrativeAreaName" : "WV", "SubAdministrativeArea" : { "Locality" : { "LocalityName" : "Davisville" }, "SubAdministrativeAreaName" : "Wood" } }, "CountryName" : "USA", "CountryNameCode" : "US" } }, "ExtendedData": { "LatLonBox": { "north": 39.2357655, "south": 39.1665921, "east": -81.4344257, "west": -81.5624851 } }, "Point": { "coordinates": [ -81.4984554, 39.2011873, 0 ] } } ] }
This is the string returned if it DOES recognize the street address:
{ "name": "W7499 So. Mound Rd,Neillsville,WI", "Status": { "code": 200, "request": "geocode" }, "Placemark": [ { "id": "p1", "address": "S Mound Rd, Neillsville, WI 54456, USA", "AddressDetails": { "Accuracy" : 6, "Country" : { "AdministrativeArea" : { "AdministrativeAreaName" : "WI", "SubAdministrativeArea" : { "Locality" : { "LocalityName" : "Neillsville", "PostalCode" : { "PostalCodeNumber" : "54456" }, "Thoroughfare" : { "ThoroughfareName" : "S Mound Rd" } }, "SubAdministrativeAreaName" : "Clark" } }, "CountryName" : "USA", "CountryNameCode" : "US" } }, "ExtendedData": { "LatLonBox": { "north": 44.5921279, "south": 44.5858326, "east": -90.5981846, "west": -90.6802503 } }, "Point": { "coordinates": [ -90.6394276, 44.5889072, 0 ] } }, { "id": "p2", "address": "S Mound Rd, Neillsville, WI 54456, USA", "AddressDetails": { "Accuracy" : 6, "Country" : { "AdministrativeArea" : { "AdministrativeAreaName" : "WI", "SubAdministrativeArea" : { "Locality" : { "LocalityName" : "Neillsville", "PostalCode" : { "PostalCodeNumber" : "54456" }, "Thoroughfare" : { "ThoroughfareName" : "S Mound Rd" } }, "SubAdministrativeAreaName" : "Clark" } }, "CountryName" : "USA", "CountryNameCode" : "US" } }, "ExtendedData": { "LatLonBox": { "north": 44.5920678, "south": 44.5857725, "east": -90.6802503, "west": -90.7004168 } }, "Point": { "coordinates": [ -90.6899796, 44.5889209, 0 ] } } ] }That is indeed JSON, as Jochem says. You can pick out data using structs and arrays. You will know which structure or array functionality to use after doing something like this
<cfsavecontent variable="myJSON1">
{ "name": "193 Farrow Hill Road,Davisville,WV", "Status": { "code": 200, "request": "geocode" }, "Placemark": [ { "id": "p1", "address": "Davisville, WV, USA", "AddressDetails": { "Accuracy" : 4, "Country" : { "AdministrativeArea" : { "AdministrativeAreaName" : "WV", "SubAdministrativeArea" : { "Locality" : { "LocalityName" : "Davisville" }, "SubAdministrativeAreaName" : "Wood" } }, "CountryName" : "USA", "CountryNameCode" : "US" } }, "ExtendedData": { "LatLonBox": { "north": 39.2357655, "south": 39.1665921, "east": -81.4344257, "west": -81.5624851 } }, "Point": { "coordinates": [ -81.4984554, 39.2011873, 0 ] } } ] }
</cfsavecontent>
<cfsavecontent variable="myJSON2">
{ "name": "W7499 So. Mound Rd,Neillsville,WI", "Status": { "code": 200, "request": "geocode" }, "Placemark": [ { "id": "p1", "address": "S Mound Rd, Neillsville, WI 54456, USA", "AddressDetails": { "Accuracy" : 6, "Country" : { "AdministrativeArea" : { "AdministrativeAreaName" : "WI", "SubAdministrativeArea" : { "Locality" : { "LocalityName" : "Neillsville", "PostalCode" : { "PostalCodeNumber" : "54456" }, "Thoroughfare" : { "ThoroughfareName" : "S Mound Rd" } }, "SubAdministrativeAreaName" : "Clark" } }, "CountryName" : "USA", "CountryNameCode" : "US" } }, "ExtendedData": { "LatLonBox": { "north": 44.5921279, "south": 44.5858326, "east": -90.5981846, "west": -90.6802503 } }, "Point": { "coordinates": [ -90.6394276, 44.5889072, 0 ] } }, { "id": "p2", "address": "S Mound Rd, Neillsville, WI 54456, USA", "AddressDetails": { "Accuracy" : 6, "Country" : { "AdministrativeArea" : { "AdministrativeAreaName" : "WI", "SubAdministrativeArea" : { "Locality" : { "LocalityName" : "Neillsville", "PostalCode" : { "PostalCodeNumber" : "54456" }, "Thoroughfare" : { "ThoroughfareName" : "S Mound Rd" } }, "SubAdministrativeAreaName" : "Clark" } }, "CountryName" : "USA", "CountryNameCode" : "US" } }, "ExtendedData": { "LatLonBox": { "north": 44.5920678, "south": 44.5857725, "east": -90.6802503, "west": -90.7004168 } }, "Point": { "coordinates": [ -90.6899796, 44.5889209, 0 ] } } ] }
</cfsavecontent>
<cfdump var="#deserializeJSON(myJSON1)#">
<cfdump var="#deserializeJSON(myJSON2)#"> -
FTP/File Adapter - Error parsing empty date field
I have an FTP Adapter and I defined a native schema using the JDeveloper FTP Adapter wizard (CSV file). One of the fields is a date specified in this format: "M/d/yyy". I have defined the coresponding element in the schema as follows:
*<xsd:element name="DOB" type="xsd:date" nxsd:dateFormat="M/d/yyyy" nxsd:style = "terminated" nxsd:terminatedBy="," nxsd:quotedBy=""">*
This seems to work correctly and read the date into an xml date variable. However if the date is empty, it just can't figure out what to do and throws an exception saying that it can't parse an empty field.
Is there any attribute that I can add to indicate that the field should be ignored if empty or non-parseable?
I know that I can get the data in as string and do the conversion myself later (which is not trivial either) but why complicate things. I can't believe that the creators of the FTP adapter did not accomodate for empty fields.
Thanks, I appreciate any suggestions.
BTW, I tried adding the "nillable=true" attribute but it didn't work either.
Edited by: user10770892 on Jun 21, 2011 10:38 AMYatan,
If I change the nillable true and minoccurs properties, BPEL is not working.
Is there any other way to handle blank lines ?
Thanks -
Parsing the data from and xml type field
Hi - I have registered a schema and inserted arecord into the table with the xml type column. Now I want to parse the data from the xmltype field into a relational table. I have been using the following select statement to accomplish this - and it does work if there is data in all the selected fields but when the filed is null then the whole select statement fails and brings back 'no rows returned'.If the value is null I want the select statment to return null. please give any ideas.
SELECT version,frmd_transaction_date,extractValue(value(b), 'event_update/location')"location",
extractValue(value(b), 'event_update/sending_system')"sending_system",
extractValue(value(b), 'event_update/event_identifier')"event_identifier",
extractValue(value(b), 'event_update/event_link')"event_link",
extractValue(value(b), 'event_update/organization_code')"organization_code",
nvl(extractValue(value(c), '/schedule/event_duration_minutes'),'000')"event_minutes"
FROM fraamed_user.frmd_event_update , TABLE(xmlsequence(extract(xml_event_update, '/event_update')))b,
TABLE(xmlsequence(extract(xml_event_update, '/event_update/schedule')))c...then I guess you have to rewrite the query.
Is schedule another xml sequence inside of event_update sequence ?
If it is not you can try this :
SELECT version,frmd_transaction_date,extractValue(value(b), '/event_update/location/text()')"location",
extractValue(value(b), '/event_update/sending_system/text()')"sending_system",
extractValue(value(b), '/event_update/event_identifier/text()')"event_identifier",
extractValue(value(b), '/event_update/event_link/text()')"event_link",
extractValue(value(b), '/event_update/organization_code/text()')"organization_code",
extractValue(value(b), '/event_update/schedule/event_duration_minutes/text()')"event_minutes"
FROM fraamed_user.frmd_event_update , TABLE(xmlsequence(extract(xml_event_update, '/event_update')))b
...if yes, did you try nvl function (I don't think this would be a solution of a problem):
SELECT version,frmd_transaction_date,nvl(extractValue(value(b), '/event_update/location/text()')"location", 'NULL VALUE'),
nvl(extractValue(value(b), '/event_update/sending_system/text()')"sending_system",'NULL VALUE'),
nvl(extractValue(value(b), '/event_update/event_identifier/text()')"event_identifier",'NULL VALUE'),
nvl(extractValue(value(b), '/event_update/event_link/text()')"event_link",'NULL VALUE'),
nvl(extractValue(value(b), '/event_update/organization_code/text()')"organization_code",'NULL VALUE'),
nvl(extractValue(value(c), '/schedule/event_duration_minutes/text()')"event_minutes",'NULL VALUE')
FROM fraamed_user.frmd_event_update , TABLE(xmlsequence(extract(xml_event_update, '/event_update')))b,
TABLE(xmlsequence(extract(xml_event_update, '/event_update/schedule')))c
If none of this works post your xml schema. -
Not able populate correct data into fields in alv report
hi experts,
question: from delivery document number(likp-vbeln) go to delivery items to get lips-matnr,lips-lgort
TYPE-POOLS:SLIS.
TABLES: MARC,LIPS,LIKP,VBAK,VBAP,VBRP.
SELECT-OPTIONS:S_VKORG FOR LIKP-VKORG,
S_VBELN FOR LIKP-VBELN,
S_MATGR FOR MARC-MATGR,
S_AUART FOR VBAK-AUART.
DATA: BEGIN OF ITAB OCCURS 0 ,
MATGR LIKE MARC-MATGR,
MATNR LIKE LIPS-MATNR,
LGORT LIKE LIPS-LGORT,
WADAT_IST LIKE LIKP-WADAT_IST,
AUART LIKE VBAK-AUART,
WAVWR LIKE VBRP-WAVWR,
KWMENG LIKE VBAP-KWMENG,
VBELN LIKE LIKP-VBELN,
VBELN LIKE VBAK-VBELN,
<GORT TYPE LIPS-LGORT,
END OF ITAB.
DATA: BEGIN OF JTAB OCCURS 0,
VBELN LIKE VBAK-VBELN,
END OF JTAB.
DATA: I_FIELDCAT TYPE SLIS_T_FIELDCAT_ALV WITH HEADER LINE,
I_EVENTCAT TYPE SLIS_T_EVENT WITH HEADER LINE.
START-OF-SELECTION.
SELECT AMATNR ALGORT INTO TABLE ITAB FROM LIPS AS
A INNER JOIN
LIKP AS B ON BVBELN EQ AVBELN WHERE BVBELN IN S_VBELN AND BVKORG
IN
S_VKORG.
ENDSELECT.
*I_FIELDCAT-COL_POS = 1.
*I_FIELDCAT-FIELDNAME = 'VBELN'.
*I_FIELDCAT-TABNAME = 'ITAB'.
*APPEND I_FIELDCAT TO I_FIELDCAT.
*CLEAR I_FIELDCAT.
I_FIELDCAT-COL_POS = 1.
I_FIELDCAT-FIELDNAME = 'MATNR'.
*I_FIELDCAT-TABNAME = 'ITAB'.
I_FIELDCAT-REF_TABNAME = 'MATNR'.
I_FIELDCAT-REF_TABNAME = 'LIPS'.
APPEND I_FIELDCAT .
*CLEAR I_FIELDCAT.
I_FIELDCAT-COL_POS = 2.
I_FIELDCAT-FIELDNAME = 'LGORT'.
*I_FIELDCAT-TABNAME = 'ITAB'.
I_FIELDCAT-REF_TABNAME = 'LIPS'.
APPEND I_FIELDCAT .
*CLEAR I_FIELDCAT.
CALL FUNCTION 'REUSE_ALV_LIST_DISPLAY'
EXPORTING
I_INTERFACE_CHECK = ' '
I_BYPASSING_BUFFER =
I_BUFFER_ACTIVE = ' '
I_CALLBACK_PROGRAM = SY-REPID
I_CALLBACK_PF_STATUS_SET = ' '
I_CALLBACK_USER_COMMAND = ' '
I_STRUCTURE_NAME =
IS_LAYOUT =
IT_FIELDCAT = I_FIELDCAT[]
IT_EXCLUDING =
IT_SPECIAL_GROUPS =
IT_SORT =
IT_FILTER =
IS_SEL_HIDE =
I_DEFAULT = 'X'
I_SAVE = ' '
IS_VARIANT =
IT_EVENTS = I_EVENTCAT[]
IT_EVENT_EXIT =
IS_PRINT =
IS_REPREP_ID =
I_SCREEN_START_COLUMN = 0
I_SCREEN_START_LINE = 0
I_SCREEN_END_COLUMN = 0
I_SCREEN_END_LINE = 0
IMPORTING
E_EXIT_CAUSED_BY_CALLER =
ES_EXIT_CAUSED_BY_USER =
TABLES
T_OUTTAB = ITAB
EXCEPTIONS
PROGRAM_ERROR = 1
OTHERS = 2
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
i am unable to populate correct data into fields i am getting matnr values as all 1's
and lgort as empty
can anyone help me out
thanks.HI there
This is what i found on the Forum
"You can use FM RS_VARIANT_CONTENTS to display ONE variant associated to a report
If you want to see the information of ALL VARIANTS associated to a report I think the only way is checking table VARID to get all the variants associated to the report and then do a loop and call RS_VARIANT_CONTENTS for each variant."
Regards
Tatenda -
Download the KTOPL field data and GLT0 table data into one Internal table
Hi,
I have downloaded GLT0 table fields data to PC file . But i need to download KTOPL(Chart Of Accounts) data also. in GLT0 table there is no KTOPL field.
But in SKA1 table have KTOPL field. Then what is the issue is GLT0 data & KTOPL field data needs to download into one Internal Table.
anybody could you please solve this problem. immediately need to solve this.
Below is the code.
REPORT ZFXXEABL_1 NO STANDARD PAGE HEADING
LINE-SIZE 200.
Tables Declaration
TABLES : GLT0.
Data Declaration
DATA : FP(8) TYPE C,
YEAR LIKE GLT0-RYEAR,
PERIOD(3) TYPE C,
DBALANCE LIKE VBAP-NETWR VALUE 0 ,
CBALANCE LIKE VBAP-NETWR VALUE 0.
*Internal table for for final data..
DATA : BEGIN OF REC1 OCCURS 0,
BAL LIKE GLT0-TSLVT value 0,
COAREA LIKE GLT0-RBUSA,
CA(4) TYPE C,
KTOPL LIKE ska1-ktopl,
CCODE LIKE GLT0-BUKRS,
CREDIT LIKE VBAP-NETWR,
CURRENCY LIKE GLT0-RTCUR,
CURTYPE(2) TYPE N,
DEBIT LIKE VBAP-NETWR,
YEAR(8) TYPE C,
FY(2) TYPE C,
ACCOUNT LIKE GLT0-RACCT,
VER LIKE GLT0-RVERS,
VTYPE(2) TYPE N,
CLNT LIKE SY-MANDT,
S_SYS(3) TYPE C,
INDICATOR LIKE GLT0-DRCRK,
END OF REC1.
DATA : C(2) TYPE N,
D(2) TYPE N.
DATA REC1_H LIKE REC1.
Variable declarations
DATA :
W_FILES(4) TYPE N,
W_DEBIT LIKE GLT0-TSLVT,
W_CREDIT LIKE GLT0-TSLVT,
W_PCFILE LIKE RLGRAP-FILENAME ,
W_UNIXFILE LIKE RLGRAP-FILENAME,
W_PCFILE1 LIKE RLGRAP-FILENAME,
W_UNIXFIL1 LIKE RLGRAP-FILENAME,
W_EXT(3) TYPE C,
W_UEXT(3) TYPE C,
W_PATH LIKE RLGRAP-FILENAME,
W_UPATH LIKE RLGRAP-FILENAME,
W_FIRST(1) TYPE C VALUE 'Y',
W_CFIRST(1) TYPE C VALUE 'Y',
W_PCFIL LIKE RLGRAP-FILENAME.
DATA: "REC LIKE GLT0 OCCURS 0 WITH HEADER LINE,
T_TEMP LIKE GLT0 OCCURS 0 WITH HEADER LINE.
DATA: BEGIN OF REC3 OCCURS 0.
INCLUDE STRUCTURE GLT0.
DATA: KTOPL LIKE SKA1-KTOPL,
END OF REC3.
DATA: BEGIN OF T_KTOPL OCCURS 0,
KTOPL LIKE SKA1-KTOPL,
SAKNR LIKE SKA1-SAKNR,
END OF T_KTOPL.
Download data.
DATA: BEGIN OF I_REC2 OCCURS 0,
BAL(17), " like GLT0-TSLVT value 0,
COAREA(4), " like glt0-rbusa,
CA(4), " chart of accounts
CCODE(4), " like glt0-bukrs,
CREDIT(17), " like vbap-netwr,
CURRENCY(5), " like glt0-rtcur,
CURTYPE(2), " type n,
DEBIT(17), " like vbap-netwr,
YEAR(8), " type c,
FY(2), " type c, fiscal yr variant
ACCOUNT(10), " like glt0-racct,
VER(3), " like glt0-rvers,
VTYPE(3), " type n,
CLNT(3), "like sy-mandt,
S_SYS(3), "like sy-sysid,
INDICATOR(1), " like glt0-drcrk,
END OF I_REC2.
Selection screen. *
SELECTION-SCREEN BEGIN OF BLOCK BL1 WITH FRAME TITLE TEXT-BL1.
SELECT-OPTIONS : COMPCODE FOR GLT0-BUKRS,
GLACC FOR GLT0-RACCT,
FISYEAR FOR GLT0-RYEAR,
no intervals no-extension, "- BG6661-070212
FISCPER FOR GLT0-RPMAX,
busarea for glt0-rbusa,
CURRENCY FOR GLT0-RTCUR.
SELECTION-SCREEN END OF BLOCK BL1.
SELECTION-SCREEN BEGIN OF BLOCK BL2 WITH FRAME TITLE TEXT-BL2.
PARAMETERS:
P_UNIX AS CHECKBOX, "Check box for Unix Option
P_UNFIL LIKE RLGRAP-FILENAME, " Unix file Dnload file name
default '/var/opt/arch/extract/GLT0.ASC', "- BG6661-070212
P_PCFILE AS CHECKBOX, "Check box for Local PC download.
P_PCFIL LIKE RLGRAP-FILENAME " PC file Dnload file name
default 'C:\GLT0.ASC'. "- BG6661-070212
DEFAULT 'C:\glt0_gl_balance_all.asc'. "+ BG6661-070212
SELECTION-SCREEN END OF BLOCK BL2.
*eject
Initialization. *
INITIALIZATION.
Try to default download filename
p_pcfil = c_pcfile.
p_unfil = c_unixfile.
if sy-sysid eq c_n01.
p_unfil = c_unixfile.
endif.
if sy-sysid eq c_g21.
p_unfil = c_g21_unixfile.
endif.
if sy-sysid eq c_g9d.
p_unfil = c_g9d_unixfile.
endif.
Default for download filename
*{ Begin of BG6661-070212
CONCATENATE C_UNIXFILE
SY-SYSID C_FSLASH C_CHRON C_FILENAME INTO P_UNFIL.
*} End of BG6661-070212
AT SELECTION-SCREEN OUTPUT.
loop at screen.
if screen-name = 'P_PCFIL'. "PC FILE
screen-input = '0'.
modify screen.
endif.
if screen-name = 'P_UNFIL'. "UN FILE
screen-input = '0'.
modify screen.
endif.
endloop.
if w_first = 'Y'.
perform path_file.
w_first = 'N'.
endif.
if w_cfirst = 'Y'.
perform cpath_file.
w_cfirst = 'N'.
endif.
Start-of-Selection. *
START-OF-SELECTION.
*COLLECT DATA
PERFORM COLLECT_DATA.
*BUILD FILENAMES
PERFORM BUILD_FILES.
*LOCAL
IF P_PCFILE = C_YES.
PERFORM LOCAL_DOWNLOAD.
ENDIF.
*UNIX
IF P_UNIX = C_YES.
PERFORM UNIX_DOWNLOAD.
ENDIF.
IF P_PCFILE IS INITIAL AND P_UNIX IS INITIAL.
MESSAGE I000(ZL) WITH 'Down load flags both are unchecked'.
ENDIF.
END-OF-SELECTION.
IF P_PCFILE = C_YES.
WRITE :/ 'PC File' , C_UNDER, P_PCFIL.
ENDIF.
*& Form DOWNLOAD
Download *
FORM DOWNLOAD.
P_PCFIL = W_PATH.
DATA LIN TYPE I.
DESCRIBE TABLE I_REC2 LINES LIN.
WRITE:/ 'No of Records downloaded = ',LIN.
CALL FUNCTION 'WS_DOWNLOAD'
EXPORTING
FILENAME = P_PCFIL
FILETYPE = C_ASC "c_dat "dat
TABLES
DATA_TAB = I_REC2 " t_str
fieldnames = t_strhd
EXCEPTIONS
FILE_OPEN_ERROR = 1
FILE_WRITE_ERROR = 2
INVALID_FILESIZE = 3
INVALID_TABLE_WIDTH = 4
INVALID_TYPE = 5
NO_BATCH = 6
UNKNOWN_ERROR = 7
OTHERS = 8.
IF SY-SUBRC EQ 0.
ENDIF.
ENDFORM.
*& Form WRITE_TO_SERVER
text *
--> p1 text
<-- p2 text
FORM WRITE_TO_SERVER.
DATA : L_MSG(100) TYPE C,
L_LINE(5000) TYPE C.
P_UNFIL = W_UPATH.
DATA LIN TYPE I.
DESCRIBE TABLE I_REC2 LINES LIN.
WRITE:/ 'No of Records downloaded = ',LIN.
OPEN DATASET P_UNFIL FOR OUTPUT IN TEXT MODE. " message l_msg.
IF SY-SUBRC <> 0.
WRITE: / L_MSG.
ENDIF.
perform header_text1.
LOOP AT I_REC2.
TRANSFER I_REC2 TO P_UNFIL.
ENDLOOP.
CLOSE DATASET P_UNFIL.
WRITE : / C_TEXT , W_UPATH.
SPLIT W_UNIXFILE AT C_DOT INTO W_UNIXFIL1 W_UEXT.
CLEAR W_UPATH.
IF NOT W_UEXT IS INITIAL.
CONCATENATE W_UNIXFIL1 C_DOT W_UEXT INTO W_UPATH.
ELSE.
W_UEXT = C_ASC. " c_csv.
CONCATENATE W_UNIXFIL1 C_DOT W_UEXT INTO W_UPATH.
ENDIF.
ENDFORM. " WRITE_TO_SERVER
*& Form BUILD_FILES
FORM BUILD_FILES.
IF P_PCFILE = C_YES.
W_PCFILE = P_PCFIL.
***Split path at dot**
SPLIT W_PCFILE AT C_DOT INTO W_PCFILE1 W_EXT.
IF NOT W_EXT IS INITIAL.
CONCATENATE W_PCFILE1 C_DOT W_EXT INTO W_PATH.
ELSE.
W_PATH = W_PCFILE1.
ENDIF.
ENDIF.
IF P_UNIX = C_YES.
W_UNIXFILE = P_UNFIL.
SPLIT W_UNIXFILE AT C_DOT INTO W_UNIXFIL1 W_UEXT.
IF NOT W_UEXT IS INITIAL.
CONCATENATE W_UNIXFIL1 C_DOT W_UEXT INTO W_UPATH.
ELSE.
W_UPATH = W_UNIXFIL1.
ENDIF.
ENDIF.
ENDFORM.
FORM CPATH_FILE.
CLEAR P_PCFIL.
CONCATENATE C_PCFILE
C_COMFILE SY-SYSID C_UNDER SY-DATUM SY-UZEIT
C_DOT C_ASC INTO P_PCFIL.
ENDFORM. " CPATH_FILE
FORM PATH_FILE.
CLEAR P_UNFIL.
if sy-sysid eq c_n01.
CONCATENATE C_UNIXFILE
C_COMFILE SY-SYSID C_UNDER SY-DATUM SY-UZEIT
C_DOT C_ASC INTO P_UNFIL.
endif.
if sy-sysid eq c_g21.
concatenate c_g21_unixfile
c_comfile sy-sysid c_under sy-datum sy-uzeit
c_dot c_asc into p_unfil.
endif.
if sy-sysid eq c_g9d.
concatenate c_g9d_unixfile
c_comfile sy-sysid c_under sy-datum sy-uzeit
c_dot c_asc into p_unfil.
endif.
ENDFORM. " PATH_FILE
Local_Download *
Local *
FORM LOCAL_DOWNLOAD.
perform header_text.
LOOP AT REC1.
REC1-CLNT = SY-MANDT.
REC1-S_SYS = SY-SYSID.
MOVE: REC1-BAL TO I_REC2-BAL,
REC1-COAREA TO I_REC2-COAREA,
REC1-CA TO I_REC2-CA,
REC1-KTOPL TO I_REC2-CA,
REC1-CCODE TO I_REC2-CCODE,
REC1-CREDIT TO I_REC2-CREDIT,
REC1-CURRENCY TO I_REC2-CURRENCY,
REC1-CURTYPE TO I_REC2-CURTYPE,
REC1-DEBIT TO I_REC2-DEBIT,
REC1-YEAR TO I_REC2-YEAR,
REC1-FY TO I_REC2-FY,
REC1-ACCOUNT TO I_REC2-ACCOUNT,
REC1-VER TO I_REC2-VER,
REC1-VTYPE TO I_REC2-VTYPE,
REC1-CLNT TO I_REC2-CLNT,
REC1-S_SYS TO I_REC2-S_SYS,
REC1-INDICATOR TO I_REC2-INDICATOR.
APPEND I_REC2.
CLEAR I_REC2.
ENDLOOP.
IF NOT I_REC2[] IS INITIAL.
PERFORM DOWNLOAD .
CLEAR I_REC2.
REFRESH I_REC2.
ELSE.
WRITE : / ' no record exist due to unavailability of data'.
ENDIF.
ENDFORM. " LOCAL_DOWNLOAD
*& Form UNIX_DOWNLOAD
FORM UNIX_DOWNLOAD.
LOOP AT REC1.
REC1-CLNT = SY-MANDT.
REC1-S_SYS = SY-SYSID.
MOVE: REC1-BAL TO I_REC2-BAL,
REC1-COAREA TO I_REC2-COAREA,
REC1-CA TO I_REC2-CA,
REC1-KTOPL TO I_REC2-CA,
REC1-CCODE TO I_REC2-CCODE,
REC1-CREDIT TO I_REC2-CREDIT,
REC1-CURRENCY TO I_REC2-CURRENCY,
REC1-CURTYPE TO I_REC2-CURTYPE,
REC1-DEBIT TO I_REC2-DEBIT,
REC1-YEAR TO I_REC2-YEAR,
REC1-FY TO I_REC2-FY,
REC1-ACCOUNT TO I_REC2-ACCOUNT,
REC1-VER TO I_REC2-VER,
REC1-VTYPE TO I_REC2-VTYPE,
SY-MANDT TO I_REC2-CLNT,
SY-SYSID TO I_REC2-S_SYS,
REC1-INDICATOR TO I_REC2-INDICATOR.
APPEND I_REC2.
CLEAR I_REC2.
ENDLOOP.
IF NOT I_REC2[] IS INITIAL.
PERFORM WRITE_TO_SERVER.
CLEAR I_REC2.
REFRESH I_REC2.
ELSE.
WRITE : / ' no record exist due to unavailability of data'.
ENDIF.
ENDFORM. " UNIX_DOWNLOAD
*& Form HEADER_TEXT
text *
--> p1 text
<-- p2 text
*form header_text.
concatenate c_bal c_ba c_ca c_cc c_credit c_currency c_curtype
c_debit c_fisyear c_fisvar c_acct c_ver c_vtype c_indicator
into t_strhd
separated by c_comma.
append t_strhd.
*endform. " HEADER_TEXT
*& Form HEADER_TEXT1
text *
*form header_text1.
concatenate c_bal c_ba c_ca c_cc c_credit c_currency c_curtype
c_debit c_fisyear c_fisvar c_acct c_ver c_vtype c_indicator
into t_strhd1
separated by c_comma.
append t_strhd1.
transfer t_strhd1 to p_unfil.
*endform. " HEADER_TEXT1
*& Form COLLECT_DATA
Collect Data *
FORM COLLECT_DATA.
SELECT * FROM GLT0 INTO TABLE REC3
WHERE BUKRS IN COMPCODE
AND RYEAR IN FISYEAR
AND RPMAX IN FISCPER
AND RACCT IN GLACC
AND RTCUR IN CURRENCY.
SELECT KTOPL FROM SKA1
INTO TABLE T_KTOPL
FOR ALL ENTRIES IN REC3
WHERE SAKNR = REC3-RACCT.
LOOP AT REC3 .
select *
from glt0
into table t_temp
where rldnr = rec-rldnr
and rrcty = rec-rrcty
and rvers = rec-rvers
and bukrs = rec-bukrs
and ryear = rec-ryear
and racct = rec-racct
and rbusa = rec-rbusa
and rtcur <> 'ZAR'
and rpmax = rec-rpmax.
if sy-subrc = 0.
rec1-bal = '0.00'.
else.
rec1-bal = rec-hslvt.
endif.
*READ TABLE T_KTOPL WITH KEY SAKNR = REC-RACCT BINARY SEARCH.
MOVE T_KTOPL-KTOPL TO REC3-KTOPL.
CLEAR: CBALANCE, DBALANCE.
REC1-BAL = REC3-HSLVT.
IF REC3-DRCRK = 'S'.
IF REC3-HSLVT NE C_ZERO.
YEAR = REC-RYEAR.
PERIOD = '000'.
CONCATENATE PERIOD C_DOT YEAR INTO FP.
REC1-INDICATOR = REC-DRCRK.
REC1-DEBIT = C_ZERO.
REC1-CREDIT = C_ZERO.
REC1-CCODE = REC-BUKRS.
REC1-YEAR = FP.
REC1-CURRENCY = REC-RTCUR.
REC1-ACCOUNT = REC-RACCT.
rec1-bal = rec-hslvt.
dbalance = rec1-bal.
REC1-CURTYPE = C_CTYPE.
REC1-FY = C_FY.
REC1-COAREA = REC-RBUSA.
REC1-VER = REC-RVERS.
REC1-VTYPE = C_CTYPE.
REC1-CA = C_CHART.
APPEND REC1.
C = 0.
PERFORM D.
ENDIF.
IF REC3-HSL01 NE C_ZERO.
YEAR = REC3-RYEAR.
PERIOD = '001'.
CONCATENATE PERIOD C_DOT YEAR INTO FP.
REC1-INDICATOR = REC3-DRCRK.
REC1-DEBIT = REC3-HSL01 .
REC1-CCODE = REC3-BUKRS.
REC1-YEAR = FP.
REC1-CURRENCY = REC3-RTCUR.
REC1-ACCOUNT = REC3-RACCT.
rec1-bal = REC3-hsl01 + dbalance.
dbalance = rec1-bal.
REC1-CURTYPE = C_CTYPE.
REC1-FY = C_FY.
REC1-COAREA = REC3-RBUSA.
REC1-VER = REC3-RVERS.
REC1-VTYPE = C_CTYPE.
REC1-CA = C_CHART.
REC1-KTOPL = REC3-KTOPL.
APPEND REC1.
C = 1.
PERFORM D.
ENDIF.
IF REC3-HSL02 NE C_ZERO.
REC1-DEBIT = REC3-HSL02.
YEAR = REC3-RYEAR.
PERIOD = '002'.
CONCATENATE PERIOD C_DOT YEAR INTO FP.
REC1-INDICATOR = REC3-DRCRK.
REC1-DEBIT = REC3-HSL02.
REC1-CCODE = REC3-BUKRS.
REC1-YEAR = FP.
REC1-CURRENCY = REC3-RTCUR.
REC1-ACCOUNT = REC3-RACCT.
rec1-bal = REC3-hsl02 + dbalance.
dbalance = rec1-bal.
REC1-CURTYPE = C_CTYPE.
REC1-FY = C_FY.
REC1-COAREA = REC3-RBUSA.
REC1-VER = REC3-RVERS.
REC1-VTYPE = C_CTYPE.
REC1-CA = C_CHART. "-BF7957-070503
REC1-KTOPL = REC3-KTOPL. "+BF7957-070503
APPEND REC1.
C = 2.
PERFORM D.
ENDIF.
IF REC3-HSL03 NE C_ZERO.
YEAR = REC3-RYEAR.
PERIOD = '003'.
CONCATENATE PERIOD C_DOT YEAR INTO FP.
REC1-INDICATOR = REC3-DRCRK.
REC1-DEBIT = REC3-HSL03.
REC1-CCODE = REC3-BUKRS.
REC1-YEAR = FP.
REC1-CURRENCY = REC3-RTCUR.
REC1-ACCOUNT = REC3-RACCT.
rec1-bal = REC3-hsl03 + dbalance .
dbalance = rec1-bal.
REC1-CURTYPE = C_CTYPE.
REC1-FY = C_FY.
REC1-COAREA = REC3-RBUSA.
REC1-VER = REC3-RVERS.
REC1-VTYPE = C_CTYPE.
REC1-CA = C_CHART. "-BF7957-070503
REC1-KTOPL = REC3-KTOPL. "+BF7957-070503
APPEND REC1.
C = 3.
PERFORM D.
ENDIF.
IF REC3-HSL04 NE C_ZERO.
REC1-DEBIT = REC3-HSL04.
YEAR = REC3-RYEAR.
PERIOD = '004'.
CONCATENATE PERIOD C_DOT YEAR INTO FP.
REC1-INDICATOR = REC3-DRCRK.
REC1-DEBIT = REC3-HSL04.
REC1-CCODE = REC3-BUKRS.
REC1-YEAR = FP.
REC1-CURRENCY = REC3-RTCUR.
REC1-ACCOUNT = REC3-RACCT.
rec1-bal = REC3-hsl04 + dbalance .
REC1-CURTYPE = C_CTYPE.
REC1-FY = C_FY.
REC1-COAREA = REC3-RBUSA.
REC1-VER = REC3-RVERS.
REC1-VTYPE = C_CTYPE.
REC1-CA = C_CHART. "-BF7957-070503
REC1-KTOPL = REC3-KTOPL. "+BF7957-070503
APPEND REC1.
dbalance = rec1-bal.
C = 4.
PERFORM D.
ENDIF.
Thanks and Regards,
Ramuse logical database SDF, nodes ska1 and skc1c
A.
Maybe you are looking for
-
IPhone 3G doesn't appear in iTunes 10.6.3
After upgrading to Snow Leopard 10.6.8 a few weeks ago, I was unable to see my iPhone 3G in iTunes. I downloaded the latest version of iTunes, which fixed the problem, and I've synced once or twice since then. Today the device once again doesn't sh
-
How do you download snow leopard onto a mac with lion
i just recently purchased a mac pro 13.3 inches and it came with lion, but lion doesnt run power pc apps so i went and bought snow leopard because i was addvised by and apple support member that i could install it onto a partition and choose to run i
-
Dear All, Can i call a flat file containing physical inventory data, (Material no, batch & no. of count) in MI10 for background processing so that all the data from the flat file can be come to the corresponding fields in MI10. I have to post the doc
-
HELP!! Can this be even be done?
I am working on a project that requires some elaborate Flash interactivity. I would like to get your thoughts on whether this will work, if Flash and Captivate are the right tools, etc.. We want to train users on how to fill out a medical form. It re
-
Hello, My Itunes keeps poping up above other screens and I cannot figure out why it is doing this. I have tried reinstalling the program and changing a few options but nothing seems to work for me.