Unwanted data row in file
I've created two vi's that write a header and then write data under the headers. Both vi's use the same code to collect the data. The difference is in the creation of the headers and the naming of the channels in the choose channel vi. The choose channel vi produces an unwanted row of data under the header which happens after the header information is written. I paused the program after the header information was written and it wasn't there yet. Is there something that I'm missing when naming channels that may cause this?
Thanks for the help.
Attachments:
Unwanted Row.xls 1 KB
Choose Channel1.vi 164 KB
Smartman DAQmx v21.vi 95 KB
This looks like a simple timing issue. If you force the wait to occur sequentially it should eliminate the problem:
Message Edited by smercurio_fc on 05-29-2008 03:54 PM
Attachments:
Choose Channel1_BD.png 3 KB
Similar Messages
-
Need to remove unwanted data in log file
hai experts,
how to identify and remove the unwanted data from log file.plz help me.
regards,
pugazh.brarchive takes a backup of all offline redologs, then deleted them. this is done at OS level.
if you do not need them and you do not need archiving, a permanent solution is to disable archiving:
stop SAP first, then:
sqlplus / as sysdba
shutdown immediate;
startup mount
alter database noarchivelog;
alter database open;
exit
after this, you will never be able to do a recovery point-in-time, and won't be able to start online backups. -
Writing large xmltype data to UTL_FILE and setting max row per file
Hey Gurus,
I am trying to create a procedure (in Oracle 9i) that writes out xml data I have created into several xml files (file would probably be to large for one xml file output...I am doing this for 270,000 rows of data), setting the max rows to 1000 rows per file. I know one would have to create a looping contsruct to do this but I am just not adept enough in PL/SQL to figure it out at the moment.
So essentially their would be some sort of loop construct and substr process that creates a file after looping through 1000 rows and then continues the count and creates a another file until all 270 xml files are created. Simple enough right...lol? Well I've tried doing this and haven't gotten anywhere. My pl/sql coding skills are too elementary I am guessing. I've only been doing this for about three months and could use the assistance of a more experienced person here.
Here are the particulars...
This is the xmltype view code that I used to create the xml data.
select XMLELEMENT("macess_exp_import_export_file",
XMLELEMENT("file_header",
XMLELEMENT("file_information")),
XMLELEMENT("items",
XMLELEMENT("documents",
(SELECT XMLAGG(XMLELEMENT("document",
XMLELEMENT("external_reference"),
XMLELEMENT("processing_instructions",
XMLELEMENT("update", name)),
XMLELEMENT("filing_instructions",
XMLELEMENT("folder_ids",
XMLELEMENT("folder",
XMLATTRIBUTES(folder_id AS "id", folder_type_id AS "folder_type_id")))),
XMLELEMENT("document_header",
XMLELEMENT("document_type",
XMLATTRIBUTES(document_type AS "id")),
XMLELEMENT("document_id", document_id),
XMLELEMENT("document_description", document_description),
XMLELEMENT("document_date",
XMLATTRIBUTES(name AS "name"), document_date),
XMLELEMENT("document_properties")),
XMLELEMENT("document_data",
XMLELEMENT("internal_file",
XMLELEMENT("document_file_path", document_file_path),
XMLELEMENT("document_extension", document_extension)
))))from macess_import_base WHERE rownum < 270000))))AS result
from macess_import_base WHERE rownum < 270000;
This is the Macess_Import_Base table that I am creating xml data from
create table MACESS_IMPORT_BASE
MACESS_EXP_IMPORT_EXPORT_FILE VARCHAR2(100),
FILE_HEADER VARCHAR2(20),
ITEMS VARCHAR2(20),
DOCUMENTS VARCHAR2(20),
DOCUMENT VARCHAR2(20),
EXTERNAL_REFERENCE VARCHAR2(20),
PROCESSING_INSTRUCTIONS VARCHAR2(20),
PATENT VARCHAR2(20),
FILING_INSTRUCTIONS VARCHAR2(20),
FOLDER_IDS VARCHAR2(20),
FOLDER_ID VARCHAR2(20),
FOLDER_TYPE_ID NUMBER(20),
DOCUMENT_HEADER VARCHAR2(20),
DOCUMENT_PROPERTIES VARCHAR2(20),
DOCUMENT_DATA VARCHAR2(20),
INTERNAL_FILE VARCHAR2(20),
NAME VARCHAR2(20),
DOCUMENT_TYPE VARCHAR2(40),
DOCUMENT_ID VARCHAR2(64),
DOCUMENT_DESCRIPTION VARCHAR2(200),
DOCUMENT_DATE VARCHAR2(100),
DOCUMENT_FILE_PATH VARCHAR2(200),
DOCUMENT_EXTENSION VARCHAR2(200)
Directory name to write output to "DIR_PATH"
DIRECTORY PATH is "\app\cdg\cov"
Regards,
ChrisI also would like to use UTL_FILE to achieve this functionality in the procedure.
-
hi guru,
I want to upload data from excel file for mm02.. first of all help on the matter of how to upload data from excel...
i hv used the FM ALSM_EXCEL_TO_INTERNAL_TABLE.. but its not working it uploading garbage value ... so tell me how to used it...
help me on this matter.Check below example.
parameters : p_file LIKE rlgrap-filename OBLIGATORY.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
PERFORM get_file_name.
PERFORM get_file_to_excel.
*& Form get_file_name
FORM get_file_name.
DATA: lv_name LIKE sy-repid.
lv_name = sy-repid..
CALL FUNCTION 'KD_GET_FILENAME_ON_F4'
EXPORTING
program_name = lv_name
dynpro_number = syst-dynnr
static = 'X'
CHANGING
file_name = p_file.
ENDFORM. " get_file_name
*& Form get_file_to_excel
FORM get_file_to_excel.
DATA: idata LIKE alsmex_tabline OCCURS 0 WITH HEADER LINE.
CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
EXPORTING
filename = p_file
i_begin_col = '1'
i_begin_row = '2' "Do not require headings
i_end_col = '2'
i_end_row = '60000'
TABLES
intern = idata
EXCEPTIONS
inconsistent_parameters = 1
upload_ole = 2
OTHERS = 3.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
STOP.
ENDIF.
* Get first row retrieved
READ TABLE idata INDEX 1.
* Set first row retrieved to current row
DATA: gd_currentrow TYPE i.
gd_currentrow = idata-row.
LOOP AT idata.
* Reset values for next row
IF idata-row NE gd_currentrow.
APPEND f. CLEAR f.
gd_currentrow = idata-row.
ENDIF.
CASE idata-col.
WHEN '0001'.
f-belnr = idata-value.
ENDCASE.
ENDLOOP.
APPEND f. CLEAR f. -
How to read data from multiple files and append in columns in one file
Hi Guys,
I have a problem in appending data from files in different columns. I have attachement has file A and B which I am reading and not able to get data as in file Result.txt. Please comment on how can I do this
Solved!
Go to Solution.
Attachments:
Write to file.vi 13 KB
A.txt.txt 1 KB
B.txt.txt 1 KBYou cannot append columns to an existing file. Since the data is arrange line-by-line as one long linear string in the file, you can only append rows. A new row needs to be interlaced into the original file, shifting everything else. If you want to append rows, you need to build the entire output structure in memory and then write all at once.
(I also don't think you need to set the file positions, it will be remembered from the last write operation.)
Unless the files are gigantic, here's what I would do:
(Also note that some of your rows have an extra tab at the end. If this is normal, you need a little bit more code to strop out empty columns. I include cleaned up files in the attachment. I also would not call them A.txt.txt etc. A plain A.txt is probably sufficient.)
EDIT: It seems Dennis's solution is similar )
LabVIEW Champion . Do more with less code and in less time .
Attachments:
Write to fileMOD.zip 6 KB
MergeColumns.png 6 KB -
Cant get data from text file to print into Jtable
Instead of doing JDBC i am using text file as database. I cant get data from text file to print into JTable when i click find button. Goal is to find a record and print that record only, but for now i am trying to print all the records. Once i get that i will change my code to search desired record and print it. when i click the find button nothing happens. Can you please take a look at my code, dbTest() method. thanks.
void dbTest() {
DataInputStream dis = null;
String dbRecord = null;
String hold;
try {
File f = new File("customer.txt");
FileInputStream fis = new FileInputStream(f);
BufferedInputStream bis = new BufferedInputStream(fis);
dis = new DataInputStream(bis);
Vector dataVector = new Vector();
Vector headVector = new Vector(2);
Vector row = new Vector();
// read the record of the text database
while ( (dbRecord = dis.readLine()) != null) {
StringTokenizer st = new StringTokenizer(dbRecord, ",");
while (st.hasMoreTokens()) {
row.addElement(st.nextToken());
System.out.println("Inside nested loop: " + row);
System.out.println("inside loop: " + row);
dataVector.addElement(row);
System.out.println("outside loop: " + row);
headVector.addElement("Title");
headVector.addElement("Type");
dataTable = new JTable(dataVector, headVector);
dataTableScrollPane.setViewportView(dataTable);
} catch (IOException e) {
// catch io errors from FileInputStream or readLine()
System.out.println("Uh oh, got an IOException error!" + e.getMessage());
} finally {
// if the file opened okay, make sure we close it
if (dis != null) {
try {
dis.close();
} catch (IOException ioe) {
} // end if
} // end finally
} // end dbTestHere's a thread that loads a text file into a JTable:
http://forum.java.sun.com/thread.jsp?forum=57&thread=315172
And my reply in this thread shows how you can use a text file as a simple database:
http://forum.java.sun.com/thread.jsp?forum=31&thread=342380 -
How to load the data from excel file into temprory table in Forms 11g?
Hi
How to Load the data from excel file(Extension is .CSV) into the temporary table of oracle in Forms11g.
My Forms Version is - Forms [64 Bit] Version 11.1.2.0.0 (Production)
Kindly Suggest the Solution.
Regards,
SachinDeclare
v_full_filename varchar2(500);
v_server_path varchar2(2000);
v_separator VARCHAR2(1);
v_filename VARCHAR2(400);
filename VARCHAR2 (100);
v_stop_load varchar2 (2000);
v_rec_error_log varchar2(4000);
v_error_log varchar2(4000);
ctr NUMBER (12);
cols NUMBER (2);
btn number;
RES BOOLEAN;
application ole2.obj_type;
workbooks ole2.obj_type;
workbook ole2.obj_type;
worksheets ole2.obj_type;
worksheet ole2.obj_type;
cell ole2.obj_type;
cellType ole2.OBJ_TYPE;
args ole2.obj_type;
PROCEDURE olearg
IS
args ole2.obj_type;
BEGIN
args := ole2.create_arglist;
ole2.add_arg (args, ctr);
ole2.add_arg (args, cols);
cell := ole2.get_obj_property (worksheet, 'Cells', args);
ole2.destroy_arglist (args);
END;
BEGIN
v_full_filename := client_get_file_name(directory_name => null
,file_name => null
,file_filter => 'Excel files (*.xls)|*.xls|'
||'Excel files (*.xlsx)|*.xlsx|'
,message => 'Choose Excel file'
,dialog_type => null
,select_file => null
If v_full_filename is not null Then
v_separator := WEBUTIL_CLIENTINFO.Get_file_Separator ;
v_filename := v_separator||v_full_filename ;
:LOAD_FILE_NAME := substr(v_filename,instr(v_filename,v_separator,-1) + 1);
RES := Webutil_File_Transfer.Client_To_AS(v_full_filename,"server_path"||substr(v_filename,instr(v_filename,v_separator,-1) + 1));
--Begin load data from EXCEL
BEGIN
filename := v_server_path||substr(v_filename,instr(v_filename,v_separator,-1) + 1); -- to pick the file
application := ole2.create_obj ('Excel.Application');
ole2.set_property (application, 'Visible', 'false');
workbooks := ole2.get_obj_property (application, 'Workbooks');
args := ole2.create_arglist;
ole2.add_arg (args, filename); -- file path and name
workbook := ole2.get_obj_property(workbooks,'Open',args);
ole2.destroy_arglist (args);
args := ole2.create_arglist;
ole2.add_arg (args, 'Sheet1');
worksheet := ole2.get_obj_property (workbook, 'Worksheets', args);
ole2.destroy_arglist (args);
ctr := 2; --row number
cols := 1; -- column number
go_block('xxx');
FIRST_RECORD;
LOOP
--Column 1 VALUE --------------------------------------------------------------------
olearg;
v_stop_load := ole2.get_char_property (cell, 'Text'); --cell value of the argument
:item1 := v_stop_load;
cols := cols + 1;
--Column 2 VALUE --------------------------------------------------------------------
olearg;
:item2 := ole2.get_char_property (cell, 'Text'); --cell value of the argument
cols := cols + 1;
--<and so on>
ole2.invoke (application, 'Quit');
ole2.RELEASE_OBJ (cell);
ole2.RELEASE_OBJ (worksheet);
ole2.RELEASE_OBJ (worksheets);
ole2.RELEASE_OBJ (workbook);
ole2.RELEASE_OBJ (workbooks);
ole2.RELEASE_OBJ (application);
END;
--End load data from EXCELPlease mark it as answered if you helped. -
How to export the text edit data to excel file without splitting the data in excel file?
how to export the text edit data to excel file without splitting the data in excel file?
I have a requirement in SAP HR where in the appraiser can add comments in the area given and can export that to excel file. Currently the file is getting exported but the comments getting split into deifferent rows.
I want the entire comment to be fit in one row.
Please help.
Thank youHi,
if your text edit value is stored in 'lv_string' variable.
then before exporting the value to excel you have to remove CL_ABAP_CHAR_UTILITIES=>NEWLINE
that is '#' from the variable lv_string.
for that use code some thing like this.
REPLACE ALL OCCURRENCES OF CL_ABAP_CHAR_UTILITIES=>NEWLINE in lv_string WITH space.
I think this will do the trick. -
Reading data from flat file Using TEXT_IO
Dear Gurus
I already posted this question but this time i need some other changes .....Sorry for that ..
I am using 10G forms and using TEXT_IO for reading data from flat file ..
My data is like this :-
0|BP-V1|20100928|01|1|2430962.89|27|2430962.89|MUR|20100928120106
9|2430962.89|000111111111|
1|61304.88|000014104113|
1|41961.73|000022096086|
1|38475.65|000023640081|
1|49749.34|000032133154|
1|35572.46|000033093377|
1|246671.01|000042148111|
Here each column is separated by | . I want to read all the columns and want to do some validation .
How can i do ?
Initially my requirement was to read only 2 or 3 columns so i did like this ...
Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
IS
v_handle utl_file.file_type;
v_filebuffer varchar2(500);
line_0_date VARCHAR2 (10);
line_0_Purp VARCHAR2 (10);
line_0_count Number;
line_0_sum number(12,2);
line_0_ccy Varchar2(3);
line_9_sum Number(12,2);
line_9_Acc_no Varchar2(12);
Line_1_Sum Number(12,2);
Line_1_tot Number(15,2) := 0;
Line_1_flag Number := 0;
lval number;
lacno varchar2(16);
v_file varchar2(20);
v_path varchar2(50);
Begin
v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); -- For the file name
v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
v_handle := UTL_FILE.fopen (v_path, v_file, 'r');
LOOP
UTL_FILE.get_line (v_handle, v_filebuffer);
IF SUBSTR (v_filebuffer, 0, 1) = '0' THEN
SELECT line_0 INTO line_0_date
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
SELECT line_0 INTO line_0_Purp
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 4;
SELECT line_0 INTO line_0_count
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 7;
SELECT line_0 INTO line_0_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 8;
SELECT line_0 INTO line_0_ccy
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 9;
ELSIF SUBSTR (v_filebuffer, 0, 1) = '9' THEN
SELECT line_9 INTO line_9_Acc_no
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
SELECT line_9 INTO line_9_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 2;
ELSIF SUBSTR (v_filebuffer, 0, 1) = '1' THEN
line_1_flag := line_1_flag+1;
SELECT line_1 INTO line_1_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_1, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
Line_1_tot := Line_1_tot + line_1_sum;
END IF;
END LOOP;
DBMS_OUTPUT.put_line (Line_1_tot);
DBMS_OUTPUT.PUT_LINE (Line_1_flag);
UTL_FILE.fclose (v_handle);
END;
But now how can i do ? Shall i use like this select Statement for all the columns ?Sorry for that ..
As per our requirement ...
I need to read the flat file and it looks like like this .
*0|BP-V1|20100928|01|1|2430962.89|9|2430962.89|MUR|20100928120106*
*9|2430962.89|000111111111|*
*1|61304.88|000014104113|*
*1|41961.73|000022096086|*
*1|38475.65|000023640081|*
*1|49749.34|000032133154|*
*1|35572.46|000033093377|*
*1|246671.01|000042148111|*
*1|120737.25|000053101979|*
*1|151898.79|000082139768|*
*1|84182.34|000082485593|*
I have to check the file :-
Validation are 1st line should start from 0 else it should raise an error and insert that error into one table .
The for 2nd line also same thing ..it should start from 9 else it should raise an error and insert that error into one table .
Then the 3rd line should start from 1 else it should raise an error and insert that error into one table .
After that i have to do a validation like i will read the 1st line 2nd column .. It should be like this BP-V1 else raise an error and insert that error to a table . Then i will check the 3rd column which is 20100928 , it should be YYYYMMDD format else same thing ERROR.
Then like this for all columns i have different validation .......
Then it will check for the 2nd line 3rd column . this is an account no .1st i will check it should be 12 char else ERROR .Then I will check that what user has imputed in the form.Like for example User putted 111111111 then i will check with this 000111111111 which is there in the 2nd line . I have to add 000 before that Account no which user imputed .
Then the lines which is starting from 1 , i have to take all the 2nd column for all the lines which is starting from 1 and i have to do a sum . After that i have to compare that sum with the value in the 1st lines ( Starting from 0) 6th column . It should be same else ERROR ...
Then same way i have to count all the lines which is starting from 1 . Then i have to compare with the 7th column of 1st line . It should be same . Here in this file it should be 9.
MY CODE IS :-
Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
IS
v_handle TEXT_IO.file_type;
v_filebuffer varchar2(500);
line_0_date VARCHAR2 (10);
line_0_Purp VARCHAR2 (10);
line_0_count Number;
line_0_sum number(12,2);
line_0_ccy Varchar2(3);
line_9_sum Number(12,2);
line_9_Acc_no Varchar2(12);
Line_1_Sum Number(12,2);
Line_1_tot Number(15,2) := 0;
Line_1_flag Number := 0;
lval number;
lacno varchar2(16);
v_file varchar2(20);
v_path varchar2(50);
LC$String VARCHAR2(50) ;--:= 'one|two|three|four|five|six|seven' ;
LC$Token VARCHAR2(100) ;
i PLS_INTEGER := 2 ;
lfirst_char number;
lvalue Varchar2(100) ;
Begin
v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); For the file name
v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
--v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
Message(lfile_name);
v_handle := TEXT_IO.fopen(lfile_name, 'r');
BEGIN
LOOP
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
--Message('First Char '||lfirst_char);
IF lfirst_char = '0' Then
Loop
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
Message('VAL - '||LC$Token);
lvalue := LC$Token;
EXIT WHEN LC$Token IS NULL ;
i := i + 1 ;
End Loop;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (9999,'0002','First line should always start with 0');
Forms_DDL('Commit');
raise form_Trigger_failure;
End if ;
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
--Message('Row '||LC$Token);
IF lfirst_char = '9' Then
Null;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (8888,'0016','Second line should start with 9');
Forms_DDL('Commit');
raise form_Trigger_failure;
End IF;
LOOP
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
--Message('Row '||LC$Token);
IF lfirst_char = '1' Then
Null;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (7777,'0022','The third line onward should start with 1');
Forms_DDL('Commit');
raise form_Trigger_failure;
End if;
END LOOP;
--END IF;
END LOOP;
EXCEPTION
When No_Data_Found Then
TEXT_IO.fclose (v_handle);
END;
Exception
When Others Then
Message('Other error');
END;
I am calling the FUNCTION which you gave SPLIT as mcb_simulator_pkg.Split. -
Load data in xml file into Pdf form created by LiveCycle Designer
I want to load data in xml file into Pdf form when Pdf opened and Form field will be filled with data from xml file .I try to use $host.importdata("filename.xml"); But i could not find suitable place to run my code. Can anyone give me some advice? thank you.
Hi,
extract your xml and then you can use insert all clause.
here's very small example on 10.2.0.1.0
SQL> create table table1(id number,val varchar2(10));
Table created.
SQL> create table table2(id number,val varchar2(10));
Table created.
SQL> insert all
2 into table1 values(id,val)
3 into table2 values(id2,val2)
4 select extractValue(x.col,'/a/id1') id
5 ,extractValue(x.col,'/a/value') val
6 ,extractValue(x.col,'/a/value2') val2
7 ,extractValue(x.col,'/a/id2') id2
8 from (select xmltype('<a><id1>1</id1><value>a</value><id2>2</id2><value2>b</value2></a>') col from dual) x;
2 rows created.
SQL> select * from table1;
ID VAL
1 a
SQL> select * from table2;
ID VAL
2 b Ants -
How to saving data in csv file
I have problem with saving data in csv file. I would like save my data look
like this example :
excel preview :
A B C
1 10 11 12
2 13 14 15
As we see all values are in separate cell A1=10, B1=11, C1=12 ...
so I try :
PrintWriter wy = new PrintWriter(new FileWriter("test.csv"));
values[0][0]="10";
values[0][1]="11";
values[0][2]="12";
values[1][0]="13";
values[1][1]="14";
values[1][2]="15";
for (String[] row : values){
for (String col : row) {
wy.print(col + "\t");
but csv file look like :
A1=10 11 12
A2=13 14 15
but B1-B2 and C1-C2 is empty
the second steep is use Ostermiller library :
OutputStream out;
out = new FileOutputStream("temp.csv");
CSVPrinter csvp = new CSVPrinter(out);
String[][] values = new String[2][3];
csvp.changeDelimiter('\t');
values[0][0]="10";
values[0][1]="11";
values[0][2]="12";
values[1][0]="13";
values[1][1]="14";
values[1][2]="15";
csvp.println(values);
but the result is also this same, is anyone do how to resolve this problem
?but iI don`t want to seperate with comma....value
I
want to seperate each value to seperate cellWhen you save the file, separate w/ comma
When excel loads it, it will automagicarifficallyput
it in it's own cell.what it is "w/ " ? when I separate my data with
comma excel don`t put value to spererate cells"w/" is an abbreviation of "with". Post the code that doesn't work, along with the results you get. If it's not working with commas, you must be doing something else wrong. -
Problem Loading Microsoft Sql Serer table data to flat file
Hi Experts,
i am trying to load data from SQL Server table to flat file but its errror out.
I have selected Staging area different form Targert ( I am using SQL Server as my staging area)
knowlegde modue used: IKM SQL to file Append
I reciee the following errror
ODI-1217: Session table to file (124001) fails with return code 7000.
ODI-1226: Step table to file fails after 1 attempt(s).
ODI-1240: Flow table to file fails while performing a Integration operation. This flow loads target table test.
ODI-1227: Task table to file (Integration) fails on the source MICROSOFT_SQL_SERVER connection POS_XSTORE.
Caused By: java.sql.SQLException: Could not read heading rows from file
at com.sunopsis.jdbc.driver.file.FileResultSet.<init>(FileResultSet.java:164)
Please help!!!
Thanks,
PrateekHave you defined the File Datastore correctly with appropriate delimiter and file properties.
Although the example is for oracle to file , see if this helps you in any way - http://odiexperts.com/oracle-to-flat-file. -
Read data from Excel file and diaplay in Webdynpro
Hi all,
I need some help. I have a Excel file with set of name, phonenumbers . I want to know how to display the data using Webdynpro. Could some one help me. help is appreciated and I promise to award points for right answer.
Thank you
Maruti<b>Hi
i can explain you to read data from Excel file
First You have to download the jxl.jar file. You can get this file from the Below site
</b><a href="http://www.andykhan.com/jexcelapi/download.html">jexcelapi jar</a>
It will be in Compressed Fromat So Unzip it to get the Contents
After Unzipping The File You will get a Folder (jexcelapi/jxl.jar)
Now in NWDS open web dynpro explorer, Right Click Your Project, a popup menu will appear and in that click Properties
You will get window displaying your Project Properties
On Left Side of the window You Will Find "Java Build Path"
Click That "Java Build Path" and you will get 4 Tabs Showing ( Source,Projects,Libraries,Order and Export)
Click Libraries Tab
You will find options many options buttons
In that click the Button "Add External Jars"
You will get Window in order to fecth the jxl.jar file from the location you had stored
After selecting the jxl.jar i will get displayed and click ok
Now Open Navigator
Open Your Project
You will find Lib folder
Copy the jxl.jar to that lib folder
Note : You cannot Read the Content from the excel file directly
First You Have to copy that file to the Server,
And from the Server you can get the file absolute path
With the absolute path you can read the contents of the Excel file
You have to save the Excel file as .xls Format and Not as xlsx format i will not accept that...
You have Upload the Excel file from the Server Using the File Upload UI Element
This Coding will extract 3 columns from the Xls File
Coding
import java.io.File;
import java.io.FileOutputStream;
import java.io.InputStream;
import jxl.Cell;
import jxl.Sheet;
import jxl.Workbook;
import com.sap.fileupload.wdp.IPrivateFileUpload_View;
import com.sap.tc.webdynpro.services.sal.datatransport.api.IWDResource;
public void onActionUpload_File(com.sap.tc.webdynpro.progmodel.api.IWDCustomEvent wdEvent )
//@@begin onActionUpload_File(ServerEvent)
IPrivateFileUpload_View.IContextElement element1 = wdContext.currentContextElement();
IWDResource resource = element1.getFileResource();
element1.setFileName(resource.getResourceName());
element1.setFileExtension(resource.getResourceType().getFileExtension());
//@@end
public void onActionUpload_File_in_Server(com.sap.tc.webdynpro.progmodel.api.IWDCustomEvent wdEvent )
//@@begin onActionUpload_File_in_Server(ServerEvent)
InputStream text=null;
int temp=0;
try
File file = new File(wdContext.currentContextElement().getFileResource().getResourceName().toString());
FileOutputStream op = new FileOutputStream(file);
if(wdContext.currentContextElement().getFileResource()!=null)
text=wdContext.currentContextElement().getFileResource().read(false);
while((temp=text.read())!=-1)
op.write(temp);
op.flush();
op.close();
path = file.getAbsolutePath();
wdComponentAPI.getMessageManager().reportSuccess(path);
catch(Exception e)
e.printStackTrace();
//@@end
public void onActionUpload_Data_into_Table(com.sap.tc.webdynpro.progmodel.api.IWDCustomEvent wdEvent )
//@@begin onActionUpload_Data_into_Table(ServerEvent)
try
Workbook wb =Workbook.getWorkbook(new File(path));
Sheet sh = wb.getSheet(0);
//wdComponentAPI.getMessageManager().reportSuccess("Columns = "+sh.getColumns());
//wdComponentAPI.getMessageManager().reportSuccess("Rows = "+sh.getRows());
int columns = sh.getColumns();
int rows = sh.getRows();
int i=0;
for(int j=1;j<=rows;j++)
ele=wdContext.nodeTable_Data().createTable_DataElement();
Cell c1 = sh.getCell(i,j);
ele.setTab_Name(c1.getContents());
Cell c2 = sh.getCell(i+1,j);
ele.setTab_Degree(c2.getContents());
Cell c3 = sh.getCell(i+2,j);
ele.setTab_Percentage(c3.getContents());
wdContext.nodeTable_Data().addElement(ele);
catch(Exception ex)
wdComponentAPI.getMessageManager().reportSuccess(ex.toString());
//@@end
* The following code section can be used for any Java code that is
* not to be visible to other controllers/views or that contains constructs
* currently not supported directly by Web Dynpro (such as inner classes or
* member variables etc.). </p>
* Note: The content of this section is in no way managed/controlled
* by the Web Dynpro Designtime or the Web Dynpro Runtime.
//@@begin others
String path;
IPrivateFileUpload_View.ITable_DataElement ele;
//@@end
Regards
Chandran S -
Error: An error occurred while skipping data rows
I am constantly getting error message “Error: An error occurred while skipping data rows”. Here is my scenario:
I am pulling data from a text flat file which is Vertical Bar {|} delimited. My data source files naturally come without headers (no column names on first row). So what I did to bypass
this is –
I used a control file to supply the header names when I developed my package. This control file’s column names have corresponding column order used in the data source files.
I then added an expression to my Flat File Connection Manager so that the system should automatically pick up new incoming files (files without headers).
Now when I try to run my package I am receiving “Error: An error occurred while skipping data rows”. Could this be due to the fact that the system is expecting the column names which I
entered when I developed the package while this new file does not have any headers, or could it be something different? There is very little information on the internet about this kind of error message. The only meaningful information that I have found is
from the link below but unfortunately the case explained there is not the same as my situation.
http://social.msdn.microsoft.com/forums/sqlserver/en-US/a790460b-9ef5-482b-b219-f55c05fa30d3/error-occured-while-skipping-data-rows
I’m also aware that one can get the same error if the specified delimiter type on their package is different from the one on the flat file. Still that is not the case in my situation. Can
someone please kindly help me troubleshoot this.
Many thanks,
MpumeleloI have managed to address my problem. I did not have to manually enter some nearly 2000 column names by hand. Here is what I have done. I will explain it in detail for the benefit of those who might face the same problem but have no answers on how to work
around it.
I have maintained the column names from the control file because it was important for me to do this in the interest of some data flow components further down the pipeline. It therefore means that my package Flat File Source components have column names
on both the External Column and Output Column sides.
When I configured the Flat Files Source component using the control file I left the option “Column names in the first row” ticked as shown in the picture below. This is the default selection from Microsoft.
I then went to my flat file connection manager; right clicked it and selected “Properties” to get something like shown below. You will notice that property “ColumnNamesInFirstRow” is down as “True” because when configuring my flat file manager I left
that selection ticked. To address my problem I changed property “ColumnNamesInFirstRow” from “True” to “False” and saved my package. This means that during runtime the package will not use the first row as column names but will use the already predefined
column names which I entered from the control file. Please note that selecting the option of not using the first row as column names should be done from the Properties selection instead of the Flat File Source component. Changing this property from the Flat
File Source component automatically changes all the metadata whereas doing this from the Properties selection leaves the metadata on the Flat File Source component unchanged.
Mpumelelo -
To get the table data into a file in pl/sql developer
Hi
i have table with 90k rows, i want to get the table data into a file like excel....
i executed the select statement it shows only 5k rows because of buffer problem.
can u please help me any alternative way to get the table data into a file.
using cursors like thatReally? excel for 90K rows :)
face/desk/floor/"Hi and sorry neighbours below, it's me again"
Err, I see you point, thanks Dang, I completely missed the 90k recs part, I must be getting old or modern ;)
@Ramanjaneyulu,
can u please help me any alternative way to get the table data into a file.You can always dump a query to a file, check these:
http://tkyte.blogspot.com/2009/10/httpasktomoraclecomtkyteflat.html
How to create Excel file (scroll down when necessary)
http://forums.oracle.com/forums/search.jspa?threadID=&q=export+to+excel&objID=f137&dateRange=all&userID=&numResults=15&rankBy=10001
Depending on your database version ( the result of: select * from v$version; ) you might have some other options.
Maybe you are looking for
-
How to improve performance of java.awt.choice
Hi! In my GUI I'm using a java.awt.choice. The problem is, when my app starts this choice is filled with lots of values with add(...). This takes quite a long time. I've already set the choice invisible before the filling, so a little bit performance
-
Print in new page while working with template
Hi all I have created a form which has a template for printing invoice. I am trying to print data from a table in one LINE in template. Since the height of LINE is not sufficient to display all records in the table, I want to display the remaining en
-
A client gave me a PNG made in Fireworks, and I need to pretty much convert this over to a CMYK file that I can send to a printer. SOme pointers would be greatly appreciated...
-
Traveling to the UK - GSM/CDMA Confusion
Everything I've read online tells me that the Droid Bionic isn't capable of using the GSM network in the UK (where I am traveling). This includes the VZW Trip Planner, which tells me my phone won't work in the UK. However, when I called Verizon's Glo
-
All but CONTACTS restored. Help!
this is strange. i had synced my phone monday night. on tuesday my phone died and i had to get it replaced with apple they gave me an activated and blank phone, i went home and went to restore it from the backup and after all was said and done it was