PL/SQL Character to date convertion issue
Dear all,
I encounter an issue with character to date convertions in PL/SQL.
The following is a short description:
I use an Oracle DBMS 8i and the problem is a wrong insertion of a date value to a column table.
The date value is retrieved from a table and is type of VARCHAR2(240). Then a convertion to the DATE type takes place and this value (date) is inserted to another table column of type DATE.
For example if the date retrieved as VARCHAR2 is '21/05/2003' the value inserted is '21/5/0003'.
The convertion is made by the following portion of code:
dateVariable Date := NULL:
dateStringVariable is retrieved from the db and is type of VARCHAR2(240)
DATE_FORMAT is a string retireved from the db with value equals to 'DD/MM/YYYY HH24:MI:SS'
dateVariable := TO_DATE(dateStringVariable, DATE_FORMAT);
Then the dateVariable is inserted to a recordSet which in turn is the one inserted to the db.
My guess is that the problem is during the char to date convertion.
I wonder if anyone knows what produces this error.
Any suggestion is welcome.
With regards
SQL> desc t
Name Null? Type
DATE# DATE
SQL> alter session set nls_date_format = 'DD-MON-RR';
Session altered.
SQL> select * from t;
no rows selected
SQL> insert into t values(to_date('21/05/2003','DD/MM/YYYY'));
1 row created.Now Oracle keeps correct date - 21th of May 2003.
How you display it depends on your NLS_DATE_FORMAT settings:
SQL> select * from t;
DATE#
21-MAY-03
SQL> alter session set nls_date_format = 'MM/DD/YYYY';
Session altered.
SQL> select * from t;
DATE#
05/21/2003So now try to do
SELECT to_char(<<your new date column>>,'DD/MM/YYYY') from <<your table>>
to be sure your date is kept right.
Rgds.
Similar Messages
-
Data convertion for VARBINARY of SQL 2000?
I have a VARBINARY data in my SQL 2000 database. I need to know what will be the
conversion for VARBINARY. I used byte{}, unfortunately it gives SQL EXCEPTION:
Unsupported data type. I also tried String, this time it does not give me SQL
exception Unsupported data type but the data is not the same. All I can do it
check the length. the length is not the same as that in the database..
NEED HELP!!!Bhawna wrote:
I have a VARBINARY data in my SQL 2000 database. I need to know what will be the
conversion for VARBINARY. I used byte{}, unfortunately it gives SQL EXCEPTION:
Unsupported data type. I also tried String, this time it does not give me SQL
exception Unsupported data type but the data is not the same. All I can do it
check the length. the length is not the same as that in the database..
NEED HELP!!! Hi. A verbinary column should be accessible fromm JDBC, via ResultSet.getBytes(),
getBinaryStream(), or getObject(). Let me know...
Joe -
SQL Generation Error after converting eFashion.unv to eFashion.unx
One of the first things I tried to do with SAP BusinessObjects Business Intelligence 4.0 was convert the built-in eFashion universe. Unfortunately, the UNX generates unresolvable outer joins, even though the data foundation layer does not contain any. I am using BI 4.0 SP02 Fix 4. Any ideas?
Here is what a query on the original eFashion.unv looks like for Year, State, Store name, and Revenue.
SELECT
Calendar_year_lookup.Yr,
Outlet_Lookup.State,
Outlet_Lookup.Shop_name,
sum(Shop_facts.Amount_sold)
FROM
Calendar_year_lookup,
Outlet_Lookup,
Shop_facts
WHERE
( Outlet_Lookup.Shop_id=Shop_facts.Shop_id )
AND
( Shop_facts.Week_id=Calendar_year_lookup.Week_id )
GROUP BY
Calendar_year_lookup.Yr,
Outlet_Lookup.State,
Outlet_Lookup.Shop_name
And here's the SQL generated by the converted eFashion.UNX. Notice the outer joins in the FROM clause even though the universe doesn't contain outer joins.
SELECT
Calendar_year_lookup.Yr,
Outlet_Lookup.State, Outlet_Lookup.
Shop_name,
sum(Shop_facts.Amount_sold)
FROM Calendar_year_lookup,
Outlet_Lookup,
Shop_facts,
{ oj Outlet_Lookup LEFT OUTER JOIN Shop_facts ON Outlet_Lookup.Shop_id=Shop_facts.Shop_id },
{ oj Shop_facts LEFT OUTER JOIN Calendar_year_lookup ON Shop_facts.Week_id=Calendar_year_lookup.Week_id }
GROUP BY Calendar_year_lookup.Yr, Outlet_Lookup.State, Outlet_Lookup.Shop_name
How should I resolve the issue so correct SQL is generated by the Information Design Tool 4.0?Correct,
Miguel has opened a dialog with the Sample Report Team and not sure what the answer was. Currently there are no samples shipped with BOE 4.0 so technically there is no issue...
All I can suggest is you use your own Universe or try to fix it your self if that's possible. I don't think they are planning on shipping samples with the GA release. They may eventually but no sure at this time.
Don -
Hi Everybody..
I'm new to Oracle APEX and i'm facing the issues loading the data into the table. My issue is the text data is getting loaded with double quotes.
Please suggests.
Thanks,
SureshHi,
assuming you're loading data via the APEX gui via Home>SQL Workshop>Utilities>Data Workshop>Load Data
Otherwise provide more info like APEX version used, how/where you are uploading data etc.
If you put your data in a file and select ""Upload file (comma separated or tab delimited" then on the next screen you can the field "Optionally Enclosed By" to double quotes (")
This should strip the double quotes from the data during import.
Regards
Bas -
Hi,
Im very new to streams and having a doubt regarding ORA-01403 issue happening while replication. Need you kind help on this regard. Thanks in advance.
Oracle version : 10.0.3.0
1.Suppose there are 10 LCRs in a Txn and one of the LCR caused ORA-01403 and none of the LCRs get executed.
We can read the data of this LCR and manually update the record in the Destination database.
Eventhough this is done, while re-executing the transaction, im getting the same ORA-01403 on the same LCR.
What could be the possible reason.
Since, this is a large scale system with thousands of transactions, it is not possible to handle the No data found issues occuring in the system.
I have written a PL/SQL block which can generate Update statements with the old data available in LCR, so that i can re-execute the Transaction again.
The PL/SQL block is given below. Could you please check if there are any issues in this while generating the UPDATE statements. Thank you
/* Formatted on 2008/10/23 14:46 (Formatter Plus v4.8.7) */
--Script for generating the Update scripts for the Message which caused the 'NO DATA FOUND' error.
DECLARE
RES NUMBER; --No:of errors to be resolved
RET NUMBER; --A number variable to hold the return value from getObject
I NUMBER; --Index for the loop
J NUMBER; --Index for the loop
K NUMBER; --Index for the loop
PK_COUNT NUMBER; --To Hold the no:of PK columns for a Table
LCR ANYDATA; --To Hold the Logical Change Record
TYP VARCHAR2 (61); --To Hold the Type of a Column
ROWLCR SYS.LCR$_ROW_RECORD; --To Hold the LCR caused the error in a Txn.
OLDLIST SYS.LCR$_ROW_LIST; --To Hold the Old data of the Record which was tried to Update/Delete
NEWLIST SYS.LCR$_ROW_LIST;
UPD_QRY VARCHAR2 (5000);
EQUALS VARCHAR2 (5) := ' = ';
DATA1 VARCHAR2 (2000);
NUM1 NUMBER;
DATE1 TIMESTAMP ( 0 );
TIMESTAMP1 TIMESTAMP ( 3 );
ISCOMMA BOOLEAN;
TYPE TAB_LCR IS TABLE OF ANYDATA
INDEX BY BINARY_INTEGER;
TYPE PK_COLS IS TABLE OF VARCHAR2 (50)
INDEX BY BINARY_INTEGER;
LCR_TABLE TAB_LCR;
PK_TABLE PK_COLS;
BEGIN
I := 1;
SELECT COUNT ( 1)
INTO RES
FROM DBA_APPLY_ERROR;
FOR TXN_ID IN
(SELECT MESSAGE_NUMBER,
LOCAL_TRANSACTION_ID
FROM DBA_APPLY_ERROR
WHERE LOCAL_TRANSACTION_ID =
'2.85.42516'
ORDER BY ERROR_CREATION_TIME)
LOOP
SELECT DBMS_APPLY_ADM.GET_ERROR_MESSAGE
(TXN_ID.MESSAGE_NUMBER,
TXN_ID.LOCAL_TRANSACTION_ID
INTO LCR
FROM DUAL;
LCR_TABLE (I) := LCR;
I := I + 1;
END LOOP;
I := 0;
K := 0;
dbms_output.put_line('size >'||lcr_table.count);
FOR K IN 1 .. RES
LOOP
ROWLCR := NULL;
RET :=
LCR_TABLE (K).GETOBJECT
(ROWLCR);
--dbms_output.put_line(rowlcr.GET_OBJECT_NAME);
PK_COUNT := 0;
--Finding the PK columns of the Table
SELECT COUNT ( 1)
INTO PK_COUNT
FROM ALL_CONS_COLUMNS COL,
ALL_CONSTRAINTS CON
WHERE COL.TABLE_NAME =
CON.TABLE_NAME
AND COL.CONSTRAINT_NAME =
CON.CONSTRAINT_NAME
AND CON.CONSTRAINT_TYPE = 'P'
AND CON.TABLE_NAME =
ROWLCR.GET_OBJECT_NAME;
dbms_output.put_line('Count of PK Columns >'||pk_count);
DEL_QRY := NULL;
DEL_QRY :=
'DELETE FROM '
|| ROWLCR.GET_OBJECT_NAME
|| ' WHERE ';
INS_QRY := NULL;
INS_QRY :=
'INSERT INTO '
|| ROWLCR.GET_OBJECT_NAME
|| ' ( ';
UPD_QRY := NULL;
UPD_QRY :=
'UPDATE '
|| ROWLCR.GET_OBJECT_NAME
|| ' SET ';
OLDLIST :=
ROWLCR.GET_VALUES ('old');
-- Generate Update Query
NEWLIST :=
ROWLCR.GET_VALUES ('old');
ISCOMMA := FALSE;
FOR J IN 1 .. NEWLIST.COUNT
LOOP
IF NEWLIST (J) IS NOT NULL
THEN
IF J <
NEWLIST.COUNT
THEN
IF ISCOMMA =
TRUE
THEN
UPD_QRY :=
UPD_QRY
|| ',';
END IF;
END IF;
ISCOMMA := FALSE;
TYP :=
NEWLIST
(J).DATA.GETTYPENAME;
IF (TYP =
'SYS.VARCHAR2'
THEN
RET :=
NEWLIST
(J
).DATA.GETVARCHAR2
(DATA1
IF DATA1 IS NOT NULL
THEN
UPD_QRY :=
UPD_QRY
|| NEWLIST
(J
).COLUMN_NAME;
UPD_QRY :=
UPD_QRY
|| EQUALS;
UPD_QRY :=
UPD_QRY
|| ' '
|| ''''
|| SUBSTR
(DATA1,
0,
253
|| '''';
ISCOMMA :=
TRUE;
END IF;
ELSIF (TYP =
'SYS.NUMBER'
THEN
RET :=
NEWLIST
(J
).DATA.GETNUMBER
(NUM1
IF NUM1 IS NOT NULL
THEN
UPD_QRY :=
UPD_QRY
|| NEWLIST
(J
).COLUMN_NAME;
UPD_QRY :=
UPD_QRY
|| EQUALS;
UPD_QRY :=
UPD_QRY
|| ' '
|| NUM1;
ISCOMMA :=
TRUE;
END IF;
ELSIF (TYP =
'SYS.DATE'
THEN
RET :=
NEWLIST
(J
).DATA.GETDATE
(DATE1
IF DATE1 IS NOT NULL
THEN
UPD_QRY :=
UPD_QRY
|| NEWLIST
(J
).COLUMN_NAME;
UPD_QRY :=
UPD_QRY
|| EQUALS;
UPD_QRY :=
UPD_QRY
|| ' '
|| 'TO_Date( '
|| ''''
|| DATE1
|| ''''
|| ', '''
|| 'DD/MON/YYYY HH:MI:SS AM'')';
ISCOMMA :=
TRUE;
END IF;
ELSIF (TYP =
'SYS.TIMESTAMP'
THEN
RET :=
NEWLIST
(J
).DATA.GETTIMESTAMP
(TIMESTAMP1
IF TIMESTAMP1 IS NOT NULL
THEN
UPD_QRY :=
UPD_QRY
|| ' '
|| ''''
|| TIMESTAMP1
|| '''';
ISCOMMA :=
TRUE;
END IF;
END IF;
END IF;
END LOOP;
--Setting the where Condition
UPD_QRY := UPD_QRY || ' WHERE ';
FOR I IN 1 .. PK_COUNT
LOOP
SELECT COLUMN_NAME
INTO PK_TABLE (I)
FROM ALL_CONS_COLUMNS COL,
ALL_CONSTRAINTS CON
WHERE COL.TABLE_NAME =
CON.TABLE_NAME
AND COL.CONSTRAINT_NAME =
CON.CONSTRAINT_NAME
AND CON.CONSTRAINT_TYPE =
'P'
AND POSITION = I
AND CON.TABLE_NAME =
ROWLCR.GET_OBJECT_NAME;
FOR J IN
1 .. NEWLIST.COUNT
LOOP
IF NEWLIST (J) IS NOT NULL
THEN
IF NEWLIST
(J
).COLUMN_NAME =
PK_TABLE
(I
THEN
UPD_QRY :=
UPD_QRY
|| ' '
|| NEWLIST
(J
).COLUMN_NAME;
UPD_QRY :=
UPD_QRY
|| ' '
|| EQUALS;
TYP :=
NEWLIST
(J
).DATA.GETTYPENAME;
IF (TYP =
'SYS.VARCHAR2'
THEN
RET :=
NEWLIST
(J
).DATA.GETVARCHAR2
(DATA1
UPD_QRY :=
UPD_QRY
|| ' '
|| ''''
|| SUBSTR
(DATA1,
0,
253
|| '''';
ELSIF (TYP =
'SYS.NUMBER'
THEN
RET :=
NEWLIST
(J
).DATA.GETNUMBER
(NUM1
UPD_QRY :=
UPD_QRY
|| ' '
|| NUM1;
END IF;
IF I <
PK_COUNT
THEN
UPD_QRY :=
UPD_QRY
|| ' AND ';
END IF;
END IF;
END IF;
END LOOP;
END LOOP;
UPD_QRY := UPD_QRY || ';';
DBMS_OUTPUT.PUT_LINE (UPD_QRY);
--Generate Update Query - End
END LOOP;
END;Thanks for you replies HTH and Dipali.
I would like to make some points clear from my side based on the issue i have raised.
1.The No Data Found error is happening on a table for which supplemental logging is enabled.
2.As per my understanding, the "Apply" process is comparing the existing data in the destination database with the "Old" data in the LCR.
Once there is a mismatch between these 2, ORA-01403 is thrown. (Please tell me whether my understanding is correct or not)
3.This mismatch can be on date field or even on the timestamp millisecond as well.
Now, the point im really wondering about :
Some how a mismatch got generated in the destination database (Not sure about the reason) and ORA-01403 is thrown.
If we could update the Destination database with the "Old" data from LCR, this mismatch should be resolved isnt it?
Reply to you Dipali :
If nothing is working out, im planning to put a conflict handler for all tables with "OVERWRITE" option. With the following script
--Generate script for applying Conflict Handler for the Tables for which Supplymentary Logging is enabled
declare
count1 number;
query varchar2(500) := null;
begin
for tables in (
select table_name from user_tables where table_name IN ("NAMES OF TABLES FOR WHICH SUPPLEMENTAL LOGGING IS ENABLED")
loop
count1 := 0;
dbms_output.put_line('DECLARE');
dbms_output.put_line('cols DBMS_UTILITY.NAME_ARRAY;');
dbms_output.put_line('BEGIN');
select max(position) into count1
from all_cons_columns col, all_constraints con
where col.table_name = con.table_name
and col.constraint_name = con.constraint_name
and con.constraint_type = 'P'
and con.table_name = tables.table_name;
for i in 1..count1
loop
query := null;
select 'cols(' || position || ')' || ' := ' || '''' || column_name || ''';'
into query
from all_cons_columns col, all_constraints con
where col.table_name = con.table_name
and col.constraint_name = con.constraint_name
and con.constraint_type = 'P'
and con.table_name = tables.table_name
and position = i;
dbms_output.put_line(query);
end loop;
dbms_output.put_line('DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(');
dbms_output.put_line('object_name => ''ICOOWR.' || tables.table_name|| ''',');
dbms_output.put_line('method_name => ''OVERWRITE'',');
dbms_output.put_line('resolution_column => ''COLM_NAME'',');
dbms_output.put_line('column_list => cols);');
dbms_output.put_line('END;');
dbms_output.put_line('/');
dbms_output.put_line('');
end loop;
end;
Reply to u HTH :
Our Destination database is a replica of the source and no triggers are running on any of these tables.
This is not the first time im facing this issue. Earlier, we had to take big outage times and clear the Replica database and apply the dump from the source...
Now i cant think about that situation. -
Hi All,
following piece of code was working fine in 4.6 C but in ECC 6.0 I get the following error:
" "END_OF_RECORD" must be a character-type data object (data type C, N,D, T or STRING) . "
I tried type-casting with field symbols but still not able to remove the error. Cannot convert end_of_record directly to type C as it may hamper the functionality. Plz advice how to remove the error without converting type x to type C
In the following code :
DATA: DELIMITER TYPE C VALUE CL_ABAP_CHAR_UTILITIES=>HORIZONTAL_TAB,
end_of_record TYPE x.
SPLIT data_file_i AT delimiter INTO it_ekko-rtype
it_ekko-ebeln
it_ekko-bsart
it_ekko-lifnr
it_ekko-bedat
it_ekko-ekorg
it_ekko-ekgrp
it_ekko-bukrs
it_ekko-zterm
it_ekko-zbd1t
it_ekko-zbd1p
it_ekko-zbd2t
it_ekko-zbd2p
it_ekko-zbd3t
it_ekko-inco1
it_ekko-inco2
it_ekko-waers
it_ekko-wkurs
it_ekko-kufix
it_ekko-verkf
it_ekko-telf1
it_ekko-ihrez
it_ekko-unsez
it_ekko-angnr
it_ekko-ihran
it_ekko-submi
it_ekko-loekz
end_of_record.
where all these fields except " end_of_record " are of character type and "data_file_i " is a character type structure as defined below:
DATA :
BEGIN OF data_file_i OCCURS 0,
record(1000),
END OF data_file_i,Type X is not allowed in Unicode. When a field is declared as Type X with Value u201809u2019 or any other value, it can be resolved by using classes.
Before Unicode
CONSTANTS: c_hex TYPE x VALUE '09'.
Resolution:
Itu2019s work for any value of x.
First a temporary field of Type c should declare. Following class will convert Type x variable into type c.
Example:
CONSTANTS: c_hex TYPE x VALUE '09'.
DATA: LV_TEMP TYPE STRING.
DATA: LV_TMP TYPE C.
TRY.
CALL METHOD CL_ABAP_CONV_IN_CE=>UCCP
EXPORTING
UCCP = c_hex
RECEIVING
CHAR = LV_TMP .
CATCH CX_SY_CONVERSION_CODEPAGE.
CATCH CX_PARAMETER_INVALID_TYPE.
CATCH CX_SY_CODEPAGE_CONVERTER_INIT.
ENDTRY.
CONCATENATE I_OUTPUT-BKTXT I_OUTPUT-BVORG
I_OUTPUT-BUDAT I_OUTPUT-MESSAGE INTO
SEPARATED BY LV_TMP.
I_BUFFER = LV_TEMP.
CLEAR LV_TEMP.
CLEAR LV_TMP.
OR
Note: It works only for type x value 09.
CLASS cl_abap_char_utilities DEFINITION LOAD.
CONSTANTS: c_hex TYPE c VALUE
abap_char_utilities=>HORIZONTAL_TAB. -
Data convertion while exporting data into flat files using export wizard in ssis
Hi ,
while exporting data to flat file through export wizard the source table is having NVARCHAR types.
could you please help me on how to do the data convertion while using the export wizard?
Thanks.Hi Avs sai,
By default, the columns in the destination flat file will be non-Unicode columns, e.g. the data type of the columns will be DT_STR. If you want to keep the original DT_WSTR data type of the input column when outputting to the destination file, you can check
the Unicode option on the “Choose a Destination” page of the SQL Server Import and Export Wizard. Then, on the “Configure Flat File Destination” page, you can click the Edit Mappings…“ button to check the data types. Please see the screenshot:
Regards,
Mike Yin
TechNet Community Support -
Using CVS in SQL Developer for Data Modeler changes.
Hi,
I am fairly new to SQL Developer Data Modeler and associated version control mechanisms.
I am prototyping the storage of database designs and version control for the same, using the Data Modeler within SQL Developer. I have SQL Developer version 3.1.07.42 and I have also installed the CVS extension.
I can connect to our CVS server through sspi protocol and external CVS executable and am able to check out modules.
Below is the scenario where I am facing some issue:
I open the design from the checked out module and make changes and save it. In the File navigator, I look for the files that have been modified or added newly.
This behaves rather inconsistently in the sense that even after clicking on refresh button, sometimes it does not get refreshed. Next I try to look for the changes in Pending Changes(CVS) window. According to the other posts, I am supposed to look at the View - Data Modeler - Pending Changes window for data modeler changes but that shows up empty always( I am not sure if it is only tied to Subversion). But I do see the modified files/ files to be added to CVS under Versioning - CVS - Pending Changes window. The issue is that when I click on the refresh button in the window, all the files just vanish and all the counts show 0. Strangely if I go to Tools - Preferences - Versioning - CVS and just click OK, the pending changes window gets populated again( the counts are inconsistent at times).
I believe this issue is fixed and should work correctly in 3.1.07.42 but it does not seem to be case.
Also, I m not sure if I can use this CVS functionality available in SQL Dev for data modeler or should I be using an external client such as Wincvs for check in/ check out.
Please help.
ThanksHi Joop,
I think you will find that in Data Modeler's Physical Model tree the same icons are used for temporary Tables and Materialized Views as in SQL Developer.
David -
BiizTalk 2013 and EDI Message Invalid character in data element
Hi:
Background: I have a Vendor sending us EDI
856 (Advance Shipping Notice). We are using EDI X12 and BizTalk 2013. In the Parties Agreement we use ISA11 = U-US EDI Community of ASC X12. The vendor is sending 'U' in ISA11.
Error:
At the beginning of EDI Schema import, during EDI audit, we get the following error messages, repeated a few time.
Error: 1 (Field level error)
SegmentID: LIN
Position in TS: 18
Data Element ID: LIN09
Position in Segment: 9
Data Value:
6:
Invalid character in data element
Error: 2 (Field level error)
SegmentID: PID
Position in TS: 20
Data Element ID: PID05
Position in Segment: 5
Data Value:
6:
Invalid character in data element
Root Cause: In all the error cases the data
fields contain “Ü” (U with two dots above it). German character.
Question: What is the best way to deal with
the issue or replace “Ü” (U with two dots above it) characters?
Looking forward to your reply.
Regards, Toraj
Toraj [email protected]I edited the source file and saved it as UTF-8. It seems, it resolved the issue.
Very cool.
Toraj
Toraj [email protected]
Glad that you have solved this issue by yourself, and thanks for sharing your solution to us.
Best regards,
Angie
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
Getting SQL*Net more data from client waits when running a query through web based interface
Hi, you all,
We are having this weird behavior when running query through web based interface, we get a lot of "SQL*Net more data from client" waits, the OEM indicates that the current wait event is SQL*Net more data from client
It's just a very simple query wich invokes a db link.
When I execute the same query on any PL/SQL tool like toad or sql developer it works fine, but that query inside an application executed through a web based interface, it hangs for ever.
Where can I start looking for the problem.
We are working on a 3 Node RAC 11gr2, both databases are on the same RAC.
Thanks.Hi ,
we managed to reproduce the case in test environment, below are the steps:
1)have 2 databases on different machines, will call the first one local, the other one remote.
2)in the local database create:
a - DBLink to remote database.
b - read data from remote database(we simply used select count(*) from dummy_table )
c - insert data into a table on the local database
d - terminate the connection between the 2 databases (disconnect either machine from the network)
e - commit on local database.
what we noticed was the following:
1)when the local database is disconnected from the network(the machine is not connected to any network at the moment): almost immediately throws an error, and issuing the following:
select * from dba_2pc_pending;we found some data .
2) when the remote database was disconnected(the local database is still connected to the network):
after 7-8 seconds an error is thrown, and issuing the following:
select * from dba_2pc_pending;did not return any data.
since this is pretty similar to our case ,we concluded that it's a network issue.
is this the correct behavior ?
as a temporary solution till the network issue is fixed ,we did the following:
1) changed the call of the remote procedure to calling a local procedure that calls the remote procedure.
2) added pragma autonomous_transaction to the local procedure.
3) at the end of the local procedure rollback the autonomous transaction.
it seems that since the global transaction does not use the DBLink database does not issue a 2PC commit.
this works in my cases since the DBLink is only issed to read data. -
Extract data from a BW 7.0 cube to a SQL DB using Data Services XI
Hi Gurus,
We are trying to extract data from a BW 7.0 cube to a SQL DB using Data Services XI, the issue is that we can not read text without making "joins" between SID in the fact table and the master data tables. Do you know if it is posible to read text in a natural way?
Best RegardsThanks Wondewossen,
As you know, the DataStores (Data Services) provide access to:
1.-Tables
2.-Functions
3.-IDOCs
4.-Open Hub Tables
We are trying to extract data using the first one (Tables), not using Open Hub.
Best Regardas -
Problem in date converting into dbFormat()
Respected Experts,
I am get the current date into String vaiable and from this string variable i want to convert into dateformat.
For converting to date format i write the .java file. It converts date correctly into format "dd/mm/yyyy" but i want date format into "dd/mm/yyyy HH:MM.pm/am", so what change should i do in my following progaram.
import java.text.SimpleDateFormat;
import java.text.DateFormat;
import java.text.ParseException;
import java.util.Date;
public class UtilityFunction
public static String toUserFormatDate(Date dt)throws ParseException
SimpleDateFormat sdf=new SimpleDateFormat("dd/MM/yyyy");
String str=sdf.format(dt);
return str;
public static java.sql.Date toDbFormatDate(String sdt)throws ParseException
SimpleDateFormat sdf=new SimpleDateFormat("dd/MM/yyyy");
Date d=sdf.parse(sdt);
SimpleDateFormat nsdf=new SimpleDateFormat("dd-MM-yyyy");
String str=nsdf.format(d);
Date nd=nsdf.parse(str);
java.sql.Date rdate= new java.sql.Date(nd.getTime());
return rdate;
public static void main(String args[])throws ParseException
SimpleDateFormat df=new SimpleDateFormat("dd-MM-yyyy");
Date d2 =df.parse("08-09-1984");
System.out.println("Date converted to User interface form");
System.out.println("Database Date"+"08-09-1984");
String p=toUserFormatDate(d2);
System.out.println("Converted date"+p);
java.sql.Date r=toDbFormatDate("12/12/2009");
System.out.println("Date converted to database form");
System.out.println("User Interface Date"+"12/12/2009");
System.out.println("Converted date"+r);
}After 22 posts on the forums, I'd have expected you to have learnt how to use the code tags by now (click on 'CODE' when posting)
And while you get points for posting an SSCCE, you've left too much fluff in there. I don't know what method you want to discuss.
And if you've reached this far with the SimpleDateFormat, what kept you from reading the rest of patterns available? It's all the same thing, you just needed to make it dd/MM/yyyy hh:mm a
Do note that with your code, you're creating a date from a string that's missing the time, so you'll only get 12:00 AM all the time, for the above pattern. And also, you'd be using hh and not HH since putting AM/PM with 24 hour clock time is plain dumb. -
SQL*Net more data to dblink event for hours or days
Hello Everyone,
in our production database when we commit a transaction we call a remote procedure over dblink.
usually the call succeeds ,but every now and then a couple of sessions hang up,
when I use the session browser of Toad I notice that these sessions are waiting with the event SQL*Net more data to dblink
below are some queries and their results:
select sid,event,wait_class,wait_time,seconds_in_wait,state from gv$session_wait where sid=225
rslt:
225 SQL*Net more data to dblink Network -1 18279 WAITED SHORT TIME
select * from gv$session_wait_history where sid=225
rslt:
INST_ID SID SEQ# EVENT# EVENT P1TEXT P1 P2TEXT P2 P3TEXT P3 WAIT_TIME WAIT_TIME_MICRO TIME_SINCE_LAST_WAIT_MICRO
2 225 1 344 SQL*Net more data to dblink driver id 1413697536 #bytes 8144 0 0 8 41
2 225 2 344 SQL*Net more data to dblink driver id 1413697536 #bytes 8143 0 0 13 39
2 225 3 344 SQL*Net more data to dblink driver id 1413697536 #bytes 8149 0 0 7 37
2 225 4 344 SQL*Net more data to dblink driver id 1413697536 #bytes 8145 0 0 8 40
2 225 5 344 SQL*Net more data to dblink driver id 1413697536 #bytes 8145 0 1 11394 37
2 225 6 344 SQL*Net more data to dblink driver id 1413697536 #bytes 8143 0 0 7 37
2 225 7 344 SQL*Net more data to dblink driver id 1413697536 #bytes 8145 0 0 7 36
2 225 8 344 SQL*Net more data to dblink driver id 1413697536 #bytes 8138 0 0 8 37
2 225 9 344 SQL*Net more data to dblink driver id 1413697536 #bytes 8149 0 0 8 38
2 225 10 344 SQL*Net more data to dblink driver id 1413697536 #bytes 8149 0 1 11476 37I'm not sure but from the above results ,is it safe to conclude that I get stuck because I am caught in infinite loop trying to write to dblink?
additional notes:
<li>some times when I look at the current statement I find that the statement is a query or insert into a local table.
<li>there were some network outages.
<li>when viewing the database log files I found:Error 3135 trapped in 2PC on transaction 7.6.306086. Cleaning up.
Error stack returned to user:
ORA-03135: connection lost contact
ORA-02063: preceding line from MPF//where MPF is the name of dblinkeven though we use the DBLink to execute the procedure only without any changes on the remote DB, and we don't use 2PC.
<li> the local DB is a RAC
select * from dba_blockers
rslt:
no rows
select * from dba_waiters
rslt:
no rows
select * from gv$lock where sid=225
rslt:
INST_ID ADDR KADDR SID TYPE ID1 ID2 LMODE REQUEST CTIME BLOCK
2 0000000199D54F60 0000000199D54FB8 225 AE 100 0 4 0 20152 2
2 000000018EA18108 000000018EA18180 225 TX 1114138 251539 6 0 19654 2
select * from gv$session where sid=225
rslt:
INST_ID SADDR SID SERIAL# AUDSID PADDR USER# USERNAME COMMAND OWNERID TADDR LOCKWAIT STATUS SERVER SCHEMA# SCHEMANAME OSUSER PROCESS MACHINE PORT TERMINAL PROGRAM TYPE SQL_ADDRESS SQL_HASH_VALUE SQL_ID SQL_CHILD_NUMBER SQL_EXEC_START SQL_EXEC_ID PREV_SQL_ADDR PREV_HASH_VALUE PREV_SQL_ID PREV_CHILD_NUMBER PREV_EXEC_START PREV_EXEC_ID PLSQL_ENTRY_OBJECT_ID PLSQL_ENTRY_SUBPROGRAM_ID PLSQL_OBJECT_ID PLSQL_SUBPROGRAM_ID MODULE MODULE_HASH ACTION ACTION_HASH CLIENT_INFO FIXED_TABLE_SEQUENCE ROW_WAIT_OBJ# ROW_WAIT_FILE# ROW_WAIT_BLOCK# ROW_WAIT_ROW# TOP_LEVEL_CALL# LOGON_TIME LAST_CALL_ET PDML_ENABLED FAILOVER_TYPE FAILOVER_METHOD FAILED_OVER RESOURCE_CONSUMER_GROUP PDML_STATUS PDDL_STATUS PQ_STATUS CURRENT_QUEUE_DURATION CLIENT_IDENTIFIER BLOCKING_SESSION_STATUS BLOCKING_INSTANCE BLOCKING_SESSION FINAL_BLOCKING_SESSION_STATUS FINAL_BLOCKING_INSTANCE FINAL_BLOCKING_SESSION SEQ# EVENT# EVENT P1TEXT P1 P1RAW P2TEXT P2 P2RAW P3TEXT P3 P3RAW WAIT_CLASS_ID WAIT_CLASS# WAIT_CLASS WAIT_TIME SECONDS_IN_WAIT STATE WAIT_TIME_MICRO TIME_REMAINING_MICRO TIME_SINCE_LAST_WAIT_MICRO SERVICE_NAME SQL_TRACE SQL_TRACE_WAITS SQL_TRACE_BINDS SQL_TRACE_PLAN_STATS SESSION_EDITION_ID CREATOR_ADDR CREATOR_SERIAL# ECID
2 00000001993E4F58 225 445 1353611 0000000198E2FA10 198 <schema> 47 2147483644 000000018EA18108 ACTIVE DEDICATED 198 <schema> oracle 1234 <cluster name> 49993 unknown JDBC Thin Client USER 00000001968A1250 3198676106 72y8ztfzagv4a 2 02/04/2013 11:18:22 ص 33554852 00000001968A18E0 3992616824 03mm4u3qznzvs 0 02/04/2013 11:18:22 ص 33554730 158207 1 158207 1 JDBC Thin Client 2546894660 0 12206 122409 8 49354 0 94 02/04/2013 10:53:20 ص 19559 NO NONE NONE NO DISABLED ENABLED ENABLED 0 NOT IN WAIT NOT IN WAIT 42844 344 SQL*Net more data to dblink driver id 1413697536 0000000054435000 #bytes 8144 0000000000001FD0 0 00 2000153315 7 Network -1 19553 WAITED SHORT TIME 8 19553325216 SYS$USERS DISABLED FALSE FALSE FIRST EXEC 100 0000000198E2FA10 2 004qLk^iPyp0bqw5wFDCiW0002fR000B^fHi ,
we managed to reproduce the case in test environment, below are the steps:
1)have 2 databases on different machines, will call the first one local, the other one remote.
2)in the local database create:
a - DBLink to remote database.
b - read data from remote database(we simply used select count(*) from dummy_table )
c - insert data into a table on the local database
d - terminate the connection between the 2 databases (disconnect either machine from the network)
e - commit on local database.
what we noticed was the following:
1)when the local database is disconnected from the network(the machine is not connected to any network at the moment): almost immediately throws an error, and issuing the following:
select * from dba_2pc_pending;we found some data .
2) when the remote database was disconnected(the local database is still connected to the network):
after 7-8 seconds an error is thrown, and issuing the following:
select * from dba_2pc_pending;did not return any data.
since this is pretty similar to our case ,we concluded that it's a network issue.
is this the correct behavior ?
as a temporary solution till the network issue is fixed ,we did the following:
1) changed the call of the remote procedure to calling a local procedure that calls the remote procedure.
2) added pragma autonomous_transaction to the local procedure.
3) at the end of the local procedure rollback the autonomous transaction.
it seems that since the global transaction does not use the DBLink database does not issue a 2PC commit.
this works in my cases since the DBLink is only issed to read data. -
Check newline character in data
Hi,
How to check for a new line character in data using SQL query ?
Thxselect * from test where cc like '%' || chr(13) || chr(10) || '%'
remember that unix does not use carriage return, so this may be more complete:
select * from test where cc like '%' || chr(10) || '%'
bye
aldo -
Hi all,
Has anyone seen the following error please or has a troubleshooting hint: -
"[NT AUTHORITY\SYSTEM (15/10/2012 18:35:12) - Service request cancelled due to an error.
Error Code: 10000
Error Description: Failed to create lease requisition.
Fault code: soap:Server
Fault string: Service Form Field: 'WarningDate2' has Date format issue.
Fault details: REQ_0024Service Form Field: 'WarningDate2' has Date format issue.
CIAC = 3.01
Date and Time format on the CCP, CPO, vmware and SQL servers all Italian (dd/mm/yy)
This only happens when we add a Lease Time on the request.
Do they all have to be set to the US format for this to work?
If this is a regional setting thing, do I have to change the format on all of the servers (CIAC components)?
Cheers
mdThis test program might help...
import java.util.*;
import java.text.*;
public class ExpandYear
public static void main(String[] args) throws ParseException
SimpleDateFormat sdf_2dyear = new SimpleDateFormat("MM/dd/yy");
SimpleDateFormat sdf_4dyear = new SimpleDateFormat("MM/dd/yyyy");
String test1 = "3/21/00";
System.out.println("test1: " + test1 + " to : " +
sdf_4dyear.format(sdf_2dyear.parse(test1)));
String test2 = "4/9/99";
System.out.println("test2: " + test2 + " to : " +
sdf_4dyear.format(sdf_2dyear.parse(test2)));
Maybe you are looking for
-
DML ERROR LOGGING - how to log 1 constraint violation on record
Hi there We are using DML error logging to log records which violate constraints into an error table. The problem is when a record violates > 1 constraint it logs the record but details only 1 constraint violation - is there a way to get it to record
-
Managed System Configuration - Software version and OS is not accepting
Dear When i try to setup the PI system in solman 7.1 - Managed system configuration, below the error msg. Please help to sort it the problem Regards Jay
-
I've been building some HTTP servlets and to ease up on the typing I've created a ServletUtilities file and class for repetitve tasks. I then added this file to my core package. However once I made a call to my ServletUtilites class from the other cl
-
I want to write a java program that can add a user to a role or sub role to the Profile Database in iPlanet Portal Server 3.0. Does anyone has any idea or a sample program do such thing? Thanks, Tommy
-
NSP (nw04sSP9) installed OK, but SAPgui fails on start
Hi I installed NSP Ok and it starts OK in sap console. When I login using SAPGUi I get the following error even before sap login screen. Runtime Error: DBIF_RSQL_INVALID_REQUEST Error Analysis: an invalid request was made to the SAP database interfac