Problems With Data Alignment when spooling to a CSV file
Dear members,
I am spooling data to a csv file. My data contains 3 columns
For example :
col1 col2 col3
USD,10000033020000000000000,-1144206.34
The 2nd column is alphanumeric, it contains some rows which have only numbers and some which have numbers and alphabets.
The 3rd column contains only numbers with positive or negative values.
I am facing problem with alignment. when i open the spooled csv file then i find that the 3rd column is aligned to right .
In the 2nd column, rows which have only numbers are right justified and rows which have alpha numeric data are left justified.
I tried using the JUSTIFY function in sql plus but still it is not working for me.
Can any body give your opinion on how to control the alignment in spooled csv files.
Your responce is highly appreciated.
Here is my code :
WHENEVER SQLERROR CONTINUE
SET TIMING off
set feedback off
set heading off
set termout OFF
set pagesize 0
set linesize 200
set verify off
set trimspool ON
SET NEWPAGE NONE
col to_char(glcd.segment1||glcd.segment2||glcd.segment3||glcd.segment4||glcd.segment5||glcd.segment6) ALIAS CONCATENATED_SEGMENTS
col CONCATENATED_SEGMENTS justify left
col to_char(decode(glbal.currency_code,glsob.currency_code,
(begin_balance_dr - begin_balance_cr) + (period_net_dr -period_net_cr),
(begin_balance_dr_beq - begin_balance_cr_beq) + (period_net_dr_beq -period_net_cr_beq))) alias Total_Functional_Currency
col Total_Functional_Currency justify left
COlUMN V_INSTANCE NEW_VALUE V_inst noprint
select trim(lower(instance_name)) V_INSTANCE
from v$instance;
column clogname new_value logname
select '/d01/oracle/'|| '&&V_inst' ||'out/outbound/KEMET_BALANCE_FILE_EXTRACT' clogname from dual;
spool &&logname..csv
SELECT glsob.currency_code ||','||
to_char(glcd.segment1||glcd.segment2||glcd.segment3||glcd.segment4||glcd.segment5||glcd.segment6) ||','||
to_char(decode(glbal.currency_code,glsob.currency_code,
(begin_balance_dr - begin_balance_cr) + (period_net_dr -period_net_cr),
(begin_balance_dr_beq - begin_balance_cr_beq) + (period_net_dr_beq -period_net_cr_beq)))
from gl_balances glbal , gl_code_combinations glcd , gl_sets_of_books glsob
where period_name = '&1' /* Period Name */
and glbal.translated_flag IS NULL
and glbal.code_combination_id = glcd.code_combination_id
and glbal.set_of_books_id = glsob.set_of_books_id
and glbal.actual_flag = 'A'
and glsob.short_name in ('KEC-BOOKS' , 'KUE' , 'KEU','KEMS', 'KEAL' , 'KEAL-TW' , 'KEAL-SZ' , 'KEAM')
and glcd.segment1 != '05'
and decode(glbal.currency_code , glsob.currency_code , (begin_balance_dr - begin_balance_cr) + (period_net_dr -period_net_cr) ,
(begin_balance_dr_beq - begin_balance_cr_beq) + (period_net_dr_beq -period_net_cr_beq)) != 0
and glbal.template_id IS NULL
ORDER BY glcd.segment1 || glcd.segment2 || glcd.segment3 || glcd.segment4 || glcd.segment5 || glcd.segment6
spool off
SET TIMING on
set termout on
set feedback on
set heading on
set pagesize 35
set linesize 100
set echo on
set verify on
Thanks
Sandeep
i think you do not have to worry about your code when you say that the plain texts created are ok when opened on the notepad. it is on the excel that you will need some adjustments. not sure about this but you might want to read about the applying styles in the excel by going through it's help menu.
Similar Messages
-
Problem with date format when ask prompt web-intelligence
Bo XIR2 with 5 SP. Instaled on Windows 2003 with support Russian.
Inside BO every labels, buttons - use russian. But when invoke web-report and Prompt appear there is problem with date format.
Looks like korean format of date 'jj.nn.aaa H:mm:ss'. I see system settings of date in Win .. everything right
What i have to do?
Where i can change format date for bo?GK, try this...
decode(instr(packagename.functionname(param1 ,param2),'2400'), 0, to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" hh24mi'),'mm/dd/yyyy hh24mi'),'mm/dd/yyyy hh24mi'),
to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" "2400"')+1,'mm/dd/yyyy "0000"'),'mm/dd/yyyy "0000"'))-Marilyn -
Error in date format when I load a CSV file
I am using Oracle G10 XE and I am trying to load data into my database from comma separated files.
When I load the data from a CSV file which has the date with the following format "DD/MM/YYYY", I received the following error "ORA-01843: not a valid month".
I have the NSL_LANG set to AMERICAN. I have tried the following command: "ALTER SESSION SET NLS DATE FORMAT="DD/MM/YYYY" and this does nothing. When I try to run "SELECT SYSDATE "NOW" FROM DUAL;" I get the date in this format "10-NOV-06".
I will appreciate any help about migrating my data with date fields in format DD//MM/YYYY.
Sincerely,
PolonioSee Re: Get error in date when I load a CSV file
-
Problem with finding correct application to open a .csv file from Downloads
Hi Guys. I've been a Mac user for some time now but have only just found this forum! Anyway, I've downloaded a .csv file to my Download folder but when I click on it to open it I get the "Can't find an application etc. " message. When the dropdown "Choose Application" menu appears it wont choose (ie the "Open" button won't enable). What is a .csv file and what Application would open it (if I can get the thing to choose one in the first place?)
Your help would be greatly appreciated. (BTW the file comes from my Paypal history if that helps.)A .csv file is a standard text file which typically contains data of some sort.
CSV stands for Comma Separated Values.
You could open the file using TextEdit, or a Word processor... but the content of the file might look a little strange and it won't be easy to read.
Your best bet is to import this .csv file using a Spreadsheet application such as Microsoft Excel or other. -
Problem with data integration when using KCLJ
Hello,
For a project, I had to integrate a new field using transaction KCLJ.
For this I extented the DDIC structure of the sender structure, and after that, I updated the corresponding transfer rules.
When I execute transaction KCLJ I have no error, and table BUT000 is updated with the data of the flat file.
The problem is that erase also 6 BUT000's fields; they're not in the sender structure and so, have no transfer rules.
Could you help me ?Hi
Please read this.
External Data Transfer
These activities are not relevant if you use a CRM/EBP system.
In the following activities you make definitions for transfer of business partner data or business partner relationship data from an external system to a SAP System.
Data transfer takes place in several stages:
1. Relevant data is read from the external system and placed in a sequential file by the data selection program. The data structure of the file is defined in the sender structure.
This procedure takes place outside of the SAP environment and is not supported by SAP programs. For this reason, data changes can be made at this point by the data selection program.
2. The sequential file is stored on an application server or a presentation server.
3. The SAP transfer program reads data from the file and places this in the sender structure. This does not change the data. This step is carried out internally by the system and does not affect the user.
4. Following transfer rules that have to be defined, the transfer program takes the data from the sender structure and places it in the receiver structure. During this step you can change or convert data.
The receiver structure is firmly defined in the SAP system. Assignment of the sender structure to the transfer program, and of the transfer program to the receiver structure is made using a defined transfer category.
5. The data records in the receiver structure are processed one after the other and, if they do not contain any errors, they are saved in the database.
Before you transfer external data for the first time, make the following determinations:
The structure of the data in the external system may not match the structure expected by the SAP system. You may have to supplement data.
There are two ways in which you can adapt the structure:
You make the required conversions and enhancements within the data selection program prior to beginning the transfer to the SAP system. This will be the most practical solution in most cases since you have the most freedom at this point.
You do the conversion using a specially developed transfer program and transfer rules.
You then define the fields of the sender structure. The system offers you the option of automatically generating a sender structure that is compatible with the receiver structure.
You define transfer rules to create rules according to which the fields of the sender structure are linked with those of the receiver structure.
You now carry out the transfer.
SAP Enhancements for External Data Transfer
The following SAP enhancements are offered in the following areas of External Data Transfer:
Four Customer Exits exist for the data transfer or for the conversion from IDOC segments. The Exits are contained in the enhancement KKCD0001. As soon as the Customer Exits are activated, they are carried out for all sender structures or segments. The first two Customer Exits require minimal coding once they are activated. The sender structure concept is used when loading data into the SAP-System. The concept Segment is used in the context of the distribution of the SAP-System. It is a matter of a record of data to be transferred or converted. It is recommendable to code a CASE -instruction within the Customer Exit, where (differentiated according to sender structure (REPID) or segment) various coding is accessed. In the parameter REPID, the name of the segment for the conversion from IDOC segments. The parameter GRPID is not filled out with the conversion from IDOC segments. You should have a WHEN OTHERS branch within the CASE instruction, in which the 'SENDER_SET' is allocated to the 'SENDER_SET_NEW' or the 'RECEIVER_SET' to the 'RECEIVER_SET_NEW'. Utherwise the return code will have its initial value. You can view a possible solution in Code sample.
The first Customer Exit is accessed before the summarizing or conversion. It is called up as follows:
CALL CUSTOMER-FUNCTION '001' EXPORTING GRPID = GRPID "Origin REPID = REPID "Sender program SENDER_SET = SENDER_SET "Sender record IMPORTING SENDER_SET_NEW = SENDER_SET "modified sender record SUBRC = SUBRC. "Returncode
If the variable 'SUBRC' is initial, the modified record is edited further or else passed over. The import parameter 'SENDER_SET_NEW ' must be filled out in the Customer Exit, as only this field and not the field 'SENDER_SET is further edited. This also especially means that you must allocate the import parameter 'SENDER_SET_NEW' the value of 'SENDER_SET' for records, for which no special handling will be carried out otherwise.
The second Customer Exit is accessed after the summarization and before the update:
CALL CUSTOMER-FUNCTION '002' EXPORTING REPID = REPID "Senderprogramm GRPID = GRPID "Herkunft RECEIVER_SET = RECEIVER_SET "verdichteter Satz IMPORTING RECEIVER_SET_NEW = RECEIVER_SET "modifizierter verdichteter Satz SUBRC = SUBRC. "Returncode
The modified record is only updated if the variable 'SUBRC'
is initial.
The import parameter 'RECEIVER_SET_NEW' must be filled out in the Customer Exit, since only this field and not the field 'RECEIVER_SET _NEW' is updated.
The third Customer Exit is used for replacing variables. It is called up when you load the transfer rules.
CALL CUSTOMER-FUNCTION '003' EXPORTING REPID = REPID GRPID = GRPID VARIA = VARIA RFELD = RFELD VARTP = VARTP CHANGING KEYID = KEYID EXCEPTIONS VARIABLE_ERROR = 1.
The parameters REPID and GRPID are supplied with the sender structure and the origin. The variable name is in the field VARIA. The name of the receiver field is in the parameterRFELD. Field VARTP contains the variable type. Valid types are fixed values of the domain KCD_VARTYP. You transfer the variable values in the parameter KEYID. If an error occurs you use the exception VARIABLE_ERROR.
the fourth Customer Exit is required in EC-EIS only. It is called up after the summarization and before the determination of key figures. It is a necessary enhancement to the second Customer Exit. This is because changes to the keys are considered before the database is checked to see if records exist for the keys.
The function is called up as follows:
CALL CUSTOMER-FUNCTION '004' CHANGING RECEIVER_SET = R SUBRC = UE_SUBRC.
The parameter RECEIVER_SET contains the receiver record to be changed. The parameter RECEIVER_SET is a changing parameter. No changes must be made to the function module if it is not used.
The User-Exits can be found in the Module pool 'SAPFKCIM'. If you want to use the Customer Exits, you can create a project and activate the Customer Exits with the transaction 'CMOD'. The enhancement which you must use with it is KKCD0001.
Note that when programming customer exits, that these will also run if corrected data records are imported into the datapool within the context of post processing for both test and real runs.
I will provide some pointers soon. Give me some time.
Hope this will help.
Please reward suitable points.
Regards
- Atul -
Problem with data format when getting from a database
I'm trying to get a date from the database but the problem is that it also return the time. I have tried to change the format of the date when retrived from my database but it still returns the time.
Any solutions?
String theDBDate = rset.getString("date_of_call");
SimpleDateFormat formatterdate = new SimpleDateFormat ("EEE, MMM d, ''yy");
String date = formatterdate.format(theDBDate);
theCalls.setDateofCall(date);// callin my Call Class to store the date which is a stringHi could any help me? I have this problem for the last few day. I gave up on it for a while but today I'm trying to get to the botton of this problem.
I connecting to an oracle database but it return back the date & time but I want only the date.
Here my code.
Locale currentLocale= new Locale("en","GB");
Date today = new Date();
DateFormat formatter = DateFormat.getDateInstance(DateFormat.DATE_FIELD,currentLocale);
String theSQL = (" Select * From Call ") ;
Statement stmt;
stmt = conn.createStatement();
ArrayList CallsList = new ArrayList();
ResultSet rset;
rset = stmt.executeQuery(theSQL);
while(rset.next() == true)
Call theCalls = new Call();
theCalls.setCallNo(rset.getInt("call_no"));
theCalls.setUsername(rset.getString("username"));
theCalls.setCompID(rset.getString("comp_id"));
theCalls.setTimeofCall(rset.getString("time_of_call"));
Date todaydate = rset.getDate("date_of_call");
Timestamp theDBDate = rset.getTimestamp("date_of_call");
SimpleDateFormat formatterdate = new SimpleDateFormat ("EEE, MMM d, ''yy");
String date = formatterdate.format(theDBDate);
theCalls.setDateofCall(date);
theCalls.setPriorty(rset.getString("priorty"));
theCalls.setProbCat(rset.getString("problem_cat"));
theCalls.setProbDesc(rset.getString("problem_desc"));
theCalls.setCloseDate(rset.getString("close_date"));
theCalls.setProbSol(rset.getString("problem_sol"));
theCalls.setStatus(rset.getString("status"));
theCalls.setAssign(rset.getString("assign"));
CallsList.add(theCalls);
conn.close();
return CallsList; -
Problem with data selection when inserted through textarea
hi there,
i am inserting data from textarea into a database field
and then displaying the inserted data it is displaying properly but as a continuous string
i want to display it as it was inserted with new line
like if i have inserted through <textarea>
hello
all
there
it is displaying like
hello all there
i want to display as it was
i have three pages as given below and my database field is varchar (using oracle 9i)
1.first.html
<html>
<body>
<form method="post" action="insert.jsp">
<textarea name="text">
</textarea>
<input type="submit"></input>
</form>
</body>
</html>2.insert.jsp
<%@ page import = "java.io.*"%>
<%@ page import="java.sql.*" %>
<%!
String txt;
%>
<%
Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
Connection con =DriverManager.getConnection("jdbc:odbc:ods","scott","tiger");
txt= request.getParameter("text");
out.println(txt);
PreparedStatement ps = con.prepareStatement("insert into text values('"+txt+"')");
ps.executeUpdate();
out.println("values inserted successfully");
%>3.display.jsp
<%@ page import = "java.io.*"%>
<%@ page import="java.sql.*" %>
<%
Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
Connection con =DriverManager.getConnection("jdbc:odbc:ods","scott","tiger");
PreparedStatement ps = con.prepareStatement("select * from text");
ResultSet rs=ps.executeQuery();
while(rs.next())
out.println(rs.getString(1));
%>thanx & regards
sweetyThis is a feature of HTML. Its the way it is supposed to work.
If you view source on your generated jsp page, you should see the html exactly mirrors what you typed in the text area
If you don't want it acting this way then either
1 - put it in a text area again
2 - put <pre> tags around what you want do display unformatted
or
3 - Change all new lines to <br> in the html
This is what you want in the html:
<pre>
Mulitple
line
Data
</pre> -
Problem with data connection in adobe forms using wsdl file....
Hi Experts,
I am tryning to create a adobe form in which i need to create a data connection using a WSDL file. But i am coing across the problem as cannot find the path of th url.
i tried to create the WSDL in 2 ways and i came across the same error.:
1) using the URL of the WSDL file generated
2)alos saved the file as WSDL file on desktop, and used to create
in both cases i am coming across the same error.
Note: i am not creating using webdynpro, i am doing using ABAP.
Regards,
Madhu@ankur17
Won't insult you by suggesting that you haven't already checked Settings > mobile network > Data connection = On, as you have confirmed already that Internet Sharing is working.
You may need to go to Settings > about > reset your phone and see if by setting up phone afresh it reads settings and applies correctly from SIM card.
Is Network Setup application included or available from Marketplace > Nokia Collection as might be worth running this application too?
If this issue still remains you may need to have device software re-installed at Nokia Care Point to try to resolve this.
Happy to have helped forum with a Support Ratio = 42.5 -
Strange problem with my ClassFactory when a non used jar file is missing.
Hello,
I have a small problem regarding a class factory and I don't really understand how to solve it.
If you see the example below we can imagine that I have a classFactory who can provide 3 different types of classes.
public class ClassFactory {
public ClassFactory() {
public static Service getClass(String szType) {
if(szType.equals("1")) {
return new Class1();
} else if (szType.equals("2")) {
return new Class2();
} else if (szType.equals("3")) {
return new Class3();
else {
return null;
My problem is the following. Let�s imagine that Class1 and Class2 do not use specific jar file but Class3 uses a jar file which is not always present in old installation.
I thought that it was possible to get this factory running correctly if the customer only uses Class1 and Class2 same if specific jar file used by Class3 was not present but this is not the case.
I remarked that if the specific jar file used in Class3 was not present in my classpath the ClassFactory is blocking (same if we do not try to get a "Class3").
My problem is that I want to keep my code backward compatible without asking the customer to add a new jar file only used in new installation.
Can anyone of you explain me why it is reacting like this and how can I solve this problem?
Thank in advance for your reply.
Alain.There's a standard way of handling these things now. You place a text file in each jar that contributes candidate concrete classes with the FQN of an implementing class on each line. Your factory class then collects all these using getResources() on the context class loader and loads the classes dynamically.
The text file goes in the diretory META-INF/services and the file name is the FQN of the interface or abstract class they all implement.
Typically there's a small factory class for each implementation and you ask each, in turn, whether they accept the specifications the caller requested of your factory (typically static getInstance() method).
Or you could, for example, put the selection name/index or whatever in each class as an annotation. -
having a problem with dates when I send my numbers doc to excel. dates are all out and that they have to cut and paste individual entries onto their spreadsheet. Any idea how I can prevent this.
I'm using Lion on an MBP and Numbers is the latest versionMay you give more details about what is wrong with your dates ?
M…oSoft products aren't allowed on my machines but I use LibreOffice which is a clone of Office.
When I export from Numbers to Excel and open the result with LibreOffice, the dates are correctly treated.
To be precise, dates after 01/01/1904 are correctly treated. dates before 01/01/1904 are exported as strings but, as it's flagged during the export process, it's not surprising.
Yvan KOENIG (VALLAURIS, France) mardi 3 janvier 2012
iMac 21”5, i7, 2.8 GHz, 12 Gbytes, 1 Tbytes, mac OS X 10.6.8 and 10.7.2
My iDisk is : http://public.me.com/koenigyvan
Please : Search for questions similar to your own before submitting them to the community
For iWork's applications dedicated to iOS, go to :
https://discussions.apple.com/community/app_store/iwork_for_ios -
Got one more problem Merilyn and Radhakrishnan...
Regarding the soln y provided me earler with the thread "Problem with date format"...
What is happening is....I am able to change the 2400 to 0000 but when it is changed from 2400 on jan 1st to 0000 the hour is changing but not the date....the date still remains as jan 1st instead of jan 2nd....
Eg: Jan 1st 2400 -- changed to -- jan1st 0000
instead of jan 2nd 0000
Could you please help me in this issue...
Thanks,
GKGK, try this...
decode(instr(packagename.functionname(param1 ,param2),'2400'), 0, to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" hh24mi'),'mm/dd/yyyy hh24mi'),'mm/dd/yyyy hh24mi'),
to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" "2400"')+1,'mm/dd/yyyy "0000"'),'mm/dd/yyyy "0000"'))-Marilyn -
Tp ended with error code 0247 - addtobuffer has problems with data- and/or
Hello Experts,
If you give some idea, it will be greatly appreciated.
This transported issue started coming after power outage, sap system went hard shutdown.
Then we brought up the system. Before that , we do not have this transport issue.
our TMS landscape is
DEV QA-PRD
SED-SEQSEP
DEV is having the TMS domain controller.
FYI:
*At OS level, when we do scp command using root user, it is fine for any TR.
In STMS, while adding TR in SEQ(QA system), we are getting error like this.
Error:
Transport control program tp ended with error code 0247
Message no. XT200
Diagnosis
An error occurred when executing a tp command.
Command: ADDTOBUFFER SEDK906339 SEQ client010 pf=/us
Return code: 0247
Error text: addtobuffer has problems with data- and/or
Request: SEDK906339
System Response
The function terminates.
Procedure
Correct the error and execute the command again if necessary.
This is tp version 372.04.71 (release 700, unicode enabled)
Addtobuffer failed for SEDK906339.
Neither datafile nor cofile exist (cofile may also be corrupted).
standard output from tp and from tools called by tp:
tp returncode summary:
TOOLS: Highest return code of single steps was: 0
ERRORS: Highest tp internal error was: 0247when we do scp using sm69,
SEDADM@DEVSYS:/usr/sap/trans/cofiles/K906339.SED SEQADM@QASYS:/usr/sap/trans/cofiles/.
it throws the error like below,
Host key verification failed.
External program terminated with exit code 1
Thanks
Praba -
I have one problem with Data Guard. My archive log files are not applied.
I have one problem with Data Guard. My archive log files are not applied. However I have received all archive log files to my physical Standby db
I have created a Physical Standby database on Oracle 10gR2 (Windows XP professional). Primary database is on another computer.
In Enterprise Manager on Primary database it looks ok. I get the following message Data Guard status Normal
But as I wrote above the archive log files are not applied
After I created the Physical Standby database, I have also done:
1. I connected to the Physical Standby database instance.
CONNECT SYS/SYS@luda AS SYSDBA
2. I started the Oracle instance at the Physical Standby database without mounting the database.
STARTUP NOMOUNT PFILE=C:\oracle\product\10.2.0\db_1\database\initluda.ora
3. I mounted the Physical Standby database:
ALTER DATABASE MOUNT STANDBY DATABASE
4. I started redo apply on Physical Standby database
alter database recover managed standby database disconnect from session
5. I switched the log files on Physical Standby database
alter system switch logfile
6. I verified the redo data was received and archived on Physical Standby database
select sequence#, first_time, next_time from v$archived_log order by sequence#
SEQUENCE# FIRST_TIME NEXT_TIME
3 2006-06-27 2006-06-27
4 2006-06-27 2006-06-27
5 2006-06-27 2006-06-27
6 2006-06-27 2006-06-27
7 2006-06-27 2006-06-27
8 2006-06-27 2006-06-27
7. I verified the archived redo log files were applied on Physical Standby database
select sequence#,applied from v$archived_log;
SEQUENCE# APP
4 NO
3 NO
5 NO
6 NO
7 NO
8 NO
8. on Physical Standby database
select * from v$archive_gap;
No rows
9. on Physical Standby database
SELECT MESSAGE FROM V$DATAGUARD_STATUS;
MESSAGE
ARC0: Archival started
ARC1: Archival started
ARC2: Archival started
ARC3: Archival started
ARC4: Archival started
ARC5: Archival started
ARC6: Archival started
ARC7: Archival started
ARC8: Archival started
ARC9: Archival started
ARCa: Archival started
ARCb: Archival started
ARCc: Archival started
ARCd: Archival started
ARCe: Archival started
ARCf: Archival started
ARCg: Archival started
ARCh: Archival started
ARCi: Archival started
ARCj: Archival started
ARCk: Archival started
ARCl: Archival started
ARCm: Archival started
ARCn: Archival started
ARCo: Archival started
ARCp: Archival started
ARCq: Archival started
ARCr: Archival started
ARCs: Archival started
ARCt: Archival started
ARC0: Becoming the 'no FAL' ARCH
ARC0: Becoming the 'no SRL' ARCH
ARC1: Becoming the heartbeat ARCH
Attempt to start background Managed Standby Recovery process
MRP0: Background Managed Standby Recovery process started
Managed Standby Recovery not using Real Time Apply
MRP0: Background Media Recovery terminated with error 1110
MRP0: Background Media Recovery process shutdown
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[1]: Assigned to RFS process 2148
RFS[1]: Identified database type as 'physical standby'
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[2]: Assigned to RFS process 2384
RFS[2]: Identified database type as 'physical standby'
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[3]: Assigned to RFS process 3188
RFS[3]: Identified database type as 'physical standby'
Primary database is in MAXIMUM PERFORMANCE mode
Primary database is in MAXIMUM PERFORMANCE mode
RFS[3]: No standby redo logfiles created
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[4]: Assigned to RFS process 3168
RFS[4]: Identified database type as 'physical standby'
RFS[4]: No standby redo logfiles created
Primary database is in MAXIMUM PERFORMANCE mode
RFS[3]: No standby redo logfiles created
10. on Physical Standby database
SELECT PROCESS, STATUS, THREAD#, SEQUENCE#, BLOCK#, BLOCKS FROM V$MANAGED_STANDBY;
PROCESS STATUS THREAD# SEQUENCE# BLOCK# BLOCKS
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
RFS IDLE 0 0 0 0
RFS IDLE 0 0 0 0
RFS IDLE 1 9 13664 2
RFS IDLE 0 0 0 0
10) on Primary database:
select message from v$dataguard_status;
MESSAGE
ARC0: Archival started
ARC1: Archival started
ARC2: Archival started
ARC3: Archival started
ARC4: Archival started
ARC5: Archival started
ARC6: Archival started
ARC7: Archival started
ARC8: Archival started
ARC9: Archival started
ARCa: Archival started
ARCb: Archival started
ARCc: Archival started
ARCd: Archival started
ARCe: Archival started
ARCf: Archival started
ARCg: Archival started
ARCh: Archival started
ARCi: Archival started
ARCj: Archival started
ARCk: Archival started
ARCl: Archival started
ARCm: Archival started
ARCn: Archival started
ARCo: Archival started
ARCp: Archival started
ARCq: Archival started
ARCr: Archival started
ARCs: Archival started
ARCt: Archival started
ARCm: Becoming the 'no FAL' ARCH
ARCm: Becoming the 'no SRL' ARCH
ARCd: Becoming the heartbeat ARCH
Error 1034 received logging on to the standby
Error 1034 received logging on to the standby
LGWR: Error 1034 creating archivelog file 'luda'
LNS: Failed to archive log 3 thread 1 sequence 7 (1034)
FAL[server, ARCh]: Error 1034 creating remote archivelog file 'luda'
11)on primary db
select name,sequence#,applied from v$archived_log;
NAME SEQUENCE# APP
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00003_0594204176.001 3 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00004_0594204176.001 4 NO
Luda 4 NO
Luda 3 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00005_0594204176.001 5 NO
Luda 5 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00006_0594204176.001 6 NO
Luda 6 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00007_0594204176.001 7 NO
Luda 7 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00008_0594204176.001 8 NO
Luda 8 NO
12) on standby db
select name,sequence#,applied from v$archived_log;
NAME SEQUENCE# APP
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00004_0594204176.001 4 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00003_0594204176.001 3 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00005_0594204176.001 5 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00006_0594204176.001 6 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00007_0594204176.001 7 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00008_0594204176.001 8 NO
13) my init.ora files
On standby db
irina.__db_cache_size=79691776
irina.__java_pool_size=4194304
irina.__large_pool_size=4194304
irina.__shared_pool_size=75497472
irina.__streams_pool_size=0
*.audit_file_dest='C:\oracle\product\10.2.0\admin\luda\adump'
*.background_dump_dest='C:\oracle\product\10.2.0\admin\luda\bdump'
*.compatible='10.2.0.1.0'
*.control_files='C:\oracle\product\10.2.0\oradata\luda\luda.ctl'
*.core_dump_dest='C:\oracle\product\10.2.0\admin\luda\cdump'
*.db_block_size=8192
*.db_domain=''
*.db_file_multiblock_read_count=16
*.db_file_name_convert='luda','irina'
*.db_name='irina'
*.db_unique_name='luda'
*.db_recovery_file_dest='C:\oracle\product\10.2.0\flash_recovery_area'
*.db_recovery_file_dest_size=2147483648
*.dispatchers='(PROTOCOL=TCP) (SERVICE=irinaXDB)'
*.fal_client='luda'
*.fal_server='irina'
*.job_queue_processes=10
*.log_archive_config='DG_CONFIG=(irina,luda)'
*.log_archive_dest_1='LOCATION=C:/oracle/product/10.2.0/oradata/luda/ VALID_FOR=(ALL_LOGFILES, ALL_ROLES) DB_UNIQUE_NAME=luda'
*.log_archive_dest_2='SERVICE=irina LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES, PRIMARY_ROLE) DB_UNIQUE_NAME=irina'
*.log_archive_dest_state_1='ENABLE'
*.log_archive_dest_state_2='ENABLE'
*.log_archive_max_processes=30
*.log_file_name_convert='C:/oracle/product/10.2.0/oradata/irina/','C:/oracle/product/10.2.0/oradata/luda/'
*.open_cursors=300
*.pga_aggregate_target=16777216
*.processes=150
*.remote_login_passwordfile='EXCLUSIVE'
*.sga_target=167772160
*.standby_file_management='AUTO'
*.undo_management='AUTO'
*.undo_tablespace='UNDOTBS1'
*.user_dump_dest='C:\oracle\product\10.2.0\admin\luda\udump'
On primary db
irina.__db_cache_size=79691776
irina.__java_pool_size=4194304
irina.__large_pool_size=4194304
irina.__shared_pool_size=75497472
irina.__streams_pool_size=0
*.audit_file_dest='C:\oracle\product\10.2.0/admin/irina/adump'
*.background_dump_dest='C:\oracle\product\10.2.0/admin/irina/bdump'
*.compatible='10.2.0.1.0'
*.control_files='C:\oracle\product\10.2.0\oradata\irina\control01.ctl','C:\oracle\product\10.2.0\oradata\irina\control02.ctl','C:\oracle\product\10.2.0\oradata\irina\control03.ctl'
*.core_dump_dest='C:\oracle\product\10.2.0/admin/irina/cdump'
*.db_block_size=8192
*.db_domain=''
*.db_file_multiblock_read_count=16
*.db_file_name_convert='luda','irina'
*.db_name='irina'
*.db_recovery_file_dest='C:\oracle\product\10.2.0/flash_recovery_area'
*.db_recovery_file_dest_size=2147483648
*.dispatchers='(PROTOCOL=TCP) (SERVICE=irinaXDB)'
*.fal_client='irina'
*.fal_server='luda'
*.job_queue_processes=10
*.log_archive_config='DG_CONFIG=(irina,luda)'
*.log_archive_dest_1='LOCATION=C:/oracle/product/10.2.0/oradata/irina/ VALID_FOR=(ALL_LOGFILES, ALL_ROLES) DB_UNIQUE_NAME=irina'
*.log_archive_dest_2='SERVICE=luda LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES, PRIMARY_ROLE) DB_UNIQUE_NAME=luda'
*.log_archive_dest_state_1='ENABLE'
*.log_archive_dest_state_2='ENABLE'
*.log_archive_max_processes=30
*.log_file_name_convert='C:/oracle/product/10.2.0/oradata/luda/','C:/oracle/product/10.2.0/oradata/irina/'
*.open_cursors=300
*.pga_aggregate_target=16777216
*.processes=150
*.remote_login_passwordfile='EXCLUSIVE'
*.sga_target=167772160
*.standby_file_management='AUTO'
*.undo_management='AUTO'
*.undo_tablespace='UNDOTBS1'
*.user_dump_dest='C:\oracle\product\10.2.0/admin/irina/udump'
Please help me!!!!Hi,
After several tries my redo logs are applied now. I think in my case it had to do with the tnsnames.ora. At this moment I have both database in both tnsnames.ora files using the SID and not the SERVICE_NAME.
Now I want to use DGMGRL. Adding a configuration and a stand-by database is working fine, but when I try to enable the configuration DGMGRL gives no feedback and it looks like it is hanging. The log, although says that it succeeded.
In another session 'show configuration' results in the following, confirming that the enable succeeded.
DGMGRL> show configuration
Configuration
Name: avhtest
Enabled: YES
Protection Mode: MaxPerformance
Fast-Start Failover: DISABLED
Databases:
avhtest - Primary database
avhtestls53 - Physical standby database
Current status for "avhtest":
Warning: ORA-16610: command 'ENABLE CONFIGURATION' in progress
It there anybody that experienced the same problem and/or knows the solution to this?
With kind regards,
Martin Schaap -
Hi to everyone,
I have a problem with data acquisitioning in LV 7.1.
I made a transition from Tradiotional NI-DAQ to NI-DAQmx in my LabVIEW application.
The problem I have is that when I acquire data in Traditional (without writing somewhere -
just reading) then there is no scan backlog data. But when I acquire data in application that
acquisition is based on DAQmx than a scan backlog indicator shows numbers from 20 to 50 for
about 6 min and then that number quite quickly increases until I get an error (unable to
acquire data. The data was overwritten).
Acquisition settings are the same in both cases. When I acquire with DAQmx I use a global
channels. Is a reason for that phenomenon in global channels data procesing? But it seems
strange why does it flows quite smoothly for about 6 min and then it stucks?
Best regards,
EroIf you have an old Daq unit it may not be compatible with DAQMX. Which DAQ unit do you have? I think NI have a list showing which DAQ driver you can use with your card
Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
(Sorry no Labview "brag list" so far) -
URGENT ! JDEV 10.1.2 Problem with data control generated from session bean
I got a problem with data control generated from session bean which return a collection of data transfer object.
The dto's seem to be correct. The session bean load correctly the data into and the object's are plenty of data. Using the console to display the dto content is ok.
When generating a data control from this session bean and associate the dto included in the collection only the first object level and one-to-one dto object are correctly setted in the data control. Object that represent collection into the dto (one-to-many foreign key) are setted as collection with an iterator but the structure of the object is not setted. I don't know how to associate this second level of collection with the dto bean class to obtain the attributes definition.
I created a case with hr schema like the hrApp demo application in the tutorial with departments and employees table. I got the same problem.
Is it a bug ?
It exists a workaround to force the data control to understand the collection data structure ?
Help is welcome ! this is urgent !!!we found the problem by assigning the child dto bean class to the node representing the iterator in the xml file corresponding to the master dto.
Maybe you are looking for
-
my calendar will no longer let me add new event or delete them, it comes up with an error saying "cannot save event, no end date set" or "event does not belong to that event store". can anyone help with this?
-
Magic Mouse and Trackpad Become Randomly Eratic
My setup: 2011 Macbook Pro 27" Thunderbolt Display Wireless Keyboard TrackPad Magic Mouse My Problem: about once every 2 or 3 days, my mouse and trackpad will become eratic (at the same time) Im unable to reliably control them and the cursor will mov
-
Select Query in Crystal Report
Dear all, I have a report to be developed in which i have to fire more than 2 select quires. I am adding this quires in "Add Command" window, it doesn't give any error but gives output of first query only. I have tried to use multiple commands in rep
-
Please help me i lost my secret question answer
how to rest please
-
Mac mail not working with gmail 2 step verification in yosemite
Every couple of days my 2 step verification seems to drop off the mac mail (and calendar) radar. Ive been using mac mail with 2 step verification (gmail account) for about 8 months, no problems. About 1 month ago my mac started asking for my gmail pa