A problem regarding Date?
hi
I have made a input fieldUI element of type date.I want a error to come when user dosenot enter this field .
I am doing it in this fashion:
view.java
//wdInit()
this.intialize();
void intialize()
wdContext.currentcontextElement.setInputfield(new Date(0));
//void checkMandotry(java.lang.FieldName)
all that to generate error message.
//onMyActionClick
void onMyActionclick()
Date date1=wdContext.currentelement.getAttributeValue();
if(date1.equals(new Date(0))
this.CheckMandotry()
But in my application 01/01/1970 comes by default on my inputfield which i donot want?
How to do this?
Please help me
regards
Nidhideep
Hi,
You can use the below code to check.
If you dont want to see anything in the input field, just dont intialize with any value in the intialize().
Date dd = wdContext.currentContextElement().getDate1();
if(dd == null ){
wdComponentAPI.getMessageManager().reportSuccess("Empty");
checkRequired();
}else{
wdComponentAPI.getMessageManager().reportSuccess("Not Empty");
Regards,
Sridhar
Message was edited by: Sridhar kanchanapalli
Message was edited by: Sridhar kanchanapalli
Similar Messages
-
Problem regarding data function 'SUMGT' (Overall Result) in BEx query desig
Hi,
I have created one query where I want to see the inventory data month wise. That's why at first I have placed 0calmonth (with variable) in column, secondly kept all the key figures which I want to see month wise under one structure, then placed the structure under 0calmonth in column. Now in that structure I have one key figure like 'total stock' and another global calculated key figure '% of total inventory' (='total stock' <b>%A</b> <b>SUMGT</b> 'total stock').
Now the problem is when ever I'm giving the value of month from 02.2006 to 05.2006, it is calculating the value of '% of total inventory' for every month with respect to the overall result of 'total stock' for month 05.2006.
But I want to calculate the value of '% of total inventory' for each month with respect to the overall result of 'total stock' for corresponding month.
Please suggest what I'll do now.
Thanks,
Arijit.Hi,
now i got whats the issue is.
Check aggregation properties of 'total stock'(RSD1>provide tech name of total stock>display-->aggragation tab)...
Exception aggregation would have been set to <b>'Last Value'</b> with some time char as aggregation ref characteristic..
That is the reason why SUMGT (or %SUMGT) is calculating on Last value of the 0calmonth interval.
i.e if you provide 02.2006 to 07.2006, %of total inventory for every month would be with respect to value of 07.2006...
Hope this helps...
Dont forget to reward if it helps!
thanks
Message was edited by: Murali -
Hi All,
I have a problem regarding data export, when I used EXP command in my oracle form then EXPORT data from database then the dmp is larger than the dmp when I use to export data from same database using DBMS_DATAPUMP. For the case of EXP dmp size is say 40,225 KB and for the same database when I used DBMS_DATAPUMP it becomes say 36,125 KB. Why this difference is occur? Is this a problem? What will be the solution?
Please Help ASAP.
Thanks in advance.
Regards
Sanjit Kr. Mahato
Edited by: userSanjit on Jul 23, 2009 6:19 PM
Edited by: userSanjit on Jul 23, 2009 6:24 PMHi,
Expdb and Exp are different exporting utility of oracle and hence the output file sizes are not same, and so difference occurs.
No this is not a problem
Since this is not a problem and hence no solution.
Why you see this as a problem
Cheers
Anurag -
Problem while data processing TRANSACTION data from DSO to CUBE
Hi Guru's,
we are facing problem while data processing TRANSACTION data from DSO to CUBE. data packets processing very slowly and updating .Please help me regarding this.
Thanks and regards,
SridharHi,
I will suggest you to check a few places where you can see the status
1) SM37 job log (give BIrequest name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. See if its accessing/updating some tables or is not doing anything at all.
If its running and if you are able to see it active in SM66 you can wait for some time to let it finish.
3) RSMO see what is available in details tab. It may be in update rules.
4) ST22 check if any short dump has occured.
You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
If you feel its active and running you can verify by checking if the number of records has increased in the cube.
Thanks,
JituK -
Got one more problem Merilyn and Radhakrishnan...
Regarding the soln y provided me earler with the thread "Problem with date format"...
What is happening is....I am able to change the 2400 to 0000 but when it is changed from 2400 on jan 1st to 0000 the hour is changing but not the date....the date still remains as jan 1st instead of jan 2nd....
Eg: Jan 1st 2400 -- changed to -- jan1st 0000
instead of jan 2nd 0000
Could you please help me in this issue...
Thanks,
GKGK, try this...
decode(instr(packagename.functionname(param1 ,param2),'2400'), 0, to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" hh24mi'),'mm/dd/yyyy hh24mi'),'mm/dd/yyyy hh24mi'),
to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" "2400"')+1,'mm/dd/yyyy "0000"'),'mm/dd/yyyy "0000"'))-Marilyn -
Problem with date format from Oracle DB
Hi,
I am facing a problem with date fields from Oracle DB sources. The date format of the field in DB table is 'Date base type is DATE and DDIC type is DATS'.
I mapped the date fields to Date characters in BI. Now the data that comes to PSA is in weird format. It shows like -0.PR.09-A
I have changing the field settings in DataSource to internal and external and also i have tried mapping these date fields to text fields with out luck. All delivers the same format.
I have also tried using conversion routines like, CONVERSION_EXIT_IDATE_INPUT to change format. It also delivers me the same old result.
If anybody of you have any suggestions or if anybody have you experienced such probelms, Please share your experience with me.
Thanks in advance.
Regards
VaradaThanks for all your reply. I can only the solutions creating view in database. I want some solution to be done in BI. I appreciate if some of you have idea in it.
The issue again in detail
I am facing an issue with date fields from oracle data. The data that is sent from Oracle is in the format is -0.AR.04-M. I am able to convert this date in BI with conversion routine in BI into format 04-MAR-0.
The problem is, I am getting data of length 10 (Output format) in the format -0.AR.04-M where the month is not in numericals. Since it is in text it is taking one character spacing more.
I have tried in different ways to convert and increased the length in BI, the result is same. I am wondering if we can change the date format in database.
I am in puzzle with the this date format. I have checked other Oracle DB connections data for date fields in BI, they get data in the format 20.081.031 which will allow to convert this in BI. Only from the system i am trying creating a problem.
Regards
Varada -
Problem with date format dd/mm/yyyy. But I need to convert yyyy-mm-dd.
Dear friends,
I have the problem with date format. I receiving the date with the format dd/mm/yyyy. But I can upload to MySQL only in the format of yyyy-mm-dd.
how should I handle this situation, for this I've created these code lines.But I have some problem with these line. please help me to solve this problem.
String pattern = "yyyy-mm-dd";
SimpleDateFormat format = new SimpleDateFormat(pattern);
try {
Date date = format.parse("2006-02-12");
System.out.println(date);
} catch (ParseException e) {
e.printStackTrace();
System.out.println(format.format(new Date()));
this out put gives me Tue Apr 03 00:00:00 IST 2007
But I need the date format in yyyy-mm-dd.
regards,
maza
thanks in advance.Thanks Dear BalusC,
I tried with this,
rs.getString("DATA_SCAD1")// where the source from .xls files
String pattern = "yyyy-MM-dd";
SimpleDateFormat format = new SimpleDateFormat(pattern);
try {
Date date = format.parse("DATA_SCAD1");
System.out.println(date);
} catch (ParseException e) {
e.printStackTrace();
System.out.println(format.format(new Date()));
this out put gives me Tue Apr 03 00:00:00 IST 2007
But I want to display the date format in yyyy-mm-dd.
regards,
maza -
A problem regarding set up of Oracle Lite 3.6.0.2.0 on Win 95, with JDK 1.1.8 and Java 2 SDK ( Ver 1.3 Beta)
After the installation of Oracle Lite 3.6.0.2.0 on a laptop (with WIN 95 OS), When I run Oracle Lite Designer from start menu, I receive following error message :
====================================
Invalid class name 'FILES\ORA95_2\LITE\DESIGNER\oldes.jar;C:\PROGRAM'
usage: java [-options] class
where options include:
-help print out this message
-version print out the build version
-v -verbose turn on verbose mode
-debug enable remote JAVA debugging
-noasyncgc don't allow asynchronous garbage collection
-verbosegc print a message when garbage collection occurs
-noclassgc disable class garbage collection
-ss<number> set the maximum native stack size for any thread
-oss<number> set the maximum Java stack size for any thread
-ms<number> set the initial Java heap size
-mx<number> set the maximum Java heap size
-classpath <directories separated by semicolons>
list directories in which to look for classes
-prof[:<file>] output profiling data to .\java.prof or .\<file>
-verify verify all classes when read in
-verifyremote verify classes read in over the network [default]
-noverify do not verify any class
-nojit disable JIT compiler
Please make sure that JDK 1.1.4 (or greater) is installed in your machine and CLASSPATH is set properly. JAVA.EXE must be in the PATH.
====================================
My ORACLE_HOME is c:\program files\ora95_2 and Oracle Lite is installed under the ORACLE_HOME in LITE\DESIGNER directory.
JDK version is 1.1.8 which is greater than 1.1.4 installed in c:\program files\jdk1.1.8, My PATH, and CLASSPATH are set in AUTOEXEC.BAT as follows:
set CLASSPATH=c:\Progra~1\jdk1.1.8\lib\classes.zip;c:\progra~1\ora95_2\lite\classes\olite36.jar;c:\progra~1\ora95_2\lite\designer\oldes.jar;c:\progra~1\ora95_2\lite\designer\swingall.j ar
PATH=C:\Progra~1\Ora95_2\bin;.;c:\Progra~1\jdk1.1.8\lib;c:\Progra~1\jdk1.1.8\bin;C:\Progra~1\Ora95_2\lite\Designer;C:\WIN95;C:\WIN95\COMMAND;C:\UTIL
And, I can run JAVA.EXE from any directory on command prompt.
With JAVA 2 SDK (ver 1.3 Beta) instead of JDK 1.1.8 I'm getting a different Error message as follows:
=============================
java.lang.NoClassFoundError: 'FILES\ORA95_2\LITE\DESIGNER\oldes.jar;C:\PROGRAM'
Please make sure that JDK 1.1.4 (or greater) is installed in your machine and CLASSPATH is set properly. JAVA.EXE must be in the PATH.
==============================
the PATH and CLASSPATH were set accordingly, as with JDK1.1.8, and there was no classes.zip in classpath
also the class file or the jar file looks weird or wrapped in the error message : 'FILES\ORA95_2\LITE\DESIGNER\oldes.jar;C:\PROGRAM'
Another interesting thing I noticed is if I run oldes.exe from Installation CD, the Oracle Lite Designer runs fine, and without error, I'm able to modify tables in the database of my laptop also.
Could someone shade some light on what am I doing wrong here ?
Thanks for help in advance .
Regards
Viral
nullOn 07/20/2015 06:35 AM, Itzhak Hovav wrote:
> hi
> [snip]
> [root@p22 eclipse]# cat eclipse.ini -startup
> plugins/org.eclipse.equinox.launcher_1.3.0.v20120522-1813.jar
> --launcher.library
> plugins/org.eclipse.equinox.launcher.gtk.linux.x86_64_1.1.200.v20120913-144807
>
> -showsplash
> org.eclipse.platform
> --launcher.XXMaxPermSize
> 256m
> --launcher.defaultAction
> openFile
> -vmargs
> -Xms40m
> -Xmx512m
> [snip]
Try this: http://wiki.eclipse.org/Eclipse.ini. You should have read the
sticky posts at forum's top for getting started help. -
"evdre encountered a problem retrieving data from the webserver"
Hi
We are using a big report which takes very long time to expand and sometimes we get the error message "evdre encountered a problem retrieving data from the webserver" when expanding the report, but not very often for most of our users. But we have one user who gets this error message almost every second time he expands the report. This user have a computer with same capacity as the rest of us, can there be some setting on the computer or in the client installtion the cause this problem?
We are using BPC 5.1 SP 5
/FredrikHi,
This error occurs usually if we have huge data in combination of our dimensions.
Even, if the selection of your memberset is not apt to the combination of the data present in the back end, you may encounter a data retrival error.
regards
sashank -
I have one problem with Data Guard. My archive log files are not applied.
I have one problem with Data Guard. My archive log files are not applied. However I have received all archive log files to my physical Standby db
I have created a Physical Standby database on Oracle 10gR2 (Windows XP professional). Primary database is on another computer.
In Enterprise Manager on Primary database it looks ok. I get the following message Data Guard status Normal
But as I wrote above the archive log files are not applied
After I created the Physical Standby database, I have also done:
1. I connected to the Physical Standby database instance.
CONNECT SYS/SYS@luda AS SYSDBA
2. I started the Oracle instance at the Physical Standby database without mounting the database.
STARTUP NOMOUNT PFILE=C:\oracle\product\10.2.0\db_1\database\initluda.ora
3. I mounted the Physical Standby database:
ALTER DATABASE MOUNT STANDBY DATABASE
4. I started redo apply on Physical Standby database
alter database recover managed standby database disconnect from session
5. I switched the log files on Physical Standby database
alter system switch logfile
6. I verified the redo data was received and archived on Physical Standby database
select sequence#, first_time, next_time from v$archived_log order by sequence#
SEQUENCE# FIRST_TIME NEXT_TIME
3 2006-06-27 2006-06-27
4 2006-06-27 2006-06-27
5 2006-06-27 2006-06-27
6 2006-06-27 2006-06-27
7 2006-06-27 2006-06-27
8 2006-06-27 2006-06-27
7. I verified the archived redo log files were applied on Physical Standby database
select sequence#,applied from v$archived_log;
SEQUENCE# APP
4 NO
3 NO
5 NO
6 NO
7 NO
8 NO
8. on Physical Standby database
select * from v$archive_gap;
No rows
9. on Physical Standby database
SELECT MESSAGE FROM V$DATAGUARD_STATUS;
MESSAGE
ARC0: Archival started
ARC1: Archival started
ARC2: Archival started
ARC3: Archival started
ARC4: Archival started
ARC5: Archival started
ARC6: Archival started
ARC7: Archival started
ARC8: Archival started
ARC9: Archival started
ARCa: Archival started
ARCb: Archival started
ARCc: Archival started
ARCd: Archival started
ARCe: Archival started
ARCf: Archival started
ARCg: Archival started
ARCh: Archival started
ARCi: Archival started
ARCj: Archival started
ARCk: Archival started
ARCl: Archival started
ARCm: Archival started
ARCn: Archival started
ARCo: Archival started
ARCp: Archival started
ARCq: Archival started
ARCr: Archival started
ARCs: Archival started
ARCt: Archival started
ARC0: Becoming the 'no FAL' ARCH
ARC0: Becoming the 'no SRL' ARCH
ARC1: Becoming the heartbeat ARCH
Attempt to start background Managed Standby Recovery process
MRP0: Background Managed Standby Recovery process started
Managed Standby Recovery not using Real Time Apply
MRP0: Background Media Recovery terminated with error 1110
MRP0: Background Media Recovery process shutdown
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[1]: Assigned to RFS process 2148
RFS[1]: Identified database type as 'physical standby'
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[2]: Assigned to RFS process 2384
RFS[2]: Identified database type as 'physical standby'
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[3]: Assigned to RFS process 3188
RFS[3]: Identified database type as 'physical standby'
Primary database is in MAXIMUM PERFORMANCE mode
Primary database is in MAXIMUM PERFORMANCE mode
RFS[3]: No standby redo logfiles created
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[4]: Assigned to RFS process 3168
RFS[4]: Identified database type as 'physical standby'
RFS[4]: No standby redo logfiles created
Primary database is in MAXIMUM PERFORMANCE mode
RFS[3]: No standby redo logfiles created
10. on Physical Standby database
SELECT PROCESS, STATUS, THREAD#, SEQUENCE#, BLOCK#, BLOCKS FROM V$MANAGED_STANDBY;
PROCESS STATUS THREAD# SEQUENCE# BLOCK# BLOCKS
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
RFS IDLE 0 0 0 0
RFS IDLE 0 0 0 0
RFS IDLE 1 9 13664 2
RFS IDLE 0 0 0 0
10) on Primary database:
select message from v$dataguard_status;
MESSAGE
ARC0: Archival started
ARC1: Archival started
ARC2: Archival started
ARC3: Archival started
ARC4: Archival started
ARC5: Archival started
ARC6: Archival started
ARC7: Archival started
ARC8: Archival started
ARC9: Archival started
ARCa: Archival started
ARCb: Archival started
ARCc: Archival started
ARCd: Archival started
ARCe: Archival started
ARCf: Archival started
ARCg: Archival started
ARCh: Archival started
ARCi: Archival started
ARCj: Archival started
ARCk: Archival started
ARCl: Archival started
ARCm: Archival started
ARCn: Archival started
ARCo: Archival started
ARCp: Archival started
ARCq: Archival started
ARCr: Archival started
ARCs: Archival started
ARCt: Archival started
ARCm: Becoming the 'no FAL' ARCH
ARCm: Becoming the 'no SRL' ARCH
ARCd: Becoming the heartbeat ARCH
Error 1034 received logging on to the standby
Error 1034 received logging on to the standby
LGWR: Error 1034 creating archivelog file 'luda'
LNS: Failed to archive log 3 thread 1 sequence 7 (1034)
FAL[server, ARCh]: Error 1034 creating remote archivelog file 'luda'
11)on primary db
select name,sequence#,applied from v$archived_log;
NAME SEQUENCE# APP
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00003_0594204176.001 3 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00004_0594204176.001 4 NO
Luda 4 NO
Luda 3 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00005_0594204176.001 5 NO
Luda 5 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00006_0594204176.001 6 NO
Luda 6 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00007_0594204176.001 7 NO
Luda 7 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00008_0594204176.001 8 NO
Luda 8 NO
12) on standby db
select name,sequence#,applied from v$archived_log;
NAME SEQUENCE# APP
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00004_0594204176.001 4 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00003_0594204176.001 3 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00005_0594204176.001 5 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00006_0594204176.001 6 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00007_0594204176.001 7 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00008_0594204176.001 8 NO
13) my init.ora files
On standby db
irina.__db_cache_size=79691776
irina.__java_pool_size=4194304
irina.__large_pool_size=4194304
irina.__shared_pool_size=75497472
irina.__streams_pool_size=0
*.audit_file_dest='C:\oracle\product\10.2.0\admin\luda\adump'
*.background_dump_dest='C:\oracle\product\10.2.0\admin\luda\bdump'
*.compatible='10.2.0.1.0'
*.control_files='C:\oracle\product\10.2.0\oradata\luda\luda.ctl'
*.core_dump_dest='C:\oracle\product\10.2.0\admin\luda\cdump'
*.db_block_size=8192
*.db_domain=''
*.db_file_multiblock_read_count=16
*.db_file_name_convert='luda','irina'
*.db_name='irina'
*.db_unique_name='luda'
*.db_recovery_file_dest='C:\oracle\product\10.2.0\flash_recovery_area'
*.db_recovery_file_dest_size=2147483648
*.dispatchers='(PROTOCOL=TCP) (SERVICE=irinaXDB)'
*.fal_client='luda'
*.fal_server='irina'
*.job_queue_processes=10
*.log_archive_config='DG_CONFIG=(irina,luda)'
*.log_archive_dest_1='LOCATION=C:/oracle/product/10.2.0/oradata/luda/ VALID_FOR=(ALL_LOGFILES, ALL_ROLES) DB_UNIQUE_NAME=luda'
*.log_archive_dest_2='SERVICE=irina LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES, PRIMARY_ROLE) DB_UNIQUE_NAME=irina'
*.log_archive_dest_state_1='ENABLE'
*.log_archive_dest_state_2='ENABLE'
*.log_archive_max_processes=30
*.log_file_name_convert='C:/oracle/product/10.2.0/oradata/irina/','C:/oracle/product/10.2.0/oradata/luda/'
*.open_cursors=300
*.pga_aggregate_target=16777216
*.processes=150
*.remote_login_passwordfile='EXCLUSIVE'
*.sga_target=167772160
*.standby_file_management='AUTO'
*.undo_management='AUTO'
*.undo_tablespace='UNDOTBS1'
*.user_dump_dest='C:\oracle\product\10.2.0\admin\luda\udump'
On primary db
irina.__db_cache_size=79691776
irina.__java_pool_size=4194304
irina.__large_pool_size=4194304
irina.__shared_pool_size=75497472
irina.__streams_pool_size=0
*.audit_file_dest='C:\oracle\product\10.2.0/admin/irina/adump'
*.background_dump_dest='C:\oracle\product\10.2.0/admin/irina/bdump'
*.compatible='10.2.0.1.0'
*.control_files='C:\oracle\product\10.2.0\oradata\irina\control01.ctl','C:\oracle\product\10.2.0\oradata\irina\control02.ctl','C:\oracle\product\10.2.0\oradata\irina\control03.ctl'
*.core_dump_dest='C:\oracle\product\10.2.0/admin/irina/cdump'
*.db_block_size=8192
*.db_domain=''
*.db_file_multiblock_read_count=16
*.db_file_name_convert='luda','irina'
*.db_name='irina'
*.db_recovery_file_dest='C:\oracle\product\10.2.0/flash_recovery_area'
*.db_recovery_file_dest_size=2147483648
*.dispatchers='(PROTOCOL=TCP) (SERVICE=irinaXDB)'
*.fal_client='irina'
*.fal_server='luda'
*.job_queue_processes=10
*.log_archive_config='DG_CONFIG=(irina,luda)'
*.log_archive_dest_1='LOCATION=C:/oracle/product/10.2.0/oradata/irina/ VALID_FOR=(ALL_LOGFILES, ALL_ROLES) DB_UNIQUE_NAME=irina'
*.log_archive_dest_2='SERVICE=luda LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES, PRIMARY_ROLE) DB_UNIQUE_NAME=luda'
*.log_archive_dest_state_1='ENABLE'
*.log_archive_dest_state_2='ENABLE'
*.log_archive_max_processes=30
*.log_file_name_convert='C:/oracle/product/10.2.0/oradata/luda/','C:/oracle/product/10.2.0/oradata/irina/'
*.open_cursors=300
*.pga_aggregate_target=16777216
*.processes=150
*.remote_login_passwordfile='EXCLUSIVE'
*.sga_target=167772160
*.standby_file_management='AUTO'
*.undo_management='AUTO'
*.undo_tablespace='UNDOTBS1'
*.user_dump_dest='C:\oracle\product\10.2.0/admin/irina/udump'
Please help me!!!!Hi,
After several tries my redo logs are applied now. I think in my case it had to do with the tnsnames.ora. At this moment I have both database in both tnsnames.ora files using the SID and not the SERVICE_NAME.
Now I want to use DGMGRL. Adding a configuration and a stand-by database is working fine, but when I try to enable the configuration DGMGRL gives no feedback and it looks like it is hanging. The log, although says that it succeeded.
In another session 'show configuration' results in the following, confirming that the enable succeeded.
DGMGRL> show configuration
Configuration
Name: avhtest
Enabled: YES
Protection Mode: MaxPerformance
Fast-Start Failover: DISABLED
Databases:
avhtest - Primary database
avhtestls53 - Physical standby database
Current status for "avhtest":
Warning: ORA-16610: command 'ENABLE CONFIGURATION' in progress
It there anybody that experienced the same problem and/or knows the solution to this?
With kind regards,
Martin Schaap -
Problem in data carrier "IDENTIFY FRONT END COMPUTER"
hello all
i am facing problem in data carrier creation for front end computer. actually i am creating data carrier type PC and i want set as default for all desktop. After DEFINE DATA CARRIER TYPE SERVER, FRONT END and Select PC and IDENTIFY FRONT END COMPUTER its shows following error
No entries found that match selection criteria
Message no. SV004
Diagnosis
No entries were found when importing data from the data base.
Procedure
If you have specified selection conditions, start the transaction again with conditions that are less restricted.
You can make new entries independently of this. To do this, select the function New entries in menu Edit.
help meHi Vivek,
In the SAP standard this entry are maintained only by the system
itself and they cannot be entered manually in transaction DC20. I can
only provide you the following information on the possible settings for
variable HOSTNAME and the DEFAULT frontend entry.
1. Default entry
Call transaction 'DC20'. Double-click node 'Define data carrier type
"server, front end"'. Select the respective data carrier type (e.g.
'PC') and double-click node 'Identify frontend computers'. Click
button 'Default entry' in the top right corner of the table control.
Note that it is only possible to create 1 default entry!
2. System variable HOSTNAME
If you run Windows XP, please go to Start -> Settings -> Control
Panel. Double-click on 'System'. Click on tab 'Advanced' and then on
button 'Environment variables'. Create system variable 'hostname' and
assign a value'. Save the new variable.
As there is already a DEFAULT entry for frontendtype 'PC' it cannot be
maintained for another frontend type.
Best regards,
Christoph -
RFC PROBLEM:NO DATA AVAILABLE IN ST03N
Hi experts,
On the t-code ST03 when i enter lastminute's load and confirm, getting the error RFC Problem:No data available.
How to go ahead.
From
Deepak Keesara.Hi Deepak,
First thing you can check that job SAP_COLLECTOR_FOR_PERFMONITOR is runningby t-code sm37.
This job runs in client 000 under user ddic with periodic of one hour.
If not schedule it in t-code sm36
Login as user ddic to client 000 ,execute t-code sm36 give the job name SAP_COLLECTOR_FOR_PERFMONITOR, give abap program RSCOLL00 AND GIVE PIRIODIC VALUE ONE HOUR THEN SAVE AND RELEASE.
Regards
Ashok
Edited by: Ashok Dalai on Aug 18, 2008 1:20 PM -
Screen refresh problem where data is entered and the screen doesn't refresh
Many people in the company are experiencing the odd screen refresh problem where data is entered and the screen doesn't refresh to show the updated result in corresponding cell formulas.
Microsoft have issued a hotfix to fix the issue for Excel 2003 as shown. Microsoft released a hotfix for this problem (<a href="advisory?ID=978908">KB978908</a>). Display memory tends to pick up data from hidden sheets and pastes it
into the active screen. No impact on the file. This occurs when protecting and unprotecting worksheets in VBA. I also suspect that enabling and disabling screen refresh contributes to this problem. In any case there is a fix, albeit with the following disclaimer:
As of yet I have not been able to find a fix for this for office 2010 and 2013, Any suggestions would be great.Hi,
Based on your description, Excel does not show the text strings when you typing. It may be caused by the cell format, if we set the cell format as ";;;" in custom format, it will not display the text that you typed.
And the issue may be caused by the third-party input method, there are some compatibility issue between them.
If the issue still exits, please try the following methods
Verify/install the latest updates
Repair your Office program
Regards,
George Zhao
TechNet Community Support -
Hi to everyone,
I have a problem with data acquisitioning in LV 7.1.
I made a transition from Tradiotional NI-DAQ to NI-DAQmx in my LabVIEW application.
The problem I have is that when I acquire data in Traditional (without writing somewhere -
just reading) then there is no scan backlog data. But when I acquire data in application that
acquisition is based on DAQmx than a scan backlog indicator shows numbers from 20 to 50 for
about 6 min and then that number quite quickly increases until I get an error (unable to
acquire data. The data was overwritten).
Acquisition settings are the same in both cases. When I acquire with DAQmx I use a global
channels. Is a reason for that phenomenon in global channels data procesing? But it seems
strange why does it flows quite smoothly for about 6 min and then it stucks?
Best regards,
EroIf you have an old Daq unit it may not be compatible with DAQMX. Which DAQ unit do you have? I think NI have a list showing which DAQ driver you can use with your card
Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
(Sorry no Labview "brag list" so far) -
Query Regarding Data transfer ( Pls Respond Quickly)
Hi,
I need to download the data to XML file. I have hard coded the file path in the Report Program. This Report should be shceduled for every night.
But the problem is that since I have hard coded the path, whenever the program is deployed to production system there will be a problem regarding the file path.
Can any body throw some light on how to avoid hard coding of the filepath ?
Reward points are assured.
Best Regards
Bhagat.Hey Kishan,
In that Program the file path was hard coded. Can you let me know the alternate way so that I can avoid hard coding.
The thing is that that report should be scheduled for every night.
Best Regards
Bhagat.
Maybe you are looking for
-
Can't open my Keynote files after last update on Ipad 2. I created the files on the same Ipad. No Mac involved. What can I do to open and use the files again?
-
How to Unshrink after adding headers/footers
Hey all, I recently went to remove some headers and footers from a document that had been edited awhile ago, and while I could remove the header/footer, the shrinking that had been done with "Shrink document to avoid overwriting..." stayed. Is there
-
Last monday I was to change my computer for another one, so I uninstalled adobe photoshop CS6 extended from my mac, I deactivated the software. Yesterday my boss told me there was a delay of the new mac so today I tried to activate again my photoshop
-
Input language limitation?
Hi, I have tried but could not find any relevant thread on what seems to be a very essential limitation : Since the latest Xperia z tablet android update, it is no longer possible to select more than 3 input languages. I have currently selected Engli
-
Exporting from PREMIERE to ENCORE (Correct Settings for crisp and clear DVD playback)
Please help. I have tried exporting a Premiere CS6 movie project to Encore, and when I play the burnt Encore DVD on my DVD player, it plays back fuzzy and it actually bounces. Here is my question: What are the correct Premeire CS6 export settings/ En