Problem with data in cube using info source
Hi,
I have a problem while loading data using IS - 2LIS_11_VAITM. In the cube its not showing any data even though i deleted and filled the setup tables in R3. In RSA3, its showing 1000 records but only 0 values in Cube.
I created transformations and DTP. Data comes upto psa.
Can anyone send me complete LO steps in detail for bi7 both R3 side and bw side upto cube. I need to use multiple LO extractors.
Thnx
Hi Reddy,
I believe that ur compounded objects also holds the proper mapping in the transformantion!!
Please cross verify ur existing data considering that the combination of the both the compounded objects and ur 0PLAN_NODE. I hope this is the one which is causing issues in ur DSO.
Check the data once with the combinations.
thanks
Hope this helps
Similar Messages
-
Problem with data in Cube 0IC_C03 in Bex Query
Hi,
(1) I am loading data into cube 0ic_c03 from data sources 2LIS_03_BX,2LIS_03_BF and 2LIS_03_UM. I need one more field <b>Manufacturing date</b> in my cube.
(2) That's why I enhanced datasource 2LIS_03_BF, and populated HSDAT (Manufacturing date from MSEG table.) Date is visible in datasource.
(3) I loaded data in Cube from 03_BX, did collapse. Then loaded data from 03_BF did collapse. In my cube data display (or LISTCUBE) I am able to see Manufacturing date and keyfigure values (<b>Received stock qty</b>:transit and <b>Issued stock qty: transit</b>) in one row.
eg.-
Plant Manufac.date Issued Qty Received qty
5102 01.07.2007 2000 3000
But In my <b>Bex query</b>, I see -
Plant Manufac.date Stock:in Transit
5102 01.07.2007 #
5102 # 1000
Ideally I want to show data in query -
Plant Manufac.date Stock:in Transit
5102 01.07.2007 1000
Should I enhance 2LIS_03_BX data source also with MAnufacturing date ?
Please suggest...It is urgent..
Thanks
SaurabhThats great Shalini.
If it is helpful..please assign points.
If you need any information on that please mail me [email protected]
Regards,
RK Ghattamaneni. -
Problem with Retrieving objects and using info...
Ok heres my problem.
I have implemented a Queue using a linkedlist in java. Ok then i created a new object(in its own class - called Data) with two values, amount and price. I want the queue to store these two values in one Node. So i pass the values to it like this:
Data A = new Data();
A.price = Intger.parseInt(jTextField.getText());
A.amount = Integer.ParseInt(jTextField.getText());
Now I pass the Queue like this
Queue B = new Queue ();
B.enqueue(A);
Which seems to work fine, the problem is that I need to work with the numbers in the Queue, update them, put them back and retrieve them.
So I create a new Data object and try to dequeue like this:
Data C = new Data();
C = B.dequeue(); //So i can work with the two values
but this doest work. It gives me the error that it needs an object.
Am I doing it right?
Anyone got any better ideas on how to do this? - Passing and retrieving two values to one Node.
Plz help, im a newbie in Java, and cant find any tutorials on the internet.why don't you use vector.
Vector queue = new Vector();
Data a = new Data();
a.price = Intger.parseInt(jTextField.getText());
a.amount = Integer.ParseInt(jTextField.getText());
queue.addElement(a);
Data c = new Data();
c.price = Intger.parseInt(jTextField.getText());
c.amount = Integer.ParseInt(jTextField.getText());
queue.addElement(c); -
Problem with data integration when using KCLJ
Hello,
For a project, I had to integrate a new field using transaction KCLJ.
For this I extented the DDIC structure of the sender structure, and after that, I updated the corresponding transfer rules.
When I execute transaction KCLJ I have no error, and table BUT000 is updated with the data of the flat file.
The problem is that erase also 6 BUT000's fields; they're not in the sender structure and so, have no transfer rules.
Could you help me ?Hi
Please read this.
External Data Transfer
These activities are not relevant if you use a CRM/EBP system.
In the following activities you make definitions for transfer of business partner data or business partner relationship data from an external system to a SAP System.
Data transfer takes place in several stages:
1. Relevant data is read from the external system and placed in a sequential file by the data selection program. The data structure of the file is defined in the sender structure.
This procedure takes place outside of the SAP environment and is not supported by SAP programs. For this reason, data changes can be made at this point by the data selection program.
2. The sequential file is stored on an application server or a presentation server.
3. The SAP transfer program reads data from the file and places this in the sender structure. This does not change the data. This step is carried out internally by the system and does not affect the user.
4. Following transfer rules that have to be defined, the transfer program takes the data from the sender structure and places it in the receiver structure. During this step you can change or convert data.
The receiver structure is firmly defined in the SAP system. Assignment of the sender structure to the transfer program, and of the transfer program to the receiver structure is made using a defined transfer category.
5. The data records in the receiver structure are processed one after the other and, if they do not contain any errors, they are saved in the database.
Before you transfer external data for the first time, make the following determinations:
The structure of the data in the external system may not match the structure expected by the SAP system. You may have to supplement data.
There are two ways in which you can adapt the structure:
You make the required conversions and enhancements within the data selection program prior to beginning the transfer to the SAP system. This will be the most practical solution in most cases since you have the most freedom at this point.
You do the conversion using a specially developed transfer program and transfer rules.
You then define the fields of the sender structure. The system offers you the option of automatically generating a sender structure that is compatible with the receiver structure.
You define transfer rules to create rules according to which the fields of the sender structure are linked with those of the receiver structure.
You now carry out the transfer.
SAP Enhancements for External Data Transfer
The following SAP enhancements are offered in the following areas of External Data Transfer:
Four Customer Exits exist for the data transfer or for the conversion from IDOC segments. The Exits are contained in the enhancement KKCD0001. As soon as the Customer Exits are activated, they are carried out for all sender structures or segments. The first two Customer Exits require minimal coding once they are activated. The sender structure concept is used when loading data into the SAP-System. The concept Segment is used in the context of the distribution of the SAP-System. It is a matter of a record of data to be transferred or converted. It is recommendable to code a CASE -instruction within the Customer Exit, where (differentiated according to sender structure (REPID) or segment) various coding is accessed. In the parameter REPID, the name of the segment for the conversion from IDOC segments. The parameter GRPID is not filled out with the conversion from IDOC segments. You should have a WHEN OTHERS branch within the CASE instruction, in which the 'SENDER_SET' is allocated to the 'SENDER_SET_NEW' or the 'RECEIVER_SET' to the 'RECEIVER_SET_NEW'. Utherwise the return code will have its initial value. You can view a possible solution in Code sample.
The first Customer Exit is accessed before the summarizing or conversion. It is called up as follows:
CALL CUSTOMER-FUNCTION '001' EXPORTING GRPID = GRPID "Origin REPID = REPID "Sender program SENDER_SET = SENDER_SET "Sender record IMPORTING SENDER_SET_NEW = SENDER_SET "modified sender record SUBRC = SUBRC. "Returncode
If the variable 'SUBRC' is initial, the modified record is edited further or else passed over. The import parameter 'SENDER_SET_NEW ' must be filled out in the Customer Exit, as only this field and not the field 'SENDER_SET is further edited. This also especially means that you must allocate the import parameter 'SENDER_SET_NEW' the value of 'SENDER_SET' for records, for which no special handling will be carried out otherwise.
The second Customer Exit is accessed after the summarization and before the update:
CALL CUSTOMER-FUNCTION '002' EXPORTING REPID = REPID "Senderprogramm GRPID = GRPID "Herkunft RECEIVER_SET = RECEIVER_SET "verdichteter Satz IMPORTING RECEIVER_SET_NEW = RECEIVER_SET "modifizierter verdichteter Satz SUBRC = SUBRC. "Returncode
The modified record is only updated if the variable 'SUBRC'
is initial.
The import parameter 'RECEIVER_SET_NEW' must be filled out in the Customer Exit, since only this field and not the field 'RECEIVER_SET _NEW' is updated.
The third Customer Exit is used for replacing variables. It is called up when you load the transfer rules.
CALL CUSTOMER-FUNCTION '003' EXPORTING REPID = REPID GRPID = GRPID VARIA = VARIA RFELD = RFELD VARTP = VARTP CHANGING KEYID = KEYID EXCEPTIONS VARIABLE_ERROR = 1.
The parameters REPID and GRPID are supplied with the sender structure and the origin. The variable name is in the field VARIA. The name of the receiver field is in the parameterRFELD. Field VARTP contains the variable type. Valid types are fixed values of the domain KCD_VARTYP. You transfer the variable values in the parameter KEYID. If an error occurs you use the exception VARIABLE_ERROR.
the fourth Customer Exit is required in EC-EIS only. It is called up after the summarization and before the determination of key figures. It is a necessary enhancement to the second Customer Exit. This is because changes to the keys are considered before the database is checked to see if records exist for the keys.
The function is called up as follows:
CALL CUSTOMER-FUNCTION '004' CHANGING RECEIVER_SET = R SUBRC = UE_SUBRC.
The parameter RECEIVER_SET contains the receiver record to be changed. The parameter RECEIVER_SET is a changing parameter. No changes must be made to the function module if it is not used.
The User-Exits can be found in the Module pool 'SAPFKCIM'. If you want to use the Customer Exits, you can create a project and activate the Customer Exits with the transaction 'CMOD'. The enhancement which you must use with it is KKCD0001.
Note that when programming customer exits, that these will also run if corrected data records are imported into the datapool within the context of post processing for both test and real runs.
I will provide some pointers soon. Give me some time.
Hope this will help.
Please reward suitable points.
Regards
- Atul -
Problem with date format from Oracle DB
Hi,
I am facing a problem with date fields from Oracle DB sources. The date format of the field in DB table is 'Date base type is DATE and DDIC type is DATS'.
I mapped the date fields to Date characters in BI. Now the data that comes to PSA is in weird format. It shows like -0.PR.09-A
I have changing the field settings in DataSource to internal and external and also i have tried mapping these date fields to text fields with out luck. All delivers the same format.
I have also tried using conversion routines like, CONVERSION_EXIT_IDATE_INPUT to change format. It also delivers me the same old result.
If anybody of you have any suggestions or if anybody have you experienced such probelms, Please share your experience with me.
Thanks in advance.
Regards
VaradaThanks for all your reply. I can only the solutions creating view in database. I want some solution to be done in BI. I appreciate if some of you have idea in it.
The issue again in detail
I am facing an issue with date fields from oracle data. The data that is sent from Oracle is in the format is -0.AR.04-M. I am able to convert this date in BI with conversion routine in BI into format 04-MAR-0.
The problem is, I am getting data of length 10 (Output format) in the format -0.AR.04-M where the month is not in numericals. Since it is in text it is taking one character spacing more.
I have tried in different ways to convert and increased the length in BI, the result is same. I am wondering if we can change the date format in database.
I am in puzzle with the this date format. I have checked other Oracle DB connections data for date fields in BI, they get data in the format 20.081.031 which will allow to convert this in BI. Only from the system i am trying creating a problem.
Regards
Varada -
hai
i have problem with data load,where for 12th month i dont have any actual data(related to finance) in r/3 but when i saw in cube i have data,i dont know how the data is coming from r/3 without any actual data.if so how can i check whether that data come from r/3 or some other place and if i found that how can i delete that data ,from where should i delete that data.please help to solve this issueHi,
If there is data in Infocube, there should be request. use content of data target option in manage and select the any one data record which is in scope and find the request number. by using this find the same request in Request tab of manage infocube. click on monitor symbol in that selected request, which take you to monitor screen where you can find the source system and which IP, and DataSource used in uploading.
if there is no ALE scenario in your system landscape then there is no possibilities of missing data in R/3 and BW. so check once again the data flow and availability.
Reg,
Vish. -
having a problem with dates when I send my numbers doc to excel. dates are all out and that they have to cut and paste individual entries onto their spreadsheet. Any idea how I can prevent this.
I'm using Lion on an MBP and Numbers is the latest versionMay you give more details about what is wrong with your dates ?
M…oSoft products aren't allowed on my machines but I use LibreOffice which is a clone of Office.
When I export from Numbers to Excel and open the result with LibreOffice, the dates are correctly treated.
To be precise, dates after 01/01/1904 are correctly treated. dates before 01/01/1904 are exported as strings but, as it's flagged during the export process, it's not surprising.
Yvan KOENIG (VALLAURIS, France) mardi 3 janvier 2012
iMac 21”5, i7, 2.8 GHz, 12 Gbytes, 1 Tbytes, mac OS X 10.6.8 and 10.7.2
My iDisk is : http://public.me.com/koenigyvan
Please : Search for questions similar to your own before submitting them to the community
For iWork's applications dedicated to iOS, go to :
https://discussions.apple.com/community/app_store/iwork_for_ios -
Problem with date format when ask prompt web-intelligence
Bo XIR2 with 5 SP. Instaled on Windows 2003 with support Russian.
Inside BO every labels, buttons - use russian. But when invoke web-report and Prompt appear there is problem with date format.
Looks like korean format of date 'jj.nn.aaa H:mm:ss'. I see system settings of date in Win .. everything right
What i have to do?
Where i can change format date for bo?GK, try this...
decode(instr(packagename.functionname(param1 ,param2),'2400'), 0, to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" hh24mi'),'mm/dd/yyyy hh24mi'),'mm/dd/yyyy hh24mi'),
to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" "2400"')+1,'mm/dd/yyyy "0000"'),'mm/dd/yyyy "0000"'))-Marilyn -
Problem with date format dd/mm/yyyy. But I need to convert yyyy-mm-dd.
Dear friends,
I have the problem with date format. I receiving the date with the format dd/mm/yyyy. But I can upload to MySQL only in the format of yyyy-mm-dd.
how should I handle this situation, for this I've created these code lines.But I have some problem with these line. please help me to solve this problem.
String pattern = "yyyy-mm-dd";
SimpleDateFormat format = new SimpleDateFormat(pattern);
try {
Date date = format.parse("2006-02-12");
System.out.println(date);
} catch (ParseException e) {
e.printStackTrace();
System.out.println(format.format(new Date()));
this out put gives me Tue Apr 03 00:00:00 IST 2007
But I need the date format in yyyy-mm-dd.
regards,
maza
thanks in advance.Thanks Dear BalusC,
I tried with this,
rs.getString("DATA_SCAD1")// where the source from .xls files
String pattern = "yyyy-MM-dd";
SimpleDateFormat format = new SimpleDateFormat(pattern);
try {
Date date = format.parse("DATA_SCAD1");
System.out.println(date);
} catch (ParseException e) {
e.printStackTrace();
System.out.println(format.format(new Date()));
this out put gives me Tue Apr 03 00:00:00 IST 2007
But I want to display the date format in yyyy-mm-dd.
regards,
maza -
Tp ended with error code 0247 - addtobuffer has problems with data- and/or
Hello Experts,
If you give some idea, it will be greatly appreciated.
This transported issue started coming after power outage, sap system went hard shutdown.
Then we brought up the system. Before that , we do not have this transport issue.
our TMS landscape is
DEV QA-PRD
SED-SEQSEP
DEV is having the TMS domain controller.
FYI:
*At OS level, when we do scp command using root user, it is fine for any TR.
In STMS, while adding TR in SEQ(QA system), we are getting error like this.
Error:
Transport control program tp ended with error code 0247
Message no. XT200
Diagnosis
An error occurred when executing a tp command.
Command: ADDTOBUFFER SEDK906339 SEQ client010 pf=/us
Return code: 0247
Error text: addtobuffer has problems with data- and/or
Request: SEDK906339
System Response
The function terminates.
Procedure
Correct the error and execute the command again if necessary.
This is tp version 372.04.71 (release 700, unicode enabled)
Addtobuffer failed for SEDK906339.
Neither datafile nor cofile exist (cofile may also be corrupted).
standard output from tp and from tools called by tp:
tp returncode summary:
TOOLS: Highest return code of single steps was: 0
ERRORS: Highest tp internal error was: 0247when we do scp using sm69,
SEDADM@DEVSYS:/usr/sap/trans/cofiles/K906339.SED SEQADM@QASYS:/usr/sap/trans/cofiles/.
it throws the error like below,
Host key verification failed.
External program terminated with exit code 1
Thanks
Praba -
I have one problem with Data Guard. My archive log files are not applied.
I have one problem with Data Guard. My archive log files are not applied. However I have received all archive log files to my physical Standby db
I have created a Physical Standby database on Oracle 10gR2 (Windows XP professional). Primary database is on another computer.
In Enterprise Manager on Primary database it looks ok. I get the following message Data Guard status Normal
But as I wrote above the archive log files are not applied
After I created the Physical Standby database, I have also done:
1. I connected to the Physical Standby database instance.
CONNECT SYS/SYS@luda AS SYSDBA
2. I started the Oracle instance at the Physical Standby database without mounting the database.
STARTUP NOMOUNT PFILE=C:\oracle\product\10.2.0\db_1\database\initluda.ora
3. I mounted the Physical Standby database:
ALTER DATABASE MOUNT STANDBY DATABASE
4. I started redo apply on Physical Standby database
alter database recover managed standby database disconnect from session
5. I switched the log files on Physical Standby database
alter system switch logfile
6. I verified the redo data was received and archived on Physical Standby database
select sequence#, first_time, next_time from v$archived_log order by sequence#
SEQUENCE# FIRST_TIME NEXT_TIME
3 2006-06-27 2006-06-27
4 2006-06-27 2006-06-27
5 2006-06-27 2006-06-27
6 2006-06-27 2006-06-27
7 2006-06-27 2006-06-27
8 2006-06-27 2006-06-27
7. I verified the archived redo log files were applied on Physical Standby database
select sequence#,applied from v$archived_log;
SEQUENCE# APP
4 NO
3 NO
5 NO
6 NO
7 NO
8 NO
8. on Physical Standby database
select * from v$archive_gap;
No rows
9. on Physical Standby database
SELECT MESSAGE FROM V$DATAGUARD_STATUS;
MESSAGE
ARC0: Archival started
ARC1: Archival started
ARC2: Archival started
ARC3: Archival started
ARC4: Archival started
ARC5: Archival started
ARC6: Archival started
ARC7: Archival started
ARC8: Archival started
ARC9: Archival started
ARCa: Archival started
ARCb: Archival started
ARCc: Archival started
ARCd: Archival started
ARCe: Archival started
ARCf: Archival started
ARCg: Archival started
ARCh: Archival started
ARCi: Archival started
ARCj: Archival started
ARCk: Archival started
ARCl: Archival started
ARCm: Archival started
ARCn: Archival started
ARCo: Archival started
ARCp: Archival started
ARCq: Archival started
ARCr: Archival started
ARCs: Archival started
ARCt: Archival started
ARC0: Becoming the 'no FAL' ARCH
ARC0: Becoming the 'no SRL' ARCH
ARC1: Becoming the heartbeat ARCH
Attempt to start background Managed Standby Recovery process
MRP0: Background Managed Standby Recovery process started
Managed Standby Recovery not using Real Time Apply
MRP0: Background Media Recovery terminated with error 1110
MRP0: Background Media Recovery process shutdown
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[1]: Assigned to RFS process 2148
RFS[1]: Identified database type as 'physical standby'
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[2]: Assigned to RFS process 2384
RFS[2]: Identified database type as 'physical standby'
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[3]: Assigned to RFS process 3188
RFS[3]: Identified database type as 'physical standby'
Primary database is in MAXIMUM PERFORMANCE mode
Primary database is in MAXIMUM PERFORMANCE mode
RFS[3]: No standby redo logfiles created
Redo Shipping Client Connected as PUBLIC
-- Connected User is Valid
RFS[4]: Assigned to RFS process 3168
RFS[4]: Identified database type as 'physical standby'
RFS[4]: No standby redo logfiles created
Primary database is in MAXIMUM PERFORMANCE mode
RFS[3]: No standby redo logfiles created
10. on Physical Standby database
SELECT PROCESS, STATUS, THREAD#, SEQUENCE#, BLOCK#, BLOCKS FROM V$MANAGED_STANDBY;
PROCESS STATUS THREAD# SEQUENCE# BLOCK# BLOCKS
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
ARCH CONNECTED 0 0 0 0
RFS IDLE 0 0 0 0
RFS IDLE 0 0 0 0
RFS IDLE 1 9 13664 2
RFS IDLE 0 0 0 0
10) on Primary database:
select message from v$dataguard_status;
MESSAGE
ARC0: Archival started
ARC1: Archival started
ARC2: Archival started
ARC3: Archival started
ARC4: Archival started
ARC5: Archival started
ARC6: Archival started
ARC7: Archival started
ARC8: Archival started
ARC9: Archival started
ARCa: Archival started
ARCb: Archival started
ARCc: Archival started
ARCd: Archival started
ARCe: Archival started
ARCf: Archival started
ARCg: Archival started
ARCh: Archival started
ARCi: Archival started
ARCj: Archival started
ARCk: Archival started
ARCl: Archival started
ARCm: Archival started
ARCn: Archival started
ARCo: Archival started
ARCp: Archival started
ARCq: Archival started
ARCr: Archival started
ARCs: Archival started
ARCt: Archival started
ARCm: Becoming the 'no FAL' ARCH
ARCm: Becoming the 'no SRL' ARCH
ARCd: Becoming the heartbeat ARCH
Error 1034 received logging on to the standby
Error 1034 received logging on to the standby
LGWR: Error 1034 creating archivelog file 'luda'
LNS: Failed to archive log 3 thread 1 sequence 7 (1034)
FAL[server, ARCh]: Error 1034 creating remote archivelog file 'luda'
11)on primary db
select name,sequence#,applied from v$archived_log;
NAME SEQUENCE# APP
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00003_0594204176.001 3 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00004_0594204176.001 4 NO
Luda 4 NO
Luda 3 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00005_0594204176.001 5 NO
Luda 5 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00006_0594204176.001 6 NO
Luda 6 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00007_0594204176.001 7 NO
Luda 7 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00008_0594204176.001 8 NO
Luda 8 NO
12) on standby db
select name,sequence#,applied from v$archived_log;
NAME SEQUENCE# APP
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00004_0594204176.001 4 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00003_0594204176.001 3 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00005_0594204176.001 5 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00006_0594204176.001 6 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00007_0594204176.001 7 NO
C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00008_0594204176.001 8 NO
13) my init.ora files
On standby db
irina.__db_cache_size=79691776
irina.__java_pool_size=4194304
irina.__large_pool_size=4194304
irina.__shared_pool_size=75497472
irina.__streams_pool_size=0
*.audit_file_dest='C:\oracle\product\10.2.0\admin\luda\adump'
*.background_dump_dest='C:\oracle\product\10.2.0\admin\luda\bdump'
*.compatible='10.2.0.1.0'
*.control_files='C:\oracle\product\10.2.0\oradata\luda\luda.ctl'
*.core_dump_dest='C:\oracle\product\10.2.0\admin\luda\cdump'
*.db_block_size=8192
*.db_domain=''
*.db_file_multiblock_read_count=16
*.db_file_name_convert='luda','irina'
*.db_name='irina'
*.db_unique_name='luda'
*.db_recovery_file_dest='C:\oracle\product\10.2.0\flash_recovery_area'
*.db_recovery_file_dest_size=2147483648
*.dispatchers='(PROTOCOL=TCP) (SERVICE=irinaXDB)'
*.fal_client='luda'
*.fal_server='irina'
*.job_queue_processes=10
*.log_archive_config='DG_CONFIG=(irina,luda)'
*.log_archive_dest_1='LOCATION=C:/oracle/product/10.2.0/oradata/luda/ VALID_FOR=(ALL_LOGFILES, ALL_ROLES) DB_UNIQUE_NAME=luda'
*.log_archive_dest_2='SERVICE=irina LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES, PRIMARY_ROLE) DB_UNIQUE_NAME=irina'
*.log_archive_dest_state_1='ENABLE'
*.log_archive_dest_state_2='ENABLE'
*.log_archive_max_processes=30
*.log_file_name_convert='C:/oracle/product/10.2.0/oradata/irina/','C:/oracle/product/10.2.0/oradata/luda/'
*.open_cursors=300
*.pga_aggregate_target=16777216
*.processes=150
*.remote_login_passwordfile='EXCLUSIVE'
*.sga_target=167772160
*.standby_file_management='AUTO'
*.undo_management='AUTO'
*.undo_tablespace='UNDOTBS1'
*.user_dump_dest='C:\oracle\product\10.2.0\admin\luda\udump'
On primary db
irina.__db_cache_size=79691776
irina.__java_pool_size=4194304
irina.__large_pool_size=4194304
irina.__shared_pool_size=75497472
irina.__streams_pool_size=0
*.audit_file_dest='C:\oracle\product\10.2.0/admin/irina/adump'
*.background_dump_dest='C:\oracle\product\10.2.0/admin/irina/bdump'
*.compatible='10.2.0.1.0'
*.control_files='C:\oracle\product\10.2.0\oradata\irina\control01.ctl','C:\oracle\product\10.2.0\oradata\irina\control02.ctl','C:\oracle\product\10.2.0\oradata\irina\control03.ctl'
*.core_dump_dest='C:\oracle\product\10.2.0/admin/irina/cdump'
*.db_block_size=8192
*.db_domain=''
*.db_file_multiblock_read_count=16
*.db_file_name_convert='luda','irina'
*.db_name='irina'
*.db_recovery_file_dest='C:\oracle\product\10.2.0/flash_recovery_area'
*.db_recovery_file_dest_size=2147483648
*.dispatchers='(PROTOCOL=TCP) (SERVICE=irinaXDB)'
*.fal_client='irina'
*.fal_server='luda'
*.job_queue_processes=10
*.log_archive_config='DG_CONFIG=(irina,luda)'
*.log_archive_dest_1='LOCATION=C:/oracle/product/10.2.0/oradata/irina/ VALID_FOR=(ALL_LOGFILES, ALL_ROLES) DB_UNIQUE_NAME=irina'
*.log_archive_dest_2='SERVICE=luda LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES, PRIMARY_ROLE) DB_UNIQUE_NAME=luda'
*.log_archive_dest_state_1='ENABLE'
*.log_archive_dest_state_2='ENABLE'
*.log_archive_max_processes=30
*.log_file_name_convert='C:/oracle/product/10.2.0/oradata/luda/','C:/oracle/product/10.2.0/oradata/irina/'
*.open_cursors=300
*.pga_aggregate_target=16777216
*.processes=150
*.remote_login_passwordfile='EXCLUSIVE'
*.sga_target=167772160
*.standby_file_management='AUTO'
*.undo_management='AUTO'
*.undo_tablespace='UNDOTBS1'
*.user_dump_dest='C:\oracle\product\10.2.0/admin/irina/udump'
Please help me!!!!Hi,
After several tries my redo logs are applied now. I think in my case it had to do with the tnsnames.ora. At this moment I have both database in both tnsnames.ora files using the SID and not the SERVICE_NAME.
Now I want to use DGMGRL. Adding a configuration and a stand-by database is working fine, but when I try to enable the configuration DGMGRL gives no feedback and it looks like it is hanging. The log, although says that it succeeded.
In another session 'show configuration' results in the following, confirming that the enable succeeded.
DGMGRL> show configuration
Configuration
Name: avhtest
Enabled: YES
Protection Mode: MaxPerformance
Fast-Start Failover: DISABLED
Databases:
avhtest - Primary database
avhtestls53 - Physical standby database
Current status for "avhtest":
Warning: ORA-16610: command 'ENABLE CONFIGURATION' in progress
It there anybody that experienced the same problem and/or knows the solution to this?
With kind regards,
Martin Schaap -
Problems with a VBA Userform using Multipage (2) and DTPicker.
Hi
Problems with a VBA Userform using Multipage (2) and DTPicker (4)
On Page1 I've got 2 DTPicker, one for the date and the second for the time.
Same thing on Page 2.
Problem:
Only one set will work, if I close the Userform with" MultiPage"on page2, only that set will work.
Same thing if I close on Page 1 then just the set on Page 1 will work.
As anyone seen this problem and any work around you may think would help.
I'm using Windows 7 , Ms Office Pro. 2003
same problem on Windows Vista , XL2003
CimjetThere are a number of issues relating to the way that date pickers are handled, but the most important is that their output is text. In order to get the values into Excel, you need to format the Excel columns as Date and Custom (time format) and convert
the output to the worksheet from text to date values.
Date pickers also display a few anomalies on multi-page forms, so you need a belt and braces approach. Personally I would put the code to call the form and enter the values in a standard module (as now in the example) and use a belt and braces approach to
maintaining the format.
I think you will find the example now works.
Revised Example
Graham Mayor - Word MVP
www.gmayor.com -
Hi to everyone,
I have a problem with data acquisitioning in LV 7.1.
I made a transition from Tradiotional NI-DAQ to NI-DAQmx in my LabVIEW application.
The problem I have is that when I acquire data in Traditional (without writing somewhere -
just reading) then there is no scan backlog data. But when I acquire data in application that
acquisition is based on DAQmx than a scan backlog indicator shows numbers from 20 to 50 for
about 6 min and then that number quite quickly increases until I get an error (unable to
acquire data. The data was overwritten).
Acquisition settings are the same in both cases. When I acquire with DAQmx I use a global
channels. Is a reason for that phenomenon in global channels data procesing? But it seems
strange why does it flows quite smoothly for about 6 min and then it stucks?
Best regards,
EroIf you have an old Daq unit it may not be compatible with DAQMX. Which DAQ unit do you have? I think NI have a list showing which DAQ driver you can use with your card
Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
(Sorry no Labview "brag list" so far) -
URGENT ! JDEV 10.1.2 Problem with data control generated from session bean
I got a problem with data control generated from session bean which return a collection of data transfer object.
The dto's seem to be correct. The session bean load correctly the data into and the object's are plenty of data. Using the console to display the dto content is ok.
When generating a data control from this session bean and associate the dto included in the collection only the first object level and one-to-one dto object are correctly setted in the data control. Object that represent collection into the dto (one-to-many foreign key) are setted as collection with an iterator but the structure of the object is not setted. I don't know how to associate this second level of collection with the dto bean class to obtain the attributes definition.
I created a case with hr schema like the hrApp demo application in the tutorial with departments and employees table. I got the same problem.
Is it a bug ?
It exists a workaround to force the data control to understand the collection data structure ?
Help is welcome ! this is urgent !!!we found the problem by assigning the child dto bean class to the node representing the iterator in the xml file corresponding to the master dto.
-
The ios8 update is causing innumerable problems with my ipad. The info from my calendar just disappeared, I am typing blind with the keyboard most of the time, and I cannot upload pics or paste to FB. When will this be fixed and what can I do in the Meantime?
Have you tried yesterday's update to iOS8.0.2?
iOS 8.0.2
This release contains improvements and bug fixes, including:
• Fixes an issue in iOS 8.0.1 that impacted cellular network connectivity and Touch ID on iPhone 6 and iPhone 6 Plus
• Fixes a bug so HealthKit apps can now be made available on the App Store
• Addresses an issue where 3rd party keyboards could become deselected when a user enters their passcode
• Fixes an issue that prevented some apps from accessing photos from the Photo Library
• Improves the reliability of the Reachability feature on iPhone 6 and iPhone 6 Plus
• Fixes an issue that could cause unexpected cellular data usage when receiving SMS/MMS messages
• Better support of Ask To Buy for Family Sharing for In-App Purchases
• Fixes an issue where ringtones were sometimes not restored from iCloud backups
• Fixes a bug that prevented uploading photos and videos from Safari
For information on the security content of this update, please visit this website:
http://support.apple.com/kb/HT1222
Maybe you are looking for
-
I am trying to do a migration transfer from an older Mac Pro, running Leopard, to a newer Mac Pro, running Lion. There is a step that asks for a "passcode" to proceed. I have no idea what that passcode is. It seems to want a numeric code. Has any
-
Why is Messages in iOS 8.1 continuing to crash/freeze?
I have a iPhone 6 Plus with 128 gb of storage and the iOS Messages app freezes/crash on a very frequent basis. I have noticed this on occasion with other apps like iTunes player, Safari and Mail. But, iOS Messages is the worst offender. I do have tw
-
Error :Receiver File Channel not Initialized
Hi all, I am using a file adapter at the receiver side with file content conversion option. When i post a message it shows an error in the message monitoring as "Reciever File Channel not initialized : Unable to proceed : null" Please Help.....
-
Web Dynpro OfficeObject with VB Macros
Hi all, I'm trying to employ an MS Word Template using the OfficeControl. The word template has macros in it, tied to buttons. When this is previewed in the Mime repository, the buttons are there and the macro runs. When the document is called in t
-
Suddenly error in fb and appworld
Z10STL100-1/10.2.1.3247 suddenly in the morning I tried to open Facebook app it didn't logged in(Sorry an unexpected error occurred please try again later) error appear after several hours I tried again the same error I decided to uninstall the FB ap