Missing Database Index Production Server
Hi,
We are getting Missing Database Index ZPP_EXCAVAT_DET5~0 in production server. For missing database index we are getting in se14 it is showing Exists in the database.While we are using t -code in Db02 there we are getting
ZPP_EXCAVAT_DET5~0 u2013 missing database index.
Had Followd as per note 1248769 but not working Please Suggest
Regards
Priyanka M Jain.
Dear Vinod,
Tried Following Steps as
. Call transaction SE14.
. Enter the name of the table and choose "Edit".
. Choose "Indexes".
. Select the index and choose "Choose (F2)".
. If you choose "Activate and adjust", the system creates the index again and it is consistent.
. Check the object log of this activation.
. If an error occurs, eliminate the cause and reactivate the index.
But could not create the index
Following Error Occured
SQL STATEMENT COULD NOT BE EXECUTED
INDEX COULD NOT BE CREATED COMPLETELY IN THE DATABASE
INDEX ZPP_EXCAVET_DET5-0 Could NOT BE CREATED
Similar Messages
-
Database Portability of restored Database in Production server
Hi All,
I have a query regarding database portability in Exchange 2010 server.
I have restored a Journaling Database which is ~ 350 GB and just wanted to check if i dismount the prod database which does not have any data in the DB and only journlaing mailbox is pointed to this DB , can i move restored DB .edb file to the production
server physical location and mount it, will the journaling account will connect to this restored database which i have copied. or do i have to run any command which will reconnect the mailbox.
BR/DeepakHi,
You can use the Connect-Mailbox -Identity -Archive cmdlet to re-connect an archive mailbox.
You can refer to the following article.
https://technet.microsoft.com/en-GB/library/ee633490(v=exchg.141).aspx
Best regards,
Belinda Ma
TechNet Community Support -
Following instructions for downloading Adobe Extension Manger and looking for the .zxp or .xzp files doesn't work with my CW 2014.1 CC. There are NO files with those extensions anywhere in my program folders. This is frustrating being a new customer. I PAID lynda.com for turtorials, after I PAID Adobe for software and now I'm at a standstill. Apparently Adobe doesn't want to talk to their end-users. Has anyone else had this problem? I desperately need to get php and Mysql working for one of my paying customers.
I was pointed in the direction of using PHP and MySQL for database development
PHP and MySQL are correct, but the connection between them should be through mysqli or PDO, not the mysql extension used by Dreamweaver server behaviors.
There are many good tools for creating database powered websites, but it can be done with just Dreamweaver and the tools included with the free XAMPP software package. That package contains the Apache web server, MySQL, PHP and PHPmyAdmin (which is a database interface).
Get the book PHP Solutions by Powers. It will get you going. -
Once I connected my new server to my farm's config db, it returned all of the following missing locally. I stripped out any redundancies and Headings, and I'm left with 43. I'm looking for a efficient strategy. Should I start with the lowest version number
and work my way up? Current DB version is 14.0.7015.1000. IIRC, SP2 is cumulative, so can I ignore the first two (SP1 and Hotfix), install SP2, and then the Language packs and etc on top?
Sorted by version:
Microsoft SharePoint 2010 Service Pack 1 (SP1) (14.0.6029.1000)
Hotfix for Microsoft SharePoint Server 2010 (KB2775353) 64-Bit Edition (14.0.6105.5000)
Service Pack 2 for Microsoft SharePoint 2010 (KB2687453) 64-Bit Edition (14.0.7015.1000)
Service Pack 2 for Microsoft 2010 Server Language Pack (KB2687462) 64-Bit Edition (14.0.7015.1000)
Microsoft Office Server Proof (English) 2010 (14.0.7015.1000)
Microsoft Office Server Proof (French) 2010 (14.0.7015.1000)
Microsoft Office Server Proof (Russian) 2010 (14.0.7015.1000)
Microsoft Office Server Proof (Spanish) 2010 (14.0.7015.1000)
Microsoft SharePoint Portal (14.0.7015.1000)
Microsoft User Profiles (14.0.7015.1000)
Microsoft SharePoint Portal English Language Pack (14.0.7015.1000)
Microsoft Shared Components (14.0.7015.1000)
Microsoft Shared Coms English Language Pack (14.0.7015.1000)
Microsoft Slide Library (14.0.7015.1000)
Microsoft InfoPath Forms Services (14.0.7015.1000)
Microsoft InfoPath Form Services English Language Pack (14.0.7015.1000)
Microsoft Word Server (14.0.7015.1000)
Microsoft Word Server English Language Pack (14.0.7015.1000)
PerformancePoint Services for SharePoint (14.0.7015.1000)
PerformancePoint Services in SharePoint 1033 Language Pack (14.0.7015.1000)
Microsoft Visio Services English Language Pack (14.0.7015.1000)
Microsoft Visio Services Web Front End Components (14.0.7015.1000)
Microsoft Excel Services Components (14.0.7015.1000)
Microsoft Document Lifecycle Components (14.0.7015.1000)
Microsoft Excel Services English Language Pack (14.0.7015.1000)
Microsoft Search Server 2010 Core (14.0.7015.1000)
Microsoft Search Server 2010 English Language Pack (14.0.7015.1000)
Microsoft Document Lifecycle Components English Language Pack (14.0.7015.1000)
Microsoft Slide Library English Language Pack (14.0.7015.1000)
Microsoft SharePoint Server 2010 (14.0.7015.1000)
Microsoft Access Services Server (14.0.7015.1000)
Microsoft Access Services English Language Pack (14.0.7015.1000)
Microsoft Web Analytics Web Front End Components (14.0.7015.1000)
Microsoft Web Analytics English Language Pack (14.0.7015.1000)
Microsoft Excel Mobile Viewer Components (14.0.7015.1000)
Recommendations?
Thanks,
ScottThanks guys. I was able to get through all of the patches except for
Language Pack for SharePoint, Project Server and Office Web Apps 2010 - English missing locally
Language Pack for SharePoint, Project Server and Office Web Apps 2010 -
Spanish/Español missing locally
This was my process:
Config Wizard:
Adding Index server to existing farm, "Server Farm Product and Patch Status" returned 34 Missing Locally required products and patches.
SKIP installing the following two, as SP2 is cumulative
Microsoft SharePoint 2010 Service Pack 1 (SP1) (14.0.6029.1000) (officeserver2010sp1-kb2460045-x64-fullfile-en-us.exe)
Hotfix for Microsoft SharePoint Server 2010 (KB2775353) 64-Bit Edition (14.0.6105.5000)
install SP2 oserversp2010-kb2687453-fullfile-x64-en-us.exe
install oslpksp2010-kb2687462-fullfile-x64-en-us.exe
Got "There are no products affected by this package installed on this system."
SO!
Uninstalled Sharepoint 2010 Server
WIll try to install again, without skipping #2, and reorder installation of oslpksp2010-kb2687462-fullfile-x64-en-us.exe
Retry (the long way):
Run SharePointServer.exe, get Missing Locally...
Install oslpksp2010-kb2687462-fullfile-x64-en-us.exe SUCCESSFUL
Install officeserver2010sp1-kb2460045-x64-fullfile-en-us.exe SUCCESSFUL
Reboot
Install oserversp2010-kb2687453-fullfile-x64-en-us.exe SUCCESSFUL
Rerun Config
STILL MISSING
Download oslpksp2010-kb2687462-fullfile-x64-es-es.exe, run, "there are no products affected..."
Uninstall.
Re-retried, only got through the first couple:
Run SharePointServer.exe, get Missing Locally...
Install SP1 officeserver2010sp1-kb2460045-x64-fullfile-en-us.exe, SUCCESSFUL
Install English Lang Pack oslpksp2010-kb2687462-fullfile-x64-en-us.exe, "There are no products affected by this package installed on this system."
Install Spanish Lang Pack oslpksp2010-kb2687462-fullfile-x64-es-es.exe,
Install SP2 oserversp2010-kb2687453-fullfile-x64-en-us.exe, run Config Wizard
I'm now downloading 462150_intl_x64_zip.exe, going to try and install it as step three.
Any suggestions greatly appreciated.
Thanks,
Scott -
Creating Indexes in Production Server.
Hi All,
1. We have master data table in Production server with 2 million records.This master data table doesn't have
any Indexes created which causes the delay in Loading process. How to create indexes in production
2. This table have Index created in DEV but was created as $tmp package so cannot be transported.So if i
create the Index for this master data table which already having the data what could be the impact like loss of
data or any other way.Please post your valuable comments.
Thanks & Regards
Kranthi.
Edited by: Arun Varadarajan on Apr 15, 2009 11:54 PMHi Kranti,
Creating Secondary Indexes
Procedure
1. In the maintenance screen of the table, choose Indexes.
If indexes already exist on the table, a list of these indexes is displayed. Choose .
2. In the next dialog box, enter the index ID and choose
The maintenance screen for indexes appears.
3. Enter an explanatory text in the field Short text.
You can then use the short text to find the index at a later time, for example with the R/3 Repository
Information System.
4. Select the table fields to be included in the index using the input help for the Field name column.
The order of the fields in the index is very important. See What to Keep in Mind for Secondary Indexes.
5. If the values in the index fields already uniquely identify each record of the table, select Unique index.
A unique index is always created in the database at activation because it also has a functional meaning
(prevents double entries of the index fields).
6. If it is not a unique index, leave Non-unique index selected.
In this case you can use the radio buttons to define whether the index should be created for all database
systems, for selected database systems or not at all in the database.
7. Select for selected database systems if the index should only be created for selected database
systems.
Click on the arrow behind the radio buttons. A dialog box appears in which you can define up to 4 database
systems with the input help. Select Selection list if the index should only be created on the given database
systems. Select Exclusion list if the index should not be created on the given database systems. Choose .
8. Choose .
Regards,
Prabhudas
Edited by: Arun Varadarajan on Apr 15, 2009 11:53 PM
Edited by: Arun Varadarajan on Apr 15, 2009 11:54 PM -
hi Friends ,i need a suggestion from you on how to
insert data to all tables on a Database "A " on Test server
Select data from all tables on a Database "A" on Production Server
where id=123
Database A is same with Structures on Test and Production also all Tables will have Id column in common.
The purpose of this insert is ,as we all know Production has the latest data and i need to push to test server on request for particular ID only ( may be weekly once or twice a week )
I have a linked server setup name "LINQ"
Example for one table is below , like wise i need a script which does for 154 tables.
Insert into ABC( id, name)---insert to test server
Select Id, name from LINQ.ProdSerevrname.databasename.ABC where id = 123
Please help me ..
ThanksWhy not use export import wizard for this if you've read access to production?
Please Mark This As Answer if it solved your issue
Please Vote This As Helpful if it helps to solve your issue
Visakh
My MSDN Page
My Personal Blog
My Facebook Page -
Hi,
I am taken data dump on oracle 9i machine and ported (imported ) oracle 10g (production machine) ,But it will showing error : language set error,
Could you tell me how to take data dump with language set.
Regards,
SuvaHi PaulM,
Please follows the details,
Development server ,It is 9i machine (I am export in this machine) and Imported on Production Server ( It is Oracle 10 g).
When import on production server error is coming, Tis error log adding below.
Production Databse (Language details)
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET UTF8
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY $
NLS_COMP BINARY
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE
NLS_NCHAR_CHARACTERSET UTF8
NLS_RDBMS_VERSION 10.2.0.1.0
Development Database Language details Details.
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET UTF8
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY $
NLS_COMP BINARY
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE
NLS_NCHAR_CHARACTERSET UTF8
NLS_RDBMS_VERSION 10.2.0.1.0
Log file
Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production
Export file created by EXPORT:V09.02.00 via conventional path
import done in WE8MSWIN1252 character set and UTF8 NCHAR character set
import server uses UTF8 character set (possible charset conversion)
export server uses AL16UTF16 NCHAR character set (possible ncharset conversion)
. importing JW_OR's objects into JW_OR
. importing JW_OS's objects into JW_OS
. importing JW_ADMIN's objects into JW_ADMIN
. importing JW_OR's objects into JW_OR
. . importing table "ACCRXNS" 234671 rows imported
. . importing table "AUTHORLINKS" 790450 rows imported
. . importing table "AUTHORS" 79500 rows imported
. . importing table "CATSOL" 25505 rows imported
. . importing table "CATSOLSYNONYMS" 80045 rows imported
. . importing table "CHAPTERTITLES" 133 rows imported
. . importing table "COMPOUNDLINKS" 601785 rows imported
. . importing table "CONDITIONS" 207445 rows imported
. . importing table "JOURNALS" 2327 rows imported
. . importing table "LANGUAGE" 0 rows imported
. . importing table "MAINDATA" 234659 rows imported
. . importing table "MOLDATA" 721174 rows imported
. . importing table "PLAN_TABLE" 1 rows imported
. . importing table "REFERENCES" 276783 rows imported
. . importing table "ROLES" 2 rows imported
. . importing table "RXNKEYLINKS" 1724404 rows imported
. . importing table "RXNKEYWORDS" 848 rows imported
. . importing table "TABLETITLES" 2400 rows imported
. . importing table "TEMP_TABLE" 165728 rows imported
. . importing table "TEMP_WILEY_MAINDATA" 155728 rows imported
. . importing table "TEMP_WILEY_PDF_MAP" 16672 rows imported
. . importing table "TEMP_WILEY_YEAR_VOL_MAP" 42 rows imported
. . importing table "WEX_ACCRXNS" 3465 rows imported
. . importing table "WEX_AUTHORLINKS" 14183 rows imported
. . importing table "WEX_AUTHORS" 79500 rows imported
. . importing table "WEX_CHAPTERTITLES" 133 rows imported
. . importing table "WEX_COMPOUNDLINKS" 10925 rows imported
. . importing table "WEX_CONDITIONS" 5297 rows imported
. . importing table "WEX_JOURNALS" 2327 rows imported
. . importing table "WEX_LANGUAGE" 0 rows imported
. . importing table "WEX_MAINDATA" 3465 rows imported
. . importing table "WEX_MOLDATA" 10358 rows imported
. . importing table "WEX_REFERENCES" 3795 rows imported
. . importing table "WEX_RXNKEYLINKS" 34540 rows imported
. . importing table "WEX_RXNKEYWORDS" 848 rows imported
. . importing table "WEX_TABLETITLES" 2400 rows imported
. . importing table "WEX_WILEY_HTML_MAP" 17316 rows imported
. . importing table "WEX_WILEY_MAINDATA" 3465 rows imported
. . importing table "WEX_WILEY_PDF_MAP" 23925 rows imported
. . importing table "WEX_WILEY_YEAR_VOL_MAP" 58 rows imported
. . importing table "WILEY_HTML_MAP" 17316 rows imported
. . importing table "WILEY_MAINDATA" 234659 rows imported
. . importing table "WILEY_PDF_MAP" 23925 rows imported
. . importing table "WILEY_YEAR_VOL_MAP" 58 rows imported
. importing JW_OS's objects into JW_OS
. . importing table "ACCRXNS" 7116 rows imported
. . importing table "ATMOSPHERE" 47 rows imported
. . importing table "AUTHORLINKS" 33276 rows imported
. . importing table "AUTHORS" 6555 rows imported
. . importing table "CATSOL" 1463 rows imported
. . importing table "CATSOLSYNONYMS" 9370 rows imported
. . importing table "CHEMICALS" 78197 rows imported
. . importing table "COMPOUNDLINKS" 20799 rows imported
. . importing table "EXPDET" 1 rows imported
. . importing table "FOOTNOTES" 77825 rows imported
. . importing table "JOURNALS" 2 rows imported
. . importing table "LANGUAGE" 2 rows imported
. . importing table "MAINDATA" 7116 rows imported
. . importing table "PATHSTEP" 7199 rows imported
. . importing table "PROCEDURENOTES" 77293 rows imported
. . importing table "ROLES" 2 rows imported
. . importing table "RXNKEYLINKS" 23096 rows imported
. . importing table "RXNKEYWORDS" 1272 rows imported
. . importing table "WEX_ACCRXNS" 135 rows imported
. . importing table "WEX_ATMOSPHERE" 47 rows imported
. . importing table "WEX_AUTHORLINKS" 613 rows imported
. . importing table "WEX_AUTHORS" 6555 rows imported
. . importing table "WEX_CHEMICALS" 0 rows imported
. . importing table "WEX_COMPOUNDLINKS" 497 rows imported
. . importing table "WEX_EXPDET" 1 rows imported
. . importing table "WEX_FOOTNOTES" 2184 rows imported
. . importing table "WEX_JOURNALS" 2 rows imported
. . importing table "WEX_LANGUAGE" 2 rows imported
. . importing table "WEX_MAINDATA" 135 rows imported
. . importing table "WEX_PATHSTEP" 135 rows imported
. . importing table "WEX_PROCEDURENOTES" 2253 rows imported
. . importing table "WEX_RXNKEYLINKS" 695 rows imported
. . importing table "WEX_RXNKEYWORDS" 1272 rows imported
. importing JW_ADMIN's objects into JW_ADMIN
. . importing table "APP_USER" 76 rows imported
. . importing table "AUTHOR" 61874 rows imported
. . importing table "CITATION"
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10794
Column 2 77
Column 3 1
Column 4 24
Column 5
Column 6 Science of Synthesis
Column 7 Negishi, E.-i.; Takahashi, T. Science of Synthesis...
Column 8 681–848
Column 9 2
Column 10
Column 11 2002
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10879
Column 2 77
Column 3 1
Column 4 110
Column 5
Column 6 Comprehensive Organic Synthesis
Column 7 Hiemstra, H.; Speckamp, W. N.; Trost, B. M.; Flemi...
Column 8 1047–108
Column 9 2
Column 10
Column 11
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10880
Column 2 77
Column 3 1
Column 4 111
Column 5
Column 6 Houben-Weyl Methods of Organic Chemistry
Column 7 De Koning, H.; Speckamp, W. N.; Helmchen, G.; Hoff...
Column 8 1953–200
Column 9 E21b
Column 10
Column 11 1995
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10904
Column 2 77
Column 3 1
Column 4 135
Column 5
Column 6 Houben-Weyl Methods of Organic Chemistry
Column 7 Ryu, I.; Murai, S.; de Meijere, A., Ed. Houben-Wey...
Column 8 1985–204
Column 9 E17c
Column 10
Column 11 1997
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10905
Column 2 77
Column 3 1
Column 4 136
Column 5
Column 6 The Chemistry of the Cyclopropyl Group
Column 7 Tsuji, T.; Nishida, S.; Patai, S.; Rappoport, Z., ...
Column 8 307–373
Column 9
Column 10
Column 11 1987
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10906
Column 2 77
Column 3 1
Column 4 137
Column 5
Column 6 The Chemistry of the Cyclopropyl Group
Column 7 Vilsmaier, E.; Patai, S.; Rappoport, Z., Eds. The ...
Column 8 1341–145
Column 9
Column 10
Column 11 1987
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10952
Column 2 77
Column 3 1
Column 4 183
Column 5
Column 6 Cyclopropane-Derived Reactive Intermediates
Column 7 Boche, G.; Walborsky, H. M. Cyclopropane-Derived R...
Column 8 117–173
Column 9
Column 10
Column 11 1990
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 10958
Column 2 77
Column 3 1
Column 4 189
Column 5
Column 6 Houben-Weyl Methods of Organic Chemistry
Column 7 Klunder, A. J. H.; Zwanenburg, B. Houben-Weyl Meth...
Column 8 2419–243
Column 9 E17c
Column 10
Column 11 1997
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 10995
Column 2 77
Column 3 1
Column 4 226
Column 5
Column 6 Science of Synthesis
Column 7 Cha, J. K. Science of Synthesis 2005, 325–338.
Column 8 325–338
Column 9
Column 10
Column 11 2005
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 10, maximum: 8)
Column 1 17123
Column 2 82
Column 3 1
Column 4 13
Column 5
Column 6 Comprehensive Organometallic Chemistry II
Column 7 Dushin, R. G.; Edward, W. A.; Stone, F. G. A.; Wil...
Column 8 1071–109
Column 9 12
Column 10
Column 11 1995
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 17124
Column 2 82
Column 3 1
Column 4 14
Column 5
Column 6 Modern Carbonyl Olefination
Column 7 Ephritikhine, M.; Villiers, C.; Takeda, T. Ed. Mod...
Column 8 223–285
Column 9
Column 10
Column 11 2004
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "JW_ADMIN"."CITATION"."PAGE" (actual: 9, maximum: 8)
Column 1 17126
Column 2 82
Column 3 1
Column 4 16
Column 5
Column 6 Transition Metals for Organic Synthesis (2nd Editi...
Column 7 Furstner, A.; Beller, M.; Bolm, C. Eds. Transition...
Column 8 449–468
Column 9
Column 10
Column 11 2004 17712 rows imported
. . importing table "FOOTNOTE" 38 rows imported
. . importing table "GT_STATS_REPORT" 0 rows imported
. . importing table "GT_VALIDATION_REPORT" 0 rows imported
. . importing table "OR_USERS" 1 rows imported
. . importing table "OS_USERS" 1 rows imported
. . importing table "PROCEDURENOTE" 70 rows imported
. . importing table "QC_TRACKING" 539881 rows imported
. . importing table "ROLE" 5 rows imported
. . importing table "SCHEMA" 3 rows imported
. . importing table "TASK_ALLOCATION" 159370 rows imported
. . importing table "USER_LOG" 174488 rows imported
. . importing table "VERSION" 3 rows imported
About to enable constraints...
IMP-00017: following statement failed with ORACLE error 2298:
"ALTER TABLE "AUTHOR" ENABLE CONSTRAINT "FK_AUTHOR_CITATIONID""
IMP-00003: ORACLE error 2298 encountered
ORA-02298: cannot validate (JW_ADMIN.FK_AUTHOR_CITATIONID) - parent keys not found
Import terminated successfully with warnings.
Regards,
Subash -
Standby database archive log apply in production server.
Dear All,
How I apply standby database archive log apply in production server.
Please help me.
Thanks,
ManasHow can i use standby database as primary for that 48 hour ?Perform a switchover (role transitions).
First check if the standby is in sync with the primary database.
Primary database:
sql>select max(sequence#) from v$archived_log; ---> Value AStandby database:
sql>select max(sequence#) from v$archived_log where applied='YES'; -----> Value BCheck if Value B is same as Value A.
If the standby is in sycn with the primary database, then perform the switchover operation (refer the below link)
http://www.articles.freemegazone.com/oracle-switchover-physical-standby-database.php
http://docs.oracle.com/cd/B19306_01/server.102/b14230/sofo.htm
http://www.oracle-base.com/articles/9i/DataGuard.php#DatabaseSwitchover
manas
Handle: manas
Email: [email protected]
Status Level: Newbie
Registered: Jan 24, 2012
Total Posts: 10
Total Questions: 3 (3 unresolved)
Name Manas
Location kolkata Mark your questions as answered if you feel that you have got better answers rather than building up a heap of unanswered questions. -
Production and warehouse database on same server
I have a production database that move every six months to warehouse where it is stored for history record., there will not be much intraction with production to warehouse. But both databases are exists in one server. I would like to know what is the best practice, since oracle recommending keep one database for one server.
user12145827 wrote:
I have a production database that move every six months to warehouse where it is stored for history record., there will not be much intraction with production to warehouse. But both databases are exists in one server. I would like to know what is the best practice, since oracle recommending keep one database for one server.----
Read following link that solve your problem.:
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:316916600346973310 -
Features missing in production server
Hii,
I have an application where i have file upload and download features.The features of file upload and download is visible in development server but not in production server.
Any clues on this ??Hi Parama,
Check whether the activity moved to production server was successfully build to the server & was not broken.
Does the activation & transport was successful ?
Regards
Chander Kararia -
Error on Production Server when php service is called.
The web application runs fine on my development machine but on the production server it gives me the following error and no responce when a phpservice is called.
error:
Send failed
Channel.Connect.Failed error NetConnection.Call.BadVersion: : url:
'http://ipaddress/Project/public/gateway.php'
Network Monitor:
Development Machine:
Windows 7
MySQL 5.5.16
Zend Framework 1.10
Production Machine:
Ubuntu 11.10
MySQL 5.1.58
Zend Framework 1.11
I followed the following steps to deploy the application on production server;
1. created identical database with same credentials on the production server
2. confirmed that my php service works on the server by manually navigating to the php file and printing the data retrieved on the page
3. modified the paths in amf_config.ini to reflect production server
[zend]
webroot = "/var/www"
zend_path ="/usr/local/zend/share/ZendFramework/library"
library ="/var/www/Project/library"
services ="/var/www/Project/services"
[zendamf]
amf.production = true
amf.directories[]=Project/services
4. initialized the class by adding
// Initialization customization goes here
_serviceControl.endpoint = "http://ipaddress/Project/public/gateway.php";
in the .as file for the php service
I have also found from my zend server logs that when my php service is called its throws the following error:
PHP Notice: Undefined index: HTTPS in /usr/local/zend/share/ZendFramework/library/Zend/Amf/Response/Http.php on line 59
I doubt that there is anything wrong in the deployment process but please do let me know if you could think of something i might have missed.
The other thing im wondering is the php notice if it were to be caused due to different Zend Framework versions!
Kindly adviceEGJ Steens has post a fix at http://forums.adobe.com/message/4097998#4097998#4097998/Using Flash Builder
Zend framework 1.11 added an extra function in the Http.php file (which was mentioned in the post) and was fixed as follows
protected function isIeOverSsl()
$_SERVER['HTTPS'] = 'off'; //Adding this line fixed the issue
$ssl = $_SERVER['HTTPS'];
if (!$ssl || ($ssl == 'off')) {
// IIS reports "off", whereas other browsers simply don't populate
return false;
$ua = $_SERVER['HTTP_USER_AGENT'];
if (!preg_match('/; MSIE \d+\.\d+;/', $ua)) {
// Not MicroSoft Internet Explorer
return false;
return true;
My concern is if i wish to secure my application with ssl using HTTPS will this prevent me from doing so.
Any Pointers Anybody? -
SSIS package is taking more time on production server than test server
Same SSIS package was taking 18 hours on production and was just taking 6-7 hours on test server. When I increased the frequency of the Tlog backup file then it dropped to 11 hours. But still the difference of 3-4 hours is there.
I had doubt that production server might have online transaction causing this issues, however this was proved wrong when I shutdown production sites and then also same output.
Any idea where I am missing..
Santosh SinghThanks for taking time and trying to understand the explanation I had shared, please find comments inline:
Are they performing server or database maintenance in the same window of time when the packages run. These include backups, index re-organize/rebuilt, partition switching, DBCC checks, statistics update, etc.?
Comments: I am DBA too for them so I know better what is going on. No such jobs are executing during this window.
What is the recovery model being used in production Full, Simple, Bulk-logged?
Comments: Full
Are the indexes and statistics being maintained in the source system you are extracting data from?
Comments:Yes
Have SQL Server configurations been modified such as Max Degree of Parallelism and Parallelism thresholds?
Comments: Yes it was earlier, then I changed that back to Default while comparing Test environment which has everything default.
Are there any special trace flags used to start up SQL Server in production?
Comments: No only Trace 1118 for Tempdb enabled.
Is SQL Server Auditing, Compression or Encryption being used in production?
Comments: Only Backup compression has been enabled, no Auditing and no Encryption
Is there Replication, Mirroring, Log-Shipping, Change Data Capture, Change Tracking being used in production?
Comments: No Replication, Mirroring, Log-Shipping, Change Data Capture, Change Tracking being used in production. Earlier we had log shipping, however as of now log shipping as well. Its long time ago we removed log shipping.
Are there other applications running along with SQL Server in the same Windows Server?
Comments: No only sql server.
Has SQL Server max memory changed from the default value? How much RAM is left for Integration Services and Windows? Check if paging is occurring.
Comments: This is cluster setup for two instances on each node, so yes default memory is not still i had to adjust, however i didn't find high memory usage or memory issues in all log like sql, windows. 28 GB left for Integration Services and Windows
and 50 GB-50 GB between two sql server instances. Paging looks normal too.
Are there query hints being used in the SELECT queries?
Comments: Query we have bad, however this is our last target. The same query is executing in 6 hours in Test server and why it is taking now 11 hours is the question of the hour for me.
Thanks for all help again.
Please let me know if I am missing anything.
Santosh Singh -
Archive Log Gap between Disaster recovery server & Production Server
Hi
Can anybody provide me the script to find out the archive log missing on Production Server means that archive log backed up & deleted from the production server & rest of the Archive logs has been shipped on to the DR server. Means it will shot the gap between production server & DR server until that log which has been deleted from the production server , but next logs has been shipped to the DR server. But rest of the logs only be applied when delete log be restored on the production server & shipped on the DR server.
thanks.
Regards
Ravi Kant AryaRavi,
The question is how are you getting the archive logs shipped over to your standby database? If you are doing it via FTP or scripts, then you may want to look at configuring DataGuard to do it for you. Look at "Setting Up Oracle Dataguard for SAP" at:
http://www.oracle.com/us/solutions/sap/wp-ora4sap-dataguard11g-303811.pdf
Good Luck.
Mike Kennedy -
Way to find Missing table Index in SAP ?
Hi All,
Is there any way to check missing tables index in SAP?
and also DB size ? and reorganize the index and tables?
Cheers
UsmanHi Usman,
Transaction <b>SE14, DB02</b> .. program to reorganize the index and tables .. <b>Report RSSDBREO </b>
Db Size :<b>Tr.DB02</b> Perform DB Check and you will get the size.
Or you can check on your database server sap data files systems. so you will know gross size of your DB.
Thanks
Saquib Khan
Message was edited by: Saquib Khan -
PPS Tuning Advisory – Missing OOTB Index On s_audit_item In 8.x
Hi All,
We have had to add this missing OOTB index at several of our accounts so I wanted to pass this along. It is missing on deletes such as deleting of quotes.
-- azgpf8ju1yj6s
ALTER SESSION SET OPTIMIZER_MODE = FIRST_ROWS_10 ;
ALTER SESSION SET "_OPTIMIZER_SORTMERGE_JOIN_ENABLED" = FALSE ;
ALTER SESSION SET "_OPTIMIZER_JOIN_SEL_SANITY_CHECK" = TRUE;
ALTER SESSION SET "_HASH_JOIN_ENABLED" = FALSE;
DELETE FROM SIEBEL.S_QUOTE_ITM_SPA WHERE ROW_ID IN
(SELECT TBL_RECORD_ID FROM SIEBEL.S_AUDIT_ITEM WHERE GROUP_NUM = :B1)
DROP INDEX SIEBEL.S_AUDIT_ITEM_CUSTOM01_X;
CREATE INDEX SIEBEL.S_AUDIT_ITEM_CUSTOM01_X ON SIEBEL.S_AUDIT_ITEM
GROUP_NUM
NOLOGGING
TABLESPACE SIEBEL1D
PCTFREE 10
INITRANS 2
MAXTRANS 255
STORAGE (
INITIAL 8M
NEXT 8M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
BUFFER_POOL DEFAULT
PARALLEL 6;
alter INDEX SIEBEL.S_AUDIT_ITEM_CUSTOM01_X
NOPARALLEL
LOGGING
begin
DBMS_STATS.GATHER_INDEX_STATS(
ownname => 'SIEBEL',
indname => 'S_AUDIT_ITEM_CUSTOM01_X',
estimate_percent => 100,
degree => DBMS_STATS.AUTO_DEGREE
end;
Robert Ponder
Lead Architect and Director
Ponder Pro Serve
cell: 770.490.2767
fax: 770.412.8259
email: [email protected]
web: www.ponderproserve.com
Edited by: Robert Ponder on Sep 13, 2010 7:18 PMHi,
it's a known issue (at least for me :)). DBMS_LOB is a special interface used by clients to operate with LOBs. When it is used to manipulate with LOB data, SQL engine is bypassed (OK, it's my speculation about that - actually it can't be bypassed), which results in accounting wait events to non-existent cursor (not parsed & executed) in the 10046 trace. Usually it's the first "available" cursor number in trace (I haven't seen another behavior). TKPROF will show all waits accounted for DBMS_LOB activities to OVERALL section, OraSRP would show it under 'unaccaunted for' section.
So that's the first part: DBMS_LOB actions are not accounted to a particular cursor in the trace.
There's second thing to remember: [direct path write|http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/waitevents.htm#sthref3038]
During Direct Path operations, the data is asynchronously written to the database filesand [direct path write and direct path write temp|http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/instance_tune.htm#i16292]
Like direct path reads, the number of waits is not the same as number of write calls issued if the I/O subsystem supports asynchronous writes
it means that the time for direct path operations is not real wall-clock time, it's just a time to submit an IO request if IO subsystem supports asynchronous IO. And that's explains "lost" wait time.
Also, parameters to that events are:
P1 - File_id for the write call
P2 - Start block_id for the write call
P3 - Number of blocks in the write callso P2 is not object id, nor data object id. Usually such cases are identified based on the number of 'direct path' events and looking at the raw trace file.
Maybe you are looking for
-
IPhoto takes too long to open and close
I just updated to iPhoto 5.0.4 from 4.0.3 (using an iLife 05 disk) on my iBook G4 (specs below). I have about 10,000 photos in my library. It took 5 or more hours (I finally had to go to bed!) to update all my photos when the program first opened. Th
-
How to create dynamic ed flash charts based on user selected fields in Orac
Hi all, Can any of the experts please tellme "how to create dynamic ed flash charts based on user selected fields in Oracle apex". Thanks Manish
-
SOA: HL7 Adapter:-
Hi I am working on HL7 Adapter in Oracle SOA Suite.While working on HL7 Adapter with Spring Context , the error message I am able to see while compiling the project is "wsdl to java cast exception. Can any one please let me know the support that HL7
-
My iPod touch is zoomed all the time and I don't have a clue how to unzoom!! Please help!!
-
Installing Grid Control 11g on ORHEL
hi, after having installed the Oracle Database 11gR2, the guide is asking to apply the following patch sets: p6880880_112000_LINUX.zip p9165206_112010_LINUX.zip p9352237_112010_LINUX.zip now I have downloaded these patches in OPatch directory. But I