Sunstudio12 dbx: core and recompiled sources
Hi Support!
I am trying to run dbx with core file for which I have the original executable and libs.
I recompiled the objects from the original source, but dbx complains that they were compiled at a different time, and refuses to read them:
Skipping stabs. (see `help finding-files')
Object file: XXX/XXX/obj_debug.solaris/main.o
compiled on: Tue Mar 24 15:00:33 2009
Executable contains object file compiled on: Fri Mar 20 11:54:06 2009
How can I force dbx to ignore the compiled-in date stamps and read the object files?
DBX version:
/opt/studio12/SUNWspro/bin/dbx -V
Sun Dbx Debugger 7.6 SunOS_i386 Patch 124873-01 2007/07/12
Machine info:
uname -a
SunOS xappd10a 5.10 Generic_138889-03 i86pc i386 i86pc
Thanks,
Ilya
Unfortunately, there's nothing you could do. There's an option for reading core file that dbx thinks does not correspond to the executable, but - alas - there's no such option for object files.
It is somewhat strange why you ran into the problem - by default, debug information produced by in Sun Studio 12 compilers is DWARF, which is stored in loadobjects, not object files, rendering the latter useless for debugging purposes. It is of course too late to change that for your case, but you probably need to consider switching from stabs to DWARF for future builds of your product.
Let me know if you have any questions about debug info (or any other, for that matter).
Anyway, the problem you are having seem to have a reasonable solution, so if you have a support contract, you can escalate it through the support channel. Otherwise, since it is related to quickly aging stabs debug format, I am not sure that it will be fixed in newer versions of dbx. Still, you can file an RFE (Request For Enhancement) through bugs.sun.com.
Similar Messages
-
Dbx against core and recompiled sources
Hi,
We are trying to run dbx on a core file for which we have the original executable and libs, but not the source / object tree. We have recompiled the objects from the original source, but dbx complains that they were compiled at a different time, and refuses to read them:
Object file: <pathname>/server.o
compiled on: Mon Oct 4 13:50:43 2004
Executable contains object file compiled on: Sun Mar 28 10:19:30 2004
dbx: warning: Object file is not the same one that was linked into executable.
Skipping stabs. (see `help finding-files')
How can we force dbx to ignore the compiled-in date stamps and read the object files? Here's the dbx version info:
Dbx Debugger 7.0 Patch 111709-02 2002/07/09
And the output of 'uname -a':
SunOS ex2-ssmsun1 5.8 Generic_108528-23 sun4u sparc SUNW,Ultra-2
Thanks,
ChrisThis version - Forte Developer 7 has been announced EOL(End of Life). If you have support contract with Sun, please follow the service channel and seek help.
http://developers.sun.com/tools/cc/support/support_matrix.html
Otherwise, we strongly recommend you to upgrade to the current version Sun Studio 9 which could be found at:
http://wwws.sun.com/software/products/studio/index.html
The dbx version is 7.3 in Sun Studio 9.
Thanks.
Rose -
dbx core dumping when I debug my application with RTC enable (I'm truying to dectect some memory leaks in my code). and I'm getting the following
RTC: Enabling Error Checking...
RTC: Running program...
(/opt/SUNWspro/bin/../WS6U2/bin/sparcv9/dbx) suppress rui
(/opt/SUNWspro/bin/../WS6U2/bin/sparcv9/dbx) cont
t@10 (l@10) signal SEGV (no mapping at the fault address) in __rtc_trap_handler at 0xff2e1148
0xff2e1148: __rtc_trap_handler+0x0058: ld [%l5], %l0
dbx: warning: undefined type number (0,158) at /export/home/husam/release/lib/libstorage_s.so.1:backend.c stab #207 sqlca:(0,158),
assuming type `int {assumed}'
dbx: warning: undefined type number (0,211) at /export/home/husam/release/lib/libstorage_s.so.1:backend.c stab #530 host:(0,211),
assuming type `int {assumed}'
dbx: warning: undefined type number (0,212) at /export/home/husam/release/lib/libstorage_s.so.1:backend.c stab #531 pwd:(0,212),
assuming type `int {assumed}'
dbx: warning: undefined type number (0,213) at /export/home/husam/release/lib/libstorage_s.so.1:backend.c stab #532 usrid:(0,213),
assuming type `int {assumed}'
dbx: warning: undefined type number (0,214) at /export/home/husam/release/lib/libstorage_s.so.1:backend.c stab #533 address:(0,214),
assuming type `int {assumed}'
dbx: warning: undefined type number (0,228) at /export/home/husam/release/lib/libstorage_s.so.1:backend.c stab #577 module:(0,228),
assuming type `int {assumed}'
dbx: warning: undefined type number (0,237) at /export/home/husam/release/lib/libstorage_s.so.1:backend.c stab #638 data:(0,237),
assuming type `int {assumed}'
dbx: warning: undefined type number (0,238) at /export/home/husam/release/lib/libstorage_s.so.1:backend.c stab #639 dest:(0,238),
assuming type `int {assumed}'
dbx: warning: undefined type number (0,846) at /export/home/husam/release/lib/libstorage_s.so.1:backend.c stab #1136 username:G(0,846),
assuming type `int {assumed}'
Current function is queue_message
751 EXEC SQL CALL p_message.queue(:seq,:dest, :data, :notempty, :ret);
(/opt/SUNWspro/bin/../WS6U2/bin/sparcv9/dbx) where
current thread: t@10
[1] sqlscht(0x48708, 0x47914, 0xa, 0x1, 0x3, 0xa), at 0xfe8cd0f8
[2] sqldbid(0x48708, 0xfacf8aa8, 0x4dbf8, 0xff0d180c, 0x4, 0x390), at 0xfe8b0fec
[3] sqlexp(0x48708, 0x1, 0x7a, 0x7a, 0xfacf8aa8, 0xfedecb18), at 0xfe8b4208
[4] sqlcmex(0x0, 0x4dbf8, 0xfacf8aa8, 0xff0d18a6, 0xfedecafc, 0xfee1689c), at 0xfe8ab770
[5] sqlcxt(0xfacf9cac, 0xff0e3d84, 0xfacf8aa8, 0xff0d18a6, 0xfacf9d48, 0xa), at 0xfe8abdbc
Segmentation Fault (core dumped)
Some iformation you might need:
Compiler : code compiled with cc version 6.2
Debugger : 6.2 patch 111683-03
OS : Solaris 8 patch 108528-12
Machine : E250
Thanks/export/home/husam/release/lib/libstorage_s.so.1:backend.c stab #530 host:(0,211),The key to your problem is probably the stabs reading error.
How was backend.c compiled?
Normally dbx will load stabs information on demand as needed.
You can start dbx and use the "module" command to load
the stabs information right away without running the program or using RTC.
Try this:
(dbx) module backend.o
Does that cause the stabs warnings?
If you can "repair" or "normalize" the stabs information in that module,
the crash will probably go away. -
So our customer has asked us to compare compare Amazon Workspace and Azure RemoteApp offerings for them to choose from. While looking at Amazon Workspace, it clealy defines bundles with specific CPU cores, memory and user storage. However, Azure RemoteApp
only specifies user storage and vaguely compares its basic vs. standard plans in terms of "task worker" vs. "information worker"
I tried looking up its documentation but couldn't find specific CPU cores that are dedicated per user in basic vs. standard plans. I have following questions:
Can anyone point me in the right direction or help understand how many CPU cores and memory are dedicated (or shared) per user in each plan?
Our customer would most likely need a "custom" image for their custom apps. Is it possible for us to choose specific CPU cores and memory for the users to be able to run their apps in azure remoteapp?
In case i am misunderstanding the basic difference between AWS workspace and Azure RemoteApp, i'd appreciate some help in understanding it as well.
Thanks!Hi,
With Azure RemoteApp users see just the applications themselves, and the applications appear to be running on their local machine similar to other programs. With Workspaces users connect to a full desktop and launch applications within that.
1. Azure RemoteApp currently uses size A3 Virtual Machines, which have 4 vCPUs and 7GB RAM. Under Basic each VM can have a maximum of 16 users using it whereas under Standard each VM is limited to 10 users. The amount of CPU available
to a user depends on what the current demands are on the CPU at that moment from other users and system processes that may be on the server.
For example, say a user is logged on to a VM with 3 other users and the other users are idle (not consuming any CPU). At that moment the user could use all 4 vCPUs if a program they are running needed to. If a few moments later
the other 3 users all needed lots of CPU as well, then the first user would only have approximately 1 vCPU for their use. The process is dynamic and seeks to give each user their fair share of available CPU when there are multiple users demanding CPU.
Under the Standard plan a user will receive approximately a minimum of .4 vCPU assuming that the VM has the maximum number of users logged on and that all users are using as much CPU as possible at a given moment. Under the Basic plan the approximate
minimum would be .25 vCPU.
2. You cannot choose the specific number of cores and memory. What you can do is choose the Azure RemoteApp billing plan, which affects the user density of each VM as described above. If you need a lower density than Standard you
may contact support.
-TP -
Missing Format Text, Insert, and Edit Source, etc Options on Some Pages
I’m having trouble understanding why certain pages have more edit options than others. I’m new to Sharepoint Workspace 2010 and I’ve been reading/ researching for answers for 2 days now. I tried to HyperSnap in some pictures however it will not allow me
to until my account is verified and it's not sending a verification email, sorry!
Issue:
On some pages, I have the edit pencil with lots of great edit options such as Format Text, Insert and Edit Source. (This is good!)
On some pages I have no edit pencil, but when I go to Page
à Edit Page à I get few edit options, not including Format Text, Insert or Edit Source. (This is not good)
After reading other somewhat similar posts, I noticed that on the pages where I get lots of edit options, they show SitePages in the URL. Where I get few edit options, SitePages is not in the URL.
Is there a fix so that I can have the expanded edit options in all of my pages? Thank you for your help!Hi,
In SharePoint we don’t have edit page options [ribbon toolbar page edit formatting options] for all types of page for application page we cannot edit application page like site pages
again the missing edit button might be due to many causes:
Can you verify that it's not a permissions issue? Try navigating to the page in edit mode by appending to the url: your-site/Pages/YourPage.aspx?ControlMode=Edit&DisplayMode=Design
Or you can try this
your-site/Pages/YourPage.aspx? ?ToolPaneView=2
1. You can check that on some pages the ribbon is hidden by default. You need to make it visible by clicking on "Show Ribbon" on the "Site Actions" drop-down menu.
2. You haven't been doing Visual Upgrade (which actually replaces your master page with the v4.master)
3. If you have different master page and may be your master page is not 100% compatibile with SP2010, nor does it use the elements from v4.master (collaboration) or the example nightandday.master
To add the button in your page you could use the <SharePoint:PageStateActionButton id="PageStateActionButton" runat="server" Visible="false" /> (if you've seen already that would generate the button on the left side of
the ribbon)!
Please can you ping the URL of the page and how you crated that page to further assist, we would be happy to help.
Krishana Kumar http://www.mosstechnet-kk.com -
Hi!
I purchased my AppleTV in Miami, but I live in Argentina.
I have not been able to get my device running correctly, and altough I've read tons of things online, I still have no solution to my problem.
Here's the thing: after everything's plugged in and ON, I see the Apple logo briefly, then the TV displays a sign that says no signal, check pwoer cords and power source.
But everything is plugged in and connected properly, otherwise, I doubt I'd be seeing the apple logo, right?
I tried changing the HDMI cable, reversing the ends, switching ports, hard resetting the ATV, and trying a different TV. No luck.
I read there is some issue with Samsung TVs, and most LED and LCD tvs down here are Samsung.
On the other hand, I cannot return it or exchange it, since I bought it in the states.
Oh! One of the tvs I tried it on displays a little sign that reads "connected at 1280x720 @60hz", and only THEN, goes to the "no signal" sign.
Please!!! I am being driven up the walls here.... HELP!!Try this:
Change the Apple TV resolution:
Press and hold the Menu and Up button on your Apple Remote for six seconds.
The Apple TV will automatically cycle to the next resolution at approximately 20 second intervals.
Press Play on the Apple Remote to keep the current resolution or Select to manually cycle to the next resolution. -
What are differences between the target tablespace and the source tablespac
The IMPDP command create so manay errors. But the EXAMPLE tablespace is transported to the target database successfully. It seems that the transported tablespace is no difference with the source tablespace.
Why create so many errors?
How to avoid these errors?
What are differences between the target tablespace and the source tablespace?
Is this datapump action really successfull?
Thw following is the log output:
[oracle@hostp ~]$ impdp system/oracle dumpfile=user_dir:demo02.dmp tablespaces=example remap_tablespace=example:example
Import: Release 10.2.0.1.0 - Production on Sunday, 28 September, 2008 18:08:31
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "SYSTEM"."SYS_IMPORT_TABLESPACE_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_TABLESPACE_01": system/******** dumpfile=user_dir:demo02.dmp tablespaces=example remap_tablespace=example:example
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39117: Type needed to create table is not included in this operation. Failing sql is:
CREATE TABLE "OE"."CUSTOMERS" ("CUSTOMER_ID" NUMBER(6,0), "CUST_FIRST_NAME" VARCHAR2(20) CONSTRAINT "CUST_FNAME_NN" NOT NULL ENABLE, "CUST_LAST_NAME" VARCHAR2(20) CONSTRAINT "CUST_LNAME_NN" NOT NULL ENABLE, "CUST_ADDRESS" "OE"."CUST_ADDRESS_TYP" , "PHONE_NUMBERS" "OE"."PHONE_LIST_TYP" , "NLS_LANGUAGE" VARCHAR2(3), "NLS_TERRITORY" VARCHAR2(30), "CREDIT_LIMIT" NUMBER(9,2), "CUST_EMAIL" VARCHAR2(30), "ACCOUNT_MGR_ID" NU
ORA-39117: Type needed to create table is not included in this operation. Failing sql is:
ORA-39117: Type needed to create table is not included in this operation. Failing sql is:
CREATE TABLE "IX"."ORDERS_QUEUETABLE" ("Q_NAME" VARCHAR2(30), "MSGID" RAW(16), "CORRID" VARCHAR2(128), "PRIORITY" NUMBER, "STATE" NUMBER, "DELAY" TIMESTAMP (6), "EXPIRATION" NUMBER, "TIME_MANAGER_INFO" TIMESTAMP (6), "LOCAL_ORDER_NO" NUMBER, "CHAIN_NO" NUMBER, "CSCN" NUMBER, "DSCN" NUMBER, "ENQ_TIME" TIMESTAMP (6), "ENQ_UID" VARCHAR2(30), "ENQ_TID" VARCHAR2(30), "DEQ_TIME" TIMESTAMP (6), "DEQ_UID" VARCHAR2(30), "DEQ_
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "SH"."CUSTOMERS" 9.850 MB 55500 rows
. . imported "SH"."SUPPLEMENTARY_DEMOGRAPHICS" 695.9 KB 4500 rows
. . imported "OE"."PRODUCT_DESCRIPTIONS" 2.379 MB 8640 rows
. . imported "SH"."SALES":"SALES_Q4_2001" 2.257 MB 69749 rows
. . imported "SH"."SALES":"SALES_Q1_1999" 2.070 MB 64186 rows
. . imported "SH"."SALES":"SALES_Q3_2001" 2.129 MB 65769 rows
. . imported "SH"."SALES":"SALES_Q1_2000" 2.011 MB 62197 rows
. . imported "SH"."SALES":"SALES_Q1_2001" 1.964 MB 60608 rows
. . imported "SH"."SALES":"SALES_Q2_2001" 2.050 MB 63292 rows
. . imported "SH"."SALES":"SALES_Q3_1999" 2.166 MB 67138 rows
Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'USER1' does not exist
Failing sql is:
GRANT SELECT ON "HR"."REGIONS" TO "USER1"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'EXAM_03' does not exist
Failing sql is:
GRANT SELECT ON "HR"."REGIONS" TO "EXAM_03"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'USER1' does not exist
Failing sql is:
GRANT SELECT ON "HR"."COUNTRIES" TO "USER1"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'EXAM_03' does not exist
Failing sql is:
GRANT SELECT ON "HR"."COUNTRIES" TO "EXAM_03"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'USER1' does not exist
Failing sql is:
GRANT SELECT ON "HR"."LOCATIONS" TO "USER1"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'EXAM_03' does not exist
Failing sql is:
GRANT SELECT ON "HR"."LOCATIONS" TO "EXAM_03"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'USER1' does not exist
Failing sql is:
GRANT SELECT ON "HR"."DEPARTMENTS" TO "USER1"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'EXAM_03' does not exist
Failing sql is:
GRANT SELECT ON "HR"."DEPARTMENTS" TO "EXAM_03"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'USER1' does not exist
Failing sql is:
GRANT SELECT ON "HR"."JOBS" TO "USER1"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'EXAM_03' does not exist
Failing sql is:
GRANT SELECT ON "HR"."JOBS" TO "EXAM_03"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'USER1' does not exist
Failing sql is:
GRANT SELECT ON "HR"."EMPLOYEES" TO "USER1"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'EXAM_03' does not exist
Failing sql is:
GRANT SELECT ON "HR"."EMPLOYEES" TO "EXAM_03"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'USER1' does not exist
Failing sql is:
GRANT SELECT ON "HR"."JOB_HISTORY" TO "USER1"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'EXAM_03' does not exist
Failing sql is:
GRANT SELECT ON "HR"."JOB_HISTORY" TO "EXAM_03"
ORA-39112: Dependent object type OBJECT_GRANT:"OE" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type OBJECT_GRANT:"OE" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
ORA-39112: Dependent object type INDEX:"OE"."CUSTOMERS_PK" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type INDEX:"OE"."CUST_ACCOUNT_MANAGER_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type INDEX:"OE"."CUST_LNAME_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type INDEX:"OE"."CUST_EMAIL_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type INDEX:"PM"."PRINTMEDIA_PK" skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
ORA-39112: Dependent object type CONSTRAINT:"OE"."CUSTOMER_CREDIT_LIMIT_MAX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type CONSTRAINT:"OE"."CUSTOMER_ID_MIN" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type CONSTRAINT:"OE"."CUSTOMERS_PK" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type CONSTRAINT:"PM"."PRINTMEDIA__PK" skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
ORA-39112: Dependent object type CONSTRAINT:"IX"."SYS_C005192" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUSTOMERS_PK" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_ACCOUNT_MANAGER_IX" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_LNAME_IX" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_EMAIL_IX" creation failed
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"PM"."PRINTMEDIA_PK" creation failed
Processing object type TABLE_EXPORT/TABLE/COMMENT
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
ORA-39112: Dependent object type REF_CONSTRAINT:"OE"."CUSTOMERS_ACCOUNT_MANAGER_FK" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39083: Object type REF_CONSTRAINT failed to create with error:
ORA-00942: table or view does not exist
Failing sql is:
ALTER TABLE "OE"."ORDERS" ADD CONSTRAINT "ORDERS_CUSTOMER_ID_FK" FOREIGN KEY ("CUSTOMER_ID") REFERENCES "OE"."CUSTOMERS" ("CUSTOMER_ID") ON DELETE SET NULL ENABLE
ORA-39112: Dependent object type REF_CONSTRAINT:"PM"."PRINTMEDIA_FK" skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
Processing object type TABLE_EXPORT/TABLE/TRIGGER
ORA-39082: Object type TRIGGER:"HR"."SECURE_EMPLOYEES" created with compilation warnings
ORA-39082: Object type TRIGGER:"HR"."SECURE_EMPLOYEES" created with compilation warnings
ORA-39082: Object type TRIGGER:"HR"."UPDATE_JOB_HISTORY" created with compilation warnings
ORA-39082: Object type TRIGGER:"HR"."UPDATE_JOB_HISTORY" created with compilation warnings
Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
ORA-39112: Dependent object type INDEX:"OE"."CUST_UPPER_NAME_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_UPPER_NAME_IX" creation failed
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
Processing object type TABLE_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/POST_INSTANCE/PROCACT_INSTANCE
ORA-39112: Dependent object type PROCACT_INSTANCE skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
ORA-39083: Object type PROCACT_INSTANCE failed to create with error:
ORA-01403: no data found
ORA-01403: no data found
Failing sql is:
BEGIN
SYS.DBMS_AQ_IMP_INTERNAL.IMPORT_SIGNATURE_TABLE('AQ$_ORDERS_QUEUETABLE_G');COMMIT; END;
Processing object type TABLE_EXPORT/TABLE/POST_INSTANCE/PROCDEPOBJ
ORA-39112: Dependent object type PROCDEPOBJ:"IX"."AQ$_ORDERS_QUEUETABLE_V" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
ORA-39112: Dependent object type PROCDEPOBJ:"IX"."ORDERS_QUEUE_N" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
ORA-39112: Dependent object type PROCDEPOBJ:"IX"."ORDERS_QUEUE_R" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
ORA-39112: Dependent object type PROCDEPOBJ:"IX"."AQ$_ORDERS_QUEUETABLE_E" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
ORA-39112: Dependent object type PROCDEPOBJ:"IX"."ORDERS_QUEUE" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
Job "SYSTEM"."SYS_IMPORT_TABLESPACE_01" completed with 63 error(s) at 18:09:14Short of trying to then reverse-engineer the objects that are in the dump file (I believe Data Pump export files contain some XML representations of DDL in addition to various binary bits, making it potentially possible to try to scan the dump file for the object definitions), I would tend to assume that the export didn't include those type definitions.
Since it looks like you're trying to set up the sample schemas, is there a reason that you wouldn't just run the sample schema setup scripts on the destination database? Why are you using Data Pump in the first place?
Justin -
How to open a page and closing source page in jdev 10.1.3.3
how to open a page and closing source page in jdev 10.1.3.3.
for example.
From page 'A' , we are opening a page 'B' and closing the page 'A'. how this can achived.Please let me knowYour question is totally not clear?
What page? a JSF page? A code editor in JDeveloper? -
How can I get a file to copy all of the files in a directory except itself and the source of the copy function will be the directory the final program is in? This application must be in Lab View 8.
you mean something like this (see below)?
Now you may have to implement code to check if the destination folder exists and to create it, etc. But if you use the Front Panel Control to select the destination folder, it should be okay.
Not the best implementation, mind you but you'll get the idea..
Message Edited by JoeLabView on 04-18-2007 03:43 PM
Attachments:
copy folder contents.PNG 10 KB -
Creation of a generic extractor and data source for the FAGLFLEXA table
Hi All,
Need to create a generic extractor and data source for the FAGLFLEXA table to support AR reporting. This table contains the necessary profit center information to perform LOB reporting against the AR data.
Please advice on how to do this.
Regards, VishalHi Vishal,
Its seems a simple a work out.
1.Go to RSO2 & choose the relevant option ie. whether you want to create Transactional DS, Master Data DS or Text DS.
2. Name it accordingly & then create.
3. Give description to it & then give table name FAGLFLEXA.
4. Save it & activate. If you need it to be delta enabled then click over Delta & you can choose accordingly.
If you still face some problem then do mail me at [email protected]
Assign points if helpful
Regards,
Himanshu -
Creation of System Alias between Portal and E Sourcing System
Hi Team,
I would like to create a System Alias between SAP Portal 7.4 and E Sourcing System 10.0. I would like to know during the creation of System Object which template i need to take (is it http System or R/3 System using Load Balancing).
If it is a HTTP System, I am getting ""Could not find Connector Factory value: null"".
Could you please help me in knowing what could be the issue.
Pls. Note: E Sourcing is a Java based SAP System.Hi Sankar,
Please go through the below SAP Note and see if that helps.
711769 - SSO between SAP Java Software Products
BR,
Anurag -
Creation of connection pools and data sources
Hi,
is there a possibility to create jdbc connection pools and data sources not manually, with a script or a mbean?
That would be helpfull, because at the moment every developper has to do that for himself (because of the individuals generated passwords).
ThanxsThe weblogic.management.configuration.JDBCDataSourceMBean
defines a non-transactional JDBC data source.
http://e-docs.bea.com/wls/docs90/javadocs_mhome/weblogic/management/configuration/JDBCDataSourceMBean.html
The JDBCConnectionPoolMBean defines a JDBC connection pool.
http://e-docs.bea.com/wls/docs90/javadocs_mhome/weblogic/management/configuration/JDBCConnectionPoolMBean.html -
I've had my farm upgraded from SP2010 to SP2013 for over 6 months now and all is well, however, I was refreshing my staging environment from production and I noticed that one of the databases still shows these errors when I run test-spcontentdatabase:
Category : Configuration
Error : False
UpgradeBlocking : False
Message : The [SharePoint Web App] web application is configured with claims authentication mode however the content database you are trying to attach is intended to be used against
a windows classic authentication mode.
Remedy : There is an inconsistency between the authentication mode of target web application and the source web application. Ensure that the authentication mode setting in upgraded web application is the
same as what you had in previous SharePoint 2010 web application. Refer to the link "http://go.microsoft.com/fwlink/?LinkId=236865" for more information.
This doesn't make sense considering I converted the production web application to claims during the upgrade and then verified all sites were working with claims logins. I also verified that existing AD user identities were converted to claims by checking out
the database tables. Yet test-spcontentdatabase still thinks there is a mismatch here.
My farm is SP1 and no further CUs. The point of this particular refresh is so I can update to the November CUs in my test farm. Anyone else see this? Seems like it's a bug/safe to ignore because my stuff is working.
Thanks,
AaronSee:
http://thesharepointfarm.com/2014/11/test-spcontentdatabase-classic-to-claims-conversion/
Trevor Seward
Follow or contact me at...
  
This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs. -
4 cores or 8 cores and which video card to choose
My father and I are presented with a dilemma. Should we get 8 core model or the 4 core model at several hundred dollars cheaper?
I have never actually bought for myself a computer, and I want to make this one last. At the moment, 8 cores is completely top of the line, and assuming I choose such a model, I will probably have an exceptionally well computer that will last be many, many years.
I am most likely going to at all times be running Windows and Leopard side by side using Parallels and have a game running in Windows and developing, browsing, chatting, watching movies, ventrilo, etc. in the other space.
For this reason, would it be wise to purchase the 8 core model? It is considerably cheaper to add the extra quad core with Apple as it is nearly half the price of one found on newegg or a dell configuration. I really want this computer to last and use it for a long time and play the most top of the line games with my friends such as Age of Conan, Crysis, etc. Also, I will have a 24 inch monitor that I will also need my computer to power.
My current computer cannot even run Age of Conan at the lowest quality settings. I figure if I want to be able to run this game at 1920x1200 resolution in Parallels while running applications on Leopard, shelling out the extra three or four hundred dollars would be worth it. Seeing as I will be using this computer for years, even though most applications do not use 8 cores now, they will then.
I believe that the 4 core model of the Mac Pro does not allow for an additional xeon quad core, so if I did buy the single quad core model, I would not be able to add another quad core, thus limiting me. Also, even if there were, it would void my warranty.
Also, I have heard of the 8800 GT causing some problems in the Mac Pro, typically in the first generation ones, however, I had seen cases of people noting problems in the January edition. I want to do a lot of gaming in this rig, and the 8800 is the de facto card of choice for most gamers, at a reasonable price. However, will it give me problems? I have heard of some people having 0 problems at all with this card. Will it give me those occasional blue and green lines through my monitor? Has Apple fully developed the drivers for this card? Will I have problems with it?
Thanks for any help and I await your responses.Fromethius,
you cannot assing CPUs to the VM in Parallels. In VMware you can select 1 or 2 virtual CPU for the guest OS, though that doesn't neccessarily translate 1:1 with your physical CPUs.
As for games, neither VM nor Parallels with run most later games well; remember, shader support is only experimental right now, so your results will vary.
I play games on Mac natively, but it depends on which ones you are after.
The is no problem with the 8800 card. The issues talked about - which is valid - is it's subpar CORE performance. That hopefully will be addressed in a driver, but won't affect you.
As for the system - get all you can afford. Don't wait for 'upgrades' - you'll never buy s system otherwise as something new is always on the way.
I work with a lot of video so I have an 8 core 3.2 with 16GB or RAM - and would never want less. Does that mean you need that? Or someone else doesn't even need more? Not at all - you just need to see what you really mean to do. 4 cores and the 8800 should be just fine for you. IF you can afford the 8 core, in case you will do a lot of work at the same time, great. Otherwise, rather not skimp on memory. Anything under 4GB, IMHO, ain't good enough.
Cheers,
dan -
[solved] pacman apart from core and extra repos
I just can't get pacman to update any other repos apart from core and extra since I am on arch64 - tried to change servers directly in pacman.conf and in the include files, unpacked the databases manually to /var/lib/pacman - can't seem to persue it though - what am I missing here?
Last edited by mykey (2008-01-07 14:19:05)sure! - pacman.conf:
# /etc/pacman.conf
# See the pacman manpage for option directives
# GENERAL OPTIONS
[options]
LogFile = /var/log/pacman.log
NoUpgrade = etc/passwd etc/group etc/shadow etc/sudoers
NoUpgrade = etc/fstab etc/raidtab etc/ld.so.conf
NoUpgrade = etc/rc.conf etc/rc.local
NoUpgrade = etc/modprobe.conf etc/modules.conf
NoUpgrade = etc/lilo.conf boot/grub/menu.lst
HoldPkg = pacman glibc
IgnorePkg = kernel26
#XferCommand = /usr/bin/wget --passive-ftp -c -O %o %u
# REPOSITORIES
# - can be defined here or included from another file
# - pacman will search repositories in the order defined here
# - local/custom mirrors can be added here or in separate files
# - repositories listed first will take precedence when packages
# have identical names, regardless of version number
#[testing]
#Include = /etc/pacman.d/testing
[core]
# Add your preferred servers here, they will be used first
#Server = ftp://archlinux.puzzle.ch/core/os/x86_64
Include = /etc/pacman.d/core
[extra]
# Add your preferred servers here, they will be used first
#Server = ftp://archlinux.puzzle.ch/extra/os/x86_64
Include = /etc/pacman.d/extra
#[community]
# Add your preferred servers here, they will be used first
#Server = ftp://archlinux.puzzle.ch/community/os/x86_64
Include = /etc/pacman.d/community
#[unstable]
# Add your preferred servers here, they will be used first
#Server = ftp://archlinux.puzzle.ch/unstable/os/x86_64
Include = /etc/pacman.d/unstable
# An example of a custom package repository. See the pacman manpage for
# tips on creating your own repositories.
#[custom]
#Server = file:///home/custompkgs
and - pacman -Sy:
:: Synchronizing package databases...
core is up to date
extra 303.3K 373.6K/s 00:00:01 [#####################] 100%
local database is up to date
Maybe you are looking for
-
Error in ABAP Objects program - SAP BI
Dear All, While trying to update infoobject /BIC/ZGROSSPRI in an ODS in BI through an update routine, a syntax error is thrown when I am trying to define a variable of type tys_TG_1-/BIC/ZGROSSPRI. How is it possible to define a variable of type targ
-
Using the iMac for display only?
I bought an iMac in 2008. I use it for the majority of my computing needs, and for the most part is serves my purposes for everything. One thing I would like though is more gaming power. I have a PC tower that is not connected to anything at the mome
-
ME49 enhancement (vendor name will always appear)
Hi gurus, In ME49 Price Comparison List, I can get price comparisons by either inputting 1)Quotation Numbers or 2)Collective RFQ Inputting only quotation numbers will display the price comparison output WITH vendor names ONLY. Inputting only collecti
-
USB to (DB9) Serial Adapter and Minicom - help needed
I'm having problems with my Netgear DM111P ethernet modem. I bought it off ebay and it was 'bricked' on arrival . All of the LEDs are flashing okay except for a solid red one which should be solid green. I would like to reflash the firmware. However
-
How to monitor the sql statement running?
Hi , i need to check the status of the sql script running on the database. what is the way to chect that? if the particular query is taking very long time what might be the reason?