No asset was archived for AM_ASSET OBJECT
Hi Experts,
I am new to SAP and Data Archiving.
Please help me.
What should be the residence time for AM_ASSET object.
and If I try to archive AM_ASSET with residence time as 0 or blank. In job log i am getting d message "no asset was archived". what should be d reason?
Regards,
Mudiyappa
If you are new to SAP and archiving then you should start with the docu which can be found help.sap.com
the easiest way to get to the documentation is to use the archiving object name
e.g. in Google just enter
AM_ASSET site:help.sap.com
Once you open such a link like Archiving FI Asset Data (FI-AA) - Financial Accounting (FI) - SAP Library
then you can find the general introduction for data archiving as the first entry in the navigation menu on the left.
This keeps you already busy for some hours (or days, if you take the chance to test what you read)
This object specific documentation has as well sections which talk about prerequisites and residence times.
In any case, before you carry out an archiving job, you should make sure that you have the most recent OSS notes with bug fixes in your system. Archiving is something final, there is not much you can do if the data is gone. So go to the service.sap.com/notes site and search again with the archiving object and read those notes and decide which one needs to be in your system. Among them you will find plenty of information to get a deep understanding about archiving in general and for this specific object. Some are in dedicated consulting notes, other are just as explanation next to bug fixes.
And if you execute an archiving job as a beginner, then always use the detailed log to get full information. Remember a job log is related to the technical execution of the job, the log with the error messages from archiving is found in the spool file to the job.
Similar Messages
-
Regarding data archiving for SD objects.
Hi All,
Can you please help me to learn data archiving of SD objects.
I am very new to SDN and this is my first thread.Hi Hariprasad,
You can find a lot of information relating to archiving SD data in SAP Help.
Here is the link you need
<a href="http://help.sap.com/saphelp_erp2005vp/helpdata/en/d1/90963466880c30e10000009b38f83b/frameset.htm">Archiving SD data</a>
Hope this helps
Cheers!
Samanjay -
Are there REST APIs to retrieve entity metadata for eloqua objects?
There is a list of all the objects which can be accessed by REST for CRUD in this link: REST API - Documentation for Core Objects under the Core Objects section.
For each of the objects listed under the Core Objects section are there is a field metadata under the Properties section.
For example for Email object, REST API - Accessing Emails , under the Properties section, there corresponding entries for fields of Emails object under the
Name ,Type, Description and Validations headings.
Is there a REST API for retrieving the same information i.e. the field metadata for an eloqua object programmatically ?
If not , it is a serious hindrance to building systems that are metadata driven and also since SOAP support is being deprecated...Metadata is 'top level' information on the object, and available whether you query the individual object (a single form, or email asset) or query for multiple objects of that type (list all forms, list all emails). Consider using a depth of minimal or partial for faster performance if the specific configuration of those objects is not important..
Example:
GET /assets/forms?depth=minimal&count=2
Returns:
"elements":
"type":"Form",
"currentStatus":"Draft",
"id":"19",
"createdAt":"1409623550",
"createdBy":"8",
"depth":"minimal",
"folderId":"7",
"name":"zzztestCS_3-9381543541_AutocompleteTest",
"permissions":"fullControl",
"updatedAt":"1409623623",
"updatedBy":"8"
"type":"Form",
"currentStatus":"Draft",
"id":"22",
"createdAt":"1409781207",
"createdBy":"11",
"depth":"minimal",
"folderId":"466",
"name":"daisychain1",
"permissions":"fullControl",
"updatedAt":"1412779449",
"updatedBy":"20"
"page":1,
"pageSize":2,
"total":130
Without limiting the count to 2, this would return up to 1000 results if you had multiple forms in your system and give you a basic top level view of each. Similarly, you can use GET /assets/form/{id}?depth=minimal to get the same sort of information.
Other endpoints can be found on the REST livedocs page here (requires authentication):
https://secure.eloqua.com/api/docs/Dynamic/Rest/1.0/Reference.aspx
Regards,
Bojan -
UNABEL TO DO WRITE-UP FOR ASSET AS IT IS ASKING FOR COST OBJECT.
Hi
I am unable to do writ-up asset as error message is coming saying that cost element require a cost object assignment.
But in asset master we have cost object assignment.
I have checked my all configuration seems to be correct but i don't know why i am getting the erroe.
Can anyone help me regarding this.Hi
I am giving you the detailed description what error I am getting.
Account 60210 requires an assignment to a CO object
Message no. KI235
Diagnosis
You have not defined a CO account assignment for an account that is relevant to cost accounting.
System Response
Account 60210 is defined as a cost element.
This means that you must always specify a CO account assignment.
Procedure
Enter one of the following CO account assignments
Order
Cost center / cost center/ activity type
Sales order item (for a project or cost relevant)
Project / WBS element
Cost object (Process manufacturing)
*Network/ Network activities*
Business process
Profitability segment
Real estate object
The posting row affected is 0000000002, account 60210.
But in asset master we have done cost onject assignment.
Waiting for quick response.
Edited by: Smruti Mohanty on Nov 14, 2008 11:30 AM -
Dear all,
I am troubleshooting a critical error showed up on Event log. It said:
Insufficient sql database permissions for user 'Name:domain\wss_search ....... EXECUTE permission was denied on the object 'proc_GetTimerRunningJobs', database 'SharePoint_Config', schema 'dbo'
domain\wss_search is the default content access account. According to
http://technet.microsoft.com/en-us/library/cc678863.aspx I should not grant it the Farm Administrators permission.
In the Search Center I am able to search out documents as expected so I think the search service is fine. However I have no clue why this account is trying to access 'proc_GetTimerRunningJobs'.
MarkHi Mark,
This issue was caused by the search account’s permission. For resolving your issue, please do as the followings:
Expand your SharePoint Configuration database 'SharePoint_Config' and navigate to ‘proc_GetTimerRunningJobs’ under Programmability ->Stored Procedures
Right-click proc_GetTimerRunningJobs and choose Properties
Click on Permission on the left launch
Select the Search button and browse for ‘WSS_Content_Application_Pools’
Provide ‘Execute’ permissions for ‘WSS_Content_Application_Pools’
Click OK
Here are some similar posts for you to take a look at:
http://adammcewen.wordpress.com/2013/03/01/execute-permission-denied-on-sharepoint-config-db/
http://technet.microsoft.com/en-us/library/ee513067(v=office.14).aspx
I hope this helps.
Thanks,
Wendy
Wendy Li
TechNet Community Support -
Problem in Archiving for object FM_BEL_CA
Dear all,
Currently I am working on archiving project who still use the R/3 4.7. One of the problems of archiving Archiving Objects is the FM_BEL_CA
My questions to this objects are the following;
1. At test run of Write process in t-code "SARA", the system only show the document type that can or cannot be archived. Is it possible in the SARA that the result of Write process also mention the document nr?
2 Why the system request full authorization for all FM area?
3. Is there any way to specify the the archiving is specified for certain FM area only?
4. When I check the information Structure of this object, there is no Information structure for this FM_BEL_CA.
5. Is it mandatory to create information structure for this object?
6. What is the impact to the report on application side if we do not maintain Information Structure in the T-code "SARI"
7. Is there any support package in relation to this Object?
Is there anyone out there can help me?
Thank you.
AzniHi Ajay, thanks for your information.
I have tried to make the info structure recommended. In the process of info structure creation, there is no object FM_BEL_CA in the Object List. Is there any reason for this?
My next question, in practice is it a common to archive these object?
My question raise after I read Note 114628. The impact of archiving this object will make the archived data become loss from data base and no way to retrieve it. So, if there is any payment received on the following period or Fiscal year after the archiving key date, will make the payment posted as unassigned revenue/expence.
Please your support.
Thank you so much.
Azni -
No log was created for object SDOK and sub-objekt QUEUEING
Hi All,
I am getting the error "No log was created for object SDOK and sub-objekt QUEUEING " while uploading the document.
Error is triggered at the call of Function Module:
CALL FUNCTION 'SDOK_PHIO_STORE_CONTENT'
Kindly help with the solution.
thanks,
Arti.Hi Arti,
Kindly take help from abap and resolve by following manner:
Check sap note 1173675 and error 26296.
Check entries and make the entries in table SDOKPROP.
I hope this will resolve the query.
Regards,
Ravindra -
Hi,
Every time I start iTunes, it crashes with the following error :
incorrect checksum for freed object - object was probably modified after being freed.
The problem occurs after I last added new tunes from external mp3 files.
Maybe a corrupted file ?
How can this make iTunes crash ?
Anyway, anybody knows how to find the guilty file ?
Are there any logs ?
Thanks.In the Console window, select
DIAGNOSTIC AND USAGE INFORMATION ▹ User Diagnostic Reports
(not Diagnostic and Usage Messages) from the log list on the left. There is a disclosure triangle to the left of the list item. If the triangle is pointing to the right, click it so that it points down. You'll see a list of crash reports. The name of each report starts with the name of the process, and ends with ".crash". Select the most recent report related to the process in question. The contents of the report will appear on the right. Use copy and paste to post the entire contents—the text, not a screenshot.
I know the report is long, maybe several hundred lines. Please post all of it anyway.
If you don't see any reports listed, but you know there was a crash, you may have chosen Diagnostic and Usage Messages from the log list. Choose DIAGNOSTIC AND USAGE INFORMATION instead.
In the interest of privacy, I suggest that, before posting, you edit out the “Anonymous UUID,” a long string of letters, numbers, and dashes in the header of the report, if it’s present (it may not be.)
Please don’t post other kinds of diagnostic report—they're very long and rarely helpful. -
IMovie '09 as archive for all my video assets?
Hi,
I have been searching the posts but couldn't find an answer on my (a bit more general) question: I've started to use iMovie 09 as archive for all my DV and HDV clips as I like the functionality how to have access to all clips once imported (like iTunes). IMO this is better than the clips view pane in iMovie HD.
Now, I have been reading that iMovie 09 reduces video quality on import("skipping every other line" or so) as it's designed for lower quality internet publishing. My intention is not to publish on the internet, but more to have convenient access to all the video captured over the years. The idea is to eventually watch edited video on my HD TV using AppleTV.
However, if the video is reducded in quality it's not a tool to archive my clips. Should I be looking to a) go back to iMovie HD, b) use FCE instead, or c) look for something completely different?
Any thoughts from the experts is highly appreciated.
thanks!A couple of reasons. The main reason is that I wanted a way to archive the video since my cam is sd card based (no more tapes). The archive is also a much smaller file than the imported files. Since I use the video on ATV I import on Large and save to ATV format. In the future if the hardware allows an easier way to play the video at its full resolution (I film everything on the highest setting - 1920x1080) then I can go back to the archive and import at Full.
-
Archiving MM_MATNR - dependent objects
Experts
Im trying to archive MM_MATNR (Material master) and I understand that all dependent objects need to be archived before material master can be archived.
My question is what is the most effective way of identifying and archiving dependent objects collectively or individually??
Any assistant will me much apreaciated but please dont post the following links or anythin from the following links(I have read thier contents n want more than they offer)
A step by step process will be much apreciated....
[http://www.sap-img.com/bc003.htm]
[http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/90f75a80-867f-2d10-7aa6-ac164e43e89f?quicklink=index&overridelayout=true]
Regards
Sylvester
Edited by: Sylvester Mokgwe on Jan 11, 2011 5:16 AMyou really have a lucky hand in finding poor documentation. Why dont you directly search in help.sap.com
The technical archiving process is well doucmented there, step by step.
further you should search OSS notes with key "MM_MATNR", there are plenty, make sure you are current with notes before you start. But more intresting in the beginning are the consultant notes there, explaining the errors you may discover when archiving materials.
of course you have to do this for any object that you need to archive before.
Which objects this can be in theory can be seen in the network graphics, this answer was already given.
But you have such a network graphic for any object that you need to archive before.....
of course you do not need to archive any object there, just objects you use, and only those records from the object that are relevant to the materials you want to archive.
Usually you set up a archving project, defining residence times for all potential archiving objects, then you do customizing and finally you start testing and then real archiving. this is explained in much more detail in help.sap.com and service.sap.com/plm
And do not underestimate the effort. you need consultants from any module, and people from the business, and good to have auditors as well in the project. you will certaiinly face possible error, any process that is not defined and excuted clean from beginning to the end will give you headaches.
I had a 50 day project (just my time), to get familiar with the object, doing its customizing and execution (this can be different from object to object, even you start everything from SARA) and of course documentation. I have already spend more than 70 days in 3 years (it is not a high priority project) and I have not really archived materials that were in use yet.
if a material was never used in dependent objects, then you can archive it right away.
SAP does extensive checks (and you many have to add own addtional checks in user exits to avoid screwing up your production system (most archived data cannot be retrieved into the database). My system takes between 1 and 3 minutes to archive 1 material (or finally fail because of an error) -
Getting contacts updated in sync logs after pushing data for activity object
Hi,
I used below snippet to create Import Definition for Activity Object using Eloqua Bulk 2.0.
"syncActions": null,
"isSyncTriggeredOnImport": "true",
"name": "External Activity via Bulk API",
"updateRule": "always",
"secondsToRetainData": "3600",
"fields": {
"C_EmailAddress": "{{Activity.Contact.Field(C_EmailAddress)}}",
"CampaignID": "{{Activity.Campaign.Id}}",
"AssetName": "{{Activity.Asset.Name}}",
"AssetType": "{{Activity.Asset.Type}}",
"AssetDate": "{{Activity.CreatedAt}}",
"ActivityType": "{{Activity.Type}}"
After pushing data for this import definition, I checked sync logs and I found that there was 2 contacts updated.
Why this happened?
here is my sample data :
"C_EmailAddress": "[email protected]",
"CampaignID": 32,
"AssetName": "Tradeshow",
"AssetType": "Tradeshow",
"AssetDate": "2014-05-12",
"ActivityType": "Visited Booth"
"C_EmailAddress": "[email protected]",
"CampaignID": 32,
"AssetName": "Tradeshow",
"AssetType": "Tradeshow",
"AssetDate": "2014-05-12",
"ActivityType": "Visited Booth"
Please find sync logs in attachment.
thanksHi,
I used below snippet to create Import Definition for Activity Object using Eloqua Bulk 2.0.
"syncActions": null,
"isSyncTriggeredOnImport": "true",
"name": "External Activity via Bulk API",
"updateRule": "always",
"secondsToRetainData": "3600",
"fields": {
"C_EmailAddress": "{{Activity.Contact.Field(C_EmailAddress)}}",
"CampaignID": "{{Activity.Campaign.Id}}",
"AssetName": "{{Activity.Asset.Name}}",
"AssetType": "{{Activity.Asset.Type}}",
"AssetDate": "{{Activity.CreatedAt}}",
"ActivityType": "{{Activity.Type}}"
After pushing data for this import definition, I checked sync logs and I found that there was 2 contacts updated.
Why this happened?
here is my sample data :
"C_EmailAddress": "[email protected]",
"CampaignID": 32,
"AssetName": "Tradeshow",
"AssetType": "Tradeshow",
"AssetDate": "2014-05-12",
"ActivityType": "Visited Booth"
"C_EmailAddress": "[email protected]",
"CampaignID": 32,
"AssetName": "Tradeshow",
"AssetType": "Tradeshow",
"AssetDate": "2014-05-12",
"ActivityType": "Visited Booth"
Please find sync logs in attachment.
thanks -
Hi,
I recently changed jobs and want to archive all email (with attachments in place) for the associated work email account. I also plan to remove the old work email account (IMAP) from the list of active mailboxes in Apple Mail. I don't want Spotlight searches for personal or current work email cluttered with results from my old job.
I have created an archive of the old work account by doing Mailbox Archive for all of the IMAP folders for that account. The entire archive is 1.7GB in size.
Before I delete the old account from my Mail preferences, I want to make sure that I have really exported everything. So I've compared the size of the new archive folder with that of the mailbox folder in users/.../Library/Mail for that same account. That latter is 8.7GB in size. Why the enormous difference?
Can the difference be attributed to lack of compression and indices (which would be rebuilt if I import the archive) for spotlight searches? Am I missing something in the archive that I created?
I've created a zip of the system folder. Can I easily restore this at a latter time if I delete the email account from my Mail preferences?
Is there a better way to achieve my objective?
MitoidsI did find a "Recovered Messages" by looking in the the Mail client under "On My Mac" (I don't normally use that feature). One message with an accicental 99MB attachment that I never sent, was recovered ~60times, consuming 5.7GB.
I deleted all of those, but the size ~/Library/Mail/xxxaccount folder did not change much. It is still much larger than the Archive I created. BTW, this is an IMAP account that was syncing with corporate GMail. -
Re: Bloated clients (was: pattern for persistancemanagement)
Geoff,
From what I've understood of Forte's partitioning behaviour, the only way to get rid of b, c, d and e Projects from the client partition is by
deploying them as a shared library and having the Client dynamically
load them at run-time. Any other way, including creating a complex
structure of interfaces to avoid the client referencing the other
projects directly, or even deploying the Server projects as another
Application, will not prevent these projects from being included in the
Client Partition.
The problem I guess is that when we think along the lines of components
we immediately associate them with Service Objects. Clearly, it's the
Project and not the SO which can be classified as a component in Forte.
The questions I have is : Now that we know how Forte's Partioning
mechanism works, would it significantly impact the way we partition our
Apps ? I'm not so sure.
Eric
----Original Message Follows----
Date: Thu, 22 Oct 1998 14:52:29 -0400
From: Geoff Puterbaugh <[email protected]>
To: [email protected]
Subject: Bloated clients (was: pattern for persistance management)
Reply-To: Geoff Puterbaugh <[email protected]>
This topic is of keen interest to me as well. At my customer's
urging, I wrote a little test application spread across
four or five projects, each supplying the next..
a << b << c << d << e
The projects b,c,d, and e each contain a class and a service
object based on that class. I partitioned it (a client
partition and four server partitions for b c d e) and made
a distribution. Then I looked at the client partition
and discovered b c d e were projects for it!
If you code-generate this and run fcompile, you will see
that all the code for all the projects in the entire chain
of supplier projects winds up in the client partition.
To be clear about this, if project e contains a class X,
you will see the source code for X.init and all other
X methods in the client partition.
This appears to be the default behavior, and the open
question is how to change the default behavior, if that
can be done.
All my best,
Geoff
P.S. If e uses GenericDBMS as a supplier, it will show up
in the client too. Of course!
This is at least my understanding as of today.
To unsubscribe, email '[email protected]' with
'unsubscribe forte-users' as the body of the message.
Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/>
Get Your Private, Free Email at http://www.hotmail.com
To unsubscribe, email '[email protected]' with
'unsubscribe forte-users' as the body of the message.
Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/>Geoff,
From what I've understood of Forte's partitioning behaviour, the only way to get rid of b, c, d and e Projects from the client partition is by
deploying them as a shared library and having the Client dynamically
load them at run-time. Any other way, including creating a complex
structure of interfaces to avoid the client referencing the other
projects directly, or even deploying the Server projects as another
Application, will not prevent these projects from being included in the
Client Partition.
The problem I guess is that when we think along the lines of components
we immediately associate them with Service Objects. Clearly, it's the
Project and not the SO which can be classified as a component in Forte.
The questions I have is : Now that we know how Forte's Partioning
mechanism works, would it significantly impact the way we partition our
Apps ? I'm not so sure.
Eric
----Original Message Follows----
Date: Thu, 22 Oct 1998 14:52:29 -0400
From: Geoff Puterbaugh <[email protected]>
To: [email protected]
Subject: Bloated clients (was: pattern for persistance management)
Reply-To: Geoff Puterbaugh <[email protected]>
This topic is of keen interest to me as well. At my customer's
urging, I wrote a little test application spread across
four or five projects, each supplying the next..
a << b << c << d << e
The projects b,c,d, and e each contain a class and a service
object based on that class. I partitioned it (a client
partition and four server partitions for b c d e) and made
a distribution. Then I looked at the client partition
and discovered b c d e were projects for it!
If you code-generate this and run fcompile, you will see
that all the code for all the projects in the entire chain
of supplier projects winds up in the client partition.
To be clear about this, if project e contains a class X,
you will see the source code for X.init and all other
X methods in the client partition.
This appears to be the default behavior, and the open
question is how to change the default behavior, if that
can be done.
All my best,
Geoff
P.S. If e uses GenericDBMS as a supplier, it will show up
in the client too. Of course!
This is at least my understanding as of today.
To unsubscribe, email '[email protected]' with
'unsubscribe forte-users' as the body of the message.
Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/>
Get Your Private, Free Email at http://www.hotmail.com
To unsubscribe, email '[email protected]' with
'unsubscribe forte-users' as the body of the message.
Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/> -
Hi all,
I am doing archiving for a r/3 archiving object, that was never been archived. I first started preprocessing of the object with a date constraint, its taking long time for pre processing, does anyone knows why its taking this much time?hi
good
For the continuous operation of a SAP system, the archiving of R/3 application data plays an important role.
Only a regular data archiving in short periods of time keeps the data base and the SAP data archiving performance in an optimal way, thus facilitating the system administration. Otherwise, the data base continues to increase which reduces the performance.
Check this sap help on Archiving Workbench..
http://help.sap.com/saphelp_nw04/helpdata/en/39/7ac81bff5ed74e87649b1368aad37c/frameset.htm
For 4.7
http://http://help.sap.com/saphelp_470/helpdata/en/6d/56a06a463411d189000000e8323d3a/frameset.htm
for 4.6
http://http://help.sap.com/saphelp_46c/helpdata/en/6d/56a06a463411d189000000e8323d3a/frameset.htm
Chk these links on Archiving:
http://help.sap.com/saphelp_nw04/helpdata/en/87/b12642aaf1de2ce10000000a1550b0/frameset.htm
http://help.sap.com/saphelp_erp2005/helpdata/en/5c/11afa1d55711d2b1f80000e8a5b9a5/frameset.htm
http://www.sap-press.com/downloads/h956_preview.pdf
http://help.sap.com/saphelp_webas610/helpdata/en/5c/11afaad55711d2b1f80000e8a5b9a5/content.htm
http://help.sap.com/saphelp_nw2004s/helpdata/en/e9/c36642ea59c753e10000000a1550b0/frameset.htm
http://help.sap.com/saphelp_47x200/helpdata/en/2e/9396345788c131e10000009b38f83b/frameset.htm
Re: Archiving
reward point if helpful.
thanks
mrutyun^ -
Exporting and Importing Statistics for Schema objects.
Hello All,
I am trying to gather stats for optimization using, gather_schema_stats for all objects under schema. The manual what i am reading says it covers both Tables and Indexes and we can also include the all partions too. For the safety reasons i was informed by a friend that we should take a back up of old stats into a table (user defined), so as in case of messing up with new stats for performance issue we can use it as an backup copy. So, i have created a user table and exported the stats from existing schema. My Question is, do i have to create one for the indexes too. Or its all in one table. I hope my question is clear.
Generally manuals don't teach all this kind of stuff until unless we learn it by ourselves by trial and error Or getting solutions from great resolvers like you people.
Hope to hear soon.
Thanks in Advance.One table for table and indexes stats is enough, a little test for you :
SCOTT@demo102> exec DBMS_STATS.CREATE_STAT_TABLE('SCOTT','MYTBL');
PL/SQL procedure successfully completed.
SCOTT@demo102> desc mytbl
Name Null? Type
STATID VARCHAR2(30)
TYPE CHAR(1)
VERSION NUMBER
FLAGS NUMBER
C1 VARCHAR2(30)
C2 VARCHAR2(30)
C3 VARCHAR2(30)
C4 VARCHAR2(30)
C5 VARCHAR2(30)
N1 NUMBER
N2 NUMBER
N3 NUMBER
N4 NUMBER
N5 NUMBER
N6 NUMBER
N7 NUMBER
N8 NUMBER
N9 NUMBER
N10 NUMBER
N11 NUMBER
N12 NUMBER
D1 DATE
R1 RAW(32)
R2 RAW(32)
CH1 VARCHAR2(1000)
SCOTT@demo102> create table mytable_obj as select * from all_objects;
Table created.
SCOTT@demo102> create index myindex on mytable_obj(object_id);
Index created.
SCOTT@demo102> exec dbms_stats.gather_table_stats(ownname=>'SCOTT',tabname=>'MYTABLE_OBJ',cascade=>true);
PL/SQL procedure successfully completed.
SCOTT@demo102> select last_analyzed from all_tables where table_name='MYTABLE_OBJ';
07/06/06
SCOTT@demo102> select last_analyzed from all_indexes where table_name='MYTABLE_OBJ';
07/06/06
SCOTT@demo102> exec dbms_stats.export_table_stats('SCOTT','MYTABLE_OBJ',stattab=>'MYTBL')
PL/SQL procedure successfully completed.
SCOTT@demo102> exec dbms_stats.delete_table_stats('SCOTT','MYTABLE_OBJ');
PL/SQL procedure successfully completed.
SCOTT@demo102> select last_analyzed from all_tables where table_name='MYTABLE_OBJ';
-->No value
SCOTT@demo102> select last_analyzed from all_indexes where table_name='MYTABLE_OBJ';
-->No value
SCOTT@demo102> exec dbms_stats.import_table_stats('SCOTT','MYTABLE_OBJ',stattab=>'MYTBL');
PL/SQL procedure successfully completed.
SCOTT@demo102> select last_analyzed from all_tables where table_name='MYTABLE_OBJ';
07/06/06
SCOTT@demo102> select last_analyzed from all_indexes where table_name='MYTABLE_OBJ';
07/06/06
SCOTT@demo102> Nicolas.
Maybe you are looking for
-
IMovieHD Aspect Ratio woes..... !
OK, so I have two clips imported into iMovieHD from my Canon 5DMKII, they are 1080p and 16x9 clips. I'm all for OAR (original aspect ratios) and enjoy "black bars" on the top and bottom, but its 16x9 and the monitor is not... so its adding them when
-
Where can I find a collaboration demo?
I'm trying to sell the benefits of portal to our business and I can't for the life of me seem to find a demo of collaboration. I know it's a very cool feature yet it's hard to sell the business without some sort of demo of what it is all about. It'
-
About iCloud: 1. How much storage is provided? 2. Is a limited amount of storage free, then more must be subscribed to and paid for? 3. Do you store music and photos only, or also videos? 4. What about music and videos previously purchased on iTunes?
-
Delay in incoming calls ringing on Lync for Windows Phone
When calling a Lync user on Windows Phone, even if the phone is connected via Wifi, the call takes about 10 seconds longer before it even starts ringing on Windows Phone. As a result it is frequently missed. If the user is signed into the desktop cli
-
Exit Code: 34....FATAL: Payload - installation of CS6
Downloaded from the Adobe website, extracted the files to the desktop, and run the installer from: Desktop\cs6\Adobe CS6\set-up.exe After making me sign in, and accept and begin, i get the error: Exit Code: 34 -----------------------------------