What are all of these tables in my database using 10gEx?
Hi I just installed sql developer and managed to connect to my 10gEx
i see tens of tables and views in my databse , can i delete them and start to build my own tables ?
why they have such un_well_structured_names like : AQ$_INTERNET_AGENT_PRIVS ?
can i build Schema in my database and use that Schema for my data ?
why those are not categorized in Schema , or maybe it is Sql developer fault that shows them like this ?
Thanks
Short answer:
a) these are Oracle supplied tables. Don't mess with them.
b) by all means. it is best practice to create a separate schema and keep your objects (tables, indexes, etc.) in that schema. in fact, build lots of schemas - one per application.
Long answer:
Oracle has a bunch of support schemas that are delivered in typical installations.
You also want to look at the schema owner of these tables and views to get some idea of what they are and why they are used. For example:
a) Userids:
CTXSYS: Oracle Text feature and capabilities
DBSNMP: Oracle system management
FLOWS_020100: Application Express
FLOWS_FILES: Application Express
HR: Database and Application Express 'main & demo' user
MDSYS: Oracle Locator feature and capabilities (subset of Spatial option)
OUTLN:
PUBLIC: general public userid
SPATIAL: Oracle Locator feature and capabilities (subset of Spatial option)
SYS: the database owner
SYSTEM: the database administrator
TSMSYS:
XDB: XML and HTTP features and capabilitiesStuff owned by SYS is sacrosanct never, ever, change it directly. Especially anything that includes a '$' or a '#' in the name.
Some specific table prefixes and functionality
AQ advanced queueing
AUDIT - system auditing
CDC - change data capture
DBMS - generally the utility packages that make Oracle special
DIR - directory support (I think)
EXP - export/import support
HS - heterogeneous services
LOGMNR - log miner
SCHEDULE- scheduler
STREAMS - advanced queueing streams component
XDB - XML DBMost of these are documented in the feature related docco at http://www.oracle.com/pls/db102/portal.portal_db?selected=1 - a place you should visit often, once you are comfortable with the docco at http://www.oracle.com/pls/xe102/homepage
Message was edited by:
forbrich
By the way - feel free to join us over in the XE forum, where XE specific questions are discussed. Registration link on the XE 'database home page' if you have the database installed and running properly.
Similar Messages
-
What are all scenarios we will open the database in open resetlog mode
what are all scenarios we will open the database in open resetlog mode..please advise
Another situation I could think of is that when the active/current online redo logs are lost.
Regards,
Jaffar -
What are all types of tables in EP
Hi,
May i know is there any spl tables in EP... when i search for that i found only 3 types of ABAP tables but I want to know if we have any tables to use for EP.
And what is the GenericInterce in webdynpros and why should we use.
concept of webdynpros is to avoid writing coding so my question here is why should we learn of APIs in web dynpro.
regards,
rahul.Hi Rahul,
webdynpro applications and tutorials
<a href="https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/webcontent/uuid/28113de9-0601-0010-71a3-c87806865f26?rid=/webcontent/uuid/8921447c-0501-0010-07b4-83bd39ffc7be">Getting Started</a>
<a href="https://www.sdn.sap.com/irj/sdn/developerareas/webdynpro">Webdynpro Java</a>
<a href="https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/webcontent/uuid/28113de9-0601-0010-71a3-c87806865f26?rid=/library/uuid/49f2ea90-0201-0010-ce8e-de18b94aee2d">Webdynpro Sample Applications ans Tutorials</a>
https://www.sdn.sap.com/irj/sdn/thread?threadID=139467
https://www.sdn.sap.com/irj/sdn/thread?threadID=200852
Very useful
http://help.sap.com/saphelp_nw04/helpdata/en/5c/1b76bc3da0504e8b535cf3e154eaa7/content.htm
Regards,
<b>Ramganesan K</b> -
What are all the key words are not recommented in User Exit or Enhancement
Dear Experts,
Can any of one tell me, what are all the key words not recommended to use any of the User exit or Enhancement spots?
Regards,
Mohana
Moderator message: please search for available information/documentation.
Edited by: Thomas Zloch on Feb 22, 2011 4:10 PMNo Icon:Visitor
Student
Honor Student
Top Student
Tutor
Intern
Teacher
Grad Student
Master’s Graduate
PhD Student
Associate Professor
Professor
Distinguished Professor
Regents Professor
Associate Dean
Dean
Provost
FONDA X. COX, PhD -
What are all the tables used for this report:
hi
what are all the tables used for this report:
report:
<b>Stock Report, which will give opening balance, receipt, issue, and closing balance for any given Duration for any material.</b>
thanks in advanceTables: MSEG, MKPF, MARD.
FOR REFERENCE SEE TRANSACTION : MB5B.
Message was edited by: Sharath kumar R -
Dear experts, I am quite a newbie when it comes to understanding the Mac filing system, as I originally came from the PC world. There's lots of things I dont understand in the iMac file viewer and how to organise and backup my photos is an important issue for me.
I know that if I want to look at the original photo files on my iMac, I can right-click on users/myname/pictures and select "Show Package Contents".
Question 1 - What are all these folders for?
Please can someone explain what is the difference between all the folders I see? Some of them seem to be exact duplicates of the others e.g. Masters, Modified and Originals all seem to have the same content. So here is a list of folders that I see. What is in them?, or what is their purpose?
Data
Data.noindex
Modified
Originals
Apple TV Photo Cache
Attachments
Auto Import
Backup
Caches
Contents
Database
iLifeShared
iPod Photo Cache
Masters
Previews
ProjectCache
Thumbnails
Question 2 - Which photo folder should I back-up?
If I want to keep a physical backup of my photos, which of the above folders should I copy to an external hard drive? (I use Get Backup to automatically copy all important new or changed files to an external drive)
Question 3 - Using the cloud: What is the best way to backup my large photo library in the cloud safely?
I would like to have some kind of safe backup in the cloud for my photos. However the size of the iphoto library is huge at 165GB. Even the Masters folder is huge. It is 130GB. Is it possible to back up files of this size in the cloud? I have a couple of services called photo streaming and Dropbox, but they don't seem to be able to handle this kind of size. Photo streaming only works with 1000 photos (as far as I can tell), and my Dropbox probably has a limit too. I guess it's about 5GB. I am already using about 3GB of my Dropbox space for other files. I would consider both paid and free solutions.
Many thanks to all the experts for your help!know that if I want to look at the original photo files on my iMac, I can right-click on users/myname/pictures and select "Show Package Contents".
Don't do that. That's like opening the hood of your car and trying to figure out what all the different bits and peices are and which you can yank out and dispose of. Simply, there are no user-serviceable parts in here.
So, your Question 2:
You back up the iPhoto Library as a single unit.
Most Simple Back Up:
Drag the iPhoto Library from your Pictures Folder to another Disk. This will make a copy on that disk.
Slightly more complex: Use an app that will do incremental back ups. This is a very good way to work. The first time you run the back up the app will make a complete copy of the Library. Thereafter it will update the back up with the changes you have made. That makes subsequent back ups much faster. Many of these apps also have scheduling capabilities: So set it up and it will do the back up automatically.
Example of such apps: Chronosync - but there are many others. Search on MacUpdate or the App Store
Your question 3:
There is no good back up to the Cloud. There are a couple of reasons for this. One is that the datasets are so large and the cloud services shapre their download speeds. This means restoring can take days to complete. Then, and this is the big problem, the iPhoto Library needs to sit on a disk formatted Mac OS Extended (Journaled). Bluntly, no servers online are formatted appropriately, and if the Library is written to - by an incremental back up, for instance - there is a very high likelihood that the library will be corrupted.
Your Question 1:
The Library you're describing there sounds like one that has been updated a few times. Not everything you list there is a folder. Some will be aliases.
The Data folders hold thumbnails.
The Masters and Originals folders hold the files as imported from the camera
The Previews hold the versions of the edited photos that are accessed via the Sharing mechanism.
I think if you look losely that you'll notice that one of the Data folders and one of either the Masters or the Originals folders is actually an alias.
Everything else is a database or cache file of some form. All are required for iPhoto to work.
As an FYI:
For help accessing your photos in iPhoto see this user tip:
https://discussions.apple.com/docs/DOC-4491 -
How to fetch what are all the tables used in this TR no and Package name of
Hi Friends,
I have input of Transport Request no (E070-TRKORR).
How to fetch what are all the tables used in this TR no and Package name of this Table.HI,
FYI
SELECT E071OBJECT_NAME, E070MDEVCLASS
FROM E071, E070M
WHERE TRKORR = YOU REQUEST NO. -
What are all the tables updates..
Hello experts,
Do we have any option provided by SAP to find , what are all the tables get updates when we run a transaction (ex: VA01)..
Thanks & Regards,
Prakash Reddy .S>
Naveen Inuganti wrote:
But please dont under estimate my answer as it deffenately meets all the angles of that question.
The problem is with you answer, that it requires lots of manual work and that will bring lots of possible errors as well (and in case of transactions like VA01, "lots of" has to be understood as "LOTS OF")
>
Naveen Inuganti wrote:
Ex: If you not having authorization or correct inputs or Dont know the screen sequences for VA01, then how can you people can analize it from ST05?
St05 will only give you a list of DB tables, you don't have to be aware of screen sequences (I believe 99% of developers are not). And if you don't have authorizations, than you have to ask for it
(My original concern was, if someone searches SCN and finds this thread will think the most points were given to the best answer, but the point assignment was changed meanwhile.) -
What are the functions of tables in ods
Hi,
what are the functions of tables in ods.
new data table
change log table
active data table.
How 0recordmode work with these 3 tables.
and how sid will work with ods?
can anyone give detailed explanation.
please do not give sap help links.
Thanks,
madhu
ThanksHi,
new table holds the "new" loaded data in ODS. If you start a new request to load data in your ODS, it firstgoes to table new data. Then you normally activate data either manually or by process chain. Then the daa will be copied over from new to active table. The change log table holds the delta information, what will change during activation or which records will be overwritten with new data or which data is new at all.
With recordmode you can specify which update mode will be used or more excalty which delta mode will be used. SID is necessary for master data, because ODS will only hold the SID values of the used characteristivcs and not the characteristic values itselfs. It's part of normalization used in BI.
For more details I recommend to look here in SDN, go to a class or read help.sap.com pages.
Regards,
Juergen -
How to identify what are all the errors appears in process chain,
Hi all,
i have a process chain running, but i want to find out what are all the errors that the process chain has thrown
thanks
poojaHi Pooja,
Errors in monitoring:
SID error
Reason: If the corresponding master data is missing for the particular transaction data.
1. Check for the load timing for the respective master data, if it less than an hour then make the request red and repeat the ip.
2. If the data is loaded to PSA then you have to delete the request from target and manually push the data to from PSA.
3. If we are required for selective update then note down the info source from header and find it in RSA1,select the one with Single as postfix.
4. Goto data selection tab and change the range.
Tip: change the last 4 digits for from to 0000and the last 4 digit for to 9999.
5. Repeat the ip.
6. In case only of failure in one target goto RSA1 find the PSA, put he request no. and reschedule it.
Note: IF PSA is present never make the request red rather delete it.
Replication error
Reason: Data source replication Failed.
1. In order to handle this error you should be known to IP, info source and source system.
2. Goto RSA1, find the data source in the source sys tab.
3. Right click on the data source and replicate it.
4. Since all the transformation rules pertaining to this data source need to be reactivated so go to SE38 and execute rs_transtru_activat_all, data source and sys name.
5. Delete the red request from the target.
Update R not supported
Reason: The corresponding initialization flag for the ip is lost.
1. Goto header click on the ip and goto schedule tab and click initialize in the source system, whatever screen appears delete the only request present(the initialization flag).
2. Goto RSA1, find the ip in the info source (one with the Adhoc initialize flag).
3. Goto update tab and select Initialize delta process try going for with data transfer.
4. Reschedule the IP.
Duplicate Record Error
Reason: Duplicate error records for the loaded master data.
1. Goto info package via header or via RSA1.
2. Goto processing tab and change the ip setting, selecting only PSA and ignore duplicate records and re run the ip.
3. Do remember to change the ip settings back to the original once after the second step.
ODS activation failure
Reason: Prerequisites for ODs activation not satisfied i.e. unique key.
1. Goto maintain variant.
2. Check for the QM status of the requests in the target they should be green.
3. Then click the ODS activation tab.
4. In the screen which appears put the requests for which ODS activation failed.
5. Activate these and keep on refreshing them until status reverts from green,
Remember to refresh these requests one at a time.
6. If requests are red then delete them from target.
7. Reschedule the IP.
Note: Never Try activating ODS manually if it is Y1.
Aggregate Rollup error
Reason: No aggregate found for the respective rollup.
1. Click on the variant in which the error occurred.
2. Goto chain tab and copy the variant and instance.
3. Run the nestle provided program YBW05.
4. Put in info in there and put the status as g green.
5. Execute and refresh the status.
Lock issue
Reason: The same ip is been locked by other user or may be used by other process chain.
1. We can see the locked entries and in the transaction SM12.
2. Wait for the other process to get complete once the ip loads to target in that process then there is no need for running it for the process.
3. In other case make the request red, when PSA is present then goto environment tab ->schedule->initiate update.
4. In the box appears select update in the background.
5. And the manually update the failed IPs by selecting manual update in the context menu.
Alfa confirming value error, Time conversion error, Chain didnt start, Delay due to long running job, Poor system performance,Heirarchy node exist in duplicate.
Reasons:
Alfa confirming error: Data format mismatch.
Time conversion error: Date, time format mismatch.
Chain didnt start: A scheduled chain didnt triggered at the prescribed timing.
-For all the above error we have to raise a ticket.
Idoc or TRFC error
Reason: An Idoc is been stuck somewhere.
1. Reload the Master Data manually again from Info-package at RSA1.
2. Release the Idoc.
3. In the source system level got environment->transaction->Transaction RFC or Data ware housing level.
4. In the Screen if the status is Transaction Recorded it means its stuck goto edit and click Execute LUW or press F6.
5. If status is Transaction executing then it means its fine wait.
6. Or raise ticket.
Error Due to short Dump
Reason: Error due to Short dump could be due to many reasons i.e. memory, table space, lock, error record, page allocation, failed change run.
Process terminated in the Source system.
Reason: Sometimes we face that a load has failed due to job Termination at Source System.
This happens due to unavailability of the source system or some connectivity problem between source and target systems.
1. Make the request red.
2. Delete the request from the data target.
3. Reschedule the ip.
4. If still the job fails raise the ticket.
And also check in following links:
Process Chain Errors
/people/mona.kapur/blog/2008/01/14/process-chain-errors
Common Process chain errors
For Data Load Errors check this blog:
/people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
Pls assign points if it helps u,
Thanks & Regards,
Madhu -
Sub:ztable fields are used by what are all transactions
Hi Experts,
I am having 2 table fields. These two fields are used in 2 different Ztables. Just i want to modify these two Zfields. But i want to tell our endusers, If i am making changes on these two fields means What are all the Transaction codes will affect? I can take what are all programs used by these 2 different fields through where-used-list option. But user dont need to know the program names know? They are familiar with only transaction codes.
So anyother option is there to find out the transaction codes?
Thanks,
Sankar Mhi
good
your query is bit confusing,if you r going to change anything in the zfields than there must be a requirement for that from your end user,apart from that if you r doing any changes than you must know what r the reports going to affect after that,than you need to change them accordingly,i dont think there is any table which ll give you idea that what r the tcoded going to affect after that,bcz its a table that would have been used in different reports in different scenarios,and you have to chage all those reports accordingly.
thanks
mrutyun^ -
How to identify what are all the events are created in background jobs?
Hi all,
how to identify what are all the events are created for background jobs. And what events gets triggered for a particular job.
thanxs
harithaHi Haritha,
JOB is a program which starts to a determined point of time and executes some standard programs in the system. JOBs can be planed to a determined point of time on the regular basis (every night, for example) or to some discret time moments. So, the JOB can be planed and then will be started automatically without the manual start.
Realtime programs are understood in the most cases as actual program execution which is started by somebody to the actual moment of time.
Typically per JOBs some special processes will be started that should be executed automatically and regularly: for example, IDOC application, some correction reports, statistic updates etc.
Standard jobs are those background jobs that should be run regularly in a production SAP System These jobs are usually jobs that clean up parts of the system, such as by deleting old spool requests.
Use
As of Release 4.6C, the Job Definition transaction ( sm36 ) provides a list of important standard jobs, which you can schedule, monitor, and edit.
Standard jobs are those background jobs that should be run regularly in a production SAP System. These jobs are usually jobs that clean up parts of the system, such as by deleting old spool requests.
for more information you can go thru the following thread:
http://help.sap.com/saphelp_nw70/helpdata/en/24/b884388b81ea55e10000009b38f842/frameset.htm
About Events:
Events have meaning only in the background processing system. You can use events only to start background jobs.
Triggering an event notifies the background processing system that a named condition has been reached. The background processing system reacts by starting any jobs that were waiting for the event.
Types of Events:
There are two types of events:
1.)System events are defined by SAP. These events are triggered automatically when such system changes as the activation of a new operation mode take place.
2.)User events are events that you define yourself. You must trigger these events yourself from ABAP or from external programs. You could, for example, signal the arrival of external data to be read into the SAP system by using an external program to trigger a background processing event.The event scheduler processes an event if the event is defined in the system.
For example, if a system (System 1) receives an event from another system (System 2), the event scheduler of System 1 processes the event only if it is defined in System 1. That event does not need to be defined in System 2 (the sending system).
You define an event by assigning a name (EVENTID) to it. When defining an event, you do not define the event arguments.
for more information you can go thru the following thread:
http://help.sap.com/saphelp_nw04s/helpdata/en/fa/096e2a543b11d1898e0000e8322d00/frameset.htm
When you schedule the process chain or infopackages the jobs associated with it run in the background mode. In case you want to create a job for a specific activity you can do so in SM36. You would be creating jobs that would get executed in any one of the options:
1. Immediate
2. Date & Time
3. After event.
4. After job.
5. At Operation mode.
In case you want to view the job logs go to sm37.
Also Pls check DB02 for database performance and ST03 for workload .
Analyse u will have an idea ,
*pls assign points,if info is useful**
Regards
CSM reddy
null -
What are all information brought into database buffer cache ?
Hi,
What are all information brought into database buffer cache , when user does any one of operations such as "insert","update", "delete" , "select" ?
Whether the datablock to be modified only brought into cache or entire datablocks of a table brought into cache while doing operations i mentioned above ?
What is the purpose of SQL Area? What are all information brought into SQLArea?
Please explain me the logic behind the questions i asked above.
thanks in advance,
nvseenuDocumentation is your friend. Why not start by
reading the
[url=http://download.oracle.com/docs/cd/B19306_01/serv
er.102/b14220/memory.htm]Memory Architecturechapter.
Message was edited by:
orafad
Hi orafad,
I have learnt MemoryArchitecture .
In that documentation , folowing explanation are given,
The database buffer cache is the portion of the SGA that holds copies of data blocks read from datafiles.
But i would like to know whether all or few datablocks brought into cache.
thanks in advance,
nvseenu -
What are all Built-in-Templets available in ODI?
What is the use of Built-in-Templets?
What are all Built-in-Templets available in ODI?Hi Harmeet,
ODI's biggest asset is Knowledge module.This is the built in templates of ODI.
Knowledge Modules (KMs) are code templates. Each KM is dedicated to an individual task in the overall data integration process.
There are almost 6 types of KMs coming with ODI.
Reverse-engineering knowledge modules (RKM) are used for reading the table and other object metadata from source databases.
Journalizing knowledge modules (JKM) record the new and changed data within either a single table or view or a consistent set of tables or views.
Loading knowledge modules (LKM) are used for efficient extraction of data from source databases and include database-specific bulk unload utilities where available.
Check knowledge modules (CKM) Checks data integrity against CONSTRAINTS defined on a Datastore. Rejects invalid records in the error table created dynamically.
Integration knowledge modules (IKM) are used for efficiently transforming data from staging area to the target tables, generating the optimized native SQL for the given database.
Service knowledge modules (SKM) provide the ability to expose data as Web services.
You can see the list of KMs in your installation folder /impexp directory.
Thanks,
Guru -
What are the most important tables in SD, MM, PP, FI , CO and QM?
Hi all,
What are the most important tables in SD, MM, PP, FI , CO and QM? i.e. most used.
Thanks,
Charles.
+++++++++++++++++<b>In FI:</b>
BKPF Accounting documents (Header)
BSEG Item level
BSID Accounting: Secondary index for customers
BSIK Accounting: Secondary index for vendors
BSIM Secondary Index, Documents for Material
BSIP Index for vendor validation of double documents
BSIS Accounting: Secondary index for G/L accounts
BSAD Accounting: Index for customers (cleared items)
BSAK Accounting: Index for vendors (cleared items)
BSAS Accounting: Index for G/L accounts (cleared items)
<b>In SD:</b>
VBAK Header data
VBAP Item data
VBPA Partners in sales order
VBKD Sales district data
VBEP Data related to line items, delivery lines
VBRK header data
VBRP Item data
LIKP Delivery header
LIPS Delivery item
VTTK Shipment header
VTTP Shipment item
VTTS Stage in transport
VTSP Stage in transport per shipment item
VTPA Shipment partners
VEKP Handling Unit - Header Table
VEPO Packing: Handling Unit Item (Contents)
<b>In MM:</b>
EKKO Purchase document
EKPO Purchase document (item level)
EKPV Shipping-Specific Data on Stock Tfr. for Purch. Doc. Item
EKET Delivery schedule
VETVG Delivery Due Index for Stock Transfer
EKES Order Acceptance/Fulfillment Confirmations
EKKN Account assignment in purchasing
EKAN Vendor address purchasing
EKPA Partner functions
EIPO Item export / import data
EINA Purchase info record (main data)
EINE Purchase info record (organisational data)
EORD Source list
EBAN Purchase requisition
EBKN Purchase Requisition Account Assignment
MKPF material document
MSEG material document (item level)
MARA Material master
MAKT Material text
MARC Material per plant / stock
MVKE Material master, sales data
MARD Storage location / stock
MSKA Sales order stock
MSPR Project stock
MARM Units of measure
MEAN International article number
PGMI Planning material
PROP Forecast parameters
MAPR Link MARC <=> PROP
MBEW Material valuation
MVER Material consumption
MLGN Material / Warehouse number
MLGT Material / Storage type
MPRP Forecast profiles
MDTB MRP table
MDKP Header data for MRP document
MLAN Tax data material master
MTQSS Material master view: QM
<b>In QM:</b>
QALS Inspection lot record
QAMB Link inspection lot - material document
QAVE Inspection usage decision
QDPS Inspection stages
QMAT Inspection type - material parameters
QINF Inspection info record (vendor - material)
QDQL Quality level
QDPS Inspection stages
<b>In PP:</b>
AUFK Production order headers
AFIH Maintenance order header
AUFM Goods movement for prod. order
AFKO Order header data PP orders
AFPO Order item
RESB Order componenten
AFVC Order operations
AFVV Quantities/dates/values in the operation
AFVU User fields of the operation
AFFL Work order sequence
AFFH PRT assignment data for the work order(routing)
JSTO Status profile
JEST Object status
AFRU Order completion confirmations
Maybe you are looking for
-
Right-click on Magic Mouse and Volume Controls on Wireless keyboard suddenly stopped working
I am running Mavericks 10.9.2 on an Late 2012 iMac. One morning the right click on my Magic Mouse and the volume controls on the wireless keyboard stopped working. These mouse and keyboard themselves seem fine; they work okay ehn used on a MacBook. E
-
Yosemite - Samsung 840 Pro Significant Slowdown
I have upgraded my MacbookPro 15" Early 2011 with a Samsung SSD 840 Pro 512GB. It was performing extremely smooth and well using OS 10.9 (Mavericks). Last week, I have upgraded the OS X to Yosemite and now I'm experiencing frequent significant slow d
-
System Connection Tests You can test the connectivity to the backend application represented by the current system object. Choose the relevant tests. Note: The test is based on the properties currently defined in the system object. It does not check
-
After system restore TM backup too large to fit on same drive
I just had to get a new hard drive at the Apple store and thus completed a system restore using time machine. My problem now is I am getting errors when TM goes to do a backup (it is saying there is not enough room on the drive). Is there a way to ge
-
Me emails are being sent 2x from my hotmail account
How can i fix the problem that when i send emails from my hotmail account, they get sent twice