DB instance/file architecture

Can you share DB files amongst different
instances ? i.e., can the control files
for different instances open with the
same files ? (parallel concepts ?)

Hi Sylvia,
Yes, You can share DB files, but You need to use the OPS (Oracle Parallel Server Option).
This option is used to get the maximun scalability for the DB.
It depends for the kind of application that You need to manage the choice between using the OPS that has additional overhead to manage the distributed lock or an architecture involving distributed DB.
Please specify Yr need.
Bye Max

Similar Messages

  • Best practices for file architecture? I mainly have photos and music to organize.

    I am moving my files from an old macbook pro to a newer one.
    I want to make sure that I am more organized with this new computer than I was in the past.
    Does anyone have suggestions re., how best to set up the file architecture?
    Thanks!

    I am moving my files from an old macbook pro to a newer one.
    I want to make sure that I am more organized with this new computer than I was in the past.
    Does anyone have suggestions re., how best to set up the file architecture?
    Thanks!

  • System 9.3 Essbase Analytics essbaseapp.instance file

    We are upgrading to System 9.3. When we create Essbase apps using AAS , it creates a file called essbaseapp.instance along with other normal ones.<BR>when we delete the app using AAS everything except this file gets deleted.<BR>Looks like this file is in use by Essbase.<BR><BR><BR>Question- What is this instance file for?<BR>Anyone come across this problem? How to resolve this?

    Hello,
    I finding exactly this same error when installing EAL. How did you fix it?
    Thanks,
    oriol

  • Mail file architecture configurable?

    Hello,
    Is it possible to configure Mail to store the message file in a way that each message would be a file?
    As I understand it, each time I download a mail from the server, it updates its corresponding mbox file. Being updated, the mbox file will be flaged for backup. And when my five email accounts at the office weight an average of 150 MB after 20 months of work, the backup of my emails only adds quite some weight to the backup file.
    Well, yes, I like to keep all my emails to refer to them later when needed. And with spotlight, it's a breeze to find one specific email even after many months.
    So, if the emails were saved each in a separate file, only the new emails would be eventually saved to the backup.
    Also, I could edit the emails received to remove attachments easily (to save space again).
    And more! I could store the emails in a certain shared location, different people of the same department could read the emails sent to the department email account (say, Emy's and my Mail app could read the same marketing email files)...
    Well, I guess it's not possible now to configure Mail to use that kind of file architecture. But do you think it would be hard to implement?
    Looking forward to read your comments...
    Frederic
    (and if you like the idea, don't forget to suggest it in Apple feedback form)

    Frederic,
    ever since 10.4, Mail does not use the mbox file any longer - messages are stored in individual files named ####.emlx in a Messages subfolder of your mailbox folders (this was done mostly for Spotlight support as well as to prevent mailbox overflow)
    Putting the mails in a shared location - I doubt that this will work. In addition to the individual emlx files, Mail uses an SQlite database to keep track of those (the file ~/Library/Mail/EnvelopeIndex) - since the numbers of the emlx files correspond to the entries within that SQlite db, this will never be identical for multiple computers (and you will need to rebuild the mailbox on each computer in order to force-update the db and have new messages appear).
    Removing attachments - you probably should not do this manually by editing the individual messages on disk (again, since you will mess up the integrity of the database. There is a command in Mail's menu for this: "Message → Remove Attachments"
    Andreas

  • BPM-  ADF binding files architecture for ADF human taskflow.

    Hi,
    I wast to know what is the architecture behind the hundreds of the file generated for the BPM adf integration for the human taskflow.
    Is there any method of debugging why data coming to BPM human taskflow is not coming properly to ADF.
    I appeciate any debug pointers, although these files are clearly xml files and we have no control over generation of these files but still we want to know how data been populated and how if any error occured of data corruption, how we will resolve.
    REF we are facing major issues with the data corruption issues with the ADF BPM components.
    Thanks
    VIpin

    Hi Vipin
    This post has an interesting discussion related to your question. I am glad BPM is infact generating all these complex xml files behind the screens and taking care of so many things. Thanks to ADF framework. Yes, I totally agree, debugging may be nightmare to begin with. But with divide and rule, you can get over it. End of the Day its just pure Workflow and a complex ADF. ADF can be stripped out carefully and run all the actonis from a normal web application. If we design properly as pure SOA, we create back end services like EJBs that can be consumed from WebServices or ADF Layer. Once we have this we can integrate these Services into any BPM Process.
    Re: Integrating BPM taskflows into an ADF application
    Now, coming back to your questions:
    1. You have a Complex BPM Process with complex BO (business objects), Arrays of BOs. I hope you have XSD for your BOs and using them only you created BusinesObjects and ProcessDataObjects. Now lets assume you have a BPM Process that uses this complex data structure. Lets take a very first Human Task initiaor who may see a Task Details page with your compelx objects. Lets say, in Task Details page, he says a header section with 10 fields. Another section with a Table of Multiple Rows and columns (arrays of Data), etc etc.
    2. NOW the main question is, when this User Saves or Submits the Task, you should be doing something with that Data ? Am I right. Are you Saving it to any Database or everything is in the Payload itself. If so, I hope you have complex. If its payload itself, then I hope you have a complex strucutre to pass arrays of data through out the process.
    3. Assuming all this gets Stored into backend Database. Lets not worry about retrieving it back when task goes to second user. Always Payload will store the key IDs lke dataId like OrderId, CustomerId etc and using that it can pull any data from backend and show on task details page.
    4. This is where the ADF comes into picture. So even before you integrate your complex backend data with Process, first create a simple WebAppilcation. If you have EJB methods to handle this data, you create Data controls, bindings. Bindings will help generate the Screens. So you can first test your application completely here. Once all this is working, thats when you call these Services from your Process.
    And beauty is with new release of BPM Suite (should be in BPM 11.5 + FP (Feature Pack)), they have a concept of Simulation. This should help you to simulate the Process without any real data and JUST focus and test on the Flow of the Process. This comes real handy for testing.
    And for the data used in your process, you have ADF Tools to create stand alone ADF Application.
    We have some applications like yours with very complex data structure, arrays that gets stored like in 25 tables with references going everywhere. First we built a simple data layer using EJBs and JPA layer (JPA Entities). These Entities have annotated methods to take care of most of the things. In the EJB we exposed some generic methods instead of adf generating them, due to complex data structre. Then we created data controls and bindings. Create Custom JSPX Files. This is where we ended up having our own Layout for the Data. I hope you have already seen by default, task details layout page generated is NOT usabe at all. It generates all the data in one Single Long column. We do not use that. Since we know data, we created our own Task details page. The only thing we add is the Actions bar and comments and attachment section. When you auto generate a taskdetails page for any .TASK file, you will see DataBindins created for Payload, Actions, and also separately for coments and attachmetns. Just drag these comments and attachments on to your page and define your own layout.
    Re: deploying a large Oracle BPM Application with multiple UI projects (Single taskdetails for multiple UIs and Tasks)
    Re: What exactly 'PAYLOAD' Means??? (Simple definition for Payload word referred in any Process)
    If you see one of the old posts, there is a way to reuse just one Single TaskDetals page for Multiple different Human .TASK files. All that matters iis the top Actions Menu that are differenet for each human task like someone can Submit, someone can only Approve or Reject etc. We can club all these Actions into single actionsMenu.jsff and put this on TaskDetails page and assing to all the .task files. Menus are automatically disabled/enabled using workflow APIs. You do not need any special coding for them. Look at code snipet for any actions button that has like isAllowedAction and pass the name of the action. Workflow APIs see if that .TASK file has a Action named with that internally.
    In conclusion, there is a decent clear gap between Process and the ADF Generated Task Details page.
    Thanks
    Ravi Jegga

  • Impdp using ASM or local to instance file

    Hi,
    I am wondering if anyone compared performance running impdp from
    a) ASM with fiber optic (SAN)
    b) local file system to instance
    Would there be any performance benefit to move from local file system to ASM ?
    Linuks, Oracle 10g, 2 nodes RAC
    Thanks
    Turtle
    Edited by: Turtle on Jun 26, 2012 3:20 AM
    Edited by: Turtle on Jun 26, 2012 3:22 AM

    sed -e 's/\/.*$//' proxyLog
    Roger

  • Deleted language files + architecture

    Ok I screwed up big time I think. I used Monolingual to delete what I thought were unnecessary files. I have intel version of software I use like Dreamweaver, Acrobat and Photoshop. Well neither of them want to start now. I can only put it down to me removing the language files and achitectures. I have English and English United Kingdom as my only language files. After that is the only language I ever use.
    Could that be responsible for the problems with Photoshop Acrobat and Dreamweaver? Thunderbird, Firefox, Word, Excel, Xcode, ImageReady, Fireworks, Shake, Final Cut Pro and Blender all work just fine.
    My computer is a Macbook Pro with 2GB RAM, Dual processor and 160GB HD 70GB of which is free space.
    I have tried to install the above software so this problem really intrigues me. I am not really sure of any sort of activity monitor as I have only been using Macs for a few months. So sorry for I sound like a total newbie here.

    Archive and Install will not work. It will preserve your Applications folder as-is with all the missing components. If you opt to preserve users then your /Home/Library/ folder will be preserved including any now damaged components.
    Trust me I've been down this path once. When you remove all the required frameworks needed you cannot get them back by Archive and Install because the A&I will preserve parts of the installed OS that you don't want preserved.
    Do your backup, but do an Erase and Install then reinstall all your third-party software.

  • Generic instance maker and class definition in separate files - problem

    file: genericinstantiator.java
    package testGenericInstantiator;
    public final class GenericInstantiator {
         public <T> T makeInstance(Class<T> c) {
              T instance = null;
              try {
                   instance = c.newInstance();
              } catch (IllegalAccessException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
              } catch (InstantiationException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
              return instance;
    file: genericinstantiatorusage.java
    package testGenericInstantiator.usage;
    import testGenericInstantiator.GenericInstantiator;
    final class SillyClass {
         public void sayBoo() {
              System.out.println("boo");
    public final class GenericInstantiatorUsage {
         public static void main(String[] args) {
              GenericInstantiator gi = new GenericInstantiator();
              SillyClass sc = gi.makeInstance(SillyClass.class);
              sc.sayBoo();
              System.exit(0);
    thrown exception:
    java.lang.IllegalAccessException
    on the line 'instance = c.newInstance();' in instantiator
    reason in jdk:
    if the class or its nullary constructor is not accessible.
    question:
    how to avoid this? (in my situation, the place of making instance and its class MUST be separated in another files)
    Edited by: gizmo640 on Mar 27, 2008 11:39 AM

    package testGenericInstantiator;
    import java.lang.reflect.Constructor;
    import java.lang.reflect.InvocationTargetException;
    public final class GenericInstantiator {
         public <T> T makeInstance(Class<T> c) throws IllegalArgumentException, InstantiationException, IllegalAccessException, InvocationTargetException, SecurityException, NoSuchMethodException {
              T instance = null;     
              Class[] emptyParams = {};
              Constructor<T> con = c.getDeclaredConstructor(emptyParams);
              con.setAccessible(true);
              instance = con.newInstance();
              return instance;
    }setting constructor to accessible works well too

  • How do I add data to an existing file, without having to create a new one?

    Hi,
    I made a servlet which is supposed to write the word "new" in a file everytime the server creates an instance:
    File chatInfos = new File("LOCATION_OF_THE_FILE/infos.dat");
       if(chatInfos.exists()) {
        try {
        FileOutputStream fos = new FileOutputStream(chatInfos);
        ObjectOutputStream os = new ObjectOutputStream(fos);
        os.writeObject("new");
        os.close();
        fos.close();
        } catch(IOException e){};
       }But it doesnt work. It seems like that every time the old file is destroyed and a new one is created. But I dont want that. I simply want to add a new object to the file.
    thanx

    http://java.sun.com/j2se/1.4.1/docs/api/java/io/FileOutputStream.html#FileOutputStream(java.io.File, boolean)
    Kind regards,
    Levi

  • Multiple sections in a single file

    How would one design a datastore for a file that has multiple sections? Each of the sections have different structures and can appear x number of times?

    could you validate your instance against your schema.
    in the visual studio, select your schema and set the property "input instance file name" to your sample instance.
    and then try to validate your instance as shown below.  If it does not succeed, then it means your instance does not comply with your schema definition. you should make sure it succeeds.
    Right click on the schema file and set the 
    Please mark the post as answer if this answers your question. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Monitoring File History / backups in Windows 8.1

    Hi,
    By tracking some events I was able to monitor Windows 7 Backups. By means of of schedule task triggered by the events belows.
    Image Backup successful : Event log : Microsoft/Windows/Backup/Operational. Source : Backup. Event ID : 14.
    Profile Backup successful : Event Log : Application. Source : Windows Backup. Event ID 4098
    (BTW, the Microsoft/Windows/Backup/Operational log not holding all backup logging is a bit disappointing, to say the least.)
    How can I monitor Windows 8 "File history" ?
    I we keep the default "every hour" File History, the best approach is perhaps to track failures instead of success. Also because I haven't seen any event logged for a successful backup into the log : Microsoft/Windows/FileHistory-Engine/File history
    backup log/Operational.
    When I disconnect the backup device I get and Event ID 203 in there, though.
    What are other event IDs related to a backup failure ?
    Is there a configurable Action Center notification on backup failure ?

    Hi,
    Log files related to File History can be found in Applications and Service Logs -> Microsoft -> Windows. We found following two interesting event files:
    WHC under FileHistory-Core
    File History Backup Log under FileHistory-Engine
    The entries of Information level in WHC appears whenever File History runs, stops, turns off or on. On the other hand, File History Backup Log records warnings or error messages, for instance, file was not backed up due to xyz error, unusual
    condition was encountered during finalization of a backup cycle for configuration, unable to scan user libraries for changes and perform backup of modified files for configuration.
    For more information about file history, you can refer to this link:
    Windows 8 File History Analysis
    http://articles.forensicfocus.com/2013/11/24/2869/
    Hope these could be helpful.
    Kate Li
    TechNet Community Support

  • Sqlldr errors loading XML file

    I am getting two errors on two different data elements trying to load an XML file. DB version is 11.2.0.2.0 - 64bit.
    This is part of the instance file:
    <?xml version="1.0" encoding="utf-8"?>
    <SamseAMSS xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://SAMSE-AMSS.org/namespace">
    <FileInfo>
    <AviationUIC>
    <OrigUnit>
    <UIC>WV7PB0</UIC>
    <UnitName>B TRP 1-230TH ACS</UnitName>
    <OrgLocCde>A</OrgLocCde>
    <DPICode>Need LOGSA DPICode</DPICode>
    <IsReporter>true</IsReporter>
    <UnitPOC />
    <POCNbr>(865)985-4634</POCNbr>
    <POCEmail />
    </OrigUnit>
    <AMSSUtilCde>
    <UtilCde>0</UtilCde>
    <Summary>
    <EIC>ROC</EIC>
    *<MDS xsi:type="xsd:string">OH-58D</MDS>*
    *<Model>OH-58D</Model>*
    <QtyAuth>6</QtyAuth>
    <QtyOH>6</QtyOH>
    <RptTimeCde>H</RptTimeCde>
    </Summary>
    </AMSSUtilCde>
    </AviationUIC>
    The problem is with the MDS and Model elements.
    This is part of the schema:
    <?xml version="1.0" encoding="UTF-8"?>
    <xs:schema xmlns="http://SAMSE-AMSS.org/namespace" xmlns:xs="http://www.w3.org/2001/XMLSchema"
    xmlns:xdb="http://xmlns.oracle.com/xdb"
    targetNamespace="http://SAMSE-AMSS.org/namespace"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns:smse="http://samsecommon.xsd/namespace"
    elementFormDefault="qualified" attributeFormDefault="unqualified">
    <xsd:import namespace="http://samsecommon.xsd/namespace" schemaLocation="samsecommon.xsd"/>
    <xs:element name="SamseAMSS" xdb:defaultTable="SAMSEAMSS">
    <xs:sequence>
    <xs:element name="FileInfo" smse:type="SamseFileInfoType"/>
    <xs:element name="AviationUIC" maxOccurs="unbounded">
    <xs:annotation>
    <xs:documentation>1 element for each UIC being reported</xs:documentation>
    </xs:annotation>
    <xs:complexType xdb:SQLType="AMSSUIC_T">
    <xs:sequence>
    <xs:element name="AMSSUtilCde" minOccurs="0" maxOccurs="unbounded">
    <xs:annotation>
    <xs:documentation>1 element for each UtilCde being reported for this UIC</xs:documentation>
    </xs:annotation>
    <xs:complexType xdb:SQLType="AMSSUTILCDE_T">
    <xs:sequence>
    <xs:element name="UtilCde">
    <xs:annotation>
    <xs:documentation>Key Field</xs:documentation>
    </xs:annotation>
    <xs:simpleType>
    <xs:restriction base="xs:token">
    <xs:length value="1"/>
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name="Summary" maxOccurs="unbounded">
    <xs:annotation>
    <xs:documentation>1 element per type of equipment (by EIC / PrimeEIC) authorized for this UIC -- always sent every time.</xs:documentation>
    </xs:annotation>
    <xs:complexType xdb:SQLType="SUMMARY_T">
    <xs:sequence>
    <xs:element name="EIC" minOccurs="0" smse:type="EICType">
    <xs:annotation>
    <xs:documentation>Key Field</xs:documentation>
    </xs:annotation>
    </xs:element>
    *<MDS xsi:type="xsd:string">OH-58D</MDS>*
    *<Model>OH-58D</Model>*
                   <xs:element name="PrimeEIC" minOccurs="0" smse:type="EICType" nillable="true">
    <xs:annotation>
    <xs:documentation>Key Field</xs:documentation>
    </xs:annotation>
    </xs:element>
    <xs:element name="QtyAuth" type="xs:unsignedShort"/>
    First, it doesn't recognize the "Model" data element that is defined in the schema:
    Record 3: Rejected - Error on table SAMSEAMSS.
    ORA-30937: No schema definition for 'Model' (namespace 'http://SAMSE-AMSS.org/namespace') in parent '/SamseAMSS/AviationUIC[1]/AMSSUtilCde[1]/Summary[1]'
    If I remove the Model from the instance file and try to load, it does not recognize the xsi:type in the MDS element. I also tried using xsi:type in the schema definition.
    Record 1: Rejected - Error on table SAMSEAMSS.
    ORA-31079: unable to resolve reference to type "string"
    If I change xsi:type to xsd:type it loads with no errors. However we don't control the incoming XML files so it needs to load with xsi:type.
    Thanks for your help.

    Whoops no I didn't put that in the schema definition! This is the part with those elements:
    <xs:complexType xdb:SQLType="AMSSUTILCDE_T">
    <xs:sequence>
    <xs:element name="UtilCde">
    <xs:annotation>
    <xs:documentation>Key Field</xs:documentation>
    </xs:annotation>
    <xs:simpleType>
    <xs:restriction base="xs:token">
    <xs:length value="1"/>
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name="Summary" maxOccurs="unbounded">
    <xs:annotation>
    <xs:documentation>1 element per type of equipment (by EIC / PrimeEIC) authorized for this UIC -- always sent every time.</xs:documentation>
    </xs:annotation>
    <xs:complexType xdb:SQLType="SUMMARY_T">
    <xs:sequence>
    <xs:element name="EIC" smse:type="EICType">
    <xs:annotation>
    <xs:documentation>Key Field</xs:documentation>
    </xs:annotation>
    </xs:element>
                   *<xs:element name="MDS" minOccurs="0" type="xsd:string"/>*
                   *<xs:element name="Model" minOccurs="0" type="xsd:string"/>*
                   <xs:element name="PrimeEIC" smse:type="EICType" nillable="true">
    <xs:annotation>
    <xs:documentation>Key Field</xs:documentation>
    </xs:annotation>
    </xs:element>
    Thanks.
    Edited by: user4109719 on Jun 12, 2012 6:32 AM Put correct Model definition
    Edited by: user4109719 on Jun 12, 2012 6:37 AM

  • System Profiler Content File Directory Location?

    Hello,
    I recently changed out my microprocessor in my PowerBook G3 because my old one was not working properly. All works fine now as it is smoothly running 10.4.11. However, I look in system profiler and the serial number has changed to the new microprocessor and does not match the sticker on the bottom on the machine. I managed to modify the "About This Mac" and "Login Window" serial number just fine, but I am stumped regarding system profiler. I did research and found that in 10.5, you can manually input the serial by finding a "SMBIOS" type file architecture in the extensions folder of System>Library. My question is:
    Where can I find the system files that give System Profiler its information. or Where can I find the equivalent in 10.4 to 10.5's SMBIOS system files.I appreciate any advice. Please indicate the entire file directory so I can easily find the location of the files I need to modify. Have a nice day.

    I suspect that System Profiler gets the serial number either by an internal call to the I/O registry which in turn has gotten it from the logic board firmware, or else directly from the firmware itself. If either of these mechanisms is correct, then there are NO files that carry the serial number, and therefore I think there is probably nothing you can change. This may be incorrect, but so far I've seen no evidence of a datafile that has this information.
    By way of background, here is my understanding. Hopefully I will be corrected if I am wrong:
    A file is a particular type of data structure, a collection of bytes assembled by the operating system in a specific way. Files usually reside on physical storage devices such as disk drives, and persist even after the computer is turned off. A user with sufficient privileges can generally access the information stored in a file.
    There are, however, other types of OS data structures in the "kernel," below the level of the user interface. These structures are not files - it is not that they are "hidden files" or "system files" - they are not files at all. These data structures exist in a protected space that users cannot alter.
    ioreg -l | grep IOPlatformSerialNumber | awk '{ print $4 }'
    I have no idea exactly "where" this piece of information is coming from, meaning the serial number that appears. It would be greatly appreciated if you could help me track down exactly where on the system the terminal command here is pulling this info from. It clearly gets it from some place, but it just may not be something we can see on the system as an object.
    According to the link I posted earlier, the ioreg terminal command directly displays (but cannot alter) the I/O registry. This data structure is not a file, and does not exist in the filesystem that you access via the GUI or via Terminal filesystem commands. Even worse, the link said:
    the Registry is not stored on disk or archived between boots. Instead, it is built at each system boot and resides in memory
    What this means is that the I/O Registry resides only in RAM - it is completely destroyed when you turn off the computer, and gets rebuilt when you start back up. And again, it is not a file.
    That's why I said "This doesn't sound promising"!
    there has to be a way to override the "connection" between the dynamic database itself and how it obtains information from it.
    If System Profiler.app does either read the I/O registry or read the firmware directly, then this instruction may be hardcoded in the app itself. I don't know this, but if so then I don't think there is any way to get it to look "somewhere else."
    Good luck, but I think this project is going to be "The Impossible Dream"

  • Applet can't read local file on web server, security issue!

    There is any way to read/write files of web server through the applet except the Signed Applet.
    If any idea the reply me soon.
    Thanks in advance

    Applets are downloaded from web servers and execute on the client machine.
    Therefore they have no access to the web server file system, signed or not.
    They could have access to the client file system (the machine where they run),
    but for security reasons only signed applets have this privilege.
    So to answer your question, you sign if you want to access client files.
    To access web server files, you don't need to sign the applet, but you need
    to provide some method of accessing files remotely, for instance:
    - Files published for the web can be read using HTTP
    - Files can be read or writen with the help of an FTP server on the same machine as the web server
    - A servlet running on the HTTP server can collaborate with your applet to exchange files

  • SAP File System are being updated at Storage Level and as well as in Trash

    Hi Friends,
    We are facing a strange but serious issue with our linux system. We have multiple instance installed on it but one of the instance's file system are being visible in Trash.
    The exact issue is this.
    1. We have db2 installed on Linux now one of our instance's Mount Points are being available in LInux Trash if i create a file at storage level i.e /db2/SID/log_dir/touch test it will be dynamically updated in the trash and will be create in Trash also.
    2  this can not be a normal behavior  of any O.S.
    3 if i delete any file from trash related to this particular SID (instance) file will be deleted from the actual location.
    i know this is not related to SAP Configuration but i just want to find out root cause analysis. if any Linux expert can help in this issue waiting for an early reply.
    Regards,

    Hi Nelis,
    I think you have misinterpreted this issue. let me explain you in detail. we have following mount points in storage including SAP installed on it.
    /db2/BID
    /db2/BID/log_dir
    /db2/BID/log_dir2
    /db2/BID/log_archive
    /db2/BID/db2dump
    /db2/BID/saptemp1
    /db2/BID/sapdata1
    /db2/BID/sapdata2
    /db2/BID/sapdata3
    /db2/BID/sapdata4
    Now i can see the same mount points in the Trash of linux and if i create a folder ./ file in any of the above mentioned mount point it will be dynamically updated in Trash and if i delete some thing at storage / os level the same will be deleted from trash and vis versa.
    I have checked everything no softlink exists anywhere but i am not sure about storage / os level thats what i want to find out?
    Regards,

Maybe you are looking for

  • Multiple devices on WRT54GS dropping

    I am trying to use a laptop and another laptop in one case, and a laptop and a PS3 in another case. Whenever One is running, if the other one is on then the router just crashes. I set static IPs on all devices so the IPs should not be conflicting. DN

  • How to Install Adobe Flash player without Google Chrome?

    I am using IE9 with Bing as my search engine on  a 64bit Windows Vista System.  When I tried to watch a video while online I received the error msg  that I needed to download Adobe Flash Player. When I tried to download the Flash Player I saw that Go

  • Improve the order-to-cash cycle and DSO with e-invoicing

    Hi Gurus Please will you provide me with material or ducumentation in this area- Improve the order-to-cash cycle and DSO with e-invoicing. Thanks in advance Rash

  • The MDI on the Web

    Hi All, I wonder if anyone else has had this problem and if they've found a way round it. In clent server mode all our windows are centered within the MDI and we can obtain the size of the MDI from a get_window_property using FORMS_MDI_WINDOW handle.

  • Delete Condition Type Depending on Item category Va01

    HI all, I wan to delete condition record depending on item category. I have a good link from sap.This link explains how to hide the condition but i want the  delete the condition. was badly struck here. http://wiki.sdn.sap.com/wiki/display/ABAP/Manip