Query to Identify File and Block IDs

i'm executing the following statement
SELECT segment_name,
file_id,
block_id
FROM dba_extents
WHERE owner = 'OE'
its executing in the case of OE owner but whn i type HR or others its showing no rows selected....
plz suggest.

It is not necessary that user who is in dba_users are also in dba_extents 100%. Chances are there may be some users who have only select privileges on different schema.objects. You will only find user name in dba_extents, if user is having any object in his/her schema; means empty schema will not be in dba_extents.
Regards
Girish Sharma

Similar Messages

  • Map files and map IDs

    This question was posted in response to the following article: http://help.adobe.com/en_US/robohelp/robohtml/WS5b3ccc516d4fbf351e63e3d11aff59c571-7ff9.ht ml

    Typo in this section:
    Remove an unused map ID
    Your map files must be unlocked to use this option. To remove unused map IDs:
    Expand the Context-Sensitive Help folder in the Project Set-up pod.
    Right-click the Map Files folder.
    Select Edit.
    Do the following:
    Map Files Select the map files from which to remove unused map IDs.
    Select All Click to remove unused map IDs from all map files.
    Clear All Click to deselect all map files and not remove unused map IDs.
    The third point should read "Select Remove Unused Map IDs..." instead of "Select Edit"; then the rest of the topic makes sense.

  • Unix program to identify file and video types

    A co-worker gave me a link to this script which I am now using to quickly identify and sort batches of video files. It labels them with file format, video codec, view size and frames per second. Its not very polished but if you get the one from the download link at the bottom of the page you can just double click and drop a folder in the window.
    http://www.machacking.net/forums/index.php?showtopic=5339

    Great link, juliejuliejulie. Thanks a lot!

  • IronPort S160/ASA5510 integration - PAC file and blocking Port 80

    We have successfully integrated our ASA5510 and IronPort S160 appliance with Active Directory and eDirectory.  We've configured AD to push IE settings to use the IronPort proxy.pac file.  Now we need to "Block" un-configured IE access to Port 80 traffic.
    In my ASA i have a firewall exception for our WAN IP ranges (source) to any Destination port tcp/http, tcp/https and domain.  If I remove the tcp/http from the exception "ALL" port 80 traffic stops, including those PCs configured to use the IronPort Poxy.pac file.
    So where have I gone wrong?  I want to block un-configured IE access to Port 80, forcing all users to pass through the IronPort appliance.

    I hate this job.  About 11:10 PM as I was trying to get ready for bed, I had the same thought.  Of course I had to test it out, so back to the VPN connection I went and added the filter permit for port 80 for the Ironport's ip address and viola it worked.  Thanks for answering my post just the same.

  • Procedure to save the output of a query into excel file or flat file

    Procedure to save the output of a query into excel file or flat file
    I want to store the output of my query into a file and then export it from sql server management studio to a desired location using stored procedure.
    I have run the query --
    DECLARE @cmd VARCHAR(255)
    SET @cmd = 'bcp "select * from dbo.test1" queryout "D:\testing2.xlsx;" -U "user-PC\user" -P "" -c '
    Exec xp_cmdshell @cmd
    error message--
    SQLState = 28000, NativeError = 18456
    Error = [Microsoft][SQL Server Native Client 10.0][SQL Server]Login failed for user 'user-PC\user'.
    NULL
    Goel.Aman

    Hello,
    -T:
    Specifies that the bcp utility connects to SQL Server with a trusted connection using integrated security. The security credentials of the network user,
    login_id, and password are not required. If
    –T is not specified, you need to specify
    –U and –P to successfully log in.
    -U:
    Specifies the login ID used to connect to SQL Server.
    Note: When the bcp utility is connecting to SQL Server with a trusted connection using integrated security, use the
    -T option (trusted connection) instead of the
    user name and password combination
    I would suggest you take a look at the following article:
    bcp Utility: http://technet.microsoft.com/en-us/library/ms162802.aspx
    A similar thread regarding this issue:
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/b450937f-0ef5-427a-ae3b-115335c0d83c/bcp-connection-error-sqlstate-28000-nativeerror-18456?forum=sqldataaccess
    Regards,
    Elvis Long
    TechNet Community Support

  • How Do RoboHelp 9 WebHelp Generated Files Handle Map IDs and Aliases?

    The text below was written by our team's developer/architect. I am the help author who uses RoboHelp to write content and generate the help files, but I am clueless how it all gets generated and is deployed. Please help. We use RoboHelp 9. I use it in Windows XP and our app and help run on IE 7, 9, and Firefox (multiple versions).
    "Our application uses the numeric identifiers associated with the Map ID. For example, to get to the <appname>_home_page.htm file, we use the number 1053. <appname> = pecs, in this example.
    All of this is used in a call to a RoboHelp method defined in the RoboHelp_CSH.js file. The mehtod we are calling is the RH_ShowHelp() JavaScript method and the code to perform the call, when you click on Page Help, is this:
    RH_ShowHelp(0, ''/pecsHelp/index.htm>pecsHelp',HH_HELP_CONTEXT,topic);
    Topic is translated to the Map ID number for the page help. HH_HELP_CONTEXT is defined in the RoboHelp_CSH.js file. This method translates into a URL and from what I have seen, the URL that gets generated is this:
    http://{server}[:port]/pecsHelp/index.htm/{server}[:port]/pecsHelp/index.htm#<id=1053>>pecsHelp
    Server and port get replaced with the appropriate values. I have no clue how id=1053 is supposed to get translated to mean "pecs_home_page.htm". If you check the PECS_help.h file, you will see the following entry:
    #define PECS_Home_Page1 1053
    Then in the RoboHelp alias file (PECS 3.0.ali), the following line is in the file:
    <alias name="PECS_Home_Page1" link="pecs_home_page.htm"> </alias>
    But both of these files are used during the WebHelp generation process and I don't know how the WebHelp generated files handle the Map ID and aliases."

    You need to assign the numbers you find in the pecs_help.h file to topics in your help. You do this in Context Sensitive Help > Map Files > All Map IDs. (From RH7, but I assume the location is similar in RH9.) This creates the entries in the .ali file.
    Peter Grainge suggests a couple of sites to read for a greater understanding here:
    http://www.grainge.org/pages/authoring/calling_webhelp/using_map_ids.htm
    (Although the second  site is based on RH X5, the basic concepts and procedures should be very similar. )
    HTH,
    Amber

  • ABAP query processing in bgrnd and saving file to prsntn/application server

    Hello Gurus
    I have question regarding to ABAP query execution. Currently I try to find out if ABAP query can be executed in background and store the result into UNIX directory or in the presentation server(desktop). What I found is that the report need to be executed using option Private File and the data by default will be stored at directory /sap/outbound/.
    However what I found next is that the maximum fields should be 64 fields. If the query result has more than 64 fields, the error will occurs WRITE_TO_OFFSET_TOOLARGE.
    My question is:
    1. Can we set the directory in unix folder to someplace else than /sap/outbound/ ?
    2. Can we set in order to have the report stored without limitation 64 fields?
    Thanks in advance

    HI
    Yes you can set the directory in unix folder to someplace as you need to get confirmation for the path from your Functional Analyst.
    Thanks

  • Verity - files and db query combo

    Is it possible to have both a search of some files and a
    query of a database table in the same cfcollection? If so, guidance
    on how to do this?

    I'm not sure if this answers your question, but it's possible
    to combine database and files while indexing.

  • Tried opening a file in library and it states can't open database with library name? It says Relaunch then will not open? and Blocks me completely from Aperture. I have to go to Finder to Rename it? I need this file how do I get it to open?

    Tried opening a file in library and it states can't open database with library name? It says Relaunch then will not open? and Blocks me completely from Aperture. I have to go to Finder to Rename it? I need this file how do I get it to open?

    Aftershotz,
    You're going to have to give a bit more information.
    What do you mean by "opening a file in library?"  There is no function of Aperture to open files -- you can open (switch) libraries.
    You'll have to be more specific about error messages, too.  Perhaps some screenshots would be useful to diagnose your problem.  "Can't open database with library name" is not enough detail about what Aperture is really telling you.
    nathan

  • General Query to identify parent columns and child columns in a table

    Does anyone know a general query that can be run to identify the child records associated with the parent column within a single table.....and between other tables?
    Am I correct in assuming the parent column is the 'primary key'?
    Thanks.....I'm a new to oracle...and need some help understanding
    my company's crazy DB structure

    You can use
    User_Constraints
    User_Cons_Columns
    views to identify parent and child table columns
    SELECT * FROM User_Constraints WHERE Constraint_Type = 'R' AND Table_Name = '<TABLENAME>';
    SELECT * FROM User_Cons_columns WHERE Constraint_Name = '<Name from Above query>';
    will give you columns of the parent table (if you use value from constraint_name) and of child table (if you use value from r_constraint_name).
    HTH..

  • Need to take a value from the csv file and query in a OAF page.

    Hello,
    I have a requirement to take the list of employee numbers in a csv file and display its corresponding job on the page.
    I have created a item 'MessageFileupload' where the user will upload the csv file containing the employee number and a Button 'Display Jobs' which will display the corresponding jobs on the page.
    Any idea how to take the values from the csv file and query it?
    Regards,
    den123.

    Hi ,
    Check
    http://oraclearea51.com/contribute/post-a-blog-article/csv-file-upload-for-oa-framework.html
    http://www.roseindia.net/jsp/upload-insert-csv.shtml
    Below code works from above blogs.
    package xx.oracle.apps.pa.Lab.webui;
    import java.io.BufferedReader;
    import java.io.IOException;
    import java.io.InputStreamReader;
    // import java.io.*;
    import oracle.apps.fnd.common.VersionInfo;
    import oracle.apps.fnd.framework.OAApplicationModule;
    import oracle.apps.fnd.framework.OAException;
    import oracle.apps.fnd.framework.server.OAViewObjectImpl;
    import oracle.apps.fnd.framework.webui.OAControllerImpl;
    import oracle.apps.fnd.framework.webui.OAPageContext;
    import oracle.apps.fnd.framework.webui.beans.OAWebBean;
    import oracle.jbo.domain.BlobDomain;
    import oracle.cabo.ui.data.DataObject;
    import oracle.jbo.Row;
    * Controller for ...
    public class deptCsvUploadCO extends OAControllerImpl
      public static final String RCS_ID="$Header$";
      public static final boolean RCS_ID_RECORDED =
            VersionInfo.recordClassVersion(RCS_ID, "%packagename%");
       * Layout and page setup logic for a region.
       * @param pageContext the current OA page context
       * @param webBean the web bean corresponding to the region
      public void processRequest(OAPageContext pageContext, OAWebBean webBean)
        super.processRequest(pageContext, webBean);
       * Procedure to handle form submissions for form elements in
       * a region.
       * @param pageContext the current OA page context
       * @param webBean the web bean corresponding to the region
      public void processFormRequest(OAPageContext pageContext, OAWebBean webBean)
        super.processFormRequest(pageContext, webBean);
        // Code Addition Started for CSV upload
        OAApplicationModule am = (OAApplicationModule) pageContext.getApplicationModule(webBean);
        OAViewObjectImpl vo = (OAViewObjectImpl) am.findViewObject("deptCsvVO1");
          //if ("GoBtn".equals(pageContext.getParameter(EVENT_PARAM)))
           if (pageContext.getParameter("GoBtn") != null)
          System.out.println("Button Pressed");
              DataObject fileUploadData =(DataObject)pageContext.getNamedDataObject("FileUploadItem");
              String fileName = null;
              String contentType = null;
              Long fileSize = null;
              Integer fileType = new Integer(6);
              BlobDomain uploadedByteStream = null;
              BufferedReader in = null;
                      try
                      fileName = (String)fileUploadData.selectValue(null, "UPLOAD_FILE_NAME");
                      contentType =(String)fileUploadData.selectValue(null, "UPLOAD_FILE_MIME_TYPE");
                      uploadedByteStream = (BlobDomain)fileUploadData.selectValue(null, fileName);
                      in = new BufferedReader(new InputStreamReader(uploadedByteStream.getBinaryStream()));
                      fileSize = new Long(uploadedByteStream.getLength());
                      System.out.println("fileSize"+fileSize);
                      catch(NullPointerException ex)
                      throw new OAException("Please Select a File to Upload", OAException.ERROR);
                      try{ 
                      //Open the CSV file for reading 
                      String lineReader=""; 
                      long t =0;
                      String[] linetext; 
                      while (((lineReader = in.readLine()) !=null) )
                      //Split the deliminated data and
                      if (lineReader.trim().length()>0)
                      System.out.println("lineReader"+lineReader.length());
                      linetext = lineReader.split(","); 
                      t++;
                      //Print the current line being
                      if (!vo.isPreparedForExecution())
                              vo.setMaxFetchSize(0);
                              vo.executeQuery();
                        System.out.println("Trimmed "+  linetext[1].replace("\"", ""));
                      Row row = vo.createRow();
                      row.setAttribute("Deptno", linetext[0].trim());
                      row.setAttribute("Dname",linetext[1].trim().replace("\"", ""));
                      row.setAttribute("Loc",linetext[2].trim().replace("\"", ""));
                      //row.setAttribute("Column4", linetext[3].trim());
                      vo.last();
                      vo.next();
                      vo.insertRow(row);
                      catch (IOException e)
                            throw new OAException(e.getMessage(),OAException.ERROR);
              //else if (pageContext.getParameter("Upload") != null)
              am.getTransaction().commit();
              throw new OAException("Uploaded SuccessFully",OAException.CONFIRMATION);     
    }Thanks,
    Jit

  • Identifying image and video file names in my presentation

    I want to identify the file names of all the images and videos in my Keynote presentation so that I can re-assemble the presentation in Final Cut Pro . . .  I thought that just using Inspector might give me the file names, but it only shows me what the image or video looks like, but not the file names.
    My photo images are not in iPhoto (I don't use iPhoto), they're in various other folders.
    Thank you.

    There are two ways to find file names of media used in Keynote:
    1
    for a few images, select the images on the slide
    Inspector > Builds
    click on more options
    if there are no builds set up, choose an effect
    the file name will display in the build order:
    2
    It is more usefull and easier  to do the following:
    right click the Keynote file and select     Show Package Contents
    all the images and QuickTime files will be listed
    you can make a screen grab to make a note of the file names (in finder -  command shift 4)
    the files can be copied to a folder and imported into other applications for further work

  • TS3212 I have removed my pop-up blocker completely and still receive the following error message when attempting to download iTunes:  "The file was blocked because it does not have a valid digital signature that verifies its publisher".....any ideas?

    have removed my pop-up blocker completely and still receive the following error message when attempting to download iTunes:  "The file was blocked because it does not have a valid digital signature that verifies its publisher".....any ideas?

    That suggests that the installer is getting damaged during the download.
    I'd first try downloading an installer from the Apple website using a different web browser:
    http://www.apple.com/itunes/download/
    If you use Firefox instead of IE for the download (or vice versa), do you get a working installer?

  • I have the following Imac and Iphoto 11 in Finder folder I upgrade the iphoto file  and  all the latest event files, can you help how to solve the problem? Thank you  Model Name iMac    Model Identifier:     iMa

    I have the following Imac and Iphoto 11 in Finder folder I upgrade the iphoto file  and I lost all the latest event files, can you help how to solve the problem? Thank you
    Model Name:     iMac
    Model Identifier:     iMac8,1   Processor Name:     Intel Core 2 Duo
    Processor Speed:     2.4 GHz

    http://support.apple.com/kb/HT2638?viewlocale=en_US&locale=en_US

  • Transaction execution time and block size

    Hi,
    I have Oracle Database 11g R2 64 bit database on Oracle Linux 5.6. My system has ONE hard drive.
    Recently I experimented with 8.5 GB database in TPC-E test. I was watching transaction time for 2K,4K,8K Oracle block size. Each time I started new test on different block size, I would created new database from scratch to avoid messing something up (each time SGA and PGA parameters ware identical).
    In all experiments a gave to my own tablespace (NEWTS) different configuration because of oracle block-datafile size limits :
    2K oracle block database had 3 datafiles, each 7GB.
    4K oracle block database had 2 datafiles, each 10GB.
    8K oracle block database had 1 datafile of 20GB.
    Now best transaction (tranasaction execution) time was on 8K block, little longer tranasaction time had 4K block, but 2K oracle block had definitly worst transaction time.
    I identified SQL query(when using 2K and 4K block) that was creating hot segments on E_TRANSACTION table, that is largest table in database (2.9GB), and was slowly executed (number of executions was low compared to 8K numbers).
    Now here is my question. Is it possible that multiple datafiles are reasone for this low transaction times. I have AWR reports from that period, but as someone who is still learning things about DBA, I would like to asq, how could I identify this multi-datafile problem (if that is THE problem), by looking inside AWR statistics.
    THX to all.

    >
    It's always interesting to see the results of serious attempts to quantify the effects of variation in block sizes, but it's hard to do proper tests and eliminate side effects.
    I have Oracle Database 11g R2 64 bit database on Oracle Linux 5.6. My system has ONE hard drive.A single drive does make it a little too easy for apparently random variation in performance.
    Recently I experimented with 8.5 GB database in TPC-E test. I was watching transaction time for 2K,4K,8K Oracle block size. Each time I started new test on different block size, I would created new database from scratch to avoid messing something up Did you do anything to ensure that the physical location of the data files was a very close match across databases - inner tracks vs. outer tracks could make a difference.
    (each time SGA and PGA parameters ware identical).Can you give us the list of parameters you set ? As you change the block size, identical parameters DON'T necessarily result in the same configuration. Typically a large change in response time turns out to be due to changes in execution plan, and this can often be associated with different configuration. Did you also check that the system statistics were appropriately matched (which doesn't mean identical cross all databases).
    In all experiments a gave to my own tablespace (NEWTS) different configuration because of oracle block-datafile size limits :
    2K oracle block database had 3 datafiles, each 7GB.
    4K oracle block database had 2 datafiles, each 10GB.
    8K oracle block database had 1 datafile of 20GB.If you use bigfile tablespaces I think you can get 8TB in a single file for a tablespace.
    Now best transaction (tranasaction execution) time was on 8K block, little longer tranasaction time had 4K block, but 2K oracle block had definitly worst transaction time.We need some values here, not just "best/worst" - it doesn't even begin to get interesting unless you have at least a 5% variation - and then it has to be consistent and reproducible.
    I identified SQL query(when using 2K and 4K block) that was creating hot segments on E_TRANSACTION table, that is largest table in database (2.9GB), and was slowly executed (number of executions was low compared to 8K numbers).Query, or DML ? What do you mean by "hot" ? Is E_TRANSACTION a partitioned table - if not then it consists of one segment, so did you mean to say "blocks" rather than segments ? If blocks, which class of blocks ?
    Now here is my question. Is it possible that multiple datafiles are reasone for this low transaction times. I have AWR reports from that period, but as someone who is still learning things about DBA, I would like to asq, how could I identify this multi-datafile problem (if that is THE problem), by looking inside AWR statistics.On a single disc drive I could probably set something up that ensured you got different performance because of different numbers of files per tablespace. As SB has pointed out there are some aspects of extent allocation that could have an effect - roughly speaking, extents for a single object go round-robin on the files so if you have small extent sizes for a large object then a tablescan is more likely to result in larger (slower) head movements if the tablespace is made from multiple files.
    If the results are reproducible, then enable extended tracking (dbms_monitor, with waits) and show us what the tkprof summaries for the slow transactions look like. That may give us some clues.
    Regards
    Jonathan Lewis

Maybe you are looking for

  • Eliminating duplicate loops when Soundtrack installed

    I've noticed in GarageBand that I have many duplicate items in the loops browser. Almost every sample, if not all, is present in the list twice. Soundtrack (and FCE) came preinstalled on my Mac when I bought it from Apple, but I don't use Soundtrack.

  • ADCS Error -8257 occurred at ISOTP.vi

    Hello everyone, I have a new problem with ADCS I'm trying to read the Ecu DTC with the "ReadDTCByStatus.vi" making the request: 18 00 FF 00 (in hex). The Ecu has 3 errors in memory and then responds with a consecutive frame. I've set the globals Dia

  • My 4s battery lasts at most 6 hrs???

    my 4s is very poor at time performance.. I have to carry a charging cord and hope for outlet when I am away from home?

  • How to fix an Ipod?

    how can i fix an ipod when it does not work

  • Forwarded email messages

    My Curve 9300 3 G  recieves forwarded emails from the server at work.  When I use my personal G Mail account on the device it records my outgoing messages in the sent file.  All incoming address are added to my contact list...whether I want them or n