Scan source paths... and performance.

In another thread,
Struts Page Flow Editor Unusably Slow
poor performance turned out to be due to a problem with having 'Scan source paths to determine project contents' enabled.
Apart from the problems with the diagrammer, this manifested itself as jdev using 100% cpu for periods of anything from 1 to lots of seconds.
There are lots of other (generally intermittent) occasions where jdev temporarily locks up using 100% cpu. A regular one is for the component palette to appear/disappear when switching between a jsp and a java file. A more intermittent one is just waiting for the text to appear in a menu or dialog box (ie the new 'window' appears as blank and you can't do anything for a variable period until the text appears.)
I shall watch these to see if disabling source scanning helps these as well.

Hi,
I've got the problem with permanent scanning the source files after I created a "Web Service Proxy". After I closed and restarted JDeveloper I got an OutOfMemory exception at startup. The only thing helped was to delete the whole JDeveloper profile.
I hope this helps to analyze the problem.

Similar Messages

  • 9.0.3 scan source paths: how to control which files are picked?

    Using 9.0.3 Preview. I've been waiting for the "scan source paths to determine project contents" feature, but I'm running into problems with this:
    * It's picking up "Packages" called com.mycompany.CVS, with "files" called Entries, Repository, etc. It messes the display (making it contain twice the number of entries that I'd like it to show).
    * Also, there's no way to exclude certain files or directories from the list (or is there?). I happen to have a few .jsp files lying around which are copied (by our Ant build.xml) to their proper location - but the build insists on compiling these .JSP files, and worse, spits out the error:
    Error: JSP files must reside in the server root directory or a subdirectory beneath it
    What the heck does this mean?
    How do I get around these problems?
    Shankar.

    Hi,
    You can quickly create a project based on a directory (e.g. the sourcepath) and its (filtered) contents using the New Project from Existing Source wizard in the New Object Gallery (from File->New...). If you're not too bothered about the project being dynamically updated with changes on the filesystem, you may find this solves your issue.
    However, I do agree that there should be some access to filtering for "dynamic" projects. I've logged the following bug against JDeveloper's SCM support (incl. CVS):
    2552750 DYNPRJ: WE SHOULD USE THE DYNAMIC PROJECT FILTER API FOR SCM FILES
    to support filtering out SCM control files (such as Entries, Root etc.) and the following ER against the IDE:
    2552763 DYNPRJ: PROVIDE UI SUPPORT FOR FILTERING
    to provide UI for filtering in a dynamic project.
    Both the bug and ER are published, so you can track their progress on metalink.
    Thanks,
    Brian
    JDev Team

  • "Scan Source Paths to Determine Project Contents"

    It would be nice if the "Scan Source Paths to Determine Project Contents" Option compared "whats currently on my projects path" aganist "what it was the last time I exited JDeveloper".
    1. Create a "junk" folder under Web Content and within that folder place some .uix or .jsp files. Compile the files. Click File --> Save All
    2. Click Tools --> Project Properties --> Common --> Input Paths.
    2. Select the checkbox labeled "Scan Source Paths to Determine Project Contents"
    3. Click OK.
    4. Exit JDeveloper 9.0.5.2 Build 1618
    5. Go to your public_html directory and remove the "junk"
    folder.
    6. Restart JDeveloper.
    7. Recompile the project. JDeveloper is still looking for the deleted files.

    Hi,
    This is a known bug with projects. Unfortunately, the only workaround is to edit the .jpr file manually and remove all references to the files.
    The project file format has been completely overhauled for the next major release of JDeveloper (10.1.3). So-called dynamic projects will also be the default in the new release, and generally work a lot better.
    Thanks,
    Brian

  • Font source path and vendor id

    two questions
    1.
    Since the step from 1.4 to 1.5 my old font folder is no more accepted by
    MakeOTF. It stops compiling the font by the status "Running txlib...".
    Him - I downloaded the examples and all works fine. So I`d copied my
    fonts paths into the "Example Font Sources" directory and MakeOTF runs
    like MakeOTF 1.4 before. But how can I change the path - so MakeOTF will
    operate from each path my font files stored?
    2.
    How can I apply my vendor ID to the font - of course I can extract the
    table form the font, can use an hex editor and place it back. But I like
    to do it with MakeOTF. I have checked the adobe samples and the vendor
    id is ADBE in the final font. But I haven't find any code in the feature
    files changing this value. If I compile my own fonts the result is UKWN
    - unknown. The values from the source pfa and fontinfo file will be
    ignored by MakeOTF.
    (I use the Windows SDK)
    Andreas Seidel

    Antoine Picard wrote:
    >
    > Question 1 is quite a puzzle. I suspect that I would need to some of
    > the paths involved to even guess at what the problem might be.
    >
    > Question 2 is easy. You can use the line:
    > Vendor "VDOR";
    > is the OS/2 table definition in the feature file where "VDOR" is your
    > desired vendor id and that should do it.
    Hallo Antoine,
    how exactly? I got an fatal error by MakeOTF saying: syntax error at
    "VDOR" missing "}"
    This is the look of my OS2 table.
    table OS/2 {
    # TypoAscender 622;
    # TypoDescender -150;
    # Panose 2 0 5 6 5 0 0 2 0 4;
    # XHeight 501;
    # CapHeight 622;
    FSType 8;
    VDOR TEST;
    } OS/2;
    Andreas Seidel

  • ZFD App distribution, source path and fs rights...

    I am encountering an issue, where it seems that the only place you can
    specify the SOURCE_PATH variable for a ZFD app distribution is via the TED
    policy...
    Whenever I try and specify an application specific SOURCE_PATH (i.e. on the
    golden app) I get the following error reported in the dist TED.LOG:
    'the source path %SOURCE_PATH% is invalid. Make sure the "SOURCE_PATH"
    variable value is defined in the Tiered Electronic Distribution policy, and
    that the policy is associated with the Distributor objects parent container.'
    I understand that you can reference subscriber specific variables to set
    the destination folder and substructure within the distribution object, but
    cannot seem to set an application specific SOURCE_PATH on either the app or
    distribution object - it just wants to use the policy based value, which
    can only be set once for the distributor...
    Is it not possible to set an application/distribution specific value? And
    if not, how does the transfer of Common->File System Rights work?
    I'm sure I must be doing something wrong/approaching this the wrong way!
    Please help if you can!
    Many thanks
    David

    The actual issue is that the SOURCE_PATH macro IS NOT modified when
    distributing MSI based apps, however this works fine for snapshot style
    apps... For whatever reason the distributor treats MSI based apps and
    trad. app objects differently. If you try and 'gather' an MSI based app
    with the SOURCE_PATH variable used in the Package Source List, whcih HAS
    been defined in the Common Macros, then it insists the variable in invalid...
    As for the rights, the best way I've found is to give the rights to an
    associated group and maintain the group through the distribution. i.e.
    mirror structure and maintain associations. If you do this then rights ARE
    assigned to the distributed group object
    HTH
    Regards
    David
    > > File rights that are explicitly assigned in the Application object using
    > > the Rights to Files and Folders tab are not transferred.
    >
    > And this is one of the biggest pains in the ass about replication... It
    > means you've got to manually go thru and reassign file rights to an
    > application, even tho they're all set in the golden app.. I've suggested
    > this as an enhancement several times, but it's not made it there yet.
    >
    >
    > Graham
    >
    > "Ron van Herk" <[email protected]> wrote in message
    > news:[email protected]...
    > > For application distribution the SOURCE_PATH needs to be specified on the
    > > application: Common -> Macros
    > >
    > > File rights that are explicitly assigned in the Application object using
    > > the Rights to Files and Folders tab are not transferred. Have a look at:
    > >
    http://www.novell.com/documentation/...k.html#bt16403
    > >
    > > Ron
    > >
    > > <[email protected]> wrote in message
    > > news:[email protected]...
    > >>I am encountering an issue, where it seems that the only place you can
    > >> specify the SOURCE_PATH variable for a ZFD app distribution is via the
    > >> TED
    > >> policy...
    > >>
    > >> Whenever I try and specify an application specific SOURCE_PATH (i.e. on
    > >> the
    > >> golden app) I get the following error reported in the dist TED.LOG:
    > >>
    > >> 'the source path %SOURCE_PATH% is invalid. Make sure the "SOURCE_PATH"
    > >> variable value is defined in the Tiered Electronic Distribution policy,
    > >> and
    > >> that the policy is associated with the Distributor objects parent
    > >> container.'
    > >>
    > >> I understand that you can reference subscriber specific variables to set
    > >> the destination folder and substructure within the distribution object,
    > >> but
    > >> cannot seem to set an application specific SOURCE_PATH on either the app
    > >> or
    > >> distribution object - it just wants to use the policy based value, which
    > >> can only be set once for the distributor...
    > >>
    > >> Is it not possible to set an application/distribution specific value? And
    > >> if not, how does the transfer of Common->File System Rights work?
    > >>
    > >> I'm sure I must be doing something wrong/approaching this the wrong way!
    > >> Please help if you can!
    > >>
    > >> Many thanks
    > >>
    > >> David
    > >>
    > >>
    > >>
    > >
    > >
    >
    >

  • How to use "Add from POM" to add source path and docpath?

    Hi
    My team is developing multiple projects with JDeveloper 11gr2.
    We share libraries and sources between team members using maven extension for JDeveloper.
    We deploy class jar, source jar and javadoc jar to repository.
    The project pom has some dependency defined as below
    <dependency>
                <groupId>com.example</groupId>
                <artifactId>CommonClient</artifactId>
                <version>0.0.1-SNAPSHOT</version>
                <type>jar</type>
                <scope>compile</scope>
    </dependency>
    We use ProjectProperties->Mave->Dependencies->Add from POM button to add dependecies to Project.
    The problem is that the class jar is added to the classpath but the source jar and javadoc jar are not.
    As a result, Team members cannot see the javadoc in code editor.
    Configure IDE libraries is an option but does not work well with maven.
    Does anyone know how to add source jar and javadoc jar to project using "Add from POM" button?
    Thank you!

    Hi ,
    Sample code snippet for GET_RELATED_CONTENT is as follows :
    dataBinder.putLocal("IdcService", "GET_RELATED_CONTENT");
    dataBinder.putLocal("dSource","CS");
    dataBinder.putLocal("dID","3202");
    dataBinder.putLocal("dLinkTypeID","1");
    serializer.serializeBinder (System.out, dataBinder);
    // Send the request to Content Server
    ServiceResponse response = idcClient.sendRequest(userContext,dataBinder);
    // Get the data binder for the response from Content Server
    DataBinder responseData = response.getResponseAsBinder();
    // Write the response data binder to stdout
    serializer.serializeBinder (System.out, responseData);
    // Retrieve the SearchResults ResultSet from the response
    DataResultSet resultSet = responseData.getResultSet("RelatedContent");
    // Iterate over the ResultSet, retrieve properties from the content items
    for (DataObject dataObject : resultSet.getRows ()) {
    System.out.println ("Related ContentID is : " + dataObject.get ("dDocName") );
    Point to note is :
    dLinkTypeID is set to 1 . Reason being that the rendition related content is used to link items . So looking for those related items which are as Renditions .
    This value would be as follows :
    dLinkTypeID Related content type
    1 Rendition
    2 Supersedes
    3 Has Supporting content
    4 Cross References
    2 more sub types are :
    Supports - dLinkTypeID=3 and extraparameter is isGetParents=1
    Cross Referenced By - dLinkTypeID=4 and extraparameter is isGetParents=1
    Hope this helps .
    Thanks,
    Srinath

  • Ant question: difference btwn source-path/compiler-source-path/include-libraries

    I have a few questions about build.xml include/lib directories (questions in xml file)
    <mxmlc>
    <source-path="../frameworks"/>
    <compiler.source-path path-element="path to include actionscript include similar to c include"/>
    <compiler.include-libraries dir="${basedir}">
        <include name="I have no idea what this is. Does flex generate binaries similar to how c compilers generate .so or .dll?"/>
    </compiler.include-libraries>
    </mxmlc>
    1. What's the difference between source-path and compiler.source-path
    2. So what exactly is a compiler.include-libraries?  Just curious what produces a library.
    I've had success accessing actions script files that reside in other directories using compiler.source-path but not source-path.  Any help appreciated
    Regards,
    Monty

    I have a few questions about build.xml include/lib directories (questions in xml file)
    <mxmlc>
    <source-path="../frameworks"/>
    <compiler.source-path path-element="path to include actionscript include similar to c include"/>
    <compiler.include-libraries dir="${basedir}">
        <include name="I have no idea what this is. Does flex generate binaries similar to how c compilers generate .so or .dll?"/>
    </compiler.include-libraries>
    </mxmlc>
    1. What's the difference between source-path and compiler.source-path
    2. So what exactly is a compiler.include-libraries?  Just curious what produces a library.
    I've had success accessing actions script files that reside in other directories using compiler.source-path but not source-path.  Any help appreciated
    Regards,
    Monty

  • LSMW - source file from Logical Path and Files?

    Hi,
    When specifing the source file (legacy data) you want to use to load from - you can choose a file from the application server.
    For the application server file does anyone know is there anyway that you can use a logical path and file to represent this source file?
    I do not see this option in LSMW and am surprised at this as it necessitates changing the LSMW in each target system.
    Thanks in advance.
    Kind regards,
    Mark

    Hi,
    When specifing the source file (legacy data) you want to use to load from - you can choose a file from the application server.
    For the application server file does anyone know is there anyway that you can use a logical path and file to represent this source file?
    I do not see this option in LSMW and am surprised at this as it necessitates changing the LSMW in each target system.
    Thanks in advance.
    Kind regards,
    Mark

  • Configuring Doc Path (and Source Path) for default JDeveloper library

    hi
    Sometimes a default JDeveloper library as no Doc Path (or Source Path) configured.
    Adding a project library with only a Doc Path (or/and Source Path) configured (so no Class Path) can make the relevant API documentation more easily available in JDeveloper,
    see http://www.consideringred.com/files/oracle/img/2011/library-doc-path-20110529.png
    - (q1) Why has the "WebLogic 10.3 Remote-Client" library no Doc Path configured by default?
    - (q2) Are somehow/somewhere source files available (maybe only of "API related"/non-implementation classes) for the "WebLogic 10.3 Remote-Client" library, so these can be configured in a library Source Path, to make the API documentation even more easily accessible in JDeveloper?
    many thanks
    Jan Vervecken

    Thanks for your reply John.
    John Stegeman wrote:
    ... At least some of the classes' javadocs are [url http://download.oracle.com/docs/cd/E21764_01/apirefs.1111/e13941/toc.htm]here ...
    As I write in my initial post, I am able to add a project library with only a Doc Path configured (as shown in library-doc-path-20110529.png),
    to "Oracle Fusion Middleware Oracle WebLogic Server MBean Javadoc 11g Release 1 (10.3.5) Part Number E13945-05 "
    at http://download.oracle.com/docs/cd/E21764_01/apirefs.1111/e13945/
    So, questions (q1) about a default Doc Path configuration and question (q2) about source files (similar to ADF) remain.
    regards
    Jan

  • Calculating the memory and performance of a oracle query

    Hi,
    I am now developing application in java with oracle as a back-end. In my application i require lot of queries to be executed. Hence, the system is getting is slow due to queries.
    So, i planned to develop one Stand-alone application in java, that should show the statistics like, memory and performance. for ex:- if i enter one SQL query in the text box, my standalone application should display, the processing time it requires to fetch the values and the memory is used for that query.
    Can anybody give ideas, suggestion, etc etc...
    Thanks in Advance
    Regards,
    Rajkumar

    This is now Oracle question, not JDBC question. :)
    Followings are sample for explain plan/autotrace/SQL*Trace.
    (You really need to read stuffs like Oracle SQL Tuning books...)
    SQL> create table a as select object_id, object_name from all_objects
    2 where rownum <= 100;
    Table created.
    SQL> create index a_idx on a(object_id);
    Index created.
    SQL> exec dbms_stats.gather_table_stats(user,'A');
    SQL>  explain plan for select from a where object_id = 1;*
    Explained.
    SQL> select from table(dbms_xplan.display());*
    PLAN_TABLE_OUTPUT
    Plan hash value: 3632291705
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
    PLAN_TABLE_OUTPUT
    | 0 | SELECT STATEMENT | | 1 | 11 | 2 (0)| 00:00:01 |
    | 1 | TABLE ACCESS BY INDEX ROWID| A | 1 | 11 | 2 (0)| 00:00:01 |
    |* 2 | INDEX RANGE SCAN | A_IDX | 1 | | 1 (0)| 00:00:01 |
    PLAN_TABLE_OUTPUT
    Predicate Information (identified by operation id):
    2 - access("OBJECT_ID"=1)
    SQL> set autot on
    SQL> select * from a where object_id = 1;
    no rows selected
    Execution Plan
    Plan hash value: 3632291705
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
    | 0 | SELECT STATEMENT | | 1 | 11 | 2 (0)| 00:00:01 |
    | 1 | TABLE ACCESS BY INDEX ROWID| A | 1 | 11 | 2 (0)| 00:00:01 |
    |* 2 | INDEX RANGE SCAN | A_IDX | 1 | | 1 (0)| 00:00:01 |
    Predicate Information (identified by operation id):
    2 - access("OBJECT_ID"=1)
    Statistics
    1 recursive calls
    0 db block gets
    1 consistent gets
    0 physical reads
    0 redo size
    395 bytes sent via SQL*Net to client
    481 bytes received via SQL*Net from client
    1 SQL*Net roundtrips to/from client
    0 sorts (memory)
    0 sorts (disk)
    0 rows processed
    SQL> exec dbms_monitor.session_trace_enable(null,null,true,true);
    -- SQL> alter session set events '10046 trace name context forever, level 12';
    -- SQL> alter session set sql_trace = true;
    PL/SQL procedure successfully completed.
    SQL> select * from a where object_id = 1;
    no rows selected
    * SQL> exec dbms_monitor.session_trace_disable(null, null);*
    -- SQL> alter session set events '10046 trace name context off';
    -- SQL> alter session set sql_trace = false;
    PL/SQL procedure successfully completed.
    SQL> show parameter user_dump_dest
    */home/oracle/admin/WASDB/udump*
    SQL>host
    JOSS:oracle:/home/oracle:!> cd /home/oracle/admin/WASDB/udump
    JOSS:oracle:/home/oracle/admin/WASDB/udump:!> ls -lrt
    -rw-r----- 1 oracle dba 2481 Oct 11 16:38 wasdb_ora_21745.trc
    JOSS:oracle:/home/oracle/admin/WASDB/udump:!> tkprof wasdb_ora_21745.trc trc.out
    TKPROF: Release 10.2.0.3.0 - Production on Thu Oct 11 16:40:44 2007
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    JOSS:oracle:/home/oracle/admin/WASDB/udump:!> vi trc.out
    select *
    from
    a where object_id = 1
    call count cpu elapsed disk query current rows
    Parse 1 0.00 0.00 0 0 0 0
    Execute 1 0.00 0.00 0 0 0 0
    Fetch 1 0.00 0.00 0 1 0 0
    total 3 0.00 0.00 0 1 0 0
    Misses in library cache during parse: 0
    Optimizer mode: ALL_ROWS
    Parsing user id: 55
    Rows Row Source Operation
    0 TABLE ACCESS BY INDEX ROWID A (cr=1 pr=0 pw=0 time=45 us)
    0 INDEX RANGE SCAN A_IDX (cr=1 pr=0 pw=0 time=39 us)(object id 65441)
    Elapsed times include waiting on following events:
    Event waited on Times Max. Wait Total Waited
    ---------------------------------------- Waited ---------- ------------
    SQL*Net message to client 1 0.00 0.00
    SQL*Net message from client 1 25.01 25.01
    Hope this helps

  • Get the source coding and the interface of a from routine (4.6C)

    Hello,
    I am looking for a way to get the source coding and the interface of a Form routine. Having a look at the functionality used in SE80, I found the function module RS_SEARCH_FORM.
    By specifying the following parameters:
    Include Name:                      i_incl             
    Main program:                      i_mainprogram
    Form routine name                i_objectname      
    Object Type for Form routine: i_objecttype
    you receive the "pattern", which is being generated by using the drag and drop
    functionality for Form routines in SE80. For example:
    PERFORM test
                USING
                   a.
    In addition to this, I am looking for a way to get the source coding and the interface for a form routine in a 4.6C system.
    The mentioned function module has a tables parameter called i_source but I didn't find out how to execute it in order receive the source code.
    It would be great if you could help me to solve this issue.
    Best regards,
    Fabian

    Hi,
    the only idea I have is to scan the source code for the form routine in order to retrieve the needed information. It would be great to have a standardized function module, which provides this functionality.
    Regards,
    Fabian

  • Time Machine, Deep event scan at path: /Volumes/Data reason:must scan subdirs|require scan|

    Had a data drive failure, replaced it.  Restored just that volume's data from TM to the new drive via "cp -a".   For some reason just that drive (DATA) forces a full scan at every TM backup.   Have also upgraded to 10.8.1 to see it that would fix things, but no.
    from .Backup.log:
    Running preflight for "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 6DA5C1F4-C522-48B2-892A-36362C54F632)
            Scanning nodes needing deep traversal
            Deep event scan at path: /Volumes/Data reason:must scan subdirs|require scan|
            Calculating size of changes
            Should copy 61 items (76 KB) representing 18 blocks of size 4096. 510751904 blocks available.
    Preflight complete for "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 6DA5C1F4-C522-48B2-892A-36362C54F632)
    Time elapsed: 0.002 seconds
    Mind you at this point, I've excluded all of the contents of DATA from actually being backed up (the 61 items is a separate bug in backupd where it reports file counts incrementally instead of per volume), it was doing a FULL copy every time, regardless of what actually changed.
    From everything I've been able to read/find/fiddle with, the part that doesn't make sense is the the "require scan" on the "Deep event" line above.  Other people have had problems with full scans, but it's usually something about 'no fsevents db' or "fsevents reset"..   this one feels like a setting somewhere telling it to do a full scan, and that's where I'm lost.
    The only thing I haven't done yet is fully reset the TM machine drive..  I'll do it, but in theory there should be a reason that will fix this.  I.e. a setting on there I could fix instead.  I'm not a big fan of reboot/reinstall/...  method of fixing things, it just hides the problem instead of solving it.
    And to make this into a question..  how do I fix "Data" to be more like my "Video" drive :
    Running preflight for "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
            Calculating size of changes
            Should copy 58 items (32 KB) representing 7 blocks of size 4096. 510751904 blocks available.
    Preflight complete for "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
    Time elapsed: 0.000 seconds
    Thoughts? 

    when this one went off, I wasn't even home.  only safari, console, terminal and finder running.  though there is a cp running to copy things off of the Data drive to an external usb for safety's sake. 
    syslog:
    8/26/12 7:46:56.322 PM com.apple.backupd[4449]: Starting automatic backup
    8/26/12 7:46:58.576 PM com.apple.backupd[4449]: Backing up to: /Volumes/Time Machine/Backups.backupdb
    8/26/12 7:47:05.840 PM com.apple.backupd[4449]: Forcing deep traversal on source: "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 9CD375B1-5483-4D33-8759-1B5AC2E9A7E7)
    8/26/12 7:47:06.009 PM com.apple.backupd[4449]: Using file event preflight for Fast
    8/26/12 7:47:07.420 PM com.apple.backupd[4449]: Will copy (207 KB) from Fast
    8/26/12 7:47:07.421 PM com.apple.backupd[4449]: Using file event preflight for Video
    8/26/12 7:47:07.421 PM com.apple.backupd[4449]: Will copy (Zero KB) from Video
    8/26/12 7:47:07.733 PM com.apple.backupd[4449]: Deep event scan at path:/Volumes/Data reason:must scan subdirs|require scan|
    8/26/12 7:47:07.733 PM com.apple.backupd[4449]: Finished scan
    8/26/12 7:47:25.056 PM com.apple.backupd[4449]: Found 2112 files (26.2 MB) needing backup
    8/26/12 7:47:25.064 PM com.apple.backupd[4449]: 1.92 GB required (including padding), 2.46 TB available
    8/26/12 7:48:31.092 PM com.apple.backupd[4449]: Copied 1258 files (4.3 MB) from volume Fast.
    8/26/12 7:48:36.552 PM com.apple.backupd[4449]: Copied 1264 files (4.3 MB) from volume Video.
    8/26/12 7:57:01.251 PM com.apple.backupd[4449]: Copied 3328 files (29.6 MB) from volume Data.
    8/26/12 7:57:01.571 PM com.apple.backupd[4449]: Using file event preflight for Fast
    8/26/12 7:57:02.468 PM com.apple.backupd[4449]: Will copy (46 KB) from Fast
    8/26/12 7:57:19.956 PM com.apple.backupd[4449]: Found 2088 files (26.1 MB) needing backup
    8/26/12 7:57:19.957 PM com.apple.backupd[4449]: 1.92 GB required (including padding), 2.41 TB available
    8/26/12 7:58:04.441 PM com.apple.backupd[4449]: Copied 428 files (263 KB) from volume Fast.
    8/26/12 7:58:15.160 PM com.apple.backupd[4449]: Copied 434 files (263 KB) from volume Video.
    8/26/12 8:06:45.605 PM com.apple.backupd[4449]: Copied 2498 files (25.6 MB) from volume Data.
    8/26/12 8:06:55.112 PM com.apple.backupd[4449]: Created new backup: 2012-08-26-200653
    8/26/12 8:07:26.837 PM com.apple.backupd[4449]: Starting post-backup thinning
    8/26/12 8:07:26.837 PM com.apple.backupd[4449]: No post-back up thinning needed: no expired backups exist
    8/26/12 8:07:26.882 PM com.apple.backupd[4449]: Backup completed successfully.
    .Backup.log
    2012-08-26-19:47:05 - Starting backup
    Previous snapshot:
              /Volumes/Time Machine/Backups.backupdb/Peter Drier’s Mac Pro/2012-08-26-192548
    Date of Previous snapshot: 1346023548470589
    Will use FS events for "Fast" (mount: '/' fsUUID: 79881815-0EDF-3698-A685-315F87B9BE52 eventDBUUID: F0683183-083A-408F-930C-AAE5DA0C3F5E)
    Gathering events since 7162257515116441245.
    Will use FS events for "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
    Gathering events since 7162257515113311785.
    Will traverse "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 9CD375B1-5483-4D33-8759-1B5AC2E9A7E7)
    === Starting backup loop #1 ===
      Will use IncrementalBackupCopier
    Running preflight for "Fast" (mount: '/' fsUUID: 79881815-0EDF-3698-A685-315F87B9BE52 eventDBUUID: F0683183-083A-408F-930C-AAE5DA0C3F5E)
              Calculating size of changes
              Should copy 48 items (207 KB) representing 50 blocks of size 4096. 600903601 blocks available.
    Preflight complete for "Fast" (mount: '/' fsUUID: 79881815-0EDF-3698-A685-315F87B9BE52 eventDBUUID: F0683183-083A-408F-930C-AAE5DA0C3F5E)
    Time elapsed: 1.412 seconds
    Running preflight for "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
              Calculating size of changes
              Should copy 48 items (207 KB) representing 50 blocks of size 4096. 600903601 blocks available.
    Preflight complete for "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
    Time elapsed: 0.001 seconds
    Running preflight for "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 9CD375B1-5483-4D33-8759-1B5AC2E9A7E7)
              Scanning nodes needing deep traversal
              Deep event scan at path: /Volumes/Data reason:must scan subdirs|require scan|
              Calculating size of changes
              Should copy 2112 items (26.2 MB) representing 6400 blocks of size 4096. 600528095 blocks available.
    Preflight complete for "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 9CD375B1-5483-4D33-8759-1B5AC2E9A7E7)
    Time elapsed: 17.634 seconds
    Processing preflight info
              Space needed for this backup: 1.92 GB (468018 blocks of size 4096)
              Preserving last snapshot /Volumes/Time Machine/Backups.backupdb/Peter Drier’s Mac Pro/2012-08-26-192548
    Finished processing preflight info
    Copying items from "Fast" (mount: '/' fsUUID: 79881815-0EDF-3698-A685-315F87B9BE52 eventDBUUID: F0683183-083A-408F-930C-AAE5DA0C3F5E)
    Finished copying items for "Fast" (mount: '/' fsUUID: 79881815-0EDF-3698-A685-315F87B9BE52 eventDBUUID: F0683183-083A-408F-930C-AAE5DA0C3F5E)
    Time elapsed: 1 minute, 6.000 seconds
              Copied 1258 items (4.3 MB)
    Gathering events since 7162257515117251301.
    Needs new backup due to change in /private/var/db/.dat1161.001
    Copying items from "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
    Finished copying items for "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
    Time elapsed: 4.589 seconds
              Copied 1264 items (4.3 MB)
    Gathering events since 7162257515113311785.
    Copying items from "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 9CD375B1-5483-4D33-8759-1B5AC2E9A7E7)
    Finished copying items for "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 9CD375B1-5483-4D33-8759-1B5AC2E9A7E7)
    Time elapsed: 8 minutes, 24.000 seconds
              Copied 3328 items (29.6 MB)
    === Starting backup loop #2 ===
      Will use IncrementalBackupCopier
    Running preflight for "Fast" (mount: '/' fsUUID: 79881815-0EDF-3698-A685-315F87B9BE52 eventDBUUID: F0683183-083A-408F-930C-AAE5DA0C3F5E)
              Calculating size of changes
              Should copy 23 items (46 KB) representing 11 blocks of size 4096. 588570106 blocks available.
    Preflight complete for "Fast" (mount: '/' fsUUID: 79881815-0EDF-3698-A685-315F87B9BE52 eventDBUUID: F0683183-083A-408F-930C-AAE5DA0C3F5E)
    Time elapsed: 0.898 seconds
    Running preflight for "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
              Calculating size of changes
              Should copy 24 items (46 KB) representing 11 blocks of size 4096. 588550912 blocks available.
    Preflight complete for "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
    Time elapsed: 1.407 seconds
    Running preflight for "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 9CD375B1-5483-4D33-8759-1B5AC2E9A7E7)
              Calculating size of changes
              Should copy 2088 items (26.1 MB) representing 6360 blocks of size 4096. 588164391 blocks available.
    Preflight complete for "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 9CD375B1-5483-4D33-8759-1B5AC2E9A7E7)
    Time elapsed: 16.080 seconds
    Processing preflight info
              Space needed for this backup: 1.92 GB (468003 blocks of size 4096)
              Preserving last snapshot /Volumes/Time Machine/Backups.backupdb/Peter Drier’s Mac Pro/2012-08-26-194705.inProgress/220D224E-57C5-4AE7-ACC9-B362141889D7
    Finished processing preflight info
    Copying items from "Fast" (mount: '/' fsUUID: 79881815-0EDF-3698-A685-315F87B9BE52 eventDBUUID: F0683183-083A-408F-930C-AAE5DA0C3F5E)
    Finished copying items for "Fast" (mount: '/' fsUUID: 79881815-0EDF-3698-A685-315F87B9BE52 eventDBUUID: F0683183-083A-408F-930C-AAE5DA0C3F5E)
    Time elapsed: 44.483 seconds
              Copied 428 items (263 KB)
    Gathering events since 7162257515117277992.
    Needs new backup due to change in /Users/drierp/Library/Messages/chat.db-wal
    Copying items from "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
    Finished copying items for "Video" (mount: '/Volumes/Video' fsUUID: A34192F9-20D3-3CCD-B80D-D55FC4EF34DC eventDBUUID: 2BD9CE6F-CCC5-456C-B1E4-26111ADAF2BE)
    Time elapsed: 10.449 seconds
              Copied 434 items (263 KB)
    Gathering events since 7162257515113311785.
    Copying items from "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 9CD375B1-5483-4D33-8759-1B5AC2E9A7E7)
    Finished copying items for "Data" (mount: '/Volumes/Data' fsUUID: 35066226-2A07-3845-AF7F-69FE0268B156 eventDBUUID: 9CD375B1-5483-4D33-8759-1B5AC2E9A7E7)
    Time elapsed: 8 minutes, 30.000 seconds
              Copied 2498 items (25.6 MB)
    Some filesystem changes made during the course of the backup may not be accounted for. Still busy after 2 retries.
    Backup complete.
    Total time elapsed: 19 minutes, 47.000 seconds

  • Difference between compiler.source-path path-element="" / & source-path path-element="" /

    Hi guys,
    I am not able to understand few differences when i am trying to compile my flex project using ANT. In the adobe resource i can see lots of places they mention as "compiler.source-path path-element="" "
    And lots of places i use as <source-path path-element="" />. Can anybody give a better explaination about this.
    The below example you can see there are lot of places where they used the attribute as compiler. & some places direct name
    <?xml version="1.0" encoding="utf-8"?>
    <!-- myMXMLCBuild.xml -->
    <project name="My App Builder" basedir=".">
        <taskdef resource="flexTasks.tasks" classpath="${basedir}/flexTasks/lib/flexTasks.jar" />
        <property name="FLEX_HOME" value="C:/flex/sdk"/>
        <property name="APP_ROOT" value="apps"/>
        <property name="DEPLOY_DIR" value="c:/jrun4/servers/default/default-war"/>
        <target name="main">
            <mxmlc
                file="${APP_ROOT}/Main.mxml"
                output="${DEPLOY_DIR}/Main.swf"
                actionscript-file-encoding="UTF-8"
                keep-generated-actionscript="true"
                incremental="true"
            >
                <!-- Get default compiler options. -->
                <load-config filename="${FLEX_HOME}/frameworks/flex-config.xml"/>
                <!-- List of path elements that form the roots of ActionScript
                class hierarchies. -->
                <source-path path-element="${FLEX_HOME}/frameworks"/>
                <!-- List of SWC files or directories that contain SWC files. -->
                <compiler.library-path dir="${FLEX_HOME}/frameworks" append="true">
                    <include name="libs" />
                    <include name="../bundles/{locale}" />
                </compiler.library-path>
                <!-- Set size of output SWF file. -->
                <default-size width="500" height="600" />
            </mxmlc>
        </target>
        <target name="clean">
            <delete dir="${APP_ROOT}/generated"/>
            <delete>
                <fileset dir="${DEPLOY_DIR}" includes="Main.swf"/>
            </delete>
        </target>
    </project>

    I know that Oracle has been putting work into improving the performance of XMLTable and XQuery and that is their direction for what we as developers should be using in our code. I've yet to write any production code like your query 2 but I agree with you that they should be equivalent. For a quick and dirty test, you could simply do an explain plan on both queries and see if they come out similar. As always, an explain plan does not mean that is exactly what Oracle will run during execution, but it is close enough for most situations.

  • Why do I have to Open and physically Save a provided .xlsx File in order to source it and process it in SSIS?

    It sounds CRAZY...but it's true and I proved it out over many tests...
    We receive a .xlsx File from an outside vendor. If we simply move the file over to our processing Path and then try to source it and process it in SSIS, we get this error...
    Exception from HRESULT: 0xC02020E8
    Error at Data Flow Task [Excel Source [1]]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB Error has occurred.
    Error code: 0x80004005
    However, if we open the file and Save It and/or Save As to our processing library path, we can source the .xlsx File and process it accordingly.
    Why is this? Is it because something funky is happening under the covers when the the vendor creates and saves the .xlsx File and then when we open and save it on our network it then cleans it up?
    Is there anyway that we can get around this? I hate for the processing clerk to have to go through what seems like a meaningless task....but I cannot think of any way around this if we cannot source the original file. And a C# Edit Script using Microsoft.Interpol.Excel
    is not possible because we are not allowed to put Microsoft Office on the Server where the SSIS Package will eventually reside.
    Any thoughts...suggestions...etc...would be GREATLY appreciated.
    Thanks for your review and am very hopeful (Holding my breath) for a reply and possible solution and answer to this strangeness.
    ITBobbyP85

    First of all...THANK YOU ARTHUR!
    You're probably sick and tired of seeing my posts out here on this.
    We are running Microsoft Office 32-bit which is why I downloaded the 32-bit driver of Microsoft Access Database Engine 2010 from...
    http://www.microsoft.com/en-us/download/details.aspx?id=13255
    Sooooo I think what you're saying is that it is a Microsoft Office 2010 .xlsx File but utilizing the 32-bit Microsoft Access Database Engine 2010 only allows me to source as Microsoft Excel 2007.
    Nowwww....this will run on a Server...soooo a couple of questions...
    1.) Are you saying I need to use the 64-bit Microsoft Access Database Engine 2010 in order to process this file without having to open and Save it? Can I do that if we're running Microsoft Office 32-bit? Will I be able to source this on the Server because
    we cannot install Microsoft Office 2010 on the Server where this SSIS Package will live and breathe.
    2.) How can I tell which Microsoft Access Database 2010 Engine is on the server where this will eventually reside and run? 32-bit or 64-bit?
    Welllll...maybe a 3rd question...
    3.) Is there anyway to get around this driver dependency? Can I create a script to bulkcopy it and it would simply take what's on the server.
    Now mind you...we DO NOT have Microsoft Office installed on our server where this will reside...a licensing issue...
    So maybe now this explains my struggles with this damn file(excuse my French and frustration)...and why I was using a C# Edit Script to Open and Save the File until I then deployed it to the server and the Microsoft.Office.Interop.Excel DLL was not available
    on the Server because we cannot install Microsoft Office on the Server.
    Sooooo it seems as though I'm back at square one.
    UGH!
    Any additional help you can suggest or even if we can continue this "offline" would be GREATLY GREATLY appreciated.
    Thanks!

  • Forms and Reports: Automated Test tools - functionality AND performance

    All,
    I'm looking to get a few leads on an automated test tools that may be used to validate Oracle forms and reports (see my software configuration below). I'm looking for tools that can automate both functional tests and performance. By this I mean;
    Functional Testing:
    * Use of shortcut keys
    * Navigation between fields
    * Screen organisation (filed locations)
    * Exercise forms validation (bad input values)
    * Provide values to forms and simulate user commit, and go and verify database state is as expected
    Performance Testing:
    * carry out tests for fixed user load
    * carry out tests for scaled step increase in user load
    * automated collection of log files and metrics during test
    So far I have:
    http://www.neotys.com/
    Thanks in advance for your response.
    Mathew Butler
    Configuration:
    Red Hat Enterprise Linux x86-64 architecture v4.5 64 bit
    Oracle Application Server 10.1.2.0.2 ( with patch 10.1.2.3 )
    Oracle Developer Suite (Oracle Forms and Reports) V10.1.2.0.2 ( with patch 10.1.2.3 )
    Oracle JInitiator 1.3.1.17 or later
    Microsoft Internet Explorer 6

    are there any tools for doing this activity like oracle recommended tools?
    Your question is unclear.  As IK mentioned, the only tool you need is a new version of Oracle Forms/Reports.  Open your v10 modules in a v11 Builder and select Save.  You now have a v11 module.  Doing a "Compile All PL/SQL" before saving is a good idea, but not required.  The Builders and utilites provided with the version 11 installation are the only supported tools for upgrading your application.  If you are trying to do the conversion of many Forms files in a scripted manner, you can use the Forms compiler in a script.  Generating new "X" files will also update the source modules (fmb, mmb, pll).  See MyOracleSupport Note 955143.1
    Also included in the installation in the Forms Migration Assistant.  Although it is more useful to people coming from older versions, it can also be used to move from v10 to 11.  It allows you to select more than one file at a time.  Documentation for this utility can be found in the Forms Upgrade Guide.
    Using the Oracle Forms Migration Assistant

Maybe you are looking for

  • Can't resume incremental backups

    I fist finished moving and after a few weeks of sitting in a box I set up my Time Capsule to resume operation as my backup device and wireless router. Unfortunately, I must have made a mistake during setup because now my MacBook Pro is treating the T

  • Oracle Linked Server problem in SQL Server

    I'm using the MS OLE DB Provider for Oracle in my SQL Server linking an Oracle 9i server to the SQL Server 2000 server. I get the following error when I try to execute a query thru the SQL Server Query Analyzer: [OLE/DB Provider 'MSDAORA' IOpenRowset

  • I have adobe reader 11.0.09 for windows 7 pdf are opening but minimize and close icon  are not showing

    please help

  • How to view basic price

    Dear Friend How can I see basic price, price w/o taxes etc in reference to PO list? in me2l, me2n,me2m , mb51etc I get price with taxes. Can I get unit basic price with reference to po for a list of POs? Thanks Sangeeta

  • SQLLDR fails when records contain delimiter

    I am using Oracle 9i and am trying to load logs into a table using SQL*Loader. In the control file, one field has been specified as: S_STATUS_MSG enclosed by '"', However, in the incoming data, a record has the field with the following value: "[AUTH]