Looping through channels is really slow in CC
I've been working on .Net application and ran into a little performance snag. I have a simple method that loops through the channel layers by index, grabs the indexed channel name, and if it matches a predetermined variable then it simple deletes that channel. It works as written but is painfully SLOW. Being fairly new at programming I figured it was just me and opened up ExtendScript to run a few test.
var docChannels = app.activeDocument.channels;
for (i = 0; i < docChannels.length; i++ ){
alert(docChannels[i].name);
To run this script in a document containing only the RGB channels plus 3 other alpha channels takes approximately 30 to 40 seconds. Running the same script modified to cycle through about 20 document layers only takes about 5 seconds.
I don't know if this is a bug in Photoshop CC or not. I don't have any earlier versions to test on. I've can accomplish the programing method by using a Try Catch statement and directly trying to delete the layer by name. This seems to run a little bit faster but not by much and is not really a proper use of a Try Catch statement.
Any insight into this problem or a better way to impliment my programing method would be most appreciated.
I am not sure about .Net apps but I think the reason the javascript Object Model can be slow when working with layers and channels it every time you create a layer or channel object all the properties of that object created. I think building the histogram property is the biggest reason for the slow down. Especially is the document has a large canvas.
Action Manger can be much faster because you can get only the property you need and avoid creating a DOM object and building the histogram. Try this to see if faster getting the channel names.
function getProperty( psClass, psKey, index ){// integer:Class, integer:key
var ref = new ActionReference();
if( psKey != undefined ) ref.putProperty( charIDToTypeID( "Prpr" ), psKey );
if(index != undefined ){
ref.putIndex( psClass, index );
}else{
ref.putEnumerated( psClass , charIDToTypeID( "Ordn" ), charIDToTypeID( "Trgt" ) );
try{
var desc = executeActionGet(ref);
}catch(e){ return; }// return on error
if(desc.count == 0) return;// return undefined if property doesn't exists
var dataType = desc.getType(psKey);
switch(dataType){// not all types supported - returns undefined if not supported
case DescValueType.INTEGERTYPE:
return desc.getInteger(psKey);
break;
case DescValueType.ALIASTYPE:
return desc.getPath(psKey);
break;
case DescValueType.BOOLEANTYPE:
return desc.getBoolean(psKey);
break;
case DescValueType.BOOLEANTYPE:
return desc.getBoolean(psKey);
break;
case DescValueType.UNITDOUBLE:
return desc.getUnitDoubleValue(psKey);
break;
case DescValueType.STRINGTYPE:
return desc.getString(psKey);
break;
case DescValueType.OBJECTTYPE:
return desc.getObjectValue(psKey);
break;
case DescValueType.LISTTYPE:
return desc.getList(psKey);
break;
case DescValueType.ENUMERATEDTYPE:
return desc.getEnumerationValue(psKey);
break;
var channelCount = app.activeDocument.channels.length;
var channelNames = [];
for(var channelIndex=1;channelIndex<=channelCount;channelIndex++){
channelNames.push(getProperty(charIDToTypeID("Chnl"),charIDToTypeID("ChnN"),channelIndex));
alert(channelNames);
Similar Messages
-
Looping through inbox items on timer event is really slow...?
Hi,
I have an issue with a email sync event that I am running. Essentially, I am given a date (the received date of the last email synced) via another process and then need to return all the emails in the user's inbox the where receivedDate > this given
date.
I have a timer object than triggers my sync event every 15 mins. I also have a button that allows the user to manually activate the sync - this basically changes the timer to 3 secs and so it fires the sync event.
If the sync event is fired manually as per above, it loops through everything quite quickly (~12,000 emails in 30 secs) and I can stored the sender/recipient and received time data I require from these in a list of objects, which I sync later on in a background
process. During this loop the main UI freezes up, which is fine since I show a progress bar so the user knows how long they need to wait, and a large sync like this should only occur on their first sync theoretically. If the first sync event is
fired without the manual input - so still called from the same timer event, but just on the normal 15 min event, not the shortened 3 sec one - then it runs differently. The sync takes a lot lot longer, and the UI doesn't freeze up.
I presume this might be because it is fired slightly differently and does not hog up the main thread, but rather shares it with normal Outlook operations. Maybe the operations are being run on a background thread and then keep having to jump into the
main thread to access the email items...? Either way, I'm not quite sure how this is happening since the only difference is the user pressing a button that changes the timer.interval. The code that is run is the same, and the button press doesn't
trigger the actual sync itself.
Is there a way I can force this first sync to always run solely in the main thread, it can freeze up Outlook and show the progress bar until it is done? Ideally any subsequent syncs would then be run with no progress bar and no freezing up of the UI.
Many thanks,
TomHello Tom,
> Maybe the operations are being run on a background thread and then keep having to jump into the main thread to access the email items...?
You shouldn't use another threads when dealing with the Outlook object model. Office applications use the single threaded apartment (STA) model and don't support multithreading. All calls made from another threads are marshalled by Outlook to the main
thread. However, you can use a low-level code (Extended MAPI) to access the data from secondary threads. For example, you can use Redemption which is based on Extended MAPI.
> it loops through everything quite quickly (~12,000 emails in 30 secs)
Instead of looping over all items in the folder I'd recommend using the
Find/FindNext
or
Restrict methods of the Items class. You can read more about them in the following articles:
How To: Use Find and FindNext methods to retrieve Outlook mail items from a folder (C#, VB.NET)
How To: Use Restrict method to retrieve Outlook mail items from a folder
You can use the System.Windows.Forms.Timer class which uses the main thread for invoking the Tick event. The .NET
Framework Class Library provides three different timer classes: System.Windows.Forms.Timer, System.Timers.Timer, and System.Threading.Timer. Each of these classes has been designed and optimized for use in different situations. The Comparing
the Timer Classes in the .NET Framework Class Library article examines the three timer classes and helps you gain an understanding of how and when each class should be used.
Also you may find the
AdvancedSearch method of the Application class helpful. Pay special attention to the fact that the search is performed in another thread. You don’t need to run another thread manually since the
AdvancedSearch method runs it automatically in the background. See
Advanced search in Outlook programmatically: C#, VB.NET for more information. -
Nested Loops...looping through one month of data at a time year by year
Hi all,
I'm trying to create an insert statement that loops through a table that has 10 years of data (2001 to 2010) month by month to minimize impact on server and commits more frequently to avoid filling up the redo logs and rollback tablespaces. The table is large, has about 40 millions records per year. Lets say the structure of the table is the following:
Customer_ID number(9),
Order_Item_1 number(6),
Order_Item_2 number(6),
Order_Item_3 number(6),
Order_date date
The table is in flat format but I want to normalize it so that it looks like the following:
Customer_ID Order_Seq Order_Item Order_date
999999999 1 555555 01-jan-2001
999999999 2 666666 01-jan-2001
999999999 3 444444 01-jan-2001
888888888 1 555555 03-jan-2001
888888888 2 666666 03-jan-2001
But because I want to loop through month by month....I need to set it up so that it loops through month by month, year by year (Using the Order Date Field) and Order_item by Order_item. Something like:
so my insert statements would be something like if I hardcoded instead of put the insert statement into a loop:
insert into orders_normalized
(Customer_id,Order_seq,Order_item,Order_date) select customer_id,1,Order_item,Order_date where Order_item_1 is not null and to_char(order_date,'yyyy') = '2001' and to_char(order_date,'mm')='01';
insert into orders_normalized
(Customer_id,Order_seq,Order_item,Order_date) select customer_id,2,Order_item,Order_date where Order_item_2 is not null and to_char(order_date,'yyyy') = '2001' and to_char(order_date,'mm')='01';
insert into orders_normalized
(Customer_id,Order_seq,Order_item,Order_date) select customer_id,3,Order_item,Order_date where Order_item_3 is not null and to_char(order_date,'yyyy') = '2001' and to_char(order_date,'mm')='01';
insert into orders_normalized
(Customer_id,Order_seq,Order_item,Order_date) select customer_id,1,Order_item,Order_date where Order_item_1 is not null and to_char(order_date,'yyyy') = '2001' and to_char(order_date,'mm')='02';
insert into orders_normalized
(Customer_id,Order_seq,Order_item,Order_date) select customer_id,2,Order_item,Order_date where Order_item_2 is not null and to_char(order_date,'yyyy') = '2001' and to_char(order_date,'mm')='02';
insert into orders_normalized
(Customer_id,Order_seq,Order_item,Order_date) select customer_id,3,Order_item,Order_date where Order_item_3 is not null and to_char(order_date,'yyyy') = '2001' and to_char(order_date,'mm')='03';
Hope this makes sense.
ThanksDoes the sequence of items in an order really matter? In other words, do we really need to preserve that an item was in position 2 versus position 1? I bet that the sequence or position of each item in an order is not meaningful. They were probably numbered 1, 2, and 3 just to make them uniquely named columns so there would be three slots to hold up to 3 items in the denormalized table.
You only have about 400 million rows to insert, so it could feasibly be done in a single transaction (depending on your database environment).
You can always do a create table as select (CTAS) to help with undo / redo issues and get better performance. You could run it in parallel, and spit it out to a new table partitioned by month. Single DDL statement running in parallel making your new table--sounds good to me.
How about something like this:
CREATE TABLE ORDERS_NORMALIZED
(CUSTOMER_ID, ORDER_ITEM, ORDER_DATE)
PARTITION BY RANGE(ORDER_DATE)
PARTITION p200901 VALUES LESS THAN (TO_DATE('200902','YYYYMM')),
PARTITION p200902 VALUES LESS THAN (TO_DATE('200903','YYYYMM')),
PARTITION p201012 VALUES LESS THAN (TO_DATE('201101','YYYYMM'))
as SELECT CUSTOMER_ID, ORDER_ITEM_1, ORDER_DATE
FROM OTHER_TABLE
WHERE ORDER_ITEM_1 IS NOT NULL
UNION ALL
SELECT CUSTOMER_ID, ORDER_ITEM_2, ORDER_DATE
FROM OTHER_TABLE
WHERE ORDER_ITEM_2 IS NOT NULL
UNION ALL
SELECT CUSTOMER_ID, ORDER_ITEM_3, ORDER_DATE
FROM OTHER_TABLE
WHERE ORDER_ITEM_3 IS NOT NULL.....................
Out of curiosity, why not normalize it further? You could have used two tables instead of one.
One (ORDER) with:
ORDER_ID
CUSTOMER_ID
DATE
Order_id would be a new surrogate key / primary key.
Another table (ORDER_ITEM) with:
ORDER_ID
ORDER_ITEM
It would be a table that links ORDERS to ITEMS. You get the idea. -
How to loop through the "On My Mac" folders in mail?
Hi there - i am new to applescript, but am slowly working out how to use it.
I am stumped on how to write a rule that will move a message to a folder identified by a tag in a message (I am using MailTags).
I have script that will take the first tag and move the message to a mail folder in a fixed location (e.g. Archive/<tag name>) however I wanted to make it more generic by looping over all of my mail folders and finding one that matched the tag name.
However I am stumped on how to loop over all of my mail folder names - I have tried:
repeat with aFolder in (every mailbox of (mailbox for "ON MY MAC"))
and
repeat with aFolder in every mailbox of inbox
etc.
but none of these seem to work.
Is there some magic syntax to do this, so that I can do:
if (name of aFolder = msgTag) then
move oneMessage to mailbox (name of aFolder)
end if
TimYou don't necessarily need to assign a variable to the entire list in order to loop through them (unless you really want to) - for example:
tell application "Mail" to repeat with aFolder in (get mailboxes)
-- do stuff with aFolder
end repeat
There are several AppleScript resources, but the main ones I use are:
AppleScript Language Guide
AppleScript Tutorials at MacScripter.net -
Hello Colleagues,
In BRFPLus I understand we can create Loop Expressions that allow you to loop through a table and perform different Actions based on the retrieved contents of the table.
We are not using BRFPLus (but the old BRF instead). Does anyone know how to build Loop Expressions in BRF without the use of ABAP Function Modules?
Your feedback would be really appreciated.
Thanks in advance.
Regards,
Ivor M.Hello Colleagues,
In BRFPLus I understand we can create Loop Expressions that allow you to loop through a table and perform different Actions based on the retrieved contents of the table.
We are not using BRFPLus (but the old BRF instead). Does anyone know how to build Loop Expressions in BRF without the use of ABAP Function Modules?
Your feedback would be really appreciated.
Thanks in advance.
Regards,
Ivor M. -
SQL Query running really slow, any help in improving will be Great!
Hi,
I am really new to this performance tuning and optimization techniques. Explain plan also, I only have theoretical knowledge, no clues on how to find out the real issue which is making the query slow..So if anyone can give me a good direction on where to start even, it will be great..
Now, my issue is, I have a query which runs really really slow. If I run this query for a small subset of data, it runs fast(Its flying, actually..) but if I give the same query for everything(Full required data), its running for ages..(Actually it is running for 2 days now and still running.)
I am pasting my query here, the output shows that the query stucks after "Table created"
SQL> @routinginfo
Table dropped.
Table created.
Please please help!
I also ran explain plan for this query and there are a number of rows in the plan_table now..
SORRY!IS there a way to insert a file here, as I want to attach my explain plan also?
My query -Routinginfo.sql
set trimspool on
set heading on
set verify on
set serveroutput on
drop table routinginfo;
CREATE TABLE routinginfo
( POST_TOWN_NAME VARCHAR2(22 BYTE),
DELIVERY_OFFICE_NAME VARCHAR2(40 BYTE),
ROUTE_ID NUMBER(10),
ROUTE_NAME VARCHAR2(40 BYTE),
BUILDING_ID NUMBER(10),
SUB_BUILDING_ID NUMBER(10),
SEQUENCE_NO NUMBER(4),
PERSONAL_NAME VARCHAR2(60 BYTE),
ADDRESS VARCHAR2(1004 BYTE),
BUILDING_USE VARCHAR2(1 BYTE),
COMMENTS VARCHAR2(200 BYTE),
EAST NUMBER(17,5),
NORTH NUMBER(17,5)
insert into routinginfo
(post_town_name,delivery_office_name,route_id,route_name,
building_id,sub_building_id,sequence_no,personal_name,
address,building_use,comments,east,north)
select
p.name,
d.name,
b.route_id,
r.name,
b.building_id,
s.sub_build_id,
b.sequence_no,
b.personal_name,
ad.addr_line_1||' '||ad.addr_line_2||' '||ad.addr_line_3||' '||ad.addr_line_4||' '||ad.addr_line_5,
b.building_use,
rtrim(replace(b.comments,chr(10),'')),
b.east,
b.north
from t_buildings b,
(select * from t_sub_buildings where nvl(invalid,'N') = 'N') s,
t_routes r,
t_delivery_offices d,
t_post_towns p,
t_address_model ad
where b.building_id = s.building_id(+)
and s.building_id is null
and r.route_id=b.route_id
and (nvl(b.residential_delivery_points,0) > 0 OR nvl(b.commercial_delivery_points,0) > 0)
and r.delivery_office_id=d.delivery_office_id
--and r.delivery_office_id=303
and D.POST_TOWN_ID=P.post_town_id
and ad.building_id=b.building_id
and ad.sub_building_id is null
and nvl(b.invalid, 'N') = 'N'
and nvl(b.derelict, 'N') = 'N'
union
select
p.name,
d.name ,
b.route_id ,
r.name ,
b.building_id ,
s.sub_build_id ,
NVL(s.sequence_no,b.sequence_no),
b.personal_name ,
ad.addr_line_1||' '||ad.addr_line_2||' '||ad.addr_line_3||' '||ad.addr_line_4||' '||ad.addr_line_5,
b.building_use,
rtrim(replace(b.comments,chr(10),'')),
b.east,
b.north
from t_buildings b,
(select * from t_sub_buildings where nvl(invalid,'N') = 'N') s,
t_routes r,
t_delivery_offices d,
t_post_towns p,
t_address_model ad
where s.building_id = b.building_id
and r.route_id = s.route_id
and (nvl(b.residential_delivery_points,0) > 0 OR nvl(b.commercial_delivery_points,0) > 0)
and r.delivery_office_id=d.delivery_office_id
--and r.delivery_office_id=303
and D.POST_TOWN_ID=P.post_town_id
and ad.building_id=b.building_id
and ad.sub_building_id = s.sub_build_id
and nvl(b.invalid, 'N') = 'N'
and nvl(b.derelict, 'N') = 'N'
union
select
p.name,
d.name,
b.route_id ,
r.name ,
b.building_id,
s.sub_build_id ,
NVL(s.sequence_no,b.sequence_no) ,
b.personal_name ,
ad.addr_line_1||' '||ad.addr_line_2||' '||ad.addr_line_3||' '||ad.addr_line_4||' '||ad.addr_line_5 ,
b.building_use,
rtrim(replace(b.comments,chr(10),'')),
b.east,
b.north
from t_buildings b,
(select * from t_sub_buildings where nvl(invalid,'N') = 'N') s,
t_routes r,
t_delivery_offices d,
t_post_towns p,
t_localities l,
t_localities lo,
t_localities loc,
t_tlands tl,
t_address_model ad
where s.building_id = b.building_id
and s.route_id is null
and r.route_id = b.route_id
and (nvl(b.residential_delivery_points,0) > 0 OR nvl(b.commercial_delivery_points,0) > 0)
and r.delivery_office_id=d.delivery_office_id
--and r.delivery_office_id=303
and D.POST_TOWN_ID=P.post_town_id
and ad.building_id=b.building_id
and ad.sub_building_id = s.sub_build_id
and nvl(b.invalid, 'N') = 'N'
and nvl(b.derelict, 'N') = 'N';
commit; Edited by: Krithi on 16-Jun-2009 01:48
Edited by: Krithi on 16-Jun-2009 01:51
Edited by: Krithi on 16-Jun-2009 02:44This link is helpful alright..but as a beginner, it is taking me too long to understand..But I am going to learn the techniques for sure..
Fo the time being,I am pasting my explain plan for the above query here, so that I hope any expert can really help me on this one..
STATEMENT_ID TIMESTAMP REMARKS OPERATION OPTIONS OBJECT_NODE OBJECT_OWNER OBJECT_NAME OBJECT_INSTANCE OBJECT_TYPE OPTIMIZER SEARCH_COLUMNS ID PARENT_ID POSITION COST CARDINALITY BYTES
06/16/2009 09:33:01 SELECT STATEMENT CHOOSE 0 829,387,159,200 829,387,159,200 3,720,524,291,654,720 703,179,091,122,042,000
06/16/2009 09:33:01 SORT UNIQUE 1 0 1 829,387,159,200 3,720,524,291,654,720 703,179,091,122,042,000
06/16/2009 09:33:01 UNION-ALL 2 1 1
06/16/2009 09:33:01 HASH JOIN 3 2 1 11,209 87,591 15,853,971
06/16/2009 09:33:01 FILTER 4 3 1
06/16/2009 09:33:01 HASH JOIN OUTER 5 4 1
06/16/2009 09:33:01 HASH JOIN 6 5 1 5,299 59,325 6,585,075
06/16/2009 09:33:01 VIEW GEO2 index$_join$_006 6 7 6 1 4 128 1,792
06/16/2009 09:33:01 HASH JOIN 8 7 1 5,299 59,325 6,585,075
06/16/2009 09:33:01 INDEX FAST FULL SCAN GEO2 POST_TOWN_NAME_I NON-UNIQUE ANALYZED 9 8 1 1 128 1,792
06/16/2009 09:33:01 INDEX FAST FULL SCAN GEO2 POST_TOWN_PK UNIQUE ANALYZED 10 8 2 1 128 1,792
06/16/2009 09:33:01 HASH JOIN 11 6 2 5,294 59,325 5,754,525
06/16/2009 09:33:01 TABLE ACCESS FULL GEO2 T_DELIVERY_OFFICES 5 ANALYZED 12 11 1 7 586 10,548
06/16/2009 09:33:01 HASH JOIN 13 11 2 5,284 59,325 4,686,675
06/16/2009 09:33:01 TABLE ACCESS FULL GEO2 T_ROUTES 4 ANALYZED 14 13 1 7 4,247 118,916
06/16/2009 09:33:01 TABLE ACCESS FULL GEO2 T_BUILDINGS 1 ANALYZED 15 13 2 5,265 59,408 3,029,808
06/16/2009 09:33:01 TABLE ACCESS FULL GEO2 T_SUB_BUILDINGS 3 ANALYZED 16 5 2 851 278,442 3,898,188
06/16/2009 09:33:01 TABLE ACCESS FULL GEO2 T_ADDRESS_MODEL 7 ANALYZED 17 3 2 3,034 1,582,421 88,615,576
06/16/2009 09:33:01 NESTED LOOPS 18 2 2 10,217 1 189
06/16/2009 09:33:01 NESTED LOOPS 19 18 1 10,216 1 175
06/16/2009 09:33:01 HASH JOIN 20 19 1 10,215 1 157
06/16/2009 09:33:01 HASH JOIN 21 20 1 6,467 80,873 8,168,173
06/16/2009 09:33:01 TABLE ACCESS FULL GEO2 T_ROUTES 11 ANALYZED 22 21 1 7 4,247 118,916
06/16/2009 09:33:01 HASH JOIN 23 21 2 6,440 80,924 5,907,452
06/16/2009 09:33:01 TABLE ACCESS FULL GEO2 T_BUILDINGS 8 ANALYZED 24 23 1 5,265 59,408 3,029,808
06/16/2009 09:33:01 TABLE ACCESS FULL GEO2 T_SUB_BUILDINGS 10 ANALYZED 25 23 2 851 278,442 6,125,724
06/16/2009 09:33:01 TABLE ACCESS FULL GEO2 T_ADDRESS_MODEL 14 ANALYZED 26 20 2 3,034 556,000 31,136,000
06/16/2009 09:33:01 TABLE ACCESS BY INDEX ROWID GEO2 T_DELIVERY_OFFICES 12 ANALYZED 27 19 2 1 1 18
06/16/2009 09:33:01 INDEX UNIQUE SCAN GEO2 DELIVERY_OFFICE_PK UNIQUE ANALYZED 1 28 27 1 1
06/16/2009 09:33:01 TABLE ACCESS BY INDEX ROWID GEO2 T_POST_TOWNS 13 ANALYZED 29 18 2 1 1 14
06/16/2009 09:33:01 INDEX UNIQUE SCAN GEO2 POST_TOWN_PK UNIQUE ANALYZED 1 30 29 1 1
06/16/2009 09:33:01 MERGE JOIN CARTESIAN 31 2 3 806,976,583,802 3,720,524,291,567,130 703,179,091,106,188,000
06/16/2009 09:33:01 MERGE JOIN CARTESIAN 32 31 1 16,902,296 73,359,971,046 13,865,034,527,694
06/16/2009 09:33:01 MERGE JOIN CARTESIAN 33 32 1 1,860 1,207,174 228,155,886
06/16/2009 09:33:01 MERGE JOIN CARTESIAN 34 33 1 1,580 20 3,780
06/16/2009 09:33:01 NESTED LOOPS 35 34 1 1,566 1 189
06/16/2009 09:33:01 NESTED LOOPS 36 35 1 1,565 1 175
06/16/2009 09:33:01 NESTED LOOPS 37 36 1 1,564 1 157
06/16/2009 09:33:01 NESTED LOOPS 38 37 1 1,563 1 129
06/16/2009 09:33:01 NESTED LOOPS 39 38 1 1,207 178 12,994
06/16/2009 09:33:01 TABLE ACCESS FULL GEO2 T_SUB_BUILDINGS 17 ANALYZED 40 39 1 851 178 3,916
06/16/2009 09:33:01 TABLE ACCESS BY INDEX ROWID GEO2 T_BUILDINGS 15 ANALYZED 41 39 2 2 1 51
06/16/2009 09:33:01 INDEX UNIQUE SCAN GEO2 BUILDING_PK UNIQUE ANALYZED 1 42 41 1 1 31
06/16/2009 09:33:01 TABLE ACCESS BY INDEX ROWID GEO2 T_ADDRESS_MODEL 25 ANALYZED 43 38 2 2 1 56
06/16/2009 09:33:01 INDEX UNIQUE SCAN GEO2 MODEL_MODEL2_UK UNIQUE ANALYZED 2 44 43 1 1 1
06/16/2009 09:33:01 TABLE ACCESS BY INDEX ROWID GEO2 T_ROUTES 18 ANALYZED 45 37 2 1 1 28
06/16/2009 09:33:01 INDEX UNIQUE SCAN GEO2 ROUTE_PK UNIQUE ANALYZED 1 46 45 1 1
06/16/2009 09:33:01 TABLE ACCESS BY INDEX ROWID GEO2 T_DELIVERY_OFFICES 19 ANALYZED 47 36 2 1 1 18
06/16/2009 09:33:01 INDEX UNIQUE SCAN GEO2 DELIVERY_OFFICE_PK UNIQUE ANALYZED 1 48 47 1 1
06/16/2009 09:33:01 TABLE ACCESS BY INDEX ROWID GEO2 T_POST_TOWNS 20 ANALYZED 49 35 2 1 1 14
06/16/2009 09:33:01 INDEX UNIQUE SCAN GEO2 POST_TOWN_PK UNIQUE ANALYZED 1 50 49 1 1
06/16/2009 09:33:01 BUFFER SORT 51 34 2 1,579 60,770
06/16/2009 09:33:01 INDEX FAST FULL SCAN GEO2 LOCAL_COUNTY_FK_I NON-UNIQUE ANALYZED 52 51 1 14 60,770
06/16/2009 09:33:01 BUFFER SORT 53 33 2 1,846 60,770
06/16/2009 09:33:01 INDEX FAST FULL SCAN GEO2 LOCAL_COUNTY_FK_I NON-UNIQUE ANALYZED 54 53 1 14 60,770
06/16/2009 09:33:01 BUFFER SORT 55 32 2 16,902,282 60,770
06/16/2009 09:33:01 INDEX FAST FULL SCAN GEO2 LOCAL_COUNTY_FK_I NON-UNIQUE ANALYZED 56 55 1 14 60,770
06/16/2009 09:33:01 BUFFER SORT 57 31 2 806,976,583,788 50,716
06/16/2009 09:33:01 INDEX FAST FULL SCAN GEO2 TLAND_COUNTY_FK_I NON-UNIQUE ANALYZED 58 57 1 11 50,716 -------------------------------------------------------------
Edited by: Krithi on 16-Jun-2009 02:47 -
I got this really great script that loops through some XML files and does stuff based off the content.
What I would like to do now is have it repeat when it gets to a certain field. for example my xml file would look like this.
<xml>
<fileName>nameoffile</fileName>
<userEmail>[email protected]</userEmail>
<commandList>
<item>info here</item>
<item>more info here</item>
<item>yet more info</item>
<item>still more info here</item>
</commandList>
</xml>
then basically i need the applescript to see the <commandList> tag and then repeat for each item.
i can't seem to wrap my head around how to get it to happen.
currently I am using this to extract the normal tags from the xml file
set theXML to choose file
tell application "System Events"
tell XML element 1 of contents of XML file theXML
set the fileName to value of (XML elements whose name is "fileName")
... and so on
end tell
end tell
i believe i have to put an if statment in saying that if the element name == "commandList" then
repeat with item in commandList
my stuff here.
end repeat
but i'm not sure how to build the statement.
thanks for any help.
PSomething like this?
<pre style="
font-family: Monaco, 'Courier New', Courier, monospace;
font-size: 10px;
margin: 0px;
padding: 5px;
border: 1px solid #000000;
width: 720px;
color: #000000;
background-color: #FFEE80;
overflow: auto;"
title="this text can be pasted into the Script Editor">
set theXML to POSIX path of (choose file)
tell application "System Events"
tell XML element 1 of contents of XML file theXML -- root element
set the fileName to value of XML element "fileName"
get value of XML elements of XML element "commandList" -- sub elements
repeat with anItem in the result
log anItem
end repeat
end tell
end tell
</pre> -
How to loop through XML data in a table of XMLType?
Hi,
I am failry new to xml document processing in Oracle using PL/SQL.
My DB version: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
I have successfully loaded an xml document into a table using the following two statements:
1) CREATE TABLE mytable2 OF XMLType;
2) INSERT INTO mytable2 VALUES (XMLType(bfilename('IMAGE_FILE_LOC', 'IFTDISB20100330172157C002.xml'), nls_charset_id('AL32UTF8')));
Now I need to traverse through the various nodes within the xml document and extract values of elements of each node. The question I have is:
How do I loop through a node? A VALID record is enclosed within the <checkItem> </checkItem> tags.
Here is a snippet of the data in that xml document:
++++++++++++++++++++++++++++++++++++++++++++++++
<?xml version="1.0" encoding="UTF-8"?>
<bdiData>
<documentControlInfo>
<documentInfo>
<docDescription>Check images and data for Test Company, account number 1234567890</docDescription>
<docID>
<ID>20100330172157</ID>
</docID>
<docModifier>Test Company</docModifier>
<docCreateDate>2010-03-30</docCreateDate>
<docCreateTime>17:21:57-0700</docCreateTime>
<standardVersion>1.0</standardVersion>
<testIndicator>0</testIndicator>
<resendIndicator>0</resendIndicator>
</documentInfo>
<sourceInfo>
<sourceName>The Bank IFT</sourceName>
<sourceID>
<idOther>TheBankIFT</idOther>
</sourceID>
</sourceInfo>
<destinationInfo>
<destinationName>Test Company</destinationName>
<destinationID>
<idOther>FEI3592</idOther>
</destinationID>
</destinationInfo>
</documentControlInfo>
<checkItemCollection>
<collectionInfo>
<description>Items</description>
<ID>1269994919135</ID>
<Classification>
<classification>Items</classification>
</Classification>
</collectionInfo>
<checkItemBatch>
<checkItemBatchInfo>
<description>Paid Checks</description>
<ID>1269994919135</ID>
<Classification>
<classification>Paid Checks</classification>
</Classification>
</checkItemBatchInfo>
<checkItem>
<checkItemType>check</checkItemType>
<checkAmount>86468</checkAmount>
<postingInfo>
<date>2010-03-29</date>
<RT>10700543</RT>
<accountNumber>1234567890</accountNumber>
<seqNum>009906631032</seqNum>
<trancode>001051</trancode>
<amount>86468</amount>
<serialNumber>300040647</serialNumber>
</postingInfo>
<totalImageViewsDelivered>2</totalImageViewsDelivered>
<imageView>
<imageIndicator>Actual Item Image Present</imageIndicator>
<imageViewInfo>
<Format>
<Baseline>TIF</Baseline>
</Format>
<Compression>
<Baseline>CCITT</Baseline>
</Compression>
<ViewSide>Front</ViewSide>
<imageViewLocator>
<imageRefKey>201003260000738400851844567205_Front.TIF</imageRefKey>
<imageFileLocator>IFTDISB20100330172157M002.zip</imageFileLocator>
</imageViewLocator>
</imageViewInfo>
<imageViewInfo>
<Format>
<Baseline>TIF</Baseline>
</Format>
<Compression>
<Baseline>CCITT</Baseline>
</Compression>
<ViewSide>Rear</ViewSide>
<imageViewLocator>
<imageRefKey>201003260000738400851844567205_Rear.TIF</imageRefKey>
<imageFileLocator>IFTDISB20100330172157M002.zip</imageFileLocator>
</imageViewLocator>
</imageViewInfo>
</imageView>
</checkItem>
<checkItem>
<checkItemType>check</checkItemType>
<checkAmount>045</checkAmount>
<postingInfo>
<date>2010-03-29</date>
<RT>10700543</RT>
<accountNumber>1234567890</accountNumber>
<seqNum>008518967429</seqNum>
<trancode>001051</trancode>
<amount>045</amount>
<serialNumber>200244935</serialNumber>
</postingInfo>
<totalImageViewsDelivered>2</totalImageViewsDelivered>
<imageView>
<imageIndicator>Actual Item Image Present</imageIndicator>
<imageViewInfo>
<Format>
<Baseline>TIF</Baseline>
</Format>
<Compression>
<Baseline>CCITT</Baseline>
</Compression>
<ViewSide>Front</ViewSide>
<imageViewLocator>
<imageRefKey>201003290000713900851896742901_Front.TIF</imageRefKey>
<imageFileLocator>IFTDISB20100330172157M002.zip</imageFileLocator>
</imageViewLocator>
</imageViewInfo>
<imageViewInfo>
<Format>
<Baseline>TIF</Baseline>
</Format>
<Compression>
<Baseline>CCITT</Baseline>
</Compression>
<ViewSide>Rear</ViewSide>
<imageViewLocator>
<imageRefKey>201003290000713900851896742901_Rear.TIF</imageRefKey>
<imageFileLocator>IFTDISB20100330172157M002.zip</imageFileLocator>
</imageViewLocator>
</imageViewInfo>
</imageView>
</checkItem>
<checkItemBatchSummary>
<totalItemCount>1028</totalItemCount>
<totalBatchAmount>61370501</totalBatchAmount>
<totalBatchImageViewsDelivered>2056</totalBatchImageViewsDelivered>
</checkItemBatchSummary>
</checkItemBatch>
<collectionSummary>
<totalBatchCount>1</totalBatchCount>
<totalItemCount>1028</totalItemCount>
<totalCollectionAmount>61370501</totalCollectionAmount>
<totalCollectionImageViewsDelivered>2056</totalCollectionImageViewsDelivered>
</collectionSummary>
</checkItemCollection>
<documentSummaryInfo>
<totalCollectionCount>1</totalCollectionCount>
<totalBatchCount>1</totalBatchCount>
<totalItemCount>1028</totalItemCount>
<totalDocumentAmount>61370501</totalDocumentAmount>
<totalDocumentImageViewsDelivered>2056</totalDocumentImageViewsDelivered>
</documentSummaryInfo>
</bdiData>
++++++++++++++++++++++++++++++++++++++++++++++++
Any ideas and or suggestions will be greatly appreciated.
Cheers!
Edited by: user12021655 on Aug 3, 2010 1:25 PMI really need to update my blog to get the example you are looking for posted. I did a quick search on the forums for XMLTable and found a good example at {message:id=4325701}. You will want to use OBJECT_VALUE in the PASSING clause where you need to reference the column in your table.
Note: See the FAQ in the upper right for how to use the tag to wrap objects to retain formatting. Also your XML is missing closing nodes. -
After Effects CC really slow in response
Hallo and sorry for may bad english!
So,my nightmares with CC continues!
I have serious lag when i do almost everything in AE. Press CTRL+Z i have to wait,press CTRL+A...wait...press P...wait....press the spacebar for move the preview window...wait for the hand...so the problem i think is clear (by the way,camera tools are really really slow!)!
My system specs: GTX680 4GB (latest drivers),Intel DX58SO mobo,12GB of ram DDR3 triple channel (fast enough),I7 980x.
With the same setup in CS6 all works smooth. Fresh install of CC,i have only Encore CS6 on my machine. All is up to date. Win 7 Professional SP1.
Multiprocessing is turned off,GPU acceleration on (i use 3000MB texture memory). I have read of many people that have the same problem on the net. I'm opening new projects,not from CS6.so that is not the problem. No internet connection,no antivirus scanning.
What may cause this problems? I'm on the way to remove AE CC and go back to AECS6 to work smoothly.
Thank you and hope to find a solution really soon,
bests from Italy,
RobertoHere's what we know about the issue with slowness when clicking to open a menu or when using a keyboard shortcut. (We're still digging, though, so it may turn out that some of this is incomplete or incorrect.)
Because of a change for After Effects CC (12.0) that shows the activated user in the menus, After Effects is checking a file on disk each time that menus are accessed. Because of the way that we implement keyboard shortcuts, this also means that the file is checked each time a keyboard shortcut is used.
On computers where disk access is slow (e.g., because the disk is slow, the bus is slow, or something else is reading from the disk at the same time), this simple check of a file can add a noticeable amount of time---maybe half a second or even a couple of seconds in some cases.
We are in the process of fixing this. It is my hope (but not a promise) that we'll be able to have a good fix in place for the next update, which we're targeting for September.
In the meantime, here's how you should be able to mitigate the problem:
Make sure that your software is installed on the fastest disk that you have, preferably an SSD. If you don't have an SSD, then at least make sure that your software (applications and OS) are not stored on the same drive from which you're reading footage or to which you're writing output files. And certainly don't run applications with high disk access (like WinZip or anti-virus software that intervenes at every disk access, just for a couple of examples) while you are working with After Effects. (BTW, this is all good advice for high performance even without this bug.)
If you make these changes and notice an improvement, let us know. Actually, let us know either way so that we can collect more information.
So, how did this get past us? I think that it's because we tend to test on computers that are set up as I describe above, including tending to use SSDs and having our disk access spread across multiple disks. That said, we do apologize for letting this get out and inconvenience you. -
How to loop through data of ALV grid in Webdynpro for Abap view
hello,
I have a view with an ALV component(SALV_WD_TABLE). I just want to loop through the lines shown (particularly when the user has set filter).
Can you help me to find how to reach the internal table ?
In summary, I want to know the reverse method of BIND_TABLE()
Thanks!Not really...
Here is the solution :
DATA lo_ui TYPE if_salv_wd_table=>s_type_param_get_ui_info.
lo_ui = lo_interfacecontroller->get_ui_info( ).
DATA: lt_displayed TYPE salv_bs_t_int,
li_displayed TYPE int4.
lt_displayed = lo_ui-t_displayed_elements.
check not lt_displayed is initial.
loop at lt_displayed into li_displayed.
Regards -
I need a program where I can remove objects and touch up dead grass, etc.
Reviews say Aperture runs really slow with Lion, any suggestions?
To add on to Corky02 answer:
The recent Aperture release AP 3.3.x is very efficient, both on Lion and Mountain Lion. But you need a decent hardware to support it. The older Macs will have problems with the storage and CPU requirements. For good performance on large raw images plenty of RAM is important, at least 4GB, but 8GB would be much better, and your library should be on a fast disk, preferably on an internal drive. And don't let the system drive get too full.
Many reports about Aperture being slow are due to putting the Aperture library onto slow disks or acessing the original image files over the network, insufficient RAM, or corrupted or ill-designedAperture databases.
Aperture excels at all kinds of image processing that can be considered image developement - raw processing and color/lighting adjustments, but does not do compositing. If you are shooting raw and want professional image developement I'd recommend to do this in Aperture and not in iPhoto - you will have much more control over this in Aperture (after a steep learning curve). You can also repair and retouch the image to correct minor blamishes. For graphics compositing and inpainting you can set up an external editor and send your images from Aperture to this external editor - any of the editors Corky recommended would be o.k. for this.
To see, if you will want the advanced image processing in Aperture or the easy to use, more basic options in iPhoro will be sufficient to you have a look at the Aperture User manual, or the tutorial on the support page: Aperture Support
But Aperture is a professional apllication, not "plug and play". You should only consider it, if you are willing to spend some time on learning to use it properly and are willing to work your way through the manual.
Regards
Léonie -
Images load really slow after using Reduce File Size...
After using the "Reduce File Size..." feature in Acrobat 9 on Windows Vista, some of the images load really slow as I page through the document. Is there some kind of compression setting that I should look for to prevent the images from be affected by this?
For some of my documents it will take 30 to 60 seconds to load an image. Sometimes I can't do anything with the document until the graphic loads entirely.Can you post an example of a file that has this issue? Personally, I recommend using the PDF optimizer, it gives better control of the final pdf file.
-
WiFi *really* slow to connect on wake
I recently moved to a new neighborhood and now my wifi is really slow to connect when I open up my MacBook from sleep mode.
My old place had other networks in the vicinity, but about half as many as there are available at my new place.
Either way, its really becoming a pain, as I have become very used to opening my MacBook and being instantly online.
I have power cycled my router and modem and done everything short of resetting the firmware.
I am using a standard Linksys WRT-54G and it's served me fine for years, as has this MacBook.
Any ideas on how to remedy this?
Nejat
Message was edited by: nugxHow long is this "really slow"? Several minutes?
You may want to look through System Preferences>Network and take a look at the "advanced" options for Airport. Might be worth clearing out all preferred networks and get those reset.
~Lyssa -
How do I loop through AFrames?
I feel dumb asking this but I really think the code I have should work. All I want to do is loop through all the aframes in a document. To do this, I assign the first Aframe to a variabe named vAFrame. Then, I created a while loop where the test is vAFrame.ObjectValid(). however, the while loop never tests to true even though the data browsers shows that the vAFrame variable contains a valid object AND it supports the ObjectValid() method AND the valid object is an AFrame. I must be missing something really obvious here. Any ideas?
main ()
function main()
var vDoc=app.ActiveDoc;
var vFlow=vDoc.MainFlowInDoc;
var vTextFrame=vFlow.FirstTextFrameInFlow;
var vAFrame=vTextFrame.FirstAFrame;
while (vAFrame.ObjectValid())
vAFrame=vAFrame.NextAFrame;I am heads down on a project so I can't give you much code right now, but I can point you in the right direction. The method you are using only works for a single text frame, so you would also have to include a loop for all of the text frames in the flow. A better approach is to get a list of FrameAnchor items from the main flow of the document. Then you can loop through the text items to process each anchored frame.
// Set a variable for the main flow in the document.
var flow = doc.MainFlowInDoc;
// Get a list of the anchored frames in the flow.
var textItems = flow.GetText(Constants.FTI_FrameAnchor);
// Loop through the anchored frames.
for (var i = 0; i < textItems.len; i += 1) {
var aFrame = textItems[i].obj;
// Do something with the anchored frame here.
Note that this will only get anchored frames in the main flow itself; it will skip anchored frames that are inside table cells or nested in other anchored frames. Please let me know if you have any questions or comments.
Rick Quatro -
Elements 13 works really slow in editing.Can someone help. I have OSX Yosemite 10
elements 13 works really slow in editing.Can someone help. I have OSX Yosemite 10
First of all, did you have PSE installed before updating to 10.10? If so, you need to do this:
A Reminder for Mac Folks upgrading to Yosemite | Barbara's Sort-of-Tech Blog
Also, be sure you have updated to 10.10.1. And if you have a large number of recently imported images in the organizer, PSE will be slow till organizer is through creating thumbnails for the photos.
Maybe you are looking for
-
Combining .docx into a PDF in Acrobat XI Pro
I am trying to combine two Word documents into a PDF, but when I do, Acrobat says that it encountered an unexpected error and cannot convert the documents. I was able to get around this issue by printing to Adobe PDF and combining the PDF documents,
-
When synching by 4th gen ipod touch it skips the first step - backup-, does steps 2 through 5, then I get a message the ipod could not be backed up because a session could not be started with the ipod. How do I fix this? Do I have to restore my ipod?
-
Representing a sequence of moves
I'm working on a board game project and I'm having trouble understanding how to store the moves for the game called Go or Igo. I've already tried using linked lists below. Each stone has a list of next move stones. I have a stone manager that keeps t
-
Is having large no. of child in dense dimension ideal?
we have a BSO application in which some of the dense dimension have over 1500 members in them, is it ideal? the hierarchy is not too flat but still there are some parents which have over 200 children.. i read somewhere that BSO throws error if the no
-
Server CPU as Keepalive Parameter - Is it possible?
Is there anyway for Keepalive to check the server CPU utilization as a way to determine if the service is dead or alive? Regards, Howard