Best way of optimizing..

Hi All,
I have a procedure as follows, but i am not sure whether it is a best way of doing it or not.
Can any one of you please advise me the best way of doing it.
CREATE OR REPLACE procedure xx_test_credit1 (p_credit in number)
is
l_even_first_digit Number;
l_even_second_digit Number;
l_even_third_digit Number;
l_even_fourth_digit Number;
l_even_fifth_digit Number;
l_odd_first_digit Number;
l_odd_second_digit Number;
l_odd_third_digit Number;
l_odd_fourth_digit Number;
l_odd_fifth_digit Number;
l_odd_sixth_digit Number;
l_len_value Number;
l_sum_value Number;
begin
select substr(p_credit,'10',1) into l_even_first_digit from dual;
select substr(p_credit,'8',1) into l_even_second_digit from dual;
select substr(p_credit,'6',1) into l_even_third_digit from dual;
select substr(p_credit,'4',1) into l_even_fourth_digit from dual;
select substr(p_credit,'2',1) into l_even_fifth_digit from dual;
select substr(p_credit,'11',1) into l_odd_first_digit from dual;
select substr(p_credit,'9',1) into l_odd_second_digit from dual;
select substr(p_credit,'7',1) into l_odd_third_digit from dual;
select substr(p_credit,'5',1) into l_odd_fourth_digit from dual;
select substr(p_credit,'3',1) into l_odd_fifth_digit from dual;
select substr(p_credit,'1',1) into l_odd_sixth_digit from dual;
l_even_first_digit:=l_even_first_digit*2;
l_even_second_digit:=l_even_second_digit*2;
l_even_third_digit:=l_even_third_digit*2;
l_even_fourth_digit:=l_even_fourth_digit*2;
l_even_fifth_digit:=l_even_fifth_digit*2;
l_sum_value:=(l_even_first_digit+l_even_second_digit+l_even_third_digit+l_even_fourth_digit+l_even_fifth_digit+
l_odd_first_digit+l_odd_second_digit+l_odd_third_digit+l_odd_fourth_digit+l_odd_fifth_digit+l_odd_sixth_digit);
if mod(l_sum_value,10)=0 then
dbms_output.put_line('The number is valid credit card number');
else
dbms_output.put_line('The number is not valid credit card number');
end if;
select length(p_credit) into l_len_value from dual;
if l_len_value<>11
then
dbms_output.put_line('Please enter 11 digit value');
end if;
end xx_test_credit1;
Thanks a lot.
Zaheer.

How about:
CREATE OR REPLACE PROCEDURE xx_test_credit1 (p_credit IN VARCHAR2) IS
        l_len_value  PLS_INTEGER  := NVL(LENGTH(p_credit), 0);
        l_sum_value  PLS_INTEGER := 0;
BEGIN
        IF l_len_value != 11 THEN
                dbms_output.put_line ('Please enter 11 digit value');
        ELSE
                FOR i IN 1 .. l_len_value
                LOOP
                        l_sum_value := l_sum_value + (TO_NUMBER(SUBSTR(p_credit, i, 1)) * (MOD(i, 2)+1));
                END LOOP;
                IF MOD (l_sum_value, 10) = 0 THEN
                        dbms_output.put_line('The number is valid credit card number');
                ELSE
                        dbms_output.put_line('The number is not valid credit card number');
                END IF;
        END IF;
END;
/If you are already in PL/SQL it will be more effient than calling a SELECT statement to do the calculation.
As others have mentioned you don't need to do:
select substr(p_credit,'10',1) into l_even_first_digit from dual;Just do:
l_even_first_digit := substr(p_credit,'10',1);and the same for:
select length(p_credit) into l_len_value from dual;Just do:
l_len_value := length(p_credit);Also your parameter you pass in "p_credit in number", should not be a "number"
if the card number starts with a leading 0 then the
if l_len_value != 11test will not work
(If you pass in 00000000001 you only realy pass in 1 - which is not 11 digits long)
You probably need to pass in a VARCHAR2 instead, to preseve the leading 0's.
Also
substr(p_credit,'10',1)
The second parameter to SUBSTR is a NUMBER, not a string.
GP>

Similar Messages

  • Best way to determine optimal font size given some text in a rectangle

    Hi Folks,
    I have a preview panel in which I am showing some text for the current selected date using a date format.
    I want to increase the size of the applied font so that it scales nicely when the panel in which it is drawn is resized.
    I want to know the best way in terms of performance to achieve the target. I did some reading about AffineTransform and determining by checking ina loop which is the correct size, but it does not feel like a good way.
    I would appreciate some tips.
    Cheers.
    Ravi

    import java.awt.*;
    import java.awt.font.*;
    import java.awt.geom.*;
    import javax.swing.*;
    public class ScaledText extends JPanel {
        String text = "Sample String";
        protected void paintComponent(Graphics g) {
            super.paintComponent(g);
            Graphics2D g2 = (Graphics2D)g;
            g2.setRenderingHint(RenderingHints.KEY_ANTIALIASING,
                                RenderingHints.VALUE_ANTIALIAS_ON);
            Font font = g2.getFont().deriveFont(16f);
            g2.setFont(font);
            FontRenderContext frc = g2.getFontRenderContext();
            int w = getWidth();
            int h = getHeight();
            float[][] data = {
                { h/8f, w/3f, h/12f }, { h/3f, w/4f, h/8f }, { h*3/4f, w/2f, h/16f }
            for(int j = 0; j < data.length; j++) {
                float y = data[j][0];
                float width = data[j][1];
                float height = data[j][2];
                float x = (w - width)/2f;
                Rectangle2D.Float r = new Rectangle2D.Float(x, y, width, height);
                g2.setPaint(Color.red);
                g2.draw(r);
                float sw = (float)font.getStringBounds(text, frc).getWidth();
                LineMetrics lm = font.getLineMetrics(text, frc);
                float sh = lm.getAscent() + lm.getDescent();
                float xScale = r.width/sw;
                float yScale = r.height/sh;
                float scale = Math.min(xScale, yScale);
                float sx = r.x + (r.width - scale*sw)/2;
                float sy = r.y + (r.height + scale*sh)/2 - scale*lm.getDescent();
                AffineTransform at = AffineTransform.getTranslateInstance(sx, sy);
                at.scale(scale, scale);
                g2.setFont(font.deriveFont(at));
                g2.setPaint(Color.blue);
                g2.drawString(text, 0, 0);
        public static void main(String[] args) {
            JFrame f = new JFrame();
            f.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
            f.getContentPane().add(new ScaledText());
            f.setSize(400,400);
            f.setLocationRelativeTo(null);
            f.setVisible(true);
    }

  • For optimal sound, what is the best way to connect an iMac to a high end stereo system?

    If the answer is wired rather than through an Airport Express, what is the best way to wire an iMac to a good stereo? Is there any way to do it other than through the headphones out jack?

    Morning Acousticare,
    Thanks for using Apple Support Communities.
    Thunderbolt cables should not exceed three meters for maximum performance. The Apple Thunderbolt to Thunderbolt cable (2.0 m) is two meters in length.
    For more information on this, take a look at this article:
    About Thunderbolt to Thunderbolt cable (2.0 m)
    http://support.apple.com/kb/HT4614
    Hope this helps,
    Mario

  • Best Buy Delivery - OPTIMA - YUK

    My Experience: 
    5/31/14-bought dishwasher, delivery & set up, scheduled for 6/1/14.  Told phone call would come the night prior with a 2 hr. window.  Did receive email saying delivery was set.
    6/9/14-received 2nd email confirming delivery would come 6/10/14, no phone call.
    6/10/14-called Best Buy - couldn't find the delivery scheduled, on the phone for 1 1/2 hrs., then told by the Clerk, it wasn't Best Buys problem, Optima (delivery company) did not pick it up, wouldn't be delivered, I needed to call Optima and complain to them.
    6/10/14-called Optima- said they couldn't help it, they didn't have a tech for my area, wouldn't get delivered, but someone from Optima would call me back to resolve, phone call never came.
    6/10/14-Called Best Buy back, talked w/associate, he called Optima, returned my phone call, said he was working on it, I would be called back between 4-6pm, call came @ 730pm.  Sorry no delivery today.  He would work on it @ 800am on 6/11/14
    6/11/14-Called Best Buy @ 1215pm, no call had come to me.  I was told the assc. working the problem would call me back, he did @ 1245pm, couldn't resolve my issue, he's still working on it, but, no delivery today.  Will let me know when it can be delivered.
    Took the day off work for delivery, spent approx. 5 hrs on the phone with various people, washed dishes 3 times as I have a large gathering scheduled for the weekend.  This was all explained to several people to no avail, again, sorry, can't do anything about it.
    Spoiler (Highlight to read)
    DEFINITLY TIME TO RETHINK YOUR DELIVERY CONTRACT WITH "OPTIMA"

    Hi barbella58,
    Wow, this is certainly way more contacts than necessary when making an appliance purchase with Best Buy! While I definitely thank you for making your purchase with us, I sincerely apologize for any delay you’ve experienced with the delivery and setup of your dishwasher. Rearranging your schedule due to an installation appointment can often be difficult, and I further apologize if you made arrangements, only for your appointment to not occur.
    I reached out to Optima on your behalf, and spoke with one of their representatives. They spoke to the previous details about not having a technician in your area, and it sounds like this is the main reason for the delay. Your installation and delivery order is in the hands of Optima’s escalations team, and they generally have a 24-48 hour turnaround time for a response. I was advised that they would be reaching out to you once a proper technician is lined-up to perform the services.
    Please let me know if you don’t hear from Optima, and I’ll happily assist further. Thank you for registering on the forums and sharing your concerns with us, and don’t hesitate to let me know if you need further assistance.
    Sincerely,
    Brian|Senior Social Media Specialist | Best Buy® Corporate
     Private Message

  • What's the best way to dry an iPhone that was briefly in water?  Rice? How long?

    iPhone 5 was briefly in water.  I turned it off immediately. What is the best way to dry it?  It's in a bag of rice right now.  If that's good, how long should it be in there before turning it back on?

    If you dropped it into water and it was submersed, even for a few seconds:
    IMMEDIATELY
    Turn it off, remove the sim card.
    Put it in a bowl or box of uncooked rice.
    Leave it there a few days (3 days is optimal).
    Your warranty is voided by exposure to water.  There are two sensors in the iPhone which are white, and turn red or pink when touched by water. 
    Repair coverage? - Check your Service and Support Coverage - https://selfsolve.apple.com/agreementWarrantyDynamic.do.  This will also show the make and model of the device. 

  • Best way to Fetch the record

    Hi,
    Please suggest me the best way to fetch the record from the table designed below. It is Oracle 10gR2 on Linux
    Whenever a client visit the office a record will be created for him. The company policy is to maintain 10 years of data on the transaction table but the table holds record count of 3 Million records per year.
    The table has the following key Columns for the Select (sample Table)
    Client_Visit
    ID Number(12,0) --sequence generated number
    EFF_DTE DATE --effective date of the customer (sometimes the client becomes invalid and he will be valid again)
    Create_TS Timestamp(6)
    Client_ID Number(9,0)
    Cascade Flg vahrchar2(1)
    On most of the reports the records are fetched by Max(eff_dte) and Max(create_ts) and cascade flag ='Y'.
    I have following queries but the both of them are not cost effective and takes 8 minutes to display the records.
    Code 1:
    SELECT   au_subtyp1.au_id_k,
                                       au_subtyp1.pgm_struct_id_k
                                  FROM au_subtyp au_subtyp1
                                 WHERE au_subtyp1.create_ts =
                                          (SELECT MAX (au_subtyp2.create_ts)
                                             FROM au_subtyp au_subtyp2
                                            WHERE au_subtyp2.au_id_k =
                                                                au_subtyp1.au_id_k
                                              AND au_subtyp2.create_ts <
                                                     TO_DATE ('2013-01-01',
                                                              'YYYY-MM-DD'
                                              AND au_subtyp2.eff_dte =
                                                     (SELECT MAX
                                                                (au_subtyp3.eff_dte
                                                        FROM au_subtyp au_subtyp3
                                                       WHERE au_subtyp3.au_id_k =
                                                                au_subtyp2.au_id_k
                                                         AND au_subtyp3.create_ts <
                                                                TO_DATE
                                                                    ('2013-01-01',
                                                                     'YYYY-MM-DD'
                                                         AND au_subtyp3.eff_dte < =
                                                                TO_DATE
                                                                    ('2012-12-31',
                                                                     'YYYY-MM-DD'
                                   AND au_subtyp1.exists_flg = 'Y'
    Explain Plan
    Plan hash value: 2534321861
    | Id  | Operation                | Name      | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT         |           |     1 |    91 |       | 33265   (2)| 00:06:40 |
    |*  1 |  FILTER                  |           |       |       |       |            |          |
    |   2 |   HASH GROUP BY          |           |     1 |    91 |       | 33265   (2)| 00:06:40 |
    |*  3 |    HASH JOIN             |           |  1404K|   121M|    19M| 33178   (1)| 00:06:39 |
    |*  4 |     HASH JOIN            |           |   307K|    16M|  8712K| 23708   (1)| 00:04:45 |
    |   5 |      VIEW                | VW_SQ_1   |   307K|  5104K|       | 13493   (1)| 00:02:42 |
    |   6 |       HASH GROUP BY      |           |   307K|    13M|   191M| 13493   (1)| 00:02:42 |
    |*  7 |        INDEX FULL SCAN   | AUSU_PK   |  2809K|   125M|       | 13493   (1)| 00:02:42 |
    |*  8 |      INDEX FAST FULL SCAN| AUSU_PK   |  2809K|   104M|       |  2977   (2)| 00:00:36 |
    |*  9 |     TABLE ACCESS FULL    | AU_SUBTYP |  1404K|    46M|       |  5336   (2)| 00:01:05 |
    Predicate Information (identified by operation id):
       1 - filter("AU_SUBTYP1"."CREATE_TS"=MAX("AU_SUBTYP2"."CREATE_TS"))
       3 - access("AU_SUBTYP2"."AU_ID_K"="AU_SUBTYP1"."AU_ID_K")
       4 - access("AU_SUBTYP2"."EFF_DTE"="VW_COL_1" AND "AU_ID_K"="AU_SUBTYP2"."AU_ID_K")
       7 - access("AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd
                  hh24:mi:ss') AND "AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
           filter("AU_SUBTYP3"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND
                  "AU_SUBTYP3"."EFF_DTE"<=TO_DATE(' 2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))
       8 - filter("AU_SUBTYP2"."CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00')
       9 - filter("AU_SUBTYP1"."EXISTS_FLG"='Y')Code 2:
    I already raised a thread a week back and Dom suggested the following query, it is cost effective but the performance is same and used the same amount of Temp tablespace
    select au_id_k,pgm_struct_id_k from (
    SELECT au_id_k
          ,      pgm_struct_id_k
          ,      ROW_NUMBER() OVER (PARTITION BY au_id_k ORDER BY eff_dte DESC, create_ts DESC) rn,
          create_ts, eff_dte,exists_flg
          FROM   au_subtyp
          WHERE  create_ts < TO_DATE('2013-01-01','YYYY-MM-DD')
          AND    eff_dte  <= TO_DATE('2012-12-31','YYYY-MM-DD') 
          ) d  where rn =1   and exists_flg = 'Y'
    --Explain Plan
    Plan hash value: 4039566059
    | Id  | Operation                | Name      | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT         |           |  2809K|   168M|       | 40034   (1)| 00:08:01 |
    |*  1 |  VIEW                    |           |  2809K|   168M|       | 40034   (1)| 00:08:01 |
    |*  2 |   WINDOW SORT PUSHED RANK|           |  2809K|   133M|   365M| 40034   (1)| 00:08:01 |
    |*  3 |    TABLE ACCESS FULL     | AU_SUBTYP |  2809K|   133M|       |  5345   (2)| 00:01:05 |
    Predicate Information (identified by operation id):
       1 - filter("RN"=1 AND "EXISTS_FLG"='Y')
       2 - filter(ROW_NUMBER() OVER ( PARTITION BY "AU_ID_K" ORDER BY
                  INTERNAL_FUNCTION("EFF_DTE") DESC ,INTERNAL_FUNCTION("CREATE_TS") DESC )<=1)
       3 - filter("CREATE_TS"<TIMESTAMP' 2013-01-01 00:00:00' AND "EFF_DTE"<=TO_DATE('
                  2012-12-31 00:00:00', 'syyyy-mm-dd hh24:mi:ss'))Thanks,
    Vijay

    Hi Justin,
    Thanks for your reply. I am running this on our Test environment as I don't want to run this on Production environment now. The test environment holds 2809605 records (2 Million).
    The query output count is 281699 (2 Hundred Thousand) records and the selectivity is 0.099. The Distinct values of create_ts, eff_dte, and exists_flg is 2808905 records. I am sure the index scan is not going to help out much as you said.
    The core problem is both queries are using lot of Temp tablespace. When we use this query to join the tables, the other table has the same design as below so the temp tablespace grows bigger.
    Both the production and test environment are 3 Node RAC.
    First Query...
    CPU used by this session     4740
    CPU used when call started     4740
    Cached Commit SCN referenced     21393
    DB time     4745
    OS Involuntary context switches     467
    OS Page reclaims     64253
    OS System time used     26
    OS User time used     4562
    OS Voluntary context switches     16
    SQL*Net roundtrips to/from client     9
    bytes received via SQL*Net from client     2487
    bytes sent via SQL*Net to client     15830
    calls to get snapshot scn: kcmgss     37
    consistent gets     52162
    consistent gets - examination     2
    consistent gets from cache     52162
    enqueue releases     19
    enqueue requests     19
    enqueue waits     1
    execute count     2
    ges messages sent     1
    global enqueue gets sync     19
    global enqueue releases     19
    index fast full scans (full)     1
    index scans kdiixs1     1
    no work - consistent read gets     52125
    opened cursors cumulative     2
    parse count (hard)     1
    parse count (total)     2
    parse time cpu     1
    parse time elapsed     1
    physical write IO requests     69
    physical write bytes     17522688
    physical write total IO requests     69
    physical write total bytes     17522688
    physical write total multi block requests     69
    physical writes     2139
    physical writes direct     2139
    physical writes direct temporary tablespace     2139
    physical writes non checkpoint     2139
    recursive calls     19
    recursive cpu usage     1
    session cursor cache hits     1
    session logical reads     52162
    sorts (memory)     2
    sorts (rows)     760
    table scan blocks gotten     23856
    table scan rows gotten     2809607
    table scans (short tables)     1
    user I/O wait time     1
    user calls     11
    workarea executions - onepass     1
    workarea executions - optimal     9
    Second Query
    CPU used by this session     1197
    CPU used when call started     1197
    Cached Commit SCN referenced     21393
    DB time     1201
    OS Involuntary context switches     8684
    OS Page reclaims     21769
    OS System time used     14
    OS User time used     1183
    OS Voluntary context switches     50
    SQL*Net roundtrips to/from client     9
    bytes received via SQL*Net from client     767
    bytes sent via SQL*Net to client     15745
    calls to get snapshot scn: kcmgss     17
    consistent gets     23871
    consistent gets from cache     23871
    db block gets     16
    db block gets from cache     16
    enqueue releases     25
    enqueue requests     25
    enqueue waits     1
    execute count     2
    free buffer requested     1
    ges messages sent     1
    global enqueue get time     1
    global enqueue gets sync     25
    global enqueue releases     25
    no work - consistent read gets     23856
    opened cursors cumulative     2
    parse count (hard)     1
    parse count (total)     2
    parse time elapsed     1
    physical read IO requests     27
    physical read bytes     6635520
    physical read total IO requests     27
    physical read total bytes     6635520
    physical read total multi block requests     27
    physical reads     810
    physical reads direct     810
    physical reads direct temporary tablespace     810
    physical write IO requests     117
    physical write bytes     24584192
    physical write total IO requests     117
    physical write total bytes     24584192
    physical write total multi block requests     117
    physical writes     3001
    physical writes direct     3001
    physical writes direct temporary tablespace     3001
    physical writes non checkpoint     3001
    recursive calls     25
    session cursor cache hits     1
    session logical reads     23887
    sorts (disk)     1
    sorts (memory)     2
    sorts (rows)     2810365
    table scan blocks gotten     23856
    table scan rows gotten     2809607
    table scans (short tables)     1
    user I/O wait time     2
    user calls     11
    workarea executions - onepass     1
    workarea executions - optimal     5Thanks,
    Vijay
    Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:17 AM
    Edited by: Vijayaraghavan Krishnan on Nov 28, 2012 11:19 AM

  • Best way to Insert Records into a DB

    hi
    i have around 400000(4 lakh ) records to insert into a db , whats the optimal way to do it ,
    i tried it doin g using threads i could gain only 2 seconds for 4 lakh records ,suggest me a better way

    Very hard thread, poor informations u give us can not help me too much to understand really the problem.
    Where do u have to 40.000K (input) records?
    Those records must be only added to output table?
    How many rows, and how many indexes output table has?
    The cost oh each insert depends also on how many indexes dbms has
    to manage on oupy table.
    In general to take about 2 seconds to add to a table a large amount of rows depends on many variables (hardware performances, dbms performances and so on)
    If u ahve only to insert and your input secord statys on another table,
    I think that a performance way to do it is insert select, so
    insert into output select * from input
    If your input records statys on a text file, the best way is to use
    native dbms importer
    Let me know something more....
    Regards

  • Best way to start a new catalog for 2009 ?

    Hi,
    What's the best way to start a new catalog for the new year, with all the keywords and all the presets of the modules (without having to export all the folders) of previous catalog 2008 ?
    Thanks,
    Dominique

    Well there is a cost too in not being able to find all your images in a single step, and inconsistencies soon develop - eg a keyword is plural in one catalogue, singular in another. Speed or stability issues are not simply related to catalogue size, and I've seen decently-performing catalogues 50% bigger than yours (as well as slow ones of a few hundred images). Have you optimized the catalogue recently?
    But if you think it's a good idea to fragment control of your picture collection.... Presets will be carried over to a new catalogue as they belong to the machine (unless you have the save with catalogue preference turned on). Keywords can be moved via the Metadata > Export and Import Keywords command. If you have lots of collections and smart collections, then maybe make a copy of your existing catalogue and then remove all the items from it - making sure you don't trash them of course.
    John

  • Best way to extract XML value wiith an xpath

    Hello,
    I wonder what is the best way to extract text value from XmlType with an xpath.
    I need to insert a row inside a table where the row's data come from xpath extractions of an XmlType. I do a lot of (approximative 20) :
    EXTRACTVALUE(var.myxmltype , '/an/xpath/to/extract/elem1').
    EXTRACTVALUE(var.myxmltype , '/an/xpath/to/extract/elemI').
    EXTRACTVALUE(var.myxmltype , '/an/xpath/to/extract/elem20').
    inside the insert statement
    Is this way is the best or is there a more optimal way ?
    For example extracting the node '/an/xpath/to/extract/' and sarting from this node extracting "elem1", ... , "elemI", "elemN" children.
    Thanks for your help,
    Regards,
    Nicolas

    Hi Nicolas,
    The answer depends on your actual storage method (binary, OR, CLOB?), and db version.
    You can try XMLTable, it might be better in this case :
    SELECT x.elem1, x.elem2, ... , x.elem20
    FROM your_table t
       , XMLTable(
          '/an/xpath/to/extract'
          passing t.myxmltype
          columns elem1  varchar2(30) path 'elem1'
                , elem2  varchar2(30) path 'elem2'
                , elem20 varchar2(30) path 'elem20'
         ) x
    ;

  • GAME: best way to MOVE objects: tweenlite them or increment their x any values

    As the title says, need best way to move objects in game. Always used x and y increments. Read about tweenlite and use it, very easy. So which do I use taking into account that these games can be ported to mobiles ie: which is more optimized. It's a platform type game. I will have
    a. A little girl that walks and climbs
    b. Little platforms that slide from side to side - you have to jump on them
    c. The little girl will jump
    d. Little baddies will fly around the screen.
    ALSO, as the girl jumps I would like it to look natural
    a. Gravity as she falls down
    b. The correct increment on x and y as she jumps.
    Shouldn't their be an actionscript 3 games forum . anybody know a good site for that?
    Cheers in advance

    OK - I get the message. I should increment x.
    I must have phrased the question wrongly. I'm just looking at optimization techniques so when games go on mobile they work OK.

  • Best way to network Iphoto

    W#hat is the best way to network iphoto so that multiple libaries stay uptodatde and runn off the one coppy of photos.
    I have a macmini as a file server and need to keep multiple libaries working on diffrent 9mac.

    There is no way you can "sync" multiple libraries with a backup/syncing program. IMO the optimal way would be to have all of your Macs run the same library located on an external HD connected to the Mac Mini. Or you could partition the Mini's HD and have one partition for the library with the ownership set to be ignorned on it. However as Terence pointed out iPhoto is not a multi user application so only one user could use the library at a time. However, that would not prevent the other users from perusing the photos with an application likeiMedia Browser.
    TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto (iPhoto.Library for iPhoto 5 and earlier) database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
    I've created an Automator workflow application (requires Tiger or later), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. It's compatible with iPhoto 6 and 7 libraries and Tiger and Leopard. iPhoto does not have to be closed to run the application, just idle. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.≤br>
    Note: There now an Automator backup application for iPhoto 5 that will work with Tiger or Leopard.

  • Best way to update an indicator

    I've attached a very simple vi to demonstrate my question.
    I'm making a test using the state-machine architecture. In the test, there is an indicator on the front panel which is to be updated at various places in the test.
    My question is ... what is the best way to update the value of the indicator? In the vi, I've wired it directly, used a property node and used a local variable, all of which achieve the same result.
    The first way - directly wiring - is obviously the best way if I have access to the input terminal of the indicator (as in state '0').
    But what if I need to update the same indicator from the second or third states? What are my options here?
    This is only a simple demonstration vi, so please don't say 'move the indicator outside the case structure and wire it through a tunnel', I know I can do that here. My 'real' vi updates the indicator several times within the state and I currently do using property nodes. I read somewhere that this isn't very efficient, which is why I'm asking this.
    Regards,
    Sebster
    LabVIEW 8.6, WinXP.
    Attachments:
    Update an indicator.vi ‏9 KB

    They look the same but they are implemented very differently. See this thread for some performance numbers.
    The control terminal is the most efficient technique. If you read the docs on creating XControls there is an explicit warning to only use the terminal and in cases where the indicator gets updated in some conditions and not others, we need to move the terminal into a following case and use a boolean to decide if we are writing to the indicator.
    I thought I had this list tagged already but i could not find it so here it goes again.
    In order of speed fastest to slowest.
    1) Terminal (VL has optimized code that let the update slip in thru a back door.
    2) Local but these require additional copies so the data has to be copied to each instance of the local.
    3) Property node has to use the User Interface thread to update. This means waiting for the OS to re-schedule the work after the thread swap.
    Both Locals and Property nodes can result in a Race condition if you use the indicator for data storage. See my signature for a link to avoid Race Conditions using an Action Engine.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Best way to know correct forecast model - process chain set up with multiple forecast models

    Hi Experts,
    I need your help in selecting best forecast model for our company. We have some of the models already used for our company, and because of multiple models used it is taking very long time for process chain to finish. There is no existing documentation available on which model was used why initially. Please help me to make out forecasting process smooth.
    - What is the best way to know, which forecast model is correct and should be used for our forecasting process.
    - In case multiple forecasting models are really required to be used, please suggest ways to optimally schedule them in process chain.
    - At times we get messages like "not enough data available" for specific model - any way to avoid this.
    - How to optimally use parallel processing profiles forecasting process in process chain.
    - Things which should be avoided.
    Request your help, please share your experiences.
    Regards
    NB

    Hi Neelesh,
    There are many points you need to consider to redesign forecast process for your company/client.
    You need to select the best suited forecast model first depending on the business. This has to be well tested & agreed by business users. Complexity will be an outcome of this exercise with business users. Best id to give then a brief intro on all available models & then help them selection the best one as per their requirement.
    Auto selection models are generally more time taking & should be used only when you have no idea at all on the business/demand pattern.
    Run time will depend how you are clubbing the CVCs to get the forecast generated & also parallel processing. For parallel processing profile you will need to do trial & error testing along with help from Basis team on how many free dial up processes are available.
    Even you can run many forecast calculations in parallel if the product/cvcs are totally different. - As per my personal experience maximum run time reduction can be achieved here.
    Daily run is not advisable except only for businesses where you have too much dynamism in demand planning i.e. you expect the demands to be changed overnight. Most of the companies run forecast on monthly basis or at weekly basis at the max.
    "Not Enough data" will be a problem if you are having the irrelevant models used in forecast profiles. This means users are not bothered to maintain the needed data for he forecast calculations or they are not aware at all of the situation. Running such models on daily basis is not advised at all. Better users should use interactive forecasting & saving the results in such cases.
    Just to give a crude example we get forecast calculated on monthly basis for approximately 4 lac cvcs in less than 3 hrs using moving avg, seasonal linear regression, seasonal trend, croston models. We use parallel profiles also everywhere with 10 blocks & 500 cvc/block.
    Hope this helps. Let me know if you have nay more questions & also the results using any of this.
    Regards,
    Rahul

  • Best way to assign timestamp in coherence?

    What is the best way to assign a timestamp to an object when it is added to a cache? Should I just do this or is there something more efficient?
    public class NewOrderTrigger implements MapTrigger {
        public NewOrderTrigger() {
        public void process(MapTrigger.Entry entry) {
            Order o = (Order)entry.getValue();
            o.setSubmittedTime(System.currentTimeMillis());
            entry.setValue(o);
        // ---- hashCode() and equals() must be implemented
        public boolean equals(Object o) {
            return o != null && o.getClass() == this.getClass();
        public int hashCode() {
            return getClass().getName().hashCode();
    }Thanks,
    Andrew

    Hi Andrew,
    There are couple issues I would like to bring to your attention.
    1. The System.currentTimeMillis() API does not guarantee the time going forward. It's theoretically possible (e.g. with Time Synchronization Service) that with two consequent calls the second one returns value less than the first one. You could compensate for this effect by either using System#nanoTime() or com.tangosol.util.Base#getSafeTimeMillis()
    2. The MapTrigger's process always has to deserialize and re-serialize the value. If your object model is POF aware, you could use the PofUpdater instead, optimizing out both steps:
    public void process(MapTrigger.Entry entry) {
        PofUpdater updater = new PofUpdater(PROP_SUBMITTED_TIME);
        updater.update(entry, new Long(Base.getSafeTimeMillis()));
        }3. Updates that come from a single client node but belong to different partitions and land on different servers could have significantly different timestamps. Whether or not this is a concern depends on your usage of the SubmittedTime property.
    Regards,
    Gene

  • Best way to transfer a 10g database from HP9000 to Linux Redhat?

    What is the best way to transfer a 10g databasse from HP9000 to Linux Redhat?

    Hi Bill,
    What is the best way to transfer a 10g databasse from HP9000 to Linux Redhat?Define "best"? There are many choices, each with their own benefits . . .
    Fastest?
    If you are on an SMP server, parallel CTAS over a databaee link can move large amnunts of tables, fast:
    http://www.dba-oracle.com/t_create_table_select_ctas.htm
    I've done 100 gig per hours . . .
    Easiest?
    If you are a beginner, data pump is good, and I have siome tips on doing it quickly:
    http://www.dba-oracle.com/oracle_tips_load_speed.htm
    Also,, make sure to check the Linux kernel settings. I query http://www.tpc.org and search for the server type . . .
    The full disclosure reports show optimal kernel settings.
    Finally, don't forget to set direct I/O in Linux:
    http://www.dba-oracle.com/t_linux_disk_i_o.htm
    Hope this helps . . .
    Donald K. Burleson
    Oracle Press author
    Author of "Oracle Tuning: The Definitive Reference" http://www.rampant-books.com/book_2005_1_awr_proactive_tuning.htm

Maybe you are looking for

  • HR - PA upgrade issue (4.6C to ECC6.0)

    dear all, we are currently in the process of upgrade from 4.6C to ECC6.0 and we found this minor issue in the submodule PA: using tcode pa20/pa30 we can not see any record of IT0000 Actions using 'overview' but the all the detail actions can be displ

  • How can I build automatically jar with Eclipse?

    Hi, I try to build automatically jar file each time when I save my project.Until now I used right click and Export to jar but that take me more time and it is not efficiently.I have try to configure an new builder from project->properties-> builders

  • Add WebDynpro views to existing J2EE app

    I am having a hard time understanding how i can add webdynpro views to my existing war file.  I use Spring, and not EJB's.  I would prefer to have the deployment coupled such that the WebDynpro is within the same classloader as my existing servlets(s

  • Netctl multiple connections?

    Hello, I welcome myself to a new distro & a new forum! i just installed and configured my machine, when i saw the news: "users are urged to move to using netctl". but, the k.i.s.s. (or is it k.i.s.?) principle makes it easy & transparent... so, no pr

  • How to view diagnostic log(ODL) via Weblogic UI?

    I'd check through the Oracle doc, and only found below from Oracle doc on viewing the ODL log via UI: 1. OEM 2. JDeveloper ODL analyzer I am wonder if there's any way that the ODL log can be view via the Weblogic Admin console like other logs eg. dom