1 1242 per 8,000/sq ft ?

What would the estimated output power level be when using a 1242 with rubber ducky antennea in warehouse be to get 1 1242 per 8,000/sq ft for use with 7921Gs?
I'm guessing 6mw-12mw? I'm guessing that at 50mW I'd have way too much power for 1 1242 per 8,000 sq ft.
I'm aware that site survey will give exacts, but am working on AirMagnet Planner predictive site survey using -67db at 12mw for cell edges with 20% overlap to get 1 1242 per 8,000 sq ft.

Can anyone help me?

Similar Messages

  • Data usage analysis page. Doesn't match current usage page?

    O.K. Here's what I'm going through.
    Verizon tells me I'm way over my 5.0 limit. Right now I am close to 9GB.
    So to check I click on usage, This is what I get.
    Data Allowance Exceeded. Additional data usage will be billed at $10 per 1.000 GB. Current Overage = 3.950 GB
    4.0 overages x $10 = $40.00 in overage fees. You may be able to avoid data overage charges by increasing your data allowance before 11/25/12 and requesting the change be backdated to the beginning of the current bill cycle.      5 GB of 5 GB   
    +3   1.0  GB Overage Allowance    3.000 GB  / 3.0 GB    
      Data Usage can be measured in kilobytes, megabytes and gigabytes.
      Data Usage Equivalent
    1,024 Kilobytes (KB) = 1 Megabyte (MB)  1,024 Megabytes (MB) = 1 Gigabyte (GB)    
        +1   1.0  GB Overage Allowance    0.950 GB  / 1.0 GB    
       Data Usage can be measured in kilobytes, megabytes and gigabytes.
      My Usage:8.950, 9,164.800,   9,384,755.200
      Total Usage: 8.950,  9,164.800,  9,384,755.200
    Next I click View Data Details.
    I get 160 lines of mumbo jumbo. So then I go to analyze usage. And it tells me I'm not even close to 1GB! Whats up with that? So I read every single line and notice that the date is from Sept to October. I only joined up Oct. 25th. Shouldn't it show my current analysis from Oct 25th to now? How do I get this number.
    Because within the first 3days of using verizon it said I was OVER my 5GB!
    I only use FB, email and browse the web. Ebay and shopping channels. At the store they told me 5Gb is more than enough. What do I do? Somebody please help?

    Yes I suppose it is. I have a 1gig netbook. The wireless card didn't work anymore ( or so verizon thinks so) So they upgraded me to a 4g wifi hotspot now sitting on my counter. This is the only computer in the house. And we live very rurally. (only reason why I went with this) I have 2 neighbor within 15 acres. And 160 acres behind my house. So I don't think someone is hacking in. Besides, I just got this. Not sure how someone could figure out within 1 day or so that I have one.
    THanks

  • How do I move a video project from Encore to AIR via Flash Builder?

    I've got a rather complex video project with lots of submenus (that works as a DVD - so it is "debugged") that I need to be able to export from Encore into an app to upload on the iTunes store and Android Market. So far, I've managed to export the project via Encore's Flash output and test the results using a web browser (it works). Everything got saved in a new folder structure with an index.html file pointing to the flashdvd.swf file pointing to the AuthoredContent.xml file that actually holds all the information on the file structure (there are 99 files in the Sources subfolder) and uses the default generated Theme.xml file.
    What I need to do is to somehow import all this into Flash Builder as a project so it can become the source for the various "app" platforms. My questions are:
    1. I've got Flash Builder 4.5 - do I need to upgrade to 4.7 in order to import the project or can this be done in 4.5?
    2. Is there some intermediate step that I'm missing?
    3. Will Flash Builder be able to repackage all these files into the file specs required by each target platform (I'm assuming it will - is that a valid assumption)?
    And moving downstream a bit...
    4. Since this is a video based project with over 200 megabytes of F4V content, how will the content arrive on the end user device? Will it have to be  streamed or can it be downloaded? - this is a big issue should the app become popular and we be forced to maintain a streaming server somewhere. What about server sizing in such an event? For example: is there a way to estimate how much bandwidth is required per 1,000 instances?

    Oh, no i think i clicked Solved but it's not at all solved and i don'tt see a way to reverse it and clicked unsolved.
    I'v tried reading the Manuals but still don't see the solution. 
    I think I've misworded my question.  Perhaps i shall have to start a new question.
    What i want to know is very basic.
    How do i take a video i see on either youTube or another side and put it either on my iPod (i guess via iTunes) or put it into a document or an email??
    Do i copy some link?  Copy what exactly?  Do i click and drag something?
    I think if i knew how to get a video to iTunes I could be able to sync it. 
    But i don't know the first thing, how to get a video onto iTunes, or email, or a document.
    Thank you both.

  • How do I convert an Encore project to an iTunes or Android app?

    I've got a rather complex video project with lots of submenus (that works as a DVD - so it is "debugged") that I need to be able to export from Encore into an app to upload on the iTunes store and Android Market. So far, I've managed to export the project via Encore's Flash output and test the results using a web browser (it works).
    Everything got saved in a new folder structure with an index.html file pointing to a flashdvd.swf file pointing to a AuthoredContent.xml file that actually holds all the information on the file structure (there are 99 files totalling over 200 megabytes at the lowest resolution in the Sources subfolder) and uses a Theme.xml file to generate a background.
    So I need to somehow convert this "stuff" into something that can be uploaded to the appropriate store. I stumbled upon Jeanette Stalions tutorials on using Flash Builder for this purpose, but can't even figure out how to open the Encore  output in some meaningful way using Flash Builder so I can get started. (yes the swf file can be opened but that doesn't seem to have any particular value since all it does is point to the xml files.)
       1. I've got Flash Builder 4.5 - do I need to upgrade to 4.7 in order to import the project or can this be done in 4.5?
    2. Is there some intermediate step that I'm missing? Do I need to do something in Flash Pro first?
    3. Will Flash Builder be able to repackage all these files into the file specs required by each target platform (I'm assuming it will - is that a valid assumption)?
    And moving downstream a bit...
    4. Since this is a video based project with over 200 megabytes of F4V content, how will the content arrive on the end user device? Will it have to be  streamed or can it be downloaded? - this is a big issue should the app become popular and we be forced to maintain a streaming server somewhere. What about server sizing in such an event? For example: is there a way to estimate how much bandwidth is required per 1,000 instances?
    <moved by mod from flash pro cc forum - kglad>

    Ok, so I opened Flash Builder, created a new project and after a lot of trial & error was able to bring in all the files from the Encore output using import/general/file_system in one chunk. That at least gets me to something that might have a future!
    As to question 1 above: Many of the articles I've found in the Developer Center refer to various versions of tools: Flex, Flex Builder (now called Flash Builder), Flex SDK, Spark, ActionScript and Flash Catalyst. The products seem to have undergone quite a few transformations and specific processes like roundtripping between Catalyst & Flash Builder are now obsolete since Catalyst is now a dead product.  So does this mean that all (or many) of these confusing tools are now united in Flash Builder 4.7?
    And this begs the question "how does a newcomer get started building an AIR app without having to learn the entire history of Flash and it's evolution from simple display of vector graphics to a Swiss Army knife environment?"

  • Besoin d'aide : Configuration d'un 2821 pour remplacer une Freebox

    Bonjour,
    Je souhaite remplacer complètement ma Freebox par les équipements suivants :
    - Un routeur Cisco 2821 équipé d'un module ADSL2+ (HWIC-1ADSL-M)
    J'utilise la dernière version de l'Advanced Enterprise Services, IOS version 15.1(4)M7
    - Un switch Cisco 2960G qui sera raccordé au 2821.
    Je suis en dégroupage totale avec l'opérateur Free (donc IP over ATM je pense ?!)
    Les spécifications de ma connexion :
    Adsl :
    ======
      Etat                           Showtime               
      Protocole                      ADSL2+                 
      Mode                           Interleaved            
                             Descendant         Montant          
      Débit ATM              6570 kb/s          887 kb/s         
      Marge de bruit         5.40 dB            7.30 dB          
      Atténuation            46.00 dB           25.00 dB         
      FEC                    5211               27735            
      CRC                    76                 0                
      HEC                    3                  267
    Je souhaite que mon routeur Cisco se comporte à l'identique de ma Freebox, c-a-dire :
    Coté WAN, le routeur récupérera l'IP du DSLAM via le module HWIC ADSL qui sera paramétré avec l'@ MAC de ma BOX.
    Coté LAN, le routeur CISCO proposera une config DHCP avec une passerelle par défaut en 192.168.1.50 + DNS récupérés via la conf. IP de l'opérateur (Lan en /24) pour les équipements IP raccordé sur le switch.
    J'aurai besoin de faire du port forwarding pour certaines application (un ou deux exemples me serait utile)
    J'accède et paramètre chaque équipement en port console via Putty.
    La conf du routeur actuellement :
    C2821#show config
    Using 1860 out of 245752 bytes
    ! Last configuration change at 09:19:32 UTC Sun Nov 10 2013
    version 15.1
    service timestamps debug datetime msec
    service timestamps log datetime msec
    no service password-encryption
    hostname C2821
    boot-start-marker
    boot system flash:c2800nm-adventerprisek9-mz.151-4.M7.bin
    boot-end-marker
    logging buffered 4096
    enable secret 5 $**************************************
    enable password ************
    no aaa new-model
    no process cpu autoprofile hog
    dot11 syslog
    ip source-route
    no ip routing
    no ip cef
    no ipv6 cef
    multilink bundle-name authenticated
    voice-card 0
    crypto pki token default removal timeout 0
    license udi pid CISCO2821 sn F***********2
    archive
    log config
    hidekeys
    redundancy
    controller DSL 0/1/0
    interface GigabitEthernet0/0
    ip address 192.168.1.50 255.255.255.0
    ip nat inside
    ip virtual-reassembly in
    no ip route-cache
    duplex auto
    speed auto
    no cdp enable
    no mop enabled
    hold-queue 100 out
    interface GigabitEthernet0/1
    no ip address
    no ip route-cache
    shutdown
    duplex auto
    speed auto
    no cdp enable
    interface ATM0/0/0
    mac-address 0007.****.****
    no ip address
    no ip redirects
    no ip proxy-arp
    no ip route-cache
    no atm ilmi-keepalive
    interface ATM0/0/0.1 point-to-point
    ip address dhcp
    ip nat outside
    ip virtual-reassembly in
    no ip route-cache
    pvc 8/36
    encapsulation aal5mux ip
    ip forward-protocol nd
    ip http server
    no ip http secure-server
    ip nat inside source list 1 interface ATM0/0/0.1 overload
    ip route 0.0.0.0 0.0.0.0 ATM0/0/0.1
    access-list 1 permit any
    snmp-server community public RO
    control-plane
    mgcp profile default
    line con 0
    exec-timeout 0 0
    line aux 0
    line vty 0 4
    session-timeout 9999
    exec-timeout 9999 0
    password ************
    login
    transport input all
    scheduler allocate 20000 1000
    end
    J'ai caché volontairement les mot de passe, adresse mac et numéro de série.
    A noter que la commande "dsl operating-mode auto" ne s'enregistre pas quand je la rentre sur l'interface ATM0/0/0, mais il semble que par défaut l'interface soit en auto d'après la documentation Cisco.
    La connexion au DSLAM semble se faire correctement, voyant vert clignotant, puis fixe au niveau du module, pas de deconnexion/reconnexion intenpestive.
    La connexion ne fonctionne pas... voici les infos que j'obtiens quand j'interroge les interfaces :
    C2821>show interface ATM0/0/0.1
    ATM0/0/0.1 is up, line protocol is up
      Hardware is HWIC-DSLSAR (with Alcatel ADSL Module), address is 0007.****.**** (bia dc7b.94d9.5423)
      Internet address will be negotiated using DHCP
      MTU 4470 bytes, BW 900 Kbit/sec, DLY 560 usec,
         reliability 81/255, txload 1/255, rxload 1/255
      Encapsulation ATM
      Keepalive not supported
         530 packets input, 42611 bytes
         15 packets output, 3880 bytes
         0 OAM cells input, 0 OAM cells output
      AAL5 CRC errors : 0
      AAL5 SAR Timeouts : 0
      AAL5 Oversized SDUs : 0
      Last clearing of "show interface" counters never
    C2821>show interface ATM0/0/0
    ATM0/0/0 is up, line protocol is up
      Hardware is HWIC-DSLSAR (with Alcatel ADSL Module), address is 0007.****.**** (bia dc7b.94d9.5423)
      MTU 4470 bytes, sub MTU 4470, BW 900 Kbit/sec, DLY 560 usec,
         reliability 108/255, txload 1/255, rxload 1/255
      Encapsulation ATM, loopback not set
      Keepalive not supported
      Encapsulation(s): AAL5
      23 maximum active VCs, 256 VCs per VP, 1 current VCCs
      VC Auto Creation Disabled.
      VC idle disconnect time: 300 seconds
      Last input 00:00:00, output 00:00:10, output hang never
      Last clearing of "show interface" counters never
      Input queue: 0/75/0/0 (size/max/drops/flushes); Total output drops: 0
      Queueing strategy: Per VC Queueing
      5 minute input rate 2000 bits/sec, 3 packets/sec
      5 minute output rate 0 bits/sec, 0 packets/sec
         739 packets input, 58437 bytes, 0 no buffer
         Received 0 broadcasts (0 IP multicasts)
         0 runts, 0 giants, 0 throttles
         0 input errors, 0 CRC, 0 frame, 0 overrun, 0 ignored, 0 abort
         19 packets output, 4936 bytes, 0 underruns
         0 output errors, 0 collisions, 0 interface resets
         0 unknown protocol drops
         0 output buffer failures, 0 output buffers swapped out
    Un "show ip interface brief" :
    Interface
    IP-Address
    OK?
    Method
    Status
    Protocol
    GigabitEthernet0/0
    192.168.1.50
    YES
    NVRAM
    up
    up
    GigabitEthernet0/1
    unassigned
    YES
    NVRAM
    administratively down
    down
    ATM0/0/0
    unassigned
    YES
    NVRAM
    up
    up
    ATM0/0/0.1
    unassigned
    YES
    DHCP
    up
    up
    NVI0
    192.168.1.50
    YES
    unset
    up
    up
    J'ai fait deux tests supplémentaires en modifiant l'interface ATM0/0/0.1 : dans un cas avec mon IP Wan en static ou en DHCP => KO dans les deux cas.
    Voici les résultats :
    1° => Interface ATM0/0/0.1 en IP Static (ip address 88.xx.xx.xx 255.255.255.0)
    C2821#show dsl interface atm0/0/0
    ATM0/0/0
    Alcatel 20190 chipset information
                    ATU-R (DS)                      ATU-C (US)
    Modem Status:    Showtime (DMTDSL_SHOWTIME)
    DSL Mode:        ITU G.992.5 (ADSL2+) Annex A
    ITU STD NUM:     0x03                            0x2
    Chip Vendor ID:  'STMI'                          'BDCM'
    Chip Vendor Specific:  0x0000                    0xA197
    Chip Vendor Country:   0x0F                      0xB5
    Modem Vendor ID: 'CSCO'                          'BDCM'
    Modem Vendor Specific: 0x0000                    0x0000
    Modem Vendor Country:  0xB5                      0xB5
    Serial Number Near:    FO******** 2821
    Serial Number Far:
    ModemChip ID:    C196 (3) capability-enabled
    DFE BOM:         DFE3.0 Annex M (3)
    Capacity Used:   99%                             99%
    Noise Margin:     7.0 dB                          7.0 dB
    Output Power:    18.0 dBm                        12.5 dBm
    Attenuation:     47.0 dB                         26.0 dB
    FEC ES Errors:   22                              12001
    ES Errors:        3                               4
    SES Errors:       1                               0
    LOSES Errors:     1                               0
    UES Errors:       0                              4094
    Defect Status:   None                            None
    Last Fail Code:  None
    Watchdog Counter: 0xC9
    Watchdog Resets: 0
    Selftest Result: 0x00
    Subfunction:     0x00
    Interrupts:      342012 (0 spurious)
    PHY Access Err:  0
    Activations:     1
    LED Status:      OFF
    LED On Time:     0
    LED Off Time:    0
    Init FW:         init_AMR-4.0.015_no_bist.bin
    Operation FW:    AMR-4.0.015.bin
    FW Source:       embedded
    FW Version:      4.0.15
                     DS Channel1      DS Channel0   US Channel1       US Channel0
    Speed (kbps):             0             6016             0               900
    DS User cells:            0             2075
    US User & Idle cells:                                     0           764053
    Reed-Solomon EC:          0               33             0             27746
    CRC Errors:               0                5             0                17
    Header Errors:            0                5             0               273
    Total BER:                0E-0           4531E-9
    Leakage Average BER:      0E-0           4531E-9
    Interleave Delay:         0                8             0                61
                            ATU-R (DS)      ATU-C (US)
    Bitswap:               enabled            enabled
    LOM Monitoring : Disabled
    DMT Bits Per Bin
    000: 0 0 0 0 0 0 0 8 B B B B B B B B
    010: B A A A A A A A A A A B B B B A
    020: 0 6 7 7 7 8 8 8 9 9 9 A A A A A
    030: A A A A 9 9 9 8 7 7 8 9 A A A A
    040: A A A A A 2 A A A A A A A A A A
    050: A A 9 9 9 9 9 9 9 9 9 9 9 9 9 9
    060: 9 9 9 9 9 9 9 9 9 8 9 8 8 8 8 8
    070: 8 8 8 8 8 8 8 8 8 8 8 8 8 8 8 8
    080: 8 8 8 7 8 8 8 8 8 8 7 8 8 7 7 7
    090: 7 7 7 7 7 7 7 7 7 7 7 7 7 7 7 7
    0A0: 7 7 7 7 7 7 7 7 7 7 7 7 7 7 7 7
    0B0: 7 7 7 7 7 6 6 6 6 6 6 6 6 6 6 6
    0C0: 6 6 5 5 5 4 2 0 0 0 2 4 5 5 5 5
    0D0: 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5
    0E0: 5 5 5 5 5 5 5 5 5 5 5 4 4 4 4 4
    0F0: 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4
    100: 4 4 4 4 4 4 4 4 3 3 3 3 3 3 3 3
    110: 3 3 3 2 2 2 2 2 2 2 2 2 2 2 2 2
    120: 2 2 2 2 2 0 0 0 0 2 2 2 2 2 2 2
    130: 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 1
    140: 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0
    150: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    160: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    170: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    180: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    190: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    1A0: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    1B0: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    1C0: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    1D0: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    1E0: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    1F0: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
    DSL: Training log buffer capability is not enabled
    2° => Interface ATM0/0/0.1 en DHCP
    C2821>show dsl interface atm0/0/0
    ATM0/0/0
    Alcatel 20190 chipset information
                    ATU-R (DS)                      ATU-C (US)
    Modem Status:    Showtime (DMTDSL_SHOWTIME)
    DSL Mode:        ITU G.992.5 (ADSL2+) Annex A
    ITU STD NUM:     0x03                            0x2
    Chip Vendor ID:  'STMI'                          'BDCM'
    Chip Vendor Specific:  0x0000                    0xA197
    Chip Vendor Country:   0x0F                      0xB5
    Modem Vendor ID: 'CSCO'                          'BDCM'
    Modem Vendor Specific: 0x0000                    0x0000
    Modem Vendor Country:  0xB5                      0xB5
    Serial Number Near:    F*********** 2821
    Serial Number Far:
    ModemChip ID:    C196 (3) capability-enabled
    DFE BOM:         DFE3.0 Annex M (3)
    Capacity Used:   99%                             99%
    Noise Margin:     6.5 dB                          7.0 dB
    Output Power:    18.0 dBm                        12.5 dBm
    Attenuation:     47.0 dB                         26.0 dB
    FEC ES Errors:   12001                           262
    ES Errors:        4                               4
    SES Errors:       0                               1
    LOSES Errors:     0                               1
    UES Errors:      4365                             0
    Defect Status:   None                            None
    Last Fail Code:  None
    Watchdog Counter: 0x26
    Watchdog Resets: 0
    Selftest Result: 0x00
    Subfunction:     0x00
    Interrupts:      8816 (0 spurious)
    PHY Access Err:  0
    Activations:     1
    LED Status:      OFF
    LED On Time:     0
    LED Off Time:    0
    Init FW:         init_AMR-4.0.015_no_bist.bin
    Operation FW:    AMR-4.0.015.bin
    FW Source:       embedded
    FW Version:      4.0.15
                     DS Channel1      DS Channel0   US Channel1       US Channel0
    Speed (kbps):             0             6029             0               900
    DS User cells:            0             5771
    US User & Idle cells:                                     0          3241224
    Reed-Solomon EC:          0              348             0              1922
    CRC Errors:               0                4             0                 4
    Header Errors:            0                4             0               101
    Total BER:                0E-0           9037E-10
    Leakage Average BER:      0E-0           9037E-10
                            ATU-R (DS)      ATU-C (US)
    Bitswap:               enabled            enabled
    LOM Monitoring : Disabled
    DMT Bits Per Bin
    Not able to get complete DMT bin information.Please retry "show dsl" after few seconds.
    DSL: Training log buffer capability is not enabled
    J'ai besoin d'un peu d'aide pour établir la configuration sur le routeur, mes connaissance étant limité pour réaliser une conf aussi complexe.
    Merci pour votre aide.

    J'avance concernant mon problème DHCP, le résultat du debug est intéressant on y vois clairement que je n'arrive pas à obtenir d'IP de Free et que l'IOS Cisco me présente avec un client-ID auquel il ajoute "cisco-" et "-AT0/0/0.1"
    C'est ce qui doit expliquer le refus de Free de me fournir une adresse IP.
    Je pense également que le format de la mac adresse attendu par le DHCP Free est incorrect, il attends un mac du type XX:XX:XX:XX:XX:XX et non un XXXX.XXXX.XXX
    Il faut utiliser la commande "ip address dhcp client-id interface-name" pour forcer le client-ID avec une mac de type XX:XX:XX:XX:XX:XX mais je n'arrive pas à utiliser cette commande correctement
    J'ai trouvé cette information ici => http://blog.ipspace.net/2007/0.....nt-id.html
    J'ai bien lu la syntaxe mais il en veux pas, le sub interface ATM ne l'accepte pas semble t'il ?!
    Quelqu'un serait m'aider à l'utiliser ?
    Ci-dessous le debug DHCP :
    Code:
    *Nov 17 10:16:33.659: DHCP: DHCP client process started: 10
    *Nov 17 10:16:33.659: RAC: Starting DHCP discover on ATM0/0/0.1
    *Nov 17 10:16:33.659: DHCP: Try 1 to acquire address for ATM0/0/0.1
    *Nov 17 10:16:33.663: DHCP: allocate request
    *Nov 17 10:16:33.663: DHCP: new entry. add to queue, interface ATM0/0/0.1
    *Nov 17 10:16:33.663: DHCP: Client socket is opened
    *Nov 17 10:16:33.663: DHCP: SDiscover attempt # 1 for entry:
    *Nov 17 10:16:33.663: Temp IP addr: 0.0.0.0  for peer on Interface: ATM0/0/0.1
    *Nov 17 10:16:33.663: Temp  sub net mask: 0.0.0.0
    *Nov 17 10:16:33.663:    DHCP Lease server: 0.0.0.0, state: 3 Selecting
    *Nov 17 10:16:33.663:    DHCP transaction id: F9D
    *Nov 17 10:16:33.663:    Lease: 0 secs,  Renewal: 0 secs,  Rebind: 0 secs
    *Nov 17 10:16:33.663:    Next timer fires after: 00:00:04
    *Nov 17 10:16:33.663:    Retry count: 1   Client-ID: cisco-0007.****.**4b-AT0/0/0.1
    *Nov 17 10:16:33.663:    Client-ID hex dump: 63697363****************************2E
    *Nov 17 10:16:33.667:                        6363*********************302F302E31
    *Nov 17 10:16:33.667:    Hostname: C2821
    *Nov 17 10:16:33.667: DHCP: SDiscover placed class-id option: 64736C666F72756D2E6F7267
    *Nov 17 10:16:33.667: DHCP: SDiscover: sending 312 byte length DHCP packet
    *Nov 17 10:16:33.667: DHCP: SDiscover 312 bytes
    *Nov 17 10:16:33.667:             B'cast on ATM0/0/0.1 interface from 0.0.0.0
    *Nov 17 10:16:37.291: DHCP: SDiscover attempt # 2 for entry:
    *Nov 17 10:16:37.291: Temp IP addr: 0.0.0.0  for peer on Interface: ATM0/0/0.1
    *Nov 17 10:16:37.291: Temp  sub net mask: 0.0.0.0
    *Nov 17 10:16:37.291:    DHCP Lease server: 0.0.0.0, state: 3 Selecting
    *Nov 17 10:16:37.291:    DHCP transaction id: F9D
    *Nov 17 10:16:37.291:    Lease: 0 secs,  Renewal: 0 secs,  Rebind: 0 secs
    *Nov 17 10:16:37.291:    Next timer fires after: 00:00:04
    *Nov 17 10:16:37.291:    Retry count: 2   Client-ID: cisco-0007.****.**4b-AT0/0/0.1
    *Nov 17 10:16:37.291:    Client-ID hex dump: 63697363****************************2E
    *Nov 17 10:16:37.291:                        6363*********************302F302E31
    *Nov 17 10:16:37.291:    Hostname: C2821
    *Nov 17 10:16:37.291: DHCP: SDiscover placed class-id option: 64736C666F72756D2E6F7267
    *Nov 17 10:16:37.291: DHCP: SDiscover: sending 312 byte length DHCP packet
    *Nov 17 10:16:37.291: DHCP: SDiscover 312 bytes
    *Nov 17 10:16:37.291:             B'cast on ATM0/0/0.1 interface from 0.0.0.0
    *Nov 17 10:16:41.291: DHCP: SDiscover attempt # 3 for entry:
    *Nov 17 10:16:41.291: Temp IP addr: 0.0.0.0  for peer on Interface: ATM0/0/0.1
    *Nov 17 10:16:41.291: Temp  sub net mask: 0.0.0.0
    *Nov 17 10:16:41.291:    DHCP Lease server: 0.0.0.0, state: 3 Selecting
    *Nov 17 10:16:41.291:    DHCP transaction id: F9D
    *Nov 17 10:16:41.291:    Lease: 0 secs,  Renewal: 0 secs,  Rebind: 0 secs
    *Nov 17 10:16:41.291:    Next timer fires after: 00:00:04
    *Nov 17 10:16:41.291:    Retry count: 3   Client-ID: cisco-0007.****.**4b-AT0/0/0.1
    *Nov 17 10:16:41.291:    Client-ID hex dump: 63697363****************************2E
    *Nov 17 10:16:41.291:                        6363*********************302F302E31

  • JDBC-ODBC Bridge Performance To MS Access

    Hey all, I'm running a Java app that extracts an entire table (set of tables) from Oracle and copies them into Access. The easiest implementation has been to use a DSN-less Type 1 JDBC-ODBC bridge connection to Access, but unfortunately the inserts are taking too long.
    I have tried both Statement and PreparedStatement approaches, with both single row and batch updates. Some of the performance times, relative to the computer I am on (PIII 1 gHz with 256 RAM), are as follows:
    Via Batch, about 8min, 15 seconds per 10,000 rows.
    Via Single Row updates/inserts, about 52 minutes for 45,000 rows.
    I've tried alot of varieties with batch, but 10,000 seemed to be the best. The largest table is only 45,000 rows (right now), but has the potential to grow much larger (obviously). This application needs to backup 4 databases, each with N tables (current N=7, but will expand). I'm trying to knock down the times and increase performance, but am not sure what is "reasonable" for Type 1 connections with JDBC. I've seen third party drivers, Type 3, for Access...but don't want to run the middle tier server for filtering request through. I'd rather use Type 4, but can't seem to find any for Access.
    Any suggestions? Recommendations? Let me know! Thanks all! :)

    Hi,
    If its a backup/batch process why are you worrying about performance, its only an offline process :) even if has to be triggered during an OLTP run this task asynchronously.
    I dont know why you have chosen java to do this ??? any way prepared statements with non scrollable resultsets will increase you performance better tha scrollable normal statements.
    Rajesh

  • [AS] How to test the presence of at least one table?

    Hello everyone,
    I would like to test for the presence of at least one table in a document before starting a process (on edge strokes). 
    I found this, but I do not know if this is really effective:
                                  set CountOfTables1 to count of tables of every story
                                  set CountOfTables2 to every table of every story
    The first gives me a list of the number of table in each story; the second gives me the objects reference of every table.
    Is there another way?
    TIA.
    Oli.

    Marc
    The test I did for nested tables stank (table pasted in rectangle and that rectangle pasted in a table ).  It does not work for nested tables
    I tested .isValid and it's a lot slower.
    Uwe
    Yes, I noticed that difference after posting my eeakk comment.
    Using slice(0) after the getElements can make a big difference but still the simple length going to be quicker in this case.
    Also the getElements without the slice(0) will return "too many elements" if there are to many elements (between 1500 - 15,000).
    On the other hand for repeated access of the variable (looping) it's as know normally quicker to use the getElements().slice(0).length than just length
    In summary
    1) Your anyItem() method is going to be very quick on a document which has a high ratio of stories containing tables.
    2) Although your method could and possibly probably be 100 times quicker than mine I would definitely use my method in terms of typing time and space verses the 1/2 second it might save on execution time per 10,000 tables.
    3) The only accurate method (in this thread) for counting the tables including nested and footnotes is Marc's Grep method.
    So I guess the 3 of us can share first place.
    I just wonder if using the same technique as Marc used in our discussion sometime back on nested buttons might get to a quick count than using the Grep method here.
    Clearly needs to be repeated on different types of document setups one can try the below
    if (!app.properties.activeDocument) {alert ("Really!?"); exit()};
    var scriptCount = 1;
    // Script 1 table.length
    var doc = app.activeDocument,
          start = Date.now(),
          t = doc.stories.everyItem().tables.length,
          finish = Date.now();
    $.writeln ("\rtable.length Script (" + scriptCount++ + ") took " + (finish - start) + "ms\r" + ((t) ? t + " Table" + ((t>1) ? "s" : "") : "Diddlysquat"));
    // Script 2 getElements
    var doc = app.activeDocument;
    var start = Date.now();
    var t = doc.stories.everyItem().tables.everyItem().getElements().length;
    var finish = Date.now();
    $.writeln ("\rgetElements Script (" + scriptCount++ + ") took " + (finish - start) + "ms\r" + ((t) ? t + " Table" + ((t>1) ? "s" : "") : "Diddlysquat"));
    // Script 3 getElements.slice(0)
    var doc = app.activeDocument;
    var start = Date.now();
    var t = doc.stories.everyItem().tables.everyItem().getElements().slice(0).length;
    var finish = Date.now();
    $.writeln ("\rgetElements.slice(0) Script (" + scriptCount++ + ") took " + (finish - start) + "ms\r" + ((t) ? t + " Table" + ((t>1) ? "s" : "") : "Diddlysquat"));
    // Script 4      isValid
    var start = Date.now();
    var t = doc.stories.everyItem().tables.everyItem().isValid;
    var finish = Date.now();
    $.writeln ("\risValid Script (" + scriptCount++ + ") took " + (finish - start) + "ms\rThe document contains " + ((t) ? "tables" : "no tables"));
    // Script 5   Marc's Grep
    var start = Date.now();
    var t = countTables();
    var finish = Date.now();
    $.writeln ("\rMarc's Grep Script as said only accurate one but slow (" + scriptCount++ + ") \rtook " + (finish - start) + "ms\r" + ((t) ? t + " Table" + ((t>1) ? "s" : "") : "Diddlysquat"));
    // Script 6 very lot of anyItem
    var start = Date.now();
    var myResult = atLeastOneTableInDoc(app.documents[0]);
    var finish = Date.now();
    $.writeln ("\rUwes Anyone for Bingo Script (" + scriptCount++ + ") took " + (finish - start) + "ms\rThe document contains " + ((myResult) ? "tables" : "no tables"));
    // Script 7 anyItem length
    var start = Date.now();
    var myResult = detectATable();
    var finish = Date.now();
    $.writeln ("\ranyItem length Script (" + scriptCount++ + ") took " + (finish - start) + "ms\rThe document contains " + ((myResult) ? "tables" : "no tables"));
    // Script 8 anyItem elements length
    var start = Date.now();
    var myResult = detectATable2();
    var finish = Date.now();
    $.writeln ("\ranyItem elements length Script (" + scriptCount++ + ") took " + (finish - start) + "ms\rThe document contains " + ((myResult) ? "tables" : "no tables"));
    //FUNCTION USING anyItem() EXTENSIVELY:
    function atLeastOneTableInDoc(myDoc){
        var myResult = 0;
        if(myDoc.stories.length === 0){return myResult};
        var myStories = myDoc.stories;
        //LOOP length == length of all Story objects
        //using anyItem() for checking length of Table objects
        for(var n=0;n<myStories.length;n++){
            if(anyStory = myStories.anyItem().tables.length > 0){
                myResult = 1;
                return myResult;
        //FALL-BACK, if anyItem() will fail:
    //EDIT:    if(!myResult){
            for(var n=0;n<myStories.length;n++){
                if(myStories[n].tables.length > 0){
                    myResult = 2;
                    return myResult;
    //EDIT:       };
        return myResult;
    }; //END function atLeastOneTableInDoc(myDoc)
    function detectATable(){
        s=app.documents[0].stories;
         if(s.anyItem().tables.length){
            return true; // Bingo
       for(var n=0;n<s.length;n++){
            if(s[n].tables.length) return true
        return false
    function detectATable2(){
        s=app.documents[0].stories;
         if(s.anyItem().tables.length){
            return true; // Bingo
        var sl = app.documents[0].stories.everyItem().getElements().slice(0);
       for(var n=0;n<s.length;n++){
            if(s[n].tables.length) return true
        return false
    function countTables()
        app.findTextPreferences = null;
        app.findTextPreferences.findWhat = "\x16";
        var t = app.properties.activeDocument||0;
        return t&&t.findText().length;
    P.s.
    Marc,
    A bit of homework. I didn't test it on the above but I found the your highRes function trimmer seems to have a large favoritism to the first function to compare.
    I was testing 2 prototypes and swapping the order swapped the result. One can make the prototypes the same and see the time difference.

  • Turning On Diagnostics eats up transactions ( MACommand.xml ) - a serious issue !!!

    We were just trying out the Azure Storage Analytics Service, and something very unusual caught our attention. 
    The transaction count for the diagnostics storage account ( the account to which the Diagnostics Service writes it's data) was extremely high. We are talking about
    600~ transaction per hour, all of which are GetBlob() operations, and all of them ended with error ( ClientOtherError is equal to the total number of operations ).
    Further investigation revealed that each instance running which has
    Diagnostics turned on, produces 300~ transactions per hour
    ( we has 2 instances, thus the 600). Continuing the investigation, looking at the $logs that the Analytics Service is producing revealed what really going on :
    The log is filled with lots of calls to an xml file that's not exists. The log file itself is very cluttered  but it's very clear that most of the calls are searching for  
    https://*****.blob.core.windows.net/mam/MACommand.xml  and also /mam/MACommandb.xml
    and /mam/MACommandb.xml
    all those calls have an error of 404.
    This issue is a real problem for us, and we have no idea what causing it.
    Has anyone encountered this issue ?
    (edit: Forgot to mention, the Diagnostics Service is not logging anything - scheduledTransferPeriod is zero for all the categories)
    (edit: here is some part of the log file, sensitive information is replaced with ****
    1.0;2012-01-06T19:38:21.5182981Z;GetBlob;ClientOtherError;404;4;4;authenticated;****;****;blob;"https://****.blob.core.windows.net/mam/****/****/TestApp.Web/TestApp.Web_IN_0/MACommand.xml";"/****/mam/****/****/TestApp.Web/TestApp.Web_IN_0/MACommand.xml";2608d6f1-1129-4abd-bbb6-2fcc1f7f9315;0;213.199.129.92:49581;2009-09-19;373;0;145;225;0;;;;;;"MAHttpClient";;
    1.0;2012-01-06T19:38:21.5482951Z;GetBlob;ClientOtherError;404;4;4;authenticated;****;****;blob;"https://****.blob.core.windows.net/mam/****/****/TestApp.Web/TestApp.Web_IN_0/MACommanda.xml";"/****/mam/****/****/TestApp.Web/TestApp.Web_IN_0/MACommanda.xml";9387eed2-9065-42af-8b7b-142458d405a1;0;213.199.129.92:49581;2009-09-19;374;0;145;225;0;;;;;;"MAHttpClient";;
    1.0;2012-01-06T19:38:21.5772922Z;GetBlob;ClientOtherError;404;5;5;authenticated;****;****;blob;"https://****.blob.core.windows.net/mam/****/****/TestApp.Web/MACommanda.xml";"/****/mam/****/****/TestApp.Web/MACommanda.xml";0f1e19ae-7796-4fdb-b575-3d115c65ef07;0;213.199.129.92:49581;2009-09-19;357;0;145;225;0;;;;;;"MAHttpClient";;
    (edit: The issue happens regardless of the role type WebRole/WorkerRole)

    After a week trying to figure this out, with the help of 
    Farida Bharmal from the azure support team, we managed solve the issue. Those calls are an expected behavior,
    and happens from SDK 1.6.
    Taken from a response Farida got from the product team :
    "I also got a response from
    the product team. There is a change in the SDK 1.6. These calls are expected. There are about 20 calls per 5 minutes for every role instance. Which means that there will be 6000 calls a day. This will cost around a penny every day for every role instance.
    (Storage Transactions = $0.01 per 10,000 transactions)"
    The product team also said that they looking into optimizing this behavior.

  • Delivery split for Replenishment Delivery in VL10B

    Hi Gurus,
    Can you advise me how I will be able to split delivery (VL10B) depending on maximum volume of my delivery?
    Example we want 1 delivery per 100,000 Cubic meter.
    In configurations, this can't be controled.
    Our current process is that, we create one delivery for collective STOs then maintain handling units (through packing) the packaging material controls the volume limit.  After grouping the materials through HU, we then split the delivery in VLSPS per HU.
    Is there a way to split the delivery from its creation to consider the max volume?
    Thanks,
    sdapprentice

    Hi,
    Thanks for your reply.  But are there any other setting you have in mind?
    Example my max delivery volume should be 100,000 cubic meters.
    Open STOs due for delivery
    STO item 10 with 200,000 Cubic meters as volume.  So when I run VL10B, I should have 2 deliveries created for STO item 10.
    Deliv 1 - 100, 000 cubic meters
    Deliv 2 - 100, 000 cubic meters.
    Thanks.

  • Adding new segment in IDOC

    Hi,
           i have a requirement where based on a condition i need to add a new segment dynamically in Inbound IDOC.
    I have written the code in the user exit of the inbound FM. Its adding new segments and process it perfectly.
    But when i see the IDOC in we02 or we19, i am not able to see the newly added segment.
    Will the newly added segment in FM appear in the We02?
    PS: i have changed the idoc_control-maxsegnum.
    Regards,
    Niyaz

    Hi Niyaz,
    Check out the below program ....Similar to your requirement
    IDoc creation from inbound file
    REPORT ZS7BM000006 message-id ZS7.
    */ Program Name: Creation of DESADV & INVOIC IDocs from file E021
    */ Description : This program reads in external file E021 containing
    *                shipping and invoice data from internal vendors and
    *                creates one DESADV and one INVOIC IDoc per invoice.
    */ Transaction : n/a - run from job Z_ccc_S7B_Annnnn, where
    *                'ccc' = 3-digit client and 'nnnnn' = zero-filled
    *                sequence number matching the scheduled job for E020.
    tables:  lfa1,
             lfm1,
             ekpo,
             eine,
             e1edk01,
             e1edk02,
             e1edk07,
             e1edk08,
             e1edk06,
             e1edk03,
             e1edka1,
             e1edka2,
             e1edp07,
             e1edp09,
             e1edp19,
             e1edp01,
             e1edp02,
             e1edp26,
             e1edp04,
             e1eds01,
             e1eds02,
             zst7f_ty_vendors.
    parameters:  p_path like PATH-PATHEXTERN
                       default '/ftp/atac/in/'.
    data:  INFILE LIKE PATH-PATHEXTERN,
           back_path(7) type c value 'backup/',
           offset like sy-fdpos,
           p07_ctr like sy-index,
           invoice_total type p decimals 3,
           d_seg_num like sy-index,
           i_seg_num like sy-index.
    data:  OUTFILE LIKE PATH-PATHEXTERN,
           today(8)     type c.
    data:  begin of uty_vendors occurs 10,
              lifnr like lfa1-lifnr,
              waers like lfm1-waers,
              name_abbr like zst7f_ty_vendors-name_abbr,
              ship_days like zst7f_ty_vendors-ship_days,
           end of uty_vendors.
    data:  iZSS7B21 like ZSS7B21.
    data:  desadvdata like edi_dd occurs 5 with header line.
    data:  invoicdata like edi_dd occurs 5 with header line.
    data:  dedidc like edi_dc occurs 1 with header line.
    data:  iedidc like edi_dc occurs 1 with header line.
    data:  begin of ie021 occurs 10,
            lifnr            like lfa1-lifnr,
            ship_days        like zst7f_ty_vendors-ship_days,
            invoice_no       like e1edk08-vbeln,
            stat             like e1edk01-action,
            po_number(10)    type n,
            po_lineno(5)     type n,
            slip_number      like e1edp09-vbeln,
            shipto_id        like e1edka1-partn,
            vendor_id        like e1edka1-partn,
            endcust_name     like e1edka1-name1,
            cust_partno      like e1edp09-kdmat,  "char 35
            vendor_partno    like e1edp09-matnr,  "char 35
            invoice_qty      like e1edp09-lfimg,
            qty_uom          like e1edp01-menee,
            unit_price       like e1edp01-vprei,
            price_uom        like e1edp01-pmene,
            price_qty        like e1edp01-peinh,
            line_amount      like e1edp26-betrg,
            currency         like e1edk01-curcy,
            etd              like e1edk06-datum, "ship date
            eta              like e1edk06-datum, "delivery date
            ship_id          like e1edk08-traid,
            ship_method      like e1edk08-traty,
            create_date      like e1edk03-datum,
            plant            like ekpo-werks,
           end of ie021.
    data: save_po like ie021-po_number,
          save_line like ie021-po_lineno,
          save_stat like ie021-stat,
          save_invoice like ie021-invoice_no.
    constants: hun_thou type p decimals 5 value '100000',
               thou type p decimals 3 value '1000'.
    *&      DEFINITION:  append_idoc_rec
    *       add a data record to the IDoc internal table
    define append_idoc_rec.
    &1-tabnam = &1-segnam.
    &2_seg_num = &2_seg_num + 1.
    &1-segnum = &2_seg_num.
    shift &1-segnum left deleting leading space.
    append &1.
    clear &1.
    end-of-definition.       " append_idoc_rec
    * MAIN PROCESSING LOOP
    START-OF-SELECTION.
    today = sy-datum.
    * find all internal vendors
    select a~lifnr
           b~waers
           c~name_abbr  c~ship_days
       into corresponding fields of table uty_vendors
         from lfa1 as a
              inner join lfm1 as b
                 on a~lifnr = b~lifnr
              inner join zst7f_ty_vendors as c
                 on a~lifnr = c~lifnr
         where a~ktokk = 'ZZTY' and
               b~ekorg = '7100' and
               c~ship_code = ' '.
    perform init_desadv.
    perform init_invoic.
    concatenate 'SAP' sy-sysid(3) into: iedidc-sndpor, dedidc-sndpor.
    loop at uty_vendors.
      clear ie021. refresh ie021.
      if not uty_vendors-name_abbr is initial.
    * datafiles are received with naming convention:
    * E020_<customer name abbreviation>_UTY
        concatenate p_path 'E021_' uty_vendors-name_abbr '_UTY'
            into infile.
        if not sy-subrc is initial.  "pathname too long
    * Filename too long: &
          message i016 with infile.
          continue.
        endif.
        condense infile.
        OPEN DATASET INFILE FOR INPUT IN TEXT MODE.
        if not sy-subrc is initial.
    *'Cannot open dataset & on &'
          message i013 with infile sy-datum.
          continue.
        else.
          concatenate p_path back_path 'E021_'
              uty_vendors-name_abbr '_UTY' today
                    into outfile.
          if not sy-subrc is initial.  "pathname too long
    * Filename too long: &
            message i016 with outfile.
            continue.
          endif.
          condense outfile.
          OPEN DATASET OUTFILE FOR OUTPUT IN TEXT MODE.
    * if the datestamped file cannot be created, do not process the
    * input file, because the input file is deleted after processing,
    * and there would be no record of the data.
          if not sy-subrc is initial.
    *'ERROR opening file & for output'
            close dataset infile.
            message i033 with outfile.
            continue.  "process next vendor's file
          endif.
          do.
            read dataset infile into izss7b21.
            case sy-subrc.
              when 0.
                transfer izss7b21 to outfile.
                if izss7b21-datacode = 'T'. "trailer rec
                  perform process_one_vendor using infile.
                  exit.  "process next vendor's file
                endif.
                check: izss7b21-datacode = 'A'. "data rec
                case izss7b21-status.
                  when ' '.  "new
                    ie021-stat = '000'.
                  when 'M'.  "modification
                    ie021-stat = '002'.
                  when 'D'.  "deletion
                    ie021-stat = '003'.
                endcase.
                move-corresponding uty_vendors to ie021.
                move-corresponding izss7b21 to ie021.
                perform convert_po_no using izss7b21-pono_poline
                                   changing ie021-po_number
                                            ie021-po_lineno.
                perform convert_dates using ie021-lifnr
                                            izss7b21-etd
                                            izss7b21-eta
                                            izss7b21-ship_method
                                            izss7b21-create_date
                                   changing ie021-eta
                                            ie021-ship_days.
                perform quantity_conversion
                                    using izss7b21-qty_uom
                                          izss7b21-invoice_qty
                                          izss7b21-unit_price
                                    changing ie021-qty_uom
                                             ie021-invoice_qty
                                          izss7b21-line_amount.
                perform money_conversion
                                    using izss7b21-currency
                                          izss7b21-unit_price
                                          izss7b21-price_uom
                                          izss7b21-line_amount
                                    changing ie021-currency
                                             ie021-price_uom
                                             ie021-price_qty
                                             ie021-unit_price
                                             ie021-line_amount.
                perform SAP_vendor_partno
                                    changing ie021-cust_partno.
                append ie021.
              when 4.  "EOF
                perform process_one_vendor using infile.
                exit.  "process next vendor's file
              when others.
    *ERROR reading dataset & - &
                message i015 with infile sy-datum.
                exit.
            endcase.
          enddo.
          close dataset: infile, outfile.
          delete dataset infile.
        endif.
      endif.
    endloop. "UTY_VENDORS
    *&      Form  process_one_vendor
    *       Pre-processed records from one vendor file are now in the
    *       internal table ie021 - ready to create IDocs
    FORM process_one_vendor using value(infile).
      sort ie021 by invoice_no stat po_number po_lineno.
      loop at ie021.
        if ( ie021-invoice_no <> save_invoice or
             ie021-stat <> save_stat ).
          if sy-tabix > 1.
            perform post_idocs using ie021-stat.
          endif.
          perform idoc_header_segs using ie021-stat.
        endif.
        if ( ie021-stat <> save_stat or
             ie021-po_number <> save_po or
             ie021-po_lineno <> save_line or
             ie021-invoice_no <> save_invoice ).
          if ( sy-tabix > 1 and
               ie021-stat = '000' ).
            perform idoc_poheader_segs.
          endif.
        endif.
        perform idoc_item_segs using ie021-stat.
        save_po = ie021-po_number.
        save_line = ie021-po_lineno.
        save_invoice = ie021-invoice_no.
        save_stat = ie021-stat.
      endloop.
      perform post_idocs using ie021-stat.
    * File successfully processed: &
      message s035 with infile.
    ENDFORM.                    " process_one_vendor
    *&      Form  convert_po_no
    *       Break the PO number & line field into separate fields
    FORM convert_po_no using value(infield)
                       changing po_number like ie021-po_number
                                po_line like ie021-po_lineno.
    data:  cpos like sy-fdpos,
           lpos like sy-fdpos,
           cline(6) type c.
    * if the infield contains a hyphen, assume that the preceding characters
    * represent the po number, if they are numeric. The po line number is
    * assumed to be all numeric characters after the hyphen.
      if infield ca '-'.
        if infield(sy-fdpos) co ' 0123456789'.  "numeric
          po_number = infield(sy-fdpos).
          cpos = sy-fdpos + 1.
        endif.
      else.  "no hyphen - PTY
        if infield(2) = '71'.  "SAP number range
          cpos = 10.
        else.                  "SyteLine number
          cpos = 6.
        endif.
        if infield(cpos) co ' 0123456789'.  "numeric
          po_number = infield(cpos).
        endif.
      endif.
      if not po_number is initial.
        while infield+cpos(1) co '0123456789'.
          cline+lpos(1) = infield+cpos(1).
          lpos = lpos + 1.
          cpos = cpos + 1.
        endwhile.
        shift cline left deleting leading '0'.
        if not cline is initial.
          po_line = cline.
        endif.
      endif.
    * Put out a warning in the job log, but create the IDoc to save the data
      if ( po_number is initial or
           po_line is initial ).
    * PO number - line item conversion failed: &
        message i034 with infield.
      endif.
    ENDFORM.                    " convert_po_no
    *&      Form  convert_dates
    *       Convert ship date to delivery date, if necessary
    FORM convert_dates using value(vendor_no)
                             value(i_ship_date)
                             value(i_delivery_date)
                             value(i_ship_code)
                             value(i_create_date)
                    changing o_delivery_date
                             ship_days.
    data:  ship_date type d.
    * if delivery date not sent, calculate it from ship date plus
    * ship days.
    * Note that this logic could leave delivery date blank,
    * if ship date is not numeric.
      if ( i_delivery_date is initial or
           i_delivery_date co ' 0' ).  "no delivery date sent
        if ( i_ship_date co ' 0123456789' and
             i_ship_date cn ' 0' ).    "ship date sent
    * move the ship date into a date field to add days
          ship_date = i_ship_date.
        elseif ( i_create_date co ' 0123456789' and
                 i_create_date cn ' 0' ).
          ship_date = i_create_date.
        endif.
        if not i_ship_code is initial.
          select single ship_days from zst7f_ty_vendors
                   into ship_days
                  where lifnr = vendor_no
                    and ship_code = i_ship_code.
        endif.
        if not ship_date is initial.
          if ship_days > 0.
            ship_date = ship_date + ship_days.
            o_delivery_date = ship_date.
            shift o_delivery_date left deleting leading ' '.
          endif.
        endif.
      else.  "delivery date sent
        o_delivery_date = i_delivery_date.
      endif.
    ENDFORM.                    " convert_dates
    *&      Form  quantity_conversion
    *       The quantities in the input file are implied 3-decimal,
    *       so need to be converted into a "real" number.
    *       Also, the unit of measure may be 'KP' indicating that the qty
    *       is given in thousands.
    FORM quantity_conversion USING    value(i_UOM)
                                      value(i_invoice_qty)
                                      value(i_unit_price)
                        CHANGING o_uom like iE021-qty_UOM
                                 o_invoice_qty like IE021-INVOICE_QTY
                                 c_LINE_AMOUNT like izss7b21-line_amount.
    data:  f_invoice_qty type f.
    data:  n_invoice_qty like lips-kcmeng.
    data:  f_unit_price type f.
    data:  f_line_amt type f.
    data:  n_line_amt0 type p decimals 0.
      if ( i_invoice_qty co ' 0123456789' and
           i_invoice_qty cn ' 0' ).
        f_invoice_qty = i_invoice_qty.
    * if no extended price is sent, calculate it
        if c_line_amount is initial.
    * the qty is implied 3-dec, the price is still implied
    * 5-dec, and line amount should be implied 3-dec.
          f_unit_price = i_unit_price.
          f_line_amt = ( f_invoice_qty * f_unit_price ) / 100000.
          n_line_amt0 = f_line_amt.
          c_line_amount = n_line_amt0.
          shift c_line_amount left deleting leading space.
        endif.
    * if the invoice qty is per 1000, the implied 3-dec times 1000 equals
    * the unconverted value. Otherwise, divide by 1000 to get the PCE qty
        if i_uom = 'KP'.
          n_invoice_qty = f_invoice_qty.
        else.
          n_invoice_qty = f_invoice_qty / thou.
        endif.
      endif.
      o_uom = 'PCE'.
      if not n_invoice_qty is initial.
        o_invoice_qty = n_invoice_qty.
        shift o_invoice_qty left deleting leading space.
      else.
        clear o_invoice_qty.
      endif.
    ENDFORM.                    " quantity_conversion
    *&      Form  money_conversion
    *       Add the implied decimals and store price-per qty, if
    *       price per 1,000 is sent.
    FORM money_conversion USING    value(I_CURR)
                                   value(i_UNIT_PRICE)
                                   value(i_UOM)
                                   value(i_LINE_AMOUNT)
                          CHANGING o_CURRENCY like ie021-currency
                                   o_PRICE_UOM like ie021-price_uom
                                   o_PRICE_QTY like ie021-price_qty
                                   o_UNIT_PRICE like ie021-unit_price
                                   o_LINE_AMOUNT like ie021-line_amount.
    data:  n_unit_price type p decimals 5,
           n_line_amount type p decimals 3.
    * not all of the vendors send the currency code, so use the vendor
    * master default
      case i_curr(2).
        when 'US'.
          o_currency = 'USD'.
        when 'JP'.
          o_currency = 'JPY'.
        when others.
          o_currency = uty_vendors-waers.
      endcase.
    * unit price is implied 5-dec
      if ( i_unit_price cn ' 0' and
           i_unit_price co ' 0123456789' ).
        n_unit_price = i_unit_price.
        n_unit_price = n_unit_price / hun_thou.
      endif.
    * line price is implied 3-dec
      if ( i_line_amount co ' 0123456789' and
           i_line_amount cn ' 0' ).
        n_line_amount = i_line_amount.
        n_line_amount = n_line_amount / thou.
      endif.
    * 'KP' = price per thousand
      if i_uom = 'KP'.
        o_price_qty = '1000'.
      else.
        o_price_qty = '1'.
      endif.
      o_price_uom = 'PCE'.
      if not n_unit_price is initial.
        o_unit_price = n_unit_price.
        shift o_unit_price left deleting leading space.
      else.
        clear o_unit_price.
      endif.
      if not n_line_amount is initial.
        o_line_amount = n_line_amount.
        shift o_line_amount left deleting leading space.
      else.
        clear o_line_amount.
      endif.
    ENDFORM.                    " money_conversion
    *&      Form  SAP_vendor_partno
    *       replace UTY part number sent by vendor with SAP material no.
    *       from PO line item.
    FORM SAP_vendor_partno changing cust_partno like ie021-cust_partno.
    tables: makt.
    data: partno_sent like makt-maktx.
      partno_sent = cust_partno.
      clear: makt, cust_partno.
      select single matnr from ekpo into cust_partno
             where ebeln = ie021-po_number and
                   ebelp = ie021-po_lineno.
      if sy-subrc is initial.
    *compare material description to part number sent by vendor
        select single maktx from makt into makt-maktx
            where matnr = cust_partno.
        if partno_sent <> makt-maktx.
    * 'Part No. Mismatch: PO & - &, Part sent &, SAP mat.no. &'
          message i031 with ie021-po_number ie021-po_lineno
                            partno_sent makt-maktx.
        endif.
      else.  "PO line not found
    *try to find SAP material number using 20-char catalog no. sent
        select single matnr from makt into cust_partno
            where maktx = partno_sent.
        if not sy-subrc is initial.
    * 'SAP material no. not found for & - PO & - &'
          message i032 with partno_sent ie021-po_number ie021-po_lineno.
        endif.
      endif.
    *if not found, IDoc will go to workflow for missing material no.
    ENDFORM.                    " SAP_vendor_partno
    *&      Form  idoc_header_segs
    *       create internal table entries for header segments.
    *  DESADV:
    *          E1EDK07
    *          E1EDKA1
    *          E1EDK03
    *          E1EDK08
    *          E1EDKA2
    *          E1EDK06
    *  INVOIC:
    *          E1EDK01
    *          E1EDKA1(s)
    *          E1EDK02
    *          E1EDK03(s)
    FORM idoc_header_segs using value(desadv_ok).
    * INVOIC
      clear i_seg_num.
      invoicdata-segnam = 'E1EDK01'.
      e1edk01-action = ie021-stat.
      if ie021-currency(2) = 'US'.
        e1edk01-curcy = 'USD'.
      else.
        e1edk01-curcy = 'JPY'.
      endif.
      invoicdata-sdata = e1edk01.
      append_idoc_rec invoicdata i.
      clear e1edka1.
      invoicdata-segnam = 'E1EDKA1'.
      e1edka1-parvw = 'RE'.
      e1edka1-partn = ie021-shipto_id.
      invoicdata-sdata = e1edka1.
      append_idoc_rec invoicdata i.
      clear e1edka1.
      invoicdata-segnam = 'E1EDKA1'.
      e1edka1-parvw = 'LF'.
      e1edka1-partn = ie021-lifnr.
      e1edka1-lifnr = ie021-shipto_id.
      invoicdata-sdata = e1edka1.
      append_idoc_rec invoicdata i.
      if not ie021-endcust_name is initial.
        clear e1edka1.
        invoicdata-segnam = 'E1EDKA1'.
        e1edka1-parvw = 'WE'.
        e1edka1-name1 = ie021-endcust_name.
        invoicdata-sdata = e1edka1.
        append_idoc_rec invoicdata i.
      endif.
      clear e1edk02.
      invoicdata-segnam = 'E1EDK02'.
      e1edk02-qualf = '009'.
      e1edk02-belnr = ie021-invoice_no.
      invoicdata-sdata = e1edk02.
      append_idoc_rec invoicdata i.
      clear e1edk03.
      invoicdata-segnam = 'E1EDK03'.
      e1edk03-iddat = '012'.
      e1edk03-datum = ie021-create_date.
      invoicdata-sdata = e1edk03.
      append_idoc_rec invoicdata i.
      invoicdata-segnam = 'E1EDK03'.
      e1edk03-iddat = '024'.
      invoicdata-sdata = e1edk03.
      append_idoc_rec invoicdata i.
      check desadv_ok = '000'.
    * DESADV
      clear d_seg_num.
      desadvdata-segnam = 'E1EDK07'.
      e1edk07-action = ie021-stat.
      e1edk07-bolnr = ie021-invoice_no.
      desadvdata-sdata = e1edk07.
      append_idoc_rec desadvdata d.
      clear e1edka1.
      desadvdata-segnam = 'E1EDKA1'.
      desadvdata-sdata = e1edka1.
      append_idoc_rec desadvdata d.
      clear e1edk03.
      desadvdata-segnam = 'E1EDK03'.
      desadvdata-sdata = e1edk03.
      append_idoc_rec desadvdata d.
      clear e1edk08.
      desadvdata-segnam = 'E1EDK08'.
      e1edk08-vbeln = ie021-invoice_no.
      e1edk08-traid = ie021-ship_id.
      e1edk08-traty = ie021-ship_method.
      desadvdata-sdata = e1edk08.
      append_idoc_rec desadvdata d.
      clear e1edka2.
      desadvdata-segnam = 'E1EDKA2'.
      desadvdata-sdata = e1edka2.
      append_idoc_rec desadvdata d.
      clear e1edk06.
      desadvdata-segnam = 'E1EDK06'.
      e1edk06-iddat = '025'.  "document date
      e1edk06-datum = ie021-create_date.
      desadvdata-sdata = e1edk06.
      append_idoc_rec desadvdata d.
      if not ie021-eta is initial.
        clear e1edk06.
        desadvdata-segnam = 'E1EDK06'.
        e1edk06-iddat = '001'.  "delivery date
        e1edk06-datum = ie021-eta.
        desadvdata-sdata = e1edk06.
        append_idoc_rec desadvdata d.
      endif.
      if not ie021-etd is initial.
        clear e1edk06.
        desadvdata-segnam = 'E1EDK06'.
        e1edk06-iddat = '010'.  "ship date
        e1edk06-datum = ie021-etd.
        desadvdata-sdata = e1edk06.
        append_idoc_rec desadvdata d.
      endif.
    ENDFORM.                    " idoc_header_segs
    *&      Form  idoc_poheader_segs
    *       create internal table entries for DESADV PO/item segments
    *          E1EDP07
    FORM idoc_poheader_segs.
    *DESADV
      clear e1edp07.
      desadvdata-segnam = 'E1EDP07'.
      e1edp07-bstnk = ie021-po_number.
      e1edp07-posex = ie021-po_lineno.
      desadvdata-sdata = e1edp07.
      append_idoc_rec desadvdata d.
      p07_ctr = p07_ctr + 1.
    ENDFORM.                    " idoc_poheader_segs
    *&      Form  idoc_item_segs
    *       create internal table entries for PO item segments:
    *          DESADV:   E1EDP09
    *          INVOIC:   E1EDP01        Qtys
    *                    E1EDP02        ref nos. (PO number / line)
    *                    E1EDP19        part numbers
    *                    E1EDP26        amounts
    *                    E1EDP04        taxes
    FORM idoc_item_segs using value(desadv_ok).
    data:  n_line_amt  type p decimals 3.
    *INVOIC
      clear e1edp01.
      invoicdata-segnam = 'E1EDP01'.
      e1edp01-menee = ie021-qty_uom.
      e1edp01-menge = ie021-invoice_qty.
      e1edp01-vprei = ie021-unit_price.
      e1edp01-pmene = ie021-price_uom.
      e1edp01-peinh = ie021-price_qty.
      e1edp01-netwr = ie021-line_amount.
      invoicdata-sdata = e1edp01.
      append_idoc_rec invoicdata i.
      clear e1edp02.
      invoicdata-segnam = 'E1EDP02'.
      e1edp02-qualf = '001'.
      e1edp02-belnr = ie021-po_number.
      e1edp02-zeile = ie021-po_lineno.
      invoicdata-sdata = e1edp02.
      append_idoc_rec invoicdata i.
      clear e1edp19.
      invoicdata-segnam = 'E1EDP19'.
      e1edp19-qualf = '001'.
      e1edp19-idtnr = ie021-cust_partno.
      invoicdata-sdata = e1edp19.
      append_idoc_rec invoicdata i.
      clear e1edp19.
      invoicdata-segnam = 'E1EDP19'.
      e1edp19-qualf = '002'.
      e1edp19-idtnr = ie021-vendor_partno.
      invoicdata-sdata = e1edp19.
      append_idoc_rec invoicdata i.
      clear e1edp26.
      invoicdata-segnam = 'E1EDP26'.
      e1edp26-qualf = '003'.
      e1edp26-betrg = ie021-line_amount.
      invoicdata-sdata = e1edp26.
      append_idoc_rec invoicdata i.
    * dummy tax seg
      clear e1edp04.
      invoicdata-segnam = 'E1EDP04'.
      e1edp04-msatz = '0.00'.
      invoicdata-sdata = e1edp04.
      append_idoc_rec invoicdata i.
      n_line_amt = ie021-line_amount.
      invoice_total = invoice_total + n_line_amt.
      check desadv_ok = '000'.
    *DESADV
      clear e1edp09.
      desadvdata-segnam = 'E1EDP09'.
      e1edp09-vbeln = ie021-slip_number.
      e1edp09-matnr = ie021-vendor_partno.
      e1edp09-vrkme = ie021-qty_uom.
      e1edp09-lfimg = ie021-invoice_qty.
      desadvdata-sdata = e1edp09.
      append_idoc_rec desadvdata d.
    ENDFORM.                    " idoc_item_segs
    *&    Form  post_idocs
    *     create database IDocs from the idocdata tables and clear tables.
    FORM post_idocs using value(desadv_ok).
    *INVOIC
      clear e1eds01.
      invoicdata-segnam = 'E1EDS01'.
      e1eds01-sumid = '010'.
      e1eds01-summe = invoice_total.
      e1eds01-waerq = ie021-currency.
      shift e1eds01-summe left deleting leading space.
      invoicdata-sdata = e1eds01.
      append_idoc_rec invoicdata i.
      CALL FUNCTION 'INBOUND_IDOC_PROCESS'
        TABLES
          IDOC_CONTROL       =  iedidc
          IDOC_DATA          =  invoicdata.
      commit work.
    *DESADV
      if desadv_ok = '000'.
        clear e1eds02.
        desadvdata-segnam = 'E1EDS02'.
        e1eds02-sumid = '001'.
        e1eds02-summe = p07_ctr.
        shift e1eds02-summe left deleting leading space.
        desadvdata-sdata = e1eds02.
        append_idoc_rec desadvdata d.
        CALL FUNCTION 'INBOUND_IDOC_PROCESS'
          TABLES
            IDOC_CONTROL       =  dedidc
            IDOC_DATA          =  desadvdata.
        commit work.
      endif.
      refresh: desadvdata,
               invoicdata.
      clear:
        desadvdata,
        invoicdata,
        p07_ctr,
        invoice_total,
        save_stat,
        save_po,
        save_line,
        save_invoice.
    ENDFORM.                    " post_idocs
    *&      Form  init_desadv
    *       add a DESDAV control record and initialize fields
    FORM init_desadv.
    clear dedidc. refresh dedidc.
    * initialize control record:
    move:  '2'        to  dedidc-direct,
          'DESADV01'  to  dedidc-doctyp,
          'DESADV'    to  dedidc-mestyp,
          'F'         to  dedidc-std,
          'E021'      to  dedidc-stdmes,
          'LS'        to  dedidc-sndprt,
          'TY_VENDORS' to dedidc-sndprn,
          sy-datlo    to  dedidc-credat,
          sy-timlo    to  dedidc-cretim.
    append dedidc.
    ENDFORM.              " init_desadv
    *&      Form  init_invoic
    *       add a INVOIC control record and initialize fields
    FORM init_invoic.
    clear iedidc. refresh iedidc.
    * initialize control record:
    move:  '2'        to  iedidc-direct,
          'INVOIC01'  to  iedidc-doctyp,
          'INVOIC'    to  iedidc-mestyp,
          'MM'        to  iedidc-mescod,
          'F'         to  iedidc-std,
          'E021'      to  iedidc-stdmes,
          'LS'        to  iedidc-sndprt,
          'TY_VENDORS' to iedidc-sndprn,
          sy-datlo    to  iedidc-credat,
          sy-timlo    to  iedidc-cretim.
    append iedidc.
    ENDFORM.              " init_invoic
    REWARD POINTS IF HELPFUL
    Lakshmiraj.A

  • Word count program..twisting my mind

    Hi fellow Java lovers , i'm new on this forum and I had a little tricky java program. Apparently I have a piece of text , about 1500 words long (in french) and I need to come with a code that makes it possible:
    1.Load up the piece of text using a menu option , convert it to lower case and display it.
    2.Display average word length
    3.Display number of commas per 1000 words
    4.Display number of times the word "le" occurs per 1000 words. (tip: u will hav to search for space followed by "l" then "e"...> " le " otherwise java will pick it as part of a word)
    5.Allow a search to b done for any word and display number of times it occurs.
    Anyone wanna come to my rescue ?

    Try this:
    import java.util.*;
    import java.awt.*;
    import java.awt.event.*;
    import javax.swing.*;
    import java.io.*;
    public class CandyGirl1
         static class DisplayPanel
              extends JPanel
              private JTextArea mTextArea= new JTextArea();
              private JLabel mLabel= new JLabel(" ");
              public DisplayPanel()
                   mTextArea.setEditable(false);
                   setLayout(new BorderLayout(4,4));
                   add(new JScrollPane(mTextArea), BorderLayout.CENTER);
                   add(mLabel, BorderLayout.SOUTH);
              public Dimension getPreferredSize() {
                   return new Dimension(600, 480);
              public void setText(String text)
                   text= text.toLowerCase();
                   mTextArea.setText(text);
                   mLabel.setText(
                        "<html>" +
                        "Average word length: " +getAverage(text) +"<p>" +
                        "Commas per 1,000 words: " +getCommas(text) +"<p>" +
                        "Occurences of 'le': " +getOccurs(text, "le") +"<p>" +
                        "</html>");
              public void search(String what)
                   mLabel.setText(
                        "Occurences of '" +what +"' : " +getOccurs(mTextArea.getText(), what));
              private int getOccurs(String text, String what)
                   int count= 0;
                   StringTokenizer st= new StringTokenizer(text, " ");
                   while (st.hasMoreTokens()) {
                        if (st.nextToken().equals(what))
                             count++;
                   return count;
              private int getCommas(String text)
                   int i= 0;
                   int cnt= 0;
                   for (; i< text.length(); i++) {
                        if (text.charAt(i) == ',')
                             cnt++;
                   return (1000/text.length())*cnt;
              private int getAverage(String text)
                   int count= 0;
                   int length= 0;
                   StringTokenizer st= new StringTokenizer(text, " ");
                   while (st.hasMoreTokens()) {
                        length += st.nextToken().length();
                        count++;
                   return length/count;
         static class Application
              extends JFrame
              private DisplayPanel mDisplayPanel= new DisplayPanel();
              public Application()
                   getContentPane().add(mDisplayPanel);
                   JMenuBar bar= new JMenuBar();
                   setJMenuBar(bar);
                   JMenu menu= new JMenu("File");
                   menu.setMnemonic(KeyEvent.VK_F);
                   bar.add(menu);
                   JMenuItem item= new JMenuItem("Open");
                   item.setMnemonic(KeyEvent.VK_O);
                   menu.add(item);
                   item.addActionListener(new ActionListener() {
                        public void actionPerformed(ActionEvent e) { open(); } });
                   item= new JMenuItem("Search");
                   item.setMnemonic(KeyEvent.VK_S);
                   menu.add(item);
                   item.addActionListener(new ActionListener() {
                        public void actionPerformed(ActionEvent e) { search(); } });
              private void search()
                   String what= JOptionPane.showInputDialog(this, "Serach for:");
                   if (what != null)
                        mDisplayPanel.search(what);
              private void open()
                   JFileChooser chooser= new JFileChooser();
                   if (chooser.showOpenDialog(this) == JFileChooser.APPROVE_OPTION)
                        open(chooser.getSelectedFile());
              private void open(File file)
                   try  {
                        FileReader reader= new FileReader(file);
                        StringBuffer str= new StringBuffer();
                        char[] buf= new char[128];
                        while (true) {
                             int read= reader.read(buf, 0, buf.length);
                             str.append(buf, 0, read);
                             if (read < buf.length)
                                  break;
                        mDisplayPanel.setText(str.toString());
                   catch (Exception e) {
         public static void main(String[] argv)
              Application frame= new Application();
              frame.pack();
              frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
              frame.setVisible(true);
    }

  • Will giving up bundling and minification in lieu of serving js and css from Azure blob be beneficial?

    Hi,
    We have an MVC web site deployed as a service on Microsoft Azure. For boosting performance some of my colleagues suggested that we scrap the bundling and minification and instead serve .js and .css files from an Azure blob instead. Please note that the solution
    does not use a CDN, merely serving the files from a blob instead of using the bundling and minification feature. 
    My take on this is that just serving the files this way will not cause any major performance benefits. Since we are not using  CDN, the files will get served from the region in which our storage is deployed all the time. Also, since they are not bundled
    and kept as individual files, the server requests will not reduce. So we are forfeiting the benefits of bundling and minification. The only benefit I see to this approach is that we can do changes to the js and css files and upload them without a need to re-deploy.
    Can anyone please tell me which of the two options is preferable in terms of performance?

    Hi Nikhil,
     Thanks for posting.
     I agree with you on this one, but again it depends on the scale of your website and the number of requests that you expect. If your page requests are low then you might as well store all the static content on the blob storage, the blob storage is very
    scalable and it will be good enough in most cases. bear in mind that each time a page request that includes a link to the storage, it counts as a storage transaction. currently, they are priced at $0.005 per 100,000 transactions, so it would be a while to
    be costly. Next step would be to expose the container over CDN to get the distributed edge caching.
      All in all there are performance benefits in using either one, just depends on careful considerations based on the expected traffic and location of your customers and users.
    you can refer to the following threads, let us know if you have any questions
    http://social.technet.microsoft.com/Forums/azure/en-US/5fc30aae-8f72-42b5-9202-3778c28033dc/best-performance-for-storing-static-files-images-css-javascript-cdn-blobstorage-or-webrole?forum=windowsazuredata
    http://stackoverflow.com/questions/6968011/storing-css-files-on-windows-azure
    Regards,
    Nithin.Rathnakar

  • Database size: optimization: tuning

    Hi:
    I'm using Berkeley DB as it is supplied with MacOS 10.5.6
    on an intel iMac with 2G of ram. On this machine, as well as on
    a more powerful machine with more memory I see the following:
    I'm loading a tied hash via Perl BerkeleyDB.pm with constant-length
    keys, and values that vary from 20 to 200 bytes.
    Loading time is constant at about 0.9 seconds per 25,000 records
    (0.6 per 25,000 on the more powerful machine)
    until the dbm file reaches about 660 MB (1.2 million records).
    Then there is an inflection point and loading time increases
    so that at 2 million records (dbm file =710 MB) it is 33 seconds per
    25,000 records. If i reduce the average size of the values of each record, the inflection
    point stays constant at ~660 MB, although more records are stored.
    I wonder if there are database tuning/optimization functions I should
    be looking at.
    Thanks
    Richard Moe

    There is in fact no better way than doing the real test yourself, but you can also just do the math. Start with figuring/estimating what's going to be the load of the application, how many queries are to be executed during one request/session, etcetera.

  • Putting items into hashmap and treemap

    Hi all, this is my first time posting here, I tried doing a search but i didn't find what i was looking for so i hope someone out there and help me out~
    Anyways, my question is that howcome when I am putting entries into either of these maps, the times for entry acutally gets faster as more entries are entered??
    For instance, in my hashMap, it took approximately 8 seconds to enter 1447 entries, but when i am entering in 23042 entries, it takes just a bit over 1 second?
    Does it have soemthing to do with how the methods in hashMap takes on constant time (O(1)) ?
    the same thing happens for HashMap, but the time cost is (O(log n)) for putting entries so i am not sure how to explain this..
    I hope someone understands my question, thanks in advance!

    8 seconds for 1400 entries sounds VERY long.
    In any case, earlier chunks may take longer due to VM startup or hotspot warmup overhead. Resizing the backing store, GC, VM acquiring more memory, OS giving CPU time to other proesses, etc., could all throw off your timing.
    I got the following results:
    HashMap:
    first 350,000: 1 s
    next 300,000: 1 s
    next 250,000: 1 s
    TreeMap:
    First 100,000: 1 s
    Next 50,000: 3 s
    Next 100, 000: 2 s
    Next 50,000: 1 s
    Next 50,000: 1 s
    and so on--generally 1 or 2 seconds per 50,000
    import java.util.*;
    public class MapTiming {
      public static void main(String[] args) throws Exception {
        Map<Integer, Integer> hm = new HashMap<Integer, Integer>();
        Map<Integer, Integer> tm = new TreeMap<Integer, Integer>();
        long start;
        start = System.currentTimeMillis();
        for (int ix = 1; ix <= 1000000; ix++) {
          hm.put(ix, ix);
          if (ix % 50000 == 0) {
            long end = System.currentTimeMillis();
            System.out.println("HashMap " + ix + ": " + ((end - start) / 1000) + " sec.");
        start = System.currentTimeMillis();
        for (int ix = 1; ix <= 1000000; ix++) {
          hm.put(ix, ix);
          if (ix % 50000 == 0) {
            long end = System.currentTimeMillis();
            System.out.println("TreeMap " + ix + ": " + ((end - start)/ 1000)+ " sec.");
    :; java -cp classes MapTiming
    HashMap 50000: 0 sec.
    HashMap 100000: 0 sec.
    HashMap 150000: 0 sec.
    HashMap 200000: 0 sec.
    HashMap 250000: 0 sec.
    HashMap 300000: 0 sec.
    HashMap 350000: 1 sec.
    HashMap 400000: 1 sec.
    HashMap 450000: 1 sec.
    HashMap 500000: 1 sec.
    HashMap 550000: 1 sec.
    HashMap 600000: 1 sec.
    HashMap 650000: 2 sec.
    HashMap 700000: 2 sec.
    HashMap 750000: 2 sec.
    HashMap 800000: 2 sec.
    HashMap 850000: 2 sec.
    HashMap 900000: 3 sec.
    HashMap 950000: 3 sec.
    HashMap 1000000: 3 sec.
    TreeMap 50000: 0 sec.
    TreeMap 100000: 1 sec.
    TreeMap 150000: 4 sec.
    TreeMap 200000: 4 sec.
    TreeMap 250000: 6 sec.
    TreeMap 300000: 7 sec.
    TreeMap 350000: 8 sec.
    TreeMap 400000: 11 sec.
    TreeMap 450000: 12 sec.
    TreeMap 500000: 14 sec.
    TreeMap 550000: 14 sec.
    TreeMap 600000: 16 sec.
    TreeMap 650000: 18 sec.
    TreeMap 700000: 19 sec.
    TreeMap 750000: 21 sec.
    TreeMap 800000: 22 sec.
    TreeMap 850000: 25 sec.
    TreeMap 900000: 25 sec.
    TreeMap 950000: 26 sec.
    TreeMap 1000000: 29 sec.

  • Io connections

    greetings,
    We have a licence for 100 I/O connections. Can I load Lookout on another PC if I keep my connections below total of 100? I believe I have a problem with PC and would like to prove theory by running the same SW on another box.
    I am doing an accelerated life test on 16 solenoids/switches through a modbus object. I miss about one count per 50,000 over the coarse of about two days(with no hypertrends). If I create multiple hypertrends the problem is magnified. When I dump serial modbus data to a file, the data indicated there are moments of time when the serial port didn't get serviced. Its as if the machine was off doing something it thought was much more important. The more hypertrends I add, the more counts I miss.
    thank you,
    JohnG 

    greetings,
    The counters are objects created in Lookout. I am sequentially driving 16 Actuator/Sensors on a ASi network through a Wiedemann modbus gateway to Dell PC( 1.8G P4, seagate 40G ATA, 654M Ram). My pollrate is 0.2 seconds. I drive the solenoids in open position for 1 second once every 20 seconds. Without any hypertrends the performance is very good(1 random err per approx. 50,000 cycles). Over the course of a week I may have 5 0r so errors on different sensors. Like I said previously, if I add hypertrends, the problem is magnified(10x). 
    Thank You for response,
    johng  

Maybe you are looking for

  • HELP: moved library to external HD and now I want it back on original drive

    So here is what I did: All my music is on an external drive. Got a new bigger external drive because the other one is now full. Followed the instructions on apple's website for moving music libraries from one drive to another, consolidated and everyt

  • How to index a website where basic authentication is needed

    Hello, We like to use TREX to index a website, to access this website we need basic authentication for logging on. Is this possible with TREX to index an website where basic authentication is used, and how to do this. Regards, Evert Schuiteman

  • ASA 5520 v7.2 - VPN site to site problem and clear command

    Hi all, I am getting some problems with a Site to Site VPN from the last two weeks. In some occasions it stops to send traffic through the VPN without any apparent reason. I have other VPNs that continue working fine. While it is failing I have run t

  • Time machine deleted hd

    I finally got around to backing up my new MacBook using my Seagate external HD.  I didn't think I needed the old data on the drive (from my old IBM Think Pad), so I checked OK to the re-format that Time Machine does.  Then I remembered 5 years of JPE

  • Upgrade ram on macbook pro 2010

    I have bought two different brands (G.Skill and PNY) of ram for my Macbook Pro 13.3 2.4 GHz Intel Core 2 Duo (2010) trying to upgrade from 4GB to 8GB  and both are not working.  Meaning I followed the exact instructions, didn't touch the Gold and ins