Filtering XML records while bursting - processing only a subset of the data

Hi
I'm attempting to filter the records in my XML data while bursting, so that depending on the value of a data element, the entire record is either skipped or included in the burst output.
I only want a subset of the XML output to be included in the burst output.
I've tried applying a filter at the select stage -
<xapi:request select="/AR_INVOICE/LIST_EMAIL_HEAD/EMAIL_HEAD{EMAIL_IND!=''}">
This removes all the records where 'EMAIL_IND' is not null - i.e. there is an email address to send to,
but instead of giving me multiple emails, one for each /AR_INVOICE/LIST_EMAIL_HEAD/EMAIL_HEAD,
I get just one mail address that contains all of the data for the records that have email addresses.
I also tried putting a filter on the template
<xapi:template type="rtf" locale="" location="xdo://AR.XXARINVEMAILDUMMY.en.00/?getSource=true" translation="" filter="{EMAIL_IND!=''}" />
i.e. having only one template and filtering as shown, but this has no effect.
Note: I had to change the square brackets - '[' to curly brackets '{' to get the examples to show.
Any ideas?
Thanks in advance.
Mike

Hi
I worked out a way to conditionally use ony a number of the data records in the bursting process - discarding the others.
In EBS, I generate a set of data in a a data-definition template that contains entries for some people who require email delivery, and some who require printed output.
In the concurrent programme, I specify an output rtf template that has a filter in it to print only the records for those who don't need emails.
(don't you just love the way that the designers call both the data definition and the output definition - TEPLATES - not confusing at all.....)
This step generates a sinlge sorted pdf file that is printed)
Then came the tricky part to send emails only to those who need them- using the bursting engine - to the other set of people - but to "cleanly" dispose of the records for the people who do not want emails.
What is not clear in any of the documentation at all is that the XMLPublisher bursting engine MUST handle ALL the records that it receives in the XML input.
If you specify a filter on the output template in the bursting control file, that excludes some records (those not needing the email) the bursting engine doesn't know what do with the remaining records.
Enter stage left - multiple delivery methods.
I simply defined another delivery method sample template with a filter including only those records for people who do NOT need emails - type filesystem - and routed the output to the /tmp directory - the trash.
The lesson(s) to be learnt
1) The bursting engine needs to have instructions to handle ALL the XML data that is fed to it.
2) You can define as many output documents and delivery methods as you like - putting filters on each delivery method as you like - BUT ALL XML RECORDS MUST be provided for.
3) XML records that are not required in one output / bursting stream can be handled in another - and trashed in the /tmp - or other - area.
The full bursting control file is shown below
1) First define the two delivery methods
2) Then define the two "documents" - using filters - using the two delivery methods.
Hope this helps others wanting to do similary things
Mike
<?xml version="1.0" encoding="UTF-8"?>
<xapi:requestset xmlns:xapi="http://xmlns.oracle.com/oxp/xapi" type="bursting">
<xapi:request select="/AR_INVOICE/LIST_EMAIL_HEAD/EMAIL_HEAD">
----- Define the two delivery methods - first the "email"---------
<xapi:delivery>
<xapi:email server="10.1.1.2" port="25"
from="${EM_DOC_EMAIL_ADDRESS}" reply-to ="${EM_COLLECTOR_EMAIL_ADDRESS}">
<xapi:message id="email" to="[email protected]"
attachment="true" subject="${EM_OPERATING_UNIT} invoice for ${EM_CUSTOMER_NAME}">
Please review the attached invoice - terms are ${EM_TERMS}
This will be sent to collector email ${EM_COLLECTOR_EMAIL_ADDRESS} and customer email ${EM_DOC_EMAIL_ADDRESS}
</xapi:message>
</xapi:email>
------ Second - the "null" - type filesystem ----------------
<xapi:filesystem id="null" output="/tmp/xmlp_null_${EM_CUSTOMER_NAME}" />
</xapi:delivery>
------Then define the first document using the"email" delivery -------------
<xapi:document output-type="pdf" delivery="email">
<xapi:template type="rtf" locale=""
location="xdo://AR.XXARINVEMAILDUMMY.en.00/?getSource=true" translation="" filter=".//EMAIL_HEAD[EM_DOC_EMAIL_ADDRESS!='NO_EMAIL']">
</xapi:template>
</xapi:document>
------ Then define the other document using the "null" delivery --------------
<xapi:document output-type="pdf" delivery="null">
     <xapi:template type="rtf" locale=""
     location="xdo://AR.XXARINVEMAILDUMMY.en.00/?getSource=true" translation="" filter=".//EMAIL_HEAD[EM_DOC_EMAIL_ADDRESS='NO_EMAIL']">
     </xapi:template>
</xapi:document>
</xapi:request>
</xapi:requestset>

Similar Messages

  • I press the record while i do live stream. where does the video saved at in my computer?

    i press the record while i do live stream. where does the video saved at??

    iMovie (Apple), Pinnacle Studio is another. But there a number in the app store. and that is just if you are limited to the iPad.

  • How to delete only a portion of the data portal?

    This would be similar to DataDelAll() command, but allow only a limited set of data to be removed.
    For some background:
    - DIAdem 9.1 sp2
    - LabVIEW interface calling various DIAdem/VBS scripts
    Using DIAdem to format reports based on various scripts and TDM data
    results, I would like to be able to load and unload various data files
    (TDM) while maintaining the specific report layout (also TDM) data in
    the portal.  Two (ugly) options seem to be:
    a)  load entire data result set into the portal (including layout instructions) and process, or
    b)  keep track of layout state variables and clear portal/reload layout/load next data
    This would all be easy if there is a command to enable the right-click delete... behavior of the data portal UI.
    Thanks!
    James

    Hello James,
    there are diufferent commands you can use to delete only a portion of the data in the portal. You can delete single channels, a selection of channels, or entire groups with these three commands:
    Call ChnDel(ChnArg) - deletes one channel
    Call ChnDelete(ClpSource) - deletes a selection of channels
    Call ChnSDel(ChnArg1, ChnNo) - delets a number of channels
    Call GroupDel(TargetGroupIndex) - deletes a group including its channels
    To create a selection of channels as argument for the second command, have a look at the functions
    ChnSelAdd, ChnSelGet, ChnSelCount
    Regards
    Ingo Schumacher
    Systems Engineer Sound&VibrationNational Instruments Germany

  • BI process has gotten stuck and the data from CRM hasn't gotten updated

    hai friends am working for BI in CRM....
    BI process has gotten stuck and the data from CRM hasn't gotten updated in the BI cubes.
    can any one help me to solve this issue???
    WHAT MAY BE THE PROBLEM AND WHAT WILL BE THE SOLUTION???
    IS THERE I NEED TO RESTART THE PROCESS CHAINS OR WHAT?

    Is it already scheduled for a daily run or manually you are running the process chain?
    If its manually running then you can select your process chain, and then right click on the start process --> Maintain variant --> Scheduling --> in the scheduler screen , you can click on the Immediate, save the changes and save in the next screen, once it comes back in the main screen, you have a scheduling Icon (Clock idoc). Click on it. Once done then it will be oK.
    Or firs the process only failed, then you can right click on the failed process and then choose "Repeat" option to repeat the  process.
    Also you can check the source system connection is fine or not? RSa1 --> Source system --> CHECK
    Edited by: Murali M on Sep 2, 2010 6:37 AM

  • Hierarchies Job Failing  The job process could not communicate with the dat

    Hi Experts,
    We have a group of hierarchies that run as a separate job on the DS schedules. The problem is this when we schedule the job to run during the production loads it fails but when we run immediately after it fails it runs completely fine. So it basically means that if i run it manually it runs but when its scheduled to run with the production job it fails. Now the interesting thing is If i schedule the job to run anytime after or before the production jobs are done. It works fine.
    The error i get is
    The job process could not communicate with the data flow <XXXXXX> process. For details, see previously logged
                                                               error <50406>.
    Now this XXXXX DF has only Horizontal Flatenning and it does not run as separate process because if i have it has separate process it fails with an EOF . So i removed the run as separate process and changes the DF to use in memory .
    Any Suggestion on this problem...

    Thanks Mike.. I was hoping its a memory issue but the thing i don't understand is when the job is scheduled to run with the production job it fails. when i manually run the job during the production job it runs, this kinda baffles me.
    DS 3.2 (Verison 12.2.0.0)
    OS: GNU/LINUX
    DF Cache Setting :- In Memory
    processor       : 0
    vendor_id       : GenuineIntel
    cpu family      : 6
    model           : 26
    model name      : Intel(R) Xeon(R) CPU           X5670  @ 2.93GHz
    stepping        : 4
    cpu MHz         : 2933.437
    cache size      : 12288 KB
    fpu             : yes
    fpu_exception   : yes
    cpuid level     : 11
    wp              : yes
    flags           : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss syscall nx rdtscp lm constant_tsc ida nonstop_tsc arat pni ssse3 cx16 sse4_1 sse4_2 popcnt lahf_lm
    bogomips        : 5866.87
    clflush size    : 64
    cache_alignment : 64
    address sizes   : 40 bits physical, 48 bits virtual
    power management: [8]
    processor       : 1
    vendor_id       : GenuineIntel
    cpu family      : 6
    model           : 26
    model name      : Intel(R) Xeon(R) CPU           X5670  @ 2.93GHz
    stepping        : 4
    cpu MHz         : 2933.437
    cache size      : 12288 KB
    fpu             : yes
    fpu_exception   : yes
    cpuid level     : 11
    wp              : yes
    flags           : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss syscall nx rdtscp lm constant_tsc ida nonstop_tsc arat pni ssse3 cx16 sse4_1 sse4_2 popcnt lahf_lm
    bogomips        : 5866.87
    clflush size    : 64
    cache_alignment : 64
    address sizes   : 40 bits physical, 48 bits virtual
    power management: [8]
    Thanks for your help

  • TS3579 I found this useful because I did not know about the effect of typing in data and that you could only drag to rearrange the data.  I had typed in data before and this had caused problems but restoring defaults did not cause correct dates to show up

    I found this  (TS3579: If the wrong date or time is displayed in some apps on your Mac Learn about If the wrong date or time is displayed in some apps on your Mac) useful because I did not know about the effect of typing in data and that you could only drag to rearrange the data.  I had typed in data before and this had caused problems but restoring defaults did not cause correct dates to show up in Finder. 

    It sounds like there are a couple things going on here.  First check if you have a successful install of SQL Server, then we'll figure out the connection issues.
    Can you launch SQL Server Configuration Manager and check for SQL Server (MSSQLSERVER) if default instance or SQL Server (other name) if you've configured your instance as a named instance.  Once you find this, make sure the service is started. 
    If not started, try to start it and see if it throws an error.  If you get an error, post the error message your hitting.  If the service starts, you can then launch SSMS and try to connect.  If you have a default instance, you can use the machine
    name in the connection dialog.  Ex:  "COWBOYS" where Cowboys is the machine name.  However, if you named the SQL Server instance during install, you'll need to connect using the machine\instance format.  Ex:  COWBOYS\Romo (where Romo
    is the instance name you set during install).
    You can also look at the summary.txt file in the SQL Server setup error logs to see what happened on the most recent install.  Past install history is archived in the log folder if you need to dig those up to help troubleshoot, but the most
    recent one may help get to the bottom of it if there is an issue with setup detecting a prior instance that needs to be repaired.
    Thanks,
    Sam Lester (MSFT)
    http://blogs.msdn.com/b/samlester
    This posting is provided "AS IS" with no warranties, and confers no rights. Please remember to click
    "Mark as Answer" and
    "Vote as Helpful" on posts that help you. This can be beneficial to other community members reading the thread.

  • My daq 6008 wil not drop the 5V after the VI is stopped, i have a digital signal going from the error out on the daq in the while loop to the error in on the daq outside the while loop and a boolean going to the data of the daq outside

    my daq 6008 wil not drop the 5V on a digital output after the VI is stopped, i have a digital signal going from the error out on the daq in the while loop to the error in on the daq outside the while loop and a boolean going to the data of the daq outside, but i can t seemto get it to work

    i attached the block diagram so you can have a look
    Attachments:
    PID Temp control.docx ‏120 KB

  • My iphone 4s siri is not working, it just worked for a while and stopped. I've checked the data plan and it is intact, i even switched my sim to a friends phone and his siri worked but mine refused even with a wireless network. pls help.

    My iphone 4s siri is not working, it just worked for a while and stopped. I've checked the data plan and it is intact, i even switched my sim to a friends phone and his siri worked but mine refused even with a wireless network. pls help.

    Troubleshooting Siri
    http://support.apple.com/kb/TS4079

  • My computer keeps shutting down while in use and when I reboot, the date comes up December 31, 2000

    my computer keeps shutting down while in use and when I reboot, the date comes up December 31, 2000. Also, i have to re-sign in on my airport as well. Not sure what is happening??

    Also, Safari keeps popping up saying facebook and other sites certificates cannot be verified, this has never happened before.

  • Only a connection with the data base

    I have an application MDI - opening of some windows at the same time.
    I use framework ADF Oracle - Swing/ADF JClient.
    For each new JInternalFrame a BindingContext (bootstrap) is servant, with this a new connection in the data base is created. Already I tried to use the same, but generally an action in a screen intervenes with the other.
    How I can make this (application MDI - ADF) only having a connection in the database?
    Thank you!!!

    If I have only one connection with
    the Data Base, two threads can't to request the data
    in the same time?.If two threads use the same database connection at the same time, bad things will happen. That is why each thread must use its own database connection. The easiest way to make that happen is to have a pool of connections.
    This site will do translations for you:
    http://world.altavista.com/
    It is not very good but it knows Spanish better than I do.

  • Save Output Xml Generated During Bursting Process To Database

    Hi All,
    I am using BI Publisher 10.1.3.4.1 to geneate pdf invoices from xml file. One xml file contains around 10k invoices. During bursting process, I can see that xml file for each individual invoice is generated in the log directory along with some other files. Can I save the individual xml files in database instead of in a directory. How do I achieve this?
    Help from any one is highly apreciated.
    Thanks
    Angshuman
    Edited by: Angshuman on Aug 23, 2010 12:28 AM

    you have to use the bursting engine api and burst in your java code, and listen to the listener, which will give you the handle to the xml files, then you can store them in DB.

  • If Records of different list items are entered, then the data is not getting inserted in the table.

    Hi Everyone,
    A Very Very Happy, Fun-filled, Awesome New Year to You All.
    Now coming to the discussion of my problem in Oracle Forms 6i:
    I have created a form in which the data is entered & saved in the database.
    CREATE TABLE MATURED_FD_DTL
      ACCT_FD_NO    VARCHAR2(17 BYTE)               NOT NULL,
      CUST_CODE     NUMBER(9),
      FD_AMT        NUMBER(15),
      FD_INT_BAL    NUMBER(15),
      TDS           NUMBER(15),
      CHQ_NO        NUMBER(10),
      CREATED_DATE  DATE,
      CREATED_BY    VARCHAR2(15 BYTE),
      PREV_YR_TDS   NUMBER(15),
      ADD_FD_AMT    NUMBER(15),
      DESCRIPTION   VARCHAR2(100 BYTE),
      P_SAP_CODE    NUMBER(10),
      P_TYPE        VARCHAR2(1 BYTE)
    The form looks like below:
    ENTER_QUERY     EXECUTE_QUERY     SAVE     CLEAR     EXIT
    ACCT_FD_NO
    CUST_CODE
    FD_AMT
    FD_INT_BAL
    PREV_YR_TDS
    TDS
    ADD_FD_AMT
    P_SAP_CODE
    P_TYPE
    CHQ_NO
    DESCRIPTION
    R
    W
    P
    List Item
    There are 5 push buttons namely ENTER_QUERY, EXECUTE_QUERY, SAVE, CLEAR, EXIT.
    The table above is same as in the form. All the fields are text_item, except the P_TYPE which is a List_Item ( Elements in List Item are R, W & P).
    The user will enter the data & save it.
    So all this will get updated in the table MATURED_FD_DTL .
    I am updating one column in another table named as KEC_FDACCT_MSTR.
    and
    I want this details to get updated in another table named as KEC_FDACCT_DTL only if the P_TYPE='P'
    CREATE TABLE KEC_FDACCT_DTL
      FD_SR_NO                NUMBER(8)             NOT NULL,
      FD_DTL_SL_NO            NUMBER(5),
      ACCT_FD_NO              VARCHAR2(17 BYTE)     NOT NULL,
      FD_AMT                  NUMBER(15,2),
      INT_RATE                NUMBER(15,2),
      SAP_GLCODE              NUMBER(10),
      CATOGY_NAME             VARCHAR2(30 BYTE),
      PROCESS_YR_MON          NUMBER(6),
      INT_AMT                 NUMBER(16,2),
      QUTERLY_FD_AMT          NUMBER(16,2),
      ITAX                    NUMBER(9,2),
      MATURITY_DT             DATE,
      FDR_STAUS               VARCHAR2(2 BYTE),
      PAY_ACC_CODE            VARCHAR2(85 BYTE),
      BANK_CODE               VARCHAR2(150 BYTE),
      NET_AMOUNT_PAYABLE      NUMBER,
      QUATERLY_PAY_DT         DATE,
      CHEQUE_ON               VARCHAR2(150 BYTE),
      CHEQUE_NUMBER           VARCHAR2(10 BYTE),
      CHEQUE_DATE             DATE,
      MICR_NUMBER             VARCHAR2(10 BYTE),
      PAY_TYPE                VARCHAR2(3 BYTE),
      ADD_INT_AMT             NUMBER(16,2),
      ADD_QUTERLY_FD_AMT      NUMBER(16,2),
      ADD_ITAX                NUMBER(16,2),
      ECS_ADD_INT_AMT         NUMBER(16),
      ECS_ADD_QUTERLY_FD_AMT  NUMBER(16),
      ECS_ADD_ITAX            NUMBER(16)
    So for the push button 'Save' , i have put in the following code in the Trigger : WHEN BUTTON PRESSED,
    BEGIN
         Commit_form;
              UPDATE KEC_FDACCT_MSTR SET PAY_STATUS='P' WHERE ACCT_FD_NO IN (SELECT ACCT_FD_NO FROM MATURED_FD_DTL);
              UPDATE MATURED_FD_DTL SET CREATED_DATE=sysdate, CREATED_BY = :GLOBAL.USER_ID WHERE ACCT_FD_NO = :acct_fd_NO;
    IF :P_TYPE='P' THEN
         INSERT INTO KEC_FDACCT_DTL
              SELECT FD_SR_NO, NULL, MATURED_FD_DTL.ACCT_FD_NO, FD_AMT, INT_RATE, P_SAP_CODE,
                   GROUP_TYPE, (TO_CHAR(SYSDATE, 'YYYYMM'))PROCESS_YR_MON,
                   FD_INT_BAL, (FD_INT_BAL-MATURED_FD_DTL.TDS)QUTERLY_FD_AMT , MATURED_FD_DTL.TDS,
                   MATURITY_DATE, P_TYPE, NULL, NULL, (FD_INT_BAL-MATURED_FD_DTL.TDS)NET_AMOUNT_PAYABLE,
                   NULL, NULL, CHQ_NO, SYSDATE, NULL, 'CHQ', NULL, NULL, NULL, NULL, NULL, NULL
              FROM MATURED_FD_DTL, KEC_FDACCT_MSTR
         WHERE KEC_FDACCT_MSTR.ACCT_FD_NO=MATURED_FD_DTL.ACCT_FD_NO;
    END IF;
    COMMIT;
         MESSAGE('RECORD HAS BEEN UPDATED AS PAID');
         MESSAGE(' ',no_acknowledge);
    END;
    If P_TYPE='P' , then the data must get saved in KEC_FDACCT_DTL table.
    The problem what is happening is,
    If i enter the details with all the records as 'P' , the record gets inserted into the table KEC_FDACCT_DTL
    If i enter the details with records of 'P' and 'R' , then nothing gets inserted into the table KEC_FDACCT_DTL.
    Even the records with 'P' is not getting updated.
    I want the records of 'P' , to be inserted into table KEC_FDACCT_DTL, even when multiple records of all types of 'P_Type' (R, w & P) are entered.
    So, can you please help me with this.
    Thank You.
    Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
    Oracle Forms Builder 6i.

    Its not working properly.
    At Form_level_Trigger: POST_INSERT, I have put in the following code.
    IF :P_TYPE='P'THEN
      INSERT INTO KEC_FDACCT_DTL
      SELECT FD_SR_NO, NULL, MATURED_FD_DTL.ACCT_FD_NO, FD_AMT, INT_RATE, P_SAP_CODE,
      GROUP_TYPE, (TO_CHAR(SYSDATE, 'YYYYMM'))PROCESS_YR_MON,
      FD_INT_BAL, (FD_INT_BAL-MATURED_FD_DTL.TDS)QUTERLY_FD_AMT , MATURED_FD_DTL.TDS,
      MATURITY_DATE, P_TYPE, NULL, NULL, (FD_INT_BAL-MATURED_FD_DTL.TDS)NET_AMOUNT_PAYABLE,
      NULL, NULL, CHQ_NO, SYSDATE, NULL, 'CHQ', NULL, NULL, NULL, NULL, NULL, NULL
      FROM MATURED_FD_DTL, KEC_FDACCT_MSTR
      WHERE KEC_FDACCT_MSTR.ACCT_FD_NO=MATURED_FD_DTL.ACCT_FD_NO;
      END IF;
    MESSAGE('RECORD HAS BEEN UPDATED AS PAID');
    MESSAGE(' ',no_acknowledge);
    It worked properly when i executed first time, but second time , in database duplicate values were stored.
    Example: First I entered the following in the form & saved it.
    ACCT_FD_NO
    CUST_CODE
    FD_AMT
    FD_INT_BAL
    PREV_YR_TDS
    TDS
    ADD_FD_AMT
    P_SAP_CODE
    P_TYPE
    CHQ_NO
    DESCRIPTION
    250398
    52
    50000
    6000
    0
    600
    0
    45415
    P
    5678
    int1
    320107
    56
    100000
    22478
    3456
    2247
    0
    45215
    R
    456
    320108
    87
    50000
    6500
    0
    650
    0
    21545
    W
    0
    In the database, in table KEC_FDACCT_DTL, the ACCT_FD_NO:250398 with P_TYPE='P' record was inserted.
    ACCT_FD_NO
    P_TYPE
    250398
    P
    But second time, when i entered the following in the form & saved.
    ACCT_FD_NO
    CUST_CODE
    FD_AMT
    FD_INT_BAL
    PREV_YR_TDS
    TDS
    ADD_FD_AMT
    P_SAP_CODE
    P_TYPE
    CHQ_NO
    DESCRIPTION
    260189
    82
    50000
    6000
    0
    600
    0
    45415
    P
    5678
    interest567
    120011
    46
    200000
    44478
    0
    4447
    0
    45215
    R
    456
    30191
    86
    50000
    6500
    0
    650
    0
    21545
    W
    56
    In the database, in the table KEC_FDACCT_DTL, the following rows were inserted.
    ACCT_FD_NO
    P_TYPE
    250398
    P
    250398
    P
    260189
    P
    320107
    R
    320108
    W
    There was duplicate of 250398 which i dint enter in the form second time,
    All the other P_TYPE was also inserted , but i want only the P_TYPE='P' to be inserted into the database.
    I want only those records to be inserted into the form where P_TYPE='P' and duplicate rows must not be entered.
    How do i do this???

  • How to process other task when reading the data from a file?

    Hello,
    I met a problem. I created a GUI to control the audio data processing. When the "Start" button is pressed, a block of data is read from the audio file and processed, then loop.But other buttons or method cannot be operated in that procession until all data is processed.
    How to let the other methods work under that circumstance? Should I use multiple threads?
    The relative code:
    SourceDataLine line;
    boolean stop = false;
    // When press "Start" button, the following method is invoked:
    public void play(){
    line.start();
    int bytesRead = 0;
    byte[] dataBuf = new byte[7500];
    while (bytesRead != -1& !stop)
    try
    bytesRead = audioInputStream.read(dataBuf, 0, dataBuf.length);
    catch (IOException e)
    e.printStackTrace();
    if (bytesRead >= 0)
    int bytesWritten = line.write(dataBuf, 0, bytesRead);
    line.drain();
    // When press "stop" button, this method is invoked.
    public void stop(){
    stop = true;
    line.stop();
    // When press "***" button, the following button is invoked:
    public void ***(){
    Many thanks.Looking forward to getting your answer.

    You have to use a seperate thread :
    Wheb you press the start button, you have to make sure a seperate thread is started :
    Thread runner = new Thread(){
    public void run(){
    //Here comes your code which reads the file
    runner.start();
    KR,
    Jan

  • SQLDatabase: Read a lot of data at once and process in memory or read the data when I need it?

    I'm not sure how to approach this problem. I require a big chunk of data records from the SQL server. This chunk is based on variables, so I don't know before what records I need. I need to do a large series of calculations and each calculation requires one
    (or more) records from this chunk of data. Again: I do not know which records are required.
    Should I:
    A. Load this data into the application memory all at once
    This creates a single connection to the DB, loads ALL required data by a query command (and a forward only DataReader) and then doesn't bother the SQL server anymore.
    The datafetch seems to be slow, since it's reading hundred of thousands of lines into memory
    B. Whenever the calculation needs data, retrieve it from the database
    This would open and close a connection to the SQL db multiple times per second.
    The initial datafetch is reduced to only a few miliseconds, but creates a massive load on the SQL server during calculation.

    Firstly, if you can turn your whole calculation in to an SQL query (or a series of queries or a stored procedure) then do so. Databases are good at this stuff, and you or a DBA may be able to do a lot to improve the query if it's still too slow.
    If not:
    Use a connection pool. Not doing so is usually crazy, unless you're writing a script that only connects once or twice.
    If you're testing this in a development environment with a local DB, beware that there can be a big difference in performance characteristics compared to a production one and don't over-optimize based on what you measure. Network delays in particular could
    catch you out. Fetching one row at a time may be fine with a low network latency, and awful with a high one.
    Database sizes are usually bigger in production, and go up over time. If you fetch all the data in advance you could get caught out and run out of memory (unless you know more about your data then we do...).
    As Pieter B suggests, you're probably better fetching data in batches if you really need a large number of rows. Then you'll neither blow everything else out of your server's memory, nor have a network latency and query overhead on every row. It'll also help
    if you want to report progress to the user.
    If you're really serious about making it go as fast as possible and not using SQL to do it, then you could try parallelizing your code. Then you can be calculating with
    one set of data whilst fetching the next, and if your production DB has multiple cores and disks you can parallelize in the DB, too. You could also look at caching, if that's appropriate (memcached and similar, or directly in your server if you know your data
    sizes well enough).

  • How can I import my IE8 bookmarks/favorites into Safari if I only have access to the data stored on my old hard drive? I can't run Windows to export them to an html file.

    My old PC died and I replaced it with a MacBook Pro.  Although I was able to copy the data from the PC's hard drive, I am unable to export my previous bookmarks/favorites to an html file because that PC won't boot.  Is there any other way to import them into Safari using the raw data stored on that drive or convert them somehow to a format I can import?

    You can only use a JSON backup to restore all bookmarks in Firefox.<br />
    If you need to import Firefox bookmarks in Safari then you need to (re)install Firefox and export the bookmarks to an HTML file.
    *http://kb.mozillazine.org/Backing_up_and_restoring_bookmarks_-_Firefox

Maybe you are looking for

  • Save for Web isn't using the correct size of artboard

    I've been using CS5 for about 5 days now and have just run into this problem. My artboard is 800x600 px.  I have 6 layers of an animation I need to export as pngs so I can manipulate them in Imageready (yes, I still use it.) I Save for Web, export as

  • Macbook Pro 13" (Late 2010, 7,1) 8GB RAM upgrade, boot up error

    Dear all,      I am currently trying to update my brother's macbookpro7,1 13" to more current hardware with a new 128GB SSD and more RAM (he has 4GB). The SSD was installed successfully and runs perfectly fine, but when I installed the 8GB RAM into m

  • Multiple iPhones in iTunes & Windows Vista

    I sync my iPhone thru iTunes on my laptop and I want to sync my wife's iPhone on the same machine. Will iTunes recognize and support each individual phone? How do I set her phone up in my laptop?

  • Static Thread + Thread() constructor

    Hi! I want to make a thread which will behave similar to "awt event dispatching thread". I want it to do some tasks by simply passing a runnable to method named addTask (this is similar to invokeLater or invokeAndWait in EventQueue)... QUESTION #1 I

  • Photoshop to dreamweaver

    hey guys anyone knows a good clip to watch to save your psd web pages properly. And how how to make inside the folder a css file. so when your working in dreamweaver you can save the pages properly when the css is saved in the right folder when you h