Maximum data size in DIAdem

Hi,
I am trying to load a 2 GB text file in DIAdem. This causes DIAdem to hang and I am forced to restart the software. Can it support this file size? If yes, then what is an approximate loading time?
My system config is:
Windows XP OS
1 GB RAM
3 GHz, P4 processor
Thanks,
Yash
Solved!
Go to Solution.

Hello Yashasvi,
Please note that loading large files will be significantly faster the more memory (2GB or better 3-4 GB) and the faster harddrive you have.  A fast (and defragmented) hard drive helps tremendously, especially on Laptop computers (make sure to have a 7,200 RPM drive vs. a 5,400 PRM drive if you have a choice).
Your file is about 2x larger than your available memory, and once you add Windows and DIAdem to the memory needs, you have very little available memory for data. Memory upgrades are relatively cheap and will have a big  influence on your import performance.
I can only reinforce what Brad mentioned, DIAdem probably doesn't crash, it's just very busy importing your large data file - we are talking hours here due to having to swap memory on the harddrive. I recently tested something similar on my Laptop (2GB RAM dual core 1.83 Mhz) and on my home PC (Quad core with 4 MB and fast HDD). My home desktop PC was about 7x faster with the same file.
Have a great weekend,
Otmar D. Foehner
Business Development Manager
DIAdem and Test Data Management
National Instruments
Austin, TX - USA
"For an optimist the glass is half full, for a pessimist it's half empty, and for an engineer is twice bigger than necessary."

Similar Messages

  • Oracle Database Express Edition maximum data size

    Hi
    We just need to know what is the maximum data(in gb) Oracle database express edition 10g can store in it?
    Thanks

    915071 wrote:
    Hi
    We just need to know what is the maximum data(in gb) Oracle database express edition 10g can store in it?
    Thanks4GB

  • Maximum data size for notes in Address Book ?

    Does anyone know if there is a definable limit to the amount of text the Note field will accept ?
    I'm trying to migrate a colleague from Now Contact to Address Book (primarily for the cloud/sync facilities, he has .Mac, and accounts on four separate macs), and Now Contact has the ability to store multiple individual Note attachments per contact. I can script to consolidate the potential data for each contact, but was wondering if I was going to run into trouble dumping as much as, say, 1024k text into the Note field for one contact.
    Any pointers appreciated, thanks.
    Andy.

    There is no defined limit to the size of Notes data in Address Book, at least that Apple has documented.
    Mulder

  • Maximum file size in picture ring?

    Hello folks!
    I am planing to use a picture ring with a quite big amount of data needed.
    My question: is there a maximum data size that i can embed in a picture ring (number of pictures or overall file sizes)?
    Thanks!

    If you have enough memory to keep all the images open simultaneously, then something like this might help.  Put all your images in the same directory on disk and have no other files in that directory. Then use List Folder from the Advanced File palette to get an aaray of the filenames.  Feed that array to a for loop where you open all the files and place the images into the pict ring.  I have written a "slide show" program which does this. Never tried it with 400 images though.
    If you do not have enough memory for all the images, then you need to manage the iamges much more carefully.
    Lynn

  • Maximum file size data that can be send to MQ using mq adapter

    Dear All,
    What is the maximum file size data that can be send to MQ using mq adapter ?
    The file can be csv, fixed , xml.
    Please let me know if anybody is aware of any limitations
    Best Regards
    Arc

    If you are on 10g then you are looking at a limit of 7MB, although any messages over 1MB will require some tuning.
    You can integrate using the native schema, or xml so you can use csv, fixed length, and xml files.
    cheers
    James

  • Maximum Data file size in 10g,11g

    DB Versions:10g, 11g
    OS & versions: Aix 6.1, Sun OS 5.9, Solaris 10
    This is what Oracle 11g Documentation
    http://download.oracle.com/docs/cd/B28359_01/server.111/b28320/limits002.htm
    says about the Maximum Data file size
    Operating system dependent. Limited by maximum operating system file size;typically 2^22 or 4 MB blocksI don't understand what this 2^22 thing is.
    In our AIX machine and ulimit command show
    $ ulimit -a
    time(seconds)        unlimited
    file(blocks)         unlimited  <-------------------------------------------
    data(kbytes)         unlimited
    stack(kbytes)        4194304
    memory(kbytes)       unlimited
    coredump(blocks)     unlimited
    nofiles(descriptors) unlimited
    threads(per process) unlimited
    processes(per user)  unlimitedSo, this means, In AIX that both the OS and Oracle can create a data file of any Size. Right?
    What about 10g, 11g DBs running on Sun OS 5.9 and Solaris 10 ? Is there any Limit on the data file size?

    How do i determine maximum number of blocks for an OS?df -g would give you the block size. OS blocksize is 512 bytes on AIX.
    Lets say the db_block_size is 8k. What would the maximum file size for data file in Small File tablespace and Big File tablespace be?Smallfile (traditional) Tablespaces - A smallfile tablespace is a traditional Oracle tablespace, which can contain 1022 datafiles or tempfiles, each of which can contain up to approximately 4 million (222) blocks. - 32G
    A bigfile tablespace contains only one datafile or tempfile, which can contain up to approximately 4 billion ( 232 ) blocks. The maximum size of the single datafile or tempfile is 128 terabytes (TB) for a tablespace with 32K blocks and 32TB for a tablespace with 8K blocks.
    HTH
    -Anantha

  • Maximum package size for data packages was exceeded and Process terminated

    Hello Guru,
    When i am execute the process chain i got this message Maximum package size for data packages was exceeded and Process terminated,any body help to me in this case how can i proceed.
    Thanks & Regards,
    Suresh.

    Hi,
    When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
    1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
    2) Run the load only to PSA.
    3) Once the load is succesfull , then push the data to targets.
    In this way you can overcome this issue.
    You can also try RSCUSTV* where * is an integer to change data load settings.
    Change Datapackage size for extraction, use Transaction RSCUSTV6.
    Change Datapackage size when upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    IN R/3, T-Code SBIW --> Genaral settings --> Maintain Control Parameters for Data Transfer (source system specific)
    Hope this helps.
    Thanks,
    JituK

  • "Maximum package size for data packages was exceded".

    Hi,
    We are getting the below error.
    "Maximum package size for data packages was exceded".
    In our scenario we are loading the data product key wise (which is a semantic key as well) to the DSO thro’ a start routine.
    The logic in the start routine is such a way that it calculates the unique product counts , product key wise. Hence we are trying to
    group  the product key thro’ semantic groups.
    Ex: In this example the product counts should be A = 1,B=2 ,C = 1
      Product Key
      Products
      A
      1000100
      B
      2000100
      C
      3000100
      B
      2000300
      C
      3000100
    For some product keys the data is so huge that we could not load the data & we are getting the error.
    Please suggest any alternate way to  handle this thro’ code or introducing any other flow.
    Regards,
    Barla

    HI
    we can solve the issue by opening the system setting of data packer size
    like below we have create 2 programs, 1 for open the system settings,2 for
    close the settings .
    1 start program
    data: z_roidocprms like table of
    roidocprms.
    data: wa like line of z_roidocprms.
    wa-slogsys = 'system_client' . wa-maxsize = '50000'. wa-statfrqu = '10'.
    wa-maxprocs = '6'. wa-maxlines = '50000'.
    insert wa into table z_roidocprms.
    Modify roidocprms from table z_roidocprms .
    2 close program
    data: z_roidocprms like table of roidocprms.
    data: wa like line of z_roidocprms.
    wa-slogsys = 'syetm_client' . wa-maxsize = '50000'. wa-statfrqu = '10'.
    wa-maxprocs = '6'. wa-maxlines = '50000'.
    insert wa into table z_roidocprms.
    modify roidocprms from table z_roidocprms .
    data load infopakage settings we have to maintain like
    below
    we have create the process chain like as below
    1 start progarm
    data load infopakage
    2 close program.
    This might fix the problem.
    Regards,
    Polu.

  • Maximum package size for data packages was exceeded?

    Hi Experts,
    I am facing this problem "Maximum package size for data packages was exceeded". When I am trying to laod. I even tried to reduce data packet and change DTP settings in EXtraction to Get All New Data Request by Request but still same error is occuring. Can u Plz focus light on this.
    Thanks,
    Krishna

    You can refer to the below OSS note:
    Note 1144332 - Consulting note: Message RSBK 250: Package size exceeded
    And other related notes: 352038, 417307
    Hope this helps.
    Murali

  • Sql loader maximum data file size..?

    Hi - I wrote sql loader script runs through shell script which will import data into table from CSV file. CSV file size is around 700MB. I am using Oracle 10g with Sun Solaris 5 environment.
    My question is, is there any maximum data file size. The following code from my shell script.
    SQLLDR=
    DB_USER=
    DB_PASS=
    DB_SID=
    controlFile=
    dataFile=
    logFileName=
    badFile=
    ${SQLLDR} userid=$DB_USER"/"$DB_PASS"@"$DB_SID \
              control=$controlFile \
              data=$dataFile \
              log=$logFileName \
              bad=$badFile \
              direct=true \
              silent=all \
              errors=5000Here is my control file code
    LOAD DATA
    APPEND
    INTO TABLE KEY_HISTORY_TBL
    WHEN OLD_KEY <> ''
    AND NEW_KEY <> ''
    FIELDS TERMINATED BY ','
    TRAILING NULLCOLS
            OLD_KEY "LTRIM(RTRIM(:OLD_KEY))",
            NEW_KEY "LTRIM(RTRIM(:NEW_KEY))",
            SYS_DATE "SYSTIMESTAMP",
            STATUS CONSTANT 'C'
    )Thanks,
    -Soma
    Edited by: user4587490 on Jun 15, 2011 10:17 AM
    Edited by: user4587490 on Jun 15, 2011 11:16 AM

    Hello Soma.
    How many records exist in your 700 MB CSV file? How many do you expect to process in 10 minutes? You may want to consider performing a set of simple unit tests with 1) 1 record, 2) 1,000 records, 3) 100 MB filesize, etc. to #1 validate that your shell script and control file syntax function as expected (including the writing of log files, etc.), and #2 gauge how long the processing will take for the full file.
    Hope this helps,
    Luke
    Please mark the answer as helpful or answered if it is so. If not, provide additional details.
    Always try to provide actual or sample statements and the full text of errors along with error code to help the forum members help you better.

  • Setting maximum packet size in JDBC driver to send data to database

    Could someone tell me how I can set the JDBC connection property of maximum packet size to send data to database?
    Regards
    Rashed

    Hi thanks....I'm having this strange SQLException while trying to insert BLOB image data to Oracle database. I'm saying this strange because for the same image that has been inserted before it's throwing the exception. My program is run from Oracle form and then some image data are inserted into database through a loop. I can't realize what's the problem inside my code that's causing this problem. In fact, when I run my program independently not from Oracle Form, it runs fine, every image data get inserted into database. Given below is my code snippet:
    public void insertAccDocs(String[] accessions) throws SQLException
        for(int q=0; q<accessions.length; q++)
        final String  docName = accessions[q];
        dbThread = new Thread(new Runnable(){
        public void run()
          try{
          System.out.println("insertDB before connection");
                   getConnected();
                   System.out.println("insertDB after connection");
                   st=con.createStatement();
             //String docName = acc; commented
         // String docName = singleAccession;
                   String text = formatFree;
                   String qry = "INSERT INTO DOCUMENT VALUES
    ('"+docName+"','"+text+"','"+formatted+"','"+uiid+"')";
                   System.out.println("parentqry"+qry);
                   int ok=0;
                   ok=st.executeUpdate(qry);
                   if(ok==1)
                System.out.println("INSERTION SUCCESS= "+ok);
                        System.out.println("Image List Size"+ imgList.size());
                // inserting into child
                        for(int i=0;i<imgList.size(); i++)
                      String imgPath = ""+imgList.get(i);
                                  System.out.println("db"+imgPath);
                                  FileInputStream fin = new FileInputStream(imgPath);
                                  BufferedInputStream bufStr = new BufferedInputStream(fin);
                                  byte[] imgByte = new byte[bufStr.available()];
                                  String img = "" + imgNameList.get(i);
                                  System.out.println("imgid="+i);
                callable = con.prepareCall("{call prc_insert_docimage(?,?,?)}");
                callable.setString(1,docName);
                            callable.setString(2,img);
                callable.setBinaryStream(3, bufStr , (int)imgByte.length);
                callable.execute();
            callable.close();
                     con.commit();
            con.close();
                   else
                        System.out.println("INSERTION NOT SUCCESS");
       //   } //else end of severalAcc
                   }catch(Exception err){ //try
                        System.out.println(err.toString());
        } //run
      }); //runnable
      dbThread.start();     
    }And the exception thrown is :
    java.sql.SQLException: Data size bigger than max size for this type: #####
    Please let me know if possible how I can get around this.
    Regards
    Rashed

  • Maximum data segment size for a process

    Hello
    we have an issue in our proxy server with the size of a process.
    the process grow in size and when it reaches 4Gbytes, the process stop with an error that it cannot allocate memory, we check and there's plently of swap left,. all the ulimit settings in solaris 10 are set to 'unlimited'
    is there a way i can see or determine, bypass configure what is the maximum process size. on a 64bit solaris.
    thank you in advance
    Mario G.

    It sounds like you're running a 32-bit app. You need a 64-bit application for it to be able to utilize more than 4 GB of RAM.

  • Maximum IMD size that can be processed by Meter Data Management v2.x

    Does anyone know the maximum IMD size that MDM 2.x can process? I was trying to process an MV90 Interval IMD with 15 minute cuts for a period of 6 months (trying to load historical usage in one go). MDM is timing out and failing to process the IMD. The IMD has been created and is failing when I am trying to do Additional Processing (with just a dummy VEE group so no 'real' validation is happening).
    Edited by: user12984550 on Jun 4, 2012 1:15 PM

    You have to options:
    1. The router that is internal must have a static route to the ASA to reach the VPN networks and must have a distribute static so that other routers that form part of EIGRP know how to route to the VPN networks.
    2. You can configure on the ASA "set reverse-route" on the crypto map then configure EIGRP on the ASA and add redistribute static so that routes learned via VPN (considered static routes) can be pushed through EIGRP.

  • How to change font size, maximum column size in the result screen ?

    hi All
    That's great when using SQL Dev.
    But I also have a trouble that how to change font size, maximum column size in the result screen ?
    My users think that font in result screen is shown very small, and whenever the data in each colum is long then it's not shown full data in column, they must double click for extend the size. Have the option to default the max size for showing full data in each column ? I try but still not to do that .
    Appreciate for anyone to help us.
    Thanks all.
    Sigmasvn

    You can't change the font for the results screen yet, however you will be able to select an auto-fit option for selected columns, so if some columns have slitghtly wider text you'll be able to set the column widths to handle these wider columns.
    Also, there s the option of switching the layout of a record in the grid.
    Sue

  • What is an idea of maximum file size for a film in Captivate?

    Hi there,
    I'm creating an elearning course in Captivate 7, and it is being published as HTML5. This means the films I've imported are being converted to MP4s, and they are around 10-20mb in size once they've converted. They seem very slow to load on some computers - do you think the file size is too big? Or could it be another issue? Does anyone have a recommendation for maximum file size for films? They are 572 x 322px and around 1-2 mins in length.
    Thanks in advance

    Probably better guidelines that dictates 'size' of a vi are:
    typically no more than 1 video screen in size
    is it legible
    and is it's function and operation clear.
    I have seen examples of '1 vi does it all' that were many screens wide and tall, totally a flustercuck, and nearly 1 MB in size.
    Globals and local variables (except for LV2 style) are typcially shunned for they can create a host of problems (race conditions, indeterminant data).
    Use connector pane to wire controls and indicators to. Then use wires between vi's to transfer data. I tend to use clusters to hold shared data.
    ~~~~~~~~~~~~~~~~~~~~~~~~~~
    "It’s the questions that drive us.”
    ~~~~~~~~~~~~~~~~~~~~~~~~~~

Maybe you are looking for