Data is overwritting in database

Dear friends
I am saving data in database table through screen.but problem is that it should allow duplicate entry
without overwriting,
  wa_ordtyp-txt = T003p-AUART.
      WA_ORDTYP-AUART = T003P-AUART.
      WA_ORDTYP-ZRD_FIELD = ZRD_FIELD.
      WA_ORDTYP-ZRD_FLHDR = ZRD_FLHDR.
      WA_ORDTYP-ZRB_SLB_PARA1 = ZRB_SLB_PARA1.
      WA_ORDTYP-ZRB_SLB_PARA2 = ZRB_SLB_PARA2.
      WA_ORDTYP-ZRB_SLB_PARA3 = ZRB_SLB_PARA3.
      WA_ORDTYP-ZRB_SLB_PARA4 = ZRB_SLB_PARA4.
      WA_ORDTYP-ZRB_CLSS = ZRB_CLSS.
      MODIFY ZRD_ORDTYP FROM WA_ORDTYP.
      clear wa.

Hello Pramod,
i think in your data base table the relation between the field is in composite PRIMARY KEY.
composite primary key can be in realtion of two field or more than two field.
the concept of composite primary key is. it check the record with whole primary key Example if you have a three key together in your data base table.than it will compare all three key.if he find any value different in any of the three field than it will allow to enter new record.instead of modifying.
so check your data base table .
hope it will help you to make your concept again.
Thanks
-Arun kayal

Similar Messages

  • I want to update, not overwrite my database!

    Morning all,
    In Dreamweaver CS4 - I've managed to insert data from my Repeating Region using Dynamic Checkboxes and Text Fields into my MySql database (thanks again to www.DwFAQ.info) , but when I go to insert another/new record, the checkboxes are already selected and the text from the previous entry is showing in the text fields.
    Obviously the dynamic fields are reading the data that is already in the database, but how do I send the data from the form to the database, then clear the form for the next use?
    I want to update my database with football player stats, and display the running total of games played, goals scored etc on another page, so I need to update the database each time I input data, not overwrite the previous entry.
    Should I be looking at the Dynamic input from Dreamweaver, or the MySql data/functions side of things?
    Any help or general pointers would be much appreciated!
    Many Thanks
    Andy

    I am still not clear on the total design and workflow.
    I would envision a design like this with 3 tables
    Player
    Match
    Player_Match
    The Player table stores details about the player not related to any specific match
    The Match table stores details about the match not related to a specific player
    The Player_Match stores details about a player for a specific match. This would include foreign keys to the other two tables and include attributes like cards, goals_scored, assists, time_played, no_show, etc.
    Is this your concept? Are you updating any stats during the game or after the game is complete. If the latter, then you won't really be updating those rows.  If the former, the workflow would be:
    Before each game: Insert a new Match row for a new Match. Insert new Player_Match rows for each player as they enter the game using insert behavior. Use the update behavior to update the Player_Match row when there is a goal, card, etc.
    >Do I need to send the data from the input form somewhere else,
    >which will then update the database, and then the form will be clear
    >again when I go into it, or is it to do with the Update/Insert functions in Mysql
    Again, if you are using the DW Insert behavior, all of the fields should be cleared. If using the update behavior, they should be populated. We may still not be understanding each other.

  • HOW RESTORE WITHOUT OVERWRITE EXISTING DATABASE ???????

    I create a test database named TESTONE
    USE MASTER
    GO
    if DB_ID('testone') IS NOT null
    drop database TESTONE
    GO
    CREATE DATABASE TESTONE
    GO
    and change recovery model to simple
    alter database TESTONE set recovery simple
    GO
    so I create a sample table
    use TESTONE
    GO
    create table table1(
    id int identity primary key,
    descr varchar(10)
    go
    I take a full backup of that database
    USE master
    GO
    BACKUP DATABASE TESTONE
    TO DISK=N'c:\TEMP\testone.bak'
    WITH NOFORMAT, INIT, NAME = N'TESTONE', SKIP, NOREWIND, NOUNLOAD, STATS = 10
    GO
    after that i do something bad :-)
    USE TESTONE
    GO
    drop table table1
    go
    USE master
    GO
    USE [master]
    RESTORE DATABASE [TESTONE] FROM DISK = N'C:\TEMP\testone.bak' WITH RECOVERY, FILE = 1, NOUNLOAD, STATS = 5
    GO
    the restore command doesn't contains  WITH REPLACE clause and ....
    if you expand the tables tree you'll find your deleted table
    the database has been overwritten
    Why?

    That is the default behavior or RESTORE. Since you are restoring the same DB (not a different DB) and the DB is in simple recovery mode, it will overwrite the existing DB. See https://msdn.microsoft.com/en-us/library/ms186858.aspx.
    Here is an excerpt from this link:
    REPLACE Option Impact
    REPLACE should be used rarely and only after careful consideration. Restore normally prevents accidentally overwriting a database with a different database. If the database specified in a RESTORE statement already exists on the current server and the specified
    database family GUID differs from the database family GUID recorded in the backup set, the database is not restored. This is an important safeguard.
    The REPLACE option overrides several important safety checks that restore normally performs. The overridden checks are as follows:
    Restoring over an existing database with a backup taken of another database.
    With the REPLACE option, restore allows you to overwrite an existing database with whatever database is in the backup set, even if the specified database name differs from the database name recorded in the backup set. This can result in accidentally overwriting
    a database by a different database.
    Restoring over a database using the full or bulk-logged recovery model where a tail-log backup has not been taken and the STOPAT option is not used.
    With the REPLACE option, you can lose committed work, because the log written most recently has not been backed up.
    Overwriting existing files.
    For example, a mistake could allow overwriting files of the wrong type, such as .xls files, or that are being used by another database that is not online. Arbitrary data loss is possible if existing files are overwritten, although the restored database is complete.
    Satish Kartan www.sqlfood.com

  • Update/insert/delete data from xcelsius to Database via web service

    Hi,
    I need to create dashboard that go function can <b>update/insert/delete</b> data send to <u>Database</u> thru <u>web services</u>, as i know got 2 xcelsius add-on software which support those of function <b>InfoBurst</b> and <b>flynet </b>
    <b>InfoBurst</b>
    http://www.infosol.com/azbocug/minutes/4-Writeback%20to%20a%20Database%20with%20Xcelsius.pdf
    <b>flynet </b>
    http://www.flynetviewer.com/public/community/Blogs/FlynetXcelsiusServerUser/default.aspx
    Except this 2 purchase add-on xcelsius, any other solution ?  
    Maybe need to write some in MSSQL or C# programming which enable insert, update, delete ...etc  ?
    *note: i not use Xcelsius Engage Server , i use Xcelsius Engage only
    thanks,
    regards
    s1
    Edited by: Leong Pui Kee on Mar 1, 2011 6:06 AM

    Hi,
    As of now in Xcelsius/Dashboard Design there is no feature or functionality to insert/update/delete data from database.
    Solution:
    Create a Web service in let’s say C# or Java, which will perform insert/update/delete operation.
    In Xcelsius add Web Service connection and user above web service.
    Xcelsius Web Service connection provides option to pass input values to a Web Service (Input Pane) and get the result (Output values pane).
    We can pass values to be written to the database as a input to Web Service via Web Service connection from Xcelsius and write data to the database.
    Note:
    Performing delete operation from Xcelsius Dashboard could be risky and may delete important data from database. I would not prefer giving delete option/functionality in Xcelsius dashboard.
    Hope this helps!
    Thank you.
    Regards,
    Vinay Mhaske

  • ORA-28150 when accessing data from a remote database

    Portal Version: Portal 3.0.9.8.0
    RDBMS Version: 8.1.6.3.0
    OS/Vers. Where Portal is Installed:: Solaris 2.6
    Error Number(s):: ORA-28150
    I have a problem with using a database link to access a table in
    a remote database. So long as the dblink uses explicit logins
    then everything works correctly. When the dblink does not have a
    username then I get the ORA-28150 message. The database link is
    always public. A synonym is created locally that points to a
    synonym in the remote database. I am using the same Oracle user
    in both databases. The Oracle portal lightweight user has this
    same Oracle user as its default schema. The contents of the
    remote table are always visible to sqlplus, both when the link
    has a username and when it doesn't have a username.
    All the databases involved are on the same version of Oracle.
    I'm not sure which Oracle login is being used to access the
    remote database, if my lightweight user has a database schema
    of 'xyz' then does portal use 'xyz' to access the remote
    database? I would be very grateful for any help or pointers that
    might help to solve this problem.
    James
    To further clarify this, both my local and remote databases
    schemas are owned by the same login.
    The remote table has a public synonym.
    The link is public but uses default rather than explicit logins.
    The local table has a public synonym that points to the remote
    synonym via the database link.
    If I change the link to have an explicit login then everything
    works correctly.
    I can view the data in the remote database with TOAD and with
    sqlplus even when the database link has default login rather
    than explicit login.
    This seems to point to Portal as being the culprit. Can anyone
    tell me whether default logins can be used across database links
    with portal?
    TIA
    James

    832019 wrote:
    One way to do this is by creating a database link and joining the two tables directly. But this option is ruled out. So please suggest me some way of doing this.Thus you MUST use two connection strings.
    Thus you are going to be either constructing some intricate SQL dynamically or you are going to be dragging a lot of data over the wire and doing an in memory search.
    Although realistically that is what the database link table would have done as well.
    Might be better to look at moving the table data from one database to the other. Depends on size of course.

  • Move XMLTYPE data to a remote  database

    Hello,
    I need your experience !!!!!
    I am working with the 10.2.1 database.
    I have try to call a remote procedure (using DBLINK) with a XMLTYPE parameter, I get a error link to a lob locator.
    One solution could be to save XMLTYPE to disk.
    Some body know other solution to transfer XMLTYPE data to a remote database.
    Thanks in advance
    Best Regards

    A future alternative will be Oracle streams and b.t.w. one of the advantages (sometimes not, but anyway) of an Oracle database is that you can use object orientated methods, relational methods, java solutions, etc. and nowadays XML methods. XML was once all about transporting data, creating an uniform interface between solutions (databases, application/webservers, technology stacks, etc...). Think out of the Box. XMLDB isn't relational, but it is packaged in one. Make use of it and vice versa.

  • APEX Application accessing data from two different databases

    Hi All,
    Currently as we all know that APEX Application resides in database and is connected to the schema of that database.
    I want APEX Application to be running and accessing data from two different databases. Elaborating my question,
    Currently, my APEX Production Application is connected with XXXX Schema of DB1 Database(Where APEX Resides). Now I want to add some pages into this APEX Application for REPORT Purpose, But I want to connect this REPORT APEX Pages to get data from Different Schema YYYY for Database DB2.
    Is it possible to configure this scenario?
    The reason for doing this is to avoid the REPORT related (adhoc queries) resource utilization effect on Production DB1 Database.
    Thanks
    Nil

    1. If you do the joining of two or more tables in DB1 then all data is pulled over to DB1 and then the join is executed: so more data over the databaselink and more work for DB1. Better keep the joining stuff where the data resides and just pull exactly that data over that you need.
    2. Don't know about your different block sizes. Seems a nice question for one of the other forums (DBA or SQL).
    3. I mean create synonyms on DB1 for reports VIEWS in DB2.
    Hope all is clear!

  • Upload data from excel into database through pl/sql

    Hi All,
    I have excel which contains data lets say employee details,
    I have one upload button ,which is used to upload excel and then i want to map the cell of excel to the database column and through plsql code i want to upload the excel data into database.
    In short ,i want to upload the data from excel into database using plsql code,
    or suggest me any other way to do this.(except the data load method present in apex)
    Thanks,
    Jitendra

    if you use APEX 4 you can define you own table
    the code below is for APEX 3
    PROCEDURE pro_carga_planilla_prosp( p_archivo VARCHAR2) IS
    v_blob_data BLOB;
    v_blob_len NUMBER;
    v_position NUMBER;
    v_raw_chunk RAW(10000);
    v_char CHAR(1);
    c_chunk_len number := 1;
    v_line VARCHAR2 (32767) := NULL;
    v_data_array wwv_flow_global.vc_arr2;
    v_rows number;
    v_sr_no number := 1;
    v_ok boolean := true;
    v_local_ok BOOLEAN := TRUE;
    v_reg_ok NUMBER := 0;
    v_reg_ko NUMBER := 0;
    v_localidad_id NUMBER;
    v_departamento_id NUMBER;
    v_cargo_id NUMBER;
    v_prospecto_id NUMBER;
    v_asesor_id NUMBER;
    V_REG prospectos%rowtype;
    BEGIN
    -- Read data from wwv_flow_files</span>
    select blob_content into v_blob_data
    from wwv_flow_files
    where name= p_archivo;
    v_blob_len := dbms_lob.getlength(v_blob_data);
    v_position := 1;
    -- Read and convert binary to char</span>
    WHILE ( v_position <= v_blob_len ) LOOP
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
    v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
    v_line := v_line || v_char;
    -- pro_log('linea '||v_line);
    v_position := v_position + c_chunk_len;
    -- When a whole line is retrieved </span>
    IF v_char = CHR(10) THEN
    -- Convert comma to : to use wwv_flow_utilities </span>
    v_line := replace(REPLACE (v_line, ',', ':'), ';',':');
    v_line := replace(replace(v_line, chr(10)),chr(13));
    if substr(v_line,1,1)= ':' then
    v_line := '0'||v_line;
    end if;
    if instr(v_line,':',1,21) = 0 then
    if instr(v_line,':',1,20) = 0 then
    v_line:=v_line||':';
    end if;
    v_line:=v_line||':';
    end if;
    -- pro_log(v_line);
    -- Convert each column separated by : into array of data </span>
    v_data_array := wwv_flow_utilities.string_to_table (v_line);
    -- Insert data into target table </span>
    IF v_data_array(1) IS NOT NULL AND
    v_sr_no <> 1 THEN
    V_REG.NOMBRE:=ltrim(rtrim(v_data_array(2)));
    V_REG.RAZON_SOCIAL:=v_data_array(3);
    V_REG.DIRECCION := v_data_array(4)||' '||v_data_array(5);
    -- PRO_LOG('PROSP 1 ' ||v_sr_no);
    v_localidad_id := pack_empresas.get_localidad(v_data_array(6));
    -- PRO_LOG('PROSP 1.1 '||v_sr_no);
    V_REG.LOCALIDAD_ID:=v_localidad_id;
    -- PRO_LOG('PROSP 1.2 '||v_sr_no);
    V_REG.CODIGO_POSTAL:=LTRIM(RTRIM(v_data_array(7)) );
    -- PRO_LOG('PROSP 1.3 '||v_sr_no);
    -- PRO_LOG('PROSP 1.1 '||v_sr_no);
    v_departamento_id := pack_empresas.get_departamento(v_data_array(8));
    -- PRO_LOG('PROSP 1.4 '||v_sr_no);
    V_REG.DEPARTAMENTO_ID:=v_departamento_id;
    -- PRO_LOG('PROSP 1.5 '||v_sr_no);
    V_REG.TELEFONO:=v_data_array(9);
    --PRO_LOG('PROSP 1.6 '||v_sr_no);
    V_REG.TELEFONO2:=v_data_array(10);
    -- PRO_LOG('PROSP 1.7 '||v_sr_no);
    V_REG.RUBRO:=v_data_array(11);
    -- PRO_LOG('PROSP 1.8 '||v_sr_no);
    V_REG.RUC:=ltrim(rtrim(v_data_array(12)));
    -- PRO_LOG('PROSP 1.9 '||v_sr_no);
    -- pro_log(v_data_array(1));
    -- pro_log(v_data_array(2));
    V_REG.CANTIDAD_EMPLEADOS:=RTRIM(LTRIM(v_data_array(13)));
    -- PRO_LOG('PROSP 1.10 '||v_sr_no);
    -- pro_log(v_data_array(14));
    V_REG.CANTIDAD_BENEFICIARIOS:=RTRIM(LTRIM(v_data_array(14)));
    --PRO_LOG('PROSP 1.11 '||v_sr_no);
    V_REG.MAIL:=v_data_array(19);
    -- pro_log(V_REG.MAIL);
    -- PRO_LOG('PROSP 1.12 '||v_sr_no);
    -- v_data_array(20):= replace(replace(v_data_array(20),chr(10)),chr(13));
    if not v_data_array.exists(20) then
    -- pro_log('existe');
    -- pro_log(ltrim(rtrim(replace(replace(v_data_array(20),chr(10)),chr(13)))));
    V_REG.Proveedor:= ltrim(rtrim(replace(replace(v_data_array(20),chr(10)),chr(13))));
    else
    v_data_array(20):=null;
    end if;
    -- V_REG.PROVEEDOR:=v_data_array(20);
    -- PRO_LOG('PROSP 1.13 '||v_sr_no);
    if not v_data_array.exists(21) then
    V_REG.OBSERVACIONES:=v_data_array(21);
    else
    v_data_array(21):=null;
    end if;
    -- PRO_LOG('PROSP 1.14 '||v_sr_no);
    -- PRO_LOG('PROSP 1.2 '||v_sr_no);
    insert into prospectos (nombre,razon_social, direccion,localidad_id,codigo_postal,
    departamento_id, telefono, telefono2, rubro,ruc,cantidad_empleados,
    cantidad_beneficiarios,mail,proveedor,observaciones)
    values (nvl(ltrim(rtrim(v_data_array(2))),v_data_array(3)), v_data_array(3),
    v_data_array(4)||' '||v_data_array(5),
    v_localidad_id, LTRIM(RTRIM(v_data_array(7))),v_departamento_id, v_data_array(9),
    v_data_array(10),v_data_array(11), ltrim(rtrim(v_data_array(12))), RTRIM(LTRIM(v_data_array(13))),
    RTRIM(LTRIM(v_data_array(14))),v_data_array(19),v_data_array(20), v_data_array(21))
    returning prospecto_id INTO v_prospecto_id;
    -- PRO_LOG('PROSP 2');
    v_cargo_id := pack_empresas.get_cargo(v_data_array(17));
    -- PRO_LOG('PROSP 3');
    insert into prospecto_contactos (prospecto_id,nombre,apellido,cargo_id,
    telefono,mail)
    values (v_prospecto_id, nvl(v_data_array(15),'S/N'), nvl(v_data_array(16),'S/A'),
    v_cargo_id, v_data_array(18), v_data_array(19));
    -- PRO_LOG('PROSP 4');
    v_asesor_id := pack_empresas.get_asesor(v_data_array(1));
    -- PRO_LOG('PROSP 5');
    insert into asignaciones (prospecto_id,asesor_id,fecha_asignacion)
    values (v_prospecto_id, v_asesor_id, trunc(sysdate));
    -- PRO_LOG('PROSP 6');
    END IF;
    -- Clear out
    v_line := NULL;
    v_sr_no := v_sr_no + 1;
    END IF;
    END LOOP;
    delete wwv_flow_files
    where name= p_archivo;
    END pro_carga_planilla_prosp;
    function hex_to_decimal
    --this function is based on one by Connor McDonald
    --http://www.jlcomp.demon.co.uk/faq/base_convert.html
    ( p_hex_str in varchar2 ) return number
    is
    v_dec number;
    v_hex varchar2(16) := '0123456789ABCDEF';
    begin
    v_dec := 0;
    for indx in 1 .. length(p_hex_str)
    loop
    v_dec := v_dec * 16 + instr(v_hex,upper(substr(p_hex_str,indx,1)))-1;
    end loop;
    return v_dec;
    end hex_to_decimal;

  • How to get data into the mySQL database?

    First some background.
    I have a website that has outgrown its designed dimensions and is a huge burden to maintain. See PPBM5 Benchmark
    There is a lot of maintenance work involved, so I'm investigating a PHP/MySQL approach to easen the burden and to add functionality to the site. With the current Excel based structure and over 420 entries, it is cumbersome for me to maintain, but also for users to find what they need.
    A MySQL based dynamic structure is a lot easier and offers vastly more selection capabilities, like selecting only records that meet specific criteria.
    Data submission is done with a form, that contains most of the relevant data, but the drawack is that people submitting their data are often not technically inclined, give wrong answers due to a lack of understanding or making typo's. The test results are attached in one or two separate .txt files, but often they have not read the instructions correctly or did something wrong, so these attached .txt files can not be trusted automatically, they have to be checked before inclusion.
    These were my initial thoughts:
    1. Data collection:
    To avoid spending all our energy and time  on correcting typo's, getting missing data, correcting errors, I am  investigating the use of CPU-Z in Ghost mode to create a .txt or .html  file that contains all relevant hardware info we need and even more. It gives all the info we currently have, but adds  data like number of memory sticks, DDR timings, stock clock speed and  BCLK setting, video card info and VRAM size, etc.
    To see what I mean, run CPU-Z, go to the About tab and press the Save Report button and look at the results.
    This can all be done without user intervention in an automatic way, but  maybe I need to add an Auto-It file to the test to make it all run as  desired.
    If this works and I'm able to extract the relevant data from the created  file and can insert it into the database, we may be in business for the  next version of PPBM5.5 or PPBM6. It does require a modification to the instructions, making them a lot  easier, because there is less data to fill out.
    2. Data submission:
    The submission form can be simplified if  the CPU-Z data can be used. We have to create an automatic way to attach  the created .html file from CPU-Z to the submission form and we have to  streamline the Output.txt and Output-MPE.txt files to be more easily included in the 'form.lib.php' file. It  currently is manual labor and very time consuming.
    3. Adding to Database:
    I have to find a way to create database  records from the Gmail forms I receive. All incoming mail messages need  to be checked on relevancy and if relevant, need to be added  automatically to the database and then offered for approval before final inclusion in the database. Data included in the database  will then include submission date and time, Email address,  IP address  used, plus links to the files submitted and available on the website.
    4. Publication of the database:
    After approval of new records from step  3, all updates will be automatically applied to the database and  accessible for users. I do not yet intend to introduce a user account ,  requesting login before all functionality is accessible. Too much trouble and administration.
    Queries should be possible on things like CPU (check box), so include  17-920, i7-930, i7-950 but exclude i7-980X and i7-990X, Size of memory  (check box), Overclocked (boolean, yes, no), SSD as OS disk, and similar  options.
    The biggest problem is to keep the color grading and statistical  indicators (Top, D9, Q3, Med, Q1 and D1) intact on dynamically generated  queries. Say you make a query which results in 20 observations, this  should show the related colors and legends. Next query results in 48 observations and of course the color grading and legends  do need to reflect that. Question in my mind, does the RPI remain  constant, independent of the query or does that need to be recalculated  on the basis of the query?
    Next thing is to allow a user to select a specific observation and by  simply clicking on it be shown, in a separate window (detail page) or  accordion, all the CPU-Z related information about the hardware.
    The graphs, Top-20 and MPE Gains, need to be dynamically adjusted, based on the query used.
    5. Ideally, external links:
    In an ideal situation, one could link the  CPU-Z data to external price databases, looking up current prices for  CPU, memory, video card, disks, raid controller, etc. to get instant  BFTB charts, based on the query made. But that is the next step.
    Situation now:
    I have a MySQL database that is easily updated with the new submissions. Simply create a .CSV flie from the submitted forms and import that into the database. The bulk of the initial work is done.Lots remain to be done as you can see above, but that is for a later time.
    Question:
    I have this table, that needs to be filled with data in the submitted and attached files. Mr. X submitted his data and can be uniquely identified by his "Ref_ID". He attached one or two files in .TXT format with the relevant test data. These files are stored on the server with a concatenated name:
    "Ref_ID","-","filename"
    Say his Ref-ID is: 20110204-6cf5 and his submitted file is called: Output(99).txt then the file can be found on the server as
    20110204-6cf5-Output(99).txt
    I need to be able to open that comma delimited file, the contents may look like this: "439","1036","819","531" and insert these contents into the relevant record and fields.
    Graphically,
    is what I want to achieve.
    This being my first exposure to PHP/MySQL, you can imagine I'm not clear on how to go from here.
    Added complication is that I actually have 5 numbers to insert per record and two calculated fields, Total Score and RPI should be calculated fields. Haven't yet figured out how to handle calculated fields, maybe only in the PHP/HTML code and not in the database.
    I hope someone can help me.

    You do have a very complex looking site and may need several tables in mysql to handle all that data. If you knew to phpmysql I would suggest taking a look at this tutorial it will help get you started in understanding how to $_GET info from a database and also how to $_POST data to a database. I am no expert just learning myself and I found this very helpful. This is the link http://www.adobe.com/devnet/dreamweaver/articles/first_dynamic_site_pt1.html
    There are also many tutorials on Youtube to help build a CMS Content Management Site I would suggest the following: -
    http://www.youtube.com/user/phpacademy
    http://www.youtube.com/user/betterphp
    http://www.youtube.com/user/flashbuilding
    And many more on my channel here
    http://www.youtube.com/user/Whisperingonthewind
    CMS's are easier to maintain, add edit and delete content.
    I have also recently bought a Book by David Powers Training from the Source very helpful.
    Anyway hope you get it sorted.

  • How to store measurement data in a single database row

    I have to store time-data series into a database and do some analysis using Matlab later. Therefore the question might be more a database question rather than Diadem itself. Nevertheless I'm interested if anyone has some best practices for this issue.
    I have a database which holds lifecycle records for certain components of same nature and different variants. Depending on the variant I have test setups which record certain data. There is a common set of component properties and a varying number of environmental measurements to be included. Also the duration of data acquisition varies across the variants.
    Therefore having tables appears to be non-optimal solution for storing the data because the needed number of columns is unknown. Additionally I cannot create individual tables for each sample of a variant. This would produce to many tables.
    So there are different approaches I have thought of:
    Saving the TDM and TDX files as text respectively as BLOB
    This makes it necessary to use intermediate files.
    Saving the data as XML text
    I don’t know yet if I can process XML data in Matlab
    Has anybody an advice on that problem?
    Regards
    Chris

    Chris
    Sorry for the lateness in replying to your post. 
    I have done quite a bit of using a Database to store test results.  (In my case this was Oracle DB, using a system called ASAM-ODS)
    My 2 Cents:
    Three functions were needed by users for me.  1) To search and find the tests,  and  2)  To take the list of Tests and process the data into a report/output summary. 2) If the file size is large, then being able to import the data quickly into analysis tool speeds up processing.
    1) Searching for test results.  This all depends on what parameters are of value for searching.  In practice this is a smaller list of values(usually under 20), but I have had great difficulty getting users to agree on what these parameters are. They tend to want to search for anything.   The organization of the searching parameters has direct relationship to how you can search.   The NI Datafinder does a nice job of searching for parameters, so long as the parameter data are stored in properties of Channel Groups or Channels. It does not search or index if the values are in channel values.
    Another note: Given these are searchable parameters, it helps greatly if these parameters have a controlled entry, so that the parameters are consistent over all tests, and not dependent on free form entry by each operator. Searching becomes impossible if the operators enter dates/ names in wildly different formats.
    2) Similar issue exists if put values into databases. (I will use the database terms of Table and column(Parameter) and Row (instance of Data that would be one test record.)
    The sql select statement, can do fast finds, if store the searchable parameters in rows of a table. Where would have one row for each Test Record.   The files I worked with have more than 2000 parameters.   Making a table that would store all of these, and be able to search for all of these, makes a very large number of rows. I did not like this approach, as it has substantial maintenance time, and when changes are made, things get ugly quick.
    3)This is where having the file format be something that the analysis tool can quickly load up is beneficial.   Especially if the data files are large. In DIAdem's case, it reads TDM,TDMS files very quickly into the dataportal.   It can also read in the MDF or HDF files, but these are hierarchical structures that require custom code to traverse the information, and get the information into dataportal to use in Analysis /reporting. (Takes more time to read data, but have much more flexibility in the data structure than the two level tdm/tdms format.)
    My personal preferences
    I would not want to put the test data into a Table row. Each of the columns would be fixed and the table would be very wide in columns.
    >
    I personally like to put the test Data into a file, like TDMS, MDF, or HDF and then the database would have a entry for the reference to the attachment. The values that are in the database is just the parameters that are used for test Searching, either in datafinder or in sql commands in the user interface.
    Hopefully these comments help your tasks some.
    Respectfully,
    Paul
    tdmsmith.com

  • Exporting data from SQL Server database to Oracle database

    Hello All,
    We need to replicate a table's data of SQL Server database to Oracle database.
    Can this task be accomplished using Import/Export wizard or Linked servers?
    Can help me regarding which Oracle data access components should i download to do this?
    I am using SQL Server 2012.
    And i have Oracle 11g release 2 client installed in my system.
    Thanks in Advance.
    Thanks and Regards, Readers please vote for my posts if the questions i asked are helpful.

    Yes you can definitely transfer data from SQL server to Oracle Have a look at below links
    Export SQL server data to Oracle Using SSIS
    Use OLEDB data provider to transfer data from SQL server to Oracle
    Using Import Export Wizard
    Similar thread
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it.
    My TechNet Wiki Articles

  • How can i save the data from the Oracle database to my local directory

    How can i save the data from the Oracle database to my local directory instead Of saving the data file to the Directory created on the Oracle Server ?
    I require to design the Procedure which will pull the data from various tables and needs to store the data in the Client's local directory.

    Since SQL*PLUS runs on the client, you can use SQL*PLUS to spool data to your local drive.
    You could also use the database to write a a specified drive where all users have access to (mapped network drive, e.g.). I wouldn't recommend doing it that way, but it is sometimes useful when the files are created in some nightly batch run.

  • I need to pass data from an Access database to Teststand by using the built in Data step types(open data

    base /open SQL Statement etc) the first time i defined the system everything was fine but when i changed the Database (using M.S.Access) the " open SQL Statement" it would show the tables but not thier columns ,I"m using win98 sec edition / Teststand 1.0.1i need to pass data from an Access database to Teststand by using the built in Data step types(open database /open SQL Statement etc) the first time i defined the system everything was fine but when i changed the Database (using M.S.Access) the " open SQL Statement" it would show the tables but not thier columns ,I"m using win98 sec edition / Teststand 1.0.1
    When I tried the same thing on another cmputer the same thing
    happend
    appreiciate u"r help

    base /open SQL Statement etc) the first time i defined the system everything was fine but when i changed the Database (using M.S.Access) the " open SQL Statement" it would show the tables but not thier columns ,I"m using win98 sec edition / Teststand 1.0.1Hello Kitty -
    Certainly it is unusual that you can still see the tables available in your MS Access database but cannot see the columns? I am assuming you are configuring an Open Statement step and are trying to use the ring-control to select columns from your table?
    Can you tell me more about the changes you made to your file when you 'changed' it with MS Access? What version of Access are you using? What happens if you try and manually type in an 'Open Statement Dialog's SQL string such as...
    "SELECT UUT_RESULT.TEST_SOCKET_INDEX, UUT_RESULT.UUT_STATUS, UUT_RESULT.START_DATE_TIME FROM UUT_RESULT"
    Is it able to find the columns even if it can't display them? I am worried that maybe you are using a version of MS Access that is too new for the version of TestSt
    and you are running. Has anything else changed aside from the file you are editing?
    Regards,
    -Elaine R.
    National Instruments
    http://www.ni.com/ask

  • Xml data into non-xml database.. solution anyone?

    Hi,
    My current project requires me to store the client's data on our servers. We're using Oracle9i. Daily, I will download the client's data for that day and load it into our database. My problem is that the data file is not a flat file so I can't use sql*loader to load the data. Instead, the data file is an xml file. What is the best way to load xml data into a non-xml database? Are there any tools similar to sql*Loader that will load xml data into non-xml database? Is it the best solution for the client to give me an XML dump of their data to load into our database, or should I request a flat file? My last resort would be to write some sort of a script to parse the xml data into a flat file, and then run it through sql*loader. Is this the best solution? One thing to note is that these files could be very large.
    Thanks in advance.
    -PV

    I assume that just putting the XML file into an
    extremely large VARCHAR field is not what you want.
    Instead, you want to extract data elements from the
    XML and write them to columns in a table in your
    database. Right?Yes. Your assumption is correct.
    It sounds like you already have a script that loads a
    flat file into your database. In that case I would
    write an XSL transformation that converts the client's
    XML into a correctly-formatted flat file.Thank you. I'll look into that. Other suggestions are welcome.

  • Error while loading the data from excel to database.

    Hi,
    I am using PL/SQL developer to load the data from excel to database. I will set the data source in the control panel and will proceed through ODBC importer in pl/sql developer to import the data.
    What exactly the error is when i click the filename to view the result preview it shows an error as:
    The field is too small to accept the amount of data you attempted to add. Try inserting or pasting less data.
    Kindly help with solution.
    Thanks/Regards
    Sakthivarman J.

    Hello;
    That error message comes from Microsoft, so something in you Excel sheet is the cause.
    Its a pain but I would check properties of each column in case Excel decided to add something, a comma for example.
    Do you have a column over 255 characters? Look there first. If any length is greater than 255 it will crash and burn.
    Or convert it to a CSV and create an external table.
    Best Regards
    mseberg
    Might also throw an 3163 as an error where you cannot see it.
    Edited by: mseberg on Sep 9, 2011 7:34 AM

Maybe you are looking for