Convert reqular | delimeted data into XMLtype

How do we hold data
How do I convert/store regular data (coming from | delimeted .dat file) into Oracle XMLtype.
there will 400+ columns on each record and the file will be having about 50 millions records,
I have requirement to use the XMLtype, any help would be greatly appreciated.
In some cases the incoming file has more columns on day2 from day1. If table is predefined the no. of columns, Then this case I will loose that additional column's values. The main stuff is even if receive different no. of columns if need to capture without altering the table?
Is there any option, without altering the table, can I capture the additional columns in Oracle?
Also is there any advantage or disadvantage of this process, please!
Thanks,

Hi,
You can declare the column as XMLType and the storage as CLOB.
  1  SELECT SYS_XMLGen('T')
  2    FROM   user_tables
  3*   WHERE  rownum = 1
sql> /
SYS_XMLGEN('T')
<ROW>T</ROW>
sql> create table t2 (
  2     x xmlType)
  3   XMLType x STORE AS CLOB;
Table created.
sql> insert into t2
  2  SELECT SYS_XMLGen('T')
  3    FROM   user_tables
  4    WHERE  rownum = 1;
1 row created.
sql> commit;
Commit complete.Thanks,
Rajesh.
Please mark this/any other answer as helpful or answered if it is so. If not, provide additional details/feedback.
Always try to provide create table and insert statements to help the forum members help you better.

Similar Messages

  • How to Convert internal table data into xml and xml data into internal tab

    Hi Guys,
          I have a requirement  that  i have to convert the internal table data into xml format and viceversa . for my requirement  i came to know that i have to use Transformations concept.  i done the converting the data from internal table into xml data by using standard Tranformations. My Question is 
    1) Can i use same Transformation to convert the xml data into abap internal table. if it is possible then how ???
    2) Is it possible using the standard Transformation  or I have to go for XSLT approach
    Please help me out from this guys,
    Thanks and Regards
    Koti

    Hi Koti,
    This is possible. There is a link. With the help of this link you can convert ABAP data to XML and vice versa.
    Link: http://www.heidoc.net/joomla/index.php?option=com_content&view=article&id=15:sapxslt&catid=22:sap-xslt&Itemid=31

  • Inserting large xml data into xmltype

    Hi all,
    In my project I need to insert very large XML data into xmltype column.
    My table:
    CREATE TABLE TransDetailstblCLOB ( id number, data_xml XMLType) XmlType data_xml STORE AS CLOB;
    I am using JDBC approach to insert values. It works fine for data less than 4000 bytes when using preparedStatement.setString(1, xmlData). As I have to insert large Xml data >4000 bytes I am now using preparedStatement.setClob() methods.
    My code works fine for table which has column declared as CLOB expicitly. But for TransDetailstblCLOB where the column is declared as XMLTYPE and storage option as CLOB I am getting the error : "ORA-01461: can bind a LONG value only for insert into a LONG column".
    This error means that there is a mismatch between my setClob() and column. which means am I not storing in CLOB column.
    I read in Oracle site that
    When you create an XMLType column without any XML schema specification, a hidden CLOB column is automatically created to store the XML data. The XMLType column itself becomes a virtual column over this hidden CLOB column. It is not possible to directly access the CLOB column; however, you can set the storage characteristics for the column using the XMLType storage clause."
    I dont understand its stated here that it is a hidden CLOB column then why not I use setClob()? It worked fine for pure CLOB column (another table) then Why is it giving such error for XMLTYPE table?
    I am struck up with this since 3 days. Can anyone help me please?
    My code snippet:
    query = "INSERT INTO po_xml_tab VALUES (?,XMLType(?)) ";
              //query = "INSERT INTO test VALUES (?,?) ";
         // Get the statement Object
         pstmt =(OraclePreparedStatement) conn.prepareStatement(query);
         // pstmt = conn.prepareStatement(query);
         //xmlData="test";
    //      If the temporary CLOB has not yet been created, create new
         temporaryClob = oracle.sql.CLOB.createTemporary(conn, true, CLOB.DURATION_SESSION);
         // Open the temporary CLOB in readwrite mode to enable writing
         temporaryClob.open(CLOB.MODE_READWRITE);
         log.debug("tempClob opened"+"size bef writing data"+"length "+temporaryClob.getLength()+
                   "buffer size "+temporaryClob.getBufferSize()+"chunk size "+temporaryClob.getChunkSize());
         OutputStream out = temporaryClob.getAsciiOutputStream();
         InputStream in = new StringBufferInputStream(xmlData);
    int length = -1;
    int wrote = 0;
    int chunkSize = temporaryClob.getChunkSize();
    chunkSize=xmlData.length();
    byte[] buf = new byte[chunkSize];
    while ((length = in.read(buf)) != -1) {
    out.write(buf, 0, length);
    wrote += length;
    temporaryClob.setBytes(buf);
    log.debug("Wrote lenght"+wrote);
         // Bind this CLOB with the prepared Statement
         pstmt.setInt(1,100);
         pstmt.setStringForClob(2, xmlData);
         int i =pstmt.executeUpdate();
         if (i == 1) {
         log.debug("Record Successfully inserted!");
         }

    try this, in adodb works:
    declare poXML CLOB;
    BEGIN
    poXML := '<OIDS><OID>large text</OID></OIDS>';
    UPDATE a_po_xml_tab set podoc=XMLType(poXML) WHERE poid = 102;
    END;

  • Converting Flat File data into XML

    Hi Experts,
    Consider the message type of the SENDER system and flat file data
    <dt_sender>
    <root>
    <header1>   0..1
        <f1>
        <f2>
        <f3>
    <header2>   0..1
        <f4>
        <f5>
        <f6>
    <item>        1..unbounded
        <f7>
        <f8>
        <f9>
        <f10>
        <f11>
        <f12>
    </item>
    abc     def     ghi     jkl     mno     pqr
    123     123     123     123     123     123
    456     456      456     456     456     456
    how to convert the flat file data into following XML data. please note that each field value is separated by TAB delimeter...wht parameters shld b used
    <root>
        <Header1>
            <f1>abc</f1>
            <f2>def</f2>
            <f3>ghi</f3>
        </Header1>
        <Header2>
            <f4>jkl</f4>
            <f5>mno</f5>
            <f6>pqr</f6>
        </Header1>
        <item>
            <f7>123</f7>
            <f8>123</f8>
            <f9>123</f9>
            <f10>123</f10>
            <f11>123</f11>
            <f12>123</f12>
            <f7>456</f7>
            <f8>456</f8>
            <f9>456</f9>
            <f10>456</f10>
            <f11>456</f11>
            <f12>456</f12>
        </item>
    points will be given to the correct answers
    Thanks in advance.
    FAisal
    Edited by: Abdul Faisal on Feb 29, 2008 5:53 AM

    Faisal,
    When you read the multiple recordset strucutre file then each record in txt file should have an header from which you can identiy which segment it should go.. and you identiy it by using the keyfiledValue in file adapter
    <root>
    <header1> 0..1
    <f1>
    <f2>
    <f3>
    <header2> 0..1
    <f4>
    <f5>
    <f6>
    <item> 1..unbounded
    <f7>
    <f8>
    <f9>
    <f10>
    <f11>
    <f12>
    </item>
    for this input file
    abc def ghi jkl mno pqr
    123 123 123 123 123 123
    456 456 456 456 456 456
    abc def ghi can be read using the file adater to header 1 usinfg key field value, but using the same file adapter you cannt put GHI into header2.
    else you should read whole row abc def ghi jkl mno pqr in single filed and write an UDF to split data to header1 and Header 2
    similarly you have to take care for item records also
    if your inout file is something like this
    abc def ghi
    jkl mno pqr
    123 123 123 123 123 123
    456 456 456 456 456 456
    abc identifies to Header 1
    JKL for Header 2  so on...
    read the whole line in single field  and write UDF to Split to header 1 and header 2 similary for item.

  • How to Convert internal table data into text output and send mail in ABAP

    Hi All,
    Good Morning.
    Taking a glance at a code that converts internal table data to an Excel file in ABAP. also checked how to send this excel to mailing list as attachment.
    But thought of doing it without excel.
    I mean, I have an internal table which contains fields of all types (character,integer,date,time). Since it is only around 4 to 5 rows in it (output),why to convert it to excel. not required!!.  Instead I  want to send this output to User's mails as Normal mail body with No attachments.
    Could anybody please suggest me a way as to how to send internal table data as a mail ( not as an excel or PDF etc).
    as of now my findings are, it is quite complex to convert internal table data to email (Text) format. but i believe if there is some way of doing it.
    Best Regards
    Dileep VT

    here's something I have used in the past where we send out information about failed precalculation settings (which are stored in internal table gt_fail)
    notice we use gt_text as "mail body"
    TRY.
    *     -------- create persistent send request ------------------------
           gv_send_request = cl_bcs=>create_persistent( ).
    *     -------- create and set document -------------------------------
    *     create text to be sent
           wa_line = text-001.
           APPEND wa_line TO gt_text.
           CLEAR wa_line.
           APPEND wa_line TO gt_text.
           LOOP AT gt_fail ASSIGNING <fs_fail>.
             MOVE <fs_fail>-retry_count TO gv_count.
             CONCATENATE text-002
                         <fs_fail>-setting_id
                         text-003
                         gv_count
                         INTO wa_line SEPARATED BY space.
             APPEND wa_line TO gt_text.
             CLEAR wa_line.
           ENDLOOP.
           APPEND wa_line TO gt_text.
           wa_line = text-007.
           APPEND wa_line TO gt_text.
    *     create actual document
           gv_document = cl_document_bcs=>create_document(
                           i_type    = 'RAW'
                           i_text    = gt_text
                           i_length  = '12'
                           i_subject = 'Failed Precalculation Settings!' ).
    *     add document to send request
           CALL METHOD gv_send_request->set_document( gv_document ).
    *     --------- set sender -------------------------------------------
           gv_sender = cl_sapuser_bcs=>create( sy-uname ).
           CALL METHOD gv_send_request->set_sender
             EXPORTING
               i_sender = gv_sender.
    *     --------- add recipient (e-mail address) -----------------------
           LOOP AT s_email INTO wa_email.
             MOVE wa_email-low TO gv_email.
             gv_recipient = cl_cam_address_bcs=>create_internet_address(
                                               gv_email ).
             CALL METHOD gv_send_request->add_recipient
               EXPORTING
                 i_recipient = gv_recipient
                 i_express   = 'X'.
           ENDLOOP.
    *     ---------- set to send immediately -----------------------------
           CALL METHOD gv_send_request->set_send_immediately( 'X' ).
    *     ---------- send document ---------------------------------------
           CALL METHOD gv_send_request->send(
             EXPORTING
               i_with_error_screen = 'X'
             RECEIVING
               result              = gv_sent_to_all ).
           IF gv_sent_to_all = 'X'.
             WRITE text-004.
           ENDIF.
           COMMIT WORK.
    *   exception handling
         CATCH cx_bcs INTO gv_bcs_exception.
           WRITE: text-005.
           WRITE: text-006, gv_bcs_exception->error_type.
           EXIT.
       ENDTRY.
    with the following declarations
    * TABLES                                                               *
    TABLES:
       adr6,
       rsr_prec_sett.
    * INTERNAL TABLES & WORK AREAS                                         *
    DATA:
       gt_fail          TYPE SORTED TABLE OF rsr_prec_sett
                             WITH UNIQUE KEY setting_id run_date,
       gt_text          TYPE bcsy_text,
       wa_fail          LIKE LINE OF gt_fail,
       wa_line(90)      TYPE c.
    FIELD-SYMBOLS:
       <fs_fail>        LIKE LINE OF gt_fail.
    * VARIABLES                                                            *
    DATA:
       gv_count(4)      TYPE n,
       gv_send_request  TYPE REF TO cl_bcs,
       gv_document      TYPE REF TO cl_document_bcs,
       gv_sender        TYPE REF TO cl_sapuser_bcs,
       gv_recipient     TYPE REF TO if_recipient_bcs,
       gv_email         TYPE adr6-smtp_addr,
       gv_bcs_exception TYPE REF TO cx_bcs,
       gv_sent_to_all   TYPE os_boolean.
    * SELECTION-SCREEN                                                     *
    SELECT-OPTIONS:
       s_email          FOR adr6-smtp_addr NO INTERVALS MODIF ID sel.
    DATA:
       wa_email         LIKE LINE OF s_email.

  • How to convert digital array data into analog signal

    i want help to convert digital data array into analog signal

    shubham62 wrote:
    We are implimenting real time audio trans-receiver. We have converted input audio data(analog form) into digital data. At the receiver side we received digital data which is in the form of digital data array.But we are unable to recover back our original input data. So please help us to sort out the problem.
    Still no useful information.  How was the data converted to digital in the first place?  What conversion parameters did you use?  As was said, you just need to do the math backwards to get it back into the analog.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • How to convert internal table data into excel format?

    Hi all,
    I want to convert internal table data in excel format and then
    send it as email attachment in workflow.
    Please tell me how do i perform this.
    Regards,
    Arpita.

    Hi Arpita,
    Try this sample code::
    Send mail
    maildata-obj_name = 'TEST'.
    maildata-obj_descr = 'Test Subject'.
    loop at htmllines.
    mailtxt = htmllines.
    append mailtxt.
    endloop.
    mailrec-receiver = 'your receiver mail id'.
    mailrec-rec_type = 'U'.
    append mailrec.
    call function 'SO_NEW_DOCUMENT_SEND_API1'
    exporting
    document_data = maildata
    document_type = 'EXL'
    put_in_outbox = 'X'
    tables
    object_header = mailtxt
    object_content = mailtxt
    receivers = mailrec
    exceptions
    too_many_receivers = 1
    document_not_sent = 2
    document_type_not_exist = 3
    operation_no_authorization = 4
    parameter_error = 5
    x_error = 6
    enqueue_error = 7
    others = 8.
    if sy-subrc 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    endif.
    Hope it will help you.
    Regards,
    NIkita

  • How to insert 4K of XML data into XMLType column?

    I use OCCI and our Oracle version is 9.2.0.1.0. I have successfully been able to insert xml data (any size) into a clob column using "insert into...values(.., empty_clob()) returning c1 into :3"; and then using Statement::GetClob() to acquire a reference to the internal clob and populate it. I cannot seem to be able to do the same when the column type is of XMLType.
    I could not find a single sample code which demonstrates inserting into a table with a XMLType column. Using SetDataBuffer(OCCI_SQLT_STR) with data over 4000 bytes does not work.
    I'd greatly appreciate any feedback.

    Pretty sure this was a bug in the base 9.2 release which was fixed in a patch drop. Try 9.2.0.6 or later.

  • Error inserting CLOB data into xmltype table on Solaris 8 Oracle 9.2.0.1.0

    Hi all,
    I have a table t of type xmltype.
    I have a function getData which parses an XML file and returns the CLOB data.
    I have a statement as
    "insert into t values(xmltype(getData('abc.xml')));"
    I get the following error
    ERROR at line 1:
    ORA-00600: internal error code, arguments: [17177], [0x0], [], [], [], [], [],
    ORA-31011: XML parsing failed
    ORA-06512: at "SYS.XMLTYPE", line 0
    ORA-06512: at line 1
    ORA-06512: at "ADAPT.AP_CREATE_INSP_LOAD", line 57
    ORA-06512: at line 1
    At line 57 I have the above mentioned "insert into..." statement.
    Can anybody tell me what can be the problem.
    Interestingly enough, the same things work on same Oracle version on Windows 2k, Windows 2k3, another Solaris 8 machine.
    Please help asap as I am in fire fighting mode.
    Thanks & Regards,
    Aniruddha Deshpande

    Hi Aniruddha
    I think you need to post to a db forum rather than XMLP.
    Tim

  • Inserting XML data into xmltype column

    Oracle version: 10.1.0.5
    OpenVms Alpha V8.3
    1) Tried this and get the error shown below. Removed charset and placed a zero. Same error.
    INSERT INTO xml_demo (xml_data) -- column of xmltype
    VALUES
    xmltype
    bfilename('XML_DIR', 'MOL.XML'),
    nls_charset_id('AL32UTF8')
    ORA-22993: specified input amount is greater than actual source amount
    ORA-06512: at "SYS.DBMS_LOB", line 637
    ORA-06512: at "SYS.XMLTYPE", line 283
    ORA-06512: at line 1
    2) This PL/SQL block works. However maximum raw size around 32K. The file can be around 100K. May be I can load it into a table of raw and somehow concatnate it to insert. Not sure whether this is possible but I am sure there must me a simple way of doing this.
    Subset of the xml file is pasted below.
    set serveroutput on size 1000000
    DECLARE
    file1 bfile;
    v_xml XMLType;
    len1 number(6);
    v_rec1 raw(32000);
    BEGIN
    file1 := bfilename('XML_DIR','MOL.XML');
    DBMS_LOB.fileopen(file1, DBMS_LOB.file_readonly);
    len1 := DBMS_LOB.getLength(file1);
    v_rec1 := dbms_lob.substr(file1,len1,1);
    v_xml := xmltype(UTL_RAW.CAST_TO_VARCHAR2(v_rec1));
    INSERT INTO xml_demo (xml_data) VALUES (v_xml);
    COMMIT;
    DBMS_LOB.fileclose(file1);
    exception
    when others then
    dbms_output.put_LINE (sqlerrm);
    DBMS_LOB.fileclose(file1);
    END;
    <?xml version="1.0" encoding="UTF-8"?>
    <MolDocument DtdVersion="3" DtdRelease="0">
    <DocumentIdentification v="MOL_20100331_1500_1600"/>
    <DocumentVersion v="1"/>
    <DocumentType v="A43"/>
    <SenderIdentification codingScheme="A01" v="17X100Z100Z0001H"/>
    <SenderRole v="A35"/>
    <ReceiverIdentification codingScheme="A01" v="10XFR-RTE------Q"/>
    <ReceiverRole v="A04"/>
    <CreationDateTime v="2010-03-31T14:10:00Z"/>
    <ValidTimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <Domain codingScheme="A01" v="10YDOM-1001A001A"/>
    <MolTimeSeries>
    <ContractIdentification v="RTE_20100331_1500_16"/>
    <ResourceProvider codingScheme="A01" v="10XFR-RTE------Q"/>
    <AcquiringArea codingScheme="A01" v="17Y100Z100Z00013"/>
    <ConnectingArea codingScheme="A01" v="10YFR-RTE------C"/>
    <AuctionIdentification v="AUCTION_20100331_1500_1600"/>
    <BusinessType v="A10"/>
    <BidTimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <MeasureUnitQuantity v="MAW"/>
    <Currency v="EUR"/>
    <MeasureUnitPrice v="MWH"/>
    <Direction v="A02"/>
    <MinimumActivationQuantity v="50"/>
    <Status v="A06"/>
    <Period>
    <TimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <Resolution v="PT60M"/>
    <Interval>
    <Pos v="1"/>
    <Qty v="50"/>
    <EnergyPrice v="50.45"/>
    </Interval>
    </Period>
    </MolTimeSeries>
    </MolDocument>

    Marc
    Thanks. I understand what you are saying. I have been copying files in binary mode from NT servers into VMS. I have to get a proper xml file via FTP from the originating system to further investigate.
    I have one last item i need help on. If anything looks obvious let me know:
    +1) The xsd defintion of Qty (type: QuantityType) and EnergyPrice (type: Amount Type)+
                   <xsd:element name="Qty" type="ecc:QuantityType">
                        <xsd:annotation>
                             <xsd:documentation/>
                        </xsd:annotation>
                   </xsd:element>
                   <xsd:element name="EnergyPrice" type="ecc:AmountType" minOccurs="0">
                        <xsd:annotation>
                             <xsd:documentation/>
                        </xsd:annotation>
                   </xsd:element>
    +2) Definition of AmountType and QuantityType in the parent xsd+
         <xsd:complexType name="AmountType">
              <xsd:annotation>
                   <xsd:documentation>
                        <Uid>ET0022</Uid>
                        <Definition>The monetary value of an object</Definition>
                   </xsd:documentation>
              </xsd:annotation>
              <xsd:attribute name="v" use="required">
                   <xsd:simpleType>
                        <xsd:restriction base="xsd:decimal">
                             <xsd:totalDigits value="17"/>
                        </xsd:restriction>
                   </xsd:simpleType>
              </xsd:attribute>
         </xsd:complexType>
         <!--_________________________________________________-->
         <xsd:complexType name="QuantityType">
              <xsd:annotation>
                   <xsd:documentation>
                        <Uid>ET0012</Uid>
                        <Definition>(Synonym "qty") The quantity of an energy product. Positive quantities shall not have a sign.</Definition>
                   </xsd:documentation>
              </xsd:annotation>
              <xsd:attribute name="v" type="xsd:decimal" use="required"/>
         </xsd:complexType>
         <!--________________
    +3. Data in the XML file+
    <Period>
    <TimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <Resolution v="PT60M"/>
    <Interval>
    <Pos v="1"/>
    <Qty v="50"/>
    <EnergyPrice v="50.45"/>
    </Interval>
    +4) When I do the load:+
    the EnergyPrice is saved in the xmltype column as <EnergyPrice v="50"/>
    Losing its decimal value of .45
    +5) When I select as follows:+
    **DEV** SQL>> l
    1 SELECT
    2 EXTRACTVALUE(x2.column_value,'/MolTimeSeries/Period/Interval/EnergyPrice/@v') v1,
    3 EXTRACTVALUE(x2.column_value,'/MolTimeSeries/Period/Interval/EnergyPrice') v2,
    4 EXTRACTVALUE(x2.column_value,'/MolTimeSeries/Period/Interval/Qty') v3
    5 FROM balit_mol_xml x,
    6 TABLE(
    7 XMLSEQUENCE(
    8 EXTRACT(x.xml_payload, '/MolDocument/MolTimeSeries')
    9 )
    10 ) x2
    11* WHERE EXISTSNODE(x.xml_payload,'/MolDocument/DocumentIdentification[@v="MOL_20100331_1500_1600"]') = 1
    +6) get the result+
    50
    AmountType479_T(XDB$RAW_LIST_T('1301000000'), 50)
    QuantityType471_T(XDB$RAW_LIST_T('1301000000'), 50)
    +7) XDB$RAW_LIST_T('1301000000'),+
    Does that tell what I am doing wrong?

  • Dynpro - how to convert a FLTP data into a nice numeric one ?

    Hi,
    I have a fltp data entered in my dynpro via a search help, into a decimal data  .
    i would want it to be of format ___,_ ( 6 positions, 1 decimal).
    thus i have declared it in the screen element list : decimal length 6, with a mask such as ____,_
    But the fltp is like '5,700000000000000E+00', and when selected is like ' 5,700'. So i am blocked by  the following anomaly  "Input should be in the form ____,_ "
    What should i do to correct this anomaly ? create or convert my data in an exit in the search help ? or is there a more simple solution ? 
    Thanks
    Sabine

    my solution :
    Data dynpro_data_from_help(22)  type c  VALUE '4.500000000000000E+01'.
    DATA wl_FLTP TYPE F . "   DATA wl_PD TYPE P DECIMALS 1.
    REPLACE ',' WITH '.' INTO  dynpro_data_from_help.
    wl_fltp =  dynpro_data_from_help.
    wl_PD = wl_fltp.
    write wl_PD to  dynpro_data_from_help.
    shift dynpro_data_from_help right deleting trailing '0'.

  • Convert syncronous serial data into sequence of hex values

    Hi,
    I have a problem analysing captured data of a digital I/O board. I hope one of you can give me an advice/hint on this.
    Used Software: Labview Express 7.0
    Problem description:
    The captured digital data contains on channel 0 a clock signal, on channel 1 a reset signal and on channel 3 a data signal. The data signal is syncronized to the clock and consists out of 8 serial data bits. My problem is, how to convert this clock syncronized serial data signal into a sequence (array) of bytes each represented by 2 hex digits?
    More details on the problem environment:
    - The captured data comes from an ASCII file which contains data samples of a 16bit digital I/O card. Every line contains one sample encoded as 4 hex digits
    - The file is converted by a VI to a digital waveform and plotted by a digital waveform graph.
    One of my core problem on the way to a solution lies in the area, of how to "Trigger and Gate" the digital waveform based on the rising edge of channel 0.
    Thanks in advance for your help!
    Tryber.

    I checked with the 7.0 express version we have loaded on an older computer still and the part that I think you can use was in that version. In the second loop I had made a note: Boolean Crossing PtByPt.  This looks at a binary input and can output a true pulse (one iteration) based on the direction input condition.  This has three options;  False-True (rising), True-False (falling), or Either which is the option I had used in that vi.  This should allow you to detect a false-true state change on the clock channel.  Now, you say that the clock pulse is not consistant, but what about the information pulses? That was the reason for the first two loops on my program.  To insure that the digital value was measured during one input pulse to avoid turning two or more input pulses into one digital value. 
    As far as your question about trigger functions between the two, I'm not too sure.  I believe they offer the same options that I tend to use so I havnt had a problem with either.  On your block diagram menu select All Functions > Analyze > Point by Point > Other Functions PtbyPt.  There are some very handy tools in that menu. 

  • How to convert the exponential data into number

    Hi,
    I have a table with a column which is a vaarchar2(60). I have a CSV file in which, I have stored value as 77052512125510000, but it got converted into 7.71E16. And when I have stored the same in the database, it got saved as 7.71E16. How can I get the original value
    Please help
    Regards
    Edited by: Sarma12 on May 25, 2012 3:11 AM

    Sarma12 wrote:
    The issue is, the data is already copied into the database. Now while retrieving from the database, I am getting that value in the scientific notation and not similar to the data which I have in the CSV file.
    RegardsYou need to be smarter than the tools you use
    bcm@bcm-laptop:~$ sqlplus user1/user1
    SQL*Plus: Release 11.2.0.1.0 Production on Fri May 25 09:50:02 2012
    Copyright (c) 1982, 2009, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    09:50:03 SQL> @test
    09:50:07 SQL> drop table test;
    Table dropped.
    09:50:09 SQL> create table test (id number);
    Table created.
    09:50:10 SQL> insert into test values (77052512125510000);
    1 row created.
    09:50:10 SQL> select * from test;
         ID
    7.7053E+16
    09:50:10 SQL> column id format 999999999999999999999
    09:50:10 SQL> select * from test;
                  ID
         77052512125510000
    09:50:10 SQL>
    09:50:10 SQL>

  • Convert Non-Hierarchical data into Hierarchical data for Tree creation

    Hi guys,
    I've been trying to figure this out for about two entire days now and I'm still stuck and see no possible solution which doesn't involve a new table creation.
    Thing is that I want to create a Tree to navigate through Projects and Tasks, I'm using the "Task Manager" demo app, which I have customized a bit to fit my needs.
    Basically I cannot create the Tree 'cause the relation between Projects and Tasks is not a hierarchical relation, it's a 1:N relation as this:
    __Projects__
    ID (PK)
    PROJECT_NAME
    ___Tasks___
    ID (PK)
    PROJECT_ID (FK references Projects.ID)
    So what I need to do is "force" that 1:N relation to a a hierarchical relation by creating a query (view) that joins the two tables into a single one and that have 2 columns I can use as ID and PARENT_ID for the tree creation.
    This is the Data Model:
    CREATE TABLE "EBA_TASK_PROJECTS"
    (     "ID" NUMBER,
         "PROJECT_NAME" VARCHAR2(255),
         "CREATED_ON" DATE NOT NULL ENABLE,
         "CREATED_BY" VARCHAR2(255) NOT NULL ENABLE,
         "UPDATED_ON" DATE,
         "UPDATED_BY" VARCHAR2(255),
         "FLEX_01" VARCHAR2(4000),
         "FLEX_02" VARCHAR2(4000),
         "FLEX_03" VARCHAR2(4000),
         "FLEX_04" VARCHAR2(4000),
         "FLEX_05" VARCHAR2(4000),
         "PARENT_ID" NUMBER,
         "IS_ACTIVE" VARCHAR2(1),
         "DESCRIPTION" VARCHAR2(4000),
         CONSTRAINT "EBA_TASK_PROJECTS_ACTIVE_CC" CHECK (is_active in ('Y', 'N')) ENABLE,
         CONSTRAINT "EBA_TASK_PROJECTS_PK" PRIMARY KEY ("ID") ENABLE
    ) ;ALTER TABLE "EBA_TASK_PROJECTS" ADD CONSTRAINT "EBA_TASK_PROJECTS_FK" FOREIGN KEY ("PARENT_ID")
         REFERENCES "EBA_TASK_PROJECTS" ("ID") ON DELETE CASCADE ENABLE;
    CREATE TABLE "EBA_TASK_TASKS"
    (     "ID" NUMBER NOT NULL ENABLE,
         "PROJECT_ID" NUMBER NOT NULL ENABLE,
         "TASK_PRIORITY" VARCHAR2(400) NOT NULL ENABLE,
         "TASK_DIFFICULTY" VARCHAR2(400) NOT NULL ENABLE,
         "TASK_NAME" VARCHAR2(4000) NOT NULL ENABLE,
         "TASK_DETAILS" VARCHAR2(4000),
         "CREATED_ON" DATE NOT NULL ENABLE,
         "COMPLETED" DATE,
         "CREATED_BY" VARCHAR2(400) NOT NULL ENABLE,
         "STATUS" VARCHAR2(4000),
         "UPDATED_ON" DATE,
         "STARTED" DATE,
         "TASK_PREDEFINED_ID" NUMBER,
         "UPDATED_BY" VARCHAR2(255),
         "USER_NAME" VARCHAR2(255) NOT NULL ENABLE,
         "FLEX_01" VARCHAR2(4000),
         "FLEX_02" VARCHAR2(4000),
         "FLEX_03" VARCHAR2(4000),
         "FLEX_04" VARCHAR2(4000),
         "FLEX_05" VARCHAR2(4000),
         "PERCENTAGE" NUMBER,
         CONSTRAINT "EBA_TASK_TASKS_PK" PRIMARY KEY ("ID") ENABLE
    ) ;ALTER TABLE "EBA_TASK_TASKS" ADD CONSTRAINT "EBA_TASK_TASKS_PROJECTS_FK" FOREIGN KEY ("PROJECT_ID")
         REFERENCES "EBA_TASK_PROJECTS" ("ID") ON DELETE CASCADE ENABLE;
    I'm using APEX4.0
    That's pretty much it guys, hope you can help me with this. I'm really stuck on this and am about to give up.

    WOW Odie! You're awesome !! It worked like a charm, I created a view as you suggested:
    CREATE OR REPLACE FORCE VIEW "VIEW_TASKS_PROJECTS_TREE" ("ID", "PARENT_ID", "NODE_NAME") AS
    SELECT to_char(id) as id
    , null as parent_id
    , project_name as node_name
    FROM eba_task_projects
    UNION ALL
    SELECT to_char(id)
    , to_char(project_id)
    , task_name
    FROM eba_task_tasks;
    And then I created a new tree with the defaults and customized the Tree query a bit (just added the links really):
    select case when connect_by_isleaf = 1 then 0
    when level = 1 then 1
    else -1
    end as status,
    level,
    "NODE_NAME" as title,
    null as icon,
    "ID" as value,
    null as tooltip,
    'f?p=&APP_ID.:22:&SESSION.::NO::P22_ID,P22_PREV_PAGE:' || "ID" || ',3' as link
    from "#OWNER#"."VIEW_TASKS_PROJECTS_TREE"
    start with "PARENT_ID" is null
    connect by prior "ID" = "PARENT_ID"
    order siblings by "NODE_NAME"
    Thanks man, you saved me a lot of time and headaches :)

  • Data convertion while exporting data into flat files using export wizard in ssis

    Hi ,
    while exporting data to flat file through export wizard the source table is having NVARCHAR types.
    could you please help me on how to do the data convertion while using the export wizard?
    Thanks.

    Hi Avs sai,
    By default, the columns in the destination flat file will be non-Unicode columns, e.g. the data type of the columns will be DT_STR. If you want to keep the original DT_WSTR data type of the input column when outputting to the destination file, you can check
    the Unicode option on the “Choose a Destination” page of the SQL Server Import and Export Wizard. Then, on the “Configure Flat File Destination” page, you can click the Edit Mappings…“ button to check the data types. Please see the screenshot:
    Regards,
    Mike Yin
    TechNet Community Support

Maybe you are looking for

  • How to get a Canon MF4150 to scan

    I've recently moved from a PC to Mac. Canon provides print and fax drivers for Macs for my laser multi-function Canon MF4150. But not a driver for scanning, and they say they don't have any plans to produce one (in spite of the fact that they still a

  • Batch management MM & QM

    Dear Experts , We have the following scenario. 1. RM is a liquid that is recieved in liters & transported through tankers . 2. Sample is taken from each tanker & its quality parameter P is checked . The payment of the RM is based on this parameter P

  • Look-up error after Upgrade

    Hi, The R/3 system  in our project was upgraded from 4.6 c to ECC 6. After the upgrade we are facing issues with the lmapping look -up. We have used RFC channel to  do look up and and java codes which calls the RFC and dose parsing. Right now after u

  • How to delete the Generated WSR

    Hi experts how to delete the generated WSR , is there  any way to delete the Generated WSR rules please help Regards

  • How do I turn debug on

    Hi folks I am migrating from MS env. to Apex and I am struggling with tracking debug messages. Here is a message I get. "1 error has occurred•ORA-06550: line 1, column 7: PLS-00103: Encountered the symbol ";" when expecting one of the following: ( be