Insert Large Binary Values

Hi all.,
I want to insert a very large binary value of about 40000 character into database table i have tried by using clob datatype and blob datatype but all my efforts where ended in vain..
So can any one help me on this......
Thanks In advance

You should use the dbms_lob package.
RTM first, then ask.

Similar Messages

  • How to insert large xml file to XMLType column?

    Hi,
    I have a table with one column as XMLType(Binary XML storage option and Free Text Indexing). When i try to insert a large XML as long as 8kb, i'm getting an error ORA-01704:string literal too long.
    Insert into TEST values(XMLTYPE('xml HERE'));
    How to insert large XML values to XMLType column?
    Regards,
    Sprightee

    For a large XML file, you basically have two options - you can load the string directly as an XMLType, or you can load the string as a CLOB and cast it on the database side to an XMLType.
    If you decide to load the XML as XmlType client-side, then you may be interested to know that versions of Oracle after 11.2.0.2 support the JDBC 4.0 SQLXML standard. See the JDBC driver release documentation here:
    http://docs.oracle.com/cd/E18283_01/java.112/e16548/jdbcvers.htm#BABGHBCC
    If you want to load as a CLOB, then you'll need to use PreparedStatement's setClob() method, or allocate an oracle.sql.clob object.
    For versions before 11.2.0.2, you can create an XMLType with a constructor that includes an InputStream or byte[] array.
    HTH
    Edited by: 938186 on Jun 23, 2012 11:43 AM

  • Operations on very large binary numbers

    Hi guys,
    I'm trying to write a java class for manipulations of very large binary numbers.
    I'm representing the BN internally as boolean[] (array of booleans false for 0 & true for 1).
    I want to write an algorithm for the following operations :
    shifLeft (boolean[] b,int n) : shifting binary number to left by n positions.preferably a circular shift .
    and also the corresponding shiftRight() method.
    can some one guide me on how to implement this ?
    thanks.

    Consider an array of ascii chars....
    array =>   | |a|b|c|d|
    offset = 1
    len = 4In the above there are actually five spots in the
    array but the offset points to the second position in
    the array and the length is 4.
    If I was to extract the value it would be "abcd"
    because of the offset and the length.
    Now a 'shift right' means that if it is "abcd" then
    it should now be "abc" (because the d fell off the
    end.)
    I can do that like this.
    array =>   | |a|b|c|d|
    offset = 1
    len = 3Notice in the above that nothing changed except the
    length. But because the length changed if I
    extracted the value I would get "abc" because the
    offset is 1 and the length is three.
    Notice also that there was no array copying.Are you sure shifting works like that ?
    what I know about left shifting is that the 'd' should be moved left -toghether with a, b, c- and we append a 0 in place of 'd' .
    from what you said the 'd' would be gone , or am I again wrong ?
    it seems i still didn't get any satisfactory answer to this problem...

  • Binary Values unreadable - Database corrupt?

    Hi,
    I'm experiencing a really weird problem here, I have a table with some varchar columns and two binary columns. What I do now, is to select data from it via Python with the following select statement:
    select * from torder where customerid=1624 order by torderid desc
    What happens is, that only for this customerid some binary data is missing, they contain an empty string. If I do the select like this:
    select * from torder where customerid=1624 order by torderid
    It does work, the data is there. With this, it works too:
    select * from torder order by torderid desc
    So it seems, the combination of "customerid=1624" and "desc" somehow corrupts the results.
    What's furthermore interesting is, that when I delete the last inserted row of the result, it works, and if I add a new one, it's broken again.
    I could not check if the error is Python-related or not, as I found no way to retrieve/display the binary data in sqlcli, as it displays only something like 0x8002637A6F70652E6931 , but I doubt that this has something to do with Python.
    Any clues of how to fix this?
    My database version is Kernel    7.6.06   Build 003-121-202-135 | X32/LINUX 7.6.06   Build 003-121-20
    2-135
    Previously, I had 7.6.03 and updated it to the above version, hoping that this would fix the problem, but it did not.
    Best Regards,
    Hermann Himmelbauer

    Ok, first many thanks for your quick reply. I did not answer at first as the problem magically went away. But unfortunately the problem is back today.
    <p/>
    The DDL statements look the following, there are no indexes on the table:
    <p/>
    <pre>
    CREATE TABLE torder (
            torderid INTEGER NOT NULL DEFAULT SERIAL,
            creation_date TIMESTAMP,
            transfer_date TIMESTAMP,
            signed_date TIMESTAMP,
            signed_with VARCHAR(3) CHECK (signed_with in ('TAN', 'BKU', 'MAN') OR signed_with IS NULL),
            revocation_date TIMESTAMP,
            done_date TIMESTAMP,
            formdata LONG BYTE NOT NULL,
            trans_function VARCHAR(30) NOT NULL CHECK (trans_function in ('do_bank_transfer', 'do_bank_collection',
                                    'do_cheque_transfer', 'do_cash', 'do_cashdraw')),
            trans_type VARCHAR(30) NOT NULL CHECK (trans_type in ('national','sepa','international')),
            applet_location VARCHAR(35),
            errors_text VARCHAR(100),
            errors LONG BYTE,
            customerid INTEGER,
            dbuserid INTEGER,
            torder_periodicid INTEGER,
            PRIMARY KEY (torderid),
             FOREIGN KEY(customerid) REFERENCES kunde (kundeid),
             FOREIGN KEY(dbuserid) REFERENCES dbuser (dbuserid),
             FOREIGN KEY(torder_periodicid) REFERENCES torder_periodic (torder_periodicid)
    </pre>
    <p/>
    The insert statements look the following (They are copied out of the SQLAlchemy SQL log), one can see that the binary values are inserted here:
    <p/>
    <pre>
    INFO:sqlalchemy.engine.base.Engine.0x...d110:INSERT INTO torder (creation_date, transfer_date, signed_date, si
    gned_with, revocation_date, done_date, formdata, trans_function, trans_type, applet_location, errors_text, err
    ors, customerid, dbuserid, torder_periodicid) VALUES (now(), ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    INFO:sqlalchemy.engine.base.Engine.0x...d110:['20091027113506345604', None, None, None, None, '\x80\x02}q\x01(
    U\x0bcharges_forq\x02U\x04bothq\x03U\x05dcodeq\x04U\x0588888q\x05U\x0ccurrency_isoq\x06U\x03EURq\x07U\x06valut
    aq\x08K\x00U\x11foreign_acc_name1q\tU\x11Foreign Account 2q\nU\x0cacc_payer_noq\x0bX\t\x00\x00\x00999111110q\x
    0cU\ttextlinesq\r]q\x0e(U\x0c1st transferq\x0fU\tMultilineq\x10U\x0bFor testingq\x11eU\x06amountq\x12cdecimal\
    nDecimal\nq\x13U\x071000.00\x85Rq\x14U\x0cacc_benef_noq\x15X\t\x00\x00\x00888111110q\x16U\x11foreign_acc_name2
    q\x17U\x0bfor Testingq\x18u.', 'do_bank_transfer', 'national', 'url:test_transfer', None, None, None, None, No
    ne]
    </pre>
    <p/>
    All rows are inserted with binary data, so it's never NULL and it's also never set to an empty string ('').
    <p/>
    But when reading back the rows, some binary values are an empty string.
    <p/>
    The following Python code illustrates the problem:
    <pre>
    import sapdb.dbapi
    def true_false_result(result):
        if result:
            return 'Bug begins'
        else:
            return 'Bug ends'
    def check_bincols(bdb, sqlcmd):
        print "----
        print "QUERY: %s" % sqlcmd
        print "----
        bdbc = bdb.cursor()
        bdbe = bdbc.execute(sqlcmd)
        bug_occured = False
        stored_bin_result = False
        while 1:
            row = bdbe.fetchone()
            if row is None:
                break
    Now check if the binary data results to an empty string,
    This should never happen (= the database bug)
            bin_result = (row[7]() == '')
            if bin_result != stored_bin_result:
                bug_occured = True
                print "Toggle to %s at torderid %s" % (
                    true_false_result(bin_result),
                    row[0])
                stored_bin_result = bin_result
        if not bug_occured:
            print "No Bug for this query"
        bdbe.close()
        bdbc.close()
    #bdb.close()
    if __name__ == '__main__':
        bdb = sapdb.dbapi.connect('USER', 'PASS', 'DBNAME', 'LOCALHOST')
    First try the original query, which results in a bug
        sqlcmd = 'select * from torder where customerid=1624 order by torderid desc'
        check_bincols(bdb, sqlcmd)
    This query normally is bugfree
        sqlcmd = 'select * from torder where customerid=1624 order by torderid'
        check_bincols(bdb, sqlcmd)
    This query has a bug, too
        sqlcmd = 'select * from torder order by torderid desc'
        check_bincols(bdb, sqlcmd)
    But this one not
        sqlcmd = 'select * from torder order by torderid'
        check_bincols(bdb, sqlcmd)
    This triggers the bug, too, which is curious as there's practically no ordering
        sqlcmd = 'select * from torder where customerid=1624 order by customerid desc'
        check_bincols(bdb, sqlcmd)
    So, it seems that the bug occurs only when using "DESC" for descending
    ordering.
    </pre>
    <p/>
    The output of this program is:
    <p/>
    <pre>
    QUERY: select * from torder where customerid=1624 order by torderid desc
    Toggle to Bug begins at torderid 1355
    Toggle to Bug ends at torderid 582
    QUERY: select * from torder where customerid=1624 order by torderid
    No Bug for this query
    QUERY: select * from torder order by torderid desc
    No Bug for this query
    QUERY: select * from torder order by torderid
    No Bug for this query
    QUERY: select * from torder where customerid=1624 order by customerid desc
    Toggle to Bug begins at torderid 1355
    Toggle to Bug ends at torderid 582
    </pre>
    <p/>
    So, it can be seen that the problem occurs ONLY when using descending ordering. Moreover it's interesting that the result for "select * from torder order by torderid desc" is sometimes buggy, sometimes not, which seems to be related if someone inserted some more rows or not. What's furthermore interesting is the last query, as there the order is applied to "customerid", which is the very same for every row, so there is no ordering and the bug occurs here, too.
    <p/>
    All this happens on my production instance. I have a testing environment, where I imported the very same data set (same database version etc.) and there is no such problem (for now), so it's quite complicated to nail down the problem further as I cannot easily disrupt the availability of the production instance.
    <p/>
    All I could do for now is not to use the "DESC" command and reorder the data in my application, but that is really suboptimal as I have to keep all results in memory.
    <p/>
    Any help is really appreciated!
    <p/>
    Best Regards,<BR>
    Hermann Himmelbauer
    <p>
    Update: I tried the same query with the following code with C++ / SQLDBC (I modified one of the SQLDBC examples):
    <p>
    <pre>
    First you have to include SQLDBC.h
    #include "SQLDBC.h"
    #include <stdio.h>
    typedef struct ConnectArgsT {
        char * username;
        char * password;
        char * dbname;
        char * host;
    } ConnectArgsT;
    static void parseArgs (ConnectArgsT * connectArgs, int argc, char **argv);
    using namespace SQLDBC;
    Let start your program with a main function
    int main(int argc, char *argv[])
       ConnectArgsT connectArgs;
       parseArgs (&connectArgs, argc, argv);
       char errorText[200];
    Every application has to initialize the SQLDBC library by getting a
    reference to the ClientRuntime and calling the SQLDBC_Environment constructor.
       SQLDBC_IRuntime *runtime;
       runtime = SQLDBC::GetClientRuntime(errorText, sizeof(errorText));
       if (!runtime) {
         fprintf(stderr, "Getting instance of the ClientRuntime failed %s", errorText);
         return (1);
       SQLDBC_Environment env(runtime);
    Create a new connection object and open a session to the database.
       SQLDBC_Connection *conn = env.createConnection();
       SQLDBC_Retcode rc;
       rc = conn->connect(connectArgs.host, connectArgs.dbname,
                          connectArgs.username, connectArgs.password);
       if(SQLDBC_OK != rc) {
         fprintf(stderr, "Connecting to the database failed %s", conn->error().getErrorText());
         return (1);
       printf("Sucessfull connected to %s as user %s\n",
              connectArgs.dbname, connectArgs.username);
    Create a new statment object and execute it.
       SQLDBC_Statement *stmt = conn->createStatement();
       rc = stmt->execute("select * from torder where customerid=1624 order by torderid desc");
       if(SQLDBC_OK != rc) {
         fprintf(stderr, "Execution failed %s", stmt->error().getErrorText());
         return (1);
    Check if the SQL command return a resultset and get a result set object.
       SQLDBC_ResultSet *result;
       result = stmt->getResultSet();
       if(!result) {
         fprintf(stderr, "SQL command doesn't return a result set %s", stmt->error().getErrorText());
         return (1);
    Position the curors within the resultset by doing a fetch next call.
       while (1) {
         rc = result->next();
         if(SQLDBC_OK != rc) {
           break;
           //fprintf(stderr, "Error fetching data %s", stmt->error().getErrorText());
           //return (1);
         char szString[30];
         char szString1[3000];
         SQLDBC_Length ind;
    Get a string value from the column.
         rc = result->getObject(1, SQLDBC_HOSTTYPE_ASCII, szString, &ind, sizeof(szString));
         if(SQLDBC_OK != rc) {
           fprintf(stderr, "Error getObject %s", stmt->error().getErrorText());
           return (1);
         rc = result->getObject(8, SQLDBC_HOSTTYPE_ASCII, szString1, &ind, sizeof(szString1));
         if(SQLDBC_OK != rc) {
           fprintf(stderr, "Error getObject %s", stmt->error().getErrorText());
           return (1);
         printf("%s %s\n", szString, szString1);
    Finish your program with a returncode.
       return 0;
    static void parseArgs (ConnectArgsT * connectArgs, int argc, char **argv)
    setting defaults for demo database
        connectArgs->username = (char*)"USER";
        connectArgs->password = (char*)"PASS";
        connectArgs->dbname = (char*)"MYDB";
        connectArgs->host = (char*)"localhost";
    use values from command line
        if (argc > 4) {
            connectArgs->host = argv [4];
        if (argc > 3) {
            connectArgs->dbname = argv [3];
        if (argc > 2) {
            connectArgs->password = argv [2];
        if (argc > 1) {
            connectArgs->username = argv [1];
    </pre>
    <p/>
    This works!! So it seems, that the problem is related to the Python module, which is interesting, as adding "DESC" should not be any difference to it. I personally suspect that there are some memory leaks in the code, which result in this strange behavior.
    <p/>
    Any suggestions?<p/>
    Best Regards<br>
    Hermann Himmelbauer
    Edited by: Hermann Himmelbauer on Oct 28, 2009 12:51 PM
    Edited by: Hermann Himmelbauer on Oct 28, 2009 12:55 PM

  • Read large binary file

    How would you read a large binary file into memory please? I had a thought about creating a Byte Array on the fly, however you cannot createa byte array with a long so what happens when you reach the maximum value an integer can store?

    a) You can map the file, instead of reading it physically.
    b) Let's suppose that you are running Sun JVM in Windows 2000/XP/2003 and you have 4-GB of RAM in your machine (memory is so cheap nowadays...)
    - Windows can not use the full 4GB (it reserves 2GB for itself, and if you buy a special server version, it reserves only 1GB for itself.)
    - Java can't use more than 1.6 GB in Windows due to some arcane reasons. (Someone posted the Bug Database explanation in this forum.)
    So you have an upper limit of 1.6 GB.
    1.6GB is a value smaller than the size of an int, so you could have an array as big (using the class java.nio.ByteBuffer and the like). Try and see.

  • Any info on CRC, checksum, or other file integity VIs for large binary files?

    Working on send rather large binary files (U16 stream to file) via internet. Would like to check for file integity via CRC or comparable checksum. Would appreciate any comments/suggestions

    Hi Brian,
    You said;
    "Would appreciate any comments/suggestions".
    You did not mention what transport mechanism you plan on using.
    As I understand ALL of the standard mechanism use CRC of some form to ensure the validity of the packet BEFORE it is ever passed up the OSI 7-Layer model.
    TCP/IP based protocols will see to it that all of the segments of a transfer are completed and in order.
    UDP on the other hand is a broadcast type protocol and does not ensure any packets are recieved.
    So,
    At the very worst you should be able to handle your "sanity checks" by simply using a sequence value that is included in your out-going message. The reciever should just have to check if the current seq value is equal to the previous +1.
    I co-developed an app that ut
    ilized this technique to transfer status messages from a RT platform to a Windows machine. The status messages in this app where concidered FYI, so the sequence counter served as a way of determining if anything was missed.
    I am insterested in others thoughts on this subject.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Inserting large xml data into xmltype

    Hi all,
    In my project I need to insert very large XML data into xmltype column.
    My table:
    CREATE TABLE TransDetailstblCLOB ( id number, data_xml XMLType) XmlType data_xml STORE AS CLOB;
    I am using JDBC approach to insert values. It works fine for data less than 4000 bytes when using preparedStatement.setString(1, xmlData). As I have to insert large Xml data >4000 bytes I am now using preparedStatement.setClob() methods.
    My code works fine for table which has column declared as CLOB expicitly. But for TransDetailstblCLOB where the column is declared as XMLTYPE and storage option as CLOB I am getting the error : "ORA-01461: can bind a LONG value only for insert into a LONG column".
    This error means that there is a mismatch between my setClob() and column. which means am I not storing in CLOB column.
    I read in Oracle site that
    When you create an XMLType column without any XML schema specification, a hidden CLOB column is automatically created to store the XML data. The XMLType column itself becomes a virtual column over this hidden CLOB column. It is not possible to directly access the CLOB column; however, you can set the storage characteristics for the column using the XMLType storage clause."
    I dont understand its stated here that it is a hidden CLOB column then why not I use setClob()? It worked fine for pure CLOB column (another table) then Why is it giving such error for XMLTYPE table?
    I am struck up with this since 3 days. Can anyone help me please?
    My code snippet:
    query = "INSERT INTO po_xml_tab VALUES (?,XMLType(?)) ";
              //query = "INSERT INTO test VALUES (?,?) ";
         // Get the statement Object
         pstmt =(OraclePreparedStatement) conn.prepareStatement(query);
         // pstmt = conn.prepareStatement(query);
         //xmlData="test";
    //      If the temporary CLOB has not yet been created, create new
         temporaryClob = oracle.sql.CLOB.createTemporary(conn, true, CLOB.DURATION_SESSION);
         // Open the temporary CLOB in readwrite mode to enable writing
         temporaryClob.open(CLOB.MODE_READWRITE);
         log.debug("tempClob opened"+"size bef writing data"+"length "+temporaryClob.getLength()+
                   "buffer size "+temporaryClob.getBufferSize()+"chunk size "+temporaryClob.getChunkSize());
         OutputStream out = temporaryClob.getAsciiOutputStream();
         InputStream in = new StringBufferInputStream(xmlData);
    int length = -1;
    int wrote = 0;
    int chunkSize = temporaryClob.getChunkSize();
    chunkSize=xmlData.length();
    byte[] buf = new byte[chunkSize];
    while ((length = in.read(buf)) != -1) {
    out.write(buf, 0, length);
    wrote += length;
    temporaryClob.setBytes(buf);
    log.debug("Wrote lenght"+wrote);
         // Bind this CLOB with the prepared Statement
         pstmt.setInt(1,100);
         pstmt.setStringForClob(2, xmlData);
         int i =pstmt.executeUpdate();
         if (i == 1) {
         log.debug("Record Successfully inserted!");
         }

    try this, in adodb works:
    declare poXML CLOB;
    BEGIN
    poXML := '<OIDS><OID>large text</OID></OIDS>';
    UPDATE a_po_xml_tab set podoc=XMLType(poXML) WHERE poid = 102;
    END;

  • Reading large binary files into an array for parsing

    I have a large binary log file, consisting of binary data separted by header flags scattered nonuniformly thorughout the data.  The file size is about 50M Byte.  When I read the file into an array, I get the Labview Memory full error.  The design of this is to read the file in and then parse it fro the flags to determine where to separate the data blocks in the byte stream. 
    There are a few examples that I have read on this site but none seem to give a straight answer for such a simple matter.   Does anyone have an example of how I should approach this?

    I agree with Gerd.  If you are working with binaries, why not use U8 instead of doubles.
    If the file is indeed 50MB, then the array should be expecting 52428800 elements, not 50000000.  So if you read the file in a loop and populate an element at a time, you could run out of memory fast because any additional element insertion above 50000000 may require additional memory allocation of the size above 50000000 (potentially for each iteration).  This is just speculation since I don't see the portion of your code that populates the array.
    Question:  Why do you need an array?  What do you do with the data after you read it?  I agree with Altenbach, 50MB is not that big, so working with a file of such a size should not be a problem.

  • Assigning a hex value to a variable and getting binary value of a variable.

    I try to develop java programs and I need to do a conversion unicode to EBCDIC and vice versa.
    How can I assign hex values to variables to build UTF-EBCDIC and EBCDIC-UTF table and get hex or binary value of data to compare it to value of in the table?
    I did a conversion like this with PL/1 before. I do not know how can I do it with Java. Because I am new to Java.
    Thank you in advance.

    I will run java code in mainframe and java uses
    unicode for data in default and mainframe environment
    is EBCDIC. So I have to translate the data from
    unicode to ebcdic.I said I think String supports EBCDIC encoding...
    String ebcdic = new String(ebcdicBytes, "Cp500");
    http://java.sun.com/j2se/1.4.2/docs/guide/intl/encoding.doc.html

  • Need to convert a binary value into decimal

    Hi i need to convert a binary value which i would be getting as a string object to a decimal value......need code for the same

    Check Integer.parseInt
    http://java.sun.com/j2se/1.5.0/docs/api/java/lang/Integer.html#parseInt(java.lang.String,%20int)

  • How to insert check box value in table?

    Hi all
    kindly help me how to insert check box value in database. what code i have to use as i am new in programing.
    thanx in advance

    Hi,
    There is no "Check box" in a table, a check box is a GUI (Graphical user interface) item.
    What you want is to store a boolean value in a table. For that you can use the varchar2(1) datatype and store Y or N. (or anything else)
    (you cannot define boolean as a datatype for a column).
    If you're using a front-end application like apex then it might be useful for you to read the documentation about chekc boxes :
    http://download.oracle.com/docs/cd/E10513_01/doc/appdev.310/e10497/check_box.htm#CHDDFBFH
    (for the rest if it's Oracle Forms then everything is already said).
    Edited by: user11268895 on Aug 17, 2010 10:44 AM

  • How to insert a blank value in not nul column using transform activity

    can anyone help me on how to insert blank values in a not null column using transform activity or however possible..This is a requirement in my project ..

    vidya
    In DB adapter or-mappings.xml , did you made any changes. If not the open that file in any notepad editor and change the following
    <attribute-mapping xsi:type="direct-mapping">
    <attribute-name>director</attribute-name>
    <field table="MYTABLE" name="MAKE_IT_BLANK_NOT_NULL" xsi:type="column"/>
    <attribute-classification>java.lang.String</attribute-classification>
    </attribute-mapping>You can try to add this:
    <attribute-mapping xsi:type="direct-mapping">
    <attribute-name>director</attribute-name>
    <field table="MYTABLE" name="MAKE_IT_BLANK_NOT_NULL" xsi:type="column"/>
    <null-value></null-value>
    <attribute-classification>java.lang.String</attribute-classification>
    </attribute-mapping>Refer below link for details
    Re: Insertion of Blank value to a Not Null varchar column in SQL server table
    Thanks
    AJ

  • How to insert a default value into MS server in java - help please

    hi
    suppose if i have a table and one of the column has a default value when the table was created. how can i insert the default value into this column? assuming that the column is the second column and i can't specify the column name when inserting. thanks.

    thanks for ur response, so if i have insert statement as follow
    insert into someTable values (1,'val1', 'val2', 'val3', 'val4')
    and column 3 has default value = DEFAULT
    then, to insert default value, my insert statement will look like this
    insert into someTable values (1,'val1', 'val3', 'val4')
    but the number of values will not match the number of columns.

  • Assigning a binary value to an integer

    Hi, I am trying to assign a binary value in my program to an integer. In some assembly languages you would do the following:
    mov a, 00010101b
    for binary and for hex:
    mov a, 0E2h
    In java, you can assign a hex value with:
    int a=0x3F;
    What should I do to assign a binary value? Is it possible?
    Regards... Martin Schmiedel

    The static method parseInt in the java.lang.Integer class can be used to convert a String representing a binary number into its decimal integer value. In addition to the string representing the number, you also pass in an integer representing the radix, which for binary would be 2, hex would be 16, etc. So for example:
    int x = Integer.parseInt("00010110", 2);
    Is that what you're trying to do? Cheers,
    Chris

  • Enable the field in the list display and insert the new value and  save it.

    Hi
    In a report when I am in third list using ALV a field which is disabled should be enabled and  have to insert the new value in it and  save.
    please tell me how to do it using classes and methods and also using ALV's.
    Promise to reward points.
    Regards
    Mac

    Hi madan,
    Please find the code sample,
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/79b5e890-0201-0010-7f8e-b7c207edf7c2
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/ec31e990-0201-0010-f4b6-c02d876ce033
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/117beb90-0201-0010-67a8-fff1482209ae
    Regards
    Kathirvel

Maybe you are looking for

  • What is the best way to install a SSD?

    I'm sure this is answered somewhere but I'd like a confirmation as of Mavericks.  I have a Samsung 840 Pro 256GB SSD I'd like to install in my Late 2011 Macbook Pro.  I also have the HD Caddy to install my existing 750GB into the Superdrive bay.  I r

  • Save as excel problem

    Post Author: Marco Nardelli CA Forum: Deployment hi everyone I'using BO 6.5.1 and I try to export different BO reports (about 200 rows) to excel 2000 SR-1. But when I try to open the xls file it doesn't work. I reduced the number of rows exported fro

  • FBZ5 -Payment Advice print without giving check details

    Hi all, I got one requirement regarding FBZ5 I want to print only Payment Advice through FBZ5 without specifying check details (like check lot no). Currently check lot no is mandatory field. So how to skip check details ? For Payment Advice i am havi

  • Group by in update and index

    Hi All, I was trying to run an update SQL statement as follows: update tableA a set a.foo = (select b.foo from another_schema.tableB b where b.bar = a.bar group by b.bar,b.foo); I have an index in tableA on the bar column (it is a number(10,0) type c

  • Vaue Limit Purchase Order Approval

    Hi We are currently using the Value Limit Single Step Approval for Shopping Carts successfully. This is followed by 0 step approval for the creation of the PO. We have a scenario whereby we will need to apply this same single step value limit approva