Using event risen to log data into citadel

Hi all,
I have an HMI application in LV & DS 8 reads My OPC items, and log data, alarms and events into citadel. Knowing Citadel is event-driven, I want to use this event to distinguish a change in database and update my Historical Alarm & Even List in the HMI front panel instead of ever polling the database.
I really appreciate any help.

Can you post more information on what you are doing? I haven't done much with Citadel but I thought that it is aimed at logging data as it changes, i.e. all of the data logged in there will be different else it wouldn't have been logged in the first place?!
There are block diagram functions that allow you to read the traces back and this is all you need. Check out the examples that come with the DSC module.
Let me know if I can help you further.

Similar Messages

  • Unable to store log data into database through JDBCAppender of Log4j

    I am able to store the log data into the file as well as to display that on console. But unable to store the same into the database. I am not getting any error or warning while execution. The code of log.properties is as below : -
    log4j.rootLogger=ERROR, C, FILE
    log4j.logger.org.firebird=ERROR, C
    log4j.logger.org.firebirdsql=ERROR, C
    log4j.logger.org.apache.joran=ERROR, C
    log4j.logger.org.apache.log4j.joran.action=ERROR, C
    log4j.appender.FILE=org.apache.log4j.FileAppender
    log4j.appender.FILE.file=/log.txt
    log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
    log4j.appender.FILE.layout.ConversionPattern=[%d{MMM dd HH:mm:ss}] %-5p (%F:%L) - %m%n
    log4j.logger.org.apache.log4j.jdbcplus.examples=DEBUG, JDBC
    # console appender
    log4j.appender.C=org.apache.log4j.ConsoleAppender
    log4j.appender.C.layout=org.apache.log4j.PatternLayout
    log4j.appender.C.layout.ConversionPattern=%d [%t] %-5p %c %x - %m%n
    # JDBC appender using custom handlers, 2a)
    log4j.appender.JDBC=org.apache.log4j.jdbcplus.JDBCAppender
    log4j.appender.JDBC.connector=org.apache.log4j.jdbcplus.examples.MySqlConnectionHandler
    log4j.appender.JDBC.sqlhandler=org.apache.log4j.jdbcplus.examples.SqlHandler
    log4j.appender.JDBC.dbclass=com.mysql.jdbc.Driver
    log4j.appender.JDBC2.url=jdbc:mysql:172.22.15.131/3306:plugins?
    log4j.appender.JDBC2.username=user18
    log4j.appender.JDBC2.password=user18
    log4j.appender.JDBC.buffer=1
    log4j.appender.JDBC.commit=true
    log4j.appender.JDBC.sql=INSERT INTO logtest (id, prio, iprio, cat, thread, msg, layout_msg, throwable, ndc, mdc, mdc2, info,
    addon, the_date, the_time, the_timestamp, created_by) VALUES (@INC@, '@PRIO@', @IPRIO@, '@CAT@', '@THREAD@', '@MSG@',
    '@LAYOUT:1@', '@THROWABLE@', '@NDC@', '@MDC:MyMDC@', '@MDC:MyMDC2@', 'info timestamp: @TIMESTAMP@', '@LAYOUT@', cast
    ('@LAYOUT:3@' as date), cast ('@LAYOUT:4@' as time), cast ('@LAYOUT:3@ @LAYOUT:4@' as timestamp), 'me')
    log4j.appender.JDBC.layout=org.apache.log4j.PatternLayout
    log4j.appender.JDBC.layout.ConversionPattern=%m
    Please help me out.. As I got stuck...

    Hi,
    This might help
    http://avdeo.com/2008/05/21/uploading-excel-sheet-using-oracle-application-express-apex/
    I think heading about that blog post is wrong. It is solution to import CSV.
    But you can convert your Excels easilly to CSV.
    I think import pure Excel is quite hard, and I have not seen any solutions.
    See this post also
    Importing Excel spreadsheet into Oracle via Apex
    Br,Jari

  • Can i use one interface to load data into 2 different tables

    Hi Folks,
    Can i use one interface to load data into 2 different tables(same schema or different schemas) from one source table with same structure ?
    Please give me advice
    Thanks
    Raj
    Edited by: user11410176 on Oct 21, 2009 9:55 AM

    Hi Lucky,
    Thanks for your reply,
    What iam trying is ...Iam trying to load the data from legacy tables(3) into oracle staging tables.But i need to load the same source data into two staging tables(these staging tables are in two different schemas)
    can i load this source data into two staging tables by using single standard interface(some business logic is there)
    If i can then give me some suggestion how to do that
    Thanks in advance
    Raj

  • Can I use Bridge to export image data into a .txt file?

    I have a folder of images and I would like to export the File Name, Resolution, Dimensions and Color Mode for each file into one text file. Can I use Bridge to export image data into a .txt file?

    Hello
    You may try the following AppleScript script. It will ask you to choose a root folder where to start searching for *.map files and then create a CSV file named "out.csv" on desktop which you may import to Excel.
    set f to (choose folder with prompt "Choose the root folder to start searching")'s POSIX path
    if f ends with "/" then set f to f's text 1 thru -2
    do shell script "/usr/bin/perl -CSDA -w <<'EOF' - " & f's quoted form & " > ~/Desktop/out.csv
    use strict;
    use open IN => ':crlf';
    chdir $ARGV[0] or die qq($!);
    local $/ = qq(\\0);
    my @ff = map {chomp; $_} qx(find . -type f -iname '*.map' -print0);
    local $/ = qq(\\n);
    #     CSV spec
    #     - record separator is CRLF
    #     - field separator is comma
    #     - every field is quoted
    #     - text encoding is UTF-8
    local $\\ = qq(\\015\\012);    # CRLF
    local $, = qq(,);            # COMMA
    # print column header row
    my @dd = ('column 1', 'column 2', 'column 3', 'column 4', 'column 5', 'column 6');
    print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    # print data row per each file
    while (@ff) {
        my $f = shift @ff;    # file path
        if ( ! open(IN, '<', $f) ) {
            warn qq(Failed to open $f: $!);
            next;
        $f =~ s%^.*/%%og;    # file name
        @dd = ('', $f, '', '', '', '');
        while (<IN>) {
            chomp;
            $dd[0] = \"$2/$1/$3\" if m%Link Time\\s+=\\s+([0-9]{2})/([0-9]{2})/([0-9]{4})%o;
            ($dd[2] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of CODE\\s/o;
            ($dd[3] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of DATA\\s/o;
            ($dd[4] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of XDATA\\s/o;
            ($dd[5] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of FARCODE\\s/o;
            last unless grep { /^$/ } @dd;
        close IN;
        print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
    EOF
    Hope this may help,
    H

  • I have manually inputted data into a blank spreadsheet and would like to use a form to enter data into that sheet !!! How do I do that please

    I have manually inputted data into a blank spreadsheet and would like to use a form to enter data into that sheet !!! How do I do that please

    Leigh,
    After creating your table, Tap the Tab marked "+" and select "Form" - you will be asked which table to use. (You should get into the habit of naming your tables - if the table name isn't visible, select the table and tap the Styles (Brush) Menu > Table > Turn ON table Name, then double tap on the Table to edit it.)
    Select the table you wish to to fill with a form and you will see a new form based on the header data.
    The form will allow you to add one row at a time and when you get to the last row, there is an option at the bottom of the form to add a new row using the "+" button. At the top of the form is the row header which you can rename by double tapping.
    You can also rename the form on the Tab by double tapping on the name.
    Try it out.

  • Using .csv file and adding data into database

    hi,
    i'm working on a project which shows all the share prices on a webpage from the FTSE100..
    my problem is..i connect to yahoo.co.uk to get my share price information which is updated every 15mins..they return a .csv file to me...at the moment, i am just printing the information onto my website, but is there any way that i could store this information into a database if i needed the data to be used elsewhere...thanx for the help in advance

    below is my code which i used to get my info onto the web...i'd just like to know how i would use this to store the data into a database..
    import java.net.*;
    import java.io.*;
    import java.lang.*;
    import java.util.*;
    public class SharePrice
         private String line;
         private int maxShares = 101;//maximun shares a user can have
         private int details = 5;//five details, name,date,time,price,change.
         public String [][] shareData = new String[maxShares][details];
    public SharePrice(String [] shares) throws Exception
              getShare(shares);
         //returns a double array containing share data of each share as a seperate row in the array
    public String [][] getShare(String [] sh) throws Exception
                   for(int i=0; i<sh.length; i++)
                        //if the entry is null we have reached the end of the array
                        if(sh!=null)
                             String share = sh[i];
                             //part of url of the resource
                             String address ="http://uk.finance.yahoo.com/d/quotes.csv?s=";
                             //adds the share tothe url so that particular shares data is retieved
                             address = address+share;
                             System.out.println(address);
                             try
                                  //connection is created to the resource and input stream opened to read data
                                  URL url = new URL(address);
                                  BufferedReader in = new BufferedReader(
                                            new InputStreamReader(
                                            url.openStream()));
                                  line = in.readLine();
                                  in.close();
                             }catch(Exception e){System.err.println("Exception: " + e.getMessage());
                                  e.printStackTrace();}
                             //the line of data retrieved is spli and placed in a single row of the array
                             //beause the each piece of data is seperated by commas it is easily seperated.
                             StringTokenizer t = new StringTokenizer(line, ",");
                             int count = t.countTokens();
                             System.out.println(" count= "+count);
                             while(t.hasMoreTokens())
                                  for(int j=0; j<count; j++)
                                       String s = t.nextToken();
                                       shareData[i][j] = s;
              return shareData;

  • Using importdata to import xml data into dynamic PDF form

    Hi again,
    Me and my colleagues have a problem using the importData service to import some xml data into an empty PDF form (represented as an XFA input variable).
    In the server log I get the error that Only XDP data is supported for XFA forms, however I only have the xml data and not the entire xdp available.
    Is this really not possible to to (like importing xml data to a form is possible in the Designer when creating forms).
    I hope the scenario is understandable
    Sincerely
    Kim Christensen
    Dafolo A/S
    Denmark
    PS: During the various projects I am working on I keep running into problems...however I am totally new to LiveCycle so I consider this very informative learning steps and appreciate all your help :-)

    Hi again,<br /><br />I have been experimenting a little with both the renderPDFForm and importData services. However I don't seem to be ble to make them work as I  need them to.<br /><br />My scenario is simple, I have one (call it a template xdp/PDF form) and lots of data in xml files (around 1000), that I need to import into the template. Therefore I have set up a "Watched Folder" to take the xml as a document (I guess this is a requirement) and then I need either the renderPDF or importData services to import the xml data into the template.<br /><br />I would like to know how I should setup the services to make this work.<br /><br />When I try to use importdata I setup the following:<br /><br />PDF document: set to be the template i need to import the xml to)<br /><br />Input data: the document variable (an xml file) that is passed to the   Watched Folder<br /><br />Data merged PDF: set to an out xfaform<br /><br />When I do this I get an Coercion error in the server log:<br /><br />2007-11-15 13:27:05,324 ERROR [com.adobe.workflow.AWS] stalling action-instance: 1506 with message: ALC-DSC-000-000: com.adobe.idp.dsc.DSCRuntimeException: Internal error.<br />     at com.adobe.idp.dsc.util.CoercionUtil.toDOMDocument(CoercionUtil.java:656)<br />     at com.adobe.idp.dsc.util.CoercionUtil.toType(CoercionUtil.java:878)<br />     at com.adobe.idp.dsc.util.CoercionUtil.toType(CoercionUtil.java:803)<br />     at com.adobe.workflow.datatype.runtime.support.AbstractDataTypeRuntimeHandler.coerceFrom(Abs tractDataTypeRuntimeHandler.java:64)<br />     at com.adobe.workflow.datatype.runtime.support.AbstractComplexDataTypeRuntimeHandler.getNode (AbstractComplexDataTypeRuntimeHandler.java:47)<br />     at com.adobe.workflow.dom.VariableElement.setBoundValue(VariableElement.java:93)<br />     at com.adobe.workflow.pat.service.PATExecutionContextImpl.setProcessDataValue(PATExecutionCo ntextImpl.java:729)<br />     at com.adobe.workflow.pat.service.PATExecutionContextImpl.setProcessDataWithExpression(PATEx ecutionContextImpl.java:335)<br />     at com.adobe.idp.workflow.dsc.service.SetValueService.execute(SetValueService.java:46)<br />...<br />Caused by: ALC-DSC-119-000: com.adobe.idp.dsc.util.InvalidCoercionException: Cannot coerce object: <document state="passive" senderVersion="3" persistent="false" senderPersistent="true" passivated="true" senderPassivated="true" deserialized="true" senderHostId="127.0.0.1/172.16.10.125" callbackId="0" senderCallbackId="7" callbackRef="null" isLocalizable="true" isTransactionBound="false" defaultDisposalTimeout="600" disposalTimeout="600" maxInlineSize="65536" defaultMaxInlineSize="65536" inlineSize="8039" contentType="application/xml" length="-1"><cacheId/><localBackendId/><globalBackendId/><senderLocalBackendId/><senderGl obalBackendId/><inline><?xml version="1.0" encoding="UTF-8"?><br /><form1><br />  <sub_BlanketTop /><br />  <sub_SubjectTop><br />    <f...</inline><senderPullServantJndiName>adobe/idp/DocumentPullServant/adobejb_server1</s enderPullServantJndiName><attributes file="c:\NCRConvert\ProcessForm\stage\Wx450d4b32843a0b0bcb8ef99e\NCR-00564_dXAE3soH.xml"/ ></document> of type: com.adobe.idp.Document to type: interface org.w3c.dom.Document<br />     at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)<br />     at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)<br /><br />However it is possible to manually import the xml data in Acrobat Professional without any problems.<br /><br />When I use the renderPDFForm with the following settings:<br /><br />Form to render: literal value that points to the template<br /><br />Form data: document variable from watched folder (xml)<br /><br />Content Root URI: repository://<br /><br />With these settings I also get a coercion error, however it does not seem to be exactly the same.<br /><br />Sincerely<br />Kim

  • Change Log data into SAP BI

    Any best practices for getting change-log data similar to that captured in CDHDR/CDPOS tables into SAP BI? What options exist for tracking changes to sales orders or invoices into SAP BI without touching CDHDR/CDPOS tables?
    Thanks,
    Vinny Ahuja

    hi,
    you check weather that data source carried out delta loading or not by using
    RSA7 T-Code. In represents to sales or invoices, if that particular data source meant
    for delta loading into bw means it will show status green. by seeing this particular
    data source carried out delta loading in to bw system.
    if helpful provide points
    regards
    harikrishna N

  • Error while using Rule file in loading data into Essbase through ODI

    Hi Experts,
    I am facing problem while loading data into Essbase. I am able to load data into Essbase successfully. But when i used Rule fule to add values to existing values am getting error.
    test is my Rule file.
    com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Cannot put olap file object. Essbase Error(1053025): Object [test] already exists and is not locked by user [admin@Native Directory]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
         at java.lang.Thread.run(Thread.java:662)
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C1_HSP_RATES "HSP_Rates",C2_ACCOUNT "Account",C3_PERIOD "Period",C4_YEAR "Year",C5_SCENARIO "Scenario",C6_VERSION "Version",C7_CURRENCY "Currency",C8_ENTITY "Entity",C9_VERTICAL "Vertical",C10_HORIZONTAL "Horizontal",C11_SALES_HIERARICHY "Sales Hierarchy",C12_DATA "Data" from PLANAPP."C$_0HexaApp_PLData" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    #stmt.setFetchSize(srcFetchSize)
    stmt.setFetchSize(1)
    print "executing query"
    rs = stmt.executeQuery(sql)
    print "done executing query"
    #load the data
    print "loading data"
    stats = pWriter.loadData(rs)
    print "done loading data"
    #close the database result set, connection
    rs.close()
    stmt.close()
    Please help me on this...
    Thanks & Regards,
    Chinnu

    Hi Priya,
    Thanks for giving reply. I already checked that no lock are available for rule file. I don't know what's the problem. It is working fine without the Rule file, but throwing error only when using rule file.
    Please help on this.
    Thanks,
    Chinnu

  • Using Import Man to load Data into Multi Value Fileds in a Qualified Table

    Hi there,
    When using the Import Manager, i can not use the "append" option to load data into my multi value field which is contained within my qualified table.
    Manually it works fine on Data manager, so the field has been set up correctly. Only problem is appending the data during Import Manager Load.
    Any reason why I do not have this option available during Field mapping in Import Manager. The selection options are shown but in gray.
    Would appreciate any sugestions.
    Chris Huggett

    Thanks Sowseel
    Its a good document but doesn't address my problem, maybe My problem isn't clear.
    The structure(part of) that I have currently is as follows.
    Main Table - Material
                           QFTable-  MNF PN
                               LUField - MNF Name(Qualifier Single Value)
                               LUField  - BU ID  (Non Qualifier Multi Value)
                               TField   - P/N- (Non Qualifier)
    I know how to load data to the main and qualified tables, but what I can not do, using Import Manger, is updating the  "LUField  - BU ID  (Non Qualifier Multi Value)" using the append functionality.
    Thanks
    Chris Huggett

  • Using VBA for loading Query data into Excel workbook

    Hi all.
    I want simply load data from BEx query into Excel Wortksheet using VBA because of report formats and footer section at the end of the results.
    Any code examples, tutorials, comments, suggestions will be regarded.
    thanx in advance,
    Gediminas

    The difficalty is that I don't know the number of rows report will return. And I need my footer only on LAST page of workbook.
    Another thing I can't imagine how to do by using standart BEx functionality is to design complex column header set (merged columns, sub-columns and etc.).

  • Interface used in loading flat file data into bw

    what is the interface used in loading flat files into the bw system.

    HI Flat file load is its own interface.  It leverages the S-API (Service API), the standard interface that is used to load data in the BI system.
    Start from this point:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03450525ee517be10000000a1553f6/frameset.htm
    Thanks for any points you choose to assign.
    Regards -
    Ron Silberstein
    SAP

  • Unable to use sqlloader to load large data into a varchar2 column..plz.help

    Hi,
    Currently the column in the database is a varchar2(4000) coulmn that stores data in the form of xml and data that has many carrriage returns.
    Current I am trying to archive this column value and later truncate the data in this column.
    I was able to create a .csv file but when using the sqlloder ,the utility fails as the data I am about to load has data stored both in the form of "xml" sometimes and sometimes as a list of attributes separated by newline character.
    After several failed attempts... I would like to know if it can be achieved using sqlloader or if sqlloader is a bad choice and i should go for an import/export instead?
    Thanks in advance...
    -Kevin

    Currently the column in the database is a
    varchar2(4000) coulmn that stores data in the form of
    xml and data that has many carrriage returns. Can I ask why your XML data has carriage returns in it? The nature of XML dictates that the structure is defined by the tags. Of course you can have CR's in your actual data between those tags, but if someone is hard coding CR's into the XML to make it look pretty this in not how XML was intended to be used.
    I was able to create a .csv file but when using the
    sqlloder ,the utility fails as the data I am about to
    load has data stored both in the form of "xml"
    sometimes and sometimes as a list of attributes
    separated by newline character.It probably can be (can you provide a sample of data so we can see the structure) but you would probably be best revisiting the code that generates the CSV and ensure that it is output in a simpler format.

  • Using events (?) to share data between tab pages

    I don't know whether this is an event question or an Object-Oriented design question or a JTabbedPane question.
    I'm pretty new to Java, so be patient!
    I wish to have a JTabbedPane with a tab to do 'Import' of data, and a tab to do 'Export' of data (the SAME data which was imported). Now, how do I share the data between the 2 tabs?
    It seems the data belongs at the JFrame level, but then how does the data in each tab access it? In other languages I would pass a reference to the data object in the window to each tab.
    Or how do I get (or set) the data when the tab changes? And how do I know when the tab changes?
    I think I need to set trigger some sort of event to the parent window, who I suppose has a listener, and then it can then query the appropriate pane? If so, I'm not sure what sort of event trigger I should be using, where etc.
    I know if I had one tab with an import/export functionality then I could declare the Collection within that panel.
    Help!!!
    Thanks ;)

    I think this is an OO design question.
    You basically have two views (the two panes) of the same data. So I'd suggest the Observer pattern (see the Design Patterns book) where one model holds the data and the model is observed by several views.

  • How to use record type for inserting data into a table

    I've read through the web the following tip:
    "When working with a large number of columns, using variables of type RECORD is a better approach. With this approach, you do not have to list anything in the INSERT statement, because you are inserting a variable into the table that is defined as a row from the same table."
    I usually define a variable as
    r_mytable     mytable%ROWTYPE;
    and inside my procedure I'll set the fields I need as
    r_mytable.field1 := 1;
    r_mytable.field3 := 7;
    and then I perform the insert as
    insert into mytable
    (field1,
    field3)
    values
    (r_mytable.field1,
    r_mytable.field3);
    According to the tip I've copied above, how can I use the TYPE RECORD of pl/sql for achieving the same result but perhaps in a smarter way?
    Thanks in advance!

    Hi,
    Are you looking for this?
    SQL> create table table1 (id number, descr varchar2(30));
    Table created
    SQL>
    SQL> DECLARE
      2   my_rec table1%rowtype;
      3  BEGIN
      4   my_rec.id := 1;
      5   my_rec.descr := 'TEST';
      6   insert into table1 values my_rec;
      7  END;
      8  /
    PL/SQL procedure successfully completed
    SQL> select * from table1;
       ID DESCR
        1 TESTWorks with UPDATE too :
    SQL> DECLARE
      2   my_rec table1%rowtype;
      3  BEGIN
      4   my_rec.id := 2;
      5   my_rec.descr := 'TEST2';
      6   update table1 set row = my_rec where id = 1;
      7  END;
      8  /
    PL/SQL procedure successfully completed
    SQL> select * from table1;
       ID DESCR
        2 TEST2

Maybe you are looking for