How to log data directly to an ODBC datasource(MS-Access).

Greetings,
I have been logging the data(Array) obtained from a transducer directly into a spreadsheet using Labview VI.
Now I need to put this data directly into an Access 2000 Database.
Any Suggestions ???.

There are a couple of options for logging data to an ODBC database: the LabVIEW Datalogging and Supervisory Control (DSC) module or the LabVIEW Enterprise Connectivity Toolkit.
With LabVIEW DSC comes the ODBC Citadel Database. This allows you to easily log data, and you could retrieve this data into MS Access with SQL statements. More information about the DSC module can be found at:
http://sine.ni.com/apps/we/nioc.vp?lang=US&pc=mn&cid=1010
The Enterprise Connectivity Toolkit gives you tools for database connectivity, statistical process control, and internet-enabling technologies. More information about this toolkit can be found at:
http://sine.ni.com/apps/we/nioc.vp?lang=US&pc=mn&cid=2452

Similar Messages

  • How to read data directly from clusters

    hi all,
    how to read data directly from clusters
    Thanx in advance,
    amruta.

    Using macro:
    RP-IMP-C2-B2.
    RP-IMP-C2-B1.
    RP-IMP-C2-ZL.
    ....etc.
    For TM cluster, U also can use BAPIs like HR_TIME_RESULTS_GET
    More details see SAP HR course 350(HR Programming)

  • How to Table data directly stored as file in FTP location

    Hi All,
    In my process i need to convert table into csv file,i know how to convert as file,but the file will stored into FTP location directly,can any one help regarding this...
    Thanks in Advance...
    Edited by: 947267 on Oct 26, 2012 2:46 AM

    Hi,
    I think you will have to store it locally first and then use the OdiFtpPut Tool in a package.
    Does your requirements allow that ?
    Regards,
    JeromeFr

  • How to marshal data directly to java.lang.String ?(JAXB, Marshaller)

    The "Marshaller" interface provide some basic methods. It can marshals to a File, or a SAX ContentHandler, or a DOM Node, or a OutputStream etc...
    But I just only want to marshals to a String, and then sent the String as TextMessage via JMS.
    So, Is it exists a DIRECT way to do this? If NOT, I think I must marshal to a File first, then read the File into a String...
    Congratulations for China-Spaceflight !
    "ShenZhou NO.5" space shuttle.
    ��������

    Here is my code
    String theXMLData = null;
    // Output
    Marshaller thePackager = jc.createMarshaller();
    thePackager.setProperty (Marshaller.JAXB_ENCODING, "GB2312");
    thePackager.setProperty( Marshaller.JAXB_FORMATTED_OUTPUT, Boolean.TRUE );
    thePackager.marshal( theRootElement, System.out );          // to Console
    // to String
    StringWriter theStringWriter = new StringWriter (1024);     // Initial size = 1K bytes
    thePackager.marshal( theRootElement, theStringWriter );     // to String
    theXMLData = theStringWriter.toString();
    System.out.println (theXMLData);--------------------
    China-Spaceflight

  • How to extract data in a format recognized by MS Access

    Hello
    I need a help to extract a database from Essbase (excel addin or esscmd) to a file and create
    a database to put inside a Microsoft Access, the model i want is:
    ** I WANT THIS MODEL **
    Country     Month     Value
    Brazil     Jan     1000,00
    Brazil Feb 1500,00
    USA     Jan     3000,00
    USA     Feb     3500,00
    ** I JUST KNOW HOW TO MAKE THIS ONE **
         Jan     Fev
    Brazil     1000,00 1500,00
    USA     3000,00 3500,00
    This example is a small base from my database, i cant format all base to the wanted model, please, im begin in a essbase
    and need a help.
    Thx a lot and sorry for my bad english
    Claudio G. Silva
    Brazil / Parana

    If you want to import in this format, I suggest you use the Java Custom Function JExport. It is more flexible than Dataexport to create the file the way you need it. This thread shows you where to get it
    Any body have JEXPORT utility

  • How to bring data for newly aciated datasource in a Application componenet

    Hi ,
             i have 2 data sources(application component 11) are active in production system. and  delta load is running for them.
    Now we have a requirement that we need to activate the 3rd datasource in the same application component 11.
    so, how to bring data for the newly acivated datasource. Do i delete the setup table data and again run statistical setup and run the init for the three data sources?
    how to do?

    Hi,
    To bring the data for the new data source you need to delete the data from setup tables and need to fill up again. before filling up the tables, its better to load the delts for other 11 application components data.
    Your deltas will come from delta queue, by deleting the setup tables, you will not impact your delta loads.
    So just fill up setup tables and run your init package. it will extract the data.
    Thanks
    Srikanth

  • ODBC-datasource generating ORA-12154 (possible listener issue)

    Hi,
    I am trying to configure ao ODBC-datasoure from a client (using instant client 11.2) to a 11.2g Oracle DB server. The database listener is configured to use a non standard port 1700 in this case.
    There is a firewall between the client and the server which is configured to allow connections to the database server on the specified port, when I use telnet from the client to the server, the connection opens on this port, but when I try to test the odbc-datasource I have configured ig et the error ORA-12154.
    The link below outlines the details of the error:
    http://ora-12154.ora-code.com/
    To the context should be added, that there is no problem to use SQL Plus from a computer behind the firewall to connect to the database server using the TNS name that is being used in the ODBC-datasource.
    I have tried to configure the odbc-datasource using both a tns_names.ora file, and also directly in the odbc-datasource using //[ip address] : [port]/TNS_NAME. Both ways of configuring the odbc-datasource generates the same error. I am currently suspecting that the error might be due to some sort of connection redirection on the part of the tns listener on the database server which triggers the firewall to terminate the connection. Could this be the case?
    Does anyone have any suggestions what this error might be caused by. Any suggestions on how to continue my troubleshooting would also be valueble. Further, what kind of configuration would need to be inplace, on the tns-listener / database server side to make this sort of configuration to work.
    Finally, I should also ask that I have tested the same ODBC-datasource configuration in a test system without any firewall inbetween, and that configuration works fine, so there seems to be no problem with the client software configuration per se.
    Any help is apprichated.
    /Eaglecoth

    Cabelcow wrote:
    I managed to solve this issue myself.
    Since there seems to be some problem locating the server I added "HOSTNAME" to the following line in the SQLNET.ORA file on the server:
    names.directory_path = (HOSTNAME,TNSNAMES)
    This solved the issue by using the following syntax in the ODBC Configuration:
    TNS Service Name: [ip-address]:[port]/[TNS NAME]
    Where the TNS_NAME should be that of the TNS_NAME for the database defined in the TNSNAMES_ORA at the server. Note that this value is case sensetivetnsnames.ora is ONLY used by the client side application. It is the tns complement to the local 'hosts' file. It is used by the CLIENT to resolve an alias (tns net service name) to a host (ultimately an ip), port, and service name. Your assertion that the tns_name should be ... hmm, now that I read that again, are you saying that the tnsnames entry on the client should match the one in the tnsnames file on the server? If so, yes and no. There is no technical requirement that they match. It is simply that it is usually assumed that the one on the server is correct and may be used as a model for what to do on the client. The server - acting as a server - doesn't even use the tnsnames.ora file. It exists on the server only to support any client process that may happen to be running on the same box as the db.
    Maybe this will help you understand the connections
    =================================
    ORA-12154: TNS:could not resolve the connect identifier specified
    This error means one thing, and one thing only. The client could not find the specified entry in the tnsnames.ora file being used.
    As a follow-on to that statement, remember that when you use a dblink, the database in which the link is defined is acting as a client to the database that is the target of the link. So in this case, the tnsnames.ora file on the host of your source should have an entry for your target db, as defined in the db_link.
    And for the umpteenth time ... this error has <b><i><u>NOTHING</u></i></b> to do with the status of a listener. The connection request never got far enough to reach a listener. If anyone tells you to check a listener in response to ora-12154, they are not paying attention, or do not understand how TNS works. This error is the equivalent of not being able to place a telephone call because you don't know the number of the party you want to reach. You wouldn't debug that situation by going to the other guy's house and testing his telephone, or by going to the phone company and testing the switchboard. And you don't debug a ORA-12154 by checking the listener. If I had a top ten list of "Incredibly Simple Concepts (tm)" that should be burned into the brain of everyone who claims to be an Oracle DBA, it would include "ORA-12154 Has Nothing To Do With The Listener".
    =================================
    A couple of important points.
    First, the listener is a server side only process. It's entire purpose in life is to receive requests for connections to databases and set up those connections. Once the connection is established, the listener is out of the picture. It creates the connection. It doesn't sustain the connection. One listener, with the default name of LISTENER, running from one oracle home, listening on a single port, will serve multiple database instances of multiple versions running from multiple homes. It is an unnecessary complexity to try to have multiple listeners or to name the listener as if it belongs to a particular database. That would be like the telephone company building a separate switchboard for each customer.
    Additional notes on the listener: One listener is capable of listening on multiple ports. But please notice that it is the listener using these ports, not the database instance. You can't bind a specific listener port to a specific db instance. Similarly, one listener is capable of listnening on multiple IP addresses (in the case of a server with multiple NICs) But just like the port, you can't bind a specific ip address to a specific db instance.
    Second, the tnsnames.ora file is a client side issue. It's purpose is for address resolution - the tns equivalent of the 'hosts' file further down the network stack. The only reason it exists on a host machine is because that machine can also run client processes.
    Assume you have the following in your tnsnames.ora:
    larry =
      (DESCRIPTION =
        (ADDRESS_LIST =
          (ADDRESS = (PROTOCOL = TCP)(HOST = myhost)(PORT = 1521))
        (CONNECT_DATA =
          (SERVICE_NAME = curley)
      )Now, when you issue a connect, say like this:
    $> sqlplus scott/tiger@larrytns will look in your tnsnames.ora for an entry called 'larry'. Finding it, tns sends a request through the normal network stack to (PORT = 1521) on (HOST = myhost) using (PROTOCOL = TCP), asking for a connection to (SERVICE_NAME = curley).
    Where is (HOST = myhost) on the network? When the request gets passed from tns to the next layer in the network stack, the name 'myhost' will get resolved to an IP address, either via a local 'hosts' file, via DNS, or possibly other less used mechanisms. You can also hard-code the ip address (HOST = 123.456.789.101) in the tnsnames.ora.
    Next, the standard networking process delivers the message to port 1521 on myhost. Hopefully, there is a listener on myhost configured to listen on port 1521, and that listener knows about SERVICE_NAME = curley. If so, the listener will spawn a server process to act as the intermediary between your client and the database instance. Communication to the server process will be on a randomly selected available port. At that point the listener is out of the process and continues to user port 1521 to await other connection requests.
    What can go wrong?
    First, there may not be an entry for 'larry' in your tnsnames. In that case you get "ORA-12154: TNS:could not resolve the connect identifier specified" No need to go looking for a problem on the host, with the listener, etc. If you can't place a telephone call because you don't know the number (can't find your telephone directory (tnsnames.ora) or can't find the party you are looking for listed in it (no entry for larry)) you don't look for problems at the telephone switchboard.
    Maybe the entry for larry was found, but myhost couldn't be resolved to an IP address (say there was no entry for myhost in the local hosts file). This will result in "ORA-12545: Connect failed because target host or object does not exist"
    Maybe there was an entry for myserver in the local hosts file, but it specified a bad IP address. This will result in "ORA-12545: Connect failed because target host or object does not exist"
    Maybe the IP was good, but there is no listener running: "ORA-12541: TNS:no listener"
    Maybe the IP was good, there is a listener at myhost, but it is listening on a different port. "ORA-12560: TNS:protocol adapter error"
    Maybe the IP was good, there is a listener at myhost, it is listening on the specified port, but doesn't know about SERVICE_NAME = curley. "ORA-12514: TNS:listener does not currently know of service requested in connect descriptor"
    Third: If the client is on the same machine as the db instance, it is possible to connect without referencing tnsnames and without going through the listener.
    Now, when you issue a connect, say like this:
    $> sqlplus scott/tigertns will attempt to establish an IPC connection to the db instance. How does it know the name of the instance? It uses the current value of the enviornment variable ORACLE_SID. So...
    $> export ORACLE_SID=fred
    $> sqlplus scott/tigerIt will attempt to connect to the instance known as "fred". If there is no such instance, it will, of course, fail. Also, if there is no value set for ORACLE_SID, the connect will fail.
    check executing instances to get the SID
    [oracle@vmlnx01 ~]$ ps -ef|grep pmon|grep -v grep
    oracle    4236     1  0 10:30 ?        00:00:00 ora_pmon_vlnxora1set ORACLE_SID appropriately, and connect
    [oracle@vmlnx01 ~]$ export ORACLE_SID='vlnxora1
    [oracle@vmlnx01 ~]$ sqlplus scott/tiger
    SQL*Plus: Release 10.2.0.4.0 - Production on Wed Sep 22 10:42:37 2010
    Copyright (c) 1982, 2007, Oracle.  All Rights Reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing optionsNow set ORACLE_SID to a bogus value, and try to connect
    SQL> exit
    [oracle@vmlnx01 ~]$ export ORACLE_SID=FUBAR
    [oracle@vmlnx01 ~]$ sqlplus scott/tiger
    SQL*Plus: Release 10.2.0.4.0 - Production on Wed Sep 22 10:42:57 2010
    Copyright (c) 1982, 2007, Oracle.  All Rights Reserved.
    ERROR:
    ORA-01034: ORACLE not available
    ORA-27101: shared memory realm does not exist
    Linux Error: 2: No such file or directory
    Enter user-name: Now set ORACLE_SID to null, and try to connect
    [oracle@vmlnx01 ~]$ export ORACLE_SID=
    [oracle@vmlnx01 ~]$ sqlplus /scott/tiger
    SQL*Plus: Release 10.2.0.4.0 - Production on Wed Sep 22 10:43:24 2010
    Copyright (c) 1982, 2007, Oracle.  All Rights Reserved.
    ERROR:
    ORA-12162: TNS:net service name is incorrectly specifiedOk, that is how we get from the client connection request to the listener. What about the listener's part of all this?
    The listener is very simple. It's job is to listen for connection requests and make the connection (server process) between the client and the database instance. Once that connection is made, the listener is out of the picture. If you were to kill the listener, all existing connections would continue. The listener is configured with the listener.ora file, but if that file doesn't exist, the listener is quite capable of starting up with all default values. One common mistake with the listner configuration is to specify "HOST=localhost" or "HOST=127.0.01". This is a NONROUTABLE ip address. LOCALHOST and ip address 127.0.0.1 always mean "this machine on which I am sitting". So, all computers are known as "localhost" or "127.0.0.1". If you specify this address, the listener will only be capable of receiving requests from the machine on which it is running. If you specified that address in your tnsnames file - on a remote client machine - the request would be routed to the machine on which the requesting client resides. Probably not what you want.
    =====================================

  • Logging data after a trigger with Lookout Direct

    Hi,
    I would like to log data with Lookout Direct after a trigger input (not periodically or continuously).  I am using Lookout DIrect version 4.5.1 build 19 and a Direct Logic 250 PLC.
    Does anyone have suggestions on how to do this?  I would prefer that this data be logged into Excel.  I have 24 sensors connected to the PLC and each time a sensor transitions from hi to low I would like to log the time of transition. 
    Currently I have individual spreadsheet storage objects for each sensor as well as individual latchgates to indicate when logging has been completed.  In the PLC code there is a state machine for each individual sensor as well and one of the states waits for an acknowledgement from Lookout that data logging has been completed before moving on to the next state (I will have to dig a bit deeper to remember exactly why I needed to do that).
    I am hoping there is a more traditional approach that is easier than what I am doing.  One of the problems I have been facing is Lookout Direct crashing every few days and I suspect it is because the sensors are often sensed within milliseconds of each other and opening/closing so many files is causing problems.  I worked through a list of possible reasons that Lookout may be crashing (provided by tech support) and I am nearly convinced I am just asking too much of the program...
    Any help will be greatly appreciated
    Thank you,
    David

    In case someone can help with this, here is a bit more information about my application and the PLC/Lookout code I have developed:
    Actuators have two positions, nominal and folded.  Prox sensors are used to monitor the position of actuators.  12 actuators can be monitored simultaneously.  The time at which prox sensors are sensed high is recorded so that actuator speed and actuation success is logged. 
    The PLC code consists of 12 separate state machines, each with the following 8 states:
    State 1
      System reset or nominal timer reset
      Wait for nominal prox release
    State 2
      Nominal prox released
      Wait for folded prox sense
    State 3
      Folded prox sensed
      Wait for folded ack from Lookout (acknowledges that timer value has been logged)
    State 4
      Folded ack from Lookout received
      Wait for folded timer reset (this state is active for one scan only)
    State 5
      Fold-in timer reset
      Wait for folded prox release
    State 6
      Folded prox released
      Wait for nominal prox sense
    State 7
      Nominal prox sensed
      Wait for nominal ack from Lookout (acknowledges that timer value has been logged)
    State 8
      Nominal ack received from Lookout
      Wait for nominal timer reset (this state is active for one scan only)
    Lookout acknowledges that timer values have been read and saved with LatchGates.  Lookout uses a SpreadSheet Storage Object to save the PLC timers when the PLC enters states 3 or 7 (prox sensor triggered).  The logged member of the SpreadSheet Storage Object is used to change the state of the LatchGates which, in turn, signal the PLC to proceed to the next state.  When the folded file is saved, the nominal LatchGate is turned off and the folded LatchGate is turned on and vice-versa for the nominal file.
    The LatchGate/SpreadSheet Storage combination is what I am hoping to improve upon.  I believe Lookout is crashing when 12 SpreadSheet Storage Objects log to 12 different files during the same 1 second period of time.
    If anyone has suggestions of a way to log this data in the PLC memory or a software package better suited for this application, please let me know!  I believe this would be simple with LabVIEW, unfortunately obtaining the additional hardware and software that I would need hasn't been easy! 

  • I can't figure out how to log off of my daughter's iTunes account that has been loaded to my PC.  When I want to sync my iPhone, I get her data, not mine.

    I can't figure out how to log off of my daughter's iTunes account that has been loaded to my PC.  When I want to sync my iPhone, I get her data, not mine.

    Hi, Abril_Perez17.
    This may be related to a new feature embedded in iOS7 that shows all purchased music by default.  Go to Settings > Music, then turn off Show All Music.  See if the issue ceases once the feature has been disabled.  This information is located on page 63 of the user guide below. 
    iPhone User Guide
    Regards,
    Jason H. 

  • How to extract data from oracle database directly in to bi7.0 (net weaver)

    how to extract data from oracle database directly in to bi7.0 (net weaver)? is it something do with EDI? can anybody explain me in detail?
    Thanks
    York

    You can use UDConnect to get from Oracle database in to BW
    <b>Data Transfer with UD Connect -</b>
    http://help.sap.com/saphelp_nw04/helpdata/en/78/ef1441a509064abee6ffd6f38278fd/content.htm
    <b>Prerequisites</b>
    You have installed the SAP WAS J2EE Engine with BI Java components.  You can find more information on this in the SAP BW installation guide on the SAP Service Marketplace at service.sap.com/instguides.
    Hope it Helps
    Chetan
    @CP..

  • How to transfer data in change log table of dso to z-table using abap code

    Hi  can you please explain me how to transfer data in change log table of dso to z-table using abap code ,with out using Function module concept

    PROGRAM NAME:   ZBW_DELTA_TO_GSTAR                                 **
    report ZBW_DELTA_TO_GSTAR no standard page heading
                                     line-size 120
                                     line-count 75
                                     message-id ZBW_MSG_CLS.
    tables:   ZGIV_DLTA_EBV_BB,
              ZGIV_DLTA_EM2_BL,
              ZGIV_DLTA_EM2_BK.
    Selection Screen Definitions
    SELECTION-SCREEN: BEGIN OF BLOCK INNER WITH FRAME TITLE TEXT-001.
    SELECTION-SCREEN: SKIP 1.
    PARAMETERS:       EBVBB RADIOBUTTON GROUP ROLL,
                      EM2BL RADIOBUTTON GROUP ROLL,
                      EM2BK RADIOBUTTON GROUP ROLL.
    SELECTION-SCREEN: END OF BLOCK INNER.
    Data:  WS_UPDATE_FLAG  Type C,
           UCounter(9)      Type N,
           ICounter(9)      Type N.
    DATA:  T_ZGIV_DLTA_EBV_BB Type Standard Table of ZGIV_DLTA_EBV_BB,
           s_ZGIV_DLTA_EBV_BB LIKE line of T_ZGIV_DLTA_EBV_BB.
    DATA:  T_ZGIV_DLTA_EM2_BK Type Standard Table of ZGIV_DLTA_EM2_BK,
           s_ZGIV_DLTA_EM2_BK LIKE line of T_ZGIV_DLTA_EM2_BK.
    DATA:  T_ZGIV_DLTA_EM2_BL Type Standard Table of ZGIV_DLTA_EM2_BL,
           s_ZGIV_DLTA_EM2_BL LIKE line of T_ZGIV_DLTA_EM2_BL.
    Standard Internal Tables - Describe usage.
    data: begin of i_AEPSD_O0140 occurs 0.
            include structure /BIC/AEPSD_O0140.
    data: end of i_AEPSD_O0140.
    data: begin of i_AEPSD_O0240 occurs 0.
            include structure /BIC/AEPSD_O0240.
    data: end of i_AEPSD_O0240.
    data: begin of i_AEPSD_O0340 occurs 0.
            include structure /BIC/AEPSD_O0340.
    data: end of i_AEPSD_O0340.
    data: begin of i_GIV_DLTA_EBV_BB occurs 0.
            include structure ZGIV_DLTA_EBV_BB.
    data: end of i_GIV_DLTA_EBV_BB.
    data: begin of i_GIV_DLTA_EM2_BK occurs 0.
            include structure ZGIV_DLTA_EM2_BK.
    data: end of i_GIV_DLTA_EM2_BK.
    data: begin of i_GIV_DLTA_EM2_BL occurs 0.
            include structure ZGIV_DLTA_EM2_BL.
    data: end of i_GIV_DLTA_EM2_BL.
    Miscellaneous Program Variables and Constants.
    TOP-OF-PAGE
    top-of-page.
    START-OF-SELECTION
    start-of-selection.
      Clear: i_GIV_DLTA_EBV_BB,
             i_GIV_DLTA_EM2_BK,
             i_GIV_DLTA_EM2_BL,
             UCounter, ICounter.
      IF EBVBB = 'X'.
        PERFORM 100_EXTRACT_EBV_BB_DELTA_RECS.
      ELSEIF EM2BK = 'X'.
        PERFORM 100_EXTRACT_EM2_BK_DELTA_RECS.
      ELSE.
        PERFORM 100_EXTRACT_EM2_BL_DELTA_RECS.
      ENDIF.
    FORM 100_EXTRACT_EBV_BB_DELTA_RECS
    FORM 100_EXTRACT_EBV_BB_DELTA_RECS.
      Refresh:   i_AEPSD_O0140,
                 i_GIV_DLTA_EBV_BB.
      Clear:      UCounter, ICounter, s_ZGIV_DLTA_EBV_BB .
      Select * From /BIC/AEPSD_O0140
        Into TABLE i_AEPSD_O0140.
      IF SY-Subrc = 0.
        LOOP AT i_AEPSD_O0140.
          MOVE-CORRESPONDING i_AEPSD_O0140 TO s_ZGIV_DLTA_EBV_BB.
          MOVE SY-DATUM to s_ZGIV_DLTA_EBV_BB-create_dt.
          INSERT ZGIV_DLTA_EBV_BB FROM s_ZGIV_DLTA_EBV_BB.
          IF SY-Subrc = 0.
            ICounter = ICounter + 1.
          ELSE.
            UPDATE ZGIV_DLTA_EBV_BB FROM  s_ZGIV_DLTA_EBV_BB.
            IF SY-Subrc = 0.
              UCounter = UCounter + 1.
            ELSE.
              Message E067 with SY-DATUM ' ' SY-UZEIT ' '.
            ENDIF.
          ENDIF.
        ENDLOOP.
      ENDIF.
    ENDFORM.                    "100_EXTRACT_EBV_BB_DELTA_RECS
    FORM 100_EXTRACT_EM2_BK_DELTA_RECS
    FORM 100_EXTRACT_EM2_BK_DELTA_RECS.
    Refresh:   i_AEPSD_O0240,
               i_GIV_DLTA_EM2_BK.
      Clear:      UCounter, ICounter, s_ZGIV_DLTA_EM2_BK .
      Select * From /BIC/AEPSD_O0240
        Into TABLE i_AEPSD_O0240.
      IF SY-Subrc = 0.
        LOOP AT i_AEPSD_O0240.
          MOVE-CORRESPONDING i_AEPSD_O0240 TO s_ZGIV_DLTA_EM2_BK.
          MOVE SY-DATUM to s_ZGIV_DLTA_EM2_BK-create_dt.
            INSERT ZGIV_DLTA_EM2_BK FROM s_ZGIV_DLTA_EM2_BK.
          IF SY-Subrc = 0.
            ICounter = ICounter + 1.
          ELSE.
            UPDATE ZGIV_DLTA_EM2_BK FROM  s_ZGIV_DLTA_EM2_BK.
            IF SY-Subrc = 0.
              UCounter = UCounter + 1.
            ELSE.
              Message E067 with SY-DATUM ' ' SY-UZEIT ' '.
            ENDIF.
          ENDIF.
        ENDLOOP.
      ENDIF.
    ENDFORM.                    "100_EXTRACT_EM2_BK_DELTA_RECS
    FORM 100_EXTRACT_EM2_BL_DELTA_RECS
    FORM 100_EXTRACT_EM2_BL_DELTA_RECS.
    Refresh:   i_AEPSD_O0340,
               i_GIV_DLTA_EM2_BL.
      Clear:      UCounter, ICounter, s_ZGIV_DLTA_EM2_BL .
      Select * From /BIC/AEPSD_O0340
        Into TABLE i_AEPSD_O0340.
      IF SY-Subrc = 0.
        LOOP AT i_AEPSD_O0340.
          MOVE-CORRESPONDING i_AEPSD_O0340 TO s_ZGIV_DLTA_EM2_BL.
          MOVE SY-DATUM to s_ZGIV_DLTA_EM2_BL-create_dt.
            INSERT ZGIV_DLTA_EM2_BL FROM s_ZGIV_DLTA_EM2_BL.
          IF SY-Subrc = 0.
            ICounter = ICounter + 1.
          ELSE.
            UPDATE ZGIV_DLTA_EM2_BL FROM  s_ZGIV_DLTA_EM2_BL.
            IF SY-Subrc = 0.
              UCounter = UCounter + 1.
            ELSE.
              Message E067 with SY-DATUM ' ' SY-UZEIT ' '.
            ENDIF.
          ENDIF.
        ENDLOOP.
      ENDIF.
    ENDFORM.                    "100_EXTRACT_EM2_BL_DELTA_RECS
    END-OF-SELECTION
    end-of-selection.
      perform D1000_REPORT_DATA.
    D1000_REPORT_DATA
    form D1000_REPORT_DATA.
    *Display the title of the program
      write: /25 SY-TITLE.
      skip.
    Diaplay the details of the user and time
      write: /1 'Executed by', 15 SY-UNAME, 30 'Date',
      38 SY-DATUM, 53 'Time', 60 SY-UZEIT.
      skip 2.
      write: /  'Delta Records have been extracted  ',
             /   'Updates : ', UCounter,
             /   'Inserts : ', ICounter.
      skip.
      skip 3.
      write: /20 'End of the report'.
    endform.                                           "D1000_REPORT_DATA
    chgeck it out this also may hep you

  • How to Parse XML data directly from context variables in webdynpro

    Hello,
       I have two requirements:
    1) I have a context variable which has string value.
       I want to write the this value into a flat file.
       How do I do this in WebDynpro.
       Any sample code for this.
    2) In Webdynpro, I want to parse and process the XML data directly from a string context variable which
       has the value in XML format.
       How do I achieve this. Any pointers or sample codes for this.
    Thanks and Regards,
    Anupama.

    Anupama,
    Here is some link which talks about unpacking xml and converting to HTML.
    <a href="http://help.sap.com/saphelp_nw04/helpdata/en/eb/3dfb402eb5f76fe10000000a1550b0/content.htm">http://help.sap.com/saphelp_nw04/helpdata/en/eb/3dfb402eb5f76fe10000000a1550b0/content.htm</a>
    I have done something like this in portal development and not in webdynpro.But in principle it should work very where.

  • How to determine Default Table Logging (log data changes )

    Does anyone know how to view exactly what tables and related data fields have change logging enabled by default? I know that some of the standard reports will produce "edit reports" show who changed what field, when ,old and new values, etc, but I don't know how to determine where the data is retrieved from.
    For example: If I look in the ABAP Dictionary at table LFA1, technical settings, it shows that log data changes is not "checked" or enabled. But if I run the standard AR Master Data Change Report, I get output showing valid field changes.
    I have seen other threads that refer to SCU3 but I can't determine the above this from report.
    Any assistance would be greatly appreciated.

    Hi Arthur,
    As far as I am aware, these are 2 different things. 
    There is table logging which is at the table level & if activated (i.e. it's listed in table DD0LV, PROTOKOLL=X and the table logging parameter is set in the system profile/s).
    The second one is programatical logging for change documents when data is maintained though a program that has been written to include a log.  I'm not sure how to identify a complete lit of these though unfortunately.
    Hope that is of some assistance.

  • Announcement : new Data Direct ODBC drivers available for download

    We have Crystal Reports 9 and we are using Crystal Reports with Legal Files software.
    I have downloaded the latest data direct driver.
    How do we get rid of the banner that comes up every time we run a report?
    Are the drivers not free if we purchase crystal report?

    Ben,
    My DataServices enviroment is Linux.
    I need to connect in Progress Database.
    There SAP driver for that ?
    Its possible work with Progress windows driver in linux ?
    tks
    Rodrigo Silveira

  • How to store call log data for one month?

    HI,
    I am using iPhone 4S with iOS 7.1.2. I would like to know that is there any setting for storing call log data for latest one month..?

    That's not how it works. Recents is limited to exactly 100 calls, not a time frame. If you need your call history for a specific time frame, look on your carrier's website. Most carriers will permit you to login to your account & view call history.

Maybe you are looking for

  • Printing two lines of text in Print Module

    Is it possible to print two, or more lines of text, say on a contact sheet, such as filename and caption. If so, how?

  • Outlook 2013: I can't find the folder where a message is stored

    In old versions of Outlook I could right click and go to the properties of the message to see what folder it is in, but that seems to be missing in Outlook 2013. Anyone know how I can enable this functionality? Thanks!

  • Error REP-0300: ORACLE error occurred while running Custom Report

    Hi Team. We are facing a very strange issue here. We have an env , where all the custom reports are failing .Whereas the concerned env is a clone of the Production instance . There is one more instacne which is also a clone of the prod, over there th

  • Error when using receiver JDBC adapter

    i am using JDBC adapter as receiver(for oracle 10g) now, however i got ORA-00984 error. So i enable logSQLStatement. then got the following SQL statement from communication channel monitoring: INSERT INTO BASEINFO(APPNO, BUSSINAME, APPPERSON) VALUES

  • Canon MG5150 Scanner problems

    Hello, I have a Canon MG5150 connected to my Mid 2010 27" I5 iMac with 16 GB ram (running OS X 10.7.4, Italian language with all updates applied). The printer works just fine, but I'm not able to use the scanner... Trying to scan using "Image Capture