Very Important Scenario - Filling Tables with Data

Hi,
For my current project, the custom (z) staged table will be filled every night with data, based on data and calculations on the previous day's material movements, for the company code selected on the selection screen (via a scheduled job that runs by using a variant.)
My question is this: Say the process is executed for one company code. Then an hour later , or at the same time, another process for a different company code is executed? What if the company code is entered twice?
What should i do in such situations? Where do the i fill the data for another company code? What if the company code is entered twice?
Points will be rewarded and all responses will be greatly appreciated.

Do you want the duplicate company code to overwrite the first one?  If you want to do that then just do a read on the table for the company code.  If it exists (sy-subrc) then you can overwrite the data.  If not then you could always use a timestamp and the company code as the key.  That would allow for multiple records of the same company code.
Regards,
Davis

Similar Messages

  • Can i use OEM to automaticaly fill tables with data ?

    Hi currently, i have tables that contains data on our databases such as the characteristics of the servers where these DB are stored, the user enter these data manually via an interface. I know that OEM has these information, i want to use it to automatically fill the tables with it. Is that doable ?

    Apparently you know nothing of OEM architecture. The obvious would be to look up documentation, sadly you belong to the majority of people here refusing to read any documentation.
    OEM comes in the flavors Grid Control (monitoring multiple databases) and Database Control (monitoring one database).
    OEM needs a repository. Using Grid Control this is located in a separate database, and each server has the Intelligent Agent installed to collect information from that server, provided the database is registered against grid control.
    Using Database Control, the repository is in the monitored database.
    Having OEM 'feed' your 'repository' would be about the poorest solution you can implement. Converting your 'repository' in views on the OEM schema would be the only viable solution, as Grid Control automatically has the most recent info.
    And obviously, for those who read, the OEM info is in the SYSMAN schema.
    Sybrand Bakker
    Senior Oracle DBA

  • Filling tables with XML data

    Hi!!
    I need help with this!! I don't know the right way to implement such a problem!
    Here is the scenario:
    I do have a number of tables (relations) created in MS Access and I do have an XML document now what I'm looking for is to fill these tables with data from the XML document!
    Here are the tables that I have (IDs are auto numbered fields)
    Table Name: datee
    Fields:
    1- datee_ID
    2- datee_day
    3- datee_month
    4- datee_year
    Table Name: paragraph
    Fields:
    1- paragraph_ID
    2- paragraph_String
    Table Name: month
    Fields:
    1- month_ID
    2- month_String
    Table Name: notee
    Fields:
    1- notee_ID
    2- notee_to
    3- notee_from
    4- notee_heading
    5- notee_datee_day
    6- notee_datee_month
    7- notee_datee_year
    Table Name: year
    Fields:
    1- year_ID
    2- year_String
    Table Name: day
    Fields:
    1- day_ID
    2- day_String
    Table Name: to
    Fields:
    1- to_ID
    2- to_String
    Table Name: from
    Fields:
    1- from_ID
    2- from_String
    Table Name: heading
    Fields:
    1- heading_ID
    2- heading_String
    Table Name: notee_paragraph
    Fields:
    1- notee_paragraph_ID
    2- notee_ParentID
    3- notee_paragraph_String
    Here is the XML document
    <notee>
    <to>Jane</to>
    <from>Tom</from>
    <heading>Reminder</heading>
    <paragraph>Hi</paragraph>
    <paragraph>Don�t be late!</paragraph>
    <datee>
    <day>13</day>
    <month>May</month>
    <year>2004</year>
    </datee>
    <to>Mark</to>
    <from>Ed</from>
    <heading>Invitation</heading>
    <paragraph>Hello</paragraph>
    <paragraph>Please come</paragraph>
    <paragraph>Take care</paragraph>
    <datee>
    <day>14</day>
    <month>March</month>
    <year>2004</year>
    </datee>
    </notee>
    I used DOM to parse the XML document but my only problem is in finding an algorithm that I should follow to fill in these tables!! so HOW!! I'm a bit confused??
    Any help or advice will be highly appreciated!

    For each of your elements, build a class representing the object. All your classes should have a method
    to create an instance of the class from an XML node (i.e. public static Object fromXML(Node)), and also
    a 'full constructor' (with parameters for all members of the class.
    Then, build utility classes to hold the table access.
    Short example for table DATEE:
    public class Datee {
       private int id;
       private byte day, month, year;
       public Datee(int id, byte day, byte month, byte year) {
          this.id = id;
          this.day = day;
          this.month = month;
          this.year = year;
       public static Object fromXML(Node node) {
          NamedNodeMap attributes = node.getAttributes();
          String strid = attributes.getNamedItem("id").getNodeValue();
          String strday = attributes.getNamedItem("day").getNodeValue();
          String strmonth = attributes.getNamedItem("month").getNodeValue();
          String stryear = attributes.getNamedItem("year").getNodeValue();
          try {
             id = Integer.parseInt(strid);
             day = Integer.parseInt(strday);
             month = Integer.parseInt(strmonth);
             year = Integer.parseInt(stryear);
          catch(Exception e) {
          return new Datee(id, day, month, year);
    }Its of course the simplest example. If instance contains other object instances, just use its fromXML() method to create it.
    Hope this helped,
    Regards.

  • Help on export sybase iq tables with data and import in another database ?

    Help on export Sybase iq 16 tables with data and import into another database ?

    Hi Nilesh,
    If you have table/index create commands (DDLs), you can create them in Developper and import data using one of methods below
    Extract/ Load table
    Insert location method : require IQ servers to be entered in interfaces file
    Backup/Restore : copy entire database content
    If you have not the DDLs, you can generate them using IQ cockpit or SCC.
    http://infocenter.sybase.com/help/topic/com.sybase.infocenter.dc01773.1604/doc/html/san1288042631955.html
    http://infocenter.sybase.com/help/topic/com.sybase.infocenter.dc01840.1604/doc/html/san1281564927196.html
    Regards,
    Tayeb.

  • Filling dynamic internal table with data from other internal table

    Hi Friends,
    My problem is that i have already built a dynamic internal table
    (class int_table->create) but now i want to fill it with data from other internal table.
    The dynamic table column name and the field value of the data filled internal table are same, but how to access that column name, since i cant hard code it anyway.
    Like if my werks field value is '8001'. I want to place it under the column 8001 of dynamic table, Can anybody help me in this regard?
    Awarding points is not a problem for even giving a slight hint.
    Best Regards

    Hi
    See this
    Dynamic internal table is internal table that we create on the fly with flexible column numbers.
    For sample code, please look at this code tutorial. Hopefully it can help you
    Check this link:
    http://www.****************/Tutorials/ABAP/DynamicInternaltable/DynamicInternalTable.htm
    Sample code:
    DATA: l_cnt(2) TYPE n,
    l_cnt1(3) TYPE n,
    l_nam(12),
    l_con(18) TYPE c,
    l_con1(18) TYPE c,
    lf_mat TYPE matnr.
    SORT it_bom_expl BY bom_comp bom_mat level.
    CLEAR: l_cnt1, <fs_dyn_wa>.
    Looping the component internal table
    LOOP AT it_bom_expl INTO gf_it_bom_expl.
    CLEAR: l_cnt1.
    AT NEW bom_comp.
    CLEAR: l_cnt, <fs_dyn_wa>, lf_mat.
    For every new bom component the material data is moved to
    temp material table which will be used for assigning the levels
    checking the count
    it_mat_temp[] = it_mat[].
    Component data is been assigned to the field symbol which is checked
    against the field of dynamic internal table and the value of the
    component number is been passed to the dynamic internal table field
    value.
    ASSIGN COMPONENT c_comp_list OF STRUCTURE <fs_dyn_wa> TO
    <fs_check>.
    <fs_check> = gf_it_bom_expl-bom_comp.
    ENDAT.
    AT NEW bom_mat.
    CLEAR l_con.
    ENDAT.
    lf_mat = gf_it_bom_expl-bom_mat.
    Looping the temp internal table and looping the dynamic internal table
    *by reading line by line into workarea, the materialxxn is been assigned
    to field symbol which will be checked and used.
    LOOP AT it_mat_temp.
    l_nam = c_mat.
    l_cnt1 = l_cnt1 + 1.
    CONCATENATE l_nam l_cnt1 INTO l_nam.
    LOOP AT <fs_dyn_table2> ASSIGNING <fs_dyn_wa2>.
    ASSIGN COMPONENT l_nam OF STRUCTURE <fs_dyn_wa2> TO <fs_xy>.
    ENDLOOP.
    IF <fs_xy> = lf_mat.
    CLEAR lf_mat.
    l_con1 = l_con.
    ENDIF.
    Checking whether the material exists for a component and if so it is
    been assigned to the field symbol which is checked against the field
    of dynamic internal table and the level of the component number
    against material is been passed to the dynamic internal table field
    value.
    IF <fs_xy> = gf_it_bom_expl-bom_mat.
    ASSIGN COMPONENT l_nam OF STRUCTURE <fs_dyn_wa> TO <fs_check>.
    CLEAR l_con.
    MOVE gf_it_bom_expl-level TO l_con.
    CONCATENATE c_val_l l_con INTO l_con.
    CONDENSE l_con NO-GAPS.
    IF l_con1 NE space.
    CONCATENATE l_con1 l_con INTO l_con SEPARATED BY c_comma.
    CLEAR l_con1.
    l_cnt = l_cnt - 1.
    ENDIF.
    <fs_check> = l_con.
    l_cnt = l_cnt + 1.
    ENDIF.
    ENDLOOP.
    AT END OF bom_comp.
    At end of every new bom component the count is moved to the field
    symbol which is checked against the field of dynamic internal table
    and the count is been passed to the dynamic internal table field
    value.
    ASSIGN COMPONENT c_count OF STRUCTURE <fs_dyn_wa> TO <fs_check>.
    <fs_check> = l_cnt.
    INSERT <fs_dyn_wa> INTO TABLE <fs_dyn_table>.
    ENDAT.
    ENDLOOP.
    Reward if useful
    Anji

  • Fill a table with data coming from an RFC

    Hello everyone:
    I've followed the Weblog "How many lines of java code did i write for a simple Web Dynpro?"
    /people/durairaj.athavanraja/blog/2004/10/17/how-many-lines-of-java-code-did-i-write-for-a-simple-web-dynpro
    I've called an RFC and created a table with data coming from it (which is also a table). My question is, if in this table there's a field named "UserType" there are two possible values for this field:
    "userA"
    "userB"
    How can I get the table only show me the "userA" registers? The RFC does return all of the users, but when filling the table, can I put an if-else somewhere on my code?
    Thanks a lot
    Alejandro

    Hi Alejandro,
    Referring to the link provided "The logic of the filter process is not implemented in Web Dynpro. The application developer must implement the action to be executed."
    We would have to implement the action onFilter in the controller implementation. Ideally, we fill the data retrieved from backend into a List (java.util.List) (this could be done on init of view) and then subset the list after meeting the criteria in the action handler(say
    onActionFilterData(com.sap.tc.webdynpro.progmodel.api.IWDCustomEvent wdEvent).
    Having done this, you may bind the output list back to the node (shown in table)
    Regards,
    Chaitanya

  • Moving tables with data from one client to another

    Hello friends
    I have 2 clients 001 and 002.
    I have created a table with lots of data on client 001.
    I have to get the same table on 002 also with the same data.
    Is there an easier way of transfering this data from one table to another?
    I would appreciate any feedback on this.
    I know that mappings etc., can be done using the export and import of tpz file. So I want to know if there is any similar thing to transfer a table with data.
    THanks
    Ram

    If you do not have the client field in your table, then the table is cross-client.
    IF you have the client field filled with 001, then you need to create the same entries with client 002, I dont think that you can do this with a transport.
    You have to write an ABAP that reads and writes the data.

  • How to move tables with data from one client to another

    Hello friends
    I have 2 clients 001 and 002.
    I have created a table with lots of data on client 001.
    I have to get the same table on 002 also with the same data.
    Is there an easier way of transfering this data from one table to another?
    I would appreciate any feedback on this.
    I know that mappings etc., can be done using the export and import of tpz file. So I want to know if there is any similar thing to transfer a table with data.
    THanks
    Ram

    hi
    thanks for the response
    We are working on a sandbox system for demo purpose.
    Actually we have two different group of people using two different clients created on the same server.
    Our group is using client 001 and another group is using client 002.
    We don't want the changes that one group does to the table to affect the other.
    Our group has created a z-table and have entered considerable amount of data.
    However, the other group also need to create the same table with the same data.
    So to avoid double work, we were trying to see if there is a way to copy the table with data created on 001 to 002.
    Any suggestions / feedback will be greatly appreciated.
    Thanks
    Ram

  • How to pass internal table with data in ABAP OO

    Hi experts ,
    Here is the problem...
    I create a screen 100 and use the internal table to get data from database, and then press the button on screen 100 , it will call screen 200.
    I use ABAP OO code in screen 200, and i want to keep the internal table with data that i get from the sceen 100 parts, but it seems that it just supports to create a new internal table and get data from database in ABAP OO. That means i need to create a new table to save the data in internal table ? Or there is any other solution something like exporting or memory id ...
    ps . the internal table in screen 100 is declared in global area, the beginning of the code.
    Please give me some advice, thanks a lot!
    Regards ,
    Claire

    I have already know how to do, here is my code:
    METHODS fill_tree importing itab2 like itab1.
    CALL METHOD: me->fill_tree exporting itab2 = itab1.
    In METHOD fill_tree,
    declare
    data: itab3 TYPE STANDARD TABLE OF itab,
    itab3[] = itab1[].
    It is useful to me, thanks a lot.

  • Using expdp to export a mix of tables with data and tables without data

    Hi,
    I would like to create a .dmp file using expdp, exporting a set of tables with data and another set without data. Is there a way to do this in a single .dmp file? For example, I want all the tables in a schema with data, but for the fact tables in that schema, I only want the fact table objects, not the data. I thought it might be easier to create two separate .dmp files, one for each scenario, but would be nice to have one .dmp file that satisfies my requirement. Any help is appreciated.
    Thanks,
    -Rodolfo
    Edited by: user6902559 on May 11, 2010 12:05 PM

    You could do this with where clauses. Let's say you have 10 tables to export, 5 with data and 5 without data. I would do it like this
    tab1_w_data
    tab2_w_data
    tab3_w_data
    tab4_w_data
    tab5_w_data
    tab1_wo_data
    tab2_wo_data
    tab3_wo_data
    tab4_wo_data
    tab5_wo_data
    I would make one generic query
    query="where rownum = 0"
    and I would make 5 specific queries
    query=tab1_w_data:"where rownum > 0"
    query=tab2_w_data:"where rownum > 0"
    query=tab3_w_data:"where rownum > 0"
    query=tab4_w_data:"where rownum > 0"
    query=tab5_w_data:"where rownum > 0"
    The first query will be applied to all tables that don't have their own specific query and it will export no rows, the next 5 will apply to each of the corresponding table.
    Dean

  • What is the best wayt to copy tables with data from development to Prod?

    Dear all,
    We have Oracle tables with data in a development server, I would like to know if there is any ‘easy’ and ‘direct’ way to copy them to the production server.
    As I think import and export would be the best way. Any other althernatives?
    Thanks.

    There are a number of methods you could use.
    <ul><li>
    Export and import would work or their 10g+ equivalents data pump. </li>
    <li>You could create a database link between the 2 databases and use SQL*Plus copy to transfer data on a table by table basis (probably quite laborious unless there's only a few tables). </li>
    <li>You could use transportable tablespaces (there are some restirctions - check the documentation http://download.oracle.com/docs/cd/E11882_01/server.112/e10595/tspaces013.htm#sthref1632) </li>
    <li>You could use RMAN to clone the development database (assuming the prod database hasn't been used yet and that there's nothing in it you need to keep).</li>
    <li>You could create the prod database as a standby copy of the dev database </li>
    </ul>
    Using transportable tablespaces would be much faster than using data pump or import/export depending on how much data there is.

  • Creating a time channel in the data portal and filling it with data - Is there a more efficient way than this?

    I currently have a requirement to create a time channel in the data portal and subsequently fill it with data. I've shown below how I am currently doing it:
    Time_Ch = ChnAlloc("Time channel", 271214           , 1      ,           , "Time"         ,1                  ,1)              'Allocate time channel
    For intLoop = 1 to 271214
      ChD(intLoop,Time_Ch(0)) = CurrDateTimeReal          'Create time value
    Next
    I understand that the function to create and allocate memory for the time channel is extremely quick. However the time to store data in the channel afterwards is going to be highly dependent on the length I have assigned to the Time_Ch. In my application the length of Time_Ch is variable but could easily be in the order of 271214 or higher. Under such circumstances the time taken to fill Time_Ch is quite considerable. I am wondering whether this is the most appropriate way of doing things or whether there is a more efficient way of creating a time channel and filling it.
    Thanks very much for any help.
    Regards
    Matthew

    Hi Matthew,
    You are correct that there is a more efficient way to do this.  I'm a little confused about your "CurrDateTimeReal" assignment-- is this a constant?  Most people want a Time channel that counts up linearly in seconds or fractions of a second over the duration of the measurement.  But that looks like you would assign the same time value to all the rows of the new Time channel.
    If you want to create a "normal" Time channel that increases at a constant rate, you can use the ChnGenTime() function:
    ReturnValue = ChnGenTime(TimeChannel, GenTimeUnit, GenTimeXBeg, GenTimeXEnd, GenTimeStep, GenTimeMode, GenTimeNo)
    If you really do want a Time channel filled with all the same values, you can use the ChnLinGen() function and simply set the GenXBegin and GenXEnd parameters to be the same value:
    ReturnValue = ChnLinGen(TimeChannel, GenXBegin, GenXEnd, XNo, [GenXUnitPreset])
     In both cases you can use the Time channel you've already created (which as you say executes quickly) and point the output of these functions to that Time channel by using the Group/Channel syntax of the Time channel you created for the first TimeChannel parameter in either of the above functions.
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments

  • How to transport/move a table with data from development to Test to Production

    Hi,
    How to transport/move a table with data from development to Test to Production..? Export-Import a Delivery Unit does only the structure and not the data
    Reg
    Sri

    Hi Sri,
    You cannot transport Data via Transport route in HANA, you can only transport code changes/Structure via DU. For Data movement, you either have a do a export/import from a flat file or replication from a Source System to HANA.
    Thanks Much,
    Abhishek

  • Rename the existing table with date suffix

    Hi,
    I'm trying to rename a table with date suffix at the end of the table name, and drop the date suffix table which is greater than 7 days. for that I have the below sql, I have not included the drop syntax in this.
    I'm not able to rename with the date suffix in the below sql, syntax error at '+'  
    DECLARE
    @TPartitionDate date
    IF EXISTS (SELECT * FROM sysobjects WHERE Name = 'IIS_4')
    BEGIN
    SELECT
    @TPartitionDate = MAX(PartitionDate)
    FROM PartitionLog (NOLOCK)
    EXEC sp_rename 'IIS_4','IIS_4_'+ @TPartitionDate
    END

    create table Test(sno int)
    DECLARE
    @TPartitionDate date = getdate()
    declare @a varchar(200)
    IF EXISTS (SELECT * FROM sysobjects WHERE Name = 'Test')
    BEGIN
    select @a='TEST_'+ cast(@TPartitionDate as varchar(10))
    EXEC sp_rename 'TEST',@a
    END
    drop table [test_2015-04-23]
    Hope it Helps!!

  • Trying to populate a table with data from WebRowset

    Hi,
    I want to be able to populate my tables with data from WebRowsets that have been saved to files. Everything goes good until I get to acceptChanges(). At which point I get a NullPointerException.
    Here's the code...
    WebRowSet wrs = new WebRowSetImpl();
    FileReader reader = new FileReader(inputFile);
    wrs.readXml(reader);
    wrs.beforeFirst();
    CachedRowSet crs = new CachedRowSetImpl();
    crs.setSyncProvider("com.sun.rowset.providers.RIXMLProvider");
    crs.populate(wrs);
    crs.beforeFirst();
    crs.acceptChanges(con);
    Results in...
    java.lang.NullPointerException
    at com.sun.rowset.CachedRowSetImpl.acceptChanges(CachedRowSetImpl.java:867)
    at com.sun.rowset.CachedRowSetImpl.acceptChanges(CachedRowSetImpl.java:919)
    I'm using Java 1.5_02. I looked at the source code for CachedRowSetImpl, and the only thing I could think of is that maybe "provider.getRowSetWriter()" in the following snippet is returning null....
    public void setSyncProvider(String s)
    throws SQLException
    provider = SyncFactory.getInstance(s);
    rowSetReader = provider.getRowSetReader();
    rowSetWriter = (TransactionalWriter)provider.getRowSetWriter();
    Any ideas?? Thanks!

    I have the same problem after setting com.sun.rowset.providers.RIXMLProvider.
    Looks like a bug to me.
    By the way, why are you creating a new CachedRowSet and populate it with a WebRowset (which extends CachedRowSet)?

Maybe you are looking for

  • How to reinstall OSX 17"MB Pro

    I bought an open box MB Pro 17" at Best Buy and its missing Iphoto and Garageband, how can I reinstall the operating system and start from scratch, they didn't have the box.

  • Prompted for Keychain password on login

    I recently bound my Mac to my organization's Active Directory domain and everything works fine except after entering my credentials the password prompt for my keychain pops up, before the login screen even disappears. The login process continues and

  • Parse String, split into rows

    I have a report who has the following output: MODEL    PO No   RR No  RR Date,     PartNo           Part Description,              Qty Received   Lot No             DR No.    DR Date      QB3-DLX  150     100455  3/16/2011   657712S400   ROD HOOD SUP

  • 802.1x and Authentication Methods

    Hi, I have ACS 5.2, Cisco 4507 switches and AD domain environment. Planning on performing only machine authentication and not user authentication. I have the following type of devices: 1. Windows XP SP3 and higher on the AD Domain 2. Devices to be wi

  • Change Error Message in commit

    I have this code in a Managed Bean, BindingContainer binding = BindingContext.getCurrent().getCurrentBindingsEntry(); OperationBinding operacion = binding.getOperationBinding("Delete"); operacion.execute(); operacion = binding.getOperationBinding("Co