Front End not reflecting previously deleted entry while entering into Staging table

Lets say I have table named 'Sales' as my Source. It contains columns Sal_ID and Sal_DESC
There is Front End Entity created as Master Sales in Model Project. According staging table in database is ' stg.Sales_Leaf '.
Now, I have entered data into stg.Sales_Leaf from Sales table for 'CODE' and 'NAME' value as Sal_ID and Sal_DESC, and my ImportType is '0'.
As Sales table contains 23 rows, these are imported to stg.Sales_Leaf with ImportType '0'.
I am able to see it in Front End also.
Problem starts now. I delete some rows from Front END directly which contains CODE value  as 22 and 23.
These gets deleted from Front End View, but now again I load Same 23 rows to stg.Sales_Leaf with same data (after truncating it).
I should be able to see 23 records in Front End, but it is showing only 21 records.
Two records with CODE value as 22 and 23 are not reflecting in Fornt End.
I think it is because 'mdm.<table_code>' table associated with stg.Sales_Leaf, is reflecting Status_ID as 2.
Please provide the way, how can I get data reflected in Front End.

All error numbers between -20000 and -20999 are not Oracle errors but rather application errors coded by whoever wrote your software.
As previously suggested ... see your DBA or manager.

Similar Messages

  • Source system appearing in BI folder : Not able to delete entry RSBASIDOC

    Hi all,
    Iam creating source system connection from ECC to BI system.
    But the source system appearing in BI folder all the time, i found one note 1087980 and tried deleting an entry with SRCTYPE = M from RSBASIDOC table with FM RSAP_BIW_DISCONNECT , but the system shows error message OTHER_ERROR and the entry does not get deleted.
    For this SAP recommended to check a valid RFC connection between ECC and BI system , i have checked it and the RFCs are working fine for it also.
    Please help.
    Regards,
    Akash.

    follow this steps:
    In the source sytem
    SE37 e launch the FM RSAP_BIW_DISCONNECT
    you will find the BIW and OLTP log on the RSBASIDOC
    So run the FM.
    IF you find the OTHER_ERROR, may you have to check the RFC connection. (If the source system was been refreshed, you have to create a RFC to PROD).
    You should encounter this error ( Not able to delete entry RSBASIDOC), when you try to restore the DS: in this case infact, is the IDENTIFIER (TSPREFIX) that creates problem (so you have to delete the relevant entry from the table).
    CIao,
    Cristian

  • Problem while inserting into a table which has ManyToOne relation

    Problem while inserting into a table *(Files)* which has ManyToOne relation with another table *(Folder)* involving a attribute both in primary key as well as in foreign key in JPA 1.0.
    Relevent Code
    Entities:
    public class Files implements Serializable {
    @EmbeddedId
    protected FilesPK filesPK;
    private String filename;
    @JoinColumns({
    @JoinColumn(name = "folder_id", referencedColumnName = "folder_id"),
    @JoinColumn(name = "uid", referencedColumnName = "uid", insertable = false, updatable = false)})
    @ManyToOne(optional = false)
    private Folders folders;
    public class FilesPK implements Serializable {
    private int fileId;
    private int uid;
    public class Folders implements Serializable {
    @EmbeddedId
    protected FoldersPK foldersPK;
    private String folderName;
    @OneToMany(cascade = CascadeType.ALL, mappedBy = "folders")
    private Collection<Files> filesCollection;
    @JoinColumn(name = "uid", referencedColumnName = "uid", insertable = false, updatable = false)
    @ManyToOne(optional = false)
    private Users users;
    public class FoldersPK implements Serializable {
    private int folderId;
    private int uid;
    public class Users implements Serializable {
    @Id
    @GeneratedValue(strategy = GenerationType.IDENTITY)
    private Integer uid;
    private String username;
    @OneToMany(cascade = CascadeType.ALL, mappedBy = "users")
    private Collection<Folders> foldersCollection;
    I left out @Basic & @Column annotations for sake of less code.
    EJB method
    public void insertFile(String fileName, int folderID, int uid){
    FilesPK pk = new FilesPK();
    pk.setUid(uid);
    Files file = new Files();
    file.setFilename(fileName);
    file.setFilesPK(pk);
    FoldersPK folderPk = new FoldersPK(folderID, uid);
         // My understanding that it should automatically handle folderId in files table,
    // but it is not…
    file.setFolders(em.find(Folders.class, folderPk));
    em.persist(file);
    It is giving error:
    Internal Exception: java.sql.SQLException: Field 'folderid' doesn't have a default value_
    Error Code: 1364
    Call: INSERT INTO files (filename, uid, fileid) VALUES (?, ?, ?)_
    _       bind => [hello.txt, 1, 0]_
    It is not even considering folderId while inserting into db.
    However it works fine when I add folderId variable in Files entity and changed insertFile like this:
    public void insertFile(String fileName, int folderID, int uid){
    FilesPK pk = new FilesPK();
    pk.setUid(uid);
    Files file = new Files();
    file.setFilename(fileName);
    file.setFilesPK(pk);
    file.setFolderId(folderId) // added line
    FoldersPK folderPk = new FoldersPK(folderID, uid);
    file.setFolders(em.find(Folders.class, folderPk));
    em.persist(file);
    My question is that is this behavior expected or it is a bug.
    Is it required to add "column_name" variable separately even when an entity has reference to ManyToOne mapping foreign Entity ?
    I used Mysql 5.1 for database, then generate entities using toplink, JPA 1.0, glassfish v2.1.
    I've also tested this using eclipselink and got same error.
    Please provide some pointers.
    Thanks

    Hello,
    What version of EclipseLink did you try? This looks like bug https://bugs.eclipse.org/bugs/show_bug.cgi?id=280436 that was fixed in EclipseLink 2.0, so please try a later version.
    You can also try working around the problem by making both fields writable through the reference mapping.
    Best Regards,
    Chris

  • How does the EDB handle deleted items and why am I only able to recover some and not all previously deleted emails?

    Hello,
    I have been unable to locate information regarding how the EDB handles deleted files.  I am able to recover deleted emails from the EDB; however, there are numerous emails that I know have been deleted, that were not recovered during the recovery process.
     Is this because the EDB dumps deleted emails once it reaches a certain amount of entries?  Or is it more likely the deleted items are corrupted and thus not recoverable?  Thank you for your time and I look forward to your responses.  The
    product in question is Microsoft Exchange Server 2007.

    Great information from the other posters and one other possible explanation is if the users did a HARD DELETE.  More information on this below
    Pre-Exchange 2010 Dumpster 1.0 worked as follows;
    1. When a user does a normal delete of an item via Outlook or OWA it gets sent to your deleted items folder
    2. If the user then empties deleted items folder those items are then marked with the ptagDeletedOnFlag attribute which in essence hides them from the view of the user
    3. At this point the user can use the Recover Deleted Items function in Outlook to recover those items as long as the items were NOT purged using the Recover Deleted Items option OR the items have not passed still the deleted item retention period.
    NOTE: Every day there is a nightly maintenance process that in short looks through  the DB and examines items that have been deleted. Items that have met or exceeded the deleted item retention period are purged from the system and are
    completely non-recoverable.  More about this at the end of my post.
    4. However if the end user were to  delete an item from the Inbox or another folder in Outlook by using Shift + Delete aka a Hard Deletion the item is left in its orignal location and gets marked with the ptagDeletedOnFlag attribute which again just
    hides it from view.
    5. By default you can only use the Recover Deleted Items option on the Delted Items folder.  Howeve you can make it so that you can recover hard Deleted items from other folders.  More on that here http://support.microsoft.com/kb/246153
    5. The problem with this method though IMO is that you have to know where to look and that can be rather time consuming.
    6. If you want a more elegant method to resolve this and are open to using 3rd party utilities check out Lucid8's DigiScope. 
    http://www.lucid8.com/product/digiscope.asp which will allow you to open any offline copy of the DB from a forensic point of view and to solve your specific issue you can either turn on a filter to ONLY
    show hard deleted items OR you can search the entire store and all mailboxes for just deleted items and can then view or export them to PST or MSG format.  You can download the product and obtain a 30 day demo license which will allow you to find any
    hard deleted items, however to view them or export your would need to purchase a license.
    NOTE: When a database is backed up or copied it is an "Offline" database and then its offline the data retention period is not an issue because the nightly online maintenance process can only act on a database that is mounted on an
    Exchange server.  So for this reason if you want to recover the most data possible use offline copies/backups of the DB with a 3rd party utility like DigiScope
    NOTE 2: Exchange 2010 and 2013 use Dumpster 2.0 which is an entirely different beast and I will not go into that in this post since its nearly 1AM  however DigiScope can expose that information as well
    Search, Recover, & Extract Mailboxes, Folders, & Email Items from Offline Exchange Mailbox and Public Folder EDB's and Live Exchange Servers or Import/Migrate direct from Offline EDB to Any Production Exchange Server, even cross version i.e. 2003
    --> 2007 --> 2010 --> 2013 with Lucid8's
    DigiScope

  • Master data modified at SQL end not reflecting in Info object master data

    Hi All,
    we have done some data modification at SQl end where the modified data not reflecting in infoobjects master data.
    for example:
    we have split/skills numbers like 1-1000 in the data base and now we did some modifications to that particular split/skills numbers and uploaded that data successfully in the SQL end.now the problem is that modified data is not reflecting in split/skill info-objects and showing the same old data but not the modified one.can you please explain me how to reflect that modified data in BI side.
    regards
    Vamshi D Krishna

    Hi vamshi ,
    what is type of data load to master data infoobject (Full / Delta), if you know the split/skills numbers, try to load with repair full till PSA and check the data. if data is correct you can load to infoobjects and data will be overwrite, after loading data into infoobject -> in RSA1 -> infoobjects-> right and select "Activate master Data" ( this is not a mandatory step, if there is any M and A version data is there that will be adjusted.)  and updated data will be available infoobject master data.
    Regards,
    Daya Sagar

  • EHP installed front end not started

    Hi,
    I started EHPI tool  in linux box through xmanager , the following message is came then started http://linuxsap1:4239/
    sdtdsu.jnlp file also I saved in my system but front end screen not coming?
    linuxsap1:sedadm 58> STARTUP
    Checking if C++ runtime is installed ...
    Starting from /db2/SED/EHPI/sdt...
    Checking username "sedadm" for compliance...
    Feb 17, 2012 9:36:11 PM [Info]: *************************
    Feb 17, 2012 9:36:11 PM [Info]: Starting Server
    Feb 17, 2012 9:36:11 PM [Info]: Reading server configuration.
    Feb 17, 2012 9:36:11 PM [Info]: Reading service configuration DSUService.
    Feb 17, 2012 9:36:11 PM [Info]: Configuring LogManager ...
    Feb 17, 2012 9:36:11 PM [Info]: ************************************************
    Feb 17, 2012 9:36:11 PM [Info]: Starting SL Controller listening on port 4241 ..
    Feb 17, 2012 9:36:11 PM [Info]: Starting StorageService ...
    Feb 17, 2012 9:36:11 PM [Info]: Initializing SecurityManager ...
    Feb 17, 2012 9:36:12 PM [Info]: Server certificate fingerprint is B6 48 B9 6D F3
    3A 2C FC E7 55 48 E6 A5 CE ED 36
    Feb 17, 2012 9:36:12 PM [Info]: Configuring HTTPManager ...
    Feb 17, 2012 9:36:12 PM [Info]: Starting WebstartService ...
    Feb 17, 2012 9:36:12 PM [Info]: Starting RoleService ...
    Feb 17, 2012 9:36:12 PM [Info]: Starting AlertService ...
    Feb 17, 2012 9:36:12 PM [Info]: Starting NotesService ...
    Feb 17, 2012 9:36:13 PM [Info]: Starting ProcessService ...
    Feb 17, 2012 9:36:13 PM [Info]: MIDService switched off.
    Feb 17, 2012 9:36:13 PM [Info]: Starting FileService ...
    Feb 17, 2012 9:36:13 PM [Info]: LogService switched off.
    Feb 17, 2012 9:36:13 PM [Info]: Starting MailService ...
    Feb 17, 2012 9:36:13 PM [Info]: Starting services ...
    Feb 17, 2012 9:36:13 PM [Info]: Starting service "DSUService" ...
    Feb 17, 2012 9:36:13 PM [Info]: Service "DSUService" started
    Feb 17, 2012 9:36:13 PM [Info]: Services started.
    Feb 17, 2012 9:36:13 PM [Info]: Starting HTTP server listening on port 4239 ...
    Feb 17, 2012 9:36:13 PM [Info]: HTTP server started.
    Feb 17, 2012 9:36:13 PM [Info]: SL Controller started.
    Thanks
    Siva

    sdtdsu.jnlp file also I saved in my system but front end screen not coming?
    You don't have to save it, you need to run it in a java enabled web browser. It is hard to diagnose as the problem can be in various areas, Try another client, another browser, check installed java versions etc.
    Cheers Michael

  • Taking More Time while inserting into the table (With foriegn key)

    Hi All,
    I am facing problem while inserting the values into the master table.
    The problem,
    Table A -- User Master Table (Reg No, Name, etc)
    Table B -- Transaction Table (Foreign key reference with Table A).
    While inserting the data's in Table B, i need to insert the reg no also in table B which is mandatory. I followed the logic which is mentioned in the SRDemo.
    While inserting we need to query the Table A first to have the values in TableABean.java.
    final TableA tableA= (TableA )uow.executeQuery("findUser",TableA .class, regNo);
    Then, we need to create the instance for TableB
    TableB tableB= (TableB)uow.newInstance(TableB.class);
    tableB.setID(bean.getID);
    tableA.addTableB(tableB); --- this is for to insert the regNo of TableA in TableB.. This line is executing the query "select * from TableB where RegNo = <tableA.getRegNo>".
    This query is taking too much time if values are more in the TableB for that particular registrationNo. Because of this its taking more time to insert into the TableB.
    For Ex: TableA -- regNo : 101...having less entry in TableB means...inserting record is taking less than 1 sec
    regNo : 102...having more entry in TableB means...inserting record is taking more than 2 sec
    Time delay is there for different users when they enter transaction in TableB.
    I need to avoid this since in future it will take more time...from 2 sec to 10 sec, if volume of data increases mean.
    Please help me to resolve this issue...I am facing it now in production.
    Thanks & Regards
    VB

    Hello,
    Looks like you have a 1:M relationship from TableA to TableB, with a 1:1 back pointer from TableB to TableA. If triggering the 1:M relationship is causing you delays that you want to avoid there might be two quick ways I can see:
    1) Don't map it. Leave the TableA->TableB 1:M unmapped, and instead just query for relationship when you do need it. This means you do not need to call tableA.addTableB(tableB), and instead only need to call tableB.setTableA(tableA), so that the TableB->TableA relation gets set. Might not be the best option, but it depends on your application's usage. It does allow you to potentially page the TableB results or add other query query performance options when you do need the data though.
    2) You are currently using Lazy loading for the TableA->TableB relationship - if it is untriggered, don't bother calling tableA.addTableB(tableB), and instead only need to call tableB.setTableA(tableA). This of course requires using TopLink api to a) verify the collection is an IndirectCollection type, and b) that it is hasn't been triggered. If it has been triggered, you will still need to call tableA.addTableB(tableB), but it won't result in a query. Check out the oracle.toplink.indirection.IndirectContainer class and it's isInstantiated() method. This can cause problems though in highly concurrent environments, as other threads may have triggered the indirection before you commit your transaction, so that the A->B collection is not up to date - this might require refreshing the TableA if so.
    Change tracking would probably be the best option to use here, and is described in the EclipseLink wiki:
    http://wiki.eclipse.org/Introduction_to_EclipseLink_Transactions_(ELUG)#Attribute_Change_Tracking_Policy
    Best Regards,
    Chris

  • How to hide password while entering into second page

    I have following script but while going into 2nd page from 1st page, ID and password are visible in adress bar....
    <HEAD>
    <TITLE>T&E Profiling</TITLE>
    <STYLE>
    .headertxt {
    font-family : Arial;
    color : #FFFFFF;
    font-size : 14pt;
    font-weight : bold;
    font-style : normal;
    .boldtxt {
    font-family : Arial;
    color : #000000;
    font-size : 10pt;
    font-weight : bold;
    font-style : normal;
    .normaltxt {
    font-family : Arial;
    color : #000000;
    font-size : 10pt;
    font-weight : normal;
    font-style : normal;
    .titletxt {
    font-family : Arial;
    color : #000000;
    font-size : 16pt;
    font-weight : normal;
    font-style : normal;
    </STYLE>
    <SCRIPT>
    function GoOn () {
    if (document.userinfo.username.value != null && document.userinfo.username.value != "") {
    var username = escape(document.userinfo.username.value);
    var password = escape(document.userinfo.password.value);
    top.document.location = "teuser.TE_SSO_PAPP.p_val_user?username="+ username + "&password="+ password;
    else {
    alert("Please, fill username text box");
    </SCRIPT>
    </HEAD>
    <BODY onload="document.userinfo.username.focus();">
    <FORM NAME="userinfo" ONSUBMIT="return false;">
    <CENTER>
    <BR>
    <FONT class="titletxt">WELCOME TO THE TRAVEL & EXPENSE APPLICATION</FONT>
    <BR>
    <BR>
    <TABLE border="0" width="50%" cellpadding="0" cellspacing="0">
    <TR><TD ALIGN=left BGCOLOR="#000050"><IMG SRC="/tende/images/1x1.gif" WIDTH=1 HEIGHT=1 BORDER=0></TD></TR>
    </TABLE><br>
    <BR>
    <BR>
    <TABLE BORDER="1" CELLPADING=0 CELLSPACING=0 BGCOLOR="#EEEEEE">
    <tr>
    <TD><FONT CLASS="titletxt">USERNAME</FONT></td>
         <td>
    <INPUT NAME="username" TYPE="text" VALUE="" onChange="javascript:this.value=this.value.toUpperCase();">
    </TD></TR>
         <tr>
         <td><FONT CLASS="titletxt">PASSWORD</FONT></td>
         <td><INPUT NAME="password" TYPE="password" VALUE=""></td></tr>
    </TABLE>
    <BR><INPUT NAME="enter" TYPE="BUTTON" VALUE="Enter" OnClick="GoOn();">
    </CENTER>
    </FORM>
    </BODY>
    </HTML>

    This is not a JAVA question... but you should post the variables using a form in stead of setting the location in javascript.

  • Error while entering values in table.

    Hi All,
    I have a quantity field in table having data type as dec and length as 13 and decimal places as 5.
    But when i enter 255 in table,it automatically changes to .255 but it should be 255.
    Thanks and Regards,
    Amanpreet Sehgal

    hi
    check this one
    For calculations in business applications, use packed numbers. The program attribute Fixed point arithmetic affects calculations using packed numbers.
    If the program attribute Fixed point arithmetic is not set, type P fields are interpreted as integers without decimal places. The decimal places that you specify in the DECIMALS addition of the TYPES or DATA statement only affect how the field is formatted in the WRITE statement.
    DATA: PACK TYPE P DECIMALS 2.
    PACK = '12345'.
    WRITE PACK.
    If the program attribute Fixed point arithmetic is not set, the output is as follows:
    123.45
    If the program attribute Fixed point arithmetic is set, the output is as follows:
    12,345.00
    If the Fixed point arithmetic attribute is set, the decimal places are also taken into account in arithmetic operations. Calculations with packed numbers in ABAP use the same arithmetic as a pocket calculator. Intermediate results are calculated using up to 31 digits (before and after the decimal point). You should therefore always set the Fixed point arithmetic attribute when you use type P fields.
    DATA: PACK TYPE P.
    PACK = 1 / 3 * 3.
    WRITE PACK.
    If you have not set the Fixed point arithmetic attribute, the result is 0, since the calculation is performed using integer accuracy, and the result is therefore rounded internally to 0.
    If the program attribute Fixed point arithmetic is set, the result is 1 because the result of the division is stored internally as 0.333333333333333333333333333333 with an accuracy of up to 31 digits.

  • Business Area & Plant not picked in CIN entries while doing MIRO

    At the time of performing MIRO, the automated entries related to cenvat clearing account for excise in CIN  are made by system.
    At the time of excise capturing the Business Area and Plant are picked up at line item for Cenvat Clearing Account.
    But at the time of MIRO the Business Area and Plant are not picked at line item for Cenvat Clearing Account.
    The client requires the segregation of the entries on the basis of Plant / Business Area.
    Regards
    Dharmveer

    GB01 is the table where you define it.
    and for documentation u can start from:
    http://help.sap.com/saphelp_47x200/helpdata/en/27/06e23954d9035de10000000a114084/frameset.htm
    use various links in this page, u can know everything abt substitution.
    Regards,
    anantha

  • How to clear previous data entries in sm30 transaction for table maintainen

    Whenever i maintain 8-10 records in sm30 for table maintenanace  and again when  I go to SM30 for entering new records i am able to view previous entries .
    then i click on new entries where data is cleared.
    Now what i need is i want to clear the data before clicking  "new entries".
    means for user it should appear as a fresh screen.
    is it possible if yes HOW?
    please anyone suggest me  way to do it immediately

    hi Nilesh,
    when u r clicking new entries data is not cleared but u r going to other screen,so it will apppaer as blank screen.If u want to delete all the records ,then write the logic in ur code.
    CASE SY-UCOMM.
    when 'NEWENTRIES'
    USe delete dbtable statemnt...then commit work.it will delete all the entries.
    ENDCASE.
    regards,
    Nagaraj

  • To eliminate the duplicate entries while looping into final internal table

    Hi friends,
    IAam facing the follwoing problem.
    Get Shipment for the Delivery number
          SELECT tknum
                 vbeln
                 INTO TABLE lt_vttp
                 FROM vttp
                 FOR ALL ENTRIES IN lt_vbfa
                 WHERE vbeln = lt_vbfa-vbelv.
        IF sy-subrc EQ 0.
          SELECT vbeln kunnr
                 FROM likp
                 INTO TABLE lt_likp
                 FOR ALL ENTRIES IN lt_vttp
                 WHERE vbeln = lt_vttp-vbeln.
      endif.
    now iam my internal table lt_vttp i have following values
       shipment             delivery
          1000                  2000
          1000                 2001
    in my internal table lt_likp i have the follwoing values
          delivery              shipto
           2000                  ABC
           2001                  ABC
    now iam looping those values in a final internal table.
      LOOP AT lt_likp INTO ls_likp.
        CHECK sy-subrc EQ 0.
         READ TABLE lt_vttp INTO ls_vttp WITH KEY vbeln = ls_likp-vbeln BINARY SEARCH.
        CHECK sy-subrc EQ 0.
             SELECT SINGLE * FROM vttk INTO ls_vttk
                    WHERE tknum = ls_vttp-tknum.
                  CHECK sy-subrc EQ 0.
                APPEND ls_vttk TO lt_vttk.
       endloop.
    Now my problem is if the shipment is same and the shipto is same then need to eliminate the duplicate records.
            in internal table iam getting 2records with same shipment and shipto combination.
           but instead if both the shipment and shipto are same then i should elimate the duplicate entries.
    and for the same shipment if i have differnt shipto then i neeed those 2 value.
    can any body tell me how can i get that.
    Regards
    Priyanka.

    Hi Priyanka,
    In your code you are using seelct statement inside the loop which will affect the performance.
    So, declare a another internal table which contains your both shipment and shipto elements with other required fields, say the name is lt_new.
    sort lt_new by shipment shipto.
    delete adjacent duplicates from lt_new comparing shipment shipto.
    "then use for all entries to choose values from VTTK.
    Hope this helps you.
    Regards,
    Manoj Kumar P

  • 'FROM keyword not found where expected'error while Inserting into Text file

    Hi,
    I have created a simple interface for testing purpose. All it has to do is to Read the data from a Source Text file and put it into another text file (exactly the same headers as that of source) ...
    I am using Oracle as my staging area.
    But at 'Insert column Header' Stage, it gives me a warning 'FROM keyword not found where expected' and when it goes to the next step 'Insert Rows' it gives me the same error 'FROM keyword not found where expected' and aborts its execution.
    I checked the query for insertion and found nothing wrong in it (FROM Statement is just where it is expected, that is) but somehow its not working.
    Can somebody help me out as i have done it before on other Installations and dont know why it is occuring that too for the simplest thing in ODI ...
    Regards,
    Nitesh.

    Probably you missed the concat operator.
    select rpad(nvl(lu_parid,' '),1,19)|| v_comma ||
    rpad(nvl(lu_class,' '),1,3)
    from temp_te07                                                                                                                                                                                                                                                                                                           

  • Avoiding duplicate records while inserting into the table

    Hi
    I tried the following insert statement , where i want to avoid the duplicate records while inserting itself
    but giving me the errror like invalid identifier, though the column exists in the table
    Please let me know Where i'm doing the mistake.
    INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
       SELECT 100,
              sk.obj_id,
              sk.key_txt,
              sk.obj_typ,
              sysdate,
              FROM S_KEY sk
        WHERE     sk.obj_typ = 'AY'
              AND SYSDATE BETWEEN sk.start_date AND sk.end_date
              AND sk.obj_id IN (100170,1001054)
               and   not exists  (select 1
                                                                   FROM t_map tm1 where tm1.O_ID=tm.o_id
                                                                        and tm1.sn_id=tm.sn_id
                                                                        and tm1.txt=tm.txt
                                                                        and tm1.typ=tm.typ
                                                                        and tm1.sn_time=tm.sn_time )

    Then
    you have to join the table with alias tml where is that ?do you want like this?
    INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
       SELECT 100,
              sk.obj_id,
              sk.key_txt,
              sk.obj_typ,
              sysdate,
              FROM S_KEY sk
        WHERE     sk.obj_typ = 'AY'
              AND SYSDATE BETWEEN sk.start_date AND sk.end_date
              AND sk.obj_id IN (100170,1001054)
               and   not exists  (select 1
                                                                   FROM t_map tm where sk.obj_ID=tm.o_id
                                                                        and 100=tm.sn_id
                                                                        and sk.key_txt=tm.txt
                                                                        and sk.obj_typ=tm.typ
                                                                        and sysdate=tm.sn_time )

  • Select Insert into staging table does not load varchar2

    I have a select insert query that loads a prebuilt table. I am running this query through Heterogenous services which points to a MySql database.
    If I run just a select query in SQL PLUS then it returns all rows just fine.
    However, when I put the select insert into a stored procedure, compile it, then run it. It does not load any VARCHAR2 datatypes but it does load DATE and NUMBER datatypes.
    I also have another table with VARCHAR2 and it loads fine.
    How can the database only load some of the items in a row and not others (VARCHAR2)
    For instance, the query will load TRANSACTION_ID, ORER_NUMBER but will not load any data into BILLFIRST_NAME, and their is data to load.
    CREATE TABLE "TRAN_STG"
    (     "TRANSACTION_ID" NUMBER(11,0),
         "ORDER_NUMBER" NUMBER(20,0),
         "BILL_FIRST_NAME" VARCHAR2(100 BYTE),
    Any suggestions would be appreciated.

    can it be that there are differences in characterset? Can it be that there are differences in datatypes? Can it be that there are differences in field length?

Maybe you are looking for

  • Can I connect an externa hard drive and access files from 2 or more computers wirelessly?

    Is there a way to connect an external hard drive to my linksys modem and access the files wirelessly by 2 or more computers? I have a wireless G wrt54G modem working on windows XP a desktop and laptop on my home system. I have a WD "my book" external

  • Layers (pdf layers to illustrator layers)

    I currently am using Acrobat 7.0 Professional and Illustrator CS2, the pdf has 3 layers and I'm trying to open the file in Illustrator and retain the layers, but when I open the file it converts to one layer.  Do you know how to retain the layers whe

  • Delete Document Flow

    Hi experts,   I need to create a FM in order to delete a document flow on certain business transaction. For example: deleting an opportunity from a lead. However, it is not working and I wonder if the input parameters might be wrong.   Below is the c

  • Having trouble with icloud on my Iphone 4, please help!

    It keeps telling me to accept the new terms and conditions in order to use icloud, but whenever i hit "view terms" it takes me no where. I deleted and signed back into my icloud account twice and shut my phone off twice. I just deleted the account al

  • HT1918 When I want to Submit payment information there is a messege

    When i bought ipad i didnt have credit card but now i have so i want to change the information of payement but when i do they said this credit card cannot be used in this itunes store please change the information this is crazy im sure that i wrote e