Mismatch of BP number.

Hi
My client system have  practice like initially BP is created in CRM, then it will be created in R/3 with same number. In ECC number range is external and in crm it is internal.  But in one case by mistakenly user created sold to in r/3 as 3333, the same created in crm as 5678, but now requirement is new BP  needed in  crm as 3333 ( with 3333 currently no BP created) is it possible ? if yes can can  e link with r/3 sold to 3333 ?
Raju

You should remove the link first. To do this, you can delete the BP using BUPA_DEL transaction, adn then create the BP on CRM. It will create the same bp number in R3....
Please provide points if work.
Thanks,
Ivá

Similar Messages

  • Mismatch between serial number masters & its stock

    Hi,
    For a plant A000 and storage location 0001, when i execute MMBE i am finding stock as 1 in un-restricted stock.
    But when is check iq02 and check number of serials exist for the Material Master, i find 3.
    Could some one tell me why there is a mismatch
    Regards

    Hi,
    This difference will come when the user creates new serial numbers during the goods movement, for example lets say if the current stock is '3' and user issued some '2 quanities for consumption. While issuing the materials if he used the option 'create automatic serail numbers' function instead of taking the serial numbers from the existing stock, then the system creates new serial numbers and issued with the stock quantity. Because of this, you will see the current stock as '1' with three serial numbers.
    Actually the system should not accpet the newly created serial numbers for issues when your settings are correct.
    Please check the serial number usage for the serial profile used in the material master in the below SPRO settings
    Serial number usage is defined at two places in SAP-PM under IMG activity as shown below
    First stage: IMG-> Plant Maintenance & Customer Service - Master Data in PM & CS - Tech Objects – Serial number Management- Define Serial number profiles
    Then go to the serializing procedure and see the serial number usage for 'MMSL'. It should be '03'.
    Second stage: IMG-> Plant Maintenance & Customer Service - Master Data in PM & CS - Tech Objects – Serial number Management-> Define Serialization Attributes for Movement Types
    Select the “define flow type groups”
    Here also, the serial number usage should be '03'.
    Let me know if you need any more help.
    Regards
    Vijay

  • Mismatch in system number on Solution manager and SMP

    Hi All,
    I found System Id is different in Solution manager and Service Market Place. But where as System number is same in Satellite system and Solution manager. Because system is not same we are unable to send he solution to SAP. Please suggest how to make system id same on SMP and both satellite system and solution manager.
    Regard's
    Sai

    HI,
    Some what similar to thread .. How to change the system no | SCN

  • Service Desk (Configuration in R3 systems) [Number Range Issue]

    Hi,
    I am using Sol Man 3.2 version.
    I have established the RFC connection between Sol Man and R3.
    When I create a Support Message from R3 system, I am getting the below error
    “<b><i>Error in Local Message System: For object DNO_NOTIF , number range interval 03 do Message was Not Created</i></b>”.
    This is a case where there is a mismatch between the number range.
    <b>Is there any setting I have to do in the R3 system to enable the correct number range.?</b>
    Because, when I login to my Sol Man system and create a support message, I am able to create the support Message and I can see the message in Transaction Monitor.
    I am unable to create support Message from R3 system.
    Regards,
    Vikas

    Dear Vikas,
    Please logon to your R/3 System which is configured in your Solution Manager Landsacpe.
    Follow the below steps now
    1. Run Transaction SM30.
    2. Table Name "BCOS_CUST"
    3. Maintain Entries as shown below ( column info )
        a. Appl - OSS_MSG
        b. + - W
        c. Dest. - SM_<SID>CLNT<Client#>_TRUSTED
        d. + - CUST620
        e. + - 1.0
    4. Save the Entries
    5. Click on Help Menu -> Create Support Message.
    Now enter the appropriate information in the message window and click on send button.
    Once you will get the notification of support message creation, logon to your solution manager system and run transaction crm_dno_monitor and check the support message.
    I hope this info will help you to create message from your R/3 system to solution manager.
    Thanks,
    Avinash

  • OEM error " Type Mismatch "

    Hi ,
    I have OEM v1.6 on Oracle 8.0.5 on NT. While creating the job,
    when I hit the schedule tab, I get the following eror message
    " Type Mismatch " ( No error number )
    How to avoid this ? Is there any thing that needs to be set up
    to avoid this error messgae.
    I configured all the files (
    tnsnames.ora,listner.ora,sqlnet.ora,snmp_ro.ora,snmp_rw.ora,servi
    ces.ora ) correctly.
    Also there is no proper documentation on this error.
    Has any one ever got this problem . Please let me know, if you
    have a solution.
    Thanks
    Chandra Kapireddy
    null

    You need to download the latest patch. Sorry, do not have
    the I.P. address handy. However, oracle support will
    gladly give you the I.P. address.
    Chandra Kapireddy (guest) wrote:
    : Hi ,
    : I have OEM v1.6 on Oracle 8.0.5 on NT. While creating the
    job,
    : when I hit the schedule tab, I get the following eror message
    : " Type Mismatch " ( No error number )
    : How to avoid this ? Is there any thing that needs to be set up
    : to avoid this error messgae.
    : I configured all the files (
    tnsnames.ora,listner.ora,sqlnet.ora,snmp_ro.ora,snmp_rw.ora,servi
    : ces.ora ) correctly.
    : Also there is no proper documentation on this error.
    : Has any one ever got this problem . Please let me know, if you
    : have a solution.
    : Thanks
    : Chandra Kapireddy
    null

  • Missing session confirmation number error..?

    Hi All, i'm facing missing session confirmation number error, when i try to add a product to wishlist for logged in user..?
    Error :
    16:01:57,218 WARN [DAFDropletEventServlet] Missing session confirmation number: Request URI: /store/product/detail.jsp,
    Referer: http://localhost:8180/store/jump/productDetail/Food_&_Candy/Candy_&_Chocolate/Licorice/Licorice_Laces_(2_lb._Bag)/52607
    Code Fragment:
    <dsp:form method="post" action="${wishlisturl}" formid="wishListForm" id="wishListForm" name="wishListForm" >
    <dsp:input type="hidden" bean="GiftlistFormHandler.giftlistId" beanvalue="Profile.wishlist.id"/>
    <dsp:input bean="GiftlistFormHandler.addItemToGiftlistSucessURL" value="wishlisturl" type="hidden"/>
    <dsp:input bean="GiftlistFormHandler.addItemToGiftlistErrorURL" value="wishlisturl" type="hidden" ></dsp:input>
    <dsp:input bean="GiftlistFormHandler.quantity" id="wishQty" type="hidden" value="1" priority="9"></dsp:input>
    <dsp:input type="hidden" bean="GiftlistFormHandler.productId" value="${productId}"/>
    <dsp:input bean="GiftlistFormHandler.catalogRefIds" id="wishlistSku" value="" priority="9" type="hidden"></dsp:input>
    <dsp:input type="hidden" bean="GiftlistFormHandler.AddItemToGiftlist" value="submit"/>
    </dsp:form>
    Please explain why the error being thrown..
    Thanks,
    Vishnu

    Usually you should not get this unless the same form is getting submitted multiple times or outside current session's context. I have also seen this happening sometimes when submitting dsp forms through AJAX. You may check these possibilities.
    Actually ATG uses a request parameter dynSessConf* whose value is a session-confirmation number to verify that the incoming request to ATG is legitimate. On submission of a form or activation of a property-setting <dsp:a> tag, DAFDropletEventServlet in request pipeline checks the value of dynSessConf against the current session's confirmation number. If it detects a mismatch or missing number, it can block form processing and return an error. This mechanism is provided to prevent cross-site scripting attacks and you can configure this behavior through following properties in /atg/dynamo/Configuration:
    enforceSessionConfirmation - specifies whether the request-handling pipeline requires session confirmation in order to process the request; the default value is true.
    warnOnSessionConfirmationFailure - specifies whether to issue a warning on a confirmation number mismatch; the default value is true.
    You can also control session confirmation for individual requests by adding the requiresSessionConfirmation attribute to true or false on the <dsp:form> or <dsp:a> tag like:
    <dsp:form requiresSessionConfirmation="false" ...>
    <dsp:a requiresSessionConfirmation="false" ...>
    When requiresSessionConfirmation attribute is set to false as above, the _dynSessConf parameter is not included in the HTTP request and DAFDropletEventServlet skips validation of this request's session-confirmation number.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Does the 'default where clause' query select the ROWID by default ?

    Hi ,
    The query in default where property of a data block is as follows:
    global.prim_lang = :global.user_lang
    and upper(group_name) like upper('%' || :B_apply_inclusions.TI_group_desc || '%')
    UNION ALL
    select g.rowid, g.group_no
    from table1 t,
    table 2 g
    where :global.prim_lang != :global.user_lang
    and upper(g.group_name) = t.key(+)
    and :global.user_lang = t.lang(+)
    and upper(nvl(t.translated_value, g.group_name)) like upper('%' || :B_apply_inclusions.TI_group_desc || '%')
    The g.rowid was added in the UNIONALL portion of the query because the first part of the query was bringing rowid as well.
    We are in 10.1.2.3.0 forms version.
    However for a user in forms verion 10.1.2.0.2, the query is giving an error " Unable to perform query " - due to mismatch in the number of columns selected in the query union.
    because for this user, rowid is not selected as part of default where clause query( 1st part of the query before the unionalll).
    If g.rowid is removed from the 2nd part of the query , it errors out in 10.1.2.3.0 forms version.
    Could you kindly clarify when this rowid will also be selected by the default where clause of a block and why this issue is occuring?Is this issue related to forms version or any other property of the block? Is it is version based, is there a patch available to deal with the same?
    Thanks in Advance.

    You normally change the default_where block property just when you want to chnage the filter conditions for what is selected from a given block data source.
    Querries with union or minus will confuse forms as to the rowid and will no longer be albe to perform the default insert/update/delete, not knowing the rowid and the table to perform the dml on.
    A from clause query will be the best way to change dynamically the tables you select from and also the where. But, by using that, if you wish to insert/update/delete, you will have to use on-insert/update/delete triggers where the processing will have to rely on some primary key columns and not on rowid.
    Or, instead of a from-clause, you may use a view, but that will definitely be less flexible than a from clause query.

  • Creating Tabular Models using AMO

    Hi,
       I am trying to setup Tabular Models on SSAS using AMO referring to the materials available at Technet and the Codeplex AMO Tabular Tutorial. I have a few questions/issues
    currently and was hoping I could get some help from here. The tool I am working with currently supports Multidimensional mode via AMO and I am trying to extend most of its features from the Multidimensional model. So, some of the questions might not exactly
    fit into the Tabular model and if so, please do point them out.
    RELATIONSHIPS: While creating relationships among tables,
    would it suffice to create the relationship in the Data Source View Schema (as the tool currently does for the Multidimensional mode) or should we explicitly create relationships under the dimensions that we create to represent the table in Tabular mode? The
    latter is the approach mentioned in the tutorial and I wanted to know if there is any difference between the relationships created using the 2 methods.
    MEASURES: I am trying to create measures using Native
    OLAP Measure objects in the Tabular model. I understand the shortcomings of this type of measures but I would still need to create them for basic aggregation functions like SUMMATION, COUNTS etc… I tried creating the measures using the same Measure Group that
    was created to represent the table and used a ColumnBinding to the Measure’s source-column for the measure object’s source property but I get the following error:
    Binding for VertiPaq measure MEASURE_NAME in measure group MEASURE_GROUP_NAME is invalid because it does not match any property binding of the fact dimension
    TABLE_NAME
    Am I missing anything here? Is there a better way of creating OLAP measures in TABULAR model without using the default Table dimensions that we create?
    HIERARCHIES: I tried creating new Dimensions to hold
    hierarchies for a table but when creating them, I get the error about MISSING ROWNUMBER attribute. Is this attribute mandatory for every dimension that is created?
    To avoid this problem, I used the same dimension that was created to represent the table and tried adding inter-table attribute relationships to it but I get the following exception
    message which I cannot figure out.
    VertiPaq property ‘’ cannot have a name binding.
    In general, can we create separate measure groups and dimensions, apart from the ones we use to represent the table, to store the custom measures and hierarchies? Is this a recommended
    approach? This way, I am trying to keep things in parallel with the Multidimensional model that our tool currently supports but when I create such individual dimensions and measure groups, I get an error on the mismatch between the number of measure groups
    and dimensions in the table.
       Please bear with the long list of questions. I could not find any help online for these and so am posting them all here.
    Thank you.

    RELATIONSHIPS: While creating relationships among tables,
    would it suffice to create the relationship in the Data Source View Schema (as the tool currently does for the Multidimensional mode) or should we explicitly create relationships under the dimensions that we create to represent the table in Tabular mode? The
    latter is the approach mentioned in the tutorial and I wanted to know if there is any difference between the relationships created using the 2 methods.
    No - relationships in the DSV have no impact on the end model. You need to explicitly create relationships between your dimensions and measure groups for them to be picked up as relationships in your tabular model.
    MEASURES: I am trying to create measures using Native OLAP
    Measure objects in the Tabular model. I understand the shortcomings of this type of measures but I would still need to create them for basic aggregation functions like SUMMATION, COUNTS etc… I tried creating the measures using the same Measure Group that was
    created to represent the table and used a ColumnBinding to the Measure’s source-column for the measure object’s source property but I get the following error:
    Binding for VertiPaq measure MEASURE_NAME in measure group MEASURE_GROUP_NAME is invalid because it does not match any property binding of the fact dimension TABLE_NAME
    Am I missing anything here? Is there a better way of creating OLAP measures in TABULAR model without using the default Table dimensions that we create?
    There is no concept of native measures in a Tabular model. You need to create all your measures as "Calculated Measures" in AMO, but using the appropriate DAX expressions instead of MDX.
    HIERARCHIES: I tried creating new Dimensions to hold hierarchies
    for a table but when creating them, I get the error about MISSING ROWNUMBER attribute. Is this attribute mandatory for every dimension that is created?
    To avoid this problem, I used the same dimension that was created to represent the table and tried adding inter-table attribute relationships to it but I get the following exception message
    which I cannot figure out.
    VertiPaq property ‘’ cannot have a name binding.
    Yes, I believe every table needs the hidden RowNumber attribute. The
    TableAddEmptyTable function in the tabular AMO sample on codeplex shows you how to create this.
    In general, can we create separate measure groups and dimensions, apart from the ones we use to represent the table, to store the custom measures and hierarchies? Is this a recommended
    approach? This way, I am trying to keep things in parallel with the Multidimensional model that our tool currently supports but when I create such individual dimensions and measure groups, I get an error on the mismatch between the number of measure groups
    and dimensions in the table.
    No you can't create any extra structures. Tabular projects only supports a subset of AMO. You need to follow the example on codeplex very closely and read all the code comments if you are making changes because it's very easy to break things.
    My suggestion is to create an abstraction layer either using the TabularAMO library from codeplex as it is or creating your own library if you only need a subset of the functionality. This will mean that your core code is not too tightly bound to AMO. The
    reason for this is that I would hope that MS will replace AMO with something better for Tabular models in a coming release and having a clear abstraction layer should make it easier to update to a new API.
    http://darren.gosbell.com - please mark correct answers

  • To get the reflection of a variable arity method.

    Hi,
    I am having
    public static Object myMethod(Object... args)I need to return the rfletion of this method. I have tried out the following, but it throws an error of mismatch in the number of arguments when I pass more than one argument.
    return cls.getMethod("myMethod", Object[ ].class);Please help in this regard.
    Thanks in advance.

    public class Test  {
        public static void main( String argv[] ) throws Throwable {
         Class cls = Test.class;
         System.out.println( Object[ ].class );
         java.lang.reflect.Method m = cls.getMethod("myMethod", Object[ ].class);
         Test t = new Test();
         Object[] paramsA = new Object[0];
         System.out.println( m.invoke( t, new Object[] { paramsA } )  );
        public  Object myMethod( Object... a ) {
         return "Test";
    }seams to be working for me. Could your post your error message? Thanks

  • Warning on Exporting DAC Repository Metadata

    After finishing export DAC repository metadata, we got some warning as following:
    W_ETL_SA: 26 objects
    31705 INFO Wed Nov 11 11:15:12 CST 2009 Exporting entity W_ETL_SA for application EBS 11.5.10.W
    31706 INFO Wed Nov 11 11:15:12 CST 2009 Moved W_ETL_SA: 26 objects
    31707 INFO Wed Nov 11 11:15:12 CST 2009 Exporting entity W_ETL_TABLE for application Universal
    31708 WARNING Wed Nov 11 11:15:12 CST 2009 Mismatch between the number of records in the object reference table and objects
    There are 14 warning of all.
    Does it mean something wrong with DAC reposritory?
    Roger

    Hi,
    I have spoken to Oracle about this previously, apparently these warnings are not important and are nothing to worry about, see here:
    [http://obiee-tips.blogspot.com/2009/09/dac-importexport.html|http://obiee-tips.blogspot.com/2009/09/dac-importexport.html]
    They still worry me a bit though...
    Regards,
    Matt

  • MaxL - Cardinality of Member Names

    I prepared a text file to load a transparent partition via MaxL. The error I'm receiving is "cardinality of member names must match". I am very unsure about the meaning of this error. It seems to have to do with either the Mapped Target Area or the Mapped Globally syntax.
    Mapped targetarea1 ("") to (''NoEmploymentType")
    Mapped targetarea2 ("") to (''NoEmploymentType")
    Mapped targetarea3 ("") to (''NoEmploymentType")
    Mapped Globally ("Periodic") to ("")
    Mapped Globally ("Whole") to ("");
    Can anyone shed some light on this error?
    Thank you

    Hello,
    The error message points to a mismatch in the number of elements you want to connect.
    I am not sure if you are familiar with setting up partitions. In case you have less experience, start small and in the wizard. When this works reliable, start creating them with MaxL. And expand the section onto what is needed.
    Regards,
    Philip Hulsebosch.

  • Oracle 10gR2 Dataguard quick question

    Hi -
    Just a quick question about Oracle 10gR2 Dataguard. I'm in the process of creating dataguard standby, which is running for few hours and could take few more hours because of the size and the standby's network latency, and i'm using OMS to create the DG. I'm in a situation now to create new tablespace(A) and also one more datafile to the existing tablespace(B).
    Question is: How would this new tablespace and the new datafile in the existing tablespace affect the DG standby which is about to finish in couple hours? Will the DG pickup the new changes from the primary? Or, will it be done with error from the mismatch of the number of files? The issue is that i can't wait for the standby creation to be done as the additional tablespace and datafile requirement is production critical, and should be added right away.
    note: standby_file_management is set to auto.
    Thanks for your response.
    regards.

    Your post is a bit vague, as OMS is an acronym for Oracle Management Server, which is a service only.
    If you would have stated you are using Database Control, or RMAN duplicate database, the picture would have been much clearer.
    Database Control uses rman duplicate database.
    Recovery is the mandatory implicit last step of this procedure to pick up all changes since you started the duplicate.
    One word of warning: Network latency is one thing to avoid like hell in a standby configuration.
    It might even slow down your production database.
    Sybrand Bakker
    Senior Oracle DBA

  • NSS 324 - Management HTTPS Error

    Hi Folks,
    I've noticed if you enable SSL management, and navigate directly using https://<nss ip> the following error is displayed:
    Secure Connection Failed
    An error occurred during a connection to ........
    SSL received a record that exceeded the maximum permissible length.
    (Error code: ssl_error_rx_record_too_long)
    however if you navigate via http://<ip> then it is redirected to https.
    done using firefox 3.6.6.
    Cheers,
    Dan

    I think the proxy from your browser setting can cause this error. I have experienced that some proxies can't handle secure connections.
    You can find the connection setting from Firefox: Tools > Options >  Advanced,  Network tab : Click on Connection (check to make sure you have the right port for SSL connection. If you do not need to use a proxy to connect to internet then select No  Proxy or set to auto detect proxy.
    The root cause for this particular issue is a mismatch between the IP address being published for  the domain between a client (PC broswer) and a server (NSS) and the IP address in  Apache's httpd-ssl.conf file (Example: 127.0.0.2 (server) vs. 127.0.0.1 (client)). When  the browsers (any browser) tried to do an https: type SSL connection, they were trying  on port 127.0.0.1:443. Apache wasn't listening there and the fetch  failed. This error is a common error when a mismatch on port number for SSL connection between client and server (if a broswer is set with a proxy- a default port is set but it can be any port determined by user which can cause this issue).
    In your case, when a HTTP is connected and redirected for HTTPS port defined from NSS, your browser is using the port determined by NSS.  Then everything is working fine.  Your browser will be cached to the working port and it should be working fine. However, you may see this issue again if you are not resolve the port between client and server for SSL connection.
    Hope this is helps! Please let me know

  • The post-expression for the step 'Set Station Global Models' could not be evaluated.

    When running a TestStand sequence, I get the following error message:
    The post-expression for the step 'Set Station Global Models' could not be evaluated.
    Variable or property types do not match or are not compatible.
    Error Code:  -17321; Variable or property types do not match or are not compatible.
    Although I do understand what the above is telling me, fixing the root cause is somewhat another thing.
    I'll explain the scenario.  The code is developped in TS4.1.0, although that should be irrelevant.
    I inherited a TestStand project which calls LabVIEW modules.  The original sequence works fine on the test pc (which is remotely located).  The local and remote PC's share the same StationGlobals.ini.
    Here is why it is strange..  I do get the error message on the local PC whether I use the old or the new code.  I was assuming that using the same StationGlobals.ini would have fixed that, but it didn't. 
    When I run the old code on the remote PC, I do not get that message, but I do get it with the new code.  I don't recall modifying code that affected the StationGlobals.  My goal is to fix it on both machines (at least the remote one).
    This is the expression that it does not like:
    StationGlobals.Models=Locals.Models,
    However, that very same expression exists in the original code.  The one thing that did change is the addition of a new model number in the Locals.Models.newNumber.  I suspect that this contributes to the error.  But why does it give me an error when I run the original sequence locally?  Maybe I should try running it again..
    In either case, any suggestions to overcome the error?
    Thanks.
    RayR
    Solved!
    Go to Solution.

    Please ignore post.
    I found the problem..  There was a mismatch with the number of models.  Easy fix.  It's all good.
    For someone else who might tumble onto this thread for the same reason, here's an explanation.
    I added a new model with its variants to the list of Locals.Models.  In total, there were 16 additions to the list.
    I also added these to the StationGlobals.  Which meant that my StationGlobals no longer matched the one on the remote PC. 
    Where the error came in is that there was one missing model in the StationGlobals, so the size (number of models) did not match that of the Locals. 
    Slight oversight in the sea of models that are listed (I'm not even going to count them).  Now fixed. 
    ... and embarrassed ...
    I may as well give myself an "Accepted Solution".
    Spoiler (Highlight to read)
    Since I've been naughty all year it will be my only Christmas gift..
    Since I've been naughty all year it will be my only Christmas gift..

  • Problem : SQL Server Delete Statement vs Oracle Delete Statement

    Hi,
    I am currently involved in a SQL Server - Oracle Migration effort and I have hit a roadblock on one of the queries.
    I have created two tables named DUMMY and DUMMY1 in MSDE with the same structure :-
    CREATE TABLE DUMMY
         CLIENT_ID                                             INT,
         NME                                                   VARCHAR(80),
         BATCH_ID                                              INT,
         CATALOG_NAME                                          VARCHAR(50),
         DATA_SOURCE                                           VARCHAR(50)
    GOI have a DELETE statement that deletes from the two tables, by selecting common data using Joins:-
    DELETE  DUMMY
    FROM      DUMMY src
            JOIN
         DUMMY1 dst 
              ON
                  src.batch_id = dst.batch_id 
              AND src.catalog_name = dst.catalog_name 
              AND src.data_source = dst.data_source
         WHERE
                   src.batch_id     = 1
              AND      src.catalog_name = '1'
                  AND      src.data_source  = '1'Now, I have created the same Tables ( DUMMY and DUMMY1 ) in Oracle :-
    CREATE TABLE DUMMY
         CLIENT_ID                                             NUMBER(5),
         NME                                                      VARCHAR2(80),
         BATCH_ID                                             NUMBER(10),
         CATALOG_NAME                                  VARCHAR2(50),
         DATA_SOURCE                                     VARCHAR2(50)
    )I have written a Delete statement to mimic the fucntionality in this manner :-
    DELETE FROM 
              DUMMY 
    WHERE
                   client_id,
                   nme,
                   batch_id,
                   catalog_name,
                   data_source
              IN
                   SELECT *
                   FROM
                        DUMMY  src
                                           JOIN 
                        DUMMY1  dst 
                             ON
                                  src.batch_id = dst.batch_id
                             AND      src.catalog_name = dst.catalog_name
                             AND      src.data_source = dst.data_source
                        WHERE
                                  src.batch_id = 1
                             AND      src.catalog_name= '1'
                             AND      src.data_source = '1'
              )However, I keep getting this error whe I try to test it :-
    ORA-06550: line 1, column 89:
    PL/SQL: ORA-00913: too many values
    Can you please help me modify the query to simulate the same functionality as the orignial SQL Server Query ?
    Thanks,
    Sandeep

    There is probably a mismatch between the number of columns.
    Try this * Not Tested *
    DELETE FROM 
              DUMMY 
    WHERE
                   client_id,
                   nme,
                   batch_id,
                   catalog_name,
                   data_source
              IN
                   SELECT      src.client_id,
                   src.nme,
                   src.batch_id,
                   src.catalog_name,
                   src.data_source
                   FROM
                        DUMMY  src
                                           JOIN 
                        DUMMY1  dst 
                             ON
                                  src.batch_id = dst.batch_id
                             AND      src.catalog_name = dst.catalog_name
                             AND      src.data_source = dst.data_source
                        WHERE
                                  src.batch_id = 1
                             AND      src.catalog_name= '1'
                             AND      src.data_source = '1'
              );

Maybe you are looking for

  • Forms: How do we select two diffrent lov for two seperate conditons

    DECLARE      a_value_chosen BOOLEAN; BEGIN      if :reports.report in('OQ29','OQ25','OQ22') then                a_value_chosen := Show_Lov('lov_agen');                if :reports.office_id is not null then                a_value_chosen := Show_Lov('l

  • Thunderbird "eating" emails on some accounts, but not others

    Greetings. I'm currently playing tech support for my mother, who has been having issues with Thunderbird that I can't wrap my head around. She has numerous email addresses all connected into Thunderbird. However, recently two of them have decided tha

  • Itunes Freezes When I Plug My Ipod

    This started to occur just a couple days ago. I noticed when I disconnected my ipod the day before, the screen turned white. I had to restart my ipod then it worked fine til yesterday. My Ipod started to really lag when I was shuffling and its still

  • Running F110 with message "Proposal has been released 07/22/10 06:42:33"

    Dear experts, When I made proposal through F110,  it shows me "Parameters have been entered", then after setting the related parameters, I tried to run Proposal with check on "Start immediately", then the message "Proposal has been released 07/22/10

  • Need URGENT help on Log4cpp....

    Hi all, I have written a C++ program which uses Log4cpp. Here I have written the following statement to use property file- log4cpp::PropertyConfigurator::configure ("/home/tests/log/testlog.properties"); -as the first statement of the program(within