SQL for No Transaction

Dear All,
I have two tables 1. Cities 2. Transactions. There are 100 cities and each city is having minimum 100 transactions per day. Transaction table contains city name which is same in Cities table. I want to write a query which should tell me which city has made NO TRANSACTION on any particular day. How application works if there is any transaction it inserts row for each transaction in Transactions table if no transaction no entry in Transaction table. I am looking for those cities who are not having any activity or transaction in a said day.
Can any one help on this and I will really appreciate any one's help in this regards.
Thnaks

It's always nice to have the DDL for the tables in question and some sample data.
I've assumed the date column in the transaction table has a time element to it.
I've also assumed that the input parameter is a string datatype. If it was of datatype DATE then TO_DATE should be omitted.
A NOT EXISTS predicate is the most natural way to express this logic.
SELECT *
FROM   cities c
WHERE  NOT EXISTS (SELECT 1
                   FROM   transactions t
                   WHERE  t.city_name         = c.city_name
                   AND    t.transaction_date >= TO_DATE(:date_in_question,'DD-MON-YYYY')
                   AND    t.transaction_date <  TO_DATE(:date_in_question,'DD-MON-YYYY') + 1);

Similar Messages

  • Error connecting SQL Azure - Network access for Distributed Transaction Manager (MSDTC) has been disabled

    Sometimes I have an error connecting SQL Azure. The error occurs in an asp.net application and in a windows service running on VM in Azure. Error details:
    System.Data.Entity.Core.EntityException: The underlying provider failed on Open. ---> System.Transactions.TransactionManagerCommunicationException: Network access for Distributed Transaction Manager (MSDTC) has been disabled. Please enable DTC for network
    access in the security configuration for MSDTC using the Component Services Administrative tool. ---> System.Runtime.InteropServices.COMException: The transaction manager has disabled its support for remote/network transactions. (Exception from HRESULT:
    0x8004D024)
       at System.Transactions.Oletx.IDtcProxyShimFactory.ReceiveTransaction(UInt32 propgationTokenSize, Byte[] propgationToken, IntPtr managedIdentifier, Guid& transactionIdentifier, OletxTransactionIsolationLevel& isolationLevel,
    ITransactionShim& transactionShim)
       at System.Transactions.TransactionInterop.GetOletxTransactionFromTransmitterPropigationToken(Byte[] propagationToken)
       --- End of inner exception stack trace ---
       at System.Transactions.Oletx.OletxTransactionManager.ProxyException(COMException comException)
       at System.Transactions.TransactionInterop.GetOletxTransactionFromTransmitterPropigationToken(Byte[] propagationToken)
       at System.Transactions.TransactionStatePSPEOperation.PSPEPromote(InternalTransaction tx)
       at System.Transactions.TransactionStateDelegatedBase.EnterState(InternalTransaction tx)
       at System.Transactions.EnlistableStates.Promote(InternalTransaction tx)
       at System.Transactions.Transaction.Promote()
       at System.Transactions.TransactionInterop.ConvertToOletxTransaction(Transaction transaction)
       at System.Transactions.TransactionInterop.GetExportCookie(Transaction transaction, Byte[] whereabouts)
       at System.Data.SqlClient.SqlInternalConnection.EnlistNonNull(Transaction tx)
       at System.Data.ProviderBase.DbConnectionPool.PrepareConnection(DbConnection owningObject, DbConnectionInternal obj, Transaction transaction)
       at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, UInt32 waitForMultipleObjectsTimeout, Boolean allowCreate, Boolean onlyOneCheckConnection, DbConnectionOptions userOptions, DbConnectionInternal&
    connection)
       at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal& connection)
       at System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(DbConnection owningConnection, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal oldConnection, DbConnectionInternal& connection)
       at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)
       at System.Data.SqlClient.SqlConnection.TryOpenInner(TaskCompletionSource`1 retry)
       at System.Data.SqlClient.SqlConnection.TryOpen(TaskCompletionSource`1 retry)
       at System.Data.SqlClient.SqlConnection.Open()
       at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.<>c__DisplayClass1.<Execute>b__0()
       at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)
       at System.Data.Entity.Core.EntityClient.EntityConnection.Open()
       --- End of inner exception stack trace ---
       at System.Data.Entity.Core.EntityClient.EntityConnection.Open()
       at System.Data.Entity.Core.Objects.ObjectContext.EnsureConnection()
       at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)
       at System.Data.Entity.Core.Objects.ObjectQuery`1.<>c__DisplayClassb.<GetResults>b__9()
       at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)
       at System.Data.Entity.Core.Objects.ObjectQuery`1.GetResults(Nullable`1 forMergeOption)
       at System.Data.Entity.Core.Objects.DataClasses.EntityReference`1.Load(MergeOption mergeOption)
       at System.Data.Entity.Core.Objects.DataClasses.RelatedEnd.DeferredLoad()
       at System.Data.Entity.Core.Objects.Internal.LazyLoadBehavior.LoadProperty[TItem](TItem propertyValue, String relationshipName, String targetRoleName, Boolean mustBeNull, Object wrapperObject)
       at System.Data.Entity.Core.Objects.Internal.LazyLoadBehavior.<>c__DisplayClass7`2.<GetInterceptorDelegate>b__2(TProxy proxy, TItem item)

    Hello,
    I am not an expert in MSDTC but as we know,SQL Azure Database does not support
    distributed transactions. This means that SQL Azure doesn’t allow Microsoft Distributed Transaction Coordinator (MS DTC) to delegate distributed transaction handling.
    One common cause of MSDTC getting involved in Entity Framework applications is the fact that we close and reopen the same connection as needed (i.e. for each query that is executed).To avoid the stack from opening and closing the connection multiple times,
    you can simply open the connection explicitly and run the queries in the same connectio.
    The following thread is about a similar issue, please refer to:
    http://answers.flyppdevportal.com/categories/azure/sqlazure.aspx?ID=d705a8cf-cba4-494c-96f6-96a136bd29e3
    What's more, you can also try the workaround that involves setting the Enlist option of the SQL Azure connection to false. For the detail explanation, please refer to:Entity
    FrameWork and SQL Azure
    Regards,
    Fanny Liu
    Fanny Liu
    TechNet Community Support

  • Authorizations for which transactions are required in BW?

    Hi,
    Can any ony please give some information regarding
    Authorizations for which transactions are required in BW Production Support?
    Regards,
    Aryan

    Hi Aryan,
    Authorizations for the following transactions are required in BW
    1. RSA1
    2. SM37
    3. ST22
    4. ST04
    5. SE38
    6. SE37
    7. SM12
    8. RSKC
    9. SM51
    10. RSRV
    11.RSPC
    13.RSMON
    The Process Chain Maintenance (transaction RSPC) is used to define, change and view process chains.
    Upload Monitor (transaction RSMO or RSRQ (if the request is known)
    The Workload Monitor (transaction ST03) shows important overall key performance indicators (KPIs) for the system performance
    The OS Monitor (transaction ST06) gives you an overview on the current CPU, memory, I/O and network load on an application server instance.
    The database monitor (transaction ST04) checks important performance indicators in the database, such as database size, database buffer quality and database indices.
    The SQL trace (transaction ST05) records all activities on the database and enables you to check long runtimes on a DB table or several similar accesses to the same data.
    The ABAP runtime analysis (transaction SE30)
    The Cache Monitor (accessible with transaction RSRCACHE or from RSRT) shows among other things the cache size and the currently cached queries. The Export/Import Shared buffer determines the cache size; it should be at least 40MB.
    ****Assign Points If Helpful****
    Regards,
    Ravikanth

  • Taking snapshot of oracle tables to sql server using transactional replication is taking a long time

    Hi All,
    I am trying to replicate around 200 oracle tables onto sql server using transaction replication and it taking a long time i.e the initial snapshot is taking more than 24 hrs and it still going on.
    Is there any way to replicate those these tables faster?
    Kindly help me out..
    Thanks

    Hi,
    According to the description, I know the replication is working fine. But it is very slow. 
    1. Check the CPU usage on Oracle publisher and SQL Server. This issue may due to slow client processing (Oracle performance) or Network performance issues.
    2. Based on SQL Server 2008 Books Online ‘Performance Tuning for Oracle Publishers’ (http://msdn.microsoft.com/en-us/library/ms151179(SQL.100).aspx). You can enable the transaction
    job set and follow the instructions based on
    http://msdn.microsoft.com/en-us/library/ms147884(v=sql.100).aspx.
    2. You can enable replication agent logging to check the replication behavior. You may follow these steps to collect them:
    To enable Distribution Agent verbose logging. Please follow these steps:
    a. Open SQL Server Agent on the distribution server.
    b. Under Jobs folder, find out the Distribution Agent.
    c. Right click the job and choose Properties.
    d. Select Steps tap, it should be like this:
    e. Click Run agent and click Edit button, add following scripts by the end of scripts in the command box:
            -Output C:\Temp\OUTPUTFILE.txt -Outputverboselevel 2
    f. Exit the dialogs
     For more information about the steps, please refer to:
    http://support.microsoft.com/kb/312292
    Hope the information helps.
    Tracy Cai
    TechNet Community Support

  • Moving the 80 Million records from Conversion database to System Test database (Just for one transaction table) taking too long.

    Hello Friends,
    The background is I am working as conversion manager and we move the data from oracle to SQL Server using SSMA and then we will apply the conversion logic and then move the data to system test ,UAT and Production.
    Scenario:
    Moving the 80 Million records from Conversion database to System Test database (Just for one transaction table) taking too long. Both the databases are in the same server.
    Questions are…
    What is best option?
    IF we use the SSIS it’s very slow and taking 17 hours (some time it use to stuck and won’t allow us to do any process).
    I am using my own script (Stored procedure) and it’s taking only 1 hour 40 Min. I would like know is there any better process to speed up and why the SSIS is taking too long.
    When we move the data using SSIS do they commit inside after particular count? (or) is the Microsoft is committing all the records together after writing into Transaction Log
    Thanks
    Karthikeyan Jothi

    http://www.dfarber.com/computer-consulting-blog.aspx?filterby=Copy%20hundreds%20of%20millions%20records%20in%20ms%20sql
    Processing
    hundreds of millions records can be done in less than an hour.
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Product: Microsoft SQL Server 2012 Transact-SQL Compiler Service -- Error 1335

    Hello,
    I am getting an error while using windows update or manually downloading and running
    SQLServer2012-KB2793634-x64.exe
    Product: Microsoft SQL Server 2012 Transact-SQL Compiler Service  -- Error 1335. The cabinet file 'Redist.cab' required for this installation is corrupt and cannot be used. This could indicate a network error, an error reading from installation media,
    or a problem with this package.
    What can be done?
    Thanks
    Bye

    Hello,
    I am getting an error while using windows update or manually downloading and runningT
    SQLServer2012-KB2793634-x64.exe
    Product: Microsoft SQL Server 2012 Transact-SQL Compiler Service  -- Error 1335. The cabinet file 'Redist.cab' required for this installation is corrupt and cannot be used. This could indicate a network error, an error reading from installation media,
    or a problem with this package.
    What can be done?
    Thanks
    Bye
    This KB article was issued as fix for known Issue as per below link.
    http://support.microsoft.com/kb/2793634
    Now I assume you are getting this error because downloaded file seems corrupt to me.Can you download again and copy it to local disk and start running  from there.
    http://www.microsoft.com/en-in/download/details.aspx?id=36215
    Also make sure your installer is not corrupt.If you find it corrupt please download it from below link and install after that run the setup again
    http://support.microsoft.com/kb/942288
    Hope this helps
    Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers

  • Can't serialize access for this transaction error.

    Hi,
              I am getting a transaction rolled back due to a beforeCompletion exception:
              java.sql.SQLException: ORA-08177: can't serialize access for this
              transaction
              I am using Weblogic 5.10, SP5 & Oracle 8.
              Basically what I have is:
              - A java app that will get an EntityBean, named Schedule.
              - The java app will then create a SessionBean, named Watchdog.
              - The java app will then call a method on the Watchdog, passing in the
              Schedule: watchdog.check(Schedule sched)
              - The watchdog.check() method will do some processing and based on the
              outcome may or may not update the Schedule: sched.setLast(<some number>)
              - The check() method is then done & return to the java app.
              I see log messages (on the weblogic server) saying the check() method is
              done. Then I see messages, presumably from the container for:
              - isModified ------> true
              - ejbStore()
              And then I see the exception messages.
              The strange thing is that most of the time this processing works just fine.
              Only occasionally do these exceptions get thrown.
              The schedule EntityBean is set up as <transaction-isolation> for
              TRANSACTION_SERIALIZABLE.
              There is no other bean trying to update the Schedule at the same time.
              Does anybody have any ideas?
              Thanks in advance,
              Beth
              

    http://www.weblogic.com/docs51/classdocs/API_ejb/EJB_environment.html#107296
              8
              tells u about the limitations of transaction_serializable w.r.t. Oracle..
              Pavan
              "Beth" <[email protected]> wrote in message news:[email protected]...
              > Hi,
              >
              > I am getting a transaction rolled back due to a beforeCompletion
              exception:
              > java.sql.SQLException: ORA-08177: can't serialize access for this
              > transaction
              >
              > I am using Weblogic 5.10, SP5 & Oracle 8.
              >
              > Basically what I have is:
              > - A java app that will get an EntityBean, named Schedule.
              > - The java app will then create a SessionBean, named Watchdog.
              > - The java app will then call a method on the Watchdog, passing in the
              > Schedule: watchdog.check(Schedule sched)
              > - The watchdog.check() method will do some processing and based on the
              > outcome may or may not update the Schedule: sched.setLast(<some number>)
              > - The check() method is then done & return to the java app.
              >
              > I see log messages (on the weblogic server) saying the check() method is
              > done. Then I see messages, presumably from the container for:
              > - isModified ------> true
              > - ejbStore()
              > And then I see the exception messages.
              >
              > The strange thing is that most of the time this processing works just
              fine.
              > Only occasionally do these exceptions get thrown.
              >
              > The schedule EntityBean is set up as <transaction-isolation> for
              > TRANSACTION_SERIALIZABLE.
              > There is no other bean trying to update the Schedule at the same time.
              >
              > Does anybody have any ideas?
              > Thanks in advance,
              > Beth
              >
              >
              >
              >
              

  • SQLException: ORA-08177: can't serialize access for this transaction

    Hiya, If anyone has any suggestions on the following, i would be grateful...
    The exception is thrown by the flushing of the OutputStream in the following few lines:
    java.io.OutputStream os = ((weblogic.jdbc20.common.OracleClob) myClob).getAsciiOutputStream(); os.flush();
    The exception given is: java.io.IOException: Error while doing writeLobByteValue: java.sql.SQLException: ORA-08177: can't serialize access for this transaction at weblogic.db.oci.OciOutputStream.flush(OciOutputStream.java:152)
    ... plus a whole pile of internal stuff
    This method is called from the ejbStore() method of an EJB, and it only fails the first time that ejbStore() is called, and then after that runs smoothly.
    Any suggestions appreciated, Cheers, Karen.

    Thanks, took a look at that previously - still having the problem though temp solution seems to be to delete extra records from the table in question but this isn't the best solution by all means.
    Thanks again

  • Error while running a BDC for the Transaction F-02

    Hi,
           I'm getting an error <b><i>"Parking not possible during Batch Input"</i></b> while running a BDC for the transaction F-02.
           When i click on the error message it displays the message [b<i>]"In Customizing, you can control whether an error message is issued."</b></i>
            How to solve this issue?.
            Waiting for ur replies.........
    Regards
    N.Senthil

    Hi,
    When you are doing the recording in SHDB, and in the same screen where the TCODE to be recorded is given, there are options that you can choose called "Recording Parameters"...Select the checkbox which says "Not a Batch Input Session", this will set the sy-binpt variable to " "(in a recording by default it is "X")...and you will not get this error...
    Also make sure while writing the BDC program to make use of the "bdc options" parameter which has this property to switch of sy-binpt...
        Refer below theard for sample bdc code for f-02.
    https://forums.sdn.sap.com/click.jspa?searchID=5126766&messageID=1538409
    Regards

  • Fields in read-only mode for BP transaction in upgrade from 4.72 to ECC6.0

    Hello,
    This is regarding the query for BP transaction in upgrade from 4.72 to ECC 6.0.
    While creating BP / Changing BP system is disabling the fields in compnay code & sales area option. There is a SAP not provided 907860; and specified that the proplem is because of the delta customizing.
    All the customization settings are as per the note. And the Synchronization Control is also activated. But still the fields are disabled only.
    I have checked the field settings also in Cross-Application-Components & Logistic-Genral.
    But these are not working for compnay code & sales area data.
    please let me know if anyone has faced this type of problem / found any solution on this.
    Best Regards,
    Shubhada

    SAP IMG  -> Cross Application Components -> Master Data Synchronization &#61664; Synchronization Control -> Synchronization Control -> Activate Synchronization options
    This setting allows you to activate synchronization of BP with R/3. Not maintaining the values may result in certain fields being read-only in BP.
    Suggested Values:
    Source Object: BP | Target Object: CUSTOMER | Active Indicator : X
    Source Object: BP | Target Object: VENDOR | Active Indicator : X
    Source Object: CUSTOMER | Target Object: BP | Active Indicator : X
    Source Object: VENDOR | Target Object: BP | Active Indicator : X
    Have you set all of the above settings??
    BHARATH

  • WIP - Job Order Pending for Completion Transaction Reprt

    How can we get report - job order pending for completion transaction from WIP module?
    Please give us solution.
    Thanks & Regards,
    PressureJet Systems Pvt. Ltd.
    Edited by: PressureJet Systems Pvt. Ltd. on Apr 7, 2013 9:57 PM

    Can you elaborate your question more?
    Mahendra

  • ORA-01489 Received Generating SQL for Report Region

    I am new to Apex and I am running into an issue with an report region I am puzzled by. Just a foreword, I'm sure this hack solution will get a good share of facepalms and chuckles from those with far more experience. I welcome suggestions and criticism that are helpful and edifying!
    I am on Apex 4.0.2.00.07 running on 10g, I believe R2.
    A little background, my customer has asked an Excel spreadsheet be converted into a database application. As part of the transition they would like an export from the database that is in the same format as the current spreadsheet. Because the column count in this export is dynmic based on the number of records in a specific table, I decided to create a temporary table for the export. The column names in this temp table are based on a "name" column from the same data table so I end up with columns named 'REC_NAME A', 'REC_NAME B', etc. (e.g. Alpha Record, Papa Record, Echo Record, X-Ray Record). The column count is currently ~350 for the spreadsheet version.
    Because the column count is so large and the column names are dynamic I've run into a host of challenges and errors creating this export. I am a contractor in a corporate environmentm, so making changes to the apex environment or installation is beyond my influence and really beyond what could be justified by this single requirement for this project. I have tried procedures and apex plug-ins for generating the file however the UTL_FILE package is not available to me. I am currently generating the SQL for the query in a function and returning it to the report region in a single column (the user will be doing a text-to-column conversion later). The data is successfully being generated, however, the sql for the headers is where I am stumped.
    At first I thought it was because I returned both queries as one and they were joined with a 'union all'. However, after looking closer, the SQL being returned for the headers is about +10K+ characters long. The SQL being returned for the data is about +14k+. As mentioned above, the data is being generated and exported, however when I generate the SQL for the headers I am receiving a report error with "ORA-01489: result of string concatenation is too long" in the file. I am puzzled why a shorter string is generating this message. I took the function from both pages and ran them in a SQL command prompt and both return their string values without errors.
    I'm hopeful that it's something obvious and noobish that I'm overlooking.
    here is the code:
    data SQL function:
    declare
      l_tbl varchar2(20);
      l_ret varchar2(32767);
      l_c number := 0;
      l_dlim varchar2(3) := '''|''';
    begin
      l_tbl := 'EXPORT_STEP';
      l_ret := 'select ';
      for rec in (select column_name from user_tab_columns where table_name = l_tbl order by column_id)
      loop
        if l_c = 1 then
            l_ret := l_ret || '||' || l_dlim || '|| to_char("'||rec.column_name||'")';
        else
            l_c := 1;
            l_ret := l_ret || ' to_char("' || rec.column_name || '")';
        end if;
      end loop;
        l_ret := l_ret || ' from ' || l_tbl;
      dbms_output.put_line(l_ret);
    end;header sql function:
    declare
      l_tbl varchar2(20);
      l_ret varchar2(32767);
      l_c number := 0;
      l_dlim varchar2(3) := '''|''';
    begin
      l_tbl := 'EXPORT_STEP';
      for rec in (select column_name from user_tab_columns where table_name = l_tbl order by column_id)
      loop
        if l_c = 1 then
            l_ret := l_ret || '||' || l_dlim || '||'''||rec.column_name||'''';
        else
            l_c := 1;
            l_ret := l_ret || '''' || rec.column_name || '''';
        end if;
      end loop;
        l_ret := l_ret || ' from dual';
      dbms_output.put_line(l_ret);
    end;-------
    EDIT: just a comment on the complexity of this export, each record in the back-end table adds 12 columns to my export table. Those 12 columns are coming from 5 different tables and are the product of a set of functions calculating or looking up their values. This is export is really a pivot table based on the records in another table.
    Edited by: nimda xinu on Mar 8, 2013 1:28 PM

    Thank you, Denes, for looking into my issue. I appreciate your time!
    It is unfortunately a business requirement. My customer has required that the data we are migrating to this app from a spreadsheet be exported in the same format, albeit temporarily. I still must meet the requirement. I'm working around the 350 columns by dumping everything into a single column, which is working for the data, however, the headers export is throwing the 01489 error. I did run into the error you posted in your reply. I attempted to work around it with the clob type but eneded up running into my string concatentation error again.
    I'm open to any suggestions at this point given that I have the data. I'm so close because the data is exporting, but because the columns are dynamic, the export does me little good without the headers to go along with it.

  • Error : Excise modvat accounts not defined for GRPO transaction and U1 exci

    I have Created Impot PO, After Planned Delivery cost MIRO I am trying to Perform GR then its giving Error "Excise modvat accounts not defined for GRPO transaction and U1 excise group
    Message no. 8I402"
          I have already maintained GL account in "Specify G/L Accounts per Excise Transaction" for Excise group U1
          This problem is coming when Additional Duty on custom condition type JADC is maintained in "Maintain Excise Defaults" node in the column "ADC Cond" . If I remove JADC condition type from this place the this error is not coming but AED column is not fetching any value while doing GR.
    Please help me in this issue , still No answer
    Edited by: shiwanshu singh on Jan 28, 2009 10:26 AM

    Dear sir
    For GRPO have you maintained sub transaction Type IP for your excise group U1. If you not maintained pl maintain . and assign G/L account to modvat clearing account , the G/l account should be same as company code CVD account.
    Regards
    jrp

  • Excise modvat accounts not defined for GRPO transaction and 58 Excise group

    Hi Experts,
    When i am doing Goods receipts for Depot for Subcontracting
    i am getting this error message .Excise modvat accounts not defined for GRPO transaction and  Excise group. For Depot no modvat. is there any configuration setting for depot. Please do needful.

    Hi,
    Please maintain
    Hi,
    go to SPRO
    Specify G/L Accounts per Excise Transaction
    SPROLogistics u2013 GeneralTax on Goods MovementsIndia Account Determination  Specify G/L Accounts per Excise Transaction
    Enter excise group -58
    ETT_ GRPO
    and G/L account
    G.Ganesh Kumar

  • Excise modvat accounts not defined for DLFC transaction and excise group

    Dear Experts,
    Iam raising this question after checking all the contents related to Sub Transaction type in SDN.
    My prob is: for raw material sales scenario i have created order delivery and invoice. For Invoice i have done account determination with a different G/L in VKOA. Now i have to create an excise invoice and the accounting entry for central ED on sales should go to different G/L account. for account determination i have done the following settings:
    Maintained Sub transaction type in
    IMG -> Logistics - General -> Tax on Goods movement -> India -> Basic Settings
    ->Maintain Sub Transaction types
    IMG -> Logistics - General -> Tax on Goods movement -> India ->
    Account determination -> Specify Excise Accounts per Excise Transaction.
    Here against DLFC sub transaction Type is maintained..
    Also
    IMG -> Logistics - General -> Tax on Goods movement -> India -> Account determination -> Specify G/L Accounts per Excise Transaction
    Here maintained Excise group with DLFC Company Code subtransaction type, chart of accounts and all required GL Account .
    But when the iam creating Excise Invoice in J1IIN and clicking on Subtransaction type-F4, no entries are shown. No values found is the message in green. Message no. DH801
    If i use subtransaction type and enter billing document and enter, the following error message comes:
    "Excise modvat accounts not defined for DLFC transaction and excise group"
    Iam unable to understand why the subtransaction type maintained is not showing in TCODE J1IIN?
    Regards

    For the error message DH801, please check note 840911
    Excise modvat accounts not defined for
           DLFC transaction and excise group"
    Please check  your G/L assignments "Specify G/L Accounts per Excise Transaction"
    thanks
    G. Lakshmipathi

Maybe you are looking for

  • Delivery Creation Problem

    Hello SAP Gurus, I am creating delivery as : Quantity Contract - Sales (Release) Order - Delivery but the quantity of delivery is not getting updated in sales order and contract. For example: Contract is of 1000 units, Sales Order is created w. r. t.

  • Safari 8 not responding

    When I start Safari 8.0 on Yosemite, it uses 100% CPU.  If I try to load any web page, Safari stops responding (spinning wheel of death) before anything is rendered.  This includes pure HTML pages (no javascript, flash, etc.). Immediately after launc

  • How to put the wild card search in the url

    I want to call a report use the url which inculd a page parameter 'LastName', to list all the person whoes LastName is begin with the certain character, like 'B'. So in the report sql query, i used Select Last_name, email, institution from members wh

  • Movie that used to play will no longer play?!?

    I have several movies that were taken with a digital cam. and about half of them no longer play when I try to open them quicktime begins to open and then the rainbow pinwheel shows up indefinitely and I have to force quit? anyone know what might be g

  • Errors  FRM-40505 and ORA-03114 after cancel a query in a form

    I have a Form (i'm working with 10g, i.e., Web) with property "interaction mode=non-blocking", and after execute the query, it pops-up the cancel query window, but when cancelled, the connection with the database is dropped. Someone knows why it happ