Problem with billing date at closed period

Hi
We have a problem with billing date in transaction . Previously we could set up different (earlier) billing date than the actual posting period and now it is not working. We receive an error message that the posting period is closed ) .
I attached an example invoice which we managed to post and where you can see what Iu2019m thinking about, parking number: 1024337, BILLING DOCUMENT: 841835. In this case the billing date was at 30.11.2007, but the posting in 15.07.2008 and we managed to post.
Pls help that issue
yps

hi,
it is due to posting period closed for the month of 07/2008.
u need to open posting during that period.
Regards,
Greeshma

Similar Messages

  • Problem with Billing date while creating Billing Document

    According to the factory calendar maintained at the client side every Saturday is a holiday. Due to  this, the billing date for every Billing Document (vf01)  created on Saturdays gets pushed to Monday's date..  I tried using VOFM routine 011 to default the billing date (vbrk-fkdat) to system date.I have assigned the same in copy control.. But this does not work. Any suggestions?

    hello,
    This is a standard flow and if you want to tamper that, need to change in standard SAP...which I would not recommend you...
    Please use a separate factory calender for that particluar site...
    Thanks..

  • Problem with billing

    says problem with billing and will place order later...
    why?

    Hi sekouc5321777,
    I see you placed two orders today for an Acrobat subscription. One was "unauthorized," but I can't see why (maybe an typo on the credit card number or exploration date?). The other is currently processing.
    Please let us know if you have additional questions.
    Best,
    Sara

  • Open sales order report with billing date.

    Hi,
    I want to know if there's an available report for open sales order with billing date field.
    Thank you.

    Hi,
    If your case is "Order-related" Billing,you can use the T.Code "VF04" for this purpose.
    One more way is to go for a custom report.
    Make a copy of code of the T.Code "VA05n"(Program Name is "SD_SALES_ORDERS_VIEW").
    Change the code as per your requirement to display the field "Billing Date" in the report.
    Check with your ABAPer if you wish to go for a custom report.
    Regards,
    Krishna.

  • Problem with Java Dates and UPDATE for SQL2000

    I am having problems with the date formats for Java. I am trying to put the current date time into a SQL table, here it the code I am using:
    var Today = new Date()
    var conn = Server.CreateObject( "ADODB.Connection" )
    conn.Open( "Provider=SQLOLEDB;Server=(local);Database=BillTracking;UID=sa;PWD=;")
    var sql = "UPDATE BillAssignments SET DatePosted = " + Today + " WHERE AssignmentID = '" + Request.QueryString("AssignmentID") + "'"
    var rs = conn.execute(sql)
    I keep getting different errors and I have been unable to find a solution yet. I know that I need to change the date format from the Java standard to the one that SQL likes.
    Help....
    Norm...

    Please tell us where the Java part of this comes in. I see that you are using JavaScript to load up data via an ADO connection (presumably on an IIS platform) - but I do not see where you are using Java
    Lee

  • Facing problem with a date column in select query

    Hi,
    I am facing problem with a date column. Below is my query and its fainling with " invalid number format model" .
    Query: SELECT *
    FROM EMP
    WHERE trunc(LAST_UPDATED) >= to_date(to_char(22-05-2009,'dd-mm-yyyy'),'dd-mm-yyyy')
    LAST_UPDATED column is "DATE" data type.
    Please help me Thanks

    Radhakrishna Sarma wrote:
    SeánMacGC wrote:
    WHERE LAST_UPDATED >= to_date('22-05-2009','dd-mm-yyyy');
    You do not need the TRUNC here in any case.
    I don't think so. What if the user wants only data for 22nd May and the table has records with date later than 22nd also? In that case your query willl not work. In order for the Index to work, I think the query can be written like this I think Sean is right though. Use of TRUNC Function is quiet useless based on the condition given here, since the to_date Function used by OP will always point to midnight of the specified date, in this case 22-05-2009 00:00:00.
    Regards,
    Jo
    Edit: I think Sean proved his point... ;)

  • Problem with loading data to Essbase

    Hi All,
    I have a problem with loading data into Essbase. I've prepared maxl script to load the data, calling rule file. The source table is located in RDBMS Oracle. The script works correctly, ie. generally loads data into Essbase.
    But the problem lies in the fact, that after deletion of data from Essbase, when I'm trying to load it again from the source table I get the message: WARNING - 1003035 - No data values modified by load of this data file - although there is no data in Essbase... I've also tried to change the mode of loading data from 'overwrite' to 'add to existing values' (in rule file) but it does'nt help ... Any ideas what can I do?

    Below few lines from EPM_ORACLE_INSTANCE/diagnostics/logs/essbase/dataload_ODL.err:
    [2013-09-24T12:01:40.480-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received Validate Login Session request
    [2013-09-24T12:01:40.482-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received client request: Get App and Database Status (from user [admin@Native Directory])
    [2013-09-24T12:01:54.488-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: MaxL: Execute (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Received client request: MaxL: Describe (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [MLEXEC-2] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Output columns described
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Define (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [MLEXEC-3] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Record(s) fetched
    [2013-09-24T12:01:54.496-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.498-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received Validate Login Session request
    [2013-09-24T12:01:54.499-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: Get Application State (from user [admin@Native Directory])

  • HT204088 I have a problem with billing info.....

    After the last purchase, it said that was a problem with billing the last purchase and need to verify my billing info, i put the password, trying to load and after few sec asks me again,i put again...the same....i tried at least 10 times and does that.....i check if my creditcard didnt have money and it has......i check the password it works (i tried a wrong pass and corrected me)......WHat is going on??? Please help....

    I think iPhone is jus a bloody rip off I downloaded a app now I can't download another app I called customer service more then 10 times today they took away 15 pounds of my call time so I would recommend everyone not to use I phone they don't give much help but they love ripping poor customers like myself

  • Problem with the date conversion

    Hi Friends,
    i am facing the problem with the date conversion,  Actuall my requirement is to pass the date to the screen based on the user setting roles(SU01).
    I have fetched the user setting date format by using the funciton module SUSR_GET_USER_DEFAULTS, The function module picks the exact user date setting (Like as MM/DD/YYYY, MM.DD.YYYY, DD.MM.YY).
    After that i have implemented the FORMAT_DATE_4_OUTPUT funciton module for converting of the user role setting date format into system  date format.
    for the english language case the funciton module FORMAT_DATE_4_OUTPUT works fine but the funciton module not supported for other languages
    Can you please provide the Function Moudle for user setting date conversion.
    The funciton module is most important for us,
    Thanks
    Charan
    Moderator message: date conversion questions = FAQ, please search before posting.
    Edited by: Thomas Zloch on Dec 21, 2010 2:19 PM

    Hope this logic helps you.
    DATA LF_DATE    TYPE DATS VALUE '21122010'. " 21-dec-2010
    DATA LF_DATE_BI(10).
    WRITE LF_DATE TO LF_DATE_BI.  "Now LF_DATE_BI contains the date in user format
    "Now populate the value LF_DATE_BI to the screen field

  • PSP: problems with viewing data

    Hello.
    I'm currently working at on-line shop and have some problems with viewing data from database. When there is no much inserts to table its working very well. But after inserting all Inserts I have its acting weird.
    Sample with 10 INSERTS:
    http://gafgarion.atspace.com/psp/1.jpg
    Sample with 100 INSERTS:
    http://gafgarion.atspace.com/psp/2.jpg
    I'm using Oracle 9i. when I have more data in my database its acting weird. There is SELECT only from one table, but sometimes I have data from other tables aswell.
    I didnt touch any config files or something else. Only created new User and DAD.
    any ideas what should I do to fix that ??
    thnx in advice

    Hello,
    My guess is that you are speaking about PLSQL Server Pages (PSP), and the PLSQL Web Toolkit.
    This is why I do not think that you will have lot of answer since this forum is targeted toward Web Services developer (XML, SOAP, and so on)
    I am inviting you to ask your question on the general Oracle Application Server - General or PLSQL forums.
    Regards
    Tugdual Grall

  • Problems with retrieving data from tables with 240 and more records

    Hi,
    I've been connecting to Oracle 11g Server (not sure exact version) using Oracle 10.1.0 Client and O10 Oracle 10g driver. Everything was ok.
    I installed Oracle 11.2.0 Client and I started to have problems with retrieving data from tables.
    First I used the same connection string, driver and so on (O10 Oracle 10g) then I tried ORA Oracle but with no luck. The result is like this:
    I'm able to connect to database. I'm able to retrieve data but from small tables (e.g. with 110 records it works perfectly using both O10 and ORA drivers). When I try to retrieve data from tables with like 240 and more records retrieval simply hangs (nothing happens at all - no error, no timeout). Application seems to hang forever.
    I'm using Powerbuilder to connect to Database (either PB10.5 using O10 driver or PB12 using ORA driver). I used DBTrace, so I see that query hangs on the first FETCH.
    So for the retrievals that hang I have something like:
    (3260008): BIND SELECT OUTPUT BUFFER (DataWindow):(DBI_SELBIND) (0.186 MS / 18978.709 MS)
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=0
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=1
    (3260008): ,len=160,type=DECIMAL,pbt=4,dbt=0,ct=0,prec=0,scale=0
    (3260008): EXECUTE:(DBI_DW_EXECUTE) (192.982 MS / 19171.691 MS)
    (3260008): FETCH NEXT:(DBI_FETCHNEXT)
    and this is the last line,
    while for retrievals that end, I have FETCH producing time, data in buffer and moving to the next Fetch until all data is retrieved
    On the side note, I have no problems with retrieving data either by SQL Developer or DbVisualizer.
    Problems started when I installed 11.2.0 Client. Even if I want to use 10.0.1 Client, the same problem occurs. So I guess something from 11.2.0 overrides 10.0.1 settings.
    I will appreciate any comments/hints/help.
    Thank you very much.

    pgoel wrote:
    I've been connecting to Oracle 11g Server (not sure exact version) using Oracle 10.1.0 Client and O10 Oracle 10g driver. Everything was ok.Earlier (before installing new stuff) did you ever try retrieving data from big tables (like 240 and more records), if yes, was it working?Yes, with Oracle 10g client (before installing 11g) I was able to retrieve any data, either it was 10k+ records or 100 records. Installing 11g client changed something that even using old 10g client (which I still have installed) fails to work. The same problem occur no matter I'm using 10g or 11g client now. Powerbuilder hangs on retrieving tables with more than like 240 records.
    Thanks.

  • Facing lot of problems with the DATA object  -- Urgent

    Hi,
    I am facing lot of problems with the data object in VC.
    1. I created the RFC initially and then imported the data object in to VC. Later i did some modifications to RFC Function module,and when i reload the data object, I am not able to see the new changes done to RFC in VC.
    2. Even if i delete the function module, after redeploying the IVIew, results are getting displayed.
    3. How stable is the VC?
      I restarted the sql server and portal connection to R3 is also made afresh.... still i am viewing such surprise results..
    please let me know what might be the problem.

    Hi Lior,
    Are u aware of this problem.
    If yes, please let me know...
    Thanks,
    Manjunatha.T.S

  • Problem with a data set: DIAdem crashes

    Hi,
    I've got a problem with a data set. When I want to zoom in DIAdem-View, DIAdem crashes with the following message (translated from German ;-):
    error type: FLOAT INEXACT RESULT or FLOAT INVALID OPERATION or FLOAT STACK CHECK
    error address: 00016CB8
    module name: gfsview.DLL
    I've got some similar data set not showing such problems. Further on I scanned the data a bit, but in the 59000 points I didn't see anything special. I did try to delete "NOVALUE"s as well, but after that there still exist "NOVALUE"s.
    Does anyone have an idea what to look for?
    Thanks,
    Carsten

    Carsten,
    Could you please upload you Citadel database to the following FTP site:
    ftp.ni.com/incoming
    If you want to compress (ZIP) and/or put a password on the data, that's fine. Please send me a private email at [email protected] (with the file name and password if you put one on the file) once you have uploaded the file and I will check it out.
    Otmar
    Otmar D. Foehner
    Business Development Manager
    DIAdem and Test Data Management
    National Instruments
    Austin, TX - USA
    "For an optimist the glass is half full, for a pessimist it's half empty, and for an engineer is twice bigger than necessary."

  • Problem with input data format - not "only" XML

    Hi Experts,
    I have problem with input data format.
    I get some data from JMS ( MQSeries) , but input format is not clear XML. 
    This is some like flat file with content of XMLu2026.
    Example:
    0000084202008-11-0511:37<?xml version="1.0" encoding="UTF-8"?>
    <Document xmlns="urn:xsd:test.01" xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance Sndr="0001" Rcvr="SP">
    ...content....
    </Document>000016750
    Problems in this file is :
    1. data before parser <? xml version="1.0"> -> 0000084202008-11-0511:37 
    2. data after last parser </Document> -> 000016750
    This data destroy XML format.
    Unfortunately XI is not one receiver of this files and we canu2019t change this file format in queue MQSeries ( before go to XI) .
    My goal is to get XML from this file:
    <?xml version="1.0" encoding="UTF-8"?>
    <Document xmlns="urn:xsd:test.01" xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance Sndr="0001" Rcvr="SP">
    ...content....
    </Document>
    My questions:
    1. Is any way or technique to delete this data 0000084202008-11-0511:37  in XI from this file ?
    2. Is any way to get only XML from this file ?
    Thanx .
    Regards,
    Seo

    Hi Buddy
    What is the XI adapter using?
    If you use inbound File adapter content conversion you could replace these values with none and then pass it to the scenario.
    Does that help?
    Regards
    cK

  • Problem with universal data cleanse

    I have problem with universal data cleanse.
    I am using DS 3.2.x (12.2.2).
    I created:
    Dictionary: TEST
    Classification: TEST_CL
    Custom outputs: category: TEST_CAT
    Dictionary Entry:
    Primary: BO
    Classification: TEST_CL
    Gender: Unassigned
    Secondary information:
    When used as: TEST_CAT
    Standard: Business Objects
    Rule file:
    DataCleanse Rule File v2.0;
    TEST_CAT = TEST_CL;
    action = TEST_CAT;
    TEST_CAT = 1 : TEST_CAT : 1;
    end_action
    Data Cleanse:
    Input: Multiline1
    Options:
    Parsing Dictionary: TEST
    Rule file: .../test_rules.dat
    Break On Whitespace Only: Yes
    Parser Sequences Multiline1: TEST_CAT
    Output:
    Parent_component: TEST_CAT1
    Generated_field_name: TEST_CAT and RULE_LABEL
    Generated_field_class: parsed/standardized
    Contenet_type: none
    and EXTRA field
    I wanted to replace word u201CBOu201D with the standard words u201CBusiness Objectu201D, but I have result: there is "BO" in the extra field,
    there are "null" in others fields.
    What am I doing wrong?
    Thanks for all help!
    P.S. I don't have cleansing packages installer.

    There seem to be a couple of things going on here:
    1. If you are using your custom dictionaries, then you have to map your input to MULTILINE1 and enable the custom parsers - just something to be aware of
    2. You mentioned that you made some changes to the existing dictionary and you are not seeing any changes. To be clear, do you have different TEST and PRODUCTION environments? Or is it the same environment except that you have a local DS repository and for the dictionary you are pointing to another repository (using Dictionary --> Manage Connections)?
    Having the dictionaries on a different repository should not make any difference as long as you point to them in your designer using the Dictionary --> Manage Connection option.
    So I think there may be some issue with your job setup and/or dictionary values need to be looked at. You can start by adding another output field named "EXTRA" to see whether or not your data is getting parsed at all. Also, make sure the entry "CLEANME" is classified as FIRM_NAME_ALONE in the dictionary and that you are selecting the correct dictionary name in the Datacleanse Transform options.

Maybe you are looking for

  • How do I add a caption to multiple images in LR3???

    I have a very simple need: To apply captions to multiple photos at once. How do I do this in Lightroom 3 AND PLEASE KEEP IT SIMPLE. There are places that purport to address this question but they in fact do not provide any solution. Thanks in advance

  • 8.1 Enterprise : dont install default apps, speed up first logon on domain?

    Looking to do the following on Windows 8.1 Enterprise (joined to domain) 1) stop the "hi", "we're setting things up for you", "setting up apps" messages. Trying to deploy hundreds of tablets and first logon for each user to the tablets is SLOWWWW bec

  • No Clue Where This Is Suppose To Go. Camera, iPhoto, USB. Have Fun.

    I got a new digital camera a few months ago and my MacBook Pro would pick up the camera just fine using iPhoto and I could upload the pictures on to my computer no problem. Now, not the case. Now when I hook up the camera to my computer with the USB

  • Automated Batch Reports

    We have Oracle 10g reports (PDF format) and ASP reports (Excel format). Now I want to automate these reports which will run on every Sunday. It should store the reports in the specified drive. Also all the reports will accept input parameters. One mo

  • Invoice management system in SRM 7.0

    Dear Experts, I get to know that SAP is no longer developing the invoice management  system from SRM 7.0. In fact they have partnered with open text to provide the IMS exception processing functionality plus additional OCR capabilities. Does SAP in a