Urgent : Timezone conversion issue

Hi,
I am having an issue in timezone conversion.Im trying to convert a date type value to server timzone before insert into table. Im writing the conversion code in validateEntity(). Im using OANLSServices methods for conversion.Since i have to pass java.util.Date to this convertTimezone() as first parameter(date to be converted),im converting the reqd date which is in oracle Date type to java util Date type. Function is returing java.util.date type, which is again converted to oracle Date type. then im passing this value to the setMethod, ie., setJoiningDate(Date ) in my case.
the code is compiled with 0 errors. But wen i run the page and after entering date value and click save button , its throwing some exception like "oracle.jbo.ValidationException: JBO-28200: Validation threshold limit reached. Invalid Entities still in cache"
and in the debug console, this code in the validateEntity() is executed 10 times. First time it is converting the date time value to date only format, then same this is executed 9 more times. Can anyone help to resolve this issue? Its very urgent.
Thanks
Harsha

Harsha,
Use this code and try again...!
oracle.jbo.domain.Date join = getJoiningDate();
// to convert oracle Date to Java util Date
java.sql.Date joinSql =(java.sql.Date)join.dateValue();
java.util.Date jdateUtil =
getOADBTransaction().getOANLSServices().convertTimezone(joinSql,
getOADBTransaction().getOANLSServices().getUserTimeZoneCode(),
getOADBTransaction().getOANLSServices().getServerTimeZoneCode());
// to convert back to oracle Date
java.sql.Date jdateSql = new java.sql.Date(jdateUtil.getTime());
oracle.jbo.domain.Date jdate = new oracle.jbo.domain.Date(jdateSql);
setJoiningDate(jdate);
--Mukul                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

Similar Messages

  • Conversion issue in encoding in PI

    Hi,
      I am having a problem in conversion issue for encoding. Let's have a detailed look into the scenario first.
    Scenario: SNC->PI (through Proxy)->MQ (through Receiver JMS adapter)->SeeBeyond (It's a middleware system).
                   Here from SNC, data are coming into PI as in UTF-8 encoding, whereas all the systems shown in the above flow are of ISO 8859_1 format. Hence UTF-8 needs to be converted into ISO 8859_1. PI does this conversion in Receiver JMS adapter. CCSID has been set to 00819 which is for ISO 8859_1 encoding. But the beginning of the xml, it remains still UTF-8, like below:-
      <?xml version="1.0" encoding="utf-8" ?>
    Whereas the content of the xml has already been converted into ISO 8859_1 by Receiver JMS adapter. Now let's see the problem now.
    Problem: After the data been passed in SeeBeyond, the mapping fails in SeeBeyond, since SeeBeyond tries to look at the XML tag from the beginning and it found it's in UTF-8 as per he tag above. Then it tries to map it in UTF-8 format whereas the content is in ISO 8859_1, so mapping fails in SeeBeyond. Please note, here allthe systems except SeeBeyond are just bypassing the data, no mapping been introduced in anywhere in the systems except SeeBeyond.
    Workaround: Is there any other chance where we can change the tag header from UTF-8 to ISO 8859_1 in Receiver JMS adapter itself in PI like this:-
    <?xml version="1.0" encoding="iso 8859_1" ?>
    This might resolve our problem I think. Any suggestions would be appreciated.
    Thanks and Regards
    Soumya

    Hi Stefan,
                Thanks for your reply. I have gone through the link you have specified. Ya, that's the other way to resolve that issue in that case. But here we have asked the receiver system (SeeBeyond) to change the encoding part in their mapping. Because changing the coding part in SeeBeyond is less time consuming than adding a new Bean in JMS adapter. Our problem got resolved by changing the mapping in Receiver end. Thanks anyway for your valuable reply.
    Thanks
    Soumya

  • Umlaut Conversion issue in Sender communication channel SAP PI

    Hi Gurus,
    We are facing issue while conversion.
    umlaut Conversion issue in Sender communication channel that is reason channel not able to pic the file from the path.
    Sender CC error:
    Value of incoming field is too large. Segment:'IMD', Field:'7008', MaxLength:'35', value:'Plssvvkbecher Lübzer 0,4 (1280Stk p' DESCRIPTION: The length of the field value is too big !
    Actually field contains 35 Chars only the value is  : 'Plssvvkbecher Lübzer 0,4 (1280Stk p'
    We tried with “ISO-8859-1” if the field value is less than 35 characters it is able to converting.
    Please help me out from this issue.
    Thank you.
    Regards,
    Jittu.

    Hi Jittu,
    Have you tried using the codepageconversion bean in the modules under the sender?  It would be like:
    AF_Modules/TextCodepageConversionBean with a parameter of Conversion.charset and a value of utf-8.
    Regards,
    Ryan Crosby

  • Language conversion issue in PI7.1

    Hi,
    My sceanrio is file to Idoc. Am facing language conversion issues.
    For example one of the field in the source file is having the value "Différence sur net". And in the IDOC field the value is appearing as "Différence sur net".
    With the same data I have done file to file scenario with and without content conversion and the output is coming correctly as "Différence sur net".
    I have enabled Unicode check box in RFC destination also in XI server. But still its coming "Différence sur net" . Please advice how can I get the target field value same as that of the source field vale.
    Can anyone help me to sort out this issue?
    Thanks.
    Dinesh

    Hi,
    In Channel try with using the File type: text and encoding : UTF-8.
    I think this will solve your issue.
    Thank you,
    Siva

  • Oracle timezone conversion

    I have a requirment in which the database server time is being changed to GMT but one of the database users needs to store its data in CST or CDT as the case may be depending on daylight savings .I am thinking of writing a function which returns depending on date value passed on to it whether it is CST or CDT (depending on the day of the year ) and thiS function would be called from a before insert trigger for the tables so that the date data being passed on GMT time is converted back to CST or CDT before storing and stored as such then .Is this the best approach ,does oracle have a system function for this ? NEW_TIME is the function that I am aware of for timezone conversion however it doesnt take into consideration if it is CST or CDT .Thanks

    What is the datatype - date, timestamp, timestamp with timezone?
    There are ways of converting the date which includes daylight savings time considerations;
    SQL> select sessiontimezone from dual
    SESSIONTIMEZONE
    -08:00        
    1 row selected.
    SQL> create table t(ts timestamp)
    Table created.
    SQL> insert into t values(to_timestamp('01-jun-2007 12:00:00', 'dd-mon-yyyy hh24:mi:ss'))
    1 row created.
    SQL> insert into t values(to_timestamp('01-dec-2007 12:00:00', 'dd-mon-yyyy hh24:mi:ss'))
    1 row created.
    SQL> SELECT ts,
       ts AT TIME ZONE 'UTC' utz,
       ts AT TIME ZONE 'Canada/Central' ctz
    FROM t
    TS                         UTZ                               CTZ                             
    2007-06-01 12:00:00,000000 2007-06-01 20:00:00,000000 +00:00 2007-06-01 15:00:00,000000 -05:00
    2007-12-01 12:00:00,000000 2007-12-01 20:00:00,000000 +00:00 2007-12-01 14:00:00,000000 -06:00

  • Currency conversion issue in SPM. We are getting incorrect results with SPM conversion function from one of the document currency to USD.

    Currently we are using SPM 2.0 version and we have been facing currency conversion issues.
    Please help me in following aspects.
    1) Where actually currency conversion happens in SPM. Is it the global program which does the conversion or other way.
    2) We have conversion issue for one of the currency where conversion function is giving incorrect results when converting from one of the document currency to USD. here The respective document currency is considering the 1:1 ratio with Dollar which is actually incorrect.
    3) We have verified in both BI side(currency tables) and even ECC side.
    Please help me in understanding this issue and let me know if you need more information on this.
    Its an production issue and appreciated your immediate inputs.
    Thanks
    Kiran

    Hi Arun,
    The following information may be helpful to you.The SSA_HELPER_PROGRAM has options regarding currency settings.
    EXCH_RATE_TYPE: This flag governs the exchange rate type which will be used for currency conversion in data management. For example if RSXAADMIN contains an entry EXCH_RATE_TYPE = „ZSPM‟ then the conversion type used for currency conversion is ZSPM. The default value for the exchange rate type is „M‟. More details can be found in the note 1278988.
    CURRENCYCONVERSION: By default data management converts all the measures in transaction currency to reporting currency and copies over to the corresponding measure in reporting currency. If the measure in reporting currency is already available in source it might be desirable to disable the currency conversion. To disable the conversion you can make an entry CURRENCYCONVERSION = „ „ in the table RSXAADMIN. This can also be achieved by running the program SSA_HELPER_PROGRAM with the option DEACTIVATE_CURRENCYCONVERSION. The conversion can be reactivated by running the same program with option ACTIVATE_CURRENCYCONVERSION.
    UNITCONVERSION: Similar to above. To deactivate unit conversion you can use the program with option and DEACTIVATE_UNITCONVERSION and to reactivate ACTIVATE_UNITCONVERSION. By default both the conversions are switched on
    EXTERNAL_CURRENCIES: Normally most of the international currencies are stored with two decimal places however certain currencies do have 0 and 1 decimal place too. For example JPY has 0 decimal places. SAP internal format stores even these currencies with 2 decimal places and at the time of display it changes the value to right decimal places. In case a file from external source is loaded to SPM it might have the format with 0 decimal places in the file. To convert it to SAP standard format post processing needs to be done on this value. If that is the case you can set the flat EXTERNAL_CURRENCIES = „X‟ in the table which will enable the post processing for these values. This flag can also be set and reset using the helper program using the option TURNON_EXT_CURRENCY_FORMAT and TURNOFF_EXT_CURRENCY_FORMAT.
    Kind Regards,
    John Harris
    Senior Support Engineer, SAP Active Global Support

  • Oracle Database : Date conversion issue between timezones

    Hi All,
    We are trying to convert date from Eurpoer/Amsterdam timezone to Australia/Sydnet timezone and extracting time out of it.
    We are facing issue related to incorrect timing after conversion.
    Please find below issue details,
    Environment
    Database: Oracle 10.2.0.4
    Machine: Linux RHEL 4
    Location: Amsterdam, Netherlands
    Issue: After converting date from Eurpoer/Amsterdam timezone to Australia/Sydnet timezone
    SELECT TO_CHAR (FROM_TZ (TO_TIMESTAMP ('201110201416', 'YYYYMMDDHH24MI'),
    DBTIMEZONE
    ) AT TIME ZONE 'Australia/Sydney', 'HH24:MI')
    FROM DUAL
    Output of above SQL is 22:16.
    Expected output is 23:16
    Database Timezone (DBTIMEZONE) is set to +02:00 (i.e. GMT + 2) (Europe/Amsterdam)
    If we convert date to Australia/Tasmania timezone then we get expected output i.e. 23:16.
    Question: Expected time for Australia/sydney is 23:16, but why Oracle database provides 22:16 as output.
    Please note that, Sydney and Tasmania comes under same timezone. Still we get different output for both the timezone.
    Oracle database considers GMT + 10 for Australia/sydney and GMT + 11 for Australia/Tasmania.
    If we want to use Australia/sydney then how should we get correct sydney time?
    Regards
    Shailendra

    I made a test and it showed when I omit TO_CHAR, both queries have the same result 11:15 PM. So it's TO_CHAR which is the 'culprit'. Since this is a globalization issue, you may repeat your question in this forum:
    Globalization Support
    (So one of the moderators moves the thread to this forum).
    Werner
    By the way my local timezone is Europe/Berlin, there should be no difference to Amsterdam.

  • Urgent help - PO issue when receiving

    Dear gurus
    i have an urgent production issue, i have done a mass conversion of all the master data from one plant to another, the scenario is that they are closing down and stopping operations at old plant and moving the entire operations to new plant under the same company code, so for this i did the entire MM master data conversions
    i also changed the delivering plant and address using MASS T-code for all the open PO's line items from old plant to new plant (some of line items had partial open qty and some of them entire open qty)
    <b><u>1st issue:</u></b>
    But when i am doing VL31N inbound delivery, the PO qty is NOT showing up automatically in the delivery qty field, so it is causing issues when trying the do a GR
    <b><u>2nd issue</u></b>
    also another issue is that in the Goods Movement Data tab while using VL31N for GR, it is by default showing the old plant still and is grayed out even though the PO line item contains the new plant code and delivery address, so how can we change the receiving plant to new plant and why it is not picking the new plant still
    so can you please help me on this issue and let me know if there is any settings or resolution
    thanks a lot in advance and appreciate your expert advices
    chakri

    dear chakri,
    you have written that you have converted the material master data, but what about the existing puchase orders, you have to copy all the orders in the new plant .
    or if possible try to chane the delivery address in in the item details of the P.O.
    rewards if useful.
    ashish

  • Content Conversion issue for header record

    Hi,
    We have a very urgent question on an issue here with one of our XI objects. 
    This is an inbound interface from an external system into R/3 & BW.  The inbound file has a header record (with about 8 fields) and detail records (about 900 fields per detail record). Data going into R/3 & BW don't have header records and everything goes in as detail records. One field from the header of this source file should be passed to the target structure at the detail level. Also, we are NOT using BPM.
    Can someone help us how we could define the file content conversion parameters for File adapter.
    Thanks in advance ......
    Prashant

    I'm so sorry, I wasn't subscribed to this thread and I didn't realize there were responses.
    If you have a message type made up of a Header with 1 occurence and Detail with 1 to unbounded occurunces, you'd want to do the following in content conversion:
    Document Name - your message type
    Document Namespac - your message type namespace
    Recordset Structure - Header,1,Detail,*
    Recordset Structure - Ascending
    Then you'll need to set some of the parameters, depending on the layout of your incoming file. 
    As for the problem of having hundreds of fields, I'm less sure about that.
    Would it be possible to break your detail data type down into smaller data types.  Each with fewer fields.  You'd still have to maintain every field in content conversion, but at least they'd be in seperate parameters, instead of all 900 in one tiny box.
    Here's a very rough example of what I mean:
    If you have 900 fields, instead of making 1 data type of detail, you could make 9 data types, Detail1, Detail2, Detail3, Detail4, Detail5, Detail6, Detail7,Detail8, Detail 9, each with 100 fields in them (or more with even less fields).
    Setting things up the file content conversion would be more complex in this scenario, so it might be a toss up if it's worth it to break it up this way or not if it meant configuring quite a few more parameters.
    For example,
    You'd have to declare your recordset structure like Header,1,Detail1,,Detail2,,Detail3,* etc, and you'd have to make sure to set the .endSeparator to '0' for all of the first 8 details, so it would recognize that they were all on one line.
    I hope this helps a little bit.

  • Conversion Issues

    I am upgrading software.  I used to use MS Office Pro 2003--> now 2007.  In converting documents from 2007 into Acrobat Pro 8.x, I was having a problem with the page numbers not getting converted properly.  For example, I use Page x of y at bottom of docs and the conversion was giving Page x of x.
    SO, I figured it could be time to upgrade to 9.0 pro.  I am currently using the trial version.  Well, I no longer have the problem above but I have another bigger problem.  The conversion is changing the fonts of most text that is centered.  I could live with that some of the time, however, certain documents must retain the specific font I use.  This is pretty much a deal breaker if I cannot fix this.
    Any thoughts?
    Many thanks and Happy Halloween.

    Ok, I figured out the CAUSE of the issue.  If a font is regular and not bolded in the font box, it converts correctly when bolded by the "B".  However, if it is bolded through the bold box, it converts to another font.  SO, is it word or adobe that is the culprit??  Therefore, what is the SOLUTION?

  • Currency conversion issue while creating PO from Shopping cart

    Hello Experts,
    I'm facing an issue in SRM during Limit purchase order creation from Limit shopping cart.
    Scenario:
    Shopping cart was created on 1st of June and Approved on 20th June & Po was created on the same day. But the currency conversion is no ware matching and couldn't able to trace out the same. I have tried several times to replicate the issue it never done in test environments.
    But the PO is creating with refer to Vendor currency. For example if SC was created in USD and vendor order currency in EUR then PO is created in EUR.
    the relevant notes were already updated in the system and it is working fine when the SC is created and approved on the same day.
    Please give me some idea how to fix this issue.
    Is there any way to make SC currency is priority than Vendor Order Currency while creating PO?
    Regards
    Pratap J

    Hello,
    Read information available in KBA 1862453.
    It mentions this issue.
    Regards,
    Ricardo

  • Crystal Data Conversion Issue (Error converting data type varchar to datetime)

    Hi,
    I can run stored procedure without error in SQL Server using my personal credentials as well as database credentials.
    I can also run Crystal Report after connecting to Stored procedure without error on my desktop using my personal credentials as well as database credentials.
    But when I upload the crystal report in BOBJDEV and when I run using database credentials report fails saying that "Error in File ~tmp1d1480b8e70fd90.rpt: Unable to connect: incorrect log on parameters. Details: [Database Vendor Code: 18456 ]" but I can run the crystal report successfully on BOBJDEV using my personal credentials.
    I googled (Data Conversion Error Message) about this issue & lot of people asked to do "Verify Database" in Crystal Report. So I did that, but when I do it I am getting a error message like this:
    Error converting data type varchar to datetime.
    Where do you think the error might be occurring? Did anyone faced this kind of issue before? If so, how to resolve it?
    (FYI, I am using Crystal Reports 2008, & for stored procedure I have used SSMS 2012 )
    Please help me with this issue.
    Thanks & Regards.
    Naveen.

    hello Naveen,
    since the report works fine in the cr designer / desktop, we need to figure out where you should post this question.
    by bobjdev do you mean businessobjects enterprise or crystal reports server? if so please post this question to the bi platform space.
    -jamie

  • PSE 8 upgrade album conversion issue

    I upgraded my system to Windows 7 and since I was going to have to reload everything anyway, I upgraded to PSE 8 at the same time.
    Before the upgrade I saved my catalog files to my external drive and then moved them back to the C:\Program Files\Adobe folder  I put it there because I didn't see a catalog folder under the Elements Organizer folder
    Anyway, I then converted my catalog - renamed it "My Catalog 7.0" to differentiate
    The conversion says it was good and upon inspection all meta tags and albums seem okay.  The issue is that it didn't bring in the images from May 2009 onward.  I know those were in the catalog in 7.0 and had tags on them.
    Any sggestions on how to get those images (with their metadata) into PSE 8.0?
    Susan

    Thanks John
    You were right, I pulled the wrong file to convert (the one I used to convert 6 to 7)
    Thankfully I had a back up and was able to restore my correct 7 catalog.  All is good now.

  • Non English characters conversion issue in LSMW BAPI Inbound IDOCs

    Hi Experts,
    We have some fields in customer master LSMW data load program which can
    contain non-English characters. We are facing issues in LSMW BAPI
    method with non-English characters Conversion. LMSW steps read and
    conversion are showing the non-English characters properly with out any
    issue. While creating inbound IDOCs most of the non-English characters
    replaced with '#' and its causing issues in creating customer master data in
    system. In our scenario customer data with non-English characters in
    the first name, last name and address details. Any specific setting
    needs to be done from our side? Please suggest me to resolve this issue.
    Thanks
    Rajesh Yadla

    If your language is a unicode tehn you need to change the options  like IN SAP you need to change it to unicode  in the initial screen Customize local layout(ALT F12) options 118  --> Encoding ....

  • Urgent Group Policy Issue - not applying despite saying it does

    Thank you for this urgent help. Auditors checking this out tomorrow morning.
    We have a GPO that sets the eventlog audit settings for success or failure security events. The scope is set to Authenticated Users.
    When I run the group policy wizard in GPMC it shows the settings applying to one of our servers in that OU.
    When I run gpresult/z from that server it shows the policy applying to that server.
    But when I go into gpedit.msc the security audit settings are all set to "not defined" and they are grayed out so I can't edit them manually.
    As a test I set the GPO to deny applying to that server. I ran gpudpate/force on the system and then gpresult and it shows the GPO now not applying. But the settings are still set to not defined and still not editable. they are not being set by any other GPO.
    In the event logs I only see three GPO errors but they are unrelated. A separate GPO is having issues creating user accounts. No other GPOs apply.
    Quick help would be fantastic.
    Server runs on Windows Server 2008 R2 (I can edit GPO but not the domain ones and I don't have access to the domain controllers).

    OK, After several hours I figured it out. Turns out there's bugs and odd functionality.
    If someone ever tested the 'advanced audit settings' (which I did in the same GPO at some point) then it sets a registry key to disable the use of the older basic audit settings. But when you stop using those advanced settings in your GPO it doesn't remove
    that registry bit. So I used the GPO to undo that setting. This was the first step. This is found Computer Configuration > Windows Settings > Security Settings > Local Policies > Security Options > "Audit: Force audit policy subcategory
    settings (Windows Vista or later) to override audit policy category settings" to DISABLED.
    Even though this is done, sometimes the GPO files on the domain controllers don't remove the old audit settings. So in the comments of another thread I found out you may have to go to
    \\domain-fqdn\SYSVOL\domain-fqdn\Policies\{your-policy-id-where-this-setting-was-originally-set}\Machine\Microsoft\Windows NT\ and delete the Audit folder which is left behind due to some odd bug. If you don't do this even after doing the next step the
    next gpupdate will bring that security setting above back down.
    Next you have to reset your audit settings on your PC to the defaults. Unfortunately there is no way to do this. Auditpol /clear does not accomplish this. The only way to do this is to take the audit settings from another working system, export them and
    then 'restore' those same settings to the affected server. To do this:
    1. On 'working system' run cmd.exe as administrator and export the audit settings to a folder like this:
    auditpol /backup /file:c:\working-auditpol-settings.txt
    2. Copy that file to the broken system such as the C:\ drive and run this on the broken system:
    auditpol /restore /file:c:\working-auditpol-settings.txt
    Open GPEDIT.MSC and verify the audit settings are back to normal. Computer Configuration > Windows Settings > Security Settings > Local Policies > Audit Policy
    Then run gpupdate/force on the formerly broken system. Close gpedit.msc and reopen and verify the settings were not overwritten. If you skipped the sysvol audit folder deletion step they may come back.
    Hope this helps someone.

Maybe you are looking for

  • How much hard disk can i upgrade for dv6516tx

    I have been using my hp pavillion dv6516tx O/S-Windows Vista 32-bit. s/n:CNF7343W7D p/n:GN393PA#ACJ Hard disk:160GB Ram:2GB I want to upgrade my Harddisk and ram.How much can i upgrade my harddisk and ram.How to take the backup into the hardisk.

  • Can't connect MacBook to a WPA

    I go to this coffee house to study, and they have a wireless router that is a WPA network. When I connect to it, my connection only works about 20% of the time. I can surf the internet for maybe 2 minutes and then it wont work for seven. Then it will

  • Can you advise the process to clear down the table(REGUH-payment table)

    Can you advise the process to clear down the table(REGUH-payment table)

  • Concurrent Error Alert Need to send mail

    Hi Everybody, My Requirement:_ if any concurrent program completed with error, it need to send an mail alert notification to the users. My Environment:_ OS : OEL4U5 DB : 9.2.0.8 Apps Version : 11.5.10.2 (with ATG RUP 7) Steps i have done: 1) OS level

  • Migrating CE9 reports to XI R3.1

    I have a customer that is asking the following questions: 1, When you import (migrate) reports from CR 9.0 to BO XI 3.1 are they automatically upgraded to CR 2008 format?     Or do you have to open the reports in CR 2008 designer, refresh it and then