Time offset in Conversion or Transformation

Hi Gurus
I have a question, I can generate time offset in conversions or transformation? for example a function to load data 2011 and that automatically be save as 2012, but dynamic, without script logic?
something like this
*MAPPING
TIME=TMVL(12, 0FISCPER)
thanks and
regards
Edited by: Nayadeth jeldres on Dec 5, 2011 10:52 PM

As far as i know , we can not use TMVL in transformation or conversion file. Its a script logic keyword

Similar Messages

  • Time offset problems with file i/o

    Hello everyone, I'm having a problem with the file i/o VI's. I require my application to save serial data at constant five minute intervals. The problem I'm currently seeing is that on each save to a file a time offset is being added which eventually becomes seconds, then minutes and so on. Since this application is meant to be run for a whole year this is a serious problem.
    Attached is the VI, an Arduino program simulating how the DAQ sends data and a file showing my time offset problem. 
    Any ideas/suggestions/fixes are appreciated.
    Thanks
    Jose Molina
    P.S.
    To run the VI upload the code to an Arduino, select it's serial port in the popup VI then click on Ok. The VI will wait for a time that is a modulo of 5 then create a folder structure inside the same location as the LLB. Inside this folder structure should be a file with the data which should be saved every five minutes if left at the default averaging time.
    Attachments:
    7-8-2012.txt ‏3 KB
    Daq_Simulator.zip ‏1 KB
    Weather_DAQ.zip ‏117 KB

    Sorry I forgot to mention that the averaging time is configurable in the pop up vi at the start. I tried 20 seconds because it's much faster for testing than 5 minutes. Timing is being done by a simple counter that increments each time data is received. So if I receive data the timer increments by 5 because data is sent every 5 seconds. Once the timer is equal to the averaging time the data is sent to the enqueue function and then the blocking dequeue function on the second loop sends the data to the file save vi which then saves the data to the file. 
    Attachments:
    counter.PNG ‏13 KB

  • Insert at BWF Time Offset - Not working!

    I have some broadcast wave files generated with Cubase 5, and they definitely have BWF time offset. I can see it in File Info. When I select the option to Use BWF Time Reference Offset, and then attempt to insert the file, it just inserts at the cursor. How do I make it insert at the time offset?

    Seems this is a good thing in CS4.
    In CS3, you can split tracks that you didn't want split if you are not careful, so this seems like a very good new thing they added.
    Dave.

  • Time offset is not set properly when db is in Eire timezone

    Hi,
    We are using oracle 10g database (10.2.0.4) on Solaris box (sparc) for an enterprise product in Eire timezone. we see Oracle time offset set as "+01:00" for one of the db column whose data type is TIMESTAMP(6) WITH TIMEZONE. As per the Eire timezone, time offset should be
    March 27, 2011 to Oct 30, 2011 --> +01:00
    Oct 31, 2010 to March 26, 2011 --> +00:00
    So time offset set in the oracle db is incorrect. Please let us know whether this is known issue on this db version and any configuration missed in the db.
    Thanks,
    Periyasamy

    With horizontalScrollPolicy="off" (the default), it is hard to control
    column widths because the columns are sometimes overridden to make sure they
    fit on the screen.

  • How to Load Revalued Assets in FA at the time of Data Conversion

    Dear Experts,
    How to Load Revalued Assets in FA at the time of Data Conversion?? Please help me
    Thanks & Regards
    Laxman

    Let me ask agagin to all experts.
    I want to make A condition to be repriced at the time of billing.
    For this, I have to set condition category as 'L' (Generally new when copying).
    But I do not want to do in that way becauuse I am maintaining big operated system now.
    In addition, though I migrate open orders after changing config. as 'L', it is almost impossible for use to migrate because we have more than a thound open orders per a DAY as Globalized system.
    That is why I am asking.
    Simply I can create new condition but as I mentioned, there are various recycling fee so we already created about 10 conditions. And this recycling conditions are linked to REA package of SAP. So creating another 10 more conditions can not be a way for us.
    At last, what I want is not to be shown this condition only in billing doc.
    'A' condition should be displayed in both Sales order and Billing doc.
    And simultaneously, when the billing is created and if user changed 'A' condition master, then new value which is different from sales order have to be reflected in billing doc.
    Thank you in advance.

  • Problem on Oracle11g2 RAC -- PRVF-5424 : Clock time offset check failed

    I have a lot of file core wrrote every 6 hour and the same time of alert"node".log file show error about PRVF-5424 : Clock time offset check failed.
    Yesterday I use this:
    [root@hostname2 ~]# service ntpd stop
    [root@hostname2 ~]# ntpdate
    [root@hostname2 ~]# service ntpd start
    in order to resolve but today the problem re-begin?
    2013-04-17 20:23:06.063
    CRS-10051:CVU found following errors with Clusterware setup : PRVF-5424 : Clock time offset check failed
    PRVF-5413 : Node "" has a time offset of 69695.6 that is beyond permissible limit of 1000.0 from NTP Time Server "******"
    PRVF-5413 : Node "" has a time offset of 69940.6 that is beyond permissible limit of 1000.0 from NTP Time Server "**
    2013-04-18 02:23:08.973
    CRS-10051:CVU found following errors with Clusterware setup : PRVF-5424 : Clock time offset check failed
    PRVF-5413 : Node "" has a time offset of -1826.9 that is beyond permissible limit of 1000.0 from NTP Time Server "1***
    PRVF-5413 : Node "" has a time offset of -1364.2 that is beyond permissible limit of 1000.0 from NTP Time Server "1**
    2013-04-18 08:23:16.093
    CRS-10051:CVU found following errors with Clusterware setup : PRVF-5424 : Clock time offset check failed
    PRVF-5413 : Node "" has a time offset of -2698.5 that is beyond permissible limit of 1000.0 from NTP Time Server "1***
    PRVF-5413 : Node "" has a time offset of -1998.7 that is beyond permissible limit of 1000.0 from NTP Time Server "1***

    What is your OS name and version ?
    In general you can solve these issues by configuring NTP service running on your cluster nodes to be synchronized with other NTP servers. For example on Linux please read "Oracle RAC and NTP" in http://www.oracle-base.com/articles/linux/linux-ntp-configuration.php#oracle-rac-and-ntp.
    Edited by: P. Forstmann on 18 avr. 2013 13:35

  • What is the maeaninf of lead time offset in BOM

    Can anybodye xplain me the maeaning of lead time offset in BOM

    Hi,
    In situations where orders have long lead times, this scheduling procedure can result in components being provided much earlier than they are actually needed in the production process. To avoid this situation, the dependent requirements date of the subordinate component can be rescheduled by the follow-up time. With the follow-up time the dependent requirements date of the components is displaced from the order start date further into the future.
    The lead-time offset (in workdays) for the component in relation to the start date for production of the superior assembly. This value is not included in lead-time scheduling for a task list.
    Displacing the Dependent Requirements Date in the Future
    If you have entered a positive value in the bill of material, the dependent requirements date of the component is displaced in the future, starting from the order start date of the assemblyu2019s planned order.
    Order start date of the assembly: 11.30.1999
    Lead-time offset: 2+
    Dependent requirements date of the component: 12.02.1999
    Bringing Forward the Dependent Requirements Date
    If you have entered a negative value, the dependent requirements date will be brought forward.
    Order start date of the assembly: 11.30.1999
    Lead-time offset: 2-
    Dependent requirements date of the component: 11.28.1999
    Regards,
    Alok Tiwari

  • Real time sample rate conversion?

    Recently upgraded all my hardware and software.. I want to record audio at 96khz - so I set my hardware to 96 and set logic to 96 in audio - no problem... except when I want to record new audio on an old project - I set the software to 96khz, and hey presto my old files recorded at 44.1 play too fast....
    The ref manual says logic will do real time sample rate conversion, but doesn't suggest how this is done.
    If I have my hardware set to 96khz but Logic to 44.1 so the old audio plays okay will my new audio still be recorded at 96?
    I could probably convert all the files individually using the sample rate converter in Factory, but I want to avoid this lengthy process as I'm talking the last 8 years of work!
    Anyone out there know anything about this?

    Just fancied recording at the best quality possible - but you are right, the difference won't be that noticable and may even sound out of place.... in any case I can record at 24 bit with no adverse effect...
    I noticed that trying to record audio at 96 when my hardware is 96 and logic at 44.1 has the effect of serious latency issues plus disk too slow errors. So forget that one.
    Someone told me recently that the difference in bit resolution is more audibly noticable as a change in audio quality than sample rate... just out of interest I wonder if anyone knows if this is true?
    BTW just recording guitars and stuff, some vocals, band kinda stuff...

  • TIM Interfaces data conversions

    Hi,
    I appriciate you.
    Pl. tell me what is "TIM interfaces" and
    how to do TIM interfaces data conversions
    thanks
    AC

    Hi Arun,
    This is a very vague question, without any information about the context of your question. But what I did was to go out to google and type in TIM interfaces.
    I think this may be Textile Integrated Manufacturing(TIM) application. The link below takes you to a pdf document about the product.
    http://www.infics.com/collaterals/Datatex.pdf
    I don't know if it has any relevance to you. Check it out or if it is not what you are looking for, please post more information as to how you heard about it and in what context.
    Here are the Google search results
    http://www.google.com/search?sourceid=navclient&ie=UTF-8&rls=GGLD,GGLD:2005-09,GGLD:en&q=%22TIMinterfaces%22
    Srinivas

  • AE Scripting Tutorial: Time Offset Expression

    Hi all,
    Just wanted to share a recent tutorial I made which goes into scripting for After Effects.  It gives a short overview of scripting, Javascript and resources online for learning each.  Then I show how to create a script (with simple GUI) for applying a time offset expression to a number of layers that will follow a pre-defined animation at a pre-defined offset.
    http://www.creativecongo.com/time-offset-script-tut/
    Let me know if you have any questions or even requests for tutorials, scripts or functionality!

    This was just for setting up, but here's the file's contents:
    var banner01 = ["Name01", "Date01", "Time01"];
    var banner02 = ["Name02", "Date02", "Time02"];
    var banner03 = ["Name03", "Date03", "Time03"];
    var banner04 = ["Name04", "Date04", "Time04"];
    var banner05 = ["Name05", "Date05", "Time05"];
    var banner06 = ["Name06", "Date06", "Time06"];
    var banner07 = ["Name07", "Date07", "Time07"];
    var banner08 = ["Name08", "Date08", "Time08"];
    var banner11 = ["Name11", "Date11", "Time11"];
    var banner12 = ["Name12", "Date12", "Time12"];
    var banner13 = ["Name13", "Date13", "Time13"];
    var banner14 = ["Name14", "Date14", "Time14"];
    var banner15 = ["Name15", "Date15", "Time15"];
    var banner16 = ["Name16", "Date16", "Time16"];
    var banner17 = ["Name17", "Date17", "Time17"];
    var banner18 = ["Name18", "Date18", "Time18"];

  • OPR LEAD TIME OFFSET IN BOM

    Hi all ,
    my scenario is like :
    FG contains SFG10 , and SFG10 omtains SFG20
    I am using backward scheduling , Results in MRP run are as :
    FG :          30.03.2008      17:30:00
                     25.03.2008     15:20:00
    (SFG10 frwrd scheduling start time is 09:30:00)
    SFG10:      25.03.2008      12:30:30
                     25.03.2008       09:30:00 (start time of workcenter).
    in BOM of FG for component SFG10 i have mainatined ( 120- min ) opr lead time offset, to start the SFG10 at 11:30:00 , but it is not happening , whr i am missing ?
    Regards,...
    Edited by: venky  shree on Mar 26, 2008 7:32 AM

    Hi,
    pls check with the operation start time and not wth the order start time
    (ie, oprn for which the component you have assigned) in CO02-oprns overview-dates tab.
    you can see that from the start time of the operation,the reqmt time for the component will be 120 min ahead (CO02 -component overview-general tab -dates tab-reqmt date ) .
    Regards,
    sheik

  • Find time offset from GMT using time zone

    Hi,
    Is there any way i can query the time offset from GMT using the timezone of a place.
    E.g. If I am in India
    My TZNAME is 'Asia/Calcutta' and TZABBREV='IST'
    How can i get the time offset from GMT for this timezone?
    In this case I want +5:30 as the result.
    Regards
    Ravi

    SQL> select dbtimezone
      2  from dual;
    DBTIME
    +01:00
    SQL> Regards,
    Gerd

  • EXIF time altered on conversion from PEF to DNG

    Converting Pentax PEF files from my *istDS to DNG using the Adobe DNG converter (4.1) results in the EXIF date/times all being shifted forward by six hours. I gather that the camera and the converter are miscommunicating regarding time zone. But the question is: short of running a batch operation to correct the times after the fact, what can be done about this to prevent this from happening in the future? I've also asked in the Pentax forums, in case someone there knows about any answers from the camera side of the equation.
    One workaround I've found is to generate XMP files for my images before running the converter. I have discovered that if the EXIF info is present in the XMP file - and ACDSee Pro 2 will do this for me by default as soon as I try to write copyright info or anything else to my PEF - then DNG will use that instead of the "real" EXIF info to populate the EXIF fields of the resulting DNG file. Actually, I'm kind of guessing it is still messing up the "real" EXIF version of the info in the DNG, but it is also reproducing the XMP version info, and ACDSee is using the latter, and that's probably good enough for now. Still, I'd prefer other alternatives.

    I still don't understand the issue *fully*. But examining my files with ExifTool, I see that the DNG converter is *not* in fact altering the EXIF info per se at all. What it is doing is adding an XMP block to the DNG file that contains a modified version of the DateTimeDigitized field. This modified version actually contains the time as shot, but it has the current time zone offset appended to it. That is, if the original EXIF dates read 09:00, and I am located in a time zone 6 hours behind GMT, the new DateTimeDigitized field created by the DNG Converter reads 09:00-06:00. Some applications will still display this as 9:00 AM, but others will display it as 3:00 PM - the GMT equivalent of 9:00 AM for folks in this time zone. In the case of the application I have been using (ACDSee Pro 2), if it sees time zone info on this field, it goes ahead and displays *all* times for the file in GMT, making it looks like the converter has modified more than it has.
    I don't know if the converter is wrong to append the current time zone offset, or if ACDSee (and, apparently, some other applications) are wrong to display the time in GMT. The folks are ACDSee are looking into whether and how they should change the behavior of their application. I would also suggest Adobe consider whether appending the time zone info from the computer on which the conversion is being run really makes sense (perhaps it could be made an optional behavior). But I am inclined to suspect the real problem is an overly-vague specification - there may be no definitively correct behavior here.
    So for me, this is enough understanding to feel like my workround is the way to go for me here. It actually suits my workflow better to generate XMP files before running the converter - it's the most convenient time for me to enter location information. Another workround would presumably be to run ExifTool to delete the XMP:DateTimeDigitized field immediately after running the converter.

  • Conversion or transformation ?

    Hi,
    In our domain model, we have many enumerated types, or in other words, classes with a finite number of read-only instances . A good example would be a class named Country. This class has some attributes and a unique identifier.
    Since all this stuff is read-only, does such a class need to have its own database table ? A class having a one-to-one relation with a Country could store the country identifier and Toplink use it to resolve to the single in-memory instance, avoiding an access to Country table.
    I'm not sure if this is good design. And if it is, I don't know how to configure Toplink to do this. Conversion, transformation, variable one-to-one mapping ???
    Any comments would be appreciated
    Jean-Christian Gagné

    Jean-Christian,
    A little confused by your question. Do you want to have the table at all or are you just trying to avoid the read on the table each time.
    If you are trying to avoid the read then I would suggest keeping the 1:1 mapping and pre-loading all Country instances at system startup. This is common for reference data and will avoid any additional queries on the table.
    If you wish to eliminate the Country table then you have a couple of options to handle the country-code to Country instance conversion.
    1. Use an ObjectTypeMapping and populate it at startup time with the static set of Country instances
    2. Use a TypeConversionMapping and specify the database type as String/int (whatever the code is). The object type will automatically be found in the mapped class. Then extend the ConversionManager to also support you Country class conversion to and from the code type. (see http://otn.oracle.com/products/ias/toplink/technical/tips/customconversion/index.html).
    Doug

  • Using java.sql.Time: Offset by 1 hour?

    I have a problem understanding the behaviour of the java.sql.Time class. As the following example shows.
    61952000 ms is the Time 17:12:32. If i feed a Time-Object with it and print the time or date I'll get "18:12:32",
    an offset of 1h. But if I use time.getTime(), I get my ms value which equals 17:12:32.
    As my timezone shows, I have a GMT offset of one hour. But why does the getTime() method not calculate
    the timezone offset to the ms?
    Time time = new Time(61952000);
            System.out.println(time.getTime() / 3600 / 1000); // 17 hours
            System.out.println( new Date(time.getTime()) ); // Thu Jan 01 18:12:32 CET 1970
            System.out.println(time.getTime()); // 61952000
            System.out.println(time); // 18:12:32
            System.out.println(Calendar.getInstance().getTimeZone());
             *  sun.util.calendar.ZoneInfo[id="Europe/Berlin",offset=3600000,dstSavings=3600000,
             *  useDaylight=true,transitions=143,lastRule=java.util.SimpleTimeZone[id=Europe/Berlin,
             *  offset=3600000,dstSavings=3600000,useDaylight=true,startYear=0,startMode=2,startMonth=2,
             *  startDay=-1,startDayOfWeek=1,startTime=3600000,startTimeMode=2,endMode=2,endMonth=9,
             *  endDay=-1,endDayOfWeek=1,endTime=3600000,endTimeMode=2]]
             */

    Thanks for your reply. But please take a look at the following code snippet. It intends to add a Timestamp representing the
    start of a day (time component 00:00:00) to a time component (having Date to 01-01-1970):
            Time time = new Time(61952000);
            Calendar c = Calendar.getInstance();
            System.out.println("time: " + time); // 18:12:32
            c.set(Calendar.HOUR_OF_DAY, 0);
            c.set(Calendar.SECOND, 0);
            c.set(Calendar.MINUTE, 0);
            c.set(Calendar.MILLISECOND, 0);
            System.out.println("date: " + c.getTime()); // Fri Apr 03 00:00:00 CEST 2009
            Timestamp timestamp = new Timestamp( c.getTimeInMillis() + time.getTime() );
            Date d = new Date(timestamp.getTime() );
            System.out.println("\nresult: " + d); // result: Fri Apr 03 17:12:32 CEST 2009If I assume that getting and setting the milis always deals with timezone independent values, but operating methods
    like toString(), getHour() etc use the timezone, then I can not explain the result (last line): 17:12:32 is the timezone
    independend value. Should'nt d.toString() show the timezone dependend value of 18:12:32?

Maybe you are looking for

  • How to update app installed by third party

    My colleague was using this macbook and installed x code. He has since disappeared prefering a life making wooden bowls to computer programming. I cannot use x code until I update it, but I cannot update it as it was installed on his apple id. I cann

  • How to work on images in java ?

    i want to generate image through test. means i wanto conert text in to image how can we do that. 2nd how can i generate BARCODE's is there any way to generate barcodes if we supply values ?

  • Image scaling produces moire

    I use Labview 8.01 and need to read grayscale images (bmp, png) and fit them to user window for display. Images are scaled via entering a scaling factor into property node for the frontpanel display. If the scaling factor is larger or equal to one, e

  • Security Update 2012-002 Won't Install

    I have been attempting for some time now to install the latest Security Update 2012-002 on my Macbook Pro running OSX v.10.6.8. Upon restart as the computer is about to fully boot up it simply shuts off. I have run Apple Disk Utility as well as MacTu

  • Field status for  T-code : CK91

    Hi All, I am trying to load data into CK91 through LSMW. Can somebody help me to suppress some of the fields in CK91 since I don’t have data for those fields. I basically want to know where the <b>field status has been maintained for the Transaction