Conversion date

Hi,
I created the Info query used 6 tables i.e. ANLA, ANLB, ANLC, ANLZ, ANEP and ANEK. The issue is some assets is showing the conversion data i.e. 05/31/2009 and some assets is given current date (if the asset posted any retirement / additions / transfers u2013 the same date is updated in the query), but business asking the original date i.e. conversion date (05/31/2010) for converted / old assets.
The goal of this table view and query set is to pull the entire asset base including converted, new adds, and new retirements to a certain point in time or period end.  But, we found that it was not pulling converted assets because of the linkage of data between 6 different tables.
There are two assets (130049 & 174068, both with sub 0).  Iu2019ve made notes for each line of data in column u201CAu201D.  674068 seem to work perfectly as there is a posting row for the asset from conversion, a new $50 Addition, then the retirement.  630049 is where we are having a problem as the data query is not pulling the original conversion row that I would expect to see for Rs.109.84.  Whatu2019s interesting is when I retired total asset (including new adds) it retired the new adds and the converted amount.  What is preventing us from pulling the conversion data row in our view or query set?
I have data sets due to tax in the very near future (already put them off quite a bit).  If you could help us look at this issue at your earliest convenience I would really appreciate it
Thanks in Advance,
C Babu

And both java.sql.date and java.sql.Timestamp inherit
from java.util.Date, so you can just cast to go back.Although you almost never need to. The only case would be to force a different signature to be called:
void foo(java.util.Date dateOrTimestampThingy) {} // 1
void foo(java.sql.Timestamp dateOrTimestampThingy) {} // 2
Timestamp ts = ...
foo(ts); // calls #2
foo((java.util.Date)ts); // calls #1

Similar Messages

  • Import conversion data table from SAP R/3 into value mapping table in XI

    Hi:
        Somebody knows how to import a table with conversion data that are in SAP R/3 and to take it to a value mapping table in XI?
        The purpose is to use a mapping table that can change in the future. Must I use a ABAP programming that retrieve data and build the value mapping table?
        If so, how I specify in the ABAP programming the group id, the scheme, the agency and the corresponding value?
        Please, help me.
        Regards!!!

    Hi David,
    please refer to this section in the help: http://help.sap.com/saphelp_nw04/helpdata/en/2a/9d2891cc976549a9ad9f81e9b8db25/content.htm
    There is an interface for mass replication of mapping data. The steps you need to carry out to use this are:
    +Activities
    To implement a value-mapping replication scenario, proceed as follows:
           1.      Register the Java (inbound) proxies.
    To do so, call the following URLs in the following order in your Internet browser:
    ¡        http://:/ProxyServer/register?ns=http://sap.com/xi/XI/System&interface=ValueMappingReplication&bean=localejbs/sap.com/com.sap.xi.services/ValueMappingApplication&method=valueMappingReplication (for the asynchronous replication scenario)
    ¡ http://:/ProxyServer/register?ns=http://sap.com/xi/XI/System&interface=ValueMappingReplicationSynchronous&bean=localejbs/sap.com/com.sap.xi.services/ValueMappingApplicationSynchronous&method=valueMappingReplicationSynchronous (for the synchronous replication scenario)
    You only need to perform this step once (for each installation).
           2.      Application programming
    The ABAP program must perform the following tasks:
    ¡        Read the value mapping data from the external table
    ¡        Call the outbound proxy used to transfer the data to a message, which is then sent to the Integration Server
           3.      Configuration of the replication scenario in the Integration Directory
    This involves creating all the configuration objects you need to execute the scenario successfully. One special aspect of the value-mapping replication scenario is that the receiver is predefined (it must be on the Integration Server). The sender, however, is not predefined in the replication scenario and can be defined to meet your individual requirements.
    For example, you can use the shipped ABAP proxies.
    In the case of the receiver communication channel, choose the adapter type XI. Ensure that you configure a channel for the Java proxy receiver in this case.
    Enter the path prefix /MessagingSystem/receive/JPR/XI for this purpose.
    +
    Regards
    Christine

  • In PDF to Excel conversion dates like 03/12/15 convert to Dec 3rd 2015 not the correct date of Mar 12th 2015 whereas date 03/13/2015 converts correctly as March 13th 2015

    In PDF to Excel conversion dates like 03/12/15 convert to Dec 3rd 2015 not the correct date of Mar 12th 2015 whereas date 03/13/2015 converts correctly as March 13th 2015

    Hi DirTech,
    Are both of these dates in the same Excel file? If they're in different files, are you choosing the same language for OCR (optical character recognition)?
    If they are in the same PDF file, how was that PDF file created? Was it created from a third-party application (rather than an Adobe application)? If it was created by a third-party application, it could be that it wasn't written to spec, and that's why you're seeing some oddities in the PDF > Excel conversion.  (See Will Adobe ExportPDF convert both text and form... | Adobe Community.)
    Best,
    Sara

  • ASAP Data Migration / Data Conversion / Data Transfer Templates

    Hi,
    I'm looking for the ASAP Data Migration / Data Conversion / Data Transfer Templates that used to be available in ASAP.
    Some time I ago, and I believe it was on this site and can no longer find it, I found a zip file which contained the above .xls template to manage all the data conversions on a project. Also included was the data mapping template .xls
    Many thanks.

    Hi,
    I'm looking for the ASAP Data Migration / Data Conversion / Data Transfer Templates that used to be available in ASAP.
    Some time I ago, and I believe it was on this site and can no longer find it, I found a zip file which contained the above .xls template to manage all the data conversions on a project. Also included was the data mapping template .xls
    Many thanks.

  • I'm in trouble You update the iPhone did not have the expense of Apple iPhone was closed and requests activation account??? What to do This conversation data

    I'm in trouble
    You update the iPhone did not have the expense of Apple iPhone was closed and requests activation account??? What to do
    This conversation data
    SerialNumber: DN*******TC0
    help   help
    help
    help
    I've stopped all my work because all the contact numbers on the phone helped me
    <Personal Information Edited by Host>

    Repost in your native language as current post makes no sense.
    Not wise to post Serial Number or IMEI in public forum - perhaps the
    hosts will edit them out.

  • Please Help !!!!! -- Reagrding csscan report - 'lossy conversion' data

    Hello -
    1. We have one database (db1) in characterset US7ASCII and does store some special characters like microsoft word's curly quotes etc.(eg “abc”) and renders and displays them correctly from Web application in the browser.
    2. We have another database(db2) in characterset WE8ISO8859P1 character set behaves the same as above.
    But the issues faced are as follows -
    1. When we try to export the data from db1 (US7ASCII) and imported in db2 (WE8ISO8859P1). It could not convert microsoft word's curly quotes etc.. could not convert correctly and replacing those with upside-down questionmarks.We tried different NLS_LANG options while export/import. But nothing seems to be working.
    Question - Not able to understand why the problem for the data conversion when both characterset databases work fine with those special characters, stores them fine.Besides WE8ISO8859P1 is strict superset of US7ASCII.
    2. We decided to alter database character set of db1 (US7ASCII) to WE8ISO8859P1 , as we need to store french characters in the future.
    WE8ISO8859P1 is strict superset of US7ASCII and both are single bytes, storage 8 bits for 8859P1 and 7 bits for ASCII7.
    We ran csscan and found weird report with exception,telling that some of the data would have 'lossy conversion'. To me it seems that data could be those special chars like 'microsoft word's curly quotes etc..'
    Questions -
    1. Why the 'lossy conversion' ? when target is the superset of current one ?
    2. when it says 'lossy conversion' what does it mean ? will the data will be corrupted for those chars (like upside question marks)?
    3. How to fix the 'lossy conversion' data before altering the database character set ?
    I am very confused with all of the above...Really need help on how to proceed on this.
    Any help would be very much appreciated....
    Thanks
    Rama

    You appear to have invalid data in the database. Understanding how you can have invalid data in the database, and yet be able to have the applications appear to work, though, is a bit complicated.
    When data is being exchanged between the database and the client, there are two character sets to be concerned with-- the database character set and the client character set. If these two character sets are different, Oracle will automatically convert between the two character sets. If the two character sets are identical, however, the data gets passed between client and server is not validated at all-- the server assumes that the client is passing valid data.
    The problem comes, however, when the client passes in data which is not valid. If I set the NLS_LANG on my client to specify that I am sending ASCII data, and the database uses the US7ASCII character set, I can send and receive Chinese data without a problem, so long as every application knows the real character set, treats the data appropriately internally, and lies about the true character set of the data when communicating with the database. Obviously, though, you cannot validly store Chinese data-- you've basically turned the VARCHAR2 column into a binary field-- the problem is that the data will be corrupted if the application doesn't know how to decode it properly.
    In your case, the US7ASCII character set only encodes the first 128 characters (0-127), which does not include any of the special Microsoft characters. If you were to try to transfer those characters to an ISO 8859-1 database, the character set conversion that would be necessary would fail, so the data would be corrupted.
    Fixing the data corruption is relatively simple, provided you've understood the explanation above. You can, for example, write a simple OCI application that selects the columns with the Windows characters and writes them to a file. The client NLS settings would be US7ASCII but the file would, presumably, use the Windows-1252 encoding. Then, you would alter the database character set, corrupting the Windows characters. You could then load the file you created before altering the database character set, assuming you changed the client NLS settings to correctly identify the encoding of the data.
    Obviously, you will want to test this extensively before starting on your production database. You'll also probably require downtime to reload the data.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Currency conversions & date conversions

    Hi,
    can anyone tell me about the currency conversions & date conversions and where they used.Routines are needed for that.please tell me the scenarios for this.
    sekhar.

    Hi Chandra,
    We can do currency translationin
    1. Transfer rules,
    2.Update rules
    3. BEX Reporting.
    we have to write code ...........also we can use some function module for translating currency in transfer rule/update rules.
    go through these documents:
    Refer to Following document on currency translation in BEx
    https://wiki.sdn.sap.com/wiki/display/BI/Currencytranslationin+Bex
    check this out
    /people/rasim.manavoglu/blog/2007/04/30/currency-currency-currency
    http://help.sap.com/saphelp_nw04/helpdata/en/ec/076f3b6c980c3be10000000a11402f/frameset.htm
    Currency Translation Type
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a6806e07211d2acb80000e829fbfe/content.htm
    Creating Currency Translation Types
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a6813e07211d2acb80000e829fbfe/content.htm
    Editing Currency Translation Types
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a6820e07211d2acb80000e829fbfe/content.htm
    Currency Translation in the Business Explorer
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a682de07211d2acb80000e829fbfe/content.htm
    Currency Translation in the Update
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a6507e07211d2acb80000e829fbfe/content.htm
    Hope it helps you................
    Regards
    Chandra Sekhar T

  • Middle of loading conversion data

    Users are not able to load forms in EBS environment.
    After our investigation we identified that only CONVERSION user is able to login to the application and load forms.
    No other user is able to load forms but they are able to login to HTML screens.
    Since we are in the middle of loading conversion data into the system we need fast resolution.

    Hi;
    Did you enable trace for see what happens?
    Regard
    Helios

  • Indirect M currency conversion data

    Dear Friends
    I am facing a problem when recieving the goods (MIGO). the system asks for entering the data for indirect M currency conversion rate.  Actually, I have copied my compnay from a standard company whose currecy is EUR. 
    I have changed my compnay currecy to INR at the time of copying only. The PO etc was generated in INR. I have not given any transanction in EUR nor my vendors are having this currency. I want to know
    1  how to solve this problem.
    2   do i need to make some more changes in my compnay currency.
    3 if so in which fileds i should do so.
    4  any study material on this topic
    Thanks in advance for the help.

    Hi, you have to check the setup in the IMG (e.g. go through the IMG point @ SAP NetWeaver-General Settings-Currencies)
    Have in mind that Exchange rate type M is hardcoded as "THE EXHANGERATE TYPE" in the logistic area. If you need to adjust something in the exchange rate type - adjust the customizing of M - despite it might be overwritten during an upgrade!
    It’s possible to assign other exchange rate type than M in the customizing of FI & CO but do not try to make and assign an exchange rate type of your own  (to be used in the logistic flows like sales order processing or purchasing) - you will never get it to work during month-end!
    To my knowledge is this little trick not documented - it's passed on from consultant to consultant!
    Rgs
    René

  • Conversion data type error at universe level

    Hi Friends,
    I am trying to convert a object datatype from number to char by using cast function in Universe Designer. I am getting bellow error. Help me on this.
    Thanks
    Riaz

    Hi,
    Try changing the data type of the object at business layer by selecting the related object instead of changing the data type at data foundation layer.
    Hope it helps.
    Grtz
    -Anila.

  • Loading historical conversion data

    Hi,
    I am trying to load Claims master data with history of the status from legacy system as flat file into the PSA and then DSO (Staging). From here I want to load to Claim_number master data where status is time dependant. I have 166 status history records for 10 claims (took a smaller subset) and it loads fine in PSA and DSO. BUt when it loads into the master data, only the last status is being updated and the history doesn't load. Can anyone help me here. I need to load 10 yrs history into BI.

    the key fields in dso should be claim_no,date_from,date_to.
    so you should have multiple records for same claim number with different date_to and date_from.
    check whether all these are taken care properly.then if you load into master you will get historic data.

  • MDX to DAX Conversion - Data Mismatch

    Hi,
    I’m pulling data from a Tabular cube, for the given time period as filter. When I run an mdx query, it pulls around 9 million records in more than 6 hours. Since I need to run it frequently and quickly so I’ve converted the MDX to DAX (below). DAX
    is executing much faster (20 minutes) than the MDX but results between two approaches are not the same and I’m struggling hard to find why!
    All the dimensions used inside SUMMARIZE clause are having a relationship with Primary Dimension/TableName. High level observations:
    Number of Records are different between the result set of DAX & MDX.
    Sum of each measures(DAX) differ from the sum of their corresponding measures(MDX)
    DAX results have no rows where all the measures have zero values, whereas the MDX results has.
    DAX omits many of the rows where all the measures have zero values except one. It appears that due to the omission of these kind of records, we have the data mismatch but reason for
    omission is unknown.
    Any help will be appreciated!!
    MDX
    SELECT NON EMPTY {
    [Measures].[Calculated Measure 1],
    [Measures].[Calculated Measure 2],
    [Measures].[Calculated Measure 8]
                    } ON COLUMNS,
                    NON EMPTY
      [Dimension1].[DimAttribute1].children
    , [Dimension2].[DimAttribute1].children
    , [Dimension2].[DimAttribute2].children
    , [Dimension10].[DimAttribute1].children
                    ) } ON ROWS
                    FROM (SELECT ([Time].[Fiscal Month].&[May, 2014) ON COLUMNS
                    FROM [Cube Name]
                    WHERE (Filter Condition)
    Converted DAX
    EVALUATE(CALCULATETABLE(ADDCOLUMNS(SUMMARIZE(
    PrimaryDimension/PrimaryTableName
    ,Dimension1[Attribute1]
    ,Dimension2[Attribute1]
    ,Dimension2[Attribute2]
    ,Dimension10[Attribute1]                                           
    MeasureGroup1.[Calculated Measure 1],
    MeasureGroup2.[Calculated Measure 2],                                            
    MeasureGroup4.[Calculated Measure 8]
                                    ,Time[Fiscal Month] = "May, 2014"
                                    ,Filter Condition               
    Thanks,
    Amit

    Amit, the two queries are not semantically equivalent.
    SUMMARIZE returns rows that have at least one row in the table you pass as a first argument for the given combination of columns you pass in the following arguments (think to a SELECT DISTINCT for all the columns you include, with an INNER JOIN between all
    the tables included by the colulmns you specified in the following arguments). You can read more about this here:
    http://www.sqlbi.com/articles/from-sql-to-dax-projection/
    The MDX query returns any existing combination of the cartesian product of the columns you included, and depending on the measures you include, you might see data also for combinations of column values that don't exist in the original source.
    The reason why you have a slow MDX is probably because of the cost of non empty evaluation for certain measures. The solution is probably to optimize the DAX code you are using in Tabular.
    The equivalent DAX statement should be something like
    EVALUATE
    ADDCOLUMNS (
        FILTER (
            CROSSJOIN (
                VALUES ( table1[columnA] ),
                VALUES ( table2[columnB] )
            [Measure] <> 0
        "Measure", [Measure]
    But don't expect better performance comparing this to MDX. Only when you can make assumptions that allow you using SUMMARIZE you might see performance benefits, but if these assumptions are not correct, you can get different results (as you experienced).
    Marco Russo (Blog,
    Twitter,
    LinkedIn) - sqlbi.com:
    Articles, Videos,
    Tools, Consultancy,
    Training
    Format with DAX Formatter and design with
    DAX Patterns. Learn
    Power Pivot and SSAS Tabular.

  • Data conversion/migration

    Hi,
    what kind of data conversion/migration are we looking in plant maintenance and inventory management?
    Regards,
    Vijay

    Dear Vijay Srikanth,
    I think you are asking for all such activities in Plant Maintenance which can be automated using some data conversion / data migration programs. Where by the user will just upload his data and the activity is achieved by the program. Please refer to the following documentation and search for the term 'Activites in Plant Maintenance'.
    http://help.sap.com/printdocu/core/print46c/en/data/pdf/MYSAP/SR_PM.pdf
    You will find all the required information there. Thank you.
    Regards,
    kartik

  • Execute SQL Task - UPDATE or Data Flow Data Conversion

    Good Morning folks,
    I want to know which is more fast to convert data type.
    I want to convert nvarchar(255) to datetime2,
    using t-sql (execute sql task)
    UPDATE TBL
    SET  FIELD1= CAST(FIELD1AS DATETIME2)
    GO
    or data conversion(data flow)
    Thank..

    Itz Shailesh, my t-sql have only UPDATE, not many UPDATES... so it's one batch, no 2,3,4... So.. it's Only one update.. ex: update table set field1 = cast(field1 as datetime2), field2 = cast(field2 as datetime2). not : update table set field = cast(field
    as datetime2) go    update table set field2 = cast(field2 as datetime2) go.... understand?
    Yeah, I understand that you have to update just 1 field. What I am saying, if you have to update millions of rows then you must update rows in batches ( lets say batch of 50k). This way you will only touch 50k rows at a time and not all rows from table.
    I see that your rows are less however I would still prefer the option of data conversion transformation over update statement.
    If this post answers your query, please click "Mark As Answer" or "Vote as Helpful".

  • Data conversion from 212 format

    I found a problem during conversion data.
    I have a file which contains ASCII characters (in fact they're look
    like ASCII on any editor) . I have to convert them to integer but I
    know that data are organised into pairs of 12bit numbers packed into
    byte triplets (format 212).
    I made some vi (which convert 2 following ASCII characters to binary,
    then divide them to pairs of 4bits, then some rotation etc..) but the
    whole conversion takes too much time (64kb file conversion is taking
    approx. 1minute).
    Thank you for any suggestions,
    Michael ([email protected])

    Dear Michal,
    I hope this vi work faster.
    Input: 3bytes, data type unsigned int (U8)
    Output: 2 values, data type integer. (INT16)
    of course read all file to memory. Don't read everytime from file by 3
    bytes.
    Deniss
    ps. vi i send by e-mail.
    "Michal Szaj" wrote in message
    news:[email protected]..
    > On Wed, 18 Sep 2002 02:27:25 GMT, Greg McKaskle
    > wrote:
    >
    > >This is one of those things that is quite sensitive to how it is coded.
    > > If you make your VI available, others, including myself can give you a
    > >hand with speeding it up.
    > >
    > >As for guessing at what might be going on, first look at the array wire.
    > > If you are using locals or property nodes to access the a
    rray, that is
    > >the cause. If the array is written to a global and later accessed via
    > >the global, that is causing it.
    > >
    > >I suspect that the bit manipulations on 64K should take something like a
    > >few seconds.
    > >
    > >Greg McKaskle
    >
    > Thank you for your support.
    >
    > In this case I have only one vi and I don't use any global/local
    > variables.
    > At the beginning of the vi I load all text character from file to
    > memory and then I'm making further operations.
    >
    > If it helps I can send the vi with data file.
    >
    > Thnx,
    >
    > Michael ([email protected])

Maybe you are looking for

  • Can not unlock the purchase in the free game

    can not unlock the purchase in the free game

  • Transfer the payed softwares from iPhone 3GS to iPhone 4G.

    I have iPhone 3GS and I want to buy iPhone 4G. Can I transfer the payed softwares from iPhone 3GS to iPhone 4G.

  • Oracle Service Bus Question

    Hi, I have 6 systems that I want to connect using OSB. The first three are DEV, TEST and PROD versions of system A, and the second are DEV, TEST and PROD versions of system B. DEV_A needs to connect to DEV_B, TEST_A connects to TEST_B And PROD_A conn

  • Managing 3 profiles in Itunes

    Ok, I'm hoping I've not lost my mind. My wife and I manage our 3GS Iphones (OS V. 4.2) on a Mac fully updated with latest Itunes. My daughters Itouch is also managed on this machine. (fully updated OS on that too) I moved the itunes library to the ro

  • Do client's and server's JVM need to be from same vendor?

    Hi, 1) Is a reference to object in the server valid in client if a different vendord JVM is used? Each JVM vendor implements references in a different way. Say client's JVM is Microsoft and Server's is SUN in such a situation will a remote reference