Loading historical conversion data

Hi,
I am trying to load Claims master data with history of the status from legacy system as flat file into the PSA and then DSO (Staging). From here I want to load to Claim_number master data where status is time dependant. I have 166 status history records for 10 claims (took a smaller subset) and it loads fine in PSA and DSO. BUt when it loads into the master data, only the last status is being updated and the history doesn't load. Can anyone help me here. I need to load 10 yrs history into BI.

the key fields in dso should be claim_no,date_from,date_to.
so you should have multiple records for same claim number with different date_to and date_from.
check whether all these are taken care properly.then if you load into master you will get historic data.

Similar Messages

  • Load Historic Sales Data into SEM

    Hi experts,
    I have a requirement in my project where as part of system readiness check it has been asked to "Load Historic Sales Data into SEM".Does anyone know as to how to proceed with the requirement.
    Has anyone dealt with this kind of requirement ever ?
    If yes can you tell me the steps please.
    Watever information you guys have please share it with me.
    Usefull answers will be rewarded subsequently.
    Thanks in advance,
    Abhinav Mahul.

    Hi Abhinav,
    Hav you integrated your system with BW system.
    If yess then you need to upload all previous sales information from CRM system to BW system.
    Secondly, copy all the CRM sales order data residing in an infocube to be replicated to a sales planning sales order infocube of SEM.
    This is what is meant by Loading historic sales data into SEM.
    To do this you would require a SEM/BW consultant.
    You can use a funtion name 'COPY' in BPS-SEM to copy one sales infocube to other.
    Best Regards,
    Pratik Patel
    <b>Reward with Points!</b>

  • Loading historical data with 0CO_PC_ACT_02

    Hello,
        I need to load historical values with infosource 0CO_PC_ACT_02.  I can load the current period, but get no data from R/3 for historical periods.  When I run RSA3 on R/3 for a historical period, I get no data, so I don't believe that this is a BW Issue.
        My question:
       1.  Is there a job or something on R/3 I need to run to enable this data source to pull historical data?  If so, what is it and how do I run it?
       2.  Is this data source simply not able to pull historical data?
    Thanks.
    Dave

    Hi All,
    I have same issue , any one got work around to load history data with this Extractor(0CO_PC_ACT_02) ?

  • Loading Historical data to the new field without deleting the data in cube

    Dear BI Experts,
    I have enhanced a new field to the Generic data source in BI 7.0 .
    I need to load historical data to the newly  appended field.
    As we are having very huge data it is not possible to delete and do init again.
    Is there any  other possibility to load the historical data for the new appeneded field without deleting the old requests?
    Thanks for your Kind  help.
    Kind Regards,
    Sunil

    Dear Sushant,
    Thanks for your reply.
    But I am just wondeing if there is any possibility of loading historical data for new field using Remodelling  concept
    with out deleting old requests.
    I do not know about Remodelling conept but heard that it is not recommeneded to use.
    Can you please suggest and help.
    Thanks and Regards,
    Sunil Kotne

  • Reg: Loading historic data for the enhanced field

    Hello All,
    We need to add a new field 0VENDOR to our datasource 0FI_GL_4. This field is available in our BSEG table. Hence, we are planning to go ahead with datasource enhancement.
    Now, please advice on how to update the historical data to this newly added field.I have heard there is a BW functionality/program to do so without deleting the entire data. Kindly advice on the possible solutions.
    Thanks & Regards
    Sneha Santhanakrishnan

    HI Sneha,
    Using remodeling option you will be able to do that, ie.. loading historical data for new attributes without deleting existing data. But the problem is in remodeling either you can assign constant or some other attribute value or values determined using EXIT.
    Now when you are loading data from source system and if you need historical data as well the best practise is delete existing data and reload it from source sytsem.
    But if you don't want to do that then I can give you one trick but not sure whether it will work or not. The idea is to populate the historical values for 0Vendor using customer exit option of remodeling. Now to get the historical values in customer exit you will need all that data in some BW table, here you can think of creating some generic extractor which will store the values of all the documents and the respective vendor, as you will load data form source system you will get historical values as well.
    Now read that table in customer exit and populate vendor value. This will be one time process to populate the historical values.
    Regards,
    Durgesh.

  • Init Load,Historical Data Load & Delta Load with 'No Marker Update'

    What is the difference b/w Compression in Inventroy cube while Init Load,Historical Data Load & Delta Load with 'No Marker Update'???
    please help for this quesy..
    Thanks
    Gaurav

    Hi Gaurav,
    I believe you will find the answers here...
    [http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/e0b8dfe6-fe1c-2a10-e8bd-c7acc921f366&overridelayout=true]
    regards
    Pavel

  • Middle of loading conversion data

    Users are not able to load forms in EBS environment.
    After our investigation we identified that only CONVERSION user is able to login to the application and load forms.
    No other user is able to load forms but they are able to login to HTML screens.
    Since we are in the middle of loading conversion data into the system we need fast resolution.

    Hi;
    Did you enable trace for see what happens?
    Regard
    Helios

  • Loading Labview Binary Data into Matlab

    This post explains the Labview binary data format. I couldn't find all of this information in any one place so this ought to help anyone in the future.  I didn't want to add any overhead in Labview so I did all of my conversion in Matlab.
    The Labview VI "Write to Binary File" writes data to a file in a linear format using Big Endian numbers of the type wired into the "Write to Binary File" VI. The array dimensions are listed before the actual array data. 
    fid = fopen('BinaryData.bin','r','ieee-be'); % Open the binary file
    Dim1 = fread(fid,4); % Reads the first dimension
    Dim2 = fread(fid,4); % Reads the second dimension
    Dim3 = ...
    Each dimension's length is specified by 4 bytes. Each increment of the first, second, third, and fourth byte represent 2^32, 2^16, 2^8, and 1 respectively. 0 0 2 38 equates to 2*256 + 38 = 550 values for that particular dimension.
    As long as you know the number of dimensions and precision of your binary data you can load it.
    Data = fread(fid,prod([Dim1 Dim2 Dim3]),'double',0,'ieee-be'); % Load double precision data
    If you have appended multiple arrays to the same file in Labview you would repeat this procedure. Load each dimension then load the data, repeat.
    Data = fread(fid,prod([Dim1 Dim2 Dim3]),'int8',0,'ieee-be'); % Load int8 precision data or boolean data
    I had to create a function for my own purposes so I thought I'd share it with everyone else too.  I uploaded it to the Matlab File Exchange.  The file is named labviewload.m.
    This was tested on Matlab R2007a and Labview 8.2.

    Thanks. I have the same questions as I tried to load labview binary data into Matlab. 
    -John

  • Inventory Cube - Historical Movement Data

    Hi experts
    In the Inventory Cube 0IC_C03 there seem to some problem with the Historical Movement Data. Now I want to execcute the Query by ignoring the Historical Data.  My question is
    1. How can I ignore the historical movent and execute a query on Inventory Cube.
    2. Can i get the correct Closing balance.
    Thanks in advance.
    Dinesh Sharma

    Hi,
    For 0IC_C03, you can't execute the reports without proper Historical data. If you run reports without proper dataloads, you can't get good results.  Because all keyfigures are Non-Cumulative Keyfigures, so you must habe proper data in 0IC_C03. It is not like a 0SD_C03 Cube, here you can restrict Month or day then run the reports, but in case of 0IC_C03, you must have perfect data.
    See the following URLs.
    Treatment of historical full loads with Inventory cube
    Setting up material movement/inventory with limit locking time
    Thanks
    Reddy

  • SPRUNConversion and historic conversion

    Hi All,
    e are using the SPRUNCONVERSION stored procedure for the historic conversion.
    We assign an historic currency rate to each single entity but when the SPRUNCONVERSION runs it converts only at the currency rate of the default entity.
    It works in BO PC 7.0 MS, we have problem in BO BP 7.5 MS with the same business rules.
    Environment
    Single Server - 64bit
    BPC Version: 7.5 SP4
    SQL Server: 2008 SP2
    Thanks for help
    Simone

    Hello again,
    I've you any idea about the meaning of this message on SPRUNCONSO ?:
    SPRunConso Version 2.07
    ERROR CSD-150 Problem extracting Data : C_REPART
    ERROR CSD-160 Problem extracting Data : C_CONSO
    It looks a little bit strange as it concerns only some groups and the top group with all the companies consolidated works fine ?
    for ie :
    OK :
    Global company
    |__A method 86 - POWN 100% - PGROUP 1
    |__B method 86 - POWN 100% - PGROUP 1
    |__C method 86 - POWN 100% - PGROUP 1
    NOK with the message above :
    Sub_conso
    |__A method 86 - POWN 100% - PGROUP 1
    |__B method 86 - POWN 100% - PGROUP 1
    Thank again
    Pascal

  • Please Help !!!!! -- Reagrding csscan report - 'lossy conversion' data

    Hello -
    1. We have one database (db1) in characterset US7ASCII and does store some special characters like microsoft word's curly quotes etc.(eg “abc”) and renders and displays them correctly from Web application in the browser.
    2. We have another database(db2) in characterset WE8ISO8859P1 character set behaves the same as above.
    But the issues faced are as follows -
    1. When we try to export the data from db1 (US7ASCII) and imported in db2 (WE8ISO8859P1). It could not convert microsoft word's curly quotes etc.. could not convert correctly and replacing those with upside-down questionmarks.We tried different NLS_LANG options while export/import. But nothing seems to be working.
    Question - Not able to understand why the problem for the data conversion when both characterset databases work fine with those special characters, stores them fine.Besides WE8ISO8859P1 is strict superset of US7ASCII.
    2. We decided to alter database character set of db1 (US7ASCII) to WE8ISO8859P1 , as we need to store french characters in the future.
    WE8ISO8859P1 is strict superset of US7ASCII and both are single bytes, storage 8 bits for 8859P1 and 7 bits for ASCII7.
    We ran csscan and found weird report with exception,telling that some of the data would have 'lossy conversion'. To me it seems that data could be those special chars like 'microsoft word's curly quotes etc..'
    Questions -
    1. Why the 'lossy conversion' ? when target is the superset of current one ?
    2. when it says 'lossy conversion' what does it mean ? will the data will be corrupted for those chars (like upside question marks)?
    3. How to fix the 'lossy conversion' data before altering the database character set ?
    I am very confused with all of the above...Really need help on how to proceed on this.
    Any help would be very much appreciated....
    Thanks
    Rama

    You appear to have invalid data in the database. Understanding how you can have invalid data in the database, and yet be able to have the applications appear to work, though, is a bit complicated.
    When data is being exchanged between the database and the client, there are two character sets to be concerned with-- the database character set and the client character set. If these two character sets are different, Oracle will automatically convert between the two character sets. If the two character sets are identical, however, the data gets passed between client and server is not validated at all-- the server assumes that the client is passing valid data.
    The problem comes, however, when the client passes in data which is not valid. If I set the NLS_LANG on my client to specify that I am sending ASCII data, and the database uses the US7ASCII character set, I can send and receive Chinese data without a problem, so long as every application knows the real character set, treats the data appropriately internally, and lies about the true character set of the data when communicating with the database. Obviously, though, you cannot validly store Chinese data-- you've basically turned the VARCHAR2 column into a binary field-- the problem is that the data will be corrupted if the application doesn't know how to decode it properly.
    In your case, the US7ASCII character set only encodes the first 128 characters (0-127), which does not include any of the special Microsoft characters. If you were to try to transfer those characters to an ISO 8859-1 database, the character set conversion that would be necessary would fail, so the data would be corrupted.
    Fixing the data corruption is relatively simple, provided you've understood the explanation above. You can, for example, write a simple OCI application that selects the columns with the Windows characters and writes them to a file. The client NLS settings would be US7ASCII but the file would, presumably, use the Windows-1252 encoding. Then, you would alter the database character set, corrupting the Windows characters. You could then load the file you created before altering the database character set, assuming you changed the client NLS settings to correctly identify the encoding of the data.
    Obviously, you will want to test this extensively before starting on your production database. You'll also probably require downtime to reload the data.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • How to Load Historical values from Material Prices

    Hi
    I have to load historical values from Material Prices.
    I'm using extractors 0CO_PC_ACT_02 and 0CO_PC_ACT_05.
    Do you know how I can do this in a easy way?
    Help is appreciated.
    thanks in advance.
    best regards

    Hi Joao,
    This extractor0CO_PC_ACT_05 takes the data from table MBEW. ( data for the current period, previous period and last period of the previous year )
    If material ledger is active, then it takes the information from table ckmlhd for those materials.
    It does not look like fetching from MBEWH.
    Please validate my above comments with sample extraction for a material and plant.
    If that is the case, I would suggest you have to create a custom extractor based on MBEWH to get the historic price data and with 0CO_PC_ACT_05 you can get the prices for the previous period as they get completed .
    0CO_PC_ACT_02 gets you the inventory value and not the price in the same way.
    You might have to think about 0CO_PC_ACT_06 and 0CO_PC_ACT_07, if you have project stock and sales order stock in your business scenario.
    Please let me know, how this goes.
    Thanks,
    Krishnan.

  • Loading new master data

    Hi experts!!!
    I am going to load new master data in our BI.Our BI consists of some cubes and DSOs which have data from past 3 years and which are based on old master data.Now the Master Data is completely changed to new values and will be loaded into our BI.I have following doubt regarding this:
    From past three years the transaction data that is loaded into our cubes and DSOs is based on Old Master data.Now if I suddenly load NEW master data values,what will happen to this old transaction data loaded until today?I can imagine that once I load new values,from now onwards,transaction data will be based on new values only.The question is what will happen to the old transaction data ?
    Please assume both cases of Time Independent Master data and Time Dependent Master data while reading my question.
    Thanks for your time.

    Hi,
    Case1 (normal case, happens frequently) - Only attributes of master data has changed.
    Your transaction data would reflect the old master data when you have master data lookups or reads on master data table (P, Q tables for time indep/time dep MD) for populating master data in the transaction data flow.
    In case just the attributes on the master data objects like Customer/Material have changed and these are used at the query level, the queries would show the new attribute values. In case the time validity/dependency has changed, the new time validity takes effect. For e.g Person X is respnsible for cost center A from 01/01/2010 till 31/12/2012. Now MD gets changed and Person Y is responsible for Cost Center A from 01.01.2011 till 31/12/2015. If the report is run now, the person Y would be shown in the report.
    Case 2 - The master data has big changes in the source system. for e.g. Cust 10000 is now 11111 or material 12345 is now 112233.
    In these cases your historical transaction data would be showing old master data.
    For e.g Person X is respnsible for cost center A from 01/01/2010 till 31/12/2012. Now Cost center A is changed to B and person Y is responsible for the same period. Old transactions with Cost center A shows X as responsible and new transactions with Cost center B shows Y as responsible.
    Even if your master data has changed completely, there would not be any affect on the existing transaction data....as new master data entries would be written to the tables and the existing master data remains.

  • UCCX 7.0.1SR5 to 8.0 upgrade while also adding LDAP integration for CUCM - what happens to agents and Historical Reporting data?

    Current State:
    •    I have a customer running CUCM 6.1 and UCCX 7.01SR5.  Currently their CUCM is *NOT* LDAP integrated and using local accounts only.  UCCX is AXL integrated to CUCM as usual and is pulling users from CUCM and using CUCM for login validation for CAD.
    •    The local user accounts in CUCM currently match the naming format in active directory (John Smith in CUCM is jsmith and John Smith is jsmith in AD)
    Goal:
    •    Upgrade software versions and migrate to new hardware for UCCX
    •    LDAP integrate the CUCM users
    Desired Future State and Proposed Upgrade Method
    Using the UCCX Pre Upgrade Tool (PUT), backup the current UCCX 7.01 server. 
    Then during a weekend maintenance window……
    •    Upgrade the CUCM cluster from 6.1 to 8.0 in 2 step process
    •    Integrate the CUCM cluster to corporate active directory (LDAP) - sync the same users that were present before, associate with physical phones, select the same ACD/UCCX line under the users settings as before
    •    Then build UCCX 8.0 server on new hardware and stop at the initial setup stage
    •    Restore the data from the UCCX PUT tool
    •    Continue setup per documentation
    At this point does UCCX see these agents as the same as they were before?
    Is the historical reporting data the same with regards to agent John Smith (local CUCM user) from last week and agent John Smith (LDAP imported CUCM user) from this week ?
    I have the feeling that UCCX will see the agents as different almost as if there is a unique identifier that's used in addition to the simple user name.
    We can simplify this question along these lines
    Starting at the beginning with CUCM 6.1 (local users) and UCCX 7.01.  Let's say the customer decided to LDAP integrate the CUCM users and not upgrade any software. 
    If I follow the same steps with re-associating the users to devices and selecting the ACD/UCCX extension, what happens? 
    I would guess that UCCX would see all the users it knew about get deleted (making them inactive agents) and the see a whole group of new agents get created.
    What would historical reporting show in this case?  A set of old agents and a set of new agents treated differently?
    Has anyone run into this before?
    Is my goal possible while keeping the agent configuration and HR data as it was before?

    I was doing some more research looking at the DB schema for UCCX 8.
    Looking at the Resource table in UCCX, it looks like there is primary key that represents each user.
    My question, is this key replicated from CUCM or created locally when the user is imported into UCCX?
    How does UCCX determine if user account jsmith in CUCM, when it’s a local account, is different than user account jsmith in CUCM that is LDAP imported?
    Would it be possible (with TAC's help most likely) to edit this field back to the previous values so that AQM and historical reporting would think the user accounts are the same?
    Database table name: Resource
    The Unified CCX system creates a new record in the Resource table when the Unified CCX system retrieves agent information from the Unified CM.
    A Resource record contains information about the resource (agent). One such record exists for each active and inactive resource. When a resource is deleted, the old record is flagged as inactive; when a resource is updated, a new record is created and the old one is flagged as inactive.

  • Import conversion data table from SAP R/3 into value mapping table in XI

    Hi:
        Somebody knows how to import a table with conversion data that are in SAP R/3 and to take it to a value mapping table in XI?
        The purpose is to use a mapping table that can change in the future. Must I use a ABAP programming that retrieve data and build the value mapping table?
        If so, how I specify in the ABAP programming the group id, the scheme, the agency and the corresponding value?
        Please, help me.
        Regards!!!

    Hi David,
    please refer to this section in the help: http://help.sap.com/saphelp_nw04/helpdata/en/2a/9d2891cc976549a9ad9f81e9b8db25/content.htm
    There is an interface for mass replication of mapping data. The steps you need to carry out to use this are:
    +Activities
    To implement a value-mapping replication scenario, proceed as follows:
           1.      Register the Java (inbound) proxies.
    To do so, call the following URLs in the following order in your Internet browser:
    ¡        http://:/ProxyServer/register?ns=http://sap.com/xi/XI/System&interface=ValueMappingReplication&bean=localejbs/sap.com/com.sap.xi.services/ValueMappingApplication&method=valueMappingReplication (for the asynchronous replication scenario)
    ¡ http://:/ProxyServer/register?ns=http://sap.com/xi/XI/System&interface=ValueMappingReplicationSynchronous&bean=localejbs/sap.com/com.sap.xi.services/ValueMappingApplicationSynchronous&method=valueMappingReplicationSynchronous (for the synchronous replication scenario)
    You only need to perform this step once (for each installation).
           2.      Application programming
    The ABAP program must perform the following tasks:
    ¡        Read the value mapping data from the external table
    ¡        Call the outbound proxy used to transfer the data to a message, which is then sent to the Integration Server
           3.      Configuration of the replication scenario in the Integration Directory
    This involves creating all the configuration objects you need to execute the scenario successfully. One special aspect of the value-mapping replication scenario is that the receiver is predefined (it must be on the Integration Server). The sender, however, is not predefined in the replication scenario and can be defined to meet your individual requirements.
    For example, you can use the shipped ABAP proxies.
    In the case of the receiver communication channel, choose the adapter type XI. Ensure that you configure a channel for the Java proxy receiver in this case.
    Enter the path prefix /MessagingSystem/receive/JPR/XI for this purpose.
    +
    Regards
    Christine

Maybe you are looking for

  • My Nano shuts down Itunes!

    Hello! I have a few ipods, but have been using my Iphone primarily with Itunes. I'm trying to plug in my 2G Nano and as soon as I do, Itunes shuts down! The connection is fine and it charges my nano, but the nano icon appears for one second on my des

  • Running net-snmp as a subagent.

    I am trying to setup net-snmp or ucd-snmp as a subagent through SunMC. I followed the instructions from the Sunmc 3.6.1 installation and configuration guide for modifying the subagent-registry-d.x file and put the following entry: netsnmp = { type =

  • Regarding Direct Connections in PI 7.1 EHP1

    Hi I have two integrate two SAP ECC systems using Direct connections  I have  gone through the WiliamLI  Blog http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b00bbb77-75bc-2a10-6b9a-a6f8161515a6?quicklink=index&overridelayout=true

  • Problem with creating customer account using TCA Java API

    Hi, I am trying to create customer account using TCA java API. i am getting exception saying PL/SQL numeric error: character to number conversion. but this error raises when calling API method HzCustAccountV2Pub.createCustAccount. can any body help m

  • Screen Saver Engine

    A colleague has a problem whereby he can't access any of his open Apps through launchpad without the ScreenSaver activating. There is a Screen Saver Engine Icon in his dock. Anyone got any ideas how he can stop is screen saver activating? Thanks in a