Delta Load to Infocube

Hi,
I am facing an issue with delta load to infocube.
Let me give you the scenaario - I have a standard Infocube and a DSO.
Data gets loaded to the infocube daily using Delta with filter on sy-datum.
There a new Key figure which is added as part of the DSO and Infocube recently.
Our requirement is - we have to populate history for that specific key figure in the DSO and subsequently to the Cube whithout any data duplication for already existing Key figures.
Hence, we have created a self loop on the DSO updating only the specific key figure.
Before loading the data into the DSO, while setting the already existing DTP which takes sy-datum is filter to " Initial Update Set".
Its giving an error which says "Overlapping Selection criteria for DTP".
Is it something like we cannot change the setting of an already existing DTP?
If i do a full data will get duplicated for the past dates, hence delta is the only solution.
But how to go about delta then?

Jaya,
If i understand corrctly.....  Data getting loaded from DSO to CUBE through delta with some selection. New keyfigure added to DSO & CUBE ... right...??
To populate new KF, self loop created(DSO --> DSO) with required mappings(keyfields and new KF).
Just do full load from DSO --> DSO and push delta to cube. No need to keep any filters while loading from DSO to DSO. Populate new KF for entire data, and while pushing to cube filter is already available at DTP(use exising delta mechanism only from DSO to CUBE). No need of any full load from DSO to CUBE.
Hope it Helps
Srini

Similar Messages

  • "Error occurred in the data selection" on init delta load ODS- InfoCube

    Hi, gurus and experts!
    I'm trying to do Init delta load from 0FIAR_O03 ODS into 0FIAR_C03 InfoCube in PRD env. by InfoPackage "Initialize with delta transfer (with data transfer)". Immediately after the load was started I got error
    "Error occurred in the data selection"
    with details
    "Job terminated in source system --> Request set to red".
    Where are no any short dumps. There are 1 activated init load in ODS - nearly 6 500 000 records.
    Any ideas about error? And the way I can load data?

    Hi Gediminas Berzanskis,
    I faced the similar error when I tried to load data from an ODS which contained huge amount of data to a Cube. Even in your case the volume of data is more (around 6.5 million records). The error could be due to the table space issue, please contact your Basis team to check if enough table space exist for both the data targets. 
    Meanwhile you may also check the RFC connection of the Myself source system.
    You can replicate the DSO datasource once and then reactivate the transfer rules using the program RS_TRANSTRU_ACTIVATE_ALL.
    Try load with a small amount of data with a Full load option and then with a delta option..
    Hope this Helps,
    Prajeevan (XLNC)

  • Delta loading procedure from Write Optimized DSO to Infocube

    Hi All,
    We are using Write optimized DSO in our project to which I am loading data using Standard DSO 0FI_GL_12
    From Write optimized DSO, we are loading delta records into Infocube, please provide your inputs to my following questionnaire:
    1) I am quite interested to know how the delta records gets loaded into infocube whenever we are using 'Write optimized DSO' as we don't have any image concept in Optimized DSO
    Ex: If am using Standard DSO, we have Change log table and image concept will allow to get updated value to Cube
    let us assume
    Active Table
    111            50
    111            70 (overwrite)
    Change Log Table
    111            -50        (X -- Before Image)
    111             70    ( '  ' -- After Image) symbol for after image is 'Space'
    So if we load this record to the target as a delta the above two records from change log table will get loaded to the CUBE and Cube will have 70 as the updated record
    If am using 'Write Optimized',
    Active Table
    111            50
    111            70 (overwrite)
    When this record loaded to the cube, as Info Cube is always having 'Additive' feature so the total value will 50+70 =120 which is wrong?
    Correct me what feature will work here to get updated value as '70' into Cube from Write Optimized DSO'
    2)As the data source is delta capable and having  'ADDITIVE' delta process, only the delta records based on REQUEST ID are loaded into Info Cube with the  updated key figure value?
    Thanks for your inputs and much appreciated.
    Regards,
    Madhu

    Hi Madhu,
    In best practice, we use WODSO in Initial layer and then Standard DSO. Just for mass data load/staging purpose.
    In best practice : Data source ----> WODSO ---> std. DSO
    In your case : Data source ----> Std.DSO  -----> WODSO.
    In both cases if data load design is not in accurate way, then your cube will have incorrect entries.
    For ex:  today 9 am : 111,  50  (in active table)
    Data load to cube, same day 11 am : then cube will have 111    50.
    Same day, value got changed  in std. DSO  1 pm :   111   70(over write function @ active table).
    Same day/next day if you load data to cube, it will have 2 records one is with value 50 and other would be 70.  So to avoid such scenarios we should plan load in accurate way.  Else make change your DTP settings as  ‘source table : change log table.
    Coming to your case:
    Once after the load to Std. DSO, load data to WODSO by changing the DTP settings ‘Delta Init.Extraction from’  : Change log.
    Now data available @WODSO from change log table, then you load to cube In delta mode.

  • Delta Load on DSO and Infocube

    Hi All,
            I would like to know the procedure for the scenario mentioned below.
    Example Scenario:
    I have created a DSO with 10 characteristics and 3 keyfigure. i have got 10 Customers whose transactions are happening everyday. A full upload on the DSO was done on 7th October 09. How can i load their changing data's from 8th Oct to till date to DSO? and what will be the situation for the same in the case of Infocube??
    Step by step guidance will be a great help
    Thanks in advance
    Liquid

    Hi,
    The key-fields take an important role at DSO level alone as you get the power of overwritting records. Once thats done at the DSO level you can simply carry on with a Delta load into the cube from your Change Log table and you don't have to worry about anything. Just to add, all the characteristics in the cube are key-fields so you will get a new records for each different value only the key-figures will sum up for a set of all the same characteristics.
    Thanks,
    Arminder

  • Can we use both 0FI_AP_3 and 0FI_AP_4 for Delta Loads at the same time.....

    Hi Gurus:
    Currently my company uses 0FI_AP_3 for some A/P reporting. It has been heavily customized & uses Delta loading. However, SAP recommends the use of "0FI_AP_4" for A/P data fro delta loads. I was able to Activate 0FI_AP_4 as well & do some Full Loads in Dev/Test boxes. Question is whether I can use both the extractors for "Delta" loads at the same time......? If there are any issue, what is the issue and how ccan I resolve it? Is the use of only one extractor recommended......??
    Please let me know as this impacts a lot of my development....! Thanks....
    Best...... ShruMaa
    PS:  I had posted this in "BI Extractors" forum but there has been no response......  Hope to get some response.......!  Thanks

    Hi,
    I would recommend you to use 0FI_AP_4 rather using both, particularly for many reasons -
    1. DS: 0FI_AP_4  replaces DataSource 0FI_AP_3 and still uses the same extraction structure. For more details refer to the OSS note 410797.
    2. You can run the 0FI_AP_4 independent of any other FI datasources like 0FI_AR_4 and 0FI_GL_4 or even 0FI_GL_14. For more details refer to the OSS note: 551044.
    3. Map the 0FI_AP_4 to DSO: 0FIAP_O03 (or create a Z one as per your requirement).
    4. Load the same to a InfoCube (0FIAP_C03).
    Hope this helps.
    Thanks.
    Nazeer

  • Issue with Delta Load in BI 7.0... Need resolution

    Hi
    I am having difficulty in Delta load which uses a Generic Extractor.  The generic extractor is based on a view of two Tables.  I use the system date to perform the delta load.  If the system date increaes by a day, the load is expected to pick up the extra records.  One of the tables used in the view for master data does not have the system date in it.
    the data does not even come up to PSA.  It keeps saying there are no records....  Is it because I loaded the data for yesterday and manually adding today's data...? 
    Not sure what is the cuase of delta failing....
    Appreciate any suggestions to take care of the issue.
    Thanks.... SMaa

    Hi
    The Generic DataSource supports following delta types:
    1. Calender day
    2. Numeric Pointer
    3. Time stamp
    Calday u2013 it is based on a  calday,  we can run delta only once per day that to at the end of the clock to minimize the missing of delta records.
    Numeric pointer u2013 This type of delta is suitable only when we are extracting data from a table which supports only creation of new records / change of existing records.
    It supports
    Additive Delta: With delta type additive delta, the record to be loaded only returns the respective changes to key figures for key figures that can be aggregated. The extracted data is added in BI (Targets: DSO and InfoCube)
    New status for changed records: With delta type new status for changed records, each of the records to be
    loaded returns the new status for all key figures and characteristics. The values in BI are overwritten (Targets: DSO and Master Data)
    Time stamp u2013 Using timestamp we can run delta multiple times per day but we need to use the safety lower limit and safety upper limit with minimum of 5 minutes.
    As you specified, the DS is based on VIEW (of two tables, with one containing date and other does not).
    Kindly check the above lines and verify, if the view (primary key) could be used to determine the Delta for your DS.
    Also let us the if any standard SAP tables used in creating VIEW and if so, what is the size of the DS load every day?
    Thanks.
    Nazeer

  • Loading through Process Chains 2 Delta Loads and 1 Full Load (ODS to Cube).

    Dear All,
    I am loading through Process chains with 2 Delta Loads and 1 Full load from ODS to Cube in 3.5. Am in the development process.
    My loading process is:
    Start - 2 Delta Loads - 1 Full Load - ODS Activation - Delete Index - Further Update - Delete overlapping requests from infocube - Creating Index.
    My question is:
    When am loading for the first am getting some data and for the next load i should get as Zero as there is no data for the next load but am getting same no of records for the next load. May be it is taking data from full upload, i guess. Please, guide me.
    Krishna.

    Hi,
    The reason you are getting the same no. of records is as you said (Full load), after running the delta you got all the changed records but after those two delta's again you have a full load step which will pick whole of the data all over again.
    The reason you are getting same no. of records is:
    1> You are running the chain for the first time.
    2> You ran this delta ip's for the first time, as such while initializing these deltas you might have choosen "Initialization without data transfer", as such now when you ran these deltas for the first time they picked whole of the data.Running a full load after that will also pick the same no. of records too.
    If the two delats you are talking are one after another then is say u got the data because of some changes, since you are loading for a single ods to a cube both your delta and full will pick same "For the first time " during data marting, for they have the same data source(ODS).
    Hope fully this will serve your purpose and will be expedite.
    Thax & Regards
    Vaibhave Sharma
    Edited by: Vaibhave Sharma on Sep 3, 2008 10:28 PM

  • Problem in running Delta load for 2LIS_13_VDKON

    Hi Friends,
    I am working on LO Data Source i.e., 2LIS_13_VDKON, i have run the INIT setup tables through OLI9BW. There were 2389443 records.
    Now, since in BW these records multiply depending on condition types I didn't want unnecessary records. So I have run multiple INITs in BW with selection criterias as I wanted data only from 01-Jan-10. Having done that, now I want to run deltas but the BW system is not letting me do. If I click on Delta in the infopackage (in update rule) then it selection criteria it adds up all those INIT selection criteria that I have run and I can't change it (as in it becomes non-updatable); why is it adding up all those selection criteria in Delta infopackage?
    I want to fetch all deltas from the day I ran OLI9BW...How to run delta for those records updated after run of OLI9BW.
    Thanks!

    Hi,
    Follow the steps, these steps are for SD module, but for your datasource, change the Tcode to fill setup tables and replace the SD DataSource with your datasource in the following steps.
    1. First Install the DataSOurce in RSA5 and see it in RSA6 and activate in LBWE.
    Before doing the steps from 2 to 6 lock the ECC System, i.e. no transaction will happen
    2. Then delete the Queues in LBWQ like below
         MCEX11  --> For 2LIS_11_* 
         MCEX12  --> For 2LIS_12_* 
         MCEX13  --> For 2LIS_13_* 
      Be carefull while doing all these deleations in Production Servers
    3. Then delete if any entry is there in RSA7
         Eg:
         2LIS_11_*
         2LIS_12_*
         2LIS_13_*
    4. Then delete setp tables using LBWG Tocde and select Application Number. i.e. 11, 12 and 13.
    5. Then Load Setup Tables using OLI7BW, OLI8BW and OLI9BW.
       Give Name of run = XYZ, Termination Date = tomorrows date and execute it in background.
       i.e. Program-->Execute in Background.
       2LIS_11_*  Use OLI7BW Tcode to fill the Setup tables
       2LIS_12_*  Use OLI8BW Tcode to fill the Setup tables
       2LIS_13_*  Use OLI9BW Tcode to fill the Setup tables
    At the time of setup table filling no entry will exists in LBWQ in ECC for the following Queues.
         MCEX11  --> For 2LIS_11_* 
         MCEX12  --> For 2LIS_12_* 
         MCEX13  --> For 2LIS_13_* 
    6. Check the job status in SM37 and once finish it then goto RSA3 and then execute it and check.
    7. Then Replicate the DataSource in BW.
    8. Install InfoCube/DSO from Business Content or cretae InfoCube/DSO and then Map the ECC DataSOurce Fields and BW          InfoObejcts in Transfer Rules (in BW 3.5) or in Transfermations (in BI 7.0).
    9. Map the InfoObejcts in InfoSource and InfoObejects in InfoCube/DSO in in Update Rules (in BW 3.5) or in Transfermations
       (in BI 7.0).
    10.Create InfoPackage and Load Init/Full.
    11.Using DTP you can load to InfoCube/DSO (if it is BI 7.0) 
    Thanks
    Reddy

  • Loading to Infocube

    Hi Gurus,
    I did load to InfoCube (Delta).We deleted the full and delta loads from the cube and dso.I am able to load the data Up to DSO is fine now.I executed the DTP from DSO to Cube but it didn't pick any load.Load was successful with Zero records transfered.Please share your thoughts on this issue.
    Thanks,
    Rani.

    Hi,
    Please check if you have any Start routine that acts as a filter when transferring data to the infocube.
    Also you could check if the request in ODS is available for reporting, if not please update the status accordingly.
    Please check the data mart status of the request in ODS also.
    Regards,
    Nitin

  • Delta loading basic question

    Hi guys,
    since I'm learning and testing the SAP Netweaver BI 7.0 I'm loading with full upload to the targets (DSO, InfoCube).
    Now I have read about loading with delta load.
    As I have undetstood the delta load enables me to load only new data to the targets.
    Now I'm wondering that in some forum threads people are writing that more Infopackages are neccessary if you want to load with delta.
    Is it correct that more Infopackages are neccessary for delta load and why?
    Is a How-to within SDN available? I couldn't find anything.
    Thanks!

    Hi,
    To do the Delta one must load the INIT.
    You can do inti in two ways
    1. INIT with Datatransfer
    2. INIT Without Datatransfer
    The following are the general steps to load Delta
    1. One IP for Init without datatransfer
    2. One IP for Full Load
    3. Another IP for Delta Load
    You can also do
    1. Create an Infopackage to do the INIT with Datatransfer(This is equivalent to INIT without Datatransfer+FULL load)
    2. create another infopackage & make it delta in the update mode
    Hope it is clear & helpful
    Regards,
    Pavan

  • Delta load from r3 to bw ?

    i have one question on the mechanism of data transfer from r3 to bi 7.0. the followings is my concerns:
    scenario:
    The total numer of data is 1000 rows and these rows are seperated into 10 data package , each package contains 100 rows.
    Now bw is trying to recieve these 10 packages from r3 in sequence.
    When 5 data packages have been transfer to bi psa. the network is broken and the tranfer is canceled at the 6th data package.
    What i want to know is when the network is reconnected ,Does the transfer start from the 6th data package?
    How to find out whether its a delta load or a full load if a transfer is scheduled from r3 to bw ? For both CUBE & ODS.
    How to find out how many records have been transfered from r3 to bw while the transfer has ended in
    the middle ? I mean how to check how many records have been successfully transfered from r3 to cube/ods .Please tell me for ods as well as cube.
    Is the continuation of transfer process is same for both ODS & INFOCUBE, if it ends in the middle ?
    What is the procedure for full load & Delta load in case of CUBE & ODS ? I mean how to reschedule the transfer.
    And if the whole scenario is is in the support, SO I will be handling these issues on the live production server. Am I right ? So in this case my user id should have access to all the production r3 and production bw to do these change status to red and delete the request and reload the request etc., So I should be logging on to the production server and do all the manipulation, Am I right ?
    Last doubt, what is repair request and so if the transfer stops in the middle we change the status to red and reload the request again , Is it called repair request ?
    What is the difference between repeat delta & repair request & repair delta ? When do we choose these options, In which scenarios ?
    How do we schedule a daily load job from r3 to bw without the use of process chains ? I mean daily it should fetch the records from r3 to bw from the SD datasources at 10.00 PM ,how do we do that ?
    Please I am having all these doubts and I want you SAP experts to answer them please so that I can clear my doubts in production support . I really appreciate if someone explain me the scenarios here. It will be a really great help if someonce clears these doubts.

    Hi......
    What i want to know is when the network is reconnected ,Does the transfer start from the 6th data package?
    If your connection is broken in between......inspite your load is through PSA...........you have to delete the request from the target(after making the QM status red)............then you have to reload it again............But before repeating the load check the connection is ok or not in SM59........
    How to find out whether its a delta load or a full load if a transfer is scheduled from r3 to bw ? For both CUBE & ODS
    This you can check in IP scheduler in the Update tab...........whether it is delta or full load.......
    or you can also check in IP monitor(You can go here through RSMO..........IP monitor screen will open directly here............or you can also go through RSA1>> Find your IP >> Double click on it >> Click on the monitor icon in the top >> From the you will go to the IP monitor>> the under the Header Tab you can see whether it is a delta or Full update...........
    Or you can check the datasource also..........
    If it is a generic datasource go to RSO2..........and check...........If it is a standard datasource check it in RSA5...........
    How to find out how many records have been transfered from r3 to bw while the transfer has ended in the middle ? I mean how to check how many records have been successfully transfered from r3 to cube/ods .Please tell me for ods as well as cube
    Your load cannot be ebded in the middle.........it can failed.............if it fails then you have to delete the request from the target..........and then repeat the load.......
    Is the continuation of transfer process is same for both ODS & INFOCUBE, if it ends in the middle
    Yes.........
    What is the procedure for full load & Delta load in case of CUBE & ODS ? I mean how to reschedule the transfer
    If a full or delta load fails after extraction got completed due lock issue or SID issue.....etc..........and if you load is through PSA.............then just delete the request from the target without making the QM status red.......and reconstruct the request.........no need to repeat the whole load again........
    But if you r load is not through PSA..........then make the QM status red........delete the request from the target........and repeat the load...........
    And if the whole scenario is is in the support, SO I will be handling these issues on the live production server. Am I right ? So in this case my user id should have access to all the production r3 and production bw to do these change status to red and delete the request and reload the request etc., So I should be logging on to the production server and do all the manipulation, Am I right ?
    Yes.......
    What is repair request and so if the transfer stops in the middle we change the status to red and reload the request again , Is it called repair request or reload request?
    Suppose you delta load has failed ............and you have deleted the request from the target without making the QM status red............In this case your data will be lost and your delta mechanism will get corrupted............In this situation.............
    1) Delete the init flag........
    2) You have to do Full repair..........(After filling the set up table)
    3) Init without data transfer to set back the init flag
    4) Then Delta upload.............
    In case of an ODS............you cannot load Full update and delta upload parrallely..............Your ODS activation will fail...........suppose you want full upload in the ODS..............then you have to go for Full repair............
    What is the difference between repeat delta & repair request & repair delta ? When do we choose these options, In which scenarios ?
    I think Full repair concept is now clear............
    Suppose your delta load fails...........After that you delete the request after making the QM status red............Then system will ask for a repeat delta when you will try to repeat the load...........which means it will pick the same delta records of the failed load..........
    How do we schedule a daily load job from r3 to bw without the use of process chains ? I mean daily it should fetch the records from r3 to bw from the SD datasources at 10.00 PM ,how do we do that ?
    This you can do via infopackage or infopackage group...........schedule it at the specific time.....
    Check these links :
    http://help.sap.com/saphelp_nw04/helpdata/en/20/ad673844a7114be10000009b38f842/frameset.htm
    Re: Schedule Infopackage for last day of a fiscal period
    Hope this helps you.........
    Regards,
    Debjani...........

  • BPC 7.5: Delta Load when loading from BI InfoProvider

    Hi,
    in BPC 7.5 running a package based on Process Chain "CPMB/LOAD_INFOPROVIDER" loads data directly from an SAP BI Infoprovider into an BPC-Cube. According to the options you can choose "Merge Data Values" or "Replace & Clear DataValues"
    According to the description the first one "Imports all records, leaving all remaining records in the destination intact", the second one "Clears the data values for any existing records that mirror each entity/category/time combination defined in the source, then imports the source records".
    I tried both and both (!!) result in what you would expect of "Replace & Clear". Both firstly create storno-records in the cube and then add the new values.
    Is this an error or wanted behaviour? I didn't find any SAP Notes on this topic, but doubt it's right....
    Is there any way to achieve an Merge or - even better - a delta load?
    Thanks a lot for any input given.
    bate

    Hi Bate,
    It is indeed a bit confusing. I'll translate the BPC-speak for you
    Replace & Clear:
    1. Look up all CATEGORY/ENTITY/TIME combinations in the incoming data
    2. Clear all records in application with the same CATEGORY/ENTITY/TIME combinations that exist in the incoming data records
    3. Load incoming data to the cube
    Merge:
    1. Load incoming data to the cube record-by-record, overwriting existing data.
    This means that "Replace & Clear" might clear out existing records in the cube that do not share the full key of incoming records but do share the CATEGORY/ENTITY/TIME dimension values. For example, a records with CATEGORY/ENTITY/TIME/ACCOUNT values of ACTUAL/1000/2010.JAN/ACCT1 would be deleted when loading an incoming record with values ACTUAL/1000/2010.JAN/ACCT2, if you use the "Replace & Clear" method, but would not be deleted if you use the "Merge" method.
    There is no option to load data additively, like it is loaded to a BW InfoCube.
    Cheers,
    Ethan

  • How to handle the error delta load?

    Hi Experts,
    We have a InfoCube include three InfoSource: 2LIS_11_VAITM, 2LIS_12_VCITM, 2LIS_13_VDITM.
    The delta load of 2LIS_12_VCITM failed on Oct 23,2006.  It blocked all of the latter data load, even though 2LIS_11_VATIM and 2LIS_13_VDITM load the data successfully.  And 2LIS_12_VCITM didn't do any load step after Oct 23, 2006.
    How should I do to fix this problem and make all of the data can be go into InfoCube correctly?
    Thanks,
    Lena

    Hi Jerome,
    Thanks so much for you help. Currently, the data loaded successfully.
    But I have another issue, because of the failed request status is red, it seems it blocked all of data to go into InfoCube (even though the delta load is successful). I can not see the new data in the InfoCube. And in the InfoCube manage, the "Request for Reporting Available" field for the successful request is null.
    How should I do?
    Thanks,
    Lena

  • Delta load

    Hello Experts
    I have cube,when i am trying to perform a delta load in it it prompts me that
    Last delta update is not yet completed.
    Therefore, No new delta is possible
    You can restart the request again.
    Please let me know hw can i restart the data load.

    Hello,
    Please check if the last load has completed successfully. You can check this by accessing RSMON transaction and going over to the DataSource you are having an issue with.
    Once, you check the request make sure the QM status has been set to red.
    Also, check to make sure the request is not there in the InfoCube.
    After that go over into the InfoPackage and start the package. When you try to start the package, it prompts up asking whether you want the request to be repeated.
    This would repeat the last delta and once this has been loaded you will be able to continue your deltas.
    Thanks
    Dharma.

  • Does APD support delta load?

    Hello experts,
    does the APD support delta load of data in case the data target is
    1. a DSO?
    2. an InfoCube?
    Many thanks in advance,
    Marco

    Hello Marco,
    As far as APD is concern with data source DSO's, it supports only Direct update DSO..& Direct update DSO not support Delta ,hence have an intermediate standard DSO before the Cube which will capture the Deltas....
    The flow should be like,
    Direct DSO-->Standard DSO --->Cube
    Hope it clears your doubt scenariowise.
    Thanks,
    Santosh

Maybe you are looking for

  • Can't figure out how to maximize the space on the page by increasing the column width

    Trouble with pages app: can't figure out how to maximize the space on the page by increasing the column width I have 3 columns and they have spaces btw them that I cannot close the gap on.

  • How can i recover files from iphone 3gs 4.2.1 while in recovery mode ?

    I had tried already multiple ways to get it off recovery mode but nothing works...it went into recovery mode while it was charging :( .. I REALLY DON'T WANT TO LOOSE MY DATA because just a couple of weeks ago my dad deleted EVERYTHING from our comput

  • Pdf image sequence in QT Pro

    I am running QT Pro 7.2.0. I frequently make Quicktime animations of image sequences of jpeg files using QT Pro "Open As Image Sequence". Works great. I now have a sequence of pdf files, (created by Acrobat Distiller from postscript files) and when I

  • Input Parameters

    How it is possible to add input parameters to my adapter/service ? For example, I want to write adapter that converts USD to DEM. In first page user enters amount in USD, presses "Submit" and in second page gets amount in DEM. The problem is that if

  • MT100 Header Blocks problem

    Hello expert, I have a problem in MT100 generation. File is generated with the correct payment information but I don't have headr block i mean tags 01, 02, 03, 04, 05, 06 and 07. There is an option that i should set to print also header block? Thank