GL Data reload in BW

Hi,
In our GL DSO all the key figures are set to Overwrite option and consists of 40 Million records.
I enhanced the data source to include 2 additional fields and to het the historical data for those 2 new fields…..i need to plan data reload for GL.
Due to volume of data, Instead of deleting the data and initializing it again I am planning to run repair full request during business hours as the data load and activation take huge time…
Does this approach will have any impact…
Thanks

Hi,
Yes, its good idea.
your data flow is like this.
Data source ---> DSO.
So all key fig property is overwrite.
Rather than delete, you try to load full load or repair full request with selections(calyear,fiscalperiod or fiscal year).
Run multiple info packages and check available application servers and speed up your loads.
try to tun your load where there is less burden on ecc and bw.
inform basis team keep on moniotr SM58(ECC).
Thanks

Similar Messages

  • How do I get my data reloaded after updating my iphone with the new IOS 6.)/

    I downloaded the IOS 6.O updated for my iphone.  It backed up my data, but I can not get it to restore it.  My music and apps were removed.  When I try to sync it in itunes an error comes up that this iphone is not allowed to sync.  What do I need to do?

    Music and apps are not part of the backup that itunes performs at all.
    You should be transferring all purchases to your computer regularly.  Did you fail to do this?
    If you failed to put your music on your computer, not good, then you can redownload some itunes purchases in some countries:
    Downloading past purchases from the App Store, iBookstore, and iTunes Store
    What exactly does the error message say?

  • Data reload from PSA problem

    I has standrat BI7 data load process: DataSource -> IPackage -> DTP -> DSO.
    I've deleted 3 last requests in my custom developed DSO. They do exists in PSA. Now I want them to be restored from PSA, but unfortunately, there is no "Update with scheduler" menu item then I right click on request in PSA.
    The data source, IPackages (full, init, delta) and the DTP are active.
    I tried to use rsa1old, but I didn't datasource in data source tree, probably because i'm using BI7 data load...
    So, why SAP doesn't allow me to update requests again? Are where any other ways to do it?

    Hi Gediminas,
    once you deleted the requests from DSO automatically you can observe the "not updated to any target" symbol in the PSA fro those requests that are deleted from DSO
    so when you schedule the DTP (Delta) again it will only take the requests from PSA which are not updated to any targets so it should take the deleted three requests and once they are updated to the target the symbol will change in PSA for those requests
    But if you do any structural change to the data source then you again need to replicate adn activate the Data source and load the data up to PSA
    Thank you

  • How do I watch a movie downloaded to my MacBook on my ATV?     The movie works fine on the computer but I just get a checkerboard pattern on the TV screen with audio but no video.  Apple support is useless and suggested that I reload the software...

    Just downloaded a HD movie to my computer (brand new MacBook Pro, all software up to date) and can't watch it on my Apple TV (brand new last year, all software upto date).  Using Airplay mirroring works fine in general but when I press play on the movie I lose the picture and get only a checkerboard pattern and sound.  Another MacBook we have gives a black screen with no sound.  Apple assistance is completely useless.  They suggested that I download the most recent software (all upto date), reload the movie (done and no help) or read the FAQs (no help, although this board suggests that there is a fundamental compatibility issue with mirroring of HD content...!)   That's what the ATV is for!   The new MacBook uses a different connector and I can't plug it directly into the monitor input on my TV with the connector I had from the old computer.   I ordered a new connector for fifty bucks and they stupidly sent it UPS with a signature required for delivery since it is such a high value item so now I have a hassle to receive it.  In any case plugging in a wire from across the room isn't a solution.  Is there any way to make all of this supposedly integrated hardware and software work or has Apple completely lost their way and turned into a Microsoft clone?   Hopefully some of their market mavens will read this and steer the ship back on course.  In the meantime any help you may provide would be much appreciated.

    The movie is from iTunes.   You would think that a download from iTunes to a MacBook could be watched on an HDTV using Apple TV without any headaches bit this does not appear to be the case. 

  • How to add a time characterstic(date) to already loaded cube??

    Hi All,
    We have a cube loaded(by daily deltas) for more than two years now, how can we add a date to this cube, what are the possibilities??
    1. do we need to delete all data,add that date ,reload??( quite tedious time consuming ,out of option for now)
    2. jut add that date to new dimension, load from now on?? but we may need data from past for that characterstic.
    3. if we copy that cube to another cube, add that char to old cube, then reload data from copied cube to old cube. is it possible to load all the data for that new filed for last two years??
    Please let me know the successful solutions as iam doing it in production system.
    Regards,
    Robyn.

    Hi Eric,
    ya that might be possible,but that charaterstic cannot be derived from the data present in cube as it needs to be derived from R/3 by changing the extract program of that cube.
    the plan is to create new cube with similar structure of old one plus this new date field,populate this new cube for say required years data, then generate query as using these two cubes in multiprovider.
    do u have any alternate solution for this
    Regards
    Robyn.

  • Remote historical data is not retrieved completely viewing it in MAX4

    Hi,
    since I installed LabVIEW 8 I have some problems retrieving historical data from another computer. Sometimes not all data is retrieved (if I zoom in or out or move back in time) and this missing data won't be retrieved ever.
    I already deleted the Citadel cache once, but after this even less data was retrieved... What's really weird, is, that for channels which weren't retrieved correctly, the data gets not updated anymore!
    On the remote computer I have a LabVIEW DSC Runtime 7.1 running (MAX 3.1.1.3003) on my local computer MAX 4.0.0.3010 and LabVIEW 8 DSC (development system) is installed parallel to LV DSC 7.1.1 (dev system). LV 8 is installed for testing purposes (I doubt we'll switch soon) and overall I like MAX 4. The HyperTrend.dll on my local computer is version  3.2.1017.
    This is really a quite annoying bug!
    So long,
        Carsten
    Message Edited by cs42 on 02-02-2006 09:18 AM

    Hi,
    > We've been unable to reproduce this issue. If you could provide some additional information, it might help us out.
    I did fear this, as even on my computer it is happening just sometimes...
    > 1) How many traces are you viewing?
    The views I observed this in had 2 to 13 traces.
    > 2) How often are the traces being updated?
    For some it's pretty often (about once a second), for some it's very infrequent (no change in data, that means updated because of max time between logs). I more often see this for traces that are updated very infrequently. But I think I've seen this for frequent traces as well (for these it does work currently).
    > 3) Are the traces being updated by a tag value change, or by the "maximum time between logs" setting in the engine?
    It happened for both types.
    > 4) What is the frequency of the "maximum time between logs" setting?
    Max time between logs is 10 minutes.
    > 5) Is the Hypertrend running in live mode when you zoom out/pan?
    I think it happened in both modes, but it defenitely did in live mode.
    > 6) If you disable/re-enable live mode in the Hypertrend, does the data re-appear?
    I couldn't trigger the loading of the data. All I did is wait and work with MAX (zooming, panning, looking at data) and after quite a while (some hours), the data appeared.
    Just tested this on a view where data is missing (for some days now!), and it didn't trigger data reloading. Zooming and panning don't as well. There's a gap of up to 3 days now for some traces. 7 of the 13 traces of this view are incompletely shown. All stopping at the same time but reappearing at different ones.
    AFAIR from the laboratory computer (these are temperatures and it's very plausable that these didn't change), there wasn't any change in these traces so they all got logged because of max time...
    I just created a new view and added these traces: the gap is there as well.
    (Sorry to put this all in this entry even if it is related to you other questions, but I started this live test with disable/re-enable live mode. )
    > 7)
    Are the clocks on the client and server computers synchronized? If not
    synchronized, how far apart are the times on the two computers?
    They should be (Windows 2000 Domain synchronized to ADS), but are 5 seconds apart.
    One thing I remember now: I have installed DIAdem 10 beta 2 (10.0.0b2530, USI + DataFinder 1.3.0.2526). There I had (and reported) some problems with data loading from a Citadel Database of a remote machine as well. This was accounted to some cache problem. Maybe a component is interfering?
    Thanks for investigating.
    Cheers,
        Carsten

  • Restore, Refresh, reload Win 8.1 after hardrive in multiple similar Thinkpads

    All -
    This is the deal...
    Had a Thinkpad T-61 with partitioned Disk 0 into logical disks C: (System other stuff - no hidden partition), D: (disk images, data back up), E: reload programs and drivers. (WD SATA 320GB, 8 GB new RAM)
    System was drowned in unknown "goo" and was essentially DOA - no power, no battery indicators.
    Harddrive was swapped into my T-61p, and after doing a system enumeration, harddive proved to be uncorrupted (whew) - at least 12 hrs of config not blown, and data saved.
    New T-61 base purchased (virtually identical to old one), LCD screen swapped, and after putting original harddrive, went through new enumeration as , of course, machine now "looked different" and did the Activation dance for both Win 8.1 and Office
    2013.
    The problem - looking in the registry it looks like I have 3 different "hardware profiles".  However, "hardware profiles" with ditched starting with Vista.  What I find is the wireless card is now Card #2 (not 3 as mine was
    an Intel BGN, not a BG), Wireless devices at now Wi-Fi 3 (not Wi-Fi) as is appears machine thinks there are 3 different hardware profiles, although only one is real.  Looking at the Registry, it is clear there are multiple entries, I assume each time
    the harddrive was inserted for test purposes, and finally in the  repaired machine with new mobo, etc.
    Question - can these now "bogus" profiles be deleted or overwritten somehow to make this into a "clean machine".  In XP, this would have been  easy - get rid of the "obsolete" hardware profiles.  In 8.1 can one
    do this with a Restore, Refresh or, worst option (and I have done this) Reinstall 8.1 and everything else on the C: partition.  (Good thing - this is someone else Thinkpad with Office 2013, and maybe  10 other programs including utils, AV, and all
    data backed up; it's not mine, a T-61p used intensively as a development machine for Oracle, SQL-Server, MySQL, etc.)
    Fortunately, the Thinkpads seem to be indestructible (T-61), and not like the old company Toshiba's that they had a stack of 15 in various stages of rebuilds/repairs for which I burned a series compatible XP disk image that just required a data reload.
    Thanks in advance!!!

    Hi,
    System refresh would reserve user profile, it would be better to make system refresh to remove everything also contains User profile and settings, it equals to reinstall system.
    Since your person data already backed up, it would be better to make a system reset to clear these old profiles.
    You can refer to the contents below for more details about system refresh, reset, restore:
    http://windows.microsoft.com/en-us/windows-8/restore-refresh-reset-pc
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Execute SQL Task does not Update from a Date Variable Reliably

    I'm using a DateTime variable in SSIS 2008 that is used to set the SQLStatement property of an Execute SQL Task.
    "DELETE FROM Labor WHERE Week = '" + (DT_WSTR, 100) @[User::Week] + "'"
    Week is the next Sunday:
    DATEADD( "day", @[User::DaysTillSunday] , @[User::TheDayThatIsTwentyMinutesPrior] )
    DaysTillSunday:
    DATEPART( "dw", @[User::TheDayThatIsTwentyMinutesPrior] ) == 1 ? 0 : 8 - DATEPART( "dw", @[User::TheDayThatIsTwentyMinutesPrior] )
    TheDayThatIsTwentyMinutesPrior:
    (DT_DATE)(DT_DBDATE)DATEADD("minute",-20,GETDATE())
    The SSIS Package deletes the current week's data, reloads it with fresh data, then calculates the difference between the current week and last week.
    The problem is that randomly, instead of deleting the current week, it will delete the previous week.  This happens maybe 5-10% of the time.  At least it does until I rebuild the package and import it into SQL Server again.
    I'm guessing that the Execute SQL Task is not updating the value of the Week variable before it executes.  I started with the source type being a variable.  Then I decided to try Direct input and pass in the Week as a parameter (OLE DB Connection
    Type).  That didn't work either.
    Most recently I tried writing the Week variable to a table first, then having a sequence container with all the tasks second.  Slightly better but I still saw the date was wrong 2 times in about 90 executions.  I was hoping that writing the Week
    variable out to the database would force an update of any associated connections to it, but that didn't seem to work.
    Any ideas?  Is this a known issue, am I missing a setting?
    thanks,
    John

    John, computers either work all the time or have a bug. I suspect it is the latter.
    To find it [faster] you need to log what the resulting expression was used in the package.
    I am baffled how rebuilding a package would fix anything like setting a date.
    It might be even dependant on when you run the package.
    Why
    DATEADD("minute",-20,GETDATE())
    DATEADD( "day", -8 , GETDATE() )
    It must be enough to set the week (that appears to be a date) as above.
    Arthur
    MyBlog
    Twitter

  • How to setup data validation mechanism

    Hi, gurus,
    Now our customer wants to do data cleansing before loading data into infocube or ods, the source system includes R/3, flat file, and some legacy systems. However the client wants to create a staging area to keep incoming data, and develop some programming to check data validation. The specific requirement is as following:
    1. Provider a user interface to let biz man to correct the error record;
    2. can code logic to check the data rejection reason, such as field type is wrong, master data doesn't exist, and etc, and can prompt these reason message to assist business guy correct;
    They want to create a specific staging for data cleansing purpose, and because refer to my prior experience with some ETL tools, we can ask developer to write embeded program in informatica or datastage to check data quality and write down the rejection reason in a file to assist business man correct;
    so if we want to use SAP BW PSA for this purpose, is it feasible? Can we modify PSA to add one column, 'rejection reason' and add some ABAP program here; or you guys have some other good options for this function.
    and can we use PSA to keep all data for 5 years? because the annual delta data is not very big, just 4G.
    Someone suggests to create Z table in BW system for staging purpose, however this solution can not use BW ETL tool to load data but need heavy ABAP programming, so any good solution on this?
    BTW: can I modify PSA structure to add 1 or 2 column, such as reject reason?
    and how to use abap program to access Characteristics master data? are there any function module to read master data directly rather than select data from P table or T table?
    Regards, ls

    I think if the business guy is given some basic BW training and some aspects about correcting PSA records or adding or loading missing master data,then it can work.
    The BW monitor does give you good enough messages about the data load errors.
    I do not think it is good to modify the PSA..because this is against purpose of having PSA.PSA has records in transfer structure format,as sent by source system.
    PSA is often preserved for data reloads that might be needed some point in future.
    As u r saying that annual record load is not big,u can preserve PSA and then later develop a custom program to delete PSA's older than..say 2-3 years..(there might a std program for this).
    Modifying or adding columns to PSA is not general practice.
    For reading master data maybe u can use this fn module..RSAU_READ_MASTER_DATA.

  • Selection profile - no data

    Hi gurus,
    this might be a basic question. Before posting here i have tried to research on my own and after getting no where i have come here to ask the gurus to help me with the issue.APO DP is new to me and i'm learning it as well.
    in the planning book, we have mutliple selection profiles and when i tried to load the data from the selection profile. some selection profiles are blank and they are unable to load the data. what could be the reason? There are some selection profiles which have the data but its not teh full list.
    few inputs - we have a weekly job running in ECP to transfer the materials and plants from ECC To APO. CFM1 and CFM2 tcodes in ECC.
    our BW resource says that he is able to extract the sales orders histroy and open orders from ECC and able to load to APO everyday, which i kinda see in LISTCUBE tcode, but then why is that we are not able to see the data via the selection profile when we are able to see the data in listcube.
    kindly help
    Thanks
    Ravi

    Hello,
    You said "we have multiple selection profiles".
    But you didn't mentioned how exactly the selection profiles has created I mean for which charactorstics/attributes has created the selections.
    For how many selection profiles you are not able to see the data.what are the charactoristics/attributes are there in those selections.
    Have you given any of the Values for those charactorstics/Attributes when you create those selections.
    Can you pls provide that information?
    And also for your understanding pls check the below ways. Hope the below mentioned points will help you why the issue has raised.
    1) Is all selection profiles created in PB for all the selections whether the data is loading from BI to APO BI or not?
    2) Hope you or your BI guy already captured BI Report before facing this issue when your BI guy has loaded the data from BI to APO BI.
    3) Now you are saying that issue has got resolved by reloading master data by your BI Resource and by doing consistency checks. Pls get the bex report download after reloading master data by your BI Resource which keyfigure the data is copying from BI to APO BI. 
    4) Compare both the Reports before master data reloading and after master data reloading and try to check for the selections which the data has missed in PB.
    5) if data has missed only for particular skus, or depots. pls check clearly the attributes in the Selection profiles which data has not loaded prevoulsly and the Bex Report attributes i.e Infocube attributes/charactoristics.
    As you said by reloading the master data by your BI Guy your issue has resolved.
    Hope the above mentioned points will help you to understand completely about the issue.
    if you want indepth clarity on your issue means pls reply me the above questions which I checked.
    Regards,
    Subhan

  • Error Loading Material Master Data

    Hi All, When I try to load ZMATERIAL in BW development. I am getting the below error. I am on BI 7.0. seems like bad data, but how to fix it?? any thoughts??
    in the details TAB of the monitor, I  get a red error message under
    'Transfer (IDocs and TRFC): Errors occurred '
    >>Processing (data packet): Errors occurred
    >>>Data Package 5 ( 10417 Records ) : Errors occurred
    >>>>Update ( 0 new / 0 changed ) : Errors occurred
    >>>>>Data records for package 5 selected in PSA - 1 error(s)
    >>>>>>ZMATERIAL : Data record 10416 ('000001 '): Version '000001 ' is not valid
    >>>Processing end : Errors occurred
    >>>>>Error 4 in the update

    Hi PK,
    i can give you some indications:
    This message indicate that in R/3 you have an incorrect  value and the data record indicate the key with The important information.
    When controlling the InfoObject and the transfer rule you can extract the info that you need to correct the data i think in your case you have to check the length and the character type between your infoobject in BW and the field table in R/3.
    transaction RSA1==> Infosource==> find your Infosource==> double click on your Infosource that take you to transfer rules==> and here you can check the mapping between your infoobject (ZMATERIAL) and the original field in R/3 that you have to correct.
    in the same screen under Transfer_structure/transfer_rules you can find the name of the datasource and in R/3 transaction RSO2 you can find the original table that contain the field to correct.
    when you  individualize the table and the field communicate this Info to the persons who works on R/3 and after they correct the data reload again in BW.
    i hope that this Info help you
    Bilal

  • Looking for the best way to load data to the appended ODS

    Hi all,
    I appended ODS with some new infoobjects.
    if there is a way to load data only to these datafields without all data deleting and then all data reloading? 
    thnx

    Hi Andrius,
    If your ODS is set to be overwritten AND the newly appended IOs are not key fields then a full upload without data deletion will works. Because old entries will be overwritten by new ones which have values for the appended IOs.
    On the other hand, if your ODS additive then from my point of view you need to remove all data and then reload again. Or else key figures would be doubled after a full upload.
    Qing

  • Not all records are updated in DSO

    Dear Friends
         My Scenario is SAP - FM -> DataSource -> InfoPackage -> DTP ->Transformation - > DSO.
         I have 896 Records in my PSA , but when I run DTP to load the data in DSO only 817 records are added, rest records are not updated.  I have supplied
    Purchasing organization             0PURCH_ORG
    Purchasing Group                      0PUR_GROUP
    Vendor                                      0VENDOR 
    BW: Document Number              0DOC_NUM
    BW: Document Item Number      0DOC_ITEM
    Material                                    0MATERIAL
    Accounting document number    0AC_DOC_NO
    Item in Material Document         ZZEILE
    as key fields of DSO.
    can you suggest me what could be possible reason for not having all the records.
    Regards
    Naim

    Hi,
    is it full load?
    Have you applied any filters at dtp level?
    if its full load then delete whole psa data and dso data.
    reload them again and see the count.
    As per dso primary keys, same combination data will over write it.
    how many records count you see at dso-->manage-->request tab
    Added and transferred.
    Thanks

  • SAP Mobile Sales 2.0 delta load issue for Sales Orders

    Hello,
    we have used Mobile Sales 2.0 with a Windows app for a while now. Our current issue is that sales reps won't see any historical sales order data on their devices.
    Background
    Due customer requirements, we need to make small changes to customer master data attributes and reload all customers from ERP to CRM. Then we ran delta loads (MAS_PARTNER followed by all other objects) to DOE, in which virtually all 5000+ customer accounts were compared. The delta load ran for about 3 days (some performance bottleneck we haven't located yet).
    During the delta load, data on devices was inconsistent. Accounts were missing and all transaction data disappeared. After the delta loads, all accounts and contacts are OK, save for a few. Data from activities (appointments, tasks) have reappeared, as they should. Only sales orders won't reappear. The sales orders exist in the backend and belong to active accounts and sales reps.
    Settings and troubleshooting so far
    We don't have any limitations for sales orders in CRM Sales Mobile configuration.
    We've run delta loads for all objects in transaction SDOE_LOAD.
    MAS_CUSTOMIZATION etc seem fine.
    We've re-run initial load for sales orders from CRM.
    In the test system, we've even reinitialized the whole CDS database on DOE and on the devices, then re-ran the loads.
    Checked steps suggested in discussion
    SAP CRM 2.0 initial load issue
    Historical sales orders (those created before the master data reload) exist in the backend, but don't show up on the device.
    If I change one of those historical sales orders in the backend, it gets sent to the device.
    If I create a new sales order in the backend or on the device, it is saved and replicated just fine.
    To sum it up, it seems DOE is unable to identify the sales orders relevant for replication.

    First Doubt i got clarify by my self as we can go with Unwired Runtime option .
    But i still have doubt in :
    2. How can i Modifying the Main Menu for iOS.
    i am able to customize the same for windows using files SybaseCRM.Configuration.xml file.
    Same how can i do for iphone/ipad.

  • How to rename existing members in an Essbase Outline

    Hi there,
    I would like to do as follows:
    1. List all members at Level-0 of a specified dimension (or parent member) that have names matching a wildcard string. e.g. names begining with 'JWC'.
    2. Delete the Level-0 members that satisfy the above condition. I must verify the list from 1 above before deleting the members.
    3. Rename remaining/existing members at Level-0 of the dimension (or parent member) specified above by attaching a required prefix to the existing name.
    Note:
    1. I am working with an historical cube and I must preserve past year data. Hence I cannot select "Remove Unspecified" on the dimension build rule as my current data set contains data for current year only.
    *kp> The level-0 members that I need deleted contains data for current year only, hence I can safely delete them.
    2. I am expecting some sort of script/s (CALC or Maxl?) that will allow me to do the above tasks (List, Delete, Rename) without any manual input.
    3. I must get this done by 31 Jan and before rolling out the oldest year data. Hence I urgently need your help here!
    Kind regards,
    Kamlesh.
    Message was edited by:
    user616142

    Thank you for your response. Being a novice, I must
    say that I am quite hesitant in taking steps that I
    am unfamiliar with, esp when dealing with a
    production cube that takes about 6hrs to load when I
    have a limited timeframe! However with additional
    guidance (see my kp>comments below), I may be able to
    try your method.No problem, we were all novices once upon a time ;-)
    > - export the outline using the Olap Underground
    Outline Extractor
    (http://www.appliedolap.com/default.asp?ID=51)
    p> Can I export the outline to just any location as I
    believe it will simply be a text file?Correct.
    > - do the necessary clean up to the exported
    outline text file
    kp> If this is the same delete & rename steps I want
    done, then I need help on some kind of script that
    will allow me to do this as it involves 1000s of L0
    members. (I am also considering the tips from Glenn)
    > - make a backup of your cube
    kp> I did. Just curious - the L0 members that I need
    deleted have actually been added via a change I made
    to the dim build rule. I had saved the cube folder
    before running the rule. I would normally restore the
    entire folder so that all components are in sync.
    Just one doubt: can I copy/paste the changed dim
    build & data load rules to another location, restore
    the cube from backup, and then simply copy/paste the
    saved rule files to replace the old ones? This will
    save me from having to delete the new L0 members.My suggestion is to tear down and rebuild based upon the edited outline export. If you find it easier to work in the outline directly, or to modify your load rules, that is your choice.
    > - export L0 data in columns from cube
    kp> Will the same Olap Underground Outline Extractor
    utility enable me to do this?No, this is a standard operation in Essbase. You should be able to do an export either from a MaxL session or from the console. Check the docs for details.
    > - build a load rule to load the export (so you
    can ignore errors from records that will no longer
    load)
    kp> Is there anything I need to be aware of here as I
    must get all the data reloaded onto original cube.If the members to be deleted have no data, then there should be no errors. If they do have data but the members will not be rebuilt, you will have to redirect where the data should load to. That will either mean editing the load data to point to the new member names, using alternate aliases, or recreating the load data for the new member names. Hard to be more specific as you situation may vary.
    > - reset and rebuild the cube using the edited
    outline export file
    kp> I am assuming I can manually delete the parent
    members and then run the rule based on the export
    file to rebuild the dimensions?Yes, the idea is to edit the outline text file to meet your new requirements and then to build the cube again. You can either manually clear the dimension of all members or rebuild with the revised outline file using a Remove Unspecified option load rule.
    >
    Thank you for your suggestions.
    Regards.Hope it helps. Be sure to document the actual steps you use for the next time you have to do this (and there will be a next time, trust me ;-)

Maybe you are looking for