Generic extarction delta init - bit confused

Hi all ,
I have created the delta for generic extraction (view in Development says.)and first time data is extracted with Inuit. delta.
  Data is in cube but how to extract the data next time ? how to do the deletion of delta ? I found the
entry in the table ROOSGENDLM-DELTAID = '20050413' ???
please guide for the next step after initial step .
I will reward to the solution .
Regards,
Milind

Hi Milind,
How to extract DELTA?
After your first INIT (Ensure that you have selected Init Request in the Info Package Update Tab). Select DELTA under the same TAB ("<i>Update Tab</i>") from the Info Package.
Click the Start Button.
The incoming request are now DELTAs.
If i have helped, pls grant points...
Bye!
--Jkyle

Similar Messages

  • Generic Delta: Init, Full Upload and Deltas...

    Dear All,
    Last week on 10.04.2014 I've created "Generic DataSource with Generic Delta" and used CalDay as Delta-Specific Field and lower limit kept to blank and upper limit to 1. On 11.04.2014 I've initialized the Delta with Data Transfer using Process Chain: Time 1:15 p.m (Note: Posting was on)
    see below screen shot:
    Result: Process Chain was scheduled to immediate and it executed and initialized the delta and brought records as well. See below screen shot of RSA7:
    Q: 1: Why generic delta current status is 10.04.2014, as I'v scheduled delta through process chain on 11.04.2014 @ 20:50: and why Total column is showing 0 records? Even I've checked Display Data Entries with Delta Repetition, but could not find any records?
    Following is the InfoPackages which I created for Delta Loads and used in another Process Chain for Delta scheduled (daily @ 20: 50).
    Following is the DTP used into Delta Process Chain:
    On 12/04/2014 when I checked my InfoCube, and found that same number of records being loaded again despite InfoPackage and DTP were created as Delta, see screen shot:
    On 12/04/2014 See below PSA table:
    On 14/04/2014 when I checked InfoCube and I saw deltas being loaded successfully: see below screen shot:
    On 14/04/2014 see below PSA table:
    On 14/04/2014 see below RSA7 BW Delta Queue Maintenance screen shot:
    Q: 2: Why am I unable to load the Delta records on the day(11/04/2014) when I initialized it with Data and why same number of records being loaded again on 11/04/2014?
    Q: 3: On 11/04/2014, around 1:15 p.m I've initialized Delta with Data successfully and the same day (11/04/2014) was my first load of delta records (Time: 20: 50), could this be a problem? as the posting period was on when I initialized and I tried to load the delta on the same day?
    Q: 4: Why the pointer is showing me Current Status: 12/04/2014 today and this time around I can find last delta records from "Display Data" option with Delta Repetition?
    Q: 5: Have I missed something in the design or data flow?
    Q: 6: How can I verify deltas records accuracy on daily basis; is it from RSA3?
    or How can I verify daily deltas with ECC delta records?
    Q: 7: What is the best practice when you load full data and initialized delta and schedule deltas for Generic DataSources with CalDay as delta specific field?
    I will appreciate your replies.
    Many Thanks!!!
    Tariq Ashraf

    I clearly got ur point.....But
               If i have situation like ....." I loaded the LO data from R/3 with Init delta......then i loaded 5 loads of delta too.......due to some error in data now i am trying to load all th data......In this case if i do Full upload it is going to bring all the data till 5 th delta load .......If i do Init delta again it is going to bring the same data as full upload..........In LO while loading both Full load or delta init load we need to lock R/3 users....I believe we can load delta even after full upload by using "Init delta without data transfer"............So that wont be the barrier....then what is the difference or how should i figure out which load should i do.............If i am wrong please correct me.......
    Thanks

  • BI switching to new ECC 6.0 box from R/3 4.7 box, question on delta init?

    Hi all,
    Our BI has been already upgraded to 7.0. Currently connected to R/3 4.7.
    Now a new box is made with ECC 6.0 and Oracle 8 which is a mirror copy of the r/3 4.7.Once the new box is ready the BI 7.0 connection will be switched from R/3 4.7 to the new box ECC 6.0.
    My question is ...
    In the R/3 4.7 the datasources were already initialized for delta..so the pointer for delta was already setup.
    Now when we connect the new ECC box do I have to initialize delta again ? or would the new system recognize the pointer since it is a mirror copy from the old system?
    pls advice

    It´s a bit confusing what you write - I try to get your point:
    - your 4.7 is running on Oracle 8 (VERY old, unsupported)
    - you install a NEW system on ERP 6.0? Or do you upgrade your 4.7 copy?
    Markus

  • 0HR_PY_1_CE - Delta init with time limitations

    Hallo everyone,
    I want to use the Content-Extractor 0HR_PY_1_CE to transfer payroll data into SAP BW in delta mode.
    As the source system is running payrolls for a very long time allready, I would like limit the data by time. Pittily the extractor does not support filtering by time in delta mode. (Found this on SAPs help pages and also tried to set start and enddate ... no luck as assumed.)
    So I had a look at the source systems customizing, but the only setting regarding a time frame for extraction I found is the one for PT-data. (Transaction SBIW -> Settings for Application-Specific DataSources (PI) -> Human Resources -> Define Time Frame for Transfer)
    Within the PY-Customizing there is no option to limit the extraction interval.
    Is there any other way to limit the extraction to a specific time frame when loading data via 0HR_PY_1_CE?
    Thanks a lot in advance and regards
    Chris
    Edited by: Christian Baum on Jul 27, 2010 12:15 PM

    Hi Yuvaraaj,
    thanks for taking the time to answer my question.
    For time selection I use the extractors default datafields: BEGDA and ENDDA.
    Pseudo-delta might be an option IF the two fields you mention are processed in the extractors selection logic. I actually don't know and would need to take a closer look on the extractors programm logic.
    Anyway my goal is to stay as close to SAPs standard as possible.
    Pittily the extractor does not support filtering by time in delta mode. (Found this on SAPs help pages and also tried to set start and enddate ... no luck as assumed.)
    I had another look at [SAPs documentation for 0HR_PY_1_CE|http://help.sap.com/saphelp_nw70ehp2/helpdata/en/6a/cccc6015484464b25605375a5db965/content.htm] and have to recall the statement above.
    As there is no restriction mentioned about delta and time selection there must be a way to pass a time frame when initializing the delta.
    I tried the following:
    1. I startet an Infopackage in full mode with the follwing filter criteria: BEGDA = 01.10.2009; ENDDA = 31.10.2009
    The extractor dilivers ~60000 records; their inper is allways 10.2009. -> Selection worls fine
    2. I started an Infopackage in delta init mode and exactly the same selection criteria as the full mode Infopackage and i receive millions of records whichs inper is spread over several years.
    So the time selection is definitely not working in delta mode.
    Are there any customizing settings I missed in the OLTP system?
    Regards
    Chris

  • My iphone 4 does not appear in my(windows 7) lap top, when i connect i can hear the sound of usb connected, but its not appear in the itunes or computer, and its not charging as well, bit confusing

    my iphone 4 does not appear in my(windows 7) lap top, when i connect i can hear the sound of usb connected, but its not appear in the itunes or computer, and its not charging when i connected to the laptop as well, bit confusing, help please

    Take a look at this link, http://support.apple.com/kb/TS1538

  • Delta Init DTP DSO- Cube loads all requests

    Hi,
    i've loaded several Full Request to a DSO (Datasource 2LIS_02_ITM).
    I had to to fragment it to 3 months per Full load because otherwise the data volume was too much for the rollback segment.
    Now i want to load these requests from the DSO to a Cube.
    I created a Delta DTP and selected the "All new data by request" option.
    The DTP starts as a Delta Init but selects all open requests at once and crashes with a "ORA-01555: snapshot too old..." error.
    As a workaround i created several Full DTPs that only load part of the data but i really want to understand why the Delta Init doesn't work as expected and documented (to load request by request in a new process).
    thanks
    Carsten

    Hemant Khemani wrote:>
    > Please check if you have selected the option "Get All New Data Request by Request" in the Delta DTP.
    > If you have selected this option then the delta DTP will select only one request (oldest - yet to be updated to the target) from the DSO.
    That's the option i've selected.
    When monitoring the DTP-Request i see in the head data that all requests from the DSO are selected at once.

  • Simple IP profile request, now wee bit confused, p...

    Hi
    Little background as to why i'm a bit confused 1st
    While ago I was getting around 4000-4500kps connection speed using my HH2, then 1 day I had no BB light, phoned helpdesk and was advised my router was dead (no suprise as it was getting old), if i extended contract i could get the all singing and dancing HH3 free (not a prob, always had good service and little issues). Job done and just to wait for new hub, but after trying various other hubs to keep me online inc a borrowed HH3 it wasnt my router that was dead. Anyhow the fault at the exchange was sorted and life was good, till i started realising my downloads and streams were nowhere near as good as before.
    Thought i'd request a IP profile reset after doing speedtest and seeing a low profile, but after reading various things I did a few tests and now am confused as to whether a IP profile would be worthless until im sure its all good my end :/
    I recieve lower connection speeds with HH3 and higher noise margin than I do using my HH2 (this is also my 2nd HH3 as 1st 1 liked to crash a lot)
    Connections to front plate and test socket are very similar in reading, and this is the only thing plugged into my phone line through filter
    Home Hub 2 stats
    ADSL line status
    Connection information
    Line state
    Connected
    Connection time
    0 days, 0:32:43
    Downstream
    2,976 Kbps
    Upstream
    448 Kbps
    ADSL settings
    VPI/VCI
    0/38
    Type
    PPPoA
    Modulation
    ITU-T G.992.1
    Latency type
    Interleaved
    Noise margin (Down/Up)
    11.6 dB / 14.0 dB
    Line attenuation (Down/Up)
    55.0 dB / 31.5 dB
    Output power (Down/Up)
    18.7 dBm / 12.3 dBm
    Loss of Framing (Local)
    11
    Loss of Signal (Local)
    1
    Loss of Power (Local)
    0
    FEC Errors (Down/Up)
    5951 / 0
    CRC Errors (Down/Up)
    0 / N/A
    HEC Errors (Down/Up)
    N/A / 0
    Error Seconds (Local)
    2
    Home hub 3 stats (same location, wiring etc)
    ADSL Line Status
    Connection Information
    Line state:
    Connected
    Connection time:
    0 days, 00:06:30
    Downstream:
    2.469 Mbps
    Upstream:
    448 Kbps
    ADSL Settings
    VPI/VCI:
    0/38
    Type:
    PPPoA
    Modulation:
    G.992.1 Annex A
    Latency type:
    Interleaved
    Noise margin (Down/Up):
    12.5 dB / 17.0 dB
    Line attenuation (Down/Up):
    55.7 dB / 31.5 dB
    Output power (Down/Up):
    18.7 dBm / 12.5 dBm
    FEC Events (Down/Up):
    0 / 0
    CRC Events (Down/Up):
    0 / 0
    Loss of Framing (Local/Remote):
    0 / 0
    Loss of Signal (Local/Remote):
    0 / 0
    Loss of Power (Local/Remote):
    0 / 0
    HEC Events (Down/Up):
    0 / 0
    Error Seconds (Local/Remote):
    61371 / 1340 
    Now previously the noise margin was 14+, but hey it was lower that time
    This is the speedtest data from connection with the 'faster' HH2
    1. Best Effort Test: -provides background information.
    Download  Speed
    1.8 Mbps
    0 Mbps
    2 Mbps Max Achievable Speed
    Download speed achieved during the test was - 1.8 Mbps
    For your connection, the acceptable range of speeds is 0.4 Mbps-2 Mbps.
    Additional Information:
    Your DSL Connection Rate :2.98 Mbps(DOWN-STREAM), 0.45 Mbps(UP-STREAM)
    IP Profile for your line is - 2 Mbps
    Please help me get back to a better faster connection
    ty

    Hi there, have left everything running without reconnection etc and my IP profile is now lower than before and max connect of 2mbs
    FAQ
    1. Best Effort Test: -provides background information.
    Download  Speed
    1.59 Mbps
    0 Mbps
    2 Mbps Max Achievable Speed
    Download speed achieved during the test was - 1.59 Mbps For your connection, the acceptable range of speeds  is 0.4 Mbps-2 Mbps. Additional Information: Your DSL Connection Rate :2.08 Mbps(DOWN-STREAM), 0.45 Mbps(UP-STREAM) IP Profile for your line is - 1.75 Mbps
    ADSL Line Status
    Connection Information
    Line state:
    Connected
    Connection time:
    5 days, 19:47:49
    Downstream:
    2.031 Mbps
    Upstream:
    448 Kbps
    ADSL Settings
    VPI/VCI:
    0/38
    Type:
    PPPoA
    Modulation:
    G.992.1 Annex A
    Latency type:
    Interleaved
    Noise margin (Down/Up):
    9.9 dB / 16.0 dB
    Line attenuation (Down/Up):
    55.6 dB / 31.5 dB
    Output power (Down/Up):
    18.7 dBm / 12.4 dBm
    FEC Events (Down/Up):
    325497558 / 1732
    CRC Events (Down/Up):
    151164 / 1811
    Loss of Framing (Local/Remote):
    0 / 0
    Loss of Signal (Local/Remote):
    0 / 0
    Loss of Power (Local/Remote):
    0 / 0
    HEC Events (Down/Up):
    344750 / 1160
    Error Seconds (Local/Remote):
    47134 / 2206

  • I'm a bit confused. Since my original camera format was 720/60p, and I converted the footage to Pro Res422 in order to edit in Final Cut Pro 7, should I convert back to a higher quality format before sending the file to DVD Studio Pro?

    I'm a bit confused. Since my original camera format was 720/60p, and I converted the footage to Pro Res422 in order to edit in Final Cut Pro 7, should I convert back to a higher quality format before sending the file to DVD Studio Pro? If so, which Compressor codec is best to use in order to preserve the original 720/60p?   How do I maintain the highest quality?

    No...ProRes is a high quality format. Finishing format.  Many TV networks take that as a final deliverable. 
    BUT...DVDs aren't high definition...they are SD.  You cannot make a 720p60 DVD with DVD Studio Pro.  Any DVD you make will be SD...720x480.  The only HD DVD format out there is BluRay, and for that you need a BluRay burner.
    As for making a high quality DVD...using the BEST QUALITY settings in Compressor will work:
    #42 - Quick and dirty way to author a DVD
    Shane's Stock Answer #42 - David Roth Weiss' Secret Quick and Dirty Way to Author a DVD:
    The absolute simplest way to make a DVD using FCP and DVDSP is as follows:
    1. Export a QT movie, either a reference file or self contained using current settings.
    2. Open DVDSP, select the "graphical" tab and you will see two little monitors, one blue, one green.
    3. Select the left blue one and hit delete.
    4. Now, select the green one, right click on it and select the top option "first play".
    5. Now drag your QT from the browser and drop it on top of the green monitor.
    6. Now, for a DVD from an HD source, look to the right side and select the "general tab" in the track editor, and see the Display Mode, and select "16:9 pan-scan."
    7. Hit the little black and yellow burn icon at the top of the page and put a a DVD in when prompted. DVDSP will encode and burn your new DVD.
    THATS ALL!!!
    NOW...if you want a GOOD LOOKING DVD, instead of taking your REF movie into DVD SP, instead take it into Compressor and choose the BEST QUALITY ENCODE (2 pass VBR) that matches your show timing.  Then take THAT result into DVD SP and follow the rest of the steps.  Except you can choose "16:9 LETTERBOX" instead of PAN & SCAN if you want to see the entire image.

  • Standard and generic COPA delta

    Hello
    I dont understand why many people say standard copa delta  , generic copa delta.
    COPA is a generated application and delta should be generic.
    What do you mean when you are talking (many posts in this forum) about standard copa delta?
    Thanks

    Hi,
    Copa extraction is generated application based on the operating concren configured in R/3 system. and same applied to delta process.
    R/3 table TKEBWTS contains information about the COPA load requests that have been posted into BW.  time stamp entries in this table can be maintained to reintialize the delta. this may be the reason some may call in generic or standard delta.
    Dev

  • Question on Delta Init Loads

    Hi All,
    I have a basic question on Delta Inits. I am a beginner in BW and would truly appreciate any input.
    At my company, we currently are doing delta loads on the cube 0RT_C35 (Material Movements). They did the Delta Inits prior to that.
    Now if we were to add new fields to the cube and need to populate data to the new fields from the beggining, how do I go about doing it..?
    Should I delete the Delta Init requests and redo the Delta Inits..? What will happen to the Delta loads that are running daily then..?
    Can someone please give me the steps that I need to follow in a sequence.
    Like I said, it might be a basic question..but I am not familiar with this process.
    Appreciate your help.

    Thank You very much Ravi.
    I will not be doing it today based on the priority of the tasks..but this is something I will have to do eventually in the next 1-2 weeks. I will definitely let you know how it goes.
    So once the mapping and everything is in place and is ready to be loaded, bascially I need to delete data from the datatarget, put the delta job on hold, reinit the delta requests and then I should be able to do the delta jobs again right..?
    Thank You so much.
    The other issue is that there are probably 25 Delta inits that they did based on the number range for the Material Nbr. But they have only 1 delta. So does the delta job cover the data for all the Delta Init requests..?
    I am assigning full points to you for being such a great help. I will let you know when I actually end up doing this.
    I will assign the remaining points when I close the request.
    Thanks again

  • Delta INIT problems in production system

    hi!
    How do you do, I hope all fine, well, I have a problem in production system, I execute the delta init and the data is in BW the problem is when I want to activate the data, the system give me a processing error, somebody can helpme please?
    BR

    Hi
    Please check with your Basis team to confirm there are enough resources.
    Also Check what is max work process time . Tcode RZ11 parameter rdisp/max_wprun_time
    Check this thread which mentions a lot of related notes.
    Link: [Re: DSO-data activation issue;
    Regards
    Sanjyot

  • I'm a bit confused about standby log files

    Hi all,
    I'm a bit confused about something and wondering if someone can explain.
    I have a Primary database that ships logs to a Logical Standby database.
    Everything appears to be working properly. If I check the v$archived_log table in the Primary and compare it to the dba_logstdby_log view in the Logical Standby, I'm seeing that logs are being applied.
    On the logical standby, I have the following configured for log_archive_dest_n parameters:
    *.log_archive_dest_1='LOCATION=/u01/oracle/archivedlogs/ORADB1
    VALID_FOR=(ALL_LOGFILES,ALL_ROLES) DB_UNIQUE_NAME=PNX8A_GMD'
    *.log_archive_dest_2='LOCATION=/u02/oracle/archivedlogs/ORADB1
    VALID_FOR=(ALL_LOGFILES,ALL_ROLES) DB_UNIQUE_NAME=PNX8A_GMD'
    *.log_archive_dest_3='LOCATION=/u03/oracle/archivedlogs/ORADB1
    VALID_FOR=(ALL_LOGFILES,ALL_ROLES) DB_UNIQUE_NAME=PNX8A_GMD'
    *.log_archive_dest_4='SERVICE=PNX8A_WDC ASYNC VALID_FOR=(ONLINE_LOGFILES,PRIMARY_ROLE) DB_UNIQUE_NAME=PNX8A_WDC'
    *.log_archive_dest_5='LOCATION=/u01/oracle/standbylogs/ORADB1
    VALID_FOR=(STANDBY_LOGFILES,STANDBY_ROLE) DB_UNIQUE_NAME=PNX8A_GMD'
    *.log_archive_dest_6='LOCATION=/u02/oracle/standbylogs/ORADB1
    VALID_FOR=(STANDBY_LOGFILES,STANDBY_ROLE) DB_UNIQUE_NAME=PNX8A_GMD'
    *.log_archive_dest_7='LOCATION=/u03/oracle/standbylogs/ORADB1
    VALID_FOR=(STANDBY_LOGFILES,STANDBY_ROLE) DB_UNIQUE_NAME=PNX8A_GMD'
    Here is my confusion now. Before converting from a Physical standby database to a Logical Standby database, I was under the impression that I needed the standby logs (i.e. log_archive_dest_5, 6 and 7 above) because a Physical Standby database would receive the redo from the primary and write it into the standby logs before applying the redo in the standby logs to the Physical standby database.
    I've now converted to a Logical Standby database. What's happening is that the standby logs are accumulating in the directory pointed to by log_archive_dest_6 above (/u02/oracle/standbylogs/ORADB1). They do not appear to be getting cleaned up by the database.
    In the Logical Standby database I do have STANDBY_FILE_MANAGEMENT parameter set to AUTO. Can anyone explain to me why standby log files would continue to accumulate and how I can get the Logical Standby database to remove them after they are no longer needed on the LSB db?
    Thanks in advance.
    John S

    JSebastian wrote:
    I assume you mean in your question, why on the standby database I am using three standby log locations (i.e. log_archive_dest_5, 6, and 7)?
    If that is your question, my answer is that I just figured more than one location would be safer but I could be wrong about this. Can you tell me if only one location should be sufficient for the standby logs? The more I think of this, that is probably correct because I assume that Log Transport services will re-request the log from the Primary database if there is some kind of error at the standby location with the standby log. Is this correct?As simple configure as below. Why more multiple destinations for standby?
    check notes Step by Step Guide on How to Create Logical Standby [ID 738643.1]
    >
    LOG_ARCHIVE_DEST_1='LOCATION=/arch1/boston VALID_FOR=(ONLINE_LOGFILES,ALL_ROLES) DB_UNIQUE_NAME=boston'
    LOG_ARCHIVE_DEST_2='SERVICE=chicago LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES,PRIMARY_ROLE) DB_UNIQUE_NAME=chicago'
    LOG_ARCHIVE_DEST_3='LOCATION=/arch2/boston/ VALID_FOR=(STANDBY_LOGFILES,STANDBY_ROLE) DB_UNIQUE_NAME=boston'
    The following table describes the archival processing defined by the initialization parameters shown in Example 4-2.
         When the Boston Database Is Running in the Primary Role      When the Boston Database Is Running in the Logical Standby Role
    LOG_ARCHIVE_DEST_1      Directs archival of redo data generated by the primary database from the local online redo log files to the local archived redo log files in /arch1/boston/.      Directs archival of redo data generated by the logical standby database from the local online redo log files to the local archived redo log files in /arch1/boston/.
    LOG_ARCHIVE_DEST_2      Directs transmission of redo data to the remote logical standby database chicago.      Is ignored; LOG_ARCHIVE_DEST_2 is valid only when boston is running in the primary role.
    LOG_ARCHIVE_DEST_3      Is ignored; LOG_ARCHIVE_DEST_3 is valid only when boston is running in the standby role.      Directs archival of redo data received from the primary database to the local archived redo log files in /arch2/boston/.
    >
    Source:-
    http://docs.oracle.com/cd/B19306_01/server.102/b14239/create_ls.htm

  • Delta Init failed

    Hi All:
    I don't understand the reason behind delta Init Failure. source system check is also OK. Message reads
    <b></b>ransfer Idocs and TRFC : Errors Occured
    Request IDoc: Application document not posted
    Info Doc 1: Application document posted.
    <b></b> Also extraction tab is red and this reads <b></b>Error occurred in the data selection<b></b>
    Thanks!
    Manasa.

    Replicating datasource didn't do good. I did Full update but It's not allowing to do Init.With full upload I tried to see the data in ODS, this message is stopping me.
    ""Your user master record is not sufficiently maintained for object""
    Thanks,
    Manasa.

  • Full repair or Delta Init?

    Hi Experts
    I need your suggest
    a)
    Sometime we need made a re-load one month, and Im using a 'full repair'; but this give me errors (duplicate or missing records) where users are working in R/3 at the same time.  Other posibilite is a 'full load' + 'delta init with out records'.   Which is better in this case? 
    b) People say than after use a 'full load' you lose the init flag and is necessary made a init again.
    In this case a made a 'full load' for 001.2008, and 'full repair' for 002.2008; and the init flag / delta load seemed ok. Is this possible? 
    c) When use a 'full repair', (with selection scope = 001.2008), how does it work?  read from R/3 again or repeat the delta loads for the month specified in the selection, in the infopackage?
    Thank you in advance!!

    Hello,
    Pls see these links
    data extraction via combination of full + init updates
    How to load init -delta requests when full load already exist
    Full Upload and Init Delta Load
    Thanks
    Chandran

  • Save Options a Bit Confusing

    Hi,
    I've been using LabVIEW for about 7 months now, and am still a bit confused about the save options.  It seems that whenever I modify a sub-VI for my app, the previous sub-VI I created ends up with the same settings that I changed in the newly modified one.
    Example:  I use an existing VI created by Daytronic Corp for monitoring a binary signal.  I change the channel and a few other settings, and name the file oil press.vi, saving it in the new folder I created for my app.  Then I close the oil press.vi I modified, then close the original, unchanged VI without saving it.  When I create another VI starting with the original, unmodified one again, and save it using the same procedure, but using a different filename (let's call it fuel press.vi), oil press.vi takes on the characteristics of fuel press.vi, although the filename is still oil press.vi.  So, I end up having to use the "replace" option on the block diagram and select oil press.vi as the replacement file.  I have to keep doing this for every sub-vi I create.
    This take a lot of time and is a bit annoying.
    I hope you can understand what I'm saying.  The Save options seem confusing to me, with all the references to keeping a vi in memory, closing the original, etc.  Is there some text or a tutorial online that explains the save options in a simpler, more direct form?  I have even read about it in my book (Labview for Everyone) and it doesn't go into much more detail about how saving the Vis actually works.
    Thanks!
    Todd Munsell
    Turbine Engine Test Technician/IT Tech
    Wood Group Pratt & Whitney Industrial Turbine Services, LLC
    Plattsburgh, NY

    Hi Mike,
    Actually, I'm using LV 8.5.  It's the descriptions of the save options that are slightly confusing.  The information about what's left in memory and what's closed and opened is what's got me baffled.
    For example, what, in layman's terms, do each of these options below mean... exactly?  I would think that an option that simply says "Save as path/filename" would be enough.  Then, the renamed, saved file stays open, since that's the one that's currently being used, and the original is closed and left as it was.  Which option (and sub-option) would I choose to do this?
    The number of options below seems a bit excessive, and not very clearly defined, at least to me.   Even the explanation in the book "LabVIEW for Everyone", which is supposed to be for beginners, doesn't do a very good job of describing exactly what each option does.  Thanks again!
    Original file—Displays the path to the open file. You can
    use this field to determine the location of a file on disk.
    Copy—Creates and saves a copy of the file in memory to disk with a name you
    choose. If you enter a new file path or name for the file, the original file on
    disk is not overwritten or deleted.
    Substitute copy for original—Both the original file and the
    new file exist on disk, but the original VI closes and the new VI opens. Use
    this option if you
    want to create a copy of the original file and immediately edit the copy.
    Caution  If the original file has
    calling VIs in memory, this option updates all these callers to refer to the new
    file.
    If the original file is in a project
    library or project, this option substitutes the new file for the original in the
    project library or project.
    Create unopened disk copy—Both the original file and the
    new file exist on disk, but only the original file remains open in memory. Use
    this option if you
    want to create a copy of the original file but continue editing the original
    file, for example, if you want to create a backup copy. This option does not update
    callers in memory to refer to the new file. You can update callers manually by
    finding all instances of a
    VI and updating the name of each instance. If the original file is in a
    library or project, this option does not add the new file to the same library or
    project.
    Open additional copy—Both the original file and the new
    file exist on disk and are open in memory. You must give the copy a new name,
    because two files of the same name cannot exist in the same application instance
    at the same time. Use this option if you want to create a copy of the original file,
    continue editing the original file, and also immediately edit the new file. This
    option does not
    update callers in memory to refer to the new file. You can update callers
    manually by finding all
    instances of a VI and updating the name of each instance. If the original
    file is in a library or project, you have the option of adding the new
    file to the same library or project by placing checkmarks in the appropriate
    checkboxes. If you place a checkmark in the checkbox to add the copy to the
    library, the checkbox to add the copy to the project is disabled and
    unchecked.
    Rename—Renames the file in memory with a new name you
    choose. This option
    closes and deletes the original file and opens the file with the new name, so
    only the new file exists on disk and in memory. Use this option if you want to
    change the name and/or location of the original file.
    Caution  If the original file has
    calling VIs in memory, this option updates all these callers to refer to the new
    file.
    If the original file is in a library or
    project, this option adds the new file to the same library or project,
    and removes the original file.
    Duplicate hierarchy to new location— Use this option if you want to
    save the original
    VI and its hierarchy to a new location. You might want to rename the new VI
    hierarchy after duplicating it.
    Message Edited by tmunsell on 08-09-2008 01:54 AM
    Message Edited by tmunsell on 08-09-2008 01:56 AM
    Todd Munsell
    Turbine Engine Test Technician/IT Tech
    Wood Group Pratt & Whitney Industrial Turbine Services, LLC
    Plattsburgh, NY

Maybe you are looking for

  • How to read some lines from a text file using java.

    hi, i m new to java and i want to read some lines from a text file based on some string occurrence in the file. This file to be read in steps. we only want to read the file upto the first Occurrence of "TEXT" string. How to do it ,,, Kinldy give the

  • SCSI CD Burner on Beige G3 with OS 8.6

    I've been using a LaCie 4x CD burner with my beige G3 Power PC. It died a few weeks ago and I found some SCSI LaCie 4x4x16 drives on eBay. But they don't really work right. Only 1 or 2 sessions of multi-session disks mount, and the computer does not

  • Connecting your Mac to a TV

    OK, I have asked this question many moons ago and it was it my previous questions asked but since Apple moved this forum over I dont seem to have the answers anymore. What i need to know is how to connect my MacBook up to the television without Apple

  • Call webservice from abap

    Hello, I use the following code to call web service from abap, but in the method "http_client->receive"  i see "http_communication_failure = 1" . this is the code: DATA: SMS_TEXT       TYPE STRING,        SMS_TEXT_UTF   TYPE STRING,        SEND_STRIN

  • Issue with VBRP - KZWI2

    First time doing this so bear with me... We are having issues with a smartform that uses the table/field VBRP/KZWI2. Basically, the pricing procedure used summarizes three conditions for this subtotal and we are only using one of them which should su