Import and contain?

Hi,
My wife wants me to import a group of contacts she's compiled of invitees to a party we're throwing. The problem is twofold:
1) When I import her list, which contains some contacts I already have in my list, her duplicate contacts overwrite my existing ones, which is not good in cases where I have more detailed contact information for the person than she does. Is there a way to import her group without overwriting the duplicates using her info?
2) There are many contacts in this group that I'll want to delete from my contact list afterward, as my car takes all of my contacts via bluetooth, and I'd rather purge most of this group after the invites are sent. Is there a way to import these individuals and contain them for later deletion without having them added to my main contact list?
Whew! Thanks for reading all of that, and thanks so much in advance for any help you might be able to give!
Cheers,
Rob

Oops... figured this out, with the update options, and then I can just save my address book and restore it to that saved version whenever I want.

Similar Messages

  • How do i delete photos which were being imported and are now stored in a folder which i can't access

    how do i delete photos which were being imported and are now stored in a folder which i can't access

    Go to your Pictures Folder and find the iPhoto Library there. Right (or Control-) Click on the icon and select 'Show Package Contents'. A finder window will open with the Library exposed.
    Look there for a Folder called 'Import' or 'Importing'.
    Drag it to the Desktop. *Make no other changes*.
    Start iPhoto. Does that help?
    If it does then look inside that folder on your desktop. Does it contain anything you want? If not you can trash the folder.

  • Error while importing a container dump file

    Hi,
    I am trying to import a container from one Oracle Server to another server which are located at two different locations. I have exported the container from the first server and FTPed it to the second server and tried to import the same into the second server. While doing this, i got the following message:
    Connected to: Oracle8i Enterprise Edition Release 8.1.7.0.0 - Production
    With the Partitioning option
    JServer Release 8.1.7.0.0 - Production
    Export file created by EXPORT:V08.01.07 via conventional path
    import done in WE8ISO8859P1 character set and WE8ISO8859P1 NCHAR character set
    . importing REPOS_MANAGER's objects into REPOS_MANAGER
    . . importing table "XTSYS_EXPORT_OBJECTS" 1 rows imported
    . . importing table "XTSYS_IMPORT_IRID_MAPPING"
    IMP-00009: abnormal end of export file
    IMP-00018: partial import of previous table completed: 1281 rows imported
    IMP-00033: Warning: Table "XTSYS_IMPORT_IVID_MAPPING" not found in export file
    IMP-00033: Warning: Table "XTSYS_TABS_EXPORTED" not found in export file
    IMP-00033: Warning: Table "XTSYS_RM$REPOSITORIES" not found in export file
    Import terminated successfully with warnings.
    I am able to import the same dump file into another server located in the same location as the Server One without any issues.
    One thing that i have observed is: On the Server 2, if a container is already present and the XTSYS_ tables are already populated, when i tried to import this dump file, it asked me "Temporary tables are already populated. Do you want to overwrite them. Otherwise the existing temporaty tables will be used for this purpose". If i clicked No, then it imported the container without any problems, BUT, the new container contents are not the same as found on Server 1's container contents.
    Can somebody suggest what could be the problem and how can i import that dump file.
    thanks a lot in advance
    regards,
    Vijay

    Vijay,
    There is a note <Note:160378.1> on the Oracle Support site: http://metalink.oracle.com.
    There are a number of possibilities, the most likely of these is that you are using the wrong import/export utilities for an older or newer database version.
    You don't mention which release of Designer you are using, Designer 9i uses the 9.0.1 export/import utilities by default. You cannot use the 9.0.1 export/import utilities against an 8.1.7 database.
    If not already installed, install the 8.1.7 export/import utilities from the Oracle 8i Release 3 (8.1.7) Client Cd into a new Oracle Home.
    In the Registry, change [hklm]\software\oracle\HOMEx\repos61\EXECUTE_EXPORT
    and EXECUTE_IMPORT to point to the 8.1.7 export/import utitilites.
    (HOMEx corresponds to your Designer 9i Oracle Home)
    A typical value for them would be:
    EXECUTE_EXPORT=d:\des_817\bin\exp.exe
    EXECUTE_IMPORT=d:\des_817\bin\imp.exe
    or
    EXECUTE_EXPORT=d:\V817\bin\exp.exe
    EXECUTE_IMPORT=d:\V817\bin\imp.exe
    After changing the parameters in the Registry, verify if they are active by going into the Repository Administration Utility, Check Requirements,
    Parameter Settings and look for EXECUTE_EXPORT and EXECUTE_IMPORT.
    Regards
    Sue

  • Internal table with Import and Export

    Hi All,
    Hi all
    Please let me know the use of <b>Internal table with Import and Export parameters and SET/GET parameters</b>, on what type of cases we can use these? Plese give me the syntax with some examples.
    Please give me detailed analysis on the above.
    Regards,
    Prabhu

    Hi Prabhakar,
    There are three types of memories.
    1. ABAP MEMORY
    2. SAP MEMORY
    3. EXTERNAL MEMORY.
    1.we will use EXPORT/ IMPORT TO/ FROM MEMORY-ID when we want to transfer between ABAP memory
    2. we will use GET PARAMETER ID/ SET PARAMETER ID to transfer between SAP MEMORY
    3. we will use EXPORT/IMPORT TO/FROM SHARED BUFFER to transfer between external memory.
    ABAP MEMORY : we can say that two reports in the same session will be in ABAP MEMORY
    SAP MEMORY: TWO DIFFERENT SESSIONS WILL BE IN SAP MEMORY.
    for ex: IF WE CALL TWO DIFFERENT TRANSACTIONS SE38, SE11
    then they both are in SAP MEMORY.
    EXTERNAL MEMORY: TWO different logons will be in EXTERNAL MEMORY.
    <b>Syntax</b>
    To fill the input fields of a called transaction with data from the calling program, you can use the SPA/GPA technique. SPA/GPA parameters are values that the system stores in the global, user-specific SAP memory. SAP memory allows you to pass values between programs. A user can access the values stored in the SAP memory during one terminal session for all parallel sessions. Each SPA/GPA parameter is identified by a 20-character code. You can maintain them in the Repository Browser in the ABAP Workbench. The values in SPA/GPA parameters are user-specific.
    ABAP programs can access the parameters using the SET PARAMETER and GET PARAMETER statements.
    To fill one, use:
    SET PARAMETER ID <pid> FIELD <f>.
    This statement saves the contents of field <f> under the ID <pid> in the SAP memory. The code <pid> can be up to 20 characters long. If there was already a value stored under <pid>, this statement overwrites it. If the ID <pid> does not exist, double-click <pid> in the ABAP Editor to create a new parameter object.
    To read an SPA/GPA parameter, use:
    GET PARAMETER ID <pid> FIELD <f>.
    This statement fills the value stored under the ID <pid> into the variable <f>. If the system does not find a value for <pid> in the SAP memory, it sets SY-SUBRC to 4, otherwise to 0.
    To fill the initial screen of a program using SPA/GPA parameters, you normally only need the SET PARAMETER statement.
    The relevant fields must each be linked to an SPA/GPA parameter.
    On a selection screen, you link fields to parameters using the MEMORY ID addition in the PARAMETERS or SELECT-OPTIONS statement. If you specify an SPA/GPA parameter ID when you declare a parameter or selection option, the corresponding input field is linked to that input field.
    On a screen, you link fields to parameters in the Screen Painter. When you define the field attributes of an input field, you can enter the name of an SPA/GPA parameter in the Parameter ID field in the screen attributes. The SET parameter and GET parameter checkboxes allow you to specify whether the field should be filled from the corresponding SPA/GPA parameter in the PBO event, and whether the SPA/GPA parameter should be filled with the value from the screen in the PAI event.
    When an input field is linked to an SPA/GPA parameter, it is initialized with the current value of the parameter each time the screen is displayed. This is the reason why fields on screens in the R/3 System often already contain values when you call them more than once.
    When you call programs, you can use SPA/GPA parameters with no additional programming overhead if, for example, you need to fill obligatory fields on the initial screen of the called program. The system simply transfers the values from the parameters into the input fields of the called program.
    However, you can control the contents of the parameters from your program by using the SET PARAMETER statement before the actual program call. This technique is particularly useful if you want to skip the initial screen of the called program and that screen contains obligatory fields.
    Reading Data Objects from Memory
    To read data objects from ABAP memory into an ABAP program, use the following statement:
    Syntax
    IMPORT <f1> [TO <g 1>] <f 2> [TO <g 2>] ... FROM MEMORY ID <key>.
    This statement reads the data objects specified in the list from a cluster in memory. If you do not use the TO <g i > option, the data object <f i > in memory is assigned to the data object in the program with the same name. If you do use the option, the data object <f i > is read from memory into the field <g i >. The name <key> identifies the cluster in memory. It may be up to 32 characters long.
    You do not have to read all of the objects stored under a particular name <key>. You can restrict the number of objects by specifying their names. If the memory does not contain any objects under the name <key>, SY-SUBRC is set to 4. If, on the other hand, there is a data cluster in memory with the name <key>, SY-SUBRC is always 0, regardless of whether it contained the data object <f i >. If the cluster does not contain the data object <f i >, the target field remains unchanged.
    Saving Data Objects in Memory
    To read data objects from an ABAP program into ABAP memory, use the following statement:
    Syntax
    EXPORT <f1> [FROM <g 1>] <f 2> [FROM <g 2>] ... TO MEMORY ID <key>.
    This statement stores the data objects specified in the list as a cluster in memory. If you do not use the option FROM <f i >, the data object <f i > is saved under its own name. If you use the FROM <g i > option, the data objet <g i > is saved under the name <f i >. The name <key> identifies the cluster in memory. It may be up to 32 characters long.
    Check this link.
    http://www.sap-img.com/abap/difference-between-sap-and-abap-memory.htm
    Thanks,
    Susmitha.
    Reward points for helpful answers.

  • Using databases and container events?

    I'm having trouble finding documents show how one should use databases and listen for container events? I'm using Berkley Java DB. I'll need all my remote object to have access to the database. How do I pass that in? Custom adapters? Also, when Tomcat shuts down I need to close the database, how would this be down? Custom adapters? So I may be answering my own question, but I'm asking cause I really don't see anything in the documentation that points this out. It really looks like I'd be using singletons and factories for this. I could register a Servlet or Servlet Context to listen for Tomcat events, but I figured it couldn't hurt to ask others using Blaze, as I'm very new to it. Thanks.

    So how would you tell Blaze to use this? Is there something you put into the config file? My thoughts on all this is to use PicoContainer to manage all services then create an adapter to inject all resources when creating remote objects. But this seems a little heavy if Blaze can do all this for me. The adapter seems like the injection point to me. Is that not what it's used for? Or is using it like that bad practice under Blaze? I've read all I can find on Blaze, so either it's missing docs, or I'm not finding them all. If I'm missing them, by all means let me know where to go. I don't want to ask questions that are already answered in documents else where.
    On Tue, May 20, 2008 at 10:51 AM, Mete Atamel <
    [email protected]> wrote:
    A new message was posted by Mete Atamel in
    General Discussion --
      Using databases and container events?
    I think it's mainly in JavaDocs but here's a sample I have where I check to make sure the HSQLDB is running as the BlazeDS starts up:
    import java.sql.Connection;
    import java.sql.SQLException;
    import flex.messaging.config.ConfigMap;
    import flex.messaging.services.AbstractBootstrapService;
    public class DatabaseCheckService extends AbstractBootstrapService
        // This is called right before server starts up.
        public void initialize(String id, ConfigMap properties)
            Connection c = null;
            try
                // Check that the database is running...
                c = ConnectionHelper.getConnection();
                // ... if yes return
                return;
            catch (SQLException e)
                System.out.println("DB is not running!");
            finally
                ConnectionHelper.close(c);
    // This is called as server is starting.
        public void start()
            // No-op
    // This is called as server is stopping.
        public void stop()
            // No-op
    View/reply at
    Using databases and container events?
    Replies by email are OK.
    Use the
    unsubscribe form to cancel your email subscription.
    "All that is necessary for the triumph of evil is that good men do nothing." - Edmund Burke

  • When to use import and export directives?

    I'm very confused about what the import and export language extensions do and when one should use them.
    I have a Bridge application with multiple source files. As long as I don't call code in another file from the startup execution path (before all source files are loaded) and only call across modules after Bridge is running (e.g. from events in Bridge), I don't seem to need import and export. All source files seem to be in the same namespace and all can see each other's global objects/functions.
    So, when would someone use import or export? And, what is the #engine that the manual discusses in relation to import and export? If some of my source files are library files that I intend to be used by multiple different files/applications, should I be using import/export for that use? Or is this only applicable when there are somehow multiple JavaScript engines involved? And when would that situation arise?
    --John

    John,
    The import and export directives don't really matter in Bridge. All scripts execute in the same namespace. That's why namespacing your scripts is crucial in Bridge.
    The #engine directive creates another JavaScript engine instance containing the script with the directive. If you then wanted to use stuff from that engine, the import and export directives would be needed. But even with that, in Bridge, the main engine will contain the script. There will just be a second engine (you can see this in the ESTK) containing the same stuff. Since it's in the main engine, there's no point in using the import and export.
    Some of the other apps, GoLive, for instance, each script executes in its own engine and namespace. And import and export are needed.
    This could change in future versions, and if it does, we'll be certain to make sure everyone knows about it.
    Bob

  • Validating CSV File BEFORE importing and moving using an SSIS package

    I'm trying to come up with a process (I'm used to doing this kind of stuff in T-SQL) in SSIS to grab a collection of .csv files from a folder, validate them, then depending on if they pass validation, import them into my database and move them into an archive
    folder. If the .csv does not pass validation, then it does not get imported, and instead gets placed into an error folder. The validation requirements is that the Column2 of my .csv contains a WKT text field and the file is considered valid if every record
    can be successfully converted into a Geometry/Geography type field. 
    Column0 Column1 Column2
    abcd efgh LINESTRING (-71.4555487 41.6079686, -71.4550113 41.6088851)
    ijkl mnop LINESTRING (-70.0748669 48.6634506, -70.0499 48.6548479)
    qrst uvwx LINESTRING (-70.3159285 48.4199802, -70.3168512 48.4187551)

    FOr that what you can do is dump the records to staging table and use STisValid function
    see
    http://technet.microsoft.com/en-us/library/bb933890.aspx
    finally take the count of failed ones and if cnt > 0 set boolean value as true/false. Then for true cases move data to final table and for false cases (invalid values), move it to error folder
    So package will look like below
    1. Data flow task to transfer data to staging
    2. execute sql task to call a procedure which does validation on staging data and returns a bit result. Set it to SSIS variable using output variable from procedure
    3. Connect to data flow task to do transfer from staging to destination. the precedence constraint option would be Expression And Constraint with expression as 
    @BooleanVariable == True
    and constraint as OnSuccess
    4. Connect to file system task to archive the file. the precedence constraint option would be Expression And Constraint with expression as 
    @BooleanVariable == False
    and constraint as OnSuccess
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Import and rename photos via Automator and ExifTool

    Hiya,
    Just finished my work on a Automator workflow in order to import and rename photos from a portable hard disk.
    I had the problem that my portable hard disk (with integrated card-reader) was not recognised by Image Capture or Aperture. Therefore I couldn't import my photos straight away into Aperture.
    Another problem I had was that Aperture doesn't rename photos while importing. Whereas other Automator actions offer this feature, they all didn't fulfill my requirements - yeah, I can be very fussy
    That's why I started initially to write a Perl script based on ExifTool in order to copy and rename photos to my liking. Once this was done and very much in use, I started to put it into an Automator action with a little AppleScript around it.
    Eventually, I initiated the entire Automator workflow with the Check For Disk action.
    More information and the files needed can be found on my website http://photography.stephanjaeger.eu/2007/03/improve-your-digital-photography-wor kflow.html.
    The only drawback of this script is that only one camera is supported. At the moment, I work on a solution to support several cameras.
    The Perl script can used by itself, if you feel comfortable in using the Terminal.
    I hope you find the information and the Automator action helpful.
    Any feedback appreciated.
    Best regards,
    Stephan Jaeger
    Mac Pro   Mac OS X (10.4.8)   2.66 GHz, 4 GB RAM

    Brian,
    Thanks for giving it a try. Let me know how you get on!
    If you have any problems, please don't hesitate to contact me.
    By the way, you don't have to use an external hard disk. You could replace easily the first step in that workflow with an other action, such as defining a folder that should be monitored, or selecting manually a folder that contains photos.
    I hope that I find some time over the week-end to make some changes to the script. I will set up as well a page on my web site with more details.
    All the best,
    Stephan

  • Importing and Organizing Thousands of Photos into IPhoto

    How can I import about 20 thousands photo into IPhotos and keep the logical organization?
    I just buy a Mac Mini and I start trying to move my photos. These are organized in directories by dates with a customize description for the occasion ( …/YEAR/YEAR-MONTH-DAY + DESCRIPTION/)
    I tried with few directories and I realized that in order to use IPhoto and keep the logical organization I will have to create and name about 500 albums manually and look for the photos for every album again one by one.
    It sounds hard because the number of albums and photos.

    There is a 3rd party application, iPhotoTool, that will let you select the top folder containing many subfolders and import all of them into iPhoto and create an album of each sub folder. That will save you a lot of manual work. Then all you need to do is create folders in the Source pane to organize your albums. Be sure to not allow duplicates each time it asks or you'll get duplicates in many of the folders. You'll be asked for each subfolder above the lowest but that's lot easier than manually finding and dragging each folder into iPhoto's Source pane.
    Do you Twango?
    TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
    I've written an Automator workflow application (requires Tiger), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.

  • Error message using SQL import and Export Wizard with Excel spread sheet

    I am trying to import an Excel spreadsheet using the Import export wizard that is provided with SQL 2014.  
    Everything seems to set up OK but then when I go to do the transfer "error message" comes up saying:
    External Table is not in the expected format.(Microsoft Jet Database Engine).   
    This is a spreadsheet and there is no database, that is why I want to import it into a database!!  So the reference to JET is a tad disappointing.
    So why the error and how do I fix the problem?
    As there are over 100 variables in the spread sheet rows, it would be great to have this automatically create the database and populate the fields.
    I am using SQL 2014  Express and Office Excel 2013.
    Thank you in  advance for taking the time to read this, and hopefully sheading some light on the issue.

    Hi AWlcurrent,
    When import a .xlsx file to SQL Server using SQL Server Import and Export Wizard, you see a “External Table is not in the expected format.(Microsoft Jet Database Engine).” error message. This error message seems that the Microsoft JET Database Engine is
    unable to handle something that is contained in the file.
    So please make sure there is no unsupported content in the Excel file. Alternatively, we can use ‘Microsoft Excel’ as the Data Source, then Select ‘Microsoft Excel 2007’ as the Excel version to import the excel file.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    If you have any feedback on our support, please click
    here.
    Katherine Xiong
    TechNet Community Support

  • No seperate albums created when importing folders containing photos

    Hi ... Need some help here please..
    When importing folders containing photos in iPhoto each folder would be made into an Event. When I do the same with Photos all my pictures are placed into Albums  - All Photos. No separate Albums are made for each folder, all pictures are mixed up together.
    Why is Photos doing this?
    (Im not importing from iPhoto ... New Mac)

    Simon
    Yes it does say that, but I have to say that a significant number of people can't get that to work, and I've never found an adequate reason why.
    I recommend that you begin to import at the bottom of your hierarchy. Drag to the left hand pane and iPhoto will create a Roll and and album of the same name. You can then create the enclosing folders in iPhoto: (File -> New Folder).
    Regards
    TD

  • Can i access and edit photos direct from finder rather than import and duplicate in iphoto?

    Hi, I have tens of thousands of photos so would rather not duplicate them in two places and eat into my storage. Is it possible to access photos saved in My Pictures direct from iPhoto without having to import them directly (and therefore duplicate them) into iPhoto? Any help appreciated! Thank you

    Yes but you don't want to.
    It's called running a Referenced Library and you can read about the issues and pitfalls in this thread:
    https://discussions.apple.com/thread/3062728?tstart=0
    Frankly the better way to work is to import them to iPhoto and let iPhoto copy the files. Live and work with it for a while and when you're content trash your pre-existing filing system - it's of no use to you any way. If you use iPhoto (or similar apps, like Aperture, Lightroom etc) you access the photos via those apps and not the Finder - the point of using a Photo Manager after all.
    If you have a folder tree or pre-existing organisation structure you can recreate it quite quickly in iPhoto. For instance:
    If you want to duplicate your Folder Tree in iPhoto:
    Start at the bottom of the hierarchy and drag a folder of images to the Album Heading in iPhoto. The pics will be imported and an Album of the same name created.
    You can then create the Enclosing Folders in the iPhoto Window (File -> New Folder) and drag the Album to it. Folders can contain other Folders (Nested Folders) and Albums.
    However, is your folder system date based? Then this form of organisation is a bit pointless in iPhoto when Smart Albums or the Calendar tool (Click on the wee magnifying glass in the Search Box) mean you can find the photos taken on any day, month or year at a click. With Smart Albums it's easy to find photos from specific range - say, June 3 to August 25, 2009 etc.
    If your folder system is theme based - Xmas pics, Birthday pics etc, then you'll find Keywords are much more flexible, and can be used in conjunction with other criteria for making Smart Albums and searches.
    Regards
    TD

  • Import and/or export for actionscript without extention?

    I have a project in which I need to import several hundered png images.
    When I import them into the library, then select all the images, then view the properties, I Check the export for action script and the export in frame 1.
    Now, all my images have class names that I can access though my code, however, it included the file extensions in the class names, thus making invalid class names.
    Is there any way I can import and/or set class names so they do not include the file extensions without having to do it manually for EVERY file?

    My actual project has something like 9200 images in it.
    if  my main project, let's call it "main.fla", was to contain every one of  those images, every time I tried to run or debug it, it would take  FOREVER to compile... as a work around that I discovered in a previous  project was to put all my graphics and media into an swc file. (Let's  call it MyLibrary.swc)
    While I was trying to get the  swc working, I discovered that while my media in the swc was indeed  setup to be exported for actionscript with unquie classes and  everything, when I compiled and built my project, I could not get any of  those media elements using the getDefinitionByName function.
    After alot of reading and trail and error I found that the only way to get it to work was to have someplace in my code, an actual reference to the objects that I was trying to get with getDefinitionByName. It turned out, that without instantiating the object in the code, the compiler was omitting it.
    Now, but instantiate, I mean doing this:
    var bm:BitmapData = new MyImage();
    Where MyImage.png is in my swc and has a class defined as "MyImage"
    Once I do that, then I can do this
    var classRef:Class = Class(  getDefinitionByName("MyImage")  );
    But without that line above SOMEWHERE in my code (even if it's not actually executed), the getDefinitionByName returns an error saying it can't find the class.
    Now, here's the problem.
    When I import all my images into my swc library they get named with the file extention... "MyImage.png" and when I select all the images that Imported and batch set the export for actionscript, they get automatically assigned a class name equal to their library name, so the class name is "MyImage.png"
    so if I wanted to use getDefinitionByName to access the class, I would need to instantiate it in my code by doing this:
    var bm:BitmapData = new MyImage.png();
    BUT, that is erronous as the period causes an error and the adobe docs says right in it that class names can't use periods or any other non-alphanumeric character.
    So while the getDefinitionByName function DOES actually work with classes with periods in them, because my media is external of the main.fla, I can't load it unless it's instantiated first, which leaves me right back where I was.... having to to manually rename over 9000 classes manually one at a time.
    I did some searching and found the "Flash Library Renamer" extension, but it requires me changing all my class names to be sequenial, which the vast majority is not.
    Is there a way that I can import the files into the library and have it omit the file extension?
    if not, is there a way that I can set the class names in a batch and not have it use the file extentions?
    if not, is there an Flash Extention or plug in that will allow batch renaming using regular expression or wildcards?

  • Import and Copy Opening – One Task & One Prompt Set

      Hi Experts,
    We want to create one data manager package that contains both import from BW with UI and copy opening. We ran into some issues.  Maybe you can help:
    We created a new process chain that contains both tasks, and created a new DM package that has both scripts.  Users only want to enter parameters such as entity and time one time for both tasks...
    When I commented out the Copy opening prompts, relying on the prompts from the Import from BW, I receive an error. The message:
    Task name LOAD
    INFOPROVIDER:
      Replace and Clear Submit
    count: 1114
      Replace and Clear Reject
    count: 0
      Aggregate count: 376
      Submit count: 312
      Reject count: 0
    Invalidate
    selection condition
    I believe it's because we have the %SELECTION% recorded twice in the script (thinking that we will take the variables from the import and continue to use for the Copy opening).
    Also thought to switch around the prompts, hard coding the cube name, and using a dimension selection screen like Copy Opening.
    Any ideas for executing this would be really helpful.  Below please find the standard scripts for
    import and Copy Opening.
    Thank you.
    Eyal Feiler
    'DEBUG(ON)
      PROMPT(INFOPROVIDERSELECTION,%InforProvide%,%SELECTION%,"Please select the InfoProvider
    and set selection (InfoProvider list is restricted by both
    BW and BPC authority)",,)
    PROMPT(KEYDATE,%KEYDATE%,"Key date",0)
    INFO(%TEMPNO1%,%INCREASENO%)
    INFO(%ACTNO%,%INCREASENO%)
    TASK(/CPMB/INFOPROVIDER_CONVERT,OUTPUTNO,%TEMPNO1%)
      TASK(/CPMB/INFOPROVIDER_CONVERT,ACT_FILE_NO,%ACTNO%)
      INFO(%MAI_TRFILE%,\ROOT\WEBFOLDERS\MAI_CONSO\Consolidation\DATAMANAGER\TRANSFORMATIONFILES\EXAMPLES\MA\T_FILE_BW_IPROVIDER.XLS)
    TASK(/CPMB/INFOPROVIDER_CONVERT,TRANSFORMATIONFILEPATH , %MAI_TRFILE%)
    TASK(/CPMB/INFOPROVIDER_CONVERT,SUSER,%USER%)
    TASK(/CPMB/INFOPROVIDER_CONVERT,SAPPSET,%APPSET%)
    TASK(/CPMB/INFOPROVIDER_CONVERT,SAPP,%APP%)
    TASK(/CPMB/INFOPROVIDER_CONVERT,FILE,%InforProvide%)
    TASK(/CPMB/INFOPROVIDER_CONVERT,INFOPROV_SELECTION,%SELECTION%)
    TASK(/CPMB/INFOPROVIDER_CONVERT,KEYDATE,%KEYDATE%)
    TASK(/CPMB/LOAD_IP,PREPROCESSMODE,0)
    TASK(/CPMB/LOAD_IP,TARGETMODE,2)
    TASK(/CPMB/LOAD_IP,INPUTNO,%TEMPNO1%)
    TASK(/CPMB/LOAD_IP,ACT_FILE_NO,%ACTNO%)
    TASK(/CPMB/LOAD_IP,RUNLOGIC,1)
    TASK(/CPMB/LOAD_IP,CHECKLCK,1)
      TASK(/CPMB/LOAD_IP,KEYDATE,%KEYDATE%)
      'supress
    prompts for work status default script and replace - default as yes and yes
    replace.  Default transformation file
    path
    Copy opening
    PROMPT(SELECTINPUT,,,,"%ENTITY_DIM%,%CATEGORY_DIM%,%CURRENCY_DIM%,%TIME_DIM%")
    TASK(/CPMB/OPENING_BALANCES_LOGIC,SUSER,%USER%)
    TASK(/CPMB/OPENING_BALANCES_LOGIC,SAPPSET,%APPSET%)
    TASK(/CPMB/OPENING_BALANCES_LOGIC,SAPP,%APP%)
    TASK(/CPMB/OPENING_BALANCES_LOGIC,SELECTION,%SELECTION%)
    TASK(/CPMB/OPENING_BALANCES_LOGIC,LOGICFILENAME,COPY_OPENING.LGF)

    Hi Vadim,
    Thanks, isn't %SELECTION% passed to the process chain?  I see that the process chain RUN LOGIC contains a field SELECTION.  If the value is passed, then I have to change that too?
    2) If we change the value to SELECTION2, then how does this variable capture the parameters for the dimensions?  I am commenting out the prompt of the copy opening script keeping the first prompt below from the import from BW in order to capture the parameters for both scripts.
    PROMPT(INFOPROVIDERSELECTION,%InforProvide%,%SELECTION%,"Please select the InfoProvider
    and set selection (InfoProvider list is restricted by both
    BW and BPC authority)",,)
    Thanks
    Eyal

  • Ideal import and export settings to make a master (uncompressed file)

    editing in FCP VERSION 6.0.5 ON A MACBOOK PRO. The footage I'm working with is one 26 GB Quicktime file (16 minute duration), standard definition, 725 x 576, 10 bit uncompressed.
    Basically I need to make this file, which is a short film, 1 minute shorter to be eligible to enter it into the Cannes Film festival next week.
    What would be the ideal import and and export settings to make a new master e.g. to create a new file that is most uncompressed.
    I very much appreciate your help!

    It seems that every time you reply your goal has changed.
    If you're worried about the self-contained movie requiring FCP to open, that's a simple fix. Right-click on the file and choose Get Info. In the Info window, change the 'Opens With' to QuickTime Player. You'll maintain the best quality and others won't need FCP to open it.
    -DH

Maybe you are looking for

  • FWs Nav Bar - Make Hyperlinks

    Hi I am currently making a website and I have created a nav bar in fireworks using slices and pop-up menus etc. Anyway I have imported it to my website in Dreamweaver, but I cant figure out how to make a hyperlink from the from down and nav bar butto

  • Forecasting Based planning, consider incoming Sales orders, Consume PIR's

    Hello Guys: I am in real Problem about this one: I am implementing forecast-based MRP for our client which is a trading company: buys, keeps in storage and sells goods (R/3 ECC 6). We are using material forecast on a monthly period, and run MRP on a

  • How to add days to a date value in SQL plus

    I would like to perform what ADD_MONTHS does, but for days. I have tried ADD_DAYS with no result. It sais it doesnt recognise the column.

  • "set deadlock_priority" not effective in SQL Server 2012?

    I am trying to control which of two sessions is chosen as the deadlock victim in SQL Server 2012. I have set up a test that induces a deadlock by updating two tables in opposite order with a "waitfor" in between the updates. I can consistently repro

  • Detect Default Printer in 9iDS ??

    Hi, I will detect the default Printer from the Client The Get_default_printer Function is not included in WEBUTIL Exists an other possibility to detect the default Printer ? Thanks Alfred