File import best practice

I need some outside input on a process. We get a file from a bank and I have to take it and move it along to where it needs to go. Pretty straight forward.
The complexity is the import process from the bank. It's a demand pull process where an exe needs to be written that pulls the file from the bank and drops it into a folder. My issue is they want me to kick the exe off from inside SSIS and then use a file
watcher to import the file into a database once the download is complete. My opinion is the the SSIS package that imports the file and the exe that gets the file from the bank should be totally divorced from each other.
Does anybody have an opinion on the best practice of how this should be done?

Here it is:http://social.msdn.microsoft.com/Forums/sqlserver/en-US/bd08236e-0714-4b8f-995f-f614cda89834/automatic-project-execution?forum=sqlintegrationservices
Arthur My Blog

Similar Messages

  • FDM file format best practice

    All, We are beginning to implement an Oracle GL and I have been asked to provide input as to the file format provided from the ledger to process through FDM (I know, processing directly into HFM is out..at least for now..).
    Is there a "Best Practice" for file formats to load through FDM into HFM. I'm really looking for efficiency (fastest to load, easiest to maintain, etc..)
    Yes, we will have to use maps in FDM, so that is part of the consideration.
    Questions: Fix or delimited, concatenate fields or not, security, minimize the use of scripts, Is it better to have the GL consolidate etc...?
    Thoughts appreciated
    Edited by: Wtrdev on Mar 14, 2013 10:02 AM

    If possible a Comma or Semi-Colon Delimited File would be easy to maintain and easy to load.
    The less use of scripting on the file, the better import performance.

  • Flat File load best practice

    Hi,
    I'm looking for a Flat File best practice for data loading.
    The need is to load a flat fle data into BI 7. The flat file structure has been standardized, but contains 4 slightly different flavors of data. Thus, some fields may be empty while others are mandatory. The idea is to have separate cubes at the end of the data flow.
    Onto the loading of said file:
    Is it best to load all data flavors into 1 PSA and then separate into 4 specific DSOs based on data type?
    Or should data be separated into separate file loads as early as PSA? So, have 4 DSources/PSAs and have separate flows from there-on up to cube?
    I guess pros/cons may come down to where the maintenance falls: separate files vs separate PSA/DSOs...??
    Appreciate any suggestions/advice.
    Thanks,
    Gregg

    I'm not sure if there is any best practise for this scenario (Or may be there is one). As this is more data related to a specific customer needs. But if I were you, I would handle one file into PSA and source the data according to its respective ODS. As that would give me more flexibility within BI to manipulate the data as needed without having to involve business for 4 different files (chances are that they will get them wrong  - splitting the files). So in case of any issue, your trouble shooting would start from PSA rather than going thru the file (very painful and frustating) to see which records in the file screwed up the report. I'm more comfortable handling BI objects rather than data files - coz you know where exactly you have look.

  • DW:101 Question - Site folder and file naming - best practices

    OK - My 1st post! I’m new to DW and fairly new to developing websites (have done a couple in FrontPage and a couple in SiteGrinder), Although not new at all to technical concepts building PCs, figuring out etc.
    For websites, I know I have a lot to learn and I'll do my best to look for answers, RTFM and all that before I post. I even purchased a few months of access to lynda.com for technical reference.
    So no more introduction. I did some research (and I kind of already knew) that for file names and folder names: no spaces, just dashes or underscores, don’t start with a number, keep the names short, so special characters.
    I’ve noticed in some of the example sites in the training I’m looking at that some folders start with an underscore and some don’t. And some start with a capital letter and some don’t.
    So the question is - what is the best practice for naming files – and especially folders. And that’s the best way to organize the files in the folders? For example, all the .css files in a folder called ‘css’ or ‘_css’.
    While I’m asking, are there any other things along the lines of just starting out I should be looking at? (If this is way to general a question, I understand).
    Thanks…
    \Dave
    www.beacondigitalvideo.com
    By the way I built this site from a template – (modified quite a bit) in 2004 with FrontPage. I know it needs a re-design but I have to say, we get about 80% of our video conversion business from this site.

    So the question is - what is the best practice for naming files – and especially folders. And that’s the best way to organize the files in the folders? For example, all the .css files in a folder called ‘css’ or ‘_css’.
    For me, best practice is always the nomenclature and structure that makes most sense to you, your way of thinking and your workflow.
    Logical and hierarchical always helps me.
    Beyond that:
    Some seem to use _css rather than css because (I guess) those file/folder names rise to the top in an alphabetical sort. Or perhaos they're used to that from a programming environment.
    Some use CamelCase, some use all lowercase or special_characters to separate words.
    Some work with CMSes or in team environments which have agreed schemes.

  • Importing best practices baseline package (IT) ECC 6.0

    Hello
    I hope is the right forum,
    i've a sap release ECC 6.00 with stack abap 14.
    In this release i have to install the preconfigured smartforms that now are called
    best practices baseline package. These pacakges are localized and mine is for Italy:
    SAP Best Practices Baseline Package (IT)
    the documents about the installation say that the required support package level has to be with stack 10.
    And it says :
    "For cases when the support package levels do not match the Best Practices requirements, especially when HIGHER support package levels are implemented, only LIMITED SUPPORT can be granted"
    Note 1044256
    By your experience , is it possible to do this installation in this support package condition?
    Thanks
    Regards
    Nicola Blasi

    Hy
    a company wants to implement the preconfigured smartforms in a landscape ECC 6.0
    I think that these smartforms can be implement using the SAP best practices , in particular the baseline package ....see service.sap.com/bestpractices  --> baseline package;  once installed you can configured the scenario you want....
    the package to download is different each other ,depends the localization...for example italy or other country but this is not important at the moment....
    the problem is the note 1044256...it says that to implement this, i must have the support package level requested in this note...not lower and above all not higher.......
    before starting with this "baseline package" installation i'd like to know if i can do it because i have a SP level of 14 for aba e basis for example....while the notes says that want a SP level of 10 for aba e basis.
    what i can do?
    i hope is clear now....let me know
    thanks
    Nicola

  • Essbase unix file system best practice

    Is there such thing in essbase as storing files in different file system to avoid i/o contention? Like for example in Oracle, it is best practice to store index files and data files indifferent location to avoid i/o contention. If everything in essbase server is stored under one file directory structure as it is now, then the unix team is afraid that there may run into performance issue. Can you please share your thought?
    Thanks

    In an environment with many users (200+) or those with planning apps where users can run large long-running rules I would recommend you separate the application on separate volume groups if possible, each volume group having multiple spindles available.
    The alternative to planning for load up front would be to analyze the load during peak times -- although I've had mixed results in getting the server/disk SME's to assist in these kind of efforts.
    Some more advanced things to worry about is on journaling filesystems where they share a common cache for all disks within a VG.
    Regards,
    -John

  • File naming best practice

    Hello
    We are building a CMS system that uses BDB XML to store the individual xhtml pages, editorial content, config files etc. A container may contain tens of thousands of relatively small (<20 Kb) files.
    We're trying to weigh up the benefit on meaningful document names such as "about-us.xml" or "my-profile.xml" versus integer/long file names such as 4382, 5693 without the .xml suffix to make filename indexing as efficient as possible.
    In both situations the document remain unique: appending '_1', '_2' etc where necessary to the former and always incrementing by 1 the latter.
    There is a 'lookup' document that describes the hierarchy and relationships of these files (rather like a site map) so the name of the required document will be known in advance (as a reference in the lookup doc) so we believe that we shouldn't need to index the file names. XQuery will run several lookups but only based upon the internal structure/content of the documents, not on the document names themeselves.
    So is there any compelling reason not to use meaningful names in the container, even if there are > 50,000 documents?
    Thanks, David

    George,
    I was interested in finding out whether document names made of integers would be much more efficient - albeit less intuitive - in the name index than something like 'project_12345.xml'.
    We may need to return all documents of type 'project' so putting the word in the document name seemed like a good idea, but on reflection perhaps we're better off putting that info in the metadata, indexing that and leave the document name as a simple integer/long such as '12345'.
    If so, is it worth rolling out my own inetger-based counter to uniquely name documents or am I better off just using the in-built method setGenerateName()? Is there likely to be much of a performance difference?
    Regards, David

  • File Creation - Best Practice

    Hi,
    I need to create a file daily based on a single query. There's no logic needed.
    Should I just spool the query in a unix script and create the file like that or should I use a stored procedure with a cursor, utl_file etc.?
    The first is probably more efficient but is the latter a cleaner and more maintainable solution?

    I'd be in favour of keeping code inside the database as far as possible. I'm not dismissing scripts at all - they have their place - I just prefer to have all code in one place.

  • Photo file management - best practice query

    A number of recent posts have discussed file management protocols, but I wondered if some of the experts on this forum would be so kind as to opine on a proposed system set up to better manage my photo files.
    I have an imac, time machine & various external hard drives.
    I run Aperture 3 on my imac (my main computer), with about 15k referenced images. Currently my photo masters are kept on my imac, as is my aperture library file, and vault. After editing in APerture, I then export the edited jpegs onto another part of my imac. The imac is backed up to time machine and an off site drive.
    Following some of the threads, the main message seems to be to take as many of my photo files as possible off my imac, and store on a separate drive. So does the folloing set up sound better?
    *Aperture run on imac, still using referenced images
    *Master images moved from imac to external drive 1
    *Aperture library file moved from imac to external drive 1
    *Aperture vault moved to external drive 1
    *External drive 1 backed up to external drive 2. Run idefrag on both drives regularly.
    *Edited exports from Aperture kept on imac, which then feed Apple TV, iphone, mobileme etc. Backed up to time machine.
    *If ever I ran Aperture on an additional machine, presumably I could just plug and play / synch with external hard drive 1.
    Is that a "good" set up? Any enhancements? The set up would seem to free up my boot volume, whilst hopefully maintaining the satefy / integrity of my files. But happy to be told it is all wrong!!
    Many thanks
    Paul

    Seems to be a good approach. However,
    Depending on how much disk space on the local drive and the speed of the external along with the iMac specs... you might keep the the library on the iMac instead of the external, but keep the masters on the external. Assuming referenced, then you could keep the vault on the external (as is I wouldn't put vault on the same external drive as both the masters and library).

  • Storing File path - Best Practice

    I have a db that stores the path of images that are loaded into a page.  All the files are stored in the same folder.  My question:  Is it better to store the entire file path in my db or should I just store the file name and make a Constant within the webpage calling the picture?  I would think it's the second option as it is less data in the db but I just wanted to make sure.
    Thanks!

    If the path is always the same, I would store just the filenames. Another option is to create a new field in your table for the path, and assign the current path as the default value. When inserting records, just add the file name; the path field will just take the default value.
    In your SQL:
    SELECT *, CONCAT(path, filename) AS image
    FROM mytable
    If you already have records stored in the table, you can update them very quickly with two SQL queries:
    UPDATE mytable SET path = '/path/to/images/folder/'
    UPDATE mytable SET filename = REPLACE(filename, 'path/to/images/folder/', '')

  • BPC 5 - Best practices - Sample data file for Legal Consolidation

    Hi,
    we are following the steps indicated in the SAP BPC Business Practice: http://help.sap.com/bp_bpcv151/html/bpc.htm
    A Legal Consolidation prerequisit is to have the sample data file that we do not have: "Consolidation Finance Data.xls"
    Does anybody have this file or know where to find it?
    Thanks for your time!
    Regards,
    Santiago

    Hi,
    From [https://websmp230.sap-ag.de/sap/bc/bsp/spn/download_basket/download.htm?objid=012002523100012218702007E&action=DL_DIRECT] this address you can obtain .zip file for Best Practice including all scenarios and csv files under misc directory used in these scenarios.
    Consolidation Finance Data.txt is in there also..
    Regards,
    ergin ozturk

  • Cp6 - Something broke! A "lifesaver" best practice reminder!

    First, I should point out a very important best practice of saving your projects often and save your Captivate projects as a new name each time.
    e.g. project_01, project_02, etc. (or some other variation that makes sense to you)
    Case in point…
    Just a while ago, my Captivate project became corrupted for unknown reasons. I last experienced this problem in Cp4 so this is a disappointing reminder that Cp6 may also be vulnerable to file corruption.
    Everything seems fine. All the slides seem perfect but this particular project now fails on my two computers.
    The symptom:
    Now, I can no longer run in preview mode (F4 or otherwise). When I select preview, the Captivate preview window hangs for a very long time (slide 1/37 – just a grey screen). If I attempt to click anywhere within the preview window, I will see “Preview (Not Responding)”
    If I wait about 5 minutes, the preview will actually start, however, it is very unresponsive… so clicking the NEXT button (I don’t use the slidebar ** name?) requires nearly 5 minutes to view the next slide.
    The only way out of this mess is to let Captivate crash.
    This repeats when I restart with the same file. In other words, the project is toast.
    While really, really, annoying... all is not lost! Because I change my project name with every save (I anticipated this unfortunately inevitable moment), I now going to open an older (by 1/2 h) project and continue.
    - Shawn

    Hello Rod,
    I am unsure what you are disagreeing with? I fully understand the legacy of Captivate as I've been there every step. I am unsure why you brought that up.
    I think you are misunderstanding or I have not explained this issue well enough.
    This course was developed entirely in Cp6. I have not done a significant amount of testing throughout development... basically building new slides each day.
    Anyhow, when I ran a rare test earlier today, I was shocked to notice the course was unresponsive; taking nearly five minutes before the first slide would show. Naturally I thought corruption. But through hiding slides and publishing older  saved project revisions, I noticed a disturbing trend that project responsiveness. For instance, clicking the navigation buttons went from instantaneous down to just over five seconds per click as test through the published versions.
    Curiously, the following remain unaffected (responsive throughout all revisions):
    - Custom TOC button (a custom/advanced conditional action - if cpCmndTOCVisible = 0 or 1....)
    - Effects
    Everything else has become quite unresponsive (or gradually more unresponsive through the revisions):
    - BACK/NEXT buttons,
    - HELP button (a Jump to Slide x action)
    - Timeline events within a slide
    If this was a problem with preferences, I should not be able to solve the problem by publishing earlier versions or even hiding slides in the most recent revision. Additionally, I should not see this same issue on three different systems.
    But because I am willing to try anything, I cleared out the preferences and restarted Captivate and believe me, I am really disappointed to report that it did not help.
    Please understand, I am not necessarily blaming Captivate... as it could very well be something I have done that has simply made responsiveness worse as I added more slides. Figuring out what that is could be a huge challenge. :-(

  • JSP Best Practices and Oracle Report

    Hello,
    I am writing an application that obtains information from the user using a JSP/HTML form and then submitted to a database, the JSP page is setup using JSP Best Practices in which the SQL statments, database connectivity information, and most of the Java source code in a java bean/java class. I want to use Oracle Reports to call this bean, and generate a JSP page displaying the information the user requested from the database. Would you please offer me guidance for setting this up.
    Thank you,
    Michelle

    JSP Best Practices.
    More JSP Best Practices
    But the most important Best Practice has already been given in this thread: use JSP pages for presentation only.

  • Bad bind variable & best practice for delete

    I am working with three tables and very new to SQL. I need to create a procedure that will accept an ID and go through two sub tables and delete child records. Item is the main table. I am passing in the ID into the procedure and I want to use it as below. I keep getting a bad bind variable error message. I have verified that the table is setup as a number and my procedure accepts a number. I also want someone to review this from best practice as I am new to procedures.
    PROCEDURE DeleteItem (p_ItemID IN NUMBER, p_RowsAffected OUT number)
    IS
    p_RowsAffected NUMBER;
    -- select the itemdetail for the analysis
    CURSOR c_itemdetail
    IS
    SELECT
    itemdetailid
    FROM itemDETAIL
    WHERE itemid = :p_ItemID;
    BEGIN
    -- loop through each itemdetail and delete the itemdetailoutlay
    FOR r_itemdetail IN c_itemdetail
    LOOP
    BEGIN
    DELETE FROM ITEMDETAILOUTLAY
    WHERE itemdetailid = r_itemdetail.itemdetailid;
    COMMIT;
    END;
    END LOOP;
    -- delete the itemdetail
    BEGIN
    DELETE FROM ITEMDETAIL
    WHERE itemid = :p_ItemID;
    COMMIT;
    END;
    -- delete the main item
    BEGIN
    DELETE FROM ITEM
    WHERE itemdid = :p_ItemID;
    COMMIT;
    p_RowsAffected := SQL%ROWCOUNT;
    END;
    END DeleteItem;

    Hi,
    Welcome to the forum!
    As you may notice, this site normally compresses white-space. Whenever you post code, or any formatted text, on this site, type these 6 characters:
    \(small letters only, inside curly brackets) before and after each section of formatted text, to preserve spacing.
    I don't think you mean to use bind variables anywhere, so don't use colons before any variable names.  You were doing this correctly with p_RowsAffected; do the same thing with p_ItemID.
    Try this:PROCEDURE DeleteItem (p_ItemID IN NUMBER, p_RowsAffected OUT number)
    IS
    -- p_RowsAffected NUMBER;     -- Don't name local variables the same as arguments
    -- select the itemdetail for the analysis
    CURSOR c_itemdetail
    IS
    SELECT
    itemdetailid
    FROM itemDETAIL
    WHERE itemid = p_ItemID;     -- No : before p_ItemID
    BEGIN
    -- loop through each itemdetail and delete the itemdetailoutlay
    FOR r_itemdetail IN c_itemdetail
    LOOP
    BEGIN
    DELETE FROM ITEMDETAILOUTLAY
    WHERE itemdetailid = r_itemdetail.itemdetailid;
    COMMIT;
    END;
    END LOOP;
    -- delete the itemdetail
    BEGIN
    DELETE FROM ITEMDETAIL
    WHERE itemid = p_ItemID;     -- No : before p_ItemID
    COMMIT;
    END;
    -- delete the main item
    BEGIN
    DELETE FROM ITEM
    WHERE itemdid = p_ItemID;     -- No : before p_ItemID
    COMMIT;
    p_RowsAffected := SQL%ROWCOUNT;
    END;
    END DeleteItem;
    The most important "best practice" with PL/SQL is to avoid doing it whenever possible. 
    If SQL offers a way o do the same thing, it's usally best not to code anything in PL/SQL.
    Have you considered foreign key constraints, with "ON DELETE CASCADE"?  That way, you could simply "DELETE FROM item", and all the dependent rows in the other tables would automatically be deleted.   You wouldn't need to remember to call a procedure like this; in fact, you would have no need for a procedure like this.
    Given that you do have such a procedure:
    You're doing row-by-row processing, which some mad wags like to call "slow-by-slow" processing.
    For example, iYou're xplicitly finding each ItemDetailID separately, and deleting each one separately, like this:... CURSOR c_itemdetail
    IS
    SELECT
    itemdetailid
    FROM itemDETAIL
    WHERE itemid = p_ItemID;
    -- loop through each itemdetail and delete the itemdetailoutlay
    FOR r_itemdetail IN c_itemdetail
    LOOP
    BEGIN
    DELETE FROM ITEMDETAILOUTLAY
    WHERE itemdetailid = r_itemdetail.itemdetailid;
    COMMIT;
    END;
    END LOOP;
    It's more efficient for the system (and less coding for you) if you let SQL handle as much as possible, so do this instead...     DELETE FROM ItemDetailOutlay
         WHERE ItemDetailID IN
              ( SELECT itemdetailid
              FROM     itemDETAIL
              WHERE     itemid     = p_ItemID
    Do you really want to COMMIT 3 times?  0 or 1 times might be better.
    What happens if there is some kind of error, say, after you've delete rows form ItemDetailOutlay and ItemDetail, but before you've delete from Item?  Wouldn't you want the entire transaction to fail, and leave all three tables in a consistent state?  If so, either have the calling procedure COMMIT, or have a single COMMIT at the end of DelteItem.
    Edited by: Frank Kulash on May 6, 2010 2:25 PM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • SAP Best Practice for Water Utilities v 1.600

    Hi All,
    I want to install SAP Best Practice for Water Utilities v 1.600. I have downloaded the package (now  available only Mat.No. 50079055 "Docu: SAP BP Water Utilities-V1.600")  from Marketplace, but there is NO transport file included on it. It only contains documentation.  Should I use the transport file from Best Practice for Utilities v 1.500?
    Thank you,
    Vladimir

    Hello!
    The file should contain eCATTs with data according to best practice preconfigured scenarios and transactions to install them.
    Some information about preconfigured scenario you could find here:
    http://help.sap.com/bp_waterutilv1600/index.htm -> Business Information -> Preconfigured Scenarios
    Under the "Installation" path you could find "Scenario Installation Guide" for Water Utilities.
    I hope it would be helpful.
    Vladimir

Maybe you are looking for

  • Where's my statement?

    I cannot find my detailed statement - can anybody help?  I click on "my account".  I click on "billing and payments".  There is supposed to be a PDF icon to view current/past statements, but not so.  I've paid my current bill, I just want to see my s

  • Query needed

    Query to get list of employees having photo attached with their profile and query needed to extract the January to April timesheets approved in oracle on 6th May 2008

  • Adobe Flash Player Update Problem

    When I go to install the new Flash Player update on my Mac OS X I go through all the normal steps but when it begins to install it stays stuck at around 25% and won't move. It just says downloading. After a while of staying still it just says connect

  • Calendar Features Needed

    Things I want to be able to do (am I missing something?): 1. Duplicate an existing event, with all its field contents & notes, so I can simply change the date of the duplicate for a return engagement, without re-entering all the details (creating new

  • HT1688 How do I print from iPhone 4 g with epson iprint

    How do I print from iPhone 4g with epson iprint