FDM file format best practice

All, We are beginning to implement an Oracle GL and I have been asked to provide input as to the file format provided from the ledger to process through FDM (I know, processing directly into HFM is out..at least for now..).
Is there a "Best Practice" for file formats to load through FDM into HFM. I'm really looking for efficiency (fastest to load, easiest to maintain, etc..)
Yes, we will have to use maps in FDM, so that is part of the consideration.
Questions: Fix or delimited, concatenate fields or not, security, minimize the use of scripts, Is it better to have the GL consolidate etc...?
Thoughts appreciated
Edited by: Wtrdev on Mar 14, 2013 10:02 AM

If possible a Comma or Semi-Colon Delimited File would be easy to maintain and easy to load.
The less use of scripting on the file, the better import performance.

Similar Messages

  • File import best practice

    I need some outside input on a process. We get a file from a bank and I have to take it and move it along to where it needs to go. Pretty straight forward.
    The complexity is the import process from the bank. It's a demand pull process where an exe needs to be written that pulls the file from the bank and drops it into a folder. My issue is they want me to kick the exe off from inside SSIS and then use a file
    watcher to import the file into a database once the download is complete. My opinion is the the SSIS package that imports the file and the exe that gets the file from the bank should be totally divorced from each other.
    Does anybody have an opinion on the best practice of how this should be done?

    Here it is:http://social.msdn.microsoft.com/Forums/sqlserver/en-US/bd08236e-0714-4b8f-995f-f614cda89834/automatic-project-execution?forum=sqlintegrationservices
    Arthur My Blog

  • Flat File load best practice

    Hi,
    I'm looking for a Flat File best practice for data loading.
    The need is to load a flat fle data into BI 7. The flat file structure has been standardized, but contains 4 slightly different flavors of data. Thus, some fields may be empty while others are mandatory. The idea is to have separate cubes at the end of the data flow.
    Onto the loading of said file:
    Is it best to load all data flavors into 1 PSA and then separate into 4 specific DSOs based on data type?
    Or should data be separated into separate file loads as early as PSA? So, have 4 DSources/PSAs and have separate flows from there-on up to cube?
    I guess pros/cons may come down to where the maintenance falls: separate files vs separate PSA/DSOs...??
    Appreciate any suggestions/advice.
    Thanks,
    Gregg

    I'm not sure if there is any best practise for this scenario (Or may be there is one). As this is more data related to a specific customer needs. But if I were you, I would handle one file into PSA and source the data according to its respective ODS. As that would give me more flexibility within BI to manipulate the data as needed without having to involve business for 4 different files (chances are that they will get them wrong  - splitting the files). So in case of any issue, your trouble shooting would start from PSA rather than going thru the file (very painful and frustating) to see which records in the file screwed up the report. I'm more comfortable handling BI objects rather than data files - coz you know where exactly you have look.

  • DW:101 Question - Site folder and file naming - best practices

    OK - My 1st post! I’m new to DW and fairly new to developing websites (have done a couple in FrontPage and a couple in SiteGrinder), Although not new at all to technical concepts building PCs, figuring out etc.
    For websites, I know I have a lot to learn and I'll do my best to look for answers, RTFM and all that before I post. I even purchased a few months of access to lynda.com for technical reference.
    So no more introduction. I did some research (and I kind of already knew) that for file names and folder names: no spaces, just dashes or underscores, don’t start with a number, keep the names short, so special characters.
    I’ve noticed in some of the example sites in the training I’m looking at that some folders start with an underscore and some don’t. And some start with a capital letter and some don’t.
    So the question is - what is the best practice for naming files – and especially folders. And that’s the best way to organize the files in the folders? For example, all the .css files in a folder called ‘css’ or ‘_css’.
    While I’m asking, are there any other things along the lines of just starting out I should be looking at? (If this is way to general a question, I understand).
    Thanks…
    \Dave
    www.beacondigitalvideo.com
    By the way I built this site from a template – (modified quite a bit) in 2004 with FrontPage. I know it needs a re-design but I have to say, we get about 80% of our video conversion business from this site.

    So the question is - what is the best practice for naming files – and especially folders. And that’s the best way to organize the files in the folders? For example, all the .css files in a folder called ‘css’ or ‘_css’.
    For me, best practice is always the nomenclature and structure that makes most sense to you, your way of thinking and your workflow.
    Logical and hierarchical always helps me.
    Beyond that:
    Some seem to use _css rather than css because (I guess) those file/folder names rise to the top in an alphabetical sort. Or perhaos they're used to that from a programming environment.
    Some use CamelCase, some use all lowercase or special_characters to separate words.
    Some work with CMSes or in team environments which have agreed schemes.

  • Essbase unix file system best practice

    Is there such thing in essbase as storing files in different file system to avoid i/o contention? Like for example in Oracle, it is best practice to store index files and data files indifferent location to avoid i/o contention. If everything in essbase server is stored under one file directory structure as it is now, then the unix team is afraid that there may run into performance issue. Can you please share your thought?
    Thanks

    In an environment with many users (200+) or those with planning apps where users can run large long-running rules I would recommend you separate the application on separate volume groups if possible, each volume group having multiple spindles available.
    The alternative to planning for load up front would be to analyze the load during peak times -- although I've had mixed results in getting the server/disk SME's to assist in these kind of efforts.
    Some more advanced things to worry about is on journaling filesystems where they share a common cache for all disks within a VG.
    Regards,
    -John

  • File naming best practice

    Hello
    We are building a CMS system that uses BDB XML to store the individual xhtml pages, editorial content, config files etc. A container may contain tens of thousands of relatively small (<20 Kb) files.
    We're trying to weigh up the benefit on meaningful document names such as "about-us.xml" or "my-profile.xml" versus integer/long file names such as 4382, 5693 without the .xml suffix to make filename indexing as efficient as possible.
    In both situations the document remain unique: appending '_1', '_2' etc where necessary to the former and always incrementing by 1 the latter.
    There is a 'lookup' document that describes the hierarchy and relationships of these files (rather like a site map) so the name of the required document will be known in advance (as a reference in the lookup doc) so we believe that we shouldn't need to index the file names. XQuery will run several lookups but only based upon the internal structure/content of the documents, not on the document names themeselves.
    So is there any compelling reason not to use meaningful names in the container, even if there are > 50,000 documents?
    Thanks, David

    George,
    I was interested in finding out whether document names made of integers would be much more efficient - albeit less intuitive - in the name index than something like 'project_12345.xml'.
    We may need to return all documents of type 'project' so putting the word in the document name seemed like a good idea, but on reflection perhaps we're better off putting that info in the metadata, indexing that and leave the document name as a simple integer/long such as '12345'.
    If so, is it worth rolling out my own inetger-based counter to uniquely name documents or am I better off just using the in-built method setGenerateName()? Is there likely to be much of a performance difference?
    Regards, David

  • File Creation - Best Practice

    Hi,
    I need to create a file daily based on a single query. There's no logic needed.
    Should I just spool the query in a unix script and create the file like that or should I use a stored procedure with a cursor, utl_file etc.?
    The first is probably more efficient but is the latter a cleaner and more maintainable solution?

    I'd be in favour of keeping code inside the database as far as possible. I'm not dismissing scripts at all - they have their place - I just prefer to have all code in one place.

  • Photo file management - best practice query

    A number of recent posts have discussed file management protocols, but I wondered if some of the experts on this forum would be so kind as to opine on a proposed system set up to better manage my photo files.
    I have an imac, time machine & various external hard drives.
    I run Aperture 3 on my imac (my main computer), with about 15k referenced images. Currently my photo masters are kept on my imac, as is my aperture library file, and vault. After editing in APerture, I then export the edited jpegs onto another part of my imac. The imac is backed up to time machine and an off site drive.
    Following some of the threads, the main message seems to be to take as many of my photo files as possible off my imac, and store on a separate drive. So does the folloing set up sound better?
    *Aperture run on imac, still using referenced images
    *Master images moved from imac to external drive 1
    *Aperture library file moved from imac to external drive 1
    *Aperture vault moved to external drive 1
    *External drive 1 backed up to external drive 2. Run idefrag on both drives regularly.
    *Edited exports from Aperture kept on imac, which then feed Apple TV, iphone, mobileme etc. Backed up to time machine.
    *If ever I ran Aperture on an additional machine, presumably I could just plug and play / synch with external hard drive 1.
    Is that a "good" set up? Any enhancements? The set up would seem to free up my boot volume, whilst hopefully maintaining the satefy / integrity of my files. But happy to be told it is all wrong!!
    Many thanks
    Paul

    Seems to be a good approach. However,
    Depending on how much disk space on the local drive and the speed of the external along with the iMac specs... you might keep the the library on the iMac instead of the external, but keep the masters on the external. Assuming referenced, then you could keep the vault on the external (as is I wouldn't put vault on the same external drive as both the masters and library).

  • Storing File path - Best Practice

    I have a db that stores the path of images that are loaded into a page.  All the files are stored in the same folder.  My question:  Is it better to store the entire file path in my db or should I just store the file name and make a Constant within the webpage calling the picture?  I would think it's the second option as it is less data in the db but I just wanted to make sure.
    Thanks!

    If the path is always the same, I would store just the filenames. Another option is to create a new field in your table for the path, and assign the current path as the default value. When inserting records, just add the file name; the path field will just take the default value.
    In your SQL:
    SELECT *, CONCAT(path, filename) AS image
    FROM mytable
    If you already have records stored in the table, you can update them very quickly with two SQL queries:
    UPDATE mytable SET path = '/path/to/images/folder/'
    UPDATE mytable SET filename = REPLACE(filename, 'path/to/images/folder/', '')

  • BPC 5 - Best practices - Sample data file for Legal Consolidation

    Hi,
    we are following the steps indicated in the SAP BPC Business Practice: http://help.sap.com/bp_bpcv151/html/bpc.htm
    A Legal Consolidation prerequisit is to have the sample data file that we do not have: "Consolidation Finance Data.xls"
    Does anybody have this file or know where to find it?
    Thanks for your time!
    Regards,
    Santiago

    Hi,
    From [https://websmp230.sap-ag.de/sap/bc/bsp/spn/download_basket/download.htm?objid=012002523100012218702007E&action=DL_DIRECT] this address you can obtain .zip file for Best Practice including all scenarios and csv files under misc directory used in these scenarios.
    Consolidation Finance Data.txt is in there also..
    Regards,
    ergin ozturk

  • Insterting vidoes in Adobe Captivate 8: best practices?

    I am looking for some information about the best format/ best practices for including mp4 videos in my responsive design courses. My company uses a lot of courses that may be just a 2-3 minute video accompanied by a quiz. I don't know much about video formats, so I'm looking for some knowledge on which format will work best in our LMS as well as maintain a crisp quality with reduced load times. If anyone has anything they could share or links to information, I would appreciate it.
    Thanks

    Yeah, wasn't really expecting the file extension change to work - but it has worked for me in the past with other types of software, so I reckoned it was worth a shot. Desperate times call for desperate measures...
    Anyway, I know that this was a long time ago. In fact, I was triggered by your question whether I was absolutely sure the files were made in Cp3. So I investigated further and discovered that some of the files are actually Captivate 2... We're talking 2007 here. Luckily, I was able to open the rest of the files that have been created in 2008 through Cp5. As for the one course created in Cp2, I'll probably have to extract the images from the swf and build them anew...
    (The reason we haven't updated the courses until now is that our client has never requested this. In fact, we were under the impression the courses were no longer running, but now appear to be still in use.)

  • SAP Best Practice for Water Utilities v 1.600

    Hi All,
    I want to install SAP Best Practice for Water Utilities v 1.600. I have downloaded the package (now  available only Mat.No. 50079055 "Docu: SAP BP Water Utilities-V1.600")  from Marketplace, but there is NO transport file included on it. It only contains documentation.  Should I use the transport file from Best Practice for Utilities v 1.500?
    Thank you,
    Vladimir

    Hello!
    The file should contain eCATTs with data according to best practice preconfigured scenarios and transactions to install them.
    Some information about preconfigured scenario you could find here:
    http://help.sap.com/bp_waterutilv1600/index.htm -> Business Information -> Preconfigured Scenarios
    Under the "Installation" path you could find "Scenario Installation Guide" for Water Utilities.
    I hope it would be helpful.
    Vladimir

  • Best practice for photo format: RAW+PSD+JPEG?

    What is the best practice in maintaining format of files while editing?
    I shoot in RAW and import into PS CS5. After editing, it allows me to save as various formats, including PSD and JPEG. PS says that if you want to re-edit the file, you should save as PSD as all the layers are maintained as-is. Hence I'd prefer to save as .PSD. However, in most cases, the end objective is to share the image with others and JPEG is the most suitable format. Does this mean, that for each image, its important to save it in 3 formats viz RAW, PSD and JPEG? Wont this increase the total space occupied tremendously? Is this how most professionals do it? Pls advice.

    Thanks everyone for this continued discussion in my absence over two weeks. Going through it i realize its helpful stuff. During this period, i downloaded Aperture trial and have learnt it (there's actually not much learning, its so incredibly intuitive and simple, but incredibly powerful. Since I used iphoto in the past, it just makes it easier.
    I have also started editing my pics to put them up on my photo site. And over past 10 days, here is the workflow I have developed.
    -Download RAW files onto my laptop using Canon s/w into a folder where i categorize and maintain all my images
    -Import them into Aperture, but letting the photos reside in the folder structure i defined (rather than have Aperture use its own structure)
    -Complete editing of all required images in Aperture (and this takes care of 80-90% of my pics)
         -From within Aperture open in PS CS5 those images that require editing that cannot be done in Aperture
         -Edit in CS5 and do 'Save', this brings them back to Aperture
         -Now I have two versions of these images in Aperture - the original RAW and the new .PSD
    -Select the images that I need to put up on my site and export them to a new folder from where i upload them
    I would be keen to know if someone else follows a more efficient or robust workflow than this, would be happy to incorporate it.
    There are still a couple questions I have:
    1 - Related to PS CS5: Why do files opened in CS5 jump up in terms of their file size. Any RAW  or JPEG file originally btn 2-10 MB shows up as minimum 27 MB in CS. The moment you do some edits and/or add layers, it reaches 50-150MB. This is ridiculous. I am sure I am doing something wrong.  Or is this how CS5 works with everyone.
    2 - After editing a file in CS by launching it from Aperture, I now end up with two versions in Aperture, the original file and the new .PSD file (which is usually 100MB+). I tried exporting the .PSD file to a folder to upload it on my site, and wasnt sure what format and size it would end up with. I got it as a JPEG file within reasonable filesize limits. Is this how Aperture works? Does Aperture allow you options of which format you want to save the file in?

  • Best practice - updating figure numbers in a file, possibly to sub-sub-chapters

    Hi,
    I'm a newbie trying to unlearn my InDesign mindset to work in FrameMaker. What is best practice for producing figure numbers to accompany diagrams throughout a document? A quick CTRL+F in the Framemaker 12 Help book doesn't seem to point me in a particular direction. Do diagrams need to be inserted into a table, where there is a cell for the image and a cell for the figure details in another? I've read that I should  use a letter and colon in the tag to keep it separate from other things that update, e.g. F: (then figure number descriptor). Is there anything else to be aware of, such as when resetting counts for chapters etc?
    Some details:
    Framemaker12.
    There are currently 116 chapters (aviation subjects) to make.
    Each of these chapters will be its own book in pdf form, some of these chapters run to over 1000 pages.
    Figure number ideally takes the form: "Figure (a number from one of the 1-116 chapters used) - figure number" e.g. "Figure 34 - 6." would be the the 6th image in the book 'chapter 34'.
    The figure number has to cross reference to explaining text, possibly a few pages away.
    These figures are required to update as content is added or removed.
    The (aviation) chapter is an individual book.
    H1 is the equivalent of the sub-chapter.
    H2 is the equivalent of the sub-sub-chapter.
    H3 is used in the body copy styling, but is not a required detail of the figure number.
    I'm thinking of making sub-chapters in to individual files. These will be more manageable on their own. They will then be combined in the correct order to form the book for one of these (1 of 116) subject chapters.
    Am I on the right track?
    Many thanks.
    Gary

    Hi,
    Many thanks for the link you provided. I have implemented your recommendation into my file. I have also read somewhere about sizing anchored frames to an imported graphic using 'esc' + 'm' + 'p'.
    What confuses me, coming from InDesign is being able to import these graphics at the size they were made ( WxH in mm at 300ppi) and keeping them anchored to a point in the text flow.
    I currently have 1 and 2 column master pages built. When I bring in a graphic my process is:
    insert a single cell table on the next space after current text > drop the title below the cell > give the title a 'figure' format. When I import a graphic it either tries to fit it in the current 2 column layout with only part of it showing in a box which is half the width of a single column!
    A current example: page 1 (2 column page) the text flows for 1.5 columns. At the end of the text I inserted a single cell table, then imported and image into the cell.
    Page 2 (2 column page) has the last line of page 1's text in the top left column.
    Page 3 (2 page column)  has the last 3 words of page 1 in its top left column.  The right column has the table in it with part of the image showing. The image has also bee distorted, like it's trying to fit. These columns are 14 cm wide, the cell is 2 cm wide at this point. I have tried to give cells for images 'wider' attributes using the object style designer but with no luck.
    Ideally I'm trying to make 2 versions. 1) an anchored frame that fits in a 1 column width on a 2 column width page. 2) An anchored frame that fits the full width of my landscape pages (minus some border dimension),  this full width frame should be created on a new proceeding page. I'd like to be able drop in images to suit these different frames with as much automation as possible.
    I notice many tutorials tell you how to do a given area of the program, but I haven't been able to find one that discusses workflow order. Do you import all text first, then add empty graphic boxes and/or tables throughout and then import images? I'm importing text from Word,  but the images are separate, having been vectored or cleaned up in Photoshop - they won't be imported from the same word file.
    many thanks

  • Best practice for importing non-"Premiere-ready" video files

    Hello!
    I work with internal clients that provide me with a variety of differnet video types (could be almost ANYTHYING, WMV, MP4, FLV).  I of course ask for AVIs when possible, but unfortunately, I have no control over the type of file I'm given.
    And, naturally, Premiere (just upgraded to CS5) has a hard time dealing with these files.  Unpredictable, ranging from working fine to not working at all, and everything in between.  Naturally, it's become a huge issue for turnaround time.
    Is there a best practice for preparing files for editing in Premiere?
    I've tried almost everything I can think of:  converting the file(s) to .AVIs using a variety of programs/methods.  Most recently, I tried creating a Watch Folder in Adobe Media Encoder and setting it for AVI with the proper aspect ratio.  It makes sense to me that that should work:using an Adobe product to render the file into something Premiere can work with.
    However, when I imported the resulting AVI into Premiere, it gave me the Red Line of Un-renderness (that is the technical term, right?), and had the same sync issue I experienced when I brought it in as a WMV.
    Given our environment, I'm completely fine with adding render time to the front-end of projects, but it has to work.  I want files that Premiere likes.
    THANK YOU in advance for any advice you can give!
    -- Dave

    I use an older conversion program (my PrPro has a much older internal AME, unlike yours), DigitalMedia Converter 2.7. It is shareware, and has been replaced by Deskshare with newer versions, but my old one works fine. I have not tried the newer versions yet. One thing that I like about this converter is that it ONLY uses System CODEC's, and does not install its own, like a few others. This DOES mean that if I get footage with an oddball CODEC, I need to go get it, and install it on the System.
    I can batch process AV files of most types/CODEC's, and convert to DV-AVI Type II w/ 48KHz 16-bit PCM/WAV Audio and at 29.97 FPS (I am in NTSC land). So far, 99% of the resultant converted files have been perfect, whether from DivX, WMV, MPEG-2, or almost any other format/CODEC. If there is any OOS, my experience has been that it will be static, so I just have to adjust the sync offset by a few frames, and that takes care of things.
    In a few instances, the PAR flag has been missed (Standard 4:3 vs Widescreen 16:9), but Interpret Footage has solved those few issues.
    Only oddity that I have observed (mostly with DivX, or WMV's) is that occasionally, PrPro cannot get the file's Duration correct. I found that if I Import those problem files into PrElements, and then just do an Export, to the same exact specs., that resulting file (seems to be 100% identical, but something has to be different - maybe in the header info?) Imports perfectly into PrPro. This happens rarely, and I have the workaround, though it is one more step for those. I have yet to figure out why one very similar file will convert with the Duration info perfect, and then a companion file will not. Nor have I figured out exactly what is different, after running through PrE. Every theory that I have developed has been shot down by my experiences. A mystery still.
    AME works well for most, as a converter, though there are just CODEC's, that Adobe programs do not like, such as DivX and Xvid. I doubt that any Adobe program will handle those suckers easily, if at all.
    Good luck,
    Hunt

Maybe you are looking for