Best Practice for Flat File Data Uploaded by Users

Hi,
I have the following scenario:
1.     Users would like to upload data from flat file and subsequently view their reports.
2.     SAP BW support team would not be involved in data upload process.
3.     Users would not go to RSA1 and use InfoPackages & DTPs. Hence, another mechanism for data upload is required.
4.     Users consists of two group, external and internal users. External users would not have access to SAP system. However, access via a portal is acceptable.
What are the best practice we should adopt for this scenario?
Thanks!

Hi,
I can share what we do in our project.
We get the files from the WEB to the Application Server in path which is for this process.The file placed in the server has a naming convention based on ur project,u can name it.Everyday the same name file is placed in the server with different data.The path in the infopackage is fixed to that location in the server.After this the process chain trigers and loads the data from that particular  path which is fixed in the application server.After the load completes,a copy of file is taken as back up and deleted from that path.
So this happens everyday.
Rgds
SVU123
Edited by: svu123 on Mar 25, 2011 5:46 AM

Similar Messages

  • Best Practice for Initial Load Data

    Dear Experts,
        I would like to know the best practices or factors to be concerned when performing initial load
    For example,
    1) requirement from business stakeholders for data analysis
    2) age of data to meet tactical reproting
    3) data dependency crossing sap  modules
    4) Is there any best practice for loading master data?

    HI ,
    check this links
    Master Data loading
    http://searchsap.techtarget.com/guide/allInOne/category/0,296296,sid21_tax305408,00.html
    http://datasolutions.searchdatamanagement.com/document;102048/datamgmt-abstract.htm
    Regards,
    Shikha

  • Best practice for putting binary data on the NMR

    Hi,
    We're creating a component that will consume messages off the NMR, encode them, and subsequently put them back on the NMR. What's the best practice for sending binary data over the NMR?
    1. setContent()?
    2. addAttachment()?
    3. setProperty()?
    If NormailzedMessage.setContent() is the desired approach, then how can you accomplish that?
    Thanks,
    Bruce

    setContent() is used only for XML messages. The recommended way to accommodate binary data is to use addAttachment().

  • Best Practice for Using Static Data in PDPs or Project Plan

    Hi There,
    I want to make custom reports using PDPs & Project Plan data.
    What is the Best Practice for using "Static/Random Data" (which is not available in MS Project 2013 columns) in PDPs & MS Project 2013?
    Should I add that data in Custom Field (in MS Project 2013) or make PDPs?
    Thanks,
    EPM Consultant
    Noman Sohail

    Hi Dale,
    I have a Project Level custom field "Supervisor Name" that is used for Project Information.
    For the purpose of viewing that "Project Level custom field Data" in
    Project views , I have made Task Level custom field
    "SupName" and used Formula:
    [SupName] = [Supervisor Name]
    That shows Supervisor Name in Schedule.aspx
    ============
    Question: I want that Project Level custom field "Supervisor Name" in
    My Work views (Tasks.aspx).
    The field is enabled in Task.aspx BUT Data is not present / blank column.
    How can I get the data in "My Work views" ?
    Noman Sohail

  • Best Practice for disparately sized data

    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    A

    Angel 1058 wrote:
    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    AHi A,
    It depends... if there is a relationship between keys and sizes, e.g. if this or that part of the key means that the size of the value will be big, then you can implement a key partitioning strategy possibly together with keyassociation on the key in a way that it will evenly spread the large entries across the partitions (and have enough partitions).
    Unfortunately you would likely not get a totally even distribution across nodes because of having fairly small amount of entries compared to the square of the number of nodes (btw, which version of Coherence are you using?)...
    Best regards,
    Robert

  • Best practice for retraction of data from BPC 10.0 to General Ledger

    Hi All,
    I have the requirement to retract data from BPC 10.0 to General Ledger.
    What is the best practice for doing this?
    I have read the "How To... Retract data from BPC 7.5 NetWeaver to SAP ERP Cost Center Accounting, Part I", but this is prepared to transfer data to Cost Center accounting. Will General Ledger be populated automatically?
    What is your opinion on this?
    Best regards,
    JA

    Hi Rich,
    In BPC 10 NW, the data entered in Input Forms has to be retracted to ECC.
    For this retraction we are following below link.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c029accf-6d95-2d10-79aa-cc291180bf73?QuickLink=index&overridelayout=true&59180354379512
    In this document, I am unable to get the details of class ZCL_BPC_RTRCT_SUPERCLASS.
    as it is Z class, i am unable to create in my system.
    could you please help me.

  • Best practices for submitting CF data to an AJAX page?

    Hi everyone,
    I've got a project I'm working on for work and have hit a
    little problem.
    I am extracting data from my database and then after each
    piece of data (just numbers, usually 10 chunks of numbers), I tack
    a "|" onto the end of each number. Then, I output the data to the
    page. Back on my AJAX enabled page, I get the "responseText" from
    that page, and then split it up using javascript and the
    pre-inserted "|".
    This seems to work fine, but it is quite a bit more messy.
    Also, It would really, really be nice to be able to do sorting and
    various other operations on the data with javascript instead of
    having to rely on CF's icky code logic.
    Can someone please enlighten me as to best practices for this
    type of thing? I get the suspicion that I'll probably be using XML
    somehow, but I'd your opinion.
    Thanks!

    Check out the Samples and Documentation portions of Adobe's
    Spry website for client side use of JSON with Spry.
    http://labs.adobe.com/technologies/spry/home.html
    Here is link to Adobe's Spry Forums:
    http://www.adobe.com/cfusion/webforums/forum/categories.cfm?forumid=72&catid=602
    If you are using CF8 you can use the SerializeJSON function
    to convert a variable to JSON. You might also be interested in the
    cfsprydataset tag. CF 8 documentation:
    http://livedocs.adobe.com/coldfusion/8/htmldocs/
    If you are using a previous version of CF there is 3rd party
    JSON support. You can find links at
    http://json.org.

  • Flat file data upload to ODS failed

    Hi guys,
    While loading the flat file data to ODS, it gets fails immediately without loading single record. Error message shows :: 
    <b>" Error when opening the data file <file name> (origin A)."
    "Error in the data request"
    "Error occurred in the data selection"</b>
    I am trying to load the data manually thru Infopackage and file is lying in the Application Server and full access given to it. But the important part is when we try to load the file again without doing any changes, it run fine and loads the data without any issues. This happens with every load & everyday.
    Any solutions?
    Regards
    Sanjiv

    Hi CK,
    Step-by-step Analysis status are as given::
    <b><Red> </b>        Data request sent off ?
    <b><No colour></b> RFC to source system successful ?
    <b><No colour></b> Does selectable data exist in the source
    <b><Red></b>         Data selection successfully started ?
    <b><Red> </b>        Data selection successfully finished ?
    <b><Red></b>         Processing error in source system reporte
    <b><No colour></b> RFC to Warehouse successful ?
    <b><Red>    </b>     Processing error in Warehouse reported ?
    <b><No colour></b> Processing successfully finished?
    <b><No colour></b> All reported data packets received?
    <b><No colour></b> All data packets complete ?
    <b><Green> </b>     Have all Processing Steps been Carried ou
    <b><Green></b>      All Data Packets Updated in all Targets?
    <b><Green>  </b>    Inadmissable Aggregation?
    Regards
    Sanjiv

  • Flat file data upload - happens for only 1 column

    Dear Experts,
    Goal: To upload data from flat file (with 3 columns-char3,language,text) to characteristic (texts).
    Problem: I am doing through CSV File but when i click on preview i am able to see data only in first column, the other two columns are empty.
    More details:
    I have created this data source in data source tab.
    Adapter - Load text type file from Workstation.
    data format -Seperated with seperator (CSV).
    Data Separator     - ,
    Escape Sign     - "
    File name      - C:\Documents and Settings\hans\Desktop\mtype.txt
    Fields
    /BIC/TSMOVTYPE           CHAR     3
    LANGU               CHAR     2
    TXTSH               CHAR     20
    Can anyone give me idea?
    regards
    BI Learner

    Hi,
    You can try this:
    1. After ESCAPE SIGN, there will be an option " Number of rows to be left". Put 1 there.
    2. In your flat file, leave the 1st row blank or add anything in the first row for all the three columns.Save as CSV file. Note that the file must have the same sequence as in the data source.
    3. Check your transformations. Data source fields should be mapped correctly to the right side.
    try to load now. Hope it'll help.
    Preet

  • Best practice for exposing internal data to external world?

    Currently we have our Internet server sitting in our corporate DMZ taking website and web service requests from the outside world.  Class libraries with compiled connection strings exist on that server.  That server then has a connection through
    the firewall to the database server.  I'm told that this is no longer the secure/recommended best practice.
    I'm told to consider having that Internet server make requests of not the database server, but rather a layer in between (application server, intranet server, whatever) that has those same Web UI methods exposed.. and then THAT server (being inside the firewall)
    connects to the database server.
    Is this the current recommended best practice to have external users interact with internal data?  It seems like lots of hoops -- outside person app queries Web UI methods on Internet server which in-turn queries same method (duplicated) on Intranet
    server which then talks to the database.
    I'm just trying to determine the simples practice, but also what is appropriately secure for our ASP.NET applications and services.
    Thanks.

    IMO this has little to do with SOA and all about DMZs. What you're are trying to stop is the same comm protocol accessing the database as accessed the web site. As long as you fulfil that then great. WCF can help here because it helps with configuring
    the transport of calls. Another mechanism is to use identities but IMO it's easier to use firewalls and transports.
    http://pauliom.wordpress.com

  • Best practice for migrating eLearning data from Sol Mgr 3 to 4?

    Greetings,
    What is the recommended method for moving eLearning data when migrating from Solution Manager 3 to version 4?
    Thanks in advance,
         Ken Henderson

    948115 wrote:
    Dear All,
    This is Priya.
    We are using ODI 11.1.1.6 version.
    In my ODI project, we have separate installations for Dev, Test and Prod. i.e. Master repositories are not common between all the three. Now my code is ready in dev. Test environment is just installed with ODI and Master and Work repositories are created. Thats it
    Now, I need to know and understand what is the simple & best way to import the code from Dev and migrate it to test environment. Can some one brief the same as a step by step procedure in 5-6 lines? If this is the 1st time you are moving to QA, better export/import complete work repositories. If it is not the 1st time then create scenario of specific packages and export/import them to QA. In case of scenario you need not to bother about model/datastores. keep in mind that the logical schema name should be same in QA as used in your DEV.
    Some questions on current state.
    1. Do the id's of master and work repositories in Dev and Test need to be the same?It should be different.
    2. I usually see in export file a repository id with 999 and fail to understand what it is exactly. None of my master or work repositories are named with that id.It is required to ensure object uniqueness across several work repositories. For more understanding you can refer
    http://docs.oracle.com/cd/E14571_01/integrate.1111/e12643/export_import.htm
    http://odiexperts.com/odi-internal-id/
    3. Logical Architecture objects and context do not have an export option. What is the suitable alternative for this?If you are exporting topology then you will get the logical connection and context details. If you are not exporting topology then you need to manually create context and other physical connection/logical connection.
    >
    Thanks,
    Priya
    Edited by: 948115 on Jul 23, 2012 6:19 AM

  • EP Upgrade - SP14 - Best Practice for Modification File Comparison

    SDN  Experts -
    We are upgrading our EP from SP14 - SP16.  SAP offers a file "diff" tool that is only useful for Java application files to assist in re-applying our mods on top of the new code stack.
    We are looking for best practices in Portal upgrades to do the following:
    - Identify all files that we have modified on existing SP
    - Diff all source code files (java, XML, GUI, other) between Current SP14 and SP16
    We are also looking for documentation that identifies the local directory structure for NWDS.  This would aid us in creating a batch process to "diff" our source code libraries.
    Any recommendations are appreciated.
    Thanks

    I'm not realy getting your question because you already state what to do:
    We are looking for best practices in Portal upgrades to do the following:
    Identify all files that we have modified on existing SP
    Diff all source code files (java, XML, GUI, other) between Current SP14 and SP16
    You should know by documentation what is changed I guess? Then start diff-ing the code and recompile or repackage. NWDS also has diff functionalities.
    Good luck,
    Benjamin

  • Flat file data upload error

    Hello,
    I am trying to load a flat file into a datasource. When I execute the infopackage, i am getting a lot of errors for diff fields in the datasource like this:
    Error 'An exception with the type CX_SY_CONVERSION_NO_NUM' at conversion exit RSDS_CONVERT_NUMBER (field DATE_OF_BIRTH record 1, value 12/)
    What does this mean and what do I have to do to load the data properly into the PSA?
    Thanks.

    Hi,
      This may be due to following,
    1. I guess you are using some Date data in your file. First check whether in your Datasource u have defined that field with correct data type( either Calday, calmonth...).
    2. If you have declared correctly, check whther you are giving the data for Date in your file in correct format: Ex: 012007, 022008....
    3. Check the sequence of the fields in your file and datasource, whether its correct or not.
    rgrds,
    v.sen

  • Best Practices for Loading Master Data via a Process Chain

    Currently, we load attributes, text, and hierarchies before loading the transactional data.  We have one meta chain.  To load the master data it is taking more than 2 hours.  Most of the master data is full loads.  We've noticed that a lot of the master data, especially text, has not changed or changed very little since we implemented 18 months ago.  Is there a precedence or best practice to follow such as do we remove these processes from the chain?  If so, how often should it be run?  We would really like to reduce the amount of the master data loading time.  Is there any documentation that I can refer to?  What are other organizations doing to reduce the amount of time to load master data?
    Thanks!
    Debby

    Hi Debby,
    I assume you're loading Master Data from a BI system? The forum here are related to SAP NetWeaver MDM, so maybe you should ask this question in a BI forum?
    Nevertheless, if your data isn't changed this much, maybe you could use a delta mechanism for extraction? This would send only the changed records and not all the unchanged all the time. But this depends on your master data and of course on your extractors.
    Cheers
    Michael

  • Best Practices for Exporting Files??

    I'm new to Premiere (coming from FCP).  I used Premiere months ago to compress some ProRes files to h.264 files for the web.  I sent the files through Media Encoder and everything seemed fine.  However, I realized after several weeks that the audio in all of the files was a few frames out of sync.  Having not been a Premiere user at the time I did not do much research and decided to just use MPEG Streamclip from then on.
    Now that I'm learning how to use Premiere, I looked up the issue on the forums and found that many people have had similar issues with the audio being out of sync after exporting. However, there are tons of different scenerios in which it seems to be occuring.  The one common variable that I've noticed (among many of the threads, but not all) is that many of the people are exporting to a Quicktime format. 
    While I don't remember all the details of my export and sequence settings from my issue months ago (so I don't want to address that specific case), I am curious as to what are some "Best Practices" when exporting from Premiere Pro? Is there any advantage/disadvantage to use AME rather than exporting directly from Premiere Pro? In general, I will just be exporting as H.264 files for the web, MPEG-2 for DVD, and ProRes 422 for After Effects (or sometimes to bring into MPEG Streamclip). 
    I shoot almost entirely in AVCHD, and usually at 1080p 30fps.  I'm running CS5 on a Macbook Pro 15" 2.0 Quad Core i7 8GB RAM.
    While the question may seem broad, my main concern that I want to avoid is having the audio out of sync.  But also I just want to know of any important details to keep in mind to prevent other issues.
    Thanks,
    Mike

    > I'm running CS5...
    What specific version? We're up to 5.0.4 now.
    There have been bug fixes for audio/video synch in the updates. One of the fixes was for a bug in the conforming of audio and indexing of MPEG files, so you need to delete your media cache files and let Premiere Pro create new ones for this fix to take effect.

Maybe you are looking for

  • Username and password issue in SAP GUI logon-Please help

    Hello, Usually I download the tx.sap file from my work system it downloads and opens the main screen directly.( password is disabled for us). Now when I connect from home(home system through VPN) ,i can download tx.sap file from my company's SAP web

  • How do I transfer playlists from one imac to another

    How do I transfer playlists from my old iMac to my new iMac?

  • Have I done a really stupid thing?

    Hi On a whim, I recently purchased a 20" Apple Cinema Display on ebay. Whilst the listing said it was faulty owing to a damaged power supply - that had been 'professionally repaired' - I thought it would be a good buy and I could get the problem diag

  • Arrow Buttons Below YouTube Videos are Distorted

    Below any video I watch on YouTube, the arrow next to the "Add To" button and the "Download" button (with the Easy YouTube Video Downloader add-on installed) are distorted. This doesn't happen in Chrome or IE9. This also happened in Firefox 4.0 so it

  • Sdo_util.from_gml311geometry and NLS

    Hello, We use sdo_util for simple conversions between coordinates. The problem is that this function changes NLS settings: SQL> select parameter, value from nls_session_parameters; PARAMETER VALUE NLS_LANGUAGE POLISH NLS_TERRITORY POLAND NLS_CURRENCY