Data upload without XI/PI

Dear Experts,
I am stuck with some problem and i hope you all there will be able to help me.
Scenario:
client doesn't have Web Services i.e. XI/PI and they want to sync data from one application to SAP. The application extracts XML file to file server which then i need to pick and upload to SAP system.
Problems:
Is it possible to pick up file from 'file server' without XI/PI ? (if yes then how?)
how to convert the file from XML to flat once file is picked?
once there is success/failure of data upload then there should be some log file for the confirmation.
this is the overview of the problem once i break this then i hope i will have some other questions to ask.
Any light on this is highly appreciated and thanks in advance.
waiting for your replies
Cheers !
Moderator message: duplicate post locked.
Edited by: Thomas Zloch on Nov 6, 2011 3:55 PM

Hi Neeru,
Is it possible to pick up file from 'file server' without XI/PI ? (if yes then how?)
Yes it is possible.
how to convert the file from XML to flat once file is picked?
There are some standard fm are available.
once there is success/failure of data upload then there should be some log file for the confirmation.
Maintain a job and collect the log.
Regards,
Madhu.

Similar Messages

  • Data upload from R3 to BPC using filters

    We are facing a performance issue on the uploading data flow from R3 to BPC. Two steps have been set up.
    Step1: R/3 --> ODS --> BW CUBE (No filters and full mode)
    Step2: BW  CUBE --> BPC CUBE (Standard BW upload package)
    The first one (from R3 to a BW Cube) has no filters, so it loads every data and it takes too much time. However, the second one (from the Cube to BPC)  has filtering options, so we can only load the data we need to (we usually use entity and time filters).
    Both are executed from the BPC DataManager.
    Any advice to improve the performance of the first step? Could we join the step 1 and step 2 and get the filters from the standard BW upload process using a custom process chain?
    Does anybody have any experience using filters to reduce the amount of records on the data uploading from R/3 to BPC?
    BPC Version 7.50.15

    Hi,
    for example in BI data is stored as below
    GroupAcc - Date - Balance
    100000 - 12.12.10 - 400
    **I'm not quite sure how you are getting the above data from infocube.Are you directly loading data from PSA to infocube with out loading data into DSO -0FIGL_O10(General ledger(New):Transaction figures) ?
    To get required format of following data, no need to do any thing while laoding the data into BPC. Your BI infocube should maitain the data in below format.
    GroupAcc - Date - Balance
    100000 - 12.12.10 - 100 - (LC)
    100000 - 12.12.10 - 100 - (TC)
    100000 - 12.12.10 - 200 - (GC)
    To acheive above format of data in the infocube please follow standard way of data flow given by SAP in BI Contect(http://help.sap.com/saphelp_nw70ehp1/helpdata/en/e6/f16940c3c7bf49e10000000a1550b0/frameset.htm).
    Flow should be
    R/3 --> PSA --> DSO -->Infocube.
    In this flow its important to remember about infoobect 0CURKEY_TC in DSO.
    This is always the currency key of the transaction currency; it is also filled for records with a different currency type. Without this key field, postings with different transaction currencies would be overwritten after summarization and thereby be lost.
    (http://help.sap.com/saphelp_nw70ehp1/helpdata/en/a8/e26840b151181ce10000000a1550b0/content.htm)
    Once you follow the standard flow, your infocube contains the requried format (LC,TC and GC )of data to load it into BPC application.
    hope it helps...
    regards,
    Raju

  • BW Archiving run Locks data upload

    Hello BW Experts.
    We are working with Archiving, but we have one problem, when we do data load to InfoProvider, data Load  is failing due to Archiving run locks.
    If we run Delete archiving step, then we can do data upload to InfoProvider without any issues.
    But we are not interested to perform Delete Step immediately; we want to do after few days. But this causes big issue for data loading in production system.
    Is there any program to unlock Archiving session?
    Is there any other process to unlock the tables?
    I know that we can unlock archiving session thru table RSARCHREQ, But I feel this is not right way to do it.
    Any help regarding this is appreciated.
    Thanks and Regards
    Venkat

    Hi Harsh,
    The Archiving process has two steps, the archiving step that writes the archive file and the second step that deletes the data. The Data Target is locked until the second step is completed.
    If you do not wish to delete the data and release the locks, then change the archiving process status to invalidate the archiving run. (transaction SARA).
    I hope this helps,
    Mike.

  • Issues with 4.1 Data Upload

    I've got some issues with the new feature 'data upload' in 4.1. I had already made the pages and the data loading and it worked perfect. Now a new column was added to my data loading table and I have to add a new table lookup for this column. Everything is set up like it has to be. I've got 4 table lookups. When I try to upload some data and indicate the table columns to the data, I've always get the error: Failed to retrieve the lookup value. There is no problem when I do a data load where he only have to retrieve one column from the lookup table, when he has to retrieve data from more tables for more columns, I always get the FAILED message. Does anyone know the problem of this situation? I already tried to make a totally new data loading, but this also failed to do the job.

    Hi Ellenvanhees,
    I dont think the number of lookups that you defined is an issue here. If possible try to share more details about your data and tables. The few things that come to my mind are probably your data.
    But if you are able to do one by one lookup without problem then I think your upload is failing due to null values. The current status of data upload feature returns a failed lookup even if a null value was to be uploaded. This is a bug #13582661 and has been fixed for the future release.
    Patrick

  • Uploading without  recording

    hi all,
    i have a flat file which has to be uploaded into a ztable. i have to upload without any recording as there is no transaction for it. Pls let me know the steps.
    regards,
    karthik

    To update the ZTables. You need to do four things
    1.Define the internal table of type your Z table.
    DATA: itab type standard table of Z table
    2.1. Upload your text file into the internal table using
    FM GUI_UPload
    here the filetype is 'ASC' as it is a text file
    3. Validate your data in the internal table with the validations in the Z table. This could be like primary key validations or some fields must be having some check table attached or even date validations.
    Otherwise invalid data is going inside your table
    loop at itab
    Validations for all the fields
    endloop.
    4. After the validations Lock your table and then modify the table
    call function ENQUEUE*TAble
    MODIFY Z TABLE from itab
    call function DEQUEUE TABLE
    Hope this helps
    Award Points for useful answers

  • Fixed Assets - How to modify Accu.Dep. Amt. in G/L A/c (Data Uploaded Amt)

    Hi Experts,
    In my organization in the last financial year i.e. 2006-2007, we have loaded Fixed Assets. Our production Client is active since last fiancial year. Our SAP was also went live for production last year only.
    So this is a case of modifying data uploaded amount for fixed assets Accu.Dep. G/L A/c in last finacial year (without Depreciation  RUN).
    Now, we have found that we have to rectify (modify / correct) the amounts in the G/L Account - Accumulated Depreciation for some Assets / Assets' Classes like Accu. Depr.->Computers, Accu.Dep.->Furniture & Fixtures etc (because some descrepancies have been found).
    How can we modify the amounts in these respective Accumualted Depreciation G/L Accounts (they are reconciliation accounts - Assets) and so cannot be directly posted to.
    How can we post the rectified / modified amounts and that too in last fiancial year in these Accumulated Depreciation.
    Please note that we cannot do Depreciation Run because last year only we also uploaded from legacy data to SAP.
    In summary, I want to modify ./ rectify the data uploaded amounts of Accumulated Depreciation G/L Accounts in last fiancial year when our SAP also went into production client.
    I will definitely award points with open heart
    Now it is production client.

    Hi
    Is Acc Depn as per asset accounting and Acc Depn in GL balances matching right now? In such case how are you planning to make changes in AA if you are correcting in GL?
    Check status of company code for asset transfer - Is it in status 1 (Asset Transfer not completed)?
    Besides how are you going to take the impact of changes to Acc Depn for past year - are u planning to open periods of last year?
    I assume that you loaded the asset values last year (may be 12/31/2006) and posted GL balances as of 12/31/2006 and carried forward of GL balances to 2007.
    Please provide some details so that we can take it from there.
    Thanks
    Satya

  • Data upload for vendor balances using BDC

    hi abap experts,
    I have a requirement on data uploading using BDC.
    For the vendor balances ie. for transaction FBL1N  ( I was given a template for vendor balance upload and need to write a BDC program for that ) I need upload the exsisting transaction data to the system. is recording necessary for this?
    can u pls help me with step by step process for vendor balance uploading.
    Thanks,
    Hema.

    Hi
    Please follow the following Steps:
    Steps for recording:
    Step1: Goto TCODE SHDB
    Step2: Click on New Recording
    Step3: Give the necessary Details such as TOCDE, Desc, ...
    Step4: Do the screen by screen recording.(Please avoid extra screen to appear)
    Step5: Save the recording.
    Step6. Select the recording and click on Program button on toolbar.
    Step7: Give the Program name and click on radio button Transfer from recording.
    Step 8: It will open a new session with SE38 and a program with the recoding.
    Step 9: then just add the basic code for BDC.
    Regards,
    Lokesh

  • Basic Data Upload to MATERIAL  Using LSMW is not working

    HI All,
      we are using LSMW    /SAPDMC/SAP_LSMW_IMPORT_TEXTS  program to upload the  basic text of the material, all steps are executed correctly and shows  records are transfered correctly , but the  in MM03 the text is not uploading..
    EPROC_PILOT - MASTER - TEXT_UPLOAD Basic long text 1line
    Field Mapping and Rule
            /SAPDMC/LTXTH                  Long Texts: Header
                Fields
                    OBJECT                       Texts: Application Object
                                        Rule :   Constant
                                        Code:    /SAPDMC/LTXTH-OBJECT = 'MATERIAL'.
                    NAME                         Name
                                        Source:  LONGTEXT-NAME (Name)
                                        Rule :   Transfer (MOVE)
                                        Code:    /SAPDMC/LTXTH-NAME = LONGTEXT-NAME.
                    ID                           Text ID
                                        Source:  LONGTEXT-ID (Text ID)
                                        Rule :   Transfer (MOVE)
                                        Code:    /SAPDMC/LTXTH-ID = LONGTEXT-ID.
                    SPRAS                        Language Key
                                        Source:  LONGTEXT-SPRAS (Language Key)
                                        Rule :   Transfer (MOVE)
                                        Code:    /SAPDMC/LTXTH-SPRAS = LONGTEXT-SPRAS.
                                                 * Caution: Source field is longer than target field
                /SAPDMC/LTXTL                  Long Texts: Row
                /SAPDMC/LTXTL                  Long Texts: Row
                    Fields
                        TEXTFORMAT                   Tag column
                                            Rule :   Constant
                                            Code:    /SAPDMC/LTXTL-TEXTFORMAT = 'L'.
                        TEXTLINE                     Text Line
                                            Source:  LONGTEXT-TEXTLINE (Text Line)
                                            Rule :   Transfer (MOVE)
                                            Code:    /SAPDMC/LTXTL-TEXTLINE = LONGTEXT-TEXTLINE.
    and at last it displaying as follws
    LSM Workbench: Convert Data For EPROC_PILOT, MASTER, TEXT_UPLOAD
    2010/02/01 - 10:14:25
    File Read:          EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.read
    File Written:       EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.conv
    Transactions Read:                    1
    Records Read:                         1
    Transactions Written:                 1
    Records Written:                      2
    can any one tell us what could be problem
    Regards
    Channappa Sajjanar

    Hi , thanks for your reply,
      i run the all the steps .
      when i run the program it gives message as follows
    Legacy System Migration Workbench
    Project:                              EPROC_PILOT     eProcurement Pilot
    Subproject:                           MASTER          Master data Upload / Change
    Object:                               TEXT_UPLOAD     Basic long text 1line
    File :                                EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.conv
    Long Texts in Total:                  1
    Successfully Transferred Long Texts:  1
    Non-Transferred Long Texts:           0

  • How to schedule Job for data uploading from source to BI

    Hi to all,
    How to schedule Job for data uploading from source to BI,
    Why we required and how we do it.
    As I am fresher in BI, I need to know from bottom.
    Regards
    Pavneet Rana

    Hi.
    You can create [process chain |http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/502b2998-1017-2d10-1c8a-a57a35d52bc8?quicklink=index&overridelayout=true]for data loading pocess and schedule start process to any time/date etc ...
    Regadrs.

  • For iTunes Match, how come some songs "match" on an album and other songs on the same album "upload" without matching?

    For iTunes Match, how come some songs "match" on an album and other songs on the same album "upload" without matching?  All songs are available on iTunes.  I don't get it!  Any way to manually match songs that iTunes apparently can't match?  I have hundreds in this unmatched state.

    gsl wrote:
    Shazam has NO problem determining what these songs are. Maybe Apple needs to consider licensing their technology as it seems MUCH more accurate than what they are doing.
    You aren't comparing like with like.
    The main task that Shazam has is to identify the name or a track and the artist. I believe that it suggests the album as well, but if it gets that wrong then it doesn't really matter too much.
    With iTunes Match, getting the album (i.e. identifying which version it is) is much more important. If it gets that wrong then it breaks up the flow of an album. This makes the task that match has very much harder.
    In fact, there is a strong argument to say that currently the problem iTunes has is that it is matching too many songs, resulting in matches from different albums.
    I'm sure that match is not struggling to identify whether it has a song or not, but whether it has the correct version of a song. When you introduce different masterings into the process then, depending on the thresholds set, it is probably correct in concluding that it hasn't got a particular version, even if you think it has.
    The solution would appear to me to be a tweaking of the matching thresholds along with the ability to force an upload, but we just don't know what restrictions they have (remember that if Shazam gets it wrong the it doesn't present you with a copy of a new track that you didn't have previously). It is almost certainly not just a case of improving their technology.

  • Legacy asset data upload for multiple line items

    Hello
    Legacy asset data upload for multiple line items- for example Building is an asset which is having different line items for purchasing of land,constructaion or renovation etc.........now to upload the legacy data what should be consider...only one line item for Building or numbers of line item.........
    which one is the proper way to do this exercise?
    regards

    Hi,
    It completely depends on clients requirement. but following are few approaches:
    1. Define Building as Asset Class.
    2. Create a Building at specific location at asset code and
    3. create asset sub numbers for other components.
    other approach would be:
    1. Define Building as Asset Class.
    2. Create a Building at specific location at asset code and
    3. create further asset codes for other components where field description will be used for relating each other.
    Regards,
    Sayujya

  • Any tools are there in data upload for other than BDC.

    Any tools are there in data upload for other than BDC.

    Hi Hassan,
    <b>LSMW</b>
    <a href="http://sapabap.iespana.es/sapabap/manuales/pdf/lsmw.pdf">http://sapabap.iespana.es/sapabap/manuales/pdf/lsmw.pdf</a>
    <b>Direct Input</b>
    <a href="http://help.sap.com/saphelp_di471/helpdata/EN/fa/097174543b11d1898e0000e8322d00/content.htm">Direct Input</a>
    <b>BAPI</b>'s
    Example :
    <b>BAPI_QUOTATION_CREATEFROMDATA2</b> - Customer Quotation: Create Customer Quotation
    <b>BAPI_PO_CREATE1 </b> -     Create Purchase Order
    <b>Close your previous threads if you have got the answers.</b>
    Regards,
    AS

  • Master data upload into SAP system

    Hello,
    I want to know if there is any standard method to upload material master, customer master, vendor master and finance master data into SAP system.
    I am not referring to LSMW's, BDC's and using BAPI's. I am aware of standard programs like RMDATIND for material master upload, RFBIDE00 for customer master upload and RFBIKR00 for vendor master upload. But these are using direct input method and SAP recommends this only for testing purpose. I am not sure if this could be really used in actual live scenarios.
    From some other posts in the forum, I came to know about some transactions like BDLR, SXDB and BMVO. Can some one tell me how to use these T.Codes?
    If some one has any detailed documentation on these T.codes or in general standard master data upload techniques please send it to [email protected]
    Thanks in advance,
    CMV

    Hi,
    Define the following attributes, using the F4 input help and F1 field help:
    Report
    Name of a registered program for this program type
    Variant
    You can only specify a variant with programs that are started directly.
    With direct input, data from the data transfer file undergoes the same checks as with the online transaction and is then transferred directly into the SAP System. The database is updated directly with the transferred data.
    For the documentation of other transactions please refer the correponding program documentation..which is more helpful,
    <b>Reward points if helpful,</b>
    Regards,
    jinesh

  • BW upgrade EHP1, data uploads should stop?

    Dear experts,
    we have plan for system upgrade. current system is BW 7.0 sp 17, Now planned for EHP1 and SP9.
    I know the there some post upgrade activities, which include consistancy check for objects (infoobject, transfer rules, cubes,dso, etc...)
    Could some one pls confirm, do we need to stop the data uploads/stop process chains during system upgrade?
    Thanks in advance!
    Best Regards,
    Mannu

    Hi Ingo,
    RSRT was giving proper results. We have now implemented few SAP notes and the issues got resolved.
    The following are the notes:
    1499233 - MDX:bXML flattening, unbalanced hierarchy, empty columns
    1485648 - MDX: bXML flattening and hierarchies and displaced columns
    1446245 - MDX: Error when RSR_MDX_BXML_GET_GZIP_DATA is called
    1441767 - MDX: No data for bXML if only ALL member is requested
    1438091 - MDX: basXML: Object MEASURE 0 not found
    1435844 - MDX:Wrong no. decimal places for basXML flattening interface
    1432162 - MDX: Flattening problems when using hierarchies
    1420169 - MDX: bXML flattening: Subsequent note
    1411491 - MDX: bXML flattening in transac. MDXTEST: Selecting packages
    1404328 - MDX: bXML flattening and PROPERTIES: Columns overwritten
    Thanks for your inputs.
    Regards,
    shesha.

  • How can I create a data connection without ODBC (directly with OLE DB)

    If connecting to an OLE DB data source, which is not defined as ODBC, the well known error message appears in Acrobat, that the environment is not trusted. The Designer Help says, that the document must be certified in Acrobat to run OLE DB without ODBC. So I certified the document in Acrobat allowing dynamic content for it. But the error message always appears. So I have some questions to this:
    How is the correct certification process in Acrobat to define a trusted environment for using OLE DB without ODBC?
    Why generates using of ODBC an trusted environment? With using ODBC data sources can be changed external from Acrobat, so I must think, ODBC is absolutely insecure.
    Why is OLE DB without ODBC insecure? Here I have a defined connection string, which cannot be changed at runtime (if the form developer this doesn't want).
    And why is the first record displayed instead of the error message, if using OLE DB without ODBC, but navigation is impossible?
    This seems all very mysterious to me.
    Does anyone in the world has ever created an OLE DB data connection (without ODBC) with Designer and Acrobat, which runs in a trusted environment?
    Thanks for your answers.
    Michael

    For the Projects, the trick will be the Asset files. If they will fit onto DVD DL discs, you're OK. If they are too large (DV-AVI is ~ 13GB per hour), then you'll need some type of backup, or spanning software, that will write files across multiple discs. A better choice, IMO, would be a 1 - 2 TB external, where you can use Project Manager, or just Windows Explorer and Copy, to do the entire Project folder with Assets. I'd recommend against a USB and go at least for FW-400. FW-800 would be better, but you'll need FW-800 connections on your computer. eSATA would be the best, but then you'll need eSATA connectors.
    Good luck,
    Hunt

Maybe you are looking for

  • Can I connect multiple Mac's using the Airport Express?

    As subject. I have got an Intel iMac and a MacBook. Can I connect them using the Airport Express? Or in another word, to create a network among them that I can see the other Mac on one Mac? Sorry, if I raised a silly question. But really like to know

  • How to cross a window in smartforms

    How to cross a window in smartforms.

  • Workflow with item type and item key  is in progress. Abort existing workfl

    Dear all, I'm using the below code from the submit button event to launch the workflow. The workflow works fine when I submit for the first time, when i try to submit for the second time from the same session it throws me the error as Workflow with i

  • KM Portal Runtime Error

    HI EVERYBODY I INSTALLED PORTAL 7 IN A SYSTEM WITH BI7 WITH MICROSOFT SERVER 2003 Y SQL 2005 THE INSTALATION WAS GOOD BUT I OPEN PORTAL AND I TRY TO GO  FOR KM, IT SHOW ME THIS LOG Additional information: null Exception ID = b2564683-b6f3-2a10-73a2-c

  • How to manually check for iOS updates

    HI My iPad is running 4.3.3 and I want to update to iOS 5, but there isn't a check for update button, there is only an update button which allows me to only update to 4.3.5 I believe it is about 4 hrs after when iOS 5 should be released but correct m