InfoSpoke Flat File Extract to Logical Filename

I'm trying to extract data from an ODS to a flat file. So far, I've found that the InfoSpoke must write to the application server for large data volume. Also, in order for the InfoSpoke to transport properly, I must use logical filenames. I've attempted to implement the custom class and append structure as defined in the SAP document "How To... Extract Data with OPEN HUB to a Customer Defined Logical Filename". I'm getting an error when attempting to import the included transports (custom class code). It appears to be a syntax error. Has anyone encountered this, and, if so, how did you fix it?

Hello.
I'm getting a syntax error also.  I did not import the transport, but applied the notes thru the appendix.  When I modified the method "GET_OBJECT_REF_INT" in class CL_RSB_DEST as below, I get a syntax error on the "create object" statement.
    when rsbo_c_desttype_int-file_applsrv.
*{   REPLACE        &$&$&$&$                                          1
*\      data: l_r_file_applsrv type ref to cl_rsb_file_applsrv.
      data: l_r_file_applsrv type ref to zcl_rsb_file_logical.
*}   REPLACE
      create object l_r_file_applsrv
        exporting i_dest    = n_dest
                  i_objvers = i_objvers
Class CL_RSB_DEST,Method GET_OBJECT_REF_INT
The obligatory parameter "I_S_VDEST" had no value assigned to it.

Similar Messages

  • Changing the file name n Flat file Extraction

    Hi,
    Currently i am using flat file extraction for my forecast data and i am sending the file through application server.
    I have created directory successfully and now everyday morning i receive a file thru FTP server with name 20060903.csv and this name is based on one field in my flat file data.  ex. /interface/asf/20060903.csv
    During mid off month we have cut off date, and this cut off date varies for each month. During this time file name changes in the FTP and a file with different name i.e 20061002.csv will be existing in the application server.
    Now in the infopackage i also need to set the deletion settings like if the file name is same delete the previous requests. I could achieve this if i could get the file name changed.
    Lets say if i am not chnaging the file name how do i set deletion condition, like it should not delete if the field(scenario) changes. ie from 20061002 to 20061101. I should have only one file for 20061002 and one file for 20061101 etc... If the scenario is same it should delete.
    Any one kindly advise. Very urgent and critical.
    Tks & regards,
    Bhuvana.

    Hi Bhunva,
    Try the following abap code in routine under External data tab in infopackage.
    data: begin of i_req occurs 0,
          rnr like RSICCONT-rnr,
          end of i_req.
    select * from RSICCONT UP TO 1 ROWS
                      where ICUBE  = <datatargetname>
                      order by TIMESTAMP descending.
      i_req-rnr = rsiccont-rnr .
      append i_req.
      clear i_req.
    endselect.
    loop at i_req.
      select single * from RSSELDONE where RNR eq i_req-rnr and
                                       filename = p_filename.
      if sy-subrc = 0.
        CALL FUNCTION 'RSSM_DELETE_REQUEST'
          EXPORTING
            REQUEST                    = i_req-rnr
            INFOCUBE                   = <datatargetname>
          EXCEPTIONS
            REQUEST_NOT_IN_CUBE        = 1
            INFOCUBE_NOT_FOUND         = 2
            REQUEST_ALREADY_AGGREGATED = 3
            REQUEST_ALREADY_COMDENSED  = 4
            NO_ENQUEUE_POSSIBLE        = 5
            OTHERS                     = 6.
        IF SY-SUBRC <> 0.
          MESSAGE ID sy-MSGID TYPE 'I' NUMBER sy-MSGNO
              WITH sy-MSGV1 sy-MSGV2 sy-MSGV3 sy-MSGV4.
        else.
          message i799(rsm1) with i_req-rnr 'deleted'.
        ENDIF.
    endif.
    let me know if you get any problem in this logic.
    regards,
    Raju

  • Omit the Open Hub control file 'S_*' for flat file extracts

    Hi Folks,
    a quick question is it somehow possible to omit the control file generation for flat file extracts.
    We got some unix scripts running that get confused by the S_* files and we where wondering if we can switch the creation of those files off.
    Thanks and best regards,
    Axel

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

  • Oracle API for Extended Analytics Flat File extract issue

    I have modified the Extended Analytics SDK application as such:
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
    instead of
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
    I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
    Where does the FLATFILE get saved/output when using the API?
    Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
    thanks for your help,
    Chris

    Never mind. I found the location on the server.

  • Extended Analytics Flat File extract in ANSI

    Hi,
    Version: EPM 11.1.2.1
    The Flat file extract using Extended Analytics from HFM is exported in UNICODE format. Is there any option to get the export in ANSI code format?
    Thanks,
    user8783298

    Hi
    In the latest version, Currently the data from HFM can be exported only in UNICODE format. There is a defect filed to Oracle for the same.
    At present the only workaround is to change the file encoding after it has been extracted, using third party software.
    For example, the extracted file can be opened using Windows Notepad and then desired encoding format can be selected when the file is saved using option Save As.
    Hope this helps.
    thank you
    Regards,
    Mahe

  • Extended Analytics Flat File extract question

    I have modified the Extended Analytics SDK application as such:
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
    instead of
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
    I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
    Where does the FLATFILE get saved/output when using the API?
    Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
    thanks for your help,
    Chris

    Never mind. I found the location on the server.

  • Encountering problem in Flat File Extraction

    Flat file extraction: the settings which I maintain in the data source u201Cfield tabu201D are conversion routine "ALPHA", format "External". But when I load the data, the record which is maintained in lower case (in the Flat File) is not getting converted into upper case. But if I try with out ALPHA conversion, the lower case data is getting converted into the SAP format (Upper case). Could you please help me to fix the problem
    When I use both ALPH & External together?

    Hai did you enable the lower case enable option check box in the infoobject level .For the objects which you want lower case values .Check the help.sap site as well.
    Goodday

  • Delta upload for flat file extraction

    Hello everyone,
    Is it possible to initialize delta update for a flat file extraction, If yes please explain how to do that??

    Hi Norton,
    For a Flat file data source, the upload will be always FULL.
    Please refer to following doc:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a0b3f0e2-832d-2c10-1ca9-d909ca40b54e?QuickLink=index&overridelayout=true&43581033152710   if you need to to extract delta records from Flat file. We can write routine at infopackage level.
    Regards,
    Harish.

  • Hierarchy flat file extraction

    Hi experts--
    Can anyone could guide me in Flat file Hierchy extraction.Step by step.
    if possible with screen shots.
    can send it to [email protected]
    regards,
    Rambo.

    Hi,
    Flat file hierarchy extraction is similar to the normal flat file extraction procedures. But the file structure itself can be complex and is different than normal flat files.
    Take a look at the threads below for more details :
    Hierarchy Flat file
    Hierarchy from flat file
    Program to load a flat file in a Hierarchy
    Cheers,
    Kedar

  • Generic and Flat file extraction

    Hi experts
    can any body send the Real time scenario for Generic and flat file extraction with examples.
    Thanks and regrds,
    satya

    Hi,
    Generic extraction is an extraction where you create a Generic DS and based upon this you would be extracting the data from R/3, which means you would have an source system to extract the generic data from the generic data source.
    Flact file extraction is an extraction where you need not have to connect to source
    system, in this case you source system would be PC, here you would be designing an infosource based up on the fields/columns of your flat file and mapping the corresponding the infoojects to it.. and update rules..
    and you would be loading/extracting the data from the file as a source..
    These two things are different in nature and usage..
    Hope this helps..
    assign points if useful..
    cheers,
    Pattan.

  • How to use 'flat file extraction' option in Infospoke

    Hello Gurus,
    We are trying to extract the BW data using Infospoke to a 3rd party ETL tool (Datastage). Now in the current BW 3.5 release, the customer is on SP11 and we are not able to extract any data using a 3rd party system option in Infospoke. Due to the fact, we found out from SAP, that since we are getting an error (3rd party system is not set)in infospoke, we have to upgrade the SP level from 11 to either 16 or 17. The customer that we are dealing with is not going to do that until July - 2007.
    So now we have a second option to load the 'flat file' in BW App. server and FTP to Datastage. Is this something workable option?
    When using that option within Infospoke destination Tab, we can see two options. One is to use File option and load the data within the standard DIR HOME. I tried that but I can't see the file and the entire data within the standard directory. Can anyone explain why and what will be the procedure to follow?
    The second option is to use Logical file. I never done that before. Can anyone explain step by step process or provide the detail document and/or link?
    Since we are at the critical timeframe stage, I would appreciate if I can get the faster response from the Gurus.
    I will make sure to give 'reward points' to each and every helpful answer(s).
    thanks in advance,
    Maulesh

    Hi shashi,
    first, let me clarify one thing here! I am not saying '3rd party tool' as an option in infospoke. I know Infospoke provides only two options (DB table, and file). But when you use DB table option, you check mark on 3rd party destination and that's where you define an RFC destination for 3rd party tool. So when you execute an infospoke, it will start extracting data and load into the defined destination. Since we are on SP11 on BW 3.5 release, we are getting an error while executing an Infospoke using 'DB table option'. The error is like '3rd party destination is not set'. When I went to Marketplace and found an OSS note, I found out that we need an upgrade on SP level (upto 17).
       As I mentioned earlier, the client doesn't want to upgrade until next year, and we are at the critical timeframe stage, where we have to make a decision whether we hold this project until the SP upgrade happens or found alternate approach where we can extract the data from BW (somehow) and load into Teradata database via Datastage - BW Pack (3rd party ETL).
       So we might be able to use the second option in Infospoke destination, which is to use 'Flat file' load.
       Now when I use File option with App. server checked, the data loads into default Server and directory on BW. Which is DIR_HOME. I can see those file extracted and loaded into that directory under the transaction AL11. But I can't see any data. Do you think I have to request for an increase in size limit? If I see all the data, how can I FTP from the BW server? Do I have to work with SAP Basis or create a UNIX script?
       When I use Logical file name, I guess I have to follow what you have described on your response, correct? Can i use an existing Logical path and file name? Do you have a 'How to' document on creating YEAR and PER FM? Is it possible that Basis can do this work for us? What happens after we assign a logical Path and logical file name into the destination tab on Infospoke? What is the use of logical file versus file? Overall, I am looking for a process from extracting the data from Infospoke using either file or logical file, all the way to load into 3rd party database.
      looking forward for your response!
    thanks,
    Maulesh

  • Flat file extraction with FILE

    Hi BWers,
    I want to extract a flat file to BW with an infopackage.
    This flat file is located in a physical path changing according to the system :
    - in the developpment server, path is :
    ...\dev\file.csv
    - in the quality server, path is :
    ...\qual\file.csv
    - in the production server, path is :
    ...\prod\file.csv
    In the Name of the File into the infopackage extraction tab, i have to put the physical path or the logical file or routine.
    I don't want to put directly the physical path, because i would have to change this path in qual and prod servers.
    I don't want too to use a routine.
    So I created a logical file with FILE T-code. But in physical path i put :
    ...\dev\<FILENAME>
    How to put a directory variable like that : $(director)<filename> which says : if we are in Dev, so $directory =
    ...\dev\file, else qual...
    Thanks for help.
    Cheers,

    Hi,
    Here is the answer i found:
    Use in  FILE\Assignment of Physical Paths to Logical Path the variable like that:
    <V=Z_INT_SRV>
    Declare this variables in Definition of Variables.
    Z_INT_SRV

  • CAN NOT TRANPORT INFOSPOKE FLAT FILE DESTINATION PATH

    Hi,
        I am writing from infospoke to a flat file in application server. When I transported this to QA,  Application server Check box  is unchecked and path is changed to path in QA.
    Do I need to have Logical File name in order to transport. If I have 5 infospokes do I need to have 5 logical file names or is there any way to pass a parameter ... .
    Is there any other way to transport with out having File Name option instead of Logical File name?
    Thanks
    Krishna

    Hi Krishna,
    You would need an ABAP program to change the entry in the table RSBFILE.
    First try this. Goto se11 -> RSBFILE -> give your Infohub name and check if you have authorization to change the path.
    ELSE..
    Write an ABAP program and update the field.
    REPORT test.
    tables: RSBFILE.
    update RSBFILE set PATH = '<your path>' where OHDEST = <your Infobub>.
    Bye
    Dinesh

  • Trigger flat file extraction

    Hello developers,
    Situation is as follows:
    External system prodcues flat file on the BW server on a daily basis always with the same name. BW should read the file if it is there and delete it after successful processing.
    We want to avoid to schedule on a fixed time, but we want to synchronize the extraction batch job with the external batch that is creating the file.
    Thanks for any input.
    Kind regards

    The solution is fine. Here are some details.
    Create the Process Chain. Trigger should be an external event. You can create events in SM62.
    The function module RSSM_EVENT_RAISE lets you trigger the event. It is RFC able so you should be able to trigger it from other non-SAP systems too.
    The filename should be fixed so you are able to load the file. You can then delete the file with an ABAP of your own. In its most simple form it looks like
    REPORT  Z_DELETE_FILE.
    parameters: filename type RSFILENM.
    delete dataset filename.
    Now create a variant for Z_DELETE_FILE and put it into the process chain.
    Best regards
       Dirk

  • Infospoke  flat file with tab seperator

    Hello,
    I have a infospoke which creates a flat file with comma field separator. I would like to have tab character as a field seperator. How would I change it.

    Varun,
       as Ram told you can give ',' as Separator.
    all the best.
    Reagrds,
    Nagesh Ganisetti.

Maybe you are looking for

  • Why are some pictures blurred in IMovie slideshow

    I have used IMovie to make slideshows mixed with videos for years. I have even made one recently which was perfect. Suddenly, some of the photos, though crystal clear in the timeline are blurred when set in motion. Certain transitions seem to cause t

  • Read Database before Archive : Program FI_DOCUMNT_WRI

    Hi                I use the program : FI_DOCUMNT_WRI to read the database before archeiving Data but there are some error message after exceute .I would like to know meaning or condition for this program the error message is "Withholding tax life (45

  • Urgent - CUCM Encryption questions

    I have gone through the CUCM Security guide and a few more docs. I have not been able to find the answers to all the questions anywhere. I need you help find answers to the questions colored green. Scenario - At this moment VoIP is not encrypted. (CU

  • CSV FILE EVALUTION

    Dear ALL I am uploading a csv file. I want to Check the content of the CSV file befor submitting it. Is it possible to do this by JavaScript. Tahnks and Regard

  • Software Update continually finds the same update even though it installs successfully.

    The update specifically deals with a Lexmark Pro915 multifunction printer connected to a Time Capsule.  Has anyone encountered anything like this with Apple's Software Update?