Encountering problem in Flat File Extraction

Flat file extraction: the settings which I maintain in the data source u201Cfield tabu201D are conversion routine "ALPHA", format "External". But when I load the data, the record which is maintained in lower case (in the Flat File) is not getting converted into upper case. But if I try with out ALPHA conversion, the lower case data is getting converted into the SAP format (Upper case). Could you please help me to fix the problem
When I use both ALPH & External together?

Hai did you enable the lower case enable option check box in the infoobject level .For the objects which you want lower case values .Check the help.sap site as well.
Goodday

Similar Messages

  • Date format problem in Flat file extraction

    My flat file is send date like this  DD/MM/YYYY  HH:MM:SS  how can i extract date and time to my ODS. i need date if posible time in seperate object
    how can i create my info object and how to map it.
    Thanks
    Babu

    Hi,
    create a routine and use the statement convert timestamp. Check out the F1-help for details of the statement.
    Siggi

  • Changing the file name n Flat file Extraction

    Hi,
    Currently i am using flat file extraction for my forecast data and i am sending the file through application server.
    I have created directory successfully and now everyday morning i receive a file thru FTP server with name 20060903.csv and this name is based on one field in my flat file data.  ex. /interface/asf/20060903.csv
    During mid off month we have cut off date, and this cut off date varies for each month. During this time file name changes in the FTP and a file with different name i.e 20061002.csv will be existing in the application server.
    Now in the infopackage i also need to set the deletion settings like if the file name is same delete the previous requests. I could achieve this if i could get the file name changed.
    Lets say if i am not chnaging the file name how do i set deletion condition, like it should not delete if the field(scenario) changes. ie from 20061002 to 20061101. I should have only one file for 20061002 and one file for 20061101 etc... If the scenario is same it should delete.
    Any one kindly advise. Very urgent and critical.
    Tks & regards,
    Bhuvana.

    Hi Bhunva,
    Try the following abap code in routine under External data tab in infopackage.
    data: begin of i_req occurs 0,
          rnr like RSICCONT-rnr,
          end of i_req.
    select * from RSICCONT UP TO 1 ROWS
                      where ICUBE  = <datatargetname>
                      order by TIMESTAMP descending.
      i_req-rnr = rsiccont-rnr .
      append i_req.
      clear i_req.
    endselect.
    loop at i_req.
      select single * from RSSELDONE where RNR eq i_req-rnr and
                                       filename = p_filename.
      if sy-subrc = 0.
        CALL FUNCTION 'RSSM_DELETE_REQUEST'
          EXPORTING
            REQUEST                    = i_req-rnr
            INFOCUBE                   = <datatargetname>
          EXCEPTIONS
            REQUEST_NOT_IN_CUBE        = 1
            INFOCUBE_NOT_FOUND         = 2
            REQUEST_ALREADY_AGGREGATED = 3
            REQUEST_ALREADY_COMDENSED  = 4
            NO_ENQUEUE_POSSIBLE        = 5
            OTHERS                     = 6.
        IF SY-SUBRC <> 0.
          MESSAGE ID sy-MSGID TYPE 'I' NUMBER sy-MSGNO
              WITH sy-MSGV1 sy-MSGV2 sy-MSGV3 sy-MSGV4.
        else.
          message i799(rsm1) with i_req-rnr 'deleted'.
        ENDIF.
    endif.
    let me know if you get any problem in this logic.
    regards,
    Raju

  • Oracle API for Extended Analytics Flat File extract issue

    I have modified the Extended Analytics SDK application as such:
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
    instead of
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
    I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
    Where does the FLATFILE get saved/output when using the API?
    Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
    thanks for your help,
    Chris

    Never mind. I found the location on the server.

  • Extended Analytics Flat File extract in ANSI

    Hi,
    Version: EPM 11.1.2.1
    The Flat file extract using Extended Analytics from HFM is exported in UNICODE format. Is there any option to get the export in ANSI code format?
    Thanks,
    user8783298

    Hi
    In the latest version, Currently the data from HFM can be exported only in UNICODE format. There is a defect filed to Oracle for the same.
    At present the only workaround is to change the file encoding after it has been extracted, using third party software.
    For example, the extracted file can be opened using Windows Notepad and then desired encoding format can be selected when the file is saved using option Save As.
    Hope this helps.
    thank you
    Regards,
    Mahe

  • Extended Analytics Flat File extract question

    I have modified the Extended Analytics SDK application as such:
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
    instead of
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
    I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
    Where does the FLATFILE get saved/output when using the API?
    Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
    thanks for your help,
    Chris

    Never mind. I found the location on the server.

  • Delta upload for flat file extraction

    Hello everyone,
    Is it possible to initialize delta update for a flat file extraction, If yes please explain how to do that??

    Hi Norton,
    For a Flat file data source, the upload will be always FULL.
    Please refer to following doc:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a0b3f0e2-832d-2c10-1ca9-d909ca40b54e?QuickLink=index&overridelayout=true&43581033152710   if you need to to extract delta records from Flat file. We can write routine at infopackage level.
    Regards,
    Harish.

  • Hierarchy flat file extraction

    Hi experts--
    Can anyone could guide me in Flat file Hierchy extraction.Step by step.
    if possible with screen shots.
    can send it to [email protected]
    regards,
    Rambo.

    Hi,
    Flat file hierarchy extraction is similar to the normal flat file extraction procedures. But the file structure itself can be complex and is different than normal flat files.
    Take a look at the threads below for more details :
    Hierarchy Flat file
    Hierarchy from flat file
    Program to load a flat file in a Hierarchy
    Cheers,
    Kedar

  • Generic and Flat file extraction

    Hi experts
    can any body send the Real time scenario for Generic and flat file extraction with examples.
    Thanks and regrds,
    satya

    Hi,
    Generic extraction is an extraction where you create a Generic DS and based upon this you would be extracting the data from R/3, which means you would have an source system to extract the generic data from the generic data source.
    Flact file extraction is an extraction where you need not have to connect to source
    system, in this case you source system would be PC, here you would be designing an infosource based up on the fields/columns of your flat file and mapping the corresponding the infoojects to it.. and update rules..
    and you would be loading/extracting the data from the file as a source..
    These two things are different in nature and usage..
    Hope this helps..
    assign points if useful..
    cheers,
    Pattan.

  • Omit the Open Hub control file 'S_*' for flat file extracts

    Hi Folks,
    a quick question is it somehow possible to omit the control file generation for flat file extracts.
    We got some unix scripts running that get confused by the S_* files and we where wondering if we can switch the creation of those files off.
    Thanks and best regards,
    Axel

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

  • "encountered problems, Cannot convert file" what does this mean?

    "Encountered problems, cannot conver file"  What does this mean?

    Hi 123456Oosse,
    I would like to help!
    In what regard are you referring to? Are you experiencing this every time you are trying to convert your pdf? Have you tried other pdf's?
    A few other questions:
    What is the file size? (there is a 100 MB limit)
    What is the original doc type of the pdf?
    What are you converting it to? excel? word?
    Looking forward to yoru response!
    Regards, Stacy

  • InfoSpoke Flat File Extract to Logical Filename

    I'm trying to extract data from an ODS to a flat file. So far, I've found that the InfoSpoke must write to the application server for large data volume. Also, in order for the InfoSpoke to transport properly, I must use logical filenames. I've attempted to implement the custom class and append structure as defined in the SAP document "How To... Extract Data with OPEN HUB to a Customer Defined Logical Filename". I'm getting an error when attempting to import the included transports (custom class code). It appears to be a syntax error. Has anyone encountered this, and, if so, how did you fix it?

    Hello.
    I'm getting a syntax error also.  I did not import the transport, but applied the notes thru the appendix.  When I modified the method "GET_OBJECT_REF_INT" in class CL_RSB_DEST as below, I get a syntax error on the "create object" statement.
        when rsbo_c_desttype_int-file_applsrv.
    *{   REPLACE        &$&$&$&$                                          1
    *\      data: l_r_file_applsrv type ref to cl_rsb_file_applsrv.
          data: l_r_file_applsrv type ref to zcl_rsb_file_logical.
    *}   REPLACE
          create object l_r_file_applsrv
            exporting i_dest    = n_dest
                      i_objvers = i_objvers
    Class CL_RSB_DEST,Method GET_OBJECT_REF_INT
    The obligatory parameter "I_S_VDEST" had no value assigned to it.

  • Problem with flat file loading/Special characters

    Hi, All,
    We just migrated from Version 3.0B to BI7.0, and I've a problem which I can't handle for the moment. We are in Unicode, current codepage of the server is 4103 and codepage of frontend is 4110 (from SNL1 Transaction).
    I'm loading czech texts in my material, via CSV files. The CSV file is correct on my PC, I see the special characters.
    The problem is on my loading, whatever Character set I choose in the Infopackage (I choose Standard or Character Set Setting User-dependant), I cannot see the good characters.
    Does any one of you already encounter this kind of problem, and, if yes, how did you solve it ?
    Thanks

    hi,
    check out with your flat file letter's type, if it's in lower case letter's change them in
    upper case letter's. or otherwise change the infobject's type as-- select check box
    corresponding to lower case letter's in infoobject.
    if helpful provide points
    regards
    harikrishna.N

  • Problem Reading Flat File from Application server

    Hi All,
    I want to upload a Flat File which is having a Line Length of 3000.
    While uploading it to Application server , only upto 555 length it is getting uploaded .
    i am using the code :
    DATA: BEGIN OF TB_DATA OCCURS 0,
            LINE(3000),
          END OF TB_DATA.
    *----Uploading the flat file from the Local drive using FM "UPLOAD".
    OPEN DATASET TB_FILENAM-DATA FOR OUTPUT IN TEXT MODE." ENCODING DEFAULT.
    IF SY-SUBRC NE 0.
      WRITE:/ 'Unable to open file:', TB_FILENAM-DATA.
    ELSE.
      LOOP AT TB_DATA.
        TRANSFER TB_DATA TO TB_FILENAM-DATA.
      ENDLOOP.
    ENDIF.
    CLOSE DATASET TB_FILENAM-DATA.
    What could be the problem?
    Could it be due to any Configuration Problem?
    Waiting for your replies.
    Thanks and Regards.
    Suki

    Your code looks OK, but you may have touble displaying the full width. Try:
    WRITE: /001 TB_FILENAM-DATA+555.
    (And don't forget to append your ITAB after each read.)
    Rob

  • Problem in Flat file transaction data loading to BI7

    Hi Experts,
    I am new to BI7, got problem in activating data source for transaction data and create transfer routine to calculate sales revenue which is not there in flat file but flat file has got price per unit and quantity sold.
    Flat file fields are:
    Cust id   SrepID  Mat ID  Price per unit  Unit measure  Quantity   TR. Date
    cu100    dr01      mat01         50                 CS              5     19991001       
    created info objects are
    IO_CUID
    IO_SRID
    IO_MATID
    IO_PRC
    IO_QTY
    IO_REV
    0CALDAY
    When i created IO_QTY, unit/curency was given as 0UNIT. In creation of DS, under fields tab, extra field CS was shown I did confuse to which object should i map it. and at the same time when i enter IO_QTY its prompting one dailogbox with 0UNIT, so i said ok. Now i can't map CS to unit as system created automatic ally 0UNIT when i enter IO_QTY. Could you please give me solution for this and procedure to enter formulae for calculating Sales revenue at this level instead of query level. Your solutions will be more appreciated with points.
    Await for ur solutions
    Thanks
    Shrinu

    Hi Sunil,
    Thanks for your quick response. I tried to assign the infoobjects under the fields tab in creation of DS. I have not reached to data target yet.
    Could you please answer to my total question.
    Will you be able to send some screen shots to my email id [email protected]
    Thanks
    Shri

Maybe you are looking for

  • Random JCO Communication Failure Error

    Dear All, I am facing the weird issue of JCo connection failures. I am facing the error of Max no of conversations excedeed. If i am testing it then it is giving me error. But after some time if i check it again it will be successful and also ping wi

  • New iPhone 5 / Says No SIM, even with SIM! Paying for Help! :)

    I'm going to make this quick! :/ PLEASE READ THE WHOLE THING! I just upgraded from an iPhone 4 to a iPhone 5. I cut the SIM card with a SIM card cutter. Micro to Nano. I use AT&T, and I'm just a teen. My new iPhone 5 just arrived! I bought it off eBa

  • Search Feature in Mail only partly works

    Good day, I have two questions today 1. when I search for a specific sender in my mail for Mac software I doesn't find or rarely finds any of the received messages, basically the search feature FROM doesn't work. When I switch FROM to ALL it will onl

  • Hyperlinks for Nigel's page work in iWeb but not in Safari. Why?

    In iWeb, the hyperlinks for my NIGEL page all work. They work to and from the main U.B.R.O.L. page. They work to and from the other few people in the U.B.R.O.L. group. However, these very same hyperlinks for Nigel do NOT work. What should I do? I am

  • No address available for SVC connection!

    I am trying to create a SSLVPN connection using Cisco Anyconnect Client after entering username and password, I am getting following error message in attached screenshot. I already created a ip local pool vpnpool 192.168.60.10-192.168.60.15 mask 255.