ODS to ASCI File

I have data coming to ODS. And data is staged in ODS. How to extract the ODS Data to ASCI file. Can we to do this using ABAP. If any of your Gurus did this before please help me here.
Thanx

Hi Pradeep,
so if you want to use abap, just create a report with select-options as you like and do a
select * into table itab from /BI0/<ODSNAME>00  "or /BIC/<ODSNAME>00
where select-options.
After that, download itab using method
cl_gui_frontend_services->gui_download
in case you want to have the file on your workstation or using
open dataset
transfer dataset
close dataset
in case you want to have the file on a server.
Hope this is of some help
regards
Siggi

Similar Messages

  • Non-character-type structure need to be written in ASCI File in Unicode sys

    Hi Gurus,
    I am trying to write a structure into a ASCI file iapplication server.
    My code is as follows.
    data : ls_header type BBP_PDS_CTR_HEADER_D ,
           lv_filename TYPE  rlgrap-filename VALUE '/interface/AEE/IN/kalandi'.
    CALL FUNCTION 'BBP_PD_CTR_GETDETAIL'
    EXPORTING
       I_OBJECT_ID               = '4000000196'
    IMPORTING
       E_HEADER                  = ls_header.
    open dataset lv_filename for output in text mode encoding default .
    if sy-subrc = 0.
    transfer ls_header to lv_filename.
    close dataset lv_filename.
    endif.
    but it ives a dump .
    Error is :
        "TRANSFER f TO ..."
    only character-type data objects are supported at the argument position
    "f".
    In this case. the operand "f" has the non-character-type
      "BBP_PDS_CTR_HEADER_D". The
    current program is a Unicode program. In the Unicode context, the type
    'X' or structures containing not only character-type components are
    regarded as non-character-type.
    Thanks a lot in advance.
    Points will be awared for appropriate Answers.
    Regards
    kalandi

    Hi,
    I think here ls_header and lv_filename are not compatible to transfer the data.
    Regards,
    Satish

  • Convert Large Binary File to Large ASCI File

    Hello, I need some suggestions on how to convert a large binary file > 200MB to an ASCI File. I have a program that streams data to disk in binary format and now I would like to either add to my application or create a new app if necessary. Here is what I want to do:
    Open the Binary File
    Read a portion of the File into an array
    Convert the array to ASCI
    Save the array to a file
    Go back and read more binary data
    Convert the array to ASCI
    Append the array to the ASCI file
    Keep converting until the end of the binary file.
    I should say that the binary data is 32-bits and I do need to parse the data; bits 0-11, bits 12-23, and bits 31-28, but I can figure that out later. The problem I see is that the file will be very large, perhaps even greater than 1GB and I don't have a clue how to read a portion of the file and come back and read another portion and then stop at the end of the file. I hope to save the data in a spreadsheet.  If anyone has some experience with a similiar situation I'd appreciate any input or example code.
    Thanks,
    joe

    sle,
    In the future, please create a new thread for unrelated questions.  To answer your question, you can use "Split Number" from the Data Manipulation palette.
    Message Edited by jasonhill on 03-14-2006 03:46 PM
    Attachments:
    split number.PNG ‏2 KB

  • Convert Binary File to ASCI File

    Hello, I'm having a little trouble writing a labview program to read a binary file and save it to an ASCI file. The binary file may be large so I've tried to read the file in chunks. I will appreciate any help you could offer. I've attached my .vi below
    thanks,
    joe
    Attachments:
    BinToASCI.vi ‏45 KB

    I don't really understand what you mean by "the data from the binary file are really huge numbers and don't match what is expected". Are you specifying the correct datatype to the Read function?
    As far as your VI is concerned: Your initial statements indicated that you wanted to read this in chunks since you're dealing with very large files. However, while your VI reads in one number at a time, it writes out the whole thing in one big chunk. This is going to be a huge memory draw. You need to write the data to the ASCII file at the same time as when you read it, not after you've read it all. Given this, it makes no sense to only read one number at a time as you're doing. You're going to be performing a disk I/O operation for each number! This is going to take forever, not to mention take a nice bite out of the life expectancy of your hard drive. I don't say this often, but you need to restructure your VI to something like this:
    1. Provide dialog to ask for the file to read.
    2. Provide dialog to ask for the file to write to.
    3. Write out your header to the output file.
    4. Start a loop where the number of iterations is based on the number of times you have to read the file using a max chunk size. This is based on the file size and the size of the chunk that you're comfortable in reading. The chunk size is dependent on your machine - i.e., speed and amount of RAM.
    5. Read that many numbers and write out the converted numbers.
    6. Repeat until done.
    7. Close both files.
    See attached file as a starting point. The error handling is not that great, so keep that in mind.
    Message Edited by smercurio_fc on 03-17-2006 05:20 PM
    Attachments:
    Read Chunks.vi ‏68 KB

  • Webutil_file_transfer.Client_To_AS_with_progress to transfer asci files

    Hi,
    I am using webutil_file_transfer.Client_To_AS_with_progress to transfer the file from local
    directory to the application servers. But I found that it transfers with binary mode.
    How can I transfer with asci mode since what I upload are sql scripts.
    Thanks.
    Ivan

    Hello,
    What is the problem with the binary mode?
    You could loose information by transfering a binary file in ASCII mode, but you would loose nothing by transfering an ASCII file in binary mode.
    Anyway, there is no option to set the mode transfert.
    Francois

  • Select from ODS into .CSV file

    hi guru's !
    I need to build an abap program that reads a number of fields from an ODS (table) and stores this result in a .CSV file.
    I have never done this before so any documentation or examples would be greatly appreciated !
    thanks alread !

    hi,
    http://help.sap.com/saphelp_nw04/helpdata/en/c7/dc833b2ab3ae0ee10000000a11402f/frameset.htm
    Regards,
    Sourabh

  • Problem in loading data to ODS from Flat File

    Hi,
    I am trying to load data from Flat Flie. i.e the records like below.
    Customer   Material Calday Price
    100            1           20080519 20
    200            2           20080520  30
    I inserted the Customer in Key Fields and Material & Calday & Price in Data Fields. First records were loaded successfully to ODS. Second time I changed the first record to below
    Customer    Material   Calday     Price
    100             1 20080519             40
    But it is not loaded to ODS. It is giving the following error msg.
    Error while inserting data records for data package 1 (see long text)
    Data records already saved in the active table
    Technical key: 00005069 / 00000001 / 00000001
    Process 000001 returned with errors
    So plz give me solution how to load data ODS.

    Hi,
    If you haven't set the delta mode, then initialize the IP and then run a Delta load from next.
    Hope this solve the issue.
    Rgs,
    I.R.K

  • How to load delta, changing the name of file for every day with open hub?

    Hi gurus,
    I load daily delta data from ODS to flat file, using open hub.
    The file is located on server. Now, if I do delta load from ODS to flat file, every day the file is overwritten, because it has the same name.
    I want to load delta data everyday to different file name, for example BW_20060101 , BW_20060102, BW_2006_01_03 ....
    How do I do that?
    Best regards,
    Uros

    Hi Uros,
    This thread may give you some idea
    Changing the file name n Flat file Extraction
    https://websmp107.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700002201112003E
    Regards,
    BVC

  • Error handling when loading flate files

    Hi
    We have a Process Chain loading flate files from a server into a ODS. We want to create a routine to avoid errors in the ODS when no files is found.
    The error we want to handle is
    Error 1 when loading external data
    Message no. RSAR234
    We want to handle this error message as OK to avoid red request in the ODS
    Please help
    Kind regards
    Ola Einar Langen

    Hi, and thanks for your hints.
    We ended up with the following codes. It's will mark the prosess chain red (but this is as wanted) but we have implemented that the prosess chain will proceed even with error in the loading process
    The code for the routine implemented in the infopackage in "External data" tab.
    p_filename = 'filename.txt.
    OPEN DATASET p_filename FOR INPUT IN BINARY MODE.
    If file could not be open
    IF SY-SUBRC NE 0.
        p_subrc = SY-SUBRC.
      delete the current request
        CALL FUNCTION 'RSSM_DELETE_REQUEST'.
    ENDIF.
    close the file if OK
    CLOSE DATASET p_filename.
    p_subrc = 0.
    The solution is no OK for us
    Kind regards
    Ola Einar Langen

  • Urgent: Flat file load issue

    Hi Guru's,
    Doing loading in data target ODS via flat file, problem is that flat files shows date field values correct of 8 characters but when we do preview it shows 7 characters and loading is not going through.
    Anyone knows where the problem is why in the preview screen it shows 7 characters of date when flat file has 8 characters.
    Thanks
    MK

    Hi Bhanu,
    How do i check if conversion is specified or not and another thing is it is not just one field we have 6 date fields and all of them are showing 7 characters in PSA after loading where as flat file have 8 characters in all of the 6 date fields.
    In PSA I checked the error message it shows 2 error messages:
    First Error message:
    The value '1# ' from field /BIC/ZACTDAYS is not convertible into the    
    DDIC data type QUAN of the InfoObject in data record 7 . The field      
    content could not be transferred into the communication structure       
    format.                                                                               
    em Response                                                                               
    The data to be loaded has a data error or field /BIC/ZACTDAYS in the    
    transfer structure is not mapped to a suitable InfoObject.                                                                               
    The conversion of the transfer structure to the communication structure 
    was terminated. The processing of data records with errors was continued
    in accordance with the settings in error handling for the InfoPackage   
    (tab page: Update Parameters).                                                                               
    Check the data conformity of your data source in the source system.                                                                               
    On the 'Assignment IOBJ - Field' tab page in the transfer rules, check    
    the InfoObject-field assignment for transfer-structure field              
    /BIC/ZACTDAYS .                                                                               
    If the data was temporarily saved in the PSA, you can also check the      
    received data for consistency in the PSA.                                                                               
    If the data is available and is correct in the source system and in the   
    PSA, you can activate debugging in the transfer rules by using update     
    simulation on the Detail   tab page in the monitor. This enables you to   
    perform an error analysis for the transfer rules. You need experience of  
    ABAP to do this.                                                                               
    2nd Error Message:
    Diagnosis                                                                               
    The transfer program attempted to execute an invalid arithmetic          
        operation.                                                                               
    System Response                                                                               
    Processing was terminated, and indicated as incorrect. You can carry out 
        the update again any time you choose.                                                                               
    Procedure                                                                               
    1.  Check that the data is accurate.                                                                               
    -   In particular, check that the data types, and lengths of the     
                transferred data, agree with the definitions of the InfoSource,  
                and that fixed values and routines defined in the transfer rules                                                                               
    -   The error may have arisen because you didn't specify the      
             existing headers.                                                                               
    2.  If necessary, correct the data error and start processing again.  
    Thanks
    MK

  • Error in ODS data load

    Hi,
    I am trying to load data to ODS from CSV file but one of the field is not getting updated.
    Although, that field has data in PSA. Why is it happening???

    Hi,
    Is the request loaded without errors to the ODS? Check the mapping for the field in the update rules, if you are doing it in 3.x or check the Tranformations.
    Hope it helps.
    Veerendra.

  • Deactivate delta update before deleting the request from ODS

    Hi all,
    Happy New Year wishes to all.
    I have a scenario, where I am feeding the data to ODS from flat file and from ODS to Cube. Here ODS is a Data mart. Flat file to ODS full update and ODS to Cube First Initialization and Delta.
          I deleted the latest request from ODS, at the I got the message dialog box and it is saying that
    “Request 98888 already retrieved by data target BWVCLNT”
    “Delta update in data target BWVCLNT must be deactivate before deleting the request
    “Do you want to deactivate the delta update in data target BWVCLNT?
    So I pressed Execute changes, then after I loaded the data in to ODS and when I am trying to load the data into Cube, once I pressed execute button it is displaying the message dialog box like
    “Delta request is incorrect REQXXXXXXX in the monitor”
    “A repeat needs to be requested”
    “Data target still include delta request REQXXXXX”
    “If the repeat is now loaded, then after the load”
    “There could be duplicate data in the data targets”
    “Request the repeat?”
    After execute changes it is showing only “zero” Records only.
    What is that mean and why zero records it is displaying instead of delta records.
    Anybody faces this type of problem please let me know. This is somewhat urgent to me
    Thanks and Regards
    Satish Chowdary

    <i>Hi Satish,
    First Switch off the ODS ( automatic Update of Datatarget ) , though not really necessary. delete the Req from the Infocube, then delete from the ODS . Now delete the Init Req from the Infopackage for datamart infopackage.then Do an Init without datatransfer into Infocube. Later load data into ODS and then go for delta update to Infocube.
    I hope this helps.
    Thanks,
    Krish
    Dont Forget to assign Points.</i>
    Dude, you just repeated my answer I gave a few hours ago. Are you hungry for points?

  • Tracking history while loading from flat file

    Dear Experts
    I have a scenario where in have to load data from flat file in regular intervals and also want to track the changes made to the data.
    the data will be like this and it keep on changes and the key is P1,A1,C1.
    DAY 1---P1,A1,C1,100,120,100
    DAY 2-- P1,A1,C1,125,123,190
    DAY 3-- P1, A1, C1, 134,111,135
    DAY 4-- P1,A1,C1,888,234,129
    I am planing to load data into an ODS and then to infocube for reporting what will be the result and how to track history like if i want to see what was the data on Day1 and also current data for P1,A1,C1.
    Just to mention as i am loading from flat file i am not mapping RECORDMODE in ODS from flat file..
    Thanks and regards
    Neel
    Message was edited by:
            Neel Kamal

    Hi
    You don't mention your BI release level, so I will assume you are on the current release, SAP NetWeaver BI 2004s.
    Consider loading to a write-optimized DataStore object, to strore the data.  That way, you automatically will have a unqiue technical key for each record and it will be kept for historical purposes.  In addition, load to a standard DataStore object which will track the changes in the change log.  Then, load the cube from the change log (will avoid your summarization concern), as the changes will be updates (after images) in the standard DataStore Object.
    Thanks for any points you choose to assign
    Best Regards -
    Ron Silberstein
    SAP

  • System exec - Not finding a file

    Hi,
    I am using the system exec vi to call a patch file that prints a .txt. or .prn file depending on the request of the user. I was ahving some issues with it not being able to find the batch file however I have solved that but now i get this error from the standard output string:
    Error: could not open print_test.txt for reading
    print_test.txt is the file it is trying to print. I am not sure what is causing this
    any ideas would be appreciated
    Thank you

    Hi Adam,
    Thanks for the post and I hope your well today.
    There are maximum limits, imposed by Windows , on how long you can have an ASCI file name and the full path as follows:
    MAX_FILE_NAME is 256
    MAX_PATH is 260
    I beleive this is correct ... however does seem to be some confusiong. You can make your own mind from a very resources...
    Forum
    Max length of VI file paths and file names
    http://forums.ni.com/ni/board/message?board.id=170&message.id=28207&requireLogin=False
    Hope this helps alittle,
    http://www.british-genealogy.com/forums/archive/index.php/t-8252.html
    msdn
    Naming a File or Directory
    http://msdn.microsoft.com/en-us/library/aa365247.aspx
    Kind Regards
    James Hillman
    Applications Engineer 2008 to 2009 National Instruments UK & Ireland
    Loughborough University UK - 2006 to 2011
    Remember Kudos those who help!

  • How to find ODS creation date?

    Hi All, I'm struggling to look for the creation date of a particular ODS in our BW system. Can anyone advise?
    Thanks.

    Hi,
    Go to edit mode of the ODS. In file menu extras -> Information (log/status) select mass activator. In here You can see when the ODS was activated the first time.
    or
    se11, table rsdodso, field timestmp
    Assign points if useful
    -Shreya

Maybe you are looking for

  • Error in Transfer of Prod Ord to ECC

    Hi Experts, Hope you can help me with this issue. The converted planned orders in APO are not transferred to ECC due to this error "Backward Scheduling (enter finish date)". Seen this error in the logs. Really appreciate your help. SIncerely, Ria

  • IPhoto takes ages to import photos from iPhone

    Hi all, I've got ~75 photos on my iphone 4S that I want to transfer to my macbook pro - from what I gather my best option is iPhoto (I've got iPhoto '09 v8.1.2). I selected them all and clicked Import, and it's taking forever. Like, 3-4 minutes per p

  • Functional acknowledgement posting to PO confirmation tab

    Hi Gurus, Can anbybody please  throw some light on the followin scenario. is it possible to have the functional acknowledgement posted to the confirmation tab of the PO. If so what kind of confirmation control key has to be used and other settings th

  • Swing related question wrongly posted in the JDBC forum

    The OP needs some sort of dynamic table model. He does not seem to understand the difference between JDBC and Swing so I am not hopeful for an easy resolution. The table model used appears to be static so it is not wonder that the new rows are not vi

  • Colour conversion on export - its a toughy

    Hi Sorry about the crosspost, but I had no joy in the indesign forum: this is a problem with a CMYK document, with CMYK images (mostly with no profile attached). I have a printer that will reject anything with colour density greater than 300%. Usuall