Open Hub FULL LOAD file split.

Hi,
Can I split a large file genearted from Open Hub Services ??
I have created an Infospoke for one of my existing info cube (whicjh is having full load).Data file extracted into application server is a size of 400MB,can we have any procedure in place to split the file into part's.
Because, I have custome ABAP prog which picks up the file from application server & put into user drive & because of my file size it can't dump the file from application server to presentation server. Program is running out of internal table memory.
Please any suggetion.
Regards,
Kironmoy.

Thanks for the post.
But in my case I Business will run the ABAP prog & download the file.
So I want to make some configuration in OHS so that it dump data in packetwise,may be I can use th selection screen availble.
Any suggestion would be appreciated.

Similar Messages

  • Open Hub export of file without meta file?

    Hi Gurus,
    I am daily exporting delta files with Open Hub Services to server.
    The name of the files include dates, so every day a new file is created and also a new meta S_ file is created.
    How can I export with Open Hub just data file without meta data file?
    Because every month there will be 30 meta files which will be of no use for me.....
    Regards, Uros

    Hi gurus,
    I've the same problem. I don't need of the S_ file. How can I block this file estraction??
    Thanks in advance & Best Regards,
    Domenico.

  • Open Hub - Change from File to Database table

    Hello Experts;
    I want to change the Open Hub Service that we create from File to Database table.
    We are using the Delta Update. However, after I change the Open Hub, Activate the transformation and DTP, when I run it, I don't have any value in the table.
    If I run a new delta into my cube, in the table I only get the data from delta.
    Can anyone knows how to upload all the requests that I have in my cube for the first time I run the delta DTP?
    Thanks and regards;
    Ricardo

    Create a new Full DTP and execute, so that you can dump all of the current contents of the InfoCube into the table and push/pull that table to the target prior to the next delta retraction.
    Edited by: Dennis Scoville on Nov 17, 2009 9:37 AM

  • Archiving object in open hub with logical file name

    Hello,
    I am trying to use an open hub with a logical file name.
    By the SAP help looks like you have to use/define a archiving object:
    http://help.sap.com/saphelp_nw70/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/8d/3e4ec2462a11d189000000e8323d3a/frameset.htm
    Steps 1 and 2 of "Defining Logical Path and File Names" are done and tested:
           1.      Definition of the logical path name
           2.      Definition of the logical file name
           3.      Assignment of the logical fine name for the archiving object
    But is it not clear to me if an archiving object is suppose to be used, which one then?,  or if you have to create a new archiving object, with which characteristics?
    Any help welcome,
    Alberto.

    Alberto,
    Can you explain what you are trying to do. Are you trying to archive data from a BI object or are you trying to export data out of a BI object using open hub.

  • Open Hub Destination - csv files

    Hi all,
    I nedd to send information through Open Hub Destination from a cube to a File in a server. But the Open Hub Destination is originating 2 csv files instead of 1: one of the files has just the name of the fields that will be in the columns and the other file only the data in the columns.
    There is an alternative to generate through Open Hub Destination, only one file, with the headers in the columns?
    Can you explain to me how to do it?
    There are other options to generate csv files, automaticaly from cubes,and send them to other applications?
    Thanks a lot.
    Regards.
    Elisabete Duarte

    Hi,
    To send information from a cube to applicatin server thru OHS,
    two files will be formed while using OHS. One which will be saved inthe aplication server and for other u should give the path. For downloadin the file to the applc server, in the  destination tab, select applcn server option and transform it
    Hopeit helps
    Cheers
    Raj
    Message was edited by:
            Raj

  • Open Batch Multi Load file type error

    Hi,
    I have been trying to use Open Batch MultiLoad via FDM Workbench, but encountered error that it .csv file type is unknown at Import stage.
    A couple things that I have tried:
    - Using multiload via FDM Web Interface to load the .csv file: Success
    - Using open batch multiload via Workbench to load the .csv file: Failed
    - I tried to rename the .csv file itu .txt (without changing the content), tried to use open batch multiload via Workbench to load the .txt file: Success
    It seems that if I try to execute open batch multiload, FDM is able to read the CSV format but refuse the .csv file type to be processed.
    Do I miss something for using openbatch multiload to load .csv file?
    Thanks,
    Erico
    *[File Content]*
    LOC1
    Budget
    1/31/2008
    2
    R,M
    Center,Description,ACCouNT,UD1,UD3,UD4,DV,v,v,v,UD2
    Sunnyvale,Description,Sales,GolfBalls,,,Periodic,1001,2001,3000,Customer2
    Sunnyvale,Description,Purchases,GolfBalls,,,Periodic,2001,3001,4000,Customer2
    Sunnyvale,Description,OtherCosts,GolfBalls,,,Periodic,4001,5001,5000,Customer2*[Error Message Received]*
    Invalid object Node=ML40942.3981712963_P1?MULTILOADFILE.,CSV_UNKNOWN FILE TYPE IN MULTILOAD BATCH FOR FILE [MULTILOADFILE.CSV]! - 35610
    *[FDM Version]*
    FDM 11.1.2.1

    Kemp2 wrote:
    Hi Erico,
    What is the fix for this issue? I am having same issue.
    Thx
    KempHi Kemp,
    I didn't get the fix for this issue. Since we decided to not use the Open Batch Multi Load (not because this issue), I stopped researching on this issue.
    But some workaround that you might want to try:
    <li>Simply have the source file in .txt file type</li>
    or
    <li>Since open batch uses script, before executing the Multi Load script, change the file type from .csv to .txt using script</li>
    Hope this helps.
    -Regards

  • When QT is open, it cannot load file

    Whenever I want to open a QT file, it opens the first time (when double-clicking the file) but in order to open another file after, I need to quit QT and double-click another file, otherwise it does not respond.
    Any suggestions?
    Thanks!

    Here is more info to describe the issue.
    These are the movies sent to the Apple Engineers to show how our
    workflow has worked for 7 years until Safari 4 came out.
    Again this has caused us to go back to 10.5.8 though we enjoyed some of Snow Leopards improvements.
    Have your sound on when listening to safari4 so you can hear the overlaps.
    http://jenniradio.com/movies/safari3.m4v
    http://jenniradio.com/movies/safari4.m4v
    Would really like this fixed.
    Ian

  • How to load delta, changing the name of file for every day with open hub?

    Hi gurus,
    I load daily delta data from ODS to flat file, using open hub.
    The file is located on server. Now, if I do delta load from ODS to flat file, every day the file is overwritten, because it has the same name.
    I want to load delta data everyday to different file name, for example BW_20060101 , BW_20060102, BW_2006_01_03 ....
    How do I do that?
    Best regards,
    Uros

    Hi Uros,
    This thread may give you some idea
    Changing the file name n Flat file Extraction
    https://websmp107.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700002201112003E
    Regards,
    BVC

  • Open hub error when generating file in application server

    Hi, everyone.
    I'm trying to execute an open hub destination that save the result as a file in the application server.
    The issue is: in production environment we have two application servers, XYZ is the database server, and A01 is the application server. When I direct the open hub to save file in A01 all is working fine. But when I change to save to XYZ I´m getting the following error:
    >>> Exception in Substep Start Update...
    Message detail: Could not open file "path and file" on application server
    Message no. RSBO214
    When I use transaction AL11, I can see the file there in XYZ filesystem (with data and time correspondent to execution), but I can´t view the content and size looks like be zero.
    Possible causes I already checked: authorization, disk space, SM21 logs.
    We are in SAP BW 7.31 support package 6.
    Any idea what could be the issue or where to look better?
    Thanks and regards.
    Henrique Teodoro

    Hi, there.
    Posting the resolution for this issue.
    SAP support give directions that solved the problem. No matter in which server (XYZ or A01) I logon or start a process chains, the DTP job always runs in A01 server, and it causes an error since the directory doesn´t exist in server XYZ.
    This occurs because DTP settings for background job was left blank. I follows these steps to solve the problem:
    - open DTP
    - go to "Settings for Batch Manager"
    - in "Server/Host/Group on Which Additional Processes Should Run" I picked the desired server
    - save
    After that, no matter from where I start the open hub extraction, it always runs in specified server and saves the file accordingly.
    Regards.
    Henrique Teodoro

  • Open Hub Extract in .txt file

    Hi Expert,
    A abc.txt file is getting populated as a target of open hub. That file is populated through a process chain. DTP monitor screen shows that output data volume is around 2.8 million everyday, but I am not been able to find data more than 1250000. Every time whatever be the data target in dtp monitor it shows me 1250000 in the txt file. Is there any limitation on the number of rows in .txt file.
    Please help.
    Regards,
    Snehasish

    thanks for the reply Geetanjali.
    Yeah one field routine is there , but from dtp monitor screen i can see that total record count is 2608000 but when i m chking from the .txt file . it is showing me 1250000 irrespective of the dtp monitor record count. Moreover, I m just populating 3 column in my target txt file.

  • Deleting DTP based on Open Hub and Remote Cube

    Hi All,
    i have created a DTP delta for my Open Hub to load a flat file from a remote cube.
    i have done some load test request but it doesn't work (no data in my file .csv), i have created another full DTP with others parameters and it works successfully.
    now i want to delete my Delta DTP but the system return me a message like this:
    DTP_4IDT4BXV29E1N0MWINKCWO60B cannot be deleted at the moment (see long text)
    and in the long text i have:
    Message no. RSBK037
    Diagnosis
    You want to delete a delta DTP that has been successfully used to load requests from the source into the target. If you delete the DTP, you will also delete the information on the source data that was successfully transferred. As a result the source data would be transferred again, if you create a new delta DTP that links the same source with the same target.
    System Response
    The system terminated the delete operation.
    Procedure
    You have two options:
    1. Delete all the requests loaded with this DTP from the target and then delete the DTP.
    2. Do not delete the DTP and continue to load deltas using this DTP.
    i tried to see the requests loaded with the delta dtp and i have deleted one but there are another requests that i can't deleted.
    i have deleted the delta dtp from my transport request.
    how should i do to delete definitivily my delta dtp?
    thanks for your help
    Bilal

    Do not delete entries out of table RSBKREQUEST.
    To delete your DTP, you may use program RSBKDTPDELETE.  Give the technical id of the DTP as the input parameter.

  • Open Hub Destination

    Hi All,
    I have created an Open Hub Destionation with file as an data targert.In the file which i have generated which is having volume as one of the filed,with the data 00015(data length of the volume is 5,thats why i am getting 00015.So i need not get 000 before the actual volume 15,is there any possibility to change the length of the infoobject in the open hub destiantion or in the transformation.
    Thanks In Advance
    Bobby

    Hi,
    While mapping in the transformation select  type as routine then write below code,
    Data :  v_OUTPUT  type string( or define someting for Numeric).
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_OUTPUT'
    EXPORTING
    INPUT = SOURCE_FIELDS-volume
    IMPORTING
    OUTPUT = v_OUTPUT .
    RESULT = v_OUTPUT.
    then actiavte the transformation .
    It would work...
    Regards,
    Satya

  • Open hub for PSA to Database Table...

    Hi
    My requirement is to use Open Hub for loading data from PSA (DataSource) --> to some BI table.
    Question is: Can i make Data Source as Source of Transformation for Open Hub?
    Once i try to do, i get an error of "Cannot connect DataSource to Open Hub Destination"
    If we can't use Data Source as source of transformation in Open Hub, could u pls. suggest another way to fulfil my requirement. I dont want to create Infoprovider from my DataSource for Source of my Open Hub.
    Thanks

    Hi Harpal,
    You can have data source as source of your transformation for Open Hub Destination.
    Just check if your data source is active in the system or not.
    Regards,
    Durgesh.

  • Open Hub, Delta Transformation

    Hi All,
    Thanks in advance for your help.  I am extracting data from a DSO to a flat file using an Open hub and a delta DTP.  The problem I have is that if there is no delta (i.e. zero new/changed records) since the last time the open hub ran, my file is not cleared.
    For example,  If I ran the DTP in the morning and received 100 records, the open hub creates a file in the UNIX directory with a 100 records.  However if in the evening I run the DTP again and there are no new records, the file is cleared, and remains with the 100 records.  This is causing a problem down the line because the data is sent twice.
    Is there anyway that the DTP can output blank files? or can the file be cleared out in a different step in the process chain?
    your help is very appreciated.
    Thanks

    Hi,
    This note should correct your problem: 1111259
    Summary
    Symptom
    You extract data using a Delta data transfer process (DTP) to an open hub destination of type 'File on application server'. If the source does not contain any new data, the existing file is not deleted.
    Reason and Prerequisites
    This problem is caused by a program error.
    Solution
    SAP NetWeaver 7.0 BI
               Import Support Package 17 for SAP NetWeaver 7.0 BI (BI Patch 17 or SAPKW70017) into your BI system. The Support Package is available when Note 1106569 "SAPBINews BI7.0 Support Package 17", which describes this Support Package in more detail, is released for customers.
    In urgent cases, you can implement the correction instructions as an advance correction.
    Hope this helps.
    Regards,
    Diego

  • Scheduling Open Hub

    We want to automatically push some flat files out, perhaps using Open Hub.  However, it seems that Open Hub in flat file mode cannot be scheduled as a background process.  Only the mode that generates a database table seems to be schedule-able.
    Is there a way to automate generateing a flat file?  Or should we look at somehow calling <b>RSCRM_BAPI</b>?
    Thanks for any advice.

    Jerry,
    Extraction of data to a flatfile via open HUB (on BW-server) can be scheduled (use Process Chain) and then a simple ABAP-program can be run to download the files to your local PC (open dataset, read into internal table, close dataset + call function GUI_DOWNLOAD).
    Regards,
    Marco

Maybe you are looking for

  • Sorting not working  on a report :(

    Hi I have a very simply report where data comes from the following query Select substr(b.field1,2,16) CARDNUMBER, trim(substr(b.field2,89,40)) cardholdername, to_date(substr(b.field1,18,6),'mmddyy') POSTINGDATE from tablename; I have checked the radi

  • Eclipse "selection does not contain main type" - but it does contain a main

    Hi! I'm having a strange thing happen to me when using the eclipse IDE in a macintosh envirnment. I imported a project, complete with package, and all of my classes. and when I try to run my main class, it tells me "selection does not contain main ty

  • Outlook 2007 Folders do not synch to iPhone w/ mobile me and two issues

    My gmail account set up and accessed by Outlook 2007, syncs all the folders to the iPhone via imap to Mobile Me. Great! 1.) Outlook also is used for my Verizon pop account and none of the folders synch via mobile me to the iPhone. Help! This is frust

  • Is iTunes the best way to burn teaching CD's or is there another choice as it seems to be more just for music?

    Using an iMac osx.  Copied a teaching cd and burned it to several blank cd's using iTunes. Several topics were changed in the order that is different from the original.  How do I change them back and why did this happen?  Is there another program to

  • TDS rate difference

    Hi, My client is having an issue, since we have configured tax codes for a particular vendor at 2% rate and accordingly my client has deducted withholding tax from vendor payment at 2% but actual rate is 10%. Now he is asking us to define new tax cod