Open Hub export of file without meta file?

Hi Gurus,
I am daily exporting delta files with Open Hub Services to server.
The name of the files include dates, so every day a new file is created and also a new meta S_ file is created.
How can I export with Open Hub just data file without meta data file?
Because every month there will be 30 meta files which will be of no use for me.....
Regards, Uros

Hi gurus,
I've the same problem. I don't need of the S_ file. How can I block this file estraction??
Thanks in advance & Best Regards,
Domenico.

Similar Messages

  • Open Hub export to ASCII file

    Is it possible to export infoprovider data through open hub to an ASCII formatted file? How would we do that in BI 7.0?

    Hi,
    You can use TCode RSANWB to goto APD and u can export data from any infoprovider to Flat File .
    Check this link on APD:
    http://help.sap.com/saphelp_nw04/helpdata/en/49/7e960481916448b20134d471d36a6b/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/49/7e960481916448b20134d471d36a6b/frameset.htm
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/96939c07-0901-0010-bf94-ac8b347dd541
    Rgds
    SVU123

  • Archiving object in open hub with logical file name

    Hello,
    I am trying to use an open hub with a logical file name.
    By the SAP help looks like you have to use/define a archiving object:
    http://help.sap.com/saphelp_nw70/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/8d/3e4ec2462a11d189000000e8323d3a/frameset.htm
    Steps 1 and 2 of "Defining Logical Path and File Names" are done and tested:
           1.      Definition of the logical path name
           2.      Definition of the logical file name
           3.      Assignment of the logical fine name for the archiving object
    But is it not clear to me if an archiving object is suppose to be used, which one then?,  or if you have to create a new archiving object, with which characteristics?
    Any help welcome,
    Alberto.

    Alberto,
    Can you explain what you are trying to do. Are you trying to archive data from a BI object or are you trying to export data out of a BI object using open hub.

  • Open Hub - Change from File to Database table

    Hello Experts;
    I want to change the Open Hub Service that we create from File to Database table.
    We are using the Delta Update. However, after I change the Open Hub, Activate the transformation and DTP, when I run it, I don't have any value in the table.
    If I run a new delta into my cube, in the table I only get the data from delta.
    Can anyone knows how to upload all the requests that I have in my cube for the first time I run the delta DTP?
    Thanks and regards;
    Ricardo

    Create a new Full DTP and execute, so that you can dump all of the current contents of the InfoCube into the table and push/pull that table to the target prior to the next delta retraction.
    Edited by: Dennis Scoville on Nov 17, 2009 9:37 AM

  • Open Hub Destination - csv files

    Hi all,
    I nedd to send information through Open Hub Destination from a cube to a File in a server. But the Open Hub Destination is originating 2 csv files instead of 1: one of the files has just the name of the fields that will be in the columns and the other file only the data in the columns.
    There is an alternative to generate through Open Hub Destination, only one file, with the headers in the columns?
    Can you explain to me how to do it?
    There are other options to generate csv files, automaticaly from cubes,and send them to other applications?
    Thanks a lot.
    Regards.
    Elisabete Duarte

    Hi,
    To send information from a cube to applicatin server thru OHS,
    two files will be formed while using OHS. One which will be saved inthe aplication server and for other u should give the path. For downloadin the file to the applc server, in the  destination tab, select applcn server option and transform it
    Hopeit helps
    Cheers
    Raj
    Message was edited by:
            Raj

  • Open Hub FULL LOAD file split.

    Hi,
    Can I split a large file genearted from Open Hub Services ??
    I have created an Infospoke for one of my existing info cube (whicjh is having full load).Data file extracted into application server is a size of 400MB,can we have any procedure in place to split the file into part's.
    Because, I have custome ABAP prog which picks up the file from application server & put into user drive & because of my file size it can't dump the file from application server to presentation server. Program is running out of internal table memory.
    Please any suggetion.
    Regards,
    Kironmoy.

    Thanks for the post.
    But in my case I Business will run the ABAP prog & download the file.
    So I want to make some configuration in OHS so that it dump data in packetwise,may be I can use th selection screen availble.
    Any suggestion would be appreciated.

  • Separator problem after opening a exported CSV file in Excel

    Dear experts,
    After exporting a report to .CSV file from BEX browser, I opened this file with Excel. I found that "," is considered as separator, but not ";" which is by default the separator in CSV file exported.
    For example, when i open the CSV in Notepad, i have:
    "15.07.2008";"E001";"No value"
    "15.07.2008";"E001";"No value, date, time"
    When it is opened in Excel, I have
    "15.07.2008";"E001";"No value"(Column A)
    "15.07.2008";"E001";"No value(Column A) date(Column B)    time"(Column C)
    Is that a problem in Excel configuration? How to solve it (for example, set ; as separator)?
    Thanks in advance!
    Edited by: Xiaoxuan WU on Jan 23, 2008 4:00 PM

    Hi,
    Please try opening your .csv file in excel. I am assuming that all texts are in one column. Select this column, go to Data>Text to Columns. Set the checkbox to Delimited. Click on next. Set the delimeter to ; and the text qualifier to ". Click on next. Click on finish. Then save in another filename. Try opening this new .csv file in notepad and see if it has the format you needed.
    Hope this helps,
    Juice

  • Open hub exporting hierarchy

    Hi
    It there a way to export hierarchies inside BW using open hub? We need to send out hierarchies using open hub to BO Data Services.
    We use BW 7.0 SP 22.
    Kind regards
    Erik

    chk if this helps
    [http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/0403a990-0201-0010-38b3-e1fc442848cb?quicklink=index&overridelayout=true]
    Regards,
    Sujia

  • Problem opening exported Excel file

    Hi
    We recently experienced a problem that has never occurred before. When trying to open an exported Excel file from a query with hierarchical structures (using the link <SAP_BW_URL DATA_PROVIDER="DP" CMD="EXPORT" FORMAT="XLS"> on the 0ANALYZER template), we are asked to log on to the BEx web, as if it's a Web application. If it is an Excel exported from a hierarchy-free query, then the file open directly without the logon prompt. According to the user, he used to be able to open downloaded hierarchy-expanded data without going through the logon process.
    Has anyone experienced the same problem before? Can you share you solution to help us? Thank you in advance!

    If I understand your issue correctly we used a different fix. Very easy to do.
    First, what I believe is happening is that your icons for the hierarchy are being exported as a reference with absolute URL (images can not be exported). This means that when Excel opens the file it tries to resolve the URL and to do that it requires a login.
    You can verify that absolute URLS are being exported by opening your excel export in a text processor (its actually XML). Look for URLS that start "http://.../MIME/BEx/Analyser/xxxx.gif". which are icons.
    This took us quite a while to figure out because excel was not asking us to log in. It just timed out for each URL reference. Took forever.
    The fix we used was to add the following entry to this table on your BW system... (LANG must be empty), Caption is optional:
    Table=RSZCONTROLTEXTS,
    Entry=
      Lang = "",
      TARGET_ID="ID_FLAG_XLS_ABS_URL",
      CAPTION=”Force Excel Export to use relative URLs in XML output for icons”
    You can experiment with results before you try by editing your XML file (the excel export from web) and removing from those icon URLS the "http:...." up to Mime. This makes it a relative URL. The down side is that it cannot find the icon and displays a box with an X in it. I believe there are techniques for importing icons into excel. They would look like this after surgery <img src="Mime/BEx/Analyzer/hieric2.gif">
    If you implement the fix and don't like the effect just remove the entry from the table and all is back to before. I do not believe this is a transportable fix though.
    We discovered it when we were pretesting a bunch of service pack upgrades. See SAP note #650887
    Hope this helps
    Dave Schuh

  • Open hub error when generating file in application server

    Hi, everyone.
    I'm trying to execute an open hub destination that save the result as a file in the application server.
    The issue is: in production environment we have two application servers, XYZ is the database server, and A01 is the application server. When I direct the open hub to save file in A01 all is working fine. But when I change to save to XYZ I´m getting the following error:
    >>> Exception in Substep Start Update...
    Message detail: Could not open file "path and file" on application server
    Message no. RSBO214
    When I use transaction AL11, I can see the file there in XYZ filesystem (with data and time correspondent to execution), but I can´t view the content and size looks like be zero.
    Possible causes I already checked: authorization, disk space, SM21 logs.
    We are in SAP BW 7.31 support package 6.
    Any idea what could be the issue or where to look better?
    Thanks and regards.
    Henrique Teodoro

    Hi, there.
    Posting the resolution for this issue.
    SAP support give directions that solved the problem. No matter in which server (XYZ or A01) I logon or start a process chains, the DTP job always runs in A01 server, and it causes an error since the directory doesn´t exist in server XYZ.
    This occurs because DTP settings for background job was left blank. I follows these steps to solve the problem:
    - open DTP
    - go to "Settings for Batch Manager"
    - in "Server/Host/Group on Which Additional Processes Should Run" I picked the desired server
    - save
    After that, no matter from where I start the open hub extraction, it always runs in specified server and saves the file accordingly.
    Regards.
    Henrique Teodoro

  • How to load delta, changing the name of file for every day with open hub?

    Hi gurus,
    I load daily delta data from ODS to flat file, using open hub.
    The file is located on server. Now, if I do delta load from ODS to flat file, every day the file is overwritten, because it has the same name.
    I want to load delta data everyday to different file name, for example BW_20060101 , BW_20060102, BW_2006_01_03 ....
    How do I do that?
    Best regards,
    Uros

    Hi Uros,
    This thread may give you some idea
    Changing the file name n Flat file Extraction
    https://websmp107.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700002201112003E
    Regards,
    BVC

  • Open Hub Extract in .txt file

    Hi Expert,
    A abc.txt file is getting populated as a target of open hub. That file is populated through a process chain. DTP monitor screen shows that output data volume is around 2.8 million everyday, but I am not been able to find data more than 1250000. Every time whatever be the data target in dtp monitor it shows me 1250000 in the txt file. Is there any limitation on the number of rows in .txt file.
    Please help.
    Regards,
    Snehasish

    thanks for the reply Geetanjali.
    Yeah one field routine is there , but from dtp monitor screen i can see that total record count is 2608000 but when i m chking from the .txt file . it is showing me 1250000 irrespective of the dtp monitor record count. Moreover, I m just populating 3 column in my target txt file.

  • Trouble with Exported .pdf Files

    I created three 20"x30" one-page documents (with iPhoto-sourced images) which saved as 24.7 MB, 19.4 MB, and 10.7 MB files, respectively. I needed to send JPEG files to my service provider for high-resolution prints. Since Pages '08 doesn't have a JPEG-export option, I exported each of them as .pdf files. My plan was to open the .pdf files in Photoshop and save them down as .jpg files. The same path had worked fine with a fourth Pages '08 layout of a 2.25"x3.75" document. With the three 20"x30" files, however, I ran into trouble.
    My laptop didn't have enough memory to open the exported .pdf files in Photoshop Elements without suffering some image gray-out. I didn't think that was a problem until we discovered that my service provider couldn't open the files at all (running the current Photoshop on a souped-up system). Arghhh.
    I switched to a service provider who can print straight from the .pdf files, but that also brought trouble. Though I could open the files easily in Acrobat 5.0, and they looked great, the service provider (running the current full version of Acrobat) could not open them; all he got was what looked like a blank page.
    I don't think the trouble is related to file size, as the business-card document which sailed through without trouble saved as an 11.2 MB Pages file.
    Could it be sheet size? That's the only thing I can figure. And why would the files open in Acrobat 5.0 without trouble, but not in the current Acrobat (or Acrobat Professional, which we also tried)?
    I ended up "printing" the Pages files as "Save PDF to iPhoto," then exporting the iPhoto image as a maximum-resolution JPEG. The only trouble with that route was that the best I could get was a 200 dpi. When I went through Photoshop (instead of iPhoto) with the business card file, I ended up with a 300 dpi file.
    Am I missing something obvious here? Any help would be appreciated.

    Hi,
    I don't know if this is the correct answer, but you can open a PDF in Preview and save the document as a JPEG and JPEG-2000. I know that Leopard will also have some neat functionality about this as well. You might also want to look at printing as a postscript file rather than a PDF and then open in Photoshop. I talked with a FEDEX/Kinkos employee yesterday about my book at he stated that Macs use 1.5 while Adobe's Acrobat is 1.7. Leopard should also fix this, but it may not help right now.
    HTH

  • ABAP routine in Open hub transformation

    Hello Experts,
    My BW consultant created a Open Hub service. For creating a .csv file from a cube.
    For that he had a transformation defined between the source and destination fields.
    I was asked to write a routine to bring the negative sign to front for one of the fields.
    I wrote the following code.
    >    DATA V_OUT(25).
    >
    >   CLEAR V_OUT.
    >
    >    V_OUT = SOURCE_FIELDS-G_AVVWHC.
    >
    >      CALL FUNCTION 'CLOI_PUT_SIGN_IN_FRONT'
    >        CHANGING
    >          VALUE = V_OUT.
    >
    >      RESULT = V_OUT.
    there was no error in the syntax check.
    Now, when the open hub is created/ran without routine(direct assignment) it works fine.
    But when the above code is ran, it TIMED OUT!!!!
    I am unable to find any reason for that.
    Could anybody please provide some guidance on this.
    Regards,
    DNP

    Hi Jerry,
    Thanks for your input again. I checked the dump in ST22 and looks like nothing is related to Routine-
    >Information on where terminated
    >    Termination occurred in the ABAP program "CL_SQL_STATEMENT==============CP" -
    >     in "EXECUTE_QUERY".
    >    The main program was "RSAWBN_START ".
    >
    >    In the source code you have the termination point in line 19
    >    of the (Include) program "CL_SQL_STATEMENT==============CM004".
    SourceCde
    >METHOD execute_query .
    >*
    >* Execute the given query.
    >*
    >
    >  DATA:
    >    stmt_ref TYPE REF TO data,
    >    sql_code TYPE i,
    >    sql_msg  TYPE dbsqlmsg,
    >    c        TYPE cursor.
    >
    >
    >* check if a statement was provided by the caller
    >  IF statement IS NOT SUPPLIED.
    >    RAISE EXCEPTION TYPE cx_parameter_invalid
    >          EXPORTING parameter = 'STATEMENT'.
    >  ENDIF.
    >
    >>  CALL 'C_DB_FUNCTION' ID 'FUNCTION'    FIELD 'DB_SQL'                     
    >>                       ID 'FCODE'       FIELD c_fcode_prepare_and_open
    >>                       ID 'CONNAME'     FIELD me->con_ref->con_name
    >>                       ID 'CONDA'       FIELD me->con_ref->con_da
    >>                       ID 'STMT_STR'    FIELD statement
    >>                       ID 'HOLD_CURSOR' FIELD hold_cursor
    >>                       ID 'INVALS'      FIELD me->parameters->param_tab
    >>                       ID 'CURSOR'      FIELD c
    >>                       ID 'SQLCODE'     FIELD sql_code
    >>                       ID 'SQLMSG'      FIELD sql_msg.
    > IF sy-subrc <> 0.
    >  31 *   call error handling macro
    >  32     m_handle_error sy-subrc sql_code sql_msg.
    >  33   ENDIF.
    >  34
    >  35 * reset the parameters in order to make the statement object reusable
    >  36 * for the next execution of an DML statement or query immediately;
    >  37 * but the cursor C itself must not be freed because it will still be
    >  38 * accessed by the created result set object; the allocated cursor
    Regards,
    DNP

  • Open Hub Destination

    Hi All,
    I have created an Open Hub Destionation with file as an data targert.In the file which i have generated which is having volume as one of the filed,with the data 00015(data length of the volume is 5,thats why i am getting 00015.So i need not get 000 before the actual volume 15,is there any possibility to change the length of the infoobject in the open hub destiantion or in the transformation.
    Thanks In Advance
    Bobby

    Hi,
    While mapping in the transformation select  type as routine then write below code,
    Data :  v_OUTPUT  type string( or define someting for Numeric).
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_OUTPUT'
    EXPORTING
    INPUT = SOURCE_FIELDS-volume
    IMPORTING
    OUTPUT = v_OUTPUT .
    RESULT = v_OUTPUT.
    then actiavte the transformation .
    It would work...
    Regards,
    Satya

Maybe you are looking for