Open hub sapace problem.

Hi all,
while i am executing a process chain which is having openhubdestination as UNIX file i am getting an error like"Resource bottleneck
The current program "CL_RSB_FILE_APPLSRV===========CP" had to be terminated  because a capacity limit has been reached".what sahll i do to fix this issue.
regards.
bhadra

Hi Sadik
BPC is very new in market and there is not much help and yes it has base on MS SQL but thing is all data load is via CSV files only not thru direct table and stuff.
I have tried making DB connect but i m not able to make RFC connection and there is no basis support I have in my project.
I have solved my problem but there is a small stuck up. I am downloading data from internal table into file but this code is dumping file into my local machine. I want the file to be directly dumped into the application server.
Following is the code:
CALL FUNCTION 'MS_EXCEL_OLE_STANDARD_DAT'
EXPORTING
  file_name = 'C:\Test' ????????? [This is the main problem]
*file_name = 'DIR_HOME  E:\usr\sap\BDV\DVEBMGS30\work\ZBPC_1000' "set this for application server location
CREATE_PIVOT = 0
DATA_SHEET_NAME = ' '
PIVOT_SHEET_NAME = ' '
PASSWORD = ' '
PASSWORD_OPTION = 0
TABLES
PIVOT_FIELD_TAB =
data_tab = lt_data_f "internal table with data
fieldnames = lt_header "internal table with header
EXCEPTIONS
file_not_exist = 1
filename_expected = 2
communication_error = 3
ole_object_method_error = 4
ole_object_property_error = 5
invalid_filename = 6
invalid_pivot_fields = 7
download_problem = 8
OTHERS = 9
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
Edited by: Navneet Thakur on Jul 2, 2009 2:53 PM

Similar Messages

  • Open Hub transporting problems

    Hi everybody, we have Open Hub transporting problems, it just gives us return code 12 or 8, basically, it doen't want to transport neither Open Hub Destination, nor Transformations, nor DTPs, and bear in mind that we're on the 15th Patch Level. We've looked at all the SAP Notes, but they reffer to the earlier patch levels...
    And other problem, is that it's OK in DEV system but in QUALITY system, when you launch a DTP it gives us a dumb with ASSIGN_TYPE_ERROR
    Any ideas??

    My destination path on the DEV box is simply the following (identified by BID in the following path):
          D:\usr\sap\BID\DVEBMGS00\work
    And when I locate the un-activated open hub on the QA box, the BID is already changed to BIQ, but the server name box/field is empty....
    I am going to try with the object changebility option, btw, is there any documentation on how to change these objects that are marked with object changebility on the QA box (provided that we can not manually change anything on the QA box)...
    Anyways, I think my problem is with the server name conversion from "abc-bi-dev" to "abc-bi-qa"....!!!
    Also, please note the RSBFILE entry for this object in the QA box has the "btcsrvname" as blank...
    Do you think there is any additional setting that needs to be done on the QA box to do this translation (from "abc-bi-dev" to "abc-bi-qa")?
    The only two setup we have done as far as the BI is concerned is at the "Tools-> Conversion of logical system name"....where you only put the conversion for client name mapping from dev to qa and local file system mapping from dev to qa.
    So, is there anything else to map the host name (from "abc-bi-dev" to "abc-bi-qa") either at BI perspective or Basis/NetWeaver perspective?
    Thanks a lot...
    ~Sabuj.
    Edited by: Sabuj Haque on Feb 1, 2009 1:11 PM
    Edited by: Sabuj Haque on Feb 1, 2009 5:34 PM

  • Open hub destination problem - .csv file is empty

    Hi everyone,
    I am trying to extract data from our BW system using Open Hub Dest. to a .csv file
    DTP runs successfully means show total no. of records but when I see in AL11 then records are not in .csv file that is .csv file is empty.
    How to figure out the problem?
    Best regards
    Ahmad

    @Jagadeesh:
    if I correctly understood you, you mean to say that first I should run DTP and save to an excel file to see if it shows data or not. This i think is not possible; as when we create OHD then 'Destination Type' we can choose among 'DB table', 'table' and 'third party tool',
    so if I choose 'Table' then 'File Name' is .csv file (same name as the OHD) and can change it to .xls file
    @Arun:
    have permission to write in the folder
    Do you mean creating OH to a Database Table?
    in DTP I have a filter on a Characteristic, when its 1500 records then it works (data is in .csv file) but when 3500 records then .csv file is empty
    no error message, no log
    Regards
    Ahmad

  • Open Hub Destination problem

    Hi
    I have created open hub destination(OHD) on a cube and then getting file in CSV format on application server.
    I have to move this file to BPC but problem is wen run DTP in OHD it generated 2 files one with header and other with data
    How can i make it to generate one file with header and then data in it???
    How can i send this file automatically to BPC server( BPC is Microsoft version not netweaver one).

    Hi Sadik
    BPC is very new in market and there is not much help and yes it has base on MS SQL but thing is all data load is via CSV files only not thru direct table and stuff.
    I have tried making DB connect but i m not able to make RFC connection and there is no basis support I have in my project.
    I have solved my problem but there is a small stuck up. I am downloading data from internal table into file but this code is dumping file into my local machine. I want the file to be directly dumped into the application server.
    Following is the code:
    CALL FUNCTION 'MS_EXCEL_OLE_STANDARD_DAT'
    EXPORTING
      file_name = 'C:\Test' ????????? [This is the main problem]
    *file_name = 'DIR_HOME  E:\usr\sap\BDV\DVEBMGS30\work\ZBPC_1000' "set this for application server location
    CREATE_PIVOT = 0
    DATA_SHEET_NAME = ' '
    PIVOT_SHEET_NAME = ' '
    PASSWORD = ' '
    PASSWORD_OPTION = 0
    TABLES
    PIVOT_FIELD_TAB =
    data_tab = lt_data_f "internal table with data
    fieldnames = lt_header "internal table with header
    EXCEPTIONS
    file_not_exist = 1
    filename_expected = 2
    communication_error = 3
    ole_object_method_error = 4
    ole_object_property_error = 5
    invalid_filename = 6
    invalid_pivot_fields = 7
    download_problem = 8
    OTHERS = 9
    IF sy-subrc <> 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    Edited by: Navneet Thakur on Jul 2, 2009 2:53 PM

  • Problem with open hub with application server

    Hello...
    I have a problem.
    I have bw 7.0 ad im usin Open Hub.
    When I configured the open hub in my development environment, In the destination tab i try two options
    Option 1.
    I use the application server, then i choose my server (the development server).
    But when i try to transport I get the follow mistake:
    Application sevier sapnwd01 does not exist (sapnwd01 is my development server)
    I now I can change the table  RSBFILE , or made an abap program.
    The problem is, i need to pass the transport withouts errors to quality, because i need to transport to my production server.
    And in production server I don't have the permission to change tables.
    Option 2.
    Using a logical file name (this logical file name, i used for other process and its working correctly)
    But when i try to transport I get the follow mistake:
    Unpermitted name   for a file .
    I hope you can help me!!
    Thanks in advance

    Hi Sharayu Kumatkar  
    Sorry for the delay, I had been problems with my access :$
    I check the AL11, there are a lot of directories, so I dont know which its the one
    My logicla path is ZAPD_INVERSIONES
    My phisical path is /netweaver/bw/fuentes/miig/altamira/
    My logical path Definition is :
    Logical file ZAPD_INVERSIONES2
    Physical file  ExtraccionInversiones.txt
    Data Format ASC
    Aplicat. Area FI
    Logical Path ZAPD_INVERSIONES
    In definition of variable
    FILENAME Test.txt
    I get this messaje error while Im trying to transport
    Start of the after-import method RS_DEST_AFTER_IMPORT for object type(s) DEST (Modo activación)
    Unpermitted name   for a file
    Thanks to help me!!

  • Problem in transporting Open Hub destination

    Hi,
    I have a open hub destination. The destination is a database table. When i try to  transport the open hub destination to staging, the transport fails saying "Unable to activate table /BIC/OHXXX" . What could be the reason ?
    Thanks,
    Satya

    Hi Satya,
    I have exactly the same problem. The reason why the table could not be activated is because the referenced fields of the open hub table are wrong. Unfortunately I do not know at the moment why this is done. It would be great if could post the solution if you solved the problem. I will keep you informed...
    Thanks,
    Thomas

  • Open hub problem

    Hi Gurus,
    please help me out of this problem
    When we execute the dtp of open hub, it is creating a extract file on
    server directory. Our problem is record length of output record is
    always truncated to 110 characters. Record length of open hub structure
    we have crated is 151 charadters(excluding delimiter).
    We are not using infospokes as they are obsolate in 7.0.
    We are using NW BI 7.0 We are on Service pack 20
    and the excel 2000.
    Thank you

    Hi,
    Please let me know what is the length of the structure in the Open Hub with the delimiters.
    Btw,
    You can create a custom field in the open hub maintenance and specify its size to whatever size you desire with the delimiters and then assign all the source fields to this through a routine.
    Regards,
    Saikat

  • Problem with a Open Hub Extraction

    Hi Experts,
    I have a huge problem here.
    I´m trying  to extract some data with an Open Hub application, but i´m having trouble with the application server.
    See, I have to extract the information from a DSO to a file in a application server, but the company has one application server for each environment (BD1, BQ1, etc) and the conversion it´s not working and the transportation of the request always fails.
    Anyone knows how to fix this??
    Thanks.
    Regards.
    Eduardo C. M. Gonçalves

    Hi,
    While creating the open hub, you need to maintain the file name details under T-CODE - FILE.
    I hope you have maintained that.
    There also you wil have to use the variable to change the file path which will change the file path in each system..
    There you will have to define
    Logical File Path Definition
    Assignment of Physical Paths to Logical Path
    Logical File Name Definition, Cross-Client
    Once you have define this and transport , then your Open Hub will go smooth.
    Thanks
    Mayank

  • Problem connecting to SAP Open Hub

    Hi, I am trying to set up a SSIS job  connecting to SAP Open Hub and have with support from the SAP guys been able to get some progress, but it has now stopped up on a error message we're not able to solve. Any suggestion on what can be wrong and
    how to solve this? When I run the package I get the following error message:
    SSIS package "D:\Source\MSBI\SapLoadStaging\Package3.dtsx" starting.
    Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
    Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
    Information: 0x40043006 at Data Flow Task, SSIS.Pipeline: Prepare for Execute phase is beginning.
    Information: 0x40043007 at Data Flow Task, SSIS.Pipeline: Pre-Execute phase is beginning.
    Information: 0x4004300C at Data Flow Task, SSIS.Pipeline: Execute phase is beginning.
    Information: 0x3E8 at Data Flow Task, SAP BW Source: Process Start Process, variant has status Completed (instance DH88PUV2SZBIFKMIF48K3USME)
    Error: 0x3E8 at Data Flow Task, SAP BW Source: Process Data Transfer Process, variant /CPMB/HMIJYDZ -> ZOH_VPL has status Ended with errors (instance DTPR_DH88PUV2SZCA46Y9QNO66A6W6)
    Error: 0x3E8 at Data Flow Task, SAP BW Source: The component is stopping because the Request ID is "0".
    Error: 0x3E8 at Data Flow Task, SAP BW Source: No data was received.
    Error: 0xC0047062 at Data Flow Task, SAP BW Source [41]: System.Exception: No data was received.
       at Microsoft.SqlServer.Dts.SapBw.Components.SapBwSourceOHS.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers)
       at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100 wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers, IntPtr ppBufferWirePacket)
    Error: 0xC0047038 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on SAP BW Source returned error code 0x80131500.  The component returned a failure code when the pipeline engine called PrimeOutput().
    The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
    Information: 0x40043008 at Data Flow Task, SSIS.Pipeline: Post Execute phase is beginning.
    Information: 0x4004300B at Data Flow Task, SSIS.Pipeline: "OLE DB Destination" wrote 0 rows.
    Information: 0x40043009 at Data Flow Task, SSIS.Pipeline: Cleanup phase is beginning.
    Task failed: Data Flow Task
    Warning: 0x80019002 at Package3: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.  The Execution method succeeded, but the number of errors raised (5) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches
    the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
    SSIS package "D:\Source\MSBI\SapLoadStaging\Package3.dtsx" finished: Failure.
    The program '[6916] DtsDebugHost.exe: DTS' has exited with code 0 (0x0)
    Regards
    Paal

    Hi Paleri,
    According to the
    thread which has the same error message, the issue may be caused by incorrect RCF settings. Could you double check your RCF connection configurations such as DNS settings?
    If it is not the case, please also make sure you have installed the correct version of Microsoft Connector for SAP BW.
    Reference:
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/e2fbafe5-d9df-490a-bfad-3d4b9784a8ea/sap-bi-connector-for-ssis-2008?forum=sqlintegrationservices
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • Open Hub not extracting all data in CUBE

    We created an FI-SL cube and use the OPEN HUB process to extract the current value of accounts.  The problem is that not all the data is being extracted.
    I have a couple of examples where some of the data in the CUBE is not being extracted by the OPEN HUB process.  I have checked the filter and it is setup properly.  I have manually validated the data in the CUBE. 
    It is very strange that the data is in the CUBE correctly, but just being extracted.
    Any ideas?
    Thanks,
    Bill Lomeli

    Ashok,
    This is the first time I have seen a comment about the field limitation.
    + A previous message from Deepak Simon states:
    "I want to extract data from a cube with more than 16 fields."+
    We are using BI 7.0...is that restriction still in place for this version?
    Thanks,
    Bill

  • How to delete the Generated files from application server(open hub)?

    hi experts,
    when i try to execute process chain the DTP it is giving below dump. Exception CX_RSBK_REQUEST_LOCKED logged.
    when i execute the DTP manually and trying to delete the previous request, it is giving for dump ITAB_DUPLICATE_KEY.
    so to delete the generated file from application server, how to delete it for specific dates?
    Information on where terminated
    Termination occurred in the ABAP program "GPD6S3OE0BCVGC6L9DBNVYQARZM" - in
    "START_ROUTINE".
    The main program was "RSBATCH_EXECUTE_PROZESS ".
    In the source code you have the termination point in line 2874
    of the (Include) program "GPD6S3OE0BCVGC6L9DBNVYQARZM".
    The program "GPD6S3OE0BCVGC6L9DBNVYQARZM" was started as a background job.
    and when i check the dump it is point out at below code
    " Populate the lookup table for 0STOR_LOC
    SELECT * from /BI0/TSTOR_LOC
    into CORRESPONDING FIELDS OF table L_0STOR_LOC_TEXT
    FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE
    STOR_LOC = SOURCE_PACKAGE-STOR_LOC.
    but the programme is syntactically correct only.
    how to rectify the issue.
    regards
    venuscm
    Edited by: venugopal vadlamudi on Sep 28, 2010 1:59 PM

    hi experts,
    We have written start routine to get the storage location text and sending to File located at Application server through OPEN HUB.
    here is the code written in the Transformations
    In the global section
    Text for 0STOR_LOC
        DATA: l_0stor_loc_text TYPE HASHED TABLE OF /bi0/tstor_loc
              WITH UNIQUE KEY stor_loc.
        DATA: l_0stor_loc_text_wa TYPE /bi0/tstor_loc.
    and in the code to get the text
    " Populate the lookup table for 0STOR_LOC
        *SELECT * from /BI0/TSTOR_LOC*
          into CORRESPONDING FIELDS OF table L_0STOR_LOC_TEXT
          FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE
                  STOR_LOC = SOURCE_PACKAGE-STOR_LOC.
    im sure there is problem with the Routine only. i think i need to change the code if so please provide me the modified one.
    thanks
    venuscm
    Edited by: venugopal vadlamudi on Sep 29, 2010 9:37 AM

  • Open hub error when generating file in application server

    Hi, everyone.
    I'm trying to execute an open hub destination that save the result as a file in the application server.
    The issue is: in production environment we have two application servers, XYZ is the database server, and A01 is the application server. When I direct the open hub to save file in A01 all is working fine. But when I change to save to XYZ I´m getting the following error:
    >>> Exception in Substep Start Update...
    Message detail: Could not open file "path and file" on application server
    Message no. RSBO214
    When I use transaction AL11, I can see the file there in XYZ filesystem (with data and time correspondent to execution), but I can´t view the content and size looks like be zero.
    Possible causes I already checked: authorization, disk space, SM21 logs.
    We are in SAP BW 7.31 support package 6.
    Any idea what could be the issue or where to look better?
    Thanks and regards.
    Henrique Teodoro

    Hi, there.
    Posting the resolution for this issue.
    SAP support give directions that solved the problem. No matter in which server (XYZ or A01) I logon or start a process chains, the DTP job always runs in A01 server, and it causes an error since the directory doesn´t exist in server XYZ.
    This occurs because DTP settings for background job was left blank. I follows these steps to solve the problem:
    - open DTP
    - go to "Settings for Batch Manager"
    - in "Server/Host/Group on Which Additional Processes Should Run" I picked the desired server
    - save
    After that, no matter from where I start the open hub extraction, it always runs in specified server and saves the file accordingly.
    Regards.
    Henrique Teodoro

  • Open Hub in BW 3.5

    Hello Experts,
    I need to dump all the data (with all the characteristics) of an infocube onto a table.
    For this, I am implementing it through an infospoke.
    I am selecting all the 42 infoobjects which are present in my infocube.
    However I am getting an error which says that only 16 chars are allowed.
    Can someone please tell me as to how I could dump the entire data of the cube (with all the characteristics) onto a table through open hub.
    If not through open hub, could some one pls if there is any alternative method to dump the cube data onto a table.
    Regards
    Dipali

    Thanks Pramod and Praveen.Have assigned points
    My problem is solved and I am able to proceed further.
    I am now proceeding further to dump the data from the cube to the DB table.
    I have activated my infospoke.I now have the destination as say eg: /BIC/OHZTEST.
    However under se11, I am not able to see this table yet.
    Is there somethign that I am missing inorder to push the data from my cube to the DB table?
    PS: I have already selected the datasource as my cube
    Regards
    Dipali

  • Cube to Open Hub DB destination - Aggregation of records

    Hi Folks,
    I am puzzled with BW 7.0 open hub DB destination in regards to aggregation.
    In BW 3.5 open hub DB destination I got from the cube already aggregated records depending what which fields I select. E.g. cube has cal week/ cal month   but in infospoke only cal month selected I get only one record per month (not several ones -> one for each week of the month).
    In BW 7.0 open hub destination it seems to be different. Although Cal Week is not used in any transformation rule and not part of the destination defintion I still get all weeks of the month as single records. In theory not a problem if the records would be aggregated according the sematic key of the open hub destination db. But here an error is issues -> duplicated record short dump.
    So do I get it right that with BW 7.0 record aggregation e.g. Cal Week / Month  ---> Cal Month is not possible at all? Or do I something wrong?
    Will need to have an intermediate DSO in between or is there another way to get teh aggregation for open hub "direclty" working?
    This is quite a shortcoming of the open hub. Not mentioning the non-availability of nav_attributes + source only from cubes not from multiproviders ...  seems that the open hub in BW 7.0 got worse compared to BW 3.5
    Thanks for all replies in advance,
    Axel

    Hi Axel,
    We can use 0CALMONTH in open hub destination. In BI 7.0 we can not extract data from Multi Provider using Open Hub.
    But in BW 7.30 we have this functionality( using DTP we can extract data from multi provider to OHD).
    No need to use intermediate DSO, we can extract data directly from Info Cube.
    Please check the below documents.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/501f0425-350f-2d10-bfba-a2280f288c59?quicklink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/5092a542-350f-2d10-50bd-fc8cb3902e2e?quicklink=index&overridelayout=true
    Regards,
    Venkatesh

  • Open Hub does not work after upgrade

    We just upgraded our BW production system from 3.0b into 3.5 and we are running support package level 09. Now we are facing serious problems with Open Hub.
    After upgrade Open Hub delivering data from infocube into
    csv-files does not work. When running the InfoSpoke it
    will deliver 0 data records even though it should bring considerable amount of data.
    There is no error message or short dump, it just do not deliver any data. InfoCube has data in place and it is available for reportoing. Extraction is made in FULL-mode and there is no transformations. Funny thing is that extraction from ODS or InfoObject works perfectly fine.
    Has anyone idea how to solve this?
    Thanks in advance,
    JL

    We ran into the same issue as Jari Laine today.
    We just upgraded our BW sandbox environment from 3.0b into 3.5. We are running support package level 13, though. Now we are facing serious problems with Open Hub.
    After the upgrade, our Open Hub which delivers data from an info cube into CSV-files does not work any more.
    When running the Info Spoke it will deliver 0 data records even though it should bring a considerable amount of data.
    There is no error message or short dump, it just does not deliver any data. The Info Cube has data in place and it is available for reporting.
    The Extraction is made in FULL-mode and there are no transformations.
    Has anyone an idea how to solve this?
    Thanks in advance,
    Marc

Maybe you are looking for