Open Hub transporting problems

Hi everybody, we have Open Hub transporting problems, it just gives us return code 12 or 8, basically, it doen't want to transport neither Open Hub Destination, nor Transformations, nor DTPs, and bear in mind that we're on the 15th Patch Level. We've looked at all the SAP Notes, but they reffer to the earlier patch levels...
And other problem, is that it's OK in DEV system but in QUALITY system, when you launch a DTP it gives us a dumb with ASSIGN_TYPE_ERROR
Any ideas??

My destination path on the DEV box is simply the following (identified by BID in the following path):
      D:\usr\sap\BID\DVEBMGS00\work
And when I locate the un-activated open hub on the QA box, the BID is already changed to BIQ, but the server name box/field is empty....
I am going to try with the object changebility option, btw, is there any documentation on how to change these objects that are marked with object changebility on the QA box (provided that we can not manually change anything on the QA box)...
Anyways, I think my problem is with the server name conversion from "abc-bi-dev" to "abc-bi-qa"....!!!
Also, please note the RSBFILE entry for this object in the QA box has the "btcsrvname" as blank...
Do you think there is any additional setting that needs to be done on the QA box to do this translation (from "abc-bi-dev" to "abc-bi-qa")?
The only two setup we have done as far as the BI is concerned is at the "Tools-> Conversion of logical system name"....where you only put the conversion for client name mapping from dev to qa and local file system mapping from dev to qa.
So, is there anything else to map the host name (from "abc-bi-dev" to "abc-bi-qa") either at BI perspective or Basis/NetWeaver perspective?
Thanks a lot...
~Sabuj.
Edited by: Sabuj Haque on Feb 1, 2009 1:11 PM
Edited by: Sabuj Haque on Feb 1, 2009 5:34 PM

Similar Messages

  • Open hub destination problem - .csv file is empty

    Hi everyone,
    I am trying to extract data from our BW system using Open Hub Dest. to a .csv file
    DTP runs successfully means show total no. of records but when I see in AL11 then records are not in .csv file that is .csv file is empty.
    How to figure out the problem?
    Best regards
    Ahmad

    @Jagadeesh:
    if I correctly understood you, you mean to say that first I should run DTP and save to an excel file to see if it shows data or not. This i think is not possible; as when we create OHD then 'Destination Type' we can choose among 'DB table', 'table' and 'third party tool',
    so if I choose 'Table' then 'File Name' is .csv file (same name as the OHD) and can change it to .xls file
    @Arun:
    have permission to write in the folder
    Do you mean creating OH to a Database Table?
    in DTP I have a filter on a Characteristic, when its 1500 records then it works (data is in .csv file) but when 3500 records then .csv file is empty
    no error message, no log
    Regards
    Ahmad

  • Open Hub Destination problem

    Hi
    I have created open hub destination(OHD) on a cube and then getting file in CSV format on application server.
    I have to move this file to BPC but problem is wen run DTP in OHD it generated 2 files one with header and other with data
    How can i make it to generate one file with header and then data in it???
    How can i send this file automatically to BPC server( BPC is Microsoft version not netweaver one).

    Hi Sadik
    BPC is very new in market and there is not much help and yes it has base on MS SQL but thing is all data load is via CSV files only not thru direct table and stuff.
    I have tried making DB connect but i m not able to make RFC connection and there is no basis support I have in my project.
    I have solved my problem but there is a small stuck up. I am downloading data from internal table into file but this code is dumping file into my local machine. I want the file to be directly dumped into the application server.
    Following is the code:
    CALL FUNCTION 'MS_EXCEL_OLE_STANDARD_DAT'
    EXPORTING
      file_name = 'C:\Test' ????????? [This is the main problem]
    *file_name = 'DIR_HOME  E:\usr\sap\BDV\DVEBMGS30\work\ZBPC_1000' "set this for application server location
    CREATE_PIVOT = 0
    DATA_SHEET_NAME = ' '
    PIVOT_SHEET_NAME = ' '
    PASSWORD = ' '
    PASSWORD_OPTION = 0
    TABLES
    PIVOT_FIELD_TAB =
    data_tab = lt_data_f "internal table with data
    fieldnames = lt_header "internal table with header
    EXCEPTIONS
    file_not_exist = 1
    filename_expected = 2
    communication_error = 3
    ole_object_method_error = 4
    ole_object_property_error = 5
    invalid_filename = 6
    invalid_pivot_fields = 7
    download_problem = 8
    OTHERS = 9
    IF sy-subrc <> 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    Edited by: Navneet Thakur on Jul 2, 2009 2:53 PM

  • Open hub sapace problem.

    Hi all,
    while i am executing a process chain which is having openhubdestination as UNIX file i am getting an error like"Resource bottleneck
    The current program "CL_RSB_FILE_APPLSRV===========CP" had to be terminated  because a capacity limit has been reached".what sahll i do to fix this issue.
    regards.
    bhadra

    Hi Sadik
    BPC is very new in market and there is not much help and yes it has base on MS SQL but thing is all data load is via CSV files only not thru direct table and stuff.
    I have tried making DB connect but i m not able to make RFC connection and there is no basis support I have in my project.
    I have solved my problem but there is a small stuck up. I am downloading data from internal table into file but this code is dumping file into my local machine. I want the file to be directly dumped into the application server.
    Following is the code:
    CALL FUNCTION 'MS_EXCEL_OLE_STANDARD_DAT'
    EXPORTING
      file_name = 'C:\Test' ????????? [This is the main problem]
    *file_name = 'DIR_HOME  E:\usr\sap\BDV\DVEBMGS30\work\ZBPC_1000' "set this for application server location
    CREATE_PIVOT = 0
    DATA_SHEET_NAME = ' '
    PIVOT_SHEET_NAME = ' '
    PASSWORD = ' '
    PASSWORD_OPTION = 0
    TABLES
    PIVOT_FIELD_TAB =
    data_tab = lt_data_f "internal table with data
    fieldnames = lt_header "internal table with header
    EXCEPTIONS
    file_not_exist = 1
    filename_expected = 2
    communication_error = 3
    ole_object_method_error = 4
    ole_object_property_error = 5
    invalid_filename = 6
    invalid_pivot_fields = 7
    download_problem = 8
    OTHERS = 9
    IF sy-subrc <> 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    Edited by: Navneet Thakur on Jul 2, 2009 2:53 PM

  • Problem in transporting Open Hub destination

    Hi,
    I have a open hub destination. The destination is a database table. When i try to  transport the open hub destination to staging, the transport fails saying "Unable to activate table /BIC/OHXXX" . What could be the reason ?
    Thanks,
    Satya

    Hi Satya,
    I have exactly the same problem. The reason why the table could not be activated is because the referenced fields of the open hub table are wrong. Unfortunately I do not know at the moment why this is done. It would be great if could post the solution if you solved the problem. I will keep you informed...
    Thanks,
    Thomas

  • Error in Open Hub Destination Transport

    Hi All,
    I am facing the following problem in transporting the openhub destination.
    We have created a Open Hub Destination with following settings.
    Destination Type :- 'FILE'
    We have ticked the application Server name checkbox.
    In the server name we have selected our BI Development Server.
    In the type of File Name we have selected 'Logical File Name'. We have maintained the details in AL11 and FILE transactions.
    While transporting it to our Quality system, it throws an error saying it does not recognize the server name (which is true, the server name is different in quality system).
    Needed your guidance if there is any setting like source system conversion where it would change the name?
    Also if you could guide me regarding the best practices to transport the entries we have created in AL11 and FILE it would be realy nice. I did a search on this in SDN and could not find any useful stuff for my case.
    Thanks in advance.

    Hi,
    Your TR is failing, may be because in your target system, application server file system is different.
    go to transaction AL11 and check that. If same folder structure is not available, get it created, or go to RSA1 and try changing your infospoke settings to point to the application sever file as in your target system.
    You have to point this path to Q server.
    ~AK

  • Problem with open hub with application server

    Hello...
    I have a problem.
    I have bw 7.0 ad im usin Open Hub.
    When I configured the open hub in my development environment, In the destination tab i try two options
    Option 1.
    I use the application server, then i choose my server (the development server).
    But when i try to transport I get the follow mistake:
    Application sevier sapnwd01 does not exist (sapnwd01 is my development server)
    I now I can change the table  RSBFILE , or made an abap program.
    The problem is, i need to pass the transport withouts errors to quality, because i need to transport to my production server.
    And in production server I don't have the permission to change tables.
    Option 2.
    Using a logical file name (this logical file name, i used for other process and its working correctly)
    But when i try to transport I get the follow mistake:
    Unpermitted name   for a file .
    I hope you can help me!!
    Thanks in advance

    Hi Sharayu Kumatkar  
    Sorry for the delay, I had been problems with my access :$
    I check the AL11, there are a lot of directories, so I dont know which its the one
    My logicla path is ZAPD_INVERSIONES
    My phisical path is /netweaver/bw/fuentes/miig/altamira/
    My logical path Definition is :
    Logical file ZAPD_INVERSIONES2
    Physical file  ExtraccionInversiones.txt
    Data Format ASC
    Aplicat. Area FI
    Logical Path ZAPD_INVERSIONES
    In definition of variable
    FILENAME Test.txt
    I get this messaje error while Im trying to transport
    Start of the after-import method RS_DEST_AFTER_IMPORT for object type(s) DEST (Modo activación)
    Unpermitted name   for a file
    Thanks to help me!!

  • Problem with a Open Hub Extraction

    Hi Experts,
    I have a huge problem here.
    I´m trying  to extract some data with an Open Hub application, but i´m having trouble with the application server.
    See, I have to extract the information from a DSO to a file in a application server, but the company has one application server for each environment (BD1, BQ1, etc) and the conversion it´s not working and the transportation of the request always fails.
    Anyone knows how to fix this??
    Thanks.
    Regards.
    Eduardo C. M. Gonçalves

    Hi,
    While creating the open hub, you need to maintain the file name details under T-CODE - FILE.
    I hope you have maintained that.
    There also you wil have to use the variable to change the file path which will change the file path in each system..
    There you will have to define
    Logical File Path Definition
    Assignment of Physical Paths to Logical Path
    Logical File Name Definition, Cross-Client
    Once you have define this and transport , then your Open Hub will go smooth.
    Thanks
    Mayank

  • Open Hub Destination transports

    Hi Experts..
    We have crated Open Hub destinations to pull data from ODS to DB table in BI 7.0. But while transport the transformations for OHD are failing with transformation becoming inactive with error messageu201D Target ODSO ZODS*** does not exist" But the ODS is present in QA system and is in active version.
    So can any one pls help me to know the proper sequence for Open Hub Destination transport and the necessary objects to be captured in the request? Currently I had captured OHD, transformation and DTP in the same request.
    <removed by moderator>
    Regards,
    Umesh.
    Edited by: Siegfried Szameitat on Jan 16, 2009 11:57 AM

    Send the OHD in one request, transformations in one request, then DTP in another request.
    When club transformations & DTP together it fails in my experience.

  • Error in Transporting Open Hub Destination Objects

    Hi Experts,
    When i transport the open-hub destination to Quality server from development, the error No Rule Type Exists occurs while activating the transformation.
    The objects are transported but transformation & DTPs are inactive.
    Please assist as this is for the first time Open-Hubs are being transported in my system.
    Any help will be greatly appreciated.
    Thanks,
    Sumit

    Hi,
    the return code is 8
    I checked and here is error log:-
    Start of the after-import method RS_TRFN_AFTER_IMPORT for object type(s) TRFN (Activation Mode)    
       Activation of Objects with Type Transformation                                                      
       Checking Objects with Type Transformation                                                          
       Checking Transformation 08FM6JIPU6MCLHM2LWBY02OT6SXOPDNY                                            
       No rule exists                                                                               
    Checking Transformation 09RS1PONVSGIHIJEHRGDCWPP4UD87TBY                                            
       No rule exists                                                                               
    Checking Transformation 0MJTLNXZAUI7SKRWPKEFUIOCMB8X5TWQ                                            
       No rule exists                                                                               
    Saving Objects with Type Transformation      
    Thanks,
    Sumit

  • Issue in transport regarding Open Hub Services

    Hi All,
    I have a strabge scenario, where I am moving one of my open hub related transport from system test env to UAT,but each time it's failing.
    The story behind is: I have a infospoke based on one ODS,where I have included a field called BRAND in the source structure.
    This transport went fine from Dev to Sytem test env,but it's failing while moving from System test to UAT.
    Error is.
    Program ZCL_IM_ZLP_WRKF===============CP, Include ZCL_IM_ZLP_WRKF===============CM001: Syntax error in line 000085
    The data object 'WA_SOURCE' does not have a component called '0IS_CLAIM__BRANDVALE'.
    Any comment will be appriciated.
    Regards,
    Kironmoy Banerjee.

    Hi,
    Generally transports fails with different situations like, by sequence and dependecy missing / with in active objects/ with RFC issue.
    so, we should follow the sequence and dependency while transporting the objetcs and they should be in active state in the source first.
    In your case, check your source fields whether they mapped correctly or not and it should be in active state.
    as per your error:
    The data object 'WA_SOURCE' does not have a component called '0IS_CLAIM__BRANDVALE'.
    Check source "'WA_SOURCE", why does not have the component called "0IS_CLAM_BRANDVALE". I think the souce missing the component called ""0IS_CLAM_BRANDVALE".
    see whether it was mapped correctly/active or not  and follow the seqence if there are any dependancy and retransport the request.
    Regards.
    Rambabu

  • Open hub problem

    Hi Gurus,
    please help me out of this problem
    When we execute the dtp of open hub, it is creating a extract file on
    server directory. Our problem is record length of output record is
    always truncated to 110 characters. Record length of open hub structure
    we have crated is 151 charadters(excluding delimiter).
    We are not using infospokes as they are obsolate in 7.0.
    We are using NW BI 7.0 We are on Service pack 20
    and the excel 2000.
    Thank you

    Hi,
    Please let me know what is the length of the structure in the Open Hub with the delimiters.
    Btw,
    You can create a custom field in the open hub maintenance and specify its size to whatever size you desire with the delimiters and then assign all the source fields to this through a routine.
    Regards,
    Saikat

  • Open Hub Destination - Transport Error

    Hi,
    Following error occured while transporting the Open Hub Destination(OHD)  to QA.
    Created OHD using Logical file as File type.
    Syntax group: WINDOWS NT.
    Transport Error Log:
    Start of the after-import method RS_DEST_AFTER_IMPORT for object type(s) DEST (Activation Mode)
    Application server bidwXXX does not exist.
    Start of the after-import method RS_DEST_AFTER_IMPORT for object type(s) DEST (Delete Mode)
    Errors occurred during post-handling RS_AFTER_IMPORT for DEST L
    RS_AFTER_IMPORT belongs to package RS
    The errors affect the following components:
       BW-WHM (Warehouse Management)
    are we missing anything configuration here?
    Regards

    Ananda,
    This is my file path in Dev:
    D:\usr\sap\<SYSID>\DVEBMGS00\work\<FILENAME> in this path i have mentioned BWD instead of <SYSID> initially, it failed so i thought i would give a try by giving the parameter <SYSID> instead of BWD, even that failed.
    When i have transported OHD with the path D:\usr\sap\<SYSID>\DVEBMGS00\work\<FILENAME> it was created in QA in the same directory as mentioned in Dev.So i believe i have the right file path in QA.
    Thanks.

  • Problem connecting to SAP Open Hub

    Hi, I am trying to set up a SSIS job  connecting to SAP Open Hub and have with support from the SAP guys been able to get some progress, but it has now stopped up on a error message we're not able to solve. Any suggestion on what can be wrong and
    how to solve this? When I run the package I get the following error message:
    SSIS package "D:\Source\MSBI\SapLoadStaging\Package3.dtsx" starting.
    Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
    Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
    Information: 0x40043006 at Data Flow Task, SSIS.Pipeline: Prepare for Execute phase is beginning.
    Information: 0x40043007 at Data Flow Task, SSIS.Pipeline: Pre-Execute phase is beginning.
    Information: 0x4004300C at Data Flow Task, SSIS.Pipeline: Execute phase is beginning.
    Information: 0x3E8 at Data Flow Task, SAP BW Source: Process Start Process, variant has status Completed (instance DH88PUV2SZBIFKMIF48K3USME)
    Error: 0x3E8 at Data Flow Task, SAP BW Source: Process Data Transfer Process, variant /CPMB/HMIJYDZ -> ZOH_VPL has status Ended with errors (instance DTPR_DH88PUV2SZCA46Y9QNO66A6W6)
    Error: 0x3E8 at Data Flow Task, SAP BW Source: The component is stopping because the Request ID is "0".
    Error: 0x3E8 at Data Flow Task, SAP BW Source: No data was received.
    Error: 0xC0047062 at Data Flow Task, SAP BW Source [41]: System.Exception: No data was received.
       at Microsoft.SqlServer.Dts.SapBw.Components.SapBwSourceOHS.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers)
       at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100 wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers, IntPtr ppBufferWirePacket)
    Error: 0xC0047038 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on SAP BW Source returned error code 0x80131500.  The component returned a failure code when the pipeline engine called PrimeOutput().
    The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
    Information: 0x40043008 at Data Flow Task, SSIS.Pipeline: Post Execute phase is beginning.
    Information: 0x4004300B at Data Flow Task, SSIS.Pipeline: "OLE DB Destination" wrote 0 rows.
    Information: 0x40043009 at Data Flow Task, SSIS.Pipeline: Cleanup phase is beginning.
    Task failed: Data Flow Task
    Warning: 0x80019002 at Package3: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.  The Execution method succeeded, but the number of errors raised (5) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches
    the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
    SSIS package "D:\Source\MSBI\SapLoadStaging\Package3.dtsx" finished: Failure.
    The program '[6916] DtsDebugHost.exe: DTS' has exited with code 0 (0x0)
    Regards
    Paal

    Hi Paleri,
    According to the
    thread which has the same error message, the issue may be caused by incorrect RCF settings. Could you double check your RCF connection configurations such as DNS settings?
    If it is not the case, please also make sure you have installed the correct version of Microsoft Connector for SAP BW.
    Reference:
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/e2fbafe5-d9df-490a-bfad-3d4b9784a8ea/sap-bi-connector-for-ssis-2008?forum=sqlintegrationservices
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • Modifying destination Directory of Application server for Open Hub

    Hi All,
    i want to load a csv file on application server using Open Hub.
    when i have created the open hub i have specified the server name, the file name and the directory.
    but the Basis guys told me that i should use another directory, they have created me a Unix Directory but the problem that i can't modify the directory path in my open hub.
    can you tell me please what is the problem?
    another question why i can't modify directly in Production System this parameter of server name, directory for open hub,
    in production i can create the open hub but when i have transported some one from dev to prod i can't modify in prod, i have controlled from transport tool the possibility of modication object and i see that for opne hub is "original modificable"
    Thanks for your help
    Bilal

    i have read in some forum that to " to change the server name or logical file name we can do so in the following table:
    RSBFILE"
    i tried to see the data of this table and i found 2 records for my open hub, 1 for Active version and 1 for Modified version
    i tried here to change the path name of the directory and after saving the system give me the message that the active version and the modified version of my open hub are not equal, when i try to activate again my open hub the system take the old
    directory path.
    is this is the right table to change the directory path on application server of my open hub? or there is another table?
    i don't work with logical filename, i work with file name and directory path name.
    thanks for your help
    Bilal

Maybe you are looking for