Scheduling Open Hub

We want to automatically push some flat files out, perhaps using Open Hub.  However, it seems that Open Hub in flat file mode cannot be scheduled as a background process.  Only the mode that generates a database table seems to be schedule-able.
Is there a way to automate generateing a flat file?  Or should we look at somehow calling <b>RSCRM_BAPI</b>?
Thanks for any advice.

Jerry,
Extraction of data to a flatfile via open HUB (on BW-server) can be scheduled (use Process Chain) and then a simple ABAP-program can be run to download the files to your local PC (open dataset, read into internal table, close dataset + call function GUI_DOWNLOAD).
Regards,
Marco

Similar Messages

  • How to create an Open HUB in BI 7.0 which will be scheduled by a Process ch

    Hi Gurus,
    Could you please explan me how to create an open HUB which will e scheduleby a process chain?
    Thank you.
    BR
    Pat

    Hi ,
    Check this links ..
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/5f12a03d-0401-0010-d9a7-a55552cbe9da
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/01d3a090-0201-0010-9783-bc33ab690e70
    http://learnsapbw.blogspot.com/2008/04/open-hub-service-using-sap-bi-70.html
    Regards,
    shikha

  • Open hub destination issue

    Hi,
    In our project we have Client Instance (eg: BWPD) and Application server (eg: BWPRD) defined for load balancing.
    We have created open hub and assigned destination server BWPD.
    When i execute the DTP manually in BWPD, it runs succesfully.
    However, the same DTP when placed in the process chain fails with error message :
    No Such File or Directory
    Could not open file D:\usr\sap\BWPD\A01\work\Material on application server
    Error while updating to target ZXXXX.
    Options Tried:
    Schedule process chain in the background server BWPD (same server which has been mentioned in Open hub dest) still DTP failed.
    Tried with Application server it failed.
    Tried with HOST as option it failed.
    couldn't make out what is going wrong. Any thoughts ?
    Regards.

    Hi there,
    found that doc quite useful.
    Maybe could shed some light to your issue.
    [Creating  Open Hub Destination  using a Logical file to extract the data|Creating  Open Hub Destination  using a Logical file to extract the data]
    Also, what OS do you have?
    is the Syntax Group accordingly created ?

  • SAP BI 7.0 to SAP PI to FTP and with Open Hub Destination ...Help!!!!

    Dear SCN Experts,
    I am currently working on scenario where I have a requirement to push data from SAP BI 7.0 to SAP PI 7.
    And I am using Client proxy for SAP BI to send the data from BI tables to SAP PI and then to write to FTP address.
    Now the challenge I am facing is the ABAP Proxy how to use it with Process Chain and also I am having Open hub destination created for the same.(Specifically with the new version i.e. 7.0)
    Can you atleast make me understand what are the steps involved in this for Client Proxy, Process Chaing, How will proxy trigger and other regarding the same.
    I have searched SDN but got the document having older versions in it. Which doesn't serve the purpose.
    Regards,
    [Gaurav Patwari|http://gauravpatwari.wordpress.com]

    Hi Michal,
    Thanks for the reply buddy.
    I know that we can run the scheduled report for the proxy to fetch the report. But the client requirement is to use process chain, Open hub destination which fetches data from the ODS to Ztable created.
    We need to fetch that data from the table via our proxy. I am familiar with that report related method.
    I have one document using the same method but is using Infospoke(which is now obsolete) so have to OHD and also some of the proxy of XI's older version. So is not helping me out.
    Please do the needful. Or can you send me some sample scenario like this with screen shots. It will be a great help.
    Regards,
    [Gaurav Patwari|http://gauravpatwari.wordpress.com]

  • Data Services - Open hub read - No Open Hub DTP error

    Hi,
    I have created a open hub destination in BW (7.0 SP21), created transformation, DTP and process chain. In Data Services (3.2) I have created flow, job etc. and every thing seems to be as expected. However when I start the job in DS it terminates with an error:
    1136     5624     BIW-230334     13-11-2009 12:28:22     Process Chain <ZTEST2> of Open Hub destination <ZTEST> at SAP system <.......> no more contains Open Hub DTP. Please reimport the Open Hub destination to execute the appropriate Process Chain.
    I've tried re-activating everything in BW and also re-importing the OHD in BW, but nothing works. I've even tried to re-create everything from scratch. The process chain I've created do indeed contain an Open Hub DTP.
    One addtional question: I've activated the process chain, but do I also need to schedule it in BW (like you do with e.g. process chains for time points to be used in Broadcasting)? Note, I've tried both scheduling/no scheduling and does not work either way.
    Any ideas/suggestions?
    Thanks in advance,
    Jacob

    You must implement following SAP Notes in SAP systems in order to make Data Services Open Hub functionality work correctly:
    SAP Note  1270828 : (Open Hub Process chain is not imported into Data Services repository).
    SAP Note  1294774:  (Open Hub import failed - when you call the module RSB_API_OHS_DEST_SETPARAMS, th system tries to open a GUI)
    SAP Note 1330568 version 3: (This note has fixed "Process Chain <ZTEST2> of Open Hub destination <ZTEST> at SAP system <.......> no more contains Open Hub DTP. Please reimport the Open Hub destination to execute the appropriate Process Chain").
    SAP Note 1079698 - 70SP16 (Enable check box "Automatically Repeat Red Requests in Process Chain" in the DTP Execute)
    SAP Note 1346049:  (Error 028(OSOH): Request <n> cannot be locked.)
    SAP Note 1338465 #with 1293589 as pre-requisite: (The ABAP/4 Open SQL array insert results in duplicate database records.)
    SAP Note 1342955: (0 data records transferred)
    Edited by: Ajit Dash on Nov 13, 2009 10:37 PM
    Edited by: Ajit Dash on Nov 13, 2009 10:39 PM
    Edited by: Ajit Dash on Nov 13, 2009 10:39 PM

  • Open hub service - Cube as source system

    Dear gurus,
    i am new to open hub concept.
    i need to transfer values from my query to a custom table in R/3.
    im using bi7.
    Questions:
    1) i guess pen hub is my solution. is it correct?
    2) do i have to transfer values from my cube or can i send values from my query too?
    3) i have created openhub service under modelling tab, i have selected target type as TAB. it says database table /BIC/OHZTABLO.
    is it correct?
    4) then i am trying to create Transformation from open hub ZTABLO to/ BIC/OHZTABLO. ?
    in that case it shows that transformation target is DEST- ZTABLO? i mean open hub service is my destination? then what is the source?i am confused in that point.
    5) Thanks all!

    1) i guess pen hub is my solution. is it correct?
    ->If u want to transfer dat from yr query then APD (analysis Process designer) will be the best bet to go for. Create APD and schedule it in process chain
    2) do i have to transfer values from my cube or can i send values from my query too?
    ->Using OH(open hub) you have to use Cube as a source,,,you cant use query then.
    3) i have created openhub service under modelling tab, i have selected target type as TAB. it says database table /BIC/OHZTABLO.
    ->ya,correct,there are two ways to do that,,,either u send the cube data to a open hub table(as its happening in yr case) or send it to application server/flat file
    is it correct?
    --> yes
    4) then i am trying to create Transformation from open hub ZTABLO to/ BIC/OHZTABLO. ?
    in that case it shows that transformation target is DEST- ZTABLO? i mean open hub service is my destination? then what is the source?i am confused in that point.
    ---> yr source is InfoCube and target is Open hub table /BIC/OHZTABLO
    5) Thanks all!
    Than you.

  • Delete the initialization of delta on open hub table

    HI Gurus
    I have loaded data from cube to open hub table using DTP with full update, later on i have loaded the data using a DTP with delta update. Now i am thinking that delta has been initialized on the open hub table, so now want to delete the initialization of delta on the open hub table. It would be great if some one could send me the response. 
    Thanks
    Rohit

    Hi Sam,
    If I understand you correctly,
    You have removed your Initialisation by going InfoPackage --> Scheduler --> Initialization Options for Source --> Removed the Request
    Then you did Init without Data Transfer
    Then you have the option from init withoout Data transfer to Delta in InfoPackage and Loaded the Data.
    Check in the Process Chain the InfoPackage you have mentioned in the selections may pointing towards Init load instead of Delta.
    Sudhakar.

  • Need help in open hub

    Hi All ,
    We have a NetWeaver2004s system and want to transport data from a cube to an internal db table in open hub . We have succcessfully created the open hub and connected it to the cube . Using DTP we have made the connection and the db table but unble to schedule the data into the db table . can you please suggest as how to schedule the data into the db table from the info cube using open hub ?
    Thanks

    Hi,
    Thanks for the reply. I created a infospoke and destination was database option. Opened the se16 gave the table name which starts with /bic/....(please correct me if i am wrong),then the initial screen of se16 came up with all the fields , but when i executed it, the table is empty.Could you say why it is happening.
    Thanks,
    RR.

  • Info spoke /Open Hub

    Hi Friends
    I need to extract few fields to a  flat file(excel).that file needs to be up loaded to APO DP .
    The issue i am facing is data belongs to 3 infocubes. 2 are transactional and one is basic. so i cant use open hub which supports only for 1 cube/ods at a time.
    I created a Multi cube on top of the 3 cubes and tried to extract the data using RSCRM_BAPI by creating a report on the multiprovider.
    But i am facing issues as
    1) Multi provider has TYPE which is not supported.
    2 ) one more is my report conatins the STRUCTURE elements. the out put in RSCRM_BAPI is not displaying the structure elements.
    please help with some solution to this.
    regards
    praveen.

    Hi Praveen,
    If you are on BW 3.5 then You can use APD to write the data you want into a transactional ODS and then create an export datasource on that ODS and use in APO.
    You can schedule this at whatever time you want.
    However APD is not supported in process chains in 3.5 but it is supported in BI 7.0 I guess.
    Bye
    Dinesh

  • SAP Open Hub Vs SAP JCO for Extraction

    Hi,
    I would like to clarify for SAP JCO connectivity from some third party tool to SAP BW Server in compare to open hub:
    We want to connect our internal third party tool to SAP BW/BI server by using Java technology through SAP defined JCO. Later we will create RFC and BAPIs at SAP BW server for data extraction. Those defined BAPIs and RFCs we would call from my java programs.
    In such kind of scenario we can extract metadata as well as data for all defined data targets (InfoCube/ODS/DSO/InfoObject) and also can make otherdata load scheduling program to extract data and store it at my non-SAP application.
    As per my understanding since we are using SAP provided JCO so there won't be any issue of license for the same. And Open hub require license if it is used to extract and send data from data targets to third party tools.
    Can someone confirm for license out of above mentioned cases when exactly it would be required. Mainly I would like to know if we doun't use SAP provided JCO for connectivity and data exttraction then license woould be required or not?
    You speedy response would be highly appreciatted.
    Regards,
    Vivek

    hi,
    refer this links
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b6d59590-0201-0010-5f9d-adb9f6b14d80
    help.sap.com/bp_biv270/documentation/SAP_BW_3.5_Functoin_Detail.pdf
    www.sapbasis.ru/library/ep_tech_and_program/ep_tech_and_program-7_8.pdf
    thanks
    naresh

  • Open Hub with Attributes and Texts

    Hi,
    I´d like to know if it´s possible to have one open hub (database table) with texts and attributes.
    There´s no problem when I create it or it´s transformations and DTP´s, but when I try to load texts, after attributes load, it always generates a dump: SAPSQL_ARRAY_INSERT_DUPREC.
    I´d like to know a way without using a DSO, any suggestions?!
    Kind Regards,
    Tomas

    Hi,
    Pls chk SAP notes system, check SAP note 668466 for some help for this error.
    *As according to SAP NOTE ;668466*
    Summary
    Symptom
    You load master data attributes or texts for a characteristic. The loading terminates with runtime errors of the type SAPSQL_ARRAY_INSERT_DUPREC or with error message RSDMD 199. The termination point is in the DBFIRE method of the CL_RSDMD_UPDATE_MASTER_DATA class.
    Other terms
    Loading master data, attributes, texts, SAPSQL_ARRAY_INSERT_DUPREC, CL_RSDMD_UPDATE_MASTER_DATA, DBFIRE
    Reason and Prerequisites
    When you load attributes or texts, data records are usually inserted in the master data tables of the characteristic using an array insert statement (P, Q, X, Y, T tables). This results in a SAPSQL_ARRAY_INSERT_DUPREC runtime error if data records that are to be inserted and that relate to the unique primary key of the relevant database table already exist in this table. The following possible reasons are currently known for the occurrence of the SAPSQL_ARRAY_INSERT_DUPREC runtime error:
    1. Data inconsistencies in the master data tables of the characteristic
    SAPSQL_ARRAY_INSERT_DUPREC runtime errors or RSDMD 199 may occur if the data appears in an inconsistent status in the master data tables of the characteristic before you load attributes or texts. Notes 323140, 566044, 592757 and 599269 describe this (among other things) in more detail.
    2. Parallel loading of nondisjunct data packages
    If master data attributes are loaded over several dialog processes that are running in parallel, you must guarantee that the data packages of the relevant request are strictly disjunct concerning the characteristic value. This means that all data records that belong to a certain characteristic value may only be contained in a single data package. If data records of a certain characteristic value are distributed over several data packages, this may cause conflicts during the parallel loading of these packages when the master data tables are accessed and this may result in SAPSQL_ARRAY_INSERT_DUPREC runtime errors.
    The same conditions that apply to loading attributes also apply to loading language-independent master data texts. If language-dependent texts are loaded in parallel, the data packages must be disjunct in relation to the combination of characteristic value and language indicator.
    Note 566044 also describes the problematic nature of nondisjunct data packages or duplicate or overlapping data records.
    3. Parallel loading of time-dependent attributes or texts
    When you load time-dependent attributes or texts, a data record that is to be loaded is assigned to a characteristic value and also specifically to a time interval within which the contained attribute values or texts are valid. For this reason, requests for loading time-dependent attributes or texts typically contain several data records for each characteristic value. If data records of a certain characteristic value or the combination of characteristic value and language indicator are distributed over several data packages, this may (as already explained) cause conflicts when the master data tables are accessed and may result in SAPSQL_ARRAY_INSERT_DUPREC runtime errors or error message RSDMD 199.
    4. Delta update of an ODS object in a characteristic
    If you use a delta update from an ODS object to load master data attributes or texts, the activation of the ODS object determines the contents of the data packages that are used for loading the attributes or texts. If all of the ODS loading requests that are used as the data basis for the update into the characteristic are activated together in one step, this guarantees that the system sets up the data packages strictly disjunct in relation to the key fields of the ODS object. If the key fields of the ODS object are displayed one for one on the compounded characteristic value in the update rules, the data may be updated to the characteristic in parallel and by package.
    However, if ODS loading requests that were activated at different times appear in the data basis for the update, data records of a certain ODS key or characteristic value may be distributed over several data packages. As already explained, this may cause conflicts when you access master data tables of the characteristic, and it may result in SAPSQL_ARRAY_INSERT_DUPREC runtime errors or error message RSDMD 199.
    This problem is also described in Note 666213.
    5. Simultaneous loading of attributes and transaction data
    If, when loading attributes of a characteristic, you discover that there are still no entries in the master data tables (P, Q) for a specific characteristic value, the system prepares to insert corresponding data records into these database tables. You can load transaction data into an InfoCube or an ODS object, for example, with the option to create missing master data entries for characteristics that involved during the loading. If a request now creates entries for transaction data in the master data tables (P, Q) of the characteristic before the process has inserted corresponding data records into the master data tables (P, Q) for loading attributes for this characteristic, this may result in a SAPSQL_ARRAY_INSERT_DUPREC runtime error or error message RSDMD 199.
    This problem may also occur if you load attributes simultaneously for a characteristic that contains the other characteristic as a navigation attribute or compounding part. If a characteristic is compounded to other characteristics or if it contains characteristics other than the navigation attributes, the system also creates SID values for these dependent characteristics when the attributes are loaded depending on the settings in the InfoPackage.
    Solution
    To repair a possible inconsistency in the data, implement the RSRV test for master data.
    To avoid SAPSQL_ARRAY_INSERT_DUPREC runtime errors or error message RSDMD 199 when you load master data, carry out the following actions. The actions to be performed in each case depend on which of the above-mentioned reasons is responsible for the runtime error:
    1. Data inconsistencies in the master data tables of the characteristic
    If data inconsistencies are responsible for the runtime error, you can usually use the tests in transaction RSRV or the RSDMD_CHECKPRG_ALL program to solve this problem. Notes 323140, 566044, 592757 and 599269 describe these (among other things) in more detail. If you cannot implement an automatic correction in this way, you may have to use an ABAP program that is adjusted to the relevant situation to restore the data consistency.
    2. Parallel loading of nondisjunct data packages
    If the runtime error does not lead back to the parallel loading of nondisjunct data packages, you must identify the duplicate data records that cause the problem. As described in Note 566044, you can change the insertion into master data tables from one array insert to an individual insert statement. If you use a single record insert statement to insert data records, duplicate data records are logged in the monitor by error messages (RSDMD 196). If the data records of a failed request are still available in the PSA, you can also search for duplicate data records directly in the relevant PSA table.
    You may also create duplicate data records if key values are changed in the transmission rules or update rules. If relevant rules are defined, you should check these to see if they can possibly create duplicate data records. If this is the case, you must correct the rules accordingly.
    3. Parallel loading of time-dependent attributes or texts
    The extractors used for loading time-dependent attributes or texts cannot guarantee that all data records assigned to a certain key value are contained together in only one data package. For this reason, to avoid SAPSQL_ARRAY_INSERT_DUPREC runtime errors or error message RSDMD 199, you should not load time-dependent attributes and texts in parallel. Data packages are processed serially if the 'PSA only' and 'Update Subsequently in Data Targets' settings under 'Processing' are used in the InfoPackage.
    4. Delta update of an ODS object in a characteristic
    To avoid SAPSQL_ARRAY_INSERT_DUPREC runtime errors or error message RSDMD 199 during the delta update from an ODS object to a characteristic, you must ensure that the data basis used for the update to the characteristic consists exclusively of ODS loading requests that were activated together in one step only. Note 666213 describes in detail how you can ensure this.
    5. Simultaneous loading of attributes and transaction data
    If the runtime error occurs because another load task creates SID values or master data for the same characteristic at the same time when attributes are loaded, you can only solve the problem by rescheduling. The load task in question must be scheduled in such a way that you can exclude simultaneous processing.
    *pls assign points,if info is useful*
    Regards
    CSM reddy

  • Error while assigning constant to infoobject in open hub transformation

    While assigning constants(in rule details) to infoobject in a transformation in a open hub I am getting an error "The Object Name is not allowed to be empty"
    Can anyone tell my why this is happening? What should I do now?

    Hi,
    In the transformation, have you connected that rule to any target field of your destination? If not, then do that and then try to create the routine.
    Regards,
    Vaibhav

  • Open Hub: How-to doc "How to Extract data with Open Hub to a Logical File"

    Hi all,
    We are using open hub to download transaction files from infocubes to application server, and would like to have filename which is dynamic based period and year, i.e. period and year of the transaction data to be downloaded. 
    I understand we could use logical file for this purpose.  However we are not sure how to have the period and year to be dynamically derived in filename.
    I have read in sdn a number of posted messages on a similar topic and many have suggested a 'How-to' paper titled "How to Extract data with Open Hub to a Logical Filename".  However i could not seem to be able to get document from the link given. 
    Just wonder if anyone has the correct or latest link to the document, or would appreciate if you could share the document with all in sdn if you have a copy.
    Many thanks and best regards,
    Victoria

    Hi,
    After creating open hub press F1 in Application server file name text box from the help window there u Click on Maintain 'Client independent file names and file paths'  then u will be taken to the Implementation guide screen > click on Cross client maintanance of file name > create a logical file path by clicking on new entiries > after creating logical file path now go to Logical file name definition there give your Logical file , name , physical file (ur file name followed by month or year what ever is applicable (press f1 for more info)) , data format (ASC) , application area (BW) and logical path (choose from F4 selection which u have created first), now goto Assignment of  physical path to logical path > give syntax group >physical path is the path u gave at logical file name definition.
    however we have created a logical path file name to identify the file by sys date but ur requirement seems to be of dynamic date of tranaction data...may u can achieve this by creating a variable. U can see the help from F1 that would be of much help to u. All the above steps i have explained will help u create a dynamic logical file.
    hope this helps u to some extent.
    Regards

  • UNCAUGHT EXCEPTION while executing Open Hub

    Hi All,
    I have created Open hub destination in BI 7.0 on one custom cube. I am trying to create file on application server using this open hub. I have also created logical filename etc..
    When I ran the DTP for this open hub, I am getting following error:
    Runtime Errors UNCAUGHT_EXCEPTION
    Except. CX_RSB_WRITE_ERROR
    This was running fine on DEV and QA system but giving this error in PRD.
    Any clue about this error?
    Thanks in advance.

    Hello,
    it seems the logical path for logical filename is not correct. Could you please check it in t-code FILE?
    Kind Regards,
    Paula Csete

  • Extract Data with OPEN HUB to a Logical Filename

    Hi Experts,
    Can anybody help me in sending the link for How to guide...Extract Data with OPEN HUB to a Logical Filename?
    Thanks in advance.
    BWUser

    Hi,
    check this links...
    http://searchcrm.techtarget.com/generic/0,295582,sid21_gci1224995,00.html
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e698aa90-0201-0010-7982-b498e02af76b
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1570a990-0201-0010-1280-bcc9c10c99ee
    hope this may help you ..
    Regards,
    shikha

Maybe you are looking for

  • This is a show stopper

    I am relatively new to PL/SQL scripting, trying to pull data from Oracle DB 11g through SQL*PLUS. I am putting all the select items of the select query into ORACLE RECORD as defined below. I am not putting the entire select query as it is quite long

  • HP Laserjet 1005 printing issue..please help!

    Hi there I am having a problem with my printer. I bought it back in sept 09 and its been great until recently. I will be in the middle of printing a document on the correct letter size and i will get a error message on my computer telling me to load

  • What is the difference btw Oracle bpm 10gr3 vs albpm 6.0.5 version

    What is the difference btw Oracle bpm 10gr3 vs albpm 6.0.5 version Are the build number common to them?

  • WCS + can't recreate the 'any' group in ACL template

    ON WCS, I'd like to configure an ACL by using a template. When I go to Configure > Controller Template Launch Pad > Security > Access Control > IP Groups, we can find, normally,  a group named "any" to use in the rules. On one of my WCS, this 'any' g

  • NW04 WebAs J2EE Full Inst. Issue: CJS-20019  Error when installing..

    I've run into this problem on step 26 of installation: <i>ERROR 2006-11-25 22:58:29 CJS-20019  Error when installing temporary license. DIAGNOSIS: External systems are not available. SOLUTION: Check connection to DB and messaging service. </i> Once i