Data transport mechanism

A customer has asked if I can describe our "data transport mechanism(s)" to move data between their databases. I have not been able to get more detail on what they are looking for, other than TCP is too low level.
We connect from one Oracle database to remote databases using database links for Oracle databases and Heterogeneous Services (ODBC or OLEDB) for other databases.
Can anybody help with a reference to Oracle's mechanism, or even some other "data transport mechanisms" so I can search for something at the correct level?
Thanks in advance,
David

I would think a brief description of Oranet and ODBC would do the job along with a diagram of the Oranet (database link) database to server/database and ODBC connections.
Oracle has white papers on Heterogeneous Services (also called generic connectivity) available on metalink.
HTH -- Mark D Powell --

Similar Messages

  • Transport Mechanism in XI

    What is the transport mechanism in XI??
    How to transport from Development system to Production syatem.
    Is there anyway other than Import or Export??
    Thanks in advance,
    Jeevan.

    When you migrate the Integration Server and the Integration Builder, you install a new IS/IB
    3.0 on additional hardware. After completing the Customizing on IS 3.0, you use XI tools to
    migrate XI configuration and runtime data from the IS 2.0 to the IS 3.0 system. Finally, you
    have to switch the addressing for all application systems, adapters, and SLD parts from IS
    2.0 to IS 3.0.
    You can prepare most steps in setting up Release 3.0 without interrupting the operative use
    of Release 2.0. Therefore, the XI downtime is kept to a minimum.
    The basics of the migration procedure are as follows:
    1. Installation and Customizing of Integration Server and Integration Builder (XI 3.0) on a
    standby host machine (all-in-one)
    2. Migration of SLD data from Release 2.0 to 3.0.
    3. Migration of XI Integration Builder data from Release 2.0 to 3.0.
    4. Cache refresh.
    5. XI downtime starts, which disables the input stream into IS 2.0 / IS 3.0.
    6. Migration of essential runtime data.
    7. XI downtime ends, which enables the input stream into IS 3.0 again.
    8. Address switching of adapters and application systems from IS 2.0 to IS 3.0.
    Migration starts with the upgrade of the DEV landscape (including the DEV SLD) to Release
    3.0
    After upgrading the DEV landscape, there is no way to transport configuration data
    from the DEV 3.0 landscape to the QA 2.0 or PRD 2.0 landscapes.
    We recommend that you perform all required transports from DEV to QA or
    PRD before the migration procedure is started for the whole landscape.
    • If you need to make configuration changes after the DEV landscape migration to DEV
    3.0, for example, for emergency corrections, you need to make the configuration
    changes separately and in parallel for DEV 3.0 as well as for QA 2.0 and PRD 2.0.
    The transport mechanisms for the SLD and Integration Builder are described in the Online
    Help.
    All application systems in the DEV landscape must switch their addressing to the new
    Integration Server 3.0 of the DEV landscape. For more information, see the relevant sections
    below.
    you could detaild information from this link
    http://help.sap.com/bp_bpmv130/Documentation/Upgrade/XI_Upgrade_Guide.pdf
    reward the points if it helps
    Regards
    Naim

  • Difference between SXPG_COMMAND_EXECUTE and Open data set mechanism

    Can you please help me to know the difference between moving a file using FM "SXPG_COMMAND_EXECUTE" and moving file using "open data set - transfer file- close data set" mechanism?

    Through 'SXPG_COMMAND_EXECUTE' u can execute a External command  (ie.. UNIX /Windows/OS/400) In ABAP programming..
    for Conversion Of Aplication server File/Move the file from one directory to Other...
    The Command must be present in SM69.
    Using this function module'SXPG_COMMAND_EXECUTE' , you can check the authorization of a user to execute a particular external command and run the command:
    With the arguments specified in ADDITIONAL_PARAMETERS
    On the target host system defined by OPERATINGSYSTEM and TARGETSYSTEM
    If an SAP profile parameter is inserted in the portion of the command stored in the database, then the value of this parameter is substituted into the command. If an SAP application server is active on the target system (TARGETSYSTEM), then the profile parameter values are read from the profile that is active on that system. No parameter substitution is made in ADDITIONAL_PARAMETERS.
    After substitution, the command is checked for the presence of "dangerous" characters such as the semicolon ( ; ) on UNIX systems.
    If an additional "security function module" is specified in the command definition, this function module is also called in the course of processingebenfalls. This function module can prevent execution of the command.
    If the authorization checks complete successfully, the command is run on the target host system.
    Through Open Data Set u can read and write the file in the same directory...
    Thnaks,
    SD
    Moderator message: copy/paste without credit, points removed, please do not repeat!
    http://www.google.de/search?hl=de&q=%22youcanchecktheauthorizationofauserto+execute%22&meta=&aq=f&aqi=&aql=&oq=&gs_rfai=
    http://www.google.de/search?hl=de&q=%22Aftersubstitution%2Cthecommandischeckedforthepresence%22&meta=&aq=f&aqi=&aql=&oq=&gs_rfai=
    Edited by: Thomas Zloch on Aug 17, 2010 12:47 PM

  • Changing Character set in SAP BODS Data Transport

    Hi Experts,
    I am facing issue in extracting data from SAP.
    Job details: I am using an ABAP data Flow which fetches the data from SAP and loads into Oracle table using Data Transport.
    Its giving me below error while executing my job:
    (12.2) 05-06-11 11:54:30 (W) (3884:2944) FIL-080102: |Data flow DF_SAP_EXTRACT_QMMA|Transform R3_QMMA_EXTRACT__AL_ReadFileMT_Process
                                                         End of file was found without reading a complete row for file <D:/DataService/SAP/Local/Z_R3_QMMA>. The expected number of
                                                         columns was <30> while the number of columns actually read was <10>. Please check the input file for errors or verify the
                                                         schema specification for the file format. The number of rows processed was <8870>.
    reason: When analyzed I found the reason for this is presence of special characters in data. So while generating the data file in SAP working directory which is available on SAP Application server the SAP code page is 1100 due to which the delimeter of the file and the special characters are represented with #. So once the ABAP is executed and data is read from the file it is treating the # as delimiter and throwing the above error.
    I tried to replace the special characters in ABAP data Flow but the ABAP data Flow doesnot support replace_substr function. I also tried changing the Code Page value to UTF-8 in SAP datastore properties but this didnt work as well.
    Please let  me know what needs to be done to resolve this issue. Is there any way we change the character set while reading from the generated data file in BODS to convert code page 1100 to UTF-8.
    Thanks in advance.
    Regards,
    Sudheer.

    Unfortunately, I am no longer working on this particular project/problem. What I did discover though, is that /127 actually refers to character <control>+<backspace>. (http://en.wikipedia.org/wiki/Delete_character)
    In SAP this and any other unknown characters get converted to #.
    The conclusion I came to at the time, was that these characters made their way into the actual data and was causing the issue. In fact I think it is still causing the issue, since no one takes responsibility for changing the records, even after being told exactly which records need to be updated ;-)
    I think I did try to make the changes on the above mentioned file, but without success.

  • New Data Transport Routine

    Hi,
    I am in the process of writing a new Data Transport Routine (RV60C702) for creating Billing documents (Maintain Copying Control for Billing Documents). I was informed that the new Routine's name (RV60C702) should be added in the Include 'RV60CNNN' and I was also informed that while transporting the new Routine (RV60C702) to the Production systems, I should add the Program 'RV80HGEN' also in a new transport which should be run in the Production system to include the new routine in the include 'RV60CNNN'.
    Could anyone please let me know how should I add the program 'RV80HGEN' in a new transport to make it run in the Production system.
    Could you please provide me some details on it.
    Thanks
    Priya

    Hi Naren,
    Thanks for the quick help!
    One more basic question. Can I add this Program in the same transport & in the same task which has the new routine or should this program be added to a new transport?
    Thanks..
    Priya

  • WPC Transport Mechanism

    We have currently 40 users to add content in the Web Page Composer (WPC) tool, but we are preparing for 100+ users when we start converting our existing Intranet content. We are interested in what your experience has been and what may be some lessons learned while using the WPC.
    What kind of transport mechanism do you have in place (CTS u2013 manual or automated, ICE)? We would like to allow our users to push their content to PRD when they completed development and testing.
    Any ideas or pointers would be greatly appreciated. Thanks!

    Simone-
    As from our experience. It is best to have content admin for the intranet sites to create there content in production. Make sure they are given permissions only to create and modify content on WPC sites.
    Its just content in production. That means they dont break any thing else.  Creating content in Dev and transporting is not advisable. Because users create content every now and then. and they want it to show up immediatetly. It not worth the time spending on transport to.
    EG. If user changes one link. You have to actually transport in all system. Which is more easier to change in prod.
    Just control the permissions for users and give content admin for that website.
    Please follow the link for more info on transporting.
    http://help.sap.com/saphelp_nw70/helpdata/en/47/276706548a2deae10000000a1553f7/frameset.htm
    Raj

  • Data transport at PAI for EXIT-COMMAND

    Hi,
    I'm using module AT EXIT-COMMAND in my dynpro flow logic. As far as I know, data transport doesn't occur when function code of type 'E' is triggered,except for OK code. But I need all the dynpro fields to be transported even for 'E' type function, since I have implemented buttons 'Back', 'Exit', and 'Cancel' as 'E' type function, and I need to propose the save data to the user before exit. Thus, I need the dynpro fields to populate ABAP fields so I can save them. Is there any way how to do this?
    Thanks in advance.
    Best regards,
    Tomas

    Hi Tomas,
    even you can write your own logic Field validation ...
    process after input.
    Chain.
    *                       <<  write your Own logic
    endchain.
    you can write your own logic at at AT EXIT-COMMAND ...
    module user_command_0200 input.
    *                       <<  write your Own logic
    case ok.
    *------Leave screen
        when 'BACK'.
          leave to screen 100.
    *------Leave transaction
        when 'EXIT'.
          leave program.
    *------Select sales item details
       When Others.
      endcase.
    or you can write own logic at ..
    process after input.
    module user_command_0200.
    module user_command_0200 input.
    case ok.
    *------Leave screen
        when 'BACK'.
          leave to screen 100.
    *------Leave transaction
        when 'EXIT'.
          leave program.
    *------Select sales item details
        when 'OTHERS'.
    *       << "Write your own logic
      endcase.
    ENDmodule.
    Regards,
    Prabhudas

  • The loading date, transportation date and planned date on the STO are the same as Delivery dates.

    Hi,
    We have created a Intra plant STO (plant to plant in the same company code). The planned delivery date is 4 days. So if I create a STO today (27th march), the delivery date is 31st march (excl Saturday and Sunday).
    During delivery creation, the loading date, transportation date and planned date are also coming as 31st march. The route determination for the given shipping point and route shows 2 days for transit and 4 hours for pick pack. There is no loading time maintained for the given combination.
    Can you please advice why are the dates coming the same as delivery date during delivery creation without considering the duration maintained in route determination ?
    Thanks!!!
    Anuja

    Hi,
    This is a standard behavior of SAP in case of Stock Transfer PO.
    If you want that the delivery date of receiving plant be transferred to supplying plant, please use BADI MD_STOCK_TRANSFER.
    Hope this helps.
    Regards,
    Prashant

  • Transport Mechanism in MDM

    Hi,
    Can you please tell me how transport mechanism works for MDM from development to Testing to Production?
    (Scenario is that all the three repositories are on  different boxes and three MDM systems are maintained in SLD)

    Hi Jaydeep,
               Are you using SP03 or SP04? As per my knowledge SP03 is not build for this kind of scenario.
    For SP04 it is possible to use transport utility. Please go thru the following link for your reference<a href="/people/savitri.sharma/blog/2006/08/03/increased-functionality-and-ease-of-use-the-ins-outs-of-mdm146s-new-transport-utility Utility</a>
    Thanks & Regards,
    Ronak Gajjar
    <b>Please mark helpful answer</b>

  • Data transport encryption  between Database - Apps Server

    Hi
    We have 10g R2 database on Linux
    and would like to setup a basic encryption for data transport between Database and Apps Server.
    On searching, i found a couple of SQL ENCRYPTION parameters to be placed in SQLNET.ORA , which am not sure off.
    Could anyone suggest as how should i be doing this setup?
    Do i also need to be doing changes at the Apps Server side?

    Check the Advanced Security Guide, specifically
    http://download-east.oracle.com/docs/cd/B19306_01/network.102/b14268/asopart2.htm#sthref141
    Note that Advanced Security is a separate licensed option.

  • Looking for a better data-transfer mechanism

    We are currently developing a rich client desktop application.
    The client can talk to back-end server via the web service.
    The web service is responsible for querying data from the back-end database (oracle).
    And for function-like requests from clients, the web service simply submit a job to the oracle db.
    The business logic resides in oracle stored procedure.
    And the data is in the same database.
    When the client request for the data that could make the database to retrieve 1,000 records out of, say 1,000,000 records.
    what we do now is making multiple web service calls, each time for 100 records (in order to make the response data package small enough),
    until no more could be retrieved.
    In detail, we make all the queries appended with an order by clause,
    and we send over the starting index along with the number of records retrieving,
    and use JDBC result set (in the web service method) to re-position and get the needed data,
    then use OracleCachedRowSet to hold the data and then send back to the client.
    The problem here seemed to be significant.
    Multiple web service calls are being wasted.
    Each web service call will make the db run the query (in-efficiently) again, and wasted whole bunch of data.
    In this querying scenario,
    we do not want to separate the data in to different pages (like the table in traditional web-app),
    and we want to know the meta data for the query result (our client application have to make use of the metas).
    Due to our in-experienced background, we can not figure out an efficient mechanism for this querying scenario.
    We thought socket-programming might work for us since once it's opened, the connection will be always be established,
    and then we would not have to waste multiple db queries.
    But socket can not by-pass the firewalls, and we would lost much the benefits introduced by the web service.
    Is there an efficient way of doing this in the web service world?
    Please enlighten us.
    Any suggestion / criticism welcomed.
    Thanks in advance.

    Yes, agree with SAB, zalman 7000 is a good one, however I´ve got 2 very simular K8N Neo2-systems but with 2 different cpu coolers.
    One with zalman 7000alcu and one with Thermalright XP-90 and a zalman 92mm fan ( you can use what ever 80/92mm fan you like high speed, low speed, led colored, etc).
    Both are equal regarding cooling to my experience, but I prefered the XP-90 because I did not have to remove the backplate ( I think zalman recommends mounting it´s own backplate to get the right distance/pressure between the cooler and cpu ).
    With the XP-90 you just switch the retension mechanism which is easy ( even with the MoBo in the case ).

  • FTP not working through SAP R/3 data transport

    Hi,
    Environment:
    Software: BO Data Services 3.2
    DI server OS: AIX
    Source data from: SAP (has BODI 11.5.2 programs)
    After recently installing BO Data services 3.2 on the AIX box, I am able to run a dataflow with an R/3 component. This component, reads a Z table from SAP and creates a .dat file on SAP in a particular path. The R/3 transport process should copy the file over from SAP to AIX server but it is failing to do. The error is:
    (12.2) 08-12-10 14:12:52 (E) (360872:0001) FIL-052008: |Data flow DF_ZSETUP_STG
    FTP could not transfer file <../qsawork/ZSetup.dat> from host <161.145.174.13>: <>. Please ensure that the FTP relative path for the SAP server working directory is set correctly such that the data file is accessible to the FTP program. Also, ensure that the target path, where the transferred file is written to, is correct.
    I am able to manually follow the ftp process and able to:
    1. Login from AIX server to SAP for ftp
    2. change directory on SAP to the relative path
    3. Get file from SAP to unix in the target folder as set in the configuration setting
    While running the job, the file gets created on SAP but fails to FTP. Please advise what we could be missing?
    Best regards,
    Manjunath Balur

    Hi,
    I was able to resolve this myself by setting following parameters in DSConfig.txt file on the $LINK_DIR/bin path.
    USE_UNIX_FTP_COMMAND=TRUE
    USE_UNIX_PASSIVE_FTP=FALSE
    Hope this is helpful.
    Manjunath Balur

  • EC-PCA FI DATA TRANSPORT ?

    HI EVERYONE!
      How can i transport some fi documents which debit and crebit is coustomer account or vendor account data to PCA?

    Hi Zhou,
    There is user exists that you can use to change the Profit center depending on your criteria. 
    You can flag your FI documents when if you are using FB70 for an A/R invoice with whatever criteria to flag this transaction.  Once you have flag the field RF048-BLCHR, then you can use another user exit that is used in tcode F.5D  Search for the flag in field BLGCHR in addition to other criterias and then change your profit center to xyz value.
    This should also be applicable for your FI A/P vendor posting. You can flag this entries. 
    Check the following user exits - EXIT_SAPLF048_001 and EXIT_SAPLF048_002. 
    You will have to perform a lot regression test because of the integration points when posting from different modules.(MM,SD, Inventory..etc.) You only want to affect entries that were manually done on the FI...
    Note sure whether this will help.  Just another way of FI posting to PCA

  • What is the transport mechanism in OWB?

    Dear all,
    We will be using OWB to control the Transportable tablespace process from a "Reporting Mirror" of the E-Business Suite to a Data Warehouse.
    The underlying partitions we will be moving are stored on ASM.
    Am I correct in thinking that OWB, behind the scenes, will be using the DBMS_FILE_TRANSFER utility to copy the partitions from source to target?
    (ftp does appear to be a clear option within OWB but we don't want to use this)
    Many thanks,
    Barry
    Barry Andersen | Technical Architect | +44.7917.265217
    Oracle Technical Architecture Consulting
    Oracle Parkway, Thames Valley Park, Reading. RG6 1RA

    You have an oldschool options aswell:
    Connect as SYS DBA with CONNECT / AS SYSDBA command.
    Shutdown the database instance with SHUTDOWN command.
    Rename or/and move the datafiles at operating system level.
    Start Oracle database in mount state with STARTUP MOUNT command.
    Modify the name or location of datafiles in Oracle data dictionary using following command syntax:
    ALTER DATABASE RENAME FILE ‘<fully qualified path to original data file name>’ TO ‘<new or original fully qualified path to new or original data file name>’;
    Open Oracle database instance completely with ALTER DATABASE OPEN command.
    If the datafiles that need to be changed or moved do not belong to SYSTEM tablespaces, and do not contain active rollback segments or temporary segments, there is another workaround that does not require database instance to be shutdown. Instead, only the particular tablespace that contains the date files is taken offline.
    Login to SQLPlus.
    Connect as SYS DBA with CONNECT / AS SYSDBA command.
    Make offline the affected tablespace with ALTER TABLESPACE <tablespace name> OFFLINE; command.
    Modify the name or location of datafiles in Oracle data dictionary using following command syntax:
    ALTER TABLESPACE <tablespace name> RENAME DATAFILE ‘<fully qualified path to original data file name>’ TO ‘<new or original fully qualified path to new or original data file name>’;
    Bring the tablespace online again with ALTER TABLESPACE alter tablespace <tablespace name> ONLINE; command.

  • Master data transport steps

    Hi expert,
    What will be the steps to transport master data in BI 7? I have transported relevant R/3 data source and then replicated data source in BW. Then I start collecting BW info objects for the master data. When collecting BW info objects for transport, do I need include data source, transfer rule and info pacakges? If so, why do I need to collect data source again in BW system as it has already been replicated from the source system?
    Thank in advance for any advise.
    Sharon

    Hi,
    The procedure for loading master data in 7.0 is :
    1. Create / Activate your data source for the info object you want load - for example data source for attributes, text ot hierarchy - SAP R/3
    2. Replicate data source in SAP BW
    3. Create Data Transformation ( similar to Update and Transfer rules) between Data Source and Info Object
    4. Create DTP ( data transfer process) to maintain the data flow . Here you have to specify that data comes from Data Source and goes to the Info Object.
    5. Create an Info Package on the Data Source and load data - !!! have in mind that this will load data only into PSA table
    6. Final step - you have to start your DTP - this will load data from PSA to the Info Object.
    For the majority of Master Data, there will be datasources in Business Content.  need to install Business Content . This will install all datasources that you require - Customer, Material etc. plus all standard config - DSO, Cubes etc.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/80/1a66d5e07211d2acb80000e829fbfe/frameset.htm
    Pls Check the following threads for DTP
    DTP EXPLANATION
    Delta DTP request in Yellow color
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fbd598481e1a61e10000000a422035/frameset.htm
    Hope this helps,
    Regards
    CSM Reddy

Maybe you are looking for