Variables in physical schema of File data server

I want to avoid hard coding of folder name where source files used to kept as it could be vary from environment to environment like dev to production
Can you please suggest best way to implement it?
How we can store it into parameter table ans how it could fetched on production enviormnet.
Please advice.
Thanks in advance.
Regards,
Dinesh.

Hi Dinesh,
I believe you can substitute a variable in physical schema aswell..But you cannot test it by clicking the test button there.
This will work only when you have defined a variable in package at the beginning of the step.That phycal path will be resolved during run time.
Hope you got it now.
Same thing will work for the password field in any oracle dataserevr as well.
Thanks.
http://bhabaniranjan.com/

Similar Messages

  • FILE DATA SERVER

    Hi,
    I am trying to define a Data Server and Physical Schema of type File to written on a server that is different from the ODI server. I have been able to successfully build my Data Server/Physical Schema on the ODI server.
    But after building the Data Server, Physical Schema, Logical Schema, Model and Interface, my execution fails in the very first step.
    I think it has got to with the Physical Schema..Any suggestions??
    The objective is write the target flat files on different server other than the ODI server....
    Thanks

    Hello,
    Here is the error message I get;
    ODI-1228: Task FDM (Integration) fails on the target FILE connection FDM
    Caused By: java.sal.SQLException: ODI-40433: File \\Server\Path/FDM.TXT was not found
    I defined the correct server, user and password on the Data Server
    I defined the correct path on the Physical Schema
    I defined the correct Logical Schema on the Model
    So, kind of lost...

  • File Data server error

    Hi , I have used the following steps to create a FILE Data Server
    To create a File Data Server:
    1. Connect to Topology Manager
    2. Select Topology > Physical Architecture > Technologies > File from the tree
    3. Right click and select Insert Data Server
    4. Fill in the following fields in the Definition tab:
    • Name: Name of the Data Server as it will appear in Oracle Data Integrator
    • User/Password: Not used.
    5. Fill in the following fields in the JDBC tab, according to the driver used:
    • JDBC Driver: com.sunopsis.jdbc.driver.file.FileDriver
    • JDBC URL: jdbc:snps:dbfile
    6. Click Test
    7. Click Test in the Test Connection window
    But I am getting an error
    http://img30.imageshack.us/img30/5173/test3i.jpg
    Can you please tell me what am I missing. I am new to ODI

    Hi ,
    Its just typing error , you have mentioned here correctly the jdbc driver correctly as com.sunopsis.jdbc.driver.file.FileDriver
    but looking carefully in the image file you have send me , you have mentioned the file driver wrongly as you are missing the word "driver" in your jdbc driver .
    Please correct it and it will work .

  • Not able to select physicalschema directory for file data server in ODI 11g

    Hi,
    I am a beginner to ODI tech and stuck up with an error while doing a tutorial (mentioned in this link - http://st-curriculum.oracle.com/obe/fmw/odi/odi_11g/ODIproject_ff-to-ff/ODIproject_flatfile-to-flatfile.htm).
    While creating a physical schema for default file server(FILE_GENERIC) , I am not able to select schema directories and the name field with value 'FILE_GENERIC.Directory' is grayed out (non editable)
    I have gone through many documents but could not find any relevant information for this.
    So could you please let me know if any configurations required for this?
    Thanks,
    Anusha

    Hi Oleg,
    Thanks for your reply.
    While creation of physical schema , Name field is grayed out , is that the default behaviour of the screen? because in the tutorial I could see the name filead is pointing to a file directory path.
    Thanks,
    Anusha

  • Integrating Flat File data to LDAP Directory using sunopsis driver

    Hello
    I need to import data from a csv file into a LDAP Directory.
    In order to acheive this, i used Demo physical and logical File data server (called FILE_GENERIC) and set up a new LDAP data server using tutorial "Oracle Data Integrator Driver for LDAP - User's Manual".
    I can manually see and update data on both file and LDAP datastores.
    The fact is that i cannot manage to import/update data from the file to the LDAP directory through a dedicated interface.
    The issue do, i think, come from the PK/FK used by sunopsis relational model to represent the directory.
    LDAP DN is represented by a set of two table representing in my example the organizational units in one hand and the persons in the other hands, linking them through FK in persons to auto-generated PK in organization units. My person table also have a auto generated PK. All the directory datastore tables have been reversed through ODI.
    In my interface, i always use my cn as update key.
    I first tried not to map the person PK in the interface, letting the driver generating it for me (or mapping a null PK). I then catch in operator a message like: " null : java.sql.SQLException: Try to insert null into a non-nullable column".
    Anyway, the first row is created in the directory and a new PK is given into ODI datastore. Curiously, this is not as i would presume the last PK value + 1.
    There are some kinds of gaps in the ID sequences.
    I even tried checking the "tolerated error" into the IKM step called "Insert new row". I'm using IKM shipped with ODI :"IKM SQL Incremental Update". The sequence is finished in operator but due, i guess, to the catched error, the other rows are not processed. (Anyway i shouldn't have to tolerate errors)
    I tried after to put not used custom PK values into my file, then map the PK column to the LDAP datastore PK column without much success: Only one row is processed. Futhermore, the id of the PK in the datastore is different of the one I put in the file.
    I finally tried to generate PK values through SQL instructions by creating new steps in the IKM modul but that did not worked much.
    I really do not see any other ideas to either have the driver construct new PK at insert/update or to make him ignore the null PK problem and process all the rows.
    If anyone do have an idea about it, please share...
    Greetings,
    Adrien

    Hi,
    I am facing an issue who is probably the same.
    using ODI 10.1.3.5, I can't insert new rows into my openLDAP.
    One of the point I see is that the execution take the LDAP server for staging area and want to create I$ table into it, so the data are already imported into the ldap Server.
    thanks for any help.

  • Default Data Server for Hypersonic SQL in ODI 11g

    Hi All,
    I installed ODI 11g 11.1.1.6.0 for Win 7 with 10g DB R2 and WLS 10.3.5 successfully without any issues.
    Created DEV_ODI_REPO Repository also without any issues.
    Created Master Repository snpm1 and Work Repository as snpw1.
    Created the Agents (Physical and Logical) and tested successfully.
    Created the data servers for respective technologies
    1) Technology Oracle -
        Data Server - ORACLE_ORCL_LOCAL 
        Physical Schemas - SALES_DEV & SALES_PROD and
        Logical Schema as ORACLE_ORCL_LOCAL_SALES
    2) Technology FILE -
        Data Server - FILE_GENERIC
        Physical Schema - FILE_GENERIC.E:/oracle/Middleware/Oracle_ODI1/oracledi/demo/file
        Logical Schema - FILE_DEMO_SRC
    3) Technology - XML
         Data Server - XML FILE
         Physical Schema - GEO_D
        Logical Schema - GEO_DIM
    4) Technology - Hypersonic SQL
         Data Server - HSQL   (Is this the correct name or some other name is recommended ? )
         Physical Schema - HSQL_LOCALHOST_20001_Default and HSQL_LOCALHOST_20001_Default
        Logical Schema - HSQL_DEMO_SRC and HSQL_DEMO_TARG
    All the Mappings are done properly using Development & Production context for Physical and Logical Schemas.
    I am able to create the Models for Oracle, FIle and XML with successful Reverse Engineering (standard with development context )
    with all the data displayed correctly.
    The issue starts here
    The Main issue is for Hypersonic SQL where I am not able to get the Reverse Engineering.
    Technology - Hypersonic SQL  I have defined the data server as HSQL
    Is this correct data server or some other name for data server is suggested ?
    I am able to start the startdemo.bat file successfully opening the 3 dos prompt windows as
    a) Oracle DI Demo - Repository Server
    b) Oracle DI Demo - Source Server
    c) Oracle DI Demo - Target Server
    Later I make  entries on the JDBC tab od HSQL - Data Server as
    JDBC Driver : org.hsqldb.jdbc.JDBCDriver
    JDBC Url : jdbc:hsqldb:hsql://localhost
    Test Connection - ok with successful connection.
    Prior to this I already run the script created schemas and inserted data to the SRC_ tables in my 10g Database as
    ORDERS - containing the Tables as SRC_CITY, SRC_CUSTOMER, SRC_PRODUCT, SRC_REGION.....
    In the Designer I create a new Model as HSQL_SRC
    Technology - Hypersonic SQL
    Logical Schema - HSQL_DEMO_SRC as defined earlier in logical schema.
    Reverse Engineer Tab - Selected as Standard   Context selected as Development from drop down list.
    When I right click on HSQL_SRC model for reverse engineer, it executes, however it does not display any tables inserted
    as shown for ORDERS Schema.
    Please correct me where I am wrong in giving the name for data server ? or some other place.
    Thanks
    Ajaz

    Hi,
    according to docs
    http://docs.oracle.com/cd/E23943_01/integrate.1111/e12644/hypersonic_sql.htm#ODIKM590
    i see
    JDBC Driver: org.hsqldb.jdbcDriver
    JDBC URL: jdbc:hsqldb:hsql://<host>:<port>
    could you modify
    JDBC Driver : org.hsqldb.jdbc.JDBCDriver
    JDBC Url : jdbc:hsqldb:hsql://localhost
    according to docs?
    Thanks

  • Logical schema can have one or more physical schemas

    Hi Experts,
    Can a logical schema in ODI will have one or more physical schema, if possible in what case do we have it.
    In the same way can one physical schema have multiple logical schema if possible in what case.
    Please give me your brief explanation on it.
    Thx,
    Sahadeva.

    Hi Sahadeva,
    1) Yes. You can map it through different contexts. The goal is to use the same logical schema to point to a different physical schemas for each environment (Dev, Test, PROD, ...).
    So let's say you have a have the schema SH_DEV on database 1 for Dev and schema SH on database 2 for Dev :
    - Create one SH Logical schema
    - Create two data server : one for database1 and one for database2
    - Create the physical schemas SH_DEV under dataserver1 and SH under dataserver2
    - Create two context : Dev and Prod
    - Map logical SH to dataserver1.SH_DEV through context Dev
    - Map logical SH to dataserver2.SH through context Prod
    2) Technically yes, through different contexts. Though I don't see any use case for this.
    Hope it helps.
    Regards,
    JeromeFr

  • Can we create logical/ physical schema by running any script (unix/window)?

    I wish to create logical/ physical schema in oracle data integrator
    by running any windows or preferably unix script.
    is there any method or scripts provided in ODI?

    I think you could insert info into Matser Respository tables:
    Connect info: snp_connect
    Logical Schemas : snp_lschema
    Physical Schemas: snp_pschema
    Or you may write some sql to do this.

  • Dynamic Creation of Physical Data Server / Agent cache Refresh

    Scenario:
    I have a requirement to load data from xml source to oracle DB, and the xml source will change at run time,but the xsd of the xml would remain same ( so I don't have to change the Logical data Server, models, mappings, interfaces and scenarios - only the Physical Data Server will change at runtime).I have created all the ODI artifacts using ODI studio in my Work Repo and then I'm using odi sdk to create the physical dataserver for the changed xml data source and then invoking the agent programmatically.
    Problem:
    The data is being loaded from the xml source to oracle DB for the first time, but it is not working fine from the second time onwards. If I restart the agent, it is again working fine for one more time. on the first run, I think the agent maintains some sort of cache for the physical data server details and so when ever I change the data server, something is going wrong and that is leading to the following exception. So I want to know, if there is any mechanism to handle dynamic data servers or if there is any way of clearing the agent cache, if any.
    Caused By: org.apache.bsf.BSFException: exception from Jython:
    Traceback (most recent call last):
    File "<string>", line 41, in <module>
    AttributeError: 'NoneType' object has no attribute 'createStatement'
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:346)
         at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2458)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
         at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:540)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1596)
         at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$2.doAction(StartScenRequestProcessor.java:582)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor.doProcessStartScenTask(StartScenRequestProcessor.java:513)
         at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$StartScenTask.doExecute(StartScenRequestProcessor.java:1070)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$1.run(DefaultAgentTaskExecutor.java:50)
         at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor.executeAgentTask(DefaultAgentTaskExecutor.java:41)
         at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.doExecuteAgentTask(TaskExecutorAgentRequestProcessor.java:93)
         at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.process(TaskExecutorAgentRequestProcessor.java:83)
         at oracle.odi.runtime.agent.support.DefaultRuntimeAgent.execute(DefaultRuntimeAgent.java:68)
         at oracle.odi.runtime.agent.servlet.AgentServlet.processRequest(AgentServlet.java:445)
         at oracle.odi.runtime.agent.servlet.AgentServlet.doPost(AgentServlet.java:394)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:821)
         at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:503)
         at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:389)
         at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
         at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181)
         at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765)
         at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:417)
         at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
         at org.mortbay.jetty.Server.handle(Server.java:326)
         at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:534)
         at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:879)
         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:747)
         at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)
         at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
         at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
         at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:520)

    Hi ,
    If you want to load multiple files ( same structure) through one connection then in topology create M.XSD for M.XML file
    Create three directories
    RAW -- It will contain file with original name
    PRO- Processing area where file will be moved one by one & renamed it as M.XML.
    OUT- Once file data will be loaded into tables move the file M.XML from PRO to OUT.
    Go to odiexperts to create loop,
    Use odifilemove ( to move & rename/masking) to move A.XML from RAW to PRO & rename to M.XML
    use ODIfilemove to move M.XML to OUT folder & then rename back to A.XML
    Use variables to store file names & refresh
    NoneType' object has no attribute 'createStatement' : It seems that structure of your file is different & your trying to load different files in same schema. If stucture is same then use Procedure "SYNCHRONIZE ALL" after every load...
    Edited by: neeraj_singh on Feb 16, 2012 4:47 AM

  • Using XML File as a Data Server

    OK - here's my scenario:
    - Have an XML file that I am holding parameters (i.e log file folder) in
    - I have set the XML file as a data server (which test connects OK) and created physical and logical schemas in Topology Manager
    - In Designer, have created a data model for this file
    What I am trying to do is populate/refresh a global variable with a value from the XML file; however, when I make a change to the value in the XML file, the value does not get passed into the variable.
    In the global variable, it is set to refresh from the xml data source and in a package I am using a variable step set to refresh but still the data does not update with the current values in the file.
    Couple of other things - if I go to the data model and view the data, it sees the value that is in the xml file but the variable is not getting the same value - it seems to be using some kind of cached value. Also, if I close any open ODI apps (Designer, Operator, etc.) and re-open, the values (sometimes) seem to reflect what is in the file.
    Crux of the post - am I missing something so obvious that I cannot see it or has anyone else experienced this kind of issue.
    Thanks,
    Gee

    OK - I got this working but in order to do this I followed some info from another post and changed the structure of the XML.
    Previously, it was structured as:
    <params>
    <param>
    <name>Test1</name>
    <value>Some Value</value>
    </param>
    <param>
    <name>Test2</name>
    <value>Some Value2</value>
    </param>
    </params>
    Now I have changed it to:
    <params>
    <param name="Test1" value="Some Value" />
    <param name="Test2" value="Some Value2" />
    </params>
    I also added the dod=true value into the JDCB url.
    Now, one of these circumstances fixed my issue (of which I am very glad) but I don't know which one :-) .... and I currently don't have the time to find out.
    I may investigate this further in the future (and post accordingly).
    Thanks,
    G

  • Procedure not selecting SQL Server database specified in Physical Schema

    Hi all,
    I'm still green with ODI and couldn't find an answer to this after searching through the forum. I have 10.1.3.6.2 installed right now.
    I have my Physical Architecture setup with Microsoft SQL Server, Essbase, and some others. The SQL Server connection is JDBC, using integrated security (I don't think integrated security is supported but I got it to work, although I had to hack up the 10.1.3.6 upgrade script to get the patch to work...)
    Under the Physical Architecture I have several data servers. Right now the only field I specify is the Database (Catalog) and a context -- everything else is standard (even Owner (Schema) is just <Undefined> right now but changing it doesn't seem to make a difference right now)
    So, I have my job all designed up in Designed and can run it, and everything works just fine, except one little snag. The database in SQL Server doesn't seem to get selected automatically (by virtue of being specified in the Physical schema). However, if I put a step in to the job so that it exectures "USE database_name", then everything works.
    Normally, I could live with this, but the databases are named differently on different servers, so I'd prefer to just get it to work from the physical schema instead of jumping through some hoops with variables.
    Also, while I thought about specifying the database on the JDBC URL (databaseName=X), this won't work (it's probably poor form anyway) either.
    So... shouldn't specifying the database/catalog in the physical schema make it so that it gets selected as default and then SQL queries are running against that?
    Thanks for any help/insight, as always,
    Jason

    Jason
    Using the database option in the URL is the right way to do it, the URL I use is:
    jdbc:sqlserver://myserver:1433;database=ODIM;selectMethod=cursor
    Craig

  • Unable to create a data server for xml file

    Hi ,
    I have created a data server for a xml file but when i test the connection i am getting the below error.
    java.sql.SQLException: class org.xml.sax.SAXException
    Prefix not found: xsi
    at oracle.odi.jdbc.datasource.LoginTimeoutDatasourceAdapter.doGetConnection(LoginTimeoutDatasourceAdapter.java:133)
    at oracle.odi.jdbc.datasource.LoginTimeoutDatasourceAdapter.getConnection(LoginTimeoutDatasourceAdapter.java:62)
    at com.sunopsis.sql.SnpsConnection.testConnection(SnpsConnection.java:1100)
    at com.sunopsis.graphical.dialog.SnpsDialogTestConnet.getLocalConnect(SnpsDialogTestConnet.java:371)
    at com.sunopsis.graphical.dialog.SnpsDialogTestConnet.localConnect(SnpsDialogTestConnet.java:794)
    at com.sunopsis.graphical.dialog.SnpsDialogTestConnet.jButtonTest_ActionPerformed(SnpsDialogTestConnet.java:754)
    at com.sunopsis.graphical.dialog.SnpsDialogTestConnet.connEtoC1(SnpsDialogTestConnet.java:137)
    at com.sunopsis.graphical.dialog.SnpsDialogTestConnet.access$1(SnpsDialogTestConnet.java:133)
    at com.sunopsis.graphical.dialog.SnpsDialogTestConnet$IvjEventHandler.actionPerformed(SnpsDialogTestConnet.java:87)
    at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1995)
    at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2318)
    at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:387)
    at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:242)
    at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:236)
    at java.awt.AWTEventMulticaster.mouseReleased(AWTEventMulticaster.java:272)
    at java.awt.Component.processMouseEvent(Component.java:6263)
    at javax.swing.JComponent.processMouseEvent(JComponent.java:3267)
    at java.awt.Component.processEvent(Component.java:6028)
    at java.awt.Container.processEvent(Container.java:2041)
    at java.awt.Component.dispatchEventImpl(Component.java:4630)
    at java.awt.Container.dispatchEventImpl(Container.java:2099)
    at java.awt.Component.dispatchEvent(Component.java:4460)
    at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4574)
    at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4238)
    at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4168)
    at java.awt.Container.dispatchEventImpl(Container.java:2085)
    at java.awt.Window.dispatchEventImpl(Window.java:2478)
    at java.awt.Component.dispatchEvent(Component.java:4460)
    at java.awt.EventQueue.dispatchEvent(EventQueue.java:599)
    at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:269)
    at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:184)
    at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:174)
    at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:169)
    at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:161)
    at java.awt.EventDispatchThread.run(EventDispatchThread.java:122)
    Can any one help me out with this issue ?
    Thanks in advance,
    Balajitk

    HI Balaji,
    Below is the sample xml file which i have saved as test.xml and placed it @ D: drive
    Test.xml:
    <employee>
    <emp>
    <eno>1</eno>
    <ename>Test1</ename>
    </emp>
    <emp>
    <eno>2</eno>
    <ename>Test2</ename>
    </emp>
    <emp>
    <eno>2</eno>
    <ename>Test2</ename>
    </emp>
    </employee>
    Goto Topology >Expand Technologues>right click on XML and select new data source
    in Defination Tab, provide the physical architecture name
    in JDBC Tab, provide the below details
    JDBC Driver: com.sunopsis.jdbc.driver.xml.SnpsXmlDriver
    JDBC URL: jdbc:snps:xml?f=d:\test.xml&d=test.dtd&re=employee&s=TEST&ro=false
    Save it
    Click on Test Connection and again click on test, it should show test connection as "Successfully Test"
    Hope this may helps you
    Regards,
    Phanikanth

  • Change values in File physical schema during runtime?

    Is there any way to change the Directory in physical schema
    during runtime?
    This is needed so that we can dynamically get the files from
    different locations.
    Any help is highly appreciated.

    Hi,
    Yes there is:
    Create your physical schema to a parent directory and, at Physical Resourse (File Datastore) use a variable before the name of the file.
    Compose the variable with the "child" directory. I like to create the path on execution time.
    Does it help you?

  • Grants for physical schema and data-servers

    Hi,
    I'd like to know
    What are the Grants needed for the Owners of each physical schema?
    For example: Grants DDL (drop table n) and DML (select / update / delete / insert).
    Grants needed for the users' connection data-servers?
    Bovolini

    It depends on what technology you plan to use.
    If you plan to use Oracle - Is the data server connection user different from the owner of the physical schemas ?
    In addition to the connect, resource to each of these users, you will also need to give data server connection user privileges on the objects of the owner of the physical schemas.

  • Physical Path in File Adapter for Windows Server

    Hi There,
    I am trying to read a file from a directory on a windows server and write it to the same server in a different directory.
    In the file adapter I gave the value for physical path as below:
    "\\10.xx.34.xxx\Share\input".
    However I am getting below message in the debug:
    "Value specified for input Physical/Logical Directory is not a directory or is not readable."
    Hence my question is what should be the value in the physical path for windows server.
    Thanks in Advance
    Krishna
    Edited by: user452458 on Jul 19, 2010 11:30 AM

    Thanks for your reply again.
    Actually they are not on the same file system.
    I have successfully used FTP Adapter to connect to a UNIX server.
    However is it possible to use FTP Adapter for Windows Server. If so how should be the corresponding JNDI defined.
    I am getting below error when I use FTP Adapter for Windows Server:
    Unable to send file to server. [Caused by: A remote host refused an attempted connect operation.]
    ; nested exception is:
         ORABPEL-11429
    Error sending file to FTP Server.
    Unable to send file to server. [Caused by: A remote host refused an attempted connect operation.]
    Please ensure 1. Specified remote output Dir has write permission 2. Output filename has not exceeded the max chararters allowed by the OS and 3. Remote File System has enough space.
    Please Advise.
    Thanks
    Krishna

Maybe you are looking for

  • My iphone wont show up in the itunes store!

    hey everyone my iphone has locked me out and i forgot my passcode, and when i connect it to itunes to fix it the program doesn't reconigse it

  • Error on extend an internal table, but the required space was not available

    Hi All, I am trying to retrieve cost data from COVP tables, as well as the quantity, cost centre, and price unit fields from MSEG / BSEG tables. Thing is, if the AWTYP value in COVP table = 'MKPF', i have to get the data from MSEG, otherwise, I need

  • Is there only ones size of card you can get?

    I was making a card on iphoto, and the only size available was 7" x 5". Is that the only size?

  • Way to scroll to specific PDF page fast ?

    Im using Safari for reading PDF docs from web. The PDF docs have a lot of pages. 200 pages is standard. Is there a way to quickly get to desired page ? Ie page 100 Scrolling manualy thought 100 pages isn't exactly user friendly. Also is there a way t

  • Master Data Issue

    Hi BW Guru's, I have an issue with standard master data data source in BW system.  In R/3 side they have done modifications for the existing strucuture and based on the modification they want to see the values but where as when we load data the value