Anybody used OWB to populate netezza warehouse

Hi,
We are currently using OWB to populate oracle warehouse.
Currently considering switching to a netezza warehouse.
Anybody any experience/thoughts on using OWB to populate netezza as target.
Many Thanks

Thanks for the advice.
We had been expecting this - we do make heavy use of cube and dimension operators.
We are currently exploring possibilities of new hardware andd netezaa was suggested as a possible option as target warehouse and that would have the option
of transporting our ETL relatively simply and just changing the target database to be netezzza rather than Oracle.
At this stage we are looiking at advantages/diasdvanatges of the various hardrware platforms (Netezza, Oracle exadata etc) partciualrly and integreated hardware/software approach.
Do you have any information outlining pros/cons of netezza vs exdata?

Similar Messages

  • Can I create a BI Beans compliant cube using OWB?

    Can I create a cube that I can browse using BI beans through OWB 9.0.4 or are there additional steps that I need to take using other tools such as Enterprise manager?
    Are there any known incompatibilities between OWB 9.0.4 and BI beans 9.03.1.?
    I will also pose this question in the BI Beans forum.
    Thanks for any replies.
    Cor

    Hi,
    I am trying to build an Analytic workspace using OWB (9.0.4.8) Transfer Brigde and I got the similar error.
    None of the view/mv sqls are generated and the analytic workspace was not created either.
    FYI.
    **! Transfer logging started at Wed May 14 18:07:41 EDT 2003 !**
    OWB Bridge processed arguments
    Default local= en_US
    Exporting project:OM_SAMPLE
    initializing project:OM_SAMPLE
    Initializing module :WH
    Exporting cube:SALES
    Exporting dimension:CHANNELS
    Exporting dimension:COUNTRIES
    Exporting dimension:CUSTOMERS
    Exporting dimension:PRODUCTS
    Exporting mappings
    Exporting table:CHANNELS
    Exporting table:COUNTRIES
    Exporting table:CUSTOMERS
    Exporting table:PRODUCTS
    Exporting table:SALES
    Exporting datatypes
    Exporting project OM_SAMPLE complete.
    setting parameter: olapimp.deploytoaw = Y
    setting parameter: olapimp.awname = OWBTARDEMO
    setting parameter: olapimp.awobjprefix = OWBTAR_
    setting parameter: olapimp.awuser =
    setting parameter: olapimp.createviews = Y
    setting parameter: olapimp.viewprefix = OWBTAR_
    setting parameter: olapimp.viewaccesstype = OLAP
    setting parameter: olapimp.creatematviews = Y
    setting parameter: olapimp.viewscriptdir = /opt/oracle
    setting parameter: olapimp.deploy = N
    setting parameter: olapimp.username = OLAPSYS
    setting parameter: olapimp.password = manager
    setting parameter: olapimp.host = 10.215.79.139
    setting parameter: olapimp.port = 1521
    setting parameter: olapimp.sid = INDEXDB
    setting parameter: olapimp.inputfilename = C:\TEMP\bridges\null-nullMy_Metadata_Transfer1052950061353.XMI
    setting parameter: olapimp.outputfilename = C:\Panneer\owbtardemo.sql
    Loading Metadata
    Loading XMI input file
    processing dim: CHANNELS
    processing level: CHANNELin dimension CHANNELS
    processing level attribute use: CHL_ID in level CHANNEL for level attribute ID
    processing level attribute : ID in level CHANNEL
    processing level attribute use: CHL_LLABEL in level CHANNEL for level attribute LLABEL
    processing level attribute : LLABEL in level CHANNEL
    processing level attribute use: CHL_SLABEL in level CHANNEL for level attribute SLABEL
    processing level attribute : SLABEL in level CHANNEL
    processing level: CLASSin dimension CHANNELS
    processing level attribute use: CLS_ID in level CLASS for level attribute ID
    processing level attribute : ID in level CLASS
    processing level attribute use: CLS_LLABEL in level CLASS for level attribute LLABEL
    processing level attribute : LLABEL in level CLASS
    processing level attribute use: CLS_SLABEL in level CLASS for level attribute SLABEL
    processing level attribute : SLABEL in level CLASS
    processing hierarchy: CHANNEL_HIERARCHY in dimension CHANNELS
    processing dim: COUNTRIES
    processing level: REGIONin dimension COUNTRIES
    processing level attribute use: RGN_ID in level REGION for level attribute ID
    processing level attribute : ID in level REGION
    processing level attribute use: RGN_LLABEL in level REGION for level attribute LLABEL
    processing level attribute : LLABEL in level REGION
    processing level attribute use: RGN_SLABEL in level REGION for level attribute SLABEL
    processing level attribute : SLABEL in level REGION
    processing level: COUNTRYin dimension COUNTRIES
    processing level attribute use: CTY_ID in level COUNTRY for level attribute ID
    processing level attribute : ID in level COUNTRY
    processing level attribute use: CTY_LLABEL in level COUNTRY for level attribute LLABEL
    processing level attribute : LLABEL in level COUNTRY
    processing level attribute use: CTY_SLABEL in level COUNTRY for level attribute SLABEL
    processing level attribute : SLABEL in level COUNTRY
    processing hierarchy: COUNTRY_HIERARCHY in dimension COUNTRIES
    processing dim: CUSTOMERS
    processing level: CUSTOMERin dimension CUSTOMERS
    processing level attribute use: CTR_CREDIT_LIMIT in level CUSTOMER for level attribute CREDIT_LIMIT
    processing level attribute : CREDIT_LIMIT in level CUSTOMER
    processing level attribute use: CTR_EMAIL in level CUSTOMER for level attribute EMAIL
    processing level attribute : EMAIL in level CUSTOMER
    processing level attribute use: CTR_ID in level CUSTOMER for level attribute ID
    processing level attribute : ID in level CUSTOMER
    processing level attribute use: CTR_NAME in level CUSTOMER for level attribute NAME
    processing level attribute : NAME in level CUSTOMER
    processing dim: PRODUCTS
    processing level: PRODUCTin dimension PRODUCTS
    processing level attribute use: PDT_DESCRIPTION in level PRODUCT for level attribute DESCRIPTION
    processing level attribute : DESCRIPTION in level PRODUCT
    processing level attribute use: PDT_ID in level PRODUCT for level attribute ID
    processing level attribute : ID in level PRODUCT
    processing level attribute use: PDT_LIST_PRICE in level PRODUCT for level attribute LIST_PRICE
    processing level attribute : LIST_PRICE in level PRODUCT
    processing level attribute use: PDT_MIN_PRICE in level PRODUCT for level attribute MIN_PRICE
    processing level attribute : MIN_PRICE in level PRODUCT
    processing level attribute use: PDT_NAME in level PRODUCT for level attribute NAME
    processing level attribute : NAME in level PRODUCT
    processing level: CATEGORYin dimension PRODUCTS
    processing level attribute use: CTY_ID in level CATEGORY for level attribute ID
    processing level attribute : ID in level CATEGORY
    processing level attribute use: CTY_LLABEL in level CATEGORY for level attribute LLABEL
    processing level attribute : LLABEL in level CATEGORY
    processing level attribute use: CTY_SLABEL in level CATEGORY for level attribute SLABEL
    processing level attribute : SLABEL in level CATEGORY
    processing hierarchy: PRODUCT_HIERARCHY in dimension PRODUCTS
    processing cube: SALES
    processing classification type is := Warehouse Builder Business Area
    processing catalog name := SALESCOLLECTION ,and description is := null
    processing catalog entry element name := SALES
    processing Cube
    processing catalog entity cube := SALES
    processing measure := COSTS , in a cube := SALES
    processing measure := SALES , in a cube := SALES
    processing catalog entry element name := CHANNELS
    processing catalog entry element name := COUNTRIES
    processing catalog entry element name := CUSTOMERS
    processing catalog entry element name := PRODUCTS
    processing catalog entry element name := CHANNELS
    Class Name CHANNELS is TableImpl@405ffd not supported
    processing catalog entry element name := COUNTRIES
    Class Name COUNTRIES is TableImpl@5e1b8a not supported
    processing catalog entry element name := CUSTOMERS
    Class Name CUSTOMERS is TableImpl@6232b5 not supported
    processing catalog entry element name := PRODUCTS
    Class Name PRODUCTS is TableImpl@6f144c not supported
    processing catalog entry element name := SALES
    Class Name SALES is TableImpl@14013 not supported
    processing classification type is := Dimensional Attribute Descriptor
    Classification type Dimensional Attribute Descriptor is not supported
    closing output file
    closing log stream
    **! Transfer process 2 of 2 completed with status = 0 !**
    **! Transfer logging stopped at Wed May 14 18:07:47 EDT 2003 !**
    But when I ran the "select * from dba_registry" everything seems to be valid.
    CATALOG     Oracle9i Catalog Views     9.2.0.2.0     VALID     24-APR-2003 09:39:24     SYS     SYS     DBMS_REGISTRY_SYS.VALIDATE_CATALOG
    CATPROC     Oracle9i Packages and Types     9.2.0.2.0     VALID     24-APR-2003 09:39:24     SYS     SYS     DBMS_REGISTRY_SYS.VALIDATE_CATPROC
    OWM     Oracle Workspace Manager     9.2.0.1.0     VALID     24-APR-2003 09:39:27     SYS     WMSYS     OWM_VALIDATE
    JAVAVM     JServer JAVA Virtual Machine     9.2.0.2.0     VALID     23-APR-2003 22:19:09     SYS     SYS     [NULL]
    XML     Oracle XDK for Java     9.2.0.2.0     VALID     24-APR-2003 09:39:32     SYS     SYS     XMLVALIDATE
    CATJAVA     Oracle9i Java Packages     9.2.0.2.0     VALID     24-APR-2003 09:39:32     SYS     SYS     DBMS_REGISTRY_SYS.VALIDATE_CATJAVA
    ORDIM     Oracle interMedia     9.2.0.2.0     LOADED     23-APR-2003 23:16:42     SYS     SYS     [NULL]
    SDO     Spatial     9.2.0.2.0     LOADED     23-APR-2003 23:17:06     SYS     MDSYS     [NULL]
    CONTEXT     Oracle Text     9.2.0.2.0     VALID     23-APR-2003 23:17:26     SYS     SYS     [NULL]
    XDB     Oracle XML Database     9.2.0.2.0     VALID     24-APR-2003 09:39:39     SYS     XDB     DBMS_REGXDB.VALIDATEXDB
    WK     Oracle Ultra Search     9.2.0.2.0     VALID     24-APR-2003 09:39:42     SYS     WKSYS     WK_UTIL.VALID
    OLS     Oracle Label Security     9.2.0.2.0     VALID     24-APR-2003 09:39:43     SYS     LBACSYS     LBAC_UTL.VALIDATE
    ODM     Oracle Data Mining     9.2.0.1.0     LOADED     12-MAY-2002 17:59:03     SYS     ODM     [NULL]
    APS     OLAP Analytic Workspace     9.2.0.2.0     LOADED     23-APR-2003 22:49:51     SYS     SYS     [NULL]
    XOQ     Oracle OLAP API     9.2.0.2.0     LOADED     23-APR-2003 22:51:49     SYS     SYS     [NULL]
    AMD     OLAP Catalog     9.2.0.2.0     VALID     02-MAY-2003 15:00:13     SYS     OLAPSYS     CWM2_OLAP_INSTALLER.VALIDATE_CWM2_INSTALL
    Your help is appreciated!
    Thanks
    Panneer

  • Using OWB in MOC map i am getting error:java.lang.reflect.invoc

    Hi All,
    When i am trying to deploy any of the ammping using OWB for oracle MOC.
    As part of the deployment, when I tried for the first following mapping:
    using MTH-->MTH_TARGET--> Mapping
    · List of deployable objects and category (EBS-Specific / Non-EBS Specific)
    · EBS Specific-Maps : MTH_WO_OI_INIT_DMF_MAP
    I am gettinge the following error while opening the map :
    java.lang.reflect.invocationtargetexception
    Because of that its not deploying at all for any of the mapping.
    I am sometimes getting error nested transactions are being carried out .But if I logging in and logging out this error doesn't comes but the error which I mentioned comes. (java.lang.reflect.invocationtargetexception)
    Will appreciate, if anybody gets teh same error or have some solution.
    Regards,
    Ashok.

    your error seems to indicate that your embeddedLdap isnt in sync with the Database ? Did you either delete files from your file systems or cleanup the portal db or finally used a copy of some existing database for your domain?

  • How to run a PLSQL mapping without using OWB Interface

    Hi there, does anybody know how to call a PL/SQL mapping <mapname.main> function without using OWB interface. We will have to create some other mechanism to call those generated mappings....
    Thanks in advance

    Hi Marcelo ,
    You can use the following code at the sql prompt to the run the mapping .
    DECLARE
    RetVal NUMBER;
    P_ENV WB_RT_MAPAUDIT.WB_RT_NAME_VALUES;
    BEGIN
    RetVal := <Mapping Name>.MAIN(P_ENV) ;
    COMMIT;
    END;
    Regards,
    Shashin

  • Use OWB packages to deploy on another database

    Hello,
    Is it possible to use OWB to create the ETL packages and then copy the package to another database (where OWB has not been setup) and execute the package there, or must I setup OWB on the new database?
    thanks

    You may also benefit from this :
    http://www.oracle.com/technology/products/warehouse/htdocs/OWBExchange.html?cat=ALP&submit=search
    Headless Deployment of Generated Code          10gR2     May 2009     
    Generate OWB code to allow for deployment to and execution on database hosts where OWB is not installed.
    and also described here:
    http://blogs.oracle.com/warehousebuilder/2009/05/headless_operation_owb_code_generation_going_it_alone.html
    You will at least need a repository in the target, but do not need to install anything in the filesystem and no control center service there.
    Regards,
    Erik

  • Anybody using 11g RAC Database with OIM 9.1.0.2?

    Hi,
    We have poor performance problems with 11g RAC Database with OIM 9.1.0.2.
    It looks like the issue could be because of RAC Ccluster.
    Anybody using 11g RAC Database with OIM 9.1.0.2?
    If so, What is the JDBC Driver r u using? Also what App Server (Weblogic or OAS etc) ?
    Appreciate your input.
    Regards
    Vijay Chinnasamy

    We are using weblogic 10.3. And I am sorry to tell you that we are yet to move to production.
    We are planning to have Database in RAC, so i just thought of taking inputs from you.
    can you tell me under what operations you face the poor performance issue?
    For us the max no. of users in production is around 6k. So I am wondering how this RAC availability will affect the performance.
    Thanks for sharing your inputs.

  • How to load flatfiles using Owb?

    Hai all,
    I would like to access a flat file (.csv files) using owb. I am able to import the files into source module of owb. But while executing the mapping , I got the following error...
    Starting Execution MAP_CSV_OWB
    Starting Task MAP_CSV_OWB
    SQL*Loader: Release 10.1.0.2.0 - Production on Fri Aug 11 16:34:22 2006
    Copyright (c) 1982, 2004, Oracle. All rights reserved.
    Control File: C:\OWBTraining\owb\temp\MAP_CSV_OWB.ctl
    Character Set WE8MSWIN1252 specified for all input.
    Data File: \\01hw075862\owbfiles\employee.csv
    File processing option string: "STR X'0A'"
    Bad File: C:\OWBTraining\owb\temp\employee.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array: 200 rows, maximum of 50000 bytes
    Continuation: none specified
    Path used: Conventional
    Table "OWNER_STG"."EMP_EXCEL", loaded from every logical record.
    Insert option in effect for this table: APPEND
    Column Name Position Len Term Encl Datatype
    "EMPNO" 1 * , CHARACTER
    "EMPNAME" NEXT * , CHARACTER
    value used for ROWS parameter changed from 200 to 96
    SQL*Loader-500: Unable to open file (\\01hw075862\owbfiles\employee.csv)
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: The system cannot find the file specified.
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    Table "OWNER_STG"."EMP_EXCEL":
    0 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Space allocated for bind array: 49536 bytes(96 rows)
    Read buffer bytes: 65536
    Total logical records skipped: 0
    Total logical records read: 0
    Total logical records rejected: 0
    Total logical records discarded: 0
    Run began on Fri Aug 11 16:34:22 2006
    Run ended on Fri Aug 11 16:34:22 2006
    Elapsed time was: 00:00:00.09
    CPU time was: 00:00:00.03
    RPE-01013: SQL Loader reported error condition, number 1.
    Completing Task MAP_CSV_OWB
    Completing Execution MAP_CSV_OWB
    could you please help me..
    thanks and regards
    gowtham sen.

    Thank you my friends.
    As you said, I gave the file name as wrong.
    Its solved. Thank you....
    I have another probem.
    How to load data from excel file to owb? Is it possible the way we do for flat files?
    I did using ODBC + HS Services.
    But after creating a mapping , and its deploying I got the following error.
    "error occurred when looking up remote object <unspecified>.EmployeeRange@EXCEL_SID.US.ORACLE.COM@DEST_LOCATION_EXCEL_SOURCE_LOC
    ORA-00604: error occurred at recursive SQL level 1
    ORA-02019: connection description for remote database not found
    Could you please help me..
    Thanks and regards
    Gowtham

  • Loading xml file using owb

    Hi Gurus,
    I am new to owb and as per requirement we need to load xml files into oracle table using owb.
    below is the xml file:
    <bookstore>
    <book category="COOKING">
    <title lang="en">Everyday Italian</title>
    <author>Giada De Laurentiis</author>
    <year>2005</year>
    <price>30.00</price>
    </book>
    <book category="CHILDREN">
    <title lang="en">Harry Potter</title>
    <author>J K. Rowling</author>
    <year>2005</year>
    <price>29.99</price>
    </book>
    <book category="WEB">
    <title lang="en">Learning XML</title>
    <author>Erik T. Ray</author>
    <year>2003</year>
    <price>39.95</price>
    </book>
    </bookstore>
    please help me in loading above xml file using owb.

    You can leverage the XML SQL functions to extract from XML using the database, see the blog post below;
    https://blogs.oracle.com/warehousebuilder/entry/leveraging_xdb
    For example to extract information from your XML document the following SQL can be generated from OWB;
    select extractValue(value(s), '/book/author'),
    extractValue(value(s), '/book/year'),
    extractValue(value(s), '/book/price') from
    ( select XMLType('<bookstore>
    +<book category="COOKING">+
    +<title lang="en">Everyday Italian</title>+
    +<author>Giada De Laurentiis</author>+
    +<year>2005</year>+
    +<price>30.00</price>+
    +</book>+
    +<book category="CHILDREN">+
    +<title lang="en">Harry Potter</title>+
    +<author>J K. Rowling</author>+
    +<year>2005</year>+
    +<price>29.99</price>+
    +</book>+
    +<book category="WEB">+
    +<title lang="en">Learning XML</title>+
    +<author>Erik T. Ray</author>+
    +<year>2003</year>+
    +<price>39.95</price>+
    +</book>+
    +</bookstore>') adoc from dual+) r,
    table(XMLSequence(extract(r.adoc, '/bookstore/book'))) s;
    Cheers
    David

  • Is anybody using RDX Removable Disk Backup System

    I´m sorry, this a new post about an old discussion but I would like to start a fresh dialogue for today; 2011.
    Is anybody using RDX Removable Disk Backup System (with or without T.Machine).
    An affordable alternative to LTO tape? Proved compatibility with Mac OS?
    Is it possible to insert a RDX drive as a second optic unit inside a Mac Pro?
    Any happy RDX+mac user, so far?
    Thanks.

    I'm not using RDX on a Mac, but I do have one client I'm using RDX for a Windows server.  While RDX is somewhat cost effective over LTO/Ultrium, IMHO it probably wouldnt' be the best for Time Machine.  Time Machine expects the previous backup to be online and if you switch cartridges, TM would have to create a new backup from scratch.  Then if you switch back the cartridges, your TM bundles are out of sync.  I'm not saying it's not possible, but that this is not the "normal" way TM works.  So you need to understand what you need to do if you were planning to switch cartridges.  (If you weren't planning to switch cartridges, then a "normal" external or internal hard drive is more cost effective than RDX.)
    RDX cartridges are basically 2.5" SATA hard drives.  The RDX "drive" is, for the most part, just a SATA to USB converter.  (Or SATA to eSATA or whatever interface you get.)  On the PC the advantage is that you don't have to "safely remove hardware" and you don't lose the drive letter.  On the Mac, a "normal" external hard drive is almost as convinient as RDX, since there's no drive letter to worry about.  The RDX's advantage on the Mac would be not having to plug/unplug a USB cable on the external units.  I suppose you could get an internal eSATA RDX to put into a Mac.  I don't know of anyone doing that.
    I did use my client's USB RDX drive on his Macbook, just to see what we could do with it.  While it worked, we didn't do extensive testing so I can't comment on long term satisfaction with OSX.  But so far it's been reliable with Windows for almost two years.  (Windows Server 2008 with Backup Exec 2010.)  However since RDX is basically just a hard drive, I've gotten the client to realize that in a year or two, we'll probably have to start replacing cartridges, assuming his data size doesn't grow beyond the RDX capacity.  This is based on the assumption that the RDX cartridges would have similar longevity to laptop hard drives, which have 3-5 year warranties.  We probably will get more lifespan than that, but better to have the client plan in the budget now, rather than have backups fail.

  • Loading data from SQL server 2000 to Oracle using OWB 10.2

    Hi All,
    I have to move data from SQL server to Oracle using OWB. Any idea how to connect to SQL Server thru the OWB design centre console. There is no detail available in the documentation. OWB cocumentation says that "OWB Integrator for Oracle DB & Apps 3.0" is available to connect to Non-Oracle Database (including Sybase, Informix, ODBC).

    Hi,
    yes, you can use ODBC and configure HSODBC on the database server to create a database link to the sql-server. If your oracle server is microsoft based you are ready, otherwise you have to buy an odbc-driver for linux, unix or whatever you use.
    Ciao Stephan

  • Has anybody used the microsoft JDBC 2.0 driver for sql server 2000?

    Hi,
    Has anybody used the JDBC 2.0 driver for sql server 2000 downloadable from the
    microsoft website?When I try using it with WL 6.1 sp1 it says it can't load the
    driver.I try viewing the class file from the jar file using the jar utility it
    gives an unknown Zip format error.Anybody has any solution for this ?If anybody
    has managed to work with this microsoft driver i will be grateful if they provide
    me with a solution.
    Thanks
    Thomas

    Hello Thomas,
    You may want to download the driver again and install it again.
    heres a sample xml tag in the config.xml:
    <JDBCConnectionPool
    DriverName="com.microsoft.jdbc.sqlserver.SQLServerDriver"
    InitialCapacity="3" MaxCapacity="12" Name="MSpool"
    Password="{3DES}fUz1bxR0zDg=" Properties="user=uid"
    Targets="myserver"
    URL="jdbc:microsoft:sqlserver://mydbserver:1433"/>
    ensure that you follow the instructions from Microsoft. For using 2000
    driver you will need to have
    Install_dir/lib/msbase.jar and Install_dir/lib/msutil.jar in addition to
    Install_dir/lib/mssqlserver.jar in the CLASSPATH.
    hth
    sree
    "Thomas" <[email protected]> wrote in message
    news:3c91ec0e$[email protected]..
    Hi,
    Has anybody used the JDBC 2.0 driver for sql server 2000 downloadable from
    the
    microsoft website?When I try using it with WL 6.1 sp1 it says it can't load
    the
    driver.I try viewing the class file from the jar file using the jar utility
    it
    gives an unknown Zip format error.Anybody has any solution for this ?If
    anybody
    has managed to work with this microsoft driver i will be grateful if they
    provide
    me with a solution.
    Thanks
    Thomas

  • BAPI_PO_CREATE1 use of potextitem in tables can i use this to populate text

    hi
    i nedd to know BAPI_PO_CREATE1 use of potextitem in tables can i use this to populate text at item level
    as my requirment is to populate the item text at item level in me23n tcode
    so how to use this pls suggest and how to acheive it?
    under me23n tab text at item level bottommost section item text i need to populate
    regards
    Nishant

    Hi,
    in fact my price values were not so small - 2.15, 1.55, 2.45, etc... but nothing less than 1.00. Nevertheless, you pointed me to the right direction - to search OSS for the answer. OSS Note 571860:
    https://websmp230.sap-ag.de/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/sapnotes/index2.htm?numm=571860
    helps me to find out what I missed in my scenario: I didn't set ITEM-NET_PRICE at all, just via condition table. Now, when I added the following:
    item-net_price = some_price. itemx-net_price = 'X'.
    item-po_price = '2'. itemx-po_price = 'X'.   " <-- in fact, PO is created even without this line
    all of my POs are created with correct prices from the input.
    Problem solved.
    Thanks!
    Ivaylo

  • How to implement "delete" on a table using OWB?

    Hi All,
    How do I implement a simple delete using OWB?
    Assumption:
    (1) Table abcd (id number, age number, amt number)
    (2) the_number is a variable
    (3) the_id is a variable
    Want to implement following transformation in OWB?
    DELETE FROM ABCD WHERE AMT=0 AND number = the_number AND id = the_id ;
    Rgds,
    Deepak

    We implemented delete mappings, and delete flows to be able to reverse a failed load. This is in my opinion a very sound and valid reason for deleting from a datawarehouse. Also if the need is there it could be used for deleting old superfluous data from the datawarehouse.
    There are a few things to consider: closed records in type II should be opened up (post mapping).
    Test, test, test.
    It is indeed a bit tricky to realize,but certainly working and possible.
    steps to take are following:
    1) create new mapping
    2) drop mapping table where to delete from onto mapping (2 times, 1 source, 1 target)
    3) map all fields from source to their corresponding fields in target, except the ones that determine the "where" clause (Refered to as filter fields)
    4) Either create a select, or a mapping input parameter which should result in generating the filter-values for your delete.
    5) map above step to the filter fields.
    6)define a delete mapping by altering target table properties as follows:
    6a) Loading Type => Delete
    6b) Match by constraint => No constraints
    7) set properties each field as folows:
    7a) filter fields match column when deleting => Yes
    7b) other fields match column when deleting => No
    Hope this helps,
    Wilco

  • Anybody using Hypersonic SQL in JSP?

    Hello,
    Does anybody use Hypersonic SQL?
    I implement a guestbook system with JSP that
    saves data in Hypersonic SQL. This system
    runs on www.mycgiserver.com. My JSP program
    has thread safe option set to false to make
    sure one thread is reading and writing the database
    at a time. However, every once in a while, the
    database will be wiped out mysteriously.
    So I am just wondering that this happens to
    anyone else? Is this a Hypersonic SQL's bug
    or it's the server that goofs up or I did something
    wrong with the JSP setting? Any ideas or
    comments please? Thanks !!!

    Hi there!
    I use Hypersonic locally and on mycgiserver.com. Luckilly, I haven't encountered your problems so far. I use a similar approach, I have a bean that handles a singleton-connection to the DB, and that will queue up all requests. But so far , I haven't had that much load on the app, so maybe I'll get back in this matter, with the same problem... Anyone att mycgiserver that had an id�a?
    regards
    Markus

  • Who is anybody using a WISM with FWSM on a CAT 6500 Switch?

    Hi
    Who is anybody using a WISM with FWSM on CAT 6500 switch ?
    Are there any problem,if use?
    And How can I set them to connecting each other ?
    I have founded a document relate it on the cisco website that name is Integrating Cisco WiSM and Firewall Service Module.
    I have a question concern it.
    Why do I have make a VRF to communication each other ?
    Please let me know.

    As far as the FWSM is concerned you can have each of the wireless vlans come in to the same context of the FWSM and then just add those vlans to the FWSM as separate vlans.

Maybe you are looking for

  • Why does iTunes 11 not see my second generation iPod touch?

    I have a 2nd gen iPod touch which is supposed to work with iTunes 11 (http://support.apple.com/kb/PH12242) and was a few months ago. I only sync it occasionally since I use it as a jukebox in my vehicle. The iPod is model PB528LL with IOS version 4.2

  • Digital Cinema Desktop ?

    The option under view, for the Digital Cinema Desktop, is grayed out, I can't select it. I have an iMac and no monitors or camera connected. Any thoughts as to why I can't playback in full screen? Tom

  • Unzip utility for BI SE one

    Hi, I have downloaded two parts for BE SE one from: http://www.oracle.com/technology/software/products/ias/htdocs/101321biseone.html I have two files in one directory: biseone_windows_x86_101321_dvd.zip biseone_windows_x86_101321_dvd.z01 I try unzip

  • OSB cluster

    Hi! I have to use cluster in OSB... But I don't understand yet how it works. I think it means a lot of configurations in Weblogic. Ok, I'm writing what I could understand in this cluster thing. :) So, we have 2 machines and install OSB to them to sup

  • Accents

    iTunes Producer is reporting an error: ERROR ITMS-9000: "2013_06_10_ePub_for_KDP.epub: The book asset contains file(s) not listed in the OPF manifest: 2013_06_10_ePub_for_KDP.epub:/OPS/images/1962_05 Lyce?e Franc?aise Brussels 1962 May 5eme Anne?e.pn