Best pratices for ODI interfaces

I was wondering how everyone is handling the errors that occur when running an interface with ODI.
Our secinaro:
We have customer data that we want to load each night via ODI. The data is in a flat file and a new file is provided each night.
We have come across an issue where a numeric field had data that was non numeric in it ... so ODI created a bad file ... with the bad record in it .... and an error file with the error message. We then had some defined constraints that forced records into the E$ table.
My question is how does everyone handle looking for these errors. We would like them to just bereported to one place ( an oracle table ) so when the process runs we can just look at the one table and then act on the issues.... as shown above ODI puts errors in two different places... DB ones in a flat file and user defined in the E$ tables.....
I was wondering if anyone has come across this issue and might be able to tell me what was done to handle the errors that occurr .. or what the best pratices might be for handling these errors?
Thanks for any assistance.
Edited by: 832187 on Sep 29, 2011 1:18 PM

If you have got few fields affected by conversion problem you could try insert an ODI constraint or you could modify LKM to load bad file if present.

Similar Messages

  • Best pratices for the customizing about the performance

    Hello,
    I would like to know the list of the best pratices for the customizing BPC NW 7.5 about the performance.
    Best regards
    Bastien

    Hi,
    There are few how to guides on SDN which will give you a basic idea on script logic. Apart from this, you can refer to the help guide on help. sap.com.
    The templates might also effect the performance. The number of EVDRE functions, the number of expansion dimensions, the number of members on which expansion takes place will effect the performance. A complex formatting in the template will also effect.
    Hope this helps.

  • Best pratices for GATP by SAP

    Hi all,
    I am not able to download best pratices for GATP by SAP from http://help.sap.com/bp_scmv250/BBLibrary/HTML/ATP_EN_DE.htm. Seems the documents are removed. Can some one who already downloaded share the same with me?
    Also can you provide working links for best pratices for SNP and PP/DS?
    Thankyou,
    Ram

    Hello Ram
    Please check this wiki page - it has good content and some useful links
    APO-GATP General Information - Supply Chain Management (SCM) - SCN Wiki
    and
    Find out more on RDS solution for GATP at : http://service.sap.com/rds-gatp
    if you search http://service.sap.com/bestpractices you will find a documents about best practice in GATP.  The help.sap.com for GATP is a good resource too to start with as well.
    Also you can read below blog written by me
    Global Available To Promise (GATP) Overview
    Hope this will help
    Thank you
    Satish Waghmare

  • ADF Faces & BC: Best pratices for project layout

    Season greetings my fellow JDevelopers!
    Our software group has been working with ADF for around 5 years and through the years we have accumulated a good amount of knowledge working with JDeveloper and ADF. Much of our current application structure has been resurrected in the early days of JDeveloper 10 where there were more samples codes floating around then there were "best pratice" documentation. I understand this is a subjective topic and varies site to site, but I believe there is a set of common practices our group has started to identify as critical to streamlining a development process(reusable decorated ui components, modular common biz logic, team development with svn, continuous integration/build, etc..). One of our development goals is to minimize dependency between each engineer as everyone is responsible for both client and middle layer implementation without losing coding consistency. After speaking with a couple of the aces at the last openworld, I understand much of our anticipated architectural requirements are met with JDeveloper 11(with the introduction of templates, declarative components, bounded task flows, etc..) but due to time constraints on upcoming deliverables we are still about an year away before moving on with that new release. The following is a little bit about our group/application.
    JDeveloper version: 10.1.3.4
    Number of developers: 7
    Developer responsibilties: Build both faces & bc code
    We have two applications currently in our production environments.
    1.A flavor of Steve Muench's dynamic jdbc credentials login module
    2.Core ADF Faces & BC application
    In our Core ADF Faces application, we have the following structure:
    OurApplication
         -OurApplicationLib (Common framework files)
         -OurApplicationModel (BC project)
              -src/org/ourapp/module1
              -src/org/ourapp/module2
         -OurApplicationView (Faces project)
              public_html/ourapp/module1
              public_html/ourapp/module2
              src/org/ourapp/backing/module1
              src/org/ourapp/backing/module2
              src/org/ourapp/pageDefs/
    Total Number of Application Modules: 15 (Including one RootApplicationModule which references module specific AMs)
    Total Number View Objects: 171
    Total Number of Entities: 58
    Total Number of BC Files: 1734
    Total Number of JSPs: 246
    Total Number of pageDefs: 236
    Total Number of navigation cases in faces-config.xml: 127
    Total Number of application files: 4183
    Total application size: 180megs
    Are there any other ways to divide up this application? Ie: module specific projects with seperate faces-config files/databindings? If so, how can these files be "hooked" together? A couple of the aces has recommended that we should separate all the entity files into its own project which make sense. Also, we are looking into the maven builds which should remove those pesky model.jpr files that constantly gets “touched”. I would to love hear how other groups are organizing their application and anything else they would like to share as an ADF best pratice.
    Cheers,
    Wes

    After discussions over the summer/autumn by members of the ADF Methodology Group I have published an ADF Coding Standards wiki page that people may find useful:
    [http://wiki.oracle.com/page/ADF+Coding+Standards]
    It's aimed at ADF 11g and is intended to be a living document - if you have comments or suggestions please post them to the ADF Methodology google group ( [http://groups.google.com/group/adf-methodology?hl=en] ).

  • Any best practices for secondary interface/IP

    Hello,
    I am working for translate firewall from to ASA now.  As I know ASA did not support secondary interface IP. 
    However, my existing firewall setup is using this method to bind different subnet into single Interface. 
    Did any best practices to migrate into ASA environment?
    Thanks!

    Hi,
    This depends on your current environment which we dont know about.
    As ASA firewalls can not have secondary IP addresses on a single interface then the typical options would be to either
    Move the gateway of these internal subnets (which need to be under the same interface) to an internal L3 switch or Router. Then configure a link network between that device and the ASA interface and route the subnets through that link subnet.
    Configure the subnets to different ASA interface (actual physical interfaces or subinterface if using Trunking) and separate those subnets to different Vlans on your switch network (or if not using Vlans then simply to different switches)
    I guess it would also be possible to have 2 separate physical ASA interfaces connected to the same network switch network (Vlan) where the 2 subnet are used and just configure the other gateway on the other interface and the other subnet on the other physical interface. I would assume it could work but I am really hesitant to even write this as this would certainly be something that I would not even consider unless in some really urgent situation where there was no other options (for some reason).
    - Jouni

  • Best layout for my interface?

    Hi,
    I'm trying to make a login interface for an authenticator I'm working on and I've tried a few layouts but I can't find one that works and looks similar on both Ubuntu Linux and Windows 7.
    Here's what I'm currently using:
    import java.awt.FlowLayout;
    import javax.swing.JButton;
    import javax.swing.JCheckBox;
    import javax.swing.JComponent;
    import javax.swing.JFrame;
    import javax.swing.JLabel;
    import javax.swing.JPanel;
    import javax.swing.JPasswordField;
    import javax.swing.JTextField;
    import javax.swing.SwingUtilities;
    public class Gui {
         private static JFrame frame = new JFrame();
         private JPanel loginPanel = new JPanel();
         private JTextField usernameField = new JTextField(10);
         private JPasswordField passwordField = new JPasswordField(10);
         private JCheckBox rememberMeCheckBox = new JCheckBox();
         private JButton loginButton = new JButton("Login");
         private JButton cancelButton = new JButton("Cancel");
         public static void main(String[] args) {
              SwingUtilities.invokeLater(new Runnable() {
                   public void run() {
                        frame.setSize(190, 160);
                        frame.setTitle("Login");
                        frame.setLocationRelativeTo(null);
                        frame.getContentPane().add(new Gui().initialize());
                        frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
                        frame.setVisible(true);
         public JComponent initialize() {
              usernameField = new JTextField(10);
              passwordField = new JPasswordField(10);
              rememberMeCheckBox = new JCheckBox();
              loginButton = new JButton("Login");
              cancelButton = new JButton("Cancel");
              loginPanel.setLayout(new FlowLayout());
              loginPanel.add(new JLabel("Username: "));
              loginPanel.add(usernameField);
              loginPanel.add(new JLabel("Password:  "));
              loginPanel.add(passwordField);
              loginPanel.add(rememberMeCheckBox);
              loginPanel.add(new JLabel("Remember my information"));
              loginPanel.add(loginButton);
              loginPanel.add(cancelButton);
              return loginPanel;
    Here's what I want it to look like (I know a lot of people won't look, but I'm hoping some will. All links are the same, just in case):
    http://i55.tinypic.com/4gsn5j.png
    http://yfrog.com/jygui0p
    http://img718.imageshack.us/img718/8819/gui0.png

    aeternaly wrote:
    Hi,
    I'm trying to make a login interface for an authenticator I'm working on and I've tried a few layouts "the trick is" - to paraphrase Rob - to learn LayoutManagers. Core has several, they all have their specific behaviour which is (mostly) well defined in their api doc. Plus there are several more - free! - open source managers out which are more powerful and at the same time easier to use than any of core.
    Simply randomly "try out" stuff will not get you far on your way to mastering Swing ;-)
    Cheers
    Jeanette
    BTW: SwingX has a fully functional LoginPane plus authentication support ...

  • Best pratices for RMAN backup management for many databases

    Dear all,
    We have many 10g databases (>40) hosted on multiple Windows servers which are backup up using RMAN.
    A year ago, all backup's were implemented through Windows Scheduled Tasks using some batch files.
    We have been busy (re)implementing / migrating such backup in Grid Control.
    I personally prefer to maintain the backup management in Grid Control, but a colleague wants now to go back to the batch files.
    What i am looking for here, are advices in the management of RMAN backup for multiple databases: do you guys use Grid Control or any third-party backup management tool or even got your home-made solution?
    One of the discussion topic is the work involved in case that the central backup location changes.
    Well... any real-life advices on best practices / strategies for RMAN backup management for many databases will be appreciated!
    Thanks,
    Thierry

    Hi Thierry,
    Thierry H. wrote:
    Thanks for your reaction.
    So, i understand that Grid Control is for you not used to manage the backups, and as a consequence, you also have no 'direct' overview of the job schedules.
    One of my concern is also to avoid that too many backups are started at the same time to avoid network / storage overload. Such overview is availble in Grid Control's Jobs screen.
    And, based on your strategy, do you recreate a 'one-time' Oracle scheduled job for every backup, or do your scripts create an Oracle job with multiple schedule?
    You're very welcome!
    Well, Grid Control is not an option for us, since each customer is in a separate infrastructure, and with their own licensing. I have no real way (in difference to your situation) to have a centralized point of control, but that on the other hand mean that I don't have to consider network/storage congestion, like you have to.
    The script is run from a "permanent" job within the dba-scheduler, created like this:
    dbms_scheduler.create_job(
            job_name        => 'BACKUP',
            job_type        => 'EXECUTABLE',
            job_action      => '/home/oracle/scripts/rman_backup.sh',
            start_date      => trunc(sysdate)+1+7/48,
            repeat_interval => 'trunc(sysdate)+1+7/48',
            enabled         => true,
            auto_drop       => false,
            comments        => 'execute backup script at 03:30');and then the "master-script", determines which level to use, based on weekday from the OS. The actual job schedule (start date, run interval etc) is set together with the customer IT/IS dept, to avoid congestion on the backup resources.
    I have no overview of the backup status, run times etc, but have made monitoring scripts that will alert me if/when a backup either fails, or runs for too long. This, in addition with scheduled disaster/recovery tests makes me sleep rather well at night.. ;-)
    I realize that there (might be) better ways of doing backup scheduling in your environment, since my requirements are so completely different than yours, but I guess that we all face the same challenges in unifying the environments as much as possible, to minimize the amount of actual work we have to do. :-)
    Good luck!
    //Johan

  • Best pratices for Wide Aperture lenses (what are ?)

    Wide angle + wide aperture = rejected images in LPC.
    What's recommended ?
    - Use a chart (same size) with bigger squares
    - Move less (right/left)
    - Combination of both (!)
    Wide aperture is where corrections are the most needed.
    I'm surprised that the LPC algorithm doesn't start with
    small aperture images and then finish with wider apertures.

    I believe there is a recommendation in the shooting guide that suggests to use a checkerboard with a larger square when calibrating fisheye/circular fisheye lenses, because of the large amount distortion at the image periphery. The specification of the correct (smallest) screen size of the checker square becomes more important.

  • Need best pratices advices

    Hey guys,
    Anyone can share with me the best pratices for the setup of an oracle database. I know that the amount of redo, grouping, file system layout, etc.. depend on the size of your BD. So to help here is the spec of my BD
    oradata : 200GB
    change rate : 50k/s (I got that by dividing the size of my archive redolog by the amount of time between the first and last archlog).
    This is a standard database (not OLTP or Data Warehouse) use to store client information
    My RPO (Recovery Point Objective) is 30 minutes
    Some quick question
    1. How should I layout the file system
    2. How many redo/group/size
    3. How many control file, where shoud I put it
    4. How I should setup the log switching
    Anyway doc, quick, don't want to read a 300 pages oracle document :-) This is why I'm looking on your knowledge
    Thanks
    Edited by: Sabey on 9-Feb-2011 8:01 AM

    Sabey wrote:
    Ok a bit more information.
    Storage : SAN, RAID 5 disk onlySince it's SAN, the RAID 5 (which is generically bad for performance in any update environment) will have minimal adverse effect (because the RAID 5 is hidden by massive cache). Just try to spread the data files across as many disks as possible.
    Oracle works best for datafiles on 'SAME' (Stripe and Mirror Everything). Spread the data files across all possible disks and mix data and index to try to get randomization.
    No ASMPity. A lot of potential transparency will be side-stepped.
    OS: Solaris 10 on a M4000, (2 SPARC 2.1GHz, 4 core each), 16GB RAMFinally some meat. ;-)
    I assume Enterprise Edition, although for the size, the transaction rate proposed, and for the configuration, Standard Edition would likely be sufficient. Assuming you don't need EE-specific features.
    You don't talk about the other things that will be stealing CPU cycles from Oracle, such as the app itself or batch jobs. As a result, it's not easy to suggest an initial guess to memory size. App behaviour will dictate PGA sizing, which can be as important as SGA size - if not more so. For the bland description of app you provide, I'd leave 2GB for OS, subtract whatever else required (app & batch, other stuff running on machine) and split the remaining memory at 50/50 for SGA and PGA until I had stats to change that.
    >
    Like I said, I espect a change rate of 50k/s, is there a rule of thumbs for the size of redo log, the amount, etc.. No bulk load, data is entered by people from a user interface, no machine generated data. Query in read for report but not a lot.Not too much to worry about then. I'd shoot for a minimum of 8 redo logs, mirrored by Oracle s/w to separate disks if at all possible, and size the log files to switch roughly every 15 minutes under typical load. From the looks, that would be (50k/s * 60 sec/min * 15 min) or about 50M - moderately tiny. And set the ARCHIVE_LAG_TARGET to thrum at 15 minutes so you have a predictable switch frequency.
    >
    BTW, what about direct I/O. Should I mount all oracle FS in that mode to prevent the use of OS buffer cache?Again, this would be eliminated by using ASM, but ... here is Tom Kyte's answer confirming direct IO http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:4159251866796
    Your environment is very very small in Oracle terms. Not too much to fuss over. Just make sure you have a decent backup/recovery/failover strategy in place and tested. Use RMAN for the BR and either DataGuard (or DBVisit for Standard Edition)

  • Best Practice for Migration of BO  from one server to another

    Hi All,
               I would like to know what is the Best pratice for Migration of BO from One server to another.
    i have Installed BO Xi R2 on my server.
    Thanks,
    Anendu Bothra
    Edited by: Anendu Bothra on Mar 5, 2009 10:24 AM

    You need to copy your input and output file stores from the old server to the new server. By default these are located in the <Business Objects install path>\FileStore directory.
    Then you need to stop the CMS, Right click the CMS,Click the Configuration tab, and then click Specify.
    Choose Copy, then click OK.
    Choose the version information for the source CMS database.
    Select the database type for the source CMS database, and then specify
    its database information (including host name, user name, and password).
    Select the database type for the destination CMS database, and then
    specify its database information (including host name, user name, and
    password).
    When the CMS database has finished copying, click OK.
    Once this process has been completed start the CMS and click on update objects -> located on the top of the CCM.
    I'd advise taking full backups beforehand.

  • Best practice for Smartview when upgrading from Excel 2003 to Excel 2007?

    Does anyone know the best pratice for Smartview when upgrading from Excel 2003 to Excel 2007?
    Current users have Microsoft Excel 2003 with Smartview 9.3.1.2.1.003.
    Computers are being upgraded to Microsoft Excel 2007.
    What is the best pratice for Smartview in this situation?
    1. Do nothing with Smartview and just install Excel 2007.
    2. Install Excel 2007 and then uninstall and reinstall Smartview
    3. Uninstall Smartview, Install Excel 2007, and then install Smartview
    4. Somthing else??
    Thanks!

    We went with option 1 and it worked out fine. Be aware that SV processes noticeably slower in Excel 2007 than 2003. Many users were/are unhappy with the switch. We haven't tested SV v11 yet, so I'm not sure if it has improved performance with Excel 2007 or not (hopefully it does).

  • 2K8 - Best practice for setting the DNS server list on a DC/DNS server for an interface

    We have been referencing the article 
    "DNS: DNS servers on <adapter name> should include their own IP addresses on their interface lists of DNS servers"
    http://technet.microsoft.com/en-us/library/dd378900%28WS.10%29.aspx but there are some parts that are a bit confusing.  In particular is this statement
    "The inclusion of its own IP address in the list of DNS servers improves performance and increases availability of DNS servers. However, if the DNS server is also a domain
    controller and it points only to itself for name resolution, it can become an island and fail to replicate with other domain controllers. For this reason, use caution when configuring the loopback address on an adapter if the server is also a domain controller.
    The loopback address should be configured only as a secondary or tertiary DNS server on a domain controller.”
    The paragraph switches from using the term "its own IP address" to "loopback" address.  This is confusing becasuse technically they are not the same.  Loppback addresses are 127.0.0.1 through 127.255.255.255. The resolution section then
    goes on and adds the "loopback address" 127.0.0.1 to the list of DNS servers for each interface.
    In the past we always setup DCs to use their own IP address as the primary DNS server, not 127.0.0.1.  Based on my experience and reading the article I am under the impression we could use the following setup.
    Primary DNS:  Locally assigned IP of the DC (i.e. 192.168.1.5)
    Secondary DNS: The assigned IP of another DC (i.e. 192.168.1.6)
    Tertiary DNS:  127.0.0.1
    I guess the secondary and tertiary addresses could be swapped based on the article.  Is there a document that provides clearer guidance on how to setup the DNS server list properly on Windows 2008 R2 DC/DNS servers?  I have seen some other discussions
    that talk about the pros and cons of using another DC/DNS as the Primary.  MS should have clear guidance on this somewhere.

    Actually, my suggestion, which seems to be the mostly agreed method, is:
    Primary DNS:  Locally assigned IP of the DC (i.e. 192.168.1.5)
    Secondary DNS: The assigned IP of another DC (i.e. 192.168.1.6)
    Tertiary DNS:  empty
    The tertiary more than likely won't be hit, (besides it being superfluous and the list will reset back to the first one) due to the client side resolver algorithm time out process, as I mentioned earlier. Here's a full explanation on how
    it works and why:
    This article discusses:
    WINS NetBIOS, Browser Service, Disabling NetBIOS, & Direct Hosted SMB (DirectSMB).
    The DNS Client Side Resolver algorithm.
    If one DC or DNS goes down, does a client logon to another DC?
    DNS Forwarders Algorithm and multiple DNS addresses (if you've configured more than one forwarders)
    Client side resolution process chart
    http://msmvps.com/blogs/acefekay/archive/2009/11/29/dns-wins-netbios-amp-the-client-side-resolver-browser-service-disabling-netbios-direct-hosted-smb-directsmb-if-one-dc-is-down-does-a-client-
    logon-to-another-dc-and-dns-forwarders-algorithm.aspx
    DNS
    Client side resolver service
    http://technet.microsoft.com/en-us/library/cc779517.aspx 
    The DNS Client Service Does Not Revert to Using the First Server in the List in Windows XP
    http://support.microsoft.com/kb/320760
    Ace Fekay
    MVP, MCT, MCITP EA, MCTS Windows 2008 & Exchange 2007 & Exchange 2010, Exchange 2010 Enterprise Administrator, MCSE & MCSA 2003/2000, MCSA Messaging 2003
    Microsoft Certified Trainer
    Microsoft MVP - Directory Services
    Complete List of Technical Blogs: http://www.delawarecountycomputerconsulting.com/technicalblogs.php
    This posting is provided AS-IS with no warranties or guarantees and confers no rights.
    I agree with this proposed solution as well:
    Primary DNS:  Locally assigned IP of the DC (i.e. 192.168.1.5)
    Secondary DNS: The assigned IP of another DC (i.e. 192.168.1.6)
    Tertiary DNS:  empty
    One thing to note, in this configuration the Best Practice Analyzer will throw the error:
    The network adapter Local Area Connection 2 does not list the loopback IP address as a DNS server, or it is configured as the first entry.
    Even if you add the loopback address as a Tertiary DNS address the error will still appear. The only way I've seen this error eliminated is to add the loopback address as the second entry in DNS, so:
    Primary DNS:  The assigned IP of another DC (i.e. 192.168.1.6)
    Secondary DNS: 127.0.0.1
    Tertiary DNS:  empty
    I'm not comfortable not having the local DC/DNS address listed so I'm going with the solution Ace offers.
    Opinion?

  • What's the best Firewire Audio/ midi interface for LE

    Am looking for an interface that is clean, has 8 audio inputs combined with MIDI interface. What are the best options?

    Issueboy wrote:
    Am looking for an interface that is clean, has 8 audio inputs combined with MIDI interface. What are the best options?
    Check out the Alesis io26 -- 8 i/o's plus lightpipe for expansion -- pristine mic/line preamps, solid and portable, and very reasonable pricing
    Description & specs at: http://www.alesis.com/product.php?id=96

  • Digital Video Interfaces??  Is AJA the best choice for the money??

    I am using FCP HD, and have always worked with Mini DV. But we just purchased a Sony PVW-2600 Beta deck, and I believe that I'm needing to purchase some kind of interface.
    Is AJA the only or best choice for me? Can you give me some suggestions?
    Thanks so much!
    Bryce

    It's not the only choice, as Blackmagic/Decklink make good cards too, as do Aurora. But whether it's the best choice is another matter. There's lots of different decisions to make - HD compatibility? Effects acceleration? Monitoring outputs? Rack mountability?
    Others will respond.

  • Best Pratice links for HM/HR

    Hello Gurus...
    I am trying to get some best practices for HR/HCM for BI implementation, but the links that i got in SDN threads dont work.
    Can any one please guide me to the correct URL?
    Also, is there any diff between HR and HCM?
    Is there any change in HR between R/3 4.6 and ECC. I have documents for 4.6
    Thank you,
    Kris

    http://help.sap.com/bp_bw370/html/index.htm
    then   Business intelligence -> Preconfigured Scenarios  ... here u can find best practices for HCM
    HCM is new terminology in NW BI for HR in BW..
    There must be some different 4.6 and ECC, but most of the ECC content is available as 3.x
    http://help.sap.com/saphelp_nw70/helpdata/en/2a/77eb3cad744026e10000000a11405a/frameset.htm

Maybe you are looking for