Real world use of Lookup, Fuzzy Lookup and Fuzzy Grouping?

Hello Experts,
Could someone please tell me the use of lookup, fuzzy lookup and fuzzy grouping tasks giving one example of each?
I know the work what these tasks do but I want to know in which scenario and for what functionality do we use these tasks?
Appreciate if you write down your own experiences.
Please no URL's of MSDN/TechNet and so on. I am tired reading it.
-Vaibhav Chaudhari

Hello Vaibhav,
A use of lookup: need to verify if a record or a corresponding entry exists in target before inserting it or not, it is also possible to use for merging / syncing data.
Fuzzy Lookup Transformation: is when you need to find similarities in text, in simple words it is the T-SQL SOUNDEX function, does it sound like what it does to you? So if you have say car brand names e.g. BMW and VW this function will confidentially match
these (depends on level chosen) even though they are not closely related. But it would not match BMW to GMC.
Fuzzy Grouping is when you need to just group related text. It is a close cousin to the lookup, but gets applied to text sets and forms them, too. Used for data de-duplication (by match score).
Arthur My Blog

Similar Messages

  • Real world performance/speed difference between 3ghz and 3.2

    Hello,
    I have a 2008, (not clovertown) Harpertown 3.0ghz Mac Pro. I wanted to know if its feasible to upgrade the cpus to the 3.2 ones? Also, what is the REAL WORLD speed/performance difference between the 3.0 and 3.2? I am so stressed out over this that I really need to have an answer to this.
    Not that I am going to buy the 3.2 processors, just wanting to see what I am missing here in terms of percentage overall between my mac pro and a 3.2ghz mac pro from 2008.
    Thank you,

    You realize that by now you can probably guess what some of our answers might be?
    Real world... well, in the real world you drive to work stuck in traffic most of the time, too.
    Spend your money on a couple new solid state drives.
    You want this for intellectual curiosity, so look at your Geekbench versus others.

  • Query using Appends,  Lookups and Functions?

    I'm new to SQL, databases and APEX! I'm sure this is very easy for you all, but I've been racking my brains for two days on this; time for the experts.
    My very simple query is:
    select first_name||' '||middle_name||' '||last_name "Full Name"
    from name_table;
    The problem I'm having, is each name above is in a foreign language (that I don't understand by the way!?), but I have a table that has "English" translations for each name...so my question is how to substitute my "translanted" value in each instance...
    to translate the values, I need to do a:
    select translated_name
    from translated_table
    where orginal_name = first_name (or middle_name, or even last_name)
    and translation source = 'Dictionary' (there can be 3-4 or four translation sources for each name...)
    It seems like a 'function' would be the best way to do this...after much googling, this is what I came up with (which obviously doesn't work!!!):
    create or replace function TRANS_NAME (v_original in varchar2)
    return varchar2
    is
    v_translated varchar2;
    begin
    select translated_name into v_translated
    from translated_table
    where original_name = v_original
    and source='Dictionary'
    return v_translated;
    end;
    I'm using the "APEX Object Browser" to create the function, for a couple reasons:
    1. SQL Plus will not display the foreign language (just shows a buch of ??????????s), the APEX SQL tool shows the original language!
    2. It gives immediate, if not cryptic, feedback on syntax errors.
    FYI, I'm using Oracle 11g, on a Sun server running Solaris 10.
    Appreciate any help you might be able to provide!

    Hi,
    leonhardtk wrote:
    to translate the values, I need to do a:
    select translated_name
    from translated_table
    where orginal_name = first_name (or middle_name, or even last_name)
    and translation source = 'Dictionary' (there can be 3-4 or four translation sources for each name...)Do you mean there can be 3 or 4 different rows with the same original_name?
    Is the combination (original_name, translation_source) unique? (That is, if several rows do have the same original_name, will they all have different translation_sources?)
    It seems like a 'function' would be the best way to do this...after much googling, this is what I came up with (which obviously doesn't work!!!):
    create or replace function TRANS_NAME (v_original in varchar2)
    return varchar2
    is
    v_translated varchar2;
    begin
    select translated_name into v_translated
    from translated_table
    where original_name = v_original
    and source='Dictionary'
    return v_translated;
    end;What's wrong with it?
    If it produces an error, post the complete error message, including line number. Indicate if it's a comile or run-time error.
    If it produces the wrong results, post the input, the relevant data from the table, and the correct results you need.
    FYI, I'm using Oracle 11g, on a Sun server running Solaris 10.Thanks, that could be useful information.
    It would also be useful to see a some sample data, and what you want to produce from that data.
    A function is only one possible way. Joining to the translated_table is another. Each way has its advantages and disadvantages.
    There's a separate forum for Oracle Application Express (APEX). If it looks like you'll need specific Apex features to solve this problem, post another question there. It's very rude to post the same question over and over, even in separate forums, so if you do post a question there, you should mark this one as "Answered".

  • Real World Experience - TIPS - Pal to NTSC and Authoring DVDS on both

    I have just completed a DVD project.
    It was shot and edited in Pal and then authored to both Pal and NTSC DVDS ( Pal for Europe - NTSC for Japan and USA)
    Standards Conversion:
    I used both compressor and graham Nattress' excellent G (for Graham?) Standards converter.
    Compressor was a fast work around for a quick pal to ntsc conversion at ok quality. Since my footage is very demanding (lots of handheld heli shots, water housing footage)i imagine it was giving the converters a real work out).
    The Nattress G converter was much slower but the results were as close to perfection as i would imagine a software solution can get. Obviously the extra render time was working extra hard to get the conversion looking good.
    For anyone going down this path - For your master you just have to use the natress converter- thanks Graham - best $100 i have spent this year.
    MPG2 encoding
    i did this using compressor and it worked well.
    Here is the rub - i THOUGHT compressor was not doing my MPG 2 conversion at acceptable quality - it was actually the lack of quality out of compressors fast and easy solution for pal - ntsc that was giving me banding on moving objects and color issues.
    I suspect this may be where some people think they are coming unstuck with compressor. (go back and check you ntsc conversion relative to your original pal quality)
    I hope this saves anyone else in my situation some time.
    In summary - when working in PAL (ie you are shooting and editing in Europe or Australia)Your work flow should be - shoot and edit in Pal ,convert with Natress, encode with compressor (compressor 2.1 - i use 7.7 mbs max, 6.0 mbs av - 2 pass variable - use ac3 for audio) - which has actually been working well for me (despite various threads about issues with compressor)
    Hope this come up nicely for anyone doing a search. Thank you to everyone who has helped me on the way and good night!

    Rory85 wrote:
    Ok cool, thanks Stan.
    Re-doing the Encore stuff, as much as that would/will suck,   it's not my worst fear.. the thing that scares me the most is the thought of re-doing all of the Premiere work again..   If an NTSC Encore project can convert a PAL Premiere working file to a NTSC DVD, I can live with re-working the Encore stuff.. And yea, Dynamic link takes care of all of the chapter markers,   so that wouldn't be a big issue - it'd just be a matter of re-building the menus..
    But yea, if anyone has a definite yes or no answer on whether or not Encore can turn a PAL Premiere project into a NTSC DVD, I'd love to know..     The person I'm doing the work for doesn't want to spend thousands of dollars getting PAL dvds printed etc unless we absolutely know it's going to be a complete re-start to get it to NTSC..
    Thanks again,
    Rory
    Hi Rory.
    It's not possible in Encore or Premiere.
    The conversion process is a complex one & consists of changing the actual resolution from 720x576 to 720x480, which gives a very different shaped pixel as well.
    Then it also changes the frame rate of the footage from 25 to 29.97.
    The easiest way to do this with Adobe tools is to use Atter Effects, making sure that the AE render is locked to the duration of the composition - this is critical, or you will end up with footage at the wrong speed. All menus will need to be rebuilt.
    Sorry I cannot give you better news.
    As a general rule of thumb, if in doiubt - create & author in NTSC as there are almost no PAL setups that cannot output either pure NTSC or PAL-60, yet there are very few that can go the other way.....

  • How to use two lookup in single interface in ODI 11g

    Hi All,
    I am trying to load GL_CODE_COMBINATIONS CC + FND_FLEX_VALUES_VL FF to target table W_GL_CODE_COMBINATIONS.
    I duplicated FND_FLEX_VALUES_VL as FND_FLEX_VALUES_VL1 FF1 for my join condition.
    In target table I have included 2 new columns named SEGMENT2_DESC and SEGMENT3_DESC.
    In my interface i am using FND_FLEX_VALUES_VL as lookup and join condition is CC.SEGMENT2=FF.FLEX_VALUE
    Mapping expression of target column SEGMENT2_DESC is FF.DESCRIPTION.
    In my interface i am using FND_FLEX_VALUES_VL1 as lookup and join condition is CC.SEGMENT3=FF1.FLEX_VALUE
    Mapping expression of target column SEGMENT3_DESC is FF1.DESCRIPTION.
    Execution of this interface taking more time. When using single lookup and SEGMENT2_DESC data loading is fast.
    Kindly advice me regarding this.
    Thanks in advance.

    Are the number of records in lookup very large ?
    Why cant you use the same lookup once and have the join containing both the conditions.
    YOu dont have to use lookup at all. YOu can also drag the "so called" lookup table in the source and build the join conditions yourself.
    The join condition would be
    CC.SEGMENT2=FF.FLEX_VALUE and CC.SEGMENT3=FF.FLEX_VALUE

  • Character Styles in the Real World

    Rick:
    Thanks for your efforts, and let me add my Amen to both
    subjects (on file locations and on Character styles).
    My real-world use of Character styles is a combination usage
    of Paragraph and Character styles for Notes: I have a Paragraph
    style called Note, which simply adds margins of .15in Left, 10pt
    Top, and 8pt Bottom. Within this paragraph style, multiple labels
    announce the type of Note with the use of Character styles
    NoteLabel (Navy), RecommendLabel (Teal), CAUTIONLabel (Purple), and
    WARNINGLabel (Red).
    This way, you can change the color of one or more labels
    without worrying about the paragraph settings (or vice versa).
    Also, when placing a Note inside a table cell (which might
    have limited horizontal space, especially with three or four
    columns), we still use the "Label" character styles but
    without the Notes paragraph style. This still sets off the
    text visually, without adding unnecessary extra vertical space.
    Thanks again, Rick!
    Leon

    I can tell you about two sites.
    1. A system which allocates and dispatches crews, trucks, backpack hoses, spare socks, etc to bushfires (wildfires to you). It operates between two Government departments here in Australia. Each of those despatchable items is a remote object and there have been up to 50,000 active in the system at a time during the hot summer months. This is a large and life-critical system.
    2. A monitoring system for cable TV channels. A piece of hardware produces a data stream representing things like channel utilization, error rates, delay, etc and this is multiplexed via RMI to a large number of operator consoles. Again this is a major and business-critical system.
    And of course every J2EE system in existence uses RMI internally, albeit almost entirely RMI/IIOP.

  • Real World Battery Life of Macbook Pro (Feb 2012)?

    I keep seeing that the Macbook Pro (Feb 2012) has up to 7hrs of battery life. I can't find any real world ACTUAL use results. What have those who bought this model been getting in real world use?

    I meant a June 2012 Macbook Pro.

  • 3G vs 3Gs in the 'real world'?

    HI I currently have the iphone 3G and am due an upgrade soon. I have no idea what phone to go for if any, as I don't think there's any other phone worth having other than the iphone. Therefore the only logical upgrade is the 3Gs. I know all the spec differences on paper, but what are they like in the 'real world'. Has anyone had both and can tell me if there is any noticeable difference (other than the video camera)? The thing I'm most interested in is the speed. Is there a real difference in apps and on the internet?
    Cheers, Toby

    I know its been awhile since the origional post, but i thought i'd post my 2 cents anyway.
    For speed, i've noticed a big difference between the two. I sold my 3G to my dad so i've compared the two, and pretty much every aspect of it is faster, and more fluent, like scrolling between pages of apps, or moving apps around.
    Also GPS searching facebook are faster as well from what i noticed.
    And like the other person mentioned, more gigabytes, so you can get more space.
    Also the video recording is nice, considering most phones have it anyway, now you can have it on your iphone as well.
    im sure you won't regret getting the 3Gs if you loved the 3G

  • Exect difference between Fuzzy lookup and Fuzzy grouping

    Hi all,
       Can you pls explain difference between Fuzzy lookup and Fuzzy grouping in simple word,pls
    Thanks
    Selva

    Hi Selva,
    In brief, the Fuzzy Grouping Transformation can be used to group the similar rows in the source dataset and identify rows of data that are likely to be duplicate; while the Fuzzy Lookup Transformation can match records between the source table and reference
    table that are similar, but not identical to, the lookup key.
    Here are good examples about the two transformations:
    http://ssis-tutorial-online.blogspot.com/2013/04/fuzzy-grouping-transformation.html 
    http://www.codeproject.com/Tips/528243/SSIS-Fuzzy-lookup-for-cleaning-dirty-data 
    Regards,
    Mike Yin
    TechNet Community Support

  • Lookup creation and using this lookup in SQL query

    I Have two tables one table called T_KEY_VALUES (KEY_ID , VALUE) and other is my transition table T_TRANSACTIONS (VERSION_ID , COL_VENDOR , COL_PREFIX, COL_RECIPTID , COL_STATE , COL_COUNTRY ..)
    The data looks like below:
    T_KEY_VALUES:
    KEY_ID , VALUE,
    10, CA
    11, NY
    13, NJ
    20, USA
    21, CANADA
    101 , AMC
    102, REGAL
    1001, MOVIES
    1002, MALLS
    T_TRANSACTIONS:
    VERSION_ID , COL_VENDOR , COL_PREFIX , COL_RECIPTID , COL_SATE , COL_COUNTRY
    1, 101 , 1001 , 100001 , 10 , 20
    2, 102 , 1002  , 100002 , 11 ,20
    Generally, COL_VENDOR, COL_PREFIX , COL_STATE , COL_COUTRY field values exist in the T_KEY_VALUES table.
    So How can I use T_KEY_VALUES as Lookup and write the one SQL query to get the data like below:
    1, AMC , MOVIES , 100001 , CA ,USA
    2, REGAL , MALLS , 100002 , NY , USA

    Hi,
    One way is to join t_transactions to 4 copies of t_key_values:
    SELECT  t.version_id
    ,       v.value           AS vendor
    ,       p.value           AS prefix
    ,       t.col_reciptid
    ,       s.value           AS state
    ,       c.value           AS country
    FROM    t_transactions  t
    JOIN    t_key_values    v  ON  v.key_id  = t.col_vendor
    JOIN    t_key_values    p  ON  p.key_id  = t.col_prefix
    JOIN    t_key_values    s  ON  s.key_id  = t.col_state     -- or col_sate
    JOIN    t_key_values    c  ON  c.key_id  = t.col_country
    If you'd care to post CREATE TABLE and INSERT statements for the sample data, then I could test this.
    The query above assumes all 4 coded columns in t_transactions have matching values in t_key_values, as they do in the sample data.  If that assumption is wrong, then use outer joins in some (or all) of the places where I used inner joins above.
    Another approach is to UNPIVOT t_transactions into 4 times as many rows, do a single join to t_key_values, and then PIVOT those results back to the original number of rows.

  • Using Color Lookup Tables with cwgraph3d

    Hi,
    When searching old forums and other references, I can only find information regarding using color lookup tables in labview and I am using VB6.
    I want to let the user to pick a given amount of colors, have this turned into a gradiant over a table of size 256, and then using the custom colormaptstyle in my 3d graph. I want to be able to change the palette in real time to change my image. Basically I want to have the color palette screen to show up all the time and that gradiant that appears on the right side(vertically) to be my new color lookup table for the image(if that makes any sense).
    Thanks. Any help would be greatly appreciated.
    Kevin
    [email protected]

    Hello Kevin,
    Attached is a small VB6 example that uses the CWGraph3D's ColorMap properties. I think this will demonstrate the color lookup table capability that you're asking about, and you'll be able to use this as a building block for your own application.
    David Mc.
    NI Applications Engineer
    Attachments:
    colormap.zip ‏5 KB

  • Making Effective Use of the Hybrid Cloud: Real-World Examples

    May 2015
    Explore
    The Buzz from Microsoft Ignite 2015
    NetApp was in full force at the recent Microsoft Ignite show in Chicago, and it was clear that NetApp's approach to hybrid cloud and Data Fabric resonated with the crowd. NetApp solutions such as NetApp Private Storage for Cloud are solving real customer problems.
    Hot topics at the NetApp booth included:
    OnCommand® Shift. A revolutionary technology that allows you to move virtual machines back and forth between VMware and Hyper-V environments in minutes.
    Azure Site Recovery to NetApp Private Storage. Replicate on-premises SAN-based applications to NPS for disaster recovery in the Azure cloud.
    Check out the following blogs for more perspectives:
    Microsoft Ignite Sparks More Innovation from NetApp
    ASR Now Supports NetApp Private Storage for Microsoft Azure
    Four Ways Disaster Recovery is Simplified with Storage Management Standards
    Introducing OnCommand Shift
    SHIFT VMs between Hypervisors
    Infront Consulting + NetApp = Success
    Richard Treadway
    Senior Director of Cloud Marketing, NetApp
    Tom Shields
    Senior Manager, Cloud Service Provider Solution Marketing, NetApp
    Enterprises are increasingly turning to cloud to drive agility and closely align IT resources to business needs. New or short-term projects and unexpected spikes in demand can be satisfied quickly and elastically with cloud resources, spurring more creativity and productivity while reducing the waste associated with over- or under-provisioning.
    Figure 1) Cloud lets you closely align resources to demand.
    Source: NetApp, 2015
    While the benefits are attractive for many workloads, customer input suggests that even more can be achieved by moving beyond cloud silos and better managing data across cloud and on-premises infrastructure, with the ability to move data between clouds as needs and prices change. Hybrid cloud models are emerging where data can flow fluidly to the right location at the right time to optimize business outcomes while providing enhanced control and stewardship.
    These models fall into two general categories based on data location. In the first, data moves as needed between on-premises data centers and the cloud. In the second, data is located strategically near, but not in, the cloud.
    Let's look at what some customers are doing with hybrid cloud in the real world, their goals, and the outcomes.
    Data in the Cloud
    At NetApp, we see a variety of hybrid cloud deployments sharing data between on-premises data centers and the cloud, providing greater control and flexibility. These deployments utilize both cloud service providers (CSPs) and hyperscale public clouds such as Amazon Web Services (AWS).
    Use Case 1: Partners with Verizon for Software as a Service Colocation and integrated Disaster Recovery in the Cloud
    For financial services company BlackLine, availability, security, and compliance with financial standards is paramount. But with the company growing at 50% per year, and periodic throughput and capacity bursts of up to 20 times baseline, the company knew it couldn't sustain its business model with on-premises IT alone.
    Stringent requirements often lead to innovation. BlackLine deployed its private cloud infrastructure at a Verizon colocation facility. The Verizon location gives them a data center that is purpose-built for security and compliance. It enables the company to retain full control over sensitive data while delivering the network speed and reliability it needs. The colocation facility gives Blackline access to Verizon cloud services with maximum bandwidth and minimum latency. The company currently uses Verizon Cloud for disaster recovery and backup. Verizon cloud services are built on NetApp® technology, so they work seamlessly with BlackLine's existing NetApp storage.
    To learn more about BlackLine's hybrid cloud deployment, read the executive summary and technical case study, or watch this customer video.
    Use Case 2: Private, Nonprofit University Eliminates Tape with Cloud Integrated Storage
    A private university was just beginning its cloud initiative and wanted to eliminate tape—and offsite tape storage. The university had been using Data Domain as a backup target in its environment, but capacity and expense had become a significant issue, and it didn't provide a backup-to-cloud option.
    The director of Backup turned to a NetApp SteelStore cloud-integrated storage appliance to address the university's needs. A proof of concept showed that SteelStore™ was perfect. The on-site appliance has built-in disk capacity to store the most recent backups so that the majority of restores still happen locally. Data is also replicated to AWS, providing cheap and deep storage for long-term retention. SteelStore features deduplication, compression, and encryption, so it efficiently uses both storage capacity (both in the appliance and in the cloud) and network bandwidth. Encryption keys are managed on-premises, ensuring that data in the cloud is secure.
    The university is already adding a second SteelStore appliance to support another location, and—recognizing which way the wind is blowing—the director of Backup has become the director of Backup and Cloud.
    Use Case 3: Consumer Finance Company Chooses Cloud ONTAP to Move Data Back On-Premises
    A leading provider of online payment services needed a way to move data generated by customer applications running in AWS to its on-premises data warehouse. NetApp Cloud ONTAP® running in AWS proved to be the least expensive way to accomplish this.
    Cloud ONTAP provides the full suite of NetApp enterprise data management tools for use with Amazon Elastic Block Storage, including storage efficiency, replication, and integrated data protection. Cloud ONTAP makes it simple to efficiently replicate the data from AWS to NetApp FAS storage in the company's own data centers. The company can now use existing extract, transform and load (ETL) tools for its data warehouse and run analytics on data generated in AWS.
    Regular replication not only facilitates analytics, it also ensures that a copy of important data is stored on-premises, protecting data from possible cloud outages. Read the success story to learn more.
    Data Near the Cloud
    For many organizations, deploying data near the hyperscale public cloud is a great choice because they can retain physical control of their data while taking advantage of elastic cloud compute resources on an as-needed basis. This hybrid cloud architecture can deliver better IOPS performance than native public cloud storage services, enterprise-class data management, and flexible access to multiple public cloud providers without moving data. Read the recent white paper from the Enterprise Strategy Group, “NetApp Multi-cloud Private Storage: Take Charge of Your Cloud Data,” to learn more about this approach.
    Use Case 1: Municipality Opts for Hybrid Cloud with NetApp Private Storage for AWS
    The IT budgets of many local governments are stretched tight, making it difficult to keep up with the growing expectations of citizens. One small municipality found itself in this exact situation, with aging infrastructure and a data center that not only was nearing capacity, but was also located in a flood plain.
    Rather than continue to invest in its own data center infrastructure, the municipality chose a hybrid cloud using NetApp Private Storage (NPS) for AWS. Because NPS stores personal, identifiable information and data that's subject to strict privacy laws, the municipality needed to retain control of its data. NPS does just that, while opening the door to better citizen services, improving availability and data protection, and saving $250,000 in taxpayer dollars. Read the success story to find out more.
    Use Case 2: IT Consulting Firm Expands Business Model with NetApp Private Storage for Azure
    A Japanese IT consulting firm specializing in SAP recognized the hybrid cloud as a way to expand its service offerings and grow revenue. By choosing NetApp Private Storage for Microsoft Azure, the firm can now offer a cloud service with greater flexibility and control over data versus services that store data in the cloud.
    The new service is being rolled out first to support the development work of the firm's internal systems integration engineering teams, and will later provide SAP development and testing, and disaster recovery services for mid-market customers in financial services, retail, and pharmaceutical industries.
    Use Case 3: Financial Services Leader Partners with NetApp for Major Cloud Initiative
    In the heavily regulated financial services industry, the journey to cloud must be orchestrated to address security, data privacy, and compliance. A leading Australian company recognized that cloud would enable new business opportunities and convert capital expenditures to monthly operating costs. However, with nine million customers, the company must know exactly where its data is stored. Using native cloud storage is not an option for certain data, and regulations require that the company maintain a tertiary copy of data and retain the ability to restore data under any circumstances. The company also needed to vacate one of its disaster-recovery data centers by the end of 2014.
    To address these requirements, the company opted for NetApp Private Storage for Cloud. The firm placed NetApp storage systems in two separate locations: an Equinix cloud access facility and a Global Switch colocation facility both located in Sydney. This satisfies the requirement for three copies of critical data and allows them to take advantage of AWS EC2 compute instances as needed, with the option to use Microsoft Azure or IBM SoftLayer as an alternative to AWS without migrating data. For performance, the company extended its corporate network to the two facilities.
    The firm vacated the data center on schedule, a multimillion-dollar cost avoidance. Cloud services are being rolled out in three phases. In the first phase, NPS will provide disaster recovery for the company's 12,000 virtual desktops. In phase two, NPS will provide disaster recover for enterprise-wide applications. In the final phase, the company will move all enterprise applications to NPS and AWS. NPS gives the company a proven methodology for moving production workloads to the cloud, enabling it to offer new services faster. Because the on-premises storage is the same as the cloud storage, making application architecture changes will also be faster and easier than it would be with other options. Read the success story to learn more.
    NetApp on NetApp: nCloud
    When NetApp IT needed to provide cloud services to its internal customers, the team naturally turned to NetApp hybrid cloud solutions, with a Data Fabric joining the pieces. The result is nCloud, a self-service portal that gives NetApp employees fast access to hybrid cloud resources. nCloud is architected using NetApp Private Storage for AWS, FlexPod®, clustered Data ONTAP and other NetApp technologies. NetApp IT has documented details of its efforts to help other companies on the path to hybrid cloud. Check out the following links to lean more:
    Hybrid Cloud: Changing How We Deliver IT Services [blog and video]
    NetApp IT Approach to NetApp Private Storage and Amazon Web Services in Enterprise IT Environment [white paper]
    NetApp Reaches New Heights with Cloud [infographic]
    Cloud Decision Framework [slideshare]
    Hybrid Cloud Decision Framework [infographic]
    See other NetApp on NetApp resources.
    Data Fabric: NetApp Services for Hybrid Cloud
    As the examples in this article demonstrate, NetApp is developing solutions to help organizations of all sizes move beyond cloud silos and unlock the power of hybrid cloud. A Data Fabric enabled by NetApp helps you more easily move and manage data in and near the cloud; it's the common thread that makes the uses cases in this article possible. Read Realize the Full Potential of Cloud with the Data Fabric to learn more about the Data Fabric and the NetApp technologies that make it possible.
    Richard Treadway is responsible for NetApp Hybrid Cloud solutions including SteelStore, Cloud ONTAP, NetApp Private Storage, StorageGRID Webscale, and OnCommand Insight. He has held executive roles in marketing and engineering at KnowNow, AvantGo, and BEA Systems, where he led efforts in developing the BEA WebLogic Portal.
    Tom Shields leads the Cloud Service Provider Solution Marketing group at NetApp, working with alliance partners and open source communities to design integrated solution stacks for CSPs. Tom designed and launched the marketing elements of the storage industry's first Cloud Service Provider Partner Program—growing it to 275 partners with a portfolio of more than 400 NetApp-based services.
    Quick Links
    Tech OnTap Community
    Archive
    PDF

    Dave:
    "David Scarani" <[email protected]> wrote in message
    news:3ecfc046$[email protected]..
    >
    I was looking for some real world "Best Practices" of deploying J2EEapplications
    into a Production Weblogic Environment.
    We are new at deploying applications to J2EE application servers and arecurrently
    debating 2 methods.
    1) Store all configuration (application as well as Domain configuration)in properties
    files and Use Ant to rebuild the domain everytime the application isdeployed.
    I am just a WLS engineer, not a customer, so my opinions have in some
    regards little relative weight. However I think you'll get more mileage out
    of the fact that once you have created your config.xml, checking it into src
    control, versioning it. I would imagine that application changes are more
    frequent than server/domain configuration so it seems a little heavy weight
    to regenerate the entire configuration everytime an application is
    deployed/redeployed. Either way you should check out the wlconfig ant task.
    Cheers
    mbg
    2) Have a production domain built one time, configured as required andalways
    up and available, then use Ant to deploy only the J2EE application intothe existing,
    running production domain.
    I would be interested in hearing how people are doing this in theirproduction
    environments and any pros and cons of one way over the other.
    Thanks.
    Dave Scarani

  • Cisco Jabber for Mac - Directory Lookup and Contact Fields

    Hello,
    After having issues myself with Cisco Jabber for Mac 9.2 specifically with Active Directory lookups not working, or contact information not populating (and therefore not being able to call users from the contact list as no telephone information exists) I am including a sample jabber-config.xml file that works for me.
    Please note: there are many different ways to configure this. What I will be showing is the method that works for me and my deployment, which is pretty standard.
    As always and as a disclaimer, once again, this is what has worked for my deployment scenario. Always keep backups of your configuration files, and always be mindful of anything you have configured already, especially in the jabber-config.xml file.
    Background
    My deployment is based on CUCM 9, with 1 publisher and 2 subscribers. I also have a CUCM IM & Presence 9.0 server. This assumes that you have already configured your deployment and Jabber is functioning already, albeit with the aforementioned issues.
    As for Active Directory, my deployment will be based on Windows 2008 R2 Domain Controllers running in native 2008 mode. For this example, we will be searching directly against one of the DCs with a Global Catalogue role. Please be aware that in large deployments you will have to plan accordingly with regards to lookup traffic from Jabber clients to the DCs.
    Also, as of Cisco Jabber for Mac 9, the client no longer can search via the CUCM internal directory (which may be made of local end users, Active Directory synced users, or a mix of both) and this must be done via a LDAP mechanism.
    Scenario
    In this deployment scenario, Cisco Jabber for Windows is working properly - you can search and add people from Active Directory, and contacts in your contact list have all the appropriate fields populated from Active Directory. However, when trying the same with Jabber for Mac, Jabber for iPhone or Jabber for iPad you notice that you cannot perform a directory lookup, and if you add people directly (for example, [email protected]) the user only has the IM field populated. No telephone, email or additional information is displayed.
    Solution
    Whereas Cisco Jabber for Windows uses the EDI mechanism (native Windows), whereby when running from a computer that is on the domain (or in the event that you are search for contacts in another Active Directory domain where a domain trust exists) , Jabber for Mac / iPhone / iPad uses the BDI mechanism.
    In this case, you will need to provision a jabber-config.xml file that you will upload to your TFTP server (or Publisher) that will be "downloaded" by your Jabber for Mac client and also used by the iPhone and iPad client. You can configure many options in the jabber-config.xml file, but for this example we will place just the information that we need to order for these clients to request and display Active Directory information.
    Please note that the configuration may vary depending on your deployment, but at the very least we will be:
    - Configuring a DC where we will perform the lookup.
    - Configure credentials that will be used to perform the lookup. This will be an Active Directory account that has read rights on the Active Directory domain. Please note that these credentials are saved in plain text in the file, so ensure that the account that you will be using is not privileged.
    - Configure the server port that we will be using to perform the lookup.
    - Configure the Search Base. This is basically where we want the directory lookup to happen. You can either choose for this lookup to start at the "base" of the domain (and therefore the search will iterate through all the user accounts and and OUs below the root base) or define a specific OU where you want to search.
    Caution!
    - There is a current limitation with Cisco Jabber for Mac whereby you can only have 1 search base configured. Please keep this in mind if, like me, you have multiple OUs (like an OU for each company in your organization) and under these OUs you have sub OUs as a user account container.
    - If using the top level search base, unless you specify a filter, you will potentially be able to search for all user accounts in the domain. You will need to configure the <BDIBaseFilter> parameter if you want to fine tune your search ability.
    Steps
    These are the steps I have followed. Other steps or considerations may vary.
    - Log in to your TFTP server and download the jabber-config.xml file and keep it as a backup. If you are already using the jabber-config.xml file for other purposes, do not worry - you can add your BDI information parameters inside.
    - Remove the jabber-config.xml file
    - Edit the jabber-config.xml file and configure thus:
    <?xml version="1.0" encoding="UTF-8"?>
    <config version="1.0">
        <Directory>
            <DirectoryServerType>BDI</DirectoryServerType>
            <BDILDAPServerType>AD</BDILDAPServerType>
            <BDIPrimaryServerName>DOMAIN CONTROLLER IP ADDRESS</BDIPrimaryServerName>
            <BDIPresenceDomain>YOUR PRESENCE DOMAIN</BDIPresenceDomain>
            <BDIServerPort1>3268</BDIServerPort1>
            <BDISearchBase1>YOUR SEARCH BASE</BDISearchBase1>
            <BDIConnectionUsername>[email protected]</BDIConnectionUsername>
            <BDIConnectionPassword>PASSWORD</BDIConnectionPassword>
            <BDIEnableTLS>0</BDIEnableTLS>
        </Directory>
    </config>
    For example, let´s assume the following:
    - Domain controller IP address is 10.1.1.2 .
    - Your presence domain is test.local .
    - Your search base will be test.local using the top level of the domain.
    - Your username with which you will be doing your searches is called walt . Usually you can either identify walt as test.local\walt or [email protected] . It is always best, in these sort of scenarios, to use the UPN convention so we will be configuring a [email protected] .
    - The password is the Active Directory password for the account walt .
    - I have disabled TLS in my case. There are issues with the Jabber for Mac client when using other security methods.
    <?xml version="1.0" encoding="UTF-8"?>
    <config version="1.0">
        <Directory>
            <DirectoryServerType>BDI</DirectoryServerType>
            <BDILDAPServerType>AD</BDILDAPServerType>
            <BDIPrimaryServerName>10.1.1.2</BDIPrimaryServerName>
            <BDIPresenceDomain>test.local</BDIPresenceDomain>
            <BDIServerPort1>3268</BDIServerPort1>
            <BDISearchBase1>DC=test,DC=local</BDISearchBase1>
            <BDIConnectionUsername>[email protected]</BDIConnectionUsername>
            <BDIConnectionPassword>walt01!</BDIConnectionPassword>
            <BDIEnableTLS>0</BDIEnableTLS>
        </Directory>
    </config>
    One you have configured the jabber-config.xml file, you will now need to upload it to you TFTP server. Once uploaded, you will need to restart the Cisco TFTP service. Again, my TFTP server is on my CUCM publisher, so:
    - I go to Cisco Unified OS Administration on my Publisher server, TFTP File Management and I upload jabber-config.xml to / directory
    - I then go to Cisco Unified Serviceability on my Publisher server, I locate the Cisco TFTP service and I restart the service
    Once this is done, you can figure up your Jabber for Mac client. As a test, on your Mac (using Terminal) go to:
    /Users/username/Library/Application Support/Cisco/Unified Communications/Jabber/Config
    In here you will see several files, but what we want to see is jabber-config.xml . As soon as you start the Jabber for Mac client and log in, the jabber-config.xml file will download from your TFTP server and get saved here. When you see it appear, just type in your terminal window more jabber-config.xml and make sure that the output is the same as the xml file you created.
    From there, try doing directory search. If you have previously added contacts and they still lack attribute information, you will need to remove them (sometimes it will not refresh properly) and add them again from the directory.
    I will be updating this guide and ammending anything that is incorrect, but this is meant to be a quick checklist and steps to get this, at least in the most very basic version, up and running for Jabber for Mac.

    Hello, 
    Thanks for this post! It works, I can do lookup and also I can add found contact to contact list and get information about contact from LDAP.
    One more question: - I can't get all information about contact. I don't get e.c mobile phone number and more others attributes. I have tried to expand your file as follows:
    <?xml version="1.0" encoding="UTF-8"?>
    <config version="1.0">
        <Directory>
            <DirectoryServerType>BDI</DirectoryServerType>
            <BDILDAPServerType>AD</BDILDAPServerType>
            <BDIPrimaryServerName>IP of AD</BDIPrimaryServerName>
            <BDIPresenceDomain>Presence Domain</BDIPresenceDomain>
            <BDIServerPort1>3268</BDIServerPort1>
            <BDISearchBase1> Search Base</BDISearchBase1>
            <BDIConnectionUsername>User</BDIConnectionUsername>
            <BDIConnectionPassword>Password</BDIConnectionPassword>
            <BDIEnableTLS>0</BDIEnableTLS>
            <BDISipUri>msRTCSIP-PrimaryUserAddress</BDISipUri>
            <BDIPhotoSource>thumbnailPhoto</BDIPhotoSource>
            <BDIBusinessPhone>telephoneNumber</BDIBusinessPhone>
            <BDIMobilePhone>mobile</BDIMobilePhone>
            <BDIHomePhone>homePhone</BDIHomePhone>
            <BDIOtherPhone>otherTelephone</BDIOtherPhone>
            <BDITitle>title</BDITitle>
            <BDICompanyName>company</BDICompanyName>
            <BDILocation>co</BDILocation>
            <BDIPostalCode>postalCode</BDIPostalCode>
            <BDICity>l</BDICity>
            <BDIState>st</BDIState>
            <BDIStreetAddress>streetAddress</BDIStreetAddress>
        </Directory>
    </config>
    But it didn't help.
    When I capture lookup via Wireshark, I can see that Jabbers sends search request with bunch of attributes, but from LDAP answer contains only 8 attributes. (see attached screenshots)

  • How to use JDBC Lookup in PI 7.1 ?

    Hi,
    Please advise how to use JDBC lookup in message mapping PI 7.1 ? any reference link / document  ?
    I have followed this step below :
    1. Create the external definition for the database table.
    2. Use the external definition (table) in message mapping JDBC Lookup.
    But the target still "Yellow colour" meanint the mapping hasnot completed yet ? why ? and when i double click the JDBC lookup
    there some error message
    "No suitable parameter found; define new parameter of type 'Channel' first"
    Please advise.
    Thank You and Best Regards
    Fernand

    Hi Fernand,
    JDBC Lookup can be done in PI 7.1 using below mentioned steps :
    1) Create a communication channel between PI and the database to connect to database.
    2) Import the table data as External Definition.
    3) In message mapping where this lookup is to be used select JDBC Lookup under Conversions and map
    4) Double Click on JDBC Lookup
    5) Select parameter and a database table (imported as the external definition). All the elements of the table will appear in the middle column. Select and move the input parameters to the left side column and the output parameters to the right side column. Click OK. 
    6) Under message mapping go to signature tab and define the parameter as channel and category as JDBC Adapter Type. 
    7) Under Operation mapping define the parameter & associate it with parameter defined in Message Mapping.
    Thanks
    Amit

  • Use of Lookup Files in BI Applications

    Hi,
    Can any one tell the use of lookup files in BI Applications. While doing the configurations (for all modules) for these lookup files we get the data from the ERP database tables using the query provided by oracle and populate the file and then map with some domain values.
    Why cant we directly populate the datawarehouse tables using the ERP tables instead of populating through the lookup files (which we any how populate with ERP tables). Also what is the use of domain values. Does these use in reporting puposes and anlyzing the data or business.
    Thank You,

    For e.g, when you map the GL group account numbers, the predefined group codes such as 'CASH', 'AR', 'AP' , 'TAX' etc are the domain values. In the repository, the metrics are defined based on these domain values.
    Say, you mapped your CASH accounts from 0011 through 0100, then OBIEE calculates the metrics related to CASH as summation (or some kind of calculation) on the accounts that have the group codes (domain values) as CASH.
    It should be possible to update the domain values directly from ERP database tables but the ETL expects the mapping to be in files. Also in some cases we cannot change the default domain values, all we can change is the mapping to these domain values or create new domain values. At one client, we had to create a new account group code called 'FIXED ASSET' as the default domain values were not sufficient but then we had to change the repository to include new metrics and change existing ones to incorporate this new group code.
    -Nilesh
    http://www.appsbi.com

Maybe you are looking for

  • Running a given Job immediately in background

    Hi experts,                  I have a requirement where I need to create a program which should take 'JOB NAME' as input and it should get executed immedieately in Background as soon as I execue the program. I have tried using JOB_OPEN,JOB_SUBMIT, AN

  • Auto login in Design console in OIM

    I want to know the changes to be done so that I can auto login to xelsysadm when using design console in OIM 9.1.0.2 Any help appreciated

  • Having trouble with my iPod Mini.. iTunes won't recognize it. Help please.

    I uploaded some songs to my iPod mini. Then a day later i wanted to add more songs, but when i connect it, it gives me this: http://img88.imageshack.us/img88/9947/error4bd.gif I restored once and uploaded all the songs including the new ones, but now

  • [urgent] Upload presentation server files in parts

    Hi, I want to read presentation server files. I can use FM GUI_UPLOAD. But my problem is i want to read this file in parts. Like i have 1 lakh records in file. So i want to red this file in parts of 10000 each. In GUI_UPLOAD there is a parameter READ

  • How do I restore individual programs (such as iTunes) from time machine?

    I recently put two hard drives in my macbook. I put a 60gig solid state in the original hard drive bay and put my original 500gig in the optical drive bay. The computer works well with the OS being on the solid state but I want to make all of my medi