Why do my rules fail in the data flow from connector view to Meta view?

I have Meta directory 5.0 alongwith the iplanet Directory Server 5.0 installed which is working fine.
I have created an instance of NT Domain Connector which retrieves entries in a Connector view.
Where do I get the examples about writing the data flow rules for the NT Domain Connector for flowing specific entries from CV to MV. Basically I do not want the NT Groups in the MV. Also I want to create an additional attribute e.g myflag whose value should be updated manually in the CV. And now if myflag = 0 I dont want this entry to be moved to MV and if myflag = 1 the entry should be moved to MV.
I tried to write a few rule but it fails in testing only (Rule Tester). And I am not able to locate the exact error in my rule. Does it require any specific configuration ?
Thanks
Amol Talap

You should post your rule.
But either way, have you tried this:
(objectclass==ntuser) or
(objectclass!=groupofuniquenames)
The first set allows only entries that are user.
The second allows only entries that are not groups.
As for the flags, try this:
(myflag==1) or
(myflag!=1)
Same effect as above.
Further more if rule testing fails, it could that you are not referencing the right Directory when using the rule tester. The rule tester does not always point to the right location.
J.F.

Similar Messages

  • Need to check the data flow from R/3 to BW server.

    Hi BI experts,
    This query is regarding need to check the data flow from R/3 to BW server.
    As of now I have some set of reports which I would need to take up in BW. The requirement is  to go through the list of transaction codes for reports in R/3 and find out if there are already  any existing objects in BW system which I can use for these reports.
    So, can u plz help me.

    Depends what are your Tcode or Reports users run in R/3 and they want the same in BW.Then in BI Content we have Out of the box Delivered reports.You can activate those Load data and use it.
    Gimme T-codes you have I can send you Standard reports in BI or Cube you can get these from.
    ~AK

  • How to migrate  the data flow from DB CONNECT sourse system from 3.5 to BI

    Hi
    can any one tell me how to migrate the data flow from DB CONNECT sourse system from 3.5 to BI 7.

    Hi,
    Go to Infoprovider to which your DB connect DS feeds and Right Click on Data source-> Then Migrate-> With Export---> You have to build new 7.0 Transformations and DTP's etc.
    ~AK

  • Mapping data flow from R/3 to BW

    Hello,
    I am pretty new to BW and I have been tasked with creating a detailed map of the data flow from R/3 into BW. 
    I need to record where the data originates from in R/3 (field names/tables) and literally track the flow of that data all the way including any info objects along the way to any cubes that it may be sitting in.
    How do I track this flow ? And how can I identify what a characteristic in BW is in R/3 ?
    Has anybody had to create a similar data flow ? If so how did you approach this ?
    Many Thanks,
    Matt

    Hi Matthew,
    From the R/3 side:
    BW treats all the data from R/3 as Datasources.
    From the Datasource the upload of data to the cube is done as..        
    <b>Datasource->Transfer Rule->psa/infosource->communication structure->cube</b>
    (for a 3.5 system)
    in case of 7.0 system... data flow is as follows...
    <b>Datasource->infopackage->psa->transformation/DTP-> Data target(cube)</b>
    -> Go to transaction <b>RSA5</b>( for Business Content datasources ) and <b>RSA6</b>( for all the active Datasources ) found in the system.
    -> There you can find all the data that you want...(For your mapping purpose this will do..)
    -> You can as well check from the BI side in the transaction RSA1 -> click on the Monitor button on the left ( for custom objects ) or Business Content button -> choose the object from the tree... right click and replicate to find if all of them were used.
    Hope this helps!!
    <b>*</b><i>Reward Pts if useful</i><b>*</b>
    regards,
    Naveenan.

  • Data Flow from Source systemside LUWS and Extarction strucures

    Hi
    Can Anybody Explain the Data flow from Source system to Bi System .Especially I mean the Extract Structure and LUWS where does they come in picture ,the core data flow of inbound and out bound queues .If any link for the document  would also be helpful.
    Regards
    Santosh

    Hi See Articles..
    http://wiki.sdn.sap.com/wiki/display/profile/Surendra+Reddy
    Data Flow from LBWQ/SMQ1 to RSA7 in ECC (Records Comparison).
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/enterprise-data-warehousing/data%20flow%20from%20lbwq%20smq1%20to%20rsa7%20in%20ecc%20(Records%20Comparison).pdf
    Checking the Data using Extractor Checker (RSA3) in ECC Delta Repeat Delta etc...
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/80f4c455-1dc2-2c10-f187-d264838f21b5&overridelayout=true 
    Data Flow from LBWQ/SMQ1 to RSA7 in ECC and Delta Extraction in BI
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/d-f/data%20flow%20from%20lbwq_smq1%20to%20rsa7%20in%20ecc%20and%20delta%20extraction%20in%20bi.pdf
    Thanks
    Reddy

  • Delta failing on the data request

    Hello All,
    I have the following error on a delta that didn't even make it into BW (failing on the data request) "Syntax error in GP_ERR_RSDRO_UPDATE, row 1.893" - any idea on the problem?
    Thank you and kind regards,
    Keith

    Hi,
    This error seems to be in the Update from ODS, so check once again ur update rules for proper mapping, concentrate on the newly enhanced InfoObject. You are saying that u have added that InfoObject only in the InfoCube, what about the ODS ? from where that IO is getting data ? how u r updating that IO, check that as well.
    try to check if the table corresponding to the ODS (/BIC/A<ODS Object>40 or /BI0/A<ODS Object>40) is active. If not just run the program RSOD_XPRA_BDS_TO_KW to create the table and load it again...
    -Shreya

  • ABAP code in update rules to convert the date

    Hi,
    Could any one send me the ABAP code that is written in the update rules to convert the date (DD/MM/YYYY  -- lenght 10) to YYYYMMDD ---  length 8  format.
    Also please let me know where I should write this code; while creating update rules or while creating infosource.
    Thanks,

    Hi Bharath,
    Hi Bharath,
    I suggest you do the conversion of dates in the transfer rules. Here is the correct code you need:
    * Assuming the source data field is called MYDATE
    * Place the ff. in the routine in the transfer rules:
    concatenate tran_structure-mydate+6(4) tran_structure-mydate+3(2) tran_structure-mydate(2) into result.
    replace MYDATE with the name of the source field (10 chars) in the transfer structure. Hope this helps.

  • SQL server(PC1) --- PC2: Login failed. The login is from untrusted domain and cannot be used with windows authentification

    Hey,
    I'want to make connection from my laptop(xxx.xxx.xxx.xxx = A) to a fixed computer(SQL server xxx.xxx.xxx.xxx =B). My connection string = "Provider=SQLNCLI11; Data source:name-pc/SQLEXPRESS; Integrated circuit=SSPI;Intial Catalog=Database name for visual
    studio C#.
    Laptop -> PC1 : Eror
    It works when i use localhost or 127.0.0.1 and i can read my database without any problems if i install SQL server on my laptop. Know i install it to PC1 and uninstall on my laptop. When i change the name-pc by an ip-adress i get this error: Login failed.
    The login is from untrusted domain and cannot be used with windows authentification. I did some research on multiple forums where they say about Local security policy(secpol.exe) but i don't have this file. 
    PC2-> PC3: work fine but i want to work with my laptop and i don't understand why it isn't working with my Laptop. 
    Can someone help me?
    Thx a lot and sry about my english(its a disaster) 
    Thibaut

    Hello,
    Yes, for the Windows Authentication to work you should be using the same Windows account and password.
    Are you willing to create SQL logins inside SQL Server and allow your users to connect to SQL Server using SQL Authentication
    instead of Windows Authentication? That could be a solution on a workgroup network.
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • How to find the data loaded from r/3 to bw

    hi
    how to find the data loaded from r/3 to bw is correct . i am not able to find which feild in the query is connected to which feild in the r/3 . where i am geting the data from r/3 . is there any process to find which feild  and table the data is comming from . plz help
    thanks in advance to u all

    Hi Veda ... the mapping between R/3 fields and BW InfoObjects should take place in Transfer Rules. Other transformation could take place in Update Rule.
    So you could proceed this way: look at InfoProvider Data Model and see if the Query does perform any calculation (even with Virtual keyfigures / chars). Than go back to Update Rules and search for other calculation / transformation. At least there are Tranfer Rule and eventually DataSource / Extraction Enhancements.
    As you can easily get there are many points where you have to look for ... it's a quite complex work but very usefull.
    Once you will have identified all mappings / transfromation see if BW data matchs R/3 (considering calculations ...)
    Good job
    GFV

  • Error 18452 "Login failed. The login is from an untrusted domain and cannot be used with Windows authentication" on SQL Server 2008 R2 Enterprise Edition 64-bit SP2 clustered instance

    Hi there,
    I have a Windows 2008 R2 Enterprise x64 SP2 cluster which has 2 SQL Server 2008 R2 Enterprise Edition x64 SP2
    instances.
    A domain account "Domain\Login" is administrator on both physcial nodes and "sysadmin" on both SQL Server instances.
    Currently both instances are running on same node.
    While logging on to SQL Server instance 2 thru "Domain\Login" using "IP2,port2", I get error 18452 "Login failed. The login is from an untrusted domain and cannot be used with Windows authentication". This happened in the past
    as well but issue resolved post insatllation of SQL Server 2008R2 SP2. This has re-occurred now. But it connects using 'SQLVirtual2\Instance2' without issue.
    Same login with same rights is able to access Instance 1 on both 'SQLVirtual1\Instance1' and "IP1,port1" without any issue.
    Please help resolve the issue.
    Thanks,
    AY

    Hello,
    I Confirm that I encountred the same problem when the first domain controller was dow !!
    During a restarting of the first domain controller, i tried to failover my SQL Server instance to a second node, after that I will be able to authenticate SQL Server Login but Windows Login returns Error 18452 !
    When the firts DC restart finishied restarting every thing was Ok !
    The Question here : Why the cluster instance does'nt used the second DC ???
    Best Regards     
    J.K

  • How to store the data coming from network analyser into a text or excel file

    Hii everyone
    I'm using Agilent 8719ET network analyser and wish to store the data coming from netowrk analyser into a text file/ excel file.
    Presently I'm able to get the data on Labview graph using GPIB . Can anyone suggest how to go ahead after collect data sub vi. How can the data be stored into a file apart from showing on the graph?
    Attached is the vi for kind consideration...
    Looking for help
    Regards
    Rohit
    Attachments:
    Agilent 87XX Series Exceed Max Meas.vi ‏43 KB

    First let me say that your code really looks pretty good. The data handling could be made more efficient by calculating the number of datapoints that are going to be in the completed dataset and preallocating the entire array -- but depending upon your answer to my questions, the logic in the lower shift register may be going away - so we won't worry about that right now.
    The thing I need to know before addressing the data storage question is: Each time you call "Collect and Display Data.vi", how many element are in the array? Are you reading single data points, or a group of data? (BTW: if the answer to that question is obvious based on the way the other VIs are setup, I don't have the drivers so I can't tell what the setup values are.) Second, how fast does the loop iterate? Are we talking msec per loop?, seconds? fortnights?
    The issues here are two-fold: how much data? and how fast is it coming? The answer to these will tell you how to save the data.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • I have updated my mac from snow leopard to mountain lion 3 days back. I had some important data in my library folder and it got overwritten. Is there any way i can get the data back from my snow leopard library folder?

    I have updated my mac from snow leopard to mountain lion 3 days back. I had some important data in my library folder and it got overwritten. Is there any way i can get the data back from my snow leopard library folder?
    I tried mackeeper to recover files but it could not. Any other way any one has tried to recover a system library folder after OS upgrade?

    No, it doesn't store a clone. You would have needed to make one with either SuperDuper or CarbonCopy Cloner.
    If the files were in your ~/Library folder then they may still be there. As I said, you can access it by
    going to your Finder "Go" menu hold the option key to choose "Library". I wouldn't think an upgrade would overwrite anything in ~/Library.
    If you have a Time Machine backup you may also be able to use that to retrieve them.

  • How can I get the data back from my game

    How can I get the data back from minecraft if I deleted the app and bought with a different Apple ID

    No, they are tied to the ID that purchased them, and cannot be transferred to anyone else.

  • How to convert the date time from MM-DD-YYYThh:mm:ss to YYYY-MM-DDThh:mm:ss format

    Hi All,
    I have a requirement in my project like to convert the date time from one format to another.my situation is like to convert the date time from MM-DD-YYYThh:mm:ss to YYYY-MM-DDThh:mm:ss format. I am using the soa suite 11.1.1.6.
    Can any one suggest me how to convert in the BPEL transformation.
    Thanks,
    Sanju.

    Hi Sanju,
    Store the date to be converted into a variable viz. dateVar. Now, process an expression in assign as: xp20:format-dateTime($dateVar,'[Y0001]-[M01]-[D01] [h]:[m01]:[s01]').
    Regards

  • Client automatically cache the data got from cache server?

    Hi expert,
    I have 2 questions about the client local cache. Would you please help to give me some suggestion?
    1. Will client automatically locally cache the data got from cache server the first time and automatically update the data in local cache when getting the same data from cache server again? I go through the API reference but cannot find any API to query the data currently cached in the local cache.
    2. If client will automatically cache the data got from cache server. Is there any way for a client to get the data event that happens to its local cache, such as entry created in local cache, entry deleted from local cache and entry updated in local cache? In my opinion, when getting an entry from cache server the first time, the MapListener's entry create event should be triggered. When getting the same entry again, the entry update event should be triggered.
    However, I have tried a client with replicated cache, a client with partitioned cache, an extend client with remote cache and a client with local cache(front cache part of near cache), the client (the NamedCache object has been set the MapListener) cannot get any event notification after getting data from cache server. By the way, my listener is OK since when putting data the entry create event and entry update event will be triggered.
    Your suggestion is very appreciated. :)

    Hi
    If I were you I would read this http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/toc.htm
    and particularly the section about Near Caching here http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/nearcache.htm#CDEFEAJG
    which is what you are asking about in your question.
    Near Caching is how Coherence stores data in the locally - which is the answetr to your first question. How Near Caching works is explained in the documentation.
    Events, which you ask about in your second question are explained here http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/delivereventsjava.htm#CBBIIEFA
    It might be that ContinuousQueryCache is closer to what you want. This is explained here http://download.oracle.com/docs/cd/E14526_01/coh.350/e14510/queryabledatafabric.htm#sthref38 A ContinuousQueryCache is like having a sub-set of the underlying cache on the local client which you can then listen to etc...
    JK

Maybe you are looking for

  • Playlists and the previous/next buttons

    Once again, Encore proves to be difficult to work with. I have a bunch of m4v and matching ac3 files that I want to put into Encore to create a Blu-ray. However, I cannot place them all in a timeline and be done with it. Because of the null space at

  • How can I parse the document in WebI using sdk?

    I wanna to parse the document in WebIntelligence using sdk. My question is : 1) By which sdk, I can parse the document.  'Report Application Server SDK' ? 2) I wanna to parse the 'Self-Defined SQL' and 'Query' components of the document. Can the sdk

  • How to test a 'leaf' node when parses a DTD?

    Hi All: Suppose I have <!ELEMENT Name (#PCDATA)> in my dtd. What is the correct way to test the node is a 'leaf' node? I am thinking to use ElementDecl.ELEMENTS to eliminate leaves. Is it correct? As far as I see, the type of leaves is ElementDecl.MI

  • Hi, whats the best free video player/converter?

    Hi, can anyone advise me on the best free video player / converter to play some of the videos ive downloaded that Quicktime cant pay ?

  • Different interest percentages for different over due days

    Hi Experts, I have the following scenario with respect to the Interest calc for customers. 1. If the line item is over due till 90 days interest will be charged 10% 2. If the same line item is over due more than 90 days interest should be charged 12%