ALE conversion rules - Using a Mapping table

Hi, I need to use ALE conversion rules to modify Objects IDs using a custom mapping table. I´m looking BD79 transaction but I didn't find a place to use a mapping table..does anyone know how to do it?
Thanks!

<b>FYI</b>
http://help.sap.com/saphelp_erp2005/helpdata/en/80/411a73382b44d0a1667816716f5b37/content.htm
http://help.sap.com/saphelp_erp2005/helpdata/en/f9/c3075cac7d11d39460005004580996/content.htm
Hope this’ll give you idea!!
<b>Pl... award the points.</b>
Good luck
Thanks
Saquib Khan
"Some are wise and some are otherwise"

Similar Messages

  • Query about ALE conversion rule

    Hi,
       I am trying to copy an ALE conversion rule from one system to another.
       In the rule, for one of the variables, the option 'Set Variable' is selected and a variable '&PRTMVxxxx' (Name changed)
       is set under 'Rule Type'.
       I was not aware of how this variable gets filled or what additional setting is required.
       Could someone help me out with this option under the conversion rules?
    Regards,
    Sudeep

    Hi Gaurav,
    I have exactly the same problem that you had back in late 2006, with the Conversion Rules somehow not passing the data to the IDOC segment.
    The IDOC segment is cleared of all values with the Conversion Rule active.
    I have managed to determine that the exit is working properly, at least to the point where it is finished in EXIT_SAPFKCIM_001, and SENDER_SET_NEW is filled correctly.
    Did you manage to figure out what was wrong?
    Does anyone else know how to get this working?
    Thanks,
    Bruno

  • ALE : conversion rule for field value to become null

    Dear experts,
    I need to pass a material from one system to another with ALE. The problem is that the field marc-prctr (profit center) of this material is populated in the first system, but it does not exist in the second, so I assume transaction bd79 should be used to convert somehow the value of prctr to initial.
    How can I do that??? I have already tried defining a conversion rule and assigning it to message type MATMAS, segment E1MARCM, but it did not work.
    Please help.

    Did you check this?<a href="http://help.sap.com/saphelp_erp2005/helpdata/en/d3/06f6679aaf0e44b67ca6d6b58d91df/content.htm">http://help.sap.com/saphelp_erp2005/helpdata/en/d3/06f6679aaf0e44b67ca6d6b58d91df/content.htm</a>

  • ALE Conversion rules not processed

    Hi all,
    I defined some conversion rules for an inbound IDoc with trascation BD79 in my backend system.
    If I send an IDoc from XI to the backend system the rules are ignored:
    Status 64
    No filters , <b>No conversion</b> , No version change .
    If I try the <u>SAME</u> Idoc with WE19 using the standard inbound, it works:
    Status 64
    No filters , <b>Fields converted</b> , No version change.
    Any idea? I guess it's a bug.
    Thank you.
    Stefano

    <b>FYI</b>
    http://help.sap.com/saphelp_erp2005/helpdata/en/80/411a73382b44d0a1667816716f5b37/content.htm
    http://help.sap.com/saphelp_erp2005/helpdata/en/f9/c3075cac7d11d39460005004580996/content.htm
    Hope this’ll give you idea!!
    <b>Pl... award the points.</b>
    Good luck
    Thanks
    Saquib Khan
    "Some are wise and some are otherwise"

  • About conversion rule using BD79

    Dear experts,
    I would like to fall back on your help on the conversion rule in BD79.
    In our system, there is a HTTP connection used for sending IDoc to the website. The Unit of measurement need be converted when sending out to the receiver side. The problem is when I converted using BD79 with rule type 'Conver Sender fields', set conditon to convert unit '%' to 'wt%', but the value in recerive field always display in Upper letters('WT%').
    Can anyone help me on this problem? if it can not be realized in this way, please show me another possible solution.
    Besides, is it possbile to convert '%' to 'wt%' with some condition to make it just apply to some spectial characteristics?
    Many thanks in advance.
    Regards,
    Benny

    Are the two GSS devices clustered ? Set one up as the Master GSSM and the other as the Standby GSSM adding the source-lists you've specified. Either of the GSS devices can reply to DNS queries and these will provide an answer depending on where the request was sourced from. It also depends on your DNS Heirarchy and where the request are coming from. Is there a DNS server sitting above your GSS devices ? How are the clients configured ? Do they point directly to you GSS devices or some other DNS device ?

  • ALE conversion rules

    Hi All,
    We are downloading HR master data into CRM through ALE.
    I have a requirement where in the telephone numbers for employees should not come into CRM from HR.
    I am trying to do this with the help of conversion rules where I am specifying that the particular field in the segment be overwritten by a constant. However, I cannot have a blank value as a constant, hence am having to put some character like a '.'
    Can anyone please let me know if there is an option by which any particular field can be just blanked out?
    Regards,
    -Sweta

    Hi;
    n the ALE implementation guide (IMG) choose:
    Transaction SALE ® IDoc Interface/Application Link Enabling (ALE) ® Model and Implement Business Processes ® Set-Up Conversion between Sender and Recipient
    Proceed as follows:
           1.      Create rule: The rules are defined per segment.
           2.      Maintain rule: Rule maintenance specifies conversion rules at field level.
           3.      Assign rule to a message type: The assignment specifies when the rule is to be applied. This is sender/recipient and message type specific.
    Regards
    Shashi

  • ALE Conversion Rules:  Make recv'r dependant on different sender segment?

    Hi Everyone,
    I am putting in a conversion rule for MBEW-LIFO.  I need this field to be checked on the receiving end ONLY for certain Material Types.
    I know I can create a conversion rule to place in 'X' in MBEW-LIFO, but can I make that dependant upon Material Type, which is not in the E1MBEWM segment? I believe material type is in E1MARCM....
    thanks!
    Jen

    hi Sudeep.
    kindly look at this thread as reference ,inorder to get rid of your problem
    aler&idocs
    Regards
    Saurabh Goel

  • Can we write code using BD79(ALE IDoc Conversion Rule)

    Hi -
    I had created a conversion rule that specifies for the field account group (KTOKD) on KNA1:   I need to include the following logic :
    if the customer number (KNA1-KUNNR) is in the range 0010000000-0019999999 or 0140000000 to 0149999999, then change the account group value to u201CSKIPu201D.
    So I had created a conversion rule using BD62.
    In BD79, I had set the constant to SKIP.
    But where to include the following logic :
    if the customer number (KNA1-KUNNR) is in the range 0010000000-0019999999 or 0140000000 to 0149999999, then change the account group value to u201CSKIPu201D.
    Thanks,
    Gyanaraj

    Hi Shital ,
    Thank U very much for replying to my question.
    But when I m trying to create the same , by clicking Use general rule & specifying special conversion routine ZBLNK , it is throwing the following error "Conversion exit ZBLNK does not exist".
    Please can u help me out.
    Thanks,
    Gyanaraj

  • Import conversion data table from SAP R/3 into value mapping table in XI

    Hi:
        Somebody knows how to import a table with conversion data that are in SAP R/3 and to take it to a value mapping table in XI?
        The purpose is to use a mapping table that can change in the future. Must I use a ABAP programming that retrieve data and build the value mapping table?
        If so, how I specify in the ABAP programming the group id, the scheme, the agency and the corresponding value?
        Please, help me.
        Regards!!!

    Hi David,
    please refer to this section in the help: http://help.sap.com/saphelp_nw04/helpdata/en/2a/9d2891cc976549a9ad9f81e9b8db25/content.htm
    There is an interface for mass replication of mapping data. The steps you need to carry out to use this are:
    +Activities
    To implement a value-mapping replication scenario, proceed as follows:
           1.      Register the Java (inbound) proxies.
    To do so, call the following URLs in the following order in your Internet browser:
    ¡        http://:/ProxyServer/register?ns=http://sap.com/xi/XI/System&interface=ValueMappingReplication&bean=localejbs/sap.com/com.sap.xi.services/ValueMappingApplication&method=valueMappingReplication (for the asynchronous replication scenario)
    ¡ http://:/ProxyServer/register?ns=http://sap.com/xi/XI/System&interface=ValueMappingReplicationSynchronous&bean=localejbs/sap.com/com.sap.xi.services/ValueMappingApplicationSynchronous&method=valueMappingReplicationSynchronous (for the synchronous replication scenario)
    You only need to perform this step once (for each installation).
           2.      Application programming
    The ABAP program must perform the following tasks:
    ¡        Read the value mapping data from the external table
    ¡        Call the outbound proxy used to transfer the data to a message, which is then sent to the Integration Server
           3.      Configuration of the replication scenario in the Integration Directory
    This involves creating all the configuration objects you need to execute the scenario successfully. One special aspect of the value-mapping replication scenario is that the receiver is predefined (it must be on the Integration Server). The sender, however, is not predefined in the replication scenario and can be defined to meet your individual requirements.
    For example, you can use the shipped ABAP proxies.
    In the case of the receiver communication channel, choose the adapter type XI. Ensure that you configure a channel for the Java proxy receiver in this case.
    Enter the path prefix /MessagingSystem/receive/JPR/XI for this purpose.
    +
    Regards
    Christine

  • ALE IDoc Conversion Rule or BADI

    Hi,
    I'm new to ALE IDoc. I have a requirement to hide some sensitive data field for the HR information when outbound the Idoc.
    Example, I need to outbound the IT8 Basic Pay with payscale area etc but not with the basic pay amount. I can convert the basic pay amount to zero when the idoc is being created.
    I've searched through some information and found out that there are several ways to achieve it. However, I'm not sure which one is a better way.
    Is it using the Conversion Rule (BD62) or using Badi or Idoc Reduction method is better?
    Kindly give me some hints as I'm very confuse of which one is the better solution.
    Many thanks in advance.

    No user exit is needed for conversion rule.
    However, you should remember that conversion rules are applied segment fields for a particular message type. So wherever the IDoc message type is used the rule will be applied. If you want to associate the rule for any other condition in addition to message type better to go with a BADI
    These are steps to create and apply a conversion rule to a message type:
    1. Transaction BD62 - Create conversion rule and assign it to a IDoc segment
    2. Transaction BD79 - Define the conversion rule by selecting your field and clicking display button. Once inside you can use various options like set constant to map 0 to the basic pay
    3. Finally assign the conversion rule to your message type using BD55
    As for BADIs, IDOC_DATA_MAPPER is also suited for your requirement, the method PROCESS, allows you to manipulate your IDoc fields whichever way you want. You get the control record, data records, using which you can restrict your manipulation to specific message type or other conditions
    To above post, BADI IDOC_CREATION_CHECK should not be used for mappings/conversions as its specific purpose is to stop creation of an IDoc under required conditions

  • Lsmw..........Step 5: Maintain Field Mapping and Conversion Rules

    Hi All,
    After maintaining all the four steps in LSMW, when I goto Step 5, I cannot see any fields which I have created in recording to map the fields to source fields. If i remember correctly there is a button to click to see all the fields under "Fields" after TCODE. Can anyone explain how to see all the fields which I have created using the recording session.
    Any inputs will be highly appreciated. I can explain the issue if some one didnt get this?
    Regards
    Shane.

    Hi All,
    My question is, In step 5, when I go to map fields, under the "Fields" I can only see four sub nodes. and they are
      Fields
              __BEGIN_OF_RECORD__ Before Using Conversion Rules
                        Rule :   Default Settings
                        Code:    MATMAS = INIT_MATMAS.
              TABNAME                      Table Name
                        Rule :   Default Settings
                        Code:    MATMAS-TABNAME = 'MATMAS'.
              TCODE                        Transaction Code
                        Rule :   Default Settings
                        Code:    MATMAS-TCODE = 'MM01'.
            __END_OF_RECORD__   After Using Conversion Rules
                        Rule :   Default Settings
                        Code:    transfer_record.
    My question is after TCODE, i should see all my fields to assign the source structre fields. I can also send the screenshot if anyof you can reply to my sapshane27 at gmail
    Shane
    Edited by: shane shane on Jan 18, 2009 7:14 PM

  • Use of Association Tables in Place of FK's  --  O/R Mapping

    We are currently in the process of redesigning one of our systems following an OO model to be implemented in Java. Our current Oracle database is 8.1.7. We care coding sql in JDBC directly without the use of mapping tools. My role in the project is a jr. schema dba.
    The oo designer on our team objects to holding foreign key attributes in the Java classes when (form the object model's perspective) they aren't need (i.e. using collections will indicate parent-child relationships).
    To get around the "foreign key problem" the oo designer is pushing for the use of "association tables" in the database to replace foreign key relationships. The designer and other members of the team (the sr. schema dba) cite literature on O/R mapping which suggests that this technique is a valid and viable approach.
    So as a simple example...
    DEP EMP
    =======          =======     
    # dep_no     # emp_no
    dep_name     dep_no
              emp_name
    Becomes...
    DEP EMP          A_DEP_EMP
    =======          =======     =========
    # dep_no     # emp_no     # dep_no
    dep_name     emp_name     # emp_no
    Where A_DEP_EMP is the association table showing the relationship between department and employee.
    To me it seems like we are making poor and unnecessary database design decisions to support the object model. My objections to this association table approach is that we are failing to use the inherent strengths of the a relational database:
    1) Both the dep and emp must exist before the association is built (assuming the association has FK's to the DEP and EMP tables) -- and yes I'm aware delaying the enforcement of FKs until commit time is an option
    2) The model does not protect against inserting orphaned emps into the EMP table (assuming we'd want to protect against this)
    3) The association table indicates a N:M cardinality. Assuming the design calls for a 1:M cardinality a UK on the emp_no column in the association table would be required. When I brought this up, the sr. schema dba raised concerns about the overhead of the UK and suggested that the application would be responsible for enforcing the cardinality requirements.
    The alternative that seems more appealing is managing the FK details in a persistence layer of the Java classes (currently the notion of separate view and entity objects is absent in our design -- the database interface is directly in each low-level java class).
    If i am correct in my understanding of Oracle's implementation of BC4J the foreign keys are handled in the persistence layer (view object) with out the need for actively maintaining them in the entity Object. Other O/R mapping models also seem to suggest that this is the way to go (e.g. Martin Fowler's recommendations http://www.martinfowler.com/isa/).
    So my question for the community at large:
    - Which approach would you use?
    - Has anyone been working with association tables in place of foreign keys? If so what are your experiences?
    - am i off-base with my objections?
    any feedback / comments much appreciated

    Martin,
    I have seen these kind of discussions over and over again and now you happen to be right in the middle of one. Without going into technical details some statements that might help:
    1) object model designers and relational model designers will NEVER 'agree' since the techniques used are, by nature, too different (they solve different problems, which is fine, but when you start to talk about them they give problems)
    2) my experience is that they do not have to 'agree', as long as they can 'work alongside eachother'. You are not there to tell the object designers how to do their work the same way they are not there to tell you how to do your job.
    3) as a senior DBA, I regard it my primary task to enforce regulations that will make the data stored as meaningfull and correct as possible. If I'm following relational principles to do so, I will use all of them where I find them applicable regardless how that data is used later on. Any suggestion by any person that undermines this principle will not get implemented. Exceptions (<0.1%) not taken into account. Also, there is a 'political correct' way of telling this and a 'bold' way.
    4) choose the 'political correct' one since it will save you a lot of stress :)
    5) if you violate rule 3, and shit happens, you will be held accountable (and rightfully so).
    6) if you have a more senior person telling you what to do, do it, but state your concerns very clearly in an email and save that mail + reply somewhere safe for future discussions. Refer to this if shit happens and you are, unrightfully so, held accountable.
    7) as a DBA, never trust application logic to enforce data integrity. It is your job, not the application developer's and you have the facilities to do so much easier than the app dev. Application developers don't have a primary task of testing data integrity, if it looks ok on the screen they pass their tests. you on the other hand perform special tests they don't. This is your additional value to the project.
    8) in oracle 8i and above I found the use of views and 'in stead of'-triggers very usefull for mimicing objects. Object designers/developers think they get simple tables they can map objects to and underneatch you yourself check everything is ok. The disadvantage of this is that you need to do some extra work not easily accounted for and might run into some performance issues, but that is your challange as you want to become a senior DBA ;). The advantage is that the discussion is over :) and reporting and other, none object applications can interface more easily (which might be an actual design decisive argument, I've used it successfully multiple times).
    9) try to stay away from discussions of how things are done best between the two methods and why one method 'is better' than the other. Try to find a workable solution that will violate the least of the two designing principles and implement that.
    A lot is dependent on your project and it might be that in your case these senior people are quite right in their suggestions, but I've found that the above 'rules' work most of the time for me. I hope they at least give you something to think about. From what I read from your post, it sounds like you know the technical pros and cons but you have a tough time communicating to your peers. Perhaps these rules help a bit.
    Success,
    Lennert
    We are currently in the process of redesigning one of our systems following an OO model to be implemented in Java. Our current Oracle database is 8.1.7. We care coding sql in JDBC directly without the use of mapping tools. My role in the project is a jr. schema dba.
    The oo designer on our team objects to holding foreign key attributes in the Java classes when (form the object model's perspective) they aren't need (i.e. using collections will indicate parent-child relationships).
    To get around the "foreign key problem" the oo designer is pushing for the use of "association tables" in the database to replace foreign key relationships. The designer and other members of the team (the sr. schema dba) cite literature on O/R mapping which suggests that this technique is a valid and viable approach.
    So as a simple example...
    DEP EMP
    =======          =======     
    # dep_no     # emp_no
    dep_name     dep_no
              emp_name
    Becomes...
    DEP EMP          A_DEP_EMP
    =======          =======     =========
    # dep_no     # emp_no     # dep_no
    dep_name     emp_name     # emp_no
    Where A_DEP_EMP is the association table showing the relationship between department and employee.
    To me it seems like we are making poor and unnecessary database design decisions to support the object model. My objections to this association table approach is that we are failing to use the inherent strengths of the a relational database:
    1) Both the dep and emp must exist before the association is built (assuming the association has FK's to the DEP and EMP tables) -- and yes I'm aware delaying the enforcement of FKs until commit time is an option
    2) The model does not protect against inserting orphaned emps into the EMP table (assuming we'd want to protect against this)
    3) The association table indicates a N:M cardinality. Assuming the design calls for a 1:M cardinality a UK on the emp_no column in the association table would be required. When I brought this up, the sr. schema dba raised concerns about the overhead of the UK and suggested that the application would be responsible for enforcing the cardinality requirements.
    The alternative that seems more appealing is managing the FK details in a persistence layer of the Java classes (currently the notion of separate view and entity objects is absent in our design -- the database interface is directly in each low-level java class).
    If i am correct in my understanding of Oracle's implementation of BC4J the foreign keys are handled in the persistence layer (view object) with out the need for actively maintaining them in the entity Object. Other O/R mapping models also seem to suggest that this is the way to go (e.g. Martin Fowler's recommendations http://www.martinfowler.com/isa/).
    So my question for the community at large:
    - Which approach would you use?
    - Has anyone been working with association tables in place of foreign keys? If so what are your experiences?
    - am i off-base with my objections?
    any feedback / comments much appreciated

  • Maintain Field Mapping and Conversion Rules//LSMW

    Hello Friends,
    I want to add new fields in the step.no.5(Maintain Field Mapping and Conversion Rules).
    Indetail i'm going to upload the GL balances, for DR and CR line item fields are same so system is not accepting the same field value, so i have added 1 for the CR line item fields like in the below example.
    BSEG-WRBTR(Dr line item)
    BSEG-WRBTR1(Cr line item)
    but BSEG-WRBTR1(Cr line item) field not displaying in the step.no.5 to mapping to source field.
    please let me know the solution for this.
    thanks
    swapna.

    Hi,
    I would like to ask few questions.
    1. Are you using batch input recording or using any program for uploading. (through LSMW)
    2. Whether all your debit or credit line items are same every transactions. I believe they should be same, because you are uploading the balances.
    You should not have two similar fields for example, if it is WMBTR, then again WMBTR should not be there, it should WMBTR1 and WMBTR2. Make sure you have done the field mapping properly. When you have done the field mapping all the fileds must have been mapped. If any one of the fields are not mapped, then it will not be uploaded.
    Please see the following LSMW sample guide:
    http://www.scmexpertonline.com/downloads/SCM_LSMW_StepsOnWeb.doc
    Maintain Object Attributes Do the recording - Make sure that you do not have two fields with the similar name. If you have two fields with the same name double click on the field name and add1 or 2 to differentiate between field names. Just Copy those fields and descriptions in excel sheet, delete the blank lines, then in excel data => text to columns, your field names and descriptions will be now in two columns. Copy them, then put your cursor on the next sheet, then edit => Paste Special => Transpose, all the columns will become your rows. Now your file structure is ready. Maintain Source Structures Give some unique structure name and description Maintain Source Fields Here you add the fields that are being used in EXCEL first sheet, just copy them and make all the fields as C (Constant) and then give length of 60 for all fields. Maintain Structure Relations Though structure relations are already created just go to this step, click on edit, then click on create structure relation, just accept the message stating that the structure relation has already been created. Maintain Field Mapping and Conversion Rules Do the field mapping for all the fields, all the fields willl be stretched and you will see five rows against each row. In case if there is any row that has NOT stretched means, there is something wrong in the mapping. Maintain Fixed Values, Translations, User-Defined Routines There is nothing to be done at this step. You can simply ignore this. Specify Files Make you must have saved your excel file as .txt (before saving make sure you have copied data from sheet2 to sheet 3 and then sheet 3 is saved at tab delimited file. Text (Tab delimited) Select your file, make SURE that you have select "TABULATOR" radio button and say OK. Assign Files Go to this step and click on Create assignment button and accept the message and say ok. Read Data Remove two check boxes and just click on execute button. See the log. Make sure you have number of entries (lines) in your excel file are matching with this. Display Read Data Display data give 1 to 999 lines and double click on one of the line and see whether fields are mapped correctly are not. Convert Data Execute and see the log match the number of entries. Display Converted Data Display converted data, give 1 to 999 and double click on one of the line and see whether fields are mapped correctly or not. Create Batch Input Session Check on Keep Batch Input sessions check box, then execute. If you select that check box, even after execution it will be there and you can analyze what happened. Run Batch Input Session (Takes you to SM35) Go to SM35 select the batch and click on process button (execute), make sure you have checked right hand side first three check boxes and FOREGROUND (because you want to save what it is creating) Say OK Keep on press ENTER on your key board in order to move the session further. If you follow these steps along with the guide, surely you should be successful. There may be small difference between the file and what I have explained but ultimately the purpose is same. Hope this is useful and let me know in case if you have any issues.
    Regards, Ravi

  • Re: Error in LSMW Field Mapping and Conversion Rules

    Hi Friends,
    In Step Field Mapping and Conversion Rules:
    Am not able find the fields.
    1. I checked the source structure relationship it is ok.
    2. I have assigned the fields in recording too.
    Please help me out.
    Thanks in advance
    vivek

    hi ram,
    I think u have gone wrong in the first step itself............
    Do like this,
    1) Goto Maintain Object Attributes
    2) Select Batch Input Recording and Enter Ur recording Name
    3) Then Select Recording Overview...
    4) Thne Press DEFAULT ALL on ur Application Toolbar or Select EDIT-> DEFAULT ALL
    5) All ur Source Fields will be mapped for Recording Fields......
    6) Now u can see ur Fields in Field Mapping and Conversion Step......
    I hope this will work u r doing with BATCH INPUT METHOD...
    Reward me if useful.......
    Harimanjesh AN

  • Conversion rule in ALE/IDocs

    Hi,
       I have a requirement to make use of conversion rules in ALE/IDocs.
       For a particular field, I need to concatenate the values of two other fields in the IDOC segment
       and populate the value. The other direct options like setting a constant or changing a field's value
       work fine.
       But I was not able to figure out how to implement this option. Does anyone have an idea?
    Regards,
    Sudeep

    Hi sudeep
    Could you let me know  the exact steps you did to  create the conversion rule for idocs.
    Thanks in Advance

Maybe you are looking for

  • Java script problem when trying to sign in to Ovi ...

    Hi Folks I am running XP SP3, with IE7.  I have had problems due to installing ie8 then removing it and going back to IE7.  I have had to reset all sorts of bits.  Anyway. I still can't sign in to the music store, or register my PC for comes with mus

  • How to set up a scrolling text box?

    I have a page in my muse site with a diagram that fills up 2/3 of the page width. Next to it I have explanatory text that is too long to view all at once. Is there a way to make the text scroll without having the graphic move? I attach a sketch for t

  • Canvas Background ....

    Hi all, I have a problem in changing background of a canvas. The problem is when I import my background which is a .jpg file, and put it on a canvas, all the items at run time are visible as I dont want to see them as an Item. I just want to see them

  • Script text editor

    In SAP script Graphical PC  editor i wrote some code and by pressing enter after line end new blank line is created. how to delete  blank line from  Griphical PC editor.

  • Reconciliation Errors?

    This is an interesting error we've been getting in our server.log file (A ton of them) when doing scheduled reconciles of AD, LDAP, and Solaris Servers. If I log into the console and manually start a full reconcile everything is fine, it's just when