Business Configuration Set: Consistency Check

Dear experts,
I installed compliance calibrator successfully in our dev box.  I'm trying to instruct the BASIS guy to install it on the QA box, and he has received an error while installing the RTA. 
While activating the BC sets, he received the following errors:
- /VIRSA/ZAUTHT - Checked activation of data for object /VIRSA/ZAUTHT not allowed for techn. resons
- /VIRSA/ZAUTHT - Data records for object /VIRSA/ZAUTHT cannot be locked, for technical reasons
- /VIRSA/ZAUTH - Checked activation of data for object /VIRSA/ZAUTH not allowed for techn. resons
- /VIRSA/ZAUTH - Data records for object /VIRSA/ZAUTH cannot be locked, for technical reasons
- /VIRSA/CCAUTH_52 - User Language D (German) is not in BC Set /VIRA/CCAUTH_52
Do you know why we'd see this in our QA box, and not in our DEV box?
Thanks,
Santosh

Have them check if the system is open (SCC4) and if the System Change Option (SE06) is set to "modifiable" for VIRSA or VIRSANH.

Similar Messages

  • Business Configuration Set

    Can anyone provide me details on Business Configuration Sets? What is its use in SAP FICO?  What is its significance? What points should be keep in mind while activating BCS

    Hi Divya,
    In SAP R/3,  a Business Configuration Set (BC Set) is a management tool that allows the user to record, save, and share customized settings. By creating a BC Set, the user is provided with a snapshot of the customized settings of a system that can be used later on as a template; SAP also provides pre-packaged BC sets designed for specific industries and applications.
    The importance of  BC Sets is that they are useful because they provide continuity and prevent project team members from overwriting each other's settings.
    Please refer to the link below for more info on BC sets
    http://www.sap.com/services/pdf/BWP_SAP_Best_Practices_for_SMB.pdf
    http://help.sap.com/saphelp_nw04/helpdata/en/90/c811c8411111d395bb00a0c930dcc1/content.htm
    Assign points if helpful.
    Thanks
    radhika

  • Business Configuration sets

    Hi,
      Could anyone please tell me what are the disadvantages of using BC sets?reply soon.
    Thanks,

    Hi
    The major disadvantage of BC sets is that you cannot configure the business process of the client rather the client ought to adapt the process that is set.
    As BC sets are pre configured they have particular process mapped and it is to be followed.
    Though this process can be changed this is normally not advisable.
    Rave
    +91.99206.33669

  • Business Configuration in ByD

    Hi, experts,
         Business Adaptation Catalog (BAC)
         Business Configuration Content(BCC)
         Business Configuration Object(BCO)
         Business Configuration Set
         Business Topic and Business Options
         When to use standard BCOs and When to use custom BCOs and Why?
         All about the business configuration.
         It is still hard for me to use them efficiently because I do not understand the relationship among them and how to use them.
         Please enlighten me about the Business Configuration Feature in ByD.
         Please provide some simple use case that can help me understand the business configuration more logically.
    Thanks in advance.
    Regards,
    Fred.
    PS. Please think of me as someone who don't know nothing about Business Configuration in ByD.

    Hi Elsen,
    If you are trying to run B1I and B1iS, they are incompatable on the same machione. That said, I am not clear what type of scenario you are attempting? If the incoming data is in the form of a SBO Object, you can confirm that the action happened on the sending system, also do the same for the reverse on the receiver system.
    If the sender system completed the task, then you have setup the inbound criteria or SLD incorrectly therefore not triggering B1i.
    I hope this helps you,
    Mike

  • Business configuration view - not assigned

    I am currently working on one extension field where users could change dropdown values.
    Currently what I have is business configuration object, business configuration set and business configuration view. Then dropdown field type is this business configuration object and everything works. Problem comes, when I want to edit those values. I assign business configuration view of that object to ImplementationProjects Work center view (I saw example in tutorial of SAP – documentation). But then in Business configuration work center and implementation projects when I click my scoping question and option I receive this strange error (see screenshot) (although the activity is in project)
    Do you have any solution for this?
    Or what is your way of implementing dropdowns that can be edited?
    Have been strugling with this thing for 2 weeks now, any kind of advice or help I would really appreciate.

    Barry,
    You should ideally have assigned Business systems in ur QA landscape to 1 group & Business systems in your dev landscape to another group. Also , hope you have created transport targets for all ur Business systems in dev environment.In ur QA SLD , you should have two entries for each business system , something like
    DEV-BS-1 -> transport target -> QA-BS-1
    QA-BS-1 -> transport source -> DEV-BS-1
    Thanks
    Saravana

  • Business Configuration - Task to Set Default Logistics Area

    Hello!
    I created a customer-specific solution with a BO Extension. The BO Extension includes a field logisticsArea and that field should have a default value, that is defined by the user.
    The difficult part ist, that the user should be able to set a default logistics area. We thought about making a task in the business configuration, where the user has to define the default value for the implementation project.
    Is this possible? How would you provide default value setting?
    Kind regards,
    Christine Toblier

    Hi, thank you for your answer!
    Maybe the label identification is missunderstanding. This is the key value of the code list data type. The second column should be the Default logistic area. I defined the input field of this column as an OVS.
    The problem is that i cannot enter a new value for the key and so a cannot create new default values. In the UI the Input field for the Key is defined to be of type "Input Field". But it is displayed as drop down list. Just want to have a Input field, where i can enter 01 or 02.
    I dropped the user column, because i thought it is not relevant for this use case. Can you please explain, why I need to have a user specific setting? Is it possible to have just one default logistic area for every user?
    Kind regards,
    Christine Toblier

  • Importance of Global Consistency check

    Hello,
    I have always checked my rpd for global consistency without knowing the actual meaning behind it. But last night I created a logical column with the following expression:
    max(VALUEOF(NQ_SESSION......))
    basically i created an aggregation over a logical column that obtains its value from session variable. I know that if we need to use a column as an aggregation column we should use the aggregation tab in the column properties. When we choose an aggregation it disables the editable column formula field.
    I put the above formula which violates the rule. The result is perfect so long as I don't check for global consistency. It throws an error that looks something like this:
    [38083] The Attribute 'Acceptance Rate Target' defines a measure using an obsolete method.
    The question is.. what is the significance of global consistency check.. and what is the consistency criteria.. and is it ok to save the rpd without checking for global consistency (yes this does not cause the BI server to crash when trying to start)
    Thanks

    First, forget about the variable approach.
    Now you need to do the following steps:
    1) Import the table to the physical layer
    2) Create logical table in existing Business Model in BMM layer with the imported table as logical table source
    3) Create another logical table in existing Business Model in BMM layer with the imported table as logical table source
    4) Create complex join between both table, now you will get one logical dimension table and one logical fact table
    5) In the logical fact table you need to select the column ("Target") and add an aggregation rule, like MAX or MIN
    6) Assuming you have a hierarchy for every dimension in your BMM layer, you need to set the logical levels of the new measure to the Grand Total Level of each dimension hierarchy.
    By doing this, you get a "level based measure", for more info: read this:
    http://oraclebizint.wordpress.com/2007/12/03/oracle-bi-ee-101332-level-based-measures-lbms/
    By setting all logical levels to the grand total level, the measure will be "immune" for all dimensions used in your report.
    So when you have a report like
    Month__Actual___Target
    The BI Server will create two queries:
    select month, sum(sales) from calendar, sales_table where calendar.id = sales_table.calendar_id
    and
    select max(value) from target_table
    The BI Server will then stitch both results together.
    Regards,
    Stijn

  • Repository Consistency Check 39008 "does not join"?

    I'm using Administration Tool 11.1.1.6.0 with a Repository version of 318.
    I have imported my star schema metadata from the database using an OCI connection. All the joins were included, so I can go to Physical->Fact Table->Physical Diagram->Object(s) and Direct Joins and it shows my fact table linked to all my dimension tables. I then clicked-and-dragged my schema to the Business layer. I created my dimension by right-clicking on my logical tables in the Business layer and choosing Create Logical Dimension -> Dimension with Level-Based Hierarchy. This worked for all the dimensions that had only one level (a base level and a grand total level), but resulted in some odd errors when done for dimensions with more than one level. I got around these errors by manually creating these dimensions, clicking-and-dragging the logical columns in, and setting up the keys.
    Only now when I do I consistency check, I get three of the following warnings, one for each dimension that has more than one level:
    WARNINGS:
    Business Model [Business Model]:
    [39008] Logical dimension table [Logical Table] has a source [Physical Table?] that does not join to any fact source.At least, I think it is referring to the Physical Table, but changing the name of the Physical Table doesn't change the error message, though changing the Logical Table name does, so I'm not really sure what it is referring to. Here is what one looks like precisely:
    [39008] Logical dimension table Time has a source TIME that does not join to any fact source.Now, each of the three multi-level dimensions have a base level with a key that is present in the Fact Table. I can even right-click on the Fact Table on the Business Layer and go to Business Model Diagram or Physical Model Diagram and get a diagram of my fact table linked to all of its dimensions, including the three in question. Analysis made in OBIEE work so long as I don't use those three dimensions.
    Does anybody have any idea what I'm missing here?

    Thanks, it looks like the field for those three logical dimensions was left blank for some reason. So it was because the Logical Dimensions weren't joining to the Fact Table, rather than the Logical Tables?

  • Routing Consistency Check in batch ?

    Hey folks!
    A business requirement has been raised to perform a consistency check for all Routing Task Lists for a specific plant.  I'm aware of the consistency check within the Routing that can be performed online.  Anyone know of a batch/background job that can validate routings and report errors?
    The business case is as follows:  Reference Operation Sets are utilized within Routing Task Lists.  If a change is made to the Reference Operation Set (say add additional operations) and these additional operations conflict in the routing this ROS is utilized, this will cause a routing error.  Yet there is no way to identify this error until someone checks the routing or writes a production order against. 
    Any help on the above would be great.  Points are waiting to be awarded! 

    Nathan,
    I do not think there is a background program that can run to check for such inconsistency.
    However when MRP runs system will detect routing inconsistency & will display exception message against planned order.
    So nett result is MRP controller will know about this much before he converts planned order to production order.
    They can look for exception messages either under Tcode: MD04/ MD05 or collectively under MD06/MD07 group by exception message.
    Message 62 - Scheduling: Master data inconsistent is one such message & it falls under Exception group 4 in standard SAP.
    So if MRP controller checks Tcode : MD06 & limits selection criteria by exception group 4 messages he can nail down all such errors in advance.
    Thanks,
    Ram

  • LiveCache Consistency Check question, OM17

    I have a general question about a LiveCache Consistency Check (transaction OM17).  I know that it is a data inconsistency b/w the SAP APO database and SAP APO LiveCache.  But what does that mean to a functional user?  Can someone explain this in layman's terms?

    In Layman terms, you can say that this checks the INCONSISTENCY between LiveCache and Database.
    Here is a detailed documentation on each of the Object Types
    Consistency Check for Setup Matrices
    The consistency check for setup matrices contains:
    · A check whether the setup matrices exist in liveCache
    · A check whether the setup transitions exist in liveCache
    · A field comparison between the setup transitions in the database and those in liveCache
    When the setup matrices are corrected, the setup matrices in liveCache are completely generated from those in the database. Previously nonexistent setup matrices and setup transitions are newly created in liveCache. Superfluous setup transitions are deleted from liveCache. Setup transitions that differ at the field level are adjusted to match the database status.
    Consistency Comparison for Block Basis Definitions
    Use
    When you set this indicator, checks are performed in liveCache and in the database on characteristic values for block basis definitions:
    · The existence of block basis definitions is checked.
    · The consistency of the characteristic values is checked.
    After the checks you can call a correction function in the check results display. When correcting the error, the system deletes obsolete block basis definitions in liveCache. The system completes or corrects missing or inconsistent block basis definitions in liveCache.
    Note
    The check is performed independently of the planning version.
    Consistency Check for Resources
    The consistency check for resources contains:
    · A check that the resource and corresponding time stream exist in liveCache
    · An check that a resource's characteristic blocks exist in liveCache
    · A field comparison between the database resource and the liveCache resource
    When correcting the resources, the resources in liveCache are completely generated from the database resources. Previously nonexistent resources are created in liveCache.
    Consistency check for downtimes caused by maintenance order
    The consistency check for maintenance downtimes contains:
    A check that the maintenance downtime has a reference to an existing maintenance order.
    A check that the dates of maintenance downtime correspond to the exisiting maintenance order.
    When correcting the maintenance downtime errors, downtimes without maintenance order are deleted and wrong dates of downtimes are corrected in relation to the maintenance order.
    Consistency Check for Product Location Combinations
    Use
    If you set this indicator the system executes a consistency check for product location combinations. The consistency check for product location combinations includes:
    · A check for the existence of a product location combination in the database and in liveCache
    · A field comparison between product location combinations in the database and in liveCache
    · The determination of obsolete entries for product location combinations in the database
    · A check for the existence of characteristic value assignments for product location combinations in the database and in liveCache
    · A field comparison of characteristic value assignments for product location combinations in the database and in liveCache
    After the check you can call a correction function from the display of check results. For the correction, the system deletes obsolete product location combinations from the database and in liveCache. The system corrects inconsistent product location combinations and characteristic value assignments for product location combinations in liveCache.
    Consistency Check for Stocks
    Use
    If you set this indicator the system executes a consistency check for stocks. The consistency check for stocks includes:
    · A check for the existence of a stock in the database and in liveCache
    · A field comparison between stocks in the database and in liveCache
    · The determination of obsolete entries for stocks in the database
    · A check for the existence of characteristic value assignments in the database and in liveCache
    · A field comparison between characteristic value assignments for batch stocks in the database and in liveCache
    After the check, you can call a correction function from the display of check results. For the correction, the system attempts to correct inconsistent stocks in the database and in liveCache. If a correction is not possible, the stocks are deleted in the database and in liveCache. The system corrects characteristic value assignments for batch stocks in liveCache.
    After inconsistent stocks have been corrected, it may be necessary to start the delta report in order to reconcile SAP APO and SAP R/3.
    Consistency Comparison of Configuration/CDP for Orders
    Use
    When you set this indicator, the system performs a consistency check with regard to configuration or CDP (characteristic value assignments/ requirements) for receipts/requirements belonging to orders:
    · In the case of products with variant configuration and product variants, the system checks whether there is a referenced configuration in the database.
    · In the case of products with CDP, the system checks whether CDP characteristics exist.
    Note
    In the area Restrictions, you can use the indicator CDP: Detailed Check to define a detailed check for CDP characteristics. If you set this indicator, the CDP data used for the orders is also compared with the product master.
    · For products without configuration/CDP, the system checks whether invalid references to variant configuration or invalid CDP characteristics data exist.
    After the check, you can call a correction function in the check results display. When executing the correction, the system tries to adjust inconsistent orders in liveCache.
    After inconsistent orders have been corrected, you may need to start the delta report to compare the SAP R/3 system and SAP APO again.
    Dependencies
    The consistency check for configuration or CDP data is very time-consuming. You should therefore limit the comparison as far as possible to certain products or locations.
    Consistency Check for Production Campaigns
    If orders are assigned to production campaigns that do not exist in the database, this leads to inconsistent campaigns.
    You can correct inconsistent production campaigns by removing all orders from them. That means that the campaign assignments are removed from the orders in liveCache.
    Consistency Check for Operations
    In the database table of /SAPAPO/OPR operations, there may exist operations that have no orders in liveCache, no orders for a simulation version, orders for deleted simulation versions, or no external operation number. These operations place an unnecessary load on the database table and can hinder system performance.
    Consistency Check for Planning Matrices
    As planning matrices are not master data, they are only located in liveCache. For each production version, there is a record in the database with information about matrices that must exist for this production version and whether the last matrix explosion was successful.
    The consistency check for planning matrices checks:
    · Whether the matrices associated with each record on the database exist in liveCache
    · Whether the records associated with all the matrices in liveCache exist on the database
    · Whether the last matrix explosion was successful.
    If inconsistencies are discovered, they can be corrected. As corrections are made by recalculating the inconsistent matrices, the process can take a while and should only be done for large matrices (with many orders or many item variants) at times when it can be guaranteed not to hinder any other system processes.
    Consistency Check for Simulation Versions
    This is a check for whether simulation versions exist in liveCache.
    Correction does not take place automatically. Simulation versions that still exist in the database but no longer exist in liveCache do not influence the running of the system. If necessary, they can be deleted using transaction /SAPAPO/CDPSS0.
    Consistency Check for Product Allocations
    The consistency check for product allocations checks the data for product allocation assignment from the database and compares this with the incoming orders quantity in Demand Planning. Surpluses or shortages are displayed and can be corrected.
    The reconcile is only executed for product allocation groups with a direct connection to the product allocation group in the planning area if the connection is also fully defined.
    There may be long runtimes during the consistency check due to the data structure. The following factors can hinder performance:
    · Number of characteristics combinations
    · Number of periods in a time series
    · Number of sales orders that take product allocations from a time stream
    Error in the reconcile
    If it is not possible to reconcile the incoming orders quantity, the data records are issued again with a relevant error message. Check the following causes and attempt again.
    Check:
    · If the planning area to be checked is locked
    · If the time streams are initialized (after liveCache has been initialized)
    · If all characteristics combinations area available in the planning area
    · The wildcard indicator for collective product allocation
    · The settings for your planning area
    Due Delivery Schedules/Confirmations Consistency Check
    When a scheduling agreement release is received from a customer for sales and distribution scheduling agreement items, a due delivery schedule is created and stored in liveCache. As soon as a confirmation for a due delivery schedule containing at least one schedule line with a quantity larger than zero is generated, an object is also created for it in liveCache. The transaction data for sales and distribution scheduling agreement items contains, amongst other things, an entry with the key of the due delivery schedule object currently located in liveCache and an entry with the key of the confirmation that is currently valid in the database. During the check, the system checks whether liveCache objects exist for sales and distribution scheduling agreement items and whether the transaction data entries are correct.
    The following individual checks are made for active sales and distribution scheduling agreement items:
    · Is there an operative scheduling agreement release and/or forecast/planning delivery schedule in the database, but no associated liveCache object?
    · Is there a confirmation in the database, but no associated liveCache object?
    · Is there a due delivery schedule in liveCache, without at least one existing operative scheduling agreement release and/or forecast/planning delivery schedule?
    · Is there a confirmation in liveCache, without an existing confirmation in the database?
    · Is the key in the transaction data in the database that shows the current due delivery schedule in liveCache, also that of the actual liveCache object?
    · Is there actually a confirmation in the database for the key in the transaction data that shows the currently valid confirmation in the database?
    If a sales and distribution scheduling agreement item is inactive, there are not allowed to be any due delivery schedules or confirmations in liveCache. In this case, the following checks are made:
    · Is there a due delivery schedule in liveCache for an inactive sales and distribution scheduling agreement item?
    · Is there a confirmation in liveCache for an inactive sales and distribution scheduling agreement item?
    Consistency Check for Production Backflushes
    Partially confirmed orders cannot be deleted from liveCache. For each partially confirmed order of the database table, there must be a corresponding order in liveCache. If no order exists, there is a data inconsistency that can only be rectified by deleting the order from the database tables of the confirmation.
    Entries for orders that have already been confirmed exist in the status matrix. The entry in an order's status matrix is deleted when the confirmed order is deleted by the /SAPAPO/PPC_ORD archiving report. Each status matrix entry for which database tables of the confirmation do not exist, present an inconsistency that can only be removed by deleting the status matrix entry.
    Consistency Check for iPPE Objects
    The iPPE object is not an iPPE master data structure. It is a data extract that is generated for each iPPE access object.
    The consistency check for the object checks that it exists in liveCache and also determines its identity using the backup copy in the database. When correcting the object, the copy from the database is written to liveCache.
    It is necessary to check the object if the following error message occurs: 'Error while calling COM routines via application program' (/sapapo/om 102) with return code 1601, 1602, or 1603. This does not apply to liveCache initialization.
    Consistency Check for Procurement Scheduling Agreement Items
    The following three objects represent procurement scheduling agreement items (scheduling agreement in short):
    1. Scheduling agreement schedule lines
    2. Release schedule lines
    3. Confirmations
    All these objects are located in liveCache. Release schedule lines and confirmations are also located in the database with a historical record. Depending on the process that was set up for the scheduling agreement, not all objects exist in liveCache or have historical records.
    If goods movements exist for an object, there must always be at least one entry in liveCache. If all schedule lines are covered by goods receipts, at least one schedule line will exist in liveCache with the number '0000000000' and an open quantity of 0.
    A liveCache crash, operator errors, and program errors can all cause inconsistencies. Below is a list of all the inconsistent statuses that have been identified and that can be removed:
    1. The object is not in liveCache but goods receipts exist.
    2. The number of input nodes and output nodes is different.
    3. There are no input nodes at the order, but the material exists in the source location for the order.
    4. The original quantity at the source of an order is different from that at the destination.
    5. The accumulated quantities in liveCache are different from those in the database (the cumulative received quantity, for example).
    6. The set process is compared with the status in liveCache.
    7. A check is made to see whether the scheduling agreement is being planned in APO or in R/3 and whether the schedule lines have the appropriate specification.
    If a schedule line inconsistency is identified, no more checks for inconsistencies are made, instead it moves on to the next schedule line.
    Consistency Check for MSP Orders
    Provides a list of maintenance and slot orders that
    ·     Exist in the database but have no corresponding orders in the liveCache
    ·     Exist in the liveCache, but have no corresponding orders in the database
    Procedure
    From within the list, you can either
    ·     Correct the inconsistencies
    If you choose to do this, the system deletes the selected orders from the database and/or liveCache.
    You receive a message that the selected order(s) have been deleted.
    ·     Leave the inconsistencies in the database and/or liveCache
    Such inconsistencies place an unnecessary load on the database and/or liveCache.  Moreover, those orders that exist in the liveCache, but have no corresponding orders in the database, influence the scheduling results of subsequent orders in the liveCache
    Hope this helps
    Regards
    Kumar Ayyagari

  • Business Function Set: Enterprise AddOns in Solution Manager

    Hello
    For testing purposes I want to activate the Enterprise AddOn EA_PLM in transaction SFW5 inside a Solution Manager 4 installation but no Enterprise AddOn is available. Additionally there is only one Business Function Set called "telco reference pack" with two Business Functions. Do I have to set up something before I can active the Enterprise AddOns? I thought PLM is part of the Solution Manager 4 installation.

    Hello Marc,
    the Solution Manager consists "only" of an CRM-component, the SolMan component itself and the Netweaver 2004s platform. So no PLM (and no Enterprise AddOn) is available. SolMan coves some functionalities of Software Lifecycle Management, but does not include the PLM coding itself. For EA_PLM you need an ERP system.
    Best regards, Alexander

  • Consistency check error in sender file content conversion

    Dear Experts,
    I am trying one simple File to File scenario in which i have only fields 1>salesorder_number 2>Description
    I created a txt file with line items "100,crudeoil" with file name as salesorder.txt
    In the sender file adapter i have configured the file convertion paramenters as follows:
    Document name : mt_salesorder_sender  
    Document namespace : http://se.com.sa/sec-sa-qax          "Namespace of message type
    Recordset Structure : document,*
    document.endSeparator : 'nl'
    document.fieldSeparator: ,
    document.fieldNames : salesorder_number,Description
    So after this if i run the scenario In file adapter i am getting the error like "Conversion initialization failed: java.lang.Exception: java.lang.Exception: java.lang.Exception: Error(s) in XML conversion parameters found: Parameter 'document.fieldFixedLengths' or 'document.fieldSeparator' is missing Consistency check: no. of arguments in 'document.fieldFixedLength' does not match 'document.fieldNames' (0 <> 2) "
    Please let me know where i am going wrong.
    Regards,
    Santhosh

    Hi,
    please maintain the order as
    Recordset Structure : document,*
    document.fieldNames : salesorder_number,Description
    document.fieldSeparator: ,
    document.endSeparator : 'nl'
    Rgds
    joel

  • Configuring double invoice check for vendor invoices posted through FB60

    Dear all
    Can anyone tell me how to configure double invoice check for vendor invoices posted through FB60.
    for miro documents..we can use Tcode OMRDC
    Is there any such tcode which can be used for configuring fi invoices for double checking..
    regards
    Expertia

    Dear Expertia,
    In FI,when checking for duplicated invoices, the system compares the
    following :Vendor, currency, company code, gross amount of the invoice,
    reference document number and Invoice document date.
    SAP Note 305201 clarifies this in a more details; please read it.
    The following fields must be identical for Duplicate invoice check
         Company code                              (BUKRS)
         Vendor number                             (LIFNR)
         Currency                                  (WAERS)
         Reference number                          (XBLNR)
         Amount in document currency               (WRBTR)
         Document date                             (BLDAT)
    If the document is having any one of the above filed different then the
    system does not consider it as a duplicate invoice.
    Also It will check duplicate invoice check in vendor master data and
    in posting key is there check box selected for sales related
    The setting you making in OMRDC i.e Materials management->Logistics
    Invoice Verification->Incoming Invoice ->Set Check for Duplicate
    Invoices is only valid for MM and not  FI invoices posted via FB60/FB65
    You should check the F1 help on field "Chk double inv." (LFB1-REPRF)
    in the relevant vendor master record (transaction FK03).
    Please also check, that message F5 117 has been set correctly in the
    IMG using this path:
    Financial Accounting -> Financial Accounting Global Settings ->
    Document -> Default Values for Document Processing -> Change Message
    Control for Document Control For Document Processing
    Finally & mainly, go to the relevant posting key is defined as sales
    related in transaction OB41. You have to flag this field if the
    duplicate invoice check should work.
    I hope this helps You.
    mauri

  • Reg: consistency check in recordset structure validation

    Hi all,
    I am doing FCC to FCC scenario, i am getting this error. Help me out, it is very urgent for me.
    Sender Adapter v2723 for Party '', Service 'bs_text2text':
    Configured at 2007-08-21 08:00:49 UTC
    History:
    - 2007-08-21 09:29:06 UTC: Retry interval started. Length: 2.000 s
    - 2007-08-21 09:29:06 UTC: Error: Conversion of complete file content to XML format failed around position 0: Exception: ERROR consistency check in recordset structure validation (line no. 87: missing structure(s) in last recordset
    - 2007-08-21 09:29:06 UTC: Processing started
    - 2007-08-21 09:29:04 UTC: Error: Conversion of complete file content to XML format failed around position 0: Exception: ERROR consistency check in recordset structure validation (line no. 87: missing structure(s) in last recordset
    - 2007-08-21 09:29:04 UTC: Processing started
    Regards,
    Ajay.

    Hi Ajay,
    Problem with the file content conversion parameters you specified in the sender file adapter. 
    Do let us know your DT structure and FCC parameters specified in adapter.
    Regards,
    Sumit

  • ERROR consistency check in recordset structure

    Hello All,
    Below is my input file format looks like:
    HEADER
    REC_1
    REC_1
    REC_1
    REC_2
    FOOTER
    And I have defined the Recordset structure as HEADER,1,REC_1,*,REC_2,*,FOOTER,1.But the occurence of REC_2 is optional (0 to Unbounded).
    Conversion of file content to XML failed at
    position 0: java.lang.Exception: ERROR consistency check in recordset structure
    validation (line no. 4: missing structure(s) before type 'HEADER'
    It looks like the error is due to missing REC_2 segment in the input file.
    Can you please tell me how to handle this in FCC
    Thanks,

    Hi Naresh
    IMHO, I don't think that the error is due to missing REC_2 in the input file.
    From SAP library, * includes also 0 occurence.
    Converting Text Format in the Sender File/FTP Adapter to XML - Configuring the File/FTP Adapter in Integration Directory…
    Under Recordset Structure, enter the sequence and the number of substructures as follows:<NameA,nA,NameB,nB,...>, where nA=1,2,3,... or * (for a variable, unlimited number, including 0).
    Please provide screenshot of your FCC configuration and also sample input file that is causing the error so that it can be analysed further.
    Rgds
    Eng Swee

Maybe you are looking for