Process / job that populates data in LBWQ

Hi,
We have 2 clients - master client and development client. I want to make some changes to 2LIS_11_VASCL. I deleted the setup tables,scheduled the job in LBWE and also ensured to delete data from RSA7. But when I go to LBWQ it has MCEX11 in the queue. Iam just not able to enhance because of this.
Is there a background job/process that runs to populate the queue? If so what is it so that I can stop it in SM37. I have checked all released and scheduled job in SM37 and none of them are related to 2lis_11_vascl. I have setup a job to run in job control of LBWE for daily run. But still data is in LBWQ. PLS PLS HELP.

Hi Suja,
You have to schedule a V3 update job in LBWE in order to clear the outbound queue. Data in outbound queue is populated when user modifies sales order schedule lines. Please ensure that no one is changing sales order when you try to enhance the datasource structure.
Regards,
Celeste

Similar Messages

  • Identify Jobs that are not running on their Scheduled date time

    I have 29 scheduled jobs that run at different intervals of time. Some run once a day. few others run on hourly basis while others run on Sundays.
    I was working on a query that would let me know if a particular job did not run on its scheduled date and time.
    SELECT * from all_scheduler_jobs WHERE state <>'DISABLED'; will give me a list of all jobs that I have to monitor and that are not in the disabled state. But how can I verify that the jobs are running at their scheduled date time?
    Any help please? I need to create a view of all such jobs and then plan to send an alert so that appropriate action can be taken and it is assured that all important jobs run as per schedule.
    Thanks.

    Hi,
    I can see 2 approaches.
    - for jobs that have run but ran very late you should query dba_scheduler_job_run_details and filter by the difference between req_start_date and actual_start_date
    - for jobs that should have run but shouldn't, query for DBA_SCHEDULER_JOBS jobs that are SCHEDULED where next_run_date is in the past
    Hope this helps,
    Ravi.

  • Set the custom timer job that copy the data in the same list

    Hi
    How to set the custom timer job that copy the data in the same list
    Thanks,

    hello
    please it is wrong to copy all of the list3 item in list1 as the example as below is a copy of one item.
    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Text;
    using Microsoft.SharePoint;
    //This namespace is used for the SPJobDefinition class
    using Microsoft.SharePoint.Administration;
    namespace GENERAL_CustomTimerJob
    //To create a custom timer job, first add a class to your SharePoint project and
    //inherit from SPJobDefinition. Implement the constructors and override the Execute
    //method as shown below. To install your timer job, and set the schedule, you must
    //add a Feature and a Feature receiver.
    class GENERAL_CustomTimerJob :
    SPJobDefinition
    #region Constructors
    //You must implement all three constructors
    public GENERAL_CustomTimerJob()
    base()
    public GENERAL_CustomTimerJob(string jobName,
    SPService service,
    SPServer server, SPJobLockType targetType)
    base(jobName, service, server, targetType)
    public GENERAL_CustomTimerJob(string jobName,
    SPWebApplication webApplication)
    base(jobName, webApplication,
    null, SPJobLockType.ContentDatabase)
    //Set the title of the job, which will be shown in the Central Admin UI
    this.Title = "Simple Example Timer Job";
    #endregion
    //Override the Execute method to run code.
    public override
    void Execute(Guid targetInstanceId)
    //Get the Web Application in which this Timer Job runs
    SPWebApplication webApp =
    this.Parent as
    SPWebApplication;
    //Get the site collection
    SPSiteCollection timerSiteCollection = webApp.ContentDatabases[targetInstanceId].Sites;
    //Get the Announcements list in the RootWeb of each SPSite
    SPList timerJobList =
    null;
    foreach (SPSite site
    in timerSiteCollection)
                    timerJobList = site.RootWeb.Lists.TryGetList("List3");
    if (timerJobList != null)
    string sourceUrl = "http://aydi-pc";
    string destinationUrl =
    "http://aydi-pc/sites/Site1/";
    string sourceList = "List3";
    string destinationList =
    "List1";
    using (SPSite sourceSite =
    new SPSite(sourceUrl))
    using (SPWeb sourceWeb = sourceSite.OpenWeb())
    using (SPSite destSite =
    new SPSite(destinationUrl))
                                    using
    (SPWeb destWeb = destSite.OpenWeb())
    SPList ObjSourcelist = sourceWeb.Lists.TryGetList(sourceList);
    SPList ObjDestinationlist = destWeb.Lists.TryGetList(destinationList);
    SPListItem newItem = ObjDestinationlist.Items.Add();
    SPListItemCollection items = ObjSourcelist.Items;
                                        if
    (items.Count > 0)
    foreach (SPListItem item
    in items)
    newItem["Title"] = item["Title"];
    newItem["Nom"] = item["Nom"];
    newItem["Prenom"] = item["Prenom"];
    newItem.Update();

  • After upgrading my iPhone, I lost all my contacts..... and they were replaced with my wife's contacts even though she upgraded after I did. I did a time machine backup before I started the upgrade process, how do I get that saved data back to my phone?

    After upgrading my iPhone, I lost all my contacts and they were replaced with my wife's contacts even though she upgraded after I did. I did a time machine backup before I started the upgrade process, how do I get that saved data back to my phone?

    Hi bluegrandpanash,
    Thanks for visiting Apple Support Communities.
    If you backed up your iPhone to iCloud before updating the software, first try the steps under "Restore from an iCloud backup" in this article to recover your data:
    iOS: Back up and restore your iOS device with iCloud or iTunes
    http://support.apple.com/kb/HT1766
    Best Regards,
    Jeremy

  • Process the result so that the dates are ordered by date

    I am retrieving a query result from a sql database of which
    one column contains a string of this format:
    09/08/2008
    How can I process this result so that the dates are ordered
    by date?
    Thanks for anu help :)

    what's your db and version?
    pretty much all dbs have built-in string and date functions,
    a
    combinations of which will let you update your field to be
    date/datetime.
    you will probably want to create a new field in the table to
    store the
    converted dates, then check that they all have been converted
    correctly,
    then delete the current date field and rename the new field
    to the old
    date field's name.
    using your db's string functions you will need to extract
    day, month and
    year parts of your date text, and pass them to your db's date
    creation
    function.
    if your db is MySQL, it has a very handy STR_TO_DATE(str,
    format)
    function... check details in mysql ref manual.
    Azadi Saryev
    Sabai-dee.com
    http://www.sabai-dee.com/

  • What are the major process to transfer the data from legacy to sap system.

    What are the major process to transfer the data from legacy to sap system using BDC at Real Time only?

    hi,
    BATCH DATA COMMUNICATION
    main methods are:
    1. SESSION METHOD
    2. CALL TRANSACTION
    3. DIRECT INPUT
    Advantages offered by BATCH INPUT method:
    1. Can process large data volumes in batch.
    2. Can be planned and submitted in the background.
    3. No manual interaction is required when data is transferred.
    4. Data integrity is maintained as whatever data is transferred to the table is through transaction. Hence batch input data is submitted to all the checks and validations.
    To implement one of the supported data transfers, you must often write the program that exports the data from your non-SAP system. This program, known as a “data transfer” program must map the data from the external system into the data structure required by the SAP batch input program.
    The batch input program must build all of the input to execute the SAP transaction.
    Two main steps are required:
    • To build an internal table containing every screen and every field to be filled in during the execution of an SAP transaction.
    • To pass the table to SAP for processing.
    Prerequisite for Data Transfer Program
    Writing a Data Transfer Program involves following prerequisites:
    Analyzing data from local file
    Analyzing transaction
    Analyzing transaction involves following steps:
    • The transaction code, if you do not already know it.
    • Which fields require input i.e., mandatory.
    • Which fields can you allow to default to standard values.
    • The names, types, and lengths of the fields that are used by a transaction.
    • Screen number and Name of module pool program behind a particular transaction.
    To analyze a transaction::
    • Start the transaction by menu or by entering the transaction code in the command box.
    (You can determine the transaction name by choosing System – Status.)
    • Step through the transaction, entering the data will be required for processing your batch input data.
    • On each screen, note the program name and screen (dynpro) number.
    (dynpro = dyn + pro. Dyn = screen, pro = number)
    • Display these by choosing System – Status. The relevant fields are Program (dynpro) and Dynpro number. If pop-up windows occur during execution, you can get the program name and screen number by pressing F1 on any field or button on the screen.
    The technical info pop-up shows not only the field information but also the program and screen.
    • For each field, check box, and radio button on each screen, press F1 (help) and then choose Technical Info.
    Note the following information:
    - The field name for batch input, which you’ll find in its own box.
    - The length and data type of the field. You can display this information by double clicking on the Data Element field.
    • Find out the identification code for each function (button or menu) that you must execute to process the batch-input data (or to go to new screen).
    Place the cursor on the button or menu entry while holding down the left mouse button. Then press F1.
    In the pop-up window that follows, choose Technical info and note the code that is shown in the Function field.
    You can also run any function that is assigned to a function key by way of the function key number. To display the list of available function keys, click on the right mouse button. Note the key number that is assigned to the functions you want to run.
    Once you have program name, screen number, field name (screen field name), you can start writing.
    DATA TRANSFER program.
    Declaring internal table
    First Integral Table similar to structure like local file.
    Declaring internal table like BDCDATA
    The data from internal table is not transferred directly to database table, it has to go through transaction. You need to pass data to particular screen and to particular screen-field. Data is passed to transaction in particular format, hence there is a need for batch input structure.
    The batch input structure stores the data that is to be entered into SAP system and the actions that are necessary to process the data. The batch input structure is used by all of the batch input methods. You can use the same structure for all types of batch input, regardless of whether you are creating a session in the batch input queue or using CALL TRANSACTION.
    This structure is BDCDATA, which can contain the batch input data for only a single run of a transaction. The typical processing loop in a program is as follows:
    • Create a BDCDATA structure
    • Write the structure out to a session or process it with CALL TRANSACTION USING; and then
    • Create a BDCDATA structure for the next transaction that is to be processed.
    Within a BDCDATA structure, organize the data of screens in a transaction. Each screen that is processed in the course of a transaction must be identified with a BDCDATA record. This record uses the Program, Dynpro, and Dynbegin fields of the structure.
    The screen identifier record is followed by a separate BDCDATA record for each value, to be entered into a field. These records use the FNAM and FVAL fields of the BDCDATA structure. Values to be entered in a field can be any of the following:
    • Data that is entered into screen fields.
    • Function codes that are entered into the command field. Such function codes execute functions in a transaction, such as Save or Enter.
    The BDCDATA structure contains the following fields:
    • PROGRAM: Name of module pool program associated with the screen. Set this field only for the first record for the screen.
    • DYNPRO: Screen Number. Set this field only in the first record for the screen.
    • DYNBEGIN: Indicates the first record for the screen. Set this field to X, only for the first record for the screen. (Reset to ‘ ‘ (blank) for all other records.)
    • FNAM: Field Name. The FNAM field is not case-sensitive.
    • FVAL: Value for the field named in FNAM. The FVAL field is case-sensitive. Values assigned to this field are always padded on the right, if they are less than 132 characters. Values must be in character format.
    Transferring data from local file to internal table
    Data is uploaded to internal table by UPLOAD of WS_UPLOAD function.
    Population of BDCDATA
    For each record of internal table, you need to populate Internal table, which is similar to BDCDATA structure.
    All these five initial steps are necessary for any type of BDC interface.
    DATA TRANSFER program can call SESSION METHOD or CALL TRANSACTION. The initial steps for both the methods are same.
    First step for both the methods is to upload the data to internal table. From Internal Table, the data is transferred to database table by two ways i.e., Session method and Call transaction.
    SESSION METHOD
    About Session method
    In this method you transfer data from internal table to database table through sessions.
    In this method, an ABAP/4 program reads the external data that is to be entered in the SAP System and stores the data in session. A session stores the actions that are required to enter your data using normal SAP transaction i.e., Data is transferred to session which in turn transfers data to database table.
    Session is intermediate step between internal table and database table. Data along with its action is stored in session i.e., data for screen fields, to which screen it is passed, the program name behind it, and how the next screen is processed.
    When the program has finished generating the session, you can run the session to execute the SAP transactions in it. You can either explicitly start and monitor a session or have the session run in the background processing system.
    Unless session is processed, the data is not transferred to database table.
    BDC_OPEN_GROUP
    You create the session through program by BDC_OPEN_GROUP function.
    Parameters to this function are:
    • User Name: User name
    • Group: Name of the session
    • Lock Date: The date on which you want to process the session.
    • Keep: This parameter is passed as ‘X’ when you want to retain session after
    processing it or ‘ ‘ to delete it after processing.
    BDC_INSERT
    This function creates the session & data is transferred to Session.
    Parameters to this function are:
    • Tcode: Transaction Name
    • Dynprotab: BDC Data
    BDC_CLOSE_GROUP
    This function closes the BDC Group. No Parameters.
    Some additional information for session processing
    When the session is generated using the KEEP option within the BDC_OPEN_GROUP, the system always keeps the sessions in the queue, whether it has been processed successfully or not.
    However, if the session is processed, you have to delete it manually. When session processing is completed successfully while KEEP option was not set, it will be removed automatically from the session queue. Log is not removed for that session.
    If the batch-input session is terminated with errors, then it appears in the list of INCORRECT session and it can be processed again. To correct incorrect session, you can analyze the session. The Analysis function allows to determine which screen and value has produced the error. If you find small errors in data, you can correct them interactively, otherwise you need to modify batch input program, which has generated the session or many times even the data file.
    CALL TRANSACTION
    About CALL TRANSACTION
    A technique similar to SESSION method, while batch input is a two-step procedure, Call Transaction does both steps online, one after the other. In this method, you call a transaction from your program by
    Call transaction <tcode> using <BDCTAB>
    Mode <A/N/E>
    Update <S/A>
    Messages into <MSGTAB>.
    Parameter – 1 is transaction code.
    Parameter – 2 is name of BDCTAB table.
    Parameter – 3 here you are specifying mode in which you execute transaction
    A is all screen mode. All the screen of transaction are displayed.
    N is no screen mode. No screen is displayed when you execute the transaction.
    E is error screen. Only those screens are displayed wherein you have error record.
    Parameter – 4 here you are specifying update type by which database table is updated.
    S is for Synchronous update in which if you change data of one table then all the related Tables gets updated. And sy-subrc is returned i.e., sy-subrc is returned for once and all.
    A is for Asynchronous update. When you change data of one table, the sy-subrc is returned. And then updating of other affected tables takes place. So if system fails to update other tables, still sy-subrc returned is 0 (i.e., when first table gets updated).
    Parameter – 5 when you update database table, operation is either successful or unsuccessful or operation is successful with some warning. These messages are stored in internal table, which you specify along with MESSAGE statement. This internal table should be declared like BDCMSGCOLL, a structure available in ABAP/4. It contains the following fields:
    1. Tcode: Transaction code
    2. Dyname: Batch point module name
    3. Dynumb: Batch input Dyn number
    4. Msgtyp: Batch input message type (A/E/W/I/S)
    5. Msgspra: Batch input Lang, id of message
    6. Msgid: Message id
    7. MsgvN: Message variables (N = 1 - 4)
    For each entry, which is updated in database, table message is available in BDCMSGCOLL. As BDCMSGCOLL is structure, you need to declare a internal table which can contain multiple records (unlike structure).
    Steps for CALL TRANSACTION method
    1. Internal table for the data (structure similar to your local file)
    2. BDCTAB like BDCDATA
    3. UPLOAD or WS_UPLOAD function to upload the data from local file to itab. (Considering file is local file)
    4. Loop at itab.
    Populate BDCTAB table.
    Call transaction <tcode> using <BDCTAB>
    Mode <A/N/E>
    Update <S/A>.
    Refresh BDCTAB.
    Endloop.
    (To populate BDCTAB, You need to transfer each and every field)
    The major differences between Session method and Call transaction are as follows:
    SESSION METHOD CALL TRANSACTION
    1. Data is not updated in database table unless Session is processed. Immediate updation in database table.
    2. No sy-subrc is returned. Sy-subrc is returned.
    3. Error log is created for error records. Errors need to be handled explicitly
    4. Updation in database table is always synchronous Updation in database table can be synchronous Or Asynchronous.
    Error Handling in CALL TRANSACTION
    When Session Method updates the records in database table, error records are stored in the log file. In Call transaction there is no such log file available and error record is lost unless handled. Usually you need to give report of all the error records i.e., records which are not inserted or updated in the database table. This can be done by the following method:
    Steps for the error handling in CALL TRANSACTION
    1. Internal table for the data (structure similar to your local file)
    2. BDCTAB like BDCDATA
    3. Internal table BDCMSG like BDCMSGCOLL
    4. Internal table similar to Ist internal table
    (Third and fourth steps are for error handling)
    5. UPLOAD or WS_UPLOAD function to upload the data from the local file to itab. (Considering file is local file)
    6. Loop at itab.
    Populate BDCTAB table.
    Call transaction <tr.code> using <Bdctab>
    Mode <A/N/E>
    Update <S/A>
    Messages <BDCMSG>.
    Perform check.
    Refresh BDCTAB.
    Endloop.
    7 Form check.
    IF sy-subrc <> 0. (Call transaction returns the sy-subrc if updating is not successful).
    Call function Format_message.
    (This function is called to store the message given by system and to display it along with record)
    Append itab2.
    Display the record and message.
    DIRECT INPUT
    About Direct Input
    In contrast to batch input, this technique does not create sessions, but stores the data directly. It does not simulate the online transaction. To enter the data into the corresponding database tables directly, the system calls a number of function modules that execute any necessary checks. In case of errors, the direct input technique provides a restart mechanism. However, to be able to activate the restart mechanism, direct input programs must be executed in the background only. Direct input checks the data thoroughly and then updates the database directly.
    You can start a Direct Input program in two ways;
    Start the program directly
    This is the quickest way to see if the program works with your flat file. This option is possible with all direct input programs. If the program ends abnormally, you will not have any logs telling you what has or has not been posted. To minimize the chance of this happening, always use the check file option for the first run with your flat file. This allows you to detect format errors before transfer.
    Starting the program via the DI administration transaction
    This transaction restarts the processing, if the data transfer program aborts. Since DI document are immediately posted into the SAP D/B, the restart option prevents the duplicate document posting that occurs during a program restart (i.e., without adjusting your flat file).
    Direct input is usually done for standard data like material master, FI accounting document, SD sales order and Classification for which SAP has provided standard programs.
    First time you work with the Direct Input administration program, you will need to do some preparation before you can transfer data:
    - Create variant
    - Define job
    - Start job
    - Restart job
    Common batch input errors
    - The batch input BDCDATA structure tries to assign values to fields which do not exist in the current transaction screen.
    - The screen in the BDCDATA structure does not match the right sequence, or an intermediate screen is missing.
    - On exceptional occasions, the logic flow of batch input session does not exactly match that of manual online processing. Testing the sessions online can discover by this.
    - The BDCDATA structure contains fields, which are longer than the actual definition.
    - Authorization problems.
    RECORDING A BATCH INPUT
    A B recording allows you to record a R/3 transaction and generate a program that contains all screens and field information in the required BDC-DATA format.
    You can either use SHDB transaction for recording or
    SYSTEM ? SERVICES ? BATCH INPUT ? EDIT
    And from here click recording.
    Enter name for the recording.
    (Dates are optional)
    Click recording.
    Enter transaction code.
    Enter.
    Click Save button.
    You finally come to a screen where, you have all the information for each screen including BDC_OKCODE.
    • Click Get Transaction.
    • Return to BI.
    • Click overview.
    • Position the cursor on the just recorded entry and click generate program.
    • Enter program name.
    • Click enter
    The program is generated for the particular transaction.
    BACKGROUND PROCESSING
    Need for Background processing
    When a large volume of data is involved, usually all batch inputs are done in background.
    The R/3 system includes functions that allow users to work non-interactively or offline. The background processing systems handle these functions.
    Non-interactively means that instead of executing the ABAP/4 programs and waiting for an answer, user can submit those programs for execution at a more convenient planned time.
    There are several reasons to submit programs for background execution.
    • The maximum time allowed for online execution should not exceed 300 seconds. User gets TIMEOUT error and an aborted transaction, if time for execution exceeds 300 seconds. To avoid these types of error, you can submit jobs for background processing.
    • You can use the system while your program is executing.
    This does not mean that interactive or online work is not useful. Both type of processing have their own purposes. Online work is the most common one entering business data, displaying information, printing small reports, managing the system and so on. Background jobs are mainly used for the following tasks; to process large amount of data, to execute periodic jobs without human intervention, to run program at a more convenient, planned time other than during normal working hours i.e., Nights or weekends.
    The transaction for background processing is SM36.
    Or
    Tools ? Administration ? Jobs ? Define jobs
    Or
    System ? services ? Jobs
    Components of the background jobs
    A job in Background processing is a series of steps that can be scheduled and step is a program for background processing.
    • Job name. Define the name of assigned to the job. It identifies the job. You can specify up to 32 characters for the name.
    • Job class. Indicates the type of background processing priority assigned to the job.
    The job class determines the priority of a job. The background system admits three types of job classes: A B & C, which correspond to job priority.
    • Job steps. Parameters to be passed for this screen are as follows:
    Program name.
    Variant if it is report program
    Start criteria for the job: Option available for this are as follows:
    Immediate - allows you to start a job immediately.
    Date/Time - allows you to start a job at a specific name.
    After job - you can start a job after a particular job.
    After event - allows you to start a job after a particular event.
    At operation mode - allows you to start a job when the system switches to a particular operation mode.
    Defining Background jobs
    It is two step process: Firstly, you define the job and then release it.
    When users define a job and save it, they are actually scheduling the report i.e., specifying the job components, the steps, the start time.
    When users schedule program for background processing, they are instructing the system to execute an ABAP/4 report or an external program in the background. Scheduled jobs are not executed until they are released. When jobs are released, they are sent for execution to the background processing system at the specified start time. Both scheduling and releasing of jobs require authorizations.
    HANDLING OF POP UP SCREEN IN BDC
    Many times in transaction pop up screen appears and for this screen you don’t pass any record but some indication to system telling it to proceed further. For example: The following screen
    To handle such screen, system has provided a variable called BDC_CURSOR. You pass this variable to BDCDATA and process the screen.
    Usually such screen appears in many transactions, in this case you are just passing information, that YES you want to save the information, that means YES should be clicked. So you are transferring this information to BDCDATA i.e., field name of YES which is usually SPOT_OPTION. Instead of BDC_OKCODE, you are passing BDC_CURSOR.
    BDC_CURSOR is also used to place cursor on particular field.
    A simple transaction where you are entering customer number on first screen and on next screen data is displayed for the particular customer number. Field, which we are changing here, are name and city. When you click on save, the changed record gets saved.
    Prerequisite to write this BDC interface as indicated earlier is:
    1. To find screen number
    2. To find screen field names, type of the field and length of the field.
    3. To find BDC_OKCODE for each screen
    4. Create flat file.
    generally  Batch Input usually are used to transfer large amount of data. For example you are implementing a new SAP project, and of course you will need some data transfer from legacy system to SAP system.
    CALL TRANSACTION is used especially for integration actions between two SAP systems or between different modules. Users sometimes wish to do something like that click a button or an item then SAP would inserts or changes data automatically. Here CALL TRANSACTION should be considered.
    2. Transfer data for multiple transactions usually the Batch Input method is used.
    check these sites for step by step process:
    For BDC:
    http://myweb.dal.ca/hchinni/sap/bdc_home.htm
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/home/bdc&
    http://www.sap-img.com/abap/learning-bdc-programming.htm
    http://www.sapdevelopment.co.uk/bdc/bdchome.htm
    http://www.sap-img.com/abap/difference-between-batch-input-and-call-transaction-in-bdc.htm
    http://help.sap.com/saphelp_47x200/helpdata/en/69/c250684ba111d189750000e8322d00/frameset.htm
    http://www.sapbrain.com/TUTORIALS/TECHNICAL/BDC_tutorial.html
    Check these link:
    http://www.sap-img.com/abap/difference-between-batch-input-and-call-transaction-in-bdc.htm
    http://www.sap-img.com/abap/question-about-bdc-program.htm
    http://www.itcserver.com/blog/2006/06/30/batch-input-vs-call-transaction/
    http://www.planetsap.com/bdc_main_page.htm
    call Transaction or session method ?
    null

  • How to use CrossReference and DVM in ODI &how to populate data into Xref

    Can any one tell how to use Domain Value Maps and Cross Referencing in ODI?
    DVM or Domain Value Map are created and used in ESB console of SOA suite.
    My actual requirement is as follows:
    The below steps describe loading data from ERP Application 1 to ERP Application 2.
    1. The Source Application ERP APP1, populates the interface table using their native technology.
    2. A job scheduler invokes the Source side ODI Package.
    3. ODI then extracts the data from Source Interface table and populates the Target Interface table.
    4. After populating the Target interface table the ODI populates the X-ref table with App 1 ID and generated common ID.
    5. The ODI either deletes or updates the rows that were processed from the Source interface table.
    6. On the Target Application ERP APP2, the native application extracts data from target interface table and populates target database there by generating ERP Application 2 ID.
    7. A job scheduler on the Target application invokes the ODI package to populate the Application 2 ID onto the Xref table matching on the Common ID.
    I just want to know :
    1. How to populate data into the Xref table from Source datastore
    2. And if data is successfully laoded from target datastore to actual base table of target then how to populate the target id into the cross reference table.

    can anyone suggest me some answer, then it would be of great help?

  • Microsoft Exchange Mailbox Replication service was unable to process jobs in a mailbox database 1006.

    HI all,
    I ran into a problem with one of the servers in my DAG.  Due to a backup failure, the log partition filled up and the Exchange database went offline on this particular server.  The DAG continued to function on the other server.  So this particular
    DB01 had only some test accounts in it so I removed the database copy.  Unfortunately the removal must not have completed correctly. 
    After recovering the needed space in the partition, I tried to recopy the database and it failed.  So I deleted the entire database from the active server and it successfully removed it from the active server.  Now on both the servers the database
    is not showing, however I'm receiving the error below which indicates to me that the service still thinks the database exists.  I am also getting event 4113 Database redundancy health check failed for the database copy as it shows there is only one copy
    of the database (failed redundancy) but there should be zero copies.
    This database no longer exists in AD as I have verified this with ADSI so it must have something on the exchange server that is still telling the service the database exists. 
    Can you point me to the location to clean this up?
    Log Name:      Application
    Source:        MSExchange Mailbox Replication
    Date:          02/25/2011 2:00:07 PM
    Event ID:      1006
    Task Category: Service
    Level:         Warning
    Keywords:      Classic
    User:          N/A
    Computer:      email1.domain.com
    Description:
    The Microsoft Exchange Mailbox Replication service was unable to process jobs in a mailbox database.
    Database: DB01
    Error: MapiExceptionUnknownUser: Unable to open message store. (hr=0x80004005, ec=1003)
    Diagnostic context:
        Lid: 55847   EMSMDBPOOL.EcPoolSessionDoRpc called [length=243]
        Lid: 43559   EMSMDBPOOL.EcPoolSessionDoRpc returned [ec=0x0][length=200][latency=0]
        Lid: 23226   --- ROP Parse Start ---
        Lid: 27962   ROP: ropLogon [254]
        Lid: 17082   ROP Error: 0x3EB    
        Lid: 26937 
        Lid: 21921   StoreEc: 0x3EB    
        Lid: 27962   ROP: ropExtendedError [250]
        Lid: 1494    ---- Remote Context Beg ----
        Lid: 26426   ROP: ropLogon [254]
        Lid: 22086 
        Lid: 27206 
        Lid: 17990 
        Lid: 16966   StoreEc: 0x8004010F
        Lid: 31433   StoreEc: 0x8004010F
        Lid: 8620    StoreEc: 0x3EB    
        Lid: 1750    ---- Remote Context End ----
        Lid: 26849 
        Lid: 21817   ROP Failure: 0x3EB    
        Lid: 26297 
        Lid: 16585   StoreEc: 0x3EB    
        Lid: 32441 
        Lid: 1706    StoreEc: 0x3EB    
        Lid: 24761 
        Lid: 20665   StoreEc: 0x3EB    
        Lid: 25785 
        Lid: 29881   StoreEc: 0x3EB    
    Event Xml:
    <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
      <System>
        <Provider Name="MSExchange Mailbox Replication" />
        <EventID Qualifiers="32772">1006</EventID>
        <Level>3</Level>
        <Task>1</Task>
        <Keywords>0x80000000000000</Keywords>
        <TimeCreated SystemTime="2011-02-25T22:00:07.000000000Z" />
        <EventRecordID>181576</EventRecordID>
        <Channel>Application</Channel>
        <Computer>email1.domain.com</Computer>
        <Security />
      </System>
      <EventData>
        <Data>DB01</Data>
        <Data>MapiExceptionUnknownUser: Unable to open message store. (hr=0x80004005, ec=1003)
    Diagnostic context:
        Lid: 55847   EMSMDBPOOL.EcPoolSessionDoRpc called [length=243]
        Lid: 43559   EMSMDBPOOL.EcPoolSessionDoRpc returned [ec=0x0][length=200][latency=0]
        Lid: 23226   --- ROP Parse Start ---
        Lid: 27962   ROP: ropLogon [254]
        Lid: 17082   ROP Error: 0x3EB    
        Lid: 26937 
        Lid: 21921   StoreEc: 0x3EB    
        Lid: 27962   ROP: ropExtendedError [250]
        Lid: 1494    ---- Remote Context Beg ----
        Lid: 26426   ROP: ropLogon [254]
        Lid: 22086 
        Lid: 27206 
        Lid: 17990 
        Lid: 16966   StoreEc: 0x8004010F
        Lid: 31433   StoreEc: 0x8004010F
        Lid: 8620    StoreEc: 0x3EB    
        Lid: 1750    ---- Remote Context End ----
        Lid: 26849 
        Lid: 21817   ROP Failure: 0x3EB    
        Lid: 26297 
        Lid: 16585   StoreEc: 0x3EB    
        Lid: 32441 
        Lid: 1706    StoreEc: 0x3EB    
        Lid: 24761 
        Lid: 20665   StoreEc: 0x3EB    
        Lid: 25785 
        Lid: 29881   StoreEc: 0x3EB     </Data>
      </EventData>
    </Event>

    Hi
    This error occurs because the MailboxReplicationService (MRS) currently caches Mailbox Database information past the lifetime of some databases. 
    Restarting the service will clear the cache.
    Maybe it is the simplest way to solve it.
    If it doesn’t work, you can try cmd to check it.
    Get-MailboxDatabase : Can you find the database which you delete?
    Remove-MailboxDatabase -Identity "MyDatabase": You can try to delete database if you can see the database.

  • How to populate data in pdf

    Hi,
    My aim is to populate data in a standard pdf file and print the same.
    How can I do this? I am using struts framework.
    What are the APIs that can be used for this?
    Would it be better to convert an rtf to pdf?
    Please help.
    Thanks!
    Archna

    Thanks for the link.
    I tried etymon PJ but there are problems with word alignment and word wrap. The process that I followed is like this:
    1. Converted a word document to pdf 1.4 using acrobat 7.0
    2. I then use this pdf which contains tags like <Address> <City> etc and have this file as an input
    3. Use Etymon PJ to replace text using these tag delimiters
    The file is converted but there is no word wrap and alignment issues occur. What could be the cause?
    Archna

  • Job for alert data reorganisation is not release or running -ERROR BPM

    Hi
    I am trying to set up BPM in solution manager EHP1 .
    I ahve created a business process and also added teh required job,When i try to generate and activate the BPM i get an error as
    "Job for alert data reorganisation is not release or running"
    kindly suggest what i need to do to resolve this issue, which job to be scheduled and also all details and also any relevant document for the same
    i have a project dead line to be met kindly help me ASAP
    Thanks
    Michael

    Hi Michael
    Was it show red alert ?
    Do you still get same alert if you generate and activate
    the BPMon session ?
    If yes, there seems some problem on job "BPM_ALERT_REORG".
    Maybe it is scheduled but not released.
    (do you see scheduled job in SM37 ?)
    1st, according to your description, you use ehp1.
    In that case, as described in note 521820, do you already have
    required note. Especially related to ST-SER like following.
    1273127
    1275225
    1298310
    1319473
    1332197
    1355132
    1390111
    I quickly checked description of each note, but could not find out
    correction related to alert reorganization job. But it is better to
    implement above notes if you do not have yet.
    And if you still have problem (try generation and activation again)
    even with above note, I recommend you to create SAP customer message
    for component "SV-SMG-MON-BPM".
    Best Regards
    Keiji

  • Background Job Still Running - Data Can Be Displayed Only Msg Populated in Maintain Versions (OKEQ) Screen

    Hi,
    I try to create 2013 FY in "Setting for Each FY" in OKEQ - Maintain Versions unfortunately I have created Null FY, try to delete Null FY on the time it will ask Job schedule, I have cancelled the job & reopen Maintain Versions (OKEQ) screen on the time system populated the msg "Background Job still running - Data can be displayed only"
    In Settings for Each FY screen there is no New Entries button, all FY are blocked status, changes was not allowed.
    How can I resolve it.
    Waiting for your valuable suggestions.
    Regards,
    Nageswar.

    Dear Rajneesh Saxena
    Thanks for your prompt reply.
    Yes you are correct, on the time of deletion it will ask Background Job Process, for immediate deletion must & should select "NO".
    While "Settings for Each FY" screen populates message "Background Job Still Running - Data Can Be Displayed Only" open  screen in display mode enter the command "=KILL" in command bar press enter, then you will allow to changes get New Entries button also.
    Problem is solved
    Regards,
    Nageswar.

  • Pre-processing job for Transactions

    Hi Archiving Forum,
    In SAP help it is mentioned that pre-processing job is optional job. I am trying to run Write job for few of the transactions where the status is completed without pre-processing job.
    But when i debugged the write program for - CRM_SERORD archiving object, the program internally checking the I1100 status for each Service ticket/ Order.
    Here, all Service tickets were marked completed status on 24/03/2009.
    Residency time for all transaction types is 1 day.
    I am running the Write job for Service Ticket and Service order on 24/03/2009. - but job is not able to write the data into Archive files.
    Should i wait for One day ( where it completes the Residency time) to run the Write job?
    If i wait for one day, then the write program is internally checking the status I100 (Archival status). Is this status is updated by default in the CRM_JEST table after completing Residency period ?
    Please need concrete answers.
    Thanks in Advance,
    Kishore

    Hi Kishore,
    Pre-processing job should be scheduled for archiving objects before runing write program because it will help in basic archivability check and makes the basement for write program.
    Without executing pre-processing program write program will fails to archive data. For example: CO_ORDER, MM_EKKO, MM_EBAN and so on... in R/3 system.
    Similarly in CRM system Pre-processing program will help
    In check, the following database changes for CRM_SERORD:
    1) The date of the archivability check is updated.
    2) For the archivable data, the status is changed to 'Archivable'. Then the
    data is no longer changeable.
    3) For the archivable data, the system sends a deletion BDoc type with
    message type 'A' (for archiving) via the CRM Middleware so that the other
    components involved in mySAP CRM solutions are informed about an archiving
    of the data in the CRM server. The system sends deletion BDoc types at this
    time because the change to an unchangeable status is the status change
    which is relevant from a business point of view.
    In CRM 3.0 version
    The check occurs in an archiving object-specific preprocessing program
    entered for the archiving object in transaction AOBJ.
    As of mySAP CRM5.0 it is controlled using transaction SARA check/delete
    In your case:
    Even after passing of 1 day of residence time service order or service ticket will not be archive since write program will not set the status for archive I1100. Preprocessing program calling functional module and setting the status as archivable (I1100) after all the pre-requisites are met and write program will archive those data.
    The status in ST / SO will not be automatically changed after meeting residence time.
    Please also read OSS note no 500551.
    -Thanks,
    Ajay
    Edited by: Ajay Kumar on Mar 25, 2009 2:17 AM

  • The maximum report processing jobs limit configured has been reached -Error

    I have Created a common page that has a CrystalReportViewerControl (name of this page is ShowReport.aspx). The report name and database name that required for the report is being passed in a querystring. The database connection info is being pulled from the web.config file. All of the reports that I am dealing with have dynamic parameters and the Crystal Prompt page is automatically being created by the crystal viewer for these. Everything in my application is working fine except that when I try to access any report for the 76th. time I get the following error "The maximum report processing jobs limit configured by your system administrator has been reached."
    I have already researched this error and am aware that the PrintJobLimit can be modifed to increase this limit or can be set to -1 if we need to allow unlimited connections. However doing this is not an option due to the degradation of server performance.
    The other option that I have tried is to make sure I close and dispose of the report document object on the Page_unload or the page_SavedStateComplete() however on doing so even the session variable that I am using to store the originally created reportdocument is loosing all of the values it requires to display the report. The session variable is still available i.e. it is still of type report document but it has no values for any of the properties like FileName, database etc , basically for all of those properties it show an error "Invalid File Path" when viewed in debug mode.
    I have already tried several approaches but with no luck. Every single time I close the originally created ReportDocument object I loose all the required values in the Session
    I am using Crystal Report XI R2 , .Net 2.0 and ASP.net
    Following is the code: (Any help will be highly appreciated) Thanks:
    Option Strict On
    Imports CrystalDecisions.CrystalReports.Engine
    Imports CrystalDecisions.Shared
    Imports System.Data.SqlClient
    Imports System.IO
    Partial Class _ShowReport
    Inherits System.Web.UI.Page
    Private FechReport As ReportDocument
    Dim strSelectedDatabase As String
    Dim strReportsFolderPath As String =
    System.Configuration.ConfigurationManager.AppSettings("ReportsFolderPath").ToString()
    Dim strReportFileName As String
    Dim strReportFullPath As String
    Dim iInsertedLogId As Integer 'This variable is used to store the inserted log id for the executed report.
    Dim strConnString As String = System.Configuration.ConfigurationManager.AppSettings("ConnString").ToString()
    Dim strServerName As String = System.Configuration.ConfigurationManager.AppSettings("CR_ServerName").ToString()
    Dim strUserName As String = System.Configuration.ConfigurationManager.AppSettings("CR_UserName").ToString()
    Dim strPassword As String = System.Configuration.ConfigurationManager.AppSettings("CR_Password").ToString()
    Protected Sub Page_OnSaveStateComplete(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.SaveStateComplete
    If IsPostBack Then
    If iInsertedLogId > 0 Then
    UpdateReportLog_ReportServedTime(iInsertedLogId)
    If Not FechReport Is Nothing Then
    FechReport.Close()
    End If
    End If
    End If
    End Sub
    Sub Page_Unload(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Unload
    If Not FechReport Is Nothing Then
    'FechReport.Close()
    'FechReport.Dispose()
    'GC.Collect()
    End If
    End Sub
    Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
    Dim sRptFileName As String
    If Request.QueryString("database") "" Then
    strSelectedDatabase = Request.QueryString("database")
    Else
    Response.Write("A Valid Database has not been supplied to this page")
    Response.End()
    End If
    If Request.QueryString("ReportFileName") "" Then
    strReportFileName = Request.QueryString("ReportFileName")
    strReportFullPath = strReportsFolderPath & strReportFileName
    Else
    Response.Write("A Valid Report has not been supplied to this page")
    Response.End()
    End If
    sRptFileName = strReportFullPath
    If Not IsPostBack Then
    FechReport = New ReportDocument
    If Not FechReport Is Nothing Then
    ShowReport(sRptFileName)
    End If
    Else
    If (Session("oReportDocument") Is Nothing) Then
    FechReport = New ReportDocument
    ShowReport(sRptFileName)
    Else
    'FechReport = New ReportDocument
    'FechReport = CType(Session("oReportDocument"), ReportDocument)
    myCrystalReportViewer.ReportSource = Session("oReportDocument")
    'myCrystalReportViewer.ReportSource = FechReport
    End If
    End If
    End Sub
    Public Function ShowReport(ByVal strReportFileName As String) As Boolean
    Dim blNoErrors As Boolean = True
    Dim crDatabase As Database
    Dim crTables As Tables
    Dim crTable As Table
    Dim crTableLogOnInfo As TableLogOnInfo
    Dim crConnectionInfo As ConnectionInfo
    FechReport.FileName = strReportFileName
    myCrystalReportViewer.ReportSource = FechReport
    crConnectionInfo = New ConnectionInfo()
    With crConnectionInfo
    .ServerName = strServerName
    .DatabaseName = strSelectedDatabase
    .UserID = strUserName
    .Password = strPassword
    End With
    Try
    crDatabase = FechReport.Database
    crTables = crDatabase.Tables
    For Each crTable In crTables
    crTableLogOnInfo = crTable.LogOnInfo
    crTableLogOnInfo.ConnectionInfo = crConnectionInfo
    crTable.ApplyLogOnInfo(crTableLogOnInfo)
    Next
    Catch ex As Exception
    Response.Write(ex.Message & ControlChars.NewLine & ex.InnerException.ToString & ControlChars.NewLine)
    Exit Function
    End Try
    Session("oReportDocument") = FechReport
    'FechReport.Close()
    'FechReport.Dispose()
    'GC.Collect()
    Return blNoErrors
    End Function
    End Class

    I have looked into Caching the report document as well. However, as you mentioned in the post it, it will only be usefull when the DB and the report parameters remain the same which is not the case in our application. We have multiple identical databases and hundreds of reports. Our users have the option of using a combination of any database and any reports, each report having numerous parameters.
    Since one user can only access one report at a time. i do have cleanup code that removes the session variable used to store the reportdocument object in the page that is initially used to call the ShowReport.aspx page.
    I understand now that the CR.net SDK is only good for light reporting only. Unfortunately when we started development based on all of the articles that I gathered, I didn't anticipate running to issues like this. But I guess that's the nature of the business :-).  And hence there are people like you who go out of the way to answer these difficult questions.
    Regards,

  • The maximum report processing jobs limit configured by your system administ

    I have Created a common page that has a CrystalReportViewerControl (name of this page is ShowReport.aspx). The report name and database name that required for the report is being passed in a querystring. The database connection info is being pulled from the web.config file. All of the reports that I am dealing with have dynamic parameters and the Crystal Prompt page is automatically being created by the crystal viewer for these. Everything in my application is working fine except that when I try to access any report for the 76th. time I get the following error "The maximum report processing jobs limit configured by your system administrator has been reached."
    I have already researched this error and am aware that the PrintJobLimit can be modifed to increase this limit or can be set to -1 if we need to allow unlimited connections. However doing this is not an option due to the degradation of server performance.
    The other option that I have tried is to make sure I close and dispose of the report document object on the Page_unload or the page_SavedStateComplete() however on doing so even the session variable that I am using to store the originally created reportdocument is loosing all of the values it requires to display the report. The session variable is still available i.e. it is still of type report document but it has no values for any of the properties like FileName, database etc , basically for all of those properties it show an error "Invalid File Path" when viewed in debug mode.
    I have already tried several approaches but with no luck. Every single time I close the originally created ReportDocument object I loose all the required values in the Session
    I am using Crystal Report XI R2 , .Net 2.0 and ASP.net
    Following is the code: (Any help will be highly appreciated) Thanks:
    Option Strict On
    Imports CrystalDecisions.CrystalReports.Engine
    Imports CrystalDecisions.Shared
    Imports System.Data.SqlClient
    Imports System.IO
    Partial Class _ShowReport
        Inherits System.Web.UI.Page
        Private FechReport As ReportDocument
        Dim strSelectedDatabase As String
        Dim strReportsFolderPath As String =
    System.Configuration.ConfigurationManager.AppSettings("ReportsFolderPath").ToString()
        Dim strReportFileName As String
        Dim strReportFullPath As String
        Dim iInsertedLogId As Integer 'This variable is used to store the inserted log id for the executed report.
        Dim strConnString As String = System.Configuration.ConfigurationManager.AppSettings("ConnString").ToString()
        Dim strServerName As String = System.Configuration.ConfigurationManager.AppSettings("CR_ServerName").ToString()
        Dim strUserName As String = System.Configuration.ConfigurationManager.AppSettings("CR_UserName").ToString()
        Dim strPassword As String = System.Configuration.ConfigurationManager.AppSettings("CR_Password").ToString()
        Protected Sub Page_OnSaveStateComplete(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.SaveStateComplete
            If IsPostBack Then
                If iInsertedLogId > 0 Then
                    UpdateReportLog_ReportServedTime(iInsertedLogId)
                    If Not FechReport Is Nothing Then
                        FechReport.Close()
                    End If
                End If
            End If
        End Sub
        Sub Page_Unload(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Unload
            If Not FechReport Is Nothing Then
                'FechReport.Close()
                'FechReport.Dispose()
                'GC.Collect()
            End If
        End Sub
        Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
            Dim sRptFileName As String
            If Request.QueryString("database") <> "" Then
                strSelectedDatabase = Request.QueryString("database")
            Else
                Response.Write("A Valid Database has not been supplied to this page")
                Response.End()
            End If
            If Request.QueryString("ReportFileName") <> "" Then
                strReportFileName = Request.QueryString("ReportFileName")
                strReportFullPath = strReportsFolderPath & strReportFileName
            Else
                Response.Write("A Valid Report has not been supplied to this page")
                Response.End()
            End If
            sRptFileName = strReportFullPath
            If Not IsPostBack Then
                FechReport = New ReportDocument
                If Not FechReport Is Nothing Then
                    ShowReport(sRptFileName)
                End If
            Else
                If (Session("oReportDocument") Is Nothing) Then
                    FechReport = New ReportDocument
                    ShowReport(sRptFileName)
                Else
                    'FechReport = New ReportDocument
                    'FechReport = CType(Session("oReportDocument"), ReportDocument)
                    myCrystalReportViewer.ReportSource = Session("oReportDocument")
                    'myCrystalReportViewer.ReportSource = FechReport
                End If
            End If
        End Sub
        Public Function ShowReport(ByVal strReportFileName As String) As Boolean
            Dim blNoErrors As Boolean = True
            Dim crDatabase As Database
            Dim crTables As Tables
            Dim crTable As Table
            Dim crTableLogOnInfo As TableLogOnInfo
            Dim crConnectionInfo As ConnectionInfo
            FechReport.FileName = strReportFileName
            myCrystalReportViewer.ReportSource = FechReport
            crConnectionInfo = New ConnectionInfo()
            With crConnectionInfo
                .ServerName = strServerName
                .DatabaseName = strSelectedDatabase
                .UserID = strUserName
                .Password = strPassword
            End With
            Try
                crDatabase = FechReport.Database
                crTables = crDatabase.Tables
                For Each crTable In crTables
                    crTableLogOnInfo = crTable.LogOnInfo
                    crTableLogOnInfo.ConnectionInfo = crConnectionInfo
                    crTable.ApplyLogOnInfo(crTableLogOnInfo)
                Next
            Catch ex As Exception
                Response.Write(ex.Message & ControlChars.NewLine & ex.InnerException.ToString & ControlChars.NewLine)
                Exit Function
            End Try
            Session("oReportDocument") = FechReport
            'FechReport.Close()
            'FechReport.Dispose()
            'GC.Collect()
            Return blNoErrors
        End Function
    End Class

    please post your question in the Business Onjects SDK Application Development Forums.

  • Need FM or Tables to see Process Chain scheduled(future) date and time

    Hello Gurus!
             Can anybody tell me where/How i can get the Process Chain scheduled(future) date and time..?
       I mean, I want to see at what date and time is the PC scheduled for tomarrow/next month etc... Like TBTCO-SDLSTRTDT / TBTCO-SDLSTRTTM along with its Process chain name or PC's Start varient?
    I will give the points for all the answers
              Bye.. Happy New Year

    Hi Anil,
    You can check in SM37 by giving the job name.then you can see when that particular chain is scheduled.
    Goto SM37,
    Give Job name = *
           User name = *
    and select job status - ready,scheduled. then F8 (Execute).
    Then you can see which job is schduled at what time and date.
    Regards,
    Lakshman.G

Maybe you are looking for

  • Packing in Outbound Delivery

    Hello All, Please I need urgent help for this error am getting. I created a sales order at the point of doing packing with reference to an existing handling unit, the system says "No material to be packed found, or quantity to be packed equals zero M

  • How can I extract the data from Xstring .

    Hi Gurus , How can I extract the data from a XSTRING  . I have to get the data which is filled in the survey form the data is getting saved in form of xstring . Someone told me that there is a standard FM for that . but I am not able to find . Please

  • Mass change of PAYMENT TERMS on Vendor Invoices

    Hi Can someone advise me how i can update paymetn terms on vendor line items en masse ? When i select the line items i require,  then CTRLShiftF9 - the field for payment terms does not seem to be available for me to add new values in ..... Is there a

  • After Migrating to project server 2013 some of my Mpp are not opening in project professional

    After migrating to project server 2013 from 2010. my some of mpp are not opening in project professional.  project professional taking some time and then showing checkin error. There are no logs on project server to this issue. but you can see these

  • Pixelated Dock Text in OSX Mavericks

    Does anyone know why the text appears like so in the below image?  And if there is a fix for it?  It only happens once in a while.  Running OSX Mavericks on a 15" Retina Pro with SSD.  Thanks!