Issue with Incremental migration - Change log data gone

Project Info :
We are migrating clients MOSS 2007 environment to SharePoint 2013 using Avepoint Docave migrator. It’s a migration and new development project where in client require new components and UI to be developed
Problem statement :
We had done last incremental migration of the data in August 2014 and when we tried doing the incremental migration in December it was found that “Change Logs”, based on which Docave does the incremental migration have been wiped out due to the Change log
duration set to 60 days by the client.
Docave support is saying that the only way to get the correct data is to perform full migration. As we have done lot of page fixes post migration, we want to avoid full migration.
Just for the information, Change logs refer to the "Event cache" table created for each content Database and the duration of the change can be set via 
Central Administration > Application Management > select Web application > General Settings >Resource Throttling > Change Log
Any thoughts/approach on how shall we proceed to tackle this situation
Regards,
AMit

Hi,
As Avepoint Docave migrator is a third party product, I suggest you can contact the corresponding support for help.
If you want to migrate MOSS2007 to SharePoint 2013, I suggest you can follow the article below:
http://technet.microsoft.com/en-us/library/ee947141(v=office.15)
Best Regards
Jerry Guo
TechNet Community Support

Similar Messages

  • Delete change log data

    I was going through posts regarding delete change-log data but I am bit confused with the answers.My data is moving from ODS to CUBE and for the performance reasons I want to delete Change-log data from PSA and keep only 3 day requests.My question - if I delete Change-log  whether it effects the delta data from ODS to CUBE.Change-log maintains images so If I keep only 3 requests there will be images of only those 3 request data.If this happens then it will effect all the delta data going into CUBE
    Let me know your suggessions
    Rgs,

    Hi,
         Yes, you can delete the data from the Change Log unless until you do not need to re-construct the data again from the change log table.
         As a good principle for safety reasons if data is not too large, you can keep the data for 7days. This will ensure us to be safe. If something went wrong on week-end say 2 days were gone and on 3rd day the user has identified and raised a ticket on this. Forth day will be for analyzing and re-construction.
        So, unless until if the data is not too large, you can keep it for 7 days in the Change log table and the rest you can delete. As said by the other members, deletion of Change Log table does not affect the delta as it compares with the Active Table and creates a Before Image and After Images records and updates in the Change Log Table.
    Raj

  • How to delete Change log data from a DSO?

    Hello Experts,
    I am trying to delete the change log data for a DSO which has some 80 Cr records in it.
    I am trying to follow the standard procedure by using the process chain variant and giving the no.of days but somehow data is not getting deleted.
    However the process chain is completing successfully with G state.
    Please let me know if there are any other ways to delete the data.
    Thanks in Advance.
    Thanks & Regards,
    Anil.

    Hi,
    Then there might something wrong at your Chang log deletion variant.
    can you recreate changlog deletion variants and set them again.
    Ty to check below settings with new variant.
    Red mark - won't select
    Provide dso name and info area, older than and select blue mark.
    blue mark - it will delete only successfully loaded request which are older than N days.
    Have you tested this process type changlog deletion before moving to prod per your data flow?
    Thanks

  • Issue with Incremental Recon?

    We're facing an issue with incremental recons. If a full recon is run on a resource and an entry exists on the resource but NOT in IdM, the entry is put in the account index with a situation of UNMATCHED.....so far so good....
    But if an IdM user is created for that entry, and incremental recon is run the incremental does not link up the resource account to the IdM user. It seems that the incremental doesn't handle the situation change of UNMATCHED to CONFIRMED?....Not sure if this is a bug (could've sworn that this used to work in a previous version of IdM) or if it's the intended functionality, and the full recon only is designed to handle this particular situation change....Thanks in advance for any insight!

    No link. Because the account index already has this resource account flagged as UNMATCHED the incremental reconciliation ignores it.
    You can manually assign the resource to the user account or perform a full reconciliation to resolve this UNMATCHED to CONFIRMED. This doesn't help in the case when there are a large number of UNMATCHED accounts or a full reconciliation takes a long time.

  • Change Log Data from DSO to CUBE

    Hi Experts,
      Is it possible to load/transfer change log data as well from DSO to Cube
    Thanks

    Hi,
    Yes it is possible.
    Create a DTP from DSO to cube.
    In the extraction tab of DTP choose the option
    Datasource Change log
    Hope it Helps!!
    Reward if useful
    Manish

  • Change Log data into SAP BI

    Any best practices for getting change-log data similar to that captured in CDHDR/CDPOS tables into SAP BI? What options exist for tracking changes to sales orders or invoices into SAP BI without touching CDHDR/CDPOS tables?
    Thanks,
    Vinny Ahuja

    hi,
    you check weather that data source carried out delta loading or not by using
    RSA7 T-Code. In represents to sales or invoices, if that particular data source meant
    for delta loading into bw means it will show status green. by seeing this particular
    data source carried out delta loading in to bw system.
    if helpful provide points
    regards
    harikrishna N

  • Extracting HR Change Log Data to BW

    Hi Guru's,
    does anybody know how to extract data from the HR change log (among others, cluster table PCL4)? I found two function modules HR_INFOTYPE_LOG_GET_LIST and HR_INFOTYPE_LOG_GET_DETAIL
    that should be giving the data in the correct format.
    Has anybody ever used these Function modules for (delta) extraction to BW?
    Or are there other ways to extract this type of data?
    I already got some pointers from CSM Reddy in a previous posting:
    http://help.sap.com/bp_biv170/html/Bw.htm
    http://help.sap.com/bp_biv235/BI_EN/html/bw.htm
    But they don't seem to work any more.
    Best regards,
    Arno

    Those links you gave didn't work for me . . .
    I have toyed with using those function modules in the past. I was trying to solve an issue I had: With heavy customization on 0EMPLOYEE, it took forever to extract the data for that infoobject as we had more than 10 extractors . . .
    I no longer have access to the BI system, but I used one of the function modules you mentioned and loaded it into a DSO. From there, I used the DSO as a filter for my 0EMPLOYEE extractors. Using some ABAP, I passed to the extractors a list of employees who had data change in the previous day and extracted all of their information (should extract all in that case to properly get the begda and endda for the master data).
    It had the potential to really speed things up but I never got to finish so I've never seen it in a production environment. Hopefully this helps . . .

  • Issue with creating array of custom data type - WebLogic Integration

    Hi,
    We are doing WebLogic integration with Siebel for which from Siebel side we have generated java wrapper class which has all the methods, input\outputs defined and in\out params are serialized along with get\set methods. Following are the details of the input\output args.
    Account_EMRIO.java
    public class Account_EMRIO implements Serializable, Cloneable, SiebelHierarchy {
    protected String fIntObjectFormat = null;
    protected String fMessageType = "Integration Object";
    protected String fMessageId = null;
    protected String fIntObjectName = "Account_EMR";
    protected String fOutputIntObjectName = "Account_EMR";
    protected ArrayList <AccountIC> fintObjInst = null;
    Above class also includes constructors\overloaded constructor\getters\setters
    public AccountIC getfintObjInst() {    
    if(fintObjInst != null) {
    return (AccountIC)fintObjInst.clone();
    }else{
    return null;
    public void setfintObjInst(AccountIC val) {
    if(val != null) {
    if(fintObjInst == null) { 
    fintObjInst = new ArrayList<AccountIC>();
    fintObjInst.add(val);
    For the nested user defined data type AccountIC, it is defined in another java class as below
    AccountIC.java
    public class AccountIC implements Serializable, Cloneable, SiebelHierarchy {
    protected String fname = null;
    protected String fParent_Account_Id= null;
    protected String fPrimary_Organization = null;
    With the above, I was able to get all the AccountIC in the wsdl correctly and using this I was able to set the input param in the client
    WSDL:
    <xs:complexType name="accountEMRIO">
    <xs:sequence>
    <xs:element name="fIntObjectFormat" type="xs:string" minOccurs="0"/>
    <xs:element name="fIntObjectName" type="xs:string" minOccurs="0"/>
    <xs:element name="fMessageId" type="xs:string" minOccurs="0"/>
    <xs:element name="fMessageType" type="xs:string" minOccurs="0"/>
    <xs:element name="fOutputIntObjectName" type="xs:string" minOccurs="0"/>
    <xs:element name="fintObjInst" type="tns:accountIC" minOccurs="0"/>
    </xs:sequence>
    <xs:complexType name="accountIC">
    <xs:sequence>
    <xs:element name="fName" type="xs:string" minOccurs="0"/>
    <xs:element name="fParent_Account_Id" type="xs:string" minOccurs="0"/>
    <xs:element name="fPrimary_Organization" type="xs:string" minOccurs="0"/>
    minOccurs="0"/>
    </xs:sequence>
    </xs:complexType>
    Now, I wanted to make slight difference in getter method of class Account_EMRIO method getfintObjInst so that an array of AccountIC is retured as output.
    public ArrayList<AccountIC> getfintObjInst() {    
    if(fintObjInst != null) {
    return (ArrayList<AccountIC>)fintObjInst.clone();
    }else{
    return null;
    With the above change, once the wsdl is generated, I am no longer getting fintObjInst field for AccountIC due to which I am unable to get the array list of accountIC
    WSDL:
    <xs:complexType name="accountEMRIO">
    <xs:sequence>
    <xs:element name="fIntObjectFormat" type="xs:string" minOccurs="0"/>
    <xs:element name="fIntObjectName" type="xs:string" minOccurs="0"/>
    <xs:element name="fMessageId" type="xs:string" minOccurs="0"/>
    <xs:element name="fMessageType" type="xs:string" minOccurs="0"/>
    <xs:element name="fOutputIntObjectName" type="xs:string" minOccurs="0"/>
    </xs:sequence>
    </xs:complexType>
    The issue that I am facing here is, we have a custom data type(AccountIC) which has several fields in it. In the output I need a array of AccountIC. In the java bean, when this type was defined as
    protected ArrayList <AccountIC> fintObjInst = null; I was unable to get a array of AccountIC. this gets complicated as inside a AccountIC there is array of contacts as well and all the time I am getting just the first records and not a array.
    To summarize: How to get xsd:list or maxoccurs property for a field in WSDL for the user defined\custom datatype?
    Can someone help me with this.
    Thanks,
    Sudha.

    can someone help with this??

  • Search Issue with OCR Files and Meta data

    We have setup SharePoint 2013 on windows 2012 server. It is have cumulative update until August. Its version is 15.0.4535.1000. 
    I have created two site content type of document type. Both content type have 4 site column which are common in them. Both have two site column separate in each. I configure search service application, which is also working very good and also don't have any
    kind of error. I have added this two content type in one document library which have lot of sub-folders
    All Files are OCR pdf which is done by meta data
    Now the issue is that when I search anything from search center, sometimes I will get result with data from site columns and sometime i will not get data but just link to file and Name of the file. Wired thing is that columns are common and they crawling Full
    and Incremental also properly.
    Below is the screen for my search where you can see that there is no summary. How can I bring summary
    So what can be the issue for this. 

    Hi,
    For metadata, which metadata are you not seeing? Are they custom properties within the PDF, and have you checked if you have crawled properties matching these?
    I know there's issue with last modified on PDF's (http://sharepointfieldnotes.blogspot.no/2013/05/understanding-and-getting-sharepoint.html) 
    Thanks,
    Mikael Svenson
    Search Enthusiast - SharePoint MVP/MCT/MCPD - If you find an answer useful, please up-vote it.
    http://techmikael.blogspot.com/
    Author of Working with FAST Search Server 2010 for SharePoint

  • Issue with xfa.event.change in XFA 3.3

    HI all
    I've found an issue with the way that XFA 3.3 processes the script below on a change event when a user paste's more than 1 character to a textfield.
    var sChange = xfa.event.change;
    if((sChange.length>1)){
        app.alert("Bad User\nCopy and paste has been disabled for this field");
        //discard the change
        xfa.event.change = "";
    in XFA 3.0 the alert is sent and change is removed, in XFA 3.3 the alert is sent but the change persists.
    Any Ideas?

    Hi Mark,
    Unfortunately thats not possible as I'm using Flash Fields with data being passed between the PDF and the Flex app which only works in XFA3.3, let me know when its fixed.
    Kind Regards
    Kevin Mortimer
    Solutions Archtect
    Avoka

  • Issue with Apex and IE , losing data for items on submit

    Hi All,
    I am facing one problem in APEX when i run my page on IE 8. When i submit the page i lose the data for all my items while the same does not happen on Firefox mozila and Google chorme.
    Actually i have one select list which has one dynamic action associated with it when i select the options from this list other object become enable and disable with that action and page get submit.
    but when i select this list in IE page get submit and my value which i selected also get lost.
    Please let me know if this is some issue with IE or Apex as soon as possible.
    This is bit urgent.
    Thanks

    Modify your code as below and run the page in debug mode and note the value shown for p610_x DECLARE
    l_vc_arr2 APEX_APPLICATION_GLOBAL.VC_ARR2;
    BEGIN
    wwv_flow.debug('value for p610_x is ' || :p610_x);
    l_vc_arr2 := APEX_UTIL.STRING_TO_TABLE(:P610_X,'~');
    FOR z IN 1..l_vc_arr2.count LOOP
    htp.p(l_vc_arr2(z) || '
    END LOOP;
    END;varad

  • Issue with Report refresh of the data

    Hi All,
    we have recently upgraded from 11.1.1.5 to 11.1.1.7 version of OBIEE and since then we are facing few issue with the data refresh.
    whenever the data gets loaded in the back end the report don't seem to get updated. Neither the Reload Metadata or Refresh option works.
    We have to explictly have to go on the RPD and update the row Counts of the object and Close all cursor to Reflect the data.
    Is there any why we can do this....
    Thanks & Regards
    Shashank

    What about cache setting?

  • Issue with Saving the Query output data in Excel format

    Hi,
    Recnetly we had upgraded from 4.6c to ECC 6.0.
    In ECc 6.0 environment, when user try to export the query output , we are getting only XML option to save the data.
    But user want to save the data in EXcel format, he was able to do that in 4.6C.
    pleas eprovide some inputs, on this issue.
    Thanks,
    Sanketh.

    I cannot for the life of me imagine, why a link to a post in the 'Business One Forum' where one uses ODBC to transfer query-data to MicroSoft Excel is of relevance to the OPs question, even if the same is not a security issue.
    Never mind. [note 40155|https://service.sap.com/sap/support/notes/402155] deals with various symptoms in the ALV-Excel combination as as of release 4.6C. There are various others, mostly in components BC-SRV-ALV and BC-ABA-LI - also: I remember that when we upgraded from 4.5B to 4.7C there was an issue with Excel-templates -> the solution was in the release notes somewhere. So, in addition to SMP you might want to check the release notes and/or upgrade guide for solutions.
    And yes, moderators ... this is not a security issue, this should go to ECC-Applications/Upgrade.

  • Issues with Purchase order change documents

    Hi Gurus,
    I have some peculiar issues with Purchase order. The PO line item has deletion indicator set but I can't find who set the deletion indicator. The PO change documents doesn't show anything related to deletion indicator.
    Please help to fix the Issue.
    Regards,
    Senthil

    Hi
    Check it in Tcode SE16
    Table Name CDHDR and CDPOS
    Regards,
    Raman

  • Issue with backing up Archive logs

    Hi All,
    Please help me with the issues/confusions I am facing :
    1. Currently, the  "First active log file  = S0008351.LOG"  from "db2 get db cfg for SMR"
        In the log_dir, there should be logs >=S0008351.LOG
        But in my case, in addition to these logs, there are some old logs like S0008309.LOG, S0008318.LOG, S0008331.LOG  etc...
        How can I clear all these 'not-really-wanted' logs from the log_dir ?
    2. There is some issue with archive backup as a result the archive backups are not running fine.
        Since this is a very low activity system, there are not much logs generated.
        But the issue is :
        There are so many archive logs in the "log_archive" directory, I want to cleanup the directory now.
        The latest online backup is @ 26.07.2011 04:01:04
        First Log File      : S0008344.LOG
        Last Log File       : S0008346.LOG
        Inside log_archive there are archive logs from  S0008121.LOG   to   S0008304.LOG
        I wont really require these logs, correct ?
    Please clear my confusions...

    Hi,
    >
    > 1. Currently, the  "First active log file  = S0008351.LOG"  from "db2 get db cfg for SMR"
    >     In the log_dir, there should be logs >=S0008351.LOG
    >     But in my case, in addition to these logs, there are some old logs like S0008309.LOG, S0008318.LOG, S0008331.LOG  etc...
    >     How can I clear all these 'not-really-wanted' logs from the log_dir ?
    >
    You should not delete logs from log_dir because there online Redo logs and if you delete then there will be problem in start of db.
    > 2. There is some issue with archive backup as a result the archive backups are not running fine.
    >     Since this is a very low activity system, there are not much logs generated.
    >     But the issue is :
    >     There are so many archive logs in the "log_archive" directory, I want to cleanup the directory now.
    >     The latest online backup is @ 26.07.2011 04:01:04
    >     First Log File      : S0008344.LOG
    >     Last Log File       : S0008346.LOG
    >   
    If your archive logs are backed up from log_archive directory then you can delete old logs.
    Thanks
    Sunny

Maybe you are looking for

  • Why is Remote app album view different than iTunes?

    I use the Remote app on my iPad to control the streaming of my PC's iTunes music library to my stereo.  I like the Album view because I can see most of the album covers. In iTunes 11.1 on my PC, album view displays the albums alphabetically by artist

  • Display PDF in Browser Error

    I have a user with Adobe Professional 7 and Adobe Reader 9.3 installed.  For some reason I can't get either version to display documents in the browser.  The user is using IE 8.  I've tried using both versions to open documents by changing the file a

  • My mac book pro retina has white screen after login in both users !!!! Don't know what to do

    It was working fine till night. Next morning I tried to log in and white screen appears. Cant see anything except cursor. I have been getting message of " start up disk is full" from last week or so. Please let me know if anybody knows how to resolve

  • My MBP can't detect my iphone4 once through usb hotspot?

    When I upgraded iTunes 10.7 on the MBP (Mac OS is 10.6.8) that cann't detect my iphne 4 through USB hotspot, event on the MBP internet service list there're can't find my iphone 4 device as well. but when I choose to blue tooth hotspot that is work,

  • Enhancement Pack 3

    Hi All, Has anybody implemented Enhancement Pack 3 and activated the profit center authorization?  We curious to hear about the posting and display of documents based on users' profit center authorization. Also, does Enhancement Pack 3 allow profit c