SLO Data Integrity check

Hi,
Currently we have 10 company codes in our production system (e.g. Company Code P01 ~ P10)
However, there is a requirement that 1 (one) of them need to be separated/migrated into separated system (eg. Company code P10)
The idea is to copy client using SLO, then purge the data so the copy client will only contain the data of Company code P10.
Can anybody give me advise on how to test the data integrity efficiently in the copy client after data purging processes?
I am afraid that there will be data corruption exist during the data purging.
Additional information, we have 11 SAP modules involved.
Thanks in advance.
Arif

Hi Arif,
We faced a similar scenario andSLO was used in separating on of our company code's data.
There was no data corruption nor any major issue. However, we designed a testing cycle included all other teams comprising people from all modules to design test cases and scenarios.
From Basis perspective, SLO was done from SAP and post-processing was standard verification done from our side.
Regards,
Divyanshu

Similar Messages

  • Shared Variable Data Integrity

    I am using Network Shared Variables for a project and have a question about if there are any data integrity checks that are built into the NI-PSP protocol.
    I'm more of a hardware engineer and I am just wondering if there is any way that data can be corrupted when using Network Shared Variables.  I could probably just cast the data to U8 and then CRC it on both ends, but wanted to check about integrity inherent to Labview for this.  Because we mainly develop medical devices, any Labview code needs to be validated if used for verification testing.
    If there is NI documentation/studies on data integrity for this, I'd appreciate it if someone could point me to it.
    Thanks!
    Thad

    Hi Thad,
    The following article talks a little about data integrity when using shared variables.
    http://www.ni.com/white-paper/4679/en/
    I hope it helps.
    Regards, 
    kruiz17

  • Using Data Integrator Migration Mechanisms and Migration Tools

    Data Integrator provides two mechanisms for migrating jobs from development to test to production :<br /><ul><li>Export/import <ul><li>Export/import is the basic mechanism for migrating Data Integrator applications between phases. First, you <em>export</em> jobs from the local repository to either a file or a database, then you can import them into another local repository. </li></ul></li><li>Multi-user development <ul><li>Instead of exporting and importing applications, multi-user development provides a more secure <em>check-in</em>, <em>check-out</em>, and <em>get</em> mechanism, using a <em>central repository</em> to store the master copies of your application elements. </li></ul></li></ul><p>Regardless of which migration mechanism you choose, Business Objects recommends you prepare for migration using one or more tools that best fit your development environment :</p><ul><li>Naming conventions <ul><li>Just as Business Objects recommends you standardize Data Integrator object prefixes, suffixes, and path name identifiers to simplify your projects internally, we also recommend the use of naming conventions externally for migration purposes. </li></ul></li><li>Datastore and system profiles <ul><li>With multiple profiles, instead of a separate datastore (and datastore configuration) for each database instance, you can associate multiple <em>datastore profiles</em> with a single datastore connection. </li></ul></li></ul><p>Each mechanism and tool is recommended for specific types of projects. For more detail, please see the <em>Data Integrator Advanced Development and Migration Guide</em> provided with Data Integrator 6.5.</p>

    Hej Fahad!
    Thanx for your valueable information but i have not any background of data migration and etc. I am mobile communication developer. So this is my first assignment between two companies. first company will give me the oracle data maybe in the shape of dump files and i will make my virtual server and there i will test that the data is valid or not and etc. If the data is valid then i will transfer it to target platform.
    So now i have no access to the source system from where data is coming. Can you please explain me little bit more about invalid objects and still export/import works same.
    Waiting for reply
    Regards
    Hani

  • Error occured in the integration checks(read log)

    Hi,
    I try to post planning of machinery hours in KP26, tht time i got this error, and data is not saved too. " error occured in the integration checks(read log)" kindly solve this problem asap.
    Regards
    PRINCE

    Sir,
    Where to maintain Nominal rate for this question KP26 ? give me a path or transaction code.
    Regards
    Prince

  • "Oracle Data Integrator" with "Total Recall"

    Hi all,
    We are planning to use Oracle Data Integrator 11g for performing ELT in Oracle 11g database. We are also planning to enable the "total recall" (flashback) technology and house all our tables on it.
    Question I have in my mind right now is, will ODI and Total Recall work well together?
    Background
    Say we have an interface defined with the target data store defined on a tablespace with flashback enabled. Say there are 100 rows in the source, of which 10 rows violated the check constraint . The "bad" data, violating the constraint, will be moved to the E$ table while the rest of the 90 rows are loaded into the target.
    Questions*
    1) If our business rule dictates zero tolerance for errors and a ROLLBACK is issued, what will happen to the data in the E$ table?
    2) Say we have committed* the 90 rows and want to use a flashback transaction query to undo the changes, how will it affect the E$ table?
    3) Will the rows be deleted from the E$ table also along with rolling back of the changes in the target?
    4) If the errors in E$ are recycled and this interface is restarted after the rollback is performed, will the I$ table contain 110 rows i.e. source data + data from E$?
    5) How does ODI handle recycling / reprocessing of the violations in E$ table?
    Please advice.
    Thank you.
    CC

    1.) The data in E$ will remain there
    2.) The data in E$ will remain there
    3.) The data in E$ will remain there
    4.) 90 rows. The recycled will still error out
    5.) To me the recycling feature is pretty lame. You need to fix the errors in the E$ table and then recycle will load the data.

  • SAP TechEd 2011- Data Integration and Quality Managment Sessions

    Looking for answers on "Data Integration and Quality Management," check out the related educational sessions at SAP TechEd 2011 ([Las Vegas|http://bit.ly/qE0dZs], [Bangalore|http://bit.ly/oVdGsM], [Madrid|http://bit.ly/rnSNOT]). [Register today|http://www.sdn.sap.com/irj/scn/sapteched]!

    Hi,
    you must check in sap teched site and register it, the link as below
    [SAP Teched Banglore 2011|http://www2.sapevents.com/SAP/TechEd2011Bangalore/index.cfm?fuseaction=agenda.sessionCatalogueHOS&view=overview]
    Thanks,
    Jansi

  • Do integrity checks by file comparison really work?

    Hello,
    I use a program (Beyond Compare) which has a Folder Compare module, with a function for doing a binary comparison between the files inside two folders. I use it to check the integrity of my backups, by comparing my main hard disk files, with the ones on
    the backup hard disks. If two files get recognized as being different, then one of them should be corrupt (if they didn't get modified "normally").
    My question is: will this method work to detect real file corruptions? Will it detect any kind of corruption, both copy corruptions, and "bit rot" corruptions? Will there be no problem regarding read cache? I mean: if the first file to compare gets read
    from disk, and is kept in a read cache (by Windows or by the hard disk, or by something else), will the second file get read from the same cache (if Windows or the hard disk think they are identical files)? Do Windows and hard disks have some kind of procedure
    to detect if a file to read from disk is already available in cache, even if it is in a different folder than the "original" one? Perhaps some kind of file-checksum-system which decides that the files are same? (And this system would not notice if the file
    to compare is corrupt). If this would be true, then integrity checks by file comparison would not work, because in practice the same file would be read twice (first from disk, and then from cache), instead of reading both the two files to be compared from
    disk.
    I have already done tests by manually "corrupting" files (changing slightly the contents, while keeping size and timestamp the same), and it works (the files get recognized as different). But I'm not sure if it will work also with "real" corrupt files.
    I'm interested mostly about Windows 8 Pro 64bit and NTFS (but would like to know also in general).
    Thanks.

    I also have Beyond Compare and have used it to check backup data.
    Yes, Windows does RAM caching.  If you have a comparison program read enough file data that you
    can be sure the RAM cache is flushed, then you can be sure that reading the file data back and comparing it with the original data is a valid way to ensure you have uncorrupted backups.
    Note that NTFS now does online recovery, so just the act of reading back the data can trigger processes inside the file system implementation that will recover the data if it should start to experience "bit rot" (weakening or failure of the
    storage medium).  But this is not strictly necessary as other operations, such as defragmentation etc., will occasionally access the disk data as well.
    Some time back I wrote a small utility that calculates a CRC-32 on all the data from all the files in a set of folders.  This is another way that re-reading all the data can be triggered, as well as producing a summary number that can be easily
    compared to determine that all is well - though one doesn't need my software to do it...  There are hash programs available that can accomplish the same things.  Search for SHA-1 programs, but beware there can be malware associated with free programs
    and download sites.
    It's good that people think about data integrity.  There's all too little of that nowadays.
    -Noel
    Detailed how-to in my eBooks:  
    Configure The Windows 7 "To Work" Options
    Configure The Windows 8 "To Work" Options

  • Data Integration from Oracle GL to Hyperion  Planning

    Hi all,
    Can I knw the steps for extarcting the from Oracle GL Tables to Hyperion Planning?
    We have installed hyperion 9.3.1 and ODI 10.
    Please let me know how will we do this process by using ODI?
    Thanks

    Hi User##########,
    In a nutshell you need to do the following...
    1. Create a Project in ODI
    2. Define your Source and Target data stores in Topology
    3. Import the appropriate Knowledge Modules
    4. Reverse Engineer your Source and Target modules to create a model
    5. Create an integration to load from your Source (Oracle GL) to your Target (Hyperion Planning -- I actually prefer to load the data directly to Essbase)
    I just did this last month. When loading the data I used an Essbase load rule to ensure Essbase knew the column order.
    I was also able to load data via ODI from a flat file WITHOUT a load rule but the order of the data in the file was very important. I was also not able to substitute or alter the information in the integration when this was done. So in the end I still needed to have a load rule handy.
    I'd recommend reading up on the ODI documentation for this -- some of it is actually quite good. http://download.oracle.com/docs/cd/E10530_01/doc/nav/portal_6.htm Check out the "Oracle Data Integrator Adapter for Hyperion xxx" section.
    -Chris
    P.S. Check out John Goodwin's blog -- there's some neat stuff in there and you'll probably figure out some neat stuff on your own too.

  • Data Integrator Web Service Job State

    I would like to know if there's any way, through DI's web services to know if a job is running on the job server or not.  I am building a .NET application that will allow a user to manually launch a job, but I need to prevent the users from launching the job if it's already running.
    I am using Data Integrator 7.2
    Thanks

    you can check the following post with similar discussions
    Re: Extracting JOB_ID for WebClient via webservice
    Extracting JOB_ID within the DS Job

  • IDM Database integrity checks

    Are there any routines or jobs that check / repair the integrity of the IDM database ? IOn particular the linkages between MSKEYVALUEs and MSKEYs
    In our development IDM instance in the MXIV_ENTRIES table we have some MXREF_MX_PRIVILEGE records which point to MSKEY's that dont exist. Found this problem when a user deletion through the GUI would fail with 'privilege doesnt exist' error. Since development is used for all sorts of destructive testing and initial installs of service pack upgrades it is no wonder the data integrity is suspect.
    Other option is to clear the lot and simply reload from all the clients. But I was just wondering if others have had any integrity problems and if there are 'fix' routines available

    Hi Phil,
    I'm not aware of any standard mechanism in SAP IDM that you can use to cleanup your database.
    I gues you have to implement this on your own. The following SQL command should give you all the assigned privileges that no longer exist in the identity store:
    select mskey, attrname, searchvalue
    from mxiv_sentries where
    attrname = 'MXREF_MX_PRIVILEGE' and searchvalue not in
    (select mskey from mxiv_sentries)
    You could then loop through the result and delete all the attribute values.
    Best regards
    Holger

  • ITunes database integrity check?

    In iTunes I have a few ! that have appeared in the first column indicating iTunes can't find the file. So far I have found three folders (albums) that are missing from my music library disc and I don't understand how or when they dissappeared. I haven't found any individual missing files yet, just missing whole folders. It appears iTunes doesn't update the ! indicator until it has some reason to actually go open the file. Is there a way to automate this? So far I've been looking at each song with Command-I to check the Where info under Summary, or selecting the first song of an album and using Command-R to view the songs in finder. This is going to take a long time with nearly 8000 songs in my library. I'm trying to get an handle on the extent of the problem. I am careful to only use iTunes to manage the library (I don't move files around with finder). My library is on an external Firewire drive. Ideally, I would like there to be an "iTunes database integrity check" command.

    The MSDN documentation says "RESTORE VERIFYONLY" command does not verify whether the structure of the data contained within the backup set is correct. Does it mean the restore command will not able to detect corruption in the database and I just need to
    restore each of the backs starting from the latest to see if integrity check fails after restore ? OR RESTORE VERIFYONLY will confirm if the database is un-corrupted ?
    As the documentation suggests, RESTORE VERIFYONLY checks the structure of the backup but not the database itself.  You'll need to restore the backup to check the database consistency.
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • How to load an XML schema with Data Integrator ?

    Post Author: Kris Van Belle
    CA Forum: Data Integration
    Is someone having experience with loading data from a regular table into an XML schema ?
    What are exactly the steps to undertake ? The DI user manual does not provide that much information...

    Post Author: bhofmans
    CA Forum: Data Integration
    Hi Kris,
    In the Designer.pdf there is a chapter called 'nested data' with more information, but you can also check this website with some detailed instructions and examples on how to build a nested relational data model (NRDM).
    http://www.consulting-accounting.com/time/servlet/ShowPage?COMPANYID=43&ELEMENTID=161
    (Thanks to Werner Daehn for putting this together).

  • How to install File Operations components in Data Integrator

    Hello Endeca Forum,
    I'm interested in using the File Operations components in Data Integrator, but getting and installing them is surprisingly obscure.
    I'm talking about...
    http://doc.cloveretl.com/documentation/UserGuide/index.jsp?topic=/com.cloveretl.gui.docs/docs/file-operations.html
    The Clover documentation (http://doc.cloveretl.com/documentation/UserGuide/index.jsp?topic=/com.cloveretl.gui.docs/docs/components.html) states...
    "+Note if you cannot see this component category, navigate to Window → Preferences → CloverETL → Components in Palette and tick both checkboxes next to File Operations+"
    However, when I go to Window --> Preferences --> CloverETL --> Components in Palette, I don't even see File Operations. This tells me that the version of Clover packaged with OEID doesn't have the components installed.
    Going to Help --> Check for Updates also didn't yield anything that looked like the File Operations components.
    I'm hoping someone here is familiar with the necessary steps to get these components installed...
    Thanks,
    Jerome

    I see what you're saying Purvesh, thanks for the clarification.
    Again, it's a little challenging to find this out, but there's a Clover support team member who mentions here...
    http://forum.cloveretl.com/viewtopic.php?f=4&t=6428&view=previous
    ... that the File Operations were introduced in the 3.3.0 M3 release.
    Guess this means we're out of luck, which is unfortunate.
    I'm fully aware that the SystemExecute component is an option for file management, but it's not the best fit for what I need to do. In any case, my question has been answered.
    Thanks!
    Jerome

  • Database Integrity check failed, how to find an un-corrupted backup for recovery

    I got database integrity check task that runs weekly. The job ran March 23rd but failed on March 30th. We have identified that there is a corruption in database and now the task is to restore it from backup (with data loss). We have database backup running
    every-night and I need to know how can I find which is the latest backup that's not corrupted.
    The MSDN documentation says "RESTORE VERIFYONLY" command does not verify whether the structure of the data contained within the backup set is correct. Does it mean the restore command will not able to detect corruption in the database and I just
    need to restore each of the backs starting from the latest to see if integrity check fails after restore ? OR RESTORE VERIFYONLY will confirm if the database is un-corrupted ?

    The MSDN documentation says "RESTORE VERIFYONLY" command does not verify whether the structure of the data contained within the backup set is correct. Does it mean the restore command will not able to detect corruption in the database and I just need to
    restore each of the backs starting from the latest to see if integrity check fails after restore ? OR RESTORE VERIFYONLY will confirm if the database is un-corrupted ?
    As the documentation suggests, RESTORE VERIFYONLY checks the structure of the backup but not the database itself.  You'll need to restore the backup to check the database consistency.
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • Verifying Data Integrity Outlook 2010

    Hi,
    We are using Microsoft SBS2011 including Exchange 2010
    one of users could not start his Outlook 2010. the screen  shows "Verifying Data Integrity" which takes hours.
    I check his .ost file at his computer, which is about 50GB.
    Does Outlook 2010 have the capacity limiit?
    MT

    Hello,
    Yes, there are limitations by default. Please see:
    How to configure the size limit for both (.pst) and (.ost) files in Outlook 2010, Outlook 2007, and in Outlook 2003
    http://support.microsoft.com/kb/832925
    Thanks,
    Simon

Maybe you are looking for

  • Apps like Evernote or Starcraft 2 crashing on start after update to 10.8.2.

    Anyone else experiencing crashes of apps like Evernote or Starcraft 2 after update to 10.8.2? For me they crash every time now when I try to start them. Deleting of Evernote and installing again from AppStore didn't help.

  • Clearing the field value

    IF l_v_ucomm = c_click.     IF fp_selfield-fieldname = c_matnr.       IF fp_selfield-value IS NOT INITIAL.         SET PARAMETER ID mak  FIELD sd_selfield-value. i want to clear the value of sd_selfield but i am not able to clear since it is not an i

  • ... Has No Properties ...

    Is there a way to check if an object exists or bypass the errors when something doesnt exist?

  • Xslt mappings

    Hi all, can any one send the blogs of XSLT mapping. Thanks in advance Naveen

  • ABAP Coding:The included type has no structure

    Hi All, When i tried to write following code, i 'm getting error like "*The included type has no structure *" TYPES:       BEGIN OF it_datapak1.             INCLUDE STRUCTURE RESULT_PACKAGE.     TYPES:          alpha(18) TYPE c,       END OF it_datap