SAP Archiving's Write Job (SARA) - Execution Target
Dear All,
As you guys you know, whenever we trigger the Write Job via TxCode: SARA. A SUBMISSION job : ARV_FI_DOCUMNT_SUB20100527121927 will be created. When the SUBMISSION job: ARV_FI_DOCUMNT_SUB20100527121927 is completed, the WRITE job : ARV_FI_DOCUMNT_WRI20100530070050 will be trigerred.
However, I realized that, the WRITE job being triggered by the SUBMISSION job will be automatically assigned with the DB server as the Execution Target.
My question is, is that possible to avoid this ?
Our WRITE jobs always get delayed due to there is only limited BGD work processes in DB server, and we don't plan to increase the DB server's BGD work processes because it will affect the DB server's performances.
Kindly advise me on this.
Thanks in advanced.
SAP Release : 640
Best Regards,
Ken
Hi Ken,
Have you tried configuring the server group for background processing in the cross-archive object customizing?
Hope this feature will resolve your problem. Have a look in Customizing -> Cross-Archiving Object Customizing -> Server Group for Background Processing
Link:[http://help.sap.com/saphelp_470/helpdata/en/6d/56a06a463411d189000000e8323d3a/frameset.htm]
Thanks,
Naveen
Similar Messages
-
Does MM_EKKO archive write job lock the tables?
Archive experts,
We run MM_EKKO archiving twice a week. From my understanding the write job just read the data and write it to archive files. But we run Replenishment jobs which hit EKPO tables, the jobs run slow, and I noticed that the archive write job is holding the locks on this table. As soon as I cancelled the write job, the replenishment jobs move faster. why this happens? Archive write jobs should not be causing any performance issues, only the delete jobs will impact the performance. Am I correct? Is any one experiencing similar issues?
SamHi Sam,
Interesting question! Your understanding is correct, write job just reads the data from tables and writes it into archive files... but....write job of MM_EKKO (and MM_EBAN) is a bit different. The MM_EKKO write job also takes care of setting the deletion indicator (depending on whether its a one step or two step archiving. So its possible that it puts a lock during the process of setting the deletion indicator, as its a change to database.
please have a look at the folloing link for the explanation of one step and two step archiving:
http://help.sap.com/saphelp_47x200/helpdata/en/9b/c0963457889b37e10000009b38f83b/frameset.htm
Hope this explains the reason for the performance problem you are facing.
Regards,
Naveen -
CO_COSTCTR Archiving Write Job Fails
Hello,
The CO_COSTCTR archiving write job fails with the error messages below.
Input or output error in archive file \\HOST\archive\SID\CO_COSTCTR_201209110858
Message no. BA024
Diagnosis
An error has occurred when writing the archive file \\HOST\archive\SID\CO_COSTCTR_201209110858 in the file system. This can occur, for example, as the result of temporary network problems or of a lack of space in the fileing system.
The job logs do not indicate other possible causes. The OS and system logs don't have either. When I ran it in test mode it it finished successfully after long 8 hours. However, the error only happens during production mode where the system is generating the archive files. The weird thing, I do not have this issue with our QAS system (db copy from our Prod). I was able to archive successfully in our QAS using the same path name and logical name (we transport the settings).
Considering above, I am thinking of some system or OS related parameter that is unique or different from our QAS system. Parameter that is not saved in the database as our QAS is a db copy of our Prod system. This unique parameter could affect archiving write jobs (with read/write to file system).
I already checked the network session timeout settings (CMD > net server config) and the settings are the same between our QAS and Prod servers. No problems with disk space. The archive directory is a local shared folder \\HOST\archive\SID\<filename>. The HOST and SID are variables which are unique to each system. The difference is that our Prod server is HA configured (clustered) while our QAS is just standalone. It might have some other relevant settings I am not aware of. Has anyone encountered this before and was able to resolve it?
We're running SAP R3 4.7 by the way.
Thanks,
TonyHi Rod,
We tried a couple of times already. They all got cancelled due to the error above. As much as we wanted to trim down the variant, the CO_COSTCTR only accepts entire fiscal year. The data it has to go through is quite a lot and the test run took us more that 8 hours to complete. I have executed the same in our QAS without errors. This is why I am bit confused why in our Production system I am having this error. Aside that our QAS is refreshed from our PRD using DB copy, it can run the archive without any problems. So I made to think that there might be unique contributing factors or parameters, which are not saved in the database that affects the archiving. Our PRD is configured with High availability; the hostname is not actually the physical host but rather a virtual host of two clustered servers. But this was no concern with the other archiving objects; only in CO_COSTCTR it is giving us this error. QAS has archiving logs turned off if it’s relevant.
Archiving 2007 fiscal year cancels every after around 7200 seconds, while the 2008 fiscal year cancels early around 2500 seconds. I think that while the write program is going through the data in loops, by the time it needs to access back the archive file, the connection has been disconnected or timed out. And the reason why it cancels almost consistently after an amount of time is because of the variant, there is not much variety to trim down the data. The program is reading the same set of data objects. When it reaches to that one point of failure (after the expected time), it cancels out. If this is true, I may need to find where to extend that timeout or whatever it is that is causing above error.
Thanks for all your help. This is the best way I can describe it. Sorry for the long reply.
Tony -
Error in write Job of MM_EKKO archive
Greetings,
I have a problem. Unexpectedly a large data was generated during MM_EKKO archiving. The Wrtie job failed with an error due to no more space available. Now I have deleted the files which showed as archived. One file which was in write mode is not showing in the archive management section. And thus we can not delete it or not able to know how much data in it. as the job log cleared , the session shows as complete.
My question is , at which time the deletion is marked for POs ?
1. At the time of Write
2. At the time of Delete job.
Regards,
VikramHello,
Deletion job will start only after creation of archive file successfully.
If management is not showing the archive session number then it means that archive files are not created and there is no deletion happened.
Before execution of write job it is always better to make note of total no of entries of main table.
Thanks,
Ajay -
Sap Archiving u0096 STO error on version ECC 6.0
Hi experts. I take a long of time with this error and I hope that you can help me. Explain the program.
We have a archiving project of no standard objects. This objects run on Sap version 4.6C, but now, on version ECC 6.0, the Store program give a execution error. The log of the STO Job on Sara transaction is the follow:
Job started
Step 001 started (program RSARCH_STORE_FILE, variant , user ID ARUIZ)
Archive file 000241-001ZIECI_RF02 is being processed
Archive file 000241-001ZIECI_RF02 does not exist (Message no. BA111)
Error occured when checking the stored archive file 000241-001ZIECI_RF02 (Message no. BA194)
Job cancelled after system exception ERROR_MESSAGE (Message no. 00564)
The Write and Delete programs runs correctly.
A strandar Archiving Object like FI_TCJ_DOC runs Ok (WRI, DEL and STO programs). The customazing for both objects are nearly on OAC0, FILE transactions. The changes are on Sap Directories. Write for you the most important customazing actions:
Transaction: FILE
ZIECI_RF02 (No Standard)
Logical File Path --> ZZA_ARCHIVE_GLOBAL_PATH
Physical Path --> /usr/sap/CX6/ADK/files/ZA/<FILENAME>
Logical File Name Definition -->
ZZA_ARCHIVE_DATA_FILE_ZIECI_RF02
FI_TCJ_DOC (Standard)
Logical File Path --> ARCHIVE_GLOBAL_PATH
Physical Path --> <P=DIR_GLOBAL>/<FILENAME>
Logical File Name Definition --> ARCHIVE_DATA_FILE
Transaction: AOBJ
Customazing settings:
ZIECI_RF02
Logical File Name --> ZZA_ARCHIVE_DATA_FILE_ZIECI_RF02
FI_TCJ_DOC (Standard)
Logical File Name --> ARCHIVE_DATA_FILE
I prove to put in own archiving object (ZIECI_RF02) the logical file name -> ARCHIVE_DATA_FILE too and the error go on.
In the others parameters the values for both objects are:
Delete Jobs: Start Automatically
Content Repository: ZA (Start Automatically)
Sequence: Delete Before Storing
I see the point of the storing STO program (RSARCH_STORE_FILE) where own archiving failed. I made a program with the function and debbuging this can see the next:
REPORT zarm_06_prueba_http_archivado.
DATA: LENGTH TYPE I,
T_DATA TYPE table of TABL1024.
BREAK ARUIZ.
CALL FUNCTION 'SCMS_HTTP_GET'
EXPORTING
mandt = '100'
crep_id = 'ZA'
doc_id = '47AAF406C02F6C49E1000000805A00A9'
comp_id = 'data'
offset = 0
length = 4096
IMPORTING
length = length
TABLES
data = t_data
EXCEPTIONS
bad_request = 1
unauthorized = 2
not_found = 3
conflict = 4
internal_server_error = 5
error_http = 6
error_url = 7
error_signature = 8
OTHERS = 9.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
If execute the program, SAP returns the next error:
Error in HTTP Access: IF_HTTP_CLIENT -> RECEIVE 1
The sy-subrc variable value is 6 -> error_http
The call of the method is the follow:
call method http_client->receive
exceptions
http_communication_failure = 1
http_invalid_state = 2
others = 3
sy-subrc value is 1 -> http_communication_failure
The unusual of the case are that the Archiving Objects work on SAP 4.6C perfectly,
furthermore on version ECC 6.0 the standard and no standard objects uses Sap Content Server with the same connection (HTTP), IP, PORT and Repository.
I hope that anypeople can help me. A lot of thanks for your time.
Best Regards.Hi Samanjay.
Answer your questions:
1) I proved the archiving from Sara with the Object ZIECI_RF02 with both customazings. The Stored program fail with logical file name = ZZA_ARCHIVE_DATA_FILE_ZIECI_RF02 and logical file name = ARCHIVE_DATA_FILE (Example at answer 2).
2) AL11 Transaction (Sap Directories):
/usr/sap/CX6/ADK/files/ZA/
cx6adm 09.04.2008 18:02:15 ZIECI_RF02_20080409.180215.ARUIZ
This is the Fich from ZIECI_RF02 on AL11. I prove this yesterday. The variable name: 000241-001ZIECI_RF02 is the generate to save the fich on repository ZA. In this case the variable name is: 000342-001ZIECI_RF02. Can view this with the table ADMI_FILES on TA SE11.
Entries on ADMI_FILES with create date = 09.04.2008
DOCUMENT: 342
ARCHIV KEY: 000342-001ZIECI_RF02
CREAT DATE: 09.04.2008
CREAT TIME : 18:18:28
OBJ COUNT: 1
FILENAME: ZIECI_RF02_20080409.18182
STATUS OPT: Not Stored
STATUS FIL: Archiving Completed
PATHINTERN: ZZA_ARCHIVE_GLOBAL_PATH
CREP:
ARCH DOCID:
Now, I put the same information from FI_TCJ_DOC (Standard Object):
AL11 Transaction (Sap Directories):
/usr/sap/CX6/SYS/global
cx6adm 10.04.2008 11:24:15 FI_FI_TCJ_DOC_20080410_112409_0.ARCHIVE
Entries on ADMI_FILES with create date = 10.04.2008
DOCUMENT: 343
ARCHIV KEY: 000343-001FI_TCJ_DOC
CREAT DATE: 10.04.2008
CREAT TIME: 11:24:09
OBJ COUNT: 2
FILENAME: FI_FI_TCJ_DOC_20080410_112409_0.ARCHI
STATUS OPT: Stored
STATUS FIL: Archiving Completed
PATHINTERN: ARCHIVE_GLOBAL_PATH
CREP: ZA
ARCH DOCID: 47FD890364131EABE1000000805A00A9
Finally, made the example with Archiving Object ZIECI_RF02, but assigning the standard logical file.
AOBJ (Customazing settings):
Object Name: ZIECI_RF02 Archivado de datos: Facturas de Baja
Logical File Name: ARCHIVE_DATA_FILE
Now, execute the archiving at SARA transaction.
AL11 Transaction (Sap Directories):
/usr/sap/CX6/SYS/global
cx6adm 10.04.2008 12:33:25 FI_ZIECI_RF02_20080410_123324_0.ARCHIVE
Entries on ADMI_FILES with create date = 10.04.2008
DOCUMENT: 345
ARCHIV KEY: 000345-001ZIECI_RF02
CREAT DATE: 10.04.2008
CREAT TIME: 12:33:24
OBJ COUNT: 1
FILENAME: FI_ZIECI_RF02_20080410_123324_0.ARCHIVE
STATUS OPT: Not Stored
STATUS FIL: Archiving Completed
PATHINTERN: ARCHIVE_GLOBAL_PATH
CREP:
ARCH DOCID:
It´s that the unusual. At first, I thought that´s the problem was the directory of SAP, but with this proof I looked a new error reason.
3) The details of repository ZA are:
Content Rep: ZA
Description: Document Area
Document Area: Data Archiving
Storage type: HTTP Content Server
Version no: 0046 Content Server version 4.6
HTTP server: 128.90.21.59
Port Number: 1090
HTTP Script: ContentServer/ContentServer.dll
Phys. Path: /usr/sap/CX6/SYS/global/
Too much thanks for your answer and your time. If you have any doubt, question me.
Best regards. -
Lock NOT set for: Archiving the data from a data target
Dear Expert,
I try to archive one of my info cube, when i start to write the Archive file the Free Space in the Archive folder not enough and it make the process error.
Then i try to changes the Archive folder to another path with enough free space. But when i start to write the archive file with new variant this error message came up :
==============================================================
An archiving run locks InfoProvider ZICCPS810 (archivation object BWCZICCP~0)
Lock NOT set for: Archiving the data from a data target
InfoProvider ZICCPS810 could not be locked for an archiving session
Job cancelled after system exception ERROR_MESSAGE
==============================================================
Please Help Me.
Wawan SHi Wawan,
If the earlier archive session resulted in error, please try to invalidate the earlier session in archive management and try running archive job again.
Hope this helps,
Naveen -
Archive routing in SAP archiving
Has anyone used Archive routing? I keep getting error message "No suitable content repository / logical file name found". but I know that logical file name has been defined properly, so as variant. Any idea?
Hello again,
I think you will need to provide more information as there is some type of inconsistency between what you have configured for archive routing (rules and conditions) and the selection criteria in the write job variant.
Out in the SAP Help Portal, it states:
When entering the selection criteria in the variant it is important to make sure that the set of data covered by the selection criteria of the variant falls AT LEAST into the set of data covered by the rules (it does not have to be exactly the same, but at least a subset). If this is not the case, the archiving session is terminated before the write job begins.
Best Regards,
Karin Tillotson -
MII Scheduler Execution Target
Hello,
we have cpu-intensive scheduled transactions that we want to separate from the load of end users on our production cluster:
MII Production system PRD
Server 1 - DB and 2 NW Java server nodes
Server 2 - 3 NW Java server nodes
Server 3 - 3 NW Java server nodes
By using SAP Web Dispatcher, we can direct users to servers 2 and 3, sparing server 1 to DB and background processing. But on MII scheduler we can't set the execution target to server 1. On ABAP jobs we have an execution target, where we can define an aplication server or group where it'll run. How can we accomplish the same on MII Scheduler? We're using MII 12.2.5 Build(66).
Thank you,
MarcosSalvatore,
thank you for your explanation. I read your document on performance and found that we're trying a 'big data' variation of scenario 5 & 6. In general terms, the problem that we have is about scaling the background processing while doing online processing concurrently on the same server. Since we cannot control which transaction will run on each instance, our approach to solve this was to split the productive system in many systems, each one responsible for part of the load:
System 1: user processing - does not have scheduled transactions
System 2: batch processing part 1 - scheduled transactions for sites A and B
System 3: batch processing part 2 - scheduled transactions for site C (part 1)
System 4: batch processing part 3 - scheduled transactions for site C (part 2)
System 5: batch processing part 4 - scheduled transactions for site D
This architecture has many obvious support drawbacks, but it is the only way we found to separate user and batch loads on MII. On ABAP systems we have job target control.
I wonder what other users are doing to separate the load, or maybe they don't have that much processing on one server, like in your scenario. Here at Petrobras we're used to deal with massive amounts of data, say 5K tags por request times the varying granularity.
Please advice.
Regards,
Marcos -
Data archiving for Write Optimized DSO
Hi Gurus,
I am trying to archive data in Write Optimized DSO.
Its allowing me to archive on request basis but it archives entire requests in the DSO(Means all data).
But i want to select to archive from this request to this request.(Selection of request by my own).
Please guide me.
I got the below details from SDN.Kindly check.
Archiving for Write optimized DSOs follow request based archiving as opposed to time slice archiving in standard DSO. This means that partial request activation is not possible; only complete requests have to be archived.
Characteristic for Time Slice can be a time characteristic present in the WDSO, or the request creation data/request loading date. You are not allowed to add additional infoobjects for semantic groups, default is 0REQUEST & 0DATAPAKID.
The actual process of archiving remains the same i.e
Create a Data Archiving Process
Create and schedule archiving requests
Restore archiving requests (optional)
Regards,
kiruthikaHi,
Please check the below OSS Note :
http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d31313338303830%7d
-Vikram -
Mass deletion of batches - SAP Archiving Object MM_SPSTOCK
__SAP Archiving Object MM_SPSTOCK (Batches and Special Stocks)__
Would like to know if there is any standard program or report available through which the batches (to be deleted) can be set with a deletion flag. We are in the process of implementing the archiving object MM_SPSTOCK and the batches have not been flagged. There is no pre-processing job for MM_SPSTOCK that can be used to set the deletion flag for all eligible batches.
This object MM_SPSTOCK does not have any date or selection criteria for the write job. All batches with set deletion flag will be archived/deleted.
We are in the process of implementing LO_CHVW (Batch where-used data) for which imlementing MM_SPSTOCK is a pre-requisite.
Would appreciate any leads around this. Any other suggestions around the MM_SPSTOCK & LO_CHVW will be great.
Thanks
SamI am basically looking for a method (program / report) through which mass deletion flag can be set for eligible batches prior to using MM_SPSTOCK archiving object to archving / deleting the batches.
Thanks
Sam -
DEAR ALL,
what does the below mean?
"maintain account type file"?
"maintain document type file"?
must this be done always before archiving?
Configuration
In SARA tcode,
Object : FI_DOCUMNT (For archive FI Documents) and press enter
click on customizing button
Maintain account type life
Maintain document type lifeRefer to the following:
http://help.sap.com/saphelp_47x200/helpdata/en/29/b6963457889b37e10000009b38f83b/frameset.htm
and refer SAP notes 99620 and 94144
Thanks,
Naveen -
SAP Archive Link (Error in opening file for writing)
I am trying to export a file to a folder in the server using a logical path.
The logical path I am using is MARKETING_FILES. I have changed its physical mapping to 'G:\Segmentation_Exports\<FILENAME>'
I am scheduling the program as a background job. But it gets canceled throwing the following error.
SAP ArchiveLink (file G:\Segmentation_Exports\100AC-$100OFF_201105300730 could not be opened for writing)
The problem occurs only in production. In development and Quality servers the program works fine.
In production, when the mapping is changed back to default i.e., <P=DIR_GLOBAL>/<FILENAME>. the program works fine.
Suggestions to solving the issue??
Edited by: qwerty_m on May 30, 2011 9:55 AMI would check the obvious things first ......
1) Does the directory "G:\Segmentation_Exports" exist on the Production server?
2) If so, does the SAP user have write access to the directory?
3) If so, was a directory inadvertently created with the same name as the output file?
You can also use transactions SM21 and ST22 for more information about the error.
Regards,
D. -
DI install is stuck at phase "unpack SAP archive"
I try to install a DI on an MSCS cluster/SQL.
On the phase called "unpack SAP archive", anyway SAPinst cannot move forward.
The sapinst_dev.log reads: (many repeated loops as follows)
TRACE [iaxxejsexp.cpp:199]
EJS_Installer::writeTraceToLogBook()
2009-03-31 18:37:33.681 NWInstall._getFiles()
TRACE [iaxxejsexp.cpp:199]
EJS_Installer::writeTraceToLogBook()
2009-03-31 18:37:33.681 NWInstall._getList(DBINDEP\SAPEXE.SAR)
TRACE [iaxxejsexp.cpp:199]
EJS_Installer::writeTraceToLogBook()
2009-03-31 18:37:33.681 NWInstall._getList() done: instance.lst
TRACE [iaxxejsexp.cpp:199]
EJS_Installer::writeTraceToLogBook()
2009-03-31 18:37:33.681 NWInstall._getFiles(): found archive in table.
TRACE [iaxxejsexp.cpp:199]
EJS_Installer::writeTraceToLogBook()
2009-03-31 18:37:33.681 NWInstall._getFiles() done: [
The sapinst.log reads:
WARNING 2009-03-31 14:52:04
Unable to get information about file
NT001C5555EP1\sapmnt\EP1\SYS\exe\uc\NTAMD64 using FindFirstFile. Operating system error message: You were not connected because a duplicate name exists on the network. Go to System in Control Panel to change the computer name and try again.
However explorer can access
NT001C5555EP1\sapmnt\EP1\SYS\exe\uc\NTAMD64 without any issue.
We can ping NT001C5555EP1 without errors.
DNS shows only one entry for above NT001C5555EP1.
How to fix it? Modify control.xml to allow the unpack anyway?
Points will be given. Thanks a lot!Dear Friends:
I find out in the sapinst_dev.log:
TRACE [iaxxejsexp.cpp:199]
EJS_Installer::writeTraceToLogBook()
2009-04-01 15:03:41.212 NWInstall._needUnpacking() done:
NT001C5555EP1/sapmnt/EP1/SYS/exe/uc/NTAMD64/sapstartsrv.exe.new is missing, returning true.
TRACE [iaxxcdialogdoc.cpp:142]
CDialogDocument::evaluateDependsAndDependsNot()
Do front side checks at backend for dialog element unpack
TRACE [iaxxgenimp.cpp:293]
CGuiEngineImp::showDialogCalledByJs()
Dialog d_nw_unpack_now was skipped because of ( getCurrentDialog()->skip() or !getCurrentDialog()->shallShow() )returns true.
TRACE [iaxxejsexp.cpp:199]
EJS_Installer::writeTraceToLogBook()
2009-04-01 15:03:41.227 NWInstall._needUnpacking({
cd:UKERNEL
codepage:Unicode
destination:
NT001C5555EP1\sapmnt\EP1\SYS\exe\uc\NTAMD64
list:
needUnpacking:true
ownPath:
path:DBINDEP\SAPEXE.SAR
sid:QP1
unpack:false
wasUnpacked:false
In the control.xml, could you tell how to edit the follwoing so that the unpack can
take place? Now I am stuck at "unpack SAP archives":
NWInstall.prototype._needUnpacking = function(archive) {
NW.trace("NWInstall._needUnpacking(", dump_properties(archive), ")");
if (!archive.cd) {
NW.trace("NWInstall._needUnpacking() done: optional archive, return false.");
return false;
var dest = FSPath.get(archive.destination);
if (!dest.isExisting()) {
if (installer.onUnix()) {
// Handle the case where /usr/sap/SID/SYS/exe does not yet exist, but
// /sapmnt/SID/exe is there. /usr/sap/SID/SYS/exe/run is a softlink to /sapmnt/SID/exe,
// but /usr/sap/SID/SYS/exe/uc/platform should be /sapmnt/SID/exe/uc/platform.
var destString = dest.toString();
var sysExe = "/usr/sap/" + this.getSID() + "/SYS/exe";
if (destString.indexOf(sysExe) == 0) {
destString = this.getSAPMountDir() + "/" + this.getSID() + "/exe" + destString.substr(sysExe.length);
destString = destString.replace(/\/run$/, "");
NW.trace("NWInstall._needUnpacking() done: destination ", dest, " does not exist, testing ", destString);
dest = FSPath.get(destString);
if (!dest.isExisting()) {
NW.trace("NWInstall._needUnpacking() done: destination ", dest, " does not exist, returning true.");
return true;
} else {
NW.trace("NWInstall._needUnpacking() done: destination ", dest, " does not exist, returning true.");
return true;
} else {
NW.trace("NWInstall._needUnpacking() done: destination ", dest, " does not exist, returning true.");
return true;
var files = NWInstall._getFiles(archive);
for (var i = 0; i < files.length; ++i) {
var f = dest.concat(files<i>);
if (! f.isExisting()) {
NW.trace("NWInstall._needUnpacking() done: ", f, " is missing, returning true.");
return true;
NW.trace("NWInstall._needUnpacking() done: false");
return false;
Thanks!! -
Hi all,
How to archive the data from SAP when the data have a delete tag?
can any one please explain the archive process in SAP? we have to delete production order data?
Thanks in advance.
Regards,
veera
Edited by: Veerab on Oct 3, 2011 9:52 AMHi,
Main transaction is SARA of all SAP archiving objects;
1) Complete the customization steps of the SAP archive object on SARA
2) In order to delete the documents, you need to archive the object successfully, first
3) Then, you can delete the archive session object
4) Do not forget to reorganize the archived tables and create the statistics
The object name is PP_ORDER.
Best regards,
Orkun Gedik -
SAP Archived object in MM . Cross verify and test.
Hi All,
We are on Process of Archiving data from Production, Configuration and Object done up to Quality client. We are started testing in Quality. As a functional consultant how to cross verify and check how the data like PO,PR,Material documents archived or not.
Now data archived for only May 2008. I would like to test the case with in the period, the data is properly archived or not. any difference is there??
Can you please let me know how to take care of this and do the testing for SAP archiving in MM perspective.
Thanks for all your help.Hi Hare,
For SAP Data Archiving testing you have to look at mainly two things: residence period and business completion requirements. Residence period is the time the data needs to be in SAP before it's eligible to be archived (data old enough) and business completion requirements is basically checking that the object is "closed" before archiving it. For instance, you can not archive a service order that is opened or an equipment that is installed.
You should check different combinations like data old enough (for the period that you want) and not complete (should not be archived) and then data old enough and complete (should be archived).
Check the data before running the archiving jobs and see what should be / not be archived according to you, run the archiving jobs and then check the data again to see if results are expected.
Hope this guidelines help you.
Cheers.
Maybe you are looking for
-
Recording Audio from additional sources: Why doesn't the "Audio File" show
Hi, when recording audio from another source and saving: where does logic store the file? Shouldn't it be in the "audio' folder or the set path you set it to>? For some reason after we record the "Audio File" we recorded is not to be found in the "Au
-
Adding EOF to XML file without using FCC
Hello Experts, I have a simple File to File scenario with source and target files being XML and simple mapping (No content conversion necessary). The issue which I face is described below: File is written by receiver File Adapter on the unix system,
-
How do I install a downloaded program?
I just download VLC open source media player. It shows an icon on the desk top - like when you put in a flash drive. What do I do now to install it? It's not prompting me or anything.
-
MacPro shut down instaed Sleep after Mountain Lion Update
Sleep Mode works not longer after updateted to Mouintain Lion. It shut down and needs quite longer on a next Computer start - longer as if started normal after normal shut down before.
-
Excahnge rate variance on delivery cost in MIRO
Dear All, I believe that SAP do not create seperate line item for exchange rate variance on delivery cost like it does for origin cost/main price during MIRO. Is it possible to post this exchange rate variance on delivery cost also into a seperate ac