BW Archiving run Locks data upload

Hello BW Experts.
We are working with Archiving, but we have one problem, when we do data load to InfoProvider, data Load  is failing due to Archiving run locks.
If we run Delete archiving step, then we can do data upload to InfoProvider without any issues.
But we are not interested to perform Delete Step immediately; we want to do after few days. But this causes big issue for data loading in production system.
Is there any program to unlock Archiving session?
Is there any other process to unlock the tables?
I know that we can unlock archiving session thru table RSARCHREQ, But I feel this is not right way to do it.
Any help regarding this is appreciated.
Thanks and Regards
Venkat

Hi Harsh,
The Archiving process has two steps, the archiving step that writes the archive file and the second step that deletes the data. The Data Target is locked until the second step is completed.
If you do not wish to delete the data and release the locks, then change the archiving process status to invalidate the archiving run. (transaction SARA).
I hope this helps,
Mike.

Similar Messages

  • Hi am trying to save Data into a write to measurement file vi using a NI PXI 1042Q with a real time mode but it is not working but when i run it with uploading it into the PXI it save in to the file

    Hi am trying to save Data into a write to measurement file vi using a NI PXI 1042Q and DAQ NI PXI-6229 with a real time mode but it is not working but when i run it without uploading it into the PXI it save in to the file please find attached my vi
    Attachments:
    PWMs.vi ‏130 KB

     other problem is that the channel DAQmx only works at real time mode not on stand alone vi using Labview 8.2 and Real time 8.2

  • Archiving BI.7.0: Must data in InfoCube be compressed before archiving run?

    Hi Expert
    I have created a Data Archiving Process (DAP). It is execute by a archiving process variant in a process chain.
    Archiving an InfoCube works fine when all requests are compressed.
    However, when I have some requests in the InfoCube that isn't archived I get the following error:
    07.11.2008 14:47:07 Data area to be archived is not completely compressed yet I RSDA 148
    07.11.2008 14:47:07 Exception condition  in row 105 of include CL_RSDA_ARCHIVING_REQUEST=====CM015 (program CL_RSDA_ARCHIVING_REQUEST=====CP) I RSDA 140
    I cannot find anywhere it is stated that all requests have to be archived and I cannot find any OSS's about this subject as well.
    Have you got any experience in this matter?
    Thanks in advance and kind regards,
    Torben

    Hi,
    First of all, i want to thank you for your help!
    but i have a question about a case:
    Loading Date     Loading N°       FISCYEAR     Order N°          Turnover
    31/12/2007         1                          2007                     Order 1            100
    01/01/2008         2                          2008                     Order 1            -50
    02/01/2008         3                          2007                     Order 1             60
    The loading date corresponds to the Request Date.
    In our case, we have compressed Data wich Request Date is < 2008, and we want to archive data wich FISCYEAR = 2007.
    In this case, we will find the chrgt 1 in the E-Table, and chrgt 2 & 3 in the F-Table.
    The goal of the 3rd Loading is to recover Data wich correspond to the year 2007 and that has not been recovered by the SAP System  .
    For exemple, in my case, Data (wich come directly from the stores ) are sent everyday to SAP, but due to a problem, we didn't receive Data.
    So, The 3rd Loading will allow us to recover those Data.
    How will i do to archive the entire data that correponds to FISCYEAR = 2007, knowing that i do not have all of them in requests wich Loading Date is < 2008.
    Thanks for your help.
    Salah Lamaamla

  • Error in "Archive Enterprise Portal data" phase of system copy

    Hi
    We are facing an error in system copy export of our JAVA only system in 6th phase "Archive Enterprise Portal data" .
    ERROR      2011-11-02 09:00:15.080
               CJSlibModule::writeError_impl()
    MUT-03011  Execution of the command "/tmp/sapinst_exe.12534.1320217175/SAPCAR -c -v -i -C /usr/sap/NED/SYS/global -f /temp/NED/JAVA/APPS/KMC/GLOBAL.KMC.SAR config/cm" finished with return code 44. Output:
    a config/cm
    a config/cm/config ....................................................
    We have NETWEAVER 7.0, JAVA STACK.
    Please help with any clues.
    Regards
    Priyanka Kaul

    Hi
    Please find further logs:-
    TRACE      2011-11-02 15:09:42.832 [syxxcfile.cpp:123]
               CSyFileImpl::decideIfMoveCopyNode(const CopyMoveDestinationInfo & , ISyNode::CopyMoveMode_t 0x3, PSyNodeInt &) const 
               lib=syslib module=syslib
    Target node does not exist and (mode & ISyNode::MISSING) ==> I will copy/move.
    INFO       2011-11-02 15:09:42.846 [syuxcpath.cpp:471]
               CSyPath::createFile() lib=syslib module=syslib
    Creating file /tmp/sapinst_instdir/NW04S/LM/COPY/ORA/EXP/CENTRAL/AS-JAVA/EXP/GLOBAL.log.
    INFO       2011-11-02 15:09:42.860
               CJSlibModule::writeInfo_impl()
    Output of /tmp/sapinst_exe.17133.1320239372/SAPCAR is written to the logfile GLOBAL.log.
    INFO       2011-11-02 15:17:45.369
               CJSlibModule::writeInfo_impl()
    Output of /tmp/sapinst_exe.17133.1320239372/SAPCAR -c -v -i -C /usr/sap/<SID>/SYS/global -f /temp/<SID>/SYSTEMCOPY/JAVA/APPS/KMC/GLOBAL.KMC.SAR config/cm:
    a config/cm
    a config/cm/config
    a config/cm/config/time.sys
    a config/cm/config/locks
    a config/cm/backups
    a config/cm/backups/backup-com.sap.portal.supportability.isolde-2010-02-22-13-03-08-863.zip
    a config/cm/backups/backup-bc.util.prjconfig-2010-02-22-13-03-18-87.zip
    a config/cm/backups/backup-kmc.util.core-2010-02-22-13-03-27-842.zip
    a config/cm/backups/backup-com.sapportals.portal.prt.service.soap-2010-02-22-13-03-37-369.zip
    a config/cm/backups/backup-bc.protocol.prjconfig-2010-02-22-13-03-46-356.zip
    a config/cm/backups/backup-bc.sf.prjconfig-2010-02-22-13-03-54-965.zip
    a config/cm/backups/backup-kmc.people.appl.presence-2010-02-22-13-04-03-709.zip
    a config/cm/backups/backup-bc.sf.service.prjconfig-2010-02-22-13-04-14-331.zip
    a config/cm/backups/backup-kmc.util.sor-2010-02-22-13-04-23-879.zip
    a config/cm/backups/backup-kmc.people.shared.cpr-2010-02-22-13-04-32-793.zip
    a config/cm/backups/backup-bc.rf.prjconfig-2010-02-22-13-04-42-322.zip
    a config/cm/backups/backup-sap.comcafkm.ep.repmanager-2010-02-22-13-04-51-426.zip
    a config/cm/backups/backup-bc.rf.repository.service.prjconfig-2010-02-22-13-05-00-577.zip
    1. We tried running the export with SIDADM user as well and followed the note 970518, but to end up with the same error.
    2. We are already using the lated Installtion master.
    3. Where should we check the OS logs?
    Regards
    Priyanka

  • ORA-01157: cannot identify/lock data file error in standby database.

    Hi,
    i have a primary database and standby database (11.2.0.1.0) running in ASM with different diskgroup names. I applied an incremental backup on standby database to resolve archive log gap and generated a controlfile for standby in primary database and restored the controlfile in standby database.But when i started the MRP process its not starting and thows error in alert log ORA-01157: cannot identify/lock data file. When i queried the standby database file it shows the location on primary database datafiles names not the standby database.
    PRIMARY DATABASE
    SQL> select name from v$datafile;
    NAME
    +DATA/oradb/datafile/system.256.788911005
    +DATA/oradb/datafile/sysaux.257.788911005
    +DATA/oradb/datafile/undotbs1.258.788911005
    +DATA/oradb/datafile/users.259.788911005
    STANDBY DATABASE
    SQL> select name from v$datafile;
    NAME
    +STDBY/oradb/datafile/system.256.788911005
    +STDBY/oradb/datafile/sysaux.257.788911005
    +STDBY/oradb/datafile/undotbs1.258.788911005
    +STDBY/oradb/datafile/users.259.788911005
    The Actual physical location of standby database files in ASM in standby server is shown below
    ASMCMD> pwd
    +STDBY/11gdb/DATAFILE
    ASMCMD>
    ASMCMD> ls
    SYSAUX.259.805921967
    SYSTEM.258.805921881
    UNDOTBS1.260.805922023
    USERS.261.805922029
    ASMCMD>
    ASMCMD> pwd
    +STDBY/11gdb/DATAFILE
    i even tried to rename the datafiles in standby database but it throws error
    ERROR at line 1:
    ORA-01511: error in renaming log/data files
    ORA-01275: Operation RENAME is not allowed if standby file management is
    automatic.
    Regards,
    007

    Hi saurabh,
    I tried to rename the datafiles in standby database after restoring it throws the below error
    ERROR at line 1:
    ORA-01511: error in renaming log/data files
    ORA-01275: Operation RENAME is not allowed if standby file management is
    automatic.
    Also in my pfile i have mentioned the below parameters
    *.db_create_file_dest='+STDBY'
    *.db_domain=''
    *.db_file_name_convert='+DATA','+STDBY'
    *.db_name='ORADB'
    *.db_unique_name='11GDB'
    Regards,
    007

  • Basic Data Upload to MATERIAL  Using LSMW is not working

    HI All,
      we are using LSMW    /SAPDMC/SAP_LSMW_IMPORT_TEXTS  program to upload the  basic text of the material, all steps are executed correctly and shows  records are transfered correctly , but the  in MM03 the text is not uploading..
    EPROC_PILOT - MASTER - TEXT_UPLOAD Basic long text 1line
    Field Mapping and Rule
            /SAPDMC/LTXTH                  Long Texts: Header
                Fields
                    OBJECT                       Texts: Application Object
                                        Rule :   Constant
                                        Code:    /SAPDMC/LTXTH-OBJECT = 'MATERIAL'.
                    NAME                         Name
                                        Source:  LONGTEXT-NAME (Name)
                                        Rule :   Transfer (MOVE)
                                        Code:    /SAPDMC/LTXTH-NAME = LONGTEXT-NAME.
                    ID                           Text ID
                                        Source:  LONGTEXT-ID (Text ID)
                                        Rule :   Transfer (MOVE)
                                        Code:    /SAPDMC/LTXTH-ID = LONGTEXT-ID.
                    SPRAS                        Language Key
                                        Source:  LONGTEXT-SPRAS (Language Key)
                                        Rule :   Transfer (MOVE)
                                        Code:    /SAPDMC/LTXTH-SPRAS = LONGTEXT-SPRAS.
                                                 * Caution: Source field is longer than target field
                /SAPDMC/LTXTL                  Long Texts: Row
                /SAPDMC/LTXTL                  Long Texts: Row
                    Fields
                        TEXTFORMAT                   Tag column
                                            Rule :   Constant
                                            Code:    /SAPDMC/LTXTL-TEXTFORMAT = 'L'.
                        TEXTLINE                     Text Line
                                            Source:  LONGTEXT-TEXTLINE (Text Line)
                                            Rule :   Transfer (MOVE)
                                            Code:    /SAPDMC/LTXTL-TEXTLINE = LONGTEXT-TEXTLINE.
    and at last it displaying as follws
    LSM Workbench: Convert Data For EPROC_PILOT, MASTER, TEXT_UPLOAD
    2010/02/01 - 10:14:25
    File Read:          EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.read
    File Written:       EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.conv
    Transactions Read:                    1
    Records Read:                         1
    Transactions Written:                 1
    Records Written:                      2
    can any one tell us what could be problem
    Regards
    Channappa Sajjanar

    Hi , thanks for your reply,
      i run the all the steps .
      when i run the program it gives message as follows
    Legacy System Migration Workbench
    Project:                              EPROC_PILOT     eProcurement Pilot
    Subproject:                           MASTER          Master data Upload / Change
    Object:                               TEXT_UPLOAD     Basic long text 1line
    File :                                EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.conv
    Long Texts in Total:                  1
    Successfully Transferred Long Texts:  1
    Non-Transferred Long Texts:           0

  • Asset Data Uploading

    Hi
    Guru's
    I  want to upoload the Assets balance, (Data)
    I am Confusing about which  data i have to be taken, Can I take Current Balances of Assets, or Acqusation Value of Asset.
    and Please explain me about Depreciation data Uploading,
    Please explain in detail with step by step.
    Note:See Example: Asset Purchase on 25.02.2007 for Rs:55,000
    Depreciation 10% , for 2years it is 11,000.
    on 25.02.2009,i want to upload the data.
    55,000-----Acquisation Value of Asset
    11,0000----depreciation for 2 Yeras
    44,000 is the net Book Value of Asset on 25.02.2009.
    in this case,
    which data i have to be  taken
    if i taken 55,000-00 (acq value)
    is it possible to upload the depreciation amount or is it necesary to run the deprecuation for two years.
    kindly give your Valuable Suggestion,
    i am waiting for your reply as early as possible.
    with Regads,
    Arun Kumar

    The asset you upload with AS91 the balance values with ABF! or OASV.
    When you search on these transaction codes on this forum you will found more information
    http://help.sap.com/saphelp_erp60_sp/helpdata/EN/4f/71e42b448011d189f00000e81ddfac/frameset.htm

  • Java.sql.SQLException: ORA-01157: cannot identify/lock data file 7 - see DB

    I am deploying my application components on Oracle RAC database, when i install my app component i will run a script which creates a user and tablespace on bot rac-nodes(node1 and node2)
    as database is clustered, the user are created on 2 nodes and tablespace is stored in a shared location for two rac nodes
    i am able to successfully run my script on rac-node1 and and installed my component and when i started installing the app component on node2 getting error
    java.sql.SQLException: ORA-01157: cannot identify/lock data file 7 - see DBWR trace file
    ORA-01110: data file 7: '/db/db/db/ora10g/10.2.0/admin/dbadmin_01.dbf'
    ORA-06512: at "ADMIN.XL_SP_DBCHECK", line 48
    ORA-06512: at line 1
    "dbadmin_01.dbf is the custom tablespace which created by me"
    i a mgetting error while installing the app component on rac-node2

    You should not create database files outside of the ASM.
    I'm not sure you can move the datafile, I would export the data from the tablespace (if there is data you need there), delete the tablespace and recreate it in the ASM, then import the data back.
    If you want to move the datafile, you can try using RMAN (from node1).
    First connect to the database with sqlplus and execute:
    alter tablespace <tbs_name> offline;Then start RMAN: rman target /
    BACKUP AS COPY DATAFILE '<file>' FORMAT '+<ASM_DG>';Then in sqlplus, execute:
    alter tablespace <tbs_name> online;I'm not sure it will work, and I hope I got the commands right because I can't check it right now.
    Good luck
    Liron Amitzi
    Senior DBA consultant
    [www.dbsnaps.com]
    [www.orbiumsoftware.com]

  • How do I schedule regular daily/weekly/monthly/quarterly data uploads?

    Hello gurus!
    How do I schedule regular daily/weekly/monthly/quarterly data uploads?  How can I make it "automatic"?
    Thank you very much!
    Philips

    Hi,
    There are lots of documents available on how to design a process chain. It is basically your requirement what is needed.
    for eg.. You have  a daily masterd ata loaded into BW from R/3 and then the transaction data. So we create a chain where in we create anothe meta chain (for Master data) and drag the option of Load IP ( give variant as ur IP name you have created for that Particluar master data) , same for texts and hierarchy, for hierarchy use save option after the hierarchy process. and then use a attribute change run proces step to activate the master data. and then create another meta chain that loads transactiond data. Normally you have a IP that loads data into ODS here from source system, so Use IP process for that , next a process step to activate the ODS data and then use another IP to send the data to a cube. If you have aggregates built on that cube, use roll up process to roll up the data.
    If you can gimme your mail ID , I can send some docu's on process chains
    Regards
    Srini

  • Data upload problem

    Hi,all:
       Now I face the problem as follow:
       1.The delta upload can not run automatic finished.I can find some course in <b>SM58</b>,How to solve this problem? Can you give me some helps?
       2.When I delta upload,system message like this:
    <i>Processing in Warehouse timed out; processing steps mising
    Diagnosis
    Processing the request in the BW system is taking a long time and the processing step Second step in the update has still not been executed.
    System response
    <DS:DE.RSCALLER>Caller  is still missing.
    Procedure
    Check in the process overview in the BW system whether processes are running in the BW system under the background user.
    If this is not the case, check the short dump overview in the BW system.</i>
        What is wrong with the data upload and how to solve it? Can you give me some advice?
    Regards&Thanks!
    zagory

    Hi Zagory,
    I think at the time of this particular job starting there are other jobs running with Background User Owner or there might be other background jobs running with Priority  A . SO  try to reschedule the JOb at some later point in time where system is less busy ......or one more option is to try to increase the System Dialog run Time in RZ10 transaction ...Ask ur basis team they will do for you ...
    But try first option first......
    Assign points if it helps.....
    Regards,
    Vijay.

  • Error when I try to lock data

    When I try to lock data for an entity for 1 year, 1 month and 1 scenario I get the following message
    "Cannot complete this action because calculations, translations, or consolidations need to be performed"
    When I look in process control however everything is ok...

    In case it would have been validation thing the message would have read "validations need to be performed". Have you changes your metadata after uploading the balances in the system? If yes then, were there any journals that were posted with the deleted accounts. Does the consolidation happen completely?

  • Archiving Run - 2LIS_03_UM

    Hi All,
    I activated the datasource 2LIS_03_UM in RSA5 and then in LBWE . I am running the t code OLIZBW to fill the set up tables.
    I enter the company code , run name and the run date ( future date) . And when I click on execute , the cursor comes to "Archiving Run" . It looks like it wants me to enter the Archiving Run session.
    I tried entering various numbers like 1,2,3 etc. But it gives me an error message " File does not exist".
    Can someone guide me if you faced this scenario.
    Thanks

    Rekha
    As I said Logistic Data sources are not active in my system to check and answer exactly. I will try to access my other BW system and let you know.
    Thanks
    Sat

  • Unable to view locked data in Data Forms

    Hi,
    Does anyone know why we are not able to view locked data (it just shows empty orange cells) in Data Entry Forms? We added a column to the Data Form with the prior quarter information. This is just for information purposes and not for entering data, so the users see in one single form what they entered in the prior quarter. Unfortunately the cells do not show any data when the prior quarter is locked.
    Any idea?
    Thanks a lot!

    Unlock Data and run updated Business Rules and lock again...

  • Setup table - Archiving run option

    Hi all,
            I want to check my inventory data have archiving run while filling setup table. Also archiving run data in which table i can find. If required R3 data in archiving table, we no need to think about backup data in BW.
    thanks

    Hey Raj,
    Your question is not clear?
    If you are talking about archived data
    - Archived data will be not be present in the source tables, you need to perform special mechanism to retrieve the archived data.
    - if you want to extract the data using the data source to BW you will not be able to extract the archived data.
      For this you need to write a exit in CMOD for that particular data source to additionally extract archived data where for example BSID-XARCH =x or BSEG-XARCH=X etc.
    or
    one more way is to extract the archived data by creating a generic data source on top of the archived tables for only those records which are archived (XARCH = X).
    Just perform a full update as the archived documents does not have updated records. you can go for creation of generic data source it would be a better option to extract the archived data instead of disturbing the existing data flow.
    Regards
    KP
    Edited by: prashanthk on Nov 17, 2011 11:58 AM

  • Time Data Upload

    Here we are using positive time management, where all actual time in and time out has been recorded in the time machines installed at factory gates.
    We want the daily time data has to uploaded to SAP System to run the time evaluation.
    Does SAP have any standard report/tool for the data upload, from time machine to SAP Or need to write ABAP programming to upload the data?
    Can you please send me the steps to be taken for the integration and any standard format is available for the uploading. Do you have any kind document on this?
    Please mail to [email protected]
    Thanks in Advance
    Peeyoosh

    Hi Peeyoosh,
    please have a look at following thread:
    How to upload time events?
    Regards
    Bernd

Maybe you are looking for

  • I open up my game and the monitor says "no signal"

    i open arma 3 up but after a few minutes of loading or on the main screen my monitor will cut out and say "no signal" it is realy bugging me. it is not my moitor because we have switched monitors and tried other things, the only things that could be

  • MSI GT70 onc

    Hello I have a MSI Gt 70 got it about 1 month ago, i have noticed that the computer is or have slow startup sow i purchased 2 Ocz Nocti Series 120gb mSata that im gona use in the Super Raid card, i took the back of and noticed my computer does not ha

  • Change Comment background color?

    Is it possible to change the background of a comment? We have multiple people editing and commenting on a document and having different color comments for each user would make things much easier...

  • Migrate Single Node 10g STreams to Multi Node RAC

    We are currently running 10g STreams. One capture process enqueing LCR into one queue. Propagation job dequeues messages and propagates them to a Destination queue. Then our Apply process at Destination database dequeues and apply changes. We have de

  • Cisco ESW 520 in Cisco LMS

    Hi, I have a question whether CicoWorks LMS can manage Cisco ESW 520 SWITCHS. I can import it in RME but it says unknown device. Thanks Ashley