Archive data of using DART : Job lock problem in table TSP01

Hi ,
I'm facing problem while archiving from Production system to UNIX using DART.
Using TC: FTW1A to data extract, once data has extracted, we need to do verifaction progess through TC. FTWE1(run a BG jobRTXWCHK4) and FTWD(BG Job RTXWCHK2).
When I am running TC: FTWD(BG Job RTXWCHK2)to verify, it was holding an extensive lock on the TSP01 table for long period of time which blocking others processing with this table, So We need to terminate this job. For the time being solution is to run this job during weekends.But I want solution for this.
Is any boby can help me in this problem?
Regards,
Nupur S Jaipuriyar

Locking a row that does not exist can be difficult.
On most database you can lock an entire table through "LOCK TABLE <table>", however this may be extreme. Potentially you could also insert an empty row into the table with the id that you want to lock, then you would have a write lock on the row until you commit the transaction.

Similar Messages

  • Report for Archived data by using exicutable program.

    Hi,
    In already existed Zreport data is fetching from table like below.
    SELECT belnr bewtp budat lfbnr lfpos FROM ekbe INTO TABLE it_ekbe
                                                   WHERE bewtp IN ('R' , 'Q') "Only IRs
                                                      AND budat IN s_date.
    Like the above in the report fetchnig data from somany tables so mant times.
    Now my requirment is the archived consultants archived old data. some table are archived some are not archived.
    So i need to fetch the old data also from the archived files. Achived consultants given below details.
    zarixmm3------------>zerox table for MM related tables after archived the data.
    I searched in google got some code to fetch the data from archived fiels. The below code.
    1. I fetched the key  archive key number from the zerox table.
    SELECT archivekey  FROM zarixmm3 INTO TABLE it_get
                                       WHERE budat IN s_date
                                         AND werks IN s_werks.
    CLEAR : wa_ekbe.
    LOOP AT it_get INTO wa_get.
      v_arcdoc = wa_get-archivekey+0(6).
      wa_arcindx1a-archivekey = wa_get-archivekey.
      wa_arcindx1a-v_arcdoc = wa_get-archivekey+0(6).
      CLEAR : wa_get.
      APPEND  wa_arcindx1a TO  it_arcindx1a.
    ENDLOOP.
    SORT it_arcindx1a BY v_arcdoc.
    REFRESH it_get.
    CLEAR: v_arcdoc.
    DELETE ADJACENT DUPLICATES FROM it_arcindx1a COMPARING v_arcdoc.
    2. I passed the key in FMs.
    LOOP AT it_arcindx1a INTO wa_arcindx1a.
      v_key = wa_arcindx1a-archivekey+0(6).
      CALL FUNCTION 'ARCHIVE_OPEN_FOR_READ'
        EXPORTING
          archive_document = v_key
    *     archive_name     = wa_arcindx1-archivekey
          object           = 'MM_EKKO'
        IMPORTING
          archive_handle   = lv_handle
        EXCEPTIONS
          OTHERS           = 1.
      IF sy-subrc <> 0.
        MESSAGE ID sy-msgid TYPE 'I' NUMBER sy-msgno
                WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        EXIT.
      ENDIF.
      DO.
        CALL FUNCTION 'ARCHIVE_GET_NEXT_OBJECT'
          EXPORTING
            archive_handle = lv_handle
          EXCEPTIONS
            end_of_file    = 1.          "nur die Ausnahmen, auf die man
        IF sy-subrc <> 0.                "wirklich reagieren will
          EXIT.
        ENDIF.
        CALL FUNCTION 'ARCHIVE_GET_TABLE'
          EXPORTING
            archive_handle        = lv_handle
            record_structure      = 'EKBE'
            all_records_of_object = 'X'
          TABLES
            table                 = git_ekbe_temp
          EXCEPTIONS
            end_of_object         = 0.
        LOOP AT git_ekbe_temp ASSIGNING <ls_ekbe>
                               WHERE bewtp = 'R' OR bewtp = 'Q'
                                AND budat IN s_date.
          MOVE-CORRESPONDING <ls_ekbe> TO wa_ekbe.
          APPEND wa_ekbe TO it_ekbe.
          MOVE-CORRESPONDING <ls_ekbe> TO wa_ekbee.
          APPEND wa_ekbee TO it_ekbee.
          CLEAR wa_ekbe.
          CLEAR wa_ekbee.
        ENDLOOP.
      ENDDO.
      CALL FUNCTION 'ARCHIVE_CLOSE_FILE'
        EXPORTING
          archive_handle = lv_handle.
      CLEAR wa_arcindx1a.
    ENDLOOP.
    Like the above i diid for all tables wich are used in the Zreport.
    Please tell me i am going in right direction or not. The above logic taking time because it is reding entire data in the table.
    If new logic is there please provide me to improve performance of the zreport.
    Regards,
    Maruthi S

    Hi Friends,
    I am waiting for solution aobve.
    Regards,
    Maruthi. S

  • Setting up archive delete functionalities using background job

    Hi,
    Through SARA transaction there we can set up a archive delete job by selecting the files, job timing details. Instead of this, we want to set up this functionality using a background job as per the variant and timing requirements.
    Is there any option to set up a background job of archive delete functionality by selecting the files through a program or function moduel. Please suggest me if anyone has done this in previous.
    Thanks,
    Siva

    Hello,
    Take a look at OSS note 205585 which describes how to schedule archive delete jobs indirectly using program RSARCHD.
    Hope this helps.
    Best Regards,
    Karin Tillotson

  • Regarding REFRESHING of Data in Data warehouse using DAC Incremental approa

    My client is planning to move from Discoverer to OBIA but before that we need some answers.
    1) My client needs the data to be refreshed every hour (incremental load using DAC) because they are using lot of real time data.
    We don't have much updated data( e.g 10 invoices in an hour + some other). How much time it usually takes to refresh those tables in Data wareshouse using DAC?
    2) While the table is getting refreshed can we use that table to generate a report? If yes, what is the state of data? Stale or incorrect(undefined)?
    3) How does refresh of Fin analytics work? Is it one module at a time or it treats all 3 modules (GL, AR and AP) as a single unit of refresh?
    I would really appreciate if I can get an answer for all the questions.
    Thank You,

    Here you go for answers:
    1) Shouldn't be much problem for a such small amt of data. All depends on ur execution plan in DAC which can always be created as new and can be customized to load data for only those tables...(Star Schema) ---- Approx 15-20 mins as it does so many things apart from loading table.
    2) Report in OBIEE will give previous data as I believe Cache will be (Shud be) turned on. You will get the new data in reports after the refresh is complete and cache is cleared using various methods ( Event Polling preferred)
    3) Again for Fin Analytics or any other module, you will have OOTB plans. But you can create ur new plans and execute. GL, AR, AP are also provided seperate..
    Hope this answers your question...You will get to know more which going through Oracle docs...particular for DAC

  • Archive data

    Hi,
    I am new to the Oracle database. And I need to archive data that is older than 60 days and delete the data from the tables. The tables have refertential integrity constraints. The archive data need to be resored back to the tables when necessary. I have read the previous messages posted by others and got to know that there are at least three ways to do it.
    Firstly, I can use Oracle Partitioning. But this is only available in Enterprise edition. The version I will be using is Standard edition one. So I can't use the partitioning method.
    Secondly, I can use Oracle Export utility with query option to export the data and delete the data from the tables after export.
    Thirdly, I can create the same set of tables (historical tables) and write scripts to copy the data from the current tables to the historical tables, and delete the data in the current tables.
    In my opinion, I think that the second method is move like an "archiving" as the Export will export the data into a file which can be stored in some sorts of storage devices.
    The third method simply sotres the historical data in the database. I still need to back up those historical tables in case the database crashes.
    If you know any other methods or any improvements to the above methods, please let me know. Thanks.

    you've got all options covered.
    the devil is in the details ..... knowing the schema well enough to be able to
    write the correct queries for export or to copy to archive tables.
    Note : Those archive tables can be moved to another database which you
    can make available to users if they need to query it for "historical" data,
    provided all parent-child / master-detail data relationships are maintained.

  • Reading archived data of different archive objects in Z programs

    Hi,
       I have Z programs which are getting data from various tables. Now i needt to read archived data of those tables in my Z programs also. i.e. i need to modify those programs so that data archived for those tables should also be extracted. Now for exmple if the program is fetching data from MARA, EKKO, LIKP, KONP tables based on certain selection criteria, then how do i fetch archived data for same criteria as each of these tables may belong to different archive objects ? Please help me. Thanks in advance.

    Hie Naganath
    Using transaction SARA you can identify the archive objects for the tables you want to use. 
    For material master you can use object MM_MATNR, for Purchasing documents use object MM_EKKO, for conditions use SD_COND and for deliveries use object RV_LIKP.
    Retrieve the archive key and offset from table ZARIXMM5 (for Purchasing documents) for your select options and pass the offset and key to function ARCHIVE_READ_OBJECT to get the handle and then pass the handle to function ARCHIVE_GET_TABLE to read the record.  Have a look at the below code
    select archivekey archiveofs into corresponding fields of table lt_arch_keys
                   from zarixsd1 "SD Index table for billing documents
                  where vbeln in s_docrng
                    and kunrg in s_arc_kn
                    and fkdat in s_arc_dt
                    and fkart in s_arc_rt
                    and vkorg in s_arc_og.
      loop at lt_arch_keys assigning <fs_lt_arch_keys>.
    *Read information from archive
        call function 'ARCHIVE_READ_OBJECT'
          exporting
            object         = 'SD_VBRK'
            archivkey      = <fs_lt_arch_keys>-archivekey
            offset         = <fs_lt_arch_keys>-archiveofs
          importing
            archive_handle = ls_handle
          exceptions
            others         = 1.
    *Get the requested header detail data for billing document
        call function 'ARCHIVE_GET_TABLE'
          exporting
            archive_handle          = ls_handle
            record_structure        = 'VBRK'
            all_records_of_object   = 'X'
            automatic_conversion    = 'X'
          tables
            table                   = lt_vbrk "Header data
          exceptions
            end_of_object           = 1
            internal_error          = 2
            wrong_access_to_archive = 3
            others                  = 4.
    *Get the requested line item detail data for billing document
        call function 'ARCHIVE_GET_TABLE'
          exporting
            archive_handle          = ls_handle
            record_structure        = 'VBRP'
            all_records_of_object   = 'X'
            automatic_conversion    = 'X'
          tables
            table                   = lt_vbrp "Line item data
          exceptions
            end_of_object           = 1
            internal_error          = 2
            wrong_access_to_archive = 3
            others                  = 4.
        clear: ls_vbrk.
        loop at lt_vbrp assigning <fs_lt_vbrp>.
          read table lt_vbrk into ls_vbrk with key vbeln = <fs_lt_vbrp>-vbeln.
          move-corresponding <fs_lt_vbrp> to lt_archive_records.
          move-corresponding ls_vbrk to lt_archive_records.
          append lt_archive_records to gt_archive_records.
        endloop.
      endloop.
    Do the same for your material master and condition records as well.
    regards
    Isaac Prince

  • Does MM_EKKO  archive write job lock the tables?

    Archive experts,
    We run MM_EKKO archiving twice a week. From my understanding the write job just read the data and write it to archive files. But we run Replenishment jobs which hit EKPO tables, the jobs run slow, and I noticed that the archive write job is holding the locks on this table. As soon as I cancelled the write job, the replenishment jobs move faster. why this happens?  Archive write jobs should not be causing any performance issues, only the delete jobs will impact the performance. Am I correct?  Is any one experiencing similar issues?
    Sam

    Hi Sam,
    Interesting question! Your understanding is correct, write job just reads the data from tables and writes it into archive files... but....write job of MM_EKKO (and MM_EBAN) is a bit different. The MM_EKKO write job also takes care of setting the deletion indicator (depending on whether its a one step or two step archiving. So its possible that it puts a lock during the process of setting the deletion indicator, as its a change to database.
    please have a look at the folloing link for the explanation of one step and two step archiving:
    http://help.sap.com/saphelp_47x200/helpdata/en/9b/c0963457889b37e10000009b38f83b/frameset.htm
    Hope this explains the reason for the performance problem you are facing.
    Regards,
    Naveen

  • TS3579 I found this useful because I did not know about the effect of typing in data and that you could only drag to rearrange the data.  I had typed in data before and this had caused problems but restoring defaults did not cause correct dates to show up

    I found this  (TS3579: If the wrong date or time is displayed in some apps on your Mac Learn about If the wrong date or time is displayed in some apps on your Mac) useful because I did not know about the effect of typing in data and that you could only drag to rearrange the data.  I had typed in data before and this had caused problems but restoring defaults did not cause correct dates to show up in Finder. 

    It sounds like there are a couple things going on here.  First check if you have a successful install of SQL Server, then we'll figure out the connection issues.
    Can you launch SQL Server Configuration Manager and check for SQL Server (MSSQLSERVER) if default instance or SQL Server (other name) if you've configured your instance as a named instance.  Once you find this, make sure the service is started. 
    If not started, try to start it and see if it throws an error.  If you get an error, post the error message your hitting.  If the service starts, you can then launch SSMS and try to connect.  If you have a default instance, you can use the machine
    name in the connection dialog.  Ex:  "COWBOYS" where Cowboys is the machine name.  However, if you named the SQL Server instance during install, you'll need to connect using the machine\instance format.  Ex:  COWBOYS\Romo (where Romo
    is the instance name you set during install).
    You can also look at the summary.txt file in the SQL Server setup error logs to see what happened on the most recent install.  Past install history is archived in the log folder if you need to dig those up to help troubleshoot, but the most
    recent one may help get to the bottom of it if there is an issue with setup detecting a prior instance that needs to be repaired.
    Thanks,
    Sam Lester (MSFT)
    http://blogs.msdn.com/b/samlester
    This posting is provided "AS IS" with no warranties, and confers no rights. Please remember to click
    "Mark as Answer" and
    "Vote as Helpful" on posts that help you. This can be beneficial to other community members reading the thread.

  • Lock Problem while Broadcasting multiple reports using process chains

    Hi All,
    I am trying to broadcast 10 reports using the process chains using the program for RSRD_START and variants created in the Broadcaster..the Program jobs start fine.But I am facing the error that the RSRA_CA_LOG table is getting locked while trying to broadcast the Reports..Can any help me on how to run the jobs without the locking problem..Suggestions please.
    Thanks,
    Mike.

    Hi,
    Thanks for the Inputs and the SAP Notes.
    I am trying to broadcast the Reports from the the Bex Analyer and getting this ock error and not the workbooks.
    I am also trying to Broadcast the Workbooks also, but could not find the options to Broadcast them. we are in SAP -BW3.5 SP 12, Is it possible to broadcast the workbooks in this Patch level..or do we need to upgrade to achieve this.If so...can any one provide me the proceedure to  Broadcast the Workbooks.
    Thanks,
    Mike.

  • Problem in Retrieval of archived data

    I have archived data from DSO successfully but after sometime activation of new data records on that DSO got failed it was showing this error ' 176 records are locked by the archiving object '. To remove this activation error I deleted the archiving object of that DSO. And issue got resolved.
    But now when I want to retrieve that archived data onto the copy of that DSO. DTP is not able to fetch any archived data records and get over with green status.
    I also used SARA tcode to reload data onto the DSO for that I entered archiving object name but when i press Archive Selection tab its showing nothing to select.
    Kindly help me with this issue.

    Hello,
    Correct me if i'm wrong with understanding, you want to retrieve the archive data but by running read program you are not able to display data from archive/file.
    - Run the read program in batch mode not with option Dialog mode.
    Reload will re-load the data back into your database tables. With reload option you are not getting archive session to select... If so then check your archive session status, if case archive session is incomplete then it will not appear.
    -Thanks,
    Ajay

  • The data form formname is currently in use and is locked by user username

    We run planning 931 on weblogic 91 and sometimes our administrators find our forms locked. So the are not able to edit them.
    The following message appears +“the data form <formname> is currently in use and is locked by user <username>”.+
    Off course it always happens with users who are not available themselves to unlock the form for others. ;-)
    Only a planning windows service restart unlockes this. In our case the form is not unlocked after 2 hours.
    I found this phrase via google. It suggests that there is a certain (time) propertie which unlockes the form after a given period.
    Does anyone know where to set this property? Or to solve this otherwise.
    Only one person at a time can use the Add a Row function on a particular form. When you have the form open, everyone else is locked out. If you close the form (using the X) instead of cancelling it, it will remain locked by you for the next two hours, even though it's not on your screen anymore. If this happens, you will be the only one who can reopen the form. You can unlock the form by reopening it and then clicking the Cancel button to exit properly.

    no solution found. A restart of the services did the job. Hopefully this is not necessary again.

  • Locking problem in BPS solution not getting solved  using RSPLSE. SEM_BPS_S

    Can anyone tell me exactly what "rsplse" offers to solve locking in BPS?
    We have made correct selection of characteristics (in our case "costcentre" infoobject) in tab "Lock characteristics" , but still when two people are trying to access (write) data in the same transactional cube against the different cost centres, the system is throwing an error message Cube “zIC_ccp” is locked by username.  (Or if a single person is using the two packages simultaneously using  two web sessions , he is getting the same error. )
    Our assessment is that he should not be facing this locking problem because cost centres here are acting as a “key” to different selection as configured in “rsplse”. 
    Can anyone tell me how to use parameter SEM_BPS_SAVE_UNLOCK. I want to know t. code and other t . code and other details to execute SEM_BPS_SAVE_UNLOCK.

    Hi,
    Please check the OSS note 635244.
      From the OSS note :
    <b> Notes on SET/GET parameters SEM_BPS_NO_LOCK, SEM_BPS_SAVE_UNLOCK :</b>
    These two parameters have nothing to do with the problem described above. They are only designed to facilitate the Customizing process, if users carry out Customizing in t ransaction BPS0 within a project.
    <b>SEM_BPS_NO_LOCK:</b> This parameter has the effect that transaction data is not locked at all. Never use this parameter in a production system. As of Support Package 14 for Release 3.1B, parameter SEM_BPS_NO_LOCK is released by an additional switch in table upc_dark2: For this purpose you have to maintain a record with param = ENABLE_NO_LOCK and value = X in table upc_dark2. Only then the system includes parameter SEM_BPS_NO_LOCK. The parameter can then be useful if you want to test functions or the manual planning with 'test data' in Customizing.
    <b>SEM_BPS_SAVE_UNLOCK</b>: This parameter only works in transaction BPS0; it has the effect that the system unlocks data (if possible) after it has been saved. Therefore, the system only simulates an exit and reentry in transaction BPS0 with the last active detail application. Therefore, the above parameter can be helpful if many people work on Customizing in transaction BPS0, because objects can be unlocked earlier.
    Regards,
    Siva.

  • I was just copying my data from stand by phone by using phone to phone data transfer but phone also locked with i cloud login of stand by phone what to do

    i was just copying my data from stand by phone by using phone to phone data transfer but phone also locked with i cloud login of stand by phone what to do

    If it's a hardware problem, then the phone will need to be replaced.
    There is no magic that can fix a hardware problem.

  • DART for archived data

    Hello Sap experts
    we have a requirement where business want us to execute dart process for archived data. We have opened the archived files for this but we have a doubt regarding use of t.code FTWB because i hope this t.code is also used if we need to run DART for archived data but we are not sure when this t.code is used & how?
    Kindly suggest.
    Thanks & Regards
    Deepak Garg

    Hi
    Follow the link,
    [http://help.sap.com/printdocu/core/Print46c/en/data/pdf/CAGTFDART/CAGTFDART.pdf]
    Regards.

  • I bought new i phone 5 3 days back, after that asked for new update which is 6.1.4, after updating my iphone i am not able to use cellular data services. I called up data provider, they says its the problem with new software update. There is no option add

    I bought new i phone 5 3 days back, after that asked for new update which is 6.1.4, after updating my iphone i am not able to use cellular data services. I called up data provider, they says its the problem with new software update. There is no option add APN. Now when i switch to safari its showing you are not subscribed for cellular data. But I am able to use data on other phone.
    Will you please help me in this regard?
    Another issue, since i bought my new iphone there is dust inside back main camera.
    Your advises are highly appreciated.

    Hey Shaiju isac,
    I'd take a look at the following article, it'll guide you though steps to you troubleshoot cellular data issues on your iPhone:
    iPhone: Troubleshooting a cellular data connection
    http://support.apple.com/kb/ts3780
    Cheers,
    David

Maybe you are looking for