Losing records when updating

I am experiencing a problem with a record being deleted on
occasion. From what I could tell, 1 out of every 150 to 200 records
is being deleted when I update.
To explain the process, I am inserting a record with a "sales
order" number and then go into a form screen that is completed and
then submitted. When submitted, it goes to the action page, which I
have attached. If someone can read through and see if there is
anything in the code that might be causing the problem. Or is it on
the DB table end?
Just a note. It was happening more often until I switched
from a SQL statement to the CFUPDATE.

The people that are using the system are also keeping a paper
copy of information (to long to explain). When it's time to
invoice, just a quick reference check against the papers, we found
records that were not entered. I then set up a trigger on DELETE to
see what might be happening. Yesterday I was told of a record that
was not saved. When I looked into the audit table, I could see the
record with the sales order number and a status of DELETED.

Similar Messages

  • Losing applications when updating firmware.

    Can anyone help me out, i'm about to update the firmware in my N95 to v21 but dont want to lose the apps i've installed like rotateme etc, stupidly i didn't save a copy of them on my PC, if i store these apps on my memory card bore updating will they be safe or will i have to go through the rigmarole of signing them again?
    Any help would be most appreciated
    muchos gracious

    If the apps have been installed on a memory card, and the application developer has done their job properly, it won't be lost, but will be automaticall reinstalled.
    If the application developer didn't do a good job, the automatic reinstallation will not work.
    If the application was installed in phone memory, it will be gone and you have to reinstall it from the original .sisx file (or .jar file, if it was a Java app).
    If the application developer did a really good job, then the "Memory" app or "PC Suite" backup/restore will also work for the application. In this space most developers don't do a good job and leave things at the default state, which means that the backup will not back up applications at all.
    I always save/keep the original .sisx files around. Safest that way.

  • How to only update existing records when loading master data ?

    Hello experts, I need your lights one more time.
    Here is my need :
    I have created an infoobject (IO) which is a very simple version of 0material, let's call it Znewmat --> Znewmat has material type and trademark as attributes, those two fields are available in 2 different datasources :
    - 0MATERIAL_ATTR for material type (field MTART)
    - 0MAT_SALES_ATTR for trademark (field MVGR2)
    When loading my new IO from 0MATERIAL_ATTR I use a filter (at DTP level) to get only a few material types (I get something like 1000 records),
    here is my issue : when I load from 0MAT_SALES_ATTR the field "material type" is not available to keep the same filter as for 0MATERIAL_ATTR and existing records are updated with the trademark, but I also get 5000 records I don't need, and my master data is "polluated" with useless lines.
    *and my question : is there a way while performing the second loading to ONLY UPDATE EXISTING RECORDS AND NOT ADD ANY
    NEW RECORDS ? (i didn't find anything in main options of my DTP)*
    (I'd like to avoid the solution to update the 0MAT_SALES_ATTR datasource to add the missing field)
    Thanks in advance for any help, points will be distributed.
    Guillaume P.
    Still no idea ?

    in the start routine of transformation from 0MAT_SALES_ATTR to znewmat do the following:
    select materials from /BIC/PZNEWMAT into i_mat
    for all entries in source_package where material eq source_package-material.
    loop at source_package.
    p_ind = sy-tabix.
    read table i_mat with key material = source_package-material.
    if sy-subrc ne 0.
    delete i_mat index p_ind.
    endif.
    this way you'll only update records that have previously been loaded by 0MATERIAL_ATTR DS
    loading sequence:
    first load ZNEWMAT from 0MATERIAL_ATTR. then activate ZNEWMAT. then load 0MAT_SALES_ATTR to ZNEWMAT.
    M.

  • Dynamic update  of cursor records when table gets updated

    Hi,
    I am having a table with 4 columns as mentioned below
    For a particular prod the value greater less than 5 should be rounded to 5 and value greater than 5 should be rounded to 10. And the rounded quantity should be adjusted with in a product starting with orderby of rank with in a prod else leave it
    Table1
    Col1     prod     value1     rank
    1     A     2     1          
    2     A     6     2
    3     A     5     3
    4     B     6     1
    5     B     3     2
    6     B     7     3
    7     C     4     1
    8     C     2     2
    9     C     1     3
    10     C     7     4
    Output
    Col1     prod     value1     rank
    1     A     5     1          
    2     A     5     2
    3     A     3     3
    4     B     10     1
    5     B     0     2
    6     B     6     3
    7     C     5     1
    8     C     5     2
    9     C     0     3
    10     C     4     4
    I have taken all the records in to a cursor. Once after rounding the request of 1st rank and adjusting the values of next rank is done. Trying to round the value for 2nd rank as done for 1st rank. Its not taking the recently updated value(i,e adjusted value in rounding of 1st rank).
    This is becoz of using a cursor having a value which is of old value.
    Is there any way to handle such scenario's where cursor records gets dynamically updated when a table record is updated.
    Any help really appreciated.
    Thanks in Advance

    Hi,
    Below is the scenario. Which I am looking for.
    ITEM_ID(A)
    ITEM_ID Value Date
    A          3     D1     
    A          5     D2
    A          3     D3     
    A          5     D4
    A          3     D5     
    A          5     D6
    Rounding for Item A has to be done for the rows less then D2 and rounding value is
    x and value adjustment to be done from very next row.
    --For record D1 rounding to be done and value adjustment is to be done from D2 to till the end untill the adjustment value is 0.
    --For record D2 (updated value has to be taken from rounding which updated in D1 row rounding) and the adjustment has to be done from very next row D3 to till the end or adjustment value is o.
    --For D3 row onwards no rounding has to be done.
    ITEM_ID(B)
    B          7     D1     
    B          8     D2
    B          9     D3     
    B          5     D4
    B          4     D5     
    B          3     D6
    Rounding for Item has to be done for the rows less then D3 and rounding value is
    y and value adjustment to be done from very next row.
    --For record D1 rounding to be done and value adjustment is to be done from D2 to till the end untill the adjustment value is 0.
    --For record D2 (updated value has to be taken from rounding which updated in D1 row rounding) and the adjustment has to be done from very next row D3 to till the end or adjustment value is o.
    --For record D3 (updated value has to be taken from rounding which updated in D2 row rounding) and the adjustment has to be done from very next row D4 to till the end or adjustment value is o.
    --For D4 row onwards no rounding has to be done.
    Thanks in Advance
    Edited by: unique on Apr 16, 2010 11:20 PM

  • When updating to target only few records got added to the target

    When updating to target only few records got added to the target from the transferred recordsu2026what could be the reason?
    Edited by: MohanDP on Jun 28, 2010 7:44 PM

    Hi Mohan,
    When updating to target only few records got added to the target from the transferred recordsu2026what could be the reason
    There might be a routine in your Transformation which might be filtering few records(or)
    It is possible that you will have multiple records for one keyfield in PSA. But when you try loading the same to datatarget like Cube, the result will be added.
    For ex, say you have Profit center US00000593 in PSA with two rows with keyfigure(say some ABC) values 100 & -50.
    Then when you load the data to datatarget, you will have only one row in it with Profit center US00000593 & values for keyfigure ABC=50 (i.e 100-50=50). So most of the times we will be having added records less than transferred records.
    Hope it is clear!
    Regards,
    Pavan

  • FRM-40501 ORACLE error: unable to reserve record for update or delete when

    Hello.
    I have two tab pages and one item on each page. Second tab page item, is mirror item of the first one. I use synchronize property on the mirrored one.
    When i try to update mirrored one i get that error: FRM-40501 ORACLE error: unable to reserve record for update or delete when.
    How can i solve that?
    Thanks

    hi dejan,
    the error u r getting means that the record cannot be locked. This is ussually caused if u had locked the record from somewhere else (u or another user) or when forms cannot find corresponding column to the base table to lock the record.. U probably r facing the second situation. I'm not sure that forms can commit a change to an item that is synchronized cause synchronized items ignore their base table attribute and just copy their value from the other item.
    Why don't u try Copy Value from Item property using <block_name>.<item_name> syntax and your item will have a value copied from the other item but u will have no problem with the db transactions. I suppose..
    hope this helps,
    teo

  • Nervous about losing personal data when updating Xperia X10 to 3.0

    I want to download 3.0.1.6.0.75  to my Xperia X10 that currently has 2.1.1.A.0.6 but I am worried about losing my personal data. I have already used Backup and Restore to create a backup of my data but when I go to PC Companion on my home computer, it tells me
    Personal data such as contacts, messages.... saved in the phone memory will be overwritten..."
    Could you please give me ALL the steps I need to do to ensure I do not lose my personal data when updating to 3.0.? The last time I updated my phone about 5 months ago there were good step by step instructions to follow but now I cannot find them on your website.
    thank you,
    SP

    Don't forget to check the other Apps to backup your phone
    Contacts
    just to play it safe sync your contacts w/ google if you haven't already
    open your phonebook
    press MENU button -> tap on Send Contacts -> select all -> send it to your email
    on your computer
    -> download the file -> go to your gmail -> click on contacts -> click on more actions -> on the drop down menu click on "import" -> choose the file -> and click import
    Apps
    Don't worry so much about backing up your apps
    on your computer go to
    market.android.com -> log in -> click on My Library -> and send all your apps to your phone
    SMS/MMS
    if you don't want to use the backup & restore app then
    Install "backup SMS" to back up your SMS and MMS to your Gmail
    https://market.android.com/details?id=com.zegoggles.smssync&feature=search_result
    or your SMS use
    "backup & restore SMS"
    https://market.android.com/details?id=com.riteshsahu.SMSBackupRestore&feature=search_result
    or for your MMS use "Save MMS"
    https://market.android.com/details?id=com.schwimmer.android.mmsextract&feature=search_result
    Call Logs
    And for call logs this is great
    https://market.android.com/details?id=com.yang.android.ansta

  • Update info record when changing units of measure in material master

    Hello,
    is there any possibility for an automatic change of info record when changing conversion factors in the material master ?
    We have 3 units of measure in our material master: the base unit and 2 alternative units. The first alternative unit is the order unit and the second alternative unit is the order price unit. So in our purchase info records we maintain an order unit, which differs from the base unit and we also maintain an order price unit, which differs from the base unit and the order unit.
    If we now change the conversion factors between the units in the material master, we would like to get an automatic change of the converions in the info record (to order unit and order price unit), which have the same previous conversions.
    For IS R it seems to be implemented (see OSS-Note 975954).
    Is there any solution for a "normal" industry system ?
    Thanks for answering
    Edited by: Britta Heinsen on Jul 23, 2009 1:52 PM

    I don't think such an auto update is available. Infact I came to know that such an update is available in IS-R after reading your query.
    Obviously you an always create a workflow based enhancement that can synchronize the Info-records and material master changes or sort of enhancement which regularly reads the material master change history for such changes and trigger a synchronization by executing change Info-record automatically.

  • Losing unprocessed records when blocking queue entries are serialized

    Hi,
    I am developing a batch framework using the Java 1.5 thread APIs. We have a requirement to save the state of the batch processes when it terminates abnormally or when it is killed for any reason. I have implemented this functionality using shutdown hooks.
    I have initialized the thread pool with two thread instances. Now, while things are behaving as expected, I am losing upto 2 records when I kill the batch process. These are those 2 records which were being processed by the threadpool threads when the kill signal was sent.
    Is there a mechanism to prevent the batch job from exiting before these two threads have processed finishing or maybe keep these 2 threadpool tasks in the pool till they have completed gracefully. I.e. remove these tasks from the threadpool task queue when have they have finished execution
    Regards,
    Hitesh

    ExecutorService.shutdown()
    ExecutorService.awaitTermination()
    One would expect the jobs that are currently executing to have been already removed from the queue.

  • Syndication only when records are updated, but ONLY for selected fields

    again with one of my : it's that even possible at all ?
    the case, we have a customer repository where, among other things we add some extra info (classification ABC and similar mainly) to a small percentage of them (around 5 - 10 %)
    and we would like to distribute only those changes, changes in few fields
    of course we load plenty of fields from R3 and other systems, and also we modify records to assign the customers to different business organization that have to look after, this assignments change also very often (it can affect to 30 to 40 % of the customers). All this data does not need to be distributed.
    so in figures (as example to easier understand), weekly base :
    _ 250.000 _ aprox records in MDM
    _ __ 1.000 _ new records every week from source systems
    _ __ 8.000 _ updated records every week from source systems
    _ ______________(yes, big number, some marketing attributes and soft stuff they like to see in MDM)
    _ ____ 100 _ aprox are "enhanced" with data we need to distribute
    _ __ 50.000 _ aprox records are re-assigned or reclassify with data we do NOT need to distribute
    _ ______________(mostly done with assignments and so on that keep changing every week)
    so, even selecting "suppress unchanged records", MDM keeps distributing like 60.000 records, when we only need those 100.
    is any direct way to make this ??
    as a work-around, I have think into a workflow that in a way flags a new field (to_be_distributed) always and only when a user change one of this specific fields. And then filter the distribution for only this field = True. And another workflow that set the mention field to False, just after the distribution is done.
    Crazy idea ??, anything better ??
    (we are in MDM 5.5, if the solution require 7.0, migration plans are in the horizon)
    thanks in advance, everybody

    Hi,
           We had the same scenario where in we have to Syndicate only certain Fields to Legacy...As per your scenario add classification data to your Existing Customer repository. Next steps is to create a map for the classifications data...You write your own conditions in FreeForm search of MDM Syndication. Whenever certain conditions are satisfied, MDM system will syndicate. 
    Next - Option - If you are implementing work flow then create a extra field called Syndicate_Legacy. Syndicate_Legacy should be Flat LookUps .Values should be Syndicate,No Syndicate.  Before end of the WF, you should have  a assignment step. By using assignment change the value to Syndicate or Not syndicate. For this , along with map you should also write a condition in Free form search in syndicator stating that only if Field(Syndicate_Legacy) = Value(Syndicate). I believe these should solve your issue...If not please let me know...
    Thanks
    Ganesh Kotti

  • How to deactivate/ignore R/3 info records when creating Shopping Cart?

    Hi all and thanks for reading...
    We have the requirement of ignoring/deactivating R/3 info records when creating Shopping Carts SRM , so that no Vendor is proposed in transactions BBPSC01/BBPSC02.
    At the moment, when info records exist in backed, system is proposing vendor and other data and we want them to be completely ignored, both in classic and extended classic scenarios.
    How can we accomplish that? Is it possible to use BBP_SOS_BADI or is this BADI only valid for SRM local sources of supply?
    Has anybody had the same problem and solved it before?
    Thanks in advance for your help, regards
    David

    Hi  David
    Inforecord  is only source of supply for classic scenario only.
    Find and Check Sources of Supply
    Use
    With the Business Add-In BBP_SOS_BADI, you can search for and check sources of supply according to your own rules. These sources of supply include contracts, vendor list entries and product linkages. For this, the customer fields of the shopping cart or purchase order are transferred to the BAdI.
    Standard settings
    The BAdI provides the following methods:
    1. BBP_SOS_INDEX_UPDATE_CHECK
    Use: Check and update contract items in the source of supply table.
    2. BBP_SOS_SEARCH
    Use: Search for sources of supply according to your own rules.
    3. BBP_SOS_CHECK
    Use: Check and filter the sources of supply found by the standard search according to your own rules.
    4. BBP_SOS_PD_CHECK
    Use: Carrying out your own additional checks when creating a shopping cart document item with an assigned contract.
    Activities
    Implement the BAdI if you wish to determine or check sources of supply according to your own rules.
    See also
    Implementation
    As prasanna mentioned - do you want disable both sides or only one side .
    Muthu

  • Error in replicat when updating a row

    Oracle 11gR2
    OGG 11.1.1.1.5
    I am getting the following error when updating a row that exists in both the target and source db:
    OCI Error ORA-01403: no data found, SQL <UPDATE....The following is in the discard file:
    Record not found
    Mapping problem with compressed key update record (target format)...
    ...The row DOES exist in the target db for sure. But for some reason it thinks that it is not there...why?
    Here are the exact steps I did:
    1. Created a table in source
    2. Create the same table in target (explicitly not through replication as we have an exclude filter on 'CREATE')
    3. Inserted rows into source table (which were replicated to target table).
    4. Delete a row in the source table (which were not applied to target as I am using the IGNOREDELETE parameter)
    5. Update a row in the source table (this is where I got the above mentioned error 'ORA-1403' even though the row does exist.)

    mb_ogg is very likely right - you forgot to "add trandata" as that's the number one most common reason for an ORA-1403 (ANSI 100) no data found error.
    The problem is that Oracle does not automatically log the PK, it only logs that which changed. Down at the replicat on the target it tries to update but the WHERE clause has PK = NULL because it was not logged in the redo. To have it logged in the redo so that the target UPDATE statement has a correct value for PK, you need to use GGSCI to issue "add trandata," which performs an "ALTER TABLE ADD SUPPLEMENTAL LOG GROUP ... ALWAYS".
    "INFO TRANDATA", if using OGG version 11.2+ on Oracle it will tell you if logging is enabled on a table and for which columns.
    Good luck,
    -joe

  • Info record not updated from PO , PO updated from Info record

    Dear Gurus,
    I want to restrict Base price in PO will be copied from Valid info record. User does not permitted to change the Base price once info record not maintained.
    Kindly tell me the configuration steps to adopt the process ?
    I don't want to update infor record from PO also.
    I know while making PO i can untick infor record not update, but that is user specific.
    Thanks in advance
    With regards
    SD

    Dear Sidi,
    Thanks . Problem solved.
    When info record is maintained the price condition will get from there. for this change in condition type P001 , set "D" in  manual entries column.
    When info record is not maintained, then the system will try to get the price from last PO, if not present, it  will throw an error u201CNet price must be greater than 0u201D and sets price condition PBXX (manual entry) to manually enter the price you want.
    What you can do:
    Is make the condition type P000 as automatic only (option D as below) and also same for condition type PBXX.
    Regards
    Soumen

  • Error when updating Idocs in Source System - Urgent

    Hi Team,
    When i was loading the data from SAP R/3 to BW, i was facing the error "Error when updating Idocs in Source System" (0 From 0 Records).
    When i check in the Environment>Transaction RFC->In the Source System, it was displaying the error--
    <b>" Import container contains errors (are any obligato) in the Function Module: IDOC_ERROR_WORKFLOW_START_R".</b>
    Can any one please help me to solve this error.
    This is am Urgent requirement for me to deliver.
    Thanks & Best Regards,
    Venkata.

    Hello VenkaTa,
    How r u ?
    The workflow settings are not proper it seems. Ask the BASIS people to correct the Work Flow Settings. Then try Updating the IDOCs manually.
    Also check the Inbound and outbound IDOCs in the Source system. The outbound should contain some warnings or errors.
    Best Regards....
    Sankar Kumar
    +91 98403 47141

  • After I did the update it deleted my phone contacts :( trying to restore but itunes only has memory of last restore yesterday when update was done.smhh

    After I did the update it deleted my phone contacts trying to restore but itunes only has memory of last restore yesterday when update was done.smhh

    Yes all contacts checked  and getting pop up message that says "your contacts, calendars,& bookmarks are being synced with ICloud over the air, so it isn't necessary to synch these items using itunes. icloud sync can be configured using your devices mail, contacts & calendar settings".
    I have that setting as referenced above  and only contacts showing are the 9 contacts I renetered manually today after losing all 3oo contacts. I then restored my iphone back to orinail and still no contacts. Please help

Maybe you are looking for

  • Unable to generate So through Back ground scheduling : URGENT

    Dear All , I am coded a customized BAPI to create Sales order from the data sent by SAP.NET scheduler. If i test the BAPI as a function module I am able to generate SO for the Same data which is there in the external system, but the SAP.NET scheduler

  • Listener error with dg4odbc and mysql

    Hi all, First of all, sorry for my English, I'm learning :-) I'm trying to create a dblink from oracle to mysql, but for now this seems impossible. I have read lots of manuals and post, but no way. I can connect to remote mysql server with "isql my_p

  • Mdx Query performance problem

    Hi Is there any way to control the performance of Mdx expressions that use the Filter function? The following Mdx statement is an example of a query we are generating to return filtered characteristic values for users to make selections for variables

  • Kernel task uses 750MBs of RAM since installing Mavericks

    Hi All, I'm new to these forums and could really use some help. After installing Mavericks I noticed my system was running deathly slow even with no applications open. At first I assumed it was indexing so I waited a day. I noticed kernel task, the f

  • T-Code SMQS

    Hello everyone! I am facing some problems to replicate BPs between CRM and CS systems. I´ve noticed that SMQS is always in status WAITING, hence the Outbound Queue (SMQ1) has always lots of entries. That could be the reason for the performance proble