What is the best way to kill/stop a data load?

Hi.
What is the best way to kill/stop a data load?
I have a data load from my QA R/3 system that is extracting 115.000.000+ records. The problem is that the selection in the function module used in the data source does not work, and the problem was not detected because of the nature of the data on the development system.
I could kill processes owned by my background user (on both R/3 and BW) but I risk killing other loads, and sometimes the job seems to restart if I just try to kill processes. If I remove transactional RFCs in SM58 the load does not terminate; I only skip one or more datapackages. I have also tried to change the QM-status in the monitor to red, but that does not stop the load either...
So isn't there a nice fool-proof way of stopping a dataload?
Best regards,
Christian Frier

Hi,
There r 2 ways to kill the job.
One is using transation RSMO locate the job and display the status tab double click on the yellow light that is shown on the line total, a pop will come 'set overall status ' is displayed select the desired status that is red and save it. Then return to the monitor page and select the header tab double ckick on the data target right click and then goto 'manage',there should be request sitting there probably with yellow lights , highlight the line with the faulty request click the delete button then click refresh button.
Second is goto SM37 and click on the active selection and enter the jobname and then click excute the particulr job should appear highlight the jobname then click on the stop iconthat appears on the taskbar( 3 rd from left)
hope it is clear.
Regards-
Siddhu

Similar Messages

  • What's the best way to cleanly stop Goldengate?

    For routine maintenance/upgrades what's the best way to cleanly stop GoldenGate? I don't want to wait endlessly. I use this currently and seen no issues :-
    stop er *!
    kill er *!
    stop manager!
    Thanks,
    Shankar

    shiyer wrote:
    For routine maintenance/upgrades what's the best way to cleanly stop GoldenGate? I don't want to wait endlessly. I use this currently and seen no issues :-
    stop er *!
    kill er *!
    stop manager!
    For routine maintenance/upgrades, just {noformat} "stop er *" {noformat} should be preferred; and when all processes are stopped, mgr can (optionally) be stopped. If you really are waiting endlessly for this to return, the real question is "why": then, perhaps there are other parameters that can be adjusted to make GG stop more quickly. (On the other hand, {noformat} "stop mgr!" {noformat} is harmless, the "!" simply prevents it from asking "are you sure?" before stopping the process.)
    I really wouldn't use "kill" unless you really have a good reason to (and the reason itself requiring the process to be killed should be analyzed & resolved). To "kill" a process shouldn't cause data loss (GG always maintains checkpoints to prevent this) -- but still it seems unnecessary, unless there really is something that should be killed. (Aside: I mean, I can kill -9 / "force quit" my browser and/or 'halt' my laptop every time as well, and it would probably be "faster" to -- but it can cause issues (and wasted time) upon restart: i.e., fsck, recover sessions, whatever. Same basic idea, imo.)
    There's a reason there are different commands to 'stop' processes (stop vs. stop! vs. kill). Just for example, "stop replicat !" causes current transactions to be rolled back; there typically is no reason for this, you'll just restart that txn when the process is restarted; might as well let the current one finish. And "kill extract" (I believe) will not warn about potentially "long running transactions" that can cause (painful) issues at startup (missing old archive logs, etc). There are probably other examples, as well.
    So if this really is "routine", then "stop", don't "stop!" or "kill". If there are long delays, see why first & see if they can be addressed independently. (This really is just a stock answer for a generic question, it would be irresponsible for me to answer otherwise :-) )

  • What is the best way to stack DAQ aquired data in labview?

    I'm developing an application to work with an M-series daq card and labview 8.5 to output a signal and then record on 8 differential inputs for a short period of time (~10 ms). I need to stack my data, however, because the incoming signal will be very, very small, even after amplification. So basically i'm running a slightly modified version of the multifunction Synch AI-AO.vi (included with the install of daqmx). What is the best way for me to rerun this vi a set number of times and add new data directly to the old data (not cat-ing or anything, like |sample 1 of run 1| + |sample 1 of run 2| = stacked stample 1).
    A slightly modified version of the mutlifunction synch AI-AO.vi is attached.
    Attachments:
    des_v2_Multi-Function-Synch AI-AO.vi ‏143 KB

    Hi LSU,
    see attachment on how to "stack" several measurements. I simply add the waveforms and use a shift register to keep the last iterations value.
    Writing to files in each iteration is extremly CPU consuming - especially with express vis. Using for loops for just one iteration is "senseless". You could enable the conditional terminal of the for loop to realize your stop feature.
    For your message 4:
    Have you ever tried all the things you asked for? Sometimes it's easiest to just try&error
    And for the "n=n+x" question: It really helps to take the free online courses offered by NI!
    Message Edited by GerdW on 11-11-2009 06:27 PM
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome
    Attachments:
    des_v2_Multi-Function-Synch AI-AO.vi ‏128 KB

  • What is the Best way To Copy and paste data from 1 book to another

     I have 18 sheets in 5 different books that I want to extract data from specific cells.  What is the best way to do this?  Example:  1 sheet is called Numbers E-O1 data in 13:WXYZ. The data updates and moves up 1 row every time I enter
    a new number. So let's say I enter the number 12. Through a lot of calculations the output goes in 13:WXYZ. To what I call a counter which is a 4 digit number.  Anyways, how can I send that 4 digit number to a totally different sheet?  To bullet
    what I'm talking about
    data in cells Row 13:WXYZ in book called Numbers sheet E-O1
    send data to book called "Vortex Numbers" Sheet E-O row 2001:CDEF
    What formula or Macro can I use to make this work?
    thank you!

    Hello Larbec,
    Syntax:
    '[BookName]SheetName'!Range
    Enter in cell  2001:CDEF:
    ='[Numbers]E-O1'!13:WXYZ
    This assumes that the file is open in Excel. Otherwise you need to add the path:
    'ThePath[BookName]SheetName'!Range
    Best regards George

  • What's the best way to handle all my data?

    I have a black box system that connects directly to a PC and sends 60 words of data at 10Hz (worse case scenario). The black box continuously transmits these words, which contain a large amount of data that is continuously updated from up to 50 participants (again worst case scenario) 
    i.e. 60words * 16bits * 10Hz * 50participants = 480Kbps.  All of this is via a UDP Ethernet connection.
    I have LabVIEW reading the data without any problem. I now want to manipulate this data and then distribute it to other PCs on a network via TCP/IP.
    My question is what is the best way of storing my data locally on the interface PC so that I can then have clients request the information they require via TCP/IP. Each message that comes in via the Ethernet will relate to one of the participants, so I need to be able to check if I already have data about that participant - if I do then I can just update it, if I don't I need to create a record for the participant, and if I havn't heard from one for a while I will need to delete it. I don't want to create unnecessary network traffic. I also want to avoid global variables if possible - especially considering that I may have up to 3000 variables to play with.
    I'm not after a solution, just some ideas about how to tackle this problem... I thought I could perhaps create a database and have labview update a table with the data, adding a record for each participant. Alternatively is there a better way of storing all the data in memory besides global variables?
    Thanks in advance.

    Hi russelldav,
    one note on your data handling:
    When  each of the 50 participants send the same 60 "words" you don't need 3000 global variables to store them!
    You can reorganize those data into a cluster for each participant, and using an array of cluster to keep all the data in one "block".
    You can initialize this array at the start of the program for the max number of participants, no need to (dynamically) add or delete elements from this array...
    Edited:
    When all "words" have the same representation (I16 ?) you can make a 2D array instead of an array of cluster...
    Message Edited by GerdW on 10-26-2007 03:51 PM
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • What is the best way to extract/consume HR data to and from Peoplesoft?

    I have to do integration between Peoplesoft and Oracle ERP to send HR data from Peoplesoft to Oracle HRMS. I will use Oracle Fusion Middleware for building the interface. I don't have much experience in working with Peoplesoft tools, like PeopleTool, App Engine, Integration Broker, Component Interface and Oracle adapter for Peoplesoft. Can somebody please let me know what is the most popular or recommended or best way to extract HR data from Peoplesoft.
    The second interface will consume some HR data from Oracle. Once again I need to know different ways to insert data into Peoplesoft.
    I appreciate if you can share your experience for the above two scenarios.

    If you plan on buying a apple laptop you have two options. You can use time machine or pull the data off of it directly and put it onto the external. I advise the second choice. This way you do not have stuff that you dont need filling up the space on your new drive. Get your applications, music, documents, photos, videos and perhaps downloads and desktop; Put all of these on your external. And then reinstall osx onto your drive.
    *** If buying a windows machine next check the format of the drive in disk utility. You will need a program called Paragon NTFS to format the drive window compatibility. You have to pay for the free version but can download a trail version for free.
    http://www.paragon-software.com/home/ntfs-mac/
    Did i leave anything unanswered?

  • What is the best way to put LabVIEW DSC data into an Oracle database?

    I have been collecting data using LabVIEW DSC 7.0 for several years and have always accessed the data from the Citadel database via the Historical Data Viewer.  I would now like to begin putting this data into an Oracle database.  My company stores all their data in Oracle and it would provide me all the benefits of their existing infrastructure such as automated backups, data mining tools, etc.
    My initial thought is to use "Read Trace.vi" in LabVIEW to pull historical data from the citadel database at regular intervals (e.g. 1 minute) and insert this data into Oracle via ODBC.  In this way, I do not need to track the value changes in order to know when to write to Oracle.  I also considered replicating the citadel database using some other method, but I recall that the tables used by citadel are somewhat complicated.  I only need a simple table with columns for channel, timestamp, and data.  The "Read Trace.vi" will provide me data in this format.
    I do not need to update the Oracle database in real time, a few minute delay is acceptable. If anyone has a better idea or additional insight please let me know. Thanks.

    In terms of connectivity, you want to use ADO, not ODBC. Beyond that, it all depends on the structure of the data and what you are going to want to do with it. This is a very big question that you need to be getting some in-depth assistance.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • What is the best way to handle specific requirements for loading?

    Hello Experts,
    There are many many loadings to be done for companies and time combinations. For this, I logically think of process chains.
    However, there are many time conditions to start the loadings pertaining to company calendar.
    Ideally, I would like to use process chains without too much heavy coding (as in using Abap as process of a chain). Light Abap routines is fine. But would this be possible?  Or should I say, is process chain (wthout coding) possible to address different unique time conditions to trigger infopackages (even though more chains will be created)?
    Hope I could receive your good advice.
    eg. of unique time conditions: run on every sunday after 15th of every month. Or runs every 3am and 10am before the 15th.
    Thanks,
    Ernesta Corvino

    Ernesta
    Since you have too many too many loads depending on Time Combinations as you rightly said rather than using ABAP coading What I suggest you is create different Process chains with different time combinations and club them in a Meta Process Chain or you could run independently depends on the Time.
    Hope I understood your question correctly.
    Thnaks
    sat

  • When traveling aboard, what's the best way to go about voice data services

    I will be traveling for 5 weeks in Turkey and I will like to be able to use my iPhone to make emergency calls and maybe be able to get 3G coverage for navigation and information purposes.
    I understand that service fees for this kind of situations can be astronomical, but does anybody knows if one can sign for local service since the iphone recognizes foreign networks and links to them when in other country.

    Thank you all for your help.
    Unfortunately my iPhone is locked to Telcel, and I just got an special voice plan for Turkey from them at about $4 USD per minute, which been expensive, it's about half of what the charge would be without it.
    But not data deal, so I think I will be limited to WiFi only.

  • What is the best way to geotag photos with data from GPS Tracker Log (GPX format)?

    I just returned from a 4 month trip to Southeast Asia. Before I left, I set the time on my camera, a Canon 7D, to match the GPS time on my GPS tracker, a QStarz BT-Q1000X. Both times were set to UTC, also known as GMT.
    I have imported the data from the GPS log into Aperture as a GPX file. Now I would like to sync the location metadata on my photos with the data from the GPS log. How can I do this using Aperture 3?
    I seem to be having two problems:
    Aperture wants to shift the times based on which part of the track I drag the images onto. Since the camera and gps tracker times were sync'd, I do not want any shift.
    The time shifts that are displayed are 12 hours and a few minutes, rather than 0 hours and a few minutes.
    A few additional notes:
    When I imported the GPS track, Aperture automatically changed the time zone of the track to Asia/Ho Chi Minh City, which is correct (GMT+7).
    I have batch changed the images from Camera Time: UTC to Actual Time: GMT+7
    The time on my computer is US/Eastern Daylight Time (GMT-4), or Eastern Standard Time (GMT-5) when the photos where taken.
    Since I was gone for so long, and my GPS tracker memory is limited, I set the tracker to record a trackpoint once every 60 seconds.
    Is there a solution to my problems using Aperture 3, or should I start looking for third party applications to geotag the photos before importing them into Aperture?
    Thank you.

    I think the 12 hour discrepancy is more likely a result of the 12 hour difference between GMT+5 and GMT-7. The dates and times of the photos and the GPS track are displayed properly in Aperture. See the images below.

  • What's the best way to read JSON data?

    Hi all;
    What is the best way to read in JSON data? And is the best way to use it once read in to turn it into XML and apply XPath?
    thanks - dave

    jtahlborn wrote:
    without having a better understanding of what your definition of "use it" is, this question is essentially unanswerable. Jackson is a fairly popular library for translating json to/from java objects. the json website provides a very basic library for parsing to/from xml. which one is the "best" depends on what you want to do with it.Good point. We have a reporting product ([www.windward.net|http://www.windward.net]) and we've had a number of people ask us for JSON support. But how complex the data is and what they want to pull is all over the place. The one thing that's commin is they generally want to pull down the JSON data, and then put specific items from that in the report.
    XML/XPath struck me as a good way to do this for a couple of reasons. First it seems to map well to the JSON data layout. Second it provides a known query language. Third, we have a really good XPath wizard and we could then use it for JSON also.
    ??? - thanks - dave

  • What is the best way to charg my ipod tuch is it bad to kill the batery fully our leve it in the red pleses get back to me asap thxs

    what is the best way to charg my ipod tuch is it bad to kill the batery fully our leve it in the red pleses get back to me asap thxs
    pleses i dont want to distroy my ipod thxs
    chad

    Whatever you do in terms of charging will not destroy your iPod.
    Many people charge when it's convenient. Others just connect the iPod for syncing to the computer and leave it charging overnight, regardless of the ramaining charge level. Life of modern batteries is measured in full charge cycles. A charge cycle is either a full charge or any combination of small charges that adds up to 100% charge.
    The only real recommendations I've seen (including from Apple) is to keep the batteries away from heat and have a full charge once a month. For this, discharge the battery to about 20%, then fully charge.

  • What is the best way to protect it?

    I HATE getting scratches or any sort of cosmetical damage on my devices, I am getting my first MBP on Friday, and want to know what the best way to keep it like new is?
    It won't be leaving the house regularly for a while, but I do have a neoprene case which I think it will fit for travelling anyway.
    I have seen many people with cases on their Macs, I don't know what the proper name is for them, but they are like hard shells attached to the back of the lid, are these recommended? I have read about some heat issues with it.
    What is the best way to protect the screen? I don't ever touch it, so I don't think that will be a problem, but cleaning dust off etc.?
    Lastly, will the trackpad wear away? My laptop's trackpad has slightly faded from it being used a lot, are MBP's prone to this too? I will try to use a mouse as much as possible.
    Are there any other things I should know about? I read that the bottom can get scratched easily too? I would only use it on a flat surface, so something like my lap, so depending on how soft the metal is... It can't be that bad can it?
    Thanks!

    Wouldnt buy that, nope
    Hard cases protect the finish of your macbook, but they trap in heat, ....many here have spoken on same.
    A major part of a macbook pro, especially a RETINA PRO is dissipating heat from the alloy case, which this case prevents from happening.
    Yes, youre stopping all the scratch,.....and likewise keeping the Macbook from dissipating a lot of heat.
    When I said INCASE, I meant this:
    http://www.amazon.com/Incase-CL57482-Nylon-Sleeve-13-Inch/dp/B0043NTOKC/ref=sr_1 _1?ie=UTF8&qid=1382551803&sr=8-1&keywords=incase+carry

  • What's the best way to back up to the cloud?

    I am going to be traveling in the on a boat in Europe. Access to wifi could be limited and slow. I want to back up my photos and catalog to the cloud as there is potental to lose laptop and external drive. Since I have a 25 mp camera and will be shooting in Raw allot, there will be some big files as each photo could be around 27 gb.
    All of my files will be in the My pictures folder LR set up when I set up my new catalog. I planned to back up these folders to the cloud, then send just the new folders each time I add them to LR during import. Then I was going to have the catalog BK that is created each time you close LR  (just the data only is in this file) sent to the cloud as well.
    Then if my laptop is lost I can copy the back up catalog (just the data file) and the My Pictures file that has all my photos in folders to a new laptop and restore my catalog. I'm told by Adobe help that this won't work as the connection between the photos and the catalog back up will be lost.
    Is this correct? And if so, what's a better method.

    Thanks Jim, and yes I meant 27 mb.  Will be living on a sail boat so there is potential to lose both the laptop and external drive. The cloud was to be my 3rd copy
    Date: Fri, 20 Dec 2013 16:53:44 -0800
    From: [email protected]
    To: [email protected]
    Subject: What's the best way to back up to the cloud?
        Re: What's the best way to back up to the cloud?
        created by JimHess in Photoshop Lightroom - View the full discussion
    You really have images that are 27 GB each? I suspect you meant 27 MB. But, even then, I think cloud backup might be cumbersome. I would suggest carrying a small external hard drive. I have a 1 TB USB 3 drive that is really fast and very small. No external power supply. Plug it in, and copy the files. That's what I would do.
         Please note that the Adobe Forums do not accept email attachments. If you want to embed a screen image in your message please visit the thread in the forum to embed the image at http://forums.adobe.com/message/5948781#5948781
         Replies to this message go to everyone subscribed to this thread, not directly to the person who posted the message. To post a reply, either reply to this email or visit the message page: http://forums.adobe.com/message/5948781#5948781
         To unsubscribe from this thread, please visit the message page at http://forums.adobe.com/message/5948781#5948781. In the Actions box on the right, click the Stop Email Notifications link.
               Start a new discussion in Photoshop Lightroom at Adobe Community
      For more information about maintaining your forum email notifications please go to http://forums.adobe.com/thread/416458?tstart=0.

  • My old computer is dying and I want to transfer my account to my new computer.  I will not be using the old computer at all.  What is the best way do do this?

    I want to stop using my old computer completely and transfer my itunes account to my new computer.  What is the best way to do this?

    A simple search

Maybe you are looking for

  • I have an icloufd account.  When I logged in on my home computer it says I do not have a cloud ID

    I have an icloud account that I created on my work computer.  When I came home, I tried to log in so that I could get all my purchases in one place.  Een though I have my correct Apple ID, it is telling me I do not have an icloud account, but i Know

  • Temp too high?

    Yo guys Two things I wanted to ask really. Firstly I have recently purchased a Coolmaster Heat Pipe Heatsink / Fan as my old hs + fan wasnt cooling my XP 1800 properly (I was getting system crashes running @ 1.53 Ghz). The system is fine now but I am

  • Slideshows ends prematurely in the middle of a transition?

    I have eight slides in a project so far and for some reason the slideshow ends prematurely in the middle of the transition from the sixth slide. any ideas? TIA jim

  • LMS 3.1 Syslog Automated Action - How to pass variables to script?

    I would like to pass variables to a windows bat file for processing.  The help seems to suggest that there are 2 available, device and message.  I would like to know how to reference them and what syntax to use to pass them to the batch file.  Are Fa

  • Low level network info

    I'm coding a cellular phone program. It needs low level network info, idealy it would be a CellID that it is connected to. The main problem is to know if two phones ar in the same area without using location ('cos it is a relatively new api and here