Data hu00E4ngt in sub system

Hi,
We have one main SAP system, where we receive data transferred from different SAP Sub-systems. One of our Sub-System shows postings hanging in the interface. Reasons: The cost centre is blocked for the actual primary costs. This error I have seen through SM37 with the help of Job number. These posting come from a legacy system in to our sub system.
I am not allowed to unblock the cost centre but I am supposed to change the cost centre in posting. I do not find the way, how to carry out this change in posting?
Can anybody help me?
Thanks

Hi ,
First of all i would like to ask as to what is the transaction that you are executing and getting the error .
Is it a Z one or a SAP standard one , as you have commented that you cannot change the cost center then what is the reason for the same ?
In the previous message you have mentioned that you dont want to unblock the postings and then you said you cannot change the cost center , please elaborate as to what you are performing and why are you unable to change the cost center .
Cheers ,
Dewang

Similar Messages

  • I/O Sub-System - What is That?

    I use the term “I/O sub-system” quite often, and some wonder what that is. Usually, I will give a bit of detail, but probably need to take things to a higher level, and also explain why the I/O sub-system is important to video editing. The term I/O is Input/Output, and the concept of the “sub-system” encompasses everything about the HDD’s (Hard Disk Drives). Basically, used in these fora, the I/O sub-system refers to the HDD’s, and all the specs. of those HDD’s.
    It starts with the HDD’s themselves. The main considerations are:
    The number of physical HDD’s (not partitons)*
    HDD capacity/size
    HDD space used, and remaining (defragmented)
    Format Type: FAT-32, or NTSF**
    The speed of the HDD, i.e. the RPM performance of the HDD
    The controller type, say SATA II, and this will specify the connection speed
    The allocation of each HDD, i.e. where one has the OS, the programs, the Project files, the Scratch Disks, the Exported files, etc.
    For Video editing, it is just about a minimum to have 2 physical HDD’s, splitting the I/O load over the separate, physical HDD’s. Having 3, and splitting the I/O load over all 3 is even better. For even greater performance, one can add additional HDD’s, and allocate them for even more spread.
    With a 2 HDD I/O sub-system, one would allocate in this manner:
    C:\ OS, programs and probably the Windows Virtual Memory [Page File]
    D:\ Project files, with Assets, Scratch Disks and Exports
    With a 3 HDD I/O sub-system, one would split D:\ by perhaps the Project files and Scratch Disks and then the Assets on the separate Disk. Note: one can also split the Scratch Disks between the D:\ and E:\ for even better performance.
    The more that one can separate the needs for reads/writes to/from the HDD’s, the greater the performance will be, as the computer will not be waiting for those reads and writes.
    Why is proper allocation of the HDD’s, and spreading the I/O load important? One of the first laws of physics is that an object cannot be in more than one place at the same time. When running a video editing program, there are several programs, and processes telling the computer to do many things at the same time. First, the OS is demanding reads from the HDD, and in the case of the Page File, writes, as well. The OS gets first dibs on these reads and writes. Next, the program, our NLE (Non Linear Editor) here, requires reads from the HDD, and will also require writes for many operations. This is just to run. When one is working on a Project, there are many writes to the Scratch Disks, to create, or work from large working files. The NLE will also need to gather information from the Assets in the Project, hence plenty of reads. When playing back the Timeline, the NLE will require many high-speed reads. When Exporting, the NLE first writes working files, and then reads from them, to create the Export file - more writes. If the computer is requested to do reads and writes for the OS and the NLE at the same time, someone must wait. If these reads and writes are requested on a single HDD, the wait is much, much longer. If one has spread the I/O load over several HDD’s, the wait is far, far shorter. The better you spread this load, the greater your performance and the fewer lags you will have. This is especially noticeable with playback of the Video and the Audio.
    One problem with HDD’s is that as they fill up, the heads have to move greater distances to do the reads and writes. Most HDD’s began to slow down at about 70% capacity, as their read/write times go up. This also causes mechanical wear and tear on the head actuating mechanism, which can lead to premature failure. The exact same thing happens, when an HDD is fragmented. The heads have to hunt all over the platters, to get the pieces of a file, which will then need to be “assembled.” Some applications and operations will also not work well, if at all, if there is not adequate defragmented free space available. Also, as video-editing requires very large working files, just to function, having adequate free space is very important.
    One might think only in terms of say the space required to write to a DVD-5 (approximately 4.7GB), but first the program creates working files, that can be much larger, and then the program will write a separate Video and a separate Audio file, and finally combine them into one muxed MPEG-2. file. Last, the program gathers up this MPEG-2 file, plus the navigation and Menus, and bundle everything into the IFO, BUP and VOB files. In PrE, and Encore, a module, the Vobulator does this last function, and will need plenty of working files, to produce that 4.7GB DVD. If the I/O sub-system cannot provide adequate free space, the process will fail. This shows why one must have adequate defragmented free space available.
    This brings us to additional reading, and to a Q & A session:
    * See Partitions, a No-No for Video Editing
    See Harm on HDD Setup for Video Editing
    See Harm on To Raid, or Not to Raid
    See Windows Virtual Memory, as the Page File requires I/O interaction, and how you have this set up, can affect things on several levels.
    ** See Converting FAT-32 to NTSF
    Question: Can I use external HDD’s in my I/O sub-system?
    Answer: yes, but the connection type will be important. USB 2.0 connections are very, very slow, and because of the speed of most computers, and the great deal of read/write activity required by an NLE, the connection is often too slow just to perform, leading to read/write failures, including the dreaded Delayed Write Failure, which can totally corrupt the external, requiring full low-level reformatting and the loss of ALL data on that HDD. USB 2.0 HDD’s are great for archiving data, but not much more. FireWire 400 is better, but is still very slow. FW-800 is adequately fast for editing to/from, and is stable. The use of eSATA is even better, and will equal in real-world terms, about what one gets with internal SATA II HDD’s.
    Question: What about SSD drives?
    Answer: They are fast, but still cost a lot for the storage, and that is still limited. They really help with boot-up speed, but then how often does one boot-up the system. In time, and as the capacity goes up, and the cost comes down, they will very likely replace the mechanical HDD’s. They do bear watching, but beyond the boot disk, are not that well-suited for additional drives for video editing. That will likely change in the future.
    Question: Do I really need to frequently defragment my HDD’s?
    Answer: Absolutely - see above. After every editing session, I will run a defragmentation program, and will do so weekly, even if I have not been editing.
    Question: Can I edit video on my laptop with a 40GB HDD?
    Answer: Perhaps, but you will want to have only the OS (and pared down to the minimum), and the NLE program, plus you will need to keep the Projects very small, and not have any more Assets, than will absolutely necessary. Then, you will need to be prepared to wait, and possibly deal with crashes, when that wait-time becomes too much for the OS, or the NLE to bear. You will want to defragment the HDD very, very often. This holds true, to a degree, with any single-disk I/O, regardless of the size, speed or controller type. There are many reads/writes, and they will bottleneck, waiting for that one poor HDD to provide them.
    Question: Does the HDD’s speed really matter?
    Answer: Yes. For Video and Audio, one needs at least a 7200 RPM HDD, to play back smoothly under all load conditions. A 5400 RPM unit might be able to keep up under otherwise ideal conditions, but lags, and jumpy playback is very common. A 10K RPM unit will likely not help much in playback, over a 7200 RPM drive, but will greatly improve other read/write functions, meaning that one might not notice that extra speed just with playback, but that other functions will be improved, speed-wise.

    Dear Mr. Hunt,
    I'm an experienced Oracle DBA.
    The tech staff in organizing VLDB systems (very large databases) is quiet similar to what you have showed here.
    May I suggest, Perhaps, for your readers benefits ( especially those who do professional large scale editing projects) I think it will be a good idea to elaborate about some modern storage systems architectures like NetApp , EMC etc.
    Using a modern storage systems can solve much of the I/O problems, and it seems to me after reading this article that me and you are dealing with the same IO problems: large amounts of data who needs to be wriiten and accessed very fast.
    Thank you for your effort.
    Sincerely Yours,
    shimon.

  • Accessing data from other SAP system

    Hi experts
    I need to access data from one SAP system from other.  My requirement is , perform applications (programs) in some SAP system, but using/accessing  data from other SAP system .
    For example, when we press F4 we get the match code (search help) for the field ;  that window is alredy created by SAP and it shows the data for that field . Of course the application (ike any other program/application) read the data from the SAP system is being executing ;  well i need execute that match code but show (taking) the data from other SAP system;  i not refer to other MANDT in same system but other SAP system.
    The communication between SAP1 system and SAP2 system is already created from BASIS.
    My doubt is how can i  execute that standard application (search help) but 'invoke' to standard application this take data from other system ?
    I guess i can create Z search help wich call a Z Function (RFC) to get the data required from the other system ;
    or maybe enhance the standar search help to call that Z RFC ;  but standard application search help is already created and this access the data in several ways or places of code , and this has severals sub-windows and search features ; and i just need execute exactly same search help but accesing data from the other SAP system ; so my doubt is, is there some way to indicate to SAP the applications always (wherever) these applications acces the data do it but from other SAP system, 
    like if we 'redirect' the Data Base to other SAP system. 
    SAP1 applications are being executing  -
    >   but all data (records) are taking from SAP2
    Maybe the first suggestion is simply why do not we execute applications in SAP2 instead SAP1 , that's because the version package in SAP2 system (where we have the data) does not support some (non SAP) applications we need to use.
    Excuse the long thread.  Does somebody can help, any idea ?
    Best Regards
    Frank

    Hi Frank,
    Below are my thoughts, these are only conceptual and haven't had an opportunity to try these possibilities, also not sure if this is the right thing.
    1) Create an entry in DBCON using TCode: DBCO for the Database of system SAP2. Now in the application of SAP1 use the "SET CONNECTION" construct as the very first statement, so that consecutive SQL statements will point at the DB of SAP2.
    2) I have read somewhere that the Work Processes are connected to the Default database and i am not sure if you can change this setting, but if you can and provided you have have multiple application servers in your SAP1 landscape, then pick one of the application servers and change the settings on the WP to point at the DB of SAP2, you have the end users login to this particualr app server.
    3) If there are multiple app servers in SAP1 landscape, then choose one of the app servers and change the default database setting to point at DB of SAP2.
    Regards,
    Chen

  • Should I use Sub-Equipment Hierarchy for sub-systems (first post)?

    I am 2 months into the maintenance manager role for a fleet of Locomotives and it has been decided to use SAP PM by senior management based on a separate business arm requirements.
    I believe for this to be successful the first step is to get the structure right so that data is collected in the most logical manner.
    I intend to use SAP to schedule all maintenance tasks based on measuring points and dates of previous work orders, capture system and sub-system failure history, capture costs and parts usage and trend performance of subsystems based on their own and system level measuring points.
    I have been told that all locomotives are assigned to a single functional location within the business model. The individual locomotives themselves are assigned as separate pieces of equipment in this functional location.
    Within the individual locomotives there are a series of subsystems that have associated maintenance requirements based on time and usage and they are serial number tracked. These items are exchanged between the locomotives and the warehouse upon failure, upgrade or overhaul and include:
    AC traction Motors
    Wheelsets
    Diesel Engine
    Main Generator
    Auxiliary Generator
    Dynostarter
    Gearbox
    Compressors
    Traction motor blowers (Fans)
    Radiator fans and AC radiator fan motors
    Is it logical to put these as sub-equipment within an Equipment Hierarchy with the Locomotive being the superior equipment?
    Should notifications regarding their failure be assigned to the Superior Equipment (Locomotive) or the Sub-Equipment and what are the implications of each? If assigned to sub-systems can I still report on system reliability by "rolling up" the failures?
    Do the measuring points (e.g. kilometres) assigned to the Superior Equipment (Locomotive) accumulate to the Sub-Equipment automatically and does installation and removal from different Locomotives with different kilometres affect this? For example a traction motor in Locomotive 001 is installed from 100,000 to 360,000 locomotive kilometres and then it is installed in Locomotive 002 from 300,000 to the current 340,000 locomotive kilometres. Does SAP accumulate 300,000 kilometres of life to the traction motor sub system based on its superior equipment's measuring points?
    I have many more questions to come that I will most likely solve through experimentation and future questions, but in order to hit the ground running I would appreciate your feedback if any of you have similar experiences.
    Thanks in advance
    Glenn Sanders

    In SAP, you can define "N" no. of hierarchies for Functional Location & Equipment.
    But in the reporting, you can get the costs at Functional Location level & Equipment level at next. Its not possible to have Sub Functional Location level or Sub Equipment Level.
    Also, you have mentioned that you will interchange the Sub components between Locomotives. Also, you want to maintain those using Serial Numbers. You can define these Sub Components as Assemblies with Serial Numbers so that Equipment can be created for those combinations.
    To maintain Measuring Point for Sub Components, Equipment should be created. In SAP, you have the option to transfer the Measuring Points from One Equipment (Superior Equipment) to another Equipment (Sub Equipment).
    While interchanging the Sub Component from One Locomotive to another Locomotive, without Measurement Reading Transfer, you can dismantle & install the same.

  • BSD Sub System Installation?

    I've recently installed 10.3 on my system and down loaded the 10.3.9 upgrade from the site and installed it according to the instructions. Some where along the line I lost of didn't install the BSD subsystem? This is causing a problem with the installation of the Adobe CS software.
    The question is this: How to I install the BSD sub system without destroying or changing any of my current system settings?
    Any assistance on this would be much appreciated
    Thanks,
    Ron
    350 Mhz Power PC G4   Mac OS X (10.3.9)  

    Hello Again! You may have to reinstall all of your apps. As Cornelius has said Pacifist may be able to extract the item from the install disk but I'm not sure that will work as it's part of the base OSX system. Since it's part of the OS doing an archive and install which only replaces the system should work and preserve your other apps and data. Before attempting any solution be sure you have your data backed up.Tom
    Read This

  • Problem : Additional required sub system in use

    Hi, I am having trouble running my program.
    I have 2 DTOLcreatetask, 1 is for Analog input and the other for Analog output ?
    The analog inputs are for manipulating data from sensors, and the analog output is for controlling a valve.
    I make 2 buttons for the output, open and close.
    First, I run the program, it's working fine but when I press the button, It appear an error log ( Additional required sub system in use )
    Can I run 2 or more DTOLcreatetask together ?
    Best regards,
    Citras
    Attachments:
    DataAcquisition5.vi ‏106 KB

    1- Where is the Timing Function for the While Loops?
    2- Why to Cases for Closed and Open? put them in one Case!
    3- Use another Architectur like the State Machine!
    http://expressionflow.com/2007/10/01/labview-queued-state-machine-architecture/
    You can also integrate the Event Structure in the manin Loop.
    Open LabVIEW and go to New --> From Template --> Design Patterns --< Producer/Consumer Design Pattern (Events)
    Kais Mekacher
    Applications Engineer
    Germany - Munich

  • How do I install all my old programs and data from an old system folder after I have reinstalled the same OSX system after a crash?

    The system is OSX10.5.8 Leopard on a 2009 imac. A new system was installed from the installation disks and the original system saved to a folder.
    I need to use my Adobe programs, rescue my email, i-tunes and iphoto data.  The disk utility indicates that my Time Machine back-up disk is damaged and I don't want to take a risk of having Time Machine erase my hard drive and try to reinstall the exact system existing at the time of the crash.  There was over 650 gb of stored files that I was copying and removing from the drive at the time it crashed. The total size of the original system file is still about 650 Gb.
    I would prefer to go back in Time Machine and only rescue the programs as most of the files have been copied to external hard drives, but I can't access the back-up hard drive from the new version of the Time Machine.  Or by I don't want the Time Mchine to start copying the new operating system which would include all the data in the old system file. Time Machine was working fine at the time of the crash.

    No, the disk was backed up with time machine a few hours prior to the crash.  I was unable to open the computer when I tried to restart it- got a grey screen with the spinning disk- after a few minutes the screen would go black and would reboot continuously, but not load any images or programs. I started the computer from the 10.5.4 installation disks and checked both the time machine external hard drive and the Imac internal drive with the disk utilities. Both showed as damaged --the internal drive and permissions were repaired, but the external drive (time machine back-up)  was damaged and not repairable by disk utilities. I don't believe that the external drive for Time Machine was connected to the computer at the time of the crash as I was copying files to a different hard drive drive. And I was not having any problems with the TM back-up drive prior to the crash.
    I accessed the Imac internal disk by firewire (as a target disk) and copied as many data files as I had room for on my external hard drives available.  And I deleted quite a few files from the imac internal drive (mostly just jpegs, duplicate tifs, etc--nothing that was used by i-photos, i-tunes or the Mail program).
    Then I installed a new OSX10.5.4 system from the installation disk and the old system was moved to a folder on the hard drive.  I previousy had had the option to reinstall the complete system from Time Machine when I connected that drive and booted with the installation disks with the C key depressed.  But it didn't seem like a good option because I was unsure of the condition of that external disk and whether it would be able to reinstall my data correctly, once it had erased my internal hard drive. 
    I'm considering buying some new external hard drives and backing up the present system to time Machine (so I'll still have my old data in the old system folder).  And then I would try using the old Time Machine back-up to try to reinstall the sytem previous to the crash.  That back-up would reinstall about 700gb of data and operating software and programs which sounds like a lengthy back-up.  Since I have never used Time Machine to do a full reinstallation (I've only used it for individual files), I'm reluctant to do anything rash.
    I'm a professional designer (with a deadline) but I can still use my Illustrator and Photoshop by opening them from the old system folder and saving the files to an external drive.  So it's not neccessary to do anything hasty except to delete some of the excess art and document files that were causing the computer to run slowly and the  Adobe programs to crash when I tried to save my work. I have quite a few books on tape in the i-tumes folder which is probably talking up tons of space but I don't where the i-tunes files live.
    Thanks for any help. Peggy
    Message was edited by: peggy toole

  • Not able to change the data of test data containers in production system

    Dear All,
    We have created eCATT scripts in Development SolMan System and moved the transports to Production SolMan System.  Customer wants to change the data at Test data containers and run the scripts in production system but we are not able to edit the data. 
    May be the reason is SCC4 transaction code has set the below option.
    Changes and transports for client-specific Objects
    u2022 No changes allowed
    Customer doesnu2019t want to change the above option and wanted to change the test data containers to give different datau2019s and run the eCATT scripts.
    Could you please let me know the solution for this?
    Your help is really appreciated.
    Thanks,
    Mahendra

    eCatt has the feature where you don't need to transport the scripts or test configuration to our target system. We can keep all our scripts and test data in Solman and run this script any other system in your landscape using the System data container and target system.
    Maintain the production as one of the target system in System container in Solman and point that system while running the script. Change the test data in Solman to run this script.
    Let me know if you need more information
    thanks
    Venkat

  • Uploading data from non sap system to sap system

    hi to all experts,
    my requirement is to upload data into the sap system from non sap system . the data is in  a flat file ,but the problem is that  in the flat file all the field lengths and fields are not in order or same as sap system. How to upload the data into sap system ....

    hi,
    the data is in a flat file,but the problem is that in the flat file all the field lengths and fields are not in order or same as sap system.
    If the data is in excel sheet use this FM.
    CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
        EXPORTING
          FILENAME                      = P_FILE
          I_BEGIN_COL                   = 1
          I_BEGIN_ROW                   = 1
          I_END_COL                     = 17
          I_END_ROW                     = 3000
        TABLES
          INTERN                        = itab
    EXCEPTIONS
      INCONSISTENT_PARAMETERS       = 1
      UPLOAD_OLE                    = 2
      OTHERS                        = 3
    It will read the data from excel sheet column wise and then store it in ITAB. you can change the order after reading the file.
    In case of Text file use FM  GUI_UPLOAD.
    You can change the order of the fileds after reading it from file, but the length and all you have to make compatible to SAP standard fields.
    Regards,
    Sachin

  • Migrate Closed Sales Data from one SAP system to Other

    Hi,
    We have a requirement to migrate Closed Sales Order data from one SAP System to the New SAP system.
    Please share your thoughts on the considerations / DOs and Donts for such cases.
    Appreciate your response.
    Kind Regards,
    Tanuji

    Hi,
    If you want to migrate closed SOs..then first consider why? and from what date i.e how long back?
    Following are the considerations:
    1) You can't get the creation date same as your previous system dates--this might lead to incorrect reporting. So you'll have to check if its really worth?
    2) You might want to have a reference no somewhere in the new transactions from the previous system.
    3) If you want to migrate the deliveries, invoices then you've to consider also GL migration as well as payments.
    Regards,
    Raghu.

  • Data load from Legacy system to BW Server through BAPI

    Requirements: We have different kind of legacy systems and SAP BW server. We want to load all legacy system data into SAP BW server using BAPI. Before loading we have to validate all data. If there are bad data, data missing we have to let the legacy system user/ operator knows to fix the data into their system with detail explanation. When it is fixed, we have to load the data again.
    Load Scenario:  We have two options to load data from legacy systems to BW Server.
    1.     We need to load data directly from legacy system to BW Server using BAPI program.
    2.     Legacy Systems data would be in workstations or flash drive as .txt (one line separated by comma) or .csv file. Need to load from .txt /.csv file to BW Server using BAPI program.
    What we want in the BAPI program code?
    It will Read / Retrieve data from text / csv file and will put into the Internal table. Internal table structure would be based on BAPI InfoObject structure.
    Call BAPI InfoObject function module ‘BAPI_IOBJ_CREATE’ to create InfoObject, include all necessary / default components, do the error check, load the data and return the status.
    Could some one help me with the sample code please? I am new in ABAP / BAPI coding.
    Is there any other better idea to load data from legacy system to BW server? BTW we are using BW 3.10. Is there any better option with BI 7.0 to resolve the issue? I appreciate your help.

    my answers:
    1. this is a scendario for a data push into SAP BW. You can only use SOAP-Based Transfer of Data.
    http://help.sap.com/saphelp_nw04/helpdata/en/fd/8012403dbedd5fe10000000a155106/frameset.htm
    (here for BW 3.5, but you'll find similar for 7.0)
    In this scenario you'll have an RFC dinamically created for every Infosource you need to transfer data.
    2. You can make a chain for each data load, so you can call the RFC "RSPC_API_CHAIN_START" to start the chain from external.
    the second solution is more simply and available on every release.
    Regards,
    Sergio

  • Loading of transaction data from SAP ECC system failed

    Hi!
    I successfully connected SAP ECC system to SAP BI system.
    The following steps have been executed:
    - user ALEREMOTE with max. authorization
    - RFC destination
    - Distributing Data model
    - Generated Partner profile
    - Maintaining message types in WE20
    Now when I try to load any data from SAP ECC system the loading process in hanging in status "yellow" and never comletes.
    [0FI_AR_4|http://www.file-upload.net/view-1447743/0FI_AR_4.jpg.html]
    The following steps within Load process are yellow:
    Extraction (messages): Missing messages
      Missing message: Request received
      Missing message: Number of sent records
      Missing message: Selection completed
    Transfer (IDocs and TRFC): Missing messages or warnings
      Request IDoc : Application document posted (is green)
      Data Package 1 : arrived in BW ; Processing : 2nd processing step not yet finished
      Info IDoc 1 : sent, not arrived ; Data passed to port OK
      Info IDoc 2 : sent, not arrived ; Data passed to port OK
      Info IDoc 3 : sent, not arrived ; Data passed to port OK
      Info IDoc 4 : sent, not arrived ; Data passed to port OK
    Subseq. processing (messages) : Missing messages
        Missing message: Subseq. processing completed
        DataStore Activation (Change Log) : not yet activated
    Question:
    Can some one give me some technical steps (tcode, report) to solve this problem?
    Thank you very much!
    Holger

    Hi!
    Many thanks for your answer.
    Via BD87 on BW system I detect that all the IDOC's (type: RSRQST) will be received from SAP ECC system.
    Via tcode SM58 I could not detect any entries.
    However the loading status from yesterday is set to "red".
    The errors are:
    Extraction (messages): Missing messages
    Data Package 1 : arrived in BW ; Processing : 2nd processing step not yet finished
    Info IDoc 1 : sent, not arrived ; Data passed to port OK
    Info IDoc 2 : sent, not arrived ; Data passed to port OK
    Can you investigate my issue again?
    Thank you very much!

  • How can I allow a sub-vi to run independent of the main program once it has been called while still sending data to the sub-vi

    I have a main program where I call a sub-vi. In this sub-vi, there is a while loop that is used to wait for commands in the sub-vi. While the while loop is running, I cannot continue with normal operation of the main program. I would like get the sub-vi to run independently once it has been called, but not hold up the main program. As well, I need to still be able to send data to the sub-vi

    One way is to use VI Server, which has been mentioned by others. This will allow you to start another VI (by name) and run it entirely independently of the calling VI. This is a good way to start various independent VIs from a main menu, for example. None of the VIs thus called need have any connection to the others.
    Another way it to have the SubVI in a separate while loop on the calling VI's BD. Then, use a local var to start this sub VI from the main loop. The calling VI sets a local START bit and continues running. The sub VI's while loop watches for this START bit to go true, and then runs the Sub VI. The advantage here is that one can more easily pass arguments to the SubVI when it is started, using local vars, which are preferable to globals. Once the Su
    bVI is running, however, you must use a global Stop Bit, set in the calling VI, to stop it when the calling VI exits, or the calling VI will hang up, waiting for the Sub VI to close and exit its while loop.
    If you need an example of this, email me. ([email protected]). I can also recommend Gary W. Johnson's excellent book which discusses this. ("LabVIEW Graphical Programming", 2nd Ed).
    Note: Where possible, I try to call a subvi from within the main VI, wait till it is done, then continue. It avoids the use of locals & globals, and results in cleaner code, with fewer "race" conditions. However, the main VI stops until the subVI is done, hence one should make the subVI modal.

  • Error while extracting data from a remote system

    Hi,
    I am facing problem while extracting data from a remote system. The connection is alright I can extract the table required from the remote system,but when I deploy it I get this error
    ORA-04052: error occurred when looking up remote object [email protected]@ORACLE_UBN_15_LOCATION1
    ORA-00604: error occurred at recursive SQL level 1
    ORA-28000: the account is locked
    ORA-02063: preceding line from UBNDW@ORACLE_UBN_15_LOCATION1
    here Scott.demo1 is the table and UBNDW is the sid of the remote system and ORACLE_UBN_15_LOCATION1 is the location. Please help me out with this
    Thanks

    Hi,
    IDOC's need to be processed manually either in OLTp or in BW depending on the failure. Error msg in monitor status will take u to either BW or OLTP whernever there is a prob. Process IDOC's , this will start the left over packets and will finish the load.
    we hav to check IDOC in WE05(t-code) and know the status these are WE51,WE52,WE53 AND GOTO WE19 there we hav to execute the exist Idoc will succesfully loaded Idoc
    Goto St22 see the short dump error msg..
    post if there any inf..
    Thanks,
    Shreya

  • The value in flexfield context reference web bean does not match with the value in the context of the Descriptive flexfield web bean BranchDescFlex. If this in not intended, please go back to correct the data or contact your Systems Administrator for assi

    Hi ,
    We have enabled context sensitive DFF in Bank Branch Page for HZ_PARTIES DFF , We have created Flex Map so that only bank branch context fields are only displayed in the bank branch page and  as we know party information DFF is shared by supplier and Customer Page so we dint want to see any Bank Branch fields or context information in those pages.
    We have achieved the requirement but when open existing branches bank branch update is throwing below error message :
    "The value in flexfield context reference web bean does not match with the value in the context of the Descriptive flexfield web bean BranchDescFlex. If this in not intended, please go back to correct the data or contact your Systems Administrator for assistance."
    this error is thrown only when we open existing branches, if we save existing branch and open then it is not throwing any error message.
    Please let us know reason behind this error message.
    Thanks,
    Mruduala

    You are kidding?  It took me about 3 minutes to scroll down on my tab to get to the triplex button!
    Habe you read the error message? 
    Quote:
    java.sql.SQLSyntaxErrorException: ORA-04098: trigger 'PMS.PROJECT_SEQ' is invalid and failed re-validation
    Check the trigger and it should work again.
    Timo

Maybe you are looking for

  • How to create Insert & Update on master-detail form JPA/EJB 3.0

    Is there any demonstration or tips how to Insert record on master-details form for JPA/EJB 3.0 with ADF binding?

  • Converting AVI DV to QuickTime DV. Do you lose quality?

    I posted this in the FCE forum but wonder if any DV gurus in here might have a definite answer. I have a some dv streams in avi format that have been captured using Sony Vegas from DV tapes. The files suffixed .avi will not open in Quicktime, but I f

  • Icons in the addon bar have moved from the right hand corner to the left

    Hi everybody, I've recently updated to the most current Firefox 4b10. I'm currently using the addon bar on the bottom that holds several addon-related icons (like firebug etc.). Those do still work, but for some reason they have moved from the right-

  • IMac Retina - External Monitor support

    Does anyone know where I might find some information about support for an additional monitor for the new iMac Retina?  I currently have a 32" Dell 4K monitor, will I be able to utilize it? Thanks Ralf

  • No disc menu templates

    I'm new to video editing and have just installed premiere elements 7, and I have no menu templates in the disc menu section.  Should they be there? Where can I get them?  Thanks