Duplicate IR through parallel processing for automated ERS
Hi,
We got duplicate IR issue in production when running the parallel processing for automated ERS job. This issue is not happening in every time. Once in a while the issue happeing. That means the issue has happened in June month as twice. What could be the reasons to got this issue. On those days the job took more time comaredt o general. We are unable to replicate the same scenareo. When i am testing the job is creating IRs successfully. Provide me the reasons for this.
Wow - long post to say "can I use hardware boxes as inserts?" and the answer is yes, and you have been able to for a long time.
I don't know why you're doing some odd "duplicated track" thing... weird...
So, for inserts of regular channels, just stick Logic's I/O plug on the channel. Tell it which audio output you want it to send to, and which audio input to receive from. Patch up the appropriate ins and outs on your interface to your hardware box/patchbay/mixer/whatever and bob's your uncle.
You can also do this on aux channels, so if you want to send a bunch of tracks to a hardware reverb, you'd put the I/O plug on the aux channel you're using in the same way as described above. Now simply use the sends on each channel you want to send to that aux (and therefore hardware reverb).
Note you'll need to have software monitoring turned on.
Another way is to just set the output of a channel or aux to the extra audio outputs on your interface, and bring the outputs of your processing hardware back into spare inputs and feed them into the Logic mix using input objects.
Lots of ways to do it in Logic.
And no duplicate recordings needed...
I still don't understand why the Apple-developers didn't think of including such a plug-in, because it could allow amazing routing possibilities, like in this case, you could send the audio track to the main output(1-2 or whatever) BUT also to alternate hardware outputs, so you can use a hardware reverb unit, + a hardware delay unit etc...to which the audio track is being sent , and then you could blend the results back in Logic more easily.
You can just do this already with mixer routing alone, no plugins necessary.
Similar Messages
-
Parallel processing for ABAP prorams in Process chain.
Hi All,
In one of the process chain, we have added the ABAP program. In Backend,the job is running as "BI_PROCESS_ABAP".
I just want to know, same like DTP, can we keep parallel processing for the ABAP programs also. Please suggest.
Thanks.Hello Jalina
Also check with BASIS if the memory allocated to run this program has not overflowed and the selections you have in your ABAP program is in small chunks and use variants to run them in parallel OR series
Thanks
Abhishek Shanbhogue -
Parallel processing for information broadcasting
Hi SDN,
How can we control parallel processing for information broadcasting in BI background management?
Early answer is appreciated.
Thanks in Advance.
NamrataHi,
agree with the above postings
you can find more details regarding this in below given link
http://help.sap.com/saphelp_nw70/helpdata/en/ef/4c0b40c6c01961e10000000a155106/frameset.htm
hope this helps
Regards,
rik -
The parallel process for mrp.
hi exports
we plan to do the scope of planning for the total planning as a background job.
while doing that system ask for the parallal processing for mrp
what is customize step and procedure to do the parallel process for mrp.Dear Raj,
With the help of parallel processing procedures, you can significantly improve the runtime of the total planning run.
To process in parallel, you can either select various sessions on the application server or various servers.
Parallel processing runs according to packages using the low-level code logic:
The work package, with a fixed number of materials that are internally defined in the program, is distributed over the individual servers/sessions. Once a server/session has finished processing a package, it starts processing the next package.
If a low-level code is being planned, the servers/sessions that have finished must wait until the last server/session has finished its package to avoid inconsistencies. Then the next low-level code is processed per packages.
The parallel processing procedure is switched on in the initial screen of total planning.
Activities
Define the application server with the number of sessions that can be used:
If you want to define various servers for parallel processing, enter the server with the number of sessions.
If you only want to use one server, but several sessions, enter the application server and the appropriate number of sessions.
Further notes
Parallel processing shortens the time required for calculation, however, it cannot shorten the database time as the system still only operates using one database.
The Customizing Transaction is OMIQ
Regards
PSV -
Job fail with Timeout for parallel process (for SID Gener.): 006000
Hello all,
Im getting below error and not able to find any issue with Basis side. Please anyone help on this!
Job started
Data package has already been activated successfully (will be skipped)
Process started
Process started
Process started
Process started
Process started
Import from cluster of the data package to be activated () failed
Process 000001 returned with errors
Process 000002 returned with errors
Process 000003 returned with errors
Process 000004 returned with errors
Background process BCTL_4XU7J1JPLOHYI3Y5RYKD420UL terminated due to missing confirmation
Process 000006 returned with errors
Data pkgs 000001; Added records 1-; Changed records 0; Deleted records 0
Log for activation request ODSR_4XUG2LVXX3DH4L1WT3LUFN125 data package 000001...000001
Errors occured when carrying out activation
Analyze errors and activate again, if necessary
Activation of M records from DataStore object CRACO20A terminated
Activation is running: Data target CRACO20A, from 1,732,955 to 1,732,955
Overlapping check with archived data areas for InfoProvider CRACO20A
Data to be activated successfully checked against archiving objects
Parallel processes (for Activation); 000005
Timeout for parallel process (for Activation): 006000
Package size (for Activation): 100000
Task handling (for Activation): Backgr Process
Server group (for Activation): No Server Group Configured
Parallel processes (for SID Gener.); 000002
Timeout for parallel process (for SID Gener.): 006000
Package size (for SID Gener.): 100000
Task handling (for SID Gener.): Backgr Process
Server group (for SID Gener.): No Server Group Configured
Activation started (process is running under user *****)
Not all data fields were updated in mode "overwrite"
Data package has already been activated successfully (will be skipped)
Process started
Process started
Process started
Process started
Process started
Import from cluster of the data package to be activated () failed
Process 000001 returned with errors
Process 000002 returned with errors
Process 000003 returned with errors
Process 000004 returned with errors
Errors occured when carrying out activation
Analyze errors and activate again, if necessary
Activation of M records from DataStore object CRACO20A terminated
Report RSODSACT1 ended with errors
Job cancelled after system exception ERROR_MESSAGEThanks for the link TSharma I will try that today.
UPDATE:
I ran a non-parallel Data Pump and just let it run overnight. This time it finished after 9 hours. In this run I set the STATUS=300 parameter in the PARFILE which basically echos STATUS updates to standard out every 300 seconds (5 minutes).
And as before after 2 hours it finished 99% of the export and just spit out WAITING status for the last 7 hours until it finished. The remaining TABLES it exported (a few hundred) were all very small or ZERO rows. There clearly is something going on that is not normal. I've done this expdp before on clones of this database and it usually takes about 2-2.5 hours to finish.
The database is about 415 Gigabytes in size.
I will update what the TRACE finds and I'm also opening a case with MOS. -
Parallel processing for one large message
I have some troubles from messaging performance perspective.
Sender:ABAP Proxy
Receiver:File Adapter
I'd like use parallel processing for one large message.
And the file for receiver is needed to be one file.
Could you let me know how to set them ?
Best regards,
Koji NagaiHi
Can you elaborate your requirement more?
How are you trying to achieve parallel processing in XI.
Since you mentioned that the source is Proxy, there should be some trigger mechanism say selection screen, you restrict the values here and use append strategy in File and can execute the same.
REgards
Krish -
Parallel processing for increaing the performance
various ways of parallel processing in oracle especially using hints
Please let me knw if there exists any online documentation in understanding the conceptFirst of all: As a rule of thumb don't use hints. Hints make programs too unflexible. A hint may be good today, but might make things worse in future.
There are lots of documents available concerning parallel processing:
Just go to http://www.oracle.com/pls/db102/homepage?remark=tahiti and search for parallel (processing)
Due to my experience in 10g, enabling parallel processing might slow down processing extremely for regular tables. The reason are lots of waits in the coordination of the parallel processes.
If, however, you are using parallel processing for partitioned tables, parallel processing works excellent. In this case, take care to choose the partitioning criterion properly to be able to distribute processing.
If, for example, your queries / DMLs work on data corresponding to a certain time range, don't use the date field as partitioning criterion, since in this case parallel processing might work on just a single partition. Which again would result in massive waits for process coordination.
Choose another criterion to distribute the data to be accessed to at least <number of CPUs -1> partitions (one CPU is needed for the coordination process). Additionally consider to use parallel processing only in cases where large tables are involved. Compare this situation with writing a book: If you are planning to have some people writing a (technical) book consisting of just 10 pages, it wouldn't make any sense at all concerning time reduction. If, however, the book is planned to have 10 chapters, each chapter could be written by a different author. Reducing the resulting time to about 1/10 compared to a single author writing all chapters.
To enable parallel processing for a table use the following statement:
alter table <table name> parallel [<integer>];If you don't use the <integer> argument, the DB will choose the degree of parallelism, otherwise it is controlled by your <integer> value. Remember that you allways need a coordinator process, so don't choose integer to be larger than <number of CPUs minus 1>.
You can check the degree of parallelism by the degree column of user_/all_/dba_tables.
To do some timing tests, you also can force parallel dml/ddl/query for your current session.
ALTER SESSION FORCE PARALLEL DML/DDL/QUERY [<PARALLEL DEGREE>]; -
Parallel Processing for a single Package
Hi,
I have PKg1 that have mixture of For Each Loop container, DFT's and Seq containers and I want to run more than one thread for this package where i can process data in parallel.
Please let me know how i can create this using SSIS 2012.
Thanks,Hi,
DFTs connected by precedence constraints and I want to run this package more than once (multiple threads) at a given point of time. is this possible? if
yes, please let me know how I can achieve this.
Thanks..
If the DFTs are connected then there will be absolutely no parallel processing. Running the same package in parallel most likely result in a lock. It depends how it is architectured, but with a RDBMS in default installation or files it is not going to fly.
When you have a DFT with say OLEDB destination each using its own connection, and they are not connected then each gets opened independently and thus allowing you to ingress data simultaneously.
Arthur My Blog -
Using Parallel Processing for Collection worklist Generation
We are scheduling the program UDM_GEN_WORKLIST in Background mode with the below mentioned values in the variant
Collection Segment - USCOLL1
Program Control:
Worklist valid from - Current date
Ditribution Method - Even Distribution to Collection Specialists
Prallel Processing:
Number of jobs - 1
Package Size - 500.
Problem:
The worklist gets generated but it drops lot of customers items from the worklist when the program is schduled in background using above parameters.
Analysis:
- When I run the program UDM_GEN_WORKLIST in online mode all customers come through correctly on the worklist.
- When I simulate strategy with the missing customers it evaluates them high so there is nothing wrong with the strategy and evaluation.
- I increased the Pacakge size to its maximum size but still doesnt work.
- Nothing looks different in terms of Collection Profile on the BP master.
- There are always a fixed set of BP's that are missing fromt the worklist.
It looks like there is something that I dont know about the running these jobs correctly using the parallel processing parameters, any help or insight provided in this matter would be highly appreaciated.
Thanks,
Mehul.Hi Mehul,
I have a similar issue now; since a couple of days, the WL generation fails in background mode. Although when I'm running it in foreground processing it's completed w/o any problem.
My question is that would you confirm that you did reduce the package size to 1?
so, your parameters are: nr of jobs: 1 and package size: 1
Is that right? Did it completely solve your issue? -
Always have to go through registering process for my ipod's...
Hey Cats,
Since upgrading to OS X (10.4.11), every time i plug in my ipod Nano and/or my ipod shuffle, iTunes treats them both as if they are brand new ipod's and tries to get me to register my "new" ipod. I have obliged while online and gone through the process and still, when I plug either ipod in itunes acts as if it is seeing the ipod for the first time. If anyone can offer any guidance on this topic I would appreciate it. Thanks for your timeContact iTS Customer Support from this link - http://www.apple.com/support/itunes/
MJ -
Parallel processing for program RBDAPP01
Hi All,
I am running this program RBDAPP01 daily after every 30minutes to clear the error I Docs (Status 51 Application document not posted) Status Message u201CObject requested is currently locked by user ADMINJOBSu201D when I run this job it only clears few Idocs because of the Status Message u201CObject requested is currently locked by user ADMINJOBSu201D Means when one Idoc is getting updated the second one tries to update the same time for same order, same customer, same material and same plant but different ship to party it finds locked and cannot be posted.
Can any one tell me what parallel processing is and will it help my case.
ThanksYou didn't specfiy which release you use so I can just give some suggestions:
Note 547253 - ALE: Wait for end of parallel processing with RBDAPP01
Note 715851 - IDoc: RBDAPP01 with parallel processing
Markus -
Parallel Processing for BI Load
Hi All,
I have a datasource which i migrated from BW 3.x to BI 7 .
I am loading the data from datasource to ODS .
In the DTP -> Execute tab i can see only 'Serial Extraction
and processing of source package' . I Think because of this
i am not able to do parallel processing . I mean when i try to load data from PSA to ODS by DTP , the data is loading package by package ( It is not triggering parallel jobs while loading) .
Could you please advice me why i am not able to see
' Serial Extraction , Immediate parallel processing' in my
Execute tab of DTP .
Is there anything i need to configure at Datasource Level .
Please help me .
Regards
Santosh
Edited by: santosh on Jun 3, 2008 2:37 AMHi, check your extraction tab on the Datasource. I am pretty sure this has something to do with it. This is what it is on the help for DTP Processing.
Processing Mode
The processing mode describes the order in which processing steps such
as extraction, transformation and transfer to target are processed at
runtime of a DTP request. The processing mode also determines when
parallel processes are to be separated.
The processing mode of a request is based on whether the request is
processed asynchronously, synchronously or in real-time mode, and on the
type of the source object:
o Serial extraction, immediate parallel processing (asynchronous
processing)
A request is processed asynchronously in a background process when a
DTP is started in a process chain or a request for real-time data
acquisition is updated. The processing mode is based on the source
type. -
Parallel processing for compression
Hello Experts,
Is there a way to control the number of parallel processes (background) used when compressing a request in a cube?
SunilHi Sunil,
Kindly have a look at below link, hope this helps.
http://help.sap.com/saphelp_nw70/helpdata/en/c5/40813b680c250fe10000000a114084/content.htm
Regards,
Mani -
How to limit number of parallel processes for a query???
Hi,
I have set table parallelism to degree (4 in my case) and when i run a query on that table i see on v$session that this query is using 8 parallel processes.
Why my query is using all this processes?? Can i limit this number?? I think this will cause poor performance if all my parallel processes stay BUSY on v$pq_slave.
1 1 P000 BUSY 22 0 8 0 2051 10 2 15 0 79179 76884
2 1 P001 BUSY 22 0 8 0 2054 10 2 15 0 81905 77443
3 1 P004 BUSY 2 0 1592 0 0 0 0 1592 0 1039 3
4 1 P005 BUSY 2 0 1592 0 0 0 0 1592 0 1038 4
5 1 PZ99 BUSY 533 0 0 0 1 3 0 0 0 1071 1107
6 2 P000 BUSY 14 0 8 0 2053 10 3 15 1 53014 73297
7 2 P001 BUSY 14 0 8 0 2048 10 2 15 1 51266 73318
8 2 P002 BUSY 14 0 8 0 2052 10 2 15 2 51043 73271
9 2 P003 BUSY 14 0 8 0 2053 9 2 15 2 49417 73327
10 2 P004 BUSY 13 0 8 0 2055 9 2 15 2 68428 12468
11 2 P005 BUSY 13 0 8 0 2059 10 2 15 1 69968 12473
12 2 PZ99 BUSY 461 0 0 0 1 3 0 0 0 921 936
Tks,
Paulo.select /*+ PARALLEL(a,4) */ ...... from owner.table a;
Or
ALTER SESSION FORCE PARALLEL DML PARALLEL <degree>
(But I am not sure whether this will affect the degree of select also ?) -
Hi preventing parallel processing for a program treating open dataset
Hi
Has anyone come into a situation where he had to do an inbound interface
(i.e using open dataset by scanning all the file in AL11)
how do you treat program running in parallel in this situation.
what i mean is that if the same program is doing an open dataset and treating the same file in al11. Then inconsistencies may occur
is there a way to lock a program that is running that is triggering an error message "Hi,
I too faced a similar situation and this is what I implemented.
Whenever your programs starts processing a particular AL11 File, Create a New Flag File in the Same Directory.
Say if my file name is usr\bin\ABC.DAT, before OPEN DATASET command on this file, I would create a new file usr\bin\ABC.DAT.FLAG!!!.
After the processing ABC.DAT is done, I delete the Flag File.
In the mean time, If some other program, (or for that matter the same program) executes in a different session and tries to open the same file, we can check for the existence of the FLAG FILE.
So the code is as follows.
CHECK for the Existence of FLAG FILE.
If FLAG FILE Exists, STOP. " another session is processing it.
ELSE.
CREATE FLAG FILE.
OPEN DATASET, and rest others.
CLOSE DATASET.
ENDIF.
Hope this helps.
Thanks,
Surya
Maybe you are looking for
-
Is there any way to get both libraries of music onto my iPhone?
I have two computers, and one has the majority of my music on it, and the other has a little. When I first got my iPhone 4S, I connected it to the one with less music on it, and synced all the music I wanted. Now, I'm trying to add the rest of my lib
-
Purchases different in MAS on 2nd Mac
I have a Mac Pro and a MacBook Pro. Mac App Store account info is the same on both, although short user name is different. Purchases appearing in MAS on one computer are different than the other. Purchases on one computer are not appearing in MAS on
-
How to put my name on photo?
Hi everyone, Can someone please advise on how to put your name on a Photo to deter people from coping your photograph? I have i Photo 09. If you need more information let me know as I don't know what else to put? Thanks.
-
Which versions of Window can perform best with CiscoWorks?
Hi, Please advice that currently we have 3200 logical devices configured on CiscoWorks and CiscoWorks is running very very slow. CiscoWorks installed on server with Windows Server 2003 32-bit and this version of windows it can only support RAM up to
-
PLEASE HELP