Stero bus or aux for parallel compresion
Hi. Trying to do a simple parallel compression on a kit but for some reason I can't get my bus or aux to be stereo. There seems to be no phase problem so doing a bounce then put on stereo audio track is not an issue. I have unselected "universal track mode" which did give me stereo bus outs on tracks but my other tracks Guitar / Bass ect were knocked out and the Aux channel was still in mono. What am I doing wrong. Cause when I put a reverb on an aux it comes as stereo. Any help please......
I use motu's 896 hd, worked out why my mono tracks were knocked out that was just an output setting. But still with my bus set to out 1-2 ect the input on my aux track is mono.....
Similar Messages
-
Is there ANY way to install windows (for parallels) WITHOUT a superdrive?
I have Parallels (build 5600) and Win XP running fine on my iMac.
I want to run it on my MBA, too.
I have successfully installed Parallels (on a new trial license for now), and when I went to install Windows, learned that it can't be done via Remote Disk.
I'd really rather not spend $100 on an external drive JUST for this purpose.
Can I make a disk image of the windows disk?
Or connect to the imac in target disk mode (or something?!) to use THAT drive?
Or... can I somehow just copy my "virtual machine" from the imac to the MBA? Not sure if that will cause windows serial number police to freak, though...
Glancing through a quick search here, it seems it's possible to migrate the virtual machine. But what files, exactly, do I need to migrate? Just the .hdd file?
I'd prefer to do a nice clean install from the CD, but I don't know if that's possible.
Any help is appreciated!
Thanks!Ok, I may have answered my own question.
I am copying my Virtual Machine from my imac, but it will take a long time to copy, and takes 10gb of space, so trying that is a last resort.
in the meantime I found this on the parallels site. Looks like you CAN use a disk image. I'm going to try it now- fingers crossed!
http://kb.parallels.com/en/4826
I'm using Macbook Air and cannot install Windows to Parallels Virtual Machine by means of CD/DVD drive “borrowing” feature. What should I do?
RESOLUTION
Unfortunately Macbook Air CD/DVD drive “borrowing” feature doesn’t support direct media streaming and thus cannot be used to install Windows (or any other Guest OS ) to Parallels Virtual Machine. To install a Windows VM on Macbook Air you can either create an .iso image of Windows installation disk on some other computer with CD/DVD drive and copy it to Mac or use an external CD/DVD drive that Apple offers for Macbook Air.
Please refer to this article for the instructions on how to create an .iso image for Parallels VM on Mac - http://kb.parallels.com/entry/32/458/ -
Task 0085 for parallel SID assignment terminated with errors
Hi,
While activating the ODS object we are getting the following error
Task 0085 for parallel SID assignment terminated with errors
Can anyone help me on this issue.
Thanks
Sheela DatlaHi Sheela,
Does the problem happen with activating the ODS object (definition) itself or with activating the data in the
ODS? I think that there must be other errors apart from Task 0085. At the time of the error you should check in
sm21 and st22 for additional error messages and dumps. If the problem is with the activation of data in the ODS
you should also check sm37 for the job that is created for the ODS activation , you may find more detailed
information in the job log. You should also check the activation step in the 'Details' TAB in the monitor for
the load there may also be more detailed information here.
In transaction rscusta2 (ODS customizing) you should try the following setting for the activation:
No. of Par. Proc. 3
Min. No. Data Recs. 10000
Wait Time in Sec. 600
Before changing the values you have in RSCUSTA2 please take note of the values you have already in
case you want to change them back.
Best Regards,
Des. -
Need SM Bus Controller Driver for A75-S231
Unfortunately I am without my system and backup disks and I am in desperate need of a driver for the SM Bus Controller. Problem surfaced after re-install of XP Home.
I called Toshiba tech support to identify the motherboard chipset shipped with my unit (specs listed at end of post) so as to locate and download the driver. Unbelievably, Tech is not equipped to identify the make or model of my motherboard or identify the chipset. Third Party software tells me I have a Toshiba EDW10 Motherboard and tech searched the internet and told me I have some VIA motherboard and good luck finding your driver.
Please help, I didnt think that Toshiba of all people would be unable to identify their own systems but here I am 4 hours later bothering you good people and still hunting. Does anyone have the SM Bus controller driver for the A75-S231 or tell me what chipset was shipped with my unknown motherboard?
My system specs are as follows:
OS Name Microsoft Windows XP Home Edition
Version 5.1.2600 Service Pack 2 Build 2600
System Manufacturer TOSHIBA
System Model Satellite A75
System Type X86-based PC
Processor x86 Family 15 Model 4 Stepping 1 GenuineIntel ~3333 Mhz
Processor x86 Family 15 Model 4 Stepping 1 GenuineIntel ~3333 Mhz
BIOS Version/Date TOSHIBA V1.30, 9/10/2004
SMBIOS Version 2.31
Hardware Abstraction Layer Version = "5.1.2600.2180 (xpsp_sp2_rtm.040803-2158)"
Total Physical Memory 512.00 MB
SM Bus Controller Device ID (from Device Manager) PCI\VEN_1002&DEV_4353&SUBSYS_FF001179&REV_18\3&61A AA01&0&A0
Motherboard Toshiba EDW10???
Thanks fo your help.Hi
As far as I know your unit must be an US notebook model and please try to check Toshiba US download page under http://www.csd.toshiba.com/cgi-bin/tais/su/su_sc_home.jsp
Please dont be mad on tech support but I dont see any reason why they should know every technical detail of your notebook. Toshiba has designed drivers for every hardware component that you can have anytime you want and install it 1000 times if you want. Toshiba guarantee that it will work properly and I dont see any problem there.
Technical details you can get from Authorized service partners in your country.
Good luck! -
Unable to clone File Adapter receiver channel for parallel processing
Hi Experts,
I am using variable substitution for File - RFC - File with out BPM scenario(using request response, oneway bean).
While i placed the file in the sender FTP folder, the file didnt get picked up, but in communication channel monitoring, i am getting error ' Unable to clone File Adapter receiver channel for parallel processing'.
Can anybody provide me suggestions to solve this error.
Note : without variable substitution , the interface is working good.
Is it due to, i am trying the source structure field in response file adapter?Hi,
In your CC, do you use some additional paramaters ?
like these one of point 47/48 of [Oss note 821267 - FAQ: XI 3.0 / PI 7.0/ PI 7.1 File Adapter|https://service.sap.com/sap/support/notes/821267]
Maybe there is conflict with a parallel connexion and the bean used to do asynch-synch bridge...
Mickael -
we are using VS 2013, I need to run multiple Coded UI Ordered Tests in parallel on different agents.
My requirement :
Example: I have 40 Coded UI Test scripts in single solution/project. i want to run in different OS environments(example 5 OS ). I have created 5 Ordered tests with the same 40 test cases.
I have one Controller machine and 5 test agent machines. Now I want my tests to be distributed in a way that every agent gets 1 Ordered test to execute.
Machine_C = Controller (Controls Machine_1,2,3,4,5)
Machine_1 = Test Agent 1 (Should execute Ordered Test 1 (ex: OS - WIN 7) )
Machine_2 = Test Agent 2 (Should execute Ordered Test 2 (ex:
OS - WIN 8) )
Machine_3 = Test Agent 3 (Should execute Ordered Test 3
(ex: OS - WIN 2008 server) )
Machine_4 = Test Agent 4 (Should execute Ordered Test 4 (ex:
OS - WIN 2012 server) )
Machine_5 = Test Agent 5 (Should execute Ordered Test 5 (ex:
OS - WIN 2003 server) )
I have changed the “MinimumTestsPerAgent” app setting value
as '1' in controller’s configuration file (QTController.exe.config).
When I run the Ordered tests from the test explorer all Test agent running with each Ordered test and showing the status as running. but with in the 5 Test Agents only 2 Agents executing the test cases remaining all 3 agents not executing the test cases but
status showing as 'running' still for long time (exp: More then 3 hr) after that all so its not responding.
I need to know how I can configure my controller or how I can tell it to execute these tests in parallel on different test agents. This will help me reducing the script execution time.
I am not sure what steps I am missing.
It will be of great help if someone can guide me how this can be achieved.
-- > One more thing Can I Run one Coded UI Ordered Test on One Specific Test Agent?
ex: Need to run ordered Test 1 in Win 7 OS (Test Agent 1) only.
Thanks in Advance.Hi Divakar,
Thank you for posting in MSDN forum.
As far as I know, we cannot specify coded UI ordered test run on specific test agent. And it is mainly that test controller determine which coded UI ordered test assign to which test agent.
Generally, I know that if we want to run multiple CodedUI Ordered Tests over multiple Test Agents for parallel execution using Test Controller.
We will need to change the MinimumTestsPerAgent property to 1 in the test controller configuration file (QTControllerConfig.exe.config) as you said.
And then we will need to change the bucketSize number of tests/number of machines in the test settings.
For more information about how to set this bucketSize value, please refer the following blog.
http://blogs.msdn.com/b/aseemb/archive/2010/08/11/how-to-run-automated-tests-on-different-machines-in-parallel.aspx
You can refer this Jack's suggestion to run your coded UI ordered test in lab Environment or load test.
https://social.msdn.microsoft.com/Forums/vstudio/en-US/661e73da-5a08-4c9b-8e5a-fc08c5962783/run-different-codedui-tests-simultaneously-on-different-test-agents-from-a-single-test-controller?forum=vstest
Best Regards,
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
Java Proxy Generation not working - Support for Parallel Processing
Hi Everyone,
As per SAP Note 1230721 - Java Proxy Generation - Support for Parallel Processing, when we generate a java proxy from an interface we are supposed to get 2 archives (one for serial processing and another suffixed with "PARALLEL" for parallel processing of jaav proxies in JPR).
https://websmp230.sap-ag.de/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/sapnotes/index2.htm?numm=1230721
We are on the correct patch level as per the Note, however when we generate java proxy from the IR for an outbound interface, it genrates only 1 zip archive (whose name we ourselves provide in the craete new archive section). This does not enable the parallel processsing of the messages in JPR.
Could you please help me in this issue, and guide as to how archives can be generated for parallel processing.
Thanks & Regards,
Rosie Sasidharan.Hi,
Thanks a lot for your reply, Prateek.
I have already checked SAP Note 1142580 - "Java Proxy is not processing messages in parallel" where they ask to modify the ejb-jar.xml. However, on performing the change in ejb-jar.xml and while building the EAR, I get the following error:
Error! The state of the source cache is INCONSISTENT for at least one of the request DCs. The build might produce incorrect results.
Then, on going through the SAP Note 1142580 again, I realised that the SAP Note 1230721 also should be looked onto which will be needed for generating the Java proxy from Message Interfaces in IR for parallel processing.
Kindly help me if any of you have worked on such a scenario.
Thanks in advance,
Regards,
Rosie Sasidharan. -
Revision: 13916
Revision: 13916
Author: [email protected]
Date: 2010-02-01 16:42:09 -0800 (Mon, 01 Feb 2010)
Log Message:
Added DynamicDRMTrait, Composite Unit tests for parallel and serial compositions. Bug fixes in CompositeDRMTrait.
Modified Paths:
osmf/trunk/framework/OSMF/org/osmf/composition/CompositeDRMTrait.as
osmf/trunk/framework/OSMFTest/org/osmf/OSMFTests.as
osmf/trunk/framework/OSMFTest/org/osmf/utils/DynamicMediaElement.as
Added Paths:
osmf/trunk/framework/OSMFTest/org/osmf/composition/TestParallelElementDRMTrait.as
osmf/trunk/framework/OSMFTest/org/osmf/composition/TestSerialElementDRMTrait.as
osmf/trunk/framework/OSMFTest/org/osmf/utils/DynamicDRMTrait.as -
Default Exchange rate type at document type level for parallel currencies
Hi,
I have a scenario wherein 2 parallel currencies have been maintained (Grp & index based) against a Co Code and have maintained a default exchange type (Z2) in the Document type settings (OBA7).
But when I want to post a foreign currency document, system picks Z2 rates for Co Code currency conversion and not for Grp & index based. In fact it picks M rate for these addtional currencies.
I know that for parallel currencies, the system will always use the exchange rate type defined in transaction OB22 ( "M" rate) which is at Co Code level but I want default exchange rate type Z2 to be picked at Document Type level for all the parallel currencies.
Please suggest how to archive this.
Thanks,
SamDear Sam,
The exchange rate type defined in the FI document type (-> field
exchange rate type in transaction OBA7) is only used for the currency
translation from transaction/document currency into first local
currency, not for the currency translation into second local (group)
currency or for the currency translation into third local currency.
The exchange rate types for the currency translation into second
local (group) currency and third local currency are defined in
transaction OB22.
Unfortunately there is no option to change the exchange rate of 2nd or
3rd local currency in our posting transactions. The design doesn't
regard that requirement. But you have the chance to adjust the parallel
currencies in OB22 or to change the amounts manually.
I hope this helps.
Mauri -
Gather_Plan_Statistics + DBMS_XPLAN A-rows for parallel queries
Looks like gather_plan_statistics + dbms_xplan displays incorrect A-rows for parallel queries. Is there any way to get the correct A-rows for a parallel query?
Version details:
Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bi on HPUX
Create test tables:
-- Account table
create table test_tacprof
parallel (degree 2) as
select object_id ac_nr,
object_name ac_na
from all_objects;
alter table test_tacprof add constraint test_tacprof_pk primary key (ac_nr);
-- Account revenue table
create table test_taccrev
parallel (degree 2) as
select apf.ac_nr ac_nr,
fiv.r tm_prd,
apf.ac_nr * fiv.r ac_rev
from (select rownum r from all_objects where rownum <= 5) fiv,
test_tacprof apf;
alter table test_taccrev add constraint test_taccrev_pk primary key (ac_nr, tm_prd);
-- Table to hold query results
create table test_4accrev as
select apf.ac_nr, apf.ac_na, rev.tm_prd, rev.ac_rev
from test_taccrev rev,
test_tacprof apf
where 1=2;
Run query with parallel dml/query disabled:
ALTER SESSION DISABLE PARALLEL QUERY;
ALTER SESSION DISABLE PARALLEL DML;
INSERT INTO test_4accrev
SELECT /*+ gather_plan_statistics */
apf.ac_nr,
apf.ac_na,
rev.tm_prd,
rev.ac_rev
FROM test_taccrev rev, test_tacprof apf
WHERE apf.ac_nr = rev.ac_nr AND tm_prd = 4;
SELECT *
FROM TABLE (DBMS_XPLAN.display_cursor (NULL, NULL, 'ALLSTATS LAST'));
| Id | Operation | Name | Starts | E-Rows | A-Rows | A-Time | Buffers | OMem | 1Mem | Use
|* 1 | HASH JOIN | | 1 | 30442 | 23412 |00:00:00.27 | 772 | 1810K| 1380K| 2949K (0)|
| 2 | TABLE ACCESS FULL| TEST_TACPROF | 1 | 26050 | 23412 |00:00:00.01 | 258 | | |
|* 3 | TABLE ACCESS FULL| TEST_TACCREV | 1 | 30441 | 23412 |00:00:00.03 | 514 | | |
ROLLBACK ;
A-rows are correctly reported with no parallel.
Run query with parallel dml/query enabled:
ALTER SESSION enable PARALLEL QUERY;
alter session enable parallel dml;
insert into test_4accrev
select /*+ gather_plan_statistics */ apf.ac_nr, apf.ac_na, rev.tm_prd, rev.ac_rev
from test_taccrev rev,
test_tacprof apf
where apf.ac_nr = rev.ac_nr
and tm_prd = 4;
select * from table(dbms_xplan.display_cursor(null,null,'ALLSTATS LAST'));
| Id | Operation | Name | Starts | E-Rows | A-Rows | A-Time | Buffers | OMem | 1Mem | Used-M
| 1 | PX COORDINATOR | | 1 | | 23412 |00:00:00.79 | 6 | | | |
| 2 | PX SEND QC (RANDOM) | :TQ10001 | 0 | 30442 | 0 |00:00:00.01 | 0 | | | |
|* 3 | HASH JOIN | | 0 | 30442 | 0 |00:00:00.01 | 0 | 2825K| 1131K| |
| 4 | PX BLOCK ITERATOR | | 0 | 30441 | 0 |00:00:00.01 | 0 | | | |
|* 5 | TABLE ACCESS FULL | TEST_TACCREV | 0 | 30441 | 0 |00:00:00.01 | 0 | | |
| 6 | BUFFER SORT | | 0 | | 0 |00:00:00.01 | 0 | 73728 | 73728 | |
| 7 | PX RECEIVE | | 0 | 26050 | 0 |00:00:00.01 | 0 | | | |
| 8 | PX SEND BROADCAST | :TQ10000 | 0 | 26050 | 0 |00:00:00.01 | 0 | | | |
| 9 | PX BLOCK ITERATOR | | 0 | 26050 | 0 |00:00:00.01 | 0 | | | |
|* 10 | TABLE ACCESS FULL| TEST_TACPROF | 0 | 26050 | 0 |00:00:00.01 | 0 | | | |
rollback;
A-rows are zero execpt for final step.I'm sorry for posting following long test case.
But it's the most convenient way to explain something. :-)
Here is my test case, which is quite similar to yours.
Note on the difference between "parallel select" and "parallel dml(insert here)".
(I know that Oracle implemented psc(parallel single cursor) model in 10g, but the details of the implementation is quite in mystery as Jonathan said... )
SQL> select * from v$version;
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod
PL/SQL Release 10.2.0.1.0 - Production
CORE 10.2.0.1.0 Production
TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
NLSRTL Version 10.2.0.1.0 - Production
SQL>
SQL> alter system flush shared_pool;
System altered.
SQL>
SQL> alter table t parallel 4;
Table altered.
SQL>
SQL> select /*+ gather_plan_statistics */ count(*) from t t1, t t2
2 where t1.c1 = t2.c1 and rownum <= 1000
3 order by t1.c2;
COUNT(*)
1000
SQL>
SQL> select sql_id from v$sqlarea
where sql_text like 'select /*+ gather_plan_statistics */ count(*) from t t1, t t2%';
SQL_ID
bx61bkyh9ffb6
SQL>
SQL> select * from table(dbms_xplan.display_cursor('&sql_id',null,'allstats last'));
Enter value for sql_id: bx61bkyh9ffb6
PLAN_TABLE_OUTPUT
SQL_ID bx61bkyh9ffb6, child number 0 <-- Cooridnator and slaves shared the cursor
select /*+ gather_plan_statistics */ count(*) from t t1, t t2 where t1.c1 = t2.c
1 and rownum <= 1000 order by t1.c2
Plan hash value: 3015647771
PLAN_TABLE_OUTPUT
| Id | Operation | Name | Starts | E-Rows | A-Rows | A-Time | Buffers | OMem | 1Mem | Used-Mem |
| 1 | SORT AGGREGATE | | 1 | 1 | 1 |00:00:00.62 | 6 | | | |
|* 2 | COUNT STOPKEY | | 1 | | 1000 |00:00:00.62 | 6 | | | |
| 3 | PX COORDINATOR | | 1 | | 1000 |00:00:00.50 | 6 | | | |
| 4 | PX SEND QC (RANDOM) | :TQ10002 | 0 | 16M| 0 |00:00:00.01 | 0 | | | |
|* 5 | COUNT STOPKEY | | 0 | | 0 |00:00:00.01 | 0 | | | |
|* 6 | HASH JOIN BUFFERED | | 0 | 16M| 0 |00:00:00.01 | 0 | 1285K| 1285K| 717K (0)|
| 7 | PX RECEIVE | | 0 | 10000 | 0 |00:00:00.01 | 0 | | | |
| 8 | PX SEND HASH | :TQ10000 | 0 | 10000 | 0 |00:00:00.01 | 0 | | | |
| 9 | PX BLOCK ITERATOR | | 0 | 10000 | 0 |00:00:00.01 | 0 | | | |
|* 10 | TABLE ACCESS FULL| T | 0 | 10000 | 0 |00:00:00.01 | 0 | | | |
| 11 | PX RECEIVE | | 0 | 10000 | 0 |00:00:00.01 | 0 | | | |
| 12 | PX SEND HASH | :TQ10001 | 0 | 10000 | 0 |00:00:00.01 | 0 | | | |
| 13 | PX BLOCK ITERATOR | | 0 | 10000 | 0 |00:00:00.01 | 0 | | | |
|* 14 | TABLE ACCESS FULL| T | 0 | 10000 | 0 |00:00:00.01 | 0 | | | |
38 rows selected.
SQL>
SQL> select sql_id, child_number, executions, px_servers_executions
2 from v$sql where sql_id = '&sql_id';
SQL_ID CHILD_NUMBER EXECUTIONS
PX_SERVERS_EXECUTIONS
bx61bkyh9ffb6 0 1
8
SQL>
SQL> insert /*+ gather_plan_statistics */ into t select * from t;
10000 rows created.
SQL>
SQL> select sql_id from v$sqlarea
where sql_text like 'insert /*+ gather_plan_statistics */ into t select * from t%';
SQL_ID
9dkmu9bdhg5h0
SQL>
SQL> select * from table(dbms_xplan.display_cursor('&sql_id', null, 'allstats last'));
Enter value for sql_id: 9dkmu9bdhg5h0
PLAN_TABLE_OUTPUT
SQL_ID 9dkmu9bdhg5h0, child number 0 <-- Coordinator Cursor
insert /*+ gather_plan_statistics */ into t select * from t
Plan hash value: 3050126167
| Id | Operation | Name | Starts | E-Rows | A-Rows | A-Time | Buffers |
| 1 | PX COORDINATOR | | 1 | | 10000 |00:00:00.20 | 3 |
| 2 | PX SEND QC (RANDOM)| :TQ10000 | 0 | 10000 | 0 |00:00:00.01 | 0 |
| 3 | PX BLOCK ITERATOR | | 0 | 10000 | 0 |00:00:00.01 | 0 |
|* 4 | TABLE ACCESS FULL| T | 0 | 10000 | 0 |00:00:00.01 | 0 |
SQL_ID 9dkmu9bdhg5h0, child number 1 <-- Slave(s)
insert /*+ gather_plan_statistics */ into t select * from t
PLAN_TABLE_OUTPUT
Plan hash value: 3050126167
| Id | Operation | Name | Starts | E-Rows | A-Rows | A-Time | Buffers |
| 1 | PX COORDINATOR | | 0 | | 0 |00:00:00.01 | 0 |
| 2 | PX SEND QC (RANDOM)| :TQ10000 | 0 | 10000 | 0 |00:00:00.01 | 0 |
| 3 | PX BLOCK ITERATOR | | 1 | 10000 | 2628 |00:00:00.20 | 16 |
|* 4 | TABLE ACCESS FULL| T | 4 | 10000 | 2628 |00:00:00.02 | 16 |
SQL>
SQL> select sql_id, child_number, executions, px_servers_executions
2 from v$sql where sql_id = '&sql_id'; <-- 2 child cursors here
SQL_ID CHILD_NUMBER EXECUTIONS
PX_SERVERS_EXECUTIONS
9dkmu9bdhg5h0 0 1
0
9dkmu9bdhg5h0 1 0
4
SQL>
SQL> set serveroutput on
-- check mismatch
SQL> exec print_table('select * from v$sql_shared_cursor where sql_id = ''&sql_id''');
Enter value for sql_id: 9dkmu9bdhg5h0
SQL_ID : 9dkmu9bdhg5h0
ADDRESS : 6AD85A70
CHILD_ADDRESS : 6BA596A8
CHILD_NUMBER : 0
UNBOUND_CURSOR : N
SQL_TYPE_MISMATCH : N
OPTIMIZER_MISMATCH : N
OUTLINE_MISMATCH : N
STATS_ROW_MISMATCH : N
LITERAL_MISMATCH : N
SEC_DEPTH_MISMATCH : N
EXPLAIN_PLAN_CURSOR : N
BUFFERED_DML_MISMATCH : N
PDML_ENV_MISMATCH : N
INST_DRTLD_MISMATCH : N
SLAVE_QC_MISMATCH : N
TYPECHECK_MISMATCH : N
AUTH_CHECK_MISMATCH : N
BIND_MISMATCH : N
DESCRIBE_MISMATCH : N
LANGUAGE_MISMATCH : N
TRANSLATION_MISMATCH : N
ROW_LEVEL_SEC_MISMATCH : N
INSUFF_PRIVS : N
INSUFF_PRIVS_REM : N
REMOTE_TRANS_MISMATCH : N
LOGMINER_SESSION_MISMATCH : N
INCOMP_LTRL_MISMATCH : N
OVERLAP_TIME_MISMATCH : N
SQL_REDIRECT_MISMATCH : N
MV_QUERY_GEN_MISMATCH : N
USER_BIND_PEEK_MISMATCH : N
TYPCHK_DEP_MISMATCH : N
NO_TRIGGER_MISMATCH : N
FLASHBACK_CURSOR : N
ANYDATA_TRANSFORMATION : N
INCOMPLETE_CURSOR : N
TOP_LEVEL_RPI_CURSOR : N
DIFFERENT_LONG_LENGTH : N
LOGICAL_STANDBY_APPLY : N
DIFF_CALL_DURN : N
BIND_UACS_DIFF : N
PLSQL_CMP_SWITCHS_DIFF : N
CURSOR_PARTS_MISMATCH : N
STB_OBJECT_MISMATCH : N
ROW_SHIP_MISMATCH : N
PQ_SLAVE_MISMATCH : N
TOP_LEVEL_DDL_MISMATCH : N
MULTI_PX_MISMATCH : N
BIND_PEEKED_PQ_MISMATCH : N
MV_REWRITE_MISMATCH : N
ROLL_INVALID_MISMATCH : N
OPTIMIZER_MODE_MISMATCH : N
PX_MISMATCH : N
MV_STALEOBJ_MISMATCH : N
FLASHBACK_TABLE_MISMATCH : N
LITREP_COMP_MISMATCH : N
SQL_ID : 9dkmu9bdhg5h0
ADDRESS : 6AD85A70
CHILD_ADDRESS : 6B10AA00
CHILD_NUMBER : 1
UNBOUND_CURSOR : N
SQL_TYPE_MISMATCH : N
OPTIMIZER_MISMATCH : N
OUTLINE_MISMATCH : N
STATS_ROW_MISMATCH : N
LITERAL_MISMATCH : N
SEC_DEPTH_MISMATCH : N
EXPLAIN_PLAN_CURSOR : N
BUFFERED_DML_MISMATCH : N
PDML_ENV_MISMATCH : N
INST_DRTLD_MISMATCH : N
SLAVE_QC_MISMATCH : N
TYPECHECK_MISMATCH : N
AUTH_CHECK_MISMATCH : N
BIND_MISMATCH : N
DESCRIBE_MISMATCH : N
LANGUAGE_MISMATCH : N
TRANSLATION_MISMATCH : N
ROW_LEVEL_SEC_MISMATCH : N
INSUFF_PRIVS : N
INSUFF_PRIVS_REM : N
REMOTE_TRANS_MISMATCH : N
LOGMINER_SESSION_MISMATCH : N
INCOMP_LTRL_MISMATCH : N
OVERLAP_TIME_MISMATCH : N
SQL_REDIRECT_MISMATCH : N
MV_QUERY_GEN_MISMATCH : N
USER_BIND_PEEK_MISMATCH : N
TYPCHK_DEP_MISMATCH : N
NO_TRIGGER_MISMATCH : N
FLASHBACK_CURSOR : N
ANYDATA_TRANSFORMATION : N
INCOMPLETE_CURSOR : N
TOP_LEVEL_RPI_CURSOR : N
DIFFERENT_LONG_LENGTH : N
LOGICAL_STANDBY_APPLY : N
DIFF_CALL_DURN : Y <-- Mismatch here. diff_call_durn
BIND_UACS_DIFF : N
PLSQL_CMP_SWITCHS_DIFF : N
CURSOR_PARTS_MISMATCH : N
STB_OBJECT_MISMATCH : N
ROW_SHIP_MISMATCH : N
PQ_SLAVE_MISMATCH : N
TOP_LEVEL_DDL_MISMATCH : N
MULTI_PX_MISMATCH : N
BIND_PEEKED_PQ_MISMATCH : N
MV_REWRITE_MISMATCH : N
ROLL_INVALID_MISMATCH : N
OPTIMIZER_MODE_MISMATCH : N
PX_MISMATCH : N
MV_STALEOBJ_MISMATCH : N
FLASHBACK_TABLE_MISMATCH : N
LITREP_COMP_MISMATCH : N
PL/SQL procedure successfully completed. -
No destination is currently free for parallel processing - SAPRCK10
Hello ABAPers,
We have scheduled SAP standard program "SAPRCK10" in background processing at our production server on daily basis.
It was worked fine on last week before but its now giving a error says "No destination is currently free for parallel processing ".
Could you please guide us how to resolve this issue ?
Thanks,
vthilagarajDear François Henrotte,
Could you please let me know how to increase the number of processe in RZ04 ?
Thanks,
vthilagaraj -
SAP job not using all dialog processes that are available for parallel processing
He Experts,
The customer is running a job which is not using all the dialog processes that are available for parallel processing. It appears to use up the parallel processes (60) for the first 4-5 minutes of the job and then maxes out about 3-5 processes for the remainder of the job.
How do I analyze the job to find out the issue from a Basis perspective?
Thanks,
ZahraHi Daniel,
Thanks for replying!
I don't believe its a standard job.
I was thinking of starting a trace using ST05 before the job. What do you think?
Thanks,
Zahra -
Troubleshooting the lockwaits for parallel processing jobs
Hi Experts,
I am facing difficulty tracing the job which is interfering with a business critical job.
The job in discussion is using parallel processing and the other jobs running at the time are also using the same.
If I see a lockwait for some process which may be a dailog process or update process spawned by these jobs, I am having difficulty knowing which one holding the lock and which one is waiting.
So, Is there any way we could identify the dailog or update processes which are used for parallel processing for a particular backgorund job.
Please help me as this is business critical and we have a high visibility in this area.
Any suggestions will be appreciated.......
Regards
RajHi Raj,
First of all, please indicate if you are using SAP Business One. If yes, then you need to check those locks under SQL Management Studio.
Thanks,
Gordon -
Configuration of Cost threshold for Parallelism
Hi all,
I want to adjust the 'Cost Threshold for Parallelism' on the SQL server running OLTP databases.
MSDN says that this is an estimate query duration - in seconds.
Is this estimate available in a DMV? I want to look at my server's query workload & assess which queries I want to be considered for parallel processing and which ones I don't.
Some potential candidates include:
dm_exec_query_stats.max_elapsed_time / dm_exec_query_stats.execution_count
I don't want to use this as this actual - where the setting is an estimate (on some arbitrary server).
I thought it might be the Estimated Subtree Cost in the query plan? Though I've read this has nothing to do with an estimated number of seconds.
Any assistance greatly appreciated.
Cheers, ClayThanks for the replies SQL24 and Ashwin -
I am looking at a server which is struggling to keep up with a substantial increase in volumes.
My top 3 waits are:
- CXPACKET @ 55%
- LCK_M_IX @ 25%
- SOS_SCHEDULER_YIELD @9% (25% signal waits)
The production server has only 2 CPUs - the current MAXDOP and Cost Threshold for Parallelism are the default, 0 and 5.
Though 2 CPUs is a tiny amount of CPU - I'm hesitant to request more until I can prove CPU is struggling (my
In regards to "issues relating to parallelism" - my guess (I'm not a DBA) is
that too many queries are being parallelised - I'm basing this guess on the high level of waits on CXPACKET, the fact that it is primarily managing small transactions (+ plus some manifesting/grouping of those transactions each afternoon) & the
fact that I currently only have 2 CPUs!
To my original question - MSDN states "The
cost refers to an estimated elapsed time in seconds required to run the serial plan on a specific hardware configuration"
http://technet.microsoft.com/en-us/library/ms188603(v=sql.105).aspx
It is the 'specific hardware configuration' part of this description that renders all of the actual times (be it a max or average) that I can obtain from
dm_exec_query_stats worthless.... for this purpose.
My guess (again) was that it could be compared to the query plans estimated subtree cost - but before I attempted extraction of this info from query plans - I wanted to confirm.
Thanks again. -
Group currency settings for parallel valuation
hi,
we need to go for parallel currency valuation for our SA company code but i need to know what are the impacts for changing the existing configuration
Company code is productive
Controlling area assignement control is "controlling area same as company code"
currency setting currency type 10 is assigned
OB22 is not maintained
Currency setting for client leval (scc4) is SAR
Regards
SanjithHi,
Offcourse, incase if you are chaning the valuations after go live, there could be data inconsistancy for addtional currency values in reports.
Hope you want to have SAR as addtional curerncy. (LC2)
In OBY6, for your company code, enter SAR in field Hard Currency and SAVE.
Add SAR as an addtional currency in OB22 for your company code,with below settings.
Curr Type: 40, Valuation BLANK, ER Type: M, Sour Curr: 1, Trs Type Det: 2.....SAVE
Regards,
Srinu
Maybe you are looking for
-
Question The latest version of Firefox no longer asks if you want to save open tabs for your next session..I HATE not having that option. What were they thinking? (I think this is self explanatory)
-
How to determine the agent of workflow administrator
In the workflow, when there is any exception raised, we need to send notification to workflow administrator. My question, how to determine the agent of workflow administrator.
-
ITunes automatic launch when iPhone connected
I suddenly have a new problem, and am uncertain as to what I did. When I plug my iPhone into my MacBook, in the past, iTune automatically launched and recognized my phone. Does not do that anymore. I upgraded to Leopard. Any thoughts? Thanks, Profess
-
*Can I pass default Selection dynamically for Drill Down Charts*
Hi I am having Charts with Drill Down tab and I am currently having to select a value for the Default Selection Combo Box while saving the chart. I want to be able to update this default selection depending upn data coming from the XML / Webservice.
-
Photoshop opening Javascript File as an Image
I have a series of Applescript droplets that perform a number of functions, then launch a javascript file in Photoshop The codes looks like this: This has worked consistenlty for a year, but recently started throwing an error with certain javascript