Performance issue while inserting data

I have a .dat file containing the required data for the table.It has almost half a million rows.When i inserted the data using SQL LOADER it took
almost whopping 70 minutes .How can i insert the data fast.what should i include in my control file to insert data fast.
Or any other method?2
oracle - 11gR2
Cant use external tables as data is not of fixed length
Edited by: Rahul_India on Oct 6, 2012 1:56 AM

True but irrelevant.
If the reason for the 70 minute load is CPU starvation the load time will remain 70 minutes.
If the reason for the 70 minute load time is that a DBA implemented workload management the load time will remain 70 minutes.
If the reason for the 70 minute load time is network saturation the load time will remain 70 minutes.
If the reason for the 70 minute load time is the SAN bus is saturated the load time will remain 70 minutes.
If the OP is on a RAC cluster and loading is happening on the wrong node causing remastering the load time will remain 70 minutes.
and I could add many dozens more to this list without breaking a sweat.
None of us have any basis for making any recommendation and again my intent here is not to be rude but sending the OP off on a wild goose chase saying DIRECT LOAD is faster, of course it is, or External Tables are faster, or whatever has no value if you do not first stop and ask the single most important question which is: "Why is the load taking 70 minutes?" Which is a question none of us can answer.
What I am encouraging you to do is stop giving advice when you don't have a single byte of relevant information upon which to base a recommendation.
Are you serious suggesting parallel execution? I do have experience with it. If CPU starvation is the root cause I recommend you try to work through the impact of your suggestion ... you will bring the server to its knees begging for mercy. So again please stop making recommendations until the OP responds with facts.

Similar Messages

  • Access issues while inserting data in a table in same schema

    Hi All.
    I have a script that at first creates and then populates a table. My script used to run fine in production environment till few hours back. But all of a sudden, it is popping up error while inserting data into the table .
    Error message - "Insufficient Previlages".
    Please suggest me what may be the reasons for this kind of error.
    Thanks in advance

    Sonika wrote:
    Hi All.
    I have a script that at first creates and then populates a table. My script used to run fine in production environment till few hours back. But all of a sudden, it is popping up error while inserting data into the table .
    Error message - "Insufficient Previlages".
    Please suggest me what may be the reasons for this kind of error.
    1) something changed
    2) you are hitting a bug

  • Performance issues while query data from a table having large records

    Hi all,
    I have a performance issues on the queries on mtl_transaction_accounts table which has around 48,000,000 rows. One of the query is as below
    SQL ID: 98pqcjwuhf0y6 Plan Hash: 3227911261
    SELECT SUM (B.BASE_TRANSACTION_VALUE)
    FROM
    MTL_TRANSACTION_ACCOUNTS B , MTL_PARAMETERS A  
    WHERE A.ORGANIZATION_ID =    B.ORGANIZATION_ID 
    AND A.ORGANIZATION_ID =  :b1 
    AND B.REFERENCE_ACCOUNT =    A.MATERIAL_ACCOUNT 
    AND B.TRANSACTION_DATE <=  LAST_DAY (TO_DATE (:b2 ,   'MON-YY' )  )  
    AND B.ACCOUNTING_LINE_TYPE !=  15  
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      3      0.02       0.05          0          0          0           0
    Fetch        3    134.74     722.82     847951    1003824          0           2
    total        7    134.76     722.87     847951    1003824          0           2
    Misses in library cache during parse: 1
    Misses in library cache during execute: 2
    Optimizer mode: ALL_ROWS
    Parsing user id: 193  (APPS)
    Number of plan statistics captured: 1
    Rows (1st) Rows (avg) Rows (max)  Row Source Operation
             1          1          1  SORT AGGREGATE (cr=469496 pr=397503 pw=0 time=237575841 us)
        788242     788242     788242   NESTED LOOPS  (cr=469496 pr=397503 pw=0 time=337519154 us cost=644 size=5920 card=160)
             1          1          1    TABLE ACCESS BY INDEX ROWID MTL_PARAMETERS (cr=2 pr=0 pw=0 time=59 us cost=1 size=10 card=1)
             1          1          1     INDEX UNIQUE SCAN MTL_PARAMETERS_U1 (cr=1 pr=0 pw=0 time=40 us cost=0 size=0 card=1)(object id 181399)
        788242     788242     788242    TABLE ACCESS BY INDEX ROWID MTL_TRANSACTION_ACCOUNTS (cr=469494 pr=397503 pw=0 time=336447304 us cost=643 size=4320 card=160)
       8704356    8704356    8704356     INDEX RANGE SCAN MTL_TRANSACTION_ACCOUNTS_N3 (cr=28826 pr=28826 pw=0 time=27109752 us cost=28 size=0 card=7316)(object id 181802)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
          1   SORT (AGGREGATE)
    788242    NESTED LOOPS
          1     TABLE ACCESS   MODE: ANALYZED (BY INDEX ROWID) OF
                    'MTL_PARAMETERS' (TABLE)
          1      INDEX   MODE: ANALYZED (UNIQUE SCAN) OF
                     'MTL_PARAMETERS_U1' (INDEX (UNIQUE))
    788242     TABLE ACCESS   MODE: ANALYZED (BY INDEX ROWID) OF
                    'MTL_TRANSACTION_ACCOUNTS' (TABLE)
    8704356      INDEX   MODE: ANALYZED (RANGE SCAN) OF
                     'MTL_TRANSACTION_ACCOUNTS_N3' (INDEX)
    Elapsed times include waiting on following events:
      Event waited on                             Times   Max. Wait  Total Waited
      ----------------------------------------   Waited  ----------  ------------
      row cache lock                                 29        0.00          0.02
      SQL*Net message to client                       2        0.00          0.00
      db file sequential read                    847951        0.40        581.90
      latch: object queue header operation            3        0.00          0.00
      latch: gc element                              14        0.00          0.00
      gc cr grant 2-way                               3        0.00          0.00
      latch: gcs resource hash                        1        0.00          0.00
      SQL*Net message from client                     2        0.00          0.00
      gc current block 3-way                          1        0.00          0.00
    ********************************************************************************On a 5 node rac environment the program completes in 15 hours whereas on a single node environemnt the program completes in 2 hours.
    Is there any way I can improve the performance of this query?
    Regards
    Edited by: mhosur on Dec 10, 2012 2:41 AM
    Edited by: mhosur on Dec 10, 2012 2:59 AM
    Edited by: mhosur on Dec 11, 2012 10:32 PM

    CREATE INDEX mtl_transaction_accounts_n0
      ON mtl_transaction_accounts (
                                   transaction_date
                                 , organization_id
                                 , reference_account
                                 , accounting_line_type
    /:p

  • Performance issue while using data type 'STRING'.

    Hello All,
    I have created a table for storing values of different features coming under a particular country. Since the last field 'Value field' have to hold text upto 800characters for some features, i have used the data type 'String' with character length specified as 1000. I am able to store values upto 1000characters using this. Also, the table has to hold lots and lots of value, and it will increase in future also.
    Since i have mentioned the data type as 'String', I have one doubt whether this will affect the performance. Because length of most of the values in my value field is less than 75characters and in some case only it will exceed 700characters. So, my question is whether the 'String' data type will allocate the length which am specifying in the table for each entries, though the values entering is less than the specified length.
    For example, if the value of my value field is 'Very High Complexity' which is of length 20characters, will the space allocation be reduced to 20 or will it be still 1000characters?
    Hope someone can clarify my doubt.
    Thanks In Advance,
    Shino
    Moved to appropriate forum
    Edited by: Rob Burbank on Feb 23, 2009 4:27 PM

    Hi Shino,
    Well it is possible to store using STRING or LCHR in the transparent tables. There are some underlyning facts here:-
    1. You can only have one such field per table
    2. You cannot view them in the se11 / se16 table content browser
    3. You will need to maintain an additional field for storing the length of the STRING or LCHR field.
    Regarding the performance:
    even though ABAP allows storing STRING or LCHR type fields in the transparent tables but as soon as the lenght of the field crosses 255 chars it is not advisable to store it directly in the transperant tables.
    You should store that field in the knowledge repository and only a pointer to the knowledge repository in the transperant table field.
    Anyways, Since you have only one field with such a requirement then i would suggest you use STRING instead of LCHR as in LCHR you will have to mandatorily assign a length (like 1000) so even if you are storing only 20 chars or 300 chars the system will reserve a slot of 1000 chars; this is not with string as in case of string everything would be dynamic.
    The result being that the reading time increases in case of LCHR.
    I Hop this answered your question.
    Regards,
    Sagar.

  • Performance issue while extracting data from non-APPS schema tables.

    Hi,
    I have developed an ODBC application to extract data from Oracle E-Business Suite 12 tables.
    e.g. I am trying to extract data from HZ_IMP_PARTIES_INT table of Receivables application(Table is in "AR" database schema) using this ODBC application.
    The performance of extraction (i.e. number of rows extracted per second) is very low if the table belongs to non-APPS schema. (e.g. HZ_IMP_PARTIES_INT belongs to "AR" database schema)
    Now if I create same table (HZ_IMP_PARTIES_INT) with same data in "APPS" schema, the performance of extraction improves a lot. (i.e. the number of rows extracted per second increases a lot) with the same ODBC application.
    The "APPS" user is used to connect to the database in both scenarios.
    Also note that my ODBC application creates multiple threads and each thread creates its own connection to the database. Each thread extract different data using SQL filter condition.
    So, my question is:
    Is APPS schema optimized for any data extraction?
    I will really appreciate any pointer on this.
    Thanks,
    Rohit.

    Hi,
    Is APPS schema optimized for any data extraction? I would say NO as data extraction performance should be the same for all schemas. Do you run "Gather Schema Statistics" concurrent program for ALL schemas on regular basis?
    Regards,
    Hussein

  • Performance issue while transferring data from one itab to another itab

    hi experts,
    i have stored all the general material details in one internal table and description, valuation details of the meterial is stored in another internal table which is of standard table. now i need to tranfer all the data from these two internal tables into one final internal table but it is taking lot of time as it has to transfer lacs of data.
    i have declared the output table as shown below
    DATA:
      t_output TYPE standard TABLE
               OF type_output
               INITIAL SIZE 0
               WITH HEADER LINE.
    (according to the standard i have to declare like this and the two internal tables are declared similar to the above one)
    could somebody suggest me how shall I procced through this...
    thanks in advance....
    ragerds,
    Deepu

    Have a look at the following article which you may find useful:
      <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40729734-3668-2910-deba-fa0e95e2c541">Improving performance in nested loops</a>
    good luck, damian

  • Issue while insert and update data to DB tables

    Hello all,
    i am having an issue while insert the data to DB table.
    my scenario is DB1 to DB2. i had a sender channel with select query which fetches data from DB1 and inserts to DB2.
    so the select query will fetch the records that were INSERTED to DB1 and records that were UPDATED to DB1 and needs to insert/update to DB2 table.
    Now the issue is i am able to insert the records but not able toupdate the records to DB2 table due to primary key issue.
    im message mapping
    sender message type is as follows:
    <src_message1>
    ----<row>
    -------<fieldA>
    -------<filedB>
    -------<filedC>
    Receiver message type as follows:
    <trgt_message1>
    ----<STATEMENT_1>
    ----<TABLE_NAME>
    ----<ACTION> INSERT
    ----<TABLE>
    ----<ACCESS>
    ----<field1> primary key
    ----<field2>
    ----<field3>
    ----<field4>
    ----<KEY>
    ----<field1>
    ----<field2>
    ----<field3>
    ----<field4>
    my query in sender channel is : select filedA, filedB, filedC from test_table where createdate=sysdate or updatedate=sysdate
    so it feteches the data from DB1 and inserting to DB2 but not updating the records to DB2 due to primarykey issue.
    please suggest how to solve ....will it solve by using UPDATE_INSERT for action?
    Best Regards,SARAN

    Hi Nagarjuna,
    i have done the following changes to target mapping structure;
    1. action as UPDATE_INSERT
    2.  in access tab, i had mapped fieldDate to field4.
    3. in Key tab, i assigned the sysdate to field4.
    but issue still exist. could you please check my above changes are correct or not. if wrong please provide me the details that needs to be done.
    thanks in advance.
    i'm providing the error details again:
    my query in sender channel is : select filedA, filedB, filedC, FiledDate from TEST_TABLE where fieldDate=sysdate or updatedate=sysdate
    it returns 4 records as follows:
    fieldA--fieldB-fieldC---fieldDate
    1001----EU----  1----
        2011-11-10
    1002----CN----  0----
         2011-11-10
    1003----AP---- 1----
          2008-03-15 (already exist in DB2)
    1004----JP----  1----
        2007-04-12 (already exist in DB2)
    the first two records are created today and remaining 2 records are updated the fieldC from 0 to 1 ( in DB1 )
    while inserting these 4 records to DB2, we get the following error "java.sql.SQLException: ORA-00001: unique constraint (data.TEST_TABLE_PK) violated" .
    Best Regards,SARAN

  • Performance issue while generating Query

    Hi BI Gurus.
    I am facing performance issue while generating query on 0IC_C03.
    It has a variable as (from & to) for generating the report for a particular time duration.
    if the variable (from & to) fields is filled then after taking a long time it shows run time error.
    & if the query is executed without mentioning the variable(which is optional) then the data is extracted from beginning to till date. the same takes less time in execution.
    & after that the period has to be selected manually by option keep filter value. please suggest how can i solve the error
    Regards
    Ritika

    HI RITIKA,
    WEL COME TO SDN.
    YOUHAVE TO CHECK THE FOLLOWING RUN TIME SEGMENTS USING ST03N TCODE:
    High Database Runtime
    High OLAP Runtime
    High Frontend Runtime
    if its high Database Runtime :
    - check the aggregates or create aggregates on cube and this helps you.
    if its High OLAP Runtime :
    - check the user exits if any.
    - check the hier. are used and fetching in deep level.
    If its high frontend runtime:
    - check if a very high number of cells and formattings are transferred to the Frontend ( use "All data" to get value "No. of Cells") which cause high network and frontend (processing) runtime.
    For From and to date variables, create one more set and use it and try.
    Regs,
    VACHAN

  • ORA-00600: internal error code while inserting data in table

    hi gems..
    i am getting the below error while inserting data in a table...
    *ORA-00600: internal error code, arguments: [kqd-objerror$ ] , , [0], [98], [BIN$sm1O+fYhF1jgRAAhKNYyZA==$0], [], [], [], [], [], [], []*
    i can select the table absolutely but cant insert datas(but this is the schema owner and so datas should get inserted)
    i have checked the alert.log...the entries in last few lines are like this:
    <msg time='2011-11-25T03:08:55.763+05:30' org_id='oracle' comp_id='clients'
    type='UNKNOWN' level='16' host_id='ICS167DOR'
    host_addr='10.184.134.139'>
    <txt>Directory does not exist for read/write [oracle/ora11g/app/ora11g/product/11.2.0/dbhome_1/log] [oracle/ora11g/app/ora11g/product/11.2.0/dbhome_1/log/diag/clients]
    </txt>
    </msg>
    please help...thanks in advance
    Edited by: user12780416 on Nov 25, 2011 3:29 AM

    hi...
    finally i got the solution...i know that this problem may occur due to some other reasons also for different users...but the problem which caused the developers facing the error in this case is below:
    they faced the error while importing the dumps in the server. at the same time the application developers told that they can select the tables but cannot insert any datas.
    after listenning to this, i assumed that this may be a space problem with the system tablespace as it is responsible for storing the data dictionary.
    i asked for the free spaces for the system tablespace and got the reason. It has only 0.2% left.
    i told them to issue the resize command for the system01.dbf datafile(allocated 2GB more) and the problem got resolved.
    Hope this helps..thanks

  • Issue while inserting a BDC program in Inbound Proxy.JDBC-- PI-- SAP.

    The scenerio is jdbcsender-sappi-inboundproxy(ECC6.0).
    The issue is related to SAP Plant Maintenance Module where  We have a requirement for creating Maintenance item
    programmatically from (SQLDatabase)Legacy Data for one of the interface.
    since there are no standard BAPIS/Idocs or function modules available from SAP side for creating the maintenance item. So I
    have written BDC program and with the help of submit statement in inbound proxy program, I am calling BDC program for
    creation of the maintenance item.
    When I tested the program independently on proxy side  the Maintenance Item is getting created successfully but when I
    executed from end-to-end ie.  SQLDATABASE->SAP PI-->SAP. The message is getting strucked into the queue and queue  got stopped on SAP.
    ECC side and the status of the message is scheduled state  on SXMB_MONI  transaction of SAP ECC.
    As the message  is in scheduled state multiple number of Maintenance Items are getting created with the same values.
    Has any one of our SAP friends, encountered this type of issue while inserting a BDC program in inbound proxy, please help 
    in fixing this issue. FYI... I am using sap pi7.0 with service pack 24 and ecc6.0
    Waiting for your kind expert guidance...
    cheers,
    Ram

    Raj,
    Thanks for the reply. I have tried registering the queues but still the same problem. the message got stuck in the queue of ECC and showing below message in queue.
    function module                                    StatusText
    SXMS_ASYNC_EXEC                  connection closed (no data)
    I have checked in the forums especially  for this issue but no one has provided the answer for this.
    Thanks and Regards
    Ram

  • Error while insert data using execute immediate in dynamic table in oracle

    Error while insert data using execute immediate in dynamic table created in oracle 11g .
    first the dynamic nested table (op_sample) was created using the executed immediate...
    object is
    CREATE OR REPLACE TYPE ASI.sub_mark AS OBJECT (
    mark1 number,
    mark2 number
    t_sub_mark is a class of type sub_mark
    CREATE OR REPLACE TYPE ASI.t_sub_mark is table of sub_mark;
    create table sam1(id number,name varchar2(30));
    nested table is created below:
    begin
    EXECUTE IMMEDIATE ' create table '||op_sample||'
    (id number,name varchar2(30),subject_obj t_sub_mark) nested table subject_obj store as nest_tab return as value';
    end;
    now data from sam1 table and object (subject_obj) are inserted into the dynamic table
    declare
    subject_obj t_sub_mark;
    begin
    subject_obj:= t_sub_mark();
    EXECUTE IMMEDIATE 'insert into op_sample (select id,name,subject_obj from sam1) ';
    end;
    and got the below error:
    ORA-00904: "SUBJECT_OBJ": invalid identifier
    ORA-06512: at line 7
    then when we tried to insert the data into the dynam_table with the subject_marks object as null,we received the following error..
    execute immediate 'insert into '||dynam_table ||'
    (SELECT

    887684 wrote:
    ORA-00904: "SUBJECT_OBJ": invalid identifier
    ORA-06512: at line 7The problem is that your variable subject_obj is not in scope inside the dynamic SQL you are building. The SQL engine does not know your PL/SQL variable, so it tries to find a column named SUBJECT_OBJ in your SAM1 table.
    If you need to use dynamic SQL for this, then you must bind the variable. Something like this:
    EXECUTE IMMEDIATE 'insert into op_sample (select id,name,:bind_subject_obj from sam1) ' USING subject_obj;Alternatively you might figure out to use static SQL rather than dynamic SQL (if possible for your project.) In static SQL the PL/SQL engine binds the variables for you automatically.

  • Performance issue while opening the report

    HI,
    I am working BO XI R3.1.there is performance issue while opening the report in BO Solris Server but  on window server it is compratively fast.
    we have few reports which contains 5 fixed prompt 7 optional prompt.
    out of 5 fixed prompt 3 prompt is static (it contains 3 -4 record only )which is coming from materlied view.
    we have already use many thing for improve performance in report like-
    1) Index Awareness
    2) Aggregate Awareness
    3) Array fatch size-250
    3) Aray bind time -32767
    4) Login time out -600
    the issue is that before refresh opening the report iteslf taking time 1.30 min on BO solris server but same report taking time in BO window server 45 sec. even we  import on others BO solris server it is taking same time as per old solris server(1.30 min).
    when we close the trace in solris server than it is taking 1.15  sec time.it should not be intial phase it is not hitting more on database.so why it is taking that much time while opening the report.
    could you please guide us where exectly problem is there and how we can improve performance for opening the report.In case the problem related to solris server so what would be and how can we rectify.
    Incase any further input require for the same feel free to ask me.

    Hi Kumar,
    If this is happening with all the reports then this issue seems to be due to firewall or security settings of Solaris OS.
    Please try to lower down the security level in solaris and test for the issue.
    Regards,
    Chaitanya Deshpande

  • Issue While sending data to Webserver

    Hi All,
      I am facing an issue while sending data to a webserver. My scenario is SAP(Proxy) to XI to Webserver. It is an asynchronous scenario. The same communication channel and same WSDL is used to send many Interfaces but when I am sending data for a particular Interface I am facing the issue "SOAP: response message contains an error XIAdapter/HTTP/ADAPTER.HTTP_EXCEPTION - HTTP 400 Bad Request"
    Few points for reference -
    1. I am sending valid data to the webserver(Confirmed by the webserver guy).
    2. The WSDL is the latest one and the same WSDL is used at their end to receive the data.
    3. The schema I am using to send the data is also the latest ones.
    4. The same payload when sent through SOAPUI tool reaches the web server.
    5. The same communcation channel is used to send data for all the messages. All other messages are reaching the webserver except this message.
    Can anyone please provide me some solution for this issue.
    Thanks in advance.
    Regards,
    Sarat

    SInce you are using only one receiver channel for all WSDL operations, I hope you are setting SOAP action for each message type (operation) using dynamic configuration during mapping or at module level(in the soap receiver channel using DC bean).
    >> 5. The same communcation channel is used to send data for all the messages. All other messages are reaching the webserver except this message.
    Cross verify again all the design and configurations. check the MONI for Dynamic configuration header for this particular message if you are setting the same during mapping.
    >> 4. The same payload when sent through SOAPUI tool reaches the web server.
    Finally you can use some Sniffer tools to verify the requests that are being sent from XI and you can compare with SOAP UI tool request which is sending correct request in your case.

  • Error while inserting data into a table.

    Hi All,
      I created a table.While inserting data into the table i am getting an error.Its telling "Create data Processing Function Module".Can any one help me regarding this?
    Thanx in advance
    anirudh

    Hi Anirudh,
      Seems there is already an entry in the Table with the same Primary Key.
    INSERT Statement will give short dump if you try to insert data with same key.
    Why dont you use MODIFY statement to achieve the same.
    Reward points if this Helps.
    Manish

  • Special character issue while loading data from SAP HR through VDS

    Hello,
    We have a special character issue, while loading data from SAP HR to IdM, using a VDS and following the standard documentation: http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e09fa547-f7c9-2b10-3d9e-da93fd15dca1?quicklink=index&overridelayout=true
    French accent like (é,à,è,ù), are correctly loaded but Turkish special ones (like : Ş, İ, ł ) are transformed into u201C#u201D in Idm.
    The question is : does someone know any special setting to do in the VDS or in IdM for special characters to solve this issue??
    Our SAP HR version is ECC6.0 (ABA/BASIS7.0 SP21, SAP_HR6.0 SP54) and we are using a VDS 7.1 SP5 and SAP NW IdM 7.1 SP5 Patch1 on oracle 10.2.
    Thanks

    We are importing directly to the HR staging area, using the transactions/programs "HRLDAP_MAP", "LDAP" and "/RPLDAP_EXTRACT", then we have a job which extract data from the staging area to a CSV file.
    So before the import, the character appears correctly in SAP HR, but by the time it comes through the VDS to the IDM's temporary table, it becomes "#".
    Yes, our data is coming from a Unicode system.
    So, could it be a java parameter to change / add in the VDS??
    Regards.

Maybe you are looking for

  • Updating Mac OS X V10.4.11 to Mac OS X10.5.1

    I'd like to know how I can update my current operating system to one that is compatible with the operating systems that Leopard run on. Do I need a new MacBook with a newer operating system? Or can I just find updates through the Downloads section of

  • Time consuming problem in Self update rule

    Hi all:      We have time consuming problem in self update rule.I have ODS ZOMS001,for this we created self update rule.In process chain we include this self update rule and during delta update,it takes 20 to 25 mins even if there is two records or 1

  • Best Practice for BW-IP MultiProvider

    Hi, When I create an InfoProvider for BW-IP it is recommended to use a Multi_Provider.  See Note 1056259 What should be included in this MultiProvider? 1. The Aggregation Levels and Actual InfoCubes 2. The Aggregation Levels, Base Planning InfoCube,

  • Premiere is Crashing when using CUDA and dual GTX680

    I am getting consistent crashing when using Premiere in this configuration: OSX 10.8.5 2 x 2.93 GHZ 6-core 64 GB RAM (32 allocated to CC) GUI graphics: ATI Radeon HD 5870 CUDA GPUs: 2 x NVIDIA GeForce GTX 680 (expansion chasis) CUDA Driver version: 5

  • ITunes store error code -9810

    While trying to update iTunes Match (I have about 40 songs "waiting") I get to step 3 and it appears to start uploading.  Then, after awhile, I get a badge that says, "We could not complete your iTunes Store request.  An unknown error occurred (-9810