Problem w/ geting a unique set of records. Too many rows returned

This is a query that is looking at itself for two diff reasons.
The d pass is to search the records that are not voided and pass lmc is the records that are the voids.
I have to identify the valid recs to be voided and then mark them accordingly but I am getting repeating values
SELECT DISTINCT
d.CHARGE_ID,
lmc.CHARGE_ID as CHARGE_ID_THAT_VOIDED_ME ,
ROW_NUMBER() OVER (PARTITION BY d.charge_id ORDER BY LMC.SOURCE_SYSTEM_RECORD_ID DESC ) AS rown
FROM CHARGES d
INNER JOIN CHARGES lmc
ON d.Encounter_id = LMC.ENCOUNTER_ID
WHERE d.TRANSACTION_TYPE_CODE = 0 --not voided
AND lmc.transaction_type_code = 1 --void records that have limited values to tell me what to void
AND d.SOURCE_SYSTEM = 'MM' --this is my source system
AND ( d.IMPUTED_SB_MAP + lmc.IMPUTED_SB_MAP) = 0 --sadly one of those values is the dollar amt. Problem occurs when there are two voids of the same value
CHARGE_ID CHARGE_ID_THAT_VOIDED_ME ROWN
10138466 10138459 1
10138467 10138460 1
10138468 10138462 1
10138468 10138461 2
10138469 10138462 1
10138469 10138461 2
10138470 10138463 1
As you can see I will not get a proper set of values to record the Charge Id that voided me for the charge Id to be voided.
I am having a real issue with this one.
Please help

So many looked and were not able to help.By all mean I hold nothing against anyone. Thankfully I never give up and this is what we did:
1. A table called CHARGES_N_VOID_STG was created
2. in the declaration section of my procedure:
cursor c_voids is select * from CHARGES_N_VOIDS_STG;
Then after the BEGIN:
          If there were any voids in these clean charges, then you must set the valid
          record that it is offsetting to a transaction type code of 2
EXECUTE IMMEDIATE 'TRUNCATE table CHARGES_N_VOIDS_STG';
--Insert the voids that have not been processed.
--If the CHARGE_ID_I_VOIDED is null then it has not been processed                
INSERT INTO CHARGES_N_VOIDS_STG
(     CHARGE_ID_Void,
IMPUTED_SB_MAP_Void,
BATCH_ID_Void,
RECORD_SEQ_Void,
ENCOUNTER_ID_Void,
SERVICE_CODE_VOID)
SELECT     DISTINCT
v.CHARGE_ID,
v.IMPUTED_SB_MAP,
v.BATCH_ID,
v.RECORD_SEQ,
v.ENCOUNTER_ID,
V.SERVICE_CODE
FROM      CHARGES v
WHERE     v.TRANSACTION_TYPE_CODE = 1
AND          v.SOURCE_SYSTEM = 'MM'
AND           v.CHARGE_ID_I_VOIDED IS NULL;
--For each void record in the temp table..
FOR r_void IN c_voids LOOP
--Update the stage table records with a valid charge to be voided          
UPDATE      CHARGES_N_VOIDS_STG
SET          CHARGE_ID_Valid =
(     SELECT      MAX(CHARGE_ID) AS CHARGE_ID
FROM          CHARGESDS.CHARGES
WHERE     ENCOUNTER_ID = r_void.ENCOUNTER_ID_Void
AND          CHARGE_ID_THAT_VOIDED_ME IS NULL
AND          (IMPUTED_SB_MAP + r_void.IMPUTED_SB_MAP_Void) = 0
AND          SERVICE_CODE = r_void.SERVICE_CODE_VOID)
WHERE     BATCH_ID_VOID = r_void.BATCH_ID_Void
AND          RECORD_SEQ_VOID = r_void.RECORD_SEQ_Void     
AND           ENCOUNTER_ID_Void IN
(     SELECT      ENCOUNTER_ID
FROM          CHARGESDS.CHARGES
WHERE     ENCOUNTER_ID = r_void.ENCOUNTER_ID_Void
AND          CHARGE_ID_THAT_VOIDED_ME IS NULL
AND          (IMPUTED_SB_MAP + r_void.IMPUTED_SB_MAP_Void) = 0);
--Update a charge with the record's charge ID that voided it and change the trans type code
-- to 2 and last mod and last modify process id
UPDATE      CHARGES
SET          CHARGE_ID_THAT_VOIDED_ME = r_void.CHARGE_ID_VOID,
TRANSACTION_TYPE_CODE = '2',
LAST_MODIFIED = sysdate,
LAST_MODIFY_PROCESS_ID = pCHGDS_PROCESS_ID
WHERE      CHARGE_ID IN
(     SELECT      MAX(CHARGE_ID) AS CHARGE_ID
FROM          CHARGESDS.CHARGES
WHERE     ENCOUNTER_ID = r_void.ENCOUNTER_ID_Void
AND          CHARGE_ID_THAT_VOIDED_ME IS NULL
AND          (IMPUTED_SB_MAP + r_void.IMPUTED_SB_MAP_Void) = 0
AND          SERVICE_CODE = r_void.SERVICE_CODE_VOID);
END LOOP; ----For each void in the temp table..
--Update the voids in Charges with a value that indicates the void has been processed
UPDATE      CHARGES
SET          CHARGE_ID_I_VOIDED = (
SELECT     CHARGE_ID_Valid
FROM          CHARGES_N_VOIDS_STG
WHERE     CHARGES.CHARGE_ID = CHARGE_ID_Void)
WHERE     EXISTS (
SELECT     CHARGE_ID_Valid
FROM          CHARGES_N_VOIDS_STG
WHERE     CHARGES.CHARGE_ID = CHARGE_ID_Void);
We had no choice but to find the voids first then go through each and find only one records to void that matched our criteria. Some of you may have thought that the client should have put more data to make it easier to find each record but some clients are what they are and you have to make the best of it.
Thanks and You are welcome
NAMASTE

Similar Messages

  • Result set does not fit; it contains too many rows

    Dear All,
    We are in BI7 and running reports on Excel 2007. Even though number of rows limitation in Excel 2007 is more than  1Million, when I try to execute a report with more than 65k records of output, system is generating output only for 65k rows with message "Result set does not fit; it contains too many rows".
    Our Patch levels:
    GUI - 7.10
    Patch level is 11
    Is there any way to generate more than 65000 rows in Bex?
    Thanks in advance...
    regards,
    Raju
    Dear Gurus,
    Could you please shed some light on this issue?
    thanks and regards,
    Raju
    Edited by: VaraPrasadraju Potturi on Apr 14, 2009 3:13 AM

    Vara Prasad,
    This has been discussed on the forums - for reasons of backward compatibility I do not think BEx supports more that 65000 rows .... I am still not sure about the same since I have not tried out a query with more that 65K rows on excel 2007 but I think this is not possible...

  • Spmetal generation problem when a list "choice type" field has too many Values

    Hello,
    in a SharePoint list I have a Choice column called City where the user should be able to pick a city.  I am using SPMetal to generate my entity context. All enum values receive a integer based on 2^n.
    However, after the maximum int value SPMetal uses 0 as the identifiert (see code below)
    [Microsoft.SharePoint.Linq.ChoiceAttribute(Value="Macao")]
    Macao = 1073741824,
    [Microsoft.SharePoint.Linq.ChoiceAttribute(Value="Beijing")]
    Beijing = -2147483648,
    [Microsoft.SharePoint.Linq.ChoiceAttribute(Value="Tianjin")]
    Tianjin = 0,
    [Microsoft.SharePoint.Linq.ChoiceAttribute(Value="Chongqing")]
    Chongqing = 0,
    [Microsoft.SharePoint.Linq.ChoiceAttribute(Value="Shanghai")]
    Shanghai = 0,
    This is causing problems while using LINQ. For example, In my LINQ result, the city field of a list entry always shows "Chongqing" even its real value is "Shanghai".

    The LINQ result seems OK after I modified the INT value as follows:
    [Microsoft.SharePoint.Linq.ChoiceAttribute(Value="Macao")]
    Macao = 1073741824,
    [Microsoft.SharePoint.Linq.ChoiceAttribute(Value="Beijing")]
    Beijing = 1073741825,
    [Microsoft.SharePoint.Linq.ChoiceAttribute(Value="Tianjin")]
    Tianjin = 1073741826,
    [Microsoft.SharePoint.Linq.ChoiceAttribute(Value="Chongqing")]
    Chongqing = 1073741827,
    [Microsoft.SharePoint.Linq.ChoiceAttribute(Value="Shanghai")]
    Shanghai = 1073741828,
    Can anyone help to confirm if this is a real solution?
    Thank you. 

  • Data set created out of csv file: columns not recognized, too many rows

    After uploading a csv file I got out of the OPP the columns don't get recognized in the dataset viewer. Perhaps this happens because the delimiter is a semicolon instead of a comma? Unfortunately, I have no influence how the data gets exported out of the OPP. Is there any option to configure which delimiter to recognize, like it's possible in excel?
    Further, is it possible to configure from which row on the data should be taken into account, similar to excel? With the csv file I get out of the OPP, the data only starts with row 7, while row 6 has the headings.

    After uploading a csv file I got out of the OPP the columns don't get recognized in the dataset viewer. Perhaps this happens because the delimiter is a semicolon instead of a comma? Unfortunately, I have no influence how the data gets exported out of the OPP. Is there any option to configure which delimiter to recognize, like it's possible in excel?
    The delimiter cannot be configured. Comma is the delimiter. I would suggest that you open this file in text editor then replace semicolon with comma. After that you can reupload your dataset.
    Further, is it possible to configure from which row on the data should be taken into account, similar to excel? With the csv file I get out of the OPP, the data only starts with row 7, while row 6 has the headings.
    This is not configurable as well.

  • How do I solve problem that safari cannot open because of too many redirects?

    How do I fix problem that safari can't open because of too many redirects?

    Try clearing the browser cache and delete any cookies for that site. If that doesn't resolve it, then that's a problem with the web-site itself, so unless it is your own website, you can't fix it.
    Often it's a bug in the website code that detects what device/browser you are using and it gets stuck in a re-direct loop. You could contact the website developers and inform them.

  • Special Ledger Roll up: too many records created in receiver table

    Hello,
    When I execute roll up from 1 special ledger to another special ledger (leaving out a number of fields) 4 records (postings) are selected but in the receiver table there are 92 records created (4*23) so 88 records too many. For each selected record in the sender table 7 records contain only 0 amounts and have an incorrect RPMAX value (starting with 16 till 336). The 8th record has an amount and has the correct RPMAX value: 352). In Sender and Receiver Ledger fiscal year variant Daily Balance is assigned (= each day is a period). Any hint how to prevent the empty records?

    Installing the patch, re- importing the SAP Provisioning framework (I selected 'update') and recreating the jobs didn't yield any result.
    When examining pass 'ReadABAPRoles' of Job 'AS ABAP - Initial Load' -> tab 'source', there are no scripts used .
    After applying the patch we decided anyway to verify the scripts (sap_getRoles, sap_getUserRepositories) in our Identity Center after those of 'Note 1398312 - SAP NW IdM Provisioning Framework for SAP Systems' , and they are different
    File size of SAP Provisioning Framework_Folder.mcc of SP3 Patch 0 and Patch 1 are also exactly the same.
    Opening file SAP Provisioning Framework_Folder.mcc with Wordpad : searched for 'sap_getRoles'  :
    <GLOBALSCRIPT>
    <SCRIPTREVISIONNUMBER/>
    <SCRIPTLASTCHANGE>2009-05-07 08:00:23.54</SCRIPTLASTCHANGE>
    <SCRIPTLANGUAGE>JScript</SCRIPTLANGUAGE>
    <SCRIPTID>30</SCRIPTID>
    <SCRIPTDEFINITION> ... string was too long to copy
    paste ... </SCRIPTDEFINITION>
    <SCRIPTLOCKDATE/>
    <SCRIPTHASH>0940f540423630687449f52159cdb5d9</SCRIPTHASH>
    <SCRIPTDESCRIPTION/>
    <SCRIPTNAME>sap_getRoles</SCRIPTNAME>
    <SCRIPTLOCKSTATE>0</SCRIPTLOCKSTATE>
    -> Script last change 2009-05-07 08:00:23.54 -> that's no update !
    So I assume the updates mentioned in Note 1398312 aren't included in SP3 Patch 1. Manually replaced the current scripts with those of the note and re- tested : no luck. Same issue.
    Thanks again for the help,
    Kevin

  • Problems setting the recording path

    I have recorded hundreds of songs successfully by setting the recording path to the project folders on my external hard drive. Now whenever I open a new project and go to record an audio file it ask me every time to set the recording path. I pick the location on my external, click save and then hit the record button and then the whole process starts all over.

    does your drive has a Sleep mode (or something like that) preference?that's after a certain time of inactivity...goes to sleep
    Might be that..
    A

  • How to devide the set of records into groups in SQL itself.

    Hi , i am using 10.2.4.0 of oracle.
    I am having one requirement, in which i have to devide the set of records into certain groups , so that they can be executed partly but not in one run.
    So in the 'SELECT' clause itself i want to asssign particular value (may be 1 )to first 50000 records then another value(may be 2) to next 10000, like wise. And again the total count of records will also varry time to time , if the total count of record set is less than 10000 , then it should only assign '1' to all the records. i will set the group values (1,2,3...) as another column itself.
    Can you please let me know if this can be done in SQL without going for PLSQL?

    Hi,
    That's called a Pagination Query , and here's one way to do it:
    WITH     got_grp          AS
         SELECT     x.*
         ,     CEIL ( ROW_NUMBER () OVER (ORDER BY  x_id) 
                   / 50000
                   )          AS grp
         FROM     table_x  x
    --     WHERE     ...          -- If you need any filtering, put it here
    SELECT     *               -- Or list the columns you want
    FROM     got_grp
    WHERE     grp     = 1
    ;ROW_NUMBER () OVER (ORDER BY x_id)     assigns unique integers 1, 2, 3, ... to all all rows, in the same order as x_id (even if x_id is not unique).
    CEIL (ROW_NUMBER () OVER (ORDER BY x_id) / 50000)     maps the 1st 50,000 of those numbers to 1, the 2nd 50,000 to 2, and so on.
    Analytic functions (like ROW_NUMBER) as computed after the WHERE clause is applied, so, to use the results in a WHERE clause, then you need to compute them in a sub-query. If you just want to display the number, and not use it in a WHERE clause, then you don't need a sub-query.
    I hope this answers your question.
    If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all the tables involved, and the results you want from that data.
    In the case of a DML operation (such as UPDATE) the sample data should show what the tables are like before the DML, and the results will be the contents of the changed table(s) after the DML.
    Explain, using specific examples, how you get those results from that data.
    Always say what version of Oracle you're using.
    See the forum FAQ {message:id=9360002}

  • Set maximum record time?

    Can I set maximum record time in Logic 8? Recording an hour-long live program unsupervised, & I'd like it to stop afterwards.
    (I don't know any problem with continuing recording for hours until the hard drive is full? But it's not necessary.)
    thanks

    You can use the Auto-Punch In/Out function to set Logic stop recording automatically.
    Also, set the bar value for the project end (at the transportbar, under the tempo).
    For How long you can record depends on you HD capacity, but also the file format (aif,wav,caf...) the bit depth and the sample rate.
    each format has its own limit.
    Refer to user manual and check for the best file format for your purpose.

  • ESB Process reads a Batch file for a limited set of records only..?

    Dear All,
    I am having an ESB process that reads a Batch file (csv) that has around 10,000 Products information.
    I have another BPEL process that will create the Product in Oracle Apps through standard API (Main BPEL process that calls a child process for creating product).
    I am invoking the BPEL process from ESB, and that works fine.
    Now, the Issue is: I am able to create at a time around 10 Products (the main process calls the child process in a loop.). The main process instance is not getting created but the child process instance is getting created for a set of records, afterwards the child process instances get stops creating. The main process instance could not be found in the console. Not getting why the process is not getting visible as well it is not completing the process.
    What could be the problem for this... Am I need to change any environment configurations...?
    Please update...
    Many Thanks in advance...

    Does this apply to you?
    https://social.technet.microsoft.com/Forums/en-US/9ccd8e50-b0af-4f58-9787-6435834e4c52/dfsr-not-working-on-new-windows-2008r2-server
    Thanks for the link, but no - not the same scenario although the error is the same.   The RGs I'm working with are all in sync and communication is working, it's just getting the backlog reported correctly.
    To reiterate, I can paste two versions of the exact command into the DOS window buffer; one copied from OneNote and one copied from my batch file.  Executing the one from OneNote succeeds and reports the RG in sync and the one copied from the batch
    file fails.
    I can repeat the results by up arrow once to the command pasted into the buffer from the batch file and see it fail.  Then up arrow twice to retrieve the command pasted from OneNote into the buffer and it will report correctly (illustrated in the grahic).
    Let me add that the command in the batch file was originally copied from OneNote and pasted in to the batch file; as if going into the batch file somehow corrupts it.
    - a -

  • WLI problem when processing a high number of records - SQLException: Data e

    Hi
    I'm having some trouble with a process in WLI when processing a high number of records from a table. I'm using WLI 8.1.6 and Oracle 9.2.
    The exception I'm getting is:
    javax.ejb.EJBException: nested exception is: java.sql.SQLException: Data exception -- Data -- Input Data length 1.050.060 is greater from the length 1.048.576 specified in create table.
    I think the problem is not with the table because it's pretty simple. I'll describe the steps in the JPD below.
    1) A DBControl checks to see if the table has records with a specific value in a column.
    select IND_PROCESSADO from VW_EAI_INET_ESTOQUE where IND_PROCESSADO = 'N'
    2) If there is one or more records, we update the column to another value (in other DBControl)
    update VW_EAI_INET_ESTOQUE  set IND_PROCESSADO = 'E' where IND_PROCESSADO = 'N'
    3) We then start a transaction with following steps:
    3.1) A DBControl queries for records in a specific condition
    select
    COD_DEPOSITO AS codDeposito,
    COD_SKU_INTERNO AS codSkuInterno,
    QTD_ESTOQUE AS qtdEstoque,
    IND_ESTOQUE_VIRTUAL AS indEstoqueVirtual,
    IND_PRE_VENDA AS indPreVenda,
    QTD_DIAS_ENTREGA AS qtdDiasEntrega,
    DAT_EXPEDICAO_PRE_VENDA AS dataExpedicaoPreVenda,
    DAT_INICIO AS dataInicio,
    DAT_FIM AS dataFim,
    IND_PROCESSADO AS indProcessado
    from VW_EAI_INET_ESTOQUE
    where IND_PROCESSADO = 'E'
    3.2) We transform all the records found to and XML message (Xquery)
    3.3) We update again update the same column as #2 to other value.
    update VW_EAI_INET_ESTOQUE set  IND_PROCESSADO = 'S'   where IND_PROCESSADO = 'E'.
    4) The process ends.
    When the table has few records under the specified condition, the process works fine. But if we test it with 25000 records, the process fails with the exception below. Sometimes in the step 3.1 and other times in the step 3.3.
    Can someone help me please?
    Exception:
    <A message was unable to be delivered from a WLW Message Queue.
    Attempting to deliver the onAsyncFailure event>
    <23/07/2007 14h33min22s BRT> <Error> <EJB> <BEA-010026> <Exception occurred during commit of transaction
    Xid=BEA1-00424A48977240214FD8(12106298),Status=Rolled back. [Reason=javax.ejb.EJBException: nested
    exception is: java.sql.SQLException: Data exception -- Data -- Input Data length 1.050.060 is greater from the length  1.048.576 specified in create table.],numRepliesOwedMe=0,numRepliesOwedOthers= 0,seconds since begin=118,seconds left=59,XAServerResourceInfo[JMS_cgJMSStore]=(ServerResourceInfo[JMS_cgJMSStore]=(state=rolledback,assigned=cgServer),xar=JMS_cgJMSStore,re-Registered =
    false),XAServ erResourceInfo[weblogic.jdbc.wrapper.JTSXAResourceImpl]=(ServerResourceInfo[weblogic.jdbc.wrapper.JTSXAResourceImpl]=
    (state=rolledback,assigned=cgServer),xar=weblogic.jdbc.wrapper.JTSXAResourceImpl@d38a58,re-Registered =false),XAServerResourceInfo[CPCasaeVideoWISDesenv]=
    (ServerResourceInfo[CPCasaeVideoWISDesenv]=(state=rolledback,assigned=cgServer),xar=CPCasaeVideoWISDesenv,re-Registered = false),SCInfo[integrationCV+cgServer]=(state=rolledback),
    properties=({weblogic.jdbc=t3://10.15.81.48:7001, START_AND_END_THREAD_EQUAL=false}),
    local properties=({weblogic.jdbc.jta.CPCasaeVideoWISDesenv=weblogic.jdbc.wrapper.TxInfo@9c7831, modifiedListeners=[weblogic.ejb20.internal.TxManager$TxListener@9c2dc7]}),OwnerTransactionManager=ServerTM[ServerCoordinatorDescriptor=
    (CoordinatorURL=cgServer+10.15.81.48:7001+integrationCV+t3+,
    XAResources={JMS_FileStore, weblogic.jdbc.wrapper.JTSXAResourceImpl, JMS_cgJMSStore, CPCasaeVideoWISDesenv},NonXAResources={})],CoordinatorURL=cgServer+10.15.81.48:7001+integrationCV+t3+): javax.ejb.EJBException: nested exception is: java.sql.SQLException: Data exception -- Data -- Input Data length 1.050.060 is greater from the length 1.048.576 specified in create table.
            at com.bea.wlw.runtime.core.bean.BMPContainerBean.ejbStore(BMPContainerBean.java:1844)
            at com.bea.wli.bpm.runtime.ProcessContainerBean.ejbStore(ProcessContainerBean.java:227)
            at com.bea.wli.bpm.runtime.ProcessContainerBean.ejbStore(ProcessContainerBean.java:197)
            at com.bea.wlwgen.PersistentContainer_7e2d44_Impl.ejbStore(PersistentContainer_7e2d44_Impl.java:149)
            at weblogic.ejb20.manager.ExclusiveEntityManager.beforeCompletion(ExclusiveEntityManager.java:593)
            at weblogic.ejb20.internal.TxManager$TxListener.beforeCompletion(TxManager.java:744)
            at weblogic.transaction.internal.ServerSCInfo.callBeforeCompletions(ServerSCInfo.java:1069)
            at weblogic.transaction.internal.ServerSCInfo.startPrePrepareAndChain(ServerSCInfo.java:118)
            at weblogic.transaction.internal.ServerTransactionImpl.localPrePrepareAndChain(ServerTransactionImpl.java:1202)
            at weblogic.transaction.internal.ServerTransactionImpl.globalPrePrepare(ServerTransactionImpl.java:2007)
            at weblogic.transaction.internal.ServerTransactionImpl.internalCommit(ServerTransactionImpl.java:257)
            at weblogic.transaction.internal.ServerTransactionImpl.commit(ServerTransactionImpl.java:228)
            at weblogic.ejb20.internal.MDListener.execute(MDListener.java:430)
            at weblogic.ejb20.internal.MDListener.transactionalOnMessage(MDListener.java:333)
            at weblogic.ejb20.internal.MDListener.onMessage(MDListener.java:298)
            at weblogic.jms.client.JMSSession.onMessage(JMSSession.java:2698)
            at weblogic.jms.client.JMSSession.execute(JMSSession.java:2610)
            at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:224)
            at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:183)
    Caused by: javax.ejb.EJBException: nested exception is: java.sql.SQLException: Data exception -- Data -- Input Data length 1.050.060 is greater from the length 1.048.576 specified in create table.
            at com.bea.wlw.runtime.core.bean.BMPContainerBean.doUpdate(BMPContainerBean.java:2021)
            at com.bea.wlw.runtime.core.bean.BMPContainerBean.ejbStore(BMPContainerBean.java:1828)
            ... 18 more

    Hi Lucas,
    Following is the information regarding the issue you are getting and might help you to resolve the issue.
    ADAPT00519195- Too many selected values (LOV0001) - Select Query Result operand
    For XIR2 Fixed Details-Rejected as this is by design
    I have found that this is a limitation by design and when the values exceed 18000 we get this error in BO.
    There is no fix for this issue, as itu2019s by design. The product always behaved in this manner.
    Also an ER (ADAPT00754295) for this issue has already been raised.
    Unfortunately, we cannot confirm if and when this Enhancement Request will be taken on by the developers.
    A dedicated team reviews all ERs on a regular basis for technical and commercial feasibility and whether or not the functionality is consistent with our product direction. Unfortunately we cannot presently advise on a timeframe for the inclusion of any ER to our product suite.
    The product group will then review the request and determine whether or not the functionality/feature will be included in a future release.
    Currently I can only suggest that you check the release notes in the ReadMe documents of future service packs, as it will be listed there once the ER has been included
    The only workaround which I can suggest for now is:
    Workaround 1:
    Test the issue by keep the value of MAX_Inlist_values parameter to 256 on designer level.
    Workaround 2:
    The best solution is to combine 'n' queries via a UNION. You should first highlight the first 99 or so entries from the LOV list box and then combine this query with a second one that selects the remaining LOV choices.
    Using UNION between queries; which is the only possible workaround
    Please do let me know if you have any queries related to the same.
    Regards,
    Sarbhjeet Kaur

  • L9 - all tracks seem to "software monitor" when set to record! BUG?!

    Hi Guys,
    Not sure if this is another Logic 9 bug or if I am just going crazy!
    Often when recording tracking parts, I need to record three or four tracks at the same time. For example, today I was recording vocals, keyboard and bass.
    I always monitor the vocals through Logic, so the singer has some reverb in real time. However, for zero latency monitoring on piano and bass, I would usually just monitor them straight through the mixer and not through Logic.
    To achieve this in Logic 8, I would press the "software monitoring" button (the green one beside the transport window). I'd also press the orange "Low latency mode" button to make sure the tracks that were software monitored would not have any latency.
    Then I would select record for all tracks I was recording, and I would press the "i" button (orange button for input monitoring) for the tracks that I actually wanted to software monitor. Tracks with "i" pressed (usually just vocals) would monitor through Logic with effects, the rest would just be monitored through the mixer.
    In Logic 9, this isn't working! Even if I only press the "i" button on one track, ALL of the tracks are coming through Logic. It seems to be applying software monitoring to ANY TRACK THAT IS IN RECORD MODE, even if the input button isn't pressed for that track!
    So I either have to have ALL record-enabled tracks software monitored (which induces a little bit of latency for the keys and bass) or NONE (and leave the vocalist without any real-time effects in the mix while he sings).
    Am I missing something here? Is there a new setting somewhere that automatically sets all record-enabled tracks to software monitor mode when the software monitoring button is pressed in the transport bar?
    OR... is this a new bug?! It's a big one if it is!
    I'd love to hear what other people think... or if anyone has a suggestion that might help with this? I might just be missing something obvious here, but I don't think so...
    Thanks heaps guys!
    Mike

    yeloop wrote:
    Hi Guys,
    Not sure if this is another Logic 9 bug or if I am just going crazy!
    Often when recording tracking parts, I need to record three or four tracks at the same time. For example, today I was recording vocals, keyboard and bass.
    I always monitor the vocals through Logic, so the singer has some reverb in real time. However, for zero latency monitoring on piano and bass, I would usually just monitor them straight through the mixer and not through Logic.
    To achieve this in Logic 8, I would press the "software monitoring" button (the green one beside the transport window). I'd also press the orange "Low latency mode" button to make sure the tracks that were software monitored would not have any latency.
    That's not what Low Latency Mode does, it only bypasses high latency inducing plugins, there's a threshold setting in Logic. But that's not the problem...
    Then I would select record for all tracks I was recording, and I would press the "i" button (orange button for input monitoring) for the tracks that I actually wanted to software monitor. Tracks with "i" pressed (usually just vocals) would monitor through Logic with effects, the rest would just be monitored through the mixer.
    That's not how it works here (on version 8.02)
    With software monitoring enabled any track in record mode is passed through Logic's audio engine. Input monitoring only works when a track is -not- record enabled. At least that's how it's working with my RME hardware.
    Perhaps you're thinking of "Auto Input Monitoring", try adding that button to the transport bar and see if that's working the way you expect. Any tracks set to input monitoring and record will pass their signal thru Logic, tracks set to record only, will not. The sequencer has to be running so the vocalist wouldn't hear reverb until you actually start recording.
    pancenter-

  • Problem in Adhoc Query's set operation functionality.

    Hi Experts,
    I am facing problem executing Adhoc Query's set operation functionality.
    In Selection Tab, following operations are performed :-
    Execute a query and mark it as 'Set A'.(Say Hit list = X)
    Execute another query and mark it as 'Set B'.(Say Hit list = Y)
    In Set operation Tab, following operations are performed :-:-
    Carry out an Operations 'Set A minus Set B'.
    which results in Resulting Set = Z.
    Transfer the resulting set 'in hit list' and press the copy resulting set button.
    In Selection Tab, Hit list is populated with Z.
    And when output button is pressed, I get to see 'Y' list and not 'Z' list.
    Kindly help.
    Thanks.
    Yogesh

    Hi Experts,
    I am facing problem executing Adhoc Query's set operation functionality.
    In Selection Tab, following operations are performed :-
    Execute a query and mark it as 'Set A'.(Say Hit list = X)
    Execute another query and mark it as 'Set B'.(Say Hit list = Y)
    In Set operation Tab, following operations are performed :-:-
    Carry out an Operations 'Set A minus Set B'.
    which results in Resulting Set = Z.
    Transfer the resulting set 'in hit list' and press the copy resulting set button.
    In Selection Tab, Hit list is populated with Z.
    And when output button is pressed, I get to see 'Y' list and not 'Z' list.
    Kindly help.
    Thanks.
    Yogesh

  • Same set of Records not in the same Data package of the extractor

    Hi All,
    I have got one senario. While extracting the records from the ECC based on some condition I want to add some more records in to ECC. To be more clear based on some condition I want to add addiional lines of data by gving APPEND C_T_DATA.
    For eg.
    I have  a set of records with same company code, same contract same delivery leg and different pricing leg.
    If delivery leg and pricing leg is 1 then I want to add one line of record.
    There will be several records with the same company code contract delivery leg and pricing leg. In the extraction logic I will extract with the following command i_t_data [] = c_t_data [], then sort with company code, contract delivery and pricing leg. then Delete duplicate with adjustcent..command...to get one record, based on this record with some condition I will populate a new line of record what my business neeeds.
    My concern is
    if the same set of records over shoot the datapackage size how to handle this. Is there any option.
    My data package size is 50,000. Suppose I get a same set of records ie same company code, contract delivery leg and pricing leg as 49999 th record. Suppose there are 10 records with the same characteristics the extraction will hapen in 2 data packages then delete dplicate and the above logic will get wrong. How I can handle this secnaio. Whether Delta enabled function module help me to tackle this. I want to do it only in Extraction. as Data source enhancement.
    Anil.
    Edited by: Anil on Aug 29, 2010 5:56 AM

    Hi,
    You will have to do the enhancement of the data source.
    Please follow the below link.
    You can write your logic to add the additional records in the case statement for your data source.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035c402-3d1a-2d10-4380-af8f26b5026f?quicklink=index&overridelayout=true
    Hope this will solve your issue.

  • MSN's latest update is crashing Firefox many times per day. When I try to send the crash report to Firefox, it says it has trouble sending the report. Is anyone else having this problem? Is there a setting that can help?

    Firefox is crashing every time I have my MSN email account up. (Since I use it for business, that's many times per day.) It seems to coincide with hotmail's latest update where the email is checked and downloaded to the inbox automatically. Then when it asks me if I want to send the crash report to Firefox, I select yes and it tells me it had trouble sending the report. Is anyone else having this problem? Is there a setting I can change? I can't seem to keep my email account up for more than 10 minutes.
    p.s if you're reading this Microsoft, it's still not bad enough to make me to go back to Internet Explorer!

    Hi,
    /Users/sarahschadek/Desktop/Safari.app/Contents/MacOS/Safari
    Move the Safari app from the Desktop to the Applications folder.
    Restart your Mac.
    That's why you see this:
    When I try to do the updates my computer says it has ready it goes through like it is downloading them then at the end it says some of the files could not be saved to "/" files.
    After your restart your Mac, click the Apple  menu (top left in your screen) then click:  Software Update ...
    Carolyn  

Maybe you are looking for

  • Unable to Switch Audio Sync Settings

    Hey everyone, I'm trying to create an animation based off of a 5-minute mp3 audio recording. I need to be able to switch my audio sync settings to Stream so that I can preview audio anywhere in my animation. Currently if I press play just anywhere in

  • Alpha channel, quiktime and iweb

    Hi guys, I am using a quicktime movie with alpha (transparent) background. When I place it in my iweb window it looks ok, but when I publish it to my site the alpha info is gone and the background is black. Does anyone know how to do this right? I do

  • Export and Import Application

    Greetings, I have created a Portal30 application that includes several forms, reports, and charts. Now I want to duplicate for another schema in the same database. I export the application in Navigator under applications. Then I use Sqlplus 8.1.7. to

  • Automatically start Front Row

    I've ditched the AppleTV in favour of a second Mac Mini. Much better. I'm streaming HDTV bitstreams and DVD (MPEG) bitsreams from a USB drive attached to Airport Extreme. (Simple alias creation in the MOVIE folder and additional components installed

  • Changing scroll bars to buttons

    Hello all, I use Macromedia Dreamweaver Mac for my wepage editing, I was wondering if anyone knows any helpful tutorials or instructions on how to replace the ugly grey scroll bars in your frames with arrow buttons or the like. Thanks much !!