Problem in data extraction using 2LIS_11_VAITM

Hi,
For sales order item detail I am using a standard data source 2LIS_11_VAITM.
And for this I have not written any code in exit.
But for a particular date some of Sales Document: Item Data has pulled to BI using delta upload and some of not pulled.
As some of sales order items are missing in BI. So i dentified those sales order detail in R/3
So the Problem is coming for only those records which have created on 04.10.2008,27.10.2008, 01.11.2008.
And the process chain for 05.10.2008, 28.10.2008, 02.11.2008 has not executed which suppose to pull the data for 04.10.2008,27.10.2008, 01.11.2008. The error is in delta uploading.
On 05.10.2008, 28.10.2008, 02.11.2008  the server is down at the scheduled time of process chain.
So the data suppose to come on next delta upload.
So 0n 06.10.2008, 29.10.2008, 03.11.2008 the delta executed successfully.
But all the records which are created on  04.10.2008,27.10.2008, 01.11.2008 hv not pulled to BI which I checked in PSA.
Only some of records which are created on  04.10.2008,27.10.2008, 01.11.2008. are in PSA.
Please suggest your idea.
Point will be awarded for this.

Hi,
Are you loading to the DSO??
Then
1)fill the set up table for the following dates only for which you have misssed the records.
2) Do a full repair to the targets for these date after the set up table is filled.
3) schedule a normal delta to the further targets from this DSO.
If loading to the cube directly??
1)do a selective deletion from the cube for these dates
2)fill the set up table for the following dates only
3) Do a full repair to the targets for these date.
No need to delete or do anything with the existing init in any case.
Thanks
Ajeet

Similar Messages

  • Problem with date format using TEXT_CONVERT_XLS_TO_SAP

    I'm using fm TEXT_CONVERT_XLS_TO_SAP to upload an xls file.
    I've the following problem the date in the spreadsheet is 01.01.2010 the result in the internal table after using the fm is 01.jan.2010. What must i do to get 01.01.2010 in the internal table?
    My setting in sap for the date format is DD.MM.YYYY

    Hi,
    What is type and size of your internal table field? Covert your excel column to text field. Now it is of type date i hope.
    Best option is, always have the value in excel in YYYYMMDD (SAP standard format) and internal table field as of type sy-datum.
    After uploading use WRITE TO statement to a character variable to convert the date as per user settings. Your current logic may not work if the date setting is different.
    Thanks,
    Vinod.

  • Infocube data extraction using ABAP

    Hi,
    We need to extract Charactristics values data from Infocube but LISTCUBE cannot be used because of reasons of flexibilty.
    can any body suggest a way of the data extraction from the Infocube.I tried using the 'RSD_CUBE_DATA_GET' but not to much success also SAP will not support for later releases.
    If anybody has a template program can you plaese send it.
    at email id [email protected]
    Thanks and Regards,
    Arunava

    Hi
    In BW v3.1 have been able to use function module RSDRI_INFOPROV_READ,  (- see demo program RSDRI_INFOPROV_READ_DEMO).
    Hope this is of some help.
    Regards,

  • Problem in Data extraction for NEW GL DSO 0FIGL_O10

    Hi ,
    I am facing Problem in extraction of records from SAP to BW.
    I have installed Business Content of NEW GL DSO  0FIGL_O10.
    When I extract the Data from SAP R/3, to this DSO  ( 0FIGL_O10 )  the reocrds are getting over written
    For Example  When I go the the Mange Option ( InfoProvider Administration)  the transferred Records and the Added Records are not same.  The Added records are less then the Transfered reocords.
    This is happening becuase of Key Filed Definations.
    I have 16 Characterisics in the KEY FIELD, which the maximum that I can have. But the Data comming from is Unique in some casses.
    As result the data get added up in the DSO, hence my balances are not matching with SAP R/3 for GL Accounts.
    There are total 31 Characteristics in the Datasource (0FI_GL_10) . Of which 16 Charactheristics i can include in the Key field area.
    Please suggest some solution.
    Regards,
    Nilesh Labde

    Hi,
    For safety, the delta process uses a lower interval setting of one hour (this is the default setting). In this way, the system always transfers all postings made between one hour before the last delta upload and the current time. The overlap of the first hour of a delta upload causes any records that are extracted twice to be overwritten by the after image process in the ODS object with the MOVE update. This ensures 100% data consistency in BW.
    But u can achive ur objective in different manner::
    Make a custom info object ZDISTINCT and populate it in transformation using ABAP code. In ABAP try and compound the values from different charactersitcs so that 1 compounded characterstic can be made. Use ZDISTINCT in ur DSO as key
    Just a thought may be it can solve ur problem.
    Ravish.

  • Problem in Data extraction into BI

    Hi Guys,
    I am having a strange problem and unable to idenitfy the root cause. I am hoping that some one can help me with this issue.
    System Details:
    BI 7.0
    SAP ECC 5.0
    Current Design:
    We have created a DSO in the BI system and created a custom extractor to pull the data from SAP ECC 5.0 into the DSO.
    The custom extractor is using the option "Extraction Frm SAP Query" and using a custom infoset. This is the transaction type data source.
    Problem Statement:
    When I run the RSA3 transaction for this extractor, the extractor brings 1870 records. However when i run the Infopackage in the BI system to bring the records into the DSO --> PSA then i am only getting 823 records.
    Question:
    Why am i seeing this difference in the number of records in RSA3 and in the BI system? I am NOT using any selection or filter conditions in the Infopackage. And i am only getting the data till PSA. So i was expecting that in PSA i can see 1870 records same as that i see in RSA3. Any idea what is missing?
    Thanks in advance !!

    Hi,
    The infopackage ran for full load and not for delta load.
    I looked at the rsmo and at first glance everything looks fine. But when i look at the details tab i can see that the overall status shows as RED and it says
    Overall Status: Errors occured: or : Missing Messages
    I am not clear what this means. I dont see any other nodes in RED. Every node is in green except the top most node which is in RED and shows the above message.
    Any idea what could be the problem?

  • Problem in data upload using pricing conditions with sales deal

    hi...
    i have to upload following fields -
    Condition Type
    Condition Table
    Valid from
    Valid to
    Sales deal
    Amount/Rate
    Currency / %
    Pricing Unit
    Pricing UoM
    Sales organization
    Distribution channel
    Sold to Party
    Material Number
    Material Pricing Grp
    Batch number
    Buying Group of Sold-to
    Customer
    Customer Group
    CustomerHierarchy 01
    CustomerHierarchy 02
    CustomerHierarchy 03
    CustomerHierarchy 04
    CustomerHierarchy 05
    Division
    Sales Order Type
    Sales Document Type
    End user
    Material Group
    Tax Classification Material
    Payer
    Plant
    Price Group
    Price list type
    Pricing reference material
    Prod. Hier -1
    Prod. Hier -2
    Prod. Hier -3
    Prod. Hier -4
    Prod. Hier -5
    Product hierarchy
    Region of Dly Plant
    Sales district
    Sales group
    Sales office
    Sales unit
    Ship-To
    Shipping point
    Buying Group of end user
    Tax classification for customer
    Type of Transaction
    Scale Basis1
    Scale Rate1
    Scale Basis2
    Scale Rate2
    Scale Basis3
    Scale Rate3
    Scale Basis4
    Scale Rate4
    using  XK15  t-code and SALES DEAL is the major concern.
    1)  First i used   RV14 BTCI , that is a standard report for uploading the pricing conditions. But by using this all fields are updating except SALES DEAL becoz this is not present in the structures (like- BKOND-1, BKOND-2, BKOND-3 etc) that is used in RV14BTCI program. I searched other structures also but SALES DEAL is not present there.
    2) Second i tried to find out some Function module that is containg SALES DEAL and i found two FMs - IDOC_INPUT_COND_A and BAPI_PRICES_CONDITIONS.....but
    a) the FM - IDOC_INPUT_COND_A is used with ALE and where third party is involved, so we require control data and status data for  this I dont have this. so we cant use it.
    b) and the FM BAPI_PRICES_CONDITIONS is also not working for SALES DEAL....by using this it is also not uploaded becoz some mandatory information related to sales deal like- sales organisation, distribution channel are not present in this FM.
    3) to upload this we can use the BDC recording method.....but the problem is - there are almost 15 condition types and based on these conditions almost 20 - 25 condition tables are there for every condition and based on the every table different screen sequence are there.....so if we go for BDC recording...than we have to make 325 recordings.....also not feasible solution.
    so plz give the suggestions for this problem and check my efforts also may be i missed something that can be a solution.
    Thanx in advance for all.......plz help.....

    Hi Jitendra,
    Goto RSA3 trans in the source system. Check if are able to extract the data.
    If so then replicate the DS once in BW system. Activate all the DS, transfer rules etc & try to load it again.
    Hope this will solve your problem!
    Regards,
    Pavan

  • Problem during data extraction program (tran - RSA3) in SC system

    Hi Experts,
    We are extracting data from one of the data source (9ALS_SBC) through trans RSA3. We are using active version 000 for the same.
    but, the transaction is showing error below:
    Error when generating program
    Message no. /SAPAPO/TSM141
    Diagnosis
    Generated programs are programs that are generated based on individual data objects, such as planning object structures, planniung areas and InfoCubes. These programs are then executed in the transaction. An error occurred during the generation of such a program.
    There are two possible causes:
    The template has been corrupted.
    The object that the template uses to generate the program contains inconsistencies, for instnace an InfopCube has not been activated.
    could someone tell me how to solve this error?
    i hv corrected all the inconsistencies but still it is showing same error!
    thanks in advance for this help!
    with best rgds/
    Jay

    Dear Jay,
    the error can have several reasons, it depends on your business scenario and system settings.
    I send you some possible reasons and solutions:
    1.
    The problem can be that the technical datasource name has changed inside the planning area. The planning area has other extract structure assigned as the datasource.
    This could have happened due to a planning area or datasource transport.
    The best way to repair this is to create a dummy-datasource for the planning area with transaction /N/SAPAPO/SDP_EXTR. This will update the relevant tables with the new datasource structure.
    Please check if the other datasources will work correctly as well. If not you may create similar dummy-datasources for the other aggregates as well.
    Please do not forget to delete the dummy-datasources as well in  /N/SAPAPO/SDP_EXTR.
    2.
    You can try to re-generate the data source at transaction /SAPAPO/SDP_EXTR and recheck the issue.                   
    3.
    Run /sapapo/ts_pstru_gen via transaction se38.         
    In the selection, enter the Planning Object Structure used and select  'Planning area extractor' and set the flags on the 2 bottom checkboxes 'Reset generation time stamp' and 'Generate'. 
    4.
    Maybe the extract structure which is assigned to planning area doesn't exist in your system. This could  happend at the time of transport.             
    Please refer to following content from note 549184:                                                                               
    =======================================================================
    Q4: Why could I have extraction problem after transport of DataSource? 
    A4: DataSources for DP/SNP planning areas depend directly on the                                                                               
    structure of the planning areas. That's why the planning area MUST                                                                               
    ALWAYS be transported with or before the DataSource.                   
    ========================================================================
    To solve this inconsistency please try the below:                      
    -Please reactivate the datasoure on Planning Area.                     
    And activate all active transfer structures for a source system with  RS_TRANSTRU_ACTIVATE_ALL programme on se38 transaction.                
    I hope these proposals could solve the error.
    Regards,
    Tibor

  • Data extraction using ODS objects

    Hello everybody,
    I am trying to extract CRM data into SAP BW.
    Found a standard ODS Object 0CRMBPKPI, which i think is relavent as i need to extract Business partner data from CRM.
    Does anybody know the procedure how to extract data using this ODS object?
    or exists any standard Infocube to extract Business partner data?
    Thanks in advance.
    Regards,
    Mohan

    Hi mohan!
    have you cheked the saphelp on this perticulat ods.
    http://help.sap.com/saphelp_nw04/helpdata/en/83/19bf78cb902e48bc9efc0a64e59907/frameset.htm
    hope this helps.
    with regards
    ashwin

  • Problem in data export using DisplayTag

    Hello Friends,
    I am getting the following exception when i try to export the data using display tag's built-in facility.
    [2008-02-26 16:54:27,472] WARN  http-7070-Processor22 (BaseNestableJspTagException.java:99  ) - Exception: [.LookupUtil] Error looking up property "mgrname" in object type "java.util.ArrayList". Cause: Unknown property 'mgrname'
    java.lang.NoSuchMethodException: Unknown property 'mgrname'
         at org.apache.commons.beanutils.PropertyUtilsBean.getSimpleProperty(PropertyUtilsBean.java:1122)
         at org.apache.commons.beanutils.PropertyUtils.getSimpleProperty(PropertyUtils.java:408)
         at org.displaytag.util.LookupUtil.getProperty(LookupUtil.java:271)
         at org.displaytag.util.LookupUtil.getBeanProperty(LookupUtil.java:129)
         at org.displaytag.model.Column.getValue(Column.java:124)
         at org.displaytag.export.BaseExportView.doExport(BaseExportView.java:265)
         at org.displaytag.tags.TableTag.writeExport(TableTag.java:1404)
         at org.displaytag.tags.TableTag.doExport(TableTag.java:1356)
         at org.displaytag.tags.TableTag.doEndTag(TableTag.java:1227)
         at org.apache.jsp.WEB_002dINF.jsps.common.tableViewTag_jsp._jspx_meth_displayTag_table_0(tableViewTag_jsp.java:195)
         at org.apache.jsp.WEB_002dINF.jsps.common.tableViewTag_jsp._jspService(tableViewTag_jsp.java:89)
         at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:97)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    Now,it doesn't face any problem while displaying data in table form on the page but when i try to export it (csv,excel,xml) then it fires the above exception.This is bit surprising to me.
    The dispalytag related tags are in a jsp. This jsp is included in a Spring tag handler class by pageContext.include("xyz.jsp"). This tag(defined by the Spring Tag Handler class) is being used in another jsp where the table is displayed. Paging works perfectly, but when I click on export, the exception occurs.
    I am using the followings:
    JDK1.5,Displaytag 1.1 and Spring 1.2.7
    The Actual flow is something like this.
    1)Controller forwards the request to jsp page.
    2)This jsp page uses a custom tag.
    3)Now,the control goes to custom tag handler class where i set the all the data into request,
    pageContext.getRequest().setAttribute("tableViewTag_data", data);4)Then i have included the page like
    pageContext.include("/WEB-INF/jsps/common/xyz.jsp");5)This xyz.jsp contains the following code.
        <displayTag:table pagesize="10" requestURI="${cmd.metaClass}.htm" name="tableViewTag_data" class="displaytag" decorator="${tableViewTag_options['decorator']}" export="true">
             <displayTag:setProperty name="paging.banner.placement" value="top"/>
             <c:forEach var="property" varStatus="propertyStatus" items="${tableViewTag_columnProperties}">
                  <c:set var="propertyTitle"><fmt:message key="field.${cmd.metaClass}.${property}" /></c:set>
                  <displayTag:column property="${property}" title="${propertyTitle}" />
             </c:forEach>
        </displayTag:table>Here, I am able to retrieve all the data.
    5)So,in this way the page is getting rendered.
    I have also included export filter into web.xml file.
    Hope i have provided all the information.
    I think i haven't made any silly mistake. -:)
    Looking forward to hear from you.
    Thanks
    Vishal Pandya

    Hi,
    Expdb and Exp are different exporting utility of oracle and hence the output file sizes are not same, and so difference occurs.
    No this is not a problem
    Since this is not a problem and hence no solution.
    Why you see this as a problem
    Cheers
    Anurag

  • Problem in Data Extracting from BEx to Webi

    Hi guys..this is sathish, we have a problem in webi reports.Actually we created webi reports from using BEx,the problem is the data in bex is showing correct data but in webi reports some cells are not showing the exact data which is showing the in the BEx.
    How to resolve the problem.please help me.

    BOXIR3.1 we are using..actually webi reports are prepared 3 months before.now the problem is in that particular report some fields are showing wrong information like Ex: in Bex it showing 1200545 but in webi report it showing like 346432.it is showing like that in 3 to 4 fields only remaining fields are correct. we tried to create new report but it showing correct information.but that particular report itself showing wrong information.
    Thanks for your reply Stratos
    Sathish
    Edited by: M.V.Sathish on Mar 27, 2011 1:26 PM

  • Master data extraction using RDA???

    Hi all,
    i 'have an idea on how  to extract transaction data in a regular intervals using RDA Concept.
    But i have a requirement in my project is there any way to extract master data(if any changes happen in Master Data in ECC immediately available to BW)
    in a regular intervals using RDA.
    if the solution is there using RDA please guide me otherwise suggest me the Best solution for this.

    hi Kishore,
    yes it is possible to extract master data using RDA mechanism.. Your datasource should be RDA enabled, for which, you need to have access key - if the data source is standard datasource.
    In generic datasources, you can mark it as "Real Time enable"..
    Please go through the document in the below link:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/607db21f-11ad-2d10-5594-e82adcd12ca3?QuickLink=index&…
    Hope this helps!
    -Sheen

  • R/3 Master Data Extraction Using Function Module

    Hi Experts,
       Could you please let me know why & How (Detail procedure) to extract R/3 master data using function module.
      My team manager asked me to extract master data from R/3 for 3 more attributes of info object '0material' which is already defined as data target.
    I would like to know why they are extracting data using function module
    Thanks

    The steps for  creating extractor using Function Module. I
    1. Create new Function group (if you have already not done so) in Se80
    2. Copy Function module "RSAX_BIW_GET_DATA_SIMPLE" with suitable name.
    3. Change the code that populate data.
    Following table may give you the guideline for parameters.
    Parameter Description
    I_REQUNR (import) BW provides this request identifier. It is a system-generated identifier in the form REQU_XXXXXX. BW uses this same identifier in all function module calls that relate to a single load.
    I_DSOURCE (import) The name of the generic extractor
    I_MAXSIZE (import) The maximum number of records that BW expects to be in each data packet
    I_INITFLAG (import) A Boolean flag that indicates if this is the initialization (first) call to the function module
    I_READ_ONLY (import) A test flag not needed in most extraction scenarios
    I_T_SELECT (table) This table holds any selections from the BW InfoPackage. The function module should examine these selections and only return data that matches the selections.
    I_T_FIELD (table) This table holds the fields that BW requests
    E_T_DATA (table) The function module fills this table with data records. These records then return to BW as data packets. This table has the same structure as the extract structure defined in the generic DataSource.
    NO_MORE_DATA (exception) The function module raises this exception when no more data is available
    ERROR_PASSED_TO_MESS_HANDLER (exception) The function module raises this exception if an error occurred during the extraction. It alerts BW to check for error logs.
    Change following code to put the selection fields
    Select ranges
    RANGES: L_R_CARRID FOR SFLIGHT-CARRID,
    L_R_CONNID FOR SFLIGHT-CONNID.
    Change following to populate data
    OPEN CURSOR WITH HOLD S_CURSOR FOR
    SELECT (S_S_IF-T_FIELDS) FROM SFLIGHT
    WHERE CARRID IN L_R_CARRID AND
    CONNID IN L_R_CONNID.
    ENDIF. "First data package ?
    Fetch records into interface table.
    named E_T_'Name of extract structure'.
    FETCH NEXT CURSOR S_CURSOR
    APPENDING CORRESPONDING FIELDS
    OF TABLE E_T_DATA
    PACKAGE SIZE S_S_IF-MAXSIZE.
    below links
    Re: functionmodule
    Re: FM for G. extractor
    ***Assign points if it helps.**
    Regards
    CSM Reddy

  • Change pointers in CRM for Product Data Extraction using MDM_CLNT_EXTR

    Hi All,
    We want to extract Product data in Delta mode using MDM_CLNT_EXTR.
    I think change pointers are activated in CRM system in a different way. Please let me know steps to activate change pointers in CRM system for Product Data.
    Thanks in advance for your help.
    Regards,
    Shiv

    Hi,
       Please follow the below steps fro change pointers.
    1.      In the Implementation Guide (IMG, transaction SALE), choose Modeling and Implementing ® Master Data Distribution ®Replication of Modified Data ® Activate Change Pointers ‑ Generally.
    2.      Set the activation status Activate Change Pointers ‑ Generally, and save your entry (i.e BD61 Activate Change Pointers ).
    3.      Choose the activity Activate Change Pointers for Message Types like ORDERS.
    4.      Set the active indicator for the message type .
    5.      Save your entries.
    warm regards
    Mahesh.

  • Data extraction using Function Midule

    Dear Experts,
    Previously we had a FM to extract data from an ODS into a csv file.
    Now we want to extract data from an infocube using a similar method (to save into a csv file again.)
    BEFORE we were extracting from ODS transparent table /BIC/AYPU_O1140.
    Now I want to extract from the infocube 0CCA_C11.
    So I thought I should use /BIC/F0CCA_C11.
    But actually the fields are not there. Only dimensions keys are mentionned.
    Example:
    I want the value type 0vtype, the infoobject is in one of the dimensions of 0CCA_C11.
    But in F0CCA_C11, there is only a field KEY_0CCA_C113...
    So my question is : were is the data ?
    For vtype (dim13) should it be in /BI0/D0CCA_C113 table
    And what about navigational attributes (like 0profit_ctr) or KF (like 0mount)?
    How can I extract them? Were is the active data stored?
    I'm looking froward to reading your suggestions.
    Kind regards,
    Alice

    Hi,
    search the forum for 'star schema' for understanding the set up of cube/dimension/sid tables....
    to read the data in the cube you need to use a FM 'RSDRI_INFOPROV_READ'...read the below thread for more info/links
    RSDRI_INFOPROV_READ?
    M.

  • Oracle 10g - Problem with Date Ranges using Between

    I am keeping track of patients who have not been contacted during a date range. if a nurse adds an event or note and the note is type 1,3,4 then this is a contact. The following works but if the event or note was made on the same day as the report is run, the current contact is not printed. Any help to improve the query and identify the problem would be appreciated. Also, if you add '1' to' n.created_date_time BETWEEN '10-Jan-2010' AND TRUNC(SYSDATE)' so it is ' n.created_date_time BETWEEN '10-Jan-2010' AND TRUNC(SYSDATE) + 1'. It works. What is wrong?
    SELECT upper(symptom_text),
      COUNT(UNIQUE(c.patient_id)) ,
      COUNT(UNIQUE(
      CASE
        WHEN e.eventdate BETWEEN '10-Jan-2010' AND TRUNC(SYSDATE)
        OR (n.created_date_time BETWEEN '10-Jan-2010' AND TRUNC(SYSDATE)
        AND n.note_type_id                                             IN (1,3,4))
        THEN c.patient_id
        ELSE 0
      END)) - 1
    FROM patient c,
      cust_info ci,
      event e,
      note n
    WHERE c.physician_id = 74
    AND c.patient_id      = ci.patient_id
    AND ci.info_type_id    = 32
    AND ci.symptom_text      IS NOT NULL
    AND c.patient_id      = e.patient_id(+)
    AND c.patient_id      = n.pk_id(+)
    AND n.table_name(+)    = 'patient'
    GROUP BY upper(symptom_text)
    ORDER BY DECODE(upper(symptom_text), 'A+', 1, 'A', 2, 'B', 3, 'C', 4, 'D', 5, 99)I suspect the end date is not inclusive. Fields are Date data types. Thanks for any help.

    Hi,
    achtung wrote:
    Understood. Frank was correct. Do you mean about the dates? Is that issue solved now?
    It would help a lot if you posted some sample data (CREATE TABLE and INSERT statements) and the results you want from that data. Simplify as much as possible. For example, if everything involving the e and n tables is working correctly, forget about them for now. Post a question that only invloves the c and ci tables.
    But, additionally, when a record in the cust_info doesn't exist my contact count is inaccurate. How can you explain this conceptually. I understand the query path is checking for this record due to the predicate. Perhaps the query should be redesigned. Thanks for your input!Again, you can see the results, and you know what they should be. Nobody else has that information. Please post some sample data and the results you want from that data.
    In your earlier message you said
    Could there be a problem if a record does not exist in the ci table?The condition:
    AND ci.symptom_text IS NOT NULL
    filters for this right? Maybe this could be part of the problem. why I'm not seeing records when a note is added to a patient's doc. How would this be technically be explained? You're doing a inner join between c and ci:
        AND c.patient_id      = ci.patient_id
        AND ci.info_type_id   = 32
        AND ci.symptom_text   IS NOT NULL ...Rows from c will be included only if there is a row in ci with the same patiend_id: even more, that matching row in ci must also have info_type_id=32 in a non-NULL symptom_text, otherwise, the row from c will be ignored.
    If you want rows from c to be included even if there is no matching row in ci, then do an outer join, like you're already doing with the e and n tables:
      WHERE c.physician_id = 74
        AND c.patient_id         = ci.patient_id (+)
        AND ci.info_type_id (+)  = 32
        AND ci.symptom_text (+)  IS NOT NULL
        AND c.patient_id         = e.patient_id(+)
        AND c.patient_id         = n.pk_id(+)
        AND n.table_name(+)      = 'patient'

Maybe you are looking for

  • An error occured while creating the new dataset Could not get type informat

    Uses: Windows XP Pro SP3+; OracleXE; Oracle 8i Client; ODP.NET; Visual Studio 2005 PRO; I had OracleXE and was using OracleXE's Oracle.DataAccess Version 10.2.0.100 which was located in C:\oraclexe\app\oracle\product\10.2.0\server\BIN. Then to use Or

  • Credit Note Issue.

    HI All Request your inputs to resolve this issue, The credit note that we generate now in the system is not reflecting the older invoice details rather it comes up with the edited details. Its a custom table and we have tried to use Deletion indicato

  • Choosing Time in Calendar App

    Does anyone else feel like setting the start and end time in the Calendar App is a bit too much like playing "Showcase Showdown" on the Price is Right? It's really cool looking but not very easy to quickly put in a start and end time when I am in a m

  • M3G icons needed

    I need the green man and the internet globe folder wich R animated... The m3g icons are in the z:\resource\m3gicons.... Who can help me out? I allready own 13 animated folders wich collected from several phones in2 my N95 8GB....

  • Color variation introduced somewhere between AfterEffects-Quicktime-Flash-Dreamweaver

    http://www.scbadasscoffee.com Please note the subtle difference between the background color of the web page (CD9933) and the background of the Flash animation. I created the original in AE Pro...took it into Quicktime...exported it to a smaller file