Looping over arraycollection with sap data

Hello,
I asked a lot of persons but i still have a problem.
I am trying to make a organizational chart in Adobe Flex with data out of the SAP system.
When I try to access my data that I have given trough the Flash Islands container it doesn’t work.
I do:
[Bindable]
Public var datasource:Arraycollection;
Then I want to loop over this datasource with a for loop but it says that the object is null.
for (i = o; i < datasource.length; i++)
When I use datasource as dataprovider for a datagrid in Flex, that works perfect.
Please answer fast because I need it for school.
greetings

Hello,
I asked a lot of persons but i still have a problem.
I am trying to make a organizational chart in Adobe Flex with data out of the SAP system.
When I try to access my data that I have given trough the Flash Islands container it doesn’t work.
I do:
[Bindable]
Public var datasource:Arraycollection;
Then I want to loop over this datasource with a for loop but it says that the object is null.
for (i = o; i < datasource.length; i++)
When I use datasource as dataprovider for a datagrid in Flex, that works perfect.
Please answer fast because I need it for school.
greetings

Similar Messages

  • XML generation with SAP data using XML schema - Reg

    Hello experts,
      My requirement is , SAP( ztable data )  data has to be transferred to third party software folder.Third party using XML so they requires output from SAP in XML format.
    For that third party software guys told me that they will give their own XML schema to me.I have to generate XML file with SAP data using their XML schema.
    Generating XML file with their Schema should be underlined.
    I studied that call transformation statement helps for this.
    Even then i don't have clear idea about this topic.
    Please brief me about how to use their XML schema to generate XML with my own sap data.
    Thanks in advance experts.
    Kumar

    please  try this  same program    and see  it ....
    *& Report  z_xit_xml_check
      REPORT  z_xit_xml_check.
      TYPE-POOLS: ixml.
      TYPES: BEGIN OF t_xml_line,
              data(256) TYPE x,
            END OF t_xml_line.
      DATA: l_ixml            TYPE REF TO if_ixml,
            l_streamfactory   TYPE REF TO if_ixml_stream_factory,
            l_parser          TYPE REF TO if_ixml_parser,
            l_istream         TYPE REF TO if_ixml_istream,
            l_document        TYPE REF TO if_ixml_document,
            l_node            TYPE REF TO if_ixml_node,
            l_xmldata         TYPE string.
      DATA: l_elem            TYPE REF TO if_ixml_element,
            l_root_node       TYPE REF TO if_ixml_node,
            l_next_node       TYPE REF TO if_ixml_node,
            l_name            TYPE string,
            l_iterator        TYPE REF TO if_ixml_node_iterator.
      DATA: l_xml_table       TYPE TABLE OF t_xml_line,
            l_xml_line        TYPE t_xml_line,
            l_xml_table_size  TYPE i.
      DATA: l_filename        TYPE string.
      PARAMETERS: pa_file TYPE char1024 DEFAULT 'c:temporders_dtd.xml'.
    * Validation of XML file: Only DTD included in xml document is supported
      PARAMETERS: pa_val  TYPE char1 AS CHECKBOX.
      START-OF-SELECTION.
    *   Creating the main iXML factory
        l_ixml = cl_ixml=>create( ).
    *   Creating a stream factory
        l_streamfactory = l_ixml->create_stream_factory( ).
        PERFORM get_xml_table CHANGING l_xml_table_size l_xml_table.
    *   wrap the table containing the file into a stream
        l_istream = l_streamfactory->create_istream_itable( table = l_xml_table
                                                        size  = l_xml_table_size ).
    *   Creating a document
        l_document = l_ixml->create_document( ).
    *   Create a Parser
        l_parser = l_ixml->create_parser( stream_factory = l_streamfactory
                                          istream        = l_istream
                                          document       = l_document ).
    *   Validate a document
        IF pa_val EQ 'X'.
          l_parser->set_validating( mode = if_ixml_parser=>co_validate ).
        ENDIF.
    *   Parse the stream
        IF l_parser->parse( ) NE 0.
          IF l_parser->num_errors( ) NE 0.
            DATA: parseerror TYPE REF TO if_ixml_parse_error,
                  str        TYPE string,
                  i          TYPE i,
                  count      TYPE i,
                  index      TYPE i.
            count = l_parser->num_errors( ).
            WRITE: count, ' parse errors have occured:'.
            index = 0.
            WHILE index < count.
              parseerror = l_parser->get_error( index = index ).
              i = parseerror->get_line( ).
              WRITE: 'line: ', i.
              i = parseerror->get_column( ).
              WRITE: 'column: ', i.
              str = parseerror->get_reason( ).
              WRITE: str.
              index = index + 1.
            ENDWHILE.
          ENDIF.
        ENDIF.
    *   Process the document
        IF l_parser->is_dom_generating( ) EQ 'X'.
          PERFORM process_dom USING l_document.
        ENDIF.
    *&      Form  get_xml_table
      FORM get_xml_table CHANGING l_xml_table_size TYPE i
                                  l_xml_table      TYPE STANDARD TABLE.
    *   Local variable declaration
        DATA: l_len      TYPE i,
              l_len2     TYPE i,
              l_tab      TYPE tsfixml,
              l_content  TYPE string,
              l_str1     TYPE string,
              c_conv     TYPE REF TO cl_abap_conv_in_ce,
              l_itab     TYPE TABLE OF string.
        l_filename = pa_file.
    *   upload a file from the client's workstation
        CALL METHOD cl_gui_frontend_services=>gui_upload
          EXPORTING
            filename   = l_filename
            filetype   = 'BIN'
          IMPORTING
            filelength = l_xml_table_size
          CHANGING
            data_tab   = l_xml_table
          EXCEPTIONS
            OTHERS     = 19.
        IF sy-subrc <> 0.
          MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                     WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        ENDIF.
    *   Writing the XML document to the screen
        CLEAR l_str1.
        LOOP AT l_xml_table INTO l_xml_line.
          c_conv = cl_abap_conv_in_ce=>create( input = l_xml_line-data replacement = space  ).
          c_conv->read( IMPORTING data = l_content len = l_len ).
          CONCATENATE l_str1 l_content INTO l_str1.
        ENDLOOP.
        l_str1 = l_str1+0(l_xml_table_size).
        SPLIT l_str1 AT cl_abap_char_utilities=>cr_lf INTO TABLE l_itab.
        WRITE: /.
        WRITE: /' XML File'.
        WRITE: /.
        LOOP AT l_itab INTO l_str1.
          REPLACE ALL OCCURRENCES OF cl_abap_char_utilities=>horizontal_tab IN
            l_str1 WITH space.
          WRITE: / l_str1.
        ENDLOOP.
        WRITE: /.
      ENDFORM.                    "get_xml_table
    *&      Form  process_dom
      FORM process_dom USING document TYPE REF TO if_ixml_document.
        DATA: node      TYPE REF TO if_ixml_node,
              iterator  TYPE REF TO if_ixml_node_iterator,
              nodemap   TYPE REF TO if_ixml_named_node_map,
              attr      TYPE REF TO if_ixml_node,
              name      TYPE string,
              prefix    TYPE string,
              value     TYPE string,
              indent    TYPE i,
              count     TYPE i,
              index     TYPE i.
        node ?= document.
        CHECK NOT node IS INITIAL.
        ULINE.
        WRITE: /.
        WRITE: /' DOM-TREE'.
        WRITE: /.
        IF node IS INITIAL. EXIT. ENDIF.
    *   create a node iterator
        iterator  = node->create_iterator( ).
    *   get current node
        node = iterator->get_next( ).
    *   loop over all nodes
        WHILE NOT node IS INITIAL.
          indent = node->get_height( ) * 2.
          indent = indent + 20.
          CASE node->get_type( ).
            WHEN if_ixml_node=>co_node_element.
    *         element node
              name    = node->get_name( ).
              nodemap = node->get_attributes( ).
              WRITE: / 'ELEMENT  :'.
              WRITE: AT indent name COLOR COL_POSITIVE INVERSE.
              IF NOT nodemap IS INITIAL.
    *           attributes
                count = nodemap->get_length( ).
                DO count TIMES.
                  index  = sy-index - 1.
                  attr   = nodemap->get_item( index ).
                  name   = attr->get_name( ).
                  prefix = attr->get_namespace_prefix( ).
                  value  = attr->get_value( ).
                  WRITE: / 'ATTRIBUTE:'.
                  WRITE: AT indent name  COLOR COL_HEADING INVERSE, '=',
                                   value COLOR COL_TOTAL   INVERSE.
                ENDDO.
              ENDIF.
            WHEN if_ixml_node=>co_node_text OR
                 if_ixml_node=>co_node_cdata_section.
    *         text node
              value  = node->get_value( ).
              WRITE: / 'VALUE     :'.
              WRITE: AT indent value COLOR COL_GROUP INVERSE.
          ENDCASE.
    *     advance to next node
          node = iterator->get_next( ).
        ENDWHILE.
      ENDFORM.                    "process_dom
    reward  points  if it is use fulll ....
    Girish

  • Wrong file in the "Populating an ArrayCollection with retrieved data" exercise

    In the "Populating  an ArrayCollection with retrieved data" there is a file ex2_04_starter.zip, which contains ex2_04_solution.mxml file and should contain ex2_04_start.mxml, right ?
    Great videos! Thank you!

    Hi Burpix,
    Looks like the link for the starter file within the exercise is linked to the wrong file.  If you pull the project archive for Day 2 you will get all of the starter & solution files and video transcripts for all of Day 2 exercises.  The starter file for ex2.04 is correct if download it from the project archive zip.  Adobe will be fixing the link within the exercise later but for now you can pull the correct starter file from the project archive zip.

  • Oracle BI Apps with SAP data sources

    What is the delivered content provided by Oracle BI Apps to map with SAP data sources?
    Thank you!

    As things stand right now SAP R3 is only supported on a much older release of BI Apps (7.8.4). That release used the Informatica SAP PowerConnect technology to populate the Warehouse.
    I posted before on this forum the list of supported SAP Modules.
    BI Apps cannot use SAP BW. BW can be a direct data source for BI EE however.
    In the near future our support for SAP will be up to date and back on track. There are internal efforts underway that I cannot discuss here in the forum.

  • How to make link between xcelsius components with sap data using Web servic

    Hi all,
    I have a doubt regarding connection between Xcelsius components and SAP data.
    I created one Web service using Function module and made a connection between xcelsius and that web service using binding URL. It shows imput and output parameters perfectly.
    But I cant get any idea as to how to connect Xcelsius components with these parameters.
    Can anybody help me out..
    please its urgent.
    Thanks,
    Simadri

    Have you bound your output parameters to ranges of cells? Select the item, then click the icon to the right of the Insert In: box and select the cells.
    Add a spreadsheet component to your chart and bind it to the cells, then preview the model. Do you see the data coming through?
    If you do, then you can click File > Snapshot > Export Excel Data. Then close Preview mode, and import data from spreadsheet and select the sheet you just exported. This gives you real data to work with when designing the dashboard.
    Hope that helps.

  • How to call and run parameters in Procedures with Sap Data Services?

    Hello Guys,
    Migrating'm all SSIS2008 packages for Sap Data Services.
    During this process I found a difficulty about running Stored Procedures (Sql Server 2008 R2) within the Sap Data Services.
    I need help to convert this code sample:
    EXEC dbo.prcInserirLogExecucaoSSIS
    @FEED = ?,
    @TIPO_ENTRADA = 'NOVA_CARGA',
    @ARQUIVO = ?
    to Sql ('datastore', 'example') with parameter passing ...

    Import the stored procedure as a function in a datastore.
    Drag the stored procedure to your query browser and you will be set. 

  • ABOUT INTERGRATING BO WITH SAP DATA

    Hi all, kindly help me out in this problem.
    we are in the project to develope crystal reports XI r2 in the BO platform.
    with out using any ETL part or data integrator.kindly clarify is it possible to get data without ETL .And clarify the SAP intergration tool kit.. any body help me out...kindly urgent..
    regards
    paneer

    Hi,
    This may be of help.
    BEST PRACTICES FOR CRYSTAL REPORTING WITH SAP Dan Kearnan, Business Objects
    http://sfarea.org/P13.pdf
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/10c3bca6-7dbc-2a10-7aa8-81d2731c7bb1
    Best Practices for implementing Business Objects on top of BW 
    /people/ingo.hilgefort/blog/2008/02/07/businessobjects-and-sap-part-i
    /people/ingo.hilgefort/blog/2008/02/19/businessobjects-and-sap-part-2
    /people/ingo.hilgefort/blog/2008/02/27/businessobjects-and-sap-part-3
    /people/ingo.hilgefort/blog/2008/03/23/businessobjects-and-sap-part-4
    /people/ingo.hilgefort/blog/2008/03/24/businessobjects-and-sap-part-5
    /people/scott.jones/blog/2007/09/13/installing-and-configuring-sap-interactive-forms-by-adobe-for-the-sap-netweaver-composition-environment
    Hope this helps.
    Thanks,
    JituK

  • Perl API: growing memory problem in loops over large sets of data

    Hi,
    When going through all XmlResults like this:
    while ($results->next($val)) {
    print $val->asString, "\n";
    The process size keeps growing. It does not when I comment $val->asString method out, but then I have no way of getting the results.
    This becomes a significant problem when the number of results is huge. I am doing this on a database of over a million short XML documents (400-800 bytes each).
    The more complete code is here:
    eval {
    $env = new DbEnv();
    $env->open($dbDir, Db::DB_JOINENV | Db::DB_INIT_LOCK
    | Db::DB_INIT_MPOOL | Db::DB_CREATE, 0);
    my $mgr = new XmlManager($env, DbXml::DBXML_ADOPT_DBENV);
    my $db = $mgr->openContainer(undef, $dbName, Db::DB_RDONLY);
    my $context = $mgr->createQueryContext(XmlQueryContext::LiveValues,
    XmlQueryContext::Lazy);
    my $lookup = $mgr->createIndexLookup($db, "", $nodeName,
    "node-$nodeType-equality-$syntax",
    new XmlValue($types{$nodeType}, $value), XmlIndexLookup::GTE);
    my $results = $lookup->execute(undef, $context);
    my $val = new XmlValue();
    while ($results->next($val)) {
    print $val->asString, "\n";
    if (my $e = catch std::exception) {
    die $e->what() . "\n";
    The process size just grows until the system limit is reached, then the process quits saying 'Out of memory'.
    I suspect the problem is with the std::string result returned by C++ XmlValue::asString() const.
    The (left-hand-side) result string is likely allocated by new std::string and receives the value by calling the string copy operator. Then the Perl scalar result is prepared, but when it gets returned to my code, the C++ string is not deleted.
    Moving the Sleepycat::XmlValue Perl object inside the loop does not help either:
    while ($results->hasNext()) {
    my $val = new XmlValue();
    $results->next($val);
    print $val->asString, "\n";
    In fact, the process seems to grow faster, possibly because the old $val instances do not get destroyed by Perl at the end of the loop. Where is Perl's garbage collection?
    I am using DB XML version: 2.2.13; BDB version: 4.4.20.2; OS: FreeBSD 6-STABLE. However the problem seems to be common for any OS or BDB XML version as it involves Perl-to-C++ interface.
    Has anyone experienced similar problems?
    Thanks,
    Konstantin.
    Konstantin @ Chuguev.com

    Good catch - you found a memory leak. Luckily the fix is very straightforward. Edit the file
    dbxml/src/perl/common.h
    and find this line
    #define newSVfromString(str) newSVpvn(str.c_str(), str.length())
    Change it to this
    #define newSVfromString(str) sv_2mortal(newSVpvn(str.c_str(), str.length()))
    and recompile the module.
    Paul

  • Pull the data from legacy System into report and display with SAP data

    Hi Friends,
    My requirement is-
    Create report by processing data from SAP tables and prepare output.And Before displaying the output, I have to pull the data from non-sap system which is readymade (It will come as flat file with similar fields as Report structure has) and finally display the records from both SAP and Legacy System by filtering duplicates.

    Steps:-
    Define the file path on selection screen:-
      Selection screen data
        select-options   (s_)
          parameters     (p_)
          radio buttons  (r_)
          checkboxes     (x_)
          pushbuttons    (b_)
    SELECTION-SCREEN  BEGIN OF BLOCK block1 WITH FRAME TITLE text-f01.
    parameter:    p_file    type text_512 obligatory.
    Start-of-selection.
      data : l_fname type string. " File Name
      l_fname = p_file .
      call function 'GUI_UPLOAD'
        exporting
          filename                = l_fname
          filetype                = 'ASC'
          has_field_separator     = '#'
        tables
          data_tab                = lt_data
        exceptions
          file_open_error         = 1
          file_read_error         = 2
          no_batch                = 3
          gui_refuse_filetransfer = 4
          invalid_type            = 5
          no_authority            = 6
          unknown_error           = 7
          bad_data_format         = 8
          header_not_allowed      = 9
          separator_not_allowed   = 10
          header_too_long         = 11
          unknown_dp_error        = 12
          access_denied           = 13
          dp_out_of_memory        = 14
          disk_full               = 15
          dp_timeout              = 16
          others                  = 17.
      if sy-subrc <> 0.
        message e000 with 'Unable to upload file from the PC'(t13).
      endif.
    lt_data is of same structure as the fields in the file.
    For filtering duplicates:-
    delete adjacent duplicates from lt_data.
    Now display the records using either ALV or using write statements.
    You can display the records in any of the way you want.

  • ArrayCollections with same data source

    Hi Everyone,
    Recently I got in a project which use Tree objects to visualize and manipulate objects. The data source of this object must to be the same but with different visualizations. In some views this tree will be filtered and in other places it won't. The data source must to be kept consistently, if an object is update in a tree, it should reflect the updates in the other Trees.
    My question is quite simple, does ArrayCollections are really expensive objects in terms of memory even using the same data source? Or does it have to do with the fact they are manipulating different datasources as well?
    Please share your experiences.
    Thanks

    Hi,
    ArrayCollection is a very useful construct.
    http://blog.flexdevelopers.com/2009/03/flex-basics-arraycollection.html
    Just make sure you clean the open nodes and it will perform fine.
    http://kb2.adobe.com/cps/897/cpsid_89785.html#products
    That is my two cents
    Best

  • Pre-fill Office documents with SAP data from WebDynPro screens

    I have a requirement to have Word/Excel type file templates that we can open and have pre-filled with data attributes from SAP objects.
    I know that we can do this relatively easily from within the SAP GUI in transactions such as SCASE etc... but our solution needs to have this type of functionality invoked from WebDynPro screens that are delivered through the Portal.
    We have looked at Adobe Document Services which will give us the fucntionality as described, but because of licencing constraints we are unable to use ADS at this stage.
    Does anyone have any knowledge or ideas on how we may achieve pre-filling of Word templates etc. from WebDynPro screens?

    Paul,
      Did you figure out how to do this? We have a very similar requirement now. Appreciate any suggestions you may have.
    Thanks
    Madhu

  • Pre-filling Office documents with SAP data?

    have a requirement to have Word/Excel type file templates that we can open and have pre-filled with data attributes from SAP objects.
    I know that we can do this relatively easily from within the SAP GUI in transactions such as SCASE etc... but our solution needs to have this type of functionality invoked from WebDynPro screens that are delivered through the Portal.
    We have looked at Adobe Document Services which will give us the fucntionality as described, but because of licencing constraints we are unable to use ADS at this stage.
    Does anyone have any knowledge or ideas on how we may achieve pre-filling of Word templates etc. from WebDynPro screens?

    Paul,
      Did you figure out how to do this? We have a very similar requirement now. Appreciate any suggestions you may have.
    Thanks
    Madhu

  • Working with SAP Data Format yyyy.mm.dd in data services

    Hi,
    I had 2 questions about date formats coming from SAP to DS .
    1) I am trying to convert date fieild from SAP that is formated to yyyy.mm.dd to mm/dd/yyyy but I am not having any luck .. I am using the to_date function when I run this below I get null values in my template table . I think it has to do with me not stating my input date format but cannot seem to get the correct syntax's any ideas ?
    2) Does Data Services recognize dates which are formatted as  yyyy.mm.dd ?? When I put  I added a column to a template table and added the sys date command it returned the sys date as yyyy.mm.dd BUT when I try a fiscal_day function (which should brings back the number of days between the input (my SAP source date ) and the sys date which are formatted the same (yyyy.mm.dd) I get a format error  (input parameter 2009.03.10 is not valid) .  any ideas ?
    Thanks for your time,
    Brett

    Hello Brett
    to_date convert STRING to date based on input format.
    As i understand your source data not in string format.Is this correct?
    Use combination to_char->to_date for your transformation
    Regards
    Kanstantsin Chernichenka

  • Looping inside Smartform with PWB data

    Hi,
    I'm trying to access the field belzart and inside the following path:
    PWB_DATA-> T_BBP_HEADER->T_BI_HEADER->T_BI_ITEM-> WA_BI_ITEM.
    can anyone please tell how shoul i loop the table so that I can access the data?
    If I loop
    PWB_DATA-T_BBP_HEADER-T_BI_HEADER-T_BI_ITEM into SU_BI_BILL_S_BI_ITEM, it gives the error:
    "No component exists in PWB_DATA with name T_BBP_HEADER-T_BI_HEADER-T_BI_ITEM".
    Awaiting early response.
    Thanks & Regards,
    Anshumita Baksi.

    Hi,
    In the initialazition tab copy the data of ts_customer into anther internal table like as follows.
    data:it_cust type table of ts_customer.
    it_cust[] = ts_customer[].
    sort it_cust by kunnr.
    delete adjacent duplicates it_cust comparing kunnr.
    now your internal table contains no duplicate kunnr.
    first loop the it_cust.
    then loop the ts_cust where kunnr = it_cust-kunnr.
    after this loop create a command node
    select the go to and give the next page .
    now it will give multiple letters for each customers.
    Tnks,
    NN.

  • Performance issue: looping over queries with a query results set

    I have code that works, but I think I should be able to run the code faster. I could try a stored procedure but there are so many variables to set. I tried with wrapping cftransation around the code, but it didn't make a noticeable difference. I need to go through the data singularly to fill my query object.
    Here's an ABBREVIATED sample of the code:
    <cfset tot_AllActiveListing = QueryNew(
    "AnnounceNum, JP_PDLoc, JP_JS_Title, JP_JS_KWID, JP_JS, JP_Open, JP_Close, JP_CloseType, JP_CloseName, JP_PosNeed, JP_DirectHire, JP_Desc, JP_Draft, JP_Archived, JP_State, JP_AreaID, JP_AreaName, JP_AreaAlias, JP_Fac_SU, JP_Fac_Facility, JP_FAC_ID, JP_Grade1, JP_Grade2, JP_Grade3, JP_Grade4, JP_Grade5, JP_Posted, JP_TypeHire, JP_HRemail",
    "VARCHAR,VARCHAR,VARCHAR,INTEGER,INTEGER,TIMESTAMP,TIMESTAMP,INTEGER,VARCHAR,INTEGER,BIT,V ARCHAR,BIT,BIT,VARCHAR,INTEGER,VARCHAR,VARCHAR,VARCHAR,VARCHAR,INTEGER,VARCHAR,VARCHAR,VAR CHAR,VARCHAR,VARCHAR,TIMESTAMP,INTEGER,VARCHAR")
    />
    <cfquery name="getAllActiveListing" datasource="#request.at_datasource#">
        SELECT j.JOB_AnnounceNum, j.JOB_PDLoc, j.fk_JS_code, j.Job_JPOpen, j.Job_JPClose, j.fk_CloseType, j.JOB_JPPosNeed, j.JOB_DirectHire, j.JOB_JPDesc, j.Job_JPDraft, j.JOB_JPArchived, j.JOB_State,
        j.fk_FACID, j.Posted, j.JOB_IHSvITU, f.Fac_Area, f.Fac_ServiceUnit, f.fac_Facility, f.Fac_Addr1, f.Fac_Addr2, f.Fac_City, f.Fac_State, f.Fac_Zip
        from JOB_JP j INNER JOIN #generaldb#IHSFacility f
        ON j.fk_FACID  = f.Fac_ID
        WHERE
                JOB_JPDraft = 0
                and (Job_JPClose = #Now()# or Job_JPClose > #Now()# or fk_CloseType = 2 or fk_CloseType = 3)
                and (JOB_JPArchived = 0 or JOB_JPArchived IS NULL)
                 <cfif IsDefined("qAltPostID") and qAltPostID.recordcount gt "0">
                and JOB_AnnounceNum IN (<cfqueryparam list="yes" cfsqltype="CF_SQL_varchar" value="#ValueList(qAltPostID.fk_Job_AnnounceNum)#">)
                <cfelseif option is "JPPostListing" and StructKeyExists(session,"IHSUID")>
                and  j.WhoCreated = #session.IHSUID#
                 </cfif>
                 Order by j.Job_JPOpen desc
        </cfquery>
        <cfloop from="1" to="#session.getAllActiveListing.recordcount#" index="i">       
                <cfquery name="getAllActiveListingGrade" datasource="#request.at_datasource#">
                    SELECT fk_Job_AnnounceNum, Grade
                    from Job_JP_Grade
                    Where Job_JP_Grade.fk_Job_AnnounceNum = '#session.getAllActiveListing.Job_AnnounceNum[i]#'
                </cfquery>    
                <cfif IsDefined("session.getAllActiveListing") and session.getAllActiveListing.recordcount neq "0">       
                    <cfquery name="getAllActiveListingIHSArea" datasource="#at_datasource#">
                    SELECT JOBIHSArea_ID, JOBIHSArea_Name, JOBIHSArea_Alias
                    from JOB_IHSArea_LKUP
                    where JOBIHSArea_Alias = '#session.getAllActiveListing.Fac_Area[i]#'
                    </cfquery>
                </cfif>       
                <cfset session.getAllActiveListingGrade = getAllActiveListingGrade />
                <cfquery name="getAllActiveListingCloseName" datasource="#at_datasource#">
                SELECT JOB_CloseName
                from JOB_CloseType_LKUP
                where JOB_CloseType_LKUP.JOB_CloseType = #session.getAllActiveListing.fk_CloseType[i]#
                </cfquery>
                    <cfscript>                                       
                       newRow=QueryAddRow(tot_AllActiveListing);
                        QuerySetCell(tot_AllActiveListing, "AnnounceNum", "#session.getAllActiveListing.Job_AnnounceNum[i]#");
                        QuerySetCell(tot_AllActiveListing, "JP_PDLoc", "#session.getAllActiveListing.JOB_PDLoc[i]#");
                        QuerySetCell(tot_AllActiveListing, "JP_Draft", "#session.getAllActiveListing.Job_JPDraft[i]#");
                        QuerySetCell(tot_AllActiveListing, "JP_Archived", "#session.getAllActiveListing.Job_JParchived[i]#");
                        QuerySetCell(tot_AllActiveListing, "JP_Posted", "#session.getAllActiveListing.Posted[i]#");
                        QuerySetCell(tot_AllActiveListing, "JP_PosNeed", "#session.getAllActiveListing.JOB_JPPosNeed[i]#");
                        QuerySetCell(tot_AllActiveListing, "JP_DirectHire", "#session.getAllActiveListing.JOB_DirectHire[i]#");
                     </cfscript>       
            </cfloop>
    Any ideas will be greatly appreciated. If stored procedures are the best way to handle this and will run appreciably faster, I'll try it.
    Thanks.
    JoyRose

    Thanks for your reply.
    So now here is the entire code written with LEFT JOIN:
    <cfquery name="getAllActiveListing" datasource="#request.at_datasource#">
        SELECT j.JOB_AnnounceNum, j.JOB_PDLoc, j.fk_JS_code, j.Job_JPOpen, j.Job_JPClose, j.fk_CloseType, j.JOB_JPPosNeed, j.JOB_DirectHire, j.JOB_JPDesc, j.Job_JPDraft, j.JOB_JPArchived, j.JOB_State,
        j.fk_FACID, j.Posted, j.JOB_IHSvITU, f.Fac_Area, f.Fac_ServiceUnit, f.fac_Facility, f.Fac_Addr1, f.Fac_Addr2, f.Fac_City, f.Fac_State, f.Fac_Zip, g.Grade, a.JOBIHSArea_ID, a.JOBIHSArea_Name, a.JOBIHSArea_Alias, c.JOB_CloseName, s.Title, p.HRContact, p.HRContactType, e.Email, k.fk_KWID, k.fk_AnnounceNum, w.JOB_KWName, w.JOB_KWID
        from JOB_JP j INNER JOIN #generaldb#IHSFacility f
        ON j.fk_FACID  = f.Fac_ID
        LEFT OUTER JOIN JOB_JP_Grade g
        ON j.JOB_AnnounceNum = g.fk_Job_AnnounceNum
        LEFT OUTER JOIN JOB_IHSArea_LKUP a
        ON j.Fac_Area = a.JOBIHSArea_Alias
        LEFT OUTER JOIN JOB_CloseType_LKUP c
        ON j.fk_CloseType = c.JOB_CloseType
        LEFT OUTER JOIN JOB_Series_LKUP s
        ON j.fk_js_code = s.fk_js_code
        LEFT OUTER JOIN JOB_JPContacts p
        ON j.JOB_AnnounceNum = p.fk_Job_AnnounceNum
        LEFT OUTER JOIN #globalds#Email e
        ON p.HRContact = e.table_ID
        LEFT OUTER JOIN JOB_JPKW k
        ON j.JOB_AnnounceNum = k.fk_AnnounceNum
        LEFT OUTER JOIN JOB_KW_LKUP w
        ON k.fk_KWID = w.JOB_KWID 
        WHERE
                JOB_JPDraft = 0
                and (Job_JPClose = #Now()# or Job_JPClose > #Now()# or fk_CloseType = 2 or fk_CloseType = 3)
                and (JOB_JPArchived = 0 or JOB_JPArchived IS NULL)
                 <cfif IsDefined("qAltPostID") and qAltPostID.recordcount gt "0">
                and JOB_AnnounceNum IN (<cfqueryparam list="yes" cfsqltype="CF_SQL_varchar" value="#ValueList(qAltPostID.fk_Job_AnnounceNum)#">)
                <cfelseif option is "JPPostListing" and StructKeyExists(session,"IHSUID")>
                and  j.WhoCreated = #session.IHSUID#
                 </cfif>
                 Order by j.Job_JPOpen desc
        </cfquery>
    I'm concerned about the queries below that I converted to the LEFT JOIN code above..
    <cfquery name="getAllActiveListingHRContact" datasource="#at_datasource#">
                SELECT HRContact, HRContactType
                from JOB_JPContacts
                where fk_Job_AnnounceNum = '#session.getAllActiveListing.JOB_AnnounceNum[i]#'
                </cfquery>
                <cfif CompareNoCase(getAllActiveListingHRContact.HRContactType,"HRContactID") is 0>       
                    <cfquery name="getAllActiveListingHREmail" datasource="#globalds#">
                    SELECT Email
                    from Email
                    where Table_ID = #getAllActiveListingHRContact.HRContact#
                    </cfquery>
                    <cfset session.getAllActiveListingHREmail = getAllActiveListingHREmail />
                </cfif>
                <cfquery name="getAllActiveListingMasterKey" datasource="#at_datasource#">
                SELECT fk_KWID, fk_AnnounceNum, JOB_KWName, JOB_KWID
                from JOB_JPKW, JOB_KW_LKUP
                where JOB_JPKW.fk_AnnounceNum = '#session.getAllActiveListing.JOB_AnnounceNum[i]#'
                and JOB_KW_LKUP.JOB_KWID = JOB_JPKW.fk_KWID
                </cfquery>
    I appreciate your help with this.

Maybe you are looking for