Lock specific number of records using ENQUEUE & DEQUEUE

Hi,
Is it possible to lock a group of records in R/3?
My requirement is to update a set of records in VBAP table. I'm not using a BAPI here. Instead, I use a direct UPDATE.
In this case, i know i can lock individual records by passing VBELN and POSNR. But what if i have to lock 10 records?
Is this possible in any way?
Thanks in advance.
The current solution is:
1) LOOP at ITAB
2) LOCK each entry
3) UPDATE VBAP for that entry
4) UNLOCK the entry
5) Endloop
I thought this solution might work: (Assume 10 records are present in ITAB)
1) LOOP at ITAB (Lock all 10 entries)
2) LOCK that entry
3) ENDLOOP
4) UPDATE VBAP from ITAB (Updates all 10 entries in one databae access)
5) LOOP at ITAB(Unlock all 10 entries)
6) UNLOCK that entry
7) ENDLOOP
Any help will be appreciated.
Tabraiz.

Hello,
Both of your solutions will work.
With solution 1 there will always be only 1 enqueue object created, because you always enqueue, perform the update and dequeue.
This means that in SM12 you will only see 1 enqueue entry on your user ID at the same time when your program runs.
Solution 2 is also possible but there you will have different enqueue objects that will be created, because you enqueue everything, then perform the updates and then dequeue everything.
In SM12 (lock entries) this will result in more enqueue records on your user ID the time your program runs.
You have to pay attention that lock entries (SM12) are stored in a queue that is limited, so make sure with solution 2 that you don't overflow the enqueue queue ! ! !
Via tcode RZ11 you can check parameter enque/table_size (Size of lock table).
Check the parameter value but also its documentation and you will understand why you should limit the number of open lock records.
Success.
Wim Van den Wyngaert

Similar Messages

  • Capability of inserting a specific number or records ...

    Hi ,
    Is there any way to permit the end-user enter a specific number or records in a multi-record block..... according to the number of fetched records in another block...????
    I assume that the trigger when-create-record can do that ... Are there any other solutions...???
    Thanks ,
    Simon

    ..Or,
    this is for single block ,but i believe works also for multirecord
    A parameter for defining the limit for the number of records the user can
    query.
    1. Define a parameter, :max_record, which is the limit for the number of records
    the user can enter. Make sure to define this parameter as numeric
    and provide a default value.
    2. For a form with a single block, create the following triggers at block level:
    a. Attach the following PL/SQL block to a KEY-CREREC trigger to create a
    record only when :system.cursor_record is less than :max_record.
    DECLARE
    a NUMBER;
    b NUMBER;
    BEGIN
    a := :system.cursor_record;
    LAST_RECORD;
    b := :system.cursor_record;
    IF b >= :parameter.max_record THEN
    GO_RECORD(a);
    MESSAGE('max record exceeded - create rec III');
    RAISE FORM_TRIGGER_FAILURE;
    END IF;
    GO_RECORD(a);
    IF :system.cursor_record < :parameter.max_record THEN
    CREATE_RECORD;
    ELSE
    MESSAGE('max record exceeded - create rec ');
    RAISE FORM_TRIGGER_FAILURE;
    END IF;
    END;
    b. To navigate to the next record when :system.cursor_record is
    less than the :max_record, create a KEY-DOWN trigger.
    IF :system.cursor_record < :parameter.max_record THEN
    DOWN;
    ELSE
    MESSAGE('max records key-down');
    END IF;

  • Selecting specific number of records

    Hellow
    How can we query certain number of records from table. For example if a table has thousands of records and i wish to query
    1. first 500 records
    2. Records between 500 and 1000 or between <anynumber> to <any number>
    3. Records less than <some number>
    I cannot perform the same operation from a primary key as it is not in serial.
    I tried to use ROWNUM, but i cannot use this when i want to select rows less than 100 or rows between 100 and 200...
    How can i accomplish this
    Regards
    Sunny

    1. first 500 records
    select *
    from ( YOUR_QUERY_GOES_HERE -- including the order by )
    where rownum <= 500Another way using analytical functions can be found in the documentation Top N Ranking
    2. Records between 500 and 1000 or between
    <anynumber> to <any number>
    select *
      from ( select a.*, rownum rnum
               from ( YOUR_QUERY_GOES_HERE -- including the order by ) a
              where rownum <= MAX_ROWS )
    where rnum >= MIN_ROWShttp://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:127412348064
    3. Records less than <some number>
    select *
    from my_table
    where my_column <   <some number>

  • How to check for locks on a material - without using Enqueue function

    Hi
        We have a requirement in which - we need to check the lock ( exclusive ) on a material before we call the BAPI to update the material.
    I don't want to actually lock the material before calling the BAPI - the BAPI itself acquires the lock as a part of its processing. All I want to do is to check whether lock on a material exists before calling the BAPI - thus avoiding any lock related issues on the material in the BAPI call.
    How do we just merely check whether material is locked - any standard function module /SAP tables where in material lock is stored that we use/interrogate ?
    Correct answers will be promptly rewarded.

    Hi ,
          There is standard function enqueue_read ,based on material number .
    Pass the material number to FM ENQUEUE_READ if sy-subrc = 0 then material is locked otherwise not .
    Please reward if useful.

  • How to use Enqueue/Dequeue to prevent concurrent write?

    Hi All,
    I have a report program that allows multiple users to save to the database tables.
    How do I incorporate ENQUEUE and DEQUEUE statements to prevent concurrent write to ensure data integrity? Are there sample codes that I could refer to?
    Thanks

    Hi,
    here screen get locked
    CALL FUNCTION 'ENQUEUE_EIQMEL'
                   EXPORTING
                        mandt            = sy-mandt
                        qmnum          = i_final_ap-qmnum  -Notification number
                   EXCEPTIONS
                        foreign_lock     = 1
                        system_failure = 2
                        OTHERS         = 3.
    *--Setting the  task "PE03" for notification  on screen
              CALL FUNCTION 'IQS4_ADD_DATA_NOTIFICATION'
                   EXPORTING
                        i_qmnum    = i_final_ap-qmnum
                        i_conv     = ' '
                        i_post     = c_x
                        i_commit   = c_x
                        i_wait     = c_x
                   TABLES
                        i_viqmsm_t = i_viqmsm_tmp
                        return     = i_return.
      CALL FUNCTION 'DEQUEUE_EIQMEL'
                   EXPORTING
                        mandt = sy-mandt
                        qmnum = i_final_ap-qmnum.

  • Not receiving email when sending large number of records using a FM?

    Hi..
    I am using the function module " SO_DOCUMENT_SEND_API1 " to send email....
    When a single record is there.. or around 5-6 records are there... email is coming successfully...
    But when there are more records..say around 100, the email is not coming... I checked SOST transaction and the status there is in red..and the error message is " Internal error: SO_OBJECT_MIME_GET Exception: 2 ".........
    What could be the reason behind this problem....... ??
    I have another problem... my output has over 60 fields, but in the email which i am receiving has only around 10 fields... how to solve this problem...??
    Plz help...

    Well... right now i am tryin to get only the first 2 fields.. but even in this case... i am not getting the email if around 15 records are there......
    I am using the code which is given below which i found in SDN only......In this code.. data is getting selected from EKPO... i tried changing the number of rows getting selected.. and in this case, the attachement is coming as desired... but when i use the same code for my prog.. i am not getting the mail.. even if there are only 10 records or so...
    *& Report  ZT062108   ALV Header                                    *
    REPORT  zt062108.
    TABLES: ekko.
    PARAMETERS: p_email   TYPE somlreci1-receiver
                                      DEFAULT '<give email here>'.
    TYPES: BEGIN OF t_ekpo,
      ebeln TYPE ekpo-ebeln,
      ebelp TYPE ekpo-ebelp,
      aedat TYPE ekpo-aedat,
      matnr TYPE ekpo-matnr,
    END OF t_ekpo.
    DATA: it_ekpo TYPE STANDARD TABLE OF t_ekpo INITIAL SIZE 0,
          wa_ekpo TYPE t_ekpo.
    TYPES: BEGIN OF t_charekpo,
      ebeln(10) TYPE c,
      ebelp(5)  TYPE c,
      aedat(8)  TYPE c,
      matnr(18) TYPE c,
    END OF t_charekpo.
    DATA: wa_charekpo TYPE t_charekpo.
    DATA:   it_message TYPE STANDARD TABLE OF solisti1 INITIAL SIZE 0
                    WITH HEADER LINE.
    DATA:   it_attach TYPE STANDARD TABLE OF solisti1 INITIAL SIZE 0
                    WITH HEADER LINE.
    DATA:   t_packing_list LIKE sopcklsti1 OCCURS 0 WITH HEADER LINE,
            t_contents LIKE solisti1 OCCURS 0 WITH HEADER LINE,
            t_receivers LIKE somlreci1 OCCURS 0 WITH HEADER LINE,
            t_attachment LIKE solisti1 OCCURS 0 WITH HEADER LINE,
            t_object_header LIKE solisti1 OCCURS 0 WITH HEADER LINE,
            w_cnt TYPE i,
            w_sent_all(1) TYPE c,
            w_doc_data LIKE sodocchgi1,
            gd_error    TYPE sy-subrc,
            gd_reciever TYPE sy-subrc.
    *START_OF_SELECTION
    START-OF-SELECTION.
    *   Retrieve sample data from table ekpo
      PERFORM data_retrieval.
    *   Populate table with detaisl to be entered into .xls file
      PERFORM build_xls_data_table.
    *END-OF-SELECTION
    END-OF-SELECTION.
    * Populate message body text
      perform populate_email_message_body.
    * Send file by email as .xls speadsheet
      PERFORM send_file_as_email_attachment
                                   tables it_message
                                          it_attach
                                    using p_email
                                          'Example .xls documnet attachment'
                                          'TXT'
                                          'filename'
                                 changing gd_error
                                          gd_reciever.
    *   Instructs mail send program for SAPCONNECT to send email(rsconn01)
      PERFORM initiate_mail_execute_program.
    *&      Form  DATA_RETRIEVAL
    *       Retrieve data form EKPO table and populate itab it_ekko
    FORM data_retrieval.
      SELECT ebeln ebelp aedat matnr
       UP TO 1000 ROWS
        FROM ekpo
        INTO TABLE it_ekpo.
    ENDFORM.                    " DATA_RETRIEVAL
    *&      Form  BUILD_XLS_DATA_TABLE
    *       Build data table for .xls document
    FORM build_xls_data_table.
    *  CONSTANTS: con_cret TYPE x VALUE '0D'.  "OK for non Unicode
    *             con_tab TYPE x VALUE '09'.   "OK for non Unicode
    *If you have Unicode check active in program attributes thnen you will
    *need to declare constants as follows
    *class cl_abap_char_utilities definition load.
    constants:
        con_tab  type c value cl_abap_char_utilities=>HORIZONTAL_TAB,
        con_cret type c value cl_abap_char_utilities=>CR_LF.
      CONCATENATE 'EBELN' 'EBELP' 'AEDAT' 'MATNR'
             INTO it_attach SEPARATED BY con_tab.
      CONCATENATE con_cret it_attach  INTO it_attach.
      APPEND  it_attach.
      LOOP AT it_ekpo INTO wa_charekpo.
        CONCATENATE wa_charekpo-ebeln wa_charekpo-ebelp
                    wa_charekpo-aedat wa_charekpo-matnr
               INTO it_attach SEPARATED BY con_tab.
        CONCATENATE con_cret it_attach  INTO it_attach.
        APPEND  it_attach.
      ENDLOOP.
    ENDFORM.                    " BUILD_XLS_DATA_TABLE
    *&      Form  SEND_FILE_AS_EMAIL_ATTACHMENT
    *       Send email
    FORM send_file_as_email_attachment tables pit_message
                                              pit_attach
                                        using p_email
                                              p_mtitle
                                              p_format
                                              p_filename
                                              p_attdescription
                                              p_sender_address
                                              p_sender_addres_type
                                     changing p_error
                                              p_reciever.
      DATA: ld_error    TYPE sy-subrc,
            ld_reciever TYPE sy-subrc,
            ld_mtitle LIKE sodocchgi1-obj_descr,
            ld_email LIKE  somlreci1-receiver,
            ld_format TYPE  so_obj_tp ,
            ld_attdescription TYPE  so_obj_nam ,
            ld_attfilename TYPE  so_obj_des ,
            ld_sender_address LIKE  soextreci1-receiver,
            ld_sender_address_type LIKE  soextreci1-adr_typ,
            ld_receiver LIKE  sy-subrc.
      ld_email   = p_email.
      ld_mtitle = p_mtitle.
      ld_format              = p_format.
      ld_attdescription      = p_attdescription.
      ld_attfilename         = p_filename.
      ld_sender_address      = p_sender_address.
      ld_sender_address_type = p_sender_addres_type.
    * Fill the document data.
      w_doc_data-doc_size = 1.
    * Populate the subject/generic message attributes
      w_doc_data-obj_langu = sy-langu.
      w_doc_data-obj_name  = 'SAPRPT'.
      w_doc_data-obj_descr = ld_mtitle .
      w_doc_data-sensitivty = 'F'.
    * Fill the document data and get size of attachment
      CLEAR w_doc_data.
      READ TABLE it_attach INDEX w_cnt.
      w_doc_data-doc_size =
         ( w_cnt - 1 ) * 255 + STRLEN( it_attach ).
      w_doc_data-obj_langu  = sy-langu.
      w_doc_data-obj_name   = 'SAPRPT'.
      w_doc_data-obj_descr  = ld_mtitle.
      w_doc_data-sensitivty = 'F'.
      CLEAR t_attachment.
      REFRESH t_attachment.
      t_attachment[] = pit_attach[].
    * Describe the body of the message
      CLEAR t_packing_list.
      REFRESH t_packing_list.
      t_packing_list-transf_bin = space.
      t_packing_list-head_start = 1.
      t_packing_list-head_num = 0.
      t_packing_list-body_start = 1.
      DESCRIBE TABLE it_message LINES t_packing_list-body_num.
      t_packing_list-doc_type = 'RAW'.
      APPEND t_packing_list.
    * Create attachment notification
      t_packing_list-transf_bin = 'X'.
      t_packing_list-head_start = 1.
      t_packing_list-head_num   = 1.
      t_packing_list-body_start = 1.
      DESCRIBE TABLE t_attachment LINES t_packing_list-body_num.
      t_packing_list-doc_type   =  ld_format.
      t_packing_list-obj_descr  =  ld_attdescription.
      t_packing_list-obj_name   =  ld_attfilename.
      t_packing_list-doc_size   =  t_packing_list-body_num * 255.
      APPEND t_packing_list.
    * Add the recipients email address
      CLEAR t_receivers.
      REFRESH t_receivers.
      t_receivers-receiver = ld_email.
      t_receivers-rec_type = 'U'.
      t_receivers-com_type = 'INT'.
      t_receivers-notif_del = 'X'.
      t_receivers-notif_ndel = 'X'.
      APPEND t_receivers.
      CALL FUNCTION 'SO_DOCUMENT_SEND_API1'
           EXPORTING
                document_data              = w_doc_data
                put_in_outbox              = 'X'
                sender_address             = ld_sender_address
                sender_address_type        = ld_sender_address_type
                commit_work                = 'X'
           IMPORTING
                sent_to_all                = w_sent_all
           TABLES
                packing_list               = t_packing_list
                contents_bin               = t_attachment
                contents_txt               = it_message
                receivers                  = t_receivers
           EXCEPTIONS
                too_many_receivers         = 1
                document_not_sent          = 2
                document_type_not_exist    = 3
                operation_no_authorization = 4
                parameter_error            = 5
                x_error                    = 6
                enqueue_error              = 7
                OTHERS                     = 8.
    * Populate zerror return code
      ld_error = sy-subrc.
    * Populate zreceiver return code
      LOOP AT t_receivers.
        ld_receiver = t_receivers-retrn_code.
      ENDLOOP.
    ENDFORM.
    *&      Form  INITIATE_MAIL_EXECUTE_PROGRAM
    *       Instructs mail send program for SAPCONNECT to send email.
    FORM initiate_mail_execute_program.
      WAIT UP TO 2 SECONDS.
      SUBMIT rsconn01 WITH mode = 'INT'
                    WITH output = 'X'
                    AND RETURN.
    ENDFORM.                    " INITIATE_MAIL_EXECUTE_PROGRAM
    *&      Form  POPULATE_EMAIL_MESSAGE_BODY
    *        Populate message body text
    form populate_email_message_body.
      REFRESH it_message.
      it_message = 'Please find attached a list test ekpo records'.
      APPEND it_message.
    endform.                    " POPULATE_EMAIL_MESSAGE_BODY

  • Counting number of records using where clause in a large table

    SQL SERVER 2008
    I've to find the count of the rows matching some conditions. The table size is very large with around 50 columns. And every day around 3000 rows are adding. 
    My business logic is to count the number of payments made by a particular user, also if the count exceeds 10 then the user is GoldUser else NormalUser.
    here:
    select count(payment_id) from payments_master_table where payment_user = @payment_user;
    here payment_id is primary key. This query is very slow, since it has to scan the whole table. Is there any other optimized way to find the number of counts. 
    Also, one way I can do is, if the count of payment_id exceeds 10, stop the count,
    but I do not how to do this.  
    Please can anyone help me. 
    Thanks!

    not top 10, but suppose 
    if the count(payment_id) reached 10 for a particular user before scanning the whole
    table, why to proceed ahead. My condition is to apply any user as GoldUser if the count is 10 or more than that. So stop the count if it reaches 10.
    Then in that case you need to implement it as a correlated query
    ie like
    SELECT t.User,COUNT(*)
    FROM table t
    CROSS APPLY (SELECT TOP 10 payment_id
    FROM table
    WHERE payment_user = t.payment_user
    )t1
    GROUP BY t.User
    It will try to fetch random 10 rows for the user and then get count. But even in this case you cant tell it will stop at exactly 10 record for every user as it depends on few other factors too.
    In any case the index on the payment_user column will really help
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Processing of large number of records using JDBC Sender Channel

    Hi experts,
    We have a JDBC-File scenario where in the tables contain about 500K records on an average.
    I used multimapping to generate a flat file for every 10K. The problem is ..when I start the JDBC Sender CC, the memory goes up and the J2EE engine restarts. In the Sender CC, I gave the Disconnect from the database option too. The query is SELECT * from TABLE and the Update statement is <TEST>. Please help me out how to solve this.
    Regards.

    Hi
    Use the below query,
    // Oracle
    SELECT Statement :      select colname from tblname where rownum<=1000 ;
    // MSSQL
    select top 1000 colname from tblname
    Regards
    Ramg

  • 2515 Locking FMs to Enqueue/Dequeue/ Which one to use?

    I know this question is redundant, and has been responded to a million times....all differently. However, being their are 2,515 function modules to choose from, what are the best choices for locking and unlocking a custom database table?
    And, why are their 2,515 of them?
    I take that back....many many more!
        Thank-You
    Edited by: Tom Matys on Sep 23, 2011 12:15 PM

    Hi
    I think the best way is to create a new lock object for every custom tables.
    Every table should have an own lock object (so own fms to Enqueue/Dequeue) because it makes sure the same lock object is not used for another table at the same time.
    If the same lock object is used for many tables it can risk to find a certain record locked although nobody locks it really.
    Max

  • Select specific number of rows

    i'm listing all data entries in a database.
    for that i want to select the 1st ten entires, then the next ten and so on ordered
    by date_column.
    how can i do that? is it possible with rownum?
    in mysql/php i did it with limit
    $query = "select * from article where parent_id=0 order by date desc, time desc limit $select, 10";
    please email me.
    thanks for help
    chris

    but what about rownum if i want the
    records from 10-20?
    <BLOCKQUOTE><font size="1" face="Verdana, Arial, Helvetica">quote:</font><HR>Originally posted by Chen Zhao ([email protected]):
    If you are using Oracle 8i, you can use ROWNUM to specify the specific number of records. You can see the ROWNUM by querying:
    SELECT *
    FROM (SELECT column_name FROM table_name ORDER BY column_name)
    WHERE ROWNUM<10
    CHEN<HR></BLOCKQUOTE>
    null

  • Maximum number of records to 'BAPI_PIRSRVAPS_SAVEMULTI'

    Hi All ,
    Could anybody tell me maximum number of records that can be passed to BAPI
    BAPI_PIRSRVAPS_SAVEMULTI.
    This BAPI is used for forecast upload to SNP planning area (which can be seen in product view: /sapapo/rrp3).
    Win full points for the resolution...
    Thanks in advance...
    Chandan Dubey

    Hi Chandan - There is no simple answer to this question.
    BAPI_PIRSRVAPS_SAVEMULTI has a built in package (number of records to process) counter which sends packets of data to livecache for creating data. By default this BAPI will process all records at once but there is a BADI in this BAPI that allows you to set the package size as well as many other things. The performance will depend upon things like your system,  environment and volume of data. There are 2 limitations in 1) the prereading (retrieval of matlocids, matids, locids, pegids, etc.) which happens prior to the livecache call and 2) the livecache call itself. The prereading can cause a memory overload but that is less likely to happen compared to a livecache problem. The proceduress that call livecache can run out of more likel than the ABAP tables and cause the program to dump as well and the dump may be hard to understand.
    What I have done with many programs is to add a wrapper around a livecache BAPI (or FM) call and use my own counter to send blocks or packets of data to the BAPI. For example loop through records in a program and call the BAPI for every 1000 records accumulating the return info in an internal table. The number of records in each packet or block is driven by a parameter on a selection screen or value in a ztable so the number can be tested and adjusted as needed. The reaction of livecache BAPIs will differ from system due to things such as hardware configuration and volume of data.
    If you do not code to call the BAPI as I have described above, place code in the BADI to set the packet size or limit the number of records being input some other way, then you are taking a risk that one day a specific number of records will cause a dump in this BAPI.
    I would think you would be safe with 500-1000 records but you should really test in your system and consider the options for packeting the number of records.
    Andy

  • To get  the number of record from cmp

    how can i get the specific number of record (25 records) from cmp using weblogic8?
    anybody know pls tell.

    http://java.sun.com/j2se/1.4.1/docs/api/java/io/File.html

  • Optimal number of records to fetch from Forte Cursor

    Hello everybody:
    I 'd like to ask a very important question.
    I opened Forte cursor with approx 1.2 million records, and now I am trying
    to figure out the number of records per fetch to obtain
    the acceptable performance.
    To my surprise, fetching 100 records at once gave me approx 15 percent
    performance gain only in comparsion
    with fetching records each by each.
    I haven't found significant difference in performance fetching 100, 500 or
    10.000 records at once.In the same time, fetching 20.000
    records at once make a performance approx 20% worse( this fact I cannot
    explain).
    Does anybody have any experience in how to improve performance fetching from
    Forte cursor with big number of rows ?
    Thank you in advance
    Genady Yoffe
    Software Engineer
    Descartes Systems Group Inc
    Waterloo On
    Canada

    You can do it by writing code in start routine of your transformations.
    1.If you have any specific criteria for filtering go with that and delete unwanted records.
    2. If you want to load specific number of records based on count, then in start routine of the transformations loop through source package records by keeping a counter till you reach your desired count and copy those records into an internal table.
    Delete records in the source package then assign the records stored in internal table to source package.

  • Issue with Enqueue / Dequeue

    We are working on a scenario where data is got into SAP through Web methods to create Sales orders.
    If we were to use Enqueue / Dequeue function while doing this would the web methods connection hold till the process is complete or would this time out in the middle?
    Is there any time out period for the connection with web methods, and if so how can it can be reset to an alternate value

    Hi,
    in ITS if you click on object, you can see the parmaters like
    ~TRANSACTION
    <b>~TIMEOUT</b>
    give more time for Time out parameter, that should solve the problem.
    Regards
    Vijay D T T.

  • Enqueue/Dequeue in Oracle Aq Adapter

    Hi,
    I need to use enqueue/dequeue option in oracle aq adapter.
    However i need to know how the macthing of ids has to happen.
    Can any one provide me with an example with details steps?
    Thanks,
    Rosh

    The below url will provide information about AQ adapter,
    http://docs.oracle.com/cd/E21764_01/integration.1111/e10231/adptr_aq.htm
    Example of AQ adapter.
    http://jamessmith73.wordpress.com/oracle-fusion-middleware/oracle-soa-bpm-11g-blogs/soa-10g/soa-hands-on-4/
    Thanks,
    Vijay

Maybe you are looking for

  • Installation on new hard drive

    I have a G4 Quicksilver that had a mechanical failure in the hard drive, so I had a new one installed. I have all of the original installation disks: OS 9.2.1, OS 10.1, displays software, and software restore (4 of them). I don't really need OS 9 (co

  • Data displayed on the report in disc plus when used "GROUP BY"

    Hi, I have created a cusom folder, in which the query has group by on 3 columns. So the report in Discoverer plus is showing as Accounting Period     Market      Account     name,     Total billed 200812     xyz     12345     xxxxx,     $12,345.00   

  • Bootcamp freezing on new macbook air

    I got a new macbook air with the new intel chip. I installed windows 7 and the usb ports dont work as well as the keyboard.  The mac OS works fine. The only button that works is the power button. I have 10 laptops with the exact same issue. Anyone el

  • Passing custom DTO to input method...

    Hi, Im attempting to create a workshop web service that accesses an EJB control. The method's input params require the passing of a custom built data transfer object (bean) that contains itself other objects. is there any way to do this via a web ser

  • After updating to 6.1.2 on Iphone5 can't send pictures to contacts

    After updating to 6.1.2 on Iphone5 can't send photo's to contacts . Have 1) Restored phone 2) Reset Network 3) Removed sim and replaced it to no avail. Any ideas?