Issues while transfering time data (CATSDB)  to  HR infotypes

We need to allow transfer (CATSDB)  of  'time data'  to infotypes using
program RPTEXTPT  by users .
( RPTEXTPT  : Transfer time data to HR  TM )
Some of the users do not have authorization to maintain infotype 2001 /2002/ 2003 via PA30 .
So, the program RPTEXTPT could not be used to  transfer data to these infotypes .
How to supress authorization check so that program RPTEXTPT  can be used ?
Appreciate the help.
Thank you .

Hi Rohan,
The users who are running RPTEXTPT/CAT6 and does not have authorizantion to create infotype 2001 /2002/ 2003 can take help of background job to create the infotype.
1. You need to create a Non dialog user with authorization to create infotype 2001 /2002/ 2003. Take basis people help.
2. Use the user name in SM36 while scheduling the batch job.
Let me know if you need any further info.
Br/Manas

Similar Messages

  • How can i extend the table control while transfering the data

    hi
    how can i extend the table control while transfering the data.

    Hi,
    For table control we have to handle the page down (P+, or what ever function codes are assigned to that activity) activity with our coding.
    Just check out this code:
    This is the bdc to update the XK01 transaction code (Vendor Creation).
    Here we will use table controls for bankings. Here Iam sending the coding and text files.
    Coding
    REPORT zprataptable2
    NO STANDARD PAGE HEADING LINE-SIZE 255.
    DATA : BEGIN OF itab OCCURS 0,
    i1 TYPE i,
    lifnr LIKE rf02k-lifnr,
    bukrs LIKE rf02k-bukrs,
    ekorg LIKE rf02k-ekorg,
    ktokk LIKE rf02k-ktokk,
    anred LIKE lfa1-anred,
    name1 LIKE lfa1-name1,
    sortl LIKE lfa1-sortl,
    land1 LIKE lfa1-land1,
    akont LIKE lfb1-akont,
    fdgrv LIKE lfb1-fdgrv,
    waers LIKE lfm1-waers,
    END OF itab.
    DATA : BEGIN OF jtab OCCURS 0,
    j1 TYPE i,
    banks LIKE lfbk-banks,
    bankl LIKE lfbk-bankl,
    bankn LIKE lfbk-bankn,
    END OF jtab.
    DATA : cnt(4) TYPE n.
    DATA : fdt(20) TYPE c.
    DATA : c TYPE i.
    INCLUDE bdcrecx1.
    START-OF-SELECTION.
    CALL FUNCTION 'WS_UPLOAD'
    EXPORTING
    filename = 'C:\first1.txt'
    filetype = 'DAT'
    TABLES
    data_tab = itab.
    CALL FUNCTION 'WS_UPLOAD'
    EXPORTING
    filename = 'C:\second.txt'
    filetype = 'DAT'
    TABLES
    data_tab = jtab.
    LOOP AT itab.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0100'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'RF02K-KTOKK'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '/00'.
    PERFORM bdc_field USING 'RF02K-LIFNR'
    itab-lifnr.
    PERFORM bdc_field USING 'RF02K-BUKRS'
    itab-bukrs.
    PERFORM bdc_field USING 'RF02K-EKORG'
    itab-ekorg.
    PERFORM bdc_field USING 'RF02K-KTOKK'
    itab-ktokk.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0110'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'LFA1-LAND1'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '/00'.
    PERFORM bdc_field USING 'LFA1-ANRED'
    itab-anred.
    PERFORM bdc_field USING 'LFA1-NAME1'
    itab-name1.
    PERFORM bdc_field USING 'LFA1-SORTL'
    itab-sortl.
    PERFORM bdc_field USING 'LFA1-LAND1'
    itab-land1.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0120'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'LFA1-KUNNR'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '/00'.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'LFBK-BANKN(01)'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '=ENTR'.
    cnt = 0.
    LOOP AT jtab WHERE j1 = itab-i1.
    cnt = cnt + 1.
    CONCATENATE 'LFBK-BANKS(' cnt ')' INTO fdt.
    PERFORM bdc_field USING fdt jtab-banks.
    CONCATENATE 'LFBK-BANKL(' cnt ')' INTO fdt.
    PERFORM bdc_field USING fdt jtab-bankl.
    CONCATENATE 'LFBK-BANKN(' cnt ')' INTO fdt.
    PERFORM bdc_field USING fdt jtab-bankn.
    IF cnt = 5.
    cnt = 0.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'LFBK-BANKS(01)'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '=P+'.  " Page down activity
    PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'LFBK-BANKN(02)'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '=ENTR'.
    ENDIF.
    ENDLOOP.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0130'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'LFBK-BANKS(01)'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '=ENTR'.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0210'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'LFB1-FDGRV'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '/00'.
    PERFORM bdc_field USING 'LFB1-AKONT'
    itab-akont.
    PERFORM bdc_field USING 'LFB1-FDGRV'
    itab-fdgrv.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0215'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'LFB1-ZTERM'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '/00'.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0220'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'LFB5-MAHNA'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '/00'.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0310'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'LFM1-WAERS'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '/00'.
    PERFORM bdc_field USING 'LFM1-WAERS'
    itab-waers.
    PERFORM bdc_dynpro USING 'SAPMF02K' '0320'.
    PERFORM bdc_field USING 'BDC_CURSOR'
    'RF02K-LIFNR'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '=ENTR'.
    PERFORM bdc_dynpro USING 'SAPLSPO1' '0300'.
    PERFORM bdc_field USING 'BDC_OKCODE'
    '=YES'.
    PERFORM bdc_transaction USING 'XK01'.
    ENDLOOP.
    PERFORM close_group.
    **Flat files for the above code***
    Intial screen data file.
    1 63190 0001 0001 0001 mr bal188 b in 31000 a1 inr
    2 63191 0001 0001 0001 mr bal189 b in 31000 a1 inr
    Table control Data:
    1 in sb 11000
    1 in sb 12000
    1 in sb 13000
    1 in sb 14000
    1 in sb 15000
    1 in sb 16000
    1 in sb 17000
    1 in sb 18000
    1 in sb 19000
    1 in sb 20000
    1 in sb 21000
    1 in sb 22000
    2 in sb 21000
    2 in sb 22000
    Regards,
    Kumar.

  • Date issue while transfering pics from macbook to external drive

    Hello
    I am transferring pictures from my Macbook to an external hard drive and my issue is that the date of the picture (date of creation/modified) is being replaced by the date of the transfer… So let say that all my 2013 pictures have now the date of today – 19th June 2014. I tried to drag the pics as well as to export them but I am getting the same issue. I am doing transfers regularly and it is the first time I encounter this issue. Funnily enough it works when I drag one picture at a time but not for more than 3 pictures – very weird.  Your help will be very appreciated !

    There are two kinds of metadata involved when you consider jpeg or other image file.
    One is the file data. This is what the Finder shows. This tells you nothing about the contents of the file, just the File itself.
    The problem with File metadata is that it can easily change as the file is moved from place to place or exported, e-mailed, uploaded etc.
    Photographs have also got both Exif and IPTC metadata. The date and time that your camera snapped the Photograph is recorded in the Exif metadata. Regardless if what the file date says, this is the actual time recorded by the camera.
    Photo applications like iPhoto, Aperture, Lightroom, Picasa, Photoshop etc get their date and time from the Exif metadata.
    When you export from iPhoto to the Finder new file is created containing your Photo (and its Exif). The File date is - quite accurately - reported as the date of Export.
    However, the Photo Date doesn't change.
    The problem is that the Finder doesn't work with Exif.
    So, your photo has the correct date, and so does the file, but they are different things. To sort on the Photo date you'll need to use a photo app.

  • Issue while loading Master Data through Process Chain in Production

    Hi All,
    We are getting an error in Process chain while loading Master Data
    Non-updated Idocs found in Source System
    Diagnosis
    IDocs were found in the ALE inbox for Source System that are not updated.
    Processing is overdue.
    Error correction:
    Attempt to process the IDocs manually. You can process the IDocs manually using the Wizard or by selecting the IDocs with incorrect status and processing them manually.
    I had checked the PSA also but I could not find any record and the strange thing is, Job itself is not getting scheduled. Can any one help me out in order to resolve this issue.
    Regards
    Bhanumathi

    Hi
    This problem is not related to Process chain..
    u can try this..
    In RSMO, select the particular load you want to monitor.
    In the menu bar, Environment >>> Transact. RFC >>> Select whichever is required, BW or Source System.
    In the next screen you can select the Execute button and the IDOCS will be displayed.
    Check Note 561880 - Requests hang because IDocs are not processed.
    OR
    Transact RFC - status running Yellow for long time (Transact RFC will be enabled in Status tab in RSMO).
    Step 1: Goto Details, Status get the IDoc number,and go to BD87 in R/3,place the cursor in the RED IDoc entroes in tRFC
    queue thats under outbound processing and click on display the IDOC which is on the menu bar.
    Step 2: In the next screen click on Display tRFC calls (will take you to SM58 particular TRFC call)
    place the cursor on the particular Transaction ID and go to EDIT in the menu bar --> press 'Execute LUW'
    (Display tRFC calls (will take you to SM58 particular TRFC call) ---> select the TrasnID ---> EDIT ---> Execute LUW)
    Rather than going to SM58 and executing LUW directly it is safer to go through BD87 giving the IDOC name as it will take you
    to the particular TRFC request for that Idoc.
    OR
    Go into the JOB Overview of the Load there you should be able to find the Data Package ID.
    (For this in RSMO Screen> Environment> there is a option for Job overview.)
    This Data Package TID is Transaction ID in SM58.
    OR
    SM58 > Give * / user name or background (Aleremote) user name and execute.It will show you all the pending TRFC with
    Transaction ID.
    In the Status Text column you can see two status
    Transation Recorded and Transaction Executing
    Don't disturb, if the status is second one Transaction Executing. If the status is first one (Transation Recorded) manually
    execute the "Execute LUWs"
    OR
    Directly go to SM58 > Give * / user name or background (Aleremote) user name and execute. It will show TRFCs to be executed
    for that user. Find the particular TRFC (SM37 > Req name > TID from data packet with sysfail).select the TrasnID (SM58) --->
    EDIT ---> Execute LUW
    (from JituK)
    Hope it helps
    Darshan

  • Issues while generating Schema DAT files

    We are facing two type of issues when generating Schema ".dat" files from Informix Database on Solaris OS using the
    "IDS9_DSML_SCRIPT.sh " file.
    We are executing the command on SOLARIS pormpt as follows..
    "IDS9_DSML_SCRIPT.sh <DBName> <DB Server Name> ".
    The first issue is ,after the command is excuted ,while generating the ".dat" files the following error is occuring .This error is occuring for many tables
    19834: Error in unload due to invalid data : row number 1.
    Error in line 1
    Near character position 54
    Database closed.
    This happens randomly for some schemas .So we again shift the script to a different folder in Unix and execute it.
    Can we get the solution for avoiding this error.
    2. The second issue is as follows..
    When the ".dat" files are generated without any errors using the script ,these .dat files are provided to the OMWB tool to load the Source Model.
    The issue here is sometimes OMWB is not able to complete the process of creating the Source Model from the .dat files and gets stuck.
    Sometimes the tables are loaded ,but with wrong names.
    For example the Dat files is having the table name as s/ysmenus for the sysmenus table name.
    and when loaded to oracle the table is created with the name s_ysmenus.
    Based on the analysis and understanding this error is occuring due to the "Delimiter".
    For example this is the snippet from a .dat file generated from the IDS9_DSML_SCRIPT.sh script.The table name sysprocauthy is generated as s\ysprocauthy.
    In Oracle this table is created with the name s_ysprocauthy.
    s\ysprocauthy║yinformixy║y4194387y║y19y║y69y║y4y║y2y║y0y║y2005-03-31y║y65537y║yT
    y║yRy║yy║y16y║y16y║y0y║yy║yy║y╤y
    Thanks & Regards
    Ramanathan KrishnaMurthy

    Hello Rajesh,
    Thanks for your prompt reply. Please find my findings below:
    *) Have there been any changes in the extractor logic causing it to fail before the write out to file, since the last time you executed it successfully? - I am executing only the standard extractors out of the extractor kit so assumbly this shouldnt be a issue.
    *) Can this be an issue with changed authorizations? - I will check this today, bt again this does not seem to be possible as the same object for a different test project i created executed fine and a file was created.
    *) Has the export folder been locked or write protected at the OS level? Have the network settings (if its a virtual directory) changed? - Does not seem so because of the above reason.
    I will do some analysis today and revert back for your help.
    Regards
    Gundeep

  • Issue while Installing Oracle Data Access Software for Windows

    All,
    Iam getting the following error while installing Oracle Data Access Software for windows. Iam installing in WindowsXP, with Oracle 9i release 9.2.0.7.0 DB and client in the same Box.
    It shows
    The Specified Key key was not found while trying to GetValue
    * Stop installation of all Products
    * Stop installtion of this componenent only.
    Kindly let me know why this error is showing up.
    Regards
    Ramesh

    Most probably you have hit this issue:
    "If you have more than one Oracle Home installed on the same machine (e.g. Oracle8i client and Oracle9i Release 2 client), use the Oracle Home Selector to run your applications with Oracle9i Release 2 client. "
    As documented on the Oracle Data Access Software for Windows. Release 9.2.0.4.0
    ~ Madrid.

  • Parent not O.K.:MATERIAL issue while uploading CMIR data in CRM

    Hi All,
    I am facing the issue while doing the initial load for customer material information record. As per the information of the sap note i have deleted parent adapter object BUPA_MAIN and added customer_main. i have tried but still the problem coming by saying Parent not O.K.:Customer_main. i have removed this customer_main also but now again am getting Parent not O.K.:MATERIAL. So finally it is not even moved a bit ahead. Please have a look and revert me.
    Shyam K Gangisetti

    Hi Raymond,
    Thanks for your reply.Yes,If I use CONVERSION_EXIT_CUNIT_INPUT in my program the issue is, Assume If the user is giving PC as value for UOM field in flat file and upload the flat file.It is successfully uploading the value PC to the UOM field in transaction VK13 but the in the database table(konp) it is showing the value as ST.
    Regards,
    Chakradhar.

  • Performance issue while transferring data from one itab to another itab

    hi experts,
    i have stored all the general material details in one internal table and description, valuation details of the meterial is stored in another internal table which is of standard table. now i need to tranfer all the data from these two internal tables into one final internal table but it is taking lot of time as it has to transfer lacs of data.
    i have declared the output table as shown below
    DATA:
      t_output TYPE standard TABLE
               OF type_output
               INITIAL SIZE 0
               WITH HEADER LINE.
    (according to the standard i have to declare like this and the two internal tables are declared similar to the above one)
    could somebody suggest me how shall I procced through this...
    thanks in advance....
    ragerds,
    Deepu

    Have a look at the following article which you may find useful:
      <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40729734-3668-2910-deba-fa0e95e2c541">Improving performance in nested loops</a>
    good luck, damian

  • Idoc issue while loading the data into BI

    Hello Gurus,
    Initially i am having the Source System connection problem. After it was fixed by the basis i had followed the below process.
    I am loading the data using the Generic extractor specifying the selection and its a full load. When i load the data from R3 PP application  i found the below result in the monitor screen of BW.
    1. Job was completed in the R3 system and 1 million records are fetched by extractor.
    2. the records are not posted to the BW side Because of TRFC issue.
    3. i got the idoc number and process it in the BD87 tCode. But it was not process sucessfully it gives the following error " Error when saving BOM. Please see the application log."
    4. when i check the Application Log using the Tcode SLG1 with the help of time and date of that particular process it was in the yelow status.
    Kindly let me know i can i resolve this issue. i have already tried by repeating the infopackage job still i am facing the same issue. I have also check the connection its is ok.
    Regards

    hello veerendra,
    Thanks for your quick response. yes i am able to process it manually. after processing  it was ended with the status 51 application not posted.
    could you pls help me out with the same
    regard
    Edited by: KK on Nov 4, 2009 2:19 AM
    Edited by: KK on Nov 4, 2009 2:28 AM

  • Error while transferring planning data

    In HCM Personnel Cost Planning,I am getting an error while releasing a cost plan run "No cost assignment available for period.......' This happens only for cost items related to persons for period 2010 onward. System gives no error on previous periods and on other cost objects.Is this due to any missing customizing settings in controlling, since the program im using transfers data to controlling..
    Regards

    Hello,
    Check your controlling area validity OKKP
    Check your cost element validity KA02
    Check your cost center validity KS02
    Check your versions validity for fiscal year OKEQ.
    Regards,
    Ravi

  • Issue while loading of data from DSO to InfoCube

    Hi Experts,
    Can you tell me what might root casue if data is coming into DSO from R3 its correct and fine as required but while loading it to InfoCube from DSO its showing wrong data like some of Line Items that were closed were shown open in Cube AND also KF values were not right
    Also there is no Routine code involved b/w DSO and InfoCube.
    Thanks in adv .
    NP

    Hope you didnt delete some req from DSO without deleting change log . This might cause inconsistency.
    If so , delete data from dso by right click delete data  and reload .

  • Issue while reflecting the data to ADF form from ADF Table

    Hi All,
    I have one scenario as follows:
    I have to open a form in entry mode. So I have used a Method Call activity in task flow to call "Create" in method call activity.
    Then form opens in a entry mode. I have a adf table in the page showing the entered data.
    When form opens in entry mode, instead of entering the values in the field User selects a record in ADF table, but due to the create mode of the form it will through me a error to fill mandatory fields.
    Please tell me the way so that if user selects the record(in adf table) instead of entering new record, the value would reflect in the Adf form.
    If I use a roll back on selecting the row of ADF table, then it is showing me this error "Row currency has changed since the user interface was rendered. The expected row key was oracle.jbo.Key[null ]" because there will be no row key in the cache when form opens.
    Please suggest me the way to complete the task.
    I am using jdeveloper 11.1.2.1.0
    Thanks,
    Gobi
    Edited by: gobinkl on Oct 3, 2012 2:06 AM

    no no your are complicating yourself. :) better your should provide your usecase.
    first thing: go through the books & also adf blogs learn and understand the framework. as above person said.
    i hope that you had method call create as activity as default while page rendering it make your af: form empty(that is ready fir insertion format).
    so if your are doing like that.am sure it will throw some mandatory errors.
    error will not resolve unless without entering any data's.
    coming af:table why are using af:table - immediate = true / false(switching).
    immediate = true - skip over validation or by pass some validation in some phase - based on the component which you are using. (say as if you are using editable value holder,af:coomand button).
    grab the information here in an more legible manner.
    http://adfpractice-fedor.blogspot.in/2012/02/understanding-immediate-attribute.html
    let me know your usecase.
    this will assist you
    https://blogs.oracle.com/shay/entry/executing_an_action_on_jsf_pag
    http://tanveeroracle.blogspot.in/2009/09/adf-11g-createinsert-to-display-blank.html
    Edited by: user707 on Oct 5, 2012 9:02 AM

  • Error while transfering the data from local to master repository

    Hi , Iam Working on SAP Learning Solution Implementation,we are using LSOAE200 as a Authoring Tool. When i am  Transfering the Learning Net from the authoring tool to the master repository which is in portal, I am facing the following error:
    <b>Type=E, ID=NR, Number=751, Parameter='', Message='For object  , number range interval  does not exist'
    com.sap.hcm.ls.shared.datamodel.ebo.EBOCatalogException: Type=E, ID=NR, Number=751, Parameter='', Message='For object  , number range interval  does not exist</b>
    we are able to make it out that the error is something which is not able to find the number ranges in R/3. Can any one of you please suggest me where exactly to create the number ranges in R/3.
    Regards..
    Kishore

    Hi,
    Please check table T77IV and see whether you haver assigned a number range for the object you want to transfer the data.
    Please let me know in case you require any further clarifications.
    With Regards,
    Kaustuv Goswami.

  • OIM 11g R2  - Issue while removing child data

    Hi,
    We are facing the following issue when we try to submit a "Modify Account" request by removing all the child form data.The issue is there only if the child form contains attributes which are of type integer ,date etc (non-string).
    Steps followed
    ==========
    1.Create a Parent process form with an attribute ( For e.g Firstname)
    2.Create a Child process form with 3 attributes ( EmpID --> Integer , Date of Joining --> Date, Address --> String )
    3.Created a resource object
    4.Created a process definition and attached this resource object and parent form
    5.Created some process tasks (Create user,Child Data insert etc) and attached tcCompleteTask
    6.Provision this resource object to an user with one entry for child data.Since tcCompleteTask is attached,the status of the account now is "Provisioned"
    7.Click on "Modify Account" button and remove the child entry (so that there is no child entry presents ) and click on submit
    8.Getting an error in the UI saying "IAM-2050061:Type mismatch for the attribute EmpID.The type passed is string but the corresponding type in dataset is integer".
    Any idea onhow to solve this issue?.Thanks.

    This could be a bug. Try raising an SR. Also more logs if you can.

  • Issue while loading master data from BI7 to BPC

    Dear Experts,
    I'm trying to load master data from BI7 to BPC NW using scenario 2 mentioned in the below document.
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/00380440-010b-2c10-70a1-e0b431255827
    My requirement is need to load 0GL_ACCOUNT attribute and text data from BI7 system to BPC.
    1.As mentioned in the How to...doc I had created a dimension called GL_ACCOUNT using BPC Admin client .
    2.Able to see GL_ACCOUNT in RSA1, when I try to create a Transformation(step 17 , page-40) to load Attribute data I could not find source of transformation object as 0GL_ACCOUNT(which exist in BI7) . I could only able to see only dimensions available in BPC system when I click F4 in Name.
    What could be the reason to not getting BI infoobject as source in BPC?
    Thanks in advance...
    regards,
    Raju

    Dear Gurus,
    My issue got resolved. So far I'm trying to pull data from R/3>BW>BPC. In the existing land scape BW and BPC are 2 different boxes. That is the reason I couldn't able to see BW objects into BPC (since 2 are different boxes). To resolve the issue I have created a new infoobect (RSD1) in BPC and data loading is from R/3>BPC infoobject(which is created through RSD1)>BPC Dimension.
    Thanks and regards,
    Raju

Maybe you are looking for

  • Problem with volume balance

    My son bought his first iPod last night (Nano). The left ear bud produces almost no sound. Any thoughts? TIA Steve

  • Details section repeating same data

    I have created a report using Crystal Reports 2008 that uses a group.  The details section of the group is repeating the data 11 times when it should only print it out once, before moving on to the next group. When I copy the SQL query out of Crystal

  • HWIC-3G

    Will a 50 ohm load on the antenna port sufficiently drop the received signal level? For a test, I have an HWIC-3G-CDMA card and an HWIC-4B-S/T card. The BRI card is connected to a satellite data terminal for an ISDN connection. The 2811 router is con

  • XML loadfromfile

    We need to load an XML file into the database. From there we will process via PL/SQL. The dbms_lob.loadfromfile loads only the first row into the desired table. Does anyone know of a way to load this xml file in as a lob. The book is great Steve and

  • Two iPhones on one laptop/iTunes

    My wife and I both have an iPhone synced to one laptop and one iTunes account.  They are linked now.  If I get an app, she gets it.  How do I keep my phone from updating hers and vice versa?