Table /SAPAPO/RES_HEAD update?

Hi All,
There is an inconsistancy between the values shown in the Resource Master and those in the table /SAPAPO/RES_HEAD.
What could be the possible reasons for the same? Does the table refresh from the parent structure periodically? What do I do if I need the updated values in /SAPAPO/RES_HEAD???
Warm Regards,
Shiva

Shiva,
Do you mean the below fileds not matching?
LC DAYS MINUS
LC DAYS PLUS
Is yes:
These values are stored in the table: /sapapo/reslct
the values in CFC9 defines how many time slots CIF transfers to APO
for each resource with external time stream. If there is nothing
defined the system uses -30...+600 as default values.
But the length of the time stream of the resource itself has to be
defined in the resource itself. Otherwise the system uses a default
value defined in database table /SAPAPO/RESLCT.
If you delete them and create yor own entry this should work as
expected for new resources.
Since this values are created while you tranfer the resource first time.,the values are taken from CFC9 settings.
If my understanding is wrong for your problem , let me know the exact field value you have the inconsistency.
-sundar

Similar Messages

  • Model updation in table /sapapo/model

    Hi,
    Could any body give solution to this query?
    To get update the table : /sapapo/model  which remote function module we can use?
    Actually I have created one model using Function module RSDMESC_DB_MODEL_CREATE.
    this model is getting updated in table :  RSDMA_DBT_MODELT.
    But I need this newly created model available in to the table :   /sapapo/model 
    How can I do this.Please update me.
    Regards

    HI,
       As per my requirement data is not getting updated in to the table /sapapo/model.
    using   the function module which you have given, it  is just checking the model and given the status whether it is  available or not.
    Regards

  • Issues with BDLS for table /SAPAPO/TSTROBJR after system refresh

    Hi,
    After a recent refresh of our QA system from the Production system, when we are running BDLS for the table /SAPAPO/TSTROBJR to convert the logical system from Prod to QA, it's not allowing us to make the change.
    Error message reads:
    /SAPAPO/TSTROBJR          LOGSYS                              1                              0 <<<<
    <<<< Error in field  of table . Manual correction required.
    I have already tried to run the report /SAPAPO/SCHED_BUFFER_RESET, but this has not been of much help.
    What can we do to resolve this issue?
    Do we need to take some extra steps before we can run BDLS, or is there some other way to update these time streams for unloading point?
    Thanks - Pawan

    Scheduling buffer reset did not update teh time stamps, but the new GATP checks are noe able to change teh time streams. So, issue doesn't exist now.
    Closing this thread.

  • ROW COMPRESSION - table /SAPAPO/BOP

    Hello guys,
    I have a big performance problem in table /SAPAPO/BOP at SCM system. I run the /SAPAPO/BOP_DELETE to delete entries older than 7 days. I saw in my database lot executions as sequential read, update and delete during of SPP and GATP - BO processing. As I have the ROW COMPRESSION active for this table, I would like to know if this could have some impact for the performance issue.
    Could you please advise about this?
    Many thanks,
    Carlos.

    Hi
    Please check the note below may help you,
    Note 1416044 - BOP performance improvement for updates back from erp to scm
    Thanks
    Sadiq

  • I have a column where I have implemented writeback, its working fine. On top of this I need to show 0 as No and 1 as yes in our report, that is also done. Now I want to enter Yes in a column where it was no and I want database table to get update with 1.

    I have a column where I have implemented writeback, its working fine. On top of this I need to show 0 as No and 1 as yes in our report, that is also done. Now I want to enter Yes in a column where it was no and I want database table to get update with 1. I am not sure how to do it. SOmeone please help me out.

    Hi ,
    In your write back XML  try the below  query insert
    INSERT INTO TABLE_XYZ (attribute1)  values (SELECT CASE  WHEN @{C1}=’Yes’ then 1 when @{C1}=’No’ then 0 else null end from dual)
    Regards
    Rajagopal

  • How to populate customer specific field data in table /SAPAPO/ORDFLDS

    Dear Gurus,
    I have explained in detail about the problem we face. I guess persons who has implemented enhancement:  /SAPAPO/RRP_IO_COL in their system can help me out.
    Background:
    Purchase requisitions in APO is created by an idoc that comes from a legacy system using BAPI CALL FUNCTION 'BAPI_POSRVAPS_SAVEMULTI3'.
    Business Requirement:
    I have a business requirement where I need to populate an additional data 'Original delivery date' from the idoc during PR creation in Product view.
    Development:
    To achieve the above requirement, we are following the below procedure in our development system.
    1. We are using enhancement:  /SAPAPO/RRP_IO_COL, method: RRP_USEX_COLS_FILL_01 and RRP_USEX_COLS_GET_TEXT_01 to display an additional field 'Original delivery date' in /sapapo/rrp3 - elements view. This field is restricted to Purchase requisition (Order category: AG) only. We are planning to populate the additional data 'Original delivery date' in this customer specific field and store it in table: /SAPAPO/ORDFLDS at the time of PR creation.
    2. Table: /SAPAPO/ORDFLDS is appended with the 'customer specific field'.
    3. We couldnt find a document on how the data can be populated in table /SAPAPO/ORDFLDS.
    4. How to polulate the live cache data in the table  '/APAPO/ORDFLDS' ? ( i.e using connection parameter )?
    Appreciate if you can throw me some light on this.
    Thanks
    Vignesh M

    Hi Vignesh,
    ANy luck on this ? I am trying the almost same thing...and stuck at same point.
    Please let us know if you have any more information.

  • File to Proxy----Tables not getting updated.

    Hi all,
    I have File to proxy scnerio, where Data from file is uploaded in to BAPI in turn
    updated in to tables.
    If i take pay load from Moni  and test in SPROXY then tables is getting updated.
    But when i run Scnerio form XI tables are not getting updated.
    Please help......

    Hi ,
    Check this out if you missed any step.. This is exactly on your senario.
    /people/prateek.shah/blog/2005/06/14/file-to-r3-via-abap-proxy
    hope this will help you.
    Regards
    Aashish Sinha
    PS : reward points if helpful

  • C$sync_history table not getting updated

    I am doing file based sync using the oracle APIs. While syncing there is no entry in the table showing Sync happened. but C$SYNC_LOG table is getting updated.
    Can anybody tell whats the issue.
    Thanks and Regards
    Lijo Lawrance

    Something additionaly. If the removeSession throws exception, or does not work as expected, the only way to continue is to extend the HeliosSession Class, and add to history by yourself.
    The code for extending the HeliosSession is listed above.
    The only thing you need is to call
    ((HeliosSessionExtended)localHeliosSession).addToSyncHistory(this); //this is the FDSession, or the new one used in constrctor of HeliosSession, or in the createSession Method from syncService)
    afert startSession completes successfully.
    import java.sql.Connection;
    import java.sql.PreparedStatement;
    import java.sql.SQLException;
    import java.sql.Timestamp;
    import java.util.Iterator;
    import java.util.LinkedList;
    import oracle.lite.sync.HeliosSession;
    import oracle.lite.sync.HeliosTransport;
    import oracle.lite.sync.SiteDef;
    import oracle.lite.sync.Subscription;
    import oracle.lite.sync.SyncPubItemStatus;
    public class HeliosSessionExtended extends HeliosSession {
    public HeliosSessionExtended(HeliosTransport paramHeliosTransport) throws Throwable {
    super(paramHeliosTransport);
    //override the addToSyncHistory method
    void addToSyncHistory(Connection paramConnection) {
    //what I said about Sync_history
    if (!SiteDef.SYNC_HISTORY) {
    return;
    Long SessionId = this.getSessionId();
    Long DeviceSessionId = this.getDeviceSessionId();
    String clientId = this.getClientId();
    String memberId = this.getMemberId();
    String devicePlatform = this.getDevicePlatform();
    int result = this.getResult();
    String localHeliosMessage = this.getMessage();
    long startTime = this.getStartTimeMs();
    long finishTime = this.getFinishTimeMs();
    long uploadStartTime = this.getUploadStartTimeMs();
    long uploadFinishTime = this.getUploadFinishTimeMs();
    int recordCount = this.getUploadRecordCount();
    long uploadedByteCount = this.getUploadByteCount();
    long compressedByteCount = this.getUploadCompressedByteCount();
    LinkedList uploadedPubItems = this.getUploadPubItems();
    int uploadPubItemsSize = 0;
    if (uploadedPubItems != null) {
    uploadPubItemsSize = uploadedPubItems.size();
    long downloadStartTime = this.getDownloadStartTimeMs();
    long downloadFinishTime = this.getDownloadFinishTimeMs();
    int downloadRecordTime = this.getDownloadRecordCount();
    long downloadedByteCount = this.getDownloadByteCount();
    long downloadedCompressedByteCount = this.getDownloadCompressedByteCount();
    LinkedList downloadedPublicationItems = this.getDownloadPubItems();
    int downloadPubItemsSize = 0;
    if (downloadedPublicationItems != null) {
    downloadPubItemsSize = downloadedPublicationItems.size();
    int completeRefItemsCount = this.getCompleteRefreshItemCount();
    PreparedStatement preparedStatement = null;
    try {
    String insertString = "INSERT INTO C$SYNC_HISTORY (SESSION_ID,CLIENT_ID,DEVICE_PLATFORM,RESULT,START_TIME,FINISH_TIME,UPLOAD_START_TIME,UPLOAD_FINISH_TIME,UPLOAD_RECORD_COUNT,UPLOAD_BYTE_COUNT,UPLOAD_COMPRESSED_BYTE_COUNT,DOWNLOAD_START_TIME,DOWNLOAD_FINISH_TIME,DOWNLOAD_RECORD_COUNT,DOWNLOAD_BYTE_COUNT,DOWNLOAD_COMPRESSED_BYTE_COUNT,COMPLETE_REFRESH_ITEM_COUNT,DEVICE_SESSION_ID,UPLOAD_PUB_ITEM_COUNT,DOWNLOAD_PUB_ITEM_COUNT,MEMBER_CLIENT_ID) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)";
    preparedStatement = paramConnection.prepareStatement(insertString);
    preparedStatement.setLong(1, SessionId.longValue());
    if ((clientId != null) && (clientId.length() > 0)) {
    SetStringParameter(preparedStatement, 2, clientId);
    } else {
    SetStringParameter(preparedStatement, 2, " ");
    if ((devicePlatform != null) && (devicePlatform.length() > 0)) {
    SetStringParameter(preparedStatement, 3, devicePlatform);
    } else {
    SetStringParameter(preparedStatement, 3, " ");
    if (result == 0) {
    preparedStatement.setString(4, "SUCCESS");
    } else if (result == 1) {
    preparedStatement.setString(4, "FAILURE");
    } else if (result == -1) {
    preparedStatement.setString(4, "UNKNOWN");
    } else {
    preparedStatement.setNull(4, 12);
    SetTimeParameter(preparedStatement, 5, startTime);
    SetTimeParameter(preparedStatement, 6, finishTime);
    SetTimeParameter(preparedStatement, 7, uploadStartTime);
    SetTimeParameter(preparedStatement, 8, uploadFinishTime);
    preparedStatement.setInt(9, recordCount);
    preparedStatement.setLong(10, uploadedByteCount);
    preparedStatement.setLong(11, compressedByteCount);
    SetTimeParameter(preparedStatement, 12, downloadStartTime);
    SetTimeParameter(preparedStatement, 13, downloadFinishTime);
    preparedStatement.setInt(14, downloadRecordTime);
    preparedStatement.setLong(15, downloadedByteCount);
    preparedStatement.setLong(16, downloadedCompressedByteCount);
    preparedStatement.setInt(17, completeRefItemsCount);
    if (DeviceSessionId != null) {
    preparedStatement.setLong(18, DeviceSessionId.longValue());
    } else {
    preparedStatement.setNull(18, 2);
    preparedStatement.setInt(19, uploadPubItemsSize);
    preparedStatement.setInt(20, downloadPubItemsSize);
    if (memberId != null) {
    preparedStatement.setString(21, memberId);
    } else {
    preparedStatement.setNull(21, 12);
    preparedStatement.executeUpdate();
    preparedStatement.close();
    preparedStatement = null;
    Object localObjcClob;
    if (localHeliosMessage != null) {
    localObjcClob = new Long[]{SessionId};
    Subscription.setCLOB(paramConnection, "C$SYNC_HISTORY", "MESSAGE", localHeliosMessage, "SESSION_ID=? ", (Object[]) localObjcClob);
    if (((uploadedPubItems != null) && (uploadedPubItems.size() > 0)) || ((downloadedPublicationItems != null) && (downloadedPublicationItems.size() > 0))) {
    insertString = "INSERT INTO C$SYNC_HIS_PUB_ITEMS (SESSION_ID,PHASE,PUBLICATION,PUB_ITEM,DEVICE_TABLE,START_TIME,FINISH_TIME,IS_COMPLETE_REFRESH,RECORD_COUNT,BYTE_COUNT,COMPRESSED_BYTE_COUNT,MESSAGE) VALUES (?,?,?,?,?,?,?,?,?,?,?,?)";
    preparedStatement = paramConnection.prepareStatement(insertString);
    preparedStatement.setLong(1, SessionId.longValue());
    SyncPubItemStatus localSyncPubItemStatus;
    if ((uploadedPubItems != null) && (uploadedPubItems.size() > 0)) {
    preparedStatement.setString(2, "UPLOAD");
    localObjcClob = uploadedPubItems.iterator();
    while (((Iterator) localObjcClob).hasNext()) {
    localSyncPubItemStatus = (SyncPubItemStatus) ((Iterator) localObjcClob).next();
    SetStringParameter(preparedStatement, 3, localSyncPubItemStatus.publication);
    SetStringParameter(preparedStatement, 4, localSyncPubItemStatus.pubItem);
    SetStringParameter(preparedStatement, 5, localSyncPubItemStatus.deviceTable);
    SetTimeParameter(preparedStatement, 6, localSyncPubItemStatus.startTimeMs);
    SetTimeParameter(preparedStatement, 7, localSyncPubItemStatus.finishTimeMs);
    if (localSyncPubItemStatus.isCompleteRefresh) {
    preparedStatement.setString(8, "YES");
    } else {
    preparedStatement.setString(8, "NO");
    preparedStatement.setInt(9, localSyncPubItemStatus.recordCount);
    preparedStatement.setLong(10, localSyncPubItemStatus.byteCount);
    preparedStatement.setLong(11, localSyncPubItemStatus.compressedByteCount);
    SetStringParameter(preparedStatement, 12, localSyncPubItemStatus.message);
    preparedStatement.executeUpdate();
    if ((downloadedPublicationItems != null) && (downloadedPublicationItems.size() > 0)) {
    preparedStatement.setString(2, "DOWNLOAD");
    localObjcClob = downloadedPublicationItems.iterator();
    while (((Iterator) localObjcClob).hasNext()) {
    localSyncPubItemStatus = (SyncPubItemStatus) ((Iterator) localObjcClob).next();
    SetStringParameter(preparedStatement, 3, localSyncPubItemStatus.publication);
    SetStringParameter(preparedStatement, 4, localSyncPubItemStatus.pubItem);
    SetStringParameter(preparedStatement, 5, localSyncPubItemStatus.deviceTable);
    SetTimeParameter(preparedStatement, 6, localSyncPubItemStatus.startTimeMs);
    SetTimeParameter(preparedStatement, 7, localSyncPubItemStatus.finishTimeMs);
    if (localSyncPubItemStatus.isCompleteRefresh) {
    preparedStatement.setString(8, "YES");
    } else {
    preparedStatement.setString(8, "NO");
    preparedStatement.setInt(9, localSyncPubItemStatus.recordCount);
    preparedStatement.setLong(10, localSyncPubItemStatus.byteCount);
    preparedStatement.setLong(11, localSyncPubItemStatus.compressedByteCount);
    SetStringParameter(preparedStatement, 12, localSyncPubItemStatus.message);
    preparedStatement.executeUpdate();
    if (preparedStatement != null) {
    preparedStatement.close();
    preparedStatement = null;
    } catch (Throwable localThrowable) {
    //some error logging
    } finally {
    try {
    if (preparedStatement != null) {
    preparedStatement.close();
    preparedStatement = null;
    } catch (SQLException localSQLException2) {
    //some sqlexception logging
    void SetStringParameter(PreparedStatement paramPreparedStatement, int paramInt, String paramString)
    throws SQLException {
    if (paramString != null) {
    paramPreparedStatement.setString(paramInt, paramString);
    } else {
    paramPreparedStatement.setNull(paramInt, 12);
    void SetTimeParameter(PreparedStatement paramPreparedStatement, int paramInt, long paramLong)
    throws SQLException {
    if (paramLong > 0L) {
    paramPreparedStatement.setTimestamp(paramInt, new Timestamp(paramLong));
    } else {
    paramPreparedStatement.setNull(paramInt, 93);
    Edited by: FlorinA on Jul 6, 2010 12:37 AM
    Edited by: FlorinA on Jul 6, 2010 12:45 AM

  • File-to-rfc ..database tables are not updating??

    hi xi friends..
    in my file -to-rfc scenario.. without BPM ..
    in sxmb_moni..it is showing successfull.. database tables in sap not updating..
    my source structure..
    workorders 1..1
    ..order 1..unbounded
    ...id
    ...operation 1..unbounded
    .....id
    .....closingdate
    .....status
    .....comment
    my target is Zbapi_alm_conf_create..
    Zbapi_alm_conf_create
    ...Zdetail_return 1..1
    .....item 0..unbounded
    ...Ztimetickets
    .....item 0..unbounded
    .......orderid
    .......operation
    .......fin_conf
    .......con_text
    .......exec_fin_date
    in message mapping:
    MM_file_to_zrfc
    i changed the occurance of target to unbounded..
    message mapping like this..
    my source structure..
    workorders 1..1 
    ..order 1..unbounded   --------->Zrfc 0..unbouned
    ...id                  --------->Ztimetickets-item-order
    ...operation 1..unbounded ------>Ztimetickets-item 0..unbounded
    .....id                ---------->Ztimeticktes-item-operation
    .....closingdate       ----------->Ztimeticktes-item-exec_fin_date
    .....status            ----------->Ztimetickets-item-fin_conf
    .....comment           ------------>Ztimeticktes-item-conf_text
    and also in  interfacemapping ,changed the target occurance to unbounded.
    and in ID ,interface determination using enhanced i selected interface mapping with occurance unbounded..
    in sxmb_moni it is showing success..
    in adapter monetering( receiver).
    <i>Receiver channel 'cc_sap_work' for party '', service 'SAP_ERP__DEV' (internal name 'RfcClient[cc_sap_work]')
    Client data: {jco.client.lang=EN, jco.client.snc_mode=0, jco.client.client=400, jco.client.passwd=****, jco.webas.ignore_jdsr_error=1, jco.client.user=aar, jco.client.sysnr=10, jco.client.ashost=53.247.192.84}
    Repository data: {jco.client.lang=EN, jco.client.snc_mode=0, jco.client.client=400, jco.client.passwd=****, jco.webas.ignore_jdsr_error=1, jco.client.user=thotv, jco.client.sysnr=10, jco.client.ashost=53.247.192.84}
    Current pool size: 0, maximum pool size : 1
    Channel History
    - OK: 2006-12-31 14:19:47 CET: Message processed for interface ZBAPI_ALM_CONF_CREATE
    - OK: 2006-12-31 14:18:50 CET: Message processed for interface ZBAPI_ALM_CONF_CREATE</i>
    but the database tables are not updating..if i execute ZBAPI_ALM_CONF_CREATE manually in SAP ..tables are updating...
    please guide me...
    thanks in advance...
    regards
    Ram

    Hi..
    my mappis is like this..
    message                   message
    .message1                         message1       
    ..workorders 1..1 
    ..order 1..unbounded   -
    >Zrfc 0..unbouned
    ...id                  -
    >Ztimetickets-item-order
    ...operation 1..unbounded -
    >Ztimetickets-item 0..unbounded
    .....id                -
    >Ztimeticktes-item-operation
    .....closingdate       -
    >Ztimeticktes-item-exec_fin_date
    .....status            -
    >Ztimetickets-item-fin_conf
    .....comment           -
    >Ztimeticktes-item-conf_text
    i didnt mapped message at root..is this necessary to mapp messages??
    please tell me
    regards
    ram

  • Excise Tables are not updating

    Hi,
    I have a issue in Depot sale. I created a GR against a PO. Then I posted a Excise invoice. But the excise tables are not updating.
    Why it so? Plz explain.......
    Thanks & Regards
    Saeed

    Hi,
    While doing the GR through T Code MIGO ( Good Receipt + Purchase Order) > Tick on "Item Ok" & Enter > and see one more additional menu will appear > Excise Invoice >Is there two options 1) No Excise Entry 2) Create RG23D Entry.
    If you need to update Excise detail through MIGO you can select "Create RG23D Entry" or if you not require to update Excise entry through MIGO you can select "No Excise Entry" & after generating the GR document you can proceed for Excise updation through T Code J1IG.
    Hope that you are understood.
    Thanks & Regds,
    Rajesh Kolhe

  • From which prgram the table TSTC is updated

    Hi all ,
    Can anybody know me from which program, the Table TSTC is updated where it stores all the transaction codes.
    Regards,
    Madhavi

    Tcode for maintaining Tcodes is SE93.
    Goto>SE93>System>Status>Program.
    Program Name:SAPLSEUK
      include lseuktop.      " Global Data
      include lseukuxx.      " Function Modules
    include lseuko01.
    include lseuki01.
    include lseukf01.
    include lseukf00.
    include lseuki00.
    Navigation Transaktion
    include lseuktn0.
    include lseuke01.
    include lseukfwm.
    Regards,
    Ansari.

  • From which transaction the table OBEW is updated

    Hi Experts,
      From which transaction the table OBEW is updated
    Thanks,
    Nagendra

    The table OBEW gets updated thru MI21/MI22/ and MB01/MB02

  • Table Maintainance Generator Updation problem

    Hello,
    I am having the problem related to table maintainence generator updation.
    I am fetching the data in table maintainance generator from Standard table.
    Now the problem is that, if I fetch  some records ; some records are updated properly but some records are not.
    Please suggest the solution.
    Thanks.
    Swati.

    >
    Swati Khandelwal wrote:
    > Hello All.
    > Thanks for your reply.
    > The field which is not updating is not the key field.
    >
    > Thanks.
    > Swati
    It does't matter.
    But the fields is not updating you need to check its key in table whether it is exsist?
    I'm yes it is there.
    One more thing i would like to confirm are you using Update or modify?

  • HR Tables are not updated due to compression error

    We have a custom program that program used to copy data from Production system to other QA systems. The program uses the FM
    Table_decompress to decompress the file downloaded from Production and update the HR tables like PA* series in QA.
    While uploading the those tables from the file it updates half of the tables  and other tables are
    not updated as FM Table_decompress produces a exception saying "Compression error". what would be the problem here?
    Edited by: Ganesh Kumar on Feb 2, 2011 10:14 AM
    Edited by: Ganesh Kumar on Feb 2, 2011 10:25 AM

    I have provided some information below that may help.
    Step 1. Update the device using VZ Access manager. Instruction below:
    http://support.vzw.com/clc/devices/knowledge_base.html?id=14636
    Step 2. Unplug the device (if tethered), restart the computer, and try another USB port (if applicable).
    Step 4. Deleting and recreating dial-up connections.
    http://support.vzw.com/clc/devices/knowledge_base.html?id=23830
    Step 5. Try to connect.
    Step 6. (If Step 5. Fails) Disconnect, unplug the device, completely uninstall the current version of VZ Access Manager, and restart  the computer.
    Step 7. Download VZ Access Manager from our website www.vzam.net
    -On the main page, select Supported Devices on the left hand side.
    -Under the Data Devices like for Verizon Wireless USB 760 (second from the bottom of the row)
    -Select the link under the Windows 7 column for the Verizon Wireless USB 760
    -Click Download Now and Save to the desktop
    Step 7. Install VZ Access manager using the downloaded file
    -Follow the step up instructions
    -Plug in the device when prompted
    -Try to connect
    I hope this helps.

  • How to find out when a table was last updated?

    Is there a way to find out when a table was last updated/inserted/deleted? Thanks!

    There may be an easier way but if you are trying to get info on something that has already happened look at your redo logs and archived logs. It would be hard but in V$LOGMNR_CONTENTS you could find the max time for a given object. Note to use this you need to set up log miner. Since you did not give a version try the Oracle 9i DBA Guide pg 9-1.

Maybe you are looking for

  • How can I create a calender with iwork08

    Im not very good with computers and need to create a calender for work. Are there any templates or easy ways to create a calender with iwork08 or ical? Please help

  • Message-Driven Bean using @Resource annotation

    I am trying to run a Message-Driven Bean very simple example in https://glassfish.dev.java.net/javaee5/ejb/examples/MDB.html I configured MDBQueueConnectionFactory and MDBQueue properly on glassfish admin console. I cannot run the example using @Reso

  • How to resize a text sprite?

    This code follow can change a shape or picture sprite's dimension: vRect = mySprite.rect vRect.left = vRect.left - 5 vRect.right = vRect.right + 5 mySprite.rect = vRect But when this code is used to a text(or field) sprite, it doesn't work! Why? How

  • Basic binary search tree question

    Hi, I was just wondering, if there's a binary search tree that holds objects can the binary search tree be based on any of the attributes in those objects. So for example say we have a Person object and it has the attributes name, age, and height. Ca

  • IPhoto, iWeb & shared library

    Hi, Is it possible to have a (home) network with a multi & full accessed read/write iPhoto library? I've an old PowerPC MacMini primarily used (like a NAS) as a central storage device. It contains my music, photo libraries etc which are further store