Error in ME21n 'The date 1,245 is not convertible (pl correct).

While preparing a STO thru T.Code ME21N , system is giving an error 'The date 1,245 is not convertible (pl correct).
Can u pl help us in analysing why the error is coming.
Regards,
Paras

Hi
Enter the date format as per user prifile sysem config.
Check In SU3 T-code  deafult tab-date format
Like example MMDDYYYY/DDMMYYYY/YYYYMMDD
Nagaraj K

Similar Messages

  • Error: ORA-16525: the Data Guard broker is not yet available

    Hi ,
    After upgrading from 11201 to 11203 ON AIX GI/RDBMS on standby but have not upgraded the primary db yet.I had set dg_broker_start=false and disable configuration before i started the upgrade .
    once the GI for oracle restart was upgraded i upgraded the rdbms binaries and brought up the standby on mount ,while trying to enable configuration its throwing the below error.I had already started the broker process.
    SQL> show parameter dg_
    NAME TYPE VALUE
    dg_broker_config_file1 string /u01/app/omvmxp1/product/11.2.
    0/dbhome_2/dbs/dr1mvmxs2.dat
    dg_broker_config_file2 string /u01/app/omvmxp1/product/11.2.
    0/dbhome_2/dbs/dr2mvmxs2.dat
    dg_broker_start boolean TRUE
    DGMGRL> show configuration;
    Configuration - Matrxrep_brkr
    Protection Mode: MaxAvailability
    Databases:
    mvmxp2 - Primary database
    mvmxs2 - Physical standby database
    Error: ORA-16525: the Data Guard broker is not yet available
    Fast-Start Failover: DISABLED
    Configuration Status:
    ERROR
    from drcmvmxs2.log
    Starting Data Guard Broker bootstrap <<Broker Configuration File Locations:
    dg_broker_config_file1 = "/u01/app/omvmxp1/product/11.2.0/dbhome_2/dbs/dr1mvmxs2.dat"
    dg_broker_config_file2 = "/u01/app/omvmxp1/product/11.2.0/dbhome_2/dbs/dr2mvmxs2.dat"
    12/19/2012 16:05:33
    Data Guard Broker shutting down
    DMON Process Shutdown <<12/19/2012 16:10:20
    Starting Data Guard Broker bootstrap <<Broker Configuration File Locations:
    dg_broker_config_file1 = "/u01/app/omvmxp1/product/11.2.0/dbhome_2/dbs/dr1mvmxs2.dat"
    dg_broker_config_file2 = "/u01/app/omvmxp1/product/11.2.0/dbhome_2/dbs/dr2mvmxs2.dat"
    ~
    Regards
    Edited by: Monto on Dec 19, 2012 1:23 PM

    Hi,
    I removed the configuration and removed the broker files from RAC primary(mvmxp2) and single instance standby(mvmxs2) and re-created back.i tried it many times but getting error "ORA-16532" .I needed to have this standby backup before i start upgrading the primary.
    SQL> alter system set dg_broker_start=true scope=both;
    System altered.
    SQL> exit
    Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, Automatic Storage Management, OLAP, Data Mining
    and Real Application Testing options
    palmer60:/u01/app/omvmxp1/product/11.2.0/dbhome_2/dbs>dgmgrl
    DGMGRL for IBM/AIX RISC System/6000: Version 11.2.0.3.0 - 64bit Production
    Copyright (c) 2000, 2009, Oracle. All rights reserved.
    Welcome to DGMGRL, type "help" for information.
    DGMGRL> connect sys@mvmxp2
    Password:
    Connected.
    DGMGRL> CREATE CONFIGURATION 'Matrxrep'
    AS
    PRIMARY DATABASE IS 'mvmxp2'
    CONNECT IDENTIFIER IS 'mvmxp2';> > >
    Configuration "Matrxrep" created with primary database "mvmxp2"
    DGMGRL> ADD DATABASE 'mvmxs2'
    AS
    CONNECT IDENTIFIER IS 'mvmxs2'
    ;Database "mvmxs2" added
    DGMGRL> SHOW CONFIGURATION;
    Configuration - Matrxrep
    Protection Mode: MaxPerformance
    Databases:
    mvmxp2 - Primary database
    mvmxs2 - Physical standby database
    Fast-Start Failover: DISABLED
    Configuration Status:
    DISABLED
    DGMGRL> ENABLE CONFIGURATION;
    Enabled.
    DGMGRL> SHOW DATABASE MVMXS2;
    Database - mvmxs2
    Role: PHYSICAL STANDBY
    Intended State: APPLY-ON
    Transport Lag: (unknown)
    Apply Lag: (unknown)
    Real Time Query: OFF
    Instance(s):
    mvmxs2
    Database Status:
    DGM-17016: failed to retrieve status for database "mvmxs2"
    ORA-16532: Data Guard broker configuration does not exist
    ORA-16625: cannot reach database "mvmxs2"
    DGMGRL>
    tailed the drcmvmxs2.log during stop and start of the broker
    palmer60:/u01/app/omvmxp1/diag/rdbms/mvmxs2/mvmxs2/trace>tail -f drcmvmxs2.log
    12/19/2012 20:32:20
    drcx: cannot open configuration file "/u01/app/omvmxp1/product/11.2.0/dbhome_2/dbs/dr1mvmxs2.dat"
    ORA-27037: unable to obtain file status
    IBM AIX RISC System/6000 Error: 2: No such file or directory
    Additional information: 3
    12/19/2012 20:32:55
    drcx: cannot open configuration file "/u01/app/omvmxp1/product/11.2.0/dbhome_2/dbs/dr2mvmxs2.dat"
    ORA-27037: unable to obtain file status
    IBM AIX RISC System/6000 Error: 2: No such file or directory
    Additional information: 3
    12/19/2012 20:59:10
    Data Guard Broker shutting down
    DMON Process Shutdown <<12/19/2012 20:59:35
    Starting Data Guard Broker bootstrap <<Broker Configuration File Locations:
    dg_broker_config_file1 = "/u01/app/omvmxp1/product/11.2.0/dbhome_2/dbs/dr1mvmxs2.dat"
    dg_broker_config_file2 = "/u01/app/omvmxp1/product/11.2.0/dbhome_2/dbs/dr2mvmxs2.dat"
    Not sure how to fix this one.
    Regards

  • ORA-16525: the Data Guard broker is not yet available while conneting ...

    Hi,
    I am trying to connect oracle database instance in dgmgrl prompt.
    Following error message appear ...
    DGMGRL>
    DGMGRL> connect sys/<password>@<oracle_service_name>
    Connected.
    Error:
    ORA-16525: the Data Guard broker is not yet available
    ORA-06512: at "SYS.DBMS_DRS", line 124
    ORA-06512: at line 1
    DGMGRL> exit
    $ e
    We are using oracle enterprise manager ...
    SSM

    DG broker is the way to go. There are only minor points you can't control - like when you have standby redo logs and no redo application delay, dg broker will always start real-time apply.
    Also, DG broker is the only way if you want fast-start failover (10gR2 feature).
    What is the latency of your nework, especially under load? Add this latency to each commit - will the commit time be acceptable? If not, you cannot use sync.
    What version do you use? In 10gR1, there are issues with async - there is only about 50MB buffer, and if you get such backlog, even with async commit will wait for standby. (In 10gR2 this is solved, async will read from disk redo logs.)
    NET_TIMEOUT - set it low for sync (so commits won't hang long), higher for async. Default is 180 seconds, Oracle recommends minimum value to 10 seconds. I would start with 10 sec for sync, 180 for sync. The main question is, how reliable your network is, and how long brownouts you want to survive without affecting data protection (with the expense of performance).
    Regarding the proper setting of all bells and whistles of archive_log_dest_n - I had best experience with leaving it to dg broker. Just set LogXptMode property to sync/async/arch, and let dg broker set everything else (service, valid_for, affirm, retries...).
    The authoritative source: http://www.oracle.com/technology/deploy/availability/htdocs/maa.htm#Publications
    If you are on 10.2.0.1 - 10.2.0.3, please see metalink bug 5578157, 5587231 - if sqlnet.outbound_connection_timeout is not set, the lgwr can hang even if net_timeout is set. (Yes, real-world experience:-)

  • Error while extracting the data in R/3 production system,

    Hi Team,
    We got the following error while extracting the data in R/3 production system,
    Error 7 When Sending an IDoc R3 3
    No Storage space available for extending the inter 44 R3 299
    No storage space available for extending the inter R3 299
    Error in Source System RSM 340
    Please guide us to fix the issue

    It´s very difficult to help you without knowing
    - what is going to be transferred
    - where you get this error
    - system configuration
    - actual memory usage
    - operating system
    - database and configuration etc. etc.etc. etc.
    I suggest you open an OSS call and let the support have a look on your system. It´s much easier if one has system access to find out the cause for that problem.
    Markus

  • Error when updating the data from DSO to cube

    Hi,
    I am getting the error when uploading the data from the ods to cube.
    The following is the error message.
    Unable to determine period for date 20090101, fiscal year variant Z2: Error #
    How can i solve this issue.
    Regards
    Annie

    Hi ,
    fiscal year variant, go into Customizing for Financial Accounting (FI) under Financial Accounting Global Settings >>>Fiscal Year >>>Maintain Fiscal Year Variant.
    check this link ..
    http://help.sap.com/saphelp_scm41/helpdata/en/50/0d89f2ad919c40b95b9ae7583c8c96/frameset.htm
    http://help.sap.com/saphelp_scm41/helpdata/en/50/0d89f2ad919c40b95b9ae7583c8c96/content.htm
    Regards,
    shikha

  • Errors in the high-level relational engine. The data source view does not contain a definition for the table or view. The Source property may not have been set.

    Hi All,
    I have a cube in which i'm using the TIME DIM that i created in the warehouse. But now i wanted a new measure in the cube which is Average over time and when i wanted to created the new measure i got a message that no time dim was defined, so i created a
    new time dimension in the SSAS using wizard. But when i tried to process the new time dimension i'm getting the follwoing error message
    "Errors in the high-level relational engine. The data source view does not contain a definition for "SSASTIMEDIM" the table or view. The Source property may not have been set."
    Can anyone please tell me why i cannot create a new measure average over the time using my time dimension? Also what am i doing wrong with the SSASTIMEDIM, that i'm getting the error.
    Thanks

    Hi PMunshi,
    According to your description, you get the above error when processing the time dimension. Right?
    In this scenario, since you have updated the DSV, it should have no problem on the table existence. One possibility is that table has been specified for tracking in the notifications for proactive caching, but isn't available any more for some
    reason. Please change the setting in Proactive Caching into "MOLAP".
    Reference:
    How To Implement Proactive Caching in SQL Server Analysis Services SSAS
    If you have any question, please feel free to ask.
    Best Regards,
    Simon Hou
    TechNet Community Support

  • Error while uploading the data using FM"upload"

    Hi,
    I encountering an error while uploading the data using text file with FM " UPLOAD"
    The error is "File does not exist or cannot be opened "
    But there is a file with name and extenstion right.
    Regards
    Vishnu

    You have to create RC29P-IDNRK(var) using concatenate statement. Try this.
    DATA: new_mark TYPE bdcdata-fnam.
    CONCATENATE 'RC29P-IDNRK(' var ')' INTO new_mark.
    PERFORM bdc_field USING new_mark W_BOM-QTY

  • Error while loading the data from PSA to Data Target

    Hi to all,
         I'm spacing some error while loading the data to data target.
    Error :  Record 1 :Value 'Kuldeep Puri Milan Joshi ' (hex. '004B0075006C0064006500650070002000500075007200690
    Details:
    Requests (messages): Everything OK
    Extraction (messages): Everything OK
    Transfer (IDocs and TRFC): Errors occurred
          Request IDoc : Application document posted
          Info IDoc 2 : Application document posted
          Info IDoc 1 : Application document posted
          Info IDoc 4 : Application document posted
          Info IDoc 3 : Application document posted
          Data Package 1 : arrived in BW ; Processing : Data records for package 1 selected in PSA - 1 er
    Processing (data packet): Errors occurred
          Update PSA ( 2462  Records posted ) : No errors
          Transfer Rules ( 2462  -> 2462  Records ) : No errors
          Update rules ( 2462  -> 2462  Records ) : No errors
          Update ( 0 new / 0 changed ) : Errors occurred
          Processing end : Errors occurred
    I'm totally new to this issue. please help to solve this error.
    Regards,
    Saran

    Hi,
    I think you are facing an invalid character issue.
    This issue can be resolved by correcting the error records in PSA and updating it into the target. For that the first step should be to identify if all the records are there in PSA. You can find out this from checking the Details tab in RSMO, Job log , PSA > sorting records based on status,etc. Once its confirmed force the request to red and delete the particular request from the target cube. Then go to PSA and edit the incorrect records (correcting or blanking out the invalid entries for particular field InfoObject for the incorrect record) and save it. Once all the incorrect records are edited go to RSA1>PSA find the particular request and update to target manually (right click on PSA request > Start update immediately).
    I will add the step by step procedure to edit PSA data and update into target (request based).
    In your case the error message says Error : Record 1 :Value 'Kuldeep Puri Milan Joshi '. You just need to conver this to Capital letter in PSA and reload.
    Edit the field to KULDEEP PURI MILAN JOSHI in PSA and push it to target.
    Identifying incorrect records.
    System wont show all the incorrect records at the first time itself. You need to search the PSA table manually to find all the incorrect records.
    1. First see RSMO > Details > Expand upate rules / processing tabs and you will find some of the error records.
    2. Then you can go to PSA and filter using the status of records. Filter all the red requests. This may also wont show the entire incorrect records.
    3. Then you can go to PSA and filter using the incorrect records based on the particular field.
    4. If this also doesnt work out go to PSA and sort (not filter) the records based on the particular field with incorrect values and it will show all the records. Note down the record numbers and then edit them one by one.
    If you want to confirm find the PSA table and search manually."
    Also Run the report RS_ERRORLOG_EXAMPLE,By this report you can display all incorrected records of the data & you can also find whether the error occured in PSA or in TRANSFER RULES.
    Steps to resolve this
    1. Force the request to red in RSMO > Status tab.
    2. Delete the request from target.
    3. Come to RSMO > top right you can see PSA maintenace button > click and go to PSA .
    4.Edit the record
    5. Save PSA data.
    6. Got to RSA15 > Search by request name > Right click > update the request from PSA to target.
    Refer how to Modify PSA Data
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40890eda-1b99-2a10-2d8b-a18b9108fc38
    This should solve your problem for now.
    As a long term you can apply some user exit in source system side or change your update rules to ensure that this field is getting blanked out before getting loaded in cube or add that particular char to permitted character list in BW.
    RSKC --> type ALL_CAPITAL --> F8 (Execute)
    OR
    Go to SE38 and execute the program RSKC_ALLOWED_CHAR_MAINTAIN and give ALL_CAPITAL or the char you want to add.
    Check the table RSALLOWEDCHAR. It should contain ALL_CAPITAL or the char you have entered.
    Refer
    /people/sap.user72/blog/2006/07/23/invalid-characters-in-sap-bw-3x-myths-and-reality-part-2
    /people/sap.user72/blog/2006/07/08/invalid-characters-in-sap-bw-3x-myths-and-reality-part-1
    /people/aaron.wang3/blog/2007/09/03/steps-of-including-one-special-characters-into-permitted-ones-in-bi
    http://help.sap.com/saphelp_nw04/helpdata/en/64/e90da7a60f11d2a97100a0c9449261/frameset.htm
    For adding Other characters
    OSS note #173241 – “Allowed characters in the BW System”
    Thanks,
    JituK
    Edited by: Jitu Krishna on Mar 22, 2008 1:52 PM

  • No data Exists error while retrieving the data from a table

    Hi Everyone!
    I am getting "No Data Exists" error while retrieving the data from a table....where i need to check...if possible please give me example link.....please help me regarding this

    Hi !
    thanks for ur response...
    I have written vo.executeQuery for the table...My page is running but i m not getting the data....I have to select two LOV's and when I click on Go button the data has to display in the table...After the selection of LOV's when I click on Go in the table "No data exists" message is appearing...the table has the data and the query is also executing...please give info where i did the mistake....

  • Error when loading the data from PSA to ODS......

    Hi BW guru's,
    i am facing one problem while loading the data from PSA to ODS.so please help me in this regard.
    Please give a step by step guidelines for me...
    the error while loading the data from PSA to ODS is "There are no PSA tables for these selection criteria","An error occurred when reading PSA data".
    thanks in advance,
    ashok.

    hi ashok,
    u can push data from  psa to ods, for this goto the psa in rsa1>psa>goto that request>rightclick>select " schedule update Immediately ", then data will moved from psa to ods.
                                                 or
    In ods > delete the failed request>goto the processing tab-->select 3rd option   " psa and then subsequentially to data targets ", --> schedule the infopackage.
    bye
    sunil

  • Getting Error while accessing the data using odata service

    Hi All,
    Iam new to SAP FIORI. 
    Iam getting the below error while accessing the data using odata service.
    "Failed to load resource: the server responded with a status of 404 (Not found)"
    "No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin "
    i have tried all the solutions like changing the url pattern "proxy/htttp".
    and disabled - security in chrome (Chrome is Updated version).
    i tried with IE still got the same problem.
    And installed all the required software in eclipse
    While installing GWPA plugin i got the following error.
    let me know if any one have idea.
    Thanks in advance.

    > Do you want to add and/or update the data in the already existing tables or do you want to replace the content completely?
    >
    > so in that way :
    > bot the options are fine what ever take less time.
    Sorry mate, but YOU have to know what you want here.
    I gave you an easy to follow set of steps.
    As you don't seam to mind the outcome, just might just use them...
    > I wanted to know weathe i can use the  loadercli for thie export import or not? if yes then is there any new steps to do before i do the export import?
    We had this discussion before...
    >
    > For that the easiest option would be just to drop the tables of SAPR3 and run the import again.
    >
    > For ease of use you could also just do:
    > - logon as superdba
    > - drop user SAPR3
    > - create user SAPR3 password SOMEPW not exclusive dba
    >
    > After these steps you can easily pump the data into the database again.
    >
    > So here in th above given steps , i am creating a new SAPR3 user and why it is not exclusive dba ?
    >  i already have that user SAPR3 can i use the same.
    Yes, you do have the SAPR3 user.
    But you don't seem to like to read documentation or learn about how the tools work or anything like that.
    Therefore I gave you s simple way to reach your goal.
    Of course it's possible to reuse the user.
    But then you would have to deal with already existing tables, already existing data etc.
    You don't seem to be able to do that. So, the easy steps might be better suited for your needs.
    regards,
    Lars

  • Error "Encountered an error while exporting the data. View the lof file (DPR-10126)

    Hello,
    I would like to have sugestions and help to solve a problem in IS 4.2.
    I've created a view from two tables, joined together with the key from each one.
    Then i've binded the rule with the view (mentioned above) created and calculated score for it.
    The calculation is successful, however when i try to export all failed data i have the following error:
    "Encountered an error while exporting the data. View the lof file (DPR-10126)"
    When exporting failed data from calculations made directly to the datasources i do not encounter any problem, but when i try to export all data to CSV file (not only the 500 results limit) i have this kind of error.
    However, when i visualize the data from the view created, it is also possible to export to csv file.
    Resuming, the error occurs only when exporting all failed data from the view.
    Can anyone help with this issue?
    Thx
    Best Regards,

    Hello,
    Your information is correct.
    I've just confirmed it in the 4.2 SP3 Release Notes and they specifically refer to this issue.
    A user who has 'View objects' rights for the project and a failed rule connection cannot export the failed data. This issue has been fixed in this release.
    Thank you very much for your input.
    Best Regards,

  • Error, while pushing the data from Oracle to MSSQL.

    Hi,
    I am facing the below error, while pushing the data from Oracle to MSSQL.
    ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
    [Transparent gateway for MSSQL][Microsoft][ODBC SQL Server Driver][SQL Server]Update or insert of view or function 'View_Name' failed because it contains a derived or constant field.[Microsoft][ODBC SQL Server Driver][SQL Server]Statement(s) could not be prepared. (SQL State: 00000; SQL Code: 8180)
    Please suggest me, how to overcome from this.
    Thanks.

    [email protected] wrote:
    Hi,
    I am facing the below error, while pushing the data from Oracle to MSSQL.
    ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
    [Transparent gateway for MSSQL][Microsoft][ODBC SQL Server Driver][SQL Server]Update or insert of view or function 'View_Name' failed because it contains a derived or constant field.[Microsoft][ODBC SQL Server Driver][SQL Server]Statement(s) could not be prepared. (SQL State: 00000; SQL Code: 8180)
    Please suggest me, how to overcome from this.
    Thanks.This is an error from SQL Server being passed back to Oracle so you can see the problem.
    What it is saying is that you are trying to insert (or update) data to a database view and that view contains a column that is either a constant (literal) value or is derived (a calculation, formula or string concatenation etc.).
    As such, you can't insert or update data on that column because it has nowhere to go on the database table(s) that underly the view.

  • 5200: Error executing query: The data form grid is invalid.

    Hi,
    I am getting this error " 5200: Error executing query: The data form grid is invalid. Verify that all members selected are in Essbase. Check log for details.
    com.hyperion.planning.HspException;hasPovDims=1;povXML=<?xml version="1.0"?><datasources><datasource name="Profit " dsid="228af2c3_129ca3dd85c_-77b1" allowEdit="1"><dim name="Period" dimIndex="1" dsName="Profit " keyDimName="Period" memberName="Apr" displayName="Period: Apr"/><dim name="Year" dimIndex="2" dsName="Profit " keyDimName="Year" memberName="FY10" displayName="Year: FY10"/><dim name="Entity" dimIndex="5" dsName="Profit " keyDimName="Entity" memberName="9999" displayName="Entity: 9999"/></datasource></datasources>"
    I have checked that both the systems essbase and planning are same. nothing's different. still i cannot find the exact reason why the report is not working.
    can any please help me.
    btw, i am using Hyperion Planning 9.3.1
    Thanks,
    BIMS

    Hi
    You probably deleted some members from outline that are used by report
    I would recommend you to open you report/grid via rep.client and try to find which elements are abcent
    Regards
    Alexander
    Edited by: Softperson on 19/8/2010 17:38

  • Error occurred in the data selection,If i do Repeat, then it is working fin

    Hi Freinds,
    Here is the error message,
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    Details Tab:
    Extraction (messages): Errors occurred
    Error occurred in the data selection ....
    If i do Repeat Delta, then it is working fine.
    As it is a daily load, manually repeating the load from the past 10 days.
    Have tried using program RS_TRANSTRU_ACTIVATE_ALL.
    Any inputs pls.....

    Hi Friends,
    Thanks for your valuable inputs, Problem is solved.
    The Reason is that, there is a time difference in the r/3 system and bi system.
    The tolerable time differene shud be only 60 seconds.
    Basis guys  took down time at the OS level and hav made the timers sync.....
    The problem is on the source system GNP time differences
    persists, which can cause the problem with delta upload .
    Universal Time Coordinated UTC....: 1291187626
    Date and time of database.........: 01.12.2010 12:46:59
    Date and Time of R/3-Kernel.......: 01.12.2010 12:43:46
    Date and Time of ABAP-Processor...: 01.12.2010 12:46:58
    ABAP Time zone Setup ..............: 19800
    Date and Time / local time ........: 01.12.2010 12:43:46
    This leads to an exception of the delta processing, because this
    tolerates a deviation of 60 seconds only (When the repeat is processed,
    the "local time" in the ABAP is higher than the qrfc counter at the time when the delta was processed -> So a repeat works fine).
    Please synchronize the timers like described in attached note 101726 and 741734 to eliminate the time difference .
    With Regards
    Badri Thiriveedhi.

Maybe you are looking for

  • Azure VM cannot connect from inside to outside

    My centos VM suddenly does not response when I "git pull" At first I thought it was the git server, then I realized wget or ping did not work as well. Anyone can kindly help me with the issue?

  • HP Photosmart C3180 on USB Will Not Print After Update to 10.5.7

    My HP Photosmart C3180 all-in-one printer (connected via USB) has worked fine up to now. Last night I installed the 10.5.7 update and now all the machine will print is a single line of text with the name of the Adobe postscript file, and then it cont

  • I have the latest itunes version on windows 8.1 and it's stuck on stage 2 of 8

    Hi.  I have latest version of itunes and have uninstalled and reinstalled twice but it keeps getting stuck at stage 2 of back up.

  • Here we go again V04 error

    Oh well is was good for a few days. The line appeared to be fixed but I'm getting the error again. The kids where enjoying the new film The mask and poof it was gone. I've reset and done all the necessaries but nothing. Looks like I'm never getting b

  • Photo booth and iBook

    Can I buy a logitech camera, upgrade to leopard and use photo booth? My daughter loves it on my MacBook Pro, but unfortunately we bought her iBook G4 a month before the newer version came out with the built in camera. Any body have any luck with this