Clear steps for HD data migration needed

I have spent a lot of time searching this and other forums and am looking for step by step instructions for replacing a HD and- especially- transferring the data to a new internal drive. I have viewed the video available on YouTube and the info provided by Apple. The actual drive installation seems straightforward, but I am confused about: 1) what type of enclosure do I need? (USB vs Firewire) 2) do I need an additional application or is Disk Utility sufficient? Carbon Copy Cloner? Migration Assistant? Super Duper? Cables? Power adapters? Oy!!
macbook 1.83Ghz   Mac OS X (10.4.9)   2 GB RAM

I did this last week, replaced my 80G drive with a 160G drive.
Items required:
- new Serial ATA drive - I chose 160G WD drive from newegg.com, $95 plus tax, free shipping - $102
- Serial ATA external case. I got USB 2.0, Firewire is technically superior, but more difficult to find. I got mine from OWC (macsales.com) for $34.95, plus shipping. Whether you choose Firewire or USB 2.0, MOST important is that the case is for SATA (serial ATA) drives - be absolutely certain on this.
- tools: #0 philips screwdriver, Torx T8 screwdriver
- download free SuperDuper
Procedure:
- close ALL programs first, especially mail. You don't want your received mail to change during this process, though that's not a disaster.
- use Disk Utility to Repair Permissions on your old drive - or just leave it to Super Duper, below.
- install NEW drive in external enclosure. Don't completely reassemble enclosure, since this is temporary
- connect new drive/enclosure to Mac with USB cable
- Use Disk Utility to partition new external drive for one partition, USING GUID partition scheme. Actually, you can use the normal Mac partition scheme, but it won't be able to be booted over USB. Use GUID.
- To be most thorough, erase the new drive using MAC OS Extended (Journaled), and select 'zero out data' under Security Options. The 'zero' part isn't strictly necessary, but many recommend it as the best way to initialize a new drive, allowing the drive to recognize any problem sectors. Takes a while - but worth it.
- SuperDuper. Copy (your internal drive) to (your new drive), using 'Backup - all files'. Under options|General, check 'Repair Permissions...', and select 'Erase (old drive), then copy files from (old drive)', click 'OK' then 'Copy Now'. You'll be asked for an admin user/password.
- wait a long time. It took a couple hours for about 60G.
At this point, you should have a bootable, exact clone of your old drive on the new, USB drive. Depending on a SuperDuper setting I forgot to mention above, it might even have rebooted already, from the new drive.
Anyway, reboot from the external drive, take a few minutes to see that everything works as expected. Probably best not to start mail.
Once you're certain the new clone works correctly, shutdown the Mac.
- swap the internal drive with the external drive.
- reboot the Mac, with the USB disconnected. If all has gone as planned, it should work exactly as it did before. If not, your original drive is untouched, you can swap it back in and try again, or boot from the old drive via USB and try again.
That's from memory - I don't think I forgot anything... I'm sure somebody will correct me if I did.
It's much easier than it sounds - good luck!
MacBook 2.0GhzCore2Duo, 2.Gig, 160G HD, PowerBook G4 1.5Ghz 15"   Mac OS X (10.4.10)   Logic Pro 7.2.3, Focusrite Saffire...

Similar Messages

  • Need help for HR data migration

    Hi Experts,
         I need a help that i want to do data migration for PA30 and PP01 transactions.so if any std bapi for upload the data or bdc programs you have means let me know for further process.if early it is good for me.
    Thanks,
    Deesanth

    You can use the standard batch program RPUSTD00.  Also you can use the function module HR_INFOTYPE_OPERATION.
    Hope this is helpful.
    Thanks,
    Pavan

  • RDS for SAP data migration through IDOC's

    Dear Experts
      Just now we have installed BODS 4.1 and RDS also installed now. I am in the process of exploring the RDS. I found all the migration objects is available in the form of IDOC's. Now my questions are
    How to work on custom IDOC's. Do i need to extensively code on R/3 side to upload the custom fields. Like user exit or BADI for custom fileds
    If yes then i would say implementation time through RDS will remain more or less same time.
    Through BODS or by RDS migration objects is there any way without coding on ABAP R/3 side, Can we achieve migrating custom fields
      Please let me how to achieve above requirements by RDS
    Thanks
    Vijay Mukunthan

    Hi Vijay,
    You are right, the standard content of the Rapid Data Migration RDS packages brings the content with the standard IDoc interface for SAP standard only. So custom objects are not pre-built with the package. However, we incorporated the foundation to include custom coding as easy as possible:
    1) IDoc
    We are using the standard interface technology IDoc. IDocs do have an enhancement concept which is described in the online help. There are three main cases and all of them are supported, however two need additional coding:
    Z-fields: Additional fields in existing tables lead to IDoc segment extensions
    With the IDoc segment versioning concept this is easy to realize without any ABAP coding
    Z-tables: Additional IDoc segments will be necessary and lead to a new IDoc version
    With the IDoc versioning concept you can easily add Z-segments with leveraging provided user exits
    Custom business objects:
    You can even create a complete own IDoc message type in Z-namespace for own stuff, leveraging the given ALE layer of the IDoc middleware
    2) Enhancement guide
    With the RDS package we offer an enhancement guide which helps you to modify the jobs and data flows in the SAP Data Services content according to the changes in the target IDoc structure to reflect additional fields and tables. We built it as a step-by-step guide following for a sample IDoc type but will work similarly for any IDoc.
    Best regards,
    Frank Densborn.

  • Steps for master data loading by using process chain

    Hi
    I want to load the Master data by using the process chain . Can anyone tell me the step-by-step method for loading the master data .
    I means what are the processes should i need to include in the process chain . Pls tell me in sequenece .
    i ll assing the points
    kumar

    Hi,
    The process os loading master data is similar to transaction data in terms of using the infopackages.
    First you have to load Attr followed by texts and hierarchy.For master data you have to include the process type called as Attribute change run.This can be included as a separate process type or using the AND process.
    Hope this helps.
    Assign points if useful.
    Regards,
    Venkat

  • Steps for loading data into the infocube in BI7, with dso in between

    Dear All,
    I am loading data into the infocube in BI7, with dso in between. By data flow looks like....
    Top to bottom:
    InfoCube (Customized)
    Transformation
    DSO (Customized)
    Transformation
    DataSource (Customized).
    The mapping and everything else looks fine and data is also seen in the cube on the FULL Load.
    But due to some minor error (i guess), i am unable to see the DELTA data in DSO, although it is loaded in DataSource through process chains.
    Kindly advise me, where did i miss.
    OR .. Step by step instructions for loading data into the infocube in BI7, with dso in between would be really helpful.
    Regards,

    Hi,
    my first impulse would be to check if the DSO is set to "direct update". In this case there is no Delta possible, because the Change log is not maintained.
    My second thought would be to check the DTP moving data between the DSO and the target cube. If this is set to full, you will not get a delta. It is only possible to create one DTP. So if you created one in FULL mode you can't switch to Delta. Just create the DTP in Delta mode.
    Hope this helps.
    Kind regards,
    Jürgen

  • Need clear steps for doing performance tuning on SQL Server 2008 R2 (DB Engine, Reporting Services and Integration Services)

    We have to inverstigate about a reporting solution where things are getting slow (may be material, database design, network matters).
    I have red a lot in MSDN and some books about performance tuning on SQL Server 2008 R2 (or other) but frankly, I feel a little lost in all that stuff
    I'am looking for practical steps in order to do the tuning. Someone had like a recipe for that : a success story...
    My (brain storm) Methodology should follow these steps:
     Resource bottlenecks: CPU, memory, and I/O bottlenecks
     tempdb bottlenecks
     A slow-running user query : Missing indexes, statistics,...
     Use performance counters : there are many, can one give us the list of the most important
    how to do fine tuning about SQL Server configuration
    SSRS, SSIS configuration ? 
    And do the recommandations.
    Thanks
    "there is no Royal Road to Mathematics, in other words, that I have only a very small head and must live with it..."
    Edsger W. Dijkstra

    Hello,
    There is no clear defined step which can be categorized as step by step to performance tuning.Your first goal is to find out cause or drill down to factor causing slowness of SQL server it can be poorly written query ,missing indexes,outdated stats.RAM crunch
    CPU crunch so on and so forth.
    I generally refer to below doc for SQL server tuning
    http://technet.microsoft.com/en-us/library/dd672789(v=sql.100).aspx
    For SSIS tuning i refer below doc.
    http://technet.microsoft.com/library/Cc966529#ECAA
    http://msdn.microsoft.com/en-us/library/ms137622(v=sql.105).aspx
    When I face issue i generally look at wait stats ,wait stats give you idea about on what resource query was waiting.
    --By Jonathan KehayiasSELECT TOP 10
    wait_type ,
    max_wait_time_ms wait_time_ms ,
    signal_wait_time_ms ,
    wait_time_ms - signal_wait_time_ms AS resource_wait_time_ms ,
    100.0 * wait_time_ms / SUM(wait_time_ms) OVER ( )
    AS percent_total_waits ,
    100.0 * signal_wait_time_ms / SUM(signal_wait_time_ms) OVER ( )
    AS percent_total_signal_waits ,
    100.0 * ( wait_time_ms - signal_wait_time_ms )
    / SUM(wait_time_ms) OVER ( ) AS percent_total_resource_waits
    FROM sys.dm_os_wait_stats
    WHERE wait_time_ms > 0 -- remove zero wait_time
    AND wait_type NOT IN -- filter out additional irrelevant waits
    ( 'SLEEP_TASK', 'BROKER_TASK_STOP', 'BROKER_TO_FLUSH',
    'SQLTRACE_BUFFER_FLUSH','CLR_AUTO_EVENT', 'CLR_MANUAL_EVENT',
    'LAZYWRITER_SLEEP', 'SLEEP_SYSTEMTASK', 'SLEEP_BPOOL_FLUSH',
    'BROKER_EVENTHANDLER', 'XE_DISPATCHER_WAIT', 'FT_IFTSHC_MUTEX',
    'CHECKPOINT_QUEUE', 'FT_IFTS_SCHEDULER_IDLE_WAIT',
    'BROKER_TRANSMITTER', 'FT_IFTSHC_MUTEX', 'KSOURCE_WAKEUP',
    'LAZYWRITER_SLEEP', 'LOGMGR_QUEUE', 'ONDEMAND_TASK_QUEUE',
    'REQUEST_FOR_DEADLOCK_SEARCH', 'XE_TIMER_EVENT', 'BAD_PAGE_PROCESS',
    'DBMIRROR_EVENTS_QUEUE', 'BROKER_RECEIVE_WAITFOR',
    'PREEMPTIVE_OS_GETPROCADDRESS', 'PREEMPTIVE_OS_AUTHENTICATIONOPS',
    'WAITFOR', 'DISPATCHER_QUEUE_SEMAPHORE', 'XE_DISPATCHER_JOIN',
    'RESOURCE_QUEUE' )
    ORDER BY wait_time_ms DESC
    use below link to analyze wait stats
    http://www.sqlskills.com/blogs/paul/wait-statistics-or-please-tell-me-where-it-hurts/
    HTH
    PS: for reporting services you can post in SSRS forum
    Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers

  • What chords do i need for a data migration from macbook 2008 to  macbook air 2012

    My 2008 macbook has a firewire port and my new macbook air 2012 has a thunderbolt port. I have a thunderbolt cable but i can't find an adapter to firewire or the firewire cord? Should I return the thunderbolt cable and surrender the 100 bucks for one-on-one or go through the tedious process of using my external hard drive as the 'messenger boy'?

    There's all sorts of ThunderBolt to FireWire adapters out there. Here's one on Amazon. http://www.amazon.com/Apple-Thunderbolt-to-FireWire-Adapter/dp/B008RXYOKY/ref=sr _1_11?ie=UTF8&qid=1346277186&sr=8-11&keywords=thunderbolt+firewire
    You'd also need a FireWire 800 to 400 adapter. http://www.amazon.com/elago-FireWire-400-800-Adapter/dp/B002TF0ZDQ/ref=sr_1_8?ie =UTF8&qid=1346277133&sr=8-8&keywords=thunderbolt+to+firewire

  • User defined step for polling data

    Hi,
    I'm creating a User Defined VI to be called from SignalExpress, that needs to poll a serial device for data.
    The serial device makes new data availiable in a register-array at a rate of 10Hz, and uses an additional register as a "message ID" to indicate when new data is availiable.
    This means that my VI needs to poll this MessageID, and when it increments, I can read the register array into a waveform and return it to SignalExpress.
    Now, what is the best/correct way of writing this VI in regard to the polling wait time?
    I see two options here for the case when my VI is called and it detects that no new message is availiable:
    It immediately returns to SignalExpress ("non blocking").
    In this case, should I return an empty waveform or the previous waveform?
    (I do not want duplicated data logged in SignalExpress)
    It blocks the SignalExpress loop where the user defined step is placed, by waiting in my VI until a new message arrives, and then returns this new data in a waveform.
    My user defined step is placed in a separate loop in signal express so the blocking in case 2 should not affect other DAQ loops in my SignalExpress project, right?
    /LeifS

    I don't think it is directly possible for changing this at the front end level.
    You might need to make changes in some SE24 Classes

  • Steps for transferring data to extra tab fields

    Hi,
          I have to populate target days from range of coverage to extra tab field. What are the steps I need to follow?
    First I need to use the FM EXIT_SAPLCMAT_001 to get the value on R/3 side and send to APO
    I think I need to create a file and give a name for that in APO product master Extra tab.
                   I know how to do the first part then I don't have any idea on how to proceed after that.
    I would appreciate any help.
    Thanks.

    I found the procedure myself.
    Here it is:
    1. Create a project in CMOD for the user exit FM “EXIT_SAPLCMAT_001”
    2. Enter your coding to populate the field one of the fields AT01 to 05 in the table CT_CIF_MATLOC-AT10X ( X being on of 1 to 5)
    3. In APO, maintain the field names in SPRO>APO>Master data>Maintain Freely definable attributes for the field AT10X as “Your required text here”. 
    When the material is CIFed from R/3 to APO, it is filled in the field “Your required text here”.

  • Steps for Master data upload from ECC6.0 to BI7.0

    Hi experts,
    I need to load material master and customer master data from ECC6.0 to BI7.0 and i could not see the steps to do so in sdn.
    Can anyone please give me the steps from the beging till end usign Business content 0material_attr/text/Hier .
    I Will assign deserving points to all the answers.
    Thanks in advance.

    Hi,
    Steps:
    1. Go to  RSA5 in  ECC6 to Install Business Content D.S.
    2.If you want do any change then Go to RSA6 in ECC6
    3. Replicate DS in BW .
    4. Create Transformation (to connect DS to MD).
    5 Create Info package ( to load from S/S to PSA ).
    6. Create DTP ( to load from PSA to MD Objects ).
    Please let me know if i am missing any steps.
    Ali.

  • How to clear cache for XML data source?

    Hi All,
    I'm facing a problem loading XML data into Oracle using Oracle Data Integrator. The problem is ODI will cache the data of the XML file at the first time loading the data. And afterwards, when the XML file changed, ODI will load data from the cache.
    I did serveral testing on this. If I run the interface in local agent of the designer, I must restart the designer to clear the cache. If I run the interface in an agent, I must restart the agent to clear the cache.
    Below is my configuration:
    ODI version: 10.1.3.5.0
    The technology of the source: XML.
    JDBC driver: com.sunopsis.jdbc.driver.xml.SnpsXmlDriver
    Does anybody know how to clear the cache of the agent or XML?
    Thanks.

    Hi ,
    Use drop_on_disc or dod in your JDBC URL .
    example ,
    jdbc:snps:xml?f=C:\mutation.xml&d=C:\mutation.xsd&lf=C:\mutation.log&ro=true&dod=yes
    The dod (drop_on_disc) drop automatically the schema when closing the JDBC connection.
    dod option must be the last option of the URL.
    This option removes the reference to the schema from the driver, and if the schema is store in memory, it also drops the schema. If the schema is stored in an external database (set via db_props), the driver tries to drop the database schema.
    Thanks,
    Sutirtha

  • Data Migration for Open Purchase Order

    Hi, All,
    Is there anyone know how to Count the volume for Open Purchase Order. What's the normal strategy for the Data Migration and Cut-over stage?
    My client want to know how many Open Purchase Order in the legacy system and then determine manual or automatic data migration. If manual, how to do? If automatic, how to do? Because all materials and vendors, plants are different number. How to track? How to find out to match between new and old?
    Thank you very much

    JC,
    Sounds a bit early to be making decisions about the realization phase.  It doesn't sound like you have finished the Blueprinting phase yet, much less the testing phase.
    Anyhow, in my experience I typically use LSMW (Legacy system migration workbench) to load MM master data (material masters), Inventory (WIP, RM, FG, etc) Purchasing Master data (Vendors, Purchase Info Records, Source Lists, Quota Arrangements), and Purchasing transactional documents (POs, PurReqs, Scheduling Agreements, etc).  Depending on the complexity and volume of data, it  may be necessary to write custom programs to load the data.  You will find this out during your requirements gathering.
    It is uncommon but possible to load all of these data manually.  I have never run across a client that wants to pay a consultant's hourly rate to sit at a terminal to peck away loading master data, so if the client intends to have his own users enter the data manually, the project manager should make provision that there will be qualified TRAINED client employees available for this data entry.  I did help with a portion of a conversion once manually; of Sales Credits, but there were only about 30 SD docs to load.   I did this the evening before go-live day, while I was waiting for some of my LSMW projects to complete in the background.
    A good opportunity to 'practice' your data loads is right after you have completed your development and customization, and you have gotten the approval from the client to proceed from the pilot build to the full test environment.  Once you have moved your workbench and customization into the client's test environment, but before integration testing, you can mass load all, or a substantial portion of your conversion data into the qual system.  You can treat it like a dry run for go-live, and fine tune your processes, as well as your LSMW projects.
    Yes, it is good practice to generate comparisons between legacy and SAP even if the client doesn't ask for it. For Purchase orders on the SAP side, you could use any of the standard SAP Purchasing reports, such as ME2W, ME2M, ME2C, ME2L, ME2N.  If these reports do not meet the requirements of the client, you could write a query to display the loaded data, or have an ABAPer write a custom report.
    You didn't ask, but you should also do comparisons of ALL loaded data - including master data.
    It sounds like you are implying that the client wants YOU to extract the legacy data.  For an SAP consultant, this is not very realistic (unless the legacy system is another SAP system).  Most of us do not understand the workings of the myriad legacy systems.  The client is usually expected to produce one or more legacy system technical experts for you to liase with.  You normally negotiate with the technical expert about every facet of of the data migration.  In addition, you will liase with business users, who will help you and the implementation team to logically validate that the final solution (turnkey SAP production system, fully loaded with data) will meet the client's business needs.
    Finally, you mentioned how do you track the mapping of master data between legacy and SAP.  There are many ways to do this.  I normally try to get the legacy person do the conversion on his end, eg, when he gives you the load file, you would like to have already translated the master data and inserted the SAP relevant values into the file.  If this is not possible, I usually use MS Access databases to maintain a master map, and I perform the mapping on a PC.  If your data package is small, you can probably get by using MS Excel or similar.
    Good Luck,
    DB49

  • SAP Legacy data Migration Starategy/Process

    Hi Friends
    I would request you, if anybody let me knwo the process of legacy data migration into sap. What are steps I have to follow while doing legacy data migration. My questions are:
    1) How to upload GL balances from legacy to SAP
    2) How to upload Vendor and Customer open items into sap
    3) Asset balances How to upload
    4) what is the use of Migration clearing accounts.
    Kindly provide if any documents for legacy data migration into sap.
    Thanks in advance
    Rao

    Dear Rao,
    Just check the bwlow link you can understand easly to upload the balances.
    http://www.saptechies.com/sap-pdf-books-download/SAP_Go_live_strategy11248141773.pdf
    Check the bwlow procedure for the same way do it.
    First create 5 Dummy GL codes. These GL code should be zero at the end of uploading balances.
    1     GL Offset Account               Dr
              Asset Offset Account          Cr
              AP Offset Account          Cr
              Sales          Cr
              Other GL's          Cr
    2     Cash               Dr
         Consumption               Dr
         Asset Offset Account               Dr
         AR Offset Account               Dr
         Material Offset Account               Dr
         Other GL's               Dr
              GL Offset Account     x     Cr
    3     AP Offset Account               Dr
              Vendor 1          Cr
              Vendor 2          Cr
              Vendor 3          Cr
    4     Customer 1               Dr
         Customer 2               Dr
         Customer 3               Dr
              AR Offset Account          Cr
    5     Fixed Asset 1               Dr
         Fixed Asset 2               Dr
              Asset Offset Account          Cr
              Acc. Depreciation Account          Cr
    6     Stock Inventory Account               Dr
              Material Offset Account          Cr
    Regards,

  • Pls Help me with steps to add data from xml file to SAP B1 through B1iSN.

    Pls Help me with steps to add data from xml file to SAP B1 through B1iSN. I  am getting stuck in xsl transformation. not able to understand where the mapping code needs to be added.
    Pls explain me the steps for adding data from xml to B1 quotation step by step.
    thanks and regards
    Priya

    Hi,
    Have you checked this: https://sap.na.pgiconnect.com/p45508295/?launcher=false&fcsContent=true&pbMode=normal ?
    Thanks,
    Gordon

  • Data Migration From Peoplesoft , JDEdwards To SAP.

    Hi,
    This is kiran here we are doing data Migration work from Peoplesoft And JDEdwards to SAP.in SAP side it involves Master data tables Related to Customer, Vendor, Material. and Meta data tables related to SD, MM, FI. We as SAP Consultant identified Fields from above tables and marked them as Required, Not required, And Mandatory. The Peoplesoft and JDEdwards flocks come up with the same from their side. Then we want map the Fields. as I am new to data Migration any body suggest me what are the steps involves in data Migration How to do Data Mapping in Migration Thanks in advance.
    Thanks
    Kiran.B

    Hi Kiran,
    Good... Check out the following documentation and links
    Migrating from one ERP solution to another is a very complex undertaking. I don't think I would start with comparing data structures. It would be better to understand the business flows you have currently with any unique customizations and determine how these could be implemented in your target ERP. Once this is in place, you can determine the necessary data unload/reload to seed your target system.
    A real configuration of an ERP system will only happen when there is real data in the system. The mapping of legacy system data to a new ERP is a long difficult process, and choices must be made as to what data gets moved and what gets left behind. The only way to verify what you need to actually run in the new ERP environment is to migrate the data over to the ERP development and test environments and test it. The only way to get a smooth transition to a new ERP is to develop processes as automatic as possible to migrate the data from the old system to the new.
    Data loading is not a project that can be done after everything else is ready. Just defining the data in the legacy system is a huge horrible task. Actually mapping it to one of the ERP system schemas is a lesson in pain that must be experienced to be believed.
    The scope of a data migration project is usually a fairly large development process with a lot of proprietary code written to extract legacy data, transform and load the data into the ERP system. This process is usually called ETL (extract, transform, load.)
    How is data put into the ERP?
    There is usually a painfully slow data import facility with most ERP systems. Mashing data into the usually undocumented table schema is also an option, but must be carefully researched. Getting the data out of the legacy systems is usually left to the company buying the ERP. These export - import processes can be complex and slow, sometimes specialized ETL tools can help, sometimes it is easier to use what ever your programmers are familiar with, tools such as C, shell or perl.
    An interesting thing to note is that many bugs and quirks of the old systems will be found when the data is mapped and examined. I am always amazed at what data I find in a legacy system, usually the data has no relational integrity , note that it does not have much more integrity once it is placed in an ERP system so accurate and clean data going in helps to create a system that can work.
    The Business Analysts (BAs) that are good understand the importance of data migration and have an organized plan to migrate the data, allocate resources, give detailed data maps to the migrators (or help create the maps) and give space estimates to the DBAs. Bad BAs can totally fubar the ERP implementation. If the BAs and management cannot fathom that old data must be mapped to the new system, RUN AWAY. The project will fail.
    Check these links
    http://pdf.me.uk/informatica/AAHN/INFDI11.pdf
    http://researchcenter.line56.com/search/keyword/line56/Edwards%20Sap%20Migration%20Solutions/Edwards%20Sap%20Migration%20Solutions
    http://resources.crmbuyer.com/search/keyword/crmbuyer/Scm%20Data%20Migration%20On%20Peoplesoft%20Peoplesoft%20Data%20Migration/Scm%20Data%20Migration%20On%20Peoplesoft%20Peoplesoft%20Data%20Migration
    Good Luck and Thanks
    AK

Maybe you are looking for

  • Business Graphics and WebEnterprise

    I have a two part question. First, what is the best way to create business graphics such as line, bar, and point graphs? OLE to Excel, OCX, some third party commercial graphics library such as Brahma Software's Beacon? Second, once I have a business

  • IPod to home theater- have video but absolutely NO AUDIO!!!

    I just bought a 30GB iPod for Christmas as well as the DLO HomeDock Deluxe. I connected everything perfectly fine: S-Video to the TV, Red (R), White (L) RCA cables to the stereo receiver. The video is working but I cannot get audio at all. No music,

  • Audio and video not in sync. went back to previously finished projects and they too are not in sync. What could be the problem?

    Audio and video not in sync. Went back to previously edited projects and they too are now not in sync. What could be the problem?

  • Leopard Rack Mount Server shutting down

    We have a new OSX server that is shutting down each day on its own. The log tells us "improper shutdown detected" My guess is that we have some setting wrong, but can't seem to figure which one. Currently we are using our old G4 server running an old

  • Calculation with Dates

    I have one date called Registration Date which will be input into the planning as string in dd--mm-yyyy format. Now I am capturing current date which should pickup from system(By @Today Function). I know that these dates are now in different formats.