Live-cache for IDES

Hi Gurus,
              I had tested IDES system and it doesnt have live cache , can i use the system for further.Please conform me as i requested for it that it doesnt work.
Thanks in advance
regards,
Kumar

Kumar.
I am only reinstating what has already been said. You cannot use your system without a Live cache installation. Please get in touch with your BASIS administrator to get it configured and activated.
For further read use:
SAP liveCache technology
Abhi

Similar Messages

  • How to install the live cache for oracle DB

    Hi,
    How to install the live cahe for the oracel DB.
    We had the SCM Dev system with oracle database.Now we are planning to install the live cahe on the same DB.Is it possible?
    Plz give the appropriate steps.
    Regards,
    Siva.

    Thanks for your response.
    i understood that live cache is a product it will be applicable to any DB's.
    i need to install the live cache from the existing system(Running with oracle DB)
    Plz guide the procedure.
    Am I going in a wrong direction?Plz guide.
    Thanks,
    Siva

  • Live cache error while correctingg inconsisitency in Resource

    Dear Expert
    While correcting inconsistency in live cache  inconsistency is not getting corrected .
    Error is Bucket vector does not exist in Live cache for resource WR201_1000_008
    we are using Block planning in PPDS . I have checked SAPAPO/RST02 But resource is there in live cache it status is green
    I am attaching error file 
    Thanks & Regards
    Virender

    Hi Virender,
    The problem is a SCM issue, so I already triggered "alert moderator" to forward SCM forum.
    Best regards,
    James

  • ( very urgent ) read data from bw to live cache

    hi,
         i am very mew to APO LIVE CACHE. i need to read the data from BW to LIVE CACHE for forecasting. i now the transaction lc10 and lc11. but the problem is i need to know where i have to give the BW cubes or BW information into LIVECACHE.
    WOULD APPRECIATE UR HELP.
    RGS,
    SUGUNA.S

    Try TCode/SAPAPO/TSCUBE or Program /SAPAPO/RTSINPUT_CUBE
    Regards
    Krishna Chaitanya.P

  • SCM5.0 Live Cache issue

    Hello,
    I am a newer from SCM5.0. I have 2 questions regarding live cache for SCM5.0.
    1) if I want to run SNP or PP/DS, the live cache is necessary. that means, we MUST install live cache for APO planning run.
    2) From now on, no 32 bit live cache version for SCM5.0. I can just install 64 bit live cache version for SCM5.0. right?
    Could anyone give me some comments on the 2 issue? thanks in advance.
    Best Regards,
    Karl

    Hi Karl,
    1. In APO whatever runs you take there should be liveCache. The data will be refreshed from database to livecache and is accessed.
    In APO data is stored either order series or time series.
    2. Yes you have to install 64 bit.
    Regards,
    Kishore Reddy.

  • ST22 timeout for all LC related transactions  LIVE cache start stop working from LC10

    Hi Team
    we are a getting a  ST22 timeout for all LC related transactions  LIVE cache start stop working from LC10
    LC version 7.9
    OS AIX
    SAP SCM 7
    SDBVERIFY giving the following error
    Checking package Redist Python
    Checking package Loader
    Checking package ODBC
    Checking package Messages
    Checking package JDBC
    Checking package DB Analyzer
    ERR: Failed
    ERR: Checking installation "Legacy" failed
    ERR: Group of directory /sapdb/data/config/install changed [0 =>
    sdbregview -l is showing good.
    any idea what might went wrong.
    trying to use sdbverify -repair_permissions , but not sure about the exact syntax to use.
    and it is not related to timeout parameter, we tested with different timeout values, but still the same error.
    thanks
    Kishore Ch

    Hello Kishore,
    you could check the sizing of the liveCache data.
    * Report  /SAPAPO/TS_LCM_REORG_SNP has checks  of the SNP planning areas for superfluous objects.
    * Delete old/temporary APO data.
    * /SAPAPO/TS_LCM_REORG report checked TS superfluous objects.
    If you didn't create the planning versions, copy of planning versions & data load to liveCache, then create the SAP message to check your system on the question of the dataarea usage.
    If you have the long running APO transactions => performance of the SCM system has to be checked.
    If you have the bottleneck in liveCache & could not solve the case by yourself => create the SAP message to BC-DB-LVC and get SAP support.
    Best regards, Natalia Khlopina

  • What are the possible live cache issues turn up for demand planning?

    HI all,
    Can anyone tell me the issues that may arise iin live cache and how to ensure the performance and maintenace of live cache that a planning area is utilizing it.
    if the utilization is high how to bring it down!!
    Can anyone pls guide me on this!!!
    Thanks
    Pooja

    Hi Pooja,
    1) Accumulation of logs created during demand planning jobs will have an impact on performance
    and affect livecache and hence to be cleared periodically
    2) Livecache consistency check to be performance at planned intervals
    which will reduce livecache issues
    3) Through om13 transaction, you can analyse the livecache and LCA objects
    and act accordingly
    4) You can also carry out /SAPAPO/PSTRUCONS - Consistency Check for Planning Object
    Structure and /SAPAPO/TSCONS - Consistency Check for Time Series Network related to
    demand planning to avoid livecache issues later
    Regards
    R. Senthil Mareeswaran.

  • Transaction code for live cache

    Hi experts,
    Please help for the following query.
    Log on to the liveCache with the SQL Studio/dbmcli as the liveCache user, and execute the following command
    SELECT count(*) from "/SAPAPO/ORDKEY"
                    where simid = '<noted VRSIOID>'
    I dont know the transaction code for live cache please suggest me how i login and exectue the query.

    Hi,
    The transaction code for live-cache is 'DB59'. Please let us know if you require further inputs.
    Regards,
    Birendra

  • In which table is the Live cache data stored?

    Hi experts,
       I am new APO .Can anyone let me know in which database table will the Live cache data be stored?
    Thanks in advance
    regards
    Ashwin.

    Hello Ashwin,
    the idea of the liveCache is to haev data permanently and quickly available without having to read/write from/to database. Therefore, the liveCache is <b>NOT</b> a physical database, it is a program in C++ that <i>simulates</i> a database and holds the data in the memory.
    However, it does need so called liveCache-anchors which are saved on the database, and which are similar to pointers.
    However, you can extract data from liveCache using BADIs or by creating a datasource on a planning area (for DP and SNP), manipulation can also be done only by BADIs or sophisticated programming (which basically uses RFCs).
    I hope this answers your question.
    Regards,
    Klaus

  • Live Cache config prob!!!!!!

    Dear All,
    I have just installed IDES of SCM 5.0 and scheduled and released the default jobs from sm36. But from sm37 I can see that the jobs are getting cancelled .
    1)" Connection to LiveCache Failed " .
    2)" LiveCache Lock Server not Accessible " .
    I have installed ABAP + JAVA and logged in client 001.
    Please advice ........
    Regards,
    Souren

    Hi Natalia,
    Thanks for the reply.
    Ours is win2k3 32 bit system where we have installed :
    (1) Oracle 10.2 &
    (2) SCM 5.0 server (Central System Installation).
    During Installation, SAPinst asked for Live cache related details (eg, Live Cache ID - we have given LC1, Control user's Pwd etc) along with the installation DVD for the same - where we have given the DVD 51032259_11  (NW 2004s SR2 liveCache 7.6.0 Windows Server on IA32 32bit).
    The installation completed successfully. Our SCM system id is SR3.
    **Does this mean that Live Cache Server got installed ??
    After the above installation, we have referred the "Installation - SAP liveCache Technology on Win - NW 7.0 SR2" and tried to do the post installation steps (Assuming that Live cache got installed )
    We created the RFC user with all the authorizations as described in the SAP note for the same.
    During the post installation step :
    dbmcli –d <SID> -u <controluser>,<password> db_offline,
    we got the following error:
    ERROR! Connection failed to node <LOCAL> for database <SID>:
    Database instance not found.
    We tried with both SR3 & LC1 as SID but the result is same.
    Further, we tried the next post installation step using LC10.
    After Integrating LCA,LDA & LEA, when we tried to "monitor" the Live Cache (by clicking the Live Cache: Monitoring button), we got the following error:
    DBMRFC Function : DBM_CONNECT
    Error                      :DBM Error
    Return Code            : -4
    Error Message database instance not found
    How should we proceed now??
    Thanks & Regards,
    Souren

  • Live Cache 7.5. Installation SCM4.1 ERROR MDB-070020

    Hi,
    we`re trying to install live cache 7.5.00.21 on SCM 4.1.
    (Maxdb version of normal db is 7.6.05.09).
    We had an other version installed before, which we deleted because of other errors.
    Now we`re getting the following error message - any idea?
    WARNING 2009-01-16 11:19:37
    Execution of the command "/special1/TAD_NEU/lcinst/UNIX/HPIA64/SDBINST '-indep_prog' '/sapdb/programs' '-indep_data' '/sapdb/data' '-depend' '/sapdb/LCA/db'
    '-profile' 'APO LiveCache' '-o' '1100' '-g' '1100' '-b'" finished with return code 1. Output:
    starting preparing phase of package Base 7.5.00.21 64 bit
    cannot downgrade package
    skipping package
    starting preparing phase of package Server Utilities 7.5.00.21 64 bit
    cannot downgrade package
    skipping package
    starting preparing phase of package Database Kernel 7.5.00.21 64 bit
    cannot do update check - test file "/sapdb/LCA/db/pgm/kernel" not found
    installation exited abnormally  at Fr, Jan 16, 2009 at 11:19:37
    ERROR 2009-01-16 11:19:37
    MDB-070020 The database installer reported an error. Some database applications are probably still running.
    Check the log file sdbinst.log.

    Michael Schneider wrote:>
    > Hi,
    >
    > we`re trying to install live cache 7.5.00.21 on SCM 4.1.
    > (Maxdb version of normal db is 7.6.05.09).
    >
    > We had an other version installed before, which we deleted because of other errors.
    >
    > Now we`re getting the following error message - any idea?
    Sure - how did you uninstall the former installation?
    rm -f /sapdb/<DBSID> ??
    To correctly uninstall a MaxDB/liveCache you'll always have to use the SDBUNINST tool.
    Since there seems to be another installation on the same server, it's now a bit complicated to clean up the registration mess... You should open a support message for that. Then we'll see what we can do here.
    regards,
    Lars

  • MAXLOCKS alert generation - SAP Live Cache

    Hello Guru's,
    We have a requirement for setting alerting for parameter MAXLOCKS on live cache server when it reaches to a threshold value, but we couldn't find any monitoring set in RZ20.
    So we have got an idea to use Count(*) on LOCKSTATISTICS and calculate a threshold. But not sure how to do it through Shell Script for MAx DB? Please suggest if any other options as well.
    Thanks and Regards,
    Sri's

    Hello Sri's,
    1.
    Please review the notes:
    65946          Lock escalations
    1243937     FAQ: MaxDB SQL locks
    As you know, the system table LOCKLISTSTATISTICS displays the statistics of the
    lock management since the last database start. This system table is insufficient for the analysis of locking conflicts.
    2.
    Review the sapact script attach to the SAP Note No. 974758 as HINT to your question on collecting the output
    of the statement :
    SELECT * FROM SYSDBA.LOCKLISTSTATISTICS     and
    SELECT * FROM SYSDBA.LOCKSTATISTICS
    Periodically during the job/application run in question.
    Also use the DB analyzer to find the bottleneck in liveCache.
    3.
    Create the SAP message and get SAP support.
    It will be helpful to know what is the version of your system, SAP basis SP, the liveCache version and the reason of your project.
    Did you get the DB analyzer warnings with lock escalation?
    Regards, Natalia Khlopina

  • How to create a cache for JPA Entities using an EJB

    Hello everybody! I have recently got started with JPA 2.0 (I use eclipseLink) and EJB 3.1 and have a problem to figure out how to best implement a cache for my JPA Entities using an EJB.
    In the following I try to describe my problem.. I know it is a bit verbose, but hope somebody will help me.. (I highlighted in bold the core of my problem, in case you want to first decide if you can/want help and in the case spend another couple of minutes to understand the domain)
    I have the following JPA Entities:
    @Entity Genre{
    private String name;
    @OneToMany(mappedBy = "genre", cascade={CascadeType.MERGE, CascadeType.PERSIST})
    private Collection<Novel> novels;
    @Entity
    class Novel{
    @ManyToOne(cascade={CascadeType.MERGE, CascadeType.PERSIST})
    private Genre genre;
    private String titleUnique;
    @OneToMany(mappedBy="novel", cascade={CascadeType.MERGE, CascadeType.PERSIST})
    private Collection<NovelEdition> editions;
    @Entity
    class NovelEdition{
    private String publisherNameUnique;
    private String year;
    @ManyToOne(optional=false, cascade={CascadeType.PERSIST, CascadeType.MERGE})
    private Novel novel;
    @ManyToOne(optional=false, cascade={CascadeType.MERGE, CascadeType.PERSIST})
    private Catalog appearsInCatalog;
    @Entity
    class Catalog{
    private String name;
    @OneToMany(mappedBy = "appearsInCatalog", cascade = {CascadeType.MERGE, CascadeType.PERSIST})
    private Collection<NovelEdition> novelsInCatalog;
    The idea is to have several Novels, belonging each to a specific Genre, for which can exist more than an edition (different publisher, year, etc). For semplicity a NovelEdition can belong to just one Catalog, being such a Catalog represented by such a text file:
    FILE 1:
    Catalog: Name Of Catalog 1
    "Title of Novel 1", "Genre1 name","Publisher1 Name", 2009
    "Title of Novel 2", "Genre1 name","Pulisher2 Name", 2010
    FILE 2:
    Catalog: Name Of Catalog 2
    "Title of Novel 1", "Genre1 name","Publisher2 Name", 2011
    "Title of Novel 2", "Genre1 name","Pulisher1 Name", 2011
    Each entity has associated a Stateless EJB that acts as a DAO, using a Transaction Scoped EntityManager. For example:
    @Stateless
    public class NovelDAO extends AbstractDAO<Novel> {
    @PersistenceContext(unitName = "XXX")
    private EntityManager em;
    protected EntityManager getEntityManager() {
    return em;
    public NovelDAO() {
    super(Novel.class);
    //NovelDAO Specific methods
    I am interested at when the catalog files are parsed and the corresponding entities are built (I usually read a whole batch of Catalogs at a time).
    Being the parsing a String-driven procedure, I don't want to repeat actions like novelDAO.getByName("Title of Novel 1") so I would like to use a centralized cache for mappings of type String-Identifier->Entity object.
    Currently I use +3 Objects+:
    1) The file parser, which does something like:
    final CatalogBuilder catalogBuilder = //JNDI Lookup
    //for each file:
    String catalogName = parseCatalogName(file);
    catalogBuilder.setCatalogName(catalogName);
    //For each novel edition
    String title= parseNovelTitle();
    String genre= parseGenre();
    catalogBuilder.addNovelEdition(title, genre, publisher, year);
    //End foreach
    catalogBuilder.build();
    2) The CatalogBuilder is a Stateful EJB which uses the Cache and gets re-initialized every time a new Catalog file is parsed and gets "removed" after a catalog is persisted.
    @Stateful
    public class CatalogBuilder {
    @PersistenceContext(unitName = "XXX", type = PersistenceContextType.EXTENDED)
    private EntityManager em;
    @EJB
    private Cache cache;
    private Catalog catalog;
    @PostConstruct
    public void initialize() {
    catalog = new Catalog();
    catalog.setNovelsInCatalog(new ArrayList<NovelEdition>());
    public void addNovelEdition(String title, String genreStr, String publisher, String year){
    Genre genre = cache.findGenreCreateIfAbsent(genreStr);//##
    Novel novel = cache.findNovelCreateIfAbsent(title, genre);//##
    NovelEdition novEd = new NovelEdition();
    novEd.setNovel(novel);
    //novEd.set publisher year catalog
    catalog.getNovelsInCatalog().add();
    public void setCatalogName(String name) {
    catalog.setName(name);
    @Remove
    public void build(){
    em.merge(catalog);
    3) Finally, the problematic bean: Cache. For CatalogBuilder I used an EXTENDED persistence context (which I need as the Parser executes several succesive transactions) together with a Stateful EJB; but in this case I am not really sure what I need. In fact, the cache:
    Should stay in memory until the parser is finished with its job, but not longer (should not be a singleton) as the parsing is just a very particular activity which happens rarely.
    Should keep all of the entities in context, and should return managed entities form mehtods marked with ##, otherwise the attempt to persist the catalog should fail (duplicated INSERTs)..
    Should use the same persistence context as the CatalogBuilder.
    What I have now is :
    @Stateful
    public class Cache {
    @PersistenceContext(unitName = "XXX", type = PersistenceContextType.EXTENDED)
    private EntityManager em;
    @EJB
    private sessionbean.GenreDAO genreDAO;
    //DAOs for other cached entities
    Map<String, Genre> genreName2Object=new TreeMap<String, Genre>();
    @PostConstruct
    public void initialize(){
    for (Genre g: genreDAO.findAll()) {
    genreName2Object.put(g.getName(), em.merge(g));
    public Genre findGenreCreateIfAbsent(String genreName){
    if (genreName2Object.containsKey(genreName){
    return genreName2Object.get(genreName);
    Genre g = new Genre();
    g.setName();
    g.setNovels(new ArrayList<Novel>());
    genreDAO.persist(t);
    genreName2Object.put(t.getIdentifier(), em.merge(t));
    return t;
    But honestly I couldn't find a solution which satisfies these 3 points at the same time. For example, using another stateful bean with an extended persistence context (PC) would work for the 1st parsed file, but I have no idea what should happen from the 2nd file on.. Indeed, for the 1st file the PC will be created and propagated from CatalogBuilder to Cache, which will then use the same PC. But after build() returns, the PC of CatalogBuilder should (I guess) be removed and re-created during the succesive parsing, although the PC of Cache should stay "alive": shouldn't in this case an exception being thrown? Another problem is what to do when the Cache bean is passivated. Currently I get the exception:
    "passivateEJB(), Exception caught ->
    java.io.IOException: java.io.IOException
    at com.sun.ejb.base.io.IOUtils.serializeObject(IOUtils.java:101)
    at com.sun.ejb.containers.util.cache.LruSessionCache.saveStateToStore(LruSessionCache.java:501)"
    Hence, I have no Idea how to implement my cache.. Can you please tell me how would you solve the problem?
    Many thanks!
    Bye

    Hi Chris,
    thanks for your reply!
    I've tried to add the following into persistence.xml (although I've read that eclipseLink uses L2 cache by default..):
    <shared-cache-mode>ALL</shared-cache-mode>
    Then I replaced the Cache bean with a stateless bean which has methods like
    Genre findGenreCreateIfAbsent(String genreName){
    Genre genre = genreDAO.findByName(genreName);
    if (genre!=null){
    return genre;
    genre = //Build new genre object
    genreDAO.persist(genre);
    return genre;
    As far as I undestood, the shared cache should automatically store the genre and avoid querying the DB multiple times for the same genre, but unfortunately this is not the case: if I use a FINE logging level, I see really a lot of SELECT queries, which I didn't see with my "home made" Cache...
    I am really confused.. :(
    Thanks again for helping + bye

  • Error during Live Cache Server Installation on SCM 4.1 system

    Hi All,
    I have an SCM 4.1 ABAP system running on MSSQL2005 and Win2003 server.I would like to install Live Cache Server on the same Server.Livecache client was installed as part of SCM 4.1 installation.
    I have installed MAXDB software and now when im trying to install Live Cache Server Instance i get the below error
    Im performing the installation with user root and it is an Administrator.
    WARNING 2011-12-09 11:01:25
    Execution of the command "change 'user' '/install'" finished with return code 1. Output: Install mode does not apply to a Terminal server configured for remote administration.
    Installation start: Friday, 09 December 2011, 11:01:23; installation directory: G:\SCM_4.1_Media\Media_Live_Cache\New_Media\51031447_2\CD_SAP_SCM_4.1_liveCache_64bit\SAPINST\NT\AMD64; product to be installed: SAP SCM 4.1> Additional Services> Install a liveCache Server instance
    Transaction begin ********************************************************
    WARNING 2011-12-09 11:02:33
    Error 3 (The system cannot find the path specified.
    ) in execution of a 'CreateProcess' function, line (265), with parameter (G:\SCM_4.1_Media\Media_Live_Cache\New_Media\51031447_2\CD_SAP_SCM_4.1_liveCache_64bit\NT\AMD64\SDBUPD.EXE -l).
    Transaction end **********************************************************
    WARNING 2011-12-09 11:02:34
    The step Fill_sapdb_db_instance_context with step key LIVECACHESERVER|ind|ind|ind|ind|ind|0|LC_SERVER_INSTALL|ind|ind|ind|ind|ind|0|Fill_sapdb_db_instance_context was executed with status ERROR.
    Has anyone seen this error before ? Any pointers would be helpful.
    Regards,
    Ershad Ahmed.

    Subprocess starts at 20111209154957
    Execute Command : C:\Program Files\sdb\programs\pgm\dbmcli.exe -n XXXXXXXXX db_enum
    Execute Session Command : exit
    > Subprocess stops at 20111209154957
    OK
    > Subprocess starts at 20111209155027
    Execute Command : C:\Program Files\sdb\programs\pgm\dbmcli.exe -n XXXXXXXXX db_enum
    Execute Session Command : exit
    > Subprocess stops at 20111209155027
    OK
    > Subprocess starts at 20111209155221
    Execute Command : C:\Program Files\sdb\programs\pgm\dbmcli.exe -n XXXXXXXXX db_enum
    Execute Session Command : exit
    > Subprocess stops at 20111209155221
    OK
    > Subprocess starts at 20111209155323
    Execute Command : C:\Program Files\sdb\programs\pgm\dbmcli.exe -n XXXXXXXXX inst_enum
    Execute Session Command : exit
    > Subprocess stops at 20111209155324
    OK
    7.5.00.31    f:\sapdb\liv\db
    7.6.06.10    f\sapdb\sdb\7606
    7.6.06.10    C:\Program Files\sdb\7606
    > Subprocess starts at 20111209155324
    Execute Command : C:\Program Files\sdb\programs\pgm\dbmcli.exe -n XXXXXXXXX inst_enum
    Execute Session Command : exit
    > Subprocess stops at 20111209155324
    OK
    7.5.00.31    f:\sapdb\liv\db
    7.6.06.10    f\sapdb\sdb\7606
    7.6.06.10    C:\Program Files\sdb\7606
    > Subprocess starts at 20111209161349
    Execute Command : C:\Program Files\sdb\programs\pgm\dbmcli.exe -n XXXXXXXXX inst_enum
    Execute Session Command : exit
    > Subprocess stops at 20111209161349
    OK
    7.5.00.31    f:\sapdb\liv\db
    7.6.06.10    f\sapdb\sdb\7606
    7.6.06.10    C:\Program Files\sdb\7606
    Regards,
    Ershad Ahmed.

  • Unable to delete Order does not exist in live cache but in table POSMAPN

    Hi Experts,
    We are facing an issue where purchase order is not available in live cache (which means no GUID) but exists in database table POSMAPN. We have tried to delete it using standard SAP inconsistent order deletion program and also using BAPI BAPI_POSRVAPS_DELMULTI but not able to delete it.
    Can anybody suggest a method by which we can get rid of this order from the system.
    Thanks a lot.
    Best Regards,
    Chandan

    Hi Chandan,
    Apologize me for taking your question in a wrong perspective. If you want to delete the same then you need to Re-CIF the order from ECC so that it would come and sit in Live Cache. Once done, try using the BAPI.
    If you are not successful with the above approach try running the consistency report /SAPAPO/SDRQCR21 in APO system
    so that it first corrects the inconsistency between ECC and APO (Live Cache + DB tables) and then use the BAPI to delete the PO.
    Not sure if you have tried this way. If this does not solve your purpose you need to check SAP Notes.
    Thanks,
    Babu Kilari

Maybe you are looking for

  • To download alv output to two excel sheets because of large data

    Hi all, I want to download alv output to excel sheet,I know how to do,but the no of record is very large and can not be accomodated in one sheet. can anyone please tell how to download it into two sheets or some other way. I want excel format only. R

  • Pagination with pl/sql region.

    Hi all, I have created one more sample page with pl/sql region workspace: srijaks login:[email protected] password:srijakutty Application: 49471 - Sample Field Display page 16 Initially it will display the first two records of the concerned departmen

  • HTML submit button in a folio won't generate an email.

    I have an HTML page with a form and submit button that creates an email with the results of the form. In Safari on the iPad the form and submit button work, but when the HTML link is viewed in a folio page the submit button will not work, even though

  • Sound gone in right side

    Hello, I have a 2GB iPod nano (Second Generation). About 1 month ago i lost the sound in the right side. If i plug my headphones in there is only sound in the left side - though with alot of noise. It just happend and i don't know why. I've tried eve

  • Transferring windows backup to new macbook

    Hi all, Well, I bought my first ever mac today - finally come over from the dark side aka a windows laptop! My old laptop died on Monday and I decided enough was enough. I ran a regualr windows backup on my external hardrive so after starting up my m