Data Migration Testing

1) How to write queries to get data from one table of one database and compare with the table present in the other database. Please explain with some example
2) How to automate the different individual queries written for testing data transformation

for this firstly careate db_link and execute merge command.

Similar Messages

  • Data migration test: Deletion of POs

    Hello gurus,
    we would like to do a migration test for open orders.
    Is there a better way to delete the imported POs afterwards than SE16N (e.g. a report)? We would like to do several tests and delete the imported data after each one.
    Thanks
    Alicia

    http://sap-img.com/materials/how-to-use-me98-to-delete-po-from-system-completely.htm

  • Details of Test Data Migration Server(TDMS)

    Hi Friends,
        I am very much interested in the new tool Test Data Migration Server(TDMS).Can anyone of u r aware of this tool.i had a detailed search in google & SAP sites.But i could not find much information.Plz help me out in finding about this tool.
    Thanx
    Rajeev

    Hi Rajeev,
    There are documents available on these pages:
    Service Marketplace
    http://service.sap.com/customdev-tdms
    SAP Homepage
    http://www.sap.com/services/customdev/tdms
    If you have specific questions, just let me know.
    Regards,
    Manfred

  • Required Unit test Plan for Data migration

    Dear All,
    I am looking for unit test plan Draft documents/templates which covers
    1. Testing tools
    2. Methods
    3. Error handling
    4. Reviews and approvals
    The project is a Oracle 10g data migration from one schema to another schema.
    It will be a greate help if anyone can forward the same to me.
    Thank you.

    Hi Vaishali,
    You may wish to refer the links below...
    https://service.sap.com/instguides --> SAP NetWeaver --> Release 2004s --> Upgrade
    https://www.sdn.sap.com/irj/sdn/developerareas/bi
    latest on upgrade tp BI7.0
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/2e8e5288-0b01-0010-2ea8-bcd4df5084a7
    Upgrading BW 3.X to SAP NetWeaver 2004s BI (PDF 2.6 MB)
    Front End Migration strategy
    Here's the migration strategy: Rolling out the New SAP NetWeaver 2004s BI Frontend Tools
    Presentation
    http://csc-studentweb.lrc.edu/swp/Berg/Articles/PM_2006_upgrade_NW2004s_Bjarne_Berg_v12.ppt
    Here are the frontend requirements:
    Troubleshoot the SAP NetWeaver 2004s BI Frontend Installation
    Here are the backend requirements in the product availabilty matrix:
    https://websmp110.sap-ag.de/~form/handler?_APP=00200682500000001303&_EVENT=DISP_NEW&00200682500000002804=01200615320900001250
    Migration of Web Teplates
    832713 - Migration of Web templates from BW 3.x to Netweaver
    Assign points if this helps.
    Regards,
    Anil

  • Data Migration for Open Purchase Order

    Hi, All,
    Is there anyone know how to Count the volume for Open Purchase Order. What's the normal strategy for the Data Migration and Cut-over stage?
    My client want to know how many Open Purchase Order in the legacy system and then determine manual or automatic data migration. If manual, how to do? If automatic, how to do? Because all materials and vendors, plants are different number. How to track? How to find out to match between new and old?
    Thank you very much

    JC,
    Sounds a bit early to be making decisions about the realization phase.  It doesn't sound like you have finished the Blueprinting phase yet, much less the testing phase.
    Anyhow, in my experience I typically use LSMW (Legacy system migration workbench) to load MM master data (material masters), Inventory (WIP, RM, FG, etc) Purchasing Master data (Vendors, Purchase Info Records, Source Lists, Quota Arrangements), and Purchasing transactional documents (POs, PurReqs, Scheduling Agreements, etc).  Depending on the complexity and volume of data, it  may be necessary to write custom programs to load the data.  You will find this out during your requirements gathering.
    It is uncommon but possible to load all of these data manually.  I have never run across a client that wants to pay a consultant's hourly rate to sit at a terminal to peck away loading master data, so if the client intends to have his own users enter the data manually, the project manager should make provision that there will be qualified TRAINED client employees available for this data entry.  I did help with a portion of a conversion once manually; of Sales Credits, but there were only about 30 SD docs to load.   I did this the evening before go-live day, while I was waiting for some of my LSMW projects to complete in the background.
    A good opportunity to 'practice' your data loads is right after you have completed your development and customization, and you have gotten the approval from the client to proceed from the pilot build to the full test environment.  Once you have moved your workbench and customization into the client's test environment, but before integration testing, you can mass load all, or a substantial portion of your conversion data into the qual system.  You can treat it like a dry run for go-live, and fine tune your processes, as well as your LSMW projects.
    Yes, it is good practice to generate comparisons between legacy and SAP even if the client doesn't ask for it. For Purchase orders on the SAP side, you could use any of the standard SAP Purchasing reports, such as ME2W, ME2M, ME2C, ME2L, ME2N.  If these reports do not meet the requirements of the client, you could write a query to display the loaded data, or have an ABAPer write a custom report.
    You didn't ask, but you should also do comparisons of ALL loaded data - including master data.
    It sounds like you are implying that the client wants YOU to extract the legacy data.  For an SAP consultant, this is not very realistic (unless the legacy system is another SAP system).  Most of us do not understand the workings of the myriad legacy systems.  The client is usually expected to produce one or more legacy system technical experts for you to liase with.  You normally negotiate with the technical expert about every facet of of the data migration.  In addition, you will liase with business users, who will help you and the implementation team to logically validate that the final solution (turnkey SAP production system, fully loaded with data) will meet the client's business needs.
    Finally, you mentioned how do you track the mapping of master data between legacy and SAP.  There are many ways to do this.  I normally try to get the legacy person do the conversion on his end, eg, when he gives you the load file, you would like to have already translated the master data and inserted the SAP relevant values into the file.  If this is not possible, I usually use MS Access databases to maintain a master map, and I perform the mapping on a PC.  If your data package is small, you can probably get by using MS Excel or similar.
    Good Luck,
    DB49

  • Data Migration of FI-CA Cleared Items

    Hi,
    Currently I am working on a Data Migration project to migrate data from our FI-CA  module on ERP 4.7 to ECC 6.0.
    THere has been a request from the business to migrate historical data (e.g. Cleared items).
    Is there a SAP recommended approach or tools to load this data into the target environment?
    Currently all documentation around SAP data migrations talks about stategies for open item data migration however I have seen nothing arouund migrating historical financial data.
    Is this because it is not recommended or technically impossible?
    Regards
    Adam Gunn

    That BAPI is used typically for straight out vanilla GL, AR, AP postings, however you still have to create the other side of the entry and then clear it.
    I need to be able to migrate full history, which means from an FI-CA  viewpoint:
    1. Migrate FI-CA posting of liability against BP/contract account.
    2. Migrate associated payments.
    3. And then clearing documents.
    Basically the requirement is to represent the historical data in the new system as if it was posted and matched off.
    Is there a technical way to do this?
    OR,
    Do you migrate the FI-CA liabilties, then the associated payments and then run clearing on the target system?
    I suspect this is almost an impossible data migration requirement as development of the extraction and load process would be extremely complex and testing would take months to cover all posting scenarios in FI-CA. However, I would be interested if anyone has attempted to do this before.
    Adam

  • Data Migration From Peoplesoft , JDEdwards To SAP.

    Hi,
    This is kiran here we are doing data Migration work from Peoplesoft And JDEdwards to SAP.in SAP side it involves Master data tables Related to Customer, Vendor, Material. and Meta data tables related to SD, MM, FI. We as SAP Consultant identified Fields from above tables and marked them as Required, Not required, And Mandatory. The Peoplesoft and JDEdwards flocks come up with the same from their side. Then we want map the Fields. as I am new to data Migration any body suggest me what are the steps involves in data Migration How to do Data Mapping in Migration Thanks in advance.
    Thanks
    Kiran.B

    Hi Kiran,
    Good... Check out the following documentation and links
    Migrating from one ERP solution to another is a very complex undertaking. I don't think I would start with comparing data structures. It would be better to understand the business flows you have currently with any unique customizations and determine how these could be implemented in your target ERP. Once this is in place, you can determine the necessary data unload/reload to seed your target system.
    A real configuration of an ERP system will only happen when there is real data in the system. The mapping of legacy system data to a new ERP is a long difficult process, and choices must be made as to what data gets moved and what gets left behind. The only way to verify what you need to actually run in the new ERP environment is to migrate the data over to the ERP development and test environments and test it. The only way to get a smooth transition to a new ERP is to develop processes as automatic as possible to migrate the data from the old system to the new.
    Data loading is not a project that can be done after everything else is ready. Just defining the data in the legacy system is a huge horrible task. Actually mapping it to one of the ERP system schemas is a lesson in pain that must be experienced to be believed.
    The scope of a data migration project is usually a fairly large development process with a lot of proprietary code written to extract legacy data, transform and load the data into the ERP system. This process is usually called ETL (extract, transform, load.)
    How is data put into the ERP?
    There is usually a painfully slow data import facility with most ERP systems. Mashing data into the usually undocumented table schema is also an option, but must be carefully researched. Getting the data out of the legacy systems is usually left to the company buying the ERP. These export - import processes can be complex and slow, sometimes specialized ETL tools can help, sometimes it is easier to use what ever your programmers are familiar with, tools such as C, shell or perl.
    An interesting thing to note is that many bugs and quirks of the old systems will be found when the data is mapped and examined. I am always amazed at what data I find in a legacy system, usually the data has no relational integrity , note that it does not have much more integrity once it is placed in an ERP system so accurate and clean data going in helps to create a system that can work.
    The Business Analysts (BAs) that are good understand the importance of data migration and have an organized plan to migrate the data, allocate resources, give detailed data maps to the migrators (or help create the maps) and give space estimates to the DBAs. Bad BAs can totally fubar the ERP implementation. If the BAs and management cannot fathom that old data must be mapped to the new system, RUN AWAY. The project will fail.
    Check these links
    http://pdf.me.uk/informatica/AAHN/INFDI11.pdf
    http://researchcenter.line56.com/search/keyword/line56/Edwards%20Sap%20Migration%20Solutions/Edwards%20Sap%20Migration%20Solutions
    http://resources.crmbuyer.com/search/keyword/crmbuyer/Scm%20Data%20Migration%20On%20Peoplesoft%20Peoplesoft%20Data%20Migration/Scm%20Data%20Migration%20On%20Peoplesoft%20Peoplesoft%20Data%20Migration
    Good Luck and Thanks
    AK

  • SAP Data Migration from R/3 4.6C to ECC 6.0

    Hey All
    We are planning to upgrade SAP R/3 4.6C to ECC 6 in near future. I am aware that data migration is not an issue as upgrade takes care of it all. What we are concerned with is that if after a few days of upgrade when the system is functional and users have performed some quantum of transactions on it, in case if the system fails then and we have to revert back to R/3 4.6C how can we get back all the information that we processed on ECC back to R/3.
    I would like to know if anyone who has performed an upgrade, had some kind of fail over strategy for such a situation. Is there some tool we can use for this purpose?
    Regards
    Bilal Nazir

    > What we are concerned with is that if after a few days of upgrade when the system is functional and users have performed some quantum of transactions on it, in case if the system fails then and we have to revert back to R/3 4.6C how can we get back all the information that we processed on ECC back to R/3.
    Simple answer: not possible.
    > I would like to know if anyone who has performed an upgrade, had some kind of fail over strategy for such a situation. Is there some tool we can use for this purpose?
    No - there is no tool.
    Table structures change, update programs changed - it's not possible to "roll back" that to a date and re-do it on 4.6c again.
    Do minimize the risk I would copy the production system to a sandbox, do the upgrade and let the (key-) users test their functionality intensively.
    Markus

  • New GL activated without data migration

    Hi all,
    we have some three projects lined up, one of it is new gl, due to some reasons we have postponed the go live of new gl but other two projects wants to go ahead with their go live, so we provided them an environment where new gl is active but doesnt have migrated data.
    What would be the risks when regression testing is done in such an environment, i have listed out few as below, can something else be expected?
    Cannot perform closing operations, as balances are not known and no previous data is available.
    Clearing and reconciliation functions cannot be performed.
    Cannot use the reporting function as there would be discrepancies in the output
    cannot check the balances
    Transactions referring to the previous data such as bill of exchange, down payments etc cannot be executed.
    if some one has gone thru such situation before, please provide information.
    thanks
    kumar

    Hi Kumar,
    Find below the answer to your question
    Cannot perform closing operations, as balances are not known and no previous data is available.
    - You can perform the closing operation without any problem. Its just that you will not have previos data which will get added to your existing data.
    Clearing and reconciliation functions cannot be performed.
    - You can perform reconcialiation function to the extent you have done the posting, not for the previous data.
    Cannot use the reporting function as there would be discrepancies in the output
    - You can take out reports but it will not give you correct picture as the previous data is missing. please keep in mind the Balance sheet GLs have cumulative balance.
    cannot check the balances
    - You can to the extent which you have done posting
    Transactions referring to the previous data such as bill of exchange, down payments etc cannot be executed..
    - Yes, to do so you need to migrate the previous data.
    Suggestion: Activating New GL without Data migration is an incomplet process, you should do the data migration at the earliest.
    Hope the above statements answer your questions.
    Regards
    Paul

  • What component is included with Microsoft SQL Servers that can be used to perform a broad range of data migration tasks?

    What component is included with Microsoft SQL Servers that can be used to perform a broad range of data migration tasks?
    a. Full Text Search service
    b. SQL Notification Server
    c. SQL Reporting Server
    d. SQL Server Integration Services

    d.
    Are you having a test and trying to cheat?
    For every expert, there is an equal and opposite expert. - Becker's Law
    My blog
    My TechNet articles

  • Delta data migration on SAP upgread from 4.6C to ECC 6.0

    Hi,
    We are planning for SAP upgrade in our Production environment and we are facing the below listed constraints and need your advice on the options to peform ECC Upgrade.
    1) Present HW resource is already facing bottleneck and so no additional load can be added.
    2) We have a completely new HW environment due to which we are unable to perform the test upgrade.
    Also, is there anyway to perform the Prepare Activities on new hardware (copy of Production system) first and then do the delta data migration for update of transactional data from present production system to new system (Prepare Activities done on this system) without affecting the Prepare Activities data.
    Thanks & Regards,
    Pratap

    Hi Pratap,
    You can first make a copy of you production system on new hardware by homogeneous system copy. Then do the test upgrade if everything will be fine. Then as suggested earlier migrate your production to new hardware and then do the upgrade.
    Thanks
    Sunny

  • DBCS data migration

    I need to migrate client data from on premise oracle database (11g) to cloud database . I went through this website which talks about all possibvle ways to load dbcs database tables , http://www.ateam-oracle.com/loading-data-in-oracle-database-cloud-service/ . I am mostly interested in Using outbound SOAP and REST invocation APIs . But I am not very sure how to approach this process . Can someone guide me what should be high level steps that need to be taken if I want to transfer data from table A in an Oracle database to a cloud database table using SOAP and REST ? I am using the trial version of Oracle cloud database .
    From my initial investigation I think , there has to be a PL/SQL code written in on premise database whcih can read data from database tables and transfer that to cloud table . On cloud side , we need to write a REST service which will have a PL/SQL code to POST this data to the cloud tables . If this is true , then if you could guide me with a sudo code for PL/SQL on on-premise databse side  that would be helpful too .
    Also I am aware and have tested data migration using SQL developer , but I dont think this is appropiate for heavy data load , again you can correct if I am wrong here , or if you are using any better migrate enterprise wide data please do share your experience .
    Regards
    Bond

    Have you looked at this?
    http://docs.oracle.com/cloud/latest/dbcs_schema/CSDBU/GUID-3B14CF7A-637B-4019-AAA7-A6DC5FF3D2AE.htm#CSDBU179

  • Data migration in LSMW with CRM_XIF_PARTNER_SAVE

    Hi,
    for a data migration of business partners from a legacy system into a CRM 5.0 system I want to use the CRM_XIF_PARTNER_SAVE or CRM_XIF_PARTNER_SAVE_M IDoc. Everything works fine when I migrate a business partner with only one address and only one bank account, but what settings I have to make that I'm able to migrate a business partner with multiple addresses and bank accounts?
    I hope you can help me. Thanks in advance,
    Timo

    Hi Timo,
    you need to repeat the corresponding IDoc segements for "Bank Details",
    eg for IDoc CRMXIF_PARTNER_SAVE01 (display via WE30 or navigate via BDFG).
    here the example for phone data: (I didn't expand all nodes)
    CRMXIF_PARTNER_SAVE01    IDoc Structure for Data Type
    CRMXIF_PARTNER_SAVE01      IDoc Structure for Data Type CRMXIF_PARTNER_COMPLEX                                                                               
    5  E1010821140028             Complex structure for business partners                 CRMXIF_PARTNER_COMPLEX                                                                               
    4  E1010821140029             Header structure for business partners                  CRMXIF_PARTNER_HEADER                    
            5  E1010328105722             External Interface: Central Data                        BUS_EI_CENTRAL_DATA                                                                               
    4  E1010328105723             External Interface: Central Business Partner Data       BUS_EI_BUPA_CENTRAL                  
                4  E1010328105735             External Interface: Roles                               BUS_EI_ROLES                         
                4  E1010328105737             External Interface: Bank Details                    BUS_EI_BANKDETAIL                    
                4  E1010328105741             External Interface: Credit Card Details         BUS_EI_CREDITCARD                    
                4  E1010328105745             External Interface: Industries                         BUS_EI_INDUSTRYSECTOR                
                4  E1010328105750             External Interface: Identification Numbers   BUS_EI_IDENTIFICATION                
                4  E1010328105755             External Interface: Tax Numbers                   BUS_EI_TAXNUMBER                     
                5  E1010328105801             External Interface: Communication Types     BUS_EI_COMMUNICATION                                                                               
    5  E1010328105802             External Interface: Communication Data Telephone   BUS_EI_TEL_DATA                                                                               
    5  E1010328105803             External Interface: Data for Telephone       BUS_EI_BUPA_TELEPHONE                                                                               
    4  E1010328105804             External Interface: Data for Creating a Telephone Number     BUS_EI_BUPA_TELEPHONE_CON
                            4  E1010328105807             External Interface: Communications Notes        BUS_EI_COMREM                                                                               
    ****start to repeat here your IDoc data, that means insert these segements for each new phone number
                        5  E1010328105803             External Interface: Data for Telephone      BUS_EI_BUPA_TELEPHONE                                                                               
    4  E1010328105804             External Interface: Data for Creating a Telephone Number     BUS_EI_BUPA_TELEPHONE_CON
                            4  E1010328105807             External Interface: Communications Notes      BUS_EI_COMREM                                                                               
    ****end insertion
                    4  E1010328105811                 External Interface: Communication Data Fax    BUS_EI_FAX_DATA                  
    You can try the same for address data. Note: you don't need to renumebr the repeated segments in the IDoc, the "system" is doing it ...
    Be sure that the sequence of the other segements after the duplication is still the same, or you get an error.
    Go to transaction WE19. There you key in an existing and successfully posted IDoc. Then you can edit this Idoc, insert the new segements for additional phone numbers for testing.
    Rgds
    JP
    Message was edited by:
            Jörn Peter

  • Treasury Data Migration

    Friends,
    One of our client has decided to use SAP Treasury and Risk Management. I have been asked to lead data migration within this area. For the same, I would like to know the generally used and needed master and transaction data for SAP Treasury Management. Also, let me know the various ways of migrating data into SAP TRM eg. BAPI's, Transaction Manager, LSMW etc.  
    Regards,
    Amey

    Hello Amey,
    Please find list of steps which might be helpful for data migration.
    Legacy Data Transfer
    1. Legacy data
    transfer always happens on a key date.
    2. The procedure is
    different for securities and OTC because they deal differently with positions
    on a key date.
    3. For securities, we
    simply need to copy the positions with all values as of the key date.
    4. For OTC, the
    entire transaction is copied as there is no explicit position management in the
    transaction manager.
    Customizing the Legacy Data Transfer:
    You primarily need
    customizing for the update types that the flows of the legacy data transfer
    business transactions are to carry. We can find the settings under Treasury and
    Risk Management à Transaction Management à General Settings à Tools à Legacy
    Data Transfer à Flow Data à Update Types.
    The update types are
    divided into those for valuation area independent & dependent. The
    independent one only relates to securities. à While the legacy data transfer is
    usually not intended to update the general ledger, nevertheless you must mark
    most of the update types as posting relevant and maintain an account
    determination in customizing. This is necessary to ensure that the data in the position
    management is complete and in particular the account assignment reference
    transfers can be performed correctly.
    Legacy Data Transfer for OTC:
    Step
    1: Existing OTC transactions enter the same way as new transactions.
    Enter either manually (or) use existing BAPIs to import them. BAPIs list à Use
    BAPI_FTR_FD_CREATE & BAPI_FTR_FXT_CREATE
    for creating FD & FOREX transactions respectively. Similarly there are many
    BAPIs for each product and we can get the list of BAPIs in BAPI transaction code
    under the node Financials à Financial Supply Chain management. Use FTR_BAPI
    Transaction to test run the Treasury BAPIs.We may define separate transaction
    type for legacy transactions so that settlement (or) posting release is not
    necessary.
    Step 2: Now we must change status of all the historical flows
    created in Step 1 to POSTED without actually generating accounting document. To do this we will use transaction TBB1_LC.
    This is similar to transaction TBB1. Unlike TBB1, TBB1_LC affects all the valuation areas without giving an option to affect only operational area. We can see the posted flows by TBB1_LC in posting journal (TPM20) without any reference to accounting documents.
    Step 3: Till now the historical flows do not yet explain the book value of OTC transactions. Hence we must first enter the historical valuation results. This is done using TPM63C which you can find under path Treasury and Risk Management à Transaction Management à General.Settings à Tools à Legacy
    Data Transfer à Flow Data à Enter Valuation Area Dependant Data for MM.After entering the
    valuation results use TPM63 to transfer values to Internal Position Management.
    Legacy Data Transfer for Futures & Listed Options:
    Step 1: Similar to OTC
    Step 2: Similar to OTC
    Step 3: Use transaction TPM63D (Valuation area dependant data for futures). We must establish the relationship between lot & the transaction on which it is based via the transaction number.
    Legacy Data Transfer for Securities:
    For securities position, it does not matter whether is arose in the past through a single purchase (or) a series of business transactions. The legacy data transfer thus seeks to build up the positions through One Business Transaction on the transfer key date.
    Step 1: Before we enter business transactions, we must first create the master data of security position i.e. their Position Indicator. I) Use TPM61A (or) menu path Treasury and Risk Management Transaction
    Management à General Settings  Tools à Legacy Data Transfer à Position Data Enter Position Information for Securities
    II) Now create the position indicator using TPM61.
    Step 2: Now we will enter the business transaction.
      I) Valuation area independent data for securities à For External Position
    II) Valuation area dependant data for securities à For Internal Position Nevertheless, we need to use transaction set the effects of update type on position to change the positions. If you want to enter LOT Positions with the transactions of legacy data transfer, use the Item Number. Ensure that business transaction generating the position and the one for valuation results carry the same item number.
    If you have built up
    lots via normal procedures, you must specify the corresponding transaction number
    when entering the valuation area dependant business transaction in the
    transaction column.
    Step 3: Now generate the business transactions using TPM63.We can initially build only valuation area independent data (external position) by checking “only valuation area independent data” button in TPM63. The external position can be checked in securities account cash flow i.e. external securities position in TPM40.
    Also we created LSMW for each product.
    Regards,
    Jain

  • Cross-Empire Data Migration - DST?

    We are still evaluating NSM 3.0.4 Cross-Empire Data Migration....
    It looks like the migration doesn't work with Dynamic Shadow Technology (DST) shadow volumes....
    I thought that seeing as the NSM Engine and/or Agent doing the migration has to have the Novell client installed that it would be able to migrate from both the primary and shadow volumes...
    However, our initial testing shows that it is ignoring the shadow based data...
    Is this an oversight? Use of incorrect/outdated API?
    Has anyone else any experience of this? We have rather a larger amount of data in our shadow volumes....!
    Cheers
    David

    djbrightman,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Visit http://support.novell.com and search the knowledgebase and/or check all
    the other self support options and support programs available.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://forums.novell.com)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://forums.novell.com/faq.php
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://forums.novell.com/

Maybe you are looking for

  • I have a HP ENvy Touchsmart Ultabook 4, touchscreen is not working.

    I was working, touchscreen was ok, but suddenly I started getting arrows poinitng to all in the scren asking me to touch or point... I did not understand what happen. but after this the touchscreen is not working anymore.. What did I do wrong? Could

  • Can I install software on a bootable sd card?

    I am a professional photographer who often uses my MacBook Pro on location to review, edit, caption and upload images. In the event of a hard drive crash, I would like to make a bootable SD card, and have found very good instructions on how to do thi

  • What's this Error in design time

    I have received a error in design time that says: Value cannot be null. Parameter name: objectType This occurs when i open the form in desin mode or when I add or remove a handle in code editor and then switch to design form. Can you help me   

  • Can not play any video after update to 4.4.2

    Can not play any video after update to 4.4.2, it says always that i have to reconnect my HDMI Cable, reason HDCP error.

  • [SOLVED]kernel update to kernel26-2.6.38.2-1, boot fail

    First of all my archlinux know-how only dated from April 2011. And I began my linux explore just at March 2011 after whole disk format of old XP. Apology that English is not my native language. My problem: Intel P M dothan 725 1.6GHz laptop. 1. I had