Best practice for checking out a file

Hello,
What is the SAP best practice to check out a file? (DTR -> EDit or DTR -> Edit Exclusive?)
What are pros and cons of checking out a file exclusively?
Thanks
MLS

Thanks Pascal.
Also, I think if a developer checks out exclusively, makes changes and leaves the company without checking in, the only way is to revert those files in which case all his changes will be gone.

Similar Messages

  • What is the best practice for checking if CR2008 runtime is installed?

    I've created our crystal report functions in a .Net exe and it is launched from a Delphi application.
    I need to check if the .Net runtime for CR 2008 is installed on a machine at startup of the delphi application.
    Right now I am checking the registry at
    HKEY_LOCAL_MACHINE\Software\Sap BusinessObjects\Crystal Reports For .Net Framework 4.0\Crystal Reports
    and checking the value for CRRuntime32Version.
    This works great if a user is an admin on the machine, however I'm assuming due to group policies and restrictions this registry is not able to be read for some reason. My prompt continues to show up after installation because it can not get the value of the registry.
    So before I get winded and ramble on, what is best practice to test to see if the runtime has been installed? Is there a particular section of the registry I can check? My next thought is to check for the runtime directory but that might not be efficient as I would hope.

    Registry and folder is about all I can think of. Problem is, you're never guaranteed that something was not installed and then uninstalled and "stuff" like folders are getting left behind (a common occurrence from my experience...). I've also seen registry entries left behind. Perhaps looking for crpe32.dll in the c:\Program Files\Business Objects\BusinessObjects Enterprise 12.0\win32_x86 folder will be best. I've never seen the crpe32.dll orphaned after an uninstall.
    Other than that, when you run the app and there is not runtime, you will get an error and you could trap that, possibly launch the installer... naaa - too goofy...
    Ludek
    Follow us on Twitter http://twitter.com/SAPCRNetSup
    Got Enhancement ideas? Try the [SAP Idea Place|https://ideas.sap.com/community/products_and_solutions/crystalreports]

  • Best practices for checked exceptions in Runnable.run

    Runnable.run cannot be modified to pass a checked exception to its parent, so it must deal with any checked exceptions that occur. Simply logging the error is inadequate, and I am wondering if there are any "best practices" on how to deal with this situation.
    Let me give a real-world example of what I'm talking about.
    When writing I/O code for a single-threaded app, I'll break the logic into methods, and declare these methods as throwing an IOException. Basically, I'll ignore all exceptions and simply pass them up the stack, knowing that Java's checked exception facility will force the caller to deal with error conditions.
    Some time later, I might try to improve performance by making the I/O code multithreaded. But now things get tricky because I can no longer ignore exceptions. When I refactor the code into a Runnable, it cannot simply toss IOExceptions to some future unnamed caller. It must now catch and handle the IOException. Of course, dealing with the problem by simply catching and logging the exception is bad, because the code that spawned the I/O thread won't know that anything went wrong. Instead, the I/O thread must somehow notify its parent that the exception occurred. But just how to do this is not straightforward.
    Any thoughts? Thanks.

    My suggestion: don't use Threads and Runnables like this.
    Instead implement Callable which can throw any Exception.
    Then use an ExecutorService to run that Callable.
    This will return a Future object which can throw an ExecutionException on get(), which you can then handle.
    This has the additional advantage that you can easily switch from a single-threaded serialized execution to a multi-threaded one by switching ExecutorService implementations (or even by tweaking the parameters of the ExecutorService implementation).

  • What are best practices for rolling out cluster upgrade?

    Hello,
    I am looking for your input on the approaches to implement Production RAC upgrade without having a spare instance of RAC servers. We have a 2-node database RAC 11.1 on Production. We are planning to upgrade to 11.2 but only have a single database that the pre-Production integration can be verified on. Our concern is that the integration may behave differently on RAC vs. the signle database instance. How everybody else approaches this problem?

    you want to test RAC upgrade on NON RAC database. If you ask me that is a risk but it depends on may things
    Application configuration - If your application is configured for RAC, FAN etc. you cannot test it on non RAC systems
    Cluster upgrade - If your standalone database is RAC one node you can probably test your cluster upgrade there. If you have non RAC database then you will not be able to test cluster upgrade or CRS
    Database upgrade - There are differences when you upgrade RAC vs non RAC database which you will not be able to test
    I think the best way for you is to convert your standalone database to RAC one node database and test it. that will take you close to multi node RAC

  • Best practice for exporting a .mov file for YouTube

    I am using FCPX 10.0.9 with a MacBook Pro. We are needing to upload relatively small-size high school football videos as .mov files to YouTube for my newspaper. I have been choosing the Export File option with the setting set to h.264, exporting the file to the desktop and then uploading that .mov file to YouTube. (I do not have Compressor installed on this Mac.) The other night the first of three files done like this uploaded fine, but the next two just kept processing and processing at YouTube and never finished. The following morning the files processed fine. Any thoughts about why ths might be happening?
    And would it be better to export in another file format --.mp4 files -- instead that might cause less of a problem and which of the export options would be best?
    Thanks,
    Douglas

    douglas i wrote:
    Am I correctly uploading a .mov file to YouTube by using the Export File command with the export setting set to .h264 then if I wish a reasonable size file for upload?
    Sure. There are many ways to get videos to YT and yours is a reasonable approach.
    "Reasonable size" should be thought of in terms of upload time. Again, YT will re-compress whatever you give it. So depending on how quickly you need to get it up on the Web, it's better to upload files that have more information than less information. (To that end, some folks upload very large Pro Res files; they also have a lot of patience.) 
    Just to let you know, the majority of FCPX users follow one of two workflows:
    1) Share>You Tube option and have FCP handle the uploading;
    2) Export as a Pro Res Master File; bring the master file into Compressor and apply the user's custom settings; upload using YT's uploader.
    FWIW, I prefer the second workflow.
    Russ

  • Best practice for sending client ai files

    If a client requests ai files...and they don't have Illustrator - but they want the orig files for printers and vendors, should I save the file using legacy or just save as a normal cs4 ai file?
    Thx.
    Chemol

    in this case it's just a simple 2-colour logo (same file you helped me with earlier) - no effects, nothing to be changed. Nothing mentioned in the contract, I just want them to have the original file so they can provide it to print providers if need be in the future for whatever projects they use it for.
    Just a simple situation, wanting to provide most versatile format for them in the long run. I have CS4, and have been told by one past client (for a signage project) that the printer was unable to open the file - I think they had an older version of AI....
    Just curious - any insight would be helpful.
    Thx.

  • Best practices for setting up projects

    We recently adopted using Captivate for our WBT modules.
    As a former Flash and Director user, I can say it’s
    fast and does some great things. Doesn’t play so nice with
    others on different occasions, but I’m learning. This forum
    has been a great source for search and read on specific topics.
    I’m trying to understand best practices for using this
    product. We’ve had some problems with file size and
    incorporating audio and video into our projects. Fortunately, the
    forum has helped a lot with that. What I haven’t found a lot
    of information on is good or better ways to set up individual
    files, use multiple files and publish projects. We’ve decided
    to go the route of putting standalones on our Intranet. My gut says
    yuck, but for our situation I have yet to find a better way.
    My question for discussion, then is: what are some best
    practices for setting up individual files, using multiple files and
    publishing projects? Any references or input on this would be
    appreciated.

    Hi,
    Here are some of my suggestions:
    1) Set up a style guide for all your standard slides. Eg.
    Title slide, Index slide, chapter slide, end slide, screen capture,
    non-screen capture, quizzes etc. This makes life a lot easier.
    2) Create your own buttons and captions. The standard ones
    are pretty ordinary, and it's hard to get a slick looking style
    happening with the standard captions. They are pretty easy to
    create (search for add print button to learn how to create
    buttons). There should instructions on how to customise captions
    somewhere on this forum. Customising means that you can also use
    words, symbols, colours unique to your organisation.
    3) Google elearning providers. Most use captivate and will
    allow you to open samples or temporarily view selected modules.
    This will give you great insight on what not to do and some good
    ideas on what works well.
    4) Timings: Using the above research, I got others to
    complete the sample modules to get a feel for timings. The results
    were clear, 10 mins good, 15 mins okay, 20 mins kind of okay, 30
    mins bad, bad, bad. It's truly better to have a learner complete
    2-3 short modules in 30 mins than one big monster. The other
    benefit is that shorter files equal smaller size.
    5) Narration: It's best to narrate each slide individually
    (particularly for screen capture slides). You are more likely to
    get it right on the first take, it's easier to edit and you don't
    have to re-record the whole thing if you need to update it in
    future. To get a slicker effect, use at least two voices: one male,
    one female and use slightly different accents.
    6) Screen capture slides: If you are recording filling out
    long window based databse pages where the compulsory fields are
    marked (eg. with a red asterisk) - you don't need to show how to
    fill out every field. It's much easier for the learner (and you) to
    show how to fill out the first few fields, then fade the screen
    capture out, fade the end of the form in with the instructions on
    what to do next. This will reduce your file size. In one of my
    forms, this meant the removal of about 18 slides!
    7) Auto captions: they are verbose (eg. 'Click on Print
    Button' instead of 'Click Print'; 'Select the Print Preview item'
    instead of 'Select Print Preview'). You have to edit them.
    8) PC training syntax: Buttons and hyperlinks should normally
    be 'click'; selections from drop down boxes or file lists are
    normally 'select': Captivate sometimes mixes them up. Instructions
    should always be written in the correct order: eg. Good: Click
    'File', Select 'Print Preview'; Bad: Select 'Print Preview' from
    the 'File Menu'. Button names, hyperlinks, selections are normally
    written in bold
    9) Instruction syntax: should always be written in an active
    voice: eg. 'Click Options to open the printer menu' instead of
    'When the Options button is clicked on, the printer menu will open'
    10) Break all modules into chapters. Frame each chapter with
    a chapter slide. It's also a good idea to show the Index page
    before each chapter slide with a progress indicator (I use an
    animated arrow to flash next to the name of the next chapter), I
    use a start button rather a 'next' button for the start of each
    chapter. You should always have a module overview with the purpose
    of the course and a summary slide which states what was covered and
    they have complete the module.
    11) Put a transparent click button somewhere on each slide.
    Set the properties of the click box to take the learner back to the
    start of the current chapter by pressing F2. This allows them to
    jump back to the start of their chapter at any time. You can also
    do a similar thing on the index pages which jumps them to another
    chapter.
    12) Recording video capture: best to do it at normal speed
    and be concious of where your mouse is. Minimise your clicks. Most
    people (until they start working with captivate) are sloppy with
    their mouse and you end up with lots of unnecessarily slides that
    you have to delete out. The speed will default to how you recorded
    it and this will reduce the amount of time you spend on changing
    timings.
    13) Captions: My rule of thumb is minimum of 4 seconds - and
    longer depending on the amount of words. Eg. Click 'Print Preview'
    is 4 seconds, a paragraph is longer. If you creating knowledge
    based modules, make the timing long (eg. 2-3 minutes) and put in a
    next button so that the learner can click when they are ready.
    Also, narration means the slides will normally be slightly longer.
    14) Be creative: Capitvate is desk bound. There are some
    learners that just don't respond no matter how interactive
    Captivate can be. Incorporate non-captivate and desk free
    activities. Eg. As part of our OHS module, there is an activity
    where the learner has to print off the floor plan, and then wander
    around the floor marking on th emap key items such as: fire exits;
    first aid kit, broom and mop cupboard, stationary cupboard, etc.
    Good luck!

  • Best practice for optimizing processing of big XMLs?

    All,
    What is the best practice when dealing with large XML files ... (say couple of MBs).
    Instead of having to read the file from the file system everytime a static method is run, what would be the best way in which the program reads the file once and then keeps it in memory. So the next time it would not have to read and parse it all over again?
    Currently my code just read the file in the static method like ...
    public static String doOperation(String path,...) throws Exception
    try{           
    String masterFile = path+"configfile.xml";
    Document theFile = (Document)getDocument(masterFile);
    Element root = theFile.getDocumentElement();
    NodeList nl = root.getChildNodes();
    // ... operations on file
    Optimization tips and tricks most appreciated :-)
    Thanks,
    David

    The best practice for multi-megabyte XML files is not to have them at all.
    However if you must, presumably you don't need all of the information in your XML, repeatedly. Or do you? If you need a little bit of it here, then another little bit of it there, then yet another little bit of it later, then you shouldn't have stored your data in one big XML.
    Sorry if that sounds unhelpful, but I'm having trouble imagining a scenario when you need all the data in an XML document repeatedly. Perhaps you could expand on your design?
    PC²

  • Best Practices for migrating .rpd

    Hi,
    Can someone please share with us the industry best practices for migrating the .rpd file from a source (QA) to a destination (Prod) environment. Our Oracle BI server runs on a unix box and currently what we do to migrate is to save copies of QA and prod .rpds on our local machine and do a Merge. Once merged, we ftp the resulting .rpd to destination (prod) machine. This approach does not seem reliable and we are looking for a better established approach for .rpd migration. Kindly help me where I can find the Documentation..Any help in this regard would be highly appreciated.

    I don't think there is an industry best practice as such. What applies for one customer doesn't necessarily apply at another site.
    Have a read of this from Mark Rittman, it should give you some good ideas:
    http://www.rittmanmead.com/2009/11/obiee-software-configuration-management-part-2-subsequent-deployments-from-dev-to-prod/
    Paul

  • Redo log best practice for performace - ASM 11g

    Hi All,
    I am new to ASM world....can you please give your valuable suggestion on below topic.
    What is the best practice for online redo log files ? I read it somewhere that oracle recommand to have only two disk group one for the all DB fils, control files online redo log files and other disk group for recovery files, like mulitplex redo file, archive log files etc.
    Will there be any performance improvment to make different diskgroup for online redo log (separate from datafiles, contorl files etc) ?
    I am looking for oracle document to find the best practise for redo log (peformance) on ASM.
    Please share your valuable views on this.
    Regards.

    ASM is only a filesystem.
    What really count on performance about I/O is the Storage Design i.e (RAID used, Array Sharing, Hard Disk RPM value, and so on).
    ASM itself is only a layer to read and write on ASM DISK. What mean if you storage design is ok ... ASM will be ok. (of course there is best pratices on ASM but Storage must be in first place)
    e.g There is any performance improvement make different Diskgroup?
    Depend...and this will lead to another question.
    Is the Luns on same array?
    If yes the performance will end up on same point. No matter if you have one or many diskgroup.
    Coment added:
    Comparing ASM to Filesystem in benchmarks (Doc ID 1153664.1)
    Your main concern should be how many IOPs,latency and throughput the storage can give you. Based on these values you will know if your redologs will be fine.

  • Best practice for importing non-"Premiere-ready" video files

    Hello!
    I work with internal clients that provide me with a variety of differnet video types (could be almost ANYTHYING, WMV, MP4, FLV).  I of course ask for AVIs when possible, but unfortunately, I have no control over the type of file I'm given.
    And, naturally, Premiere (just upgraded to CS5) has a hard time dealing with these files.  Unpredictable, ranging from working fine to not working at all, and everything in between.  Naturally, it's become a huge issue for turnaround time.
    Is there a best practice for preparing files for editing in Premiere?
    I've tried almost everything I can think of:  converting the file(s) to .AVIs using a variety of programs/methods.  Most recently, I tried creating a Watch Folder in Adobe Media Encoder and setting it for AVI with the proper aspect ratio.  It makes sense to me that that should work:using an Adobe product to render the file into something Premiere can work with.
    However, when I imported the resulting AVI into Premiere, it gave me the Red Line of Un-renderness (that is the technical term, right?), and had the same sync issue I experienced when I brought it in as a WMV.
    Given our environment, I'm completely fine with adding render time to the front-end of projects, but it has to work.  I want files that Premiere likes.
    THANK YOU in advance for any advice you can give!
    -- Dave

    I use an older conversion program (my PrPro has a much older internal AME, unlike yours), DigitalMedia Converter 2.7. It is shareware, and has been replaced by Deskshare with newer versions, but my old one works fine. I have not tried the newer versions yet. One thing that I like about this converter is that it ONLY uses System CODEC's, and does not install its own, like a few others. This DOES mean that if I get footage with an oddball CODEC, I need to go get it, and install it on the System.
    I can batch process AV files of most types/CODEC's, and convert to DV-AVI Type II w/ 48KHz 16-bit PCM/WAV Audio and at 29.97 FPS (I am in NTSC land). So far, 99% of the resultant converted files have been perfect, whether from DivX, WMV, MPEG-2, or almost any other format/CODEC. If there is any OOS, my experience has been that it will be static, so I just have to adjust the sync offset by a few frames, and that takes care of things.
    In a few instances, the PAR flag has been missed (Standard 4:3 vs Widescreen 16:9), but Interpret Footage has solved those few issues.
    Only oddity that I have observed (mostly with DivX, or WMV's) is that occasionally, PrPro cannot get the file's Duration correct. I found that if I Import those problem files into PrElements, and then just do an Export, to the same exact specs., that resulting file (seems to be 100% identical, but something has to be different - maybe in the header info?) Imports perfectly into PrPro. This happens rarely, and I have the workaround, though it is one more step for those. I have yet to figure out why one very similar file will convert with the Duration info perfect, and then a companion file will not. Nor have I figured out exactly what is different, after running through PrE. Every theory that I have developed has been shot down by my experiences. A mystery still.
    AME works well for most, as a converter, though there are just CODEC's, that Adobe programs do not like, such as DivX and Xvid. I doubt that any Adobe program will handle those suckers easily, if at all.
    Good luck,
    Hunt

  • Best/Easiest Practice for Distribution of Webhelp files for Fat(ish) Client Application

    So another in my long line of questions in trying to change the implementation of the help systems at my new employer.
    So, the pressure is to convince the existing guard to toss out the old method and go with a new method... One of the issues has come up for CSH Webhelp (via RH10)...
    Apparently, in the old method (where every help topic was it's own CHM - no, I'm not kidding).  They kept it that way so that any time a help file was updated, they could just send out that one chm to update that one topic.
    Now they are worried that if we switch to webhelp using topics in one single big project that we can't just send out a single updated file any more...
    So I'm trying to get ammunition as to how updates to the help can be distributed without it being a big hassle...
    We do have small updates that go out about every week, and they are afraid that going to the webhelp (1 project with lots of topics) method will require more time and effort for the techs because they will have to complete the update of a huge help system every time.
    One solution we have suggested is that we only update help files on major releases and add any changes to help procedures in the Release Notes or in a separate PDF and also include a note that the help files "will be updated to reflect this change in the XX/2015 XYZ Release).
    Does ANYONE have any other ideas of how distribution could be done efficiently so that we don't have to continue this "project per topic" fiasco?
    HELP!!! PLEASE!!!!

    Amebr - you hit the nail on the head... i want to apologize to everyone for all this mass hysteria...but this has been a freaking rollercoaster... one day they are happy with the plan, the next day they decided they want to complicate it more... So now my job is to find out how viable and how tricky and perhaps how dangerous it will be to send out JUST the files that have been changed.  Especially for CSH! 
    Anyone have any experience with this or have any advice?  Peter?  I suspect you might know how this might work... are they any pitfalls to watch out for.. I just have this fear of sending out a few files from an entire project... Altho, it appears when I publish, ONLY the files that have been edited in some way show a new date.. that includes non-topic files - so i'm assuming sending all those will keep the TOC, Index, and Search features working properly... as well as incoming and outgoing links from an edited topic?
    Another question would be how to handle new images in a project... as the image folder tends to be the largest, i don't want to have to send the entire image folder... Thoughts...????  Advice?  Jeff? Peter? Amebr? Bueller?  Bueller?  Bueller....?
    THANKS!Best/Easiest Practice for Distribution of Webhelp files for Fat(ish) Client Application
    OH... and Jeff... my apologies for the misnomer of using the term "Fat-ish" client in reference to this... I totally misspoke and should have said Mobile App... but in my mind, any time you have to download something to use it, it's a fat client... sorry!  :-/

  • Best practice for including additional DLLs/data files with plug-in

    Hi,
    Let's say I'm writing a plug-in which calls code in additional DLLs, and I want to ship these DLLs as part of the plug-in.  I'd like to know what is considered "best practice" in terms of whether this is ok  (assuming of course that the un-installer is set up to remove them correctly), and if so, where is the best place to put the DLLs.
    Is it considered ok at all to ship additional DLLs, or should I try and statically link everything?
    If it's ok to ship additional DLLs, should I install them in the same folder as the plug-in DLL (e.g. the .8BF or whatever), in a subfolder of the plug-in folder or somewhere else?
    (I have the same question about shipping additional files too, such as data or resource files.)
    Thanks
                             -Matthew

    Brother wrote:
    I probably should have posed this question here before I delved into writing Java to get data for reports, but better late than never.
    Our ERP is written in COBOL. We have a third party ODBC which allows us to access data using a version of SQL. I have several Java sources compiled in my database that access the data and return something relevant. The Java sources are written in a procedural style rather than taking advantage of object oriented programming with attributes and methods.
    OO is a choice not a mandate. Using Java in a procedural way is certainly not ideal but given that it is existing code I would look more into whether is well written procedural code rather than looking at the lack of OO.
    Now that I am becoming more comfortable with the Java language, I would greatly appreciate any feedback as to best practices for incorporating Java into my database.
    My guess is that it would be helpful to model the ERP "tables" with Java classes that would have attributes, which correspond to the fields, and methods to return the attributes in an appropriate way. Does that sound reasonable? If so, is there a way to automate the task of modeling the tables? If not reasonable, what would you recommend?Normally you create a data model driven by business need. You then implement using whatever means seem expedient in terms of other business constraints to closely model that data model.
    It is often the case that there is a strong correlation between data models and tables but certainly in my experience it is rare when there are not other needs driven by the data model (such as how foreign keys and link tables are implemented and used.)

  • Best practice for how to access a set of wsdl and xsd files

    I've recently beeing poking around with the Oracle ESB, which requires a bunch of wsdl and xsd files from HOME/bpel/system/xmllib. What is the best practice for including these files in a BPEL project? It seems like a bad idea to copy all these files into every project that uses the ESB, especially if there are quite a few consumers of the bus. Is there a way I can reference this directory from the project so that the files can just stay in a common place for all the projects that use them?
    Bret

    Hi,
    I created a project (JDeveloper) with local xsd-files and tried to delete and recreate them in the structure pane with references to a version on the application server. After reopening the project I deployed it successfully to the bpel server. The process is working fine, but in the structure pane there is no information about any of the xsds anymore and the payload in the variables there is an exception (problem building schema).
    How does bpel know where to look for the xsd-files and how does the mapping still work?
    This cannot be the way to do it correctly. Do I have a chance to rework an existing project or do I have to rebuild it from scratch in order to have all the references right?
    Thanks for any clue.
    Bette

  • Editing XMP metadata in Bridge for checked-out files?

    Our custom connector does not implement the WRITE_XMP_METADATA Capability because we do not want to change the XMP of a file in our DAM system without checking in a new file version.
    However, when a file is checked out through Adobe Drive, we would expect to be able to edit its XMP metadata in Bridge just like we can for local files.
    It looks like this is still not possible with Adobe Drive 3, i.e. if your connector does not implement the WRITE_XMP_METADATA Capability you cannot edit XMP for checked out files either.
    It seems weird that you can edit XMP metadata for checked out files with regular CS applications but not with Bridge. Is this something that is expected to be fixed in a future version?
    Thanks

    In our design, editing XMP data in Bridge and in regular CS applications is different, the former one will be routed to SetXMPHandler in you connector, in the latter case, CS applications take the normal file writting ways to embed XMP data in the file content.
    The purpose of this design is to offer similar user experience for normal users when they edit XMP data for AD assets and local file assets, they don't need to check-out files first.
    To address your issue, I think you can add some codes in SetXMPHandler to auto check-out files before real XMP operation, and check-in them before exiting the handler
    We will consider to make some enhancements for this part, that's, to allow editng XMP data for the checked-out files in Br, in the next release, thanks for this advice.

Maybe you are looking for

  • My iTunes will not open, error message 13014, help!

    My itunes will not open on my Macbook Pro, everytime I try it says error message 13014, what does this mean? Thank you!

  • Can I Have Two iTunes Accounts on One Computer?

    I have an iTunes account and no iPhone. My wife has an iPhone but has notestablished an iTunes account. Can we open her iTunes account on my computer? I so, do we have to download a second instance of iTunes on my computer to establish her account?

  • Saving a PDF file to a Word Document

    Can someone please tell me how to save a PDF file as a word document?  When I do a "save as" with the doc extension, a window pops up that says, "Save as failed to process this document.  No file was created."  Any help would be appreciated.  Thank y

  • Discoverer security limitations

    Hi all, Does Discoverer allow security to be implemented down to specific items? If I had a workbook and several users had access to it, would there be a way for me to define which user could drill down into which item? Also, if I had a report for sa

  • How do I determine what is causing Screen pixilating and jumping on my G5

    I have an IMAC-G5 that is now causing problems. Screen is pixilating, vibrating and freezing. Any ideas on repairs?